• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Indie Game Development Discussion Thread | Of Being Professionally Poor

Status
Not open for further replies.

Turfster

Member
Bleeding enemies. :D


Q: Is there a systray util (Win7) for switching color modes quickly? I need
to switch between 256 colors and 32bit truecolor mode. Anyone?

Not really a systray utility, but you could keep the exe's properties open on the compatibility tab and toggle "run in 256 colors"
 

Turfster

Member
^ Ohh, awesome. Works great. Thx! :+

You're welcome.

No, I mean pathfinding for AI agents. I ask because I had the idea of making a Descent-like game too but I wasn't sure how to implement path-finding for the enemies because the navmesh of Unity couldn't be used for a full 3D environment (with enemies that can chase you across upwards corridors or downwards).

Or do you just have a system for obstacle avoidance for the enemies?

While it's not entirely trivial, it's not impossible to scale up a navigation graph to use 3D volumes instead of polygons. (A connecting edge is a connecting edge, after all)
Alternatively, depending on the actual map layout and complexity, it could be reduceable to a 2D navmesh with some extra information. (Not saying this is the case, I'm just freewheeling how I'd do it here ;)

(Sorry, I'm in a navmesh mood. I just got past another milestone in rolling my own for runtime dynamic generation, because the Unity navmesh really is rather useless and limited, especially in Indie)
 

Paul F

Neo Member
No, I mean pathfinding for AI agents. I ask because I had the idea of making a Descent-like game too but I wasn't sure how to implement path-finding for the enemies because the navmesh of Unity couldn't be used for a full 3D environment (with enemies that can chase you across upwards corridors or downwards).

Or do you just have a system for obstacle avoidance for the enemies?

I rolled my own system for this. NeonXSZ has as many as 1000 enemy ships flying around at once, and I needed a system that was ultra lightweight. I don't have the time to write-up how I did it right now but I should have some time tomorrow to do it.
 

AxiomVerge

Neo Member
IntroShotAnimated.gif


I added the suggested vegetation (and some buildings), as well as a wrapping particle snow effect. Looks a bit more fleshed out now, I think.
 

Alts

Member
I finally finished a game I've been working on (off and on) since last April.
Umbragram
It's a puzzle game about perception. It plays similarly, in certain ways, to Picross. You have the two shapes on the left and right, and you have to build the isometric structure that casts those shadows. It's pretty short, just 12 puzzles. I estimate it'd take maybe a half hour to get through. Give it a try and let me know what you think.
http://games.evilrobotstuff.com/umbragram/

ss02-crs.png
 
Hey Wiiu-DevGAF,

Anyone else having problems with the dimensions of the GUILayer (i think that would best describe it) being sent to the Gamepad being significantly smaller than the actual resolution of the gamepad itself?

It's driving me goddamn insane.

Other news I finally figured out the implementation of rhythm gameplay in the game.
EXCITE GET
 

Blizzard

Banned
Hey Wiiu-DevGAF,

Anyone else having problems with the dimensions of the GUILayer (i think that would best describe it) being sent to the Gamepad being significantly smaller than the actual resolution of the gamepad itself?

It's driving me goddamn insane.

Other news I finally figured out the implementation of rhythm gameplay in the game.
EXCITE GET
If you haven't posted it to the corresponding subforum in the Nintendo indie dev site, do that, since someone using the same tools might have had similar experiences, and no one will be able to answer here without breaking NDA by talking about the tools.
 
If you haven't posted it to the corresponding subforum in the Nintendo indie dev site, do that, since someone using the same tools might have had similar experiences, and no one will be able to answer here without breaking NDA by talking about the tools.

Yeah, I definitely did that a few days ago.
I was wondering if talking about the Unity side of it was still breaking the NDA, I guess it's probably safer than sorry.
But Jesus H Lopez, the lack of communication can be really frustrating sometimes.
 

bumpkin

Member
Here's my take:


The ideal framework is to update state once per frame, render, wait for v-sync, repeat. Which is awesome when you know the power of the machine you're running on, and that you *know* your render will never take too long, (resulting in dropped frames). Problem is, that's not usually very realistic.

Taking the time difference between this and the previous frame plays havoc with physics and the feel of your game. Whether the game runs slowly or quickly. Weirdness lies this way.


The best framework I've seen is keyframed state update with interpolation for render. This lets you decouple the frequency of state update from render frequency.

The update of things like position of enemies or anything that changes over time, the state update, is separated from the render update. The state update can run zero, once, or many times per frame. Interpolation and Render is done once.

First decide how many frames per second your state update will run at, from this you calculate your fixed timestep. Lower the timestep, the better collisions you'll get as things can't travel so far per update. You can pick anything, but 60Hz is common. Depends on the game.

At the start of each game loop you figure out how many state updates should have been generated since the start of the game, starting at 2, (explained below). NOTE Real time is not used to update the internal state, just to work out how many state updates should have been generated at this point in real time.

You then step the state update until you are at that number. Each step generates a new frame of state for the game, and stores the previous frame's state. Let's call the previous frame State Frame 0 and the new frame State Frame 1. So, every time you step, you copy State Frame 1 to State Frame 0, and generate a new State Frame 1.

State Frame 1 is always at or up to one time step ahead of real time. You need this because once you've generated past the current time you can interpolate the state for the actual real time. You figure out how far you are between the last two frames, generate a number between 0.0 and 1.0, and use that number to lerp all the state from state frame 0 to state frame 1. You then render things at their interpolated positions. Simple!


Your physics is always updated with a fixed timestep, meaning your controls feel the same. Your display can run vsynced, non-vsynced, slow, fast, whatever. You still play the exact same internal game. It'll feel great at 500fps, and the physics won't do anything weird! Same if it's displaying slowly, 20fps won't mean 1/20th second update for the physics, which may let fast moving objects start to pass through walls, and other such crap. It's still look like shit, because 20fps is.. shit! :D

One thing I have noticed with this is that you feel like you have more control at higher frame rates, even though you don't. Weird effect, but we tested it years ago at work. The game could update its state at 20Hz and feel fine as long as the display ran at 60Hz. We picked 60Hz state frequency for physics issues though.

NOTE This doesn't require threading. You can do this all on one thread. Start frame, run state updates as required, interpolate, render, optionally wait for v-sync.

MASSIVE NOTE Settle on a state timestep before fine tuning any physics handling/controls. Your magic numbers will only feel right for one time step. Doubling the timestep and halving your settings WILL NOT WORK. It'll be subtley off, and feel wrong.


A few of the games I've worked on, (Colin McRae Rally, SEGA Rally), worked like this, and Unity does too:

Code:
... the execution order for any given script:
All Awake calls
All Start Calls
while (stepping towards variable delta time)
[INDENT]All FixedUpdate functions
Physics simulation
OnEnter/Exit/Stay trigger functions
OnEnter/Exit/Stay collision functions[/INDENT]
Rigidbody interpolation applies transform.position and rotation
OnMouseDown/OnMouseUp etc. events
All Update functions
Animations are advanced, blended and applied to transform
All LateUpdate functions
Rendering

https://docs.unity3d.com/Documentation/Manual/ExecutionOrder.html


So yeah, what Mental Atrophy said, but with more words :D

Hope that helps someone! :D
hehe, thanks razu! I appreciate the added detail. It clarifies what's going on a little bit more. In the previous iteration of my engine, I was letting it run as fast as it possibly could for everything -- updates, render, etc. -- and was passing in a delta time calculation into 'em. While things like movement and animation seemed to behave as I expected, collision detection was spotty as hell. It made troubleshooting the issues a nightmare. I'd wager a guess that it wasn't possible to "fix" the problems without having some sort of fixed time-step managing things. It didn't stop me from trying (and failing miserably) though.

I've incorporated something similar to what Mental Atrophy said into my main game loop. Hopefully this will help me keep things working reliably and logically this time around.
 

Paul F

Neo Member
No, I mean pathfinding for AI agents. I ask because I had the idea of making a Descent-like game too but I wasn't sure how to implement path-finding for the enemies because the navmesh of Unity couldn't be used for a full 3D environment (with enemies that can chase you across upwards corridors or downwards).

Or do you just have a system for obstacle avoidance for the enemies?

Okay, hopefully I have enough time to make a decent explanation now. It's a multi-tiered solution.
Firstly, ships in Neon don't just spawn in a room and hang around there. The game world has about 50km of spaceship highways which connect up the various bases and about one thousand AI ships are free to travel around the whole world doing their own thing.

Waypointing:
To solve getting from one place to another, they use a spiderweb of waypoints that are manually placed This is nearly invisible to the player because the waypoints are not points in space but large volumes. When a ship reaches a waypoint it asks the waypoint where to go and the waypoint gives a new destination based on a pre-defined weighting system to control the flow of traffic, while taking into account things like the faction and level of the ship and the faction and level of where it might go next.

Although ships appear to always be going 'somewhere', they don't really know where they are going. They just make decisions as they hit each waypoint. As far as nav systems go this is almost free. Most of it is precalculated on game generation.

Empty Space Volume Navigation
The second system is hundreds of manually placed boxes that fill out most of the empty space in the game. AI ships pick destinations within these boxes to ensure that they don't run into walls. The boxes overlap and this allows ships to fly around the entire internal space by swapping from one box to another. Again, this is extremely fast.

Ships can swap between using the waypointing system and the box system. Some ships cruise around using the waypointing system, while others are more inclined to mooch around the same area for a while using the box system. If the player shoots a waypointing ship it will instantly switch to the box system.

Combat Movement
The movement of ships in combat is based on combat logic but they do it within the confines of the box system. Things like
  • Do I want to get closer
  • Should I move further away
  • Should I try to get behind my opponent
  • Should I dodge

Breadcrumb Pathfinding
Finally, the game would seem a bit crap if enemies couldn't follow the player around a corner. To further avoid heavy pathfinding I use a breadcrumb system for this:
The player leaves a trail of 5 invisible breadcrumbs behind him (one droped every 1.5 seconds). If the AI can't see the player, it will move towards the last visible breadcrumb that was dropped (the last place it saw the player basically). When it gets there it will repeat this process until it sees the player or runs out of visible breadcrumbs. This is much closer to how real players follow an opponent so it feels way more realistic than traditional pathfinding and is much more efficient. It also lets the player lose a chasing enemy in a realistic way. The intelligence of each individual AI determines their skill at using this system.

If the AI loses track of the player it will likely patrol around that area for a while using the box system before finally deciding to switch back to waypointing to go 'somewhere'.
So the whole thing creates a totally emergent gameworld where the AI can go anywhere and do anything but in reality they never had a pre-defined destination.

There is much more to it than that as far as individual ship decision logic, but that's basically how the ships navigate. During gameplay this all blends together seemlessly.

There is a demo available for the game (85Mb) if you are interested in seeing it in action.
 

GulAtiCa

Member
For my next Wii U game, I am currently building my own Physic's Engine in Javascript. Currently going well. I got the basics down on my own, now currently reading a tutorial on engine designs/etc, to help me out with the more complex stuff like angular rotation.

Mainly doing this internal so I can have full control of everything and allow for more advance options. As well as more importantly, as their are not that many out there in Javascript currently. Box2D Javascript version being the bigger ones, are unofficial ports of the original C++ (or w/e) Box2D which leads to many old versions by many different people unrelated to each other that haven't been updated for a long time. Each with of them either being bloated, broken, or just not well documented...

Clearly I'm a masochist. lol
 

Kamaki

Member
3oqR7f9.jpg

Coming back to this game, decided to move on without my programmer. Should hopefully have everything wrapped up by tomorrow and then it's just sound!

Don't worry about scanlines and stuff, everything shall be disable-able in the menus so people can put it all on or off or anything inbetween!
 
You're one fo the Olympia Rising devs, right? That game looks beautiful!

Yep, that's me! And thanks! I'm finally getting the chance to create some new environments now that we're funded, excited to show everyone when they're finished!

I can't stop staring at this scene you've created - so much depth!
 

SHARKvince

Neo Member
I haven't really seen any posts on GAF about it, but pretty recently the Godot Engine has been released - it's a 2D and 3D open-source game engine. It looks pretty capable to me, and it's been in development for about a decade apparently.

I know there's tons of different open-source engines, but after digging around a little I saw more info on the dev's website. Seems it can be used for making games on 3DS as well; I haven't seen many simple engines that can "export" to 3DS yet, so it's noteworthy imho! :) Also, that's in addition to PS Vita, PS3, Win/Lin/OS X support.

Just finding out about the engine made me want to make a list of open-source tools that are great for game development. And so I did. Feel free to expand it. Hope this post may be of use to anyone here :D
 

razu

Member
hehe, thanks razu! I appreciate the added detail. It clarifies what's going on a little bit more. In the previous iteration of my engine, I was letting it run as fast as it possibly could for everything -- updates, render, etc. -- and was passing in a delta time calculation into 'em. While things like movement and animation seemed to behave as I expected, collision detection was spotty as hell. It made troubleshooting the issues a nightmare. I'd wager a guess that it wasn't possible to "fix" the problems without having some sort of fixed time-step managing things. It didn't stop me from trying (and failing miserably) though.

I've incorporated something similar to what Mental Atrophy said into my main game loop. Hopefully this will help me keep things working reliably and logically this time around.


No problem! Writing this was a warm up for an article I'm trying to write on the subject! :D
 

RawNuts

Member
I did an underground cutaway section; it repeats seamlessly on both ends.
Found it difficult to apply the style that I'm going for to an area that is so shallow, but I think it ended up alright. The rocks pop out quite far so it should provide at least some depth.


IntroShotAnimated.gif


I added the suggested vegetation (and some buildings), as well as a wrapping particle snow effect. Looks a bit more fleshed out now, I think.
I feel like the mountains shouldn't move as much as they do, but I still love this scene and the layers of clouds provide such a great illusion of depth; good use of violets in pixel art does something for me that I can't even explain.
I have the weirdest boner right now.
 

Bollocks

Member
What do you use for project management, tickets?
Way back someone posted a screenshot of the tool they're using, but I forgot the name.
 
I hate how I'm still using a bazillion placeholders for my game

But the functionality is growing so nicely. Been working more and more on it as things get fleshed out. And finally got GameMaker to work properly on collision so I'm even more excited.
 

Turfster

Member
Ah, thank you very much :).

How is your prototype coming along?

Currently working on my pathfinding implementation.
Pondering if I should precompute edge centers and use those for navmesh movement, or just follow triangle vertices and "collapse" skippable ones on the fly... or a combination of both.
(a.k.a. the boring scutwork ;)

I hate how I'm still using a bazillion placeholders for my game

Pretty much everything in my prototype uses standard size grid textures ;)
 

Mr. Virus

Member
Currently working on my pathfinding implementation.
Pondering if I should precompute edge centers and use those for navmesh movement, or just follow triangle vertices and "collapse" skippable ones on the fly... or a combination of both.
(a.k.a. the boring scutwork ;)

I'm an audio guy by trade, so most coding stuff I have a basic understanding of ha ha. But yeah, I remember some of the programmers taking some time getting it right on our side.

Hope it doesn't take too long for you!
 

-Winnie-

Member
I finally finished a game I've been working on (off and on) since last April.
Umbragram
It's a puzzle game about perception. It plays similarly, in certain ways, to Picross. You have the two shapes on the left and right, and you have to build the isometric structure that casts those shadows. It's pretty short, just 12 puzzles. I estimate it'd take maybe a half hour to get through. Give it a try and let me know what you think.
http://games.evilrobotstuff.com/umbragram/

ss02-crs.png

I really love the minimalist aesthetic, and the music/sound effects go well to creating a zen atmosphere. Puzzles and gameplay are clever too. Good job! :D

Only feedback I have is I didn't like the jarring music cuts at the end of each level. Kind of breaks the flow. Aside from that, very cool game!
 

Turfster

Member
I'm an audio guy by trade, so most coding stuff I have a basic understanding of ha ha. But yeah, I remember some of the programmers taking some time getting it right on our side.

Hope it doesn't take too long for you!

Well, I eventually decided on vertices with smoothing, which works... for now.
Code is horribly unoptimized at the moment, so I'll most definitely have to revisit this.
Still, I can start adding NPCs now and work on their AI ;)
 

Makai

Member
Yeah, Barry's music is awesome.

I don't mean to sound stupid, but could you explain this a little?
It's easier to compete for high scores if the average game length is short. One reason why Flappy Bird (and other endless runners) is successful is you die quickly, so going for "just one more run" is low cost to the player's time. I didn't die for a few minutes on my first go with your game, even though I was still learning the controls. Experienced players will be playing for much longer than that to compete for a high score. I think you could decrease the average playtime by increasing the acceleration of the screen, so it starts getting faster a lot quicker.

Anyway, that's just my $0.02 on the difficulty. I think the theme is very cute.
 

bumpkin

Member
So I know that with game programming, there's always a hundred ways to do something. What I'm curious about is the general consensus on approaches to giving the player control over a specific entity in the game. It seems like in a lot of the pre-fab engines out there, the player's avatar/sprite is one of many in a big bucket. What do you consider the best way to designate which is controllable and why?

I ask because I'm struggling with how to best approach it in my own engine. Right now, the first sprite added to the collection is *it* and is affected by keyboard/button input. I've thought about adding a controllable flag to my sprite object too. Is it just a case of tomato-tomatoe?
 

Turfster

Member
I ask because I'm struggling with how to best approach it in my own engine. Right now, the first sprite added to the collection is *it* and is affected by keyboard/button input. I've thought about adding a controllable flag to my sprite object too. Is it just a case of tomato-tomatoe?

My 2 cents:
Do what feels best to you, really, and what fits the game you're currently building.
When I only have one player, they're usually separate from the NPCs... although both have the same basic Interface, so I can hook up a recorded input replayer to every one of them that calls their generic move and respective "button" functions.
 

Five

Banned
So I know that with game programming, there's always a hundred ways to do something. What I'm curious about is the general consensus on approaches to giving the player control over a specific entity in the game. It seems like in a lot of the pre-fab engines out there, the player's avatar/sprite is one of many in a big bucket. What do you consider the best way to designate which is controllable and why?

I ask because I'm struggling with how to best approach it in my own engine. Right now, the first sprite added to the collection is *it* and is affected by keyboard/button input. I've thought about adding a controllable flag to my sprite object too. Is it just a case of tomato-tomatoe?

I usually make one module that polls for and holds all of the input, which is then exposed to everything else in the game. The PC is treated quite similarly to NPCs and other objects for the purposes of collision code, hit boxes and such.
 

SeanNoonan

Member
It's easier to compete for high scores if the average game length is short. One reason why Flappy Bird (and other endless runners) is successful is you die quickly, so going for "just one more run" is low cost to the player's time. I didn't die for a few minutes on my first go with your game, even though I was still learning the controls. Experienced players will be playing for much longer than that to compete for a high score. I think you could decrease the average playtime by increasing the acceleration of the screen, so it starts getting faster a lot quicker.

Anyway, that's just my $0.02 on the difficulty. I think the theme is very cute.
Ah, I understand. The difficulty previously was more towards the Flappy Bird end of the scale, but people were finding the jump to whip mechanic too difficult if they had to master that and concentrate on the speed (react to jumps in tight windows). I saw this based on the average play time/scores/player drop off. When I lowered the difficulty I saw an overall increase in players and a good curve to their scores.

The way I see it, with Flappy Bird, it was (predominantly) downloaded as an iOS app - in this situation the game is an executable that can be accessed easily at any time, and requires some form of uninstalling, therefore there's a pretty good chance a player will come back out of convenience whether or not they were frustrated. A web game on the other hand is dropped and forgotten as quick as a tab closes - so frustrating players straight out of the gate may not be the best route in the browser world. Obviously there's a way using both methods, but I've seen an encouraging curve from this new difficulty change.

Seriously thanks for the feedback though - if I'm ever able to put the game up on iOS I'd be sure to readdress the difficulty curve (and probably add in an item that if whipped would reduce the speed a little - a skill challenge to extend play time).
 

Noogy

Member
IntroShotAnimated.gif


I added the suggested vegetation (and some buildings), as well as a wrapping particle snow effect. Looks a bit more fleshed out now, I think.

Man, this is looking really nice. Every time I see some new assets I boot up your original demo and dream of what may come.
 

Alts

Member
I really love the minimalist aesthetic, and the music/sound effects go well to creating a zen atmosphere. Puzzles and gameplay are clever too. Good job! :D

Only feedback I have is I didn't like the jarring music cuts at the end of each level. Kind of breaks the flow. Aside from that, very cool game!

Thanks! Yeah, I've been hearing about the jarring audio from several people now, so I'll probably change that in a future build.
 

gabbo

Member
3oqR7f9.jpg

Coming back to this game, decided to move on without my programmer. Should hopefully have everything wrapped up by tomorrow and then it's just sound!

Don't worry about scanlines and stuff, everything shall be disable-able in the menus so people can put it all on or off or anything inbetween!

More info on this plz!
 

Turfster

Member
In today's "WTF Unity" corner: Add a second copy of my rigidbody, gravity using player prefab to the scene to test some pathing code, watch it slowly start floating upwards until it hits the ceiling... What?
Edit: Of course, the second I paste my frustration, I discover it's the animator's fault.
 
Status
Not open for further replies.
Top Bottom