• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GAF Indie Game Development Thread 2: High Res Work for Low Res Pay

Status
Not open for further replies.
Jobbs, I think a cool "enemy" can be made out of some of the plants in your game. Not necessarily an enemy, but you can have have plants that produce spores that cause lingering effects, like slower speed, health drain. The spores should also affect the enemies so that in areas where you have to fight, there is a tactical decision. Do I take out the plants and remove my ill effects, or do I let them be to weaken the enemies. I think you could create some compelling situations where the spore effects complement the enemies strengths for a more challenging fight, or it weakens enemies significantly yet drains your life quickly (so now you have to race against time).
 
Jobbs, I think a cool "enemy" can be made out of some of the plants in your game. Not necessarily an enemy, but you can have have plants that produce spores that cause lingering effects, like slower speed, health drain. The spores should also affect the enemies so that in areas where you have to fight, there is a tactical decision. Do I take out the plants and remove my ill effects, or do I let them be to weaken the enemies. I think you could create some compelling situations where the spore effects complement the enemies strengths for a more challenging fight, or it weakens enemies significantly yet drains your life quickly (so now you have to race against time).

Sounds pretty neat tbh. I was trying to think of ideas myself but failed horribly.
 

missile

Member
A cute little creature, that starts following you like a cat/dog. It can damage enemies for you but it starts building up from the damage it deals, and eventually it grows and goes berserk on the player. Maybe have something like, you can try to kill it right away, and after sometime it goes into the same berserk form, but with tougher defense and have the berserk after following you be easier to kill but deal more damage.

It would present an interesting risk/reward situation. If I take it with me it can help deal with enemies, but will I be able to deal with the berserk form? There's a chance that it will go berserk on a bad room, or in a particularly bad time.

Though I imagine it would be difficult to balance.
Cool idea! Signed.

Jobbs, I think a cool "enemy" can be made out of some of the plants in your game. Not necessarily an enemy, but you can have have plants that produce spores that cause lingering effects, like slower speed, health drain. The spores should also affect the enemies so that in areas where you have to fight, there is a tactical decision. Do I take out the plants and remove my ill effects, or do I let them be to weaken the enemies. I think you could create some compelling situations where the spore effects complement the enemies strengths for a more challenging fight, or it weakens enemies significantly yet drains your life quickly (so now you have to race against time).
Sounds interesting. Would even fit his game somehow.

... As a plugin for UE4?. I'd buy it.
UE4? That's an option. How does the pluging structure actually works? Is it
possible to load a compiled platform plugin (C/C++ & shader code) and have its
interface exposed via blueprint? Considering the code, I'm completely open to
fully rewrite it in any other language if its worth the effort.
 

missile

Member
Today I'm trying to get Retrotron to spit out some colors. I'm halfway
through. One problem here is the color carrier's frequency which ideally
should match the sampling frequency if you want the cycles (period) to match
at the end of the line, which would ease the problem esp. from a
computational perspective. On the other hand many video signal specifications
work with fractional cycles resp. won't match together considering resolution,
carrier frequencies, and sampling rates. Sure, I can build it how ever I want
(even defining my own video standard), but the problem is that each ansatz
requires an adjusted video filter(s). Unfortunately, my filter tool isn't
fully ready yet to easily recompute the filters again and again. But will do
so in the future. So it's a little bit daunting readjusting everything. I
think this is also the problem of similar projects working on a given
resolution (sampling rate, frequency, etc.), i.e. the filters are constructed
just for a given resolution. Changing any of the parameters requires to
recompute all the filters again in deriving all of their coefficients, judging
each filter's frequency and phase response, and also derive an efficient
implementation again, which is all serious business. That's why I'm doing them
from scratch here and is also what takes quite some time. Of course, there are
filter packages everywhere (the good ones are very expensive), but I need to
do them from scratch because I need them for video games, i.e. they need to be
adjusted for games and not for DSP with their long filter taps computed in
hardware, and they ideally should run in realtime, i.e. recomputing the
coefficients and code changes (if the filter changes order or type). There is
another reason why to put such a huge effort into all of this (considering
Retrotron). Many unique video effects and distortions (esp. when the signal is
on VHF/UHF, sort of) stems from mis-algined filter curves (frequency-, phase-,
and delay response) don't matching the specification etc. It's a huge
advantage if you can manipulate the filter curves (hence, the filter itself)
at runtime leading to all sorts of dynamically adjustable effects controlled
with just a few parameters.

Regarding the implementation of the color, I may do both versions, I may even
recompute the carrier's samples again and again because this would allow me to
simulate color shifting (tint) seen in many old NTSC TVs where you had this
tint-knob which basically adjusted the phase of the TVs color carrier
frequency. Additionally, another interesting aspect of NTSC was that the local
oscillator in old TVs wasn't as stable as it should be (phase drift) leading
to color shifts along the line. So an old NTSC color TV has its colors
shifted towards the right side of the screen. That's why NTSC was also known
as Never The Same Color. PAL solved this problem by converting the color
phase (tint) error into a saturation error which isn't as noticeable to the
eye as compared to the same error in tint. Hence, there were no tint-knobs on
PAL TVs.
 

RhysD85

Member
Congrats to those who participated in LD, I'm currently downloading and will play/rate all the games linked so far.

Here's mine for anyone interested - http://ludumdare.com/compo/ludum-dare-34/?action=preview&uid=30493

SWAMP WITCH

UP4wGYX.jpg


Please excuse the controls, still trying to figure out a nicer way of doing them.
 
All of those Ludum Dare projects look really cool. I am amazed that you guys can get so much work done in so little time.

As for me, no work today so I spent the morning doing new mocap attack animations (managed to get two done) and setup the characters' skeletons and physic assets for fatty tissue simulation (bellies and breasts) :
neat, how are you finding UE4's physics support? Have you done any cloth sims yet? I'll probably have to get into it some point soon, and hoping it's not a nightmare.

finally got something working that I've been stumped with for ages - fully functioning in-game camera system that doesn't kill framerate! Can also write out images to HDD, an automatic upload option to imgur or something would be cool but way out of my league at the moment.

obligatory pretentious DOF shots incoming:




 
neat, how are you finding UE4's physics support? Have you done any cloth sims yet? I'll probably have to get into it some point soon, and hoping it's not a nightmare.

Physics is pretty good and easy to use in UE4 (character's physics assets need a lot of manually setup though). I've done some cloth sims using the old school joint method, and that worked fairly well, but we are looking to use APEX objects for all our cloth in game. I've downloaded and installed the Physx plugin for Maya and messed around with it, but I haven't actually made an APEX cloth object for the game yet (I watch some tutorials, and it looks fairly straight forward though). The only thing we've had problems with was trying to use physics simulations for character collisions (ie, physics based melee attacks), and that proved to be a huge pain in the dick. Admittedly, I think that was a bad choice on our part because I do not think the system is setup to work like that (or maybe we just did it all wrong), so we switched to the more traditional way of using overlap events for our collision capsules.


UE4? That's an option. How does the pluging structure actually works? Is it
possible to load a compiled platform plugin (C/C++ & shader code) and have its
interface exposed via blueprint? Considering the code, I'm completely open to
fully rewrite it in any other language if its worth the effort.

I have never made a plugin for UE4 but have seen some really impressive stuff others have made; you could probably do what ever you want. More info at the following links:

https://docs.unrealengine.com/latest/INT/Programming/Plugins/index.html
https://wiki.unrealengine.com/An_Introduction_to_UE4_Plugins
 

snarge

Member
Congrats to those who participated in LD, I'm currently downloading and will play/rate all the games linked so far.

Here's mine for anyone interested - http://ludumdare.com/compo/ludum-dare-34/?action=preview&uid=30493

SWAMP WITCH

UP4wGYX.jpg


Please excuse the controls, still trying to figure out a nicer way of doing them.

I played and enjoyed Swamp Witch! Great concept and execution. The background art was very atmospheric. It was tough, haha. I kept wanting to press the left and right arrow keys to fly. It was tough, but tough in the good, Luftrausers way. On the same not as Luftrausers, I really like how the interface is intuitive and easy to get right back into the game. I liked the upgrade system, though I myself would've given a little more sooner. Maybe start them off with some coins to spend. Also, the crash the witch does on the ground is fantastic, ha. Great job!
 
i'm interested in getting into vr development. is there any consensus on which is the better engine between unity and ue4 when it comes to vr? i'm interested in easily being able to deploy apps to multiple platforms. also, i have a lot of experience using xna and built a couple of custom engines from scratch. i prefer doing that over having to work within an existing setup. i get the vibe that ue4 may be more flexible, but just thought i'd ask here.
 

Subtilia

Neo Member
Hello, indie people!

I've been lurking GAF for ages, and got an account just some weeks ago. And since I've been developing for some years now I guess I should stop by and say hi!

Now, they say it's not polite to visit and don't bring anything, so here's a shader I've worked on and recently wrote about on Gamasutra:
/A_Subtle_shader_making_an_invisible_thing_visible*code included*
 

RhysD85

Member
I played and enjoyed Swamp Witch! Great concept and execution. The background art was very atmospheric. It was tough, haha. I kept wanting to press the left and right arrow keys to fly. It was tough, but tough in the good, Luftrausers way. On the same not as Luftrausers, I really like how the interface is intuitive and easy to get right back into the game. I liked the upgrade system, though I myself would've given a little more sooner. Maybe start them off with some coins to spend. Also, the crash the witch does on the ground is fantastic, ha. Great job!

Thanks :) Sometimes i'll purposely crash into the ground to see how many flips I can do.
 

bumpkin

Member
So something I've been wondering, why does it seem like so many pre-fab engines use a model of "nodes" that all (or a lot) of the other objects are subclasses of? Is it just programmer laziness or is there some sort of underlying principle that I don't understand?
 
i'm interested in getting into vr development. is there any consensus on which is the better engine between unity and ue4 when it comes to vr? i'm interested in easily being able to deploy apps to multiple platforms. also, i have a lot of experience using xna and built a couple of custom engines from scratch. i prefer doing that over having to work within an existing setup. i get the vibe that ue4 may be more flexible, but just thought i'd ask here.


not sure about unity, but VR support is fully integrated into UE4, and it is fairly easy to set up.

https://docs.unrealengine.com/latest/INT/Platforms/VR/index.html

Frankly, you can't go wrong with UE4; it's an amazing game engine.


So something I've been wondering, why does it seem like so many pre-fab engines use a model of "nodes" that all (or a lot) of the other objects are subclasses of? Is it just programmer laziness or is there some sort of underlying principle that I don't understand?


I wouldn't call myself a programmer and this could be way off, but isn't that just efficient object oriented programming?
 

Subtilia

Neo Member
I'm wondering if you could implement VR support without owning any hardware?

Uhm... Unless you have any way around for testing what you're building without hardware I believe this doesn't seem a good idea. Specially in this vomit-inducing-world of VR I think playtesting is paramount.
 

Davision

Neo Member
Uhm... Unless you have any way around for testing what you're building without hardware I believe this doesn't seem a good idea. Specially in this vomit-inducing-world of VR I think playtesting is paramount.
I checked around now, unfortunately you can't test inside the editor in stereo without a device plugged in but there is this workaround: https://answers.unrealengine.com/qu...ble-split-screen-without-oculus-or-multi.html
Setting up the rotating by Oculus seems to be as simple as making sure the camera has Use Controller View Rotation enabled.
For a packaged project the player just have to press alt+enter for fullscreen and plug in the device and it changes automatically to stereo.
My brother has a Oculus so he can test it.
Not sure what I could do wrong, wrong FOV? Wrong eye height? Which would result in motion sickness? I mean for a default UE4 First Person Character. I just want to give it a try with Forklift Man for the fun of it.
 

anteevy

Member
Pub check: http://digeratidistribution.com/

Anyone familiar with them? They just contacted me. They are currently publishing several games but their most notable is Slain: http://www.wolfbrewgames.com/slain/

I'm honestly unsure about how I feel about a publisher, at the moment. I've spoken with a very popular indie pub already and I'm supposed to throw them a bunch of stuff, eventually. Just not ready to quite yet.
Don't know them, but they seem to be doing a good job with the marketing for Slain. I'd say you should talk to them and see what they can offer. But if the deal with the other publisher you're talking about is a sure thing and they are good, you might not need to.
 

missile

Member
xvluG14.png

(work in progress)

Colors, finally!

The raster you see within the colors is the color-carrier (NTSC 3.579535 MHz),
it produces a raster if not filtered out rightfully. I currently have only
built some simple filters which do not have a steep roll-off of the filter
curve to filter out the color-carrier signal fully, but I will improve on this
over the weekend. The whole thing needs much more work, of course (well, it's
work in progress, mind you), but I'm pretty proud of the result so far,
because the colors are actually generated with the help of the colorburst (a
signal similar to the pilot-ton for stereo) which I have now simulated.
Usually you would fake all of this by converting the RGB colors to YIQ, mess
around, and convert back to RGB. This may work for some stuff (emulators) but
it won't reproduce the color distortions see on TV while undergoing signal
distortions. Let me explain a bit for those interested;

Roughly speaking, the colorburst (a very short signal) is sort of an
indicator whether the video signal contains color or not. The broadcasting
station needs to send the colorburst out as well such that your TV can decode
the colors contain within the signal. But there is more. For decoding the
colors, your TV actually needs the color-carrier's frequency and phase.
However, it was found out that one can save a lot of energy when the
color-carrier signal isn't sent out at all (suppressed carrier, as it is
called). But your TV needs the carrier because the colors were encoded with
it! Uhh! The carrier's phase actually encodes the tint of the colors
stored within the video signal. So how to decode the colors when the carrier
is missing? Well, instead of sending the color-carrier out continuously, only
a small copy of it will be sent out known as a burst, i.e. the colorburst.
Your TV (locally) runs an oscillator at a frequency of the transmitter's
color-carrier at 3.579535 MHz (NTSC). Hence, the transmitter's color-carrier
is sort of regenerated within the TV itself. But the colorburst is need
because it contains the phase! For, despite having the same frequency, the
regenerated color-carrier doesn't need to have the same phase as the
transmitted one, and the right phase is needed to decode the colors (unless
you want cool color errors). The TV now syncs its own color-carrier in phase
with the phase of the colorburst and will as such generated a local copy of
the color-carrier of the transmitting station which can now be used to decode
the colors. That's how it works, basically. And that's what I've simulated.

The advantage gained from simulating this behavior is nothing less than real
TV color distortions when the video signal is heavily distorted due to
whatever reasons or when the TV is at fault. All that's needed is to disturb
the colorburst and make it difficult for the TV to lock on its phase. That
will be Retrotron for you!

I didn't made any attempt to disturb the colorburst yet, or let the TVs
phase-locking mechanism fail whatever, but you may image that there will be
some cool things coming out of all of this. For today I'm very happy that it
actually works. Cheers! :+
 

Jobbs

Banned
xvluG14.png

(work in progress)

Colors, finally!

The raster you see within the colors is the color-carrier (NTSC 3.579535 MHz),
it produces a raster if not filtered out rightfully. I currently have only
built some simple filters which do not have a steep roll-off of the filter
curve to filter out the color-carrier signal fully, but I will improve on this
over the weekend. The whole thing needs much more work, of course (well, it's
work in progress, mind you), but I'm pretty proud of the result so far,
because the colors are actually generated with the help of the colorburst (a
signal similar to the pilot-ton for stereo) which I have now simulated.
Usually you would fake all of this by converting the RGB colors to YIQ, mess
around, and convert back to RGB. This may work for some stuff (emulators) but
it won't reproduce the color distortions see on TV while undergoing signal
distortions. Let me explain a bit for those interested;

Roughly speaking, the colorburst (a very short signal) is sort of an
indicator whether the video signal contains color or not. The broadcasting
station needs to send the colorburst out as well such that your TV can decode
the colors contain within the signal. But there is more. For decoding the
colors, your TV actually needs the color-carrier's frequency and phase.
However, it was found out that one can save a lot of energy when the
color-carrier signal isn't sent out at all (suppressed carrier, as it is
called). But your TV needs the carrier because the colors were encoded with
it! Uhh! The carrier's phase actually encodes the tint of the colors
stored within the video signal. So how to decode the colors when the carrier
is missing? Well, instead of sending the color-carrier out continuously, only
a small copy of it will be sent out known as a burst, i.e. the colorburst.
Your TV (locally) runs an oscillator at a frequency of the transmitter's
color-carrier at 3.579535 MHz (NTSC). Hence, the transmitter's color-carrier
is sort of regenerated within the TV itself. But the colorburst is need
because it contains the phase! For, despite having the same frequency, the
regenerated color-carrier doesn't need to have the same phase as the
transmitted one, and the right phase is needed to decode the colors (unless
you want cool color errors). The TV now syncs its own color-carrier in phase
with the phase of the colorburst and will as such generated a local copy of
the color-carrier of the transmitting station which can now be used to decode
the colors. That's how it works, basically. And that's what I've simulated.

The advantage gained from simulating this behavior is nothing less than real
TV color distortions when the video signal is heavily distorted due to
whatever reasons or when the TV is at fault. All that's needed is to disturb
the colorburst and make it difficult for the TV to lock on its phase. That
will be Retrotron for you!

I didn't made any attempt to disturb the colorburst yet, or let the TVs
phase-locking mechanism fail whatever, but you may image that there will be
some cool things coming out of all of this. For today I'm very happy that it
actually works. Cheers! :+

Awesome. can you demo it with different images so I can contextualize it more?
 

snarge

Member
Any suggestions for Combo stage names? I've got a system similar to DmC, where the first letter indicates the rank: Deadly, Carnage, Brutal, Atomic, Stylish, SS-word

But my world is a semi-cutesy type of world, so stuff like Deadly, Carnage, Brutal wouldn't fit. My theme is more like Splatoon-y.

Gameplay, for reference: http://youtu.be/zBXn-k0T7UQ (although the combo system is not in that build)

Also, any ideas for a non-letter based system are welcome. I like the letter based system because it's familiar and you know that you are progressing up. Plus, SS would be "Snail Storm!" for my game, and that just works too conveniently.
 

missile

Member
Awesome. can you demo it with different images so I can contextualize it more?
Currently not. Simply because it wouldn't even represent π% of what will
be under the hood and may as such lead to a wrong/degraded perception of the
program. If I start to use real pictures now, people may start to share them
around. So I will stay with technical ones for the time being, which also
serve me a real purpose to say the least. But once I think the program reaches
sort of an alpha state, I'll start to use random or specific pictures of other
games resp. game developers to actually get some feedback before entering beta
stage making the program feature complete for release 1.0.
 

Popstar

Member
What's the use case for the Unity store? Buy once, share it to 8 different users without needing a group license?
I mean in the sense that use a something from the Unreal marketplace you're on the hook for the Engine royalty even if you're not using the Unreal engine itself.

The Unity asset store is more liberal. You can buy and use stuff from it in a non-Unity project without additional fees.
 

Blizzard

Banned
I mean in the sense that use a something from the Unreal marketplace you're on the hook for the Engine royalty even if you're not using the Unreal engine itself.

The Unity asset store is more liberal. You can buy and use stuff from it in a non-Unity project without additional fees.
Oh, that makes sense. I'm actually surprised the Unity store sells things and lets you use them on competing engines.
 

Subtilia

Neo Member
Not sure what I could do wrong, wrong FOV? Wrong eye height? Which would result in motion sickness? I mean for a default UE4 First Person Character. I just want to give it a try with Forklift Man for the fun of it.

Welp, I've never developed anything VR, so I can't say what exactly could go wrong. But with a new medium with so little established conventions and good practices for user interaction, UI and stuff, I'm rather do as much real-life testing as possible.
 
Finally got the APEX cloth working in engine...first time working though it was a huge pain in the ass though, but it looks pretty damn nice.

7TCb.gif



As for VR, I have an Oculus on loan from my dev partner and frankly I am not on board the hype train. It's novel, but I do not think it is the next big thing in gaming. It definitely feels like first gen technology (the screen resolution/visual fidelity is really bad) and I have yet to see any compelling VR game experiences that would sway me. Don't get me wrong, there are some cool demos, but they're essentially virtual tours. Although I did watch Tested's video on Crytek's climbing VR game and that actually looks interesting. But i don't know if a game like that has any lasting appeal. I guess that's the biggest issue now, how do you create a engaging game experience with the tech?

I also don't think you'd be able to develop anything serious without a headset. You can always set up any game in UE4 to output in VR mode, but it doesn't really work well if the game wasn't designed with it in mind.
 

missile

Member
Finally got the APEX cloth working in engine...first time working though it was a huge pain in the ass though, but it looks pretty damn nice.

7TCb.gif
...
Damn nice, bro! :+

... As for VR, I have an Oculus on loan from my dev partner and frankly I am not on board the hype train. It's novel, but I do not think it is the next big thing in gaming. It definitely feels like first gen technology (the screen resolution/visual fidelity is really bad) and I have yet to see any compelling VR game experiences that would sway me. Don't get me wrong, there are some cool demos, but they're essentially virtual tours. Although I did watch Tested's video on Crytek's climbing VR game and that actually looks interesting. But i don't know if a game like that has any lasting appeal. I guess that's the biggest issue now, how do you create a engaging game experience with the tech?

I also don't think you'd be able to develop anything serious without a headset. You can always set up any game in UE4 to output in VR mode, but it doesn't really work well if the game wasn't designed with it in mind.
Have an DK1 right next to me since day 1. I really love it and also want do
make a game with it (or equivalent) further down the road. But you are right
about the lack of fidelity which will only be accepted by the diehards who
also know a lil about some of the issues with VR. Well, let's see how PSVR
will perform. If it takes off, with Sony also improved the product over time
(investing further), then it may becomes a real option.
 

_machine

Member
Woah, it's been quite a while since I've last posted (though naturally I have lurked that whole time), but it's been a busy end to an already busy autumn. I will be moving away from the northern wastes of Finland (and Kajaani) to the south for a few weeks and then off to Germany. That also means that I just spent my final day of development with the most wonderful team and project.

We also released another content update yesterday, which potentially could be the final one. There's still some WIP content that faced major technical issues, but a large portion of the team got poached to "actual" studios and the fact is that we couldn't come close to sustaining development further than was originally planned. We would love to work further on the game, especially to expand the single-player aspect, but right now it's really unclear whether it can happen.

Here's also the little promo shot we did for the update (link to patch notes):

Finally got the APEX cloth working in engine...first time working though it was a huge pain in the ass though, but it looks pretty damn nice.

7TCb.gif
Looks bloody stunning! Easily rivals any AAA-cloth physics I've seen.
 
do i need the innovator edition of the gear vr headset to do development or can i use the retail model? the development documentation claims you need the innovator edition, but i am wondering if the documentation is just outdated. the headset doesn't actually contain any tech other than the lenses, so i'm not seeing any reason you would need a special version of the headset to do development...
 

missile

Member
I improved the filters, but still not as good as I want them to be (need to
address phase delay).

KXfffyb.png

original (RGB)

xvluG14.png

old (composite, YIQ, bad filter)

2fU3Yu3.png

new (composite, YIQ, better filter)

Looks better now, more composite like. There are still some artifacts left,
some of them will never disappear unless you filter the video YIQ components
to death.

As one can see, yellow and cyan are a bit off. Some color combinations produce
an out-of-range composite video signal K = Y+C (Y luma, C chroma) for the
transmitter. Limiting the signal was necessary to not overdrive the amplifiers
(due to over-modulation) leading to various distortions, hence, the
transmitter needed to "clip" the signal somehow. Problem is, if you simply
clip the signal (like I've done here), your colors will suffer in hue,
saturation, and value. Look at cyan in the last picture. Well, you might have
heard that emulator-people try to recreate the right colors of a console when
the video signal is sent over RF. But if you just clip the video signal you
will not only distort your colors (esp. in hue, which is rather bad), you
will also not get the right broadcasting lock of colors sent via RF. What's
needed is a gamut mapping. One needs to map the YIQ gamut into one which
results into save colors, i.e. into colors save for broadcasting (which
stay within the limits of the composite video signal when combined together)
yet being as close as possible to the original one. That's not an easy task
and there are multiple ways to define an optimum here. Over the weekend I will
deal with this problem -- which should bring me a step close to a proper
broadcasting video signal, hopefully. Note: I could skip clamping the signal
and yellow, cyan etc. would be fully fine, but this won't be broadcasting
video.
 

Urishizu

Member
Hey all!

I just wanted to share a game that I've been working on for the last eight months or so. It was made as an excuse to get to know the new Windows 10 development environment. It utilizes both XAML and Win2D to handle the menu/game rendering.


Eternal Heart Website
Eternal Heart on the Windows 10 Store (Trial available)

Eternal Heart is a 2D 'rogue-lite' where you control a character exploring a dungeon searching for loot (and ultimately the Eternal Heart). It's not quite a real rogue-like, as it's more of a 2D action game with light RPG elements. As you defeat monsters and break open chests, you gather loot which you can spend to buy potions and level up. The game itself isn't overly complex, but the hook is the included game editor that allows you to make your own adventures. I like to include editors in my games and the entire included campaign was made directly inside of it. It's meant to be an iteration on design from my previous project, Venatio Creo.

I'm not an artist, so all of the assets used in the game are either licensed from free sites, or purchased by me. I have a full list of credits on the Eternal Heart website.

You can use the touch screen, KB&M or an Xbox gamepad to play. The editor is usable using touch or KB&M. The game works on Windows 10 PCs, tablets and Windows 10 phones since it's a universal app. It should also function pretty well with the Windows Phone Continuum if you have a 950/950XL and a dock. Just press the button (or tap the screen) on your control-method of choice, and the game's UI will transition over to it automatically.

Anyway, I'm fairly happy with how it turned out and I'm going to use it as the basis for my next game. Let me know if anyone wants a free code and I'll PM it to you. Interested to hear any thoughts!
 

Dewfreak83

Neo Member
That also means that I just spent my final day of development with the most wonderful team and project.

What!? You guys are all done? Did the game not perform as well as you'd hope? Seems like it was done well from this end! :(

Sorry if I missed this before - but how big is your team?
 
Whelp, I finally put down monies on steam for greenlight. Now I just need to get my head around how to make a trailer. Wish me luck :p

Hope everyone else is doing well as xmas looms over us all XD
 

missile

Member
uKspar8.png

RGB

7TtUQoX.png

non-NTSC broadcast colors
(clipping the video signal to Y+|C| <= 1.20)


DfCG5Do.png

NTSC broadcast colors
(scaling the colors to satisfy Y+|C| <= 1.20 (120 IRE), |C| <= 0.34)


That's my take on the colors for the time being. I don't know if it fits on
all accounts, but looks good for a first try. How it's done? Well, I tried
many variants to scale the colors to limit the composite video signal to 120
(IRE) according to the NTSC broadcast specification. The most important thing
is to keep the color's hue constant when scaling. It's also good to keep the
luma (Y) constant as much as possible because the eye is very sensitive
to changes in luminance (which Y encodes more or less) and also because it
preserves the grays of the colors. However, if the limit on the composite
signal becomes rather low, then we also need to scale luma as well. So my
strategy was as follows; If Y+|C| > A (A given, i.e. 1.20), scale the chroma
(C) component down (desaturating) to B (may become 0 depending on A) <= a
given chroma limit, while keeping Y constant, and see if Y+|B| <= A (*). If
that's the case, fine. If not, scale Y down until (*) holds. Works fine!

One problem when scaling colors is that you may leave the color gamut. The
scaling procedure above leads to small YIQ values which do not fall into the
NTSC_RGB color space (i.e. negative RGB values). I therefor have written a
clamping method which projects these out-of-range RGB values onto the NTSC_RGB
gamut following a similar strategy of keeping the hue constant while
desaturating the offending color by moving it towards the center axis
(connecting black and white of the gamut) in a plane perpendicular to
this axis containing the color.

The actual limit 0.34 for |C|, in conjunction with Y+|C|<=1.2, was derived in
such a way that almost all clamped YIQ colors lead to valid NTSC_RGB colors.
So this is an upper limit for chroma if you want to have maximum Y with
maximum C under the constrained that Y+|C|<=1.2 for producing valid NTSC_RGB
colors. Of course, I can rise the limit for |C| which would produce more
vibrant colors, but will require to lower Y (and such the gray scale will be
off) and also requires serious NTSC_RGB clamping. What's interesting is that
the maximal excursion of the composite signal Y+|C| is about ~1.334 and for
|C| about ~0.632. Hence, about half the chroma component of the composite
video signal is lost (if you want to stay true and within the bounds of the
NTSC specification). And this can be see on many NTSC TVs.

Now there is another issue. Many operations on the composite video signal,
i.e. filtering etc., do indirectly manipulate the YIQ values contained withing
the signal producing out-of-range NTSC_RGB colors on decoding. So usually one
would have to clamp these values as well, but I haven't done so, because in TV
there was no color-mapping within the TV producing correct NTSC_RGB values.
The circuit of resistors etc. will "clip" the signal their way with the result
that the colors will change in hue, saturation, and value to a given yet small
degree. So I also just clipped them to [0,1], whereas soft-clipping (like a
vacuum tube does) could be more advantageous, I guess. Hence, I only clamped
the colors on the broadcasting side (which is reasonable) and hard-clipped them
on the end of the TV. The "overshoot" between the bars are due to the filters
and gamma correction. It can be dimmed a bit by better adjusting the filter's
phase, but can never be eliminated unless your filter becomes trivial. So you
will always see this behavior on color NTSC TVs to a given degree (while
displaying color bars, yet barely visible otherwise).

The colors you see aren't absolut colors. To see the real colors, the
coordinates of the primary colors of ones monitor are needed to map the
NTSC_RGB gamut (defined with different phosphors) to the RGB gamut of ones
monitor. To actually see the colors like they're appearing on a given TV needs
the primaries of the TV as well as the primaries of ones own monitor to map
NTSC_RGB to TV_RGB to Monitor_RGB. That's something I'm going to implement
further down the road.

Here is an demonstration of what happens if one just clips the composite
video signal to, for example, 0-100 IRE;

ESTFOmp.png


and if properly clamped

vgjQQjK.png


The first picture looks a lil more vibrant, but the colors are off. Look for
example at cyan, it turns into green whereas in the second picture the colors
are all fine, yet a bit more desaturated (which is the trade-off). Hue errors
are much more problematic esp. when picturing natural scenes.



Edit: I should also say that the colors are filtered according to the NTSC
standard -- for those trying to judge or recreate the result.
 
Status
Not open for further replies.
Top Bottom