How exactly does DirectX work?

Zaptruder

Banned
Just doing a little wondering... what exactly does DirectX help developers do? And where does the divide between what DirectX does and what the developers code actually start?

I'm wondering because I think it would be neat if they could plug-in bits of code into Direct X, such as a physics and character animation engine (creating a solid standard for everyone)... but it doesn't seem like Direct X is the thing that handles this type of stuff?
 
Zaptruder said:
Just doing a little wondering... what exactly does DirectX help developers do? And where does the divide between what DirectX does and what the developers code actually start?

I'm wondering because I think it would be neat if they could plug-in bits of code into Direct X, such as a physics and character animation engine (creating a solid standard for everyone)... but it doesn't seem like Direct X is the thing that handles this type of stuff?

http://www.microsoft.com/windows/di...dows/directx/productinfo/overview/default.htm

At the core of DirectX are its application programming interfaces, or APIs. The APIs act as a kind of bridge for the hardware and the software to "talk" to each other. The DirectX APIs give multimedia applications access to the advanced features of high-performance hardware such as three-dimensional (3-D) graphics acceleration chips and sound cards. They control low-level functions, including two-dimensional (2-D) graphics acceleration; support for input devices such as joysticks, keyboards, and mice; and control of sound mixing and sound output. Because of DirectX, what you experience with your computer is better 3-D graphics and immersive music and audio effects.

Basically, each piece of hardware, like a Radeon, or a GeForce, or an Audigy, or whatever, understands different commands.

Developers don't really want to code unique commands for every different piece of hardware you could possibly have in your computer, so they just talk to DirectX instead. DirectX takes the generic commands and translates them to commands your particular hardware understands.

DirectX does not currently do physics for you, it does not provide an actual game engine, and it doesn't do stuff like character animation. Basically it is a way for a developer to tell the hardware what to do without having to know all the specifics of that piece of hardware.
 
I see...

so any improvements and 'plug-ins' with DirectX would have to be met with a similar response on the hardware side.

I guess the plug-in physics and character animation engines wouldn't work too well in that sense.
 
Zaptruder said:
Just doing a little wondering... what exactly does DirectX help developers do? And where does the divide between what DirectX does and what the developers code actually start?

DirectX provides the low level layer encapsulating the functionality of the hardware in a (relatively) standard API. Without DirectX developers would be working with the hardware directly and life would REALLY suck.

I'm wondering because I think it would be neat if they could plug-in bits of code into Direct X, such as a physics and character animation engine (creating a solid standard for everyone)... but it doesn't seem like Direct X is the thing that handles this type of stuff?

There are already vertex shaders that handle character animation in more than a few engines, but that type of thing is a utility provided ABOVE DirectX. The used to be a Direct3D Retained Mode which was a higher level scene graph sort of API that would have provided that functionality, but for the most part those types of things don't go in DIrectX, they are either painfully written by engine developers who go through the Maya, SoftImage, 3D Studio Max (ha) APIs and gather the kinematic information about animation and store it in some more convenient proprietary format for that developers game engine. These days most things like physics, character animation, etc. are provided by 3rd parties. No one REALLY wants to write a physics engine. Its a fairly complex volume of code that has many higher level math equations that many game developers don't really understand. Same thing goes for vertex weighting skinning algorithms that drive modern animation systems in games.

Anyways, I hope I answered your question. In summary DirectX == low level API for accessing a variety of hardware in a standard manner. Physics and animation == high level concepts that aren't provided for in the API.
 
A plug-in physics model or character animation stuff would be at a layer above DirectX.

Edit: Pretty much what Phoenix said.
 
Basically the Hardware makers (Nvidia, ATI) will come out with a Direct X 9 spec card (like the Radeon 9800) after consulting with the DX team, and microsoft will release DX9 around the same time, if not before.

A game like Halflife 2 will come out, know what kind of Direct X codepath to run, and enable certain DX9 effects, things like Pixel Shaders, Texture compression, Normal maps and such.
 
Phoenix said:
Anyways, I hope I answered your question. In summary DirectX == low level API for accessing a variety of hardware in a standard manner. Physics and animation == high level concepts that aren't provided for in the API.

I see... that's pretty interesting :)

So... what are the chances of something like muscle on skin and bones weighted by exterior forces along with realistic cloth clipping/animation becoming standardized?
 
eventually though (when we all have flying cars), You'll program games in postscript (or similar). You just supply a high level description of the game, and the device interprets it to the best of its ability.
 
Zaptruder said:
I see... that's pretty interesting :)

So... what are the chances of something like muscle on skin and bones weighted by exterior forces along with realistic cloth clipping/animation becoming standardized?


The algorithms are fairly standard. While you can't tell visually, most companies are using the same neighboring joing averaging algorithms to do their skinning. At the same time you have to realize that everything is also going to be different based upon the needs of a particular design. If a game doesn't need realistic cloth animation, there isn't really any reason to not just animate it in the modelling package and export it out - perhaps with some basic bones to do some basic collision detection on the mesh.

If you're really interested in these topics I would encourage you to go to GDC one year and hit the sessions on these topics :)
 
Well... the cloth animations I'm thinking about I've never seen in a game...

where the clothing is pretty much removable from the character (developers can 'pin' it to the character for the sake of modesty) and deforms like real cloth, folding correctly over the shape of the body and reacting correctly to movement as well as collisions.

The only thing I've seen approaching this is the cloth animation from the Meqon physics tech demo.
 
Zaptruder said:
Well... the cloth animations I'm thinking about I've never seen in a game...

where the clothing is pretty much removable from the character (developers can 'pin' it to the character for the sake of modesty) and deforms like real cloth, folding correctly over the shape of the body and reacting correctly to movement as well as collisions.

The only thing I've seen approaching this is the cloth animation from the Meqon physics tech demo.


You get to a point where you have to ask yourself - is that cloth important enough that its work 20% of the CPU all the time :) Its always a tradeoff. Even on the most powerful machines we can't do realtime accurate cloth dynamics 'cheaply'. As such, most games just approximate it - because its really not that important to gameplay to take that much CPU/GPU bandwidth. If however it were gameplay relative to have insanely accurately modelled cloth (like some magic clothing or something - I dunno, I can't think of a case where it really makes a difference), I'm sure you'll see it. The algorithms for doing it are very well known - they just aren't cheap.
 
I saw a The Incredibles documentary... they said that making realistic clothes and having them animate was the hardest part and took most time. Of course, now that they've done it, you could probably scale down the code to real-time. Still, it would require ridiculous CPU power.

Virtua Fighter 4 has pretty decent cloth movement (take a look at Lei-Fei or Aoi) but it's obviously not perfect. And it's all preprogrammed.
 
SiegfriedFM said:
I saw a The Incredibles documentary... they said that making realistic clothes and having them animate was the hardest part and took most time

Pixar always say stuff like that. For Monsters, Inc. it was the fur, for Finding Nemo it was the water.

I don't see how clothes are more difficult than the water in Finding Nemo. That was interacting with complex surfaces in the whale etc.

Maybe they just hadn't programmed anything for clothes. They usually end up rewriting half of renderman every bloody movie anyway.
 
Top Bottom