childplease
Banned
Is it weird that I am not impressed at all.
Bokeh, particles and lighting look stunning. Is there a reason why the geometry and textures&shaders look so shit though, everything looks really rough, like the hammer and door? I recall the Samaritan demo having really good models and textures.
hm not impressed at all,sorry epic but you teased this shit way too long,should have shown it 6 months ago,now the ff demo out of nowhere stole your thunder
Just curious. Does anyone know if squenix plans to use the engine for more than just final fantasy? Would love to see a new game or genre made with it.
Just curious. Does anyone know if squenix plans to use the engine for more than just final fantasy? Would love to see a new game or genre made with it.
The UE4 demo looked great. I didn't care much for the indoor parts, but the transition to the snowy mountains was stunning.
engine is designed to able to be modified so that more than just RPGs could be played on it. It's gonna be the go-to engine for all internal SE studios, including Eidos, Crystal Dynamics and I/O along with Square Japan.
Some of the posts in this thread man...
Do you guys know what a tech demo is? Why are people even mentioning the assets and the texture resolution? You do realize that what appears as "styrofoam rocks" is down to a couple if settings in the physics engine, not a deficiency with the engine itself? Come on, Gaf.
Hopefully we'll be seeing some human characters soon enough. Devs seemingly had some issues making those look good in UE3. Anyone know why that was? I'm not well versed in UE3.
I agree with this.The assets look terrible TBH. You'd think they would have used higher quality textures and mapping
They explicitly said that they are not doing that.Don't IO and CD have their own toolsets? Is Square really gonna pull an EA and just shove the engine on every teams throat like EA is doing with Frostbyte?
They explicitly said that they are not doing that.
I think that's unfair to the Witcher 2. Unlike this tech demo, W2 has fantastic, consistent assets and art.This is also the reason imo why there are so many people out there who think Uncharted looks better then The Witcher 2 on PC and are oblivious to the amount of power needed for the little things they don't even notice.
ITT Gaf shows they do not understand the purpose of a technology demonstration.
I think that's unfair to the Witcher 2. Unlike this tech demo, W2 has fantastic, consistent assets and art.
I think that's unfair to the Witcher 2. Unlike this tech demo, W2 has fantastic, consistent assets and art.
I think that's unfair to the Witcher 2. Unlike this tech demo, W2 has fantastic, consistent assets and art.
They explicitly said that they are not doing that.
Don't IO and CD have their own toolsets? Is Square really gonna pull an EA and just shove the engine on every teams throat like EA is doing with Frostbyte?
Demo seems fairly impressive, lets hope the animations are all on par![]()
They do support MSAA in DX11 now.Looks like it's using FXAA, I hate how that looks in motion. Couldn't they atleast use MSAA
Despite MSAA's popularity, "it is relatively costly in the demo because Samaritan uses deferred shading," explains Ignacio Llamas, a Senior Research Scientist at NVIDIA who worked with Epic on the FXAA implementation. By writing pixel attributes to off-screen render targets prior to final shading, deferred shading enables complex, realistic lighting effects that would be otherwise impossible using forward rendering, a lighting technique commonly used in many game engines. There are a couple of downsides to this: first the render targets require four times the memory since they must hold the information of four samples per pixel; and second, the deferred shading work is also multiplied by four in areas with numerous pieces of intersecting geometry.
"Without anti-aliasing, Samaritan’s lighting pass uses about 120MB of GPU memory. Enabling 4x MSAA consumes close to 500MB, or a third of what's available on the GTX 580. This increased memory pressure makes it more challenging to fit the demo’s highly detailed textures into the GPU’s available VRAM, and led to increased paging and GPU memory thrashing, which can sometimes decrease framerates.”
Hahahaha, surely u jest, what the fuck is a standard human being going to do with 16GBs of ram?16 GB RAM in a PC is pretty much the standard these days.
Im eerily unimpressed by this shot, nothing in it looks particularly amazing....however in motion i cant help but be impressed.check the amazing high resolution screenshots
http://images.eurogamer.net/2012/articles//a/1/4/9/1/0/9/2/UE4_Elemental_Cine_screen_00014.jpg.jpg
![]()
-.-Don't IO and CD have their own toolsets? Is Square really gonna pull an EA and just shove the engine on every teams throat like EA is doing with Frostbyte?
And this is why people probably aren't at 16 GB en masse, even among new buyers. I'll go to that someday perhaps, especially if we start seeing 64-bit games that use up that much ram crop up, but there's really not much need for more than 4 for gaming alone, or high above 8 if you're not going to do serious graphical work or whatever that demands absurd amounts of ram.The standard for what? For who? Most PC games are 32-bit and can't even address that much RAM.
-.-
Only 1 none-DICE studio has ever used FrostBite, and that was Black Box.
Indeed the number will grow to 3 with Warfighter and Generals 2.
But thats far far far from every team when you consider they have some 30 teams.
.......Criterion used 1.5 for Hot Pursuit and 2.0 for Most Wanted and Visceral's next project uses it as well (Army of Two sequel).
.......
hahaha, go kill yourself.
hahaha, go kill yourself.
Ignoring your childish response, Most Wanted uses EAGL, and the 2010 edition of Hot Pursuit uses Chameleon (either/both of which could simply be renamed modified variants of FB -- who knows), but, judging from industry rumblings, he's right about the next entry in the Army of Two series.
.......
![]()
hahaha, go kill yourself.
Hahahaha, surely u jest, what the fuck is a standard human being going to do with 16GBs of ram?
I think even 8GB is excessive.
Ive only ever managed to break 6Gb when i had a High Poly model open in 3DS while rendering another High Poly in Octane.
With 7 or 8 tabs open in Chrome then and only then did i feel my 8GB ram was worth it.
HW Spec for Epic's Programmers
Lenovo ThinkStation D20 (Model 4158-C95)
Windows 7 64-bit
Dual Quad-Core Xeon Nehalem Processors (3.17GHz)
24 GB DDR3 RAM
nVidia GeForce GTX 285 (1 GB DDR3)
3x500 GB Hard Drives (1x OS Drive, 2x Data Drives in a RAID 0 configuration)
HW Spec for Epic's Level Designers
Dell Precision Workstation T7400
Windows 7 64-bit
Dual Quad-Core Xeon Processors (3.0GHz)
16 GB DDR2 RAM
nVidia GeForce GTX 285 (1 GB DDR3)
3x500 GB Hard Drives (1x OS Drive, 2x Data Drives in a RAID 0 configuration)
HW Spec for Epic's Artists (same specs as L.D.)
Dell Precision Workstation T7400
Windows 7 64-bit
Dual Quad-Core Xeon Processors (3.0GHz)
16 GB DDR2 RAM
nVidia GeForce GTX 285 (1 GB DDR3)
3x500 GB Hard Drives (1x OS Drive, 2x Data Drives in a RAID 0 configuration)
To help with hitting the NextGen Game asset wall, we do the following
All machines have 16 GB or more RAM. ]
...
Basically, the bottleneck is I/O so getting a RAID 0 and lots of RAM can help out a lot.
Most people at Epic are running Windows 7 64-Bit as their primary development OS. The biggest benefit is having more than 4GB of RAM as it speeds up iteration time immensely as you are no longer just constantly swapping to disk.
For programmers, the most obvious benefits will be in:
Compile C++; link C++
Compile UnrealScript
Run editor; run1-2 copies of the game
The old Most Wanted used EAGL.
As a developer, the new editor demo vid was FAR more impressive to me.
Really impressive debugging tools
I just checked Wulfens texture pack for Doom 3 and those textures look "next gen" for me.
![]()
![]()
![]()
![]()
These are the specs they used when 285 GTX was the most powerful single GPU card.
http://udn.epicgames.com/Three/UE3MinSpecs.html
Not impressed. Though that may have more to do with Epic's terrible asset design.
ITT Gaf shows they do not understand the purpose of a technology demonstration.