• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

007 First Light recommended specs revealed and... oof.

Im here to talk about games, not about my feelings sorry
i said this in the other thread, but that writer shouldve never opened his mouth and said what he said. you could do everything right in today's society, but if you say one mildly concerning thing, the wolves come for you. And sadly, the sheep follow.

This game has a straight white male lead, a straight white attractive female bond girl, shows Bond as a playboy ogling bikini clad women on beaches, hanging out on yachts with scantily clad women, ray traced graphics, incredible physics and destruction finally using the next gen CPUs, great new melee combat animations, and all of that is ignored because the writer hinted at making the game for a modern audience.

Devs need to understand that the very gamers they are trying to sell this game to are extremely sensitive and insecure, extremely jaded, ready to pounce on every word, ready to dismiss every game if it does not appeals to their own sensibilities. We have to treat them with kids gloves and give them absolutely fuck all to complain about because they do make up the vast majority of gamers. This game just might end up flopping because of that interview, and all of the incredible work done by engineers, artists and level designers will have gone to waste. Everything even remotely negative about this game will now be viewed with that lens.
 
Last edited:
Lol wait, my 12th gen Intel isn't even good enough for recommended anymore? I thought Moore's Law was dead?

The GPU requirements seem low compared to everything else, interestingly enough.
 
I would assume they mean native 1080p which is DLSS performance. So possibly just above a 3060Ti for 4K60 and people are freaking out?
god i remember being able to play diablo 3 near launch on an dell ultrasharp 10bit screen in true 1440p and it fucking blasting my eyes in highres. and now we have fake res and fake frames

Hungry Monster GIF
 
Last edited:
cQQ3xoCOJ5ARYsKe.jpeg


Lets see how this woke shit does. I was looking forward to it but with this news and the modern Bond shit the other day im now turned away from this. Maybe the specs are over the top, we have seen that recently but the official gameplay fpotage was running like shit and that was from the devs so I have my doubts.
 
Why defend a game that comes out for consoles with 16GB of RAM or less, while FS2024 requires 32GB in its recommended specs? Do you think the two games are equal in scope and graphics?
 
Specs for games are rarely correct. MANY games have recommend 32GB but when Hardware Unboxed tested this recently 90% of them run the same on 16GB and 32GB. As long as you have enough VRAM and game isn't spilling into main memory.
This is clowns territory though. Not only doesn't RAM affect your FPS this way you also generally don't specify 6 freaking years old GPUs in your spec sheet.
 
32GB Ram for 1080p/60fps? Why?

This seems really unoptimized and going off the trailer they showed a while back it seems like it's not going to run well at all :|
My mid-range laptop has 64GB...
Only 8GB VRAM though..

Its also only a 1080p screen so I'm good. Its essentially a "I dont wanna pack my Pro for this trip" system.
 
So the minimum VRAM is 8gb, yet the minimum GPU is a 1660 that comes with 6GB
its a reverse 970, it has a third more vram ofc

My mid-range laptop has 64GB...
Only 8GB VRAM though..

Its also only a 1080p screen so I'm good. Its essentially a "I dont wanna pack my Pro for this trip" system.
das not midrange fam, it might not be cheap but midrange it is not

I know it "looks good" but all those copy and paste shop stall assets in the car scene is so hilarious.
they have been to a movie of a guy driving through a fruit market stall once, watchu want fruit stall man, are you an agent of Big Fruit Stall?
 
i said this in the other thread, but that writer shouldve never opened his mouth and said what he said. you could do everything right in today's society, but if you say one mildly concerning thing, the wolves come for you. And sadly, the sheep follow.

This game has a straight white male lead, a straight attractive female bond girl, shows Bond as a playboy ogling bikini clad women on beaches, hanging out on yachts with scantily clad women, ray traced graphics, incredible physics and destruction finally using the next gen CPUs, great new melee combat animations, and all of that is ignored because the writer hinted at making the game for a modern audience.

Devs need to understand that the very gamers they are trying to sell this game to are extremely sensitive and insecure, extremely jaded, ready to pounce on every word, ready to dismiss every game if it does not appeals to their own sensibilities. We have to treat them with kids gloves and give them absolutely fuck all to complain about because they do make up the vast majority of gamers. This game just might end up flopping because of that interview, and all of the incredible work done by engineers, artists and level designers will have gone to waste. Everything even remotely negative about this game will now be viewed with that lens.
God I love this forum.
 
I have 32gb of ram and a 3080 with 10gb of vram so I'm under specs for this bullshit at 1080p.
lol you will be fine. i had the 3080 for 3+ years and the vram was not an issue for 99.99% of the games. Path traced indy was the only game that couldnt be run. And i played everything at 4k textures using dlss. never 1440p.

worst case scenario, you will have to settle for 1440p. But even that could likely be mitigated by setting some vram heavy texture or draw distance settings down a notch or two.
 
We'll see what happens, but their previous games have usually run pretty well, and they're good about fixing most egregious stuff.

I don't have enough interest to buy in day-one, but I'm not expecting this to be like Monster Hunter Wilds's launch.
 
god i remember being able to play diablo 3 near launch on an dell ultrasharp 10bit screen in true 1440p and it fucking blasting my eyes in highres. and now we have fake res and fake frames

Hungry Monster GIF

Diablo 3 released in 2012. that was the SEVENTH year of the PS360 generation.

back then, PCs were so insanely far ahead of consoles that you could run anything at super high resolutions and framerates.
consoles are always the minimum spec. Diablo 3 was already planned for console as well.
hence, you could completely crank it like crazy.

you could get PC cards with 2x console performance in 2006 already... now imagine 6 years later.
the upper tier 2012 Geforce cards were more than 10x as fast as the 360 GPU.

so if you had a GTX600 series card, there was a good chance you ran a game designed to work at 60fps on a GPU with only 10% of the power of yours.
and they didn't even run the lowest settings on console either... they in fact kept the pretty demanding soft shadows enabled, and ran with 2x MSAA.
 
Last edited:
Diablo 3 released in 2012. that was the SEVENTH year of the PS360 generation.

back then, PCs were so insanely far ahead of consoles that you could run anything at super high resolutions and framerates.
consoles are always the minimum spec. Diablo 3 was already planned for console as well.
hence, you could completely crank it like crazy.

you could get PC cards with 2x console performance in 2006 already... now imagine 6 years later.
the upper tier 2012 Geforce cards were more than 10x as fast as the 360 GPU.

so if you had a GTX600 series card, there was a good chance you ran a game designed to work at 60fps on a GPU with only 10% of the power of yours.
and they didn't even run the lowest settings on console either... they in fact kept the pretty demanding soft shadows enabled, and ran with 2x MSAA.
The 600 series launched the year of Diablo 3 - you need to go back to cards from 2007ish as they are recommending 5 year old cards. And then mid range cards from 2007.
 
The 600 series launched the year of Diablo 3 - you need to go back to cards from 2007ish as they are recommending 5 year old cards. And then mid range cards from 2007.

well yes, but that wasn't the point.

my point is that running almost any game at high resolutions in 2012 was super easy due to how much more powerful PCs were in comparison to the base spec, which was the Xbox 360.

meanwhile, you can't even buy a GPU that is 10x as powerful as the Series X if you wanted to now.
even the 5090 is only ~6x as powerful, maybe 7x, and is an insane high end card.
while a GTX660ti in 2012 was already enough to get 10x Xbox 360 GPU performance.
 
Last edited:
well yes, but that wasn't the point.

my point is that running almost any game at high resolutions in 2012 was super easy due to how much more powerful PCs were in comparison to the base spec, which was the Xbox 360.

meanwhile, you can't even buy a GPU that is 10x as powerful as the Series X if you wanted to now.
even the 5090 is only ~6x as powerful, maybe 7x, and is an insane high end card.
while a GTX660ti in 2012 was already enough to get 10x Xbox 360 GPU performance.
this is true. but AMD had affordable 2x better than console GPUs out in 2020. But people wanted nvidia and refused to buy them. Instead going for 8GB gpus that we all said were garbage even back then. now its coming back to bite them.

IIRC, the 7800xt which offers even better performance than the 6800xt was going for $450-500 a few years ago. maybe 2023 or 2024. no one bought them. not to mention the fact that even the lowly 6700xt shipped with 16gb of ram.

People holding on to their 3060s, zen 2 CPUs, and 16GB of RAM played it all wrong and are paying for it.

I picked up a 3080 after a year and a half searching for it, and paid $150 markup over the MSRP. Ultimately it was worth it because in ray tracing games it was like 3x more powerful and combined with my powerful CPU, i was literally getting 4x more performance than the base ps5 in some nvidia sponsored games. i understand not everyone has $850 to waste on GPUs, but the 5070 has been around $500 for a good year now. and its even more powerful than my 3080. 4x more for the price of a PS5 is really not bad even if the 10x better performance is no longer attainable on the best most expensive cards.

P.S Xbox 360 came out in 2005 so you should be looking at 2010 cards for your comparisons. not 2012. the 560 was roughly only around 4x more powerful. i ended up going with the GTX 570 in 2011 and was able to run every game at 1080p 60 fps, so double the resolution and double the framerate, 4x more powerful with some settings set to high instead of the medium or low on consoles. maybe it was around 5x more powerful. we are very close to that when you look at ray traced games. less so in rasterized games but still around 2-3x better.
 
Steam machine rapidly becoming obsolete before we even get a price lol.

I'm sorry but the more time passes the more I wonder if valve shot too low raiding the AMD junk pile for "4K (with upscaling)"
 
i said this in the other thread, but that writer shouldve never opened his mouth and said what he said. you could do everything right in today's society, but if you say one mildly concerning thing, the wolves come for you. And sadly, the sheep follow.

This game has a straight white male lead, a straight white attractive female bond girl, shows Bond as a playboy ogling bikini clad women on beaches, hanging out on yachts with scantily clad women, ray traced graphics, incredible physics and destruction finally using the next gen CPUs, great new melee combat animations, and all of that is ignored because the writer hinted at making the game for a modern audience.

Devs need to understand that the very gamers they are trying to sell this game to are extremely sensitive and insecure, extremely jaded, ready to pounce on every word, ready to dismiss every game if it does not appeals to their own sensibilities. We have to treat them with kids gloves and give them absolutely fuck all to complain about because they do make up the vast majority of gamers. This game just might end up flopping because of that interview, and all of the incredible work done by engineers, artists and level designers will have gone to waste. Everything even remotely negative about this game will now be viewed with that lens.
Monday Night Raw Lol GIF by WWE
 
End of the day, I can't take this avatar seriously as 007. I don't want to control that character for a game, pretending he's supposed to be James Bond. Suspension of disbelief will be nonexistent – they've made this game for the Tom Holland generation but it holds no appeal to me.

Should have just called it Kingsman instead, I'd have loved that tbh.
 
Last edited:
Their engine while running great now, was also a bit of a mess when the first hitman (reboot) hit the scene.

Hitman 3 runs great without Rtx but its also a pretty old game built on an engine that still targeted older tech (since all 3 games ran on the same tech)

So im not surprised 007 is a hog, I just hope they sort things out and we dont have to wait 3 games into the franchise.

Also if they keep that consistency with bond where sequels can be packed into one... that would be awesome
 
Top Bottom