Unknown Soldier
Member
I love Todd Howard because he's so shameless about everything. He reminds me of Elon Musk except Todd is not a genius who builds rockets and electric cars
Calm down Phil.Pat yourself on the back “hack” we’re all waiting on your game that your directing from the development studio that you’ve been running for the past 25 years.
Why would get so mad about what he said. Are you playing Starfield? Are you a fan of any Bethesda game. Why??
You sound so mentally unhealthy, remember it’s a video game. Entertainment purposes only.
A lot of gaslighting going on in this thread. Not sure why we are cheering on objectively bad performing PC games.
The biggest take away however from these comments from Todd is that it seems they aren't going to put any resources into make the game perform better which is very disappointing because again... the game objectively runs like horse shit.
The game picks up after the first three main missions. However, I do agree. The engine is absolute ass:Just done the intro on a x5800 and 3070 in 1440p and the performance is appalling.
Around 35fps during that fight with the pirates and that's with the amd solution thing on. Considering how shit the game looks, this performance is terrible .
How is this acceptable?
Creation engine garbage strikes again.
Do we a thread yet about how women in starfield have awkward heads, are wearing too much clothing, and has been ruined by wokeness yet? That could be another dollar!If I had a dollar for every Starfield thread, I’d buy my own island.
on topic , it’s kind of bs to upgrade I thought he said it’s optimized to run on several CPU’s
Someone should tell SlimySnake that PC gamers don’t game at 1080p. You know, the resolution where having the latest CPU actually matters.this has nothing to do with the 5800x3d. you are the only one getting triggered by me simply posting benchmarks showing this cpu get trounced by AMDs own CPUs.
i had no idea the cpu had a spouse on this board taking everything so personally.
Just found this video showing optimized settings. not a big visual hit and hes able to get the game running at 60 fps even in atlantis on your cpu.Great info. Thank you!
No offense, but a 6 core processor paired with a 3080 isn't exactly a balance build.Nvidia 3080 GPU and AMD Ryzen 5 5600X CPU. I'm using optimized settings off of Nexus, which drops the shadows to low, for instance.
It's not constant sub-30 in New Atlantis, but I'd prefer it never go that low. I just haven't heard a compelling argument from Todd Howard or otherwise as to what this game is doing that other games couldn't and why it's so CPU dependent.
Then again, I just looked up what the Xbox Series X's equivalent-ish PC CPU would be, and it's a Ryzen 7 3700X, which is actually better than the 5600X. It's starting to make more sense now.
First I'm hearing of it. Really, only BG3 and Starfield have been disappointing, performance-wise, on it. I was plenty happy with Cyberpunk and all the other games I've played on it the past 2 yearsNo offense, but a 6 core processor paired with a 3080 isn't exactly a balance build.
Just downloaded DLSS and for some reason, it is making my CPU go into overdrive. Indoors, outdoors, doesnt matter, im hitting 140 watts on my i7-11700k and temps are jumping up to 75C. i actually saw it jump to 80 degrees at one point. im legit scared. if I play this game for around a 100 hours, am I killing my cpu if it stays around 75 degrees for dozens of hours?As somebody that has a 3080 and an i9….to any of my intel bros struggling with fps, enter the bios and make sure your CPU is running in turbo mode. Makes a huge difference. It’s CPU heavy, Rarely ever drop below 60. Indoors im around 120fps
Cyberpunk is heavy on the cpu, especially with ray tracing on, so you are leaving a ton of performance on the table. A better CPU won’t always show it’s worth in benchmarks, since those graphs tend to only show avg framerate running on a pc with nothing else running in the background. Stutters and/or strange low frame drops with a 3080 is either a cpu issue or you are running out of vram. Starfield seems to be pretty good with managing vram so your cpu would most likely be the issue.First I'm hearing of it. Really, only BG3 and Starfield have been disappointing, performance-wise, on it. I was plenty happy with Cyberpunk and all the other games I've played on it the past 2 years
Willing to bet a lot has to do with driver overhead with Nvidia GPU's affecting performance. And as far as we can see from all the CPU testing online, Starfield is very CPU dependant game. That and this is an AMD sponsored title.I really wonder what CPU people are running with a 4070 at 1080p to only hit just 60fps. I'm sorry, but that's some whacked out component upgrade pathing if I've seen any.
I have a 6800XT and a Ryzen 3800XT, playing the game at 1440p 144hz I'm getting on average 100-140 fps with everything ultra. I've only turned motion blur off and dynamic res off while pushing to 100% render resolution. How am I able to achieve this yet others are struggling with what is technically a better card?
For once we have a game actually using the CPU instead of four cores max and going GPU heavy. That is a far better balance in my opinion, and what next gen development should be focusing on.
You can cull 80% of the game world in top down. There's a limit to what you can optimize for draw distance in true perspective. C'mon now...Just because a game is top down view, doesn't mean it stops rendering 3D stuff.
Both games are rendering 3D worlds. With the big diference that open areas in DG3 look good, while Starfield looks like crap, while running like crap.
Engines are more than just rendering. I don't entirely disagree that the Engine is being stretched to its limits but it isn't just about graphics.Get a better engine, Todd. This game isn't a visual masterpiece.
I really wonder what CPU people are running with a 4070 at 1080p to only hit just 60fps. I'm sorry, but that's some whacked out component upgrade pathing if I've seen any.
I have a 6800XT and a Ryzen 3800XT, playing the game at 1440p 144hz I'm getting on average 100-140 fps with everything ultra. I've only turned motion blur off and dynamic res off while pushing to 100% render resolution. How am I able to achieve this yet others are struggling with what is technically a better card?
For once we have a game actually using the CPU instead of four cores max and going GPU heavy. That is a far better balance in my opinion, and what next gen development should be focusing on.
The 4070 performs way worse in this game which seems to prefer AMD cards. Your 6800xt is performing more like a 3090 ti.I really wonder what CPU people are running with a 4070 at 1080p to only hit just 60fps. I'm sorry, but that's some whacked out component upgrade pathing if I've seen any.
I have a 6800XT and a Ryzen 3800XT, playing the game at 1440p 144hz I'm getting on average 100-140 fps with everything ultra. I've only turned motion blur off and dynamic res off while pushing to 100% render resolution. How am I able to achieve this yet others are struggling with what is technically a better card?
For once we have a game actually using the CPU instead of four cores max and going GPU heavy. That is a far better balance in my opinion, and what next gen development should be focusing on.
I would say you are probably fine but somebody more knowledgeable on this can probably clarify?Just downloaded DLSS and for some reason, it is making my CPU go into overdrive. Indoors, outdoors, doesnt matter, im hitting 140 watts on my i7-11700k and temps are jumping up to 75C. i actually saw it jump to 80 degrees at one point. im legit scared. if I play this game for around a 100 hours, am I killing my cpu if it stays around 75 degrees for dozens of hours?
I have no idea why FSR was holding back my CPU but switching to dlss has made me gain roughly 10% more performance. sadly it comes with more wattage and higher temps.
Jensen is the only guy who fancy Physics seriously.Ah yes, basic ass havok that did the same thing almost 13 years ago
![]()
![]()
When was the last time that physx got an upgrade or was featured in a game?Jensen is the only guy who fancy Physics seriously.
It’s was a pun for Bo…
Google "3080" and "5600X" and see what people say about the combo. The 5600X bottlenecking the 3080 is a new phenomena.Cyberpunk is heavy on the cpu, especially with ray tracing on, so you are leaving a ton of performance on the table. A better CPU won’t always show it’s worth in benchmarks, since those graphs tend to only show avg framerate running on a pc with nothing else running in the background. Stutters and/or strange low frame drops with a 3080 is either a cpu issue or you are running out of vram. Starfield seems to be pretty good with managing vram so your cpu would most likely be the issue.
Mods should at least do that on older games, I know it doesn't make sense, the devs do share our thoughts BTW.When was the last time that physx got an upgrade or was featured in a game?
Creative Optimization™He's right, it is optimized... For 30 fps... On PC.
Of course not but this game mechanically isn't doing anything new that older systems can't do with other engines. I don't think Creative is horrible, but it's dated and old. There's a reason they shouldn't be using this and it boils down to 30fps on Series X and unstable frames with mid tier systems.Engines are more than just rendering. I don't entirely disagree that the Engine is being stretched to its limits but it isn't just about graphics.
75 ? Dude my i5-2500k stay at 90+ for like 5 years , and its still working in my nephew PC, could been over 10 years old already .Just downloaded DLSS and for some reason, it is making my CPU go into overdrive. Indoors, outdoors, doesnt matter, im hitting 140 watts on my i7-11700k and temps are jumping up to 75C. i actually saw it jump to 80 degrees at one point. im legit scared. if I play this game for around a 100 hours, am I killing my cpu if it stays around 75 degrees for dozens of hours?
I have no idea why FSR was holding back my CPU but switching to dlss has made me gain roughly 10% more performance. sadly it comes with more wattage and higher temps.
It's doing two things that nobody else does really. One is object persistence and the other one is mod support.Of course not but this game mechanically isn't doing anything new that older systems can't do with other engines. I don't think Creative is horrible, but it's dated and old. There's a reason they shouldn't be using this and it boils down to 30fps on Series X and unstable frames with mid tier systems.
Well you got a high end gpu with a lower end cpu. Now that we are getting games that demand more power the lower end part isn’t keeping up with the higher end part.Google "3080" and "5600X" and see what people say about the combo. The 5600X bottlenecking the 3080 is a new phenomena.
Good of you to take care of your nephew's heating needs.75 ? Dude my i5-2500k stay at 90+ for like 5 years , and its still working in my nephew PC, could been over 10 years old already .
based on what knowledge ? it would be interesting to knowI mean the game looks great, but still doesnt justify the HW it needs to run at 60+ fps so try harder bethesda
Todd Howard's resume: Game director and Creator on Starfield, Fallout, Elder Scrolls for the last 30 years.I love Todd Howard because he's so shameless about everything. He reminds me of Elon Musk except Todd is not a genius who builds rockets and electric cars
He didn't buy PayPal, he was just the largest shareholder when it sold. He owned X.com which merged with Confinity and soon after became PayPal.Todd Howard's resume: Game director and Creator on Starfield, Fallout, Elder Scrolls for the last 30 years.
Elon Musk resume: Bought Paypal. Bought Tesla. Bought Space-X. Bought Twitter. He's the Phil Spencer of his industry. Todd Howard is the real deal.
A game from 2004?I bet someone out there who barely gets 30fps in half-life 2 expects Starfield to be playable on their PC.
And its not optimized if they can't play it well or at all.
A game from 2004?
You gonna be kidding me, Half Life 2 is the benchmark of game optimization
The Geforce 2 MX 400 was a freaking potato by late 2004 and manage to run it
that explains the name of the engine CREATION engine!!!!!Creative Optimization™
The 7XXXs and 13XXXs by default intentionally rev themselves up to 95c and stay there full time at load. The 11700 is current enough where I would assume its spec'ed to target at least 85c, if not 95. Modern CPUs are spec'ed to run much hotter than older ones.Just downloaded DLSS and for some reason, it is making my CPU go into overdrive. Indoors, outdoors, doesnt matter, im hitting 140 watts on my i7-11700k and temps are jumping up to 75C. i actually saw it jump to 80 degrees at one point. im legit scared. if I play this game for around a 100 hours, am I killing my cpu if it stays around 75 degrees for dozens of hours?
I have no idea why FSR was holding back my CPU but switching to dlss has made me gain roughly 10% more performance. sadly it comes with more wattage and higher temps.
It looks good but it won't let me remove the piss filterI mean the game looks great, but still doesnt justify the HW it needs to run at 60+ fps so try harder bethesda
I don't entirely disagree. The missing piece of info for me is what it is this game is doing, CPU computationally, that other games haven't done or couldn't do better with a "low end" CPU if Bethesda had optimized better.Well you got a high end gpu with a lower end cpu. Now that we are getting games that demand more power the lower end part isn’t keeping up with the higher end part.