• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

"I need a new PC!" 2010 Edition

Odious Tea said:
How does that 5770 compare to the one I just ordered? It went up in price since I purchased it (I got it at $164.99+$10 MIR).

http://www.newegg.com/Product/Product.aspx?Item=N82E16814102873

Well they'll both "perform" the same, its only the price/bundle/cooling/warranty that really differs. That's a custom design so I dunno how well it cools the card, I'd guess better but I couldn't say. At the price you'd paid you can't go wrong though, really.


Kintaro said:
Good point. Only ends up costing $3 to the total cost in exchange for a better CPU and a game. Original post edited to reflect this. :D

Cool, that's seems like a fair trade! :D Happy to help.
 
I'm looking to get a new keyboard, I currently have a gen1 logitech G15, it's a bit oo big for my tastes. I've been looking at the G110(Logitech) and the Lycosa(razer). Has anyone had any experience with these keyboards? Are they good? Bad? Any suggestions are welcome, thanks guys.
 
evil solrac v3.0 said:
any news updates on how much the new six-core processors from AMD will cost? what board would i need to buy to go with it?


They're going to fit not only AM3 but AM2+ sockets, which is obviously sweet.

Source
 
Details about the ATI 5830. Will wait for benchmarks/price before I decide on purchasing and returning my 5770.

My initial plan was to Crossfire my 5770 to get ~5850 performance, since I couldn't afford a 5850 with the rest of my build. My 5770 arrives tomorrow, if I'm pleased with it's performance and not convinced by 5830 price, I'll probably end up keeping it.
 
Odious Tea said:
Details about the ATI 5830. Will wait for benchmarks/price before I decide on purchasing and returning my 5770.

Ouch, I was hoping for a lot more from the 5830 (was considering upgrading from my 4850. We'll see how it benchmarks, but with those specs and the price (if accurate), no way is it worth it to me.
 
Girlfriend talked me out of the upgrade. Damn her logic and he obvious craziness of blu rays and games in March. So, I'll just get Win7 for now...

I'm running E8400 3.0 Ghz, 8 GB of DDR2 800 and GeForce GTX250. I should still be good at 1080p for awhile yet huh?
 
Kintaro said:
Girlfriend talked me out of the upgrade. Damn her logic and he obvious craziness of blu rays and games in March. So, I'll just get Win7 for now...

I'm running E8400 3.0 Ghz, 8 GB of DDR2 800 and GeForce GTX250. I should still be good at 1080p for awhile yet huh?

What were you upgrading? All you need to upgrade is the gpu and overclock that beast of cpu. You can get it close to 4.00Ghz! I would personally sell some of that ram, it is kinda excessive and use that money to buy a cooler/gpu.
 
Kintaro said:
Girlfriend talked me out of the upgrade. Damn her logic and he obvious craziness of blu rays and games in March. So, I'll just get Win7 for now...

I'm running E8400 3.0 Ghz, 8 GB of DDR2 800 and GeForce GTX250. I should still be good at 1080p for awhile yet huh?

You have a slightly better build then I do (e8400, 4850, 4gb ddr2), and I'm still getting 1080p in most games with completely playable framerates (30-40) and high settings. So you should be fine.

I'm seriously considering OCing my e8400. I've never done anything like it though, so I'm trying to read up on it. I don't want to go anywhere near danger levels though, so I doubt I'll get that far with it.
 
Kintaro said:
Girlfriend talked me out of the upgrade. Damn her logic and he obvious craziness of blu rays and games in March. So, I'll just get Win7 for now...

I'm running E8400 3.0 Ghz, 8 GB of DDR2 800 and GeForce GTX250. I should still be good at 1080p for awhile yet huh?
you could swap out the e8400 for a q9550 if you really want quad performance. the GTS 250 will hold you back the most
 
Kintaro said:
Girlfriend talked me out of the upgrade. Damn her logic and he obvious craziness of blu rays and games in March. So, I'll just get Win7 for now...

I'm running E8400 3.0 Ghz, 8 GB of DDR2 800 and GeForce GTX250. I should still be good at 1080p for awhile yet huh?

Holy hell, send that piece of shit to the scrapyard. I'm surprised you can run Quake 1 in software rendering.
 
For little photoshop just save your money and go for PII X4 945-955 (wich can be later upgraded to a X6 if you need more juice) . Will be the same.

Use the saves for a better gpu.


ChoklitReign
Instead of 250 + 4770 --> 240 + HD5750

Both procs will overclock to the same level.
 
Any particular reason why? This is my first build so I'm trying to get all the info I can get. Not to mention I know nothing about computers.:lol
 
SePhoBroth said:
Any particular reason why? This is my first build so I'm trying to get all the info I can get. Not to mention I know nothing about computers.:lol
The socket 1136 motherboards needed to run the i7 920 tend to be a whole lot more expensive than the socket 1156 boards for the i5 750, meaning the overall system price difference is more than just the CPU price difference.
 
Ahhh I didn't know that. I thought I could use the same motherboard for either one of them? Any recommendations on a motherboard then? For the I5.

edit; around $150 or lower.
 
bhlaab said:
Holy hell, send that piece of shit to the scrapyard. I'm surprised you can run Quake 1 in software rendering.

I know. It's nuts. =/ GTX260, not 250 as I said before. =/

Doom is slow as hell. I can't play like this! <tosses PC case>
 
ZoddGutts said:
Performance test numbers of the 5830:


http://forums.anandtech.com/showthread.php?t=2053231

It's basically on par with the 4890.

Even with 8xaa applied in most games (which is pretty much broken on GTX 2xx cards) it still struggles to beat out a GTX 275. That's not particularly appealing performance imo considering you could pick up GTX 260s for $150 9 months ago (which is what I did but with the equivalent UK prices) and they'd easily clock to GTX 275 performance, we're going backwards if anything.
 
I'm definitely skipping this generation, nothing seems to be offering a significant upgrade over my highly OCed GTX 260 (216) and I'm positive that the the 1GB VRAM on these cards is going to prove a major bottleneck in the future. ~1.5GB should be standard on GTX 480 (I hope) but even then, its less than I'd want. I hope 2GB is standard on high end cards next generation, VRAM on high end cards hasn't really moved anywhere since the 8800GTX launched with 768MB, and you can't tell me those new cards can only use ~200MB more than a 4 year old GPU.
 
brain_stew said:
Even with 8xaa applied in most games (which is pretty much broken on GTX 2xx cards) it still struggles to beat out a GTX 275. That's not particularly appealing performance imo considering you could pick up GTX 260s for $150 9 months ago (which is what I did but with the equivalent UK prices) and they'd easily clock to GTX 275 performance, we're going backwards if anything.

A GTX 260 is $200 today though, so if it is priced under that, it may be a good value. I'll have to see more benches than 3Dmark though.

The March 26th launch of Fermi will be interesting... I'm expecting there to be a very limited quantity of cards, so they will sell out. What I want to see is the real MSRPs, and if demand inflates them.

Edit: I didn't notice the other benches... it really isn't much better than a 260GTX on those, except for HAWX, so hopefully they price it accordingly.
 
brain_stew said:
I'm definitely skipping this generation, nothing seems to be offering a significant upgrade over my highly OCed GTX 260 (216) and I'm positive that the the 1GB VRAM on these cards is going to prove a major bottleneck in the future. ~1.5GB should be standard on GTX 480 (I hope) but even then, its less than I'd want. I hope 2GB is standard on high end cards next generation, VRAM on high end cards hasn't really moved anywhere since the 8800GTX launched with 768MB, and you can't tell me those new cards can only use ~200MB more than a 4 year old GPU.
Well, my last PC, is a 2005 laptop.. so as soon as i got enough money together, i'm gettin' something new :P
 
Minsc said:
A GTX 260 is $200 today though, so if it is priced under that, it may be a good value. I'll have to see more benches than 3Dmark though.

The March 26th launch of Fermi will be interesting... I'm expecting there to be a very limited quantity of cards, so they will sell out. What I want to see is the real MSRPs, and if demand inflates them.

Oh, I definitely see where the 5830 fits, it just doesn't look all that appealing to me at anything more than $200. Its been gimped way too much, 16 ROPs is a pretty tough pill to swallow, all that extra bandwidth over the 5770 looks like it'll mostly go to waste. Judging by its rumoured TDP (which is higherthan the 5850) it looks like ATI will be able to sell some really crappy dies now, so its a smart move I guess and it does fill the one major gap in ATI's DX11 lineup.

Its just a little disheartening to see that the price:performance ratio has fell so much in the last few months and unless the 5830 is available at under $200 en masse (which it won't be, I guarantee you that) that isn't changing anytime soon either. I guess I should be happy, in a bizarre twist of events the value of my GPU has risen since I bought it nearly a year ago but alas I'm not one to be sentimental about hardware, I want to see progress and we just don't seem to be getting it.

You know, I kinda predicted this happening, back before the 58xx launch there were countless people holding off for some "mythical" new price drop due to new hardware. I never saw it coming and despite pointing out just how much GPUs had dropped in the preceding 12-18 months, most seemed convinced it would continue while I kinda knew it couldn't.

With memory prices way up on last year as well, PC gaming "on a budget" is definitely more difficult now than it was 6-12 months ago, however bizarre that may seem. AMD's excellent $100 quad cores are a nice recent development though, I must admit. Them Athlon ii X4s are perfect budget chips and enough to satisfy all but the most demanding of consumers.
 
Glass Brain said:
I recently built an LGA 1156 PC (thanks to everyone in this thread for the recommendations). I'm considering better and quieter cooling, and I'm looking at this: http://www.newegg.com/Product/Product.aspx?Item=N82E16835103065 . Can anyone steer me in the right direction, or will that be good?

That cooler gets excellent reviews. If you want silence though, I'd recommend buying a 120mm Yate Loon or Noctua to replace the stock fan.
 
brain_stew said:
Oh, I definitely see where the 5830 fits, it just doesn't look all that appealing to me at anything more than $200. Its been gimped way too much, 16 ROPs is a pretty tough pill to swallow, all that extra bandwidth over the 5770 looks like it'll mostly go to waste. Judging by its rumoured TDP (which is higherthan the 5850) it looks like ATI will be able to sell some really crappy dies now, so its a smart move I guess and it does fill the one major gap in ATI's DX11 lineup.

Its just a little disheartening to see that the price:performance ratio has fell so much in the last few months and unless the 5830 is available at under $200 en masse (which it won't be, I guarantee you that) that isn't changing anytime soon either. I guess I should be happy, in a bizarre twist of events the value of my GPU has risen since I bought it nearly a year ago but alas I'm not one to be sentimental about hardware, I want to see progress and we just don't seem to be getting it.

You know, I kinda predicted this happening, back before the 58xx launch there were countless people holding off for some "mythical" new price drop due to new hardware. I never saw it coming and despite pointing out just how much GPUs had dropped in the preceding 12-18 months, most seemed convinced it would continue while I kinda knew it couldn't.

With memory prices way up on last year as well, PC gaming "on a budget" is definitely more difficult now than it was 6-12 months ago, however bizarre that may seem. AMD's excellent $100 quad cores are a nice recent development though, I must admit. Them Athlon ii X4s are perfect budget chips and enough to satisfy all but the most demanding of consumers.

Starcraft2-CPUs.png


Since it is almost impossible to play the same match over and over again, we made use of the replay feature to run our benchmarks. This means that the A.I. is not calculated, but the effect is negligible. We fast forward our replay to 15:30 and have Fraps record the framerate of the following 20 seconds while more than 60 Zerglings attack a Protoss base defended by several Carriers and other units. This is not a worst-case scenario we usually use for our benchmarks but represents a common gaming situation.

According to the Windows 7 Task Manager the Starcraft 2 Beta utilizes only two cores and our benchmarks confirm this: The Intel quad-core Q6600 is only slightly faster than its dual-core sibling, the E6600. But Starcarft 2 reacts very well to additional cache - especially the E8400 and the two Lynnfield CPUs benefit from that while it is a disadvantage for the Athlon II X2 250. AMD's Phenoms are doing very well, but can't reach Intel's Core i5/i7. The Athlon 64 array is rather slow.

Given our results and the fact that our little brawl isn't a really challenging scenario, a Core i5/i7 is the ideal CPU for the Starcraft 2 Beta. But a lot cheaper and not much slower is a Phenom II X2 545 at 3.0 GHz - our current recommendation. According to reports in the Battle.net forums Core i7 CPUs have performance problems in certain situations and we can confirm this from what we saw during our gamplay tests - our benchmark scene is not affected though.

It's sort of relevant here too, don't know how many people caught the Starcraft 2 benchmark I put in the beta thread.

Chalk it up to poor optimization, but you cannot buy a computer that maxes starcraft 2 unless you have like a 5GHz CPU. A $900 SLI GPUs will make little difference from a $150 one.

What's surprising, but not really, is hearing the X2 do so well (what's weird is they recommend it but don't have it in their benchmark). I guess like they say, additional cache improves performance big time. If only the game were able to use more than 1.5 cores, you'd see much higher results (basically double) I bet, and that X2 wouldn't look so hot.

I agree w/ your comments on the hardware situation completely, it's nice and it isn't. I'm shocked still by rising prices, and don't see the trend turning around with the launch of Fermi either.
 
I'd hardly use Starcraft 2 as a general benchmark, for whatever reason Blizzard feel they can phone it in and buck the industry trend towards proper multi core development. Its probably going to be the only major release of 2010 that only uses two cores, they'll get away with it of course, but its seriously poor form considering the resources they have. You really can't buy yourself all that much extra dual core performance and its a wasted effort anyways, don't judge your purchasing decision on Blizzards shitty move.

The major sites should be slamming Blizzard for not optimising for more than two cores when their engine clearly needs it and every other developer worth their salt is doing it. Heck, little 4A games can manage to code an engine that spawns dozens of threads at once, with all the resources Blizzard have they can surely muster up more than they've managed. This isn't 2006 anymore.
 
Don't see where the 5830 fits in really. Higher power draw, poor performence for the price. Seems ATI aren't really pushing themselves due to there being no competition.

Nvidia need to bring something to the table akin to when they brought out the 8800GT because nothing hits the sweet spot yet with this gen of cards. The 5850 is by far the best buy but the prices are seriously inflated. The 5770 is underpowered and overpriced when you take into account the performence of last gens cards.

I want a 5850 but refuse to pay more than £200 for one. Its rediculous how its so far over its RRP right now. Would have been cheaper to buy one when it launched FFS. Its reached a price so high that its better value buying a 5870 right now for just £30-£40 more.
 
brain_stew said:
I'd hardly use Starcraft 2 as a general benchmark, for whatever reason Blizzard feel they can phone it in and buck the industry trend towards proper multi core development. Its probably going to be the only major release of 2010 that only uses two cores, they'll get away with it of course, but its seriously poor form considering the resources they have. You really can't buy yourself all that much extra dual core performance and its a wasted effort anyways, don't judge your purchasing decision on Blizzards shitty move.

The major sites should be slamming Blizzard for not optimising for more than two cores when their engine clearly needs it and every other developer worth their salt is doing it. Heck, little 4A games can manage to code an engine that spawns dozens of threads at once, with all the resources Blizzard have they can surely muster up more than they've managed. This isn't 2006 anymore.

Yea, I really wish Starcraft was using more cores, I don't think they even use 2 cores fully. I just noticed that the review said those benches (unlike the Dragon Age ones I was wrong about earlier), do not even represent the worst case scenarios. On top of that they are limiting the beta so far to 2v2 games, which means it could get significantly worse if you add in more players to 3v3 and so on.
 
Minsc said:
Yea, I really wish Starcraft was using more cores, I don't think they even use 2 cores fully. I just noticed that the review said those benches (unlike the Dragon Age ones I was wrong about earlier), do not even represent the worst case scenarios. On top of that they are limiting the beta so far to 2v2 games, which means it could get significantly worse if you add in more players to 3v3 and so on.

Well hopefully they've got some proper mutlithreaded optimisations ready for the final release as their engine clearly demands its. Ponder this, with single core performance stating to stagnate, it may never be possible to get a perfect 60fps in Starcraft 2 without resorting to extreme OCing. That's a fucking traversty tbh, and it needs sorting asap.
 
brain_stew said:
Well hopefully they've got some proper mutlithreaded optimisations ready for the final release as their engine clearly demands its. Ponder this, with single core performance stating to stagnate, it may never be possible to get a perfect 60fps in Starcraft 2 without resorting to extreme OCing. That's a fucking traversty tbh, and it needs sorting asap.

Yea, but in the interest of being fair, this is sort of like a GTAIV situation too in the sense that if you don't play on max settings, obviously framerate will be significantly higher, and you'll hit 60 no problem all the time. It's a simple thing to just lower a couple settings and have 60fps, but why should we have to, when the GPU and CPUs aren't being properly used?! :)

Let's hope the final release or one of the expansions fixes things up a bit!
 
Minsc said:
Yea, but in the interest of being fair, this is sort of like a GTAIV situation too in the sense that if you don't play on max settings, obviously framerate will be significantly higher, and you'll hit 60 no problem all the time. It's a simple thing to just lower a couple settings and have 60fps, but why should we have to, when the GPU and CPUs aren't being properly used?! :)

Let's hope the final release or one of the expansions fixes things up a bit!

Wait, so lowering the graphics settings, significantly increases your CPU load? That.......makes little sense, I mean you're not introducing extra units (and thus more draw calls), right? So why the hell do they bring down CPU performance so much? Hmm, just smacks of some seriously shitty coding on Blizzards part, I mean, sheesh if you're doing some of the visual work on the CPU at high settings (which they shouldn't be, but whatever) at least spawn a new thread for it. I'm well aware that multicore development isn't "as simple as that" but the rest of the industry managed to make the transistion years ago now, and none of them had the resources and backing that Blizzard do, so there's really no excuse.


The reason GTA's setting s taxed high end CPUs was because they were increasing the traffic density, the draw distance and thus the amount of draw calls and AI calculated, and even then, at least Rockstar had proper multi core optimisations.
 
brain_stew said:
Wait, so lowering the graphics settings, significantly increases your framerate in CPU bottlenecked situations? That.......makes little sense, I mean you're not introducing extra units (and thus more draw calls), right? So why the hell do they bring down CPU performance so much? Hmm, just smacks of some seriously shitty coding on Blizzards part, I mean, sheesh if you're doing some of the visual work on the CPU at high settings (which they shouldn't be, but whatever) at least spawn a new thread for it. I'm well aware that multicore development isn't "as simple as that" but the rest of the industry managed to make the transistion years ago now, and none of them had the resources and backing that Blizzard do, so there's really no excuse.

Yea, guess this was a bit more sidetracking than I had expected, but TheExodus5 figured out that the shaders setting are causing a huge performance hit, like 400%+ improvement in framerates by changing shaders from ultra to low. It's weird to see a CPU limited game gain such a framerate boost by lowering a feature that's normally entirely handled by the GPU. Clearly something is strange...
 
brain_stew said:
Wait, so lowering the graphics settings, significantly increases your CPU load? That.......makes little sense, I mean you're not introducing extra units (and thus more draw calls), right? So why the hell do they bring down CPU performance so much? Hmm, just smacks of some seriously shitty coding on Blizzards part, I mean, sheesh if you're doing some of the visual work on the CPU at high settings (which they shouldn't be, but whatever) at least spawn a new thread for it. I'm well aware that multicore development isn't "as simple as that" but the rest of the industry managed to make the transistion years ago now, and none of them had the resources and backing that Blizzard do, so there's really no excuse.


The reason GTA's setting s taxed high end CPUs was because they were increasing the traffic density, the draw distance and thus the amount of draw calls and AI calculated, and even then, at least Rockstar had proper multi core optimisations.
http://www.neogaf.com/forum/showpost.php?p=19879295&postcount=1704
http://www.neogaf.com/forum/showpost.php?p=19893261&postcount=1850
http://www.neogaf.com/forum/showpost.php?p=19893949&postcount=1881

From there just read them arguing.
 
Minsc said:
Yea, guess this was a bit more sidetracking than I had expected, but TheExodus5 figured out that the shaders setting are causing a huge performance hit, like 400%+ improvement in framerates by changing shaders from ultra to low. It's weird to see a CPU limited game gain such a framerate boost by lowering a feature that's normally entirely handled by the GPU. Clearly something is strange...

Shaders!? The hell!? Well hopefully that's just a bug/fixable quirk because that really shouldn't be happening, especially to the degree with which it is. At least that means there's hope that, even without properly multithreaded engine, they can sort out the CPU performance.

Anyway enough sidetracking, I'll do some more digging in the SC2 BETA thread to satisfy my curiosity and leave it out of here.
 
don't put a lot of hope in big optimizations at this point.

when public Betas like this hit, they're at a point where they're doing about 10% optimizations, and 90% gameplay balancing.
 
Top Bottom