• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Official 2008 "I Need A New PC" Thread

zoku88

Member
Kadey said:
Yes, true. But the thing is, one is stocked at 2.4 and another 3.0 but it being 45mm means it being slightly faster core by core. That's the difference. I'll still maintain it being "way better for games" unless someone rids my memory of all the benchmarks that state it and comments seen from owners. Hell the E6850 is better than the Q66 for games, the E8400 stomps the E6850.
http://www.techarp.com/showarticle.aspx?artno=499&pgno=3

Do you see how it's almost constant? You actually see more of a difference with Supreme Commander because strategy games tend to use the CPUs more than other genres because of the way AI works.
 

Kadey

Mrs. Harvey
The article makes some great points (the whole duo core and quad thing) but I don't see anything to prove against my point. Real time performance over any CPU charts any day. Many of the latest games perform better with an E8400. Many people stepping from a Q6600 to an E8400 tell the difference.
 

Servbot #42

Unconfirmed Member
Andokuky said:
I rushed into my PC build and bought a mobo with only 1 PCI Express x16 slot. If I eventually want to utilize SLI, can you do it with two cards using two different types of PCI slots? Do they have to be the exact same two cards?


You need two PCIE 16X slots in order to use SLI. They don`t have to be the same exact cards but i think they have to be in the same family (geforce 8 ,geforce 9 family,Etc) plus if you do SLI with two cards that are not the same the two cards will work at the speed of the less powerful card, it was something like that.
 

Andokuky

Banned
Gexecuter said:
You need two PCIE 16X slots in order to use SLI. They don`t have to be the same exact cards but i think they have to be in the same family (geforce 8 ,geforce 9 family,Etc) plus if you do SLI with two cards that are not the same the two cards will work at the speed of the less powerful card, it was something like that.

Thanks.

If I just buy a new mobo, will that make my Windows run all wonky?
 

Servbot #42

Unconfirmed Member
Andokuky said:
Thanks.

If I just buy a new mobo, will that make my Windows run all wonky?

Probably since different motherboard use different windows drivers, i suggest you backup and reinstall windows if you do end up buying a new mobo.
 

zoku88

Member
Kadey said:
The article makes some great points (the whole duo core and quad thing) but I don't see anything to prove against my point. Real time performance over any CPU charts any day. Many of the latest games perform better with an E8400. Many people stepping from a Q6600 to an E8400 tell the difference.
Unless they actually had solid benchmarks to back themselves up, I'd be inclined not to believe them, mainly because most newer games are GPU bottlenecked thus implying gains in processor speeds would make a negligible difference.

Anyway, I believe that tt's not unnatural for people to believe that there is a difference when there isn't. Your mind, more or less, tries to justify ever decision you make as the best one (which is natural. Kinda like that one M&M experiment with monkeys.)
 

SleazyC

Member
For now the E8400 (or E8500) produce slightly better results in most games because of their higher clock speeds and the fact that most games aren't optimized for multicore CPU's to the point where you see drastic performance increase. Just recently you see companies like Epic saying that their games are optimized for two cores and I think most games sweet spot is two cores right now. Supreme Commander is an example of game that works better on a quad core but from what I can see it is in the minority. The next cycle of engines and big games will probably see a push into quad+ optimization as I recall Alan Wake is setup to take advantage of 4-8 cores.
 

eznark

Banned
godhandiscen said:
Are you serious? Check Tom Hardware's revies. Crysis gives you <20fps once you enable AA with any freaking card. Including the 9800GX2

You were lied then. I have a damn 3870X2 4 Gigs of RM and 4 cores each one running at 2.9. And I cannot fucking get 30fps constant. The game will always go down to 20 fps every single time a massive firefight occurs. BTW, I use DX10 and 1280x800 res. Everything on High.

I've got an 8800GT, a Q6600 and 4 gigs of ram. I'm getting 40 FPS inside and 30FPS in big out door fire fights. To claim,flat out, that it doesn't run at all is incorrect. As far as settings, I'm running an awesome custom setting mod that i got, which make the game look amazing.
 

mr stroke

Member
SRG01 said:
Crysis performance is highly dependent on resolution.

In other words, stop trying to play it on super-high 1080p-ish ones. It won't work.

Playing at 1280x768res. With everything on High....
system-
AMD X2 6400+
4gigs Kingstons DDR2 Ram
Saphire 3870x2

considering the 3870x2 outperforms or is equal to a 8800gt, I can only come up with two solutions here
1. Everyone is full of shit and Crysis can't run right on anything.
2. My system(and godhandiscen's) are misconfigured??(keep in mind everything else plays like butter maxed out w/high res)

its confusing as all hell because every person has a completely different experience with the games performance. Anandtech says one thing and Hardocp is completely different....hmmmm??? WTF!!!!!!!!!!
 

SRG01

Member
Durante said:
... except on Crysis, because it doesn't play well with multi-gpu configs.

Ooops, I forgot about this little detail too. :lol The X2 versions of the 3870 effectively acts like Crossfire.
 

Mr. Hyde

Member
How do I change my GPU fan speed? Everest is showing it running at 37%.

My q6600's temperature is at 22C, but why does each core use its own temperature?

The GPU is running at 50C, so should I raise the fan on it some?
 

aznpxdd

Member
Mr. Hyde said:
How do I change my GPU fan speed? Everest is showing it running at 37%.

My q6600's temperature is at 22C, but why does each core use its own temperature?

The GPU is running at 50C, so should I raise the fan on it some?

Download Rivatuner.
 

SleazyC

Member
Mr. Hyde said:
How do I change my GPU fan speed? Everest is showing it running at 37%.

My q6600's temperature is at 22C, but why does each core use its own temperature?

The GPU is running at 50C, so should I raise the fan on it some?
Nvidia card? Download RivaTuner and you can change it.
 

Antagon

Member
Mr. Hyde said:
How do I change my GPU fan speed? Everest is showing it running at 37%.

My q6600's temperature is at 22C, but why does each core use its own temperature?

The GPU is running at 50C, so should I raise the fan on it some?

It probably does that automatically. 50C is high if it's idle but still not really a problem. If you start gaming and it gets hotter the fan should kick up a few notches.
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
Durante said:
... except on Crysis, because it doesn't play well with multi-gpu configs.
Crysis is just coded like shit. Boot the console and you will see all the fucking errors and fail to loads that the damn code makes. Everytime it tries to allocate memory and fails to load it is wasted time. In my opinion a game with the graphics of Crysis could run at 40fps without the need of any custom mod if it was coded for quad core processors, and dual videocards. Fuck Crysis. I don't give a fuck anymore. I'll just wait for the next generation of videocards before I try to run it.
Kadey said:
Aren't you the one with a Q6600? You do know the E8400 is way better for games right? And like I said, having the thing overclocked, to 4.0ghz and with a stock 9800GX2, will get you this.

crysisveryhigh9800gx2de1.jpg


Now it's possible to overclock the thing even more, up to 4.5ghz with an air cooler before it reaches the limit of the highest temp I would go. 3dmark scores are over 20k.
My GPU temps are pretty normal. I could overclock it as well and produce even better results.
AVG fps means shit in Crysis. I get 40fps in those benchmarks. However it still drops below fucking 25 when the battles get too intense. Try running the CPU demo 2 and give me your avg there, thats the most demanding benchmark imo. I will drop down to 15fps in that damn demo for like 5 seconds.
 

Kadey

Mrs. Harvey
godhandiscen said:
Crysis is just coded like shit. Boot the console and you will see all the fucking errors and fail to loads that the damn code makes. Everytime it tries to allocate memory and fails to load it is wasted time. In my opinion a game with the graphics of Crysis could run at 40fps without the need of any custom mod if it was coded for quad core processors, and dual videocards. Fuck Crysis. I don't give a fuck anymore. I'll just wait for the next generation of videocards before I try to run it.

AVG fps means shit in Crysis. I get 40fps in those benchmarks. However it still drops below fucking 25 when the battles get too intense. Try running the CPU demo 2 and give me your avg there, thats the most demanding benchmark imo. I will drop down to 15fps in that damn demo for like 5 seconds.

I agree it's coded badly. But you're exaggerating it being unplayable. I've already run through the game from beginning to end. At 1680x1050, it's as smooth as butter. Rarely if ever, any problems. At 1080p, it's as playable as most console fps, nice all the time but perks here and there. Crysis is one of those games that still run pretty well in the 20fps mark and it rarely dips below that. I'd make a comparison with the 8800GT being in the same boat. At 1280x1024 on high dx9 to 1680x1050 high dx9. What really kills it is having AA on on any res. For some reason, the game really hates it. Anyone with the two high end cards from either company should run it fine.

3870x2 on very high.
http://www.youtube.com/watch?v=-C186GK5Enw&feature=related

9800GX2 ditto.
http://www.youtube.com/watch?v=sA8waCvKv48&feature=related
 

zoku88

Member
Really, since DDR2 RAM is so cheap, I don't see why anybody would make a build now with less than 4GB, especially when it costs maybe an extra $50.
 

zoku88

Member
avaya said:
Is 8-core Nehalem in early 2009 or winter 2008?

My upgrade is tied to it.
Never trust release dates :p When's the last time we got something on time?

EDIT: Anyway, I would advise against getting Nehalem chips right when they come out. Sure, you would have a faster processor, but you would also most likely have less RAM (because DDR3 is so wicked expensive.) Since Windows PCs tend to be more RAM dependent than processor dependent (as opposed to MACs,) the day-to-day use of your computer might be slower.
 
From my guys at Intel, the Nehalem will be ready for consumer consumption by Winter '08. What that means is it will first be in Dell OE machines and then start hitting tiger direct and newegg most likely Jan/Feb '09. Quite frankly though, I wouldn't buy one right off the bat as it's a new chipset (think first model year car) and wait till they get the bugs worked out.

Also, DDR3 and DDR4 prices are going to be re-cockliously expensive. Whether or not it will use DDR2 (doubtful) has yet to be seen but I would expect these bleeding edge procs to simply be DDR3 and up.

We won't see DDR3 prices where DDR2 is now for at least another 18 months. The best bet is to get a Q6600, Wolfdale, or Yorkfield revision come late summer/fall if you can wait a little while longer.
 

zoku88

Member
VictimOfGrief said:
Also, DDR3 and DDR4 prices are going to be re-cockliously expensive. Whether or not it will use DDR2 (doubtful) has yet to be seen but I would expect these bleeding edge procs to simply be DDR3 and up.
Also, DDR3 and DDR4 prices are going to be re-cockliously expensive.
Also, DDR3 and DDR4 prices

Wait, what?

EDIT: Anyway, I though Nehalem was DDR3-only anyway, because of the integrated mem controller. I guess any memory type that is backwards compatible with DDR3 also works?
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
Kadey said:
I agree it's coded badly. But you're exaggerating it being unplayable. I've already run through the game from beginning to end. At 1680x1050, it's as smooth as butter. Rarely if ever, any problems. At 1080p, it's as playable as most console fps, nice all the time but perks here and there. Crysis is one of those games that still run pretty well in the 20fps mark and it rarely dips below that. I'd make a comparison with the 8800GT being in the same boat. At 1280x1024 on high dx9 to 1680x1050 high dx9. What really kills it is having AA on on any res. For some reason, the game really hates it. Anyone with the two high end cards from either company should run it fine.

3870x2 on very high.
http://www.youtube.com/watch?v=-C186GK5Enw&feature=related

9800GX2 ditto.
http://www.youtube.com/watch?v=sA8waCvKv48&feature=related
Well, its not unplayable, its just that I am too sensitive to framerate changes. Crysis's framerate jumps back and forth too much, and that throws me off. It's a great game, I just dont want to play it in this state. I can wait for it.
 

Luthair

Member
Quazar said:
Is subscribing to file planet worth it??
I have a account, sometimes its awesome to have when I need patches, and the beta's they get, even getting them for consoles now, are pretty cool, but the price is a bit steep. I might not renew.
 

Cheeto

Member
zoku88 said:
Really, since DDR2 RAM is so cheap, I don't see why anybody would make a build now with less than 4GB, especially when it costs maybe an extra $50.
Maybe because it doesn't perform better, and often performs worse for games on 32bit?
 

zoku88

Member
WhatRuOn said:
Maybe because it doesn't perform better, and often performs worse for games on 32bit?
Windows PCs get more of a speedboast from having more RAM than with having a better processor, in general.
 

SleazyC

Member
Anyone see the new WD Raptors?

24l8v3d.jpg


Overclockers and gamers, prepare to meet your next hard drive: the 300GB VelociRaptor from Western Digital. Said to be 35% faster than previous WD Raptors, the 10,000 RPM drive features a 3Gbps SATA interface, 16MB cache, and impressive 1.4 million hour MTBF thanks in part to the IcePack Mounting Frame. The IcePack heat sink not only keeps the drive spinning extra cool, it also bumps the 2.5-inch HDD to a required 3.5-inch drive bay. Available exclusively on Alienware's ALX gaming desktop this month and then up for grabs for everybody with $300 to burn starting mid-May.

Beats SSD drives in burst, read, and average write tests.

Also...

106ai5z.jpg


Pretty sick, 6GHz on an LN2 cooled Skulltrail system with all 8 cores active. Thats about 48GHz of processing power!

2gtso3k.png


Crazy 3dmark score with a 9800GTX2

2u92xc7.gif
 

zoku88

Member
SRG01 said:
Dell.ca has an incredible offer for their Vostro 400. Basically, E8400, 2GB RAM, 8800 512MB for about $729 CDN.
Wait, aren't all Vostro's laptops? Do they mean T8400 instead? But I thought the 8+ series only had odd third digits so far...

/confused.
 

SRG01

Member
zoku88 said:
Wait, aren't all Vostro's laptops? Do they mean T8400 instead? But I thought the 8+ series only had odd third digits so far...

/confused.

No, Vostros are also desktop systems.
 

SRG01

Member
Is there a processor that's equivalent to the X2 5000+ or X2 BE-2400 on the Intel side? Something that's around the same price range, preferably... No OC, since I don't like to play around with that stuff.
 

bee

Member
looks like theres a new price/performance king in the world of cpu's, intel core2duo e7200 its a wolfdale chip costs $133 in the US and is available to buy right now in the UK for £87.50

nice review HERE
 
CPU Previous Price Current Price % Change
Intel® Core™2 Quad Processor Q6700 (8M Cache, 2.66 GHz, 1066 MHz FSB) $530 $266 50%
Intel® Core™2 Quad Processor Q6600 (8M Cache, 2.40 GHz, 1066 MHz FSB) $266 $224 16%
Intel® Core™2 Duo Processor E8300 (6M Cache, 2.83 GHz, 1333 MHz FSB) - $163 0%
Intel® Core™2 Duo Processor E7200 (3M Cache, 2.53 GHz, 1066 MHz FSB) - $133 0%
Intel® Core™2 Duo Processor E6850 (4M Cache, 3.00 GHz, 1333 MHz FSB) $266 $183 31%
Intel® Core™2 Duo Processor E4600 (2M Cache, 2.40 GHz, 800 MHz FSB) $133 $113 15%
Intel® Pentium® Dual-Core Processor E2200 (1M Cache, 2.20 GHz, 800 MHz FSB) $84 $74 12%
Intel® Pentium® Dual-Core Processor E2180 (1M Cache, 2.00 GHz, 800 MHz FSB) $74 $64 14%
Intel® Celeron® Dual-Core Processor E1400 (512K Cache, 2.00 GHz, 800 MHz FSB) - $53 0%
Intel® Celeron® Dual-Core Processor E1200 (512K Cache, 1.60 GHz, 800 MHz FSB) $53 $43 19%
Intel® Celeron® Processor 440 (512K Cache, 2.00 GHz, 800 MHz FSB) $53 $44 17%
Intel® Celeron® Processor 430 (512K Cache, 1.80 GHz, 800 MHz FSB) $44 $34 23%
Intel® Celeron® Processor 570 (1M Cache, 2.66 GHz, 533 MHz FSB) - $134 0%
Intel® Celeron® Processor 560 (1M Cache, 2.13 GHz, 533 MHz FSB) $134 $107 20%
Intel® Celeron® Processor 550 (1M Cache, 2.00 GHz, 533 MHz FSB) $107 $86 20%
Quad-Core Intel® Xeon® Processor X3230 (8M Cache, 2.66 GHz, 1066 MHz FSB) $530 $266 50%
Quad-Core Intel® Xeon® Processor X3220 (8M Cache, 2.40 GHz, 1066 MHz FSB) $266 $224 16%
Dual-Core Intel® Xeon® Processor 3085 (4M Cache, 3.00 GHz, 1333 MHz FSB) $266 $188 29%
 

Cheeto

Member
zoku88 said:
Windows PCs get more of a speedboast from having more RAM than with having a better processor, in general.
Games depend on memory speed, not so much total memory. Compromising the speed boost you get from dual channel often is detrimental for game performance. Now, we're only talking about 2 or 3 frames per second difference, but still...if you machine is for playing games mainly, why spend more money for less performance?
 
WhatRuOn said:
Games depend on memory speed, not so much total memory. Compromising the speed boost you get from dual channel often is detrimental for game performance. Now, we're only talking about 2 or 3 frames per second difference, but still...if you machine is for playing games mainly, why spend more money for less performance?
Okay so from what you're saying....

a 1GBx2 sticks in Dual Channel mode say.... DDR21066 is better than 2GBx2 DDR2800 in dual channel mode?

In some respects I can see that as being valid since the memory speed--- However say you had the same RAM for a moment.

1GBx2 sticks DDR2800 vs. 2GBx2 sticks --- Both in Dual Channel mode. To the average joe, the performance difference vs. increase in available RAM is going to be pretty small.

However I have noticed that most benchmarks that the sites run are only using 2GB of RAM which is interesting.... but again, I have a limited scope of the trade off in available memory vs. performance in game--- And if it's only 2-3 FPS, people would obviously have to weigh that out.
 

Cheeto

Member
With 2gbx2, you aren't destroying the dual channel setup. However, you also aren't using your entire pool. 32-bit Windows can address a total of 4gb of memory, video included. So if your GPU has a 512mb memory pool, then Windows will address 3.5gbs of your 4. When Windows does this, it's inefficient speed-wise. You will have more physical memory for applications that will use it, but most games do not. In fact, on my G15 I can monitor real time memory and CPU usage, and every game I play peaks at 60% memory usage of 2GB.

Rabhw makes a good point though.
 
Top Bottom