• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

When will we see another PC technology jump?

Status
Not open for further replies.

olimario

Banned
In the 5th grade we bought a 133 Mhz computer.
In the 8th we bought a 500 Mhz computer.
In the 10th we bought a 2.0 Ghz computer.

Here I am, 3 and a half years later and the highest end machine I can find is 3.4 Ghz. Hardly the jump the others were and in a greater span of time. Is technology slowing down or am I going to purchase a 3.4 Ghz computer and get blindsided by a technology jump later this year?
 

gofreak

GAF's Bob Woodward
The march of technology is no longer defined by clockspeed increases..we're no longer going vertically, we're going horizontally (more cores, more parallelism). It'll take software a little while to catch up, though.

I know what you mean though, the move to 64-bit aside, for the last 18 months or so things have seemed pretty "static" at least as far as CPUs go.
 

Bregor

Member
Clock speed increases are hitting there limit. Both Intel and AMD will soon have dual core processors - but the actual performance increase will not be significant until more software is written to take advantage of it. Eventually Quad core (and higher) processors are planned.
 

olimario

Banned
So even though the clock speed increase doesn't look so impressive, will a current 3.4 Ghz machine provide the same jump the 500Mgz --> 2Gzh jump did?

And when do you think software will catch up? When will the next noticable jump be, do you think?
 

Bregor

Member
BTW if that 3.4 GHz CPU you referred to is a P4 then it isn't the best available. Lower speed Athlon 64's are clearly superior.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
Probably once code takes advantage of multi-core 64-bit tech.

I'm thinking there's going to be a lot of job openings for programmers to make this happen.


Bregor is right about the intel chip. Just avoid intel entirely. They've been sucking the past 1.5 years. AMD has had the superior chip in just about every respect. Get a 3500+.
 

pxleyes

Banned
Silicon can only get so hot before it melts.

Give it another 5-10 years before we use another material. Until then, the other posts apply.
 

olimario

Banned
I just don't want to purchase a current high end machine and then get blindsided by a sizable jump. Are there any predictions as to when these said events will occur?
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
I'm guessing in about 1-2 years.

Hopefully they'll ditch DDR2 in favor of next gen RAMBUS.
 

Bregor

Member
olimario said:
I just don't want to purchase a current high end machine and then get blindsided by a sizable jump. Are there any predictions as to when these said events will occur?

You will always have to worry about this no matter when you purchase your PC. There is always some new and great tech on the horizon. Wait till your current PC no longer serves your needs, and then go ahead and buy a new one. And don't worry about whether or not it will soon be obsolete - because it is certain to happen no matter what you do.
 

Borys

Banned
olimario said:
I just don't want to purchase a current high end machine and then get blindsided by a sizable jump. Are there any predictions as to when these said events will occur?

I can't see any big jump in the next 2-3 years. Everything is deadlocked from like 2 years now. According to my math we should be buying P4 6GHz now... pity it isn't happening cause the MHz race was entertaining and it was fantastic to benchmark your games/ 3DMarks before and after upgrading.

At least I got nice memories of the PC hardware scene in the late 90's.
 

olimario

Banned
Bregor said:
You will always have to worry about this no matter when you purchase your PC. There is always some new and great tech on the horizon. Wait till your current PC no longer serves your needs, and then go ahead and buy a new one. And don't worry about whether or not it will soon be obsolete - because it is certain to happen no matter what you do.


Do the people developing new technology keep tight lipped on their progress and on when they plan to release certain tech? If so, how stupid.
 

Culex

Banned
olimario said:
I just don't want to purchase a current high end machine and then get blindsided by a sizable jump. Are there any predictions as to when these said events will occur?

Desktop dual-core CPU's from AMD are going to be out by June. I'd go with them, and unlike Intel, AMD's new cores don't need new mobo's to go along with them. Just slap them in your current 939 board and get a new BIOS.

I think if you use the current mindset that you have, you'll never buy a new PC. There's always better and more sophisiticated tech around the corner. Buy what you want NOW, not later, or you'll just be waiting indefinitely.
 

olimario

Banned
Culex said:
Desktop dual-core CPU's from AMD are going to be out by June. I'd go with them, and unlike Intel, AMD's new cores don't need new mobo's to go along with them. Just slap them in your current 939 board and get a new BIOS.

I think if you use the current mindset that you have, you'll never buy a new PC. There's always better and more sophisiticated tech around the corner. Buy what you want NOW, not later, or you'll just be waiting indefinitely.

I'll buy when there is an equivelant of a 500Mgz --> 2Ghz jump. That is for what I am waiting. Do you think the dual-core AMDs will provide that?
 

Bregor

Member
olimario said:
Do the people developing new technology keep tight lipped on their progress and on when they plan to release certain tech? If so, how stupid.

No, you usually know months in advance when a new tech is coming out (if you keep up with the PC tech news). Its not always clear what the actual gains will be from the tech however.

My point is that as soon as one new tech comes out, another is usually just a couple months after that - and so on and so forth. There is always something new just around the corner. So there is little point trying to avoid obsolescence - its better to just wait till you need the PC and buy then.
 

Slo

Member
I've definetly noticed a lot less pressure to upgrade my PC than in previous years. Coding, web activities, and gaming are the only needs I have out of my PC, and honestly I can think of no real reason to upgrade aside from gaming. I'm not really a framerate whore, but as of yet there hasn't been a single game my rusty ol' 2100+ XP/1GB PC266/9800 Pro can't run at 1024x768.
 

Zaptruder

Banned
When longhorn is released.

As it is... I think most people are content with the modern OS and the speed at which it operates, so that the explosive growth we saw in the 90s has really plateaued off.

A few things might shake up equations a bit... such as Sony including a version of linux for the PS3 right off the bat.

Can you imagine millions of cheap but super powerful computers been made available to the general public? More then capable of taking care of next gen computing needs such as HD video support and what not...
given that large user base, you'll soon have a good deal of people writing software for the PS3 linux, which would give MS a massive scare and maybe even topple them if all things go right for the PS3.

Longhorn will also promote the drive for more powerful computers... but truthfully, given that most people are relatively satisfied with XP, I'm not sure what huge advances Longhorn can provide over XP that won't be included in their service packs.

the physics processing unit/cards that speed up physics simulation like graphics card speeds up graphics will also provide an intresting boost to PC power; increasingly the bottleneck in graphical fidelity is the strength of animation; the difference between hair and clothing billowing around in the wind will be quite noticable with a PPU.

But honestly... I can't see computers even requiring a significant boost or jump in power until someone gets off their asses and perfects a 3D display solution as well as creating a 3D viewing environment/user paradigm to fit the display. Or maybe the other way around... when that happens, things will get intresting... I predict a bit after that, computers will hit a sort of market limit... where gains may still be possible, but not required by consumers.
 
For quite some time we were measuring jumps in computing power by clock frequency. Of course the internal workings of the CPU and the rest of the system's configuration all factored into the bottom line but you could pretty much tell what type of power your machine had by the clockspeed of its CPU.

A major reason for this is that most of the general user "desktop" applications that we use as single threaded and do their work in a nice and usually organized serial manner. Thus, increasing clock speed gave an overall performance boost because the app could simply do those serial steps faster. Even currently multi-threded apps benefitted because the serial events in each thread could go faster.

Now, for a while at least it seems to me that the free "performance boost" ride is over. Because the clock frequency won't be bumped in leaps and bounds anymore hardware makers are going to give software developers speed and the ability to do more things at once. With multiple cores able to simultaneously execute multiple instructions as well as other parallelism facilitating hardware modifications applications are going to be able to get performance boosts by being smarter.

Unfortunately, in modern software engineering, we still for the most part live in a single-threaded world. In the short term, OSes and applications that lend themselves to multiple threads doing serial in a moderately independant way are going to see the biggest increases from the new hardware design paradigm. Future increases in power, for the first time in a while, is going to take cooperation between hardware and software developers. Hardware guys have offloaded a portion of their "performance increasing workload" to guys using their hardware.

Of course, things like increases in cache size and memory bandwith will give us some "free" performance increases in software. Better tools will also improve the final executable's performance on newer hardware (altho it may require recompiling, relinking, and re-distributing to end users).

For the momeny 64-bit CPUs and multi-core systems for end users are not worth the high prices that they are fetching. Personally until I need over 4GB of RAM, I see 64-bit OSes with features I can't get on64-bit, or I see applications that I use everyday outperforming its counterparts on equally clocked current hardware design paradigm machines I'm going to wait it out.,..
 
Status
Not open for further replies.
Top Bottom