• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

This new 'skyscraper' chip could make computers run 1,000 times faster

Herne

Member
Wut? maybe not this gen. But the cell was generally one of the faster CPUs around at the time as was the 360s gpu during launch.

And look at what it cost them. Cutting edge, high performance tech generally just does not go with small and cheap mass consumer devices. It can be done, don't get me wrong, but it usually causes some headaches.
 
PC development can't produce games worth porting to console.

Star Citizen. Also, again, these chips will not be ready for the next consoles. In a dream world the next set would have stacked CPU's made from graphene (gotta get the bandgap first) which would have astronomically high clock speeds and amazing performance across all cores. But hey, they were working on the same stuff 5 years ago.
 
It's late 2015. If this can be easily mass produced who doesn't think this won't be in consoles in late 2018. In three years time I think it's a real possibility looking at current tech trends. VR is very thirsty for speed.
 
It's late 2015. If this can be easily mass produced who doesn't think this won't be in consoles in late 2018. In three years time I think it's a real possibility looking at current tech trends. VR is very thirsty for speed.

Clearly you are the one who doesn't follow tech. 3 years is nothing! Really nothing at all when it comes to completely redesigning the entire structure.

Where is your Rockstar?

Right here.
 

bomblord1

Banned
3D ICs are being worked on from multiple angles. Their arrival is inevitable and has been for decades. The reason it hasn't happened sooner is because we've gotten enormous performance and efficiency gains yearly simply by making the process node smaller, fitting more transistors per chip at lower voltages. The benefits of that minituarization is drying up as we're hitting physical limits and building upwards is starting to look enticing. You can be sure that all relevant chipmakers are researching 3D ICs.

https://en.wikipedia.org/wiki/3D_IC

1000x performance increase is just clickbait, I'm sure we'll get there in time but at first you have to make chips that are even equal to the competition. You need billions of dollars worth of research in materials, fabs, EDA software, probably new ISAs, schedulers etc. It's a massive task that has been put off because making the process node smaller has been easier (still possibly the most daunting endeavor in human history) until now.

This was my initial thought as well "too good to be true" is usually that.

It would also require a very expensive overhaul of existing manufacturing processes and facilities in order to be viable to mass produce and that's not necessarily something that your average factory would be interested in while existing chips are still selling.
 
Will X4 and PS5 be able to utilize this?

The Ps5 and X4 no. but the consoles after will literally just be the chip, heat disappation + I/O options. Everything else will be a emulation running in its own virtualisationon the chip. New GPUs will be unlocked in software misterxmedia style and you will finally be able to download more RAM.

Trust me on this, I have inside sources.
 

StevieP

Banned
PC development can't produce games worth porting to console.

There is so much more available in the PC market that it isn't worth devolving this thread into list wars on its second bloody page. Grow up.

It's late 2015. If this can be easily mass produced who doesn't think this won't be in consoles in late 2018. In three years time I think it's a real possibility looking at current tech trends. VR is very thirsty for speed.

And yet Sony is grafting it onto a console that can barely keep pace with a mid range PC. Again, the console market isn't about cutting edge any longer. Not to mention tech like this has been in the works for a lot longer than a few years.
 
It's late 2015. If this can be easily mass produced who doesn't think this won't be in consoles in late 2018. In three years time I think it's a real possibility looking at current tech trends. VR is very thirsty for speed.

Price it to 299.99 good sir.

Rc7G1Eq.png
 
So can anyone please confirm if whether or not this is going to be relevant for us consumers in the near future?

I doubt the major benefactors of this kind of technology will be gamers (because lets face it, we all dont run Xeon processors, do we?) - I would say industrial design and any venture that uses mathematical computing as a core will make most use of this.
I'm running a Xeon in my gaming machine. Back then in 2012 it used to be on pair with the latest in and more than 100€ cheaper. Could you please explicate what this has to do with Xeon in particular?

EDIT: I meant on par with the latest i7, sorry. Damn auto correction on mobile device.
 

Skinpop

Member
And look at what it cost them. Cutting edge, high performance tech generally just does not go with small and cheap mass consumer devices. It can be done, don't get me wrong, but it usually causes some headaches.

there is more to "cutting edge" than raw power.

So can anyone please confirm if whether or not this is going to be relevant for us consumers in the near future?
I'd say at least a decade.
 

DocSeuss

Member
I'm not sure how this is that much different than stacked RAM as manufacturing goes, so I would think the industry as a whole is moving that direction.

Still not going to bring PCs and consoles together, since, at the end of the day, it's all down to how much wattage you can put into your machine, and you can put 700+ into a PC vs a cap of around 300-400 in a console.

So can anyone please confirm if whether or not this is going to be relevant for us consumers in the near future?


I'm running a Xeon in my gaming machine. Back then in 2012 it used to be on pair with the latest in and more than 100€ cheaper. Could you please explicate what this has to do with Xeon in particular?

Why would you do that? It's not particularly efficient for gaming purposes, as I understand it. Was never on par with gaming processors.
 
Feels good to see a glimpse of the future.

It's one possible solution , let's hope the technology becomes mainstream enough that the price won't reach high skies..
 

Caayn

Member
Why would you do that? It's not particularly efficient for gaming purposes, as I understand it. Was never on par with gaming processors.
No. That doesn't work like that for CPUs. A Xeon can be perfectly used for gaming. It's a x86 CPU after all.

The major difference is that Xeon CPU usually works with a different chipset and supports Enterprise features such as ECC memory, can have more on die memory and other benefits such as better binned which will result in lower power usage and running cooler. They're not slower at all than an equally specced "gaming processor" in games.
 

daveo42

Banned
It's late 2015. If this can be easily mass produced who doesn't think this won't be in consoles in late 2018. In three years time I think it's a real possibility looking at current tech trends. VR is very thirsty for speed.

I don't think you realize what they are suggesting. This isn't even using existing manufacturing. The whole infrastructure would need to change to support this. We might see it in consoles in the generation after next if there are still dedicated consoles or something completely different. This is a sea change, not just an enhancement to existing tech.
 
remember back when everyone and their mom was speculating Xbox One was using stacked chips because there was no possible way that MS's console would have such weak (leaked) specs?

Fun times.

The launch window was a magical time. The juniors, the bans, the rants, the tags, the insiders, 720p, IGN, the gifs. It was too much fun.
 
I don't think you realize what they are suggesting. This isn't even using existing manufacturing. The whole infrastructure would need to change to support this. We might see it in consoles in the generation after next if there are still dedicated consoles or something completely different. This is a sea change, not just an enhancement to existing tech.

Gotcha. This will take much longer.
 
The key takeaway here is that Moore's Law probably isn't coming to an end soon. We'll continue to the steady march forward of computer progress with a few hitches here and there. So while this might not be revolutionizing computers in a few years, when you look out over the 10-20 year time span things start to look pretty incredible.
 

Arttemis

Member
I doubt the major benefactors of this kind of technology will be gamers (because lets face it, we all dont run Xeon processors, do we?) - I would say industrial design and any venture that uses mathematical computing as a core will make most use of this.

The article mentions replacing silicon transistors with carbon nanotubes. I see no reason why this can't be applied to any and every chip manufacturing process.
 

Instro

Member
I wouldn't expect it for a while, at least not at the consumer level. They haven't quite tapped out on shrinking just yet.
 
Why would you do that? It's not particularly efficient for gaming purposes, as I understand it. Was never on par with gaming processors.

Sorry for typo. I meant my Xeon was comparable to the latest i7 in terms of performance.
I initially built my PC for pretty much everything but gaming back then (just browsing, multimedia etc.) but decided to go with a strong CPU so I wouldn't have to worry about performance for the years to come. According to the advice I sought from other people, i-series would only then be must, if I were to go for OC, which I wasn't. So I went for the best bang for my buck.

Looks like it was a good decision though. It performs surprisingly well with the gtx780 OC-version, which I just got this weekend.
 

Herne

Member
there is more to "cutting edge" than raw power.


I'd say at least a decade.

There was meant to be an and in there, but I was in a rush to get food done and was typing too quickly. Of course there is far more than just power, I didn't mean to imply otherwise.
 
Top Bottom