arkhamguy123
Member
Serious question: Should I invest?
AI boom will calm down once it replaces human intelligence completely.Don't worry this is all temporary, it will slowly start sliding down once the AI boom calms down. Same shit happened during the dot com era.
The AI boom will never slow down. It will only accelerate until it becomes impossible to keep up (it already is).Don't worry this is all temporary, it will slowly start sliding down once the AI boom calms down. Same shit happened during the dot com era.
Not specifically Geforce Now, there are enough alternatives.
Serious question: Should I invest?
The AI boom will never slow down. It will only accelerate until it becomes impossible to keep up (it already is).
True, like image processing. The last company I worked at used AI via PyTorch to process images with no human intervention. They used Nvidia cards. I also interviewed at a health science company that was using image processing to look at pathology slides to look for and identify cancers. This field is just looking for uses.Unlike fads (e.g. metaverse, shitcoins, nfts), AI is actually a huge revolution with tangible impacts on the vast majority of industries. Just look at practical applications. Also the .com era open the way to the internet of today and having the majority of white collar jobs done electronically. It completely changed the world economy we had in the 90's and before.
Xcloud is surly not on Nvidia Hardware.And what do you think most of those alternatives are running in terms of hardware?
Sorry, the time to invest was last year when it was heading towards it's lows.
If you want to invest for the long haul at these multiples then good luck, you'll need it.
Will it if they’re the #1 global supplier of compute to train these AI models?AI isn’t going away. NVIDIA’s valuation will though
Will it if they’re the #1 global supplier of compute to train these AI models?
Even a one-year lead would be a significant advantage. Not to mention the volumes enabling better pricing. A couple years back, I thought ASICS like the Google TPU would mean the big players would stop using NVIDIA's GPUs for training. But clearly I was wrong, OpenAI still uses them, Google and Meta will use H100s, Tesla too even though they're working on their DOJO.Maybe they will be, but more likely they’ll face significant competition
Chips are used in everything already. No reason why they should be valued at current levels just because of the types of chips used give them a short term advantage
Xcloud is surly not on Nvidia Hardware.
Yeah, on one hand I am happy Nvidia, MS and Apple have been doing well, on the other hand...All I'm saying is I was fucking SHOOK at my stocks this morning. This is one time I'm happy that I bought a dip!
Even a one-year lead would be a significant advantage. Not to mention the volumes enabling better pricing. A couple years back, I thought ASICS like the Google TPU would mean the big players would stop using NVIDIA's GPUs for training. But clearly I was wrong, OpenAI still uses them, Google and Meta will use H100s, Tesla too even though they're working on their DOJO.
Time will tell what happens, but we're soon to the point where one or two years could be the difference between GPT-x and AGI. NVIDIA has the momentum, the ecosystem, the API, the network... a lot of inertia, very little time to catch up.
You kinda missed the boat for big gains ... but I'm sure over time they will keep going up.Serious question: Should I invest?
Yeah probably. nVidia rode the crypto bubble hard and then it popped, something which caught them a bit off guard with their high end card pricing this gen.Don't worry this is all temporary, it will slowly start sliding down once the AI boom calms down. Same shit happened during the dot com era.
Pretty much, these cards are for workstations and data centers, not gaming. There's cheap ai processing to be had in bulk than trying to sell workstation cards exclusively anymore.higher GPU price confirmed.
Joint venture with Apple and Intel to push some open ML language.How does AMD compete with that?
Jensen: