HellBlazer
Member
http://www.tomshardware.com/news/openai-nvidia-dgx-1-ai-supercomputer,32476.html
Sounds pretty cool. The prospects of advanced AI are both exciting and scary. Hopefully an open project like this can help mitigate the risks.
...Welp. Here comes the apocalypse.
Elon Musk has been one of the more prominent tech figures to warn about centralized artificial intelligence control. Musk believes that no one company or government should have a monopoly on artificial intelligence. The thing about artificial intelligence is that once it passes a certain threshold, it could quickly become orders of magnitude more intelligent than any other AI solution out there.
Musk thinks that if artificial intelligence solutions are decentralized, then if one AI solution becomes too strong and somehow goes rogue, all the other AI solutions out there have a chance at fighting back.
Its also possible that in the long run, a crowdsourced version of an artificial intelligence could beat any solution that is developed by a single company or country. This crowdsourced solution would also be distributed to multiple players in various countries and industries, thus reducing the risks that are involved with artificial intelligence.
Elon Musk, together with a group of partners, created OpenAI last year, a billion-dollar-funded project that has has since been dubbed the Xerox PARC of AI. The idea is to get as many smart people as possible from all over the world to improve this AI solution so it can at least keep up with the other centralized and commercial solutions out there.
To achieve its goals of keeping up with centralized competition, OpenAI needs not just good machine learning software, but also cutting-edge hardware. Thats why the OpenAI non-profit is the first to get Nvidias new DGX-1 AI supercomputer in a box.
The reason Nvidia calls it that is because the DGX-1 is a $129,000 system that integrates eight Nvidia Tesla P100 GPUs. The P100 is Nvidias highest-end GPU right now, capable of up to 21.2 teraflops of FP16 performance (which is the type of low-precision computation that machine learning software would use). The whole system is capable of up to 170 teraflops.
One of OpenAIs goals is to use generative modeling to train an AI that can not only recognize speech patterns, but can also learn how to generate appropriate responses to the questions that people ask it. However, the researchers behind OpenAI say that until now theyve been limited by the computation power of their systems.
The Nvidia DGX-1 should help the OpenAI researchers get the significant boost in performance they need to jump to the next level in AI research. According to Ilya Sutskever, the DGX-1 will shorten the time each experiment takes by weeks. The DGX-1 will allow the researchers to come up with new ideas that were perhaps not practical before because they couldnt allocate weeks of research time to every single idea they had.
Sounds pretty cool. The prospects of advanced AI are both exciting and scary. Hopefully an open project like this can help mitigate the risks.
"So right now, if were training on, say, a month of conversations on Reddit, we can, instead, train on entire years of conversations of people talking to each other on all of Reddit."
...Welp. Here comes the apocalypse.