The 2 Exaflops Aurora supercomputer is a beast of a machine and the system will be used to power the Aurora genAI AI model. Announced at today's ISC23 keynote, the Aurora genAI model is going to be trained on General text, Scientific texts, Scientific data, and codes related to the domain. This is going to be a purely science-focused generative AI model with potential applications being:
The foundations of the Intel Aurora genAI model are Megatron and DeepSpeed. Most importantly, the target size for the new model is 1 trillion parameters. Meanwhile, the target size for the free & public versions of ChatGPT is just 175 million in comparison. That's a 5.7x increase in number of the parameters.
More Stuff I Like
More Stuff tagged private , chatgpt , intel , aurora genai
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.