The 2 Exaflops Aurora supercomputer is a beast of a machine and the system will be used to power the Aurora genAI AI model. Announced at today's ISC23 keynote, the Aurora genAI model is going to be trained on General text, Scientific texts, Scientific data, and codes related to the domain. This is going to be a purely science-focused generative AI model with potential applications being:
The foundations of the Intel Aurora genAI model are Megatron and DeepSpeed. Most importantly, the target size for the new model is 1 trillion parameters. Meanwhile, the target size for the free & public versions of ChatGPT is just 175 million in comparison. That's a 5.7x increase in number of the parameters.
More Stuff I Like
More Stuff tagged private , chatgpt , intel , aurora genai