Curated Resource ( ? )

Intel Announces Aurora genAI, Generative AI Model With 1 Trillion Parameters

my notes ( ? )

The 2 Exaflops Aurora supercomputer is a beast of a machine and the system will be used to power the Aurora genAI AI model. Announced at today's ISC23 keynote, the Aurora genAI model is going to be trained on General text, Scientific texts, Scientific data, and codes related to the domain. This is going to be a purely science-focused generative AI model with potential applications being:

  • Systems Biology
  • Cancer Research
  • Climate Science
  • Cosmology
  • Polymer Chemistry and Materials
  • Science

The foundations of the Intel Aurora genAI model are Megatron and DeepSpeed. Most importantly, the target size for the new model is 1 trillion parameters. Meanwhile, the target size for the free & public versions of ChatGPT is just 175 million in comparison. That's a 5.7x increase in number of the parameters.

Read the Full Post

The above notes were curated from the full post wccftech.com/intel-aurora-genai-chatgpt-competitor-generative-ai-model-with-1-trillion-parameters/.

Related reading

More Stuff I Like

More Stuff tagged private , chatgpt , intel , aurora genai

Cookies disclaimer

MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.