Climbing Olympus: Amazon’s 2 Trillion Parameter Language Model Takes on AI Giant

Virginia Backaitis
2 min readNov 9, 2023
Climbing Mount Olympus

Amazon is making waves in the world of artificial intelligence with the development of its colossal language model codenamed “Olympus.” The project, which is shrouded in secrecy but recently revealed by Reuters, is a significant step for the tech giant as it seeks to compete with industry leaders like OpenAI, Microsoft, and Google.

Olympus, boasting a staggering 2 trillion parameters, is set to become one of the largest language models in existence. In comparison, OpenAI’s GPT-4, a close rival in terms of parameter count, came with a price tag of over $100 million for its development.

Parameters in AI are configuration settings that determine how the model processes data. Unlike hyperparameters, which are set by developers, parameters are defined during the AI’s training process, influencing how it interprets and responds to input data. The more parameters an AI model possesses, the wider range of tasks it can handle.

This ambitious undertaking is led by Rohit Prasad, Amazon’s Senior Vice President and Head Scientist for Artificial General Intelligence, a notable figure known for his previous leadership in the development of the popular Alexa voice assistant. Prasad has assembled a team of experts from the Alexa unit and Amazon Science to spearhead Olympus.

--

--