Microsoft Develops MAI-1 To Compete With Google, Open AI, And Anthropic

Microsoft is developing the latest large language model called MAI -1 that will compete with similar models of Google, Open AI and Anthropic. This indicates the biggest shift of Microsoft, as before they depend on Open AI’s Models like Chat GPT 4.

It might be as good as the ones made by Google, Anthropic, and Open AI. The success of MAI-1 will depend on its development and how well it performs compared to existing models.
This is the first time Microsoft has made such a big AI model themselves since they spent lots of money on Open AI. They paid over $10 billion to use Open AI’s AI models. One of these models, GPT-4, powers Chat GPT and Microsoft Copilot.

Mustafa Suleyman, who used to work at Google and was the CEO of Inflection, is leading the development of MAI-1. Microsoft bought most of Inflection’s team and ideas for $650 million in March. Even though some of the techniques used in MAI-1 might come from Inflection, it’s said to be a completely new big language model. Two Microsoft employees who know about the project confirmed this.

MAI-1 will be much bigger than Microsoft’s older models like Phi-3. It’s said to have about 500 billion parts, which is a lot. This means it needs more powerful computers and more information to learn from. It’s in the same group as Open AI’s GPT-4, which is even bigger with over 1 trillion parts. Other smaller models, like the ones from Meta and Mistral, have around 70 billion parts.

Microsoft is working on two different types of AI. One type is for small language models that can run on mobile devices. The other type is for big, advanced models that need the power of the internet, called “the cloud.” Apple is said to be thinking about doing something similar.

This shows that Microsoft is okay with trying out AI on its own, without relying only on Open AI. Right now, Open AI’s technology helps Microsoft make its most exciting AI features, like the chatbot in Windows.

The exact purpose of MAI-1 isn’t clear yet, even within Microsoft. How useful it will be depends on how well it works, says a source from The Information. To teach the model, Microsoft is using a big group of servers with Nvidia GPUs. They’re gathering training information from different places, like text made by Open AI’s GPT-4 and stuff from the internet that anyone can see.

According to the report, if things go well in the next few weeks, Microsoft might show a preview of MAI-1 at its Build developer conference later this month. This info comes from one of the sources mentioned in the article.

Back to top button

judicuan bandar138 slot99 gacor123 bigslot elangslot bonus168 pragmatic77 vegasgg lucks77 tambang88 garuda99 grandbet kaisar138 rajacuan slot mahkota88 dolar88 bimaslot bos77 wayang888 galaxy77 megawin88 autowin88 dragon77 cuan138 big77 emas138 jackpot138 bet88 slot megawin77 vegasgg lucky99 vegasslot777 max77 enterslots kdslots777 megahoki88 situs toto togel online