Microsoft unveils Phi-2, the next of its smaller, more nimble genAI models

Estimated read time 1 min read

Microsoft has announced the next of its suite of smaller, more nimble artificial intelligence (AI) models targeted at more specific use cases.

Earlier this month, Microsoft unveiled Phi-1, the first of what it calls small language models (SLMs); they have far fewer parameters than their large language model (LLM) predecessor. For example, the GPT-3 LLM — the basis for ChatGPT — has 175 billion parameters. GPT-4, OpenAI’s latest LLM, has about 1.7 trillion parameters. Phi-1 was followed by Phi-1.5, which by comparison, has 1.3 billion parameters.

To read this article in full, please click here

​ Microsoft has announced the next of its suite of smaller, more nimble artificial intelligence (AI) models targeted at more specific use cases.Earlier this month, Microsoft unveiled Phi-1, the first of what it calls small language models (SLMs); they have far fewer parameters than their large language model (LLM) predecessor. For example, the GPT-3 LLM — the basis for ChatGPT — has 175 billion parameters. GPT-4, OpenAI’s latest LLM, has about 1.7 trillion parameters. Phi-1 was followed by Phi-1.5, which by comparison, has 1.3 billion parameters.To read this article in full, please click here   Read More Computerworld 

You May Also Like

More From Author

+ There are no comments

Add yours