var TRINITY_TTS_WP_CONFIG={“cleanText”:”Microsoft designs Phi-3 mini for lightweight AI tasks.u23f8Big Tech firm Microsoft (NASDAQ: MSFT) hasu00a0announcedu00a0the release of a small language model (SLM) designed for lightweight artificial intelligence (AI) tasks, placing a keen emphasis on cost-effectiveness.u23f8In their officialu00a0statement, Microsoft says that the new AI model will offer enterprises and other end users an alternative to the mainstreamu00a0large language modelsu00a0(LLMs) that typically require significant computing power. Dubbed Phi-3 mini, Microsoft says its lightweight AI model demonstrates impressive coding, language, and math abilities when placed side by side with larger models.u23f8The Phi-3 mini contains only 3.8 billion parameters but still packs a heavy punch thanks tou00a0Microsoftu2019s advanced training systems. In comparison, the parameters for OpenAIu2019su00a0GPT-4u00a0and Metau2019su00a0Llama 2u00a0hover around 1.76 trillion and 70 billion,u00a0respectively, foru00a0the largest versions.u23f8Microsoftu2019s researchers are already rubbing their hands in glee at the prospects of mainstream acceptance of SLMs. For starters, the company says enterprises will have broader options to pick from in theiru00a0pivot to AIu00a0but clarifies that all sizes of AI models can co-exist in the same ecosystem.u23f8u201cWhat weu2019re going to start to see is not a shift from large to small, but a shift from a singular category of models to a portfolio of models where customers get the ability tou00a0make a decisionu00a0on what is the best model for their scenario,u201d remarked Sonali Yadav, generative AI product manager at Microsoft.u23f8Another potential use case for SMLs is the localization of AI models of devices without the requirement for internet connectivity. Firms or enterprises seeking AI capabilities but are wary of data leaks may opt for SMLs for their needs to u201ckeep data on theiru00a0ownu00a0premises.u201du23f8Apart from heightened levels of security, turning to SMLs will reduce latency whileu23f8 encouraging AI adoptionu00a0in rural areas, especially those without internet connectivity.u23f8The lightweight SLM will debut on Hugging Face,u00a0Microsoft Azure AI Modelu00a0Catalog, and Ollama, with Microsoft hinting at an interoperable API.u23f8Microsoft says it will double its bet with SMLs by releasing new versions of Phi-3 in the coming months. The company is eyeing the release of 7-billion and 14-billion parameter versions to increase its capabilities without demanding excessive computational power.u23f8Headlong into AIu23f8Microsoftu2019s latest AI offering builds upon its previousu00a0experiments with lightweight AI models, culminating in the release of Phi-1 and Phi-2 in 2023. Awed by the u201csurprising poweru201d of SMLs, the big tech company planted its heels with smaller AI models.u23f8The company has been inking partnership deals with leading industry players, including theu00a0UAEu2019s G42,u00a0Vodafone (NASDAQ: VODPF), and KPMG. Other high-profile deals see Microsoft invest in the emerging technologies space in Australia, the U.K., and the U.S. to deepen the talent pool and improve cloud capabilities.u23f8In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownershipu2014allowing it to keep data safe while also guaranteeing the immutability of data.u00a0Check out u2019s coverageu00a0on this emerging tech to learn moreu00a0why Enterprise blockchain will be the backbone of AI.u23f8Watch: How blockchain will keep AI honestu23f8″,”headlineText”:”Microsoft designs Phi-3 mini for lightweight AI tasks”,”articleText”:”Big Tech firm Microsoft (NASDAQ: MSFT) hasu00a0announcedu00a0the release of a small language model (SLM) designed for lightweight artificial intelligence (AI) tasks, placing a keen emphasis on cost-effectiveness.u23f8In their officialu00a0statement, Microsoft says that the new AI model will offer enterprises and other end users an alternative to the mainstreamu00a0large language modelsu00a0(LLMs) that typically require significant computing power. Dubbed Phi-3 mini, Microsoft says its lightweight AI model demonstrates impressive coding, language, and math abilities when placed side by side with larger models.u23f8The Phi-3 mini contains only 3.8 billion parameters but still packs a heavy punch thanks tou00a0Microsoftu2019s advanced training systems. In comparison, the parameters for OpenAIu2019su00a0GPT-4u00a0and Metau2019su00a0Llama 2u00a0hover around 1.76 trillion and 70 billion,u00a0respectively, foru00a0the largest versions.u23f8Microsoftu2019s researchers are already rubbing their hands in glee at the prospects of mainstream acceptance of SLMs. For starters, the company says enterprises will have broader options to pick from in theiru00a0pivot to AIu00a0but clarifies that all sizes of AI models can co-exist in the same ecosystem.u23f8u201cWhat weu2019re going to start to see is not a shift from large to small, but a shift from a singular category of models to a portfolio of models where customers get the ability tou00a0make a decisionu00a0on what is the best model for their scenario,u201d remarked Sonali Yadav, generative AI product manager at Microsoft.u23f8Another potential use case for SMLs is the localization of AI models of devices without the requirement for internet connectivity. Firms or enterprises seeking AI capabilities but are wary of data leaks may opt for SMLs for their needs to u201ckeep data on theiru00a0ownu00a0premises.u201du23f8Apart from heightened levels of security, turning to SMLs will reduce latency whileu23f8 encouraging AI adoptionu00a0in rural areas, especially those without internet connectivity.u23f8The lightweight SLM will debut on Hugging Face,u00a0Microsoft Azure AI Modelu00a0Catalog, and Ollama, with Microsoft hinting at an interoperable API.u23f8Microsoft says it will double its bet with SMLs by releasing new versions of Phi-3 in the coming months. The company is eyeing the release of 7-billion and 14-billion parameter versions to increase its capabilities without demanding excessive computational power.u23f8Headlong into AIu23f8Microsoftu2019s latest AI offering builds upon its previousu00a0experiments with lightweight AI models, culminating in the release of Phi-1 and Phi-2 in 2023. Awed by the u201csurprising poweru201d of SMLs, the big tech company planted its heels with smaller AI models.u23f8The company has been inking partnership deals with leading industry players, including theu00a0UAEu2019s G42,u00a0Vodafone (NASDAQ: VODPF), and KPMG. Other high-profile deals see Microsoft invest in the emerging technologies space in Australia, the U.K., and the U.S. to deepen the talent pool and improve cloud capabilities.u23f8In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownershipu2014allowing it to keep data safe while also guaranteeing the immutability of data.u00a0Check out u2019s coverageu00a0on this emerging tech to learn moreu00a0why Enterprise blockchain will be the backbone of AI.u23f8Watch: How blockchain will keep AI honestu23f8″,”metadata”:{“author”:”Wahid Pessarlay”},”pluginVersion”:”5.7.4″}; |
Big Tech firm Microsoft (NASDAQ: MSFT) has announced the release of a small language model (SLM) designed for lightweight artificial intelligence (AI) tasks, placing a keen emphasis on cost-effectiveness.
In their official statement, Microsoft says that the new AI model will offer enterprises and other end users an alternative to the mainstream large language models (LLMs) that typically require significant computing power. Dubbed Phi-3 mini, Microsoft says its lightweight AI model demonstrates impressive coding, language, and math abilities when placed side by side with larger models.
The Phi-3 mini contains only 3.8 billion parameters but still packs a heavy punch thanks to Microsoft’s advanced training systems. In comparison, the parameters for OpenAI’s GPT-4 and Meta’s Llama 2 hover around 1.76 trillion and 70 billion, respectively, for the largest versions.
Microsoft’s researchers are already rubbing their hands in glee at the prospects of mainstream acceptance of SLMs. For starters, the company says enterprises will have broader options to pick from in their pivot to AI but clarifies that all sizes of AI models can co-exist in the same ecosystem.
“What we’re going to start to see is not a shift from large to small, but a shift from a singular category of models to a portfolio of models where customers get the ability to make a decision on what is the best model for their scenario,” remarked Sonali Yadav, generative AI product manager at Microsoft.
Another potential use case for SMLs is the localization of AI models of devices without the requirement for internet connectivity. Firms or enterprises seeking AI capabilities but are wary of data leaks may opt for SMLs for their needs to “keep data on their own premises.”
Apart from heightened levels of security, turning to SMLs will reduce latency while
encouraging AI adoption in rural areas, especially those without internet connectivity.
The lightweight SLM will debut on Hugging Face, Microsoft Azure AI Model Catalog, and Ollama, with Microsoft hinting at an interoperable API.
Microsoft says it will double its bet with SMLs by releasing new versions of Phi-3 in the coming months. The company is eyeing the release of 7-billion and 14-billion parameter versions to increase its capabilities without demanding excessive computational power.
Headlong into AI
Microsoft’s latest AI offering builds upon its previous experiments with lightweight AI models, culminating in the release of Phi-1 and Phi-2 in 2023. Awed by the “surprising power” of SMLs, the big tech company planted its heels with smaller AI models.
The company has been inking partnership deals with leading industry players, including the UAE’s G42, Vodafone (NASDAQ: VODPF), and KPMG. Other high-profile deals see Microsoft invest in the emerging technologies space in Australia, the U.K., and the U.S. to deepen the talent pool and improve cloud capabilities.
In order for artificial intelligence (AI) to work right within the law and thrive in the face of growing challenges, it needs to integrate an enterprise blockchain system that ensures data input quality and ownership—allowing it to keep data safe while also guaranteeing the immutability of data. Check out coverage on this emerging tech to learn more why Enterprise blockchain will be the backbone of AI.
Watch: How blockchain will keep AI honest
New to blockchain? Check out ’s Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.