Mistral AI, a French AI firm launches Les Ministraux, its first collection of generative AI models for computing-based devices like laptops and phones.
On Wednesday (Oct 17, 2024), Mistral AI introduced two new models called – Les Ministraux: Ministral 3B and Ministral 8B.
These models are equipped with the ability to perform multiple tasks including understanding and generating text, reasoning, and managing complex processes.
The company says the models are designed to be flexible and efficient, making them suitable for a variety of applications.
“These models set a new frontier in knowledge, commonsense, reasoning, function-calling, and efficiency in the sub-10B category,” Mistral AI noted in an official statement. “They can be used or tuned to a variety of uses, from orchestrating agentic workflows to creating specialist task workers.”
Both Ministral 3B and Ministral 8B are equipped with the capability to generate text sequences of up to 128k tokens long. This enhancement is a significant upgrade from Mistral AI’s initial AI model which is currently 32k on a virtual large language model (vLLM).
Fast and longer sequences
Les Ministraux can manage more complex tasks and longer inputs according to the French company. Previous models have produced much shorter sequences in comparison.
Mistral AI said that Ministral 8B has a special interleaved sliding-window attention pattern for faster and memory-efficient inference.
Mistral AI customers and partners have allegedly minimised their reliance on the internet owing to their usage of devices such as phones and robots which leverages Mistral’s AI technology.
“Our most innovative customers and partners have increasingly been asking for local, privacy-first inference for critical applications such as on-device translation, internet-less smart assistants, local analytics, and autonomous robotics,” stated the firm.
Mistral AI designed Les Ministraux to provide a compute-efficient and low-latency solution for multiple use cases.
The AI solution caters to a wide range of use cases, for instance, independent hobbyists can use it too along with global manufacturing organisations.
The new gen AI models are used in combination with large language models (LLMs) like Mistral Large. According to the makers of Les Ministraux, these models are efficient intermediaries for function-calling in multi-step agentic workflows.
They can be tuned to handle input parsing, task routing, and calling APIs based on user intent across multiple contexts at extremely low latency and cost.
Mistral AI says its priority is to secure customer data especially when translating languages, using smart assistants offline, analysing data locally, and controlling robots.
“At Mistral AI, we continue pushing the state-of-the-art for frontier models. It’s been only a year since the release of Mistral 7B, and yet our smallest model today (Ministral 3B) already outperforms it on most benchmarks,” the company stated.
What is Mistral AI?
Mistral AI is a French firm that specialises in AI products. It’s known for its fast, open-source and secure language models and Facilitating specialised models on business use cases, leveraging private data and usage feedback.
The gen AI models developed by Mistral AI can create various types of text and other content. Their most notable product is hailed as Mistral 7B which has been designed for use on devices like laptops and phones. This model is a 7.3B parameter model released under the Apache 2.0 licence and can be used without many restrictions.
Now Mistral AI has launched two new gen AI models as part of its ‘Les Ministraux’ collection – Ministral 3B and Ministral 8B.
Is Mistral AI open source?
Yes, Mistral AI offers open-source generative technologies to foster trust and transparency in decentralised technology development. The company has made customisable open-weight models available for the public which can be deployed to a system of the users’ preference.
A small AI model, Mistral NeMo built in collaboration with NVIDIA also launched this year in July is available to use without many restrictions. Mistral Large 2 has also been made available through both a free non-commercial license and a commercial license.
How to use Mistral AI models?
Mistral AI’s open-source AI models with licensed under Apache 2.0 can be downloaded by users in their own environment through an API code provided by the company.
After which, users can begin exploring and working on Mistral AI’s La Plateforme which is immediately available promising speed, and quality control. Users can simply sign up to begin using the platform.