LFM2Liquid AI
|
Phi-3Microsoft
|
|||||
Related Products
|
||||||
About
LFM2 is a next-generation series of on-device foundation models built to deliver the fastest generative-AI experience across a wide range of endpoints. It employs a new hybrid architecture that achieves up to 2x faster decode and prefill performance than comparable models, and up to 3x improvements in training efficiency compared to the previous generation. These models strike an optimal balance of quality, latency, and memory for deployment on embedded systems, allowing real-time, on-device AI across smartphones, laptops, vehicles, wearables, and other endpoints, enabling millisecond inference, device resilience, and full data sovereignty. Available in three dense checkpoints (0.35 B, 0.7 B, and 1.2 B parameters), LFM2 demonstrates benchmark performance that outperforms similarly sized models in tasks such as knowledge recall, mathematics, multilingual instruction-following, and conversational dialogue evaluations.
|
About
A family of powerful, small language models (SLMs) with groundbreaking performance at low cost and low latency. Maximize AI capabilities, lower resource use, and ensure cost-effective generative AI deployments across your applications. Accelerate response times in real-time interactions, autonomous systems, apps requiring low latency, and other critical scenarios. Run Phi-3 in the cloud, at the edge, or on device, resulting in greater deployment and operation flexibility. Phi-3 models were developed in accordance with Microsoft AI principles: accountability, transparency, fairness, reliability and safety, privacy and security, and inclusiveness. Operate effectively in offline environments where data privacy is paramount or connectivity is limited. Generate more coherent, accurate, and contextually relevant outputs with an expanded context window. Deploy at the edge to deliver faster responses.
|
|||||
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
Platforms Supported
Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook
|
|||||
Audience
Developers and engineering teams needing a solution offering foundation models without reliance on cloud infrastructure
|
Audience
Anyone interested in a family of powerful small language models
|
|||||
Support
Phone Support
24/7 Live Support
Online
|
Support
Phone Support
24/7 Live Support
Online
|
|||||
API
Offers API
|
API
Offers API
|
|||||
Screenshots and Videos |
Screenshots and Videos |
|||||
Pricing
No information available.
Free Version
Free Trial
|
Pricing
No information available.
Free Version
Free Trial
|
|||||
Reviews/
|
Reviews/
|
|||||
Training
Documentation
Webinars
Live Online
In Person
|
Training
Documentation
Webinars
Live Online
In Person
|
|||||
Company InformationLiquid AI
Founded: 2023
United States
www.liquid.ai/blog/liquid-foundation-models-v2-our-second-series-of-generative-ai-models
|
Company InformationMicrosoft
Founded: 1975
United States
azure.microsoft.com/en-us/products/phi-3
|
|||||
Alternatives |
Alternatives |
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
|
|
|
|||||
Categories |
Categories |
|||||
Integrations
Azure AI Services
Azure OpenAI Service
Cake AI
Database Mart
Falcon-7B
Hugging Face
LM-Kit.NET
Molmo
Msty
NativeMind
|
Integrations
Azure AI Services
Azure OpenAI Service
Cake AI
Database Mart
Falcon-7B
Hugging Face
LM-Kit.NET
Molmo
Msty
NativeMind
|
|||||
|
|
|