AI case studies using {model/tool}






%201%20(1).png)
Neuromnia uses Meta's Llama 3.1 to develop Nia, an AI assistant for automating ABA therapy tasks.
%20(1).png)

%201%20(1).png)
MetaLearner uses Meta's Llama 3.1 to make ERP systems like SAP and Oracle easier to work with.
%20(1).png)

%201%20(1).png)
NVIDIA leverages Meta's Llama 3.1 models and employs structured weight pruning and knowledge distillation to develop smaller, more efficient Llama-Minitron models.
%20(1).png)

%201%20(1).png)
SNCF, in collaboration with Alp Valley and partners, integrated Meta's Llama 3.1 to create a multilingual, voice-enabled virtual assistant for Paris' Gare du Nord. Unveiled at Vivatech Paris and the Paris 2024 Summer Olympics, this assistant enhances passenger experience by providing real-time train information and multilingual support, demonstrating advanced module development and efficient workflow validation.
%20(1).png)

%201%20(1).png)
Envision, an assistive technology company, leverages Meta's Llama 3.1 to empower individuals who are blind or have low vision. By integrating Llama's language processing and computer vision, Envision's products, including a mobile app and a prototype on Meta’s Project Aria glasses, articulate visual information into speech. This enables users to access everyday information independently, enhancing their autonomy and quality of life.
%20(1).png)
What's brewing in AI
Join 20,000 readers from Google, Amazon, Microsoft and more.