AI Weekly Digest #37: AI Performance & Speed Improvements Across the Board

Air Canada’s Chatbot, Groq's LPU inference, Released Stable Video, Mistral Large Model & Magic's AI Software Engineer!

AI Weekly Digest #37: AI Performance & Speed Improvements Across the Board

Hello, tech enthusiasts! This is Wassim Jouini and Welcome to my AI newsletter, where I bring you the latest advancements in Artificial Intelligence without the unnecessary hype.

You can find me on LinkedIn, Twitter and Medium! Let’s connect!

Now let's dive into this week's news and explore the practical applications of AI across various sectors.

Main Headlines

Here are the main trends to keep in mind if you are working in AI today.

If you are building AI based chatbots, for your website or clients, you need to read this!

The eye-opening story of Air Canada’s chatbot fiasco ; A reminder that when AI fails, it’s the company that pays the price!

  • Microsoft and Mistral AI have announced a multi-year partnership to accelerate AI innovation, with the release of Mistral Large, Mistral latest large language model, available first on Azure!

  • The partnership focuses on supercomputing infrastructure, scale to market through Azure AI Studio and Azure Machine Learning, and AI research and development, including for European public sector workloads.

Mistral Large - Performance compared to other LLMs

#3 Groq & The Promise Of A Much Faster LLM inference!

With Nvidia's stock soaring to $2T, it's evident that computational power has become the new gold, and the current limiting factor!

Therefore, it's not surprising to see processors designed for LLM inference acceleration receiving significant attention.

This is the context in which Groq emerges, with its LPU (Language Processing Unit) that delivers a 20x improvement in inference time, providing a glimpse into the potential for real-time LLM inferences!

Fast & Cost Efficient Inference with LPUs

#4 The Story behind Magic’s $100M Investment to Build Superhuman AI Software Engineers

In a suprising move, Magic has secured a $100 million investment to pioneer the development of superhuman AI software engineering!

  • Investment Lead: The investment is led by Nat Friedman, former GitHub CEO. After a demo, “Daniel and I were so impressed, we are investing $100M in the company today.” - Nat Friedman

  • Active Reasoning Breakthrough: Internal rumors claim that Magic's innovation is a breakthrough in active reasoning, enabling AI to make more autonomous decisions and solve complex problems.

  • Revolutionizing Software Development: This project aims to transcend traditional AI coding assistants, promising an AI that collaborates like a human engineer.

To be continued!

I know, I know… after OpenAI’s SORA, it will take some effort to impress you with AI generated videos! Nevertheless SORA isn’t available yet ; Stable Video is! And it’s Open Source (so you can fine-tune it!)

What you need to know:

  • Availability: It’s already available online!

  • Cost: You get 150 free credits to test it! It takes about10 credits for one image-to-video generation for instance. You can buy 500 credits for $10.

  • Video’s Length: 2-5 second high quality videos (30 FPS).

  • Generation Time: A couple of minutes to generate the video. Expect a bit of delay in case high traffic.

  • License: Code and model weights released for research and non commercial applications for now.

This is it for Today!

Until next time, this is Wassim Jouini, signing off. See you in the next edition!

Have a great week and may AI always be on your side!