Why the future of AI is Big, Efficient and Open
Why will ai-PULSE dive into large, efficient and open source models on November 7? Get the primer on our exclusive AI conference here!
The second edition of ai-PULSE, held at Paris's iconic Station F on November 7, kicked off with a powerful morning session that highlighted some of the most important voices in AI today. The morning’s keynote speakers shared their insights into the latest advancements, the growing power of open-source AI, and the urgent need for Europe to position itself as a hub for sovereign, open, and efficient AI solutions.
Europe is not only catching up with AI powerhouses in the US and China, but is emerging as a formidable force in its own right. Xavier Niel, Founder of Iliad Group, began his address with a bold statement: "18 months ago, when we said we wanted to create a tech AI conference in Paris, people told us, 'Guys, you're completely crazy, Europe has no AI ecosystem!' Today, we are proving them wrong." He added, "thanks to our talent, we’re building amazing startups like Mistral, Poolside, and Argil. And to keep that talent, we’ve created Kyutai, where we’re assembling the ‘Avengers of AI’, bringing together the best minds to develop open source AI models in Europe, built according to our values and regulations."
One of the most exciting innovations to emerge from Kyutai is Moshi, an open source conversational AI model. "Moshi is the first truly open-source AI model created in Europe, and it’s completely free to use. It’s a game-changer for the industry," Niel declared.
Damien Lucas, CEO of Scaleway and Aude Durand, Deputy CEO of iliad Group, took to the stage next, declaring that Moshi is now available as a one-click deployment service through Scaleway's platform.
Durand then emphasized the theme of this year’s event: "Big, Efficient, Open." "In AI, we know that top-tier models require immense computational power," she said. Lucas added: ”Scaleway is proud to support cutting-edge AI models by offering access to 3000 NVIDIA H100 GPUs today - triple our total this time last year - and in the coming weeks, we will surpass 5000 GPUs. This will allow developers across Europe to leverage the power of AI, while minimizing environmental impact."
As part of its commitment to sustainable AI, Scaleway has also launched its new Environmental Footprint Calculator. "Transparency matters," said Lucas. "We want to give developers the tools to understand the environmental impact of their AI workloads, covering everything from emissions to water consumption. It’s the most comprehensive calculator available today, and it’s already in beta." Lucas also unveiled Generative APIs, a new Scaleway service which allows developers to move from Open AI to alternative open source models in just one line of code. More about Scaleway's ai-PULSE announcements here.
Michael Dell, Chairman and CEO of Dell Technologies, then shared his thoughts on the accelerating pace of AI innovation, drawing comparisons with the rise of the internet. "If you remember the internet in the mid-90s, people were wondering what it really meant,” he said. “It took about a decade before the full impact was understood. We're in that same phase with AI today." Dell then asserted his belief in the transformative power of AI, both in the cloud and on-device: “AI will be used both on the cloud and on devices. It will be used at the edge, on the point of activity closest where the action is. It will run on cloud platforms, on your phone, in cars, factories, and retail stores,” he said.
Dell also spoke passionately about open-source AI, emphasizing that it is a critical driver of innovation. "Open-source AI has already revolutionized the field, and it’s just getting started," he enthused. "Last year would’ve been more about closed models; the world has shifted now. Two thirds of activity is now happening in open models; derivative, small models that don’t necessarily need to run on massive clusters."
As AI continues to evolve, Dell predicted "we’re transitioning from calculating to thinking. The future of AI will require new architectures that bring together data, computing, and cognition in a way that helps humans become more successful," he explained.
Renee J. James, Chairman and CEO of Ampere Computing, took the stage next to talk about the future of AI hardware. She reflected on the importance of power efficiency in AI infrastructure. “We set out to build something twice as powerful, but with half the energy consumption, and we ended up getting 3 to 4 times the performance,” said James.
In a speech echoing her CPO Jeff Wittich’s assertion last year that Ampere CPUs could handle many inference workloads whilst using 3-5 times less energy than comparable GPUs, she stressed that AI hardware, particularly processors, need to evolve to meet the increasing demands of AI models. “Today, we are all about training models, we have people building unlimited clusters to train,” she explained. “But as [AI] starts to moderate and become mainstream, we have to make it affordable and environmentally efficient, because we don’t have more power. That's the journey we are going to begin.”
James also pointed out that Europe has an untapped wealth of intellectual talent, especially in computer architecture, and she believes the continent must focus on increasing investment levels to continue pushing the boundaries of AI hardware. “Access to capital is one of the most important things that needs to continue to improve and the ability for investors to take more risks,” she concluded.
Renen Hallak, Founder and CEO of VAST Data, agreed with James on the growing importance of efficient data management in AI scaling. “AI is changing from allowing us to analyze numbers and rows and columns to pictures and videos, natural language and sound. Older systems were not built for that. So we need to break that paradigm, and that requires a new software architecture,” said Hallak. He then discussed the shift from CPUs to GPUs for data analysis, emphasizing the need for new software architectures “when you build exabyte-scale clusters that need to manage themselves.” Hallak also addressed the growing role of AI agents in problem-solving. “AI agents will soon be communicating with each other, generating ideas, and solving problems that humans cannot solve on their own,” he said.
Bryan Catanzaro, VP of Applied Deep Learning Research Research at NVIDIA, and Jean-Baptiste Kempf, Scaleway’s Chief Technical Officer, shared insights on the future of AI infrastructures. Catanzaro discussed the challenges of scaling AI models, particularly when it comes to GPU clusters. “We’re going to find that instead of putting a million GPUs in a single data center, we're going to run training and inference in a more distributed way, so we can have better access to electricity, and scale in a more efficient way,” he said, highlighting the importance of optimizing AI for both performance and sustainability.
Kempf echoed this sentiment, explaining that new AI models will need to be designed with compute efficiency top of mind. “We talked about thousands or hundreds of thousands of GPUs for training, but 90% of the compute power will actually be dedicated to inference. So, I don't believe we’ve reached the maximum capacity for inference yet,” he said.
Catanzaro added, “we change everything at NVIDIA. My job is to figure out how to change our models and make them much more efficient, both for training and deployment. NVIDIA is not just a chip company; we focus on accelerating computing through full-stack optimization. The level of specialization that goes into our GPUs for AI is something we decide on every day, focusing most importantly on machine learning and intelligence.”
Patrick Perez, CEO of Kyutai, then spoke about the groundbreaking work behind Moshi. “Since launching in mid-September, Moshi has already been experienced by over half a million people,”said Perez. He explained that Moshi is more than just a conversational AI; it is a multimodal foundation model capable of performing various tasks, from speech synthesis to question answering. “What we’ve created with Moshi is not just a voice model, but a platform for developers to innovate on top of,” he added. The open-source nature of Moshi is a key part of Kyutai’s mission to democratize AI and ensure that European models can compete on the global stage.
Charles Kantor, CEO of H then addressed the potential of action models in AI, highlighting the importance of combining LLMs (Large Language Models) and VLMs (Vision Models) for specialized tasks. "Action models are incredibly powerful today. By orchestrating LLMs that generate plans and workflows and VLMs that understand interfaces like desktop or Android, you can create systems that can take action effectively," said Kantor. He also emphasized the need for sector-specific focus: "the goal is to focus on one industry and master it, building a holistic system with well-orchestrated layers."
Clara Chappaz, France’s Secretary of State for AI and Digital Affairs, concluded the morning's sessions by discussing the importance of building a distinctive AI ecosystem in Europe. "We are at the very beginning of this technology. It’s crucial to ensure AI is inclusive, frugal, and ethical," said Chappaz. She praised France’s strong educational system for producing top AI talent, and emphasized the role of infrastructure and private companies, like Scaleway, in providing the necessary tools for that talent to thrive and launch companies in France.
With AI’s growing energy demands, she mentioned the need for green energy in powering infrastructures. “We [in France] have some of the greenest (lowest carbon) energy in the world. You all know how much these machines consume when it comes to power and we need to think about how we feed these models with energy that is green, how do we continue fostering these technologies without damaging the planet.”
For Chappaz, what's required is a 'third way' for AI; a new, collaborative, and inclusive approach to AI development, distinct from the competition-driven dynamics seen in the US and China. “The way we want to build this ecosystem and want to use AI is a way that can resonate with a lot of people. We like to call it the third way; one that brings people together," she concluded.
Why will ai-PULSE dive into large, efficient and open source models on November 7? Get the primer on our exclusive AI conference here!
Artificial intelligence took major leaps forward at Station F on November 17, setting trends which are bound to leave their mark on 2024 and beyond. Let’s discover a few…