In This Article:
IBM (IBM) and NASA are partnering to develop an open-source AI model for weather and climate analysis. The foundational model trained on NASA's data is now available on Hugging Face, the AI startup platform.
Hugging Face CEO Clem Delangue joins Catalysts to discuss the company's partnership with IBM and NASA and its outlook as generative AI seeps into all corners of the market.
Hugging Face now has more than one million public models available on its platform, with one new repository being created every 10 seconds. These models encompass everything from chatbots to areas like biology and chemistry.
Delangue calls Hugging Face's partnership with IBM and NASA an example of "AI for good," explaining, "The number of people who die from our inability to predict weather events is massive, and if you can use AI to reduce that number just by predicting these events, maybe a few hours before, this is a massive positive impact that AI has on the world."
While Hugging Face is an open-source platform, Delangue notes that the company has always been "very intentional about picking the revenue streams and the offerings that are very much kind of like creating value for customers." This way, Hugging Face has become profitable, which Delangue notes is "quite rare for AI startups."
Well, IBM and NASA are partnering to develop an open-source AI model for weather and climate analysis, using and including and improving some localized forecasts, predicting severe weather, enhancing global climate solutions, and more. Now, the foundational model trained on NASA's data is now available on Hugging Face. This is the artificial intelligence startup, which runs thousands of open source models for developers to test new language learning models. And joining me now is Hugging Face CEO, Clem Delangue here in studio with us. And and before it was thousands, but now I understand it's millions of public models that are now on Hugging Face. Just talk us through the scaling of this business and and partnerships like this that you've got with IBM, with NVIDIA, with Google. I mean, you've really racked up some of the largest names in mega cap tech right now that are leveraging Hugging Face and partnering with you.
Yeah, we're actually crossing today 1 million public models on the platform. There's one new repository that is created every 10 seconds on Hugging Face. And what's interesting is that it's not only text chatbot models, like ChatGPT, but a lot of them are in other domains like biology, chemistry, image, video. So this example of a model, uh, and a dataset released by IBM and NASA is interesting because by forecasting, uh, climate, uh, events, it's actually AI for good, right? Because the numbers of people who die from our inability to predict weather events is is massive. And if you can use AI to reduce that number just by predicting these events, maybe a few hours before, this is a massive positive impact that AI has on the world.
Yeah, absolutely. And really speaks to the strength of artificial intelligence as well, and inferencing as well, and that's something that your business is leaning a lot into. Inferencing as a service. How does that change the profitability perspective and trajectory for the business as well, because you've already become profitable at this point.
Yeah, we've always been very intentional about picking the revenue streams and the offering that are very much like kind of like creating value for customers, and as a result can be high margin. Um, that's how we manage to be profitable today, which is quite rare for AI startups, which are usually the opposite spectrum. Most of them, even the biggest one are actually burning much more than revenue.
Is that OpenAI?
All of them. You know, like it's it's incredible that we manage to keep such like uncertainty at such high level of revenue. That's a little bit the the nature of the field, but we've taken a little bit of a different direction where we focus more on profitable, sustainable revenue because we built a platform for the community. And so we want to build that for the long term.
You know, I heard this question asked on one of the earnings calls this season, and I think it was Oracle's. And it was about the market transitioning to from an AI training phase and all of the purchasing around chips that have really just run away with so much of the attention to an AI inferencing phase. And what that changing in the model or at least evolution in the cycle would mean for companies like yours and and getting some of that investment. You know, what does that look like as as the revenue model for Hugging Face becomes more solidified as well?
Well, I think that means looking more at AI in production. You know, not only looking at how do you build experiments for AI, but also how do you put these experiments in production for millions of users? So for Hugging Face, one interesting thing is now, I mentioned this 1 million public models on the platform. Something that people don't really know is that we almost have as many private models on the platform that companies are using internally in production for their use cases. So we're seeing this transition from AI for prototypes, for experiments to AI in production. And in terms of like revenue, I think that means more companies using our enterprise hub offering that has been very, very successful. And in general, kind of like looking at more of the cost for AI, especially when you scale to make sure you can keep margins that are sustainable.
You know, I'm going to take us back a step back here, and then and then we'll go kind of rapid fire with the time that we have left. But just where are we at in the generative AI cycle right now, and how far are we from the point where it feels like my grandmother might even know the difference that artificial intelligence is having on her day-to-day life?
So I think we're quite mainstream on the usage side. Uh, we've seen some sort of a catch up from investors and the public on AI. Uh, and maybe we're at the phase where it slowed down a little bit, or at least we have a more realistic view of what AI can do and can't do. It's more mature. And hopefully, in the next few years, we'll keep kind of like improving the use cases, expand from just text to all the other domains, as I mentioned, and really make it, make it mainstream.
All right, we got to go. But is this, you know, a period where you think about Hugging Face being a publicly traded company at some point in the future?
Not yet. I think, uh, we're happy to be, to be private right now. We're doing some acquisitions. We've done two in the past three months. Um, so the idea is to keep building the usage and the revenue in the next few years.
Thank you so much for joining us here in studio, Clem Delangue, who is the founder and CEO of Hugging Face. Thanks so much.
Pleasure.
"It's incredible that we managed to keep such uncertainty at such high level of revenue. That's the nature of the field. But we've taken a little bit of a different direction where we focus more on profitable, sustainable revenue because we build a platform for the community. And so we want to build that for the long term," he adds.
In addition to the one million public models offered by the platform, Hugging Face has nearly as many private models that are being used by companies internally. "So we're seeing this transition from AI for prototypes, for experiments, to AI in production. And in terms of revenue, I think that means more companies using our enterprise hub offering that has been very, very successful... in general, kind of like looking at more of the costs for AI, especially when you scale to make sure you can keep margins that are sustainable," he tells Yahoo Finance.
As more people and companies adopt generative AI, Delangue believes that right now, "We're quite mainstream on the usage side." He explains that there's been a "catch-up" from investors and the public on the technology, adding, "We're at the phase where it's slowed down a little bit, or at least we have a more realistic view of what AI can do and can't do. It's more mature. And hopefully in the next few years, we'll keep improving the use cases, expand from just text to all the other domains... and really make it make it mainstream."