Skip to content
Briefings are running a touch slower this week while we rebuild the foundations.See roadmap
Hugging Face
OrganisationFR

Hugging Face

World's leading open-source ML platform; default hub for AI model distribution and research.

Last refreshed: 22 April 2026 · Appears in 1 active topic

Key Question

Why is a Hugging Face co-founder backing a UK gas plant with AI at its core?

Timeline for Hugging Face

View full timeline →
Common Questions
What is Hugging Face and why is it important for AI?
Hugging Face is an open-source machine learning platform founded in France in 2016. Its Hub hosts millions of public models and datasets, and its Transformers library is the standard tool for loading and fine-tuning AI models across industry and academia.
Why did Thomas Wolf invest in a UK energy startup?
Thomas Wolf, Hugging Face co-founder, joined the angel round for Rivan's Project Steadfast in April 2026, a synthetic natural gas plant in Wiltshire, reflecting growing interest from AI infrastructure founders in energy-sector AI applications.Source: Lowdown
How does Hugging Face differ from OpenAI?
Hugging Face releases model weights openly, making frontier-class AI accessible to any developer or researcher, whereas OpenAI restricts access to its models via paid API. Hugging Face does not operate its own closed frontier lab.
What is the Hugging Face Hub?
The Hugging Face Hub is a hosted platform where researchers and companies publish open-weight AI models, datasets, and interactive demos. It is the primary distribution point for open-source AI research globally.
Where is Hugging Face based and where was it founded?
Hugging Face was founded in France in 2016 and is now headquartered in Brooklyn, New York, while retaining a significant European presence and playing a central role in EU AI policy debates.

Background

Thomas Wolf, Hugging Face's co-founder, joined the angel round for Rivan's Project Steadfast in April 2026, a 15MW synthetic natural gas plant in Wiltshire that will become Europe's largest SNG facility. Wolf's move signals a broader pattern: open-source AI infrastructure capital is rotating into adjacent deep-tech and energy sectors where AI-driven optimisation sits at the core of the value proposition.

Hugging Face is the world's leading open-source machine learning platform. Founded in 2016 in France by Clément Delangue, Julien Chaumond, and Thomas Wolf, the company now operates from Brooklyn, New York. Its central product, the Hugging Face Hub, hosts millions of public models, datasets, and interactive demos and has become the default distribution layer for open-weight AI research worldwide. The company's Transformers Python library is the de facto standard for loading and fine-tuning transformer-based language and vision models, used by researchers at Google, Meta, and universities across every continent.

Hugging Face occupies a structurally distinct position in the AI industry as an open-source counterweight to closed labs such as OpenAI and Anthropic. Where those firms treat model weights as proprietary assets, Hugging Face's model is to commoditise access to frontier-class architectures and lower the barrier for enterprise and academic deployment. This positioning makes it a key reference point in European AI sovereignty debates: the EU AI Act's tiering of foundation models explicitly draws on the open/closed distinction Hugging Face has championed. The company sits at the centre of ongoing competitive tension between open-weight accessibility and the safety arguments advanced by proponents of restricted access.