
Fractile
UK startup building in-memory AI inference accelerators that compute alongside model weights rather than fetching them from separate memory.
Last refreshed: 1 May 2026 · Appears in 1 active topic
Will the AI Hardware Plan give Fractile a fab commitment or just a name-check?
Timeline for Fractile
Named by Kendall as British AI hardware supply-chain candidate
UK Startups and Innovation: Kendall names UK chip five at RUSI- What does Fractile's in-memory inference chip actually do?
- Fractile builds accelerators that execute AI inference directly alongside the model weights in memory, eliminating the round-trip penalty of fetching weights from separate DRAM that throttles GPU-based inference.Source: Lowdown reporting
- Why did the UK government name Fractile in a ministerial speech?
- Secretary of State Liz Kendall named Fractile alongside four other British AI hardware startups at RUSI on 28 April 2026, framing them as British supply-chain candidates the AI Hardware Plan is designed to underwrite.Source: Lowdown reporting
- What is the AI Hardware Plan and when does it launch?
- The AI Hardware Plan was pre-announced by DSIT for launch at London Tech Week in June 2026. Its specific mechanism (procurement, equity or grant) has not been confirmed.Source: Lowdown reporting
- Who are the five British AI chip companies named by the UK government in 2026?
- Fractile, Olix, Lumai, Optalysys and Salience Labs were named by Liz Kendall at RUSI on 28 April 2026 as British supply-chain candidates for the AI Hardware Plan.Source: Lowdown reporting
Background
Fractile entered the public record on 28 April 2026 when Secretary of State Liz Kendall named it alongside four other British AI hardware companies in a RUSI address pre-announcing the AI Hardware Plan for London Tech Week in June. The ministerial endorsement moved Fractile onto procurement and investor watchlists overnight; none of the five had previously been named in a Cabinet-level speech .
Fractile, founded in 2022, builds in-memory inference accelerators that compute alongside the model weights rather than fetching them from separate DRAM. The approach collapses the memory-bandwidth bottleneck that throttles large-language model inference on conventional GPU clusters: by keeping the weights in compute-adjacent storage, Fractile's chips execute matrix operations without the round-trip penalty that limits throughput at scale. The company is part of a cluster of British deep-tech hardware companies working across the photonic, edge, and in-memory layers of the AI silicon stack.
The AI Hardware Plan, which Kendall left without specifying a mechanism (procurement, equity or grant), is the immediate upstream pressure on Fractile's trajectory. If the plan underwrites a multi-year purchase commitment or fab-line finance, it would give Fractile the customer anchor typically needed to raise a Series A at industrial scale. The plan arrives in a context where London's datacentre grid is at saturation , and the UK's AI Growth Zones in Scotland and the north are the intended deployment sites for domestic compute running on domestic silicon.