LoginContact

Boltzmann Live-Learning Machine

Building the future of AI on foundational research

The birth of the Boltzmann Machine
Geoffrey Hinton and Terry Sejnowski publish "A Learning Algorithm for Boltzmann Machines," introducing a biologically-inspired stochastic model for learning internal representations. This sets the stage for how machines could "imagine" and reconstruct data.
Get the PDF >
A fast learning algorithm for deep belief nets

The moment "Deep Learning" became viable. Hinton’s breakthrough showed that stacks of Restricted Boltzmann Machines (RBMs) could be trained efficiently, sparking the modern neural network revolution and overcoming the vanishing gradient problem.

Get the PDF >
Attention is all you need

Google researchers introduce the Transformer architecture. By replacing recurrence with self-attention, the industry gained the ability to process massive datasets in parallel, leading to the LLM explosion and leaving the "black box" of backpropagation behind.

Get the PDF >
Boltzmann Live-Learning Machines

The missing link in AI evolution. By making Boltzmann learning scalable and integrating it with the Transformer architecture, we’ve moved beyond static training. This paper introduces a system that learns in real-time, offering biological efficiency with transformer-scale power.

Get the PDF >
The Nobel Prize-winning AI architecture
Learn more

We conduct research in the foundations of AI for groundbreaking innovation.

Boltzbit lab was founded by top AI researchers to advance foundamental research of AI. We specialise in some of the most challenging  areas of Generative AI, particularly learning and inference algorithms.

We believe that only this level of research can meaningfuly expand the frontiere of AI technology and lead to foundamental breakthroughs.

Eyebrow

Headline

Copy

Eyebrow

Headline

Copy

Features
View all

Chat

Our most intelligentLLM to date
Try it with
Initial idea >

Space

A family of easy-to-use office applications
Try it with
Initial idea >

LLMs

Create a custom LLMspecifically for your needs
Try it with
Initial idea >

API

Integrate anything everywhere you need it at hand
Try it with
Initial idea >
Research papers
Research
10/10/2025

A Gradient Based Strategy for Hamiltonian Monte Carlo Hyperparameter Optimization

This paper presents a gradient-based framework for hyperparameter optimisation in Hamiltonian Monte Carlo, directly maximising convergence speed by optimising the expected log-target under the chain’s final state.

Learn more >
Research
10/10/2023

Quasi-Newton Methods for Markov Chain Monte Carlo

A Novel MCMC sampler makes use of quasi-Newton approximations from the optimization literature, that approximate the Hessian of the target distribution from previous samples and gradients generated by the sampler.

Learn more >
Research
11/1/2022

Continuous Relaxations for Discrete Hamiltonian Monte Carlo

This article discusses the recent advancements in transformation of Boltzmann machines for gradient-based Hamiltonian MonteCarlo inference.

Learn more >
© All rights reserved Boltzbit 2026