Harmoniq: What a Quantum State Sounds Like
I built a quantum synth. It’s called Harmoniq, it’s open source, and it lives at harmoniq.quantum.dev. You build a quantum circuit by placing gates on a grid, hit play, and the quantum state turns into sound.
It’s also a community project. The source is on GitHub under MIT, we’re accepting contributions, and we’d love to see what people do with it. If you’re interested, join us on Discord.
I want to talk about why I built it, how the sonification actually works, and what some of this stuff sounds like. If you want to skip ahead and just play with it, the link is up there.
Why audio
Most of the work I do on quantum games is about translating quantum mechanics into things you can see and touch. Pieces in superposition, boards that get entangled, measurement collapses you can watch happen. It works, mostly. People build intuition that way.
But I’ve always been a little frustrated by how much information gets lost when you flatten a quantum state down to a visual. A two-qubit state has four complex amplitudes. By the time you’ve reduced it to “this piece is 70% likely to exist,” you’ve thrown away phase and entanglement, the stuff that makes quantum… quantum. Visuals can hint at all that, but it is difficult to show without becoming dense and unreadable.
Sound is different. Sound is inherently parallel. You can hear a chord and pick out the individual tones, but you also hear the whole thing as one gestalt. Volume, pitch, timing, timbre, panning, harmonic content; it’s a multi-dimensional channel that humans have spent their whole lives learning to parse. And quantum states are also multi-dimensional. The mapping isn’t free, but the pieces fit better than I expected.
I also wanted to do something a little weird with the public release of Quantum Forge. Harmoniq is the first thing I’ve open sourced on top of the public npm package. It’s not a game, it’s not a tutorial, it’s not a benchmark. It’s an experiment in what else this stuff is good for. I think there’s a lot more.
What it actually does
You drop gates onto a circuit grid. Hadamard, X, Z, T, CNOT, CZ, iSWAP. There’s a playhead, a red line that scrubs through the circuit one step at a time. At each step, Quantum Forge evaluates the state of the circuit so far, and Harmoniq turns that state into sound.
The sound model is built around continuous oscillators. Each basis state of an n-qubit system gets its own oscillator, and the oscillators are always running. What changes is their volume. As the playhead moves and the quantum state evolves, the volumes ramp smoothly to match the new probability distribution. With the decay knob all the way down, the sound is sustained and drone-like; the chord just morphs from one shape to the next as the state evolves. The sound is the state.
Turn the decay up and the same engine becomes percussive. Each circuit step retriggers the gain envelope, so instead of a continuous morph you get the chord for the current quantum state struck at each gate position, fading out before the next one hits. Same mapping, same oscillators, just a different envelope shape. The decay knob is the difference between an ambient quantum drone and a sequenced quantum arpeggio.
Visuals run alongside the audio. Probability bars, a state vector readout with phases, a real-time waveform oscilloscope, entanglement and correlation meters. It’s a synth and a quantum state inspector at the same time. The visuals are there to help you correlate what you’re hearing with what’s actually going on in the circuit.
The mapping
There are two pieces to the sonification. Map basis states to notes, and map probabilities to volumes.
For notes, each basis state gets assigned a pitch from a configurable musical scale. C major, pentatonic, blues, whole tone, dorian, chromatic, a few others. State |000⟩ gets the root, |001⟩ gets the next note in the scale, and so on. The number of tones grows exponentially with qubits, which is appropriate, because the quantum state space does too. Three qubits gives you eight tones. Four gives you sixteen.
For volumes, the obvious choice would be to map probability directly to gain. I tried that first and it sounded wrong. The problem is that human loudness perception is roughly logarithmic, not linear, so a state with probability 0.25 doesn’t sound half as loud as probability 0.5. It sounds much quieter. The fix is to use the square root of the probability, which is also the magnitude of the amplitude in the state vector. That gives a perceptually linear loudness curve, and as a bonus it lines up with how amplitudes actually compose under quantum operations. Probability is just a derived quantity; amplitude is the real thing.
So: amplitude magnitude becomes loudness, basis state becomes pitch. That’s enough to sonify a pure state in one basis. But this is where it gets more interesting.
Two bases at once
The thing about quantum mechanics is that the choice of measurement basis is part of the experiment. If you have a qubit in the state (|0⟩ + |1⟩)/√2 and you measure it in the Z-basis, you get 0 or 1 with equal probability. If you measure the same state in the X-basis, you get + with certainty. Same state, completely different outcomes; what changed is the question you asked.
Harmoniq sonifies the quantum state in two bases simultaneously. The Z-basis runs through one set of oscillators (sine waves by default). The X-basis runs through a separate set, an octave up by default, with a different waveform (triangle, by default). They mix together at the master output. You hear both at once.
The reason this matters is that a lot of what’s interesting about a quantum state is invisible if you only look at one basis. Apply a phase gate (Z, T) to a qubit in superposition. Nothing changes in the Z-basis. The probabilities of |0⟩ and |1⟩ are still 50/50, the volumes don’t move, the visualization doesn’t budge. But the X-basis is doing something completely different. Phase gates rotate amplitudes in the complex plane, and rotations in one basis are visible as probability shifts in the rotated basis.
In Harmoniq, this means a phase gate is silent in the cyan tones (Z-basis) and loud in the gold tones (X-basis). You can hear the underlying phase. The thing that’s invisible from one direction is the loudest thing from another.
To read the X-basis without collapsing the state, the simulator applies a Hadamard to every qubit, reads the probabilities, and then reverses the Hadamard. This is the standard trick for measuring in a rotated basis, but it’s cool to be able to do it inside an audio loop running at 60 frames per second. Quantum Forge is fast enough that the overhead is invisible.
What you can hear
Some specific things to listen for once you start playing with it.
Superposition sounds like a chord. A single Hadamard on one qubit takes you from a single note to a two-note chord. The tones don’t fade in or beat against each other; they just appear, equally loud. It sounds like flipping a switch from “one thing” to “two things at the same time,” because that’s what it is.
Entanglement sounds like two tones that move together. The GHZ state on three qubits, (|000⟩ + |111⟩)/√2, only has two non-zero basis states, so you hear two notes. But the interesting thing isn’t the chord; it’s that those two notes are linked. They’re not two independent tones. They’re one quantum object that happens to be expressing itself as two notes. The visualization shows it (entanglement entropy goes up), but the audio gives you something the visualization can’t, which is the perceptual quality of “these belong together.”
Phase gates are quiet in Z and loud in X. Apply a T gate to a qubit in superposition and nothing changes in the cyan tones. But the gold tones shift; new notes appear, old ones get quieter. The state didn’t get more “uncertain” in any classical sense. It rotated. You can hear the rotation as a basis-dependent shift.
Walking through basis states sounds like an arpeggio. The “Scale” preset uses X gates to step through the basis states one at a time, which gives you the assigned scale played sequentially. Harmonically boring but useful for hearing how the basis state to note mapping works.
The “Full Superposition” preset puts H on every qubit, which gives you equal probability across all 2^n states. In Z-basis you get a chord with every tone active at equal loudness; in X-basis you get a single tone (the all-zeros basis state in X). It’s a great demonstration of how the same state looks completely different in two bases.
The tech
Harmoniq is vanilla JavaScript. No React, no Vue, no Svelte. Vite for the build, Web Audio API for the sound, Quantum Forge for the quantum simulation, hand-written HTML and CSS for everything else. The whole thing is a few thousand lines.
I went vanilla for two reasons. First, I wanted the published version to be a reasonable starting point for anyone who wants to fork it and try their own sonification ideas. Less framework lock-in, less to learn. Second, the audio loop runs at 60 frames per second and updates oscillator gains continuously, and I didn’t want to fight a framework’s reconciliation cycle for that. Direct DOM and direct Web Audio API calls are simpler and faster.
The quantum side runs on the public Quantum Forge npm package. Same WASM module that powers Quantum Chess, Quantris, Ponq, and Bloch Invaders. You get a QuantumPropertyManager, you allocate qubit properties, you apply gates, you read probabilities or density matrices, you release the properties when you’re done.
The important thing about Quantum Forge isn’t just that it’s fast or that it runs in the browser. It’s that the API enforces modes of interaction that are compatible with real quantum hardware. You can’t copy a quantum state. You can’t inspect amplitudes without measuring. The operations you’re allowed to perform are the operations that actual quantum processors support. This is the same principle that let us run Quantum Chess on a Google Sycamore processor. The game didn’t need to change; the backend did. Harmoniq is built the same way. The circuits you build in it today could, in principle, run on real hardware with the same API and the same code. What you’d hear would be different, because real measurement is genuinely random rather than seeded, but the structure of the sonification would carry over unchanged.
This is the first thing I’ve open sourced that uses the public package, so if you’ve been wanting a working example of how to actually wire Quantum Forge into a real app, the source is right there.
A couple of implementation notes for anyone who looks at the code:
The circuit evaluator caches state. As the playhead moves forward, it doesn’t rebuild the circuit from scratch each step. It just applies the new gates on top of the cached state. When you edit the circuit or scrub backward, the cache invalidates and the relevant prefix gets replayed. This is the starting point for a usable, interactive synth.
The audio engine pre-allocates oscillators per basis state and only modulates their gain. Creating and destroying oscillators on every step would click and pop and use way too much CPU. The trick is to use linear ramps for attack (15ms, fast enough to feel instant but slow enough to avoid pops) and exponential ramps for decay (variable, controlled by a knob). When the qubit count changes, the old oscillators fade out over 20 milliseconds before getting cleaned up.
What I want this to be
Harmoniq is a small thing. It’s an experiment, not a product. But I think the broader idea, that you can take a real quantum simulator and use it as the heart of something that isn’t a game and isn’t an algorithm benchmark, is worth more attention than it gets. Quantum Forge is fast, deterministic, and runs in the browser. There’s a lot of stuff you could build with it that nobody has built yet, partly because the audience for “build something with a quantum simulator” has historically been physicists and computer scientists, and the audience for “build something interesting in the browser” has historically been everyone else.
The point of opening up Quantum Forge as a public npm package was to bridge that gap. Harmoniq is one example of what happens when you do. I’d love to see more.
I want Harmoniq to be a community project. The repo is open on GitHub under MIT, and we’re accepting contributions. If you have ideas for new sonification modes, new gates, new visualizations, better scales, accessibility improvements, whatever, open an issue or a PR. If you’re interested in Quantum Forge but haven’t had a project to try it on, this is a good one. The codebase is small, the feedback loop is immediate (you can hear whether your change worked), and the quantum side is well-contained.
We’re also building a broader community of quantum game and app developers on Discord. If you’re experimenting with Quantum Forge, whether it’s sonification or something else entirely, come share what you’re working on.
harmoniq.quantum.dev. Headphones recommended.
Comments