Michaela Eichinger, PhD

29 posts

Michaela Eichinger, PhD

Michaela Eichinger, PhD

@drmichaela_e

Product Solutions Physicist @Quantum Machines. Writing on scalable quantum systems.

Switzerland Katılım Ekim 2020
192 Takip Edilen145 Takipçiler
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
Just because you're a physicist working in quantum doesn't mean you're an expert in quantum applications and quantum advantage. It is the bitter truth, and one that seems to be ignored a lot. Have I been in a situation where I was asked about the applications that quantum computers can enable? Surely was. Have I been uncomfortable answering those questions? Even bigger yes. The problem is that there is a lot to learn between QPU hardware, cryo, control, QEC and the application layer. And it is no simple task to gather a broad overview of where the field is at. At least, one would hope that papers coming out from the leading quantum groups on algorithms and advantage could be trusted. But I recently learned from a quantum chemist working in quantum computing for a few years now that one should be 𝗘𝗫𝗧𝗥𝗘𝗠𝗘𝗟𝗬 cautious. I was already skeptical about new advantage claims before this conversation, but now even more so. Apparently there are many quantum chemists out there who roll their eyes massively when "quantum computing will allow for new drug discovery" phrases drop. And they can't believe what classical, outdated techniques are often used as comparison benchmarks. I'm taking it as a wake-up call. To be more skeptical, but also more deliberate about expanding my knowledge up the stack, from let's say unnatural angles for a physicist. How do you approach learning about quantum algorithms and applications, especially when your natural home is lower in the stack ?
Michaela Eichinger, PhD tweet media
English
0
0
0
49
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
If you think @Microsoft is only focusing on topological qubits, you missed the bigger picture. "We work with every platform on the planet," says Zulfi Alam (CVP, Microsoft Quantum) This might surprise those who thought Microsoft was only betting on the topological "moonshot". While they are absolutely still engineering the Majorana chips, they have been focusing on another essential part of building a quantum processor: 𝗼𝗿𝗰𝗵𝗲𝘀𝘁𝗿𝗮𝘁𝗶𝗼𝗻. Microsoft isn’t waiting for the perfect hardware to arrive. They are building their version of a Quantum OS. Just like any enterprise giant, Microsoft is running an interesting quantum strategy. If their own hardware takes longer, they aim to own the ecosystem for everyone else's.
Michaela Eichinger, PhD tweet media
English
0
0
0
56
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
You need people who understand: • Statistical estimation techniques • Optimization algorithms • Real-Time Systems Orchestration • Hardware-Centric Logic These are software engineering and computer science skills.   But look at what we're teaching: Quantum mechanics. Circuit QED. Pulse engineering. Decoherence mechanisms. All critical.     But statistical estimation? Optimization on FPGAs? Real-time control? We're not teaching them. And if your team doesn't have that expertise, you're building a lab setup that publishes papers. Not a computer that runs.
English
0
0
0
37
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
A computer doesn't run calibration sequences between every operation and hope the parameters hold. It measures, detects drift, corrects in real time, and keeps running. That requires closed-loop feedback operating fast enough to track fluctuations that happen on timescales shorter than your calibration routines. And that's a completely different problem.
English
1
0
0
42
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
I am scared for your calibration stack. Because the skills that we've been teaching most grad students and quantum engineers are not the ones that will be sufficient to turn a multi-qubit chip into a high performing quantum processor.
Michaela Eichinger, PhD tweet media
English
1
0
1
108
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
How do you prove a quantum computer did something a classical computer genuinely cannot do? It sounds simple. It’s not. 𝗧𝗵𝗲 𝗣𝗿𝗼𝗯𝗹𝗲𝗺: By definition, a "quantum advantage" task is too hard to simulate classically. If you can't simulate it, you can’t verify the answer is correct. Most claims fall into the "𝗘𝘅𝘁𝗿𝗮𝗽𝗼𝗹𝗮𝘁𝗶𝗼𝗻 𝗧𝗿𝗮𝗽", meaning trusting that because a quantum computer works on small, checkable systems, it’s still working when we scale it into the "dark". 𝗧𝗵𝗲 𝗛𝗶𝘀𝘁𝗼𝗿𝘆: • @GoogleQuantumAI (𝟮𝟬𝟭𝟵): Used Random Circuit Sampling. It was a breakthrough, but verification (XEB) becomes exponentially expensive at scale. • @IBM (𝟮𝟬𝟮𝟯): Used a "utility" approach on a 127-qubit problem. It sparked a massive debate: did they beat classical computing, or just the specific classical methods they tried? • 𝗦𝗰𝗼𝘁𝘁 𝗔𝗮𝗿𝗼𝗻𝘀𝗼𝗻 famously called this "supremacy theater". Scientifically real, but not yet "useful." 𝗔 𝗡𝗲𝘄 𝗦𝗼𝗹𝘂𝘁𝗶𝗼𝗻 𝗯𝘆 @BlueQubitIO: Instead of a totally random circuit that looks like noise, we engineer 𝗣𝗲𝗮𝗸𝗲𝗱 𝗖𝗶𝗿𝗰𝘂𝗶𝘁𝘀. We "hide" a single, specific bitstring (s*) that has an anomalously high probability of appearing (e.g., a 10% peak). 𝗪𝗵𝘆 𝘁𝗵𝗶𝘀 𝗶𝘀 𝗶𝗻𝘁𝗲𝗿𝗲𝘀𝘁𝗶𝗻𝗴 𝗳𝗼𝗿 𝘃𝗲𝗿𝗶𝗳𝗶𝗰𝗮𝘁𝗶𝗼𝗻: • 𝗜𝗻𝘀𝘁𝗮𝗻𝘁 𝗦𝘂𝗰𝗰𝗲𝘀𝘀 𝗖𝗵𝗲𝗰𝗸: The person who builds the circuit knows the "peak" bitstring in advance. • 𝗡𝗼 𝗦𝘂𝗽𝗲𝗿𝗰𝗼𝗺𝗽𝘂𝘁𝗲𝗿 𝗡𝗲𝗲𝗱𝗲𝗱: You run the circuit 1,000 times on quantum hardware. If the peak bitstring appears ~100 times, you’ve verified the hardware is working. • 𝗖𝗹𝗮𝘀𝘀𝗶𝗰𝗮𝗹 𝗛𝗮𝗿𝗱𝗻𝗲𝘀𝘀: To an attacker, the circuit looks like random noise. So-called "identity obfuscation" (swaps, sweeps, and masks) can be used to hide the structure so classical simulators can’t find the shortcut. 𝗧𝗵𝗲 𝗥𝗲𝘀𝘂𝗹𝘁𝘀: Recent demonstrations on @QuantinuumQC 's 𝗛𝟮 𝗽𝗿𝗼𝗰𝗲𝘀𝘀𝗼𝗿 (56 qubits, all-to-all connectivity) show a massive "Heuristic Advantage": • 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗥𝘂𝗻 𝗧𝗶𝗺𝗲: Under 2 hours. • 𝗖𝗹𝗮𝘀𝘀𝗶𝗰𝗮𝗹 𝗥𝘂𝗻 𝗧𝗶𝗺𝗲: Leading techniques (Tensor Networks, Pauli Path Simulators) are estimated to take 𝘆𝗲𝗮𝗿𝘀 on exascale supercomputers like Frontier. Looks like we are moving out of the theater, no ?
Michaela Eichinger, PhD tweet media
English
0
1
2
112
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
As quantum computing scales, the challenges extend beyond qubit fidelity and error rates. Incorporating non-Clifford gates, the keys to universal quantum computation, place extraordinary demands on control systems. Clifford gates form the backbone of error correction, but non-Clifford gates, such as the T gate, are essential to surpass their limitations and enable algorithms like Shor’s factorization or advanced quantum simulations. Fault-tolerant implementations of non-Clifford circuits require in particular: ➡️ decoding-dependent quantum gates (feed-forwards) ➡️ ultra-low controller-decoder latencies ➡️ seamless integration of classical and quantum components for real-time operation At @QuantumQM, we explored these requirements in depth: ➡️ We analyzed execution of Shor’s factorization algorithm for 21, mapping it from logical circuits to surface code and finally to the physical level. ➡️ We showed that decoding must run in parallel, with latency requirements as low as tens of microseconds. ➡️ We demonstrated that a pulse-processor approach meets these demands, enabling scalable, fault-tolerant computation for 1,000 qubits and beyond.
Michaela Eichinger, PhD tweet media
English
0
0
0
96
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
The first time I saw machine learning applied to quantum computing was during my time at the Niels Bohr Institute in Copenhagen. Anasua Chatterjee and her team were exploring AI-driven methods to automate the tune-up of spin qubits. I didn’t pay much attention at the time. Fast forward to today, and AI feels like the secret sauce accelerating almost every aspect of quantum computing. Quantum computing is all about mastering exponentially complex systems. AI thrives in high-dimensional, data-rich environments. The pairing is like finding the perfect dance partner. What’s exciting: AI isn’t just helping debug or optimize. It’s diving deep into core quantum research. It’s designing qubits, discovering novel error correction codes, and making circuit synthesis far more efficient. Tasks that once took research teams weeks are now becoming automated, adaptive, and scalable. One example I particularly like is AI-enhanced quantum error correction. Researchers are using neural networks and transformers to achieve error rates below those of traditional methods; at a fraction of the computational cost. Another promising direction is quantum feedback control using transformers. This approach could transform how we stabilize and steer quantum systems in real time by leveraging AI to predict and counteract noise.
Michaela Eichinger, PhD tweet media
English
0
0
1
120
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
Ever wondered what role resonators play in superconducting qubits? Resonators are typically used as readout components in quantum computing, serving as intermediaries between the quantum and classical worlds. A resonator is a circuit element that stores energy at a specific frequency. When paired with a qubit, it forms a hybridized system, enabling dispersive readout—the most common method for measuring qubit states. In the dispersive regime, the qubit and resonator are coupled but operate at different frequencies. Instead of exchanging energy, the qubit slightly shifts the resonator’s frequency depending on whether it is in the |0⟩ or |1⟩ state. By sending a microwave tone through the resonator, we can measure this shift and infer the qubit’s state without directly disturbing it. However, achieving high-fidelity readout is far from straightforward. The process must be fast enough to support high-throughput quantum operations while minimizing errors and avoiding back-action that could disturb the qubit. This balance requires careful tuning of the coupling between the qubit, resonator, and feedline. Too much coupling risks qubit decoherence, while too little slows down the readout. To address this, hardware tricks like Purcell filters are used to protect qubit coherence while enabling fast and efficient readout. Hardware is only part of the equation. On the software side, optimizing the microwave pulses used for readout is critical for improving fidelity and speed. One particularly exciting approach is reinforcement learning, which can autonomously explore the qubit-resonator landscape and design novel readout waveforms. If you’re curious, Yvonne Gao's paper on this topic (arXiv:2412.04053) is a great place to dive deeper.
Michaela Eichinger, PhD tweet media
English
0
0
3
57
Michaela Eichinger, PhD retweetledi
Craig Gidney
Craig Gidney@CraigGidney·
Chevignard et al show residues also reduce the qubit cost of quantum attacks on elliptic curves: eprint.iacr.org/2026/280 The space savings is less dramatic than for factoring (1.6x instead of 6x), and they again pay a big gate count penalty (256x), but very interesting.
English
4
11
50
54.5K
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
5/5 So this new work doesn't validate a qubit. It validates their error model.
English
0
0
1
67
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
4/5 This isn't a failure; it's a diagnostic map. The long Z-lifetime shows external noise is well-controlled. The short X-lifetime pinpoints the real bottleneck: internal "residual coupling" of the Majorana modes.
English
1
0
2
96
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
3/5 The core finding? A massive ~1000x difference in stability for their two measurement types: 🔹 Z-measurement (on one wire): 12.4 ms lifetime 🔹 X-measurement (across two wires): 14.5 µs lifetime
English
1
0
1
103
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
2/5 A true qubit needs low-error operations. Their new paper reports a 16% error on a key measurement. Crucially, the test to prove their X & Z measurements anti-commute - which would confirm a qubit - is still on their to-do list. This is Milestone 1, not a finished device.
English
1
0
1
61
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
1/5 @Microsoft's new Majorana paper quietly says what the headlines didn’t: They didn’t build a qubit. They built a tool to understand why they can’t - yet. Their own data proves they haven’t crossed that threshold. Here's the real story:
English
1
0
1
71
Michaela Eichinger, PhD
Michaela Eichinger, PhD@drmichaela_e·
How do you actually use reinforcement learning in quantum computing? We often hear about applying AI to quantum—but what does that really mean when you’re working with real hardware? Here’s how 𝗿𝗲𝗶𝗻𝗳𝗼𝗿𝗰𝗲𝗺𝗲𝗻𝘁 𝗹𝗲𝗮𝗿𝗻𝗶𝗻𝗴 (𝗥𝗟) can be used to optimize quantum gates and calibrations: 𝟭. 𝗔𝗴𝗲𝗻𝘁 → 𝗖𝗼𝗻𝘁𝗿𝗼𝗹 𝗣𝗼𝗹𝗶𝗰𝘆 A classical neural network learns a policy: it takes in some description of the circuit context (e.g. qubit rotation angles or layer structure) and outputs control parameters—like the pulse shape for a specific gate. 𝟮. 𝗘𝗻𝘃𝗶𝗿𝗼𝗻𝗺𝗲𝗻𝘁 → 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗛𝗮𝗿𝗱𝘄𝗮𝗿𝗲 The quantum processor executes a circuit using those parameters. The physical outcome (fidelity, population transfer, etc.) reflects how well the suggested control worked. 𝟯. 𝗙𝗲𝗲𝗱𝗯𝗮𝗰𝗸 → 𝗥𝗲𝘄𝗮𝗿𝗱 𝗦𝗶𝗴𝗻𝗮𝗹 A reward is calculated—ideally a fast, scalable proxy for gate fidelity. This could be based on outcome probabilities, expectation values, or simplified channel fidelity estimation. 𝟰. 𝗧𝗿𝗮𝗶𝗻𝗶𝗻𝗴 𝗟𝗼𝗼𝗽 → 𝗣𝗼𝗹𝗶𝗰𝘆 𝗨𝗽𝗱𝗮𝘁𝗲 The agent updates its internal parameters to improve future suggestions, gradually learning how to steer the hardware toward higher-performance operations. This is 𝘮𝘰𝘥𝘦𝘭-𝘧𝘳𝘦𝘦 𝘭𝘦𝘢𝘳𝘯𝘪𝘯𝘨—meaning the agent doesn’t need a detailed understanding of the quantum system’s dynamics. It learns directly from experience, even in the presence of unknown or fluctuating noise. One powerful use case: 𝗰𝗼𝗻𝘁𝗲𝘅𝘁-𝗮𝘄𝗮𝗿𝗲 𝗰𝗮𝗹𝗶𝗯𝗿𝗮𝘁𝗶𝗼𝗻, where the RL agent learns to adjust gate parameters depending on surrounding circuit elements, suppressing coherent errors like crosstalk that vary with input conditions.
Michaela Eichinger, PhD tweet media
English
0
0
1
99