Yesterday, I attended a Qiskit Advocate seminar on Sample-Based Quantum Diagonalization (SQD). The session was a fascinating deep dive into how we can extract eigenvalues from complex Hamiltonians using advanced sampling techniques. It was a reminder of just how powerful hybrid quantum algorithms are becoming for research.
But it also prompted a personal challenge.
As I watched the presentation, I realized that while I grasp the concepts, I want to get my hands dirty with the implementation details of the foundational solvers that lead up to SQD. To truly appreciate the cutting edge, one must master the building blocks.
So, I am kicking off a new learning sprint. I have officially enrolled in the IBM Quantum Learning course on Quantum Diagonalization Algorithms. Over the coming weeks, this blog series will serve as my lab notebook—documenting the code, the math, and the results as I attempt to diagonalize matrices on real quantum backends.
Why “Diagonalization” Matters (Beyond Physics)
If you look at most quantum computing tutorials, “diagonalization” is almost exclusively framed as a chemistry problem: Find the ground state energy of a molecule.
While simulating nature is a killer application, as a Computer Scientist, I want to broaden the aperture. Mathematically, diagonalizing a matrix \(A\) simply means finding the basis in which \(A\) acts as a simple scaling factor. We are solving for eigenvectors (\(v\)) and eigenvalues (\(\lambda\)):
\[ Av = \lambda v \]
In my view, this isn’t just about electron orbitals. This is about extracting the defining features of a system. The algorithms I’ll be exploring in this course—VQE, Quantum Krylov, and Phase Estimation—are actually universal mathematical tools that apply directly to classic CS and Data Science problems:
1. Graph Theory & Spectral Clustering
In social network analysis, we use the Graph Laplacian matrix. Diagonalizing this matrix helps us find the “Fiedler Vector,” which tells us the optimal way to cut a graph into disconnected communities. It’s not energy; it’s clustering.

2. Data Science & PCA
In Machine Learning, Principal Component Analysis (PCA) relies on diagonalizing the Covariance Matrix. The eigenvectors with the largest eigenvalues represent the features that matter most in a massive dataset. A quantum eigensolver is effectively a pattern recognition engine.
3. System Dynamics
In Markov chains (like the math powering PageRank), the steady state of a system is simply the eigenvector with an eigenvalue of 1.
The Syllabus: My Roadmap
This course moves beyond the “Hello World” of quantum. Here is the roadmap of algorithms I plan to implement and stress-test:
- The Heuristic Layer: Deep diving into VQE (Variational Quantum Eigensolver), specifically looking at efficient ansatz construction for non-chemistry problems.
- The Subspace Layer: This is the bridge to the seminar I attended. I will be exploring Quantum Subspace Expansion and Krylov methods, which project large problems into smaller, manageable subspaces.
- The Precision Layer: Finally, looking at Quantum Phase Estimation (QPE), the “textbook” algorithm that offers exponential speedup but demands significant resources.
The Commitment
I intend to approach this not just as a student, but as an engineer and educator.
I won’t just post the “happy path” where the code works perfectly on a simulator. I plan to run these algorithms on IBM’s utility-scale quantum systems. I will document the noise, the convergence failures, and the error mitigation strategies (like Zero Noise Extrapolation) required to get a clean signal out of a real device.
Next up: I’ll be tackling the Graph Laplacian and seeing if we can use a quantum computer to solve a classic clustering problem.