• Open Banker
  • Posts
  • Quantum Shift — Why the Time Has Come for Financial Institutions to Pay Attention to Quantum Computing

Quantum Shift — Why the Time Has Come for Financial Institutions to Pay Attention to Quantum Computing

Written by Tom Brown

In partnership with

Tom Brown is Senior Counsel at Paul Hastings, specializing in FinTech law with deep expertise in payment systems, digital assets, and financial services regulation. He invested in quantum computing company BlueQubit during his time at Nyca Partners and is currently collaborating with BlueQubit on educational initiatives about quantum threats to cryptography.

Open Banker curates and shares policy perspectives in the evolving landscape of financial services for free.

The Great Quantum Divide

Joe Altepeter, the Program Manager at DARPA who oversees the agency's quantum computing portfolio, recently observed that members of the tech world have long fallen into one of two camps: those who believe that building a quantum computer will prove both impossible and useless, and those who believe it will be the most important technology of the 21st century.

In 2025, the signal shifted. As Scott Aronson recently explained, the developments over the past year shift the debate over quantum computing from “will it ever work” to “when will specific applications become viable.” Quantum computing companies raised $3.77 billion in equity during the first nine months of 2025 — nearly triple the $1.3 billion raised in all of 2024. National governments announced plans to invest an additional $10 billion in quantum computing initiatives in the first seven months of the year, bringing the three year total to $25 billion. The Nobel committee awarded the 2025 Physics prize to three scientists for their work on superconducting quantum circuits. Last year also saw a steady drip of progress on quantum hardware, software, and quantum addressable problems. 

For those in secret-keeping industries, including defense, financial services, telecommunications, and health care, these developments mean that quantum computing and the existential threat it poses to public-key encryption can no longer be ignored. 

Understanding the Threat: Quantum Computing in Plain English

First, let’s back up and explain why quantum computers threaten current encryption.

Classical vs. Quantum: Different Physics, Different Rules

Traditional computers process information using bits. Each bit functions like a switch — on or off. And the power of a computer grows linearly with each switch. Computers manipulate these bits through logical operations, following the rules of classical physics. 

Quantum computers operate differently. Instead of bits, they use quantum bits or “qubits.” Due to quantum superposition, a qubit can exist as 0, 1, or both simultaneously — until measured. Think of a classical bit as a coin lying flat on a table (definitely heads or tails) versus a qubit as a spinning coin in the air (both, until you catch it).

Multiple qubits in superposition can represent many states at once. Two classical bits represent one of four values at any time (00, 01, 10, or 11). Two qubits represent all four simultaneously. Qubits also scale exponentially: 300 qubits can represent more states than there are atoms in the universe.

Quantum computers also exploit entanglement. When qubits become entangled, measuring one instantly affects the others. This gives rise to the concept of interference and allows quantum systems to assess vast numbers of possibilities simultaneously. Interference enables a quantum system to amplify correct answers while canceling wrong ones.

Why This Breaks RSA

RSA encryption relies on a mathematical asymmetry: multiplying two large prime numbers is easy, but factoring the product back into those primes is extraordinarily hard. RSA-2048 uses numbers that are 617 digits long. Finding the two prime factors classically requires trying trillions upon trillions of combinations.

In 1994, as linked above, mathematician Peter Shor proved that a sufficiently powerful quantum computer could factor large numbers exponentially faster using quantum superposition and entanglement. Shor’s algorithm takes advantage of a property of numbers discovered by a Swiss mathematician Leonhard Euler in the 18th century. In brief, products of prime numbers, when manipulated properly, have a near quadratic relationship.  

Where classical computers check possibilities one at a time, Shor’s algorithm exploits quantum interference to test them simultaneously. Assuming a quantum computer of sufficient scale (still a big assumption), Shor’s algorithm could crack codes in hours or days that would take a classical computer billions of years.

2025: A Year of Notable Developments in Quantum Computing

To this point, robust quantum computers have only been hypothesized, not actualized. Building a working quantum computer requires creating qubits, keeping them in a quantum state, and correcting errors without destroying them. In order for such a computer to be commercially viable, it must operate at a scale large enough to exceed the capabilities of a classic computer to simulate quantum logic, a threshold that advances every year. Last year delivered several significant developments that appear to move quantum computing closer to practical reality.  

Hardware: Creating, Sustaining, and Correcting Qubits

The fundamental hardware challenges — creating stable qubits, maintaining quantum states long enough to compute, and correcting errors without destroying quantum information — all saw measurable progress in 2025.

Error correction breakthroughs: Google demonstrated the ability to add more qubits to a system and reduce errors rather than increase them, solving a challenge that had persisted for nearly 30 years, publishing a paper in Nature based on the results and introducing a new quantum processor, Willow. IBM demonstrated real-time error correction decoding 10 times faster than previous approaches, completing this milestone a year ahead of schedule and introducing its own new processor, Nighthawk.

Extended operation times: Harvard physicists built a quantum computer that operated continuously for over two hours, compared to previous systems that ran for seconds. An MIT physicist who worked on the project suggested indefinite operation might be achievable within 2-3 years, down from previous 5+ year estimates.

Coherence improvements: Princeton engineers achieved the ability to keep qubits in a quantum state — what experts call “coherence” — three times longer than previous records. Caltech researchers demonstrated superposition lasting 13 seconds in a 6,100-qubit array. Longer coherence means qubits can maintain quantum states long enough for more complex calculations.

Software: Better Algorithms and Programming Tools

Algorithmic innovations reduced the hardware requirements for cryptographically-relevant quantum computing, while improved programming tools extracted more value from existing quantum systems.

Google's Quantum Echoes: Google announced the first verifiable quantum advantage algorithm running on hardware. Their out-of-order time correlator algorithm ran 13,000 times faster on Willow than classical supercomputers for learning the structure of molecules and magnetic systems.

Improved error mitigation: IBM's Qiskit capabilities showed 24% accuracy increases with dynamic circuits and decreased costs of extracting accurate results by over 100 times through HPC-powered error mitigation. Quantum control solutions from companies like Q-CTRL, working with Nvidia and others, overcame computational bottlenecks in error suppression.

Hybrid quantum-classical frameworks: Co-design methodologies became prevalent, with hardware and software teams collaborating to optimize systems for specific applications. IBM's Qiskit Functions allowed partners like E.ON, Yonsei, and ColibriTD to publish research combining quantum and classical computing resources.

Applications: Problems Beyond Breaking Encryption

Although breaking RSA encryption remains the most discussed quantum computing application, 2025 brought demonstrations in other problem domains. Several organizations reported quantum systems outperforming classical computers in specific applications.

Molecular simulation: One of the primary motivations for creating a working quantum computer is simulating quantum dynamics. Quantinuum demonstrated the ability to simulate high-temperature superconductors on its Helios machine. 

Materials science: D-Wave's Advantage2 annealing quantum computer performed magnetic materials simulations in minutes that would take classical supercomputers nearly one million years, according to peer-reviewed research published in Science. The simulation used programmable spin glasses with known applications to business and science.

Primitive quantum encryption: BlueQubit researchers demonstrated a potential encryption approach using “peaked circuits” on Quantinuum’s H2 quantum processor. These circuits hide a secret bitstring that quantum hardware can extract in under 2 hours, while classical supercomputers would require years to find the same answer. The team has launched a public challenge (with a Bitcoin bounty) inviting cryptanalysts to attempt classical attacks, similar to how RSA’s security was stress-tested in the 1990s.

Certified randomness: One of the limitations of classic computers is the ability to generate truly random numbers. Known classical tools produce strings of numbers that exhibit some degree of correlation. Two teams demonstrated the ability of quantum computers to generate truly random numbers with classical verification — a milestone with applications in cryptography and secure protocols.

Sampling problems: China's Jiuzhang 4.0 photonic quantum computer achieved quantum advantage in Gaussian boson sampling, with classical supercomputer El Capitan requiring longer than the universe's age for equivalent computation. While this specific problem has limited direct applications and has seen the claimed advantage chipped away, it demonstrates quantum systems can solve certain mathematical problems classical computers cannot.

Discerning The Implications For Financial Institutions And Other Secret Keepers

It is tempting to dismiss these developments as quantum-investment bro hype. But they came alongside institutional and scientific validation as well as research explaining how these advances could speed up the timeline to compromising public-key encryption by 2029. 

DARPA’s quantum computing initiative has two components: (1) a Quantum Benchmarking Initiative, launched in July 2024, that seeks to determine whether utility-scale quantum computing — i.e., where computational value exceeds cost — can be achieved by 2033; and (2) a catalog of problems to which to apply such a computer. 

DARPA has documented progress on both components over the course of 2025. In April, DARPA selected eleven companies for the first phase of the benchmarking initiative, giving each $15 million. By November, it had advanced eighteen companies to the second phase of the program, a year-long evaluation of their R&D plans and prototype roadmaps. It has also identified ten potential applications, ranging from fuel discovery to drug discovery.

IBM established a community-led quantum advantage tracker to systematically monitor and verify emerging demonstrations, acknowledging that rigorous validation requires industry consensus. IBM anticipates the first cases of verified quantum advantage will be confirmed by the wider community by the end of 2026.

A paper published by quantum researcher at Google, Greg Gidney, documents how these developments could accelerate the cracking of the key public key encryption algorithm, RSA-2048. Coming into the year, the prevailing estimate for the number of qubits that would be necessary to crack RSA-2048 was twenty million. Gidney’s paper shows how improvements in quantum programming techniques could reduce the number of qubits twenty-fold. Several researchers, including those at Gartner and Protiviti, have speculated that RSA-2048 could be compromised by 2029 more than five years ahead of the schedule laid out by the National Institute of Science and Technology just four years ago.

2026: A Year to Begin Preparing

This shift invites comparisons to Y2K. The Y2K problem arose because older mainframe systems used two-digit year fields. Computers would read "00" as 1900. The world spent approximately $300-600 billion on Y2K remediation, and it worked. Although the date shift broke a few systems, the vast majority continued to operate normally. 

The parallel is imperfect. Y2K had a fixed date. It could be fixed relatively easily through date shifting, windowing, or expanding the year field from two characters to four. Failures were also easy to spot. Systems shut down or produced absurd results. None of this is true about the demise of public key encryption. It could arise at any time, will be very difficult to fix, and may not be known until longer after the fact.

Given the developments in 2025, financial institutions should treat 2026 as the year to begin understanding their potential exposure and monitoring quantum computing progress more closely.

  • Begin mapping systems using quantum-vulnerable encryption: external communications (TLS/SSL, VPNs), internal systems (databases, authentication, APIs), digital signatures and code signing, stored data with long-term value, and third-party dependencies.

  • Establish processes to track quantum computing progress: hardware milestones (qubit counts, error rates, coherence times), algorithmic breakthroughs, commercial deployments, and regulatory guidance.

  • Begin conversations with critical technology vendors and service providers: What is their post-quantum cryptography roadmap? When will quantum-safe versions be available? What migration support will they provide? What timeline do they envision?

Measured Attention, Not Panic

The cryptography that protects today's sensitive data will be vulnerable if quantum computing continues on its current trajectory. Indeed, it may be impossible to protect some valuable information if the time that would be required to implement post quantum cryptographic exceeds the time until quantum computers can crack existing public key cryptographic techniques as Michele Mosca outlined a decade ago. 

“If” is, obviously, the key word in that sentence, but institutions that wait for certainty will find themselves exposed. Again, the Y2K parallel is instructive: organizations that began preparing early — in the late 1980s and early 1990s — managed the transition relatively easily. Those that waited until 1998 or 1999 had to scramble. And a small handful failed entirely. We're likely in the quantum equivalent of the mid-1990s — awareness exists, the threat appears real, and the time has come to prepare.

The opinions shared in this article are the author’s own and do not reflect the views of any organization they are affiliated with.

Open Banker curates and shares policy perspectives in the evolving landscape of financial services for free.

If an idea matters, you’ll find it here. If you find an idea here, it matters. 

Interested in contributing to Open Banker? Send us an email at [email protected].

Wake up to better business news

Some business news reads like a lullaby.

Morning Brew is the opposite.

A free daily newsletter that breaks down what’s happening in business and culture — clearly, quickly, and with enough personality to keep things interesting.

Each morning brings a sharp, easy-to-read rundown of what matters, why it matters, and what it means to you. Plus, there’s daily brain games everyone’s playing.

Business news, minus the snooze. Read by over 4 million people every morning.