Sign Up to Our Newsletter

Be the first to know the latest tech updates

Tech News

What quantum technology can learn from AI’s development

What quantum technology can learn from AI’s development



Quantum has reached an inflection point. Across computing, sensing, communications and navigation, the science is progressing quickly and the commercial pull is growing just as fast.

But if we want quantum technologies to move beyond pilots and prototypes into reliable, scalable products, we need to pay attention to something that rarely makes headlines: standards.

Article continues below

Tim Prior

Quantum Programme Manager, National Physical Laboratory (NPL).

AI’s uptake has been extraordinarily rapid, but it also demonstrates the pressures that emerge when powerful technologies are adopted faster than the frameworks needed to compare them, integrate them safely and build long-term trust.

Quantum can, and should, take a different path, embedding measurement and standardization early so innovation can scale with confidence.

Standardization isn’t a brake on innovation. It’s how you scale it.

For many businesses, “standards” can sound like bureaucracy: extra time, extra cost, and a risk of locking-in today’s best guess before the science is settled. Historically, that perception has encouraged companies to treat standardization as something that happens later, once products are mature and markets are established. But quantum flips that logic on its head.

In quantum, the barriers to adoption often aren’t about whether the technology is transformative. They’re about whether it can be evaluated fairly, compared objectively, and integrated reliably into complex systems and supply chains.

Without shared terminology, agreed performance metrics and trusted test methods, it becomes really hard to attract investment and create a supply chain as neither customers nor investors can easily answer the practical questions that matter.

When we talk about standards in this context, we’re not talking about slowing progress. We’re talking about building the foundations of progress: best practice, interoperability, methods of comparison, and shared terminology—a common language that enables innovators to prove what they have built, and buyers to trust what they are adopting.

Learning from graphene: hype isn’t enough without measurement infrastructure

A useful parallel is the story of graphene and other 2D materials. Graphene’s remarkable properties sparked intense excitement, rapid investment and big expectations.

Yet commercial uptake was slower and more uneven than many predicted, partly because the measurement and comparison infrastructure wasn’t mature enough to let the market distinguish consistently between materials, performance claims and real-world outcomes. In the UK, this gap became prominent enough to trigger serious scrutiny into why the supporting measurement infrastructure wasn’t in place to capitalize on the opportunity.

The takeaway isn’t that graphene failed. It’s that breakthrough science doesn’t automatically translate into commercial certainty. Without agreed definitions and consistent ways to measure and validate performance, markets struggle to reward the best solutions, and supply chains struggle to form around them. Standardization helps solve that by creating the basis for comparability, repeatability and trust.

The AI lesson: adoption can outpace frameworks, then everyone pays the price

AI shows the other side of the story. It has scaled at astonishing speed and delivered real value across sectors. But it has also highlighted what happens when capability outpaces common frameworks and guardrails.

Organizations have faced challenges around consistent evaluation, governance, and the absence of widely accepted protocols for things like comparing performance and making robust decisions about deployment.

Quantum has the advantage of foresight: we can see the challenges that appear when “everyone moves fast” without shared approaches to evaluation, interoperability and risk management.

We have an opportunity to put the “control rods” in early, building the measurement and standards environment that allows innovation to grow safely and sustainably.

Why measurement sits at the heart of quantum standards

Quantum technologies are built on exquisitely sensitive physical effects. In many cases, the use of the end-product of quantum technology isn’t the hard part. The hard part is making and manufacturing it consistently, and proving how it behaves, across devices, environments, suppliers, and time.

That’s why metrology (the science of measurement) is so central. It provides the tools to test, verify and benchmark performance, and it creates the evidence base that standards require.

This is also why National Metrology Institutes are so deeply involved. Many SI units already have strong links to quantum principles: precision timing is fundamentally quantum, and modern electrical standards are quantum-based. The UK has decades of heritage here in work that stretches back to landmark achievements like the first atomic clock in 1955.

Collaboration is non-negotiable in a field this complex

There’s another reason quantum needs early standardization: no single organization, and no single country, can do everything. Quantum computing, networking, sensing and quantum-enabled position, navigation and timing span multiple technologies, multiple engineering challenges and multiple supply chains. Fragmented approaches don’t just slow progress but can also create incompatible ecosystems that limit market access and reduce investor confidence.

That’s why international cooperation is so important, and why the standards conversation has become more strategic. The recently announced NMI-Q initiative brings together National Metrology Institutes from G7 countries and Australia to accelerate pre-standardization research and develop “best measurement practices” that can shape global standards for quantum technologies.

In the UK, initiatives like the UK Quantum Standards Network Pilot provide a way for industry, academia and government to feed into standards development and represent UK interests within European and international standards bodies, helping ensure UK quantum companies can access global markets and supply chains as the sector matures.

Standards de-risk investment by making performance comparable

For startups and scale-ups, standardization is often framed as a “nice to have”. In reality, it’s increasingly a commercial necessity.

Investors and end-users don’t primarily want a physics lecture. They want confidence. Questions come up repeatedly: Does it work? How does it compare with existing technologies? Is it reliable? What does it cost to run? Is there a robust supply chain? Can it scale? Are there relevant standards or regulation?

Standards help answer these questions because they make performance measurable in a way that is credible across the ecosystem, thereby reducing the risk of vendor lock-in, improving interoperability, and creating minimum quality expectations that buyers can trust. Over time, that enables economies of scale and multi-vendor supply chains, which in turn helps drive adoption.

Crucially, the old model of “build the product first, then standardize it” doesn’t fit quantum. As industry has told us, standards now often need to emerge before technology is fully mature, because without shared approaches you risk building something that is locked out of the wider stack, or you end up with fragmented standards shaped by whoever moved first rather than by what works best.

A practical call to action for quantum innovators

If you’re building quantum technologies today, it’s important not the view early engagement with standards as giving away IP or slowing down your roadmap, but as a way of shaping the market you want to enter.

Get involved early so you’re not excluded later. Contribute to the shared terminology and measurement practices that will define “good” in your segment. Use independent benchmarks to validate performance, strengthen credibility with investors, and build trust with partners across your supply chain.

Quantum computing will open new possibilities, and quantum sensing and timing technologies can deliver powerful capabilities in sectors that depend on precision and reliability.

But those opportunities will only translate at scale if the ecosystem can compare approaches fairly, integrate them reliably and prove value with confidence. By building the measurement and standards foundations now, innovation can scale when the market is ready.

Check out list of the best product information management software.



Source link

Team TeachToday

Team TeachToday

About Author

TechToday Logo

Your go-to destination for the latest in tech, AI breakthroughs, industry trends, and expert insights.

Get Latest Updates and big deals

Our expertise, as well as our passion for web design, sets us apart from other agencies.

Digitally Interactive  Copyright 2022-25 All Rights Reserved.