Quantum Blogs and Articles
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

Quantum Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
QuantumBlogsQuantum Random Features Achieve 89.3% Accuracy on Fashion-Mnist with Scalable Qubits
Quantum Random Features Achieve 89.3% Accuracy on Fashion-Mnist with Scalable Qubits
QuantumAI

Quantum Random Features Achieve 89.3% Accuracy on Fashion-Mnist with Scalable Qubits

•February 2, 2026
0
Quantum Zeitgeist
Quantum Zeitgeist•Feb 2, 2026

Why It Matters

The results prove that practical quantum classifiers can match classical performance using far fewer quantum resources, accelerating the deployment of quantum AI on near‑term hardware.

Key Takeaways

  • •QRF achieves 86% accuracy with 11 qubits
  • •QDRF reaches 89% accuracy, matching classical models
  • •Preprocessing cost scales O(N L d), far below 2^N
  • •Layer depth ~20 approximates random Fourier feature statistics
  • •Performance improves with qubit count, up to 15 qubits

Pulse Analysis

The promise of quantum machine learning has long been hampered by the need for deep, entangling circuits that quickly exhaust the limited coherence of today’s quantum processors. By borrowing the random Fourier feature (RFF) paradigm from classical kernel methods, the new Quantum Random Features (QRF) and its dynamical variant (QDRF) replace costly optimisation with fixed, randomly sampled parameters encoded through simple Z‑rotations. This spectral framework preserves the expressive power of high‑dimensional feature maps while keeping circuit depth to a handful of layers, making it compatible with noisy intermediate‑scale quantum (NISQ) devices.

From a computational standpoint, the authors demonstrate that preprocessing scales linearly with the product of qubits (N), layer count (L) and data dimension (d), i.e., O(N L d), instead of the exponential O(2^N) required for full quantum state tomography. In practice, a depth of about twenty layers already reproduces the statistical properties of classical RFF, allowing an 11‑qubit circuit to reach 86 % accuracy on Fashion‑MNIST and a 15‑qubit circuit to push performance to 89 %. These figures rival conventional convolutional networks while consuming far fewer quantum resources.

The implications extend beyond image classification. Because QRF and QDRF generate high‑dimensional embeddings with minimal overhead, they can be repurposed for time‑series forecasting, generative modeling, and reinforcement learning—domains where classical kernels already excel. Moreover, the ability to trade qubit count for accuracy offers a clear roadmap for hardware vendors: incremental improvements in qubit quality directly translate into better machine‑learning outcomes. As the community refines spectral engineering and quantifies generalization bounds, scalable quantum feature maps could become a cornerstone of commercial quantum AI solutions.

Quantum Random Features Achieve 89.3% Accuracy on Fashion-Mnist with Scalable Qubits

Read Original Article
0

Comments

Want to join the conversation?

Loading comments...