Biohacking News and Headlines
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsSocialBlogsVideosPodcastsDigests

Biohacking Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Tuesday recap

NewsSocialBlogsVideosPodcasts
HomeLifeBiohackingNewsComputer Run on Human Brain Cells Learned to Play ‘Doom’
Computer Run on Human Brain Cells Learned to Play ‘Doom’
BiohackingBioTech

Computer Run on Human Brain Cells Learned to Play ‘Doom’

•March 2, 2026
0
Popular Science
Popular Science•Mar 2, 2026

Why It Matters

Demonstrating complex, vision‑based learning in living neural tissue shows biocomputers can tackle real‑world tasks, opening pathways for bio‑integrated AI and robotics.

Key Takeaways

  • •DishBrain evolved into CL1, runs Doom via neurons
  • •Visual data converted to electrical stimulation for neuronal processing
  • •Training achieved faster than comparable silicon machine learning
  • •Performance still far from human, but surpasses random baseline
  • •Future biocomputers could control robotics and complex software

Pulse Analysis

The rise of biocomputing has moved beyond proof‑of‑concept demos, with Cortical Labs’ CL1 representing a tangible leap from simple reflex games to environments that require visual perception. By culturing hundreds of thousands of human cortical neurons on a hydrogel substrate and coupling them to a programmable silicon interface, the team created a living processor capable of interpreting pixel data. This hybrid architecture sidesteps traditional silicon bottlenecks, leveraging the brain’s innate parallelism and plasticity to adapt in real time, a feat that took the researchers just weeks to implement after the *Doom* benchmark was set.

Technical hurdles centered on translating the game’s visual feed into stimulation patterns the neurons could understand. Engineers devised a Python‑driven pipeline that maps screen pixels to electrical pulses, effectively giving the neural network a synthetic “eye.” The rapid learning curve—outpacing conventional deep‑learning models—suggests that biological substrates can internalize complex mappings with fewer training cycles, thanks to synaptic plasticity that silicon lacks. While the current system still loses most encounters, its ability to outperform random action policies highlights a nascent form of embodied intelligence emerging from living tissue.

Looking ahead, the implications stretch across sectors. Biocomputers could eventually manage low‑power robotic limbs, process sensory streams for autonomous vehicles, or run specialized algorithms where energy efficiency and adaptability are paramount. Challenges remain, including scalability, reproducibility, and regulatory concerns around human‑derived cells. Nonetheless, mastering a benchmark as iconic as *Doom* signals that organic computing is transitioning from novelty to a viable component of future hybrid AI ecosystems.

Computer run on human brain cells learned to play ‘Doom’

Read Original Article
0

Comments

Want to join the conversation?

Loading comments...