
Researchers mapped a one‑cubic‑millimeter piece of human brain, revealing 57,000 neurons and roughly 150 million synapses within a volume half the size of a grain of rice. The sample was sliced into 5,000 ultra‑thin sections, each 30 nm thick, exposing previously undocumented neuronal structures such as a cell linked to 5,000 partners and tightly coiled endings. By contrast, contemporary artificial neural networks and large language models operate with only millions of artificial synapses and lack the multimodal sensory integration of biology. The stark scale and structural disparity suggests current AI is far from achieving genuine understanding.

Gemini, Google’s AI model, critiques the author’s earlier AI‑hype essay, praising the strong voice, the “Office 3.0” analogy that recasts AI as a productivity utility, and concrete real‑world examples. It flags factual slip‑ups—incorrect GPT‑3 release dates—and notes dated cultural references that...

The author argues that current large language models are powerful summarization tools but lack true intelligence, cautioning against the prevailing AI hype. While LLMs can boost office productivity, they are prone to hallucinations and cannot replace deep expertise. Job displacement...

The post explains Nairobi’s informal traffic etiquette, where drivers move into any available space regardless of painted lanes or signals. This unwritten Rule #1 and Rule #2 create a cooperative “Nairobi flow” that keeps traffic moving despite chronic congestion. By...