
The latest AI research roundup highlights a pivot from scaling raw compute toward efficiency‑first designs. Notable advances include calibrated sparse attention that accelerates text‑to‑video diffusion without retraining, and an object‑centric self‑improving loop that refines image generation alignment autonomously. A hybrid LLM‑driven evolutionary search (LLEMA) demonstrates practical materials discovery by coupling scientific intuition with synthesis constraints. Additional work demystifies transformer activation spikes as architectural artifacts and introduces cubic discrete diffusion, setting a new token‑based ImageNet benchmark.

A new benchmark called SWE‑CI, developed by Sun Yat‑sen University and Alibaba, reframes AI coding evaluation from single‑snapshot bug fixes to continuous maintenance of evolving repositories. The benchmark tracks 233 days and an average of 71 commits per project, simulating...