Researchers from Bar-Ilan University challenge the prevailing “deeper is better” paradigm in AI by demonstrating that the brain’s learning mechanism—which relies on a wide, shallow architecture with very few layers—can compete with deep learning in complex classification tasks. Published in Physica A, the study contrasts the “skyscraper” structure of artificial neural networks with the “wide building” structure of the brain, arguing that network width can effectively substitute for depth. The authors note, however, that adopting this biological approach faces a significant hurdle: modern GPU technology is specifically engineered to accelerate deep architectures and currently fails to efficiently implement the wide, shallow dynamics found in nature.