AI News

Nous Research Proposes Lighthouse Attention: A Training-Only Selection-Based Hierarchical Attention That Delivers 1.4–1.7× Pretraining Speedup at Long Context

Low Severity Global
Date Occurred May 16, 2026 22:23 UTC
Event Type AI News
Source MarkTechPost
Recorded May 16, 2026
Full Description

<p>Nous Research has published Lighthouse Attention, a selection-based hierarchical attention mechanism that wraps around standard scaled dot-product attention during pretraining and is removed afterward. Unlike prior methods such as NSA and HISA that pool only keys and values, Lighthouse pools Q, K, and V symmetrically across a multi-resolution pyramid, reducing the attention call from O(N·S·d) to O(S²·d) and running stock FlashAttention on a small dense sub-sequence. Tested on a 530M Llama-3-s

AI Intelligence Layer

Mentioned Models

LLaMA

AI Categories

research product performance
Event Metadata
  • ID #1670
  • Type AI News
  • Region Global
  • Severity Low
  • Indexed May 16, 2026