Language Models Pack Billions of Concepts into 12,000 Dimensions
Link: nickyoder.com/johnson-lindenst…
Discussion: news.ycombinator.com/item?id=4…
Beyond Orthogonality: How Language Models Pack Billions of Concepts into 12,000 Dimensions
In a recent 3Blue1Brown video series on transformer models, Grant Sanderson posed a fascinating question: How can a relatively modest embedding space of 12,288 dimensions (GPT-3) accommodate millions of distinct real-world concepts? The answer lies …Nicholas Yoder (Nick Yoder)