Outbound Link Summary:
8 hours ago
p3k dots

Every – A daily newsletter on what comes next in tech.

Language models are trained by a simple trick.
We give them a line of text and ask them, “What comes next?”
We do this over and over again, and eventually, they learn to complete the line.
By this trick language models have learned to write poetry, prose, and quantum physics. It works because in text, as in life, what’s past is prologue.