ALPHA Timepoint is in alpha Talk to Us

Simulate any moment in time

Timepoint renders causally coherent historical simulations — complete with characters, dialog, and AI-generated scenes. Past, present, or future.

Time Travel
Describe any moment and watch it render in real-time with AI agents
🔗
Clockchain
Explore a temporal graph of causally connected historical events
💬
Character Chat
Talk to historical figures rendered from primary sources
Pro Simulations
Run multi-entity temporal simulations with cognitive dynamics
Moments
6,789
Causal Edges
110,957
Rendered Scenes
130
From the Clockchain
Publication of "Attention Is All You Need"
Publication of "Attention Is All You Need"
2017 · Attention Is All You Need Transformer Paper, California
Eight researchers from Google Brain, Google Research, and the University of Toronto publish "Attention Is All You Need" (arXiv:1706.03762), introducing the Transformer architecture — a neural network built entirely on self-attention mechanisms, dispensing with recurrence and convolution. The paper proposes multi-head attention, scaled dot-product attention, and sinusoidal positional encodings, achieving state-of-the-art results on WMT 2014 English-to-German and English-to-French translation benchmarks while training significantly faster than prior architectures. Presented at NeurIPS 2017 in Long Beach, California, the paper would become the most cited work in deep learning history (over 140,000 citations by 2024), winning the NeurIPS 2023 Test of Time Award. The Transformer directly spawned BERT, GPT, T5, PaLM, LLaMA, and virtually every large language model that followed, making it the single most consequential architecture paper in the history of artificial intelligence.
AlexNet Wins the ImageNet Challenge
AlexNet Wins the ImageNet Challenge
2012 · Alexnet Wins Imagenet Challenge, Tuscany
Alex Krizhevsky, Ilya Sutskever, and Geoffrey Hinton's deep convolutional neural network "AlexNet" wins the ImageNet Large Scale Visual Recognition Challenge (ILSVRC 2012) with a top-5 error rate of 15.3%, obliterating the runner-up's 26.2% by an unprecedented 10.8 percentage-point margin. Trained on two NVIDIA GTX 580 GPUs for six days, the 8-layer, 60-million-parameter network demonstrated that deep neural networks trained with backpropagation on GPUs could vastly outperform decades of hand-engineered computer vision features. The result — announced at ECCV 2012 in Florence and presented as a paper at NeurIPS 2012 in Lake Tahoe — is widely regarded as the "Big Bang" of the modern deep learning era. It triggered an avalanche of GPU-accelerated neural network research, convinced industry that deep learning worked at scale, and set in motion the chain of advances through ResNets, GANs, sequence models, and ultimately the Transformer architectures that produced modern large language models. Cited over 150,000 times, it remains one of the most consequential papers in computer science history.
Invention of the Microprocessor
Invention of the Microprocessor
1971 · Santa Clara, United States
Intel unveils the 4004, the first commercial microprocessor, launching the personal computing revolution
Rosenblatt Demonstrates the Perceptron
Rosenblatt Demonstrates the Perceptron
1958 · Rosenblatt Perceptron Demonstration, District Of Columbia
Frank Rosenblatt demonstrates the Mark I Perceptron at a U.S. Navy press conference in Washington, D.C. — the first machine that can learn from experience. Built at the Cornell Aeronautical Laboratory in Buffalo, New York under Office of Naval Research contract N6onr-24807, the Mark I is a room-sized apparatus with 400 cadmium sulfide photocells arranged in a 20x20 grid as its "retina," wired to 512 association units and 8 output units, with weights stored in potentiometers adjusted by electric motors during training. It learns to classify simple visual patterns by example, not by explicit programming. The New York Times covers the event the next day under the headline "New Navy Device Learns by Doing," breathlessly describing it as "the embryo of an electronic computer that [the Navy] expects will be able to walk, talk, see, write, reproduce itself and be conscious of its existence." Rosenblatt's formal paper, "The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain," appears months later in Psychological Review (Vol. 65, No. 6, pp. 386-408, November 1958), establishing the mathematical framework for single-layer neural networks. The perceptron represents a radical alternative to the symbolic AI approach championed at the 1956 Dartmouth Workshop — where Rosenblatt's high school classmate Marvin Minsky was a co-organizer. Rather than encoding intelligence as logical rules, Rosenblatt proposes that intelligence emerges from learning in networks of simple neuron-like units. This connectionist vision will be suppressed for nearly two decades after Minsky and Papert's 1969 book "Perceptrons" proves the architecture cannot learn linearly non-separable functions like XOR — but vindicated when Rumelhart, Hinton, and Williams demonstrate in 1986 that multi-layer networks trained with backpropagation overcome exactly these limitations, igniting the deep learning revolution that the perceptron started.
Launch of Sputnik 1
Launch of Sputnik 1
1957 · Launch Of Sputnik 1, Kazakh Ssr
The Soviet Union launches Sputnik 1 — a polished 83.6 kg aluminum sphere with four radio antennae — from the Tyuratam launch facility in the Kazakh SSR atop a modified R-7 ICBM at 19:28:34 UTC. Its 20 MHz "beep-beep" signal, detectable by amateur radio operators worldwide, proves the Soviets can place objects in orbit and by implication deliver nuclear warheads intercontinentally. The "Sputnik crisis" that follows reshapes American science policy within months: ARPA is created on February 7, 1958 to prevent future technological surprise (its Information Processing Techniques Office will later fund MIT, Stanford, and CMU AI labs plus ARPANET); NASA is established on July 29, 1958; the National Defense Education Act floods federal money into science and engineering education on September 2, 1958. The satellite completes approximately 1,440 orbits over three months before burning up on reentry January 4, 1958. Sputnik is the single event that causally connects the birth of the Space Race — culminating in Kennedy's moon commitment and the Apollo program — with the institutional infrastructure that funded American artificial intelligence research for decades. Chief Designer Sergei Korolev, whose identity remained a state secret, orchestrated the launch that changed the trajectory of two civilizations.
Shannon Publishes "A Mathematical Theory of Communication"
Shannon Publishes "A Mathematical Theory of Communication"
1948 · Shannon Mathematical Theory Of Communication, New Jersey
Claude E. Shannon publishes "A Mathematical Theory of Communication" in the Bell System Technical Journal (Vol. 27, No. 3, pp. 379-423, July 1948; No. 4, pp. 623-656, October 1948), founding the field of information theory and laying the mathematical bedrock for the entire digital age. Working at Bell Telephone Laboratories in Murray Hill, New Jersey, Shannon introduces a rigorous mathematical framework for quantifying information — defining entropy (H = -Σ pᵢ log₂ pᵢ) as the measure of uncertainty in a message, formalizing the "bit" (binary digit) as the fundamental unit of information, and proving two landmark theorems. His source coding theorem establishes the theoretical minimum for lossless data compression. His noisy-channel coding theorem proves that reliable communication is possible over any noisy channel at rates below its capacity — an existence proof so counterintuitive that engineers initially refused to believe it. The channel capacity formula C = B·log₂(1 + S/N) becomes the governing equation of telecommunications for the next 75 years. Shannon's insight that all information — text, sound, images — can be reduced to binary digits and transmitted, stored, and processed mathematically is the conceptual foundation for digital computing, data compression (Huffman, Lempel-Ziv, MP3, JPEG), error-correcting codes, cryptography, and ultimately machine learning, where Shannon entropy resurfaces as the cross-entropy loss function that trains virtually every modern neural network. Shannon had discussed the possibility of "thinking machines" over teatime with Alan Turing during Turing's 1943 visit to Bell Labs, and would go on to co-organize the 1956 Dartmouth Workshop that formally launched artificial intelligence as a field. The paper, cited over 150,000 times, is arguably the most consequential scientific publication of the twentieth century after Einstein's relativity papers.
Battle of Mühlberg
Battle of Mühlberg
1547 · Muhlberg, Holy Roman Empire Modern Day Germany
The Battle of Mühlberg has just concluded with a decisive Catholic victory. Charles V, Holy Roman Emperor, surveys the battlefield as his forces round up the defeated Protestant rebels, including thei
Formation of the Schmalkaldic League
Formation of the Schmalkaldic League
1531 · Schmalkalden, Holy Roman Empire
Protestant princes and city representatives gather in Schmalkalden Castle to formally establish the Schmalkaldic League, a defensive alliance against the Catholic Holy Roman Emperor Charles V, marking
Browse the Clockchain →