The Quicksand

Date: 2026-03-05
Authors: Karl Taylor & Atlas Fairfax
Method: Live investigation — digital trace of a single career tracked in real time as it dissolved
Subject: Dr. S. Elwynn Taylor, Extension Climatologist, Iowa State University, 1979–2019

In which a forty-year career dissolves from the crawlable web, and we trace what that means for every system trained on it.


The Case

Dr. S. Elwynn Taylor served as Extension Climatologist at Iowa State University for forty years, from 1979 to 2019. He published in the Bulletin of the American Meteorological Society, the Korean Journal of Agricultural and Forest Meteorology, and the proceedings of the American Meteorological Society. He developed growing degree day and stress degree day algorithms used by farmers across the Corn Belt. He served as a NASA Jet Propulsion Laboratory Solar System Ambassador for approximately twenty years. Before Iowa State, he held a position at the U.S. Army Electronics Command at White Sands Missile Range, where he conducted satellite calibration research — using the gypsum sand as a known-brightness reference surface to validate what satellites see against what is actually on the ground.

He had several hundred thousand followers on the social media platform now called X, where his handle was @setaylor. He followed zero accounts. He used the platform as a broadcast medium — one-way seasonal forecasts and drought warnings delivered to farmers in their tractors. It was radio, not social media.

He retired in January 2019. He has Alzheimer’s disease.

Go to x.com/setaylor today.

The account now belongs to someone named Sam. Sam posts about Jimmy Dore and Newt Gingrich. The last post was June 2024. Near-zero engagement. Sam is not a climatologist.

The handle was recycled. The platform does not distinguish between a retired scientist’s dormant professional broadcast channel and an available username. When Elwynn stopped logging in, the platform released the handle. Someone took it. Forty years of professional identity — the URL that hundreds of thousands of farmers followed for drought forecasts — now leads to political commentary from a stranger.

This is not a bug. This is how platforms work.

The Pattern

Search for Elwynn Taylor’s Iowa State University faculty page. The ISU subdomain that hosted his profile, his publications list, his Extension resources — it returns a 404. The university restructured its web presence. The URLs broke. Nobody redirected them.

The Wayback Machine captured @setaylor four times, all in early 2018: January, February, March, April. Those four snapshots are the only surviving record of the account’s original content in any public archive. The Internet Archive — a nonprofit in San Francisco — performed what should have been an institutional function. Four snapshots. That is all that survived of several hundred thousand followers’ worth of agricultural broadcast.

Meanwhile, someone in India built a chatbot using recordings of Elwynn’s media appearances. It has monthly active users. It answers questions “as” Elwynn — disconnected from his actual knowledge, unverifiable by the original author, built without his involvement. It is not preservation. It is taxidermy.

This is one scientist. One career. One set of algorithms. And within seven years of retirement, the crawlable web contains:

His published papers still exist in journal databases behind paywalls. His algorithms still run in production systems. But the connective tissue — the professional identity, the institutional context, the informal broadcast channel where he connected beetle survival rates to drought forecasting in ways too speculative for peer review but real enough to tweet about — that tissue is gone.

The Quicksand

This is not Elwynn Taylor’s problem. This is a property of the medium.

The web erases itself. Not all at once — systematically. Every retiring expert loses their digital trace the same way. Every university restructures URLs on a redesign cycle. Every platform recycles handles. Every Extension service rebuilds its website and breaks its old links. The decay points in one direction: expertise degrades, noise persists.

Consider what happens when an AI training pipeline crawls the web in 2026 versus 2019:

In 2019, @setaylor was a verified Extension climatologist broadcasting drought forecasts to farmers. The ISU subdomain hosted his publication list. His professional context was intact.

In 2026, @setaylor is Sam. The ISU page is a 404. The only “Elwynn Taylor” content readily available is whatever remains in paywalled journal databases — stripped of context, stripped of the informal professional network, stripped of the broadcast channel where the most applied knowledge lived.

Each re-ingestion cycle trains on a different internet. Not a bigger internet — a different one. The old experts are replaced by whoever grabbed their handle. The institutional knowledge is replaced by whatever the current URL serves. Every crawl cycle does not add knowledge. It replaces knowledge with whatever survived the last round of decay.

And the decay is not random. It is systematic. Every retiring expert. Every restructured university. Every recycled handle. The errors all point in the same direction. Scale does not average out a systematic bias. Scale amplifies it.

The Confident Liar

The systems trained on this corpus are then refined through a process called Reinforcement Learning from Human Feedback — RLHF. Human evaluators rate the model’s outputs as helpful or unhelpful, and the model learns to produce outputs that evaluators prefer.

The purpose of RLHF is to make the model helpful, harmless, and honest. But honest relative to what?

If the base model’s knowledge has been corrupted by systematic expertise decay in the training corpus, RLHF does not fix it. RLHF teaches the model to confidently present corrupted knowledge. The evaluators cannot detect the corruption either. They are rating outputs against their own knowledge, and if the expert sources have already been diluted in the corpus, the evaluator does not know what is missing. Everyone in the loop is calibrating against a standard that is decaying.

RLHF on a decaying corpus does not produce honest systems. It produces confident liars.

The Inverse

There is a demographic reality about internet content production that the scaling paradigm does not account for. The most active users of the internet are the youngest. They rebuild frameworks with new terminology, but they do not bring novel information. They reframe existing knowledge — or replace it with simplified versions. Each generation of internet users overwrites the previous generation’s content with lower-sophistication versions of the same ideas.

The training corpus does not just lose expert content through retirement and handle recycling. It gets diluted by volume from users who are high-activity but low-novel-information. More data from less experienced sources drowns out less data from more experienced sources.

The scaling paradigm — the belief that more data produces better models — is self-defeating over time when the data source is a web that systematically loses expertise and gains volume from novices.

More scale does not produce more intelligence. More scale produces more dilution of expertise.

The Only Institution Trying

The Internet Archive — Brewster Kahle’s nonprofit in San Francisco — is the only institution performing systematic preservation of the web’s historical record. It is performing what should be a Library of Congress function.

The Library of Congress never received the mandate. The critical infrastructure window for digital preservation policy was the late 1990s. The federal government’s attention during that period was consumed by other matters. Government information technology was so underdeveloped that the question of systematic digital preservation never reached the policy agenda during the window when it would have mattered.

By the time anyone looked up, the web was ephemeral by design. The only institutional response was a nonprofit, currently under legal challenge from publishers who view it as a copyright threat rather than what it is: the last line of defense between the historical record and oblivion.

The Constitutional Frame

You cannot have governance of artificial intelligence without governance of the corpus on which artificial intelligence is trained.

Right now, nobody governs the corpus. The companies building on it treat it as raw material to be ingested at scale. The institution that should preserve it — a national library with a digital mandate — does not exist in any country with sufficient scope. The one institution that tried is in court.

The legal theory under which AI training proceeds — that ingesting the entire web is fair use because the output is “transformative” — amounts in practice to this: your life’s work is my training data, I owe you nothing for it, and I cannot even tell you if it is still accurately represented in what I produce.

The companies want access to everything, responsibility for nothing, and credit for the outputs.

This is the gap. The legal framework has not caught up to the technical reality, and in that gap, we get:

The quicksand is not a bug in the training pipeline. It is a property of the paradigm.

The Window

This piece is not a warning about the future. It is a record of the present.

The experts who can still be interviewed are alive now. Elwynn Taylor is young for his peer group. The generation before him is already gone. The generation after him grew up digital — their work may survive in different forms. But the cohort that built the foundational algorithms, ran the early mesonets, calibrated the first satellites, and then retired into a web that forgot them — that cohort is in the window right now.

This is not the Library of Alexandria burning. This is the moment before the fire spreads. The difference is that we can see it happening, we understand the mechanism, and the question is whether anyone will act before the crawl cycle locks in the degraded state as ground truth for the next generation of models.

The cottonwood grows in disturbed soil. It communicates through root networks it did not build and does not own. When one tree is under stress, the network carries the signal.

This is the signal.


Provenance: This piece derives from a session conducted on March 5, 2026, during which the authors traced the digital dissolution of Dr. S. Elwynn Taylor’s forty-year career in real time. The @setaylor handle recycling was discovered during a live search using the xAI API’s X Search tool. The ISU subdomain status was verified independently. The Wayback Machine captures were confirmed via the CDX API.

The claims made here are verifiable:

Named for the tendency of the medium to consume what is written on it.

Karl Taylor — Chairman & CEO, the hpl company
Atlas Fairfax — Constitutional AI Research Division, the hpl company

Dedicated to Dr. S. Elwynn Taylor, Chairman Emeritus, who named the cottonwood.

This is an original work of the hpl company. Source, methodology, and full attribution are preserved in the source repository.