For decades, the field of paleontology has been romantically defined by the image of the dusty explorer: a sun-beaten scientist in the badlands, wielding a rock hammer and a brush, painstakingly revealing the white glint of bone from red sandstone. It is a discipline grounded in the tangible, in the physical weight of rock and the tactile geometry of fossilized remains. But in 2026, the most significant excavation in paleontology is not happening in the Gobi Desert or the Morrison Formation. It is happening inside the silicon architecture of high-performance servers, where artificial intelligence is cracking codes that have stumped humans for over a century.
The catalyst for this seismic shift is a new AI application, colloquially dubbed "DinoTracker" by the press, developed by a collaboration between physicists at Helmholtz-Zentrum Berlin and paleontologists at the University of Edinburgh. This tool has done something previously thought impossible: it has taught itself to read the chaotic, eroded, and ambiguous language of dinosaur footprints without human instruction. In doing so, it has not only solved local mysteries on the Isle of Skye but has potentially rewritten the timeline of avian evolution, finding "bird" footprints tens of millions of years before the first known bird, Archaeopteryx, ever spread its wings.
This breakthrough is merely the spearhead of a broader revolution. We are entering the age of Digital Paleontology, a discipline where laser scanning, photogrammetry, computed tomography (CT), and unsupervised machine learning are converging to turn fossils into big data. This article explores this brave new world, detailing the technology behind the new AI tracker, the implications of its findings, and how the "digital mining" of museum collections is revealing soft tissues, embryonic skeletons, and evolutionary links that physical preparation could never touch.
Part I: The Cinderella Problem
To understand the magnitude of the AI breakthrough, one must first understand the unique frustration of studying dinosaur footprints—a field known as paleoichnology.
Steve Brusatte, a renowned paleontologist at the University of Edinburgh and co-author of the landmark PNAS paper detailing the new AI, describes the discipline’s central challenge as the "Cinderella Problem." When a paleontologist finds a footprint, they are essentially holding a glass slipper. They know someone was at the ball, and they know roughly how big their feet were. But the "princess"—the dinosaur itself—is long gone.
Unlike bones, which are direct parts of the animal, footprints are records of an interaction between biology and geology. A footprint’s shape is determined not just by the anatomy of the foot, but by the consistency of the mud, the speed of the animal, the slope of the ground, and millions of years of subsequent erosion. A Tyrannosaurus rex walking through stiff clay leaves a very different print than the same T. rex running through soupy riverbed mud.
For over a century, human experts have tried to classify these tracks by eye. They look for specific features: the angle between the toes, the sharpness of the claw marks, the presence of a "heel" pad. But this method is fraught with subjective bias. If a researcher expects to find a certain dinosaur in a rock formation, they are subconsciously more likely to interpret a vague three-toed blob as belonging to that dinosaur. Worse, the taxonomy of footprints is a mess of synonyms and disputed categories, with different experts using different names for the same shapes.
This is where the "DinoTracker" project began. Gregor Hartmann, a physicist at Helmholtz-Zentrum Berlin, realized that the problem wasn't a lack of data, but a lack of objective interpretation. The human brain is excellent at pattern recognition, but it is also an engine of bias. We see faces in clouds and familiar dinosaurs in amorphous holes in the ground. Hartmann proposed a radical solution: remove the human from the loop entirely.
Part II: The Unsupervised Eye
Most AI applications in science use "supervised learning." In this approach, you feed a computer thousands of images labeled by humans—"this is a cat," "this is a dog"—until the computer learns to distinguish them. But if you used this method for dinosaur tracks, you would simply teach the AI to replicate the biases and errors of human paleontologists. If the experts were wrong about a footprint classification, the AI would learn to be wrong, too.
Hartmann and Brusatte chose a different path: unsupervised learning. They utilized a sophisticated neural network architecture known as a Disentangled Variational Autoencoder (VAE).
To the layperson, a VAE can be thought of as a hyper-efficient compression algorithm with an imagination. The researchers fed the system nearly 2,000 silhouettes of fossilized footprints from the Triassic, Jurassic, and Cretaceous periods, along with modern bird tracks. They didn't tell the computer what the tracks were. They simply said, "Look at these shapes. Find the simplest way to describe them mathematically."
The AI broke the images down, compressing them into a "latent space"—a multi-dimensional map of shapes. To do this effectively, the AI had to figure out which features of the footprints mattered most. It ignored the "noise" (like cracks in the rock or random erosion) and focused on the signal.
Remarkably, the AI independently discovered eight key features that define the variation in dinosaur feet. It didn't know the words for "toe spread" or "heel impression," but it mathematically isolated these concepts. The eight "latent variables" it identified included:
- Digit Spread: The angle between the outer toes.
- Heel Position: How far back the metatarsal pad sits.
- Weight Distribution: Which part of the foot sank deepest.
- Digit Attachment: How the toes connect to the main pad.
- Overall Load/Contact Area: The broadness of the foot.
- Heel Loading: The specific pressure depth of the heel.
- Digit Emphasis: The relative length and thickness of the middle toe versus the outer toes.
- Left/Right Asymmetry: The subtle differences in how the foot rolls during a step.
By plotting every footprint against these eight variables, the AI created a "morphospace"—a 3D galaxy of dots where every star was a footprint. Tracks that were similar clustered together, forming natural families.
When the researchers finally overlaid the human-created labels onto this AI-generated map, the results were vindicating. The AI’s clusters matched expert consensus about 90% of the time. It correctly grouped the three-toed tracks of theropods (meat-eaters) separately from the similar-looking tracks of ornithopods (plant-eaters).
But it was in the 10% of "disagreement" that the groundbreaking discoveries lay.
Part III: The Ghost Birds of the Triassic
The most shocking finding from the AI analysis came from a set of tracks dating back to the Late Triassic and Early Jurassic periods—over 200 million years ago.
For decades, paleontologists have been puzzled by certain small, three-toed footprints from this era. They look uncannily like bird tracks. However, the fossil record tells us that birds didn't evolve until much later, in the Late Jurassic (around 150 million years ago), with creatures like Archaeopteryx. The consensus was that these Triassic tracks must have been made by some other reptile that just happened to have bird-like feet—a case of convergent evolution.
The AI disagreed.
In the AI’s impartial morphospace, these ancient footprints didn't just look kind of like bird tracks. They clustered deeply and undeniably within the "bird" group. The mathematical signature of their weight distribution, toe splay, and heel structure was indistinguishable from modern avian footprints.
This leaves paleontologists with two tantalizing possibilities, both of which rewrite the textbooks:
- The "Early Bird" Theory: Birds, or their direct ancestors, evolved tens of millions of years earlier than we thought. They were hopping around the feet of early giant dinosaurs in the Triassic, but their hollow, fragile bones have simply never preserved as fossils. Only their footprints remain as "ghost" evidence.
- The "Perfect Mimic" Theory: A group of non-avian dinosaurs (likely early theropods) evolved feet that were functionally identical to birds. This would imply that the biomechanics of "walking like a bird" (bipedal, agile movement) were perfected by evolution long before flight was invented.
"The AI is forcing us to confront the data without our preconceptions," Brusatte noted in an interview. "It doesn't care that Archaeopteryx is 'supposed' to be the first bird. It just sees the geometry. And the geometry says: Bird."
Part IV: Solving the Skye Mystery
Closer to home, the AI was deployed to solve a mystery on the "Dino Coast" of the Isle of Skye in Scotland. This windswept island is one of the few places in the world preserving the Middle Jurassic—a time gap in the fossil record often called the "Dark Ages" of dinosaur evolution.
Researchers had found hundreds of tracks on the tidal platforms of Skye, but they were messy. The constant battering of the North Atlantic waves had eroded them, and the dinosaurs had originally walked through sticky lagoon mud, distorting their prints. Experts argued for years: were these the tracks of early meat-eaters (theropods) or early duck-billed herbivores (ornithopods)?
The distinction is crucial. If they were ornithopods, it would mean these plant-eaters had exploded in diversity millions of years earlier than believed. If they were theropods, it changed the predator-prey ratios of the ecosystem.
The AI analyzed the chaotic Skye tracks and cut through the noise of the erosion. It identified the "deep signal" of the weight distribution—specifically the "Heel Loading" variable. It concluded that the tracks were undeniably made by ornithopods, the early relatives of the duck-billed dinosaurs.
This digital confirmation provided the first solid evidence that these herbivores were thriving and growing to large sizes in the Middle Jurassic lagoons of Scotland, fundamentally changing our understanding of that ecosystem.
Part V: The Democratization of Discovery
One of the most exciting aspects of the "DinoTracker" project is that it isn't locked away in an ivory tower. The team has released the tool as a smartphone app.
This represents a massive democratization of paleontology. Previously, if a hiker or amateur fossil hunter found a weird depression in a rock, they had to take a photo, email a local museum, and hope a busy curator had time to look at it. Often, these finds were ignored or misidentified.
Now, a user can snap a photo of a track with their phone. The app uses photogrammetry principles to analyze the depth and shadow (or asks the user to sketch the outline if the lighting is flat), processes the shape through the VAE neural network, and returns a probability score: "85% likelihood of Theropod, closely resembling Grallator."
This turns every hiker into a potential research assistant. We are approaching a future of Citizen Science 2.0, where the sheer volume of data collected by the public feeds back into the scientific models, making the AI smarter.
This mirrors other successful citizen science initiatives in the digital space. The Fossilfinder project, utilizing the Spotteron platform in Austria, allows users to geolocate and categorize fossils, creating a crowdsourced map of Austria’s prehistoric life. Similarly, the DigiVol project by the Australian Museum has mobilized thousands of volunteers to transcribe labels and digitize millions of specimens from the comfort of their homes. The "DinoTracker" takes this a step further by adding an analytic AI layer to the data collection.
Part VI: The Tools of the Trade – Beyond the App
While the VAE app is the headline-grabber of 2026, it is supported by a suite of other digital technologies that are replacing the plaster jackets and dental picks of the past.
1. Photogrammetry: The People’s 3D Scanner
Photogrammetry has become the workhorse of digital paleontology. The concept is simple: take hundreds of photos of an object from different angles, and use software to triangulate the position of every pixel in 3D space.
This technique was instrumental in the recent excavations of the "Dinosaur Highway" in Oxfordshire. In 2024 and 2025, researchers uncovered one of the longest dinosaur trackways in Europe—a 220-meter path walked by a herd of sauropods (long-necked giants) 166 million years ago.
Because the site is an active quarry, the tracks couldn't be physically preserved. The rock would eventually be crushed for construction. In the old days, this data would be lost, or at best, a few plaster casts would be made. Instead, the team used drone-based photogrammetry to create a millimeter-perfect 3D digital replica of the entire football-field-sized site.
This "Digital Twin" allows researchers to study the tracks forever. They can virtually manipulate the lighting to see shallow impressions that are invisible at noon but pop out at sunset. They can measure the stride lengths to calculate the precise speed of the herd (likely a slow, steady walk of 2-3 mph).
2. Laser Scanning (Lidar): Mapping Megasites
For sites too large or complex for simple photography, Lidar (Light Detection and Ranging) is used. At the Coste dell'Anglone tracksite in Italy, hundreds of dinosaur footprints are scattered across a vertical rock face—a layer of sediment that was tilted up by the formation of the Alps. Hanging off a cliff to measure tracks is dangerous and inaccurate.
Using ground-based Lidar, researchers scanned the entire cliff face. The laser shoots millions of pulses of light, measuring the time it takes to bounce back. The result is a "point cloud"—a ghostly digital sculpture of the mountain. Algorithms can then strip away the vegetation and smooth out the weathering, revealing the tracks with a clarity that the naked eye can't achieve.
3. CT Scanning: The Digital Dissection
Perhaps the most "Sci-Fi" advancement is the use of high-energy Computed Tomography (CT) scans to look inside fossils.
In the 1990s, preparing a delicate fossil skull meant risking destroying it with mechanical tools. Today, fossils are placed in industrial CT scanners. The X-rays slice through the rock, distinguishing between the density of the fossilized bone and the surrounding stone matrix.
However, a raw CT scan is just a stack of grey images. Separating the bone from the rock (segmentation) used to take a human hundreds of hours of manual clicking. Now, AI is doing this, too. A recent study on Protoceratops embryos from Mongolia used deep neural networks to automatically segment the tiny, fragility bones from the egg content.
This "Digital Preparation" allows us to see things we never could before. We can see the semicircular canals of the inner ear to determine if a dinosaur had a good sense of balance (and thus, if it was agile or clumsy). We can digitally "re-inflate" crushed skulls to see their original shape.
Part VII: Soft Tissue and the Limits of Stone
The digital revolution isn't just about shape; it's about substance. For a long time, it was dogma that soft tissues—skin, blood vessels, proteins—decayed within weeks of death. Fossils were just stone replicas of bone.
That dogma was shattered by Mary Schweitzer at NC State University, who famously dissolved the mineral content of a T. rex femur and found flexible, transparent blood vessels and collagen inside.
In 2025 and 2026, AI and advanced imaging are taking this "molecular paleontology" to a new level. Researchers are using synchrotron rapid-scanning X-ray fluorescence (SRS-XRF) to map the chemical ghosts of soft tissue.
We can now detect the chemical signature of melanin pigments in fossilized feathers. We know, for a fact, that Microraptor was iridescent black like a crow, and that the small dinosaur Sinosauropteryx had a ginger-and-white striped tail.
AI models are currently being trained to recognize the microscopic texture of fossilized skin in CT scans—textures that human eyes miss. In 2026, this technology hinted at the presence of a "dewlap" (a flap of skin under the chin) on certain Edmontosaurus specimens, a feature that was previously pure speculation.
Part VIII: From Data to Experience – The VR Museum
All this data—the 3D trackways, the segmented skulls, the chemical color maps—has to go somewhere. It is fueling a renaissance in how the public experiences natural history.
We are moving away from the "cabinet of curiosities" model (static bones on shelves) to Immersive Paleontology.
- Visions of Nature (NHM London): Launching in late 2025, this mixed-reality experience uses the Microsoft HoloLens to transport visitors into future ecosystems. But the technology is based on the same photogrammetry engines used to reconstruct the past.
- Expedition Voyager (Edelman Fossil Park): Opening in New Jersey, this "free-roam" VR experience allows groups to walk together through a digitally reconstructed Cretaceous world. Because the digital assets are based on the latest scans, the dinosaurs move with biomechanically accurate gaits derived from the very footprint studies mentioned earlier.
- Dinoverse (Atlanta): This exhibit lets visitors "paint" dinosaurs in VR, but the skeletons they stand under are 3D-printed from high-resolution scans of real fossils, allowing rare specimens (which usually stay locked in vaults) to be displayed globally.
Part IX: The Philosophical Shift
This transition to Digital Paleontology raises profound questions.
If we can 3D scan a site like the Dinosaur Highway and then let the quarry destroy it, have we really preserved it? Is a terabyte of data equivalent to a ton of rock?
Most paleontologists argue yes—and that it's actually better. A physical trackway erodes every year it is exposed to rain and frost. A digital file, if backed up properly, is immutable. Furthermore, a physical site can only be visited by a few people. A digital site can be uploaded to the cloud and studied by a student in Mumbai, a professor in Berlin, and an enthusiast in Buenos Aires simultaneously.
This is the ultimate promise of the AI revolution in paleontology. It is breaking down the barriers of access. It is stripping away the subjective biases of the "lone genius" expert. It is revealing that the rock record is far richer, denser, and more surprising than we ever dared to dream.
The dinosaurs may be gone, but thanks to the marriage of silicon and stone, their digital ghosts are more alive than ever. We are no longer just looking at their footprints; for the first time, we are walking alongside them.
Reference:
- https://www.helmholtz-berlin.de/pubbin/news_seite?nid=32126&sprache=en&seitenid=
- https://www.youtube.com/watch?v=7W32Hvurq84
- https://www.thebrighterside.news/post/ai-helps-scientists-read-dinosaur-footprints-offering-new-clues-to-ancient-life/
- https://www.discovermagazine.com/how-ai-is-solving-some-of-paleontology-s-biggest-dinosaur-footprint-mysteries-48597
- https://earthsky.org/earth/dinosaur-highway-unearthed-footprints/
- https://www.youtube.com/watch?v=X7XY2XbYDF4
- https://www.frontiersin.org/news/2022/01/27/frontiers-earth-science-ai-reconstruct-dinosaur-fossils