The open-source map shows the changes that happen to city streetscapes over time.
In the 20 years he’d lived in New York, Raimondas Kiveris had seen the city change immensely. “It was a completely different place, a different town,” says Kiveris, a software engineer at Google Research. This got him wondering what his neighborhood looked like even before that—before he’d lived there, before he’d even been born. “There’s really no easy way to find that information in any organized way,” he says. “So I was starting to think, can we somehow enable this kind of virtual time travel?”
Three years later, his attempt at virtual time travel is taking shape as an open-source map that can show, in both a bird’s-eye view and a pedestrian-level view, the changes that happen to city streetscapes over time. With a slider to control the year, the map displays a historically accurate representation of development in almost any U.S. city dating back to 1800. Automatically generated 3D models of buildings rise from the landscape as the slider moves forward through time. It can even show a rough estimation of what a city would have looked like from the pedestrian’s view, like a low-res Google Street View.
The map, called “rǝ,” is a project Kiveris has led through his research into artificial intelligence and machine learning at Google. Though still in a very early form, the map is functional enough to offer a glimpse of what someone would have seen on a city street decades in the past.
The map was created using historical fire insurance maps, a rich source of information for the built environment that includes precise information about building ages, sizes, heights, roof shapes, and even materials. The map creates simplified 3D models of these buildings, and the time slider allows a user to see, for example, Washington, D.C.’s Dupont Circle nearly devoid of buildings in the 1870s and almost fully developed in the 1920s.
Kiveris wants the map to do more than model buildings over time. He and his team created it as an open-source project so that people such as librarians and map enthusiasts can contribute their own historical sources to add detail. It can even integrate photographs of buildings, using deep learning to analyze images and augment the blocky 3D models with architectural details.
“If we have photos of a building showing the facade in some detail, we can do much more,” he says. “We can essentially do semantic parsing of that facade and figure out this area here is a window, this area is a cornice, this a stair, this is a door.”
This level of detail has already been visualized in some parts of Manhattan, such as the Chelsea neighborhood, where a user can enter the map’s 3D street-level mode and see streets lined with porches and stoops
Eventually, with enough visual data contributed, Kiveris says the map will be able to create lifelike representations of entire neighborhoods that could be good enough to use as the setting for video games or even movies. “If it’s not possible today, it will be possible in five years,” he says.
That level of detail will require a lot more information, from a wider pool of contributors. “Where we think we really could get good coverage is, for instance, you going to your parents or grandparents and digging through shoeboxes and finding photos,” he says. With good photographs along with known dates and locations, the model can create 3D versions of buildings in a matter of days.
He’s hoping the map will eventually be able to model even more detail, such as the interiors of spaces. “What was inside the building in the 1920s, what did the kitchen look like in the 1940s,” he says. “It becomes a compendium of everyday, mundane life.”
Kiveris says it can also be a way to help create an archive of neighborhoods through time, even those places that might not seem worth preserving. “It’s not a historic building—maybe only a couple of people in the world care about it. But that’s kind of the point of this,” he says. “Landmarks are covered and preserved well enough, but the rest of the world is kind of disappearing.”