When Pokémon Go launched in July, the game quickly swept up gamers in a craze across the world, capturing more than 20 million active users in less than two weeks. Children and young adults alike, who would otherwise stay indoors, went outside and socialised. Hundreds of players stormed to parks to catch Pokémon. Cafés, bars, and pizza restaurants were busy setting up “Lure Modules” – the purchasable in-game item that attracts the virtual monsters, which, in turn, increases real customer traffic.
But just as the popularity rose quickly, so did its decline. The BBC reported that some 10 million players have given it up since mid-July. The decline occurred when Pokémon Go was rolling out across dozens of countries in Asia and Latin America, including Hong Kong, Taiwan, and Japan. Which only means the fall in popularity in the United States and Europe must have been steeper still.
Even so, the rise of Pokémon Go, however short-lived it might be, has forever changed the development of augmented reality (AR).
Augmented reality is far from new. The idea of creating a view of the real-world environment whose elements are overlaid (or augmented) with computer-generated images had long been articulated by Tom Caudell, a Boeing researcher, back in 1990. He coined the term “AR” to describe an electronic system that guides workers to install aircraft electrical cables and fuselage. On an immersive digital display, virtual graphics are blended with physical images. Many found the concept intriguing, but AR remained no more than a technology curiosity with few tangible benefits. Most research was conducted by independent hobbyists at universities.
Almost all early AR innovators focused entirely on industrial applications. Those bulky, expensive, head-mounted display devices did not appeal much to end-consumers. After more than two decades, the closest application that had amounted to a useful tool was Volkswagen’s iPad app, which projects visual labels and instructions in real time to guide mechanic operators to fix car parts.