Recently, Microsoft announced a new contract with the United States Army for $480 million to provide around 100,000 new HoloLens units for a combat and training program focused on augmented reality. This deal, other than illustrating a huge new boost for Microsoft as it quietly becomes one of the most valuable companies in the world, also demonstrates the continued interest in wearable technology, big data and logistics, and modern digital communication to enhance combat capabilities on the ground.
While there are some in Microsoft who protest this joint venture, others are pointing out the historical relationship between military necessity and technological development. Augmented reality is no exception from this, as the majority of AR technology derives, directly or indirectly, from military investment and research stretching back to the Cold War.
In order to better understand how we’ve gotten to this point, it’s best to understand how digital technology has evolved since the mid-20th century.
Digital History: The Military Roots of Modern Digital Technology
The military interest technological research is evident in the way through innovation tends to “burst” during and after war periods. This, in part, reflects the development of scientific inquiry alongside the conflicts of the early 20th century.
For example, during World War I, advances in military science and technology would shape the deployment of armored vehicles, submarines, naval technology, and combat aircrafts. In World War II, the already-established relationship between science and war led to the development of guided rocket systems, nuclear energy, and radar tracking systems.
However, it was during the post-WWII, Cold War era that communication, and communication technology, emerged as a huge area of development and investment. As the United States and the Soviet Union settled in for an arms race that would last for decades, the U.S. government partnered with (and funded) public and private universities to develop research pertaining to several new fields of research, like wireless communication, digital imaging, and computing. The idea was that the future or combat would rest on the ability for armies to communicate quickly and respond to changing conditions rapidly.
Accordingly, the United States Military awarded several research contracts for new innovations like decentralized computing machines, automated hardware for radar detection, and a new, decentralized communication method that used digital data and a method called “packet-switching”. This method served at the heart of the ARPANET project, the precursor to the Internet.
While technological leaps occurred relatively soon post-war, continuing research actually stretched from the 1950s through the1990s as funding from government and military contracts flooded into universities to fund the work of scientists and researchers. Over a period of 30 years, developments in computation and communication led to more interest in digital imaging, virtual reality, and geolocation… all of which would set the stage for the development of augmented reality.
Augmented Reality as an Academic and Military Technology
One of the first technologies developed that can make a claim to the title of “augmented reality” (although the term wasn’t coined until much later) is the “Sword of Damocles”. Developed by Ivan Sutherland and his student Bob Sproull at the Maryland Institute of Technology, this tech was comprised of a set of goggles connected to a room-sized computer system that tracked head position, coordinating that data with sensors to display a wireframe model of the room for the viewer. Although more accurately described as a virtual reality system, the Sword of Damocles was a prototype for many of the technologies that would comprise augmented reality– including wearables, location-aware devices, and the use of digital objects within a spatial framework.
Research into VR and AR continued for years with more or less success, although their development was slowed by the fact that the only regular application of such technology was either in academic research or towards military innovation. It was in 1978 that Myron Krueger developed the “Videoplace” lab at the University of Connecticut, which was one of the first virtual reality technologies that allowed users to interact with digital objects rather than just view them. This key aspect of augmented reality shifted the viability of VR and AR, as the digital information utilized by devices could be translated, in some fashion, within a real, physical space.
In the following decades, practical and theoretical research would shape the field of augmented reality to what we see today. Gavan Lintern, from the University of Illinois, published a paper in 1980 demonstrating the benefits of heads-up augmented display to improve flight skills for pilots. This conceded with several efforts in developing Heads-Up Display (HUD) technologies, including Steve Mann’s development of the first wearable computer in 1980. Mann’s work superimposed digital images on photographs of the environment through an eyepiece, serving as a precursor for technologies like the HoloLens and Google Glass
This development reflected a continued focus on HUDs and augmented reality in the military, specifically the Army and Air Force. Through the 80s and 90s, the U.S Military invested heavily in AR research working to develop accurate HUDs for pilots and soldiers to provide them with the edge in combat situations. In a complementary avenue of research, several advances in robotics lead to the development of AR-equipped vehicles were funded by the military for training and simulation building purposes. These devices provided new technology to help soldiers better prepare for war, while providing a constant test bed for emerging AR technology.
From Military to Business and Consumer Technology
Like most technology, visions of AR as a consumer technology and social reality were always part of the public imagination. In science fiction movies and stories, the blending of digital space with real-world environments permeated films like Aliens and The Terminator as well as in literary subgenres like cyberpunk and space opera. The image of an individual viewing the world through an optical device that provides real-time information overlaying the environment were plenty, and in many ways were influenced by the continued academic and military development of AR technology.
Much like wireless communication and its portrayal on shows like Star Trek, AR technology was at once both an emerging reality (for those “in the know”) and an unreal science fantasy. But the influence that popular culture and existing technological research have on one another provided the foundation for contemporary AR. In a case of art influencing life (and influencing art one again), technology companies began to develop their own augmented reality tools like wearables and software. While this started slowly with ill-conceived gaming systems and devices, the arrival of smartphones as portable computers equipped with cameras, GPS location tracking, and copious processing power provided a baseline for AR adoption. Digital maps and games began to mix real-world images with digital information to support user interaction, and the rise of games like Pokémon Go or apps like Snapchat and Instagram demonstrated the potential for AR as a consumer technology.
However, AR developers looking to fully realize AR as an always-on tool started to invest in wearables. With Google Glass released for beta testing in 2013, and Microsoft’s HoloLens following suit in 2015, the use of wearables as AR technology began to leave the confines of military research and science fiction and start to enter the public space.
The Future of Military AR Technology
As a sort of a reverse of the previous historical trends, private companies are now providing support and research for military research into AR.
Military investment into AR technology never really waned, however–While the U.S. Naval Research Laboratory continues to develop the Battlefield Augmented Reality System (BARS), it’s quite clear that active military interest in AR is a prominent part of overall military strategy.
However, the announcement of the Army’s investment into Microsoft HoloLens units signals both a maintained interest in AR technology for combat purposes, and an increased role for private companies as leaders for hardware development for military applications.
This is a good sign for the future of AR. While AR adoption is increasing (and expected to continue to increase) in several consumer-grade applications like video games and retail, that major adoption has yet to occur. As of this writing, the HoloLens is reported to have sold around 50,000 units, only half of the Army contract. In an interesting reversal of historical relationships, it seems like the private sector has driven innovation in AR and wearables to the point that they are finding a life in industrial and military applications. This is supported by Google Glass wearable device, considered less-than-successful on the consumer market, finding new life as a manufacturing and enterprise technology.
The news of the Microsoft agreement signals a new era of development between the private sector and the military in the AR market. While augmented reality is already projected to pull billions of dollars in investments over the next decade, the addition of military and industrial interest only supports the prediction that AR is a foundational technology for the future.