Whether you own any of the gadgets or not, it’s becoming increasingly difficult to avoid wearable technology. With established tech giants like Apple, multinational banks, and sideways looking start-ups all buckling up for the wearable ride, the future has gone from being held in our hands to being strapped to our wrists.
From experiments like Google Glass to the Apple Watch, many new wearables are marketed as extensions of the self, theoretically allowing wearers to walk through the world with an added ease and awareness of surroundings. With wrist held maps telling you exactly where to walk and turn, you’ll never get lost again. With contactless payment, the awkwardness of PIN machines is a thing of the past. And with unlocking doors being as easy as a flick of the wrist, say goodbye to the frustration of lost keys.
However, as of yet, this interconnected world is still just a dream of tech companies. It’s safe to say that most people probably don’t have Apple Watch compatible locks at home, nor are most people planning for replacements in the near future.
The fact is that most new technology elicits some form of suspicion. This is in part based in our unease in the new and unfamiliar, and also in the potential for faults or oversights in gadgets that are brand new, and as of yet largely incompatible with the world around them. Without an iPhone, an Apple Watch is just a watch. An expensive watch.
This is not perhaps a new phenomenon, and throughout history we’ve been slow to adopt new technology, due to unfamiliarity and stereotypes surrounding early adopters. Before Google Glass (and the “Glasshole”) there was the eyeglass, one of the first wearables designed to enhance a person’s experience of the world, allowing for clarity and enhanced perception. Whilst acceptable and ubiquitous now, throughout much of their history glasses have been stigmatised, with wearers historically being viewed as pious, bookish, and even weak. This stereotype comes from their widespread mediaeval use amongst monks and the clergy; at the time, one of the only groups who could read.
Another reason for the slow adoption of technology is the cumbersome nature of initial releases. Take for example one of the first portable timekeeping devices, the Nuremburg egg. A far cry from today’s discrete and highly accurate watches, Nuremburg Eggs were large, worn around the neck, and fairly inaccurate. However, rather than being simply utilitarian pieces, they were status symbols, being incredibly expensive to make and purchase. For this reason, they were worn only by the rich, powerful, and influential.
This has been much the case with most wearable tech throughout history. With many new gadgets being expensive to produce and limited in supply, only the most well off and influential will have had any chance of acquiring them. Whilst this is still true to an extent – simply look at the £8000 solid gold Apple Watch – in more recent years wearable tech has become increasingly more accessible. Whilst the “internet of things” might be a little way off, just about anyone can purchase and put a fitness tracker to good use.
The most successful wearable tech is perhaps the most accessible wearable tech; precisely the reason why you might not have heard of the air conditioned top hat, or ever considered lighting up your house with a girl in a dress. However, every wearable arguably has its place, and whilst a sneaker phone might seem like a relic now, to those living in the 90’s it could very well have been the future. Take a look at the infographic to see what could have been, and what might be.