[vc_row][vc_column width=”1/2″][vc_column_text]I remember watching Steven Spielberg’s epic ‘Minority Report’ in the cinema, when it came out back in 2002 – a prediction of how we might be living in the future – more specifically the year 2050.
At the time I remember being in awe of every facet of the blockbuster and truth be told, a little scared, for obvious reasons! As a 13 year old in 2002, there were no smartphones and only a few people owned a mobile.[/vc_column_text][/vc_column][vc_column width=”1/2″][vc_single_image image=”3530″ img_size=”medium”][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]My young mind wasn’t able to process the inevitability of the film. I just chalked it up to science fiction and moved on. Looking back on it 15 years later, I’m learning more and more about how it presented a plausible scenario of life in the not so distant future…
Far from the dystopian reality of many sci-fi films, Minority Report now has the realistic feeling that scenes throughout the movie might only be a few years away – most notably how we interact with devices.
We already interact with our computers and phones on an unprecedented level with some reports suggesting we click, tap or swipe our phones on average 2,617 times a day. However these interactions aren’t nearly as advanced as the way in which Minority Report’s protagonist Chief John Anderton uses gestures to control and communicate with the computer. What seemed like fantasy then – now seems more plausible than ever.
In Kevin Kelly’s latest book the ‘The Inevitable’, he talks about how today’s devices such as laptops, tablets and phones – being largely ignorant of their owners’ use of them. Citing a quip from Nicholas Negroponte (Head of MIT Media Lab) – that his urinal in the men’s restroom was smarter than his computer because it knew he was there and would flush when he left – while his computer had no idea he was sitting in front of it all day. So true!
Unless I swipe my phone or tap my keyboard how does a device know I’m there? We’re not yet utilising all our senses or bodies to communicate and interact with our devices… yet.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_single_image image=”3531″ img_size=”large”][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]However, with the introduction of Virtual Reality (VR), we are beginning to get a glimpse of how we might interact with technology in the future.
I recently tested the Oculus Touch and immediately understood the potential this platform can offer and what we can learn about how the mind and body behave in a virtual world. When playing the Oculus Touch, it literally hijacked 3 of my 5 senses – launching my brain immediately into a heightened state of anxiety. I knew I was standing in Big Motive’s studio playing Bullet Train, however my sense of touch was overwhelming. I found myself reaching out to grab objects in a virtual world – whilst understanding they weren’t actually there.
VR provides a deeper, more immersive experience – incomparable to tapping a screen. However to do so, requires donning a headset, using hand controllers and even in some cases – a full bodysuit. This doesn’t look or feel as cool as Tom Cruise did in Minority Report. Nonetheless VR is still currently only in the ‘Slope of Enlightenment’ in the Gartner Hype Cycle and according to a recent report, it won’t officially go mainstream until approx 2019 – so we still have a few years yet to refine the tech and figure out how to make the goggles a little sweeter.
The devices we use today may not know we’re present but we are beginning to use technology on a deeper level than before – with eye tracking mechanisms built into phones and voice controlled Artificial Intelligence such as Apple’s Siri and Amazon’s Alexa.
This development begs the question: in a world of AIs and virtual experiences, will interfaces disappear altogether? Will we still be swiping, pinching and tapping or like the Cruiser’s John Anderton – will we use our bodies to control and communicate with machines.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_single_image image=”3533″ img_size=”large”][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]John Underkoffler – the man who invented ‘G-Speak’, the gesture interface system used in Minority Report – claims; that’s exactly where we’re headed.
Underkoffler believes the old mantra of ‘one machine, one human, one mouse, one screen’ will soon be redundant. In his Ted Talk ‘Pointing to the future of UI’, he declares that in the future we’ll collaborate with each other more, requiring more intuitive interfaces which support this teamwork. This could involve microphones, heat sensors, accelerometers and cameras providing deeper opportunity for interaction into our UIs – ultimately helping devices to ‘hear us, see us and feel us’. This intuitive UI could be built into the bezel of every device or use our own bodies like MIT’s Smart Tattoo – which can turn your skin into a touchpad. While this technology is at a very early stage, it’s a tantalising glimpse of a possible future.
So what else did Minority Report foresee?
If you’re thinking predicting the future, you’re right!
If you can cast your memory back, it predicted the use of algorithms to predict people’s future actions, based on past behaviours. Again, my 13 year old self treated this prophesied technology with a large spoonful of scepticism. And guess what… I was wrong again.
Recently, Admiral Insurance was forced to pull its latest product called ‘First Car Quote’ due to privacy issues (later launched with reduced functionality). The Insurer was planning to use an algorithm to analyse Facebook posts, likes and comments (not photos) to look for personality traits associated with driving. For instance the way you communicate with your friends online, your grammatical patterns, semantics and general punctuation could be scrutinised to ‘predict’ your driving style. Frequent use of “always” or “never” rather than “maybe” would be held against you, and ultimately affect the price you pay for insurance.
Sound familiar? It reminds me of Minority Report’s crime prediction software “Precogs” – where users were held responsible for crimes they hadn’t (yet) committed, based on past indicators. Although Facebook pulled the plug on Admiral using their data due to invasion of ‘personal data’, this is the tip of the iceberg for ‘prediction innovation’ as a recent report from Stanford University, states that in 2030 we will rely heavily on ‘predictive policing’ – just like in the movie.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_single_image image=”3534″ img_size=”large”][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]Steven Spielberg’s film was derived from the story by Philip K. Dick and at the time, featured many unheard of technologies. More surprisingly, it painted a vivid but strikingly accurate picture of how we would use technologies in the future — measuring and tracking everything; human-to-machine interactions becoming more visceral and immersive; autonomous driving cars; 3-D holographic videos.
One potential version of the future which Minority Report suggests is the fact that we may not be dealing with interfaces at all. We could simply be talking to AIs like we see Chief John Anderton frequently doing.
The digital prophet Scott Galloway predicts that Amazon could be the first Trillion dollar company. His reasoning for this? Their ‘foray into voice’ and Amazon’s ability to harness customer data. Thanks to Amazon we are about to experience the birth of ‘pure frictionless commerce’ via zero-click ordering via their Alexa AI. User experience nirvana where the brand knows the customer so intimately, that the interface has disappeared altogether.
Whatever the future holds, the interfaces we depend on need to evolve at a rapid rate from the traditional tap-swipe-pinch approach we’ve become a little too-used to.
Collectively we need to embrace how technology is enabling greater utility for the user and how this impacts user experience as well as the very nature of how we use digital products.
It’s out with the old and in with the new, or I can guarantee we’ll find ourselves a little out of the loop in the not-too-distant future.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/2″][vc_column_text]
Carol McHugh is Business and Marketing Analyst at digital design and innovation studio Big Motive in Belfast.[/vc_column_text][/vc_column][vc_column width=”1/2″][vc_single_image image=”5581″ img_size=”large”][/vc_column][/vc_row]