Express & Star

How Google is trying to change the way you see technology

Is our tech becoming more like us?

Published
Technology such as Google Assistant is at a tipping point, its product management director believes (Jonathan Brady)

From “OK Google”, to “Alexa”, or “Hey Siri”… there’s a revolution under way in how we use our computers, and even what we consider to be a computer. For technology giant Google, it’s about building the most natural conversation possible. How does the man charged with directing that conversation think the world of human-computer interaction is changing?

In the morning, just before I leave the house I ask: “OK Google, how’s my journey to work looking?”

Sat in a corner of my living room, my Google Home smart speaker chirps up and gives me an update on road conditions, throwing the directions and latest traffic details to my Google Pixel phone.

It’s a simple, frictionless interaction which masks a staggering amount of complexity.

“We started with wanting to build an interface that allowed you to have a conversation with Google,” explains Gummi Hafsteinsson, the man charged with leading Google Assistant.

Mr Hafsteinsson is one of the pioneers of the technology. He worked at Siri, the company bought by Apple to bring intelligence to its devices, later joining the Cupertino firm.

He’s now product management director for Google Assistant – his second stint at the firm. Launched 18 months ago, he says the technology has reached a tipping point.

“Once we started to look into what did it mean to have a conversation with technology, it opened up a lot of opportunities,” he said.

“(With Assistant), you can have a conversation with technology and you shouldn’t have to learn how to use it.”

He describes Google Assistant as the interface that adapts to us as people, rather than requiring us to adapt to the technology.

Thirty years ago, if you wanted to use a computer it’s likely you sat at a fixed position in either your home or an office and typed on a keyboard, and used a mouse to control a graphical user interface.

A decade later and you could shift the same experience to a laptop – the interface was the same, it’s just the location could change. Ten years ago the birth of the first modern smartphone fundamentally altered our relationship with computers, and while it helped remove barriers, our relationship still felt like one between user and tool.

Today we are on the brink of a shift from computing power being something held in a device connected to the net that we touch, to something far more ambient, and with a resultant shift in how we interface with that technology.

Is the technology becoming invisible?

Mr Hafsteinsson says: “To a point, yes. Invisible, ephemeral – it should be seamless. It should be there for you. But it shouldn’t feel like you are having to think about how to do something. It should be natural.”

The logo of internet search engine Google reflected in a pair of glasses
(Dominic Lipinski/PA)

Mr Hafsteinsson says this is a golden age of innovation for such technology.

“We don’t want the consumer to think about the complexity,” he says.

“There’s been a lot of progress. We’ve got really good at speech recognition, industry wide. At Google we have a really good technology stack. We are making really good progress. But we are not done.”

From playing music, starting video to setting timers and controlling your lights or your heating, Google Assistant is well named – it’s a helpful companion that removes some of the manual steps of many perfunctory tasks.

It’s easy to mistake it for a different kind of remote control, but where the technology is becoming genuinely transformative is in its ability to answer questions to a surprisingly diverse range of queries.

A Google logo on the screen of a mobile phone
(Yui Mok/PA)

“We didn’t program it (to respond to that question) – it found information on the web. It was a paragraph that made sense when read out.”

Despite being the architect of Google Assistant, Mr Hafsteinsson laughs when asked if it still surprises him.

“I asked Assistant how I could become a ninja, and it had an answer,” he chuckles.

He says the next step is a fully natural conversation with the technology.

In the coming year Assistant will start to inhabit a greater array of devices – from televisions to car systems to wearables and more Android phones.

Google’s approach is to take Assistant into many more contexts than Amazon’s Alexa, which seems confined to the living room, while Apple’s HomePod is being marketed as a speaker first and an smart speaker second.

A living room with a television and a guitar
(Getty Images/Customdesigner)

He says: “Because it is so natural, it scales across so many different devices – today on speakers and phones and its coming to display (dedicated consumer electronic devices with screens). It’s available on TV, in your car, on your watch; it’s the same concept to have a conversation but it scales very naturally.

“We are getting to the point where I shouldn’t have to think about the device – it’s there when I need it and doesn’t get in the way.

“Once you have all these points working together, it is a tremendous amount of delight through sheer ease of use.”

For Mr Hafsteinsson, the technology’s rapid development fulfils a goal that sparked his interest in such human computer interactions many years ago.

“When I was studying engineering way, way back, I was fascinated by the idea of the technology becoming more like us than the other way around.

“I was a huge techie and always trying to get my family to use technology. And my dad would say, ‘I will use it when I can talk to it’.”

That day is certainly upon us.

Sorry, we are not accepting comments on this article.