Business thinker and author Rachel Botsman reveals how technology is impacting the way we trust, and the opportunities and consequences that presents• Fujitsu Forum Munich 2018 speaker •
Trust: it seems a simple, universal concept. But try to define it and you quickly run into trouble. Explain to someone why you trust something and it’s difficult to get beyond ‘gut feel’ or other intangibles like ‘belief.’ Yet while trust is intrinsic to our everyday lives and the function of society as a whole, we rarely consider its basis or importance.
Remove trust and the world would descend into bleak chaos, with little getting achieved. Who would look after your children when you’re at work? How could you use public transport with any confidence? What mechanisms would enable even the simplest of business transactions? None of that happens without a solid foundation of trust.
However, the nature of trust is going through an evolution, as it becomes intrinsically tied to technology and our increasingly digital lives. The consequences are both wonderful and fearful.
Rachel Botsman, a lecturer at Oxford University’s Saïd Business School and author of business bestsellers, including Who Can You Trust?
has been fascinated by the complex relationship between trust and technology for more than a decade.
“‘What is trust?’ is actually one of the hardest questions to answer. A lot of people think of trust being about certainty, but it’s actually the opposite of that. If you’re sure of the outcome, if there is no risk, no trust is actually required.” She defines trust as “a confident relationship with the unknown. When you think of it through this lens it becomes the bridge between something certain and something uncertain. Trust enables us to place faith in unknown people, unknown products and services, unknown ideas. It allows us to be vulnerable. It allows us to place our faith in new innovations and for society to move forward.”Trust and technology
We’re at a crossroads with trust: technology has accelerated the time it takes for individuals, businesses, even countries to place trust in new products or processes. Think about offering your house keys to an unknown person (Airbnb) or getting into a stranger’s private car (Uber or BlaBlaCar). That is giving life to a whole world of disruptive innovation. At the same time, this new world of trust is having unintended consequences that aren’t yet fully understood.
As Botsman, a renowned TED Talk speaker, says: “The science behind the way trust works hasn’t changed, but the process for actually giving our trust has. Technology can’t fundamentally change how trust works but it is changing our behaviors and who we trust. Digital tools are changing what we expect of people and how we interact with one another.”
She adds: “Technology has impacted trust in fascinating ways. On a positive note, it’s allowed us to place our faith in complete strangers. It’s enabled innovations and marketplaces to flourish; it’s enabled people to collaborate and connect in ways that we’ve never seen before.
“But we’re also starting to see the consequences of people giving their trust away too easily,” says Botsman. “Technology has enabled us to assess people [remotely] and to give our trust much easier and faster. If we’re using a new service we have instant access to real-time reviews and ratings, and we’re so used to trusting these sources that we’ve developed a habit to just click, accept and share. We want the benefits immediately so we let convenience trump trust.” But this is leading us into worrying territory, she observes. While there is much discussion today about the nature and sources of fake news and misinformation, in a few years AI may be generating such sophisticated false content that observers will be unable to distinguish from the authentic.
“With the rise of ‘deep fakes’ you won’t be able to watch a video of a politician and know whether they actually said the words [you’re hearing]. It’s very confronting for the future of society.”Enforced friction
Botsman points to a surprising potential solution: the need for “more friction.” She says: “The efficiency that technology creates can be the enemy of trust. There is a degree of friction that is really healthy in society. What technology often does is put us in this almost automatic mode. For example, sharing an article solely based on the headline without reading it or looking at the source is a form of accelerated trust.”
She thinks technology companies should consider how they can introduce a degree of friction into the customer experience so the customer asks: ‘Do I have the right information to decide whether this person, product or company is actually worthy of my trust?’
She calls this the ‘trust pause.’ When technology companies talk about frictionless transactions and processes being seamless, accelerated and efficient, such a notion can present unintended consequences. However, introducing a slight element of friction presents an ethical challenge for companies because, as Botsman says: “They know how to do this.”