¾«¶«´«Ã½

Digital addiction by design

The ‘like’ is a “bright ding of pseudo pleasure”

Our infatuation with devices is no accident, raising ethical and legal questions about how tech companies are using our digital addiction.


A news video of a pedestrian injured by a car while transfixed by her phone as she walked against crossing lights in Sydney in June last year was shocking but not surprising.

On any busy street in Australia people can be seen glued to their screens walking through crowds, sitting at bus stops, crossing the road, and sitting in their cars at red lights.

In the report from professional services firm Ernst and Young, 30 per cent of the respondents who own smartphones said they were addicted.

It is a well-recognised downside of our infatuation with digital devices, but one that is not merely an unanticipated side effect, explains ¾«¶«´«Ã½ of ¾«¶«´«Ã½ ethicist Dr David Neil.

Addiction to these devices is no accident: the technology is designed to feed the multi-billion-dollar global attention economy, says Neil, a philosophy lecturer from the ¾«¶«´«Ã½'s Faculty of Law, Humanities and the Arts.



¾«¶«´«Ã½ of ¾«¶«´«Ã½ ethicist, Dr David Neil. Photo: Paul Jones

Unlike the more transparent traditional business strategy model, the attention economy depends on personal information from a number of sources, from reward cards to digital devices, which can be harvested and sold for commercial purposes.

"Most of what we use online like Facebook, Twitter, Snapchat, shopping and game sites is free,'' Neil says. "The way they make money is by collecting and packaging the attention of users and selling it to advertisers who, in turn, use it to target individuals very specifically in terms of behaviour and consumption patterns."

To that end, the technology has been designed to maximise time on a device.

Fun features with an agenda

What users might consider innocuous fun features such as Facebook 'likes' or the red dot contact notification on many apps are just some of the intentionally addictive features that encourage us to fritter away more time than we intended.

Justin Rosenstein, co-designer of the Facebook 'like' button, is among the reflective tech apostates who are now re-evaluating their contribution to this technology.

He recently described the 'like' as a "bright ding of pseudo pleasure".

"The fundamental question is whether having the intention of causing people to become addicted is an acceptable business strategy,'' Neil says.

"The way they make money is by collecting and packaging the attention of users and selling it to advertisers who, in turn, use it to target individuals very specifically in terms of behaviour and consumption patterns" - David Neil

The Digital Australia report found 25 per cent of respondents with smartphones and tablets spend more time on their devices than talking with family and friends.

In the 2015 report it was 21 per cent and in 2016 it was 23 per cent.

A quarter of respondents said their social life would be non-existent without their digital device, a figure that has also increased marginally over the past three years.

"Those trends give us reason to be concerned that devices and apps can be addictive," Neil says.

"Nobody sits down and thinks, 'I should use this device five hours a day' or 'I need to check my phone 400 times a day because those are my communicative needs and requirements'.

"We develop habits in our electronic device use, and our neural reward systems are progressively conditioned into expectation and reward patterns that can form the basis of an addiction."

Dr David Neil looking through screens. Photo: Paul Jones

Free choice undermined

But we all have a choice, you say, to turn off the phone, the computer or even those seductive notifications.

"Free choice is any easy term to use, but it is much harder to explain what we actually mean by that," Neil says. "We typically feel that we acted freely when we experienced a want or desire and then act on that desire."

Dr Neil argues we have far less free choice than we think.

"We tend to overrate our capacity to intentionally regulate our behaviours, and we tend to underrate the extent to which many of our behaviours are conditioned, repetitive and patterned, triggered by causes we are not entirely aware of."

The problem, Neil says, with addictive technologies is that we are being encouraged and manipulated into time-consuming behaviours for the benefit of tech companies and data brokers.

As an ethicist he is willing to raise and discuss society's problematic issues, particularly in the technology and biomedical field, but reluctant to be too prescriptive.

But he does believe some form of regulation of the technology industry is overdue.

"We typically feel that we acted freely when we experienced a want or desire and then act on that desire" - David Neil

Reflection and action

The only warnings consumers receive are buried in dense, legalistic Terms of Service agreements that are largely ignored. A recent US survey by Deloitte found that more than 90 per cent of people consent to legal terms and services conditions without reading them.

Another disenchanted tech insider, Tristan Harris, a former product philosopher with Google, is one of the influential voices speaking out about questionable motivations of the attention economy.

He is a co-founder of the advocacy group Time Well Spent, formed to bring moral integrity to software design and to lobby the tech industry to help people disengage more easily from devices.

In a he described the attention economy as ''a race to the bottom of the brain stem".

"You could say it's my responsibility to exert self-control … but that's not acknowledging that there's a thousand people on the other side of the screen whose job is to break down whatever responsibility I can maintain,'' Harris says.

That loss of control will be exacerbated by the growing use of virtual reality technologies.

Yvonne Apolo and reflection

Yvonne Apolo from UOW's School of Law reflected on screen. Photo: Paul Jones

"As Virtual Reality becomes cheaper and better in terms of how real it is, it offers the ultimate form of escapism, the ability to opt out of the real world and spend your time in a virtual world that is more exciting and more geared to satisfy your individual needs and preferences,'' Neil says.

"The harm is that it encourages and enables a profound form of narcissism because people can spend their time in a world that is entirely structured around all about continual gratification of their needs and desires."

He argues this is particularly dangerous for younger people who are still learning social and psychological skills needed to navigate the world.

"The case for regulation is strongest for products that are aimed at children because they are more susceptible to manipulation and are less able to understand or resist the strategies that are used to hook them,'' Neil says.

"When you are targeting children and trying to modify their behaviour in ways that are economically good for you but not good for their health and welfare, that is when the state needs to step in."

A need to tighten existing privacy laws

UOW's School of Law lecturer Yvonne Apolo, who is completing her PhD, was asked to theorise how this could be done. She says Australia's Privacy Act 1988 was a good place to start, however, it would need reform to keep pace with the complex digital landscape of modern society.

"This framework's significant shortcoming is that it's built on a premise of privacy management that was developed in the 1970s and allows individual consent to legitimise virtually any form of collection, use or disclosure of personal information," Apolo says.

Yvonne Apolo standing against white wall

UOW School of Law Lecturer, Yvonne Apolo. Photo: Paul Jones

"Each time we download an app and agree to the 'permissions' sought, such as access to our public profile and location settings, we're essentially consenting to having our information collected and shared.

"While we ask consent to do a lot of work in many areas of law, it is particularly problematic in the context of privacy protection as it is difficult to weigh up the immediate benefits associated with using such apps against the more elusive future detriments caused by a very complex process of data aggregation and accumulation."

Apolo says we need to rethink how we approach the dual concepts of privacy and consent, in a world where all is not as it seems.

"If this industry is drawing upon social and behavioural sciences' insights to condition our information-sharing choices, then law too should engage with these insights to rethink the reliance it places on consent in the context of problematic privacy practices.

"It is possible to beef-up our existing privacy framework by incorporating more substantive rules that govern how personal information can be collected, used or disclosed.

"Reliance on consent is inappropriate when deception is being used to condition our choices and render the divulging of personal facts addictive or pleasurable."

Banner graphic by Jasper Smith.