Cognitive Control in an Attention Economy
‘Attention and Memory in the Digital Age’ was a well attended event opening up an intriguing array of questions about our present and future in a technologically saturated society. The event took place at THECUBE on the 21st January, a night of extreme conditions, bringing us our first taste of sleet and snow for 2019. Despite the weather, the four anchor speakers and audience soon warmed up over an enthusiastic debate, bringing together a range of perspectives, sometimes somewhat contrasting.
The real focus of this roundtable discussion was the function of attention. I originally selected this subject for its basis in neuroscience, and because of its popular presence in the great neurological debate investigating the impact of digital technology. Having not yet lived a lifetime alongside our handheld devices, this debate is a contentious one. Many of the scientists who are brave enough to predict these effects, put their reputation at risk, as without hard evidence this research can only be read as a prediction.
One of these scientists, Richard Cytowic, claims that amongst the growing debate which questions; the addictive quality of our devices; whether the internet weakens our memory or makes it smarter by showering us with facts; or if social apps connect or isolate us; “what no one seems to dispute us that our attention spans have gone to hell” [1]. We will address the function of attention from a range of contexts, beginning with this statement and the biological perspective.
Nilli Lavie, UCL Professor of Psychology and Brain Sciences, with a specialism in attention and cognitive control, opened with a definition of attention which helped ground our foundational knowledge of the subject. Attention, she explains, has a capacity limit which cannot be affected by what we do. Neurologically, Lavie held the option that technology couldn’t have an impact on its capacity, as attention is something biological with a genetic component. There are certainly different strategies that can help one focus, and there are some who are simply more susceptible to this ability, but we cannot create or generate more attention.
Cytowic similarly states that attention has a limited capacity, and cannot be altered by any level of exercise or lifestyle choice. He emphasises that one must approach the relationship between technology and attention through the consideration of energy cost. Digital applications promote media multitasking, which requires a high level of task switching and the spreading of one’s attention across a number of tasks. Cytowic explains that the average human brain, which accounts for 2% of the body’s mass, consumes 20% of the daily glucose consumed by a human [2]. To be energy efficient, the brain must use the least amount of cell signalling at one time, which although uses the least amount of energy, carries the most amount of information. Multitasking therefore consumes more energy, as our brains must increase signalling across an increased number of neurons. Every time we switch a task, our neural circuits must take a break, which can reduce brain efficiency by up to 50% [1].
The response to Cytowic’s research, sparked off a central debate to the discussion: is technology really the problem that is causing us to act in a certain way, or should we be blaming ourselves?
Lavie stated with conviction that we are responsible for our own actions, and just as we have the power to control ourselves with alcohol or sugar, we can do the same with technology. This draws upon the fact that technology doesn’t have the capacity to want to us to do anything, as it has no logic or direction of its own. Just as critical theorist Dale Carrico emphasises, there is no such thing as technology as something in its own right. It makes no sense to state whether one is ‘for’ or ‘against’ technology as there are many existing technologies advancing and stagnating at different rates [3]. Professor of Physics, Richard Jones, adds to this interpretation, and stresses that:
‘The agency belongs to the people who make technology and the people who use it. Technology doesn’t want anything, people do’ [4].
This is perhaps a mistake that I sometimes make myself, forgetting the generalisation that is easily assumed whilst describing the effects of our digital devices.
Vanessa Bartlett, UNSW researcher and curator, raised a similar point during the discussion about the language we were using, and that technology is not a thing in itself, but an infrastructure. If technology really is having an adverse affect on its user’s attention spans, whose interest does this fulfil? The ecosystem behind technology is built through forms of power, data and advertising, and it is these components that manipulate what we pay our attention to. Human behaviour is a key driver in the design of digital applications, because of the financial interest that is at stake. Perhaps our immersion in the attention economy stems from the marketing campaigns that lead our digital devices.
Btihaj Ajana, KCL lecturer in digital culture, held the opinion that we have a shared responsibility with technology, and that neither party is blameless. Digital technology as an application, is designed to frame the way the user acts. For example, Twitter only allows its users to write posts with a very short word count, and process posted information in small bites. If a particular social media happens to be the current dominant site for social interaction, its format will influence the way that we engage with each other.
Ajana also reflects upon the often cautious approach people feel against digital technology, and reminds us that our fears of modern technologies are historical and our anxieties aren’t new. If we were to look at patterns of behaviour in our relationship with technology, humans have historically feared modern tools before they are accepted as the norm. Plato once predicted that writing would one day weaken memory, causing us to ‘mistake the truth for its shadow’. In reality, Ajana reflects, writing has instead re-mediated memory. We have replaced mental arithmetic with calculators, hand-drawn maps with GPS systems, and even writing with the printing-press and the keyboard. At the centre of all these tools are their power in freeing the brain from their tasks, enabling room for us to carry out other forms of cognition.
Designer Ben Koslowski, who holds a PhD in Communication Design from the RCA, questions how bad distractions really are, as they help formulate new journeys of discovery. Through a traditional setting such as a library, Koslowski has previously found himself searching for a book, and coming across additional sources of interest along the way. After spending several more hours than expected, searching and reading, although Koslowski hadn’t event found the book he had been looking for, he had gained a far richer experience uncovering the unexpected information and unknown sources. This is no different from searching on one’s computers, and becoming distracted by related articles or tasks along the way. He also points out that in the busy, fast-paced information society, losing focus could be a refreshing break for an overactive mind.
Another argument put forward in the discussion stressed that it is the cultural expectation that drives the force behind the digital age. Many corporations now expect employees to take part in their digital health schemes, where the tracking and monitoring of staff physiological levels and exercise have become the norm. Ajana describes how opting out reflects negatively upon your career, as you’re at risk of no longer being seen as an active team player.
In China, they have introduced a new social scoring system, which plans to rank all of its citizens by their ‘social credit’ by 2020 [5]. Ajana describes how citizens will be ranked according to factors such as their use of internet, smoking habits or bad driving, and can be punished or rewarded depending on their score. The system is mostly run by city councils or private tech companies which hold personal data, and will penalise the individuals who decide to ‘opt out’. Ajana illustrates that what we perceive as a choice is slowly shrinking, and that before we know it, it will be an expectation.
Analogue alternatives are disappearing as their technological replacements become the accepted norm. Online banking has risen, resulting in many bank branches closing. The elderly generation, who until this point haven’t always needed the resources given by technology, are now being forced to learn and adapt, as their physical alternatives diminish. In contrast, the younger generation, who many describe as digital natives, have been brought up on this digital diet, and are well accustomed to a society where spending time on one’s smartphone is the accepted norm. In their case, not being on one’s phone could have serious implications relating to what they are missing out on. What do you lose when you don’t join?
The core essence needed to maintain these pressures of digital social presence lies in discipline. One audience member questions what it takes to decide what we assign our attention to. It’s not just the younger generation whose peers expect them to be online, this also exists within adults. Particularly for freelancers, who in a way, can never switch off the clock from their working hours, instant messaging has instilled a certain expectation in society, that we are always online, and therefore always available. At times, not answering an email can seem like putting yourself at risk of seeming disengaged.
This social expectation is another driver behind our constant immersion in the digital age, and another nudge towards a behaviour which can be easily distracted. What happens in the virtual realm will often influence what happens to us in the physical realm. The two are not separate, they are often inter-related.
Attention is part of an ecosystem that is economical, social, political and ethical, it is not purely about biology. Technology serves the purpose behind its maker’s intentions, which can also be measured through economical, social, political and ethical means. It is evident through the examples discussed, that we won’t always have the option to maintain complete control over our experiences with technology, as it’s evolving to become such an integral part of our health and professional life. With a future that will one day be reshaped by the minds and motives of our current digital natives, it may take some discipline to truly overcome the powers in control of attention, but it really is down to the individual to assess how they will direct and share their attention.
We must therefore consider in our behaviour with technology, what this relationship means to us and to the people behind it. Assessing an appropriate level of commitment and contact with our devices will help us maintain control, and put us in the driver’s seat over how we manage our attention. Although sometimes we might not have no option to ‘opt out’ of certain systems, we do have the discipline and power to choose what we assign our attention to, and should exercise this ability to help maintain a healthy relationship with our digital devices.
This essay is part of a series, examining the research evolved through the programme Cognitive Sensations. To read Fields of Perception, an essay exploring the impact of virtual reality and digital technology on the relationship between perception and the environment, please click here.
To listen to the audio recording of this talk, please click here.
References
1. Cytowic, R (2015) “Your Brain on Screens. The American Interest” Vol 10(6) June 9, 2015, [Available online]
2. Cytowic, R (2015) What percentage of your brain do you use? Ted Ed. [Available online]
3. Carrico, D (2013) Futurological Discourses and Posthuman Terrains. International Journal in Philosophy, Religion, Politics, and the Arts. Volume 8, No 2, Fall 2013 [Available online]
4. Jones, R (2016) Against Transhumanism: The delusion of technological transcendence. Soft Machines. [Available online].
5. Ma, A (2018) China has started ranking citizens with a creepy ‘social credit’ system — here’s what you can do wrong, and the embarrassing, demeaning ways they can punish you. Business Insider. [Available Online]