I recently had the chance to give a conference in Chartres as part of the Human Tech Days series. I was a guest of the Orléans-Tours and Centre-Val de Loire region’s DANE (Academic Delegation for Digital-based Learning), who were taking advantage of the series of events dedicated to digital in order to allocate one day to education.
The title of this conference was as follows: “An overview of the digital age and of its implications for education”. You can rediscover it under Creative Commons licence on this page.
So here are the 5 implications that I set out during this conference. It is evidently a highly subjective and non-exhaustive list. It seeks to open up a debate rather than give clear answers.
1 : The digital age amplifies the effects of our cognitive and social biases
Following on from progress made in the fields of neuroscience and cognitive science, cognitive and social biases are becoming increasingly better identified and understood by researchers. We are all biased – our brains play tricks on us, and they are not completely reliable. Let me be clear; they mislead us and distort our perception of reality.
These cognitive and social biases form part of the way our brains work ; they did not come into existence with the digital age, but digital – and in particular algorithmic processing – considerably increases the effects of these different biases.
A few of our cognitive biases
Confirmation bias: we primarily search for information which confirms a belief,or our own prejudice. This bias is particularly problematic in our use of search engines and social networks. Our way of typing requests into Google is biased (Google’s algorithm itself also contains bias), and we have a tendency to follow people on Twitter and Instagram who have the same tastes, the same political thought processes, and the same values as us. We isolate ourselves subconsciously from information which could contradict our beliefs. In the same way, we will more readily go and watch a film which matches our preferences rather than take a ‘risk’. This is a form of bias very well understood by Netflix and other VOD platforms.
Negativity bias: we have a tendency to remember things that are negative better and to place a greater emphasis on them. This trend is already reflected in the news items on offer from the press, which are often pessimistic and can even go as far as scaremongering. Yet it is amplified on social networks and platforms like YouTube, where negative and extreme videos are actually shared more widely than positive or more moderate videos.
Conformism bias : we have a tendency to think and act like other people. On a social network, we are permanently exposed to opinions which end up influencing us.
Repetition bias : we place emphasis on information – even information that is completely untrue, ridiculous or offensive – when it is regularly passed on to us. It is clearly social networks here that relay information – including fake news, on a loop and this information can also end up influencing us too.
Fads (‘the bandwagon effect’) : in a situation where we do not have an opinion, we have a tendency to fall in behind the opinion of the majority (or at least our view of what the majority thinks). Again this bias applies perfectly to social networks like Twitter where it is not possible to not have an opinion. Furthermore, as Twitter is not a representational network of what the French population thinks for example, it shapes opinions which are wrongly seen as those of the majority.
More information about our cognitive bias can be found on this wikipedia page.
Our biases play tricks on us on two levels
A software developer designs their program while introducing biases into it that are both cognitive and social (sexist bias, racist bias, social class bias etc.). The Artificial Intelligence (AI) technology of facial recognition – which has difficulty recognising black people – would be an example of this, or perhaps the Apple Cards algorithm which offers less advantageous banking conditions to women. Developers today are by and large educated white men from English-speaking countries, and they introduce several social and cognitive biases into their programs.
Subsequently, users use these programs while bringing their own biases to them, as mentioned above. Because as the image above shows, there are three levels of understanding for the machine:
- Information that we share voluntarily. Here we can control our biases as this information is handed over in a conscious and thought-out way. An example of this would be me sharing my favourite music bands on Facebook to give off a certain image of myself.
- What our behaviour shows. Here is where things get complicated and biases show up. I may well have shared with the machine that I like Chopin and David Bowie, yet it records my musical history where the latest hits follow on from one another – Maitre Gims, M Pokora and Aya Nakamura for example. This is what my behaviour says to the machine about my musical tastes, and this is what generally happens with all of my interactions online : Google searches, likes on Facebook, retweets on Twitter, Spotify listens, etc.
- What the machine thinks about us. Once the machine has enough data about us – that which we share and that which is generated by our behaviours – a procession of algorithms then gathers speed. The machine can then (attempt to) calculate our IQ, our political views, our religion, our sexual preferences, our musical tastes, what repels or excites us, etc. It does this by combining and calculating tons of scattered information, and by comparing this information with other similar profiles in order to carry out predictions. If I like the same music as such and such a profile, I like the same books as them, and I carry out the same searches as them, then undoubtedly I would be interested in the same products as them on Amazon.
To go further:
2 : The digital age thrives on the ‘attention economy’
The concept of the ‘attention economy’ is recent, but as with cognitive bias, its existence is not linked to the digital age. The attention economy is based upon the fact that our attention is increasingly sought out (through advertising for example and that it is becoming a “scarce resource”.
The digital age and the modern world in general have considerably amplified the trends of attention capture. We are literally bombarded by ever more varied media : traditional poster campaigns, the wide range of screens on offer, as well as audible, olfactory or even taste stimuli.
Many companies who have become giants of the digital age have built their business model on this famous attention economy. While free for users, they still manage to charge advertisers who want to show you ever more personalised adverts. To increase their revenues, these companies ardently want to keep hold of you… for as long as possible on their platforms. To always target you that bit better, they want you to interact and to hand over data (with a few likes here and a few shares there).
The problem is that the most advanced companies are now calling on cognitive science and neuroscience in order to influence their users and encourage them to use their application through an extreme use of what is known as ‘UX design’, or user experience techniques. Here are a few examples :
- The autoplay at the end of a YouTube or Netflix video, which makes you want to watch one more episode. Last one, promise !
- Multiple notifications which build expectations and temporary excitement. Was my last publication liked ?
- Endless scrolling which means that you can spend hours on a Facebook or Twitter news feed.
More and more designers are distancing themselves from what they consider to be a deviation from their working practices. Collective groups are being formed to demand more ethics. Because as Tristan Harris – a former designer at Google states, “Absolutely everybody – without exception – is influenced by motives that they cannot see.”
To go further:
- Our Minds Have Been Hijacked by Our Phones. Tristan Harris Wants to Rescue Them
- Designing for Human Attention
- 8 Strategies to Survive in the Attention Economy
3 : The digital age has several speeds
Today, those on the ground who are working on access to digital agree that, strictly speaking, the ‘digital divide’ doesn’t exist. Why ?
Because there are not two clearly identified groups – those who embrace digital and those who are excluded from it. Of course there are differences in people’s equipment and their level of access, but there is above all a very wide range in the usage and understanding of digital.
Studies by Dominique Pasquier have shown perfectly that the working classes have completely got to grips with certain uses of digital, in particular to keep themselves informed and to communicate. We also now know that the digital habits from within a generation are very diverse, and it was indeed necessary to put an end to the myth of the ‘digital native’; this idea of a young person born into the digital age who would master all of its uses and its challenges.
It would nevertheless seem that digital, and in particular the web – although it was engineered against the idea of a centralised state system and intended to encourage the sharing, publication of and access to knowledge – is itself being confronted with the realities of its users. Everyone can express themselves thanks to the web, and that is revolutionary… for better or for worse, as all types of publication (videos, images, text) and all levels of quality are possible. The rules are still broadly implied, and the uninitiated are frequently unaware of them. There is a great deal of self-regulation, and it is the law of the jungle that prevails, as the recent Mila affair sadly reminded us.
On balance, ‘knowledgeable’ and ‘popular’ uses of digital stand alongside each other, but don’t blend very much. Is this the same as in real life ? Digital is therefore developing at several speeds, both when it comes to the level of equipment (number of devices, the range of device being used, the extent to which it is personalised etc.) and when it comes to quantitative uses (time spent on devices) and qualitative uses (the activities being carried out). It therefore perpetuates the socio-economic inequalities which have already been studied, and it consolidates them.
To go further:
- Digital Divide Is Wider Than We Think, Study Says
- Americans and Digital Knowledge
- A socioeconomic related ‘digital divide’ exists in how, not if, young people use computers
4 : The digital age is revolutionising our relationship with traces
On this issue, I would invite you to read my dedicated article on the topic : The culture of digital: photos and traces.
I’d like to talk again about a video where Louise Merzeau raises the subject of traces and of the impact of the digital age on these traces. To keep things very brief, for a very long time, being forgotten was the norm and keeping hold of traces required effort, political will, techniques, and money. Today, leaving traces has become automatic with digital objects.
How we went from a quest for traces to a quest of being forgotten
5 : The digital age has to (again) become a political topic
Digital is a pervasive trend (D. Boullier), in other words it permeates all parts of our society : our social life, our consumption, (the exampe of Yuka is an exceptional example of this) our work, the way we relate to information and our loves ones. But digital also is about the economy, the army, geopolitics, energy management, etc.
It’s quite simple, today, everything operates with digital technology, including the power grid which allows digital to work. The cycle is complete, and if ever there were a shutdown, we would have serious problems.
Paradoxically, it seems to me that the topic of digital is not a very politicised one, in the sense that citizens don’t invest much thought in it and digital is rarely debated seriously. With the notable exception of a few specialised organisations, the laws which affect digital are the target of very few protests, discussions and petitions. Undoubtedly due to this tangible lack of interest, digital rarely forms part of political manifestos during elections, even though we could reasonably expect it to be there, since the subject very much is about security, jobs, and the transition to a greener society.
More generally, digital forms part of those technologies which are pursuing a policy of « carrying on regardless », without any real democratic control. Today, where are the debates taking place in order to reflect upon the necessity of 5G ? And where are the debates taking place in order to discuss the now regular surveillance and facial recognition laws, or laws about the digitisation of public services, or about financial assistance to French Tech start-up companies?
I mentioned earlier that there was no real digital divide. That being said, if we considered that being digitally excluded is about citizens who are insufficiently informed to take up the challenges posed by digital, then I would say almost all French people are in a position of digital exclusion. That includes influencers, big decision-makers and political leaders. And that is quite a big problem…
Conclusion: the role of education
The implications of digital are plentiful, and they are at all levels of society. From the more trivial – such as using a search engine – to the more complex, such as the country’s energy strategy. It is imperative that citizens regain their power over digital.
The exercise of democracy assumes informed choices, a critical mind, and curiosity. Understanding the implications of digital is therefore essential in order for politics to be able to take back control, and that therefore involves a real education in the culture of digital.
Parents, not being equal in their ability to pass this information on alone, means that the schools of the Republic and further education colleges have a massive role to play. Now more than ever, it would seem that the job of schools and the education system is to alleviate the social gaps between children, as digital increases the socio-economic and cultural gaps which already exist.
In conclusion, I would like to mention this beautiful quote from Benjamin Bayart: “Printing enabled people to read. The Internet will enable them to write.”
What do you think? Let’s talk about it in the comments!