Terms & Conditions Apply:
If you want to re-build trust, a new conversation, consensus and compact is needed…
By Margaret Heffernan and Robert Phillips
5 minute read
The march of technology brings with it awesome power but also great responsibility. The opportunities for a new, digital democracy and a better society are manifest but the revolution is not without its darker side: fake news, addiction, tax avoidance, election fraud, turbo-consumption, “surveillance capitalism”, cybercrime, market monopolies, terrorism and more besides. The “techlash” – brought to a head this week by the revelations swirling around Cambridge Analytica and Facebook – is a response to much of this.
Legislators and in many instances media fail to understand the complex realities of the advance of technology – at exceptional speed – and its many (un)intended consequences. All sides need to recognise not just the entrepreneurial spirit that has driven the tech. corporates but also social production of much of the data that drives the sector and creates its value.
Although the tide of popular opinion appears now to be turning, politicians are too quick to laud and fetishise the likes of Google, Amazon and Facebook – seeing them as “the future” without looking carefully at the many downsides. Other players are in danger in being drawn into issues that are not in their genuine domain – eg. surveillance, data, hate crimes – simply being found guilty by association and all may soon be the victim of knee-jerk, catch-all regulation, the politicians’ favoured response.
Taxation raised an early red flag but is only one issue among many. Legislators appear not to have read the bit in the seminal Cluetrain Manifesto that speaks not only to the hordes but to the fools and the marauders, too. Maybe they have not even read the Manifesto itself.
“The digital citizen, currently, is wandering around the equivalent of an 18th century turnpike, surrounded by what can feel like thieves, marauders and conmen, with a number of the tech corporates seemingly effectively out of public and social control.”
This is not to say that we, the authors, are anti-tech or anti-progress. Far from it. Both of us have written extensively on the subject in very positive ways and Margaret has been CEO of a number of tech companies in the US. We celebrate technology as a potential force for good, with a reminder that we must never lose sight of the human in the face of inevitable, and often welcome automation. However, without an open and adaptable model – inviting all parties into a bigger conversation – the “techlash” will likely only intensify, at both a national, European and global level, and “trust” will be eroded further still. This ultimately benefits no-one.
We also believe in responsible leadership and a responsible future, that is led by the needs of citizens and society, not by the profit motives or market domination of a Silicon Valley, select few; nor by political agendas and interests.
We take inspiration from the interesting precedent of the Warnock Commission. In the face of vast and fast change in fertilisation technologies, it managed to create an ethical consensus around the new technologies which people actually understood. Creating such a consensus – and codifying it into Terms & Conditions to underpin a new compact and possibly legislation – would be a terrific goal. It’s also about getting ahead of the tech. instead of, like government, always being way behind.
Data protection or privacy is a fundamental human rights issue.
We citizens happily sign away our privacy and rights in return for anything from free WiFi (the new opiate of the masses) to the endless temptation of “special offers” and of course “free” personality profiles. It has become a standing joke (but a well-researched one) that no-one reads the Terms & Conditions on any agreement, from telecoms providers to the train operators to the local coffee shop. The question of who is collecting and harvesting our data – and who stands to profit? – has been troubling us for some time, even before Alexander Nix and his worrying, if colourful, claims.
In the early days of “customer loyalty”, data harvesting was bi-lateral and openly transactional. We signed up for a supermarket club-card in return for discounts or offers and, in turn, the supermarket supposedly knew whether – and when – to sell us nappies or prosecco in return. Sophistication has reached exceptional new levels: data harvesters operating at scale are less interested in giving something back to you or me, than they are selling-on our data for use in bigger data sets elsewhere – from governments mapping transport in cities, to shadowy strategists, as we have now learned, figuring out how to skew national elections and referenda.
And, still, nobody is prepared to tell us what is really going on.
In parallel, there is of course much noise around “fake news” and/or the excuse perpetrated by the tech corporates that they are platforms, not media owners. We argue that this is a false and ultimately irresponsible distinction – which may be why support for this argument is beginning to recede. To borrow from philosopher Tony Judt, this is less the “post-truth” age than a “post-ethical” one. An ethical re-set would be most welcome and a key step towards improving levels of trust.
As part of this re-set, we might consider a new Consumer Protection Act, with the tech corporates signing up to Terms & Conditions, based on citizen rights, thus turning the tables and returning power to the people. Such an Act protects consumers who are their data. Silicon Valley accomplishes a neat rhetorical trick by talking about data as though it is distinct from “me”, but in fact it is not: I am my data, my data is me. Many consumer protection groups, such as Which?, and BEUC at a European level, should be natural allies in helping prosecute this argument. What do we, as citizens, have a right to expect from both corporations and government?
Governance: new models required
The key question is therefore this: are the “tech” issues really tech issues or are they fundamental governance issues that concern us all?
For example, we elect governments to carry the responsibility to keep us safe. But if we accept that we are our data (more than that too, of course, but also that) then the state is responsible for keeping us safe. This consists of (but isn’t limited to) data security, data ownership and online aggression, harassment and exploitation.
In other words, we have a right as citizens of a democracy to expect the same protection of our digital selves as we expect of our physical selves. Tech is not different – except insofar as legislators are dazzled, ignorant and prey to populism. The digital citizen is a citizen and vice versa. And the digital citizen, currently, is wandering around the equivalent of an 18th century turnpike, surrounded by what can feel like thieves, marauders and conmen, with a number of the tech corporates seemingly effectively out of public and social control.
Issues remain around pricing our data. The immediate risk is that selling becomes the default. Is that what we want? It is what Silicon Valley might agree as long as they think they can afford it. But is it what we – as citizens – want? It is what Jaron Lanier suggests in Who Owns the Future? but tactically it could be ceding too much too soon. Meanwhile, the Financial Times’ John Thornhill has an interesting take on how data might fund new models of Universal Basic Income.
We, the authors, would be uncomfortable with a scenario in which the impoverished state milks its citizens to provide the revenue its economic policies can no longer provide. This is however, a direction in which we could easily and dangerously head.
If you are interested in Jericho’s Terms & Conditions Apply project, please get in touch.