From the Guardian in 2012:
From the Economist last year:
They play the stockmarket, decide whether you can have a mortgage and may one day drive your car for you. They search the internet when commanded, stick carefully chosen advertisements into the sites you visit and decide what prices to show you in online shops... Algorithmically curated “filter bubbles” may even affect the way a country votes.
An algorithm is, essentially, a brainless way of doing clever things. It is a set of precise steps that need no great mental effort to follow but which, if obeyed exactly and mechanically, will lead to some desirable outcome.
From the New Yorker also last year:
Satirical Cartoons - The New Yorker
And they now rule the world:
2017 Was The Year We Fell Out of Love with Algorithms | WIRED
Police use Experian Marketing Data for AI Custody Decisions – Big Brother Watch
THE SCORED SOCIETY: DUE PROCESS FOR AUTOMATED PREDICTIONS - data science assn.org
As reported a few times in this blog:
Futures Forum: Reforming the data economy
Futures Forum: Brexit: and Exeter's MP once 'regarded as a crank' >>> but now questions are multiplying over the roles of Cambridge Analytica and its parent company Strategic Communications Laboratories
Futures Forum: Brexit: and bots
Futures Forum: How to fix the labour market: disruption, technological automation and the gig economy
Futures Forum: The explosion of online propaganda and the use of our own data to manipulate us
Futures Forum: Brexit: and the use of data analytics
Futures Forum: A simple guide to algorithms
Futures Forum: Artificial Intelligence on the farm >>> 'Though machines with AI are surprising in their adaptability and prospects for improvement, they still lack a very human factor..... common sense'
Futures Forum: Brexit: and post-fact politics
Futures Forum: A post-work society >>> transitioning to a fully-automated economy
Futures Forum: The promises of technological innovation >>> "So, if networked communication and cybernetic technologies are so potentially liberating, why are they so authoritarian in the forms they currently take?"
Futures Forum: Alternative currencies >>> responding to market and government failures
This week, the Guardian looked at how public policy will be based on our algorithmic footprint:
The radical geographer and equality evangelist Danny Dorling tried to explain to me once why an algorithm could be bad for social justice.
Imagine if email inboxes became intelligent: your messages would be prioritised on arrival, so if the recipient knew you and often replied to you, you’d go to the top; I said that was fine. That’s how it works already. If they knew you and never replied, you’d go to the bottom, he continued. I said that was fair – it would teach me to stop annoying that person.
If you were a stranger, but typically other people replied to you very quickly – let’s say you were Barack Obama – you’d sail right to the top. That seemed reasonable. And if you were a stranger who others usually ignored, you’d fall off the face of the earth.
“Well, maybe they should get an allotment and stop emailing people,” I said.
“Imagine how angry those people would be,” Dorling said. “They already feel invisible and they [would] become invisible by design.”
The capacity of tech to outstrip the worst imaginings of its detractors is truly incredible. Prioritising emails turned out to be small fry for big data, which turned its attentions instead to simply ranking people, not for the interest they might hold in an inbox, but for their value as customers, employees, tenants – for all practical purposes, their value as human beings.
The Chinese government is working towards assigning its citizens a “social score”: by 2020, an algorithm will rate citizens as a “desirable employee, reliable tenant, valuable customer – or a deadbeat, shirker, menace and waste of time”, in the words of two US academics. “Waste of time”, it strikes me, is a more searing criticism than “deadbeat”, which sounds quite rakish and rebellious. Algorithms don’t understand nuance, because it saves time not to. But the erasure of small degrees of human difference is the least bad thing about it. The scored society, as the New Economics Foundation calls it in its report, What’s Your Score?, is everywhere: it is just more pronounced in China because the government is not embarrassed about it.
All our debates about the use of big data have centred on privacy, and all seem a bit distant: I care, in principle, whether or not Ocado knows what I bought on Amazon. But in my truest heart, I don’t really care whether or not my Frube vendor knows that I also like dystopian fiction of the 1970s.
I do, however, care that a program exists that will determine my eligibility for a loan by how often I call my mother. I care if landlords are using tools to rank their tenants by compliant behaviour, to create a giant, shared platform of desirable tenants, who never complain about black mould and greet each rent increase with a basket of muffins. I care if the police in Durham are using Experian credit scores to influence their custodial decisions, an example – as you may have guessed by its specificity – that is already real. I care that the same credit-rating company has devised a Mosaic score, which splits households into comically bigoted stereotypes: if your name is Liam and you are an “avid texter”, that puts you in “disconnected youth”, while if you’re Asha you’re in “crowded kaleidoscope”. It’s not a privacy issue so much as a profiling one, although, as anyone who has ever been the repeated victim of police stop-and-search could have told me years ago, these are frequently the same thing.
Privacy isn’t the right to keep secrets: it’s the right to be an individual, not a type; the right to make a choice that’s entirely your own; the right to be private. The answer is structural. I’m as sick now of being told to delete my Facebook account as I was 20 years ago of being told that turning plugs off would halt climate change. We need better laws, fast, or we’ll all be deadbeats in the end.
Algorithms are taking over – and woe betide anyone they class as a 'deadbeat' | Zoe Williams | World news | The Guardian
Here the report from the New Economics Foundation:
WELCOME TO THE SCORED SOCIETY
OPENING THE BLACK BOX
What’s your score? | New Economics Foundation