Saturday, 14 July 2018

Algorithms and Big Brother

Algorithms are everywhere:

From the Guardian in 2012:

Keynes's observation (in his General Theory) that "practical men who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist" needs updating. Replace "economist" with "algorithm". And delete "defunct", because the algorithms that now shape much of our behaviour are anything but defunct. They have probably already influenced your Christmas shopping, for example. They have certainly determined how your pension fund is doing, and whether your application for a mortgage has been successful. And one day they may effectively determine how you vote.


From the Economist last year:

They play the stockmarket, decide whether you can have a mortgage and may one day drive your car for you. They search the internet when commanded, stick carefully chosen advertisements into the sites you visit and decide what prices to show you in online shops... Algorithmically curated “filter bubbles” may even affect the way a country votes.
An algorithm is, essentially, a brainless way of doing clever things. It is a set of precise steps that need no great mental effort to follow but which, if obeyed exactly and mechanically, will lead to some desirable outcome.



From the New Yorker also last year:


Satirical Cartoons - The New Yorker 


And they now rule the world:
2017 Was The Year We Fell Out of Love with Algorithms | WIRED
Police use Experian Marketing Data for AI Custody Decisions – Big Brother Watch
THE SCORED SOCIETY: DUE PROCESS FOR AUTOMATED PREDICTIONS - data science assn.org

As reported a few times in this blog:
Futures Forum: Reforming the data economy
Futures Forum: Brexit: and Exeter's MP once 'regarded as a crank' >>> but now questions are multiplying over the roles of Cambridge Analytica and its parent company Strategic Communications Laboratories
Futures Forum: Brexit: and bots
Futures Forum: How to fix the labour market: disruption, technological automation and the gig economy
Futures Forum: The explosion of online propaganda and the use of our own data to manipulate us
Futures Forum: Brexit: and the use of data analytics
Futures Forum: A simple guide to algorithms
Futures Forum: Artificial Intelligence on the farm >>> 'Though machines with AI are surprising in their adaptability and prospects for improvement, they still lack a very human factor..... common sense'
Futures Forum: Brexit: and post-fact politics
Futures Forum: A post-work society >>> transitioning to a fully-automated economy
Futures Forum: The promises of technological innovation >>> "So, if networked communication and cybernetic technologies are so potentially liberating, why are they so authoritarian in the forms they currently take?"
Futures Forum: Alternative currencies >>> responding to market and government failures

This week, the Guardian looked at how public policy will be based on our algorithmic footprint: 



The radical geographer and equality evangelist Danny Dorling tried to explain to me once why an algorithm could be bad for social justice.
Imagine if email inboxes became intelligent: your messages would be prioritised on arrival, so if the recipient knew you and often replied to you, you’d go to the top; I said that was fine. That’s how it works already. If they knew you and never replied, you’d go to the bottom, he continued. I said that was fair – it would teach me to stop annoying that person.
If you were a stranger, but typically other people replied to you very quickly – let’s say you were Barack Obama – you’d sail right to the top. That seemed reasonable. And if you were a stranger who others usually ignored, you’d fall off the face of the earth.
“Well, maybe they should get an allotment and stop emailing people,” I said.
“Imagine how angry those people would be,” Dorling said. “They already feel invisible and they [would] become invisible by design.”
The capacity of tech to outstrip the worst imaginings of its detractors is truly incredible. Prioritising emails turned out to be small fry for big data, which turned its attentions instead to simply ranking people, not for the interest they might hold in an inbox, but for their value as customers, employees, tenants – for all practical purposes, their value as human beings.
The Chinese government is working towards assigning its citizens a “social score”: by 2020, an algorithm will rate citizens as a “desirable employee, reliable tenant, valuable customer – or a deadbeat, shirker, menace and waste of time”, in the words of two US academics. “Waste of time”, it strikes me, is a more searing criticism than “deadbeat”, which sounds quite rakish and rebellious. Algorithms don’t understand nuance, because it saves time not to. But the erasure of small degrees of human difference is the least bad thing about it. The scored society, as the New Economics Foundation calls it in its report, What’s Your Score?, is everywhere: it is just more pronounced in China because the government is not embarrassed about it.
All our debates about the use of big data have centred on privacy, and all seem a bit distant: I care, in principle, whether or not Ocado knows what I bought on Amazon. But in my truest heart, I don’t really care whether or not my Frube vendor knows that I also like dystopian fiction of the 1970s.
I do, however, care that a program exists that will determine my eligibility for a loan by how often I call my mother. I care if landlords are using tools to rank their tenants by compliant behaviour, to create a giant, shared platform of desirable tenants, who never complain about black mould and greet each rent increase with a basket of muffins. I care if the police in Durham are using Experian credit scores to influence their custodial decisions, an example – as you may have guessed by its specificity – that is already real. I care that the same credit-rating company has devised a Mosaic score, which splits households into comically bigoted stereotypes: if your name is Liam and you are an “avid texter”, that puts you in “disconnected youth”, while if you’re Asha you’re in “crowded kaleidoscope”. It’s not a privacy issue so much as a profiling one, although, as anyone who has ever been the repeated victim of police stop-and-search could have told me years ago, these are frequently the same thing.
Privacy isn’t the right to keep secrets: it’s the right to be an individual, not a type; the right to make a choice that’s entirely your own; the right to be private. The answer is structural. I’m as sick now of being told to delete my Facebook account as I was 20 years ago of being told that turning plugs off would halt climate change. We need better laws, fast, or we’ll all be deadbeats in the end.

Algorithms are taking over – and woe betide anyone they class as a 'deadbeat' | Zoe Williams | World news | The Guardian

Here the report from the New Economics Foundation:

WHAT’S YOUR SCORE?

HOW DISCRIMINATORY ALGORITHMS CONTROL ACCESS AND OPPORTUNITY


MIRANDA HALL
DUNCAN MCCANN

In China, a new system using data from public and private sources aims to score every citizen according to their trustworthiness’ by 2020. Cheating on a video game could lower your score. Buying a lot of nappies could give you extra points. This number determines whether you can buy plane tickets, how long you wait to see a doctor, the cost of your electricity bills and your visibility on online dating sites. 
Across Europe and the US, people are shocked by this dystopian IT-backed authoritarianism. But citizens of these countries are already being scored by systems based on the same logic – they just haven’t noticed.
Public debate in the UK on datafication’ has been overwhelmingly concerned with individual privacy and the protection of personal data. Understandably, a lot of people don’t really care, feeling they have nothing to hide.’ We should care. But for this to happen, public debate needs to shift focus to the ways our digital footprint is used to produce scoring systems that shape our lives.
Similar to Chinese social score’, these algorithms rank and rate every member of society to determine what we get access to and the condition of that access: from sorting job applications and allocating social services to deciding who sees advertisements for housing and products. They decide whether you are a desirable employee, reliable tenant, valuable customer — or a deadbeat, shirker, menace, and waste of time”.
The current trend in the corporate sharing economy has been to promote the democratising potential of access to goods and services over ownership.The actual impact of this shift has in fact been to make conditional access to private property the norm, amplifying wealth inequalities. At the same time, in the public sector, an obsession with innovative’ automated decision-making has reframed access to rights and resources (like housing or healthcare) as an issue of efficient allocation rather than fairness or justice.

WELCOME TO THE SCORED SOCIETY

Credit scoring has always ranked and rated citizens based on their consumer behaviour. But fintech’ (financial technology) is now pioneering the use of alternative data’ to reflect individuals’ true’ personality. With tools like Tala, whether you organise your phone contacts by their first and last names, and or call your mother regularly will generate a score that could dictate your eligibility for a loan. Other startups like the insurance company Kin are looking to use Internet of Things devices in people’s homes to price their home insurance, for example using water sensors to detect leaks.
The same systems are creeping into the rental sector, with the advent of proptech’. Desiree Fields has shown how the financialisation of the rental sector has resulted in new software platforms for private equity firms like Blackstone to manage massive portfolios of geographically dispersed homes. These technologies have gamified the tenant’s experience of renting by automating everything from maintenance requests and rent payments to evictions.
INCENTCO offers a platform through which tenants who consistently act in a way aligns with landlords’ interests (such as paying rent on time) they can build up enough points to get rewards, like new appliances, smart home technologies and general home upgrades. This automated system has worrying ramifications: if you are a single mother working on a zero-hours contract whose shift is cancelled and who unexpectedly has to pay for her kid to go to the dentist — tough. Now your score is too low to get a new fridge.
A number of new apps such as Canopy have developed similar RentPassport’ features, allowing users to demonstrate their financial prudence’, building up a reliability score’ over years of renting that informs their credit scores.
The power of algorithms to make decisions about our lives is also growing in the public sector. In Automating Inequality, Virginia Eubanks exposes how governments and local authorities are increasingly using digital tools to determine which families most deserve support. This comes as more and more people are living in poverty while less resources are allocated to help them under austerity.
In the US, algorithms have replaced nurses in determining how many hours of home care visits a patient is entitled to. In some places, funding dropped by as much as 42% as a result and when service users tried to understand why their hours had been cut, the state refused to share the algorithm’s decision-making process. Similar calculations sift through survey data to create a ranking of deservingness’ for housing waiting lists in places like Los Angeles.
And these developments are underway in the UK in a number of sectors. These range from automated screening processes for housing benefits as 78,000 families are currently homeless or in temporary accommodation to algorithmic prediction such as Xantura’s Early Help Profiling System’ to determine which children are at risk of abuse. While some councils develop systems in-house using their own data sets, there is also a trend towards partnering with private companies to acquire tools that incorporate other kinds of data. 
Experian, who are leading the way in the use of alternative data for credit scores, now also offer to help the public sector make better decisions”. An investigation by Big Brother Watch in Durham revealed that the police were using Experian’s services to make custody decisions. The company’s Mosaic’ system ranks individuals and households according to crude and offensive stereotypes, from disconnected youth’ (‘avid texters’ with low wages’ and names like Liam’ or Chelsea’) to crowded kaleidoscope’ (‘multicultural’ families in cramped’ flats with name like Abdi’ and Asha’) and penthouse chic’ (young professionals on astronomical salaries’ who drink a lot of champagne).
On top of deciding who gets to access basic public services, algorithms are being developed to decide who gets citizenship. Trump’s extreme vetting initiative’ in the US would use available data to predict the chances of a visa applicant’s likelihood to become a terrorist versus a contributing member of society.

OPENING THE BLACK BOX

Algorithms with no accountability are dividing society up into haves’ and have-nots’. Seemingly innocuous data on location or patterns of behaviour and consumption used to assess an individual’s reliability’ also function as proxies for gender, race and class. The result is that these scores end up amplifying existing social inequalities. 
These scoring systems are often described as black boxes’. It’s almost impossible to find out how they work because they are run by private companies (often delivering services for the state) and therefore their inner workings qualify as trade secrets’. Even when algorithms are made public, their overwhelming complexity and scale often make them almost impossible to understand.
If these algorithms can’t be seen or understood, how can we assign responsibility for harm when they produce discriminatory outcomes? 
This emphasis on smart’ and efficient’ technological solutions to social problems obscures the political choices that produce them. Virginia Eubanks puts it best when she says that we outsource these inhuman choices to machines because they are too difficult for us…we know there is no ethical way to prioritize one life over the next”.
For more on how algorithms and data are reinforcing inequality, read our report, Controlled by Calculations.


What’s your score? | New Economics Foundation
.
.
.

No comments:

Post a Comment