12 October 2021. Algorithms | Trends
Bringing the algorithms under control; the trends that will shape the next deacde
Welcome to Just Two Things, which I try to publish daily, five days a week. (For the next few weeks this might be four days a week while I do a course: we’ll see how it goes). Some links may also appear on my blog from time to time. Links to the main articles are in cross-heads as well as the story.
#1: Bringing the algorithms under control
There’s been more Facebook-related coverage over the weekend than one can really take in, but since I’m mostly interested in the likely routes to bringing our tech giants under some degree of social and political control, I’m going to write about algorithms.
Because one of the ways that you can bring tech giants under control, possibly, is to audit their algorithms. Well, it’s a bit more complicated than that, but that’s the line of argument made by Cathy O’Neill in a column in Bloomberg. O’Neill is the author of Weapons of Math Destruction, which is about the social impact of algorithms, so she is literally talking her book here.
(Diagram by Lubaochuan, via Wikipedia ipedia. CC BY-SA 4.0))
Just as relevant, she makes her living these days auditing algorithms:
When I take on a job, I first consider whom the algorithm affects. The stakeholders of an exam-grading algorithm, for example, might include students, teachers and schools, as well as subgroups defined by race, gender and income. Usually there’s a tractable number of categories, like 10 or 12. Then I develop statistical tests to see if the algorithm is treating or is likely to treat any groups unfairly — is it biased against Black or poor students, or against schools in certain neighborhoods? Finally, I suggest ways to mitigate or eliminate those harms.
In her short piece, she goes on to note that this approach is hard to use when you’re dealing with a company as large as Facebook (or Google, come to that). The reason?
They’re just too big. The list of potential stakeholders is endless. The audit would never be complete, and would invariably miss something important. I can’t imagine, for example, that an auditor could have reasonably anticipated in 2016 how Facebook would become a tool for genocide in Myanmar.
So, a different approach would be needed. Her solution is turn the process around, so it focuses on the outcomes rather than the immediate effects. The approach is to look at specific harms (in the same way that competition authorities might when considering the impact of a takeover). So this might include the harms to teenage girls raised by the whistle-blower Frances Haugen in her testimony to Congress last week:
Suppose the Federal Trade Commission (FTC) made a list of outcomes it wants to prevent.... It could then order Facebook to provide the data needed to test whether its algorithms are contributing to those outcomes — for example, by seeking causal connections between certain types of posts and young female users’ reported concerns about body image. To provide a robust picture, there should be multiple measures of each phenomenon, with daily or weekly updates.
As with the inter-operability argument that I discussed last week, this probably doesn’t require legislation. Facebook can go about its business as long as the data suggests it’s not committing harm. If it is the FTC can ask for remedies.
Of course, extending the argument beyond O’Neill’s Bloomberg column, it might be that there are other ways to view algorithms. In the industrial age, for example, we would be able to identify the harms from toxins or carcinogens in a product by simply buying it and testing it in a lab.
So maybe there’s an argument that says we need to be able to test algorithms in public the same way. Despite lobbying, algorithms don’t generally classify as intellectual property. As a society we could just say—as a social principle—that your algorithms need to be visible as a condition of doing business, so we can see what they are doing.
These businesses are so large now that bringing them back under control will probably involve everything we can do rather than picking one solution.
So it’s also worth noting here John Naughton’s weekend column in the Guardian that contrasted the recent Chinese approach to its social media giants with that in the West:
The other question is whether Xi Jinping and co understand something that we seem unwilling to accept – that social media companies, no matter how large and apparently powerful, are ultimately disposable. What really matters is what the west still has and China lacks, namely the ability to create (and modernise) the technological infrastructure that underpins companies that, basically, are just doing tricks with old technology such as the web.
#2: The trends that will shape the next decade
The academic Paul Rogers marked the impressive milestone of 1,000 columns for Open Democracy over the last 20 years with a ‘big picture’ article taking the temperature of the next 10 years.
As he notes, some of his early articles were informed by a view that three big trends would shape conflict in the coming decades:
These were an increasing rich/poor divide, environmental limits to growth and a global defence culture that prioritised military responses to challenges.
Given that this view has been borne out by events, he uses his 1000th column to check where we are now.
On the rich.poor divide, things are not much different (and maybe even worse):
There is plenty of bright new thinking coming to the fore in many countries as think tanks and gifted economists advocate for more just economic systems. And the fact that at least 12% of humanity is a member of one of the three milliom cooperatives worldwide is promising. Even so, wealthy elites across the world are deeply resistant to change. 'Tax the rich' has not yet become a general mantra – but that may be only a matter of time.
He’s more optimistic on the environment, at least up to a point:
Public awareness has risen markedly in many countries, aided by impressive new campaigns, and the repeated experience of extreme weather events is at last having an impact on public opinion. Meanwhile, the technology of decarbonising economies has come on apace, especially in the field of renewables such as wind and solar power. The two catches are that the neoliberal system has not responded remotely fast enough to curbing carbon dioxide emissions sufficiently, and inter-governmental political cooperation remains hopelessly limited.
And as for military security, well, things are not much different, they are very much the same:
The ability of the US, Australia and the UK to move seamlessly from the failures in Afghanistan to locking horns with China in barely a month has indeed been quite a feat for the military-industrial complex.
(Photo: Extinction Rebellion Sverige/flickr, CC BY 2.0))
As with Mark Carney, the former Bank of England Governor, Rogers sees the twin crises of COVID-19 and climate breakdown dominating the next decade.
There will be moves to address both, but they will not create sufficient effect and by the end of the 2020s, global insecurity will have substantially increased... At some point, most likely around the turn of the next decade, the global predicament will have become so dire that radical change will be forced upon society, even on those elements that are singularly wealthy and powerful.
As I have argued here before, the choice we have isn’t whether we adapt to these changes. The choice is whether we adapt to the changes in a managed way, or whether these changes overwhelm us, and we are forced to adapt any which way we can. As Rogers says in his article:
The truly urgent task is to speed up the rate of change, with the rest of this decade being the key period... If prophecy is indeed a matter of 'suggesting the possible', then there is an awful lot of suggesting that needs to be done right now, and we can all use our voices to speak up and play a part in being the change.
j2t#185
If you are enjoying Just Two Things, please do send it on to a friend or colleague.