28 March 2022. War | Time
‘Urbicide’, or why civilians are in the front line of modern warfare. Saving daylight.
Welcome to Just Two Things, which I try to publish daily, five days a week. Some links may also appear on my blog from time to time. Links to the main articles are in cross-heads as well as the story. Recent editions are archived and searchable on Wordpress.
1: ‘Urbicide’, or why civilians are in the front line of modern warfare
My policy on the war is to stay away from it unless there’s something to see that’s different from the mainstream coverage. Adam Tooze’s weekend Chartbook—on ‘urbicide’ as a military strategy—definitely came into that category.
The word was coined by the British science fiction writer Michael Moorcock in 1963, and adopted in the ‘60s by campaigners opposed to urban clearance schemes. These often seemed to be like a ‘war’ on people of colour, and on poorer communities.
It moved into the military domain in the 1990s, to make sense of some of the tactics used in the Balkan wars, notably against Bosnia, and specifically about “the destruction of the historic city of Mostar.”
As Mary Kaldor noted in her essential work on “New Wars”, one way to read the Yugoslav conflict was as a collapse back into archaic ethnic strife. But, as the decade wore on that became increasingly implausible. Instead, what it seemed that we were witnessing was a new configuration of war, or the mode of war-making.
To simplify a huge debate brutally, “new wars” were more informal, unconventional, asymmetric and they were conflicts taking place in a world that was increasingly urbanized. Ever more often cities became battlefields.
At one level, as Tooze suggests, this was just a function of urbanisation—cities are now where the majority of the world’s people are.
But it pre-dates this by some way. I was also reminded of listening to a talk a few years ago by the historian Anthony Beevor, who observed that one of the long trends of the 20th century was the increasing proportion of civilians who were casualties of war. Air power had quite a lot to do with this. From memory, the Great War was the last significant conflict where military casualties outnumbered civilian casualties.
This creates some dissonance. Tooze talks about ‘war among the high rises’, recalling
Mahomoud Darwish’s description in Memory of Forgetfulness of making coffee in his modern apartment under bombardment by Israeli jets.
But equally, most modern urban development is low rise—informal settlements, usually with poor infrastructure.
Tooze notes the influential work of the Newcastle University researcher Stephen Graham in this area, as an editor and a writer. His book Cities Under Seige may be a definitive text. Tooze quotes Graham in his piece:
In the 'new' wars of the post-Cold War era—wars which increasingly straddle the 'technology gaps' that separate advanced industrial nations from informal fighters—the world's burgeoning cities are the key sites. Indeed, urban areas have become the lightning conductors for our planet's political violence. Warfare, like everything else, is being urbanized. The great geopolitical contests—of cultural change, ethnic conflict and diasporic social mixing; of economic re-regulation and liberalization; of militarization, informatization and resource exploitation; of ecological change—are, to a growing extent, boiling down to violent conflicts in the key strategic sites of our age: contemporary cities.
Two things happen in such a setting.
The first is that everyone becomes a target, as we have seen repeatedly in the Russian assaults on Mariupol.
The second is that—if cities are ‘key sites’—then military ‘doctrine’ is not going to be far behind. This is ‘MOUT’, or Military Operations in Urban Terrain’. But, as Tooze notes, the urban space is challenging for modern mechanised infantry. The city may not be fortified any longer, but urban layouts are more helpful to defenders than attackers.
It tends to help, in other words, if you have the the overwhelming share of force. Tooze doesn’t mention this, but in Stephen Graham’s book he notes that Israeli assaults in Palestine by the Orwellian-named ‘Israeli Defence Force’ were monitored by US military observers for research purposes.
Grozny features large in MOUT accounts, because it represents something of a controlled experiment: the failure by Russian troops in 1994-95, the brutal ‘success’ in 1999-2000. But unlike the Americans and Israelis, Russia’s done little urban fighting since then. During its Georgia campaign it bombed Tsblisi but didn’t attempt to attack it on the ground. In Syria, Russian military were generally well away from the urban front lines.
Tooze notes a LinkedIn post from last November that suggests that the Russians have been working on their urban warfare doctrine since then—it describes using precision military fire to isolate parts of the city being attacked. But we haven’t seen much of that in Ukraine.
And the numbers of troops needed to capture a city is huge. Mariupol—population 400,000—is close to the the size that Stalingrad was in 1942, and about same size as Dublin. Kyiv is six times larger. A serious assault on even a part of the city would take more men than Russia has so far committed to the whole campaign, even if you could keep them supplied.
Looking at it through this perspective suggests why we have the military outcomes we’ve seen in the war. Despite its superiority in numbers, the Russians don’t have enough troops to take control of cities on the ground against committed defenders. (The myth of the ‘liberating Russian army’ must have done a lot of work in the planning phase.) Which means that civilians, and their urban spaces, will going on being military targets.
Ukraine notes
Food researcher Sarah Taber had a twitter thread at the weekend that noted that the impact of the war on global wheat production was a lot smaller than has been discussed. Yes, 25% of world wheat production comes from Ukraine and Russia, but only 0.9% of it has been affected by the war—or something like seven million tonnes of a world output of 778 million tonnes.
That sounds like a lot, but market signals seems to have worked. Globally, farmers planted more wheat this spring because of higher forward prices—because of concerns about war. This will get harvested in the autumn. Most wheat is also eaten locally, in the country where it is produced. There are some problems—notably in the MENA countries that are dependent on imports—but this is, she says, a shipping problem, not a growing problem.
2: Saving daylight
We switched from GMT to British Summer Time in the UK at the weekend (‘spring forward, fall back’, as the American mnemonic has it). This reminded me that I’d written previously—and before the Scottish indepedence referendum—about the politics involved in inventing ‘daylight saving’. I’m republishing that piece here, slightly amended.
Saving the Daylight, by David Prerau, is a history of the daylight saving movement. It’s an interesting read for a number of reasons – one being that social institutions which seem mundane now were fiercely contested when they were introduced.
The original notion of getting people to wake earlier in winter back was aired by the American polymath Benjamin Franklin in 1784, and came complete with calculations of the amount of tallow that would be saved by ordinary citizens. But the movement didn’t gain any momentum until 1907 – a hundred and twenty years later – when a British daylight activist (how strange that phrase seems), William Willett, wrote a pamphlet, “The Waste of Daylight“, advocating the change, for reasons both of economy and public health (children and others would be able to play outside after school).
I’d expected to discover a tale of years of campaigning, but in fact Willett got lucky. The then Prime Minister, Asquith, opposed the measure (probably because he thought there were too many noisy losers, such as farmers), but it gained support from an influential House of Commons Committee. When the Germans borrowed the plan during the Great War to increase productivity and reduce energy costs, Britain promptly followed suit – introducing it as a measure which had to be re-approved each year, to save the face of opponents. (It became permanent after the War, and has been tinkered with since, but not challenged).
In the United States, it was a different story; daylight saving was more complex on a continent with multiple time zones, and although it was introduced during World War I the legislation was subsequently repealed, and state and city jurisdictions got to make their own decisions – a bit like the US response to Kyoto.
London time and local time
What do we learn from the story? Well, the whole plan was only possible because clocks had become accurate enough to be set, and it was only possible to discuss it conceptually because the railways had effectively constructed, in the UK, a single national time zone. Prior to this, Dublin Mean Time (then part of the UK) had been 25 minutes behind London because it was to the west and the sun rose later; in 1840, a timetable told passengers that
“London time is kept at all stations on the railway, which is 4 minutes earlier than Reading time, 5 1/2 minutes before Steventon time, 7 1/2 minutes before Cirencester time, 8 minutes before Chippenham time, and 14 minutes before Bridgewater time”.
The great Tom Tower clock in Oxford was fitted with two separate minute hands to show both local and London time.
(Tom Tower, Oxford. Photo via Wikipedia)
Some of the opposition argued that the daylight-saving advocates were tinkering with ‘natural time’ – as if time wasn’t an artificial social construct in the first place. Some of the arguments against seem absurd in hindsight (and possibly at the time). There was a peer who argued that if aristocratic twins were born either side of 2am when the clocks were being put back, the second born would be recorded with an earlier time of birth, which would affect inheritance of titles.
But this is not to pretend that this is only of historic interest. A map on wikipedia suggests that only about a half of the planet uses DST, a quarter used to use it, and another quarter have never used it. Those countries that do use it are mostly richer and in the north.
The relatively persistent suggestions that the UK should shift its time in line with western Europe—which would mean that it would be double summer time in summer—would be bad for Scotland, where they would lead to dark winter mornings. With a Scottish Parliament, it’s quite possible that Scotland might choose not to follow the UK’s lead and move itself into a different time zone. Social constructs have symbolic meanings.
Notes from readers
Charles Tallack of the Health Foundation checked the claim in a recent post here that nine out of ten of the poorest regions in Europe were in the UK, and found that a better number was six—(courtesy of FullFact). This data is from 2016, not 2014.
The figure measures GDP per head for the region—which is a measure of average income for the region. There are also notes in Full Fact about the definition of ‘northern Europe,” since the (poorer) Baltic states are excluded).
All the same, if the incidence of such regions was distributed randomly across northern Europe, the UK would have about two of these regions. And according to The Guardian, when Eurostat revisited the figures in 2017, the number of UK regions in the top ten had increased to seven—up from three in 2008. The source for this story was the High Pary Centre, but I couldn’t track it back to them. Either way, this suggests that it’s an effect of austerity.
It’s worth adding that this is a moving target; it’s measuring relative poverty, not absolute. So if GDP growth is higher across northern Europe as a whole than it is in the UK, UK regions will slide down the table. It’s also worth noting that regions within London figure on both sides of this version of the table. Eurostat no longer includes the UK in this data (because, Brexit), which the government probably regards as a Brexit bonus.
j2t#288
If you are enjoying Just Two Things, please do send it on to a friend or colleague.