Tech-savvy developer types often ride bikes, and often instinctively back the idea of robot vehicles. After all, the subconscious asks, if I can code a computer to play a game, what’s so hard about getting it to move around a real map?
But driverless cars are not like ‘normal’ AI. They exist in a world with all-too real consequences. For the first time we’re asking billions of people to cede control of a lethal machine to a still-highly-experimental, incredibly complicated, autonomous system. Noise, edge cases and unusual behaviours not present in training data all abound. Autonomous vehicles will be more dangerous for cyclists and pedestrians, not less.
Bikes and people can suddenly reverse, or move sideways, or in diagonally. They can even jump! This means to truly safely deal with these types of users – call it the ‘Friday Night In Peckham Test’ – is a much bigger challenge than a motorway, where three lanes of users move in predictable patterns, and all between 40-80MPH.
There’s a reason car manufacturers are testing away from towns, and won’t share testing data, or test specifications. They’d rather not have to deal safely with pedestrians and bikes because they know how much more costly they are. So car-makers would rather they went away. They’ll probably get their wish because they’re marque ‘heavy industry’ employers, and because AI belongs to the sleek shiny optimistic future of tech that all politicians are desperate to court.
Instead we will see two worrying developments to shift responsibility from car makers. There will be pressure to gradually remove troublesome bikes and pedestrians from parts of the roads ‘for their own safety’. We’ve already started to see this.
Alternatively manufacturers will introduce two levels of AI – a truly autonomous ‘safe mode’ which overreacts to all stimuli with infuriatingly (unworkably) slow speeds, or a ‘sport setting’ which makes much riskier decisions to enable faster speeds, under the flimsy caveat that users ‘monitor the system more actively’. Most will prefer to travel faster but few will be bothered to supervise the AI closely or consistently, when Netflix and social media are available distractions.
Finally, a growing body of evidence shows that too much automation of routine tasks can make them more dangerous, not less. Airline pilots’ skills atrophy when autopilot handles most of their flying time – with literally lethal consequences when an emergency occurs. Car drivers – most of whom already take driving far less seriously than such a dangerous activity merits – will suffer the same fate. Who would bet on a driver, suddenly handed control back in an emergency situation, taking the right decision in a split-second if they have perhaps hardly driven for years?
We’d just started to halt the decades long declines in cycling and walking rates, and lower urban speeds to 20MPH (‘twenty’s plenty’ initiatives), but the rise of autonomous vehicles will likely threaten this and see walking and cycling people relegated to third place, behind fleets of AI cars travelling bumper to bumper at far higher speeds than today and a few die-hard petrolheads defiantly navigating their own vintage tanks among the whizzing fleet.
One of the best things about my teenage years was romping around town on foot or bikes with my mates. My daughter turns 16 in 2030. It’s terrifying to think that by then, the killer robots that end her life might not be gun-toting drones, put plain old delivery vans.