This week, in the wake of the Parsons Green bombing, Channel 4 News revealed something pretty disturbing about Amazon. Users who go on the website and add certain chemicals to their basket will find themselves presented with suggestions of other ingredients that are commonly used to make explosives. Some are even pointed in the direction of remote detonators and ball bearing (which are often used as shrapnel in terrorist bombs).
It’s embarrassing, and some would say outrageous, on Amazon’s part. But of course it’s not deliberate. It’s not the fault of some marketing guy at Amazon who thought, ‘well if we’re going to cater properly to terrorists then we need to make sure they can find all the gear they need with ease.’ Instead it’s the product of Amazon’s automated algorithms, which seek to cleverly tempt you to splash out on more things than you planned on doing when logging on.
It’s not the first such algorithmic embarrassment to have afflicted tech firms in recent weeks. After Friday’s incident, taxi app Uber was slammed for unwittingly putting up its prices in the area around Parsons Green. It soon rectified that and refunded any affected passengers, but that didn’t stop it getting a tonne of social media grief.