Issue link: https://maltatoday.uberflip.com/i/1397092
OPINION 29.7.2021 Alexiei Dingli Prof Alexiei Dingli is a Professor of AI at the University of Malta and has been conducting research and working in the field of AI for more than two decades, assisting different companies to implement AI solutions. He forms part of the Malta.AI task-force, set up by the Maltese government, aimed at making Malta one of the top AI countries in the world T he pandemic brought chaos all over the globe and spared no one. One of those harshly affected was a school bus driver operating in a small rural town. When the numbers of infected people be- gan to rise, the school shut and what seemed like a steady job suddenly fizzled into nothing- ness. Essentially, an unexpect- ed series of events changed her life overnight. Having children dependent on her complicated his situation by various orders of magnitude, and the problem quickly shied towards obtain- ing the basic needs, "how to bring food to the table?" Luckily for her, the pandem- ic also opened new oppor- tunities. Since people were spending more time at home, they couldn't go out, and they desperately needed deliver- ies. is situation boosted the small-goods delivery business overnight. She had already per- formed a few trips before the pandemic as a part-time gig, but it quickly became her only source of income following the loss of her main job. e task was not easy; run- ning around and delivering thousands of packages within a limited time frame. But the worse part of it was not that; it was the Artificial Intelligence (AI) manager. e AI manager had no face. It was not someone you could crack a joke with, maybe get angry or even discuss concerns. e manager app became the main communication chan- nel; it could track the vehicle's movement and sometimes also demand impossible feats. ese demanding requirements arise from the fact that some com- panies are making very bold promises to their customers. Rather than making their cli- ents wait, they offer them same day deliveries. So the algorithm monitors whether the drivers reached the delivery station, if they completed the route with- in the predefined time window, whether they left the pack- age on the porch hidden from thieves and so on. e algo- rithms scan the incoming data, analyse it and decide whether a driver should get more routes or is deactivated. As simple as switching a button, on or off. But the algorithm does not seem to be giving much weight to factors beyond the driver's control, like driving through kilometres of winding dirt roads in the snow or waiting for an hour to retrieve a pack- age because the delivery sta- tion is overflowing with other drivers. ese issues and many others throw the drivers be- hind schedule, thus negatively affecting their delivery ratings. In one particular case, the driver had a puncture. When she reported her situation, the company asked her to return the package, which she did, even though she was almost flat. Her rating fell from "great" to "at-risk" almost immediately because she technically aban- doned her delivery. Following the incident, she did receive emails from the company re- assuring her that she was still one of their best drivers. Most probably, it was just a bipolar AI trying hard to be empathet- ic because the very next day, the algorithm reevaluated her score and coldly terminated her by simply blocking her app. She was stunned! She had de- livered more than 8,000 packag- es, was rated as one of the best drivers, and because of a flat tire, they fired her on the spot! Luckily the system provides for an appeal, so she applied. Once again, the empathetic bipolar AI sent her an email a few days later telling her that he's sorry for the delays and that they're processing her appeal. I'm sure that the AI lost many sleepless hours of processing time think- ing about her precarious situa- tion and her kids. But a few days later, the bipolar AI sent her an- other email informing her that the company's position did not change after the appeal and they won't resume her employment. At the end of the email, it "genu- inely" wished her all the success in her future endeavours. e effect of this decision was that she began to struggle finan- cially. She stopped paying her mortgage, the bank took her car, she almost lost her house, became dependent on govern- ment assistance, and her chil- dren passed one of the most ter- rible Christmases in their lives. is episode is not a horror story situated in a distant dys- topian future governed by an AI. It happened last year to a 42-year-old single mother in the United States. Such a system has many faults. We cannot create algo- rithms and make them demand unrealistic goals. It makes no sense. While we should aim for productivity, we cannot treat people like machines. If something is wrong with that person's performance, it should be discussed humanely while considering their back- story. If an algorithm assesses a person, that person should have the right to examine the assessment, counteract the ar- guments and appeal it. Finally, we should never remove the human judges from the loop. Algorithms are not infallible, they are subject to biases, and they can't determine the future of human lives based upon a discrete mathematical formula. is story is not a one-off mistake; many other similar cases are sprouting every day. Companies are resorting to automating their human re- sources operations, banks are approving loans with the help of a computerised system, and courts are relying on algorithms to give parole. If we want to re- solve this situation, algorithms that affect people need to be transparent about their deci- sions, they have to justify them and give people the informa- tion they need. People should have the faculty to call out mis- takes and have access to simple corrective mechanisms. Legis- lators are duty-bound to enact rules which prevent harm be- fore it is too late. Only then can we hope to create a fair society where AI is helping every one of us improve our lives and not acting as a digital executioner over our livelihood. My bipolar AI manager just fired me!