Issue link: https://maltatoday.uberflip.com/i/1408316
OPINION 9.9.2021 Alexiei Dingli Prof Alexiei Dingli is a Professor of AI at the University of Malta and has been conducting research and working in the field of AI for more than two decades, assisting different companies to implement AI solutions. He forms part of the Malta.AI task-force, set up by the Maltese government, aimed at making Malta one of the top AI countries in the world A t the dawn of this mil- lenium, Steven Spiel- berg launched a science fiction movie called Artificial Intelligence (AI). e inception of this film originated from the great American director, Stanley Kubrick, who got inspired by the 1969 short story called "Super- toys Last All Summer Long". Before this, Holywood has been toying with AI concepts for decades, even before the term AI was even conceived! e first movie to feature a ro- bot was Metropolis, produced in 1927. What followed was a chain of blockbuster movies featuring AI, which includes "2001: A Space Odyssey", "Blade Runner", "e Termi- nator", "e Matrix", "I Robot", and many others. However, none of them pushed the term AI into collective use as much as the AI movie by Steven Spielberg. Since then, AI has not only found its way into our diction- aries, but it is increasingly en- tering everyday use. e main reason it took so long was that systems that use AI do not ac- knowledge it! Have you ever seen a car manufacturer prom- inently advertise AI? Yet many standard cars have parking sensors, auto parking features, lane-centring steering and traf- fic-sign recognition, to name a few. We can say the same for air conditioners, ovens, washing machines, and most mundane household appliances. So we have all been using AI for quite a while without even knowing it! But AI did not just gain pop- ularity because of the movies. Something else happened in the first decade of this Mille- nium. First of all, AI is data-hungry and, as such, requires large volumes of information to op- erate well. Ever since the be- ginning of this Millenium, the world has experienced a mas- sive surge in data production, which is still ongoing. Com- puters, smartphones or other wearable devices constantly capture videos, photos, sounds, 3D and text-based content. So- cial media has turned every- one into a content creator. In- ternet of ings (IoT) sensors are being plugged all over the place, promptly reporting all sorts of things; whose ringing the doorbell, the state of the garage door, the temperature or air quality inside the home. According to Forbes, 90% of all the data in the world was cre- ated during the past two years! Second, just having data is not enough if the algorithms can- not handle it. Luckily around 2006, a new breed of AI algo- rithms were developed called Deep Learning (DL). ese programs build upon Artifi- cial Neural Networks (ANN), a set of techniques created in 1958 inspired by the internal workings of the human brain. DL algorithms can learn from massive data sets, and their accuracy is extremely high, in some cases, even higher than what a human can achieve. Most of the systems used today within consumer applications make use of such techniques. ird, since these algorithms process vast amounts of data, they need powerful comput- ers. During the same period, powerful processors designed initially to cater for high-end graphics became the proces- sors of choice for DL algo- rithms. But even these were not enough to quench the thirst for processing powers. Luckily, cloud computing was on the rise, thus allowing AI develop- ers to source cheap processing power located remotely around the globe. So the right conditions sud- denly formed around 2010, which is why AI took off so rap- idly immediately after and is to- day on everyone's lips. But the window of opportunity in any innovation is not perpetual and typically lasts around 15 years. If you do the math, to gain the most advantage out of AI, a company needs to take action by 2025. A survey conducted by the Massachusetts Institute of Technology (MIT) amongst 3000 managers shows that one-third of companies still have little to no understand- ing of AI and will most likely miss this opportunity. 48% are piloting AI projects only to appear forward-looking, but they have no intention of using these projects. Only 20% of the companies are pioneers; they have deep knowledge of AI and plan to integrate it inside their processes. is very much tal- lies with research conducted by MMC Ventures, a London based investment firm. ey looked at 2,830 AI companies based in Europe and discov- ered that 40% of them don't use any AI at all! Trying to keep up appearances rather than gaining real benefit from AI is rather tragic. Unfortunately, according to the Diffusion of Innovations theory developed by E.M. Rogers in 1962, this seems to be a relatively com- mon trend in all innovations, and AI is not an exception! Based on the same survey, while almost 90% of compa- nies believe that AI will offer them an advantage over others, only 60% have an AI strategy in place. Of these, only 10% are using it to generate financial benefits. If these companies do not act, the gap with the lead- ing companies will widen fur- ther. e analysis also shows that those companies who take the initial steps of AI adoption have a 20% advantage over their competitors to gain mar- ket leadership. e percentage increases to 40% for those who successfully implement AI into their systems. But their shoot up to more than 70% if they revamp their processes using an intelligent combination of humans and machines. eir profits will also increase five- fold, compared to those who made minor changes to a few business processes. Of course, the technology is here, and the ball is your their court. e time window to reap the most advantages from AI is also rapidly closing. It's up to you to decide whether your company will reap the rewards or lament a missed opportunity in the coming years! Is it too late to adopt Artificial Intelligence?