For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Making a large AI model available in devices that operate in isolated environments, such as agricultural areas, forests, or oceans. That is, in a nutshell, the mission of Dolly Sapra, assistant professor at the UvA, who is working on 'Edge AI'. To enable that, energy efficiency is key.
Shutterstock

AI offers unprecedented possibilities, but an AI model (such as ChatGPT) will not work on a small device without access to the internet. But areas with low coverage could also benefit from autonomous intelligent systems. Think about forest fire prevention, poacher detection or crop monitoring. The system could be in a small device or in a drone. Also in healthcare, it may be undesirable to send privacy sensitive data to a server. In such situations, Edge AI can be deployed. This type of AI is used at the edges of a network, and thus far away from the servers that it normally uses. To make AI work on such devices, considerable modifications are needed. Current AI models are very large, because they need to perform a lot of calculations. As a result, a battery-only device would soon run out of power. By tinkering with the software, AI models should also be able to do their work at the edge.

Dolly Sapra, photo Marius Herget

Via India and London to Amsterdam

Dolly Sapra has been appointed assistant professor in the Parallel Computing Systems (PCS) department of the Informatics Institute since late March, focusing on Edge AI. Originally from India, Sapra studied computer science and engineering, and then worked as a software developer. When her husband got a job in London, she moved with him, and they planned to stay there for a long time. But then he was asked to join the Amsterdam branch for a year. Sapra went along, and found a temporary role as a visiting researcher at PCS. She liked the freedom of academia, so after a year she started applying for PhD positions. Initially in London, but her department head Andy Pimentel asked: why not stay here? ‘I hadn't even considered that option, but actually thought it was a good idea.’ Sapra and her family stayed in Amsterdam, and her PhD research was followed by a postdoc position, and the recent appointment as an assistant professor.

Vertical agriculture

The difference with her previous research, is that in addition to her fundamental work, Sapra now wants to think more about applications. Not that she wants to develop those herself, but she wants to connect with people who can. ‘I would like my research to be applied, but that is difficult with fundamental science. If you don't actively put effort into that yourself, it can take 10 to 20 years for your idea to be taken up.’ Sapra is therefore now working, for example, on a proposal to apply Edge AI in devices that provide crop pollination in vertical farming. Here, crops are grown in buildings under LED lights, and it is not possible to use insects for pollination. Small devices should be able to automatically estimate when pollination is needed, and where to exactly release it on the crop. For this, a small, energy-efficient AI model is needed.

Optimal use

One way to create such a model is to simply see if you can omit parts of the software, but how far do you go in this? ‘If you leave out more, your model becomes less accurate. You need to find a trade-off.’ Another way is frequency scaling, by varying the frequency used by chip components. If a task needs to be performed, multiple components can be put to work. ‘Some components work at a higher frequency and finish quickly, but they consume more energy as a result. Other components work slower. We are investigating how to bring down the frequency without slowing down the task as a whole. So it's about making the most of all the resources of the software and hardware.’

The models behind ChatGPT are huge. One query to ChatGPT 3 apparently costs 36 cents in energy. The big tech companies are currently only focusing on making bigger models with even more features, and they are not concerned with energy consumption. But this cannot go on forever. At some point, energy consumption will become the limiting factor. So reducing the carbon footprint is a very important issue for me.’professor at the UvA, who is working on 'Edge AI'. To enable that, energy efficiency is key. Dolly Sapra

Footprint must go down

In addition, a personal mission for Sapra is to make AI models more energy-efficient, not only for edge applications, but also for larger systems connected to servers in data centres. One of Sapra's new lines of research will therefore be to create hybrid AI models that combine current, energy-intensive techniques with older, more energy-efficient techniques. ‘The models behind ChatGPT are huge. One query to ChatGPT 3 apparently costs 36 cents in energy. The big tech companies are currently only focusing on making bigger models with even more features, and they are not concerned with energy consumption. But this cannot go on forever. At some point, energy consumption will become the limiting factor. So reducing the carbon footprint is a very important issue for me.’