AI on Low-Cost Hardware

Software Subgroup

More Info
expand_more

Abstract

Artificial Intelligence has become a dominant part of our lives, however, complex artificial intelligence models tend to use a lot of energy, computationally complex operations, and a lot of memory resources. Therefore, it excluded a whole class of hardware in its applicability. Namely, relatively resource-constrained low-cost hardware. This paper investigates learning methods that are potentially better suited for these types of devices: the forward-forward algorithm and Hebbian learning rules. The results are compared to backpropagation with equivalent network configurations, training hyperparameters and internal data types on different types of low-cost hardware. Backpropagation has consistently outperformed other algorithms in various tests. It exhibits higher accuracy, faster training, and faster inference compared to forward-forward models. However, forward-forward models can come close to matching backpropagation's performance, but they suffer from longer training times and decreased performance with multi-layer networks. Additionally, a poorly trained forward-forward model is sensitive to quantization, resulting in a significant drop in accuracy. On the other hand, forward-forward models offer the benefit of independently training each layer, allowing for more flexibility in optimizing the training process. Hebbian models were not found to be competitive, displaying performance below the required threshold. Smaller models for MCU and FPGA would likely perform even worse.