Although it can sometimes feel like it, AI is not a ‘silver bullet’. As with any technology, AI must be used responsibly.
There has been a trend in creating larger AI models to give better performance on tasks, especially in Natural Language Processing (NLP). Larger models (meaning those with a larger set of tuneable parameters) require more data and computation power to train and run, creating more emissions, cost, and technical barriers-to-entry for developers. There has been an overall increase of 300,000x in the cost of training AI models (essentially, teaching them how to perform a given task) in 6 years for state-of-the-art models:
This large increase in computational requirements has raised barriers to participation in AI research due to the rising cost of generating state-of-the-art results — annual computation budgets for many academic research labs, for example, are often smaller than the cost of training just one of the state-of-the-art models in the figure above a single time. It also leads to negative environmental impacts; Strubell et al. (2019) showed that modern training of AI models consumes a similar amount of CO2 as an average car does in 5 years. Developing and deploying ‘greener’ AI should be a consideration for alleviating these issues.
The Allen Institute for AI argued the prioritization of Green AI efforts that focus on the energy efficiency of AI systems. The term Green AI refers to AI research that yields novel results while taking into account the computational cost, encouraging a reduction in resources spent. Developers should consider reporting model efficiency and computation ‘price tags’ from training and running models. For example, research papers could be required to plot accuracy as a function of computational cost and training set size, giving a baseline for more data-efficient research in the future. Reporting the computational price tag of finding, training, and running models is a key Green AI practice.
More positively, the AI ecosystem is strongly integrated into the wider open-source community. AI and tech companies frequently release trained models so developers can use them without incurring the financial or environmental costs of re-training, and can benefit from their data and AI expertise.
There has also been progress in the techniques used to reduce the size of a model before it is deployed. Neural networks (a popular AI model type often with a large footprint) undergo iterations of training and then have their weakest neuron connections removed, effectively shrinking the models without significantly impacting performance.
AI has proven its commercial value but, with climate change beginning to seriously impact industries, ecosystems and livelihoods, now is the time for stakeholders across the public and private sectors to collaborate. Together we can make purposeful, lasting and necessary impact with responsible AI applications.
Original article on Medium