Artificial intelligence Is Not Helpful as It Seems

A new paper printed earlier this month by researchers on the University of Massachusetts Amherst and flagged by the MIT Technology Review reveals the energy and prices concerned in training up natural-language processing (NLP) models (i.e., training machines the way to understand and use human language).

Current progress in hardware and methodology for training neural networks has ushered in a brand new generation of massive networks trained on abundant information. These fashions have obtained notable gains in accuracy throughout many NLP duties. Nonetheless, these accuracy enhancements rely upon the availability of significant computational assets that necessitate equally substantial energy consumption. As a result, these models are expensive to coach and develop, each financially, due to the cost of hardware and electrical energy or cloud compute time, and environmentally, as a result of carbon footprint required to fuel modern tensor processing hardware.

Like bitcoin mining (which was as soon as accomplished on desktops in geeky coders’ bedrooms however is now carried out on an industrial scale on specialized GPUs [graphics processing units] in large data centers in locations like China and Mongolia), as NLP expertise has progressed, so have the computing and hardware requirements. It is a drawback from both an environmental but also a financial viewpoint.

Leave a Reply

Your email address will not be published. Required fields are marked *