ARK Invest: AI training costs dropped 100-fold between 2017 and 2019
Machine learning systems are cheaper to train now than ever before. That’s the assertion of ARK Invest, which today published a meta-analysis indicating the cost of training is improving at 50 times the pace of Moore’s law, the principle that computer hardware performance doubles every two years.
In its report, ARK found that while computing devoted to training doubled in alignment with Moore’s law from 1960 to 2010, training compute complexity — the amount of petaflops (quadrillions of operations per second) per day — increased by 10 times yearly since 2010. Coinciding with this, training costs over the past three years declined by 10 times yearly; in 2017, the cost to train an image classifier like ResNet-50 on a public cloud was around $1,000, while in 2019, it was around $10.
That’s surely music to the ears of startups competing with well-financed firms like Google’s DeepMind, which last year recorded losses of $572 million and took on a billion-dollar debt. While some experts believe labs outmatched by tech giants are empowered by their limitations to pursue new research, it’s also true that training is an unavoidable expenditure in AI work — whether within the enterprise, academia, or otherwise.
The findings would appear to agree with — and indeed source from — those in a recent OpenAI report, which suggested that since 2012, the amount of compute needed to train an AI model to the same performance on classifying images in the ImageNet benchmark has been decreasing by a factor of 2 every 16 ...
More on: venturebeat.com