India, April 19 -- "Three or four years ago, a lot of the work in deep learning was testing, experimentation, finding out what things could be done with the technology. At the time it was a mix of training and inference, and the tools of the trade were not highly optimized. Between then and now, deep learning frameworks arrived, such as TensorFlow. That allowed the creation of a lot of new models and topologies. It allowed us to create an optimized software stack from those deployment frameworks, down to the CPU. This optimization with new libraries and new graph compilers gives us over 200x improvement. The CPU that you probably have your enterprise application running on - you trust it and you're familiar with it, now it is performing ver...