Apple’s Carlos Guestrin cautions AI leaders to think very carefully about how they use their data

In a very interesting and wide ranging talk on the history (including a very early mechanical perceptron) of AI, Carlos Guestrin outlined 4 trends he sees in the future:
1. shift from parallelism (e.g. spark) to HPC (e.g. deep learning on massive GPU clusters) workloads driving insights from huge (rather than "massive") data
2. a return to specialized hardware (e.g. custom chips, FPGAs etc. for HPC) and the engineering challenges to fix it (e.g. AutoTVM)
3. commodification of deep learning tools to reduce barriers of adoption (e.g. pretrained models, higher level libraries)
4. ensuring that models are inclusive by making models interpretable
The rise of machine learning has been one of the most exciting developments in modern technology, but according to Carlos Guestrin, one of the oldest principles of computing still applies: garbage in…