The Best Things we read in AI this Week
“Artificial Intelligence” is Vintra’s weekly round-up of AI-related articles, blogs, videos, and papers we liked.
- The Fastest Way to Train Neural Nets … A deep, deep, dive into AdamW & Super-convergence. Read the article here.
The journey of the Adam optimizer has been quite a roller coaster. First introduced in 2014, it is, at its heart, a simple and intuitive idea: why use the same learning rate for every parameter, when we know that some surely need to be moved further and faster than others?
Check our latest Case Study, The Running Man.
- From the Jan/Feb 2018 edition of the Harvard Business Review, Artificial Intelligence for the Real World. Read the article here.
We believe that every large company should be exploring cognitive technologies. There will be some bumps in the road, and there is no room for complacency on issues of workforce displacement and the ethics of smart machines. But with the right planning and development, cognitive technology could usher in a golden age of productivity, work satisfaction, and prosperity.
- DeepMind is helping AI imagine scenes it hasn’t specifically seen before. Read the article here.
Given a handful of “snapshots” of a virtual scene, the software—known as a generative query network (GQN)—uses a neural network to build a compact mathematical representation of that scene. It then uses that representation to render images of the room from new perspectives—perspectives the network hasn’t seen before.
- It’s not quite The Sims, but it’s getting us one step closer to robot assistants in the home. Read the article here.
… VirtualHome, a new project out of the MIT Computer Science and Artificial Intelligence Laboratory, a place where AI can rehearse complicated tasks and household chores in a simulated, virtual world. The idea is that by mastering certain commands in the simulation, we could someday have robots with AI systems that can more easily perform these tasks in the real world.
Subscribe to our blog to stay up to date!