In December 2015, Google announced a new tool on for app developers: it lets apps, robots and drones “see.”
The tech is called the Google Cloud Vision API and it allows any app developer to tap into Google’s smart “machine learning” service that can identify objects, including faces and emotions.
Cloud Vision solves a hard computer problem of “seeing.” For instance, your computer can scan and reproduce an image of a mountain or an image of a baby, but to the computer, they look the same: a bunch of pixels.
Cloud Vision can detect various emotions on a face such as a happy smile or an angry frown, Google says.
To demonstrate the power of Cloud Vision, Google built a cute little robot with Cloud Vision, via a DYI robot kit known as GoPiGo.
At 1:14 in the video below, the robot demonstrates how it can follow faces and detect emotions. At 2:08, the robot demonstrates how it can detect and identify other objects like glasses, a banana, money.