How to create AI Apps with Emotions and Image recognition

Artificial Intelligence and machine learning are the most trending subjects in the world of computer science. To this day, computers can only understand machine languages but efforts are being taken by leading companies around the World to let users understand human behaviours, recognise their activities and talk to them just as any human would. Good examples include the Google’s Assistant, Apple’s Siri and the Facebook’s ads system which delivers ads that are relevant to your likes. Detecting emotions and objects on a camera requires complex image analysis and calculations but today, if anybody has great ideas to build a consumer product using these technologies, it is actually not that hard. Microsoft cognitive API empowers you to use their REST API and use their intelligent services in any of your applications. Many companies including Uber have started to Microsoft Cognitive API to let their drivers authenticate using their faces. Basically the API scans your images and returns a text/JSON containing all the information found related to that object or image. Today I’ll be demonstrating how this technology works using Thunkable, which is a free drag-and-drop app creation platform for Android (originally based on App Inventor but supports Material Design). This should give you clear overview of how you can use the emotions and image recognition services.

Videocast

WRITE A COMMENT