Artificial Intelligence has caught our attention once again since the announcements of Google’s new products and services. Machine Learning, as a method for achieving Artificial Intelligence, soon will be integrated to the majority of features on Android.

The newest smartphones will be launched with Android P, the most recent operating system, allowing users to experiment how machine learning can be used to get smarter mobile devices.

For those new developers with clients or bosses asking them to try out this trendy technology, Google launched the ML Kit beta. This SDK is now available on Firebase and aims to help developers to work with Google’s Machine Learning expertise more easily, lowering the entry barrier for new developers that don’t have much experience in this particular subject.

ML Kit for mobile developers

First of all, the ML Kit is oriented to provide all mobile developers the resources to work with base APIs for common use cases, and it is compatible with both Android OS and iOS.

This kit is claimed to be a powerful package that can help your apps to be more engaging, given users´expectation to get services increasingly smarter. Plus, being a resource offered by Google means you´ll get something used in many of their apps already and that it is worth trying.

The ML Kit is not only thought to be useful for developers with less Machine Learning experience but also a solution for the skilled ones too, as it offers an option to deploy your own TensorFlow Lite models saving you time and resources. This is done by using Firebase Mobile Development Platform, as we mentioned before.

For developers with less experience in Machine Learning, the SDK now has 5 base APIs you can work with and integrate them to your apps:

Image labeling
Text recognition
Face detection
Barcode scanning
Landmark detection

We say now because they are planning to add more APIs in the next months such as High Density Face Contour and Smart Reply. There isn’t an exact date for these releases yet, however, you can apply for access to them by filling in the corresponding Google form.

It is worth to mention that ML Kit provides you both on-device and Cloud APIs, the first one allows you to process data quickly without the need of having a network connection, and the second one uses Google Cloud Platform to process data more accurately.

So, what can be done with these features?

Well, there are many examples of how using Machine Learning expand the possibilities of an app:

TurboTax, a tax preparation software, has a feature on its app that allows you to take, for example, a picture of the Wage and Tax Statement (Form W-2 in the US) and automatically recognizes the text on it, easing the process of data entry.

Take barcode scanning, on the other hand, just like eBay has introduced it into their app, although developed by them, not with the ML Kit. They have facilitated adding items to the store only by scanning the barcode that appears in a product´s box. A basic description, an image and even a starting price (predicted by analytics) will be filled on the website automatically.

Lose it!, an app made for weight loss, has been working with image labeling. The app allows you to snap a picture of your food and identify the ingredients, calories, and nutrients it contains. Now, it uses text recognition and barcode scanning on food labels to upload nutritional information.

As you can see, integrating Machine Learning into your apps can make them more engaging for your users by providing helpful solutions that save time and provide a smarter experience.

Now, with the ML Kit, you have access to Google’s technologies in this matter. Tell us, What features would you like to add to your apps?