Prediction API: Machine Learning from Google
Join the DZone community and get the full member experience.
Join For Freeintroduction
one of the exciting apis among the 50+ apis offered by google is the prediction api. it provides pattern matching and machine learning capabilities like recommendations or categorization. the notion is similar to the machine learning capabilities that we can see in other solutions (e.g. in apache mahout): we can train the system with a set of training data and then the applications based on prediction api can recommend ("predict") what products the user might like or they can categories spams, etc. in this post we go through an example how to categorize sms messages - whether they are spams or valuable texts ("hams").using prediction api
in order to be able to use prediction api, the service needs to be enabled via google api console. to upload training data, prediction api also requires google cloud storage. the dataset used in this post is from uci machine learning repository. uci machine learning repository has 235 datasets publicly available, this post is based on sms spam collections dataset.
to upload the training data first we need to create a bucket in google cloud storage. from google api console we need to click on google cloud storage and then on google cloud storage manager: this will open a webpage whe we can create new buckets and upload or delete files.
the uci sms spam collection file is not suitable as is for prediction api, it needs to be converted into the following format (the categories - ham/spam - need to be quoted as well as the sms text):
"ham" "go until jurong point, crazy.. available only in bugis n great world la e buffet... cine there got amore wat..."
google prediction api offers a handful of commands that can be invoked via rest interface. the simplest way of testing prediction api is to use
prediction api explorer.
once the training data is available on google cloud storage, we can start training the machine learning system behind prediction api. to begin training our model, we need to run prediction.trainedmodels.insert. all commands require authentication, it is based on oauth 2.0 standard.

in the insert menu we need to specify the fields that we want to be included in the response. in the request body we need to define an id (this will be used as a reference to the model in the commands used later on), a storagedatalocation where we have the training data uploaded (the google cloud storage path) and the modeltype (could be regression or classification, for spam filtering it is classification):

the training runs for a while, we can check the status using prediction.trainedmodels.get command. the status field is going to be running and then will be changed to done, once the training is finished.


now we are ready to run our test against the machine learning system and it is going to classify whether the given text is spam or ham. the prediction api command for this action is prediction.trainedmodels.predict. in the id field we have to refer to the id that we defined for the prediction.trainedmodels.insert command (bighadoop-00001) and we also need to specify the request body - input will be csvinstance and then we enter the text that we want to get categorized (e.g. "free entry")

the system then returns with the category (spam) and the score (0.822158 for spam, 0.177842 for ham):

google prediction api libraries
google also offers a featured sample application that includes all the code required to run it on google app engine. it is called try-prediction and the code is written in python and also in java. the application can be tested at http://try-prediction.appspot.com. for instance, if we enter a quote for the language detection model from niels bohr: "prediction is very difficult, especially if it's about the future.", it will return that it is likely to be an english text (54,4%).
the key part of the python code is in predict.py:
class predictapi(webapp.requesthandler): '''this class handles ajax prediction requests, i.e. not user initiated web sessions but remote procedure calls initiated from the javascript client code running the browser. ''' def get(self): try: # read server-side oauth 2.0 credentials from datastore and # raise an exception if credentials not found. credentials = storagebykeyname(credentialsmodel, user_agent, 'credentials').locked_get() if not credentials or credentials.invalid: raise exception('missing oauth 2.0 credentials') # authorize http session with server credentials and obtain # access to prediction api client library. http = credentials.authorize(httplib2.http()) service = build('prediction', 'v1.4', http=http) papi = service.trainedmodels() # read and parse json model description data. models = parse_json_file(models_file) # get reference to user's selected model. model_name = self.request.get('model') model = models[model_name] # build prediction data (csvinstance) dynamically based on form input. vals = [] for field in model['fields']: label = field['label'] val = str(self.request.get(label)) vals.append(val) body = {'input' : {'csvinstance' : vals }} logging.info('model:' + model_name + ' body:' + str(body)) # make a prediction and return json results to javascript client. ret = papi.predict(id=model['model_id'], body=body).execute() self.response.out.write(json.dumps(ret)) except exception, err: # capture any api errors here and pass response from api back to # javascript client embedded in a special error indication tag. err_str = str(err) if err_str[0:len(err_tag)] != err_tag: err_str = err_tag + err_str + err_end self.response.out.write(err_str)the java version of prediction web application is as follows:
public class predictservlet extends httpservlet { @override protected void doget(httpservletrequest request, httpservletresponse response) throws servletexception, ioexception { entity credentials = null; try { // retrieve server credentials from app engine datastore. datastoreservice datastore = datastoreservicefactory.getdatastoreservice(); key credskey = keyfactory.createkey("credentials", "credentials"); credentials = datastore.get(credskey); } catch (entitynotfoundexception ex) { // if can't obtain credentials, send exception back to javascript client. response.setcontenttype("text/html"); response.getwriter().println("exception: " + ex.getmessage()); } // extract tokens from retrieved credentials. accesstokenresponse tokens = new accesstokenresponse(); tokens.accesstoken = (string) credentials.getproperty("accesstoken"); tokens.expiresin = (long) credentials.getproperty("expiresin"); tokens.refreshtoken = (string) credentials.getproperty("refreshtoken"); string clientid = (string) credentials.getproperty("clientid"); string clientsecret = (string) credentials.getproperty("clientsecret"); tokens.scope = indexservlet.scope; // set up the http transport and json factory httptransport httptransport = new nethttptransport(); jsonfactory jsonfactory = new jacksonfactory(); // get user requested model, if specified. string model_name = request.getparameter("model"); // parse model descriptions from models.json file. map models = indexservlet.parsejsonfile(indexservlet.modelsfile); // setup reference to user specified model description. map selectedmodel = (map) models.get(model_name); // obtain model id (the name under which model was trained), // and iterate over the model fields, building a list of strings // to pass into the prediction request. string modelid = (string) selectedmodel.get("model_id"); list params = new arraylist(); list<map > fields = (list<map >) selectedmodel.get("fields"); for (map field : fields) { // this loop is populating the input csv values for the prediction call. string label = field.get("label"); string value = request.getparameter(label); params.add(value); } // set up oauth 2.0 access of protected resources using the retrieved // refresh and access tokens, automatically refreshing the access token // whenever it expires. googleaccessprotectedresource requestinitializer = new googleaccessprotectedresource(tokens.accesstoken, httptransport, jsonfactory, clientid, clientsecret, tokens.refreshtoken); // now populate the prediction data, issue the api call and return the // json results to the javascript ajax client. prediction prediction = new prediction(httptransport, requestinitializer, jsonfactory); input input = new input(); inputinput inputinput = new inputinput(); inputinput.setcsvinstance(params); input.setinput(inputinput); output output = prediction.trainedmodels().predict(modelid, input).execute(); response.getwriter().println(output.toprettystring()); } }besides python and java support, google also offers .net, objective-c, ruby, go, javascript, php, etc. libraries for prediction api.
Opinions expressed by DZone contributors are their own.
Trending
-
How To Scan and Validate Image Uploads in Java
-
Five Java Books Beginners and Professionals Should Read
-
DevOps Midwest: A Community Event Full of DevSecOps Best Practices
-
Database Integration Tests With Spring Boot and Testcontainers
Comments