The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Scoring New Data with ModelOps Webservices from within RapidMiner
sgenzer
Administrator, Moderator, Employee-RapidMiner, RapidMiner Certified Analyst, Community Manager, Member, University Professor, PM Moderator Posts: 2,959 Community Manager
This is a KB article to help illustrate an interesting situation - when you have a RapidMiner model in deployment (via ModelOps) on a remote RapidMiner Server - and wish to use RapidMiner to score new data via a webservice.
Most importantly, I have activated the "Integrations" feature in ModelOps to generate a Scoring URL:
If I test this webservice from the ModelOps screen, you can see the request in the upper right, and the response on the lower right:
This is fairly straightforward. Just enter the URL exactly as you see it in the Test URL window from the ModelOps screen (see above) with the data you want. For example, we can score the following:
And the response looks like this:
I personally like to use a JSON reader such as this one to look at responses. If you do, you will see a proper response from the model scoring engine:
So this passenger has a 78.8% chance of survival from the model's prediction.
If you wanted to convert this response into an ExampleSet, the easiest way to do this is via the JSON to Data operator (Text Processing extension):
Inside the query parameters menu:
You will get a similar response as above:
[Note: this is really not a POST request in the normal sense - for that we would normally send a JSON array as a data object. But ModelOps web service queries do not currently accept JSON arrays at this time...]
Scoring More Than One Row of Data
Sample Deployment
I have built a decision tree model from the standard Titanic training dataset and deployed it on a remote RapidMiner Server. I am not going to explain how to do this here - please see the RapidMiner Academy which has good explanations on how to do this:Most importantly, I have activated the "Integrations" feature in ModelOps to generate a Scoring URL:
If I test this webservice from the ModelOps screen, you can see the request in the upper right, and the response on the lower right:
Scoring One Row of Data
Now let's score a new data row using a variety of methods from within RapidMiner:Method 1: Using a GET request using the Get Page operator (Web Mining extension)
The Web Mining extension is available for free via the RapidMiner Marketplace. It is not the most modern set of tools we have, but it will serve the purpose at hand.This is fairly straightforward. Just enter the URL exactly as you see it in the Test URL window from the ModelOps screen (see above) with the data you want. For example, we can score the following:
passenger_class: First
no_of_siblings_or_spouses_on_board: 2
sex: Female
no_of_parents_or_children_on_board: 1
passenger_fare: 123
age: 32
The URL would be:no_of_siblings_or_spouses_on_board: 2
sex: Female
no_of_parents_or_children_on_board: 1
passenger_fare: 123
age: 32
http://partnersrv.rapidminer.com:8080/api/rest/process/titanic_deployment_score?passenger_class=First&no_of_siblings_or_spouses_on_board=2&sex=Female&no_of_parents_or_children_on_board=1&passenger_fare=123&age=32The process looks like this:
And the response looks like this:
I personally like to use a JSON reader such as this one to look at responses. If you do, you will see a proper response from the model scoring engine:
So this passenger has a 78.8% chance of survival from the model's prediction.
If you wanted to convert this response into an ExampleSet, the easiest way to do this is via the JSON to Data operator (Text Processing extension):
Method 2: Using a POST request using the Get Page operator (Web Mining extension)
Using a POST request using the Get Page operator is very similar to the GET request, with one key difference: you are NOT going to submit the queries in the URL. Rather you are going to enter the queries in the "query parameters" section instead:Inside the query parameters menu:
You will get a similar response as above:
[Note: this is really not a POST request in the normal sense - for that we would normally send a JSON array as a data object. But ModelOps web service queries do not currently accept JSON arrays at this time...]
Scoring More Than One Row of Data
As ModelOps does not current accept JSON arrays that would allow us to score more than one row data in one query, we cannot send more than one row at a time via a web service query. However this is easily accomplished via a loop operator.
Here I take the Titanic Unlabeled data set, filter for only the first three rows (just for illustration purposes), and then loop over them using the Loop Examples operator. The iteration macro is named %{example} which will increment row-by-row starting at 1. Afterwards I will append all the results together back into one ExampleSet:
Inside Loop Examples, I use the Extract Macro operator to grab the parameters for the nth example we looking at in the loop:
The additional macros look like this:
The Get Page operator is exactly the same as above, but the query parameters are tweaked slightly:
Note that the query value is now the macro value set by Extract Macros.
Here is the result:
I hope that makes sense. Good luck and get those models into deployment!!
Scott
Here I take the Titanic Unlabeled data set, filter for only the first three rows (just for illustration purposes), and then loop over them using the Loop Examples operator. The iteration macro is named %{example} which will increment row-by-row starting at 1. Afterwards I will append all the results together back into one ExampleSet:
Inside Loop Examples, I use the Extract Macro operator to grab the parameters for the nth example we looking at in the loop:
The additional macros look like this:
The Get Page operator is exactly the same as above, but the query parameters are tweaked slightly:
Note that the query value is now the macro value set by Extract Macros.
Here is the result:
I hope that makes sense. Good luck and get those models into deployment!!
Scott
3