Skip to main content
Version: 4.5

Batch Inference

Once the best model gets deployed, you'll get an API endpoint and an Authorization token to use the API in Web Applications and BI tools etc.

In this Batch Inference, we'll use the previously cleaned data. So that we don't need to do the preprocessing again. We can directly use it for the Inference.

You need API endpoint along with the API token in order to do the predictions.

Here, this is how you can do predictions using API token.

batch inference

These are the predicted values for all the Genres. If they belong to a Genre, it will give an output as 1 else 0.

We can convert these to lists to find out the Genre names.

batch inference results