User Question: Deploying Models and Predictions to an API with MindsDB
A user wants to know whether MindsDB has a way for him to deploy his models and predictions to an API that’ll be used by applications.

User Question: As a developer, does MindsDB have a way for me to deploy my models and predictions to an API that’ll be used by applications?

MindsDB has a native model that you can install using pip3 install mindsdb. This is a model that you’re able to save and move around. Let’s say, for instance, you want this model to run natively in some sort of device or within your application, you can use this model that way. If you, instead, want to expose it to an API, MindsDB comes with a MindsDB Server that allows you to do this.

MindsDB  Server is essentially a wrap around the MindsDB native interface that allows you to—through a RESTful API—train models, use a model that has been trained, or upload a model that you have trained on your machine.

Once you’ve exposed a model through a RESTful API, you can go to MindsDB open source and see that there is a MindsDB Server that you can also use. If you install our graphical user interface, Scout, and the place where you install the graphical user interface has a way for other machines to consume information, you can connect to the IP address that the machine has on port 555. You’ll then have access to the same API that you have access to on the open source server. 

If you want to move this into a production environment, we have a set of tools that allow you to do this efficiently (such as authentication and making sure that, if you’re using a constantly changing data set, that you can retrain your data with different cadences). 

If you have a question on how to deploy MindsDB in a production environment, you can contact us for direct help. We’ll be happy to help you set MindsDB up for your specific production environment. If your interest is primarily in the open source version, feel free to try the MindsDB Server (available on our Github page). If you have any questions about setting up this deployment, you can log a question on Github or email for help.