# Model Distribution Methods

After our team trains your custom model using your app's data, it can be deployed directly to your users. A custom-trained model ensures optimal performance tailored specifically to your application and user base.

There are two distribution methods:

{% tabs %}
{% tab title="Over-the-Air (Recommended)" %}
We recommend distributing your machine learning models directly to your app over-the-air. This method provides instant updates, requires minimal system resources, and allows us to iterate faster and more efficiently to further improve your app's performance.

Our OTA rollouts are safe, reliable, and won't affect your app's performance. Our systems continuously monitor the rollout and the resulting conversion performance to ensure everything is working as expected.
{% endtab %}

{% tab title="Custom SDK Binary" %}
Typically, our team will inform you, based on the flows of your app, whether over-the-air updates are possible. If they're not possible, or if you simply choose not to use over-the-air updates, we offer the option of embedding your custom model within a custom-built SDK binary.

To request a custom SDK binary, contact us at <support@contextsdk.com>.
{% endtab %}
{% endtabs %}

See [analytics-and-reporting](https://docs.contextsdk.com/context-decision/advanced/analytics-and-reporting "mention") to learn how to monitor model rollouts.
