Endpoint
Request Body
| Field | Type | Required | Description |
|---|---|---|---|
model | string | Yes | Model slug (e.g., flux-1-1-pro) |
version | string | Yes | Model version (e.g., 1.0.0) |
input | object | Yes | Input parameters (varies by model) |
webhook_url | string | No | URL to receive prediction result via webhook |
webhook_secret | string | No | Secret used to sign webhook requests |
input object accepts model-specific parameters. Some models also support:
| Input Field | Type | Default | Description |
|---|---|---|---|
enable_safety_checker | boolean | true | Set to false to disable NSFW filtering. Only works on supported models. |
Code Examples
Response
Response Fields
| Field | Type | Description |
|---|---|---|
status | string | "success" if prediction was created |
message | string | Human-readable status message |
predictionID | string | Unique ID to track the prediction |
Error Responses
| Status | Body | Description |
|---|---|---|
400 | {"error": "Invalid input for model"} | Input doesn’t match model schema |
400 | {"error": "model field is required"} | Missing required field |
401 | {"error": "Invalid or missing API key"} | Authentication failure |
500 | {"error": "Failed to create prediction"} | Server error |
What Happens Next
After creating a prediction, the model processes your input asynchronously. Here’s how you can get your results:- Poll: Use Get Prediction to check status
- Webhook: Provide a
webhook_urland get results delivered automatically when they’re ready
Every model has different input parameters. Use Get Model to grab the
request_schema for any model and see exactly what it expects.