Create a new inference job.
The job is created in ‘pending’ status. Add files using the upload endpoints, then files will be queued for processing automatically.
cURL
curl --request POST \ --url https://api.example.com/api/v1/inference-jobs \ --header 'Content-Type: application/json' \ --header 'X-API-Key: <api-key>' \ --data ' { "model_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a", "config": { "threshold": 0.5, "merge_window_ms": 200, "min_duration_ms": 50 } } '
{ "id": "3c90c3cc-0d44-4b50-8888-8dd25736052a", "tenant_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a", "model_id": "3c90c3cc-0d44-4b50-8888-8dd25736052a", "config": {}, "status": "<string>", "total_files": 123, "processed_files": 123, "failed_files": 123, "total_detections": 123, "error_message": "<string>", "created_at": "2023-11-07T05:31:56Z", "started_at": "2023-11-07T05:31:56Z", "completed_at": "2023-11-07T05:31:56Z" }
Request to create an inference job.
Configuration for inference job.
Show child attributes
Successful Response
Inference job response.