Overview of the Prompts API endpoints
The Prompts API allows you to run AI prompts with your project. You can trigger prompts with different streaming options based on your application needs.
Run a specific prompt and get the complete response in a single API call.
Recommended when creating classifiers to make decisions, since the full response is available for decision making.
Returns a complete JSON response
Run a specific prompt and stream the response using HTTP/2 protocol. This is useful for real-time applications where you want to display the response as it’s being generated.
Streams chunks of data as they become available.
Run a specific prompt and stream the response using Server-Sent Events (SSE) protocol. This is useful for web applications that need to display the response as it’s being generated.
Streams events with data as they become available.
Overview of the Prompts API endpoints
The Prompts API allows you to run AI prompts with your project. You can trigger prompts with different streaming options based on your application needs.
Run a specific prompt and get the complete response in a single API call.
Recommended when creating classifiers to make decisions, since the full response is available for decision making.
Returns a complete JSON response
Run a specific prompt and stream the response using HTTP/2 protocol. This is useful for real-time applications where you want to display the response as it’s being generated.
Streams chunks of data as they become available.
Run a specific prompt and stream the response using Server-Sent Events (SSE) protocol. This is useful for web applications that need to display the response as it’s being generated.
Streams events with data as they become available.