By default, the HorayAI API platform generates unstructured text as output. However, in certain applications, you may need the model to produce output in a structured format. Simply instructing the model through prompts often fails to yield properly structured output.
As a standardized and lightweight data exchange format, JSON mode enables large language models (LLMs) to generate structured outputs. When you send a request to the API, the model’s response can be returned in JSON format, making it easy for both humans to read and write, and for machines to parse and generate.
Now, most language models on the HorayAI platform support JSON mode, ensuring that the model outputs strings in JSON format. This allows you to process the output logically and ensures it adheres to the expected structure.
For instance, you can use the HorayAI API to attempt structured output in the following cases:
Include the following parameter in your request:
Currently, all available large language models support the above parameter.
Here is an example using the openai library:
模型将输出:
By default, the HorayAI API platform generates unstructured text as output. However, in certain applications, you may need the model to produce output in a structured format. Simply instructing the model through prompts often fails to yield properly structured output.
As a standardized and lightweight data exchange format, JSON mode enables large language models (LLMs) to generate structured outputs. When you send a request to the API, the model’s response can be returned in JSON format, making it easy for both humans to read and write, and for machines to parse and generate.
Now, most language models on the HorayAI platform support JSON mode, ensuring that the model outputs strings in JSON format. This allows you to process the output logically and ensures it adheres to the expected structure.
For instance, you can use the HorayAI API to attempt structured output in the following cases:
Include the following parameter in your request:
Currently, all available large language models support the above parameter.
Here is an example using the openai library:
模型将输出: