Skip to main content
All CollectionsAI Project
Configuring Custom Input & Output Parameters
Configuring Custom Input & Output Parameters
Edward Hu avatar
Written by Edward Hu
Updated over a week ago

"Custom Input Parameters" and "Custom Output Parameters" allow you to tailor the input and output of your API calls to the LLM pipeline. This customization provides greater control over the pipeline, helping you achieve the desired output through the API.

When creating a new project, you can find "Custom Input Parameter" by clicking on the "Input" action:

Which will trigger a drawer that gives you the interface for customizing your input parameters:

And you can find "Custom Output Parameter" by clicking on the "Output" action:

This will trigger a drawer that gives you the interface for customizing your output parameters:

Custom Input Parameters

Configuration

The default input parameter will always be payload, this is unchangeable and will always be required when calling via API.

You can add a new parameter by clicking on the "+ Add New Parameter" button, and a new row will appear right below.

Note

You can only add up to 4 custom input parameters in total (plus 1 default).

A "key" is always required, serving as the reference for this variable in your project. It is crucial for assigning values when making API calls.

In the following example, let's set up a custom input parameter with the key location:

We can leave the value empty, as we will be bringing in this value via API call later.

Now, if you have an LLM (or any other actions), you can see under the input, the location is now a variable that you can add to your prompt or input:

So whenever this prompt executes, the location variable will be whatever you brought in when calling the API.

API Call

The API format will look like this based on the above example:

curl -XPOST 
-H 'Content-Type: application/json'
-H 'Apikey: Api-Key <API_KEY>'
-d '{
"payload": "Hello World!",
"custom_variables": {"location": "San Francisco, CA"}
}' 'https://payload.vextapp.com/hook/xxxxx/catch/$(channel_token)'

And with this call, the LLM system prompt upon execution might become something like:

You are a helpful assistant, assist users with their questions. The user is from San Francisco, CA.

User: Hello World!
You:

Fixed Input Parameter

If there's a case when you have to provide a fixed value to a parameter, you can do so by directly entering your value and save:

Now, any actions that's referring to the location parameter, will always get the "fixed input here" value. Again, using the example above, when the LLM system prompt executes, it'll look like:

You are a helpful assistant, assist users with their questions. The user is from fixed input here.

User: Hello World!
You:

Custom Output Parameters

Configuration

The default output parameter will always be text, this cannot be removed or modified and will always carry the output of the last action in the LLM pipeline.

For example, if a project looks like this:

The text value from the output parameter will always be the output generated from "Action 3", which is the last action of this pipeline, and will look like this when you check the output:

You can add a new parameter by clicking on the "+ Add New Parameter" button, and a new row will appear right below.

Note

  • You can add up to 4 custom output parameters in total (plus 1 default).

  • Spaces and special characters are not allowed in the Key.

  • The Value cannot be empty.

A good example might look something like this:

  • test_1 and test_2 both contain the generated output from the pipeline's actions

  • test_3 has a fixed output of "false", so the returned result for test_3 will always be "false"

Note

You can only include when variable per each parameter.

API Call

When calling via API, the returned result for the above example will look like:

{
"text": "Hello world",
"test_1": "foo",
"test_2": "bar",
"test_3": "false"
"citation": {
...
},
"request_id": "xxxxxxxx"
}

This makes the output more controllable and gives you more data/flag to interact with your app.

Did this answer your question?