CREATE PROVIDER
The CREATE PROVIDER
statement is used to define a LLM provider with the models and parameters. These providers are reusable in nature and the parameters can overridden if needed while creating the model.
General Syntax
CREATE PROVIDER [IF NOT EXISTS] provider_name
ENGINE = ProviderEngine(
parameter1 = 'value1',
parameter2 = 'value2',
...
);
Option | Description | Possible Values | Mandatory |
---|---|---|---|
IF NOT EXISTS | If specified, the model will only be created if it does not already exist. | N/A | Optional |
ENGINE | The LLM Provider Engine to be used | String (e.g., OpenAI, Bedrock, Gemini, AWS, AwsBedrock) | Yes |
Provider-Specific Syntaxes
OpenAI
CREATE PROVIDER [IF NOT EXISTS] open_ai_provider
ENGINE = OpenAI(
api_key='api-key',
model_name = 'gpt-4o-mini'
);
AWS Bedrock
CREATE PROVIDER [IF NOT EXISTS] aws_provider
ENGINE = Bedrock(
access_key = 'xxxx',
access_secret = 'xxxx',
region = 'xxxx'
);
Gemini
CREATE PROVIDER [IF NOT EXISTS] gemini_provider
ENGINE = Gemini(
api_key = 'your-api-key'
);
AWS Lambda
CREATE PROVIDER [IF NOT EXISTS] my_lambda_provider
ENGINE = AwsLambda(
function_name = 'your-lambda-function-name',
region = 'us-east-1',
access_key = 'your-access-key',
access_secret = 'your-access-secret'
);
Common Parameters
While each provider may have unique parameters, some common ones include:
Parameter | Description | Applicable to | Mandatory |
---|---|---|---|
api_key | API key for accessing the provider's services | OpenAI, Anthropic, Gemini | Yes |
model_name | Name of the model to be used | OpenAI, Anthropic, Gemini | Yes |
access_key | Access key for AWS services | Bedrock, AWS Lambda | Yes |
access_secret | Secret key for AWS services | Bedrock, AWS Lambda | Yes |
region | AWS region | Bedrock, AWS Lambda | Yes |
function_name | Name of the Lambda function | AWS Lambda | Yes |
Using the Provider with the Model
Instead of defining the Engine inline, we can use Provider with USING
keyword.
CREATE PROMPT story_prompt (
system "You are a helpful assistant. Write a short story about {{topic}}"
);
CREATE PROVIDER open_ai_provider
ENGINE = OpenAI(
api_key = 'sk-proj-xxxxxxxxxxxxxxxxxxx',
model_name = 'gpt-3.5-turbo',
temperature = 0.3
)
CREATE MODEL IF NOT EXISTS story_writer (
topic COMMENT 'This is the first argument'
) USING open_ai_provider()
PROMPT story_prompt
SETTINGS retries = 2;
If we want to change the model parameters like changing temperature
to 0.7
and setting the model to gpt-4-turbo
. We can override the provider while creating the model.
CREATE MODEL upgraded_story_writer (
topic COMMENT 'This is the first argument'
) USING open_ai_provider(model_name = 'gpt-4-turbo', temperature = 0.7)
PROMPT story_prompt
SETTINGS retries = 2;
This allows for flexibility in using the same provider with different configurations for various models.