Skip to main content

CREATE PROVIDER

The CREATE PROVIDER statement is used to define a LLM provider with the models and parameters. These providers are reusable in nature and the parameters can overridden if needed while creating the model.

General Syntax

CREATE PROVIDER [IF NOT EXISTS] provider_name
ENGINE = ProviderEngine(
parameter1 = 'value1',
parameter2 = 'value2',
...
);
OptionDescriptionPossible ValuesMandatory
IF NOT EXISTSIf specified, the model will only be created if it does not already exist.N/AOptional
ENGINEThe LLM Provider Engine to be usedString (e.g., OpenAI, Bedrock, Gemini, AWS, AwsBedrock)Yes

Provider-Specific Syntaxes

OpenAI

CREATE PROVIDER [IF NOT EXISTS] open_ai_provider
ENGINE = OpenAI(
api_key='api-key',
model_name = 'gpt-4o-mini'
);

AWS Bedrock

CREATE PROVIDER [IF NOT EXISTS] aws_provider
ENGINE = Bedrock(
access_key = 'xxxx',
access_secret = 'xxxx',
region = 'xxxx'
);

Gemini

CREATE PROVIDER [IF NOT EXISTS] gemini_provider
ENGINE = Gemini(
api_key = 'your-api-key'
);

AWS Lambda

CREATE PROVIDER [IF NOT EXISTS] my_lambda_provider
ENGINE = AwsLambda(
function_name = 'your-lambda-function-name',
region = 'us-east-1',
access_key = 'your-access-key',
access_secret = 'your-access-secret'
);

Common Parameters

While each provider may have unique parameters, some common ones include:

ParameterDescriptionApplicable toMandatory
api_keyAPI key for accessing the provider's servicesOpenAI, Anthropic, GeminiYes
model_nameName of the model to be usedOpenAI, Anthropic, GeminiYes
access_keyAccess key for AWS servicesBedrock, AWS LambdaYes
access_secretSecret key for AWS servicesBedrock, AWS LambdaYes
regionAWS regionBedrock, AWS LambdaYes
function_nameName of the Lambda functionAWS LambdaYes

Using the Provider with the Model

Instead of defining the Engine inline, we can use Provider with USING keyword.

CREATE PROMPT story_prompt (
system "You are a helpful assistant. Write a short story about {{topic}}"
);

CREATE PROVIDER open_ai_provider
ENGINE = OpenAI(
api_key = 'sk-proj-xxxxxxxxxxxxxxxxxxx',
model_name = 'gpt-3.5-turbo',
temperature = 0.3
)

CREATE MODEL IF NOT EXISTS story_writer (
topic COMMENT 'This is the first argument'
) USING open_ai_provider()
PROMPT story_prompt
SETTINGS retries = 2;

If we want to change the model parameters like changing temperature to 0.7 and setting the model to gpt-4-turbo. We can override the provider while creating the model.

CREATE MODEL  upgraded_story_writer (
topic COMMENT 'This is the first argument'
) USING open_ai_provider(model_name = 'gpt-4-turbo', temperature = 0.7)
PROMPT story_prompt
SETTINGS retries = 2;

This allows for flexibility in using the same provider with different configurations for various models.