Providers

Fireworks AI

Learn how to connect Fireworks AI to your CodinIT projects for ultra-fast inference of open-source models.

CodinIT supports accessing high-performance AI models through the Fireworks AI platform, providing ultra-fast inference for Llama 3.1, Mixtral, Code Llama, and other open-source models.

Fireworks AI specializes in ultra-fast inference with industry-leading response times and production-ready performance for open-source AI models.

Getting an API Key

Sign Up for Fireworks AI

Visit the Fireworks AI Platform and create an account.

Once logged in, go to the API Keys section in your account dashboard.

Create Your API Key

Click on "Create API Key" and give it a descriptive name.

Secure Your Key

Copy the generated API key immediately and store it securely.

Supported Models

Llama 3.1

The latest and most advanced model from Meta.

Code Llama

A powerful model specialized in code-related tasks.

Mixtral

A powerful model with a balance of intelligence and speed.

Starcoder

A powerful model specialized in code-related tasks.

Configuration in CodinIT

Open CodinIT Settings

Click the settings gear icon (⚙️) in the CodinIT panel.

Select Provider

Choose "Fireworks AI" from the "API Provider" dropdown menu.

Enter API Key

Paste your Fireworks AI API key into the "Fireworks AI API Key" field.

Select Model

Choose your desired model from the "Model" dropdown list.