Get the list of models
  • 02 May 2024
  • 2 Minutes to read
  • Dark
    Light
  • PDF

Get the list of models

  • Dark
    Light
  • PDF

Article summary

Introduction

In order to know which models you can have access to through Paradigm, a dedicated endpoint is available for you to get the list of available models.

Model visibility

Please note that the listed models are the models accessible for your API key.
If your API key has not been granted access to a model, you will not find this model in the response coming from the API endpoint.

If you don't find a specific model in the response, from the admin interface please check that:

  • The model exists in Paradigm
  • Your API key has access to this model

Prerequisites

In order to use the embedding endpoint, here are the prerequisites:

  • Having a Paradigm API key: if you do not have one, go to your Paradigm profile and generate a new API key.

Usage methods

There are several ways to call the endpoint:

  1. With the OpenAI python client (recommended)
  2. With the python requests package: to avoid the OpenAI layer
  3. Through a curl request: to do a quick check or a first test

OpenAI python client

Setup

Here is a code snippet setting up the OpenAI python client:

from openai import OpenAI as OpenAICompatibleClient
import os

# Get API key from environment
api_key = os.getenv("PARADIGM_API_KEY")
# Our base url
base_url = "https://paradigm.lighton.ai/api/v2"

# Configure the OpenAI client
client = OpenAICompatibleClient(api_key=api_key, base_url=base_url)

Here the Paradigm API key is set up in an environment variable, but you could very well forward it through a argument parsing method with argparse.

Usage

To get the list of , you can use the client.models.list() method with an available model in Paradigm.
An example is described below:

models_response = client.models.list()

The given response would then follow the OpenAI format:

SyncPage[Model](
    data=[
        Model(
            id=None, 
            created=None, 
            object='model', 
            owned_by=None, 
            name='alfred-40b-1123', 
            model_type='Large Language Model', 
            deployment_type='SageMaker', 
            enabled=True, 
            technical_name='alfred-40b-1123'
        )
    ], 
    object='list'
)

Python requests package

You can also avoid using the OpenAI python class and directly send request to the API endpoint through the requests package.

import requests
import os

# Get API key from environment
api_key = os.getenv("PARADIGM_API_KEY")

response = requests.request(
    method="GET",
    url="https://paradigm.lighton.ai/api/v2/models",
    headers={
        'accept': "application/json",
        'Authorization': f"Bearer {api_key}"
    }
)

print(response.json())

You would then get a JSON answer as a dictionnary:

{
    'object': 'list', 
    'data': [
        {
            'object': 'model', 
            'name': 'alfred-40b-1123', 
            'model_type': 'Large Language Model', 
            'deployment_type': 'SageMaker', 
            'enabled': True, 
            'technical_name': 'alfred-40b-1123'
        }
    ]
}

cURL request

If you would prefer sending a request to Paradigm with a simple cURL command, here is an example:

curl --request GET \
  --url https://paradigm.lighton.ai/api/v2/models \
  --header 'Authorization: Bearer <YOUR_API_KEY>'

Was this article helpful?

What's Next