Health Universe
  • Core Concepts
    • Overview of Health Universe
    • How Health Universe Works
  • Building Apps in Health Universe
    • Getting started with Health Universe
      • Create a Health Universe Account
      • Create a Github Account
      • Link your Github Account to your Health Universe Account
    • Creating a Workspace
    • Developing your Health Universe App
      • Streamlit vs FastAPI
      • Working in Streamlit
        • Typical Project Setup
        • Your First Health Universe App
        • Streamlit Best Practices
      • Working in FastAPI
        • Typical Project Setup
        • Your First Health Universe App
        • Navigator FastAPI best practices
    • Deploying your app to Health Universe
      • Deploying to Health Universe
      • Secret Management
      • Connecting to an LLM
      • Connecting to an external data source
  • Testing your app
  • Re-deploying your app
  • Document your app
  • Deleting your App on Health Universe
  • Additional resources
    • Data Formats, Standards & Privacy
    • External Tools and Libraries
Powered by GitBook
On this page

Was this helpful?

  1. Building Apps in Health Universe
  2. Deploying your app to Health Universe

Connecting to an LLM

PreviousSecret ManagementNextConnecting to an external data source

Last updated 19 days ago

Was this helpful?

When integrating a Large Language Model (LLM) into your application, whether using Streamlit or FastAPI, it's important to securely manage the connection credentials (such as an API key) and ensure smooth communication with the model. Below are the steps to connect to an LLM in a general manner.

1. Set up your LLM API Key Additional Notes:

To securely store and access your LLM API key, use the .

For example:

import os

LLM_API_KEY = os.environ.get("LLM_API_KEY")

2. Connect to the LLM from your code:

Once you have the API key, you can establish a connection to the LLM. The steps for connecting to the model will depend on your specific LLM provider, but here's a general example of how you might set it up:

  • Install the required library: If your LLM provider requires a specific Python library, ensure it’s installed.

  • For example: pip install some-llm-library

  • Set up the connection in your Streamlit or FastAPI app:

import some_llm_library
import os

# Set up the API key
some_llm_library.api_key = os.environ.get("LLM_API_KEY")

# Example function to call the LLM
def query_llm(prompt):
    response = some_llm_library.Completion.create(
        model="your-model-name",  # Adjust model as needed
        prompt=prompt,
        max_tokens=150
    )
    return response["choices"][0]["text"].strip()
  • Integrate with Streamlit or FastAPI:

In Streamlit, you could use the query_llm() function to process user input in real-time:

import streamlit as st

st.title("LLM Query Example")

user_input = st.text_input("Ask something to the model:")
if user_input:
    response = query_llm(user_input)
    st.write(response)

In FastAPI, you could expose an endpoint for querying the LLM:

from fastapi import FastAPI

app = FastAPI()

@app.get("/query_llm")
async def query_llm_endpoint(prompt: str):
    response = query_llm(prompt)
    return {"response": response}

Additional Notes:

  • Security Considerations: Always retrieve your API keys securely using the Secrets Management process. Never hardcode your API keys directly into your codebase.

  • Rate Limits and Costs: Be mindful of the rate limits and associated costs of using the LLM service. Ensure that your application handles errors and retries gracefully.

  • Error Handling: Implement error handling for cases like invalid API keys, rate limits being exceeded, or network issues.

try:
    response = query_llm(user_input)
except some_llm_library.LLMError as e:
    st.error(f"Error connecting to the model: {e}")

By following these steps, you can securely integrate an LLM into your Streamlit or FastAPI application, ensuring both security and functionality.

Secrets Management