mirror of
https://github.com/we-promise/sure.git
synced 2026-04-07 14:31:25 +00:00
* Add Langfuse-based LLM observability * Document Langfuse configuration * Don't hardcode model in use
1.2 KiB
1.2 KiB
Langfuse
This app can send traces of all LLM interactions to Langfuse for debugging and usage analytics. Find them here on GitHub and look at their Open Source statement.
Prerequisites
- Create a Langfuse project (self‑hosted or using their cloud offering).
- Copy the public key and secret key from the project's settings.
Configuration
Set the following environment variables for the Rails app:
LANGFUSE_PUBLIC_KEY=your_public_key
LANGFUSE_SECRET_KEY=your_secret_key
# Optional if self‑hosting or using a non‑default domain
LANGFUSE_HOST=https://your-langfuse-domain.com
In Docker setups, add the variables to compose.yml and the accompanying .env file.
The initializer reads these values on boot and automatically enables tracing. If the keys are absent, the app runs normally without Langfuse.
What Gets Tracked
chat_responseauto_categorizeauto_detect_merchants
Each call records the prompt, model, response, and token usage when available.
Viewing Traces
After starting the app with the variables set, visit your Langfuse dashboard to see traces and generations grouped under the openai.* traces.