AI Gateway: LLM request logs and Langfuse integration
The AI Gateway now gives you full visibility into your LLM requests — both in the Aptible UI for day-to-day troubleshooting and via log drain integrations for long-term compliance and observability.
View request and response logs in the UI
You can now view the full contents of your LLM requests and responses directly in the Aptible UI. From the key details page, click into any request to see the complete request and response payloads. Use this to verify your gateway configuration, refine prompts, and troubleshoot issues without leaving the product.
Stream logs to Langfuse
You can now configure a Langfuse log drain from the Aptible UI to send your full LLM request and response logs to Langfuse for prompt management, observability, and analysis. LLM trace drains are configured at the environment level in the Integrations tab.
A note about Anthropic models
We now support accessing Anthropic models through Bedrock, e.g. bedrock/anthropic.claude-sonnet-4-6 instead of anthropic/claude-sonnet-4-6. In the near future, we plan to fully transition to only supporting Anthropic models through bedrock/anthropic — please update your existing Anthropic usage accordingly.