Subtitle:
An all-in-one product analytics platform with LLM observability capabilities
Core Idea:
PostHog is a comprehensive product analytics platform that helps developers track usage, implement feature flags, and monitor LLM costs and performance across applications.
Key Principles:
- All-in-One Platform:
- Combines multiple product development tools (analytics, feature flags, experiments, LLM observability) in a single solution.
- Developer-Focused:
- Built specifically for engineering teams to implement and maintain without requiring dedicated data analysts.
- Open Architecture:
- Provides integrations with various AI providers through direct connections or via frameworks like Vercel AI SDK.
Why It Matters:
- Cost Transparency:
- Prevents unexpected AI expenditures by tracking LLM usage costs in real-time.
- Resource Optimization:
- Helps identify inefficient AI usage patterns and opportunities for cost reduction.
- Performance Monitoring:
- Provides insights into LLM response times, error rates, and other performance metrics.
How to Implement:
- Initial Setup:
- Create a PostHog client and integrate it with your application.
- API Integration:
- For OpenAI, pass the PostHog client directly into the OpenAI constructor.
- For other LLM providers, implement through wrappers like Vercel AI SDK's tracing function.
- Custom Analytics:
- Develop SQL queries within PostHog to analyze specific metrics not available out-of-the-box.
Example:
- Scenario:
- Tracking Gemini API costs across an application.
- Application:
import { withTracing } from 'posthog-js';
import { generateObject } from '@vercel/ai';
// Wrap Vercel AI SDK functions with PostHog tracing
const generateWithTracking = withTracing(generateObject, {
distinctId: userId,
properties: {
model: "gemini-2.0-flash",
// Additional properties
}
});
// Custom SQL query for cost calculation
// SELECT
// date_trunc('day', timestamp) as day,
// sum(properties.$input_tokens) * 0.00001 as input_cost,
// sum(properties.$output_tokens) * 0.00004 as output_cost
// FROM events
// WHERE event = 'ai_generation'
// GROUP BY day
```
- **Result**:
- Accurate daily tracking of LLM usage costs, preventing surprise bills at month-end.
---
### **Connections**:
- **Related Concepts**:
- LLM Cost Tracking: PostHog provides specialized observability for LLM costs.
- Vercel AI SDK: Common integration method for non-OpenAI models with PostHog.
- **Broader Concepts**:
- Product Analytics: PostHog is a comprehensive platform in this space.
- Feature Flagging: Another core capability of the PostHog platform.
---
### **References**:
1. **Primary Source**:
- PostHog Documentation (posthog.com)
2. **Additional Resources**:
- "How to set up LLM analytics" tutorials on PostHog website
- Vercel AI SDK documentation for integration methods
---
### **Tags**:
#analytics #LLM #observability #costTracking #developerTools #productAnalytics #featureFlags
---
---
**Connections:**
-
---
**Sources:**
- From: Your Average Tech Bro - Cómo hago un seguimiento del uso de LLM en mis aplicaciones para no quedarme sin dinero