Helicone

Open-source LLM observability — proxy-based logging, caching, and rate limits

San Francisco, CA
helicone.ai →
About

Open-source LLM observability — proxy-based logging, caching, and rate limits

Unclaimed via auto_from_business
Updated Apr 21, 2026
Work here? Claim this profile to manage it.
Share to LinkedIn Invite someone
Featured in
Vibe Coder StackAI Observability & Eval AI EconomyAI Observability & Eval
Agent Readiness ?
View rankings →
35
/ 100
Crawl10
Machine2
API0
Commerce3
Content10
Signals4
ODC Enrichment auto-generated
Industry
AI Observability & Eval
confidence: high
NAICS
541512
Computer Systems Design Services
SIC
7372
Prepackaged Software
B2B / B2C
B2C
Email Infrastructure
via MX profiling
85
/ 100
Provider
Google Workspace
enterprise
SPF
~all (soft)
DKIM
Configured (google)
DMARC
Policy: none
Models: agent_readiness_v2, agent_role_v1, b2b_v2, description_v1, employee_v1, industry_v2, mx_profile_v2, verify_v1
QR code Scan to share
Profile History
Profile created
Apr 21, 2026 — via auto_from_business
Enriched by Stu
Apr 14, 2026 — agent_readiness_v2,agent_role_v1,b2b_v2,description_v1,employee_v1,industry_v2,mx_profile_v2,verify_v1
Agent Readiness scanned
Apr 11, 2026 — 35/100
Business record created
Apr 11, 2026 — added to company database
Verification & Provenance
Loading...
Recently joined
Something wrong? Submit feedback

First-party context. Not scraped. Not inferred.

OnlyData Club Datasets Terms Privacy Join