Portkey

  • What it is:Portkey is an enterprise AI gateway providing unified access to 1600+ LLMs, observability, governance, guardrails, and MCP support for production AI apps.
  • Best for:AI engineering teams, Startups building LLM apps, LangChain/LlamaIndex users
  • Pricing:Starting from $0 forever
  • Rating:78/100Good
  • Expert's conclusion:Portkey is the go-to LLMOps platform for all developer teams seeking to deploy and optimize their GenAI applications in a production environment.
Reviewed byMaxim Manylov·Web3 Engineer & Serial Founder

What Is Portkey and What Does It Do?

Portkey is an artificial intelligence (AI) infrastructure provider that has developed an end-to-end production stack for builders of Generative Artificial Intelligence (GenAI), this includes an AI Gateway; a platform for observability, guardrails, governance, and prompt management; all in one location. Portkey provides enterprise AI teams with a centralized place to provide access to their models and tools as well as provides reliability, safety, and compliance to large-scale production AI applications. Founded in 2023, Portkey primarily services developers and companies that are creating production-grade AI applications.

Active
📍San Francisco, CA
📅Founded 2023
🏢Private
TARGET SEGMENTS
Enterprise AI TeamsDevelopersStartupsFortune 500

What Are Portkey's Key Business Metrics?

📊
$3M
Total Funding
📊
Seed VC
Funding Stage
📊
1600+
Models Supported
📊
March 2023
Launch Date
👥
Fortune 500s & Startups
Customers
📊
17 reviews
G2 Reviews
Rating by Platforms
4.8/ 5
G2 (17 reviews)

How Credible and Trustworthy Is Portkey?

78/100
Good

A mature AI infrastructure provider with high G2 ratings, seed funding, and Fortune 500 company adoption, although Portkey is a relatively new company and does not have sufficient publicly available financial information for analysis.

Product Maturity75/100
Company Stability72/100
Security & Compliance85/100
User Reviews88/100
Transparency80/100
Support Quality78/100
Used by Fortune 500 companiesTop-rated on G2Backed by industry expertsStrict uptime SLAsStrategic partnerships (F5)

What is the history of Portkey and its key milestones?

2023

Company Founded

Founded by Ayush Garg and Rohit Agarwal to unify fragmented AI models into a single gateway.

2023

Product Launch

In March 2023, launched its first product, the AI Gateway, which currently supports over 1600 models, and provides both observability and governance capabilities.

2023

Seed VC Funding

Raised a $3 million Seed Venture Capital (VC) round to create production AI infrastructure.

2024

MCP Gateway Expansion

Developed its platform to include MCP Gateway, which allows users to deploy AI agents, tools, and workflows directly on the platform.

2025

Strategic Partnership

Announced a strategic partnership with F5 to provide secure enterprise AI applications using Portkey.

Who Are the Key Executives Behind Portkey?

Ayush GargCo-founder & CEO
One of the co-founders is leading Portkey’s overall strategy to develop enterprise AI infrastructure and governance.
Rohit AgarwalCo-founder
Another co-founder is leading the development of the AI Gateway and the observability platform.

What Are the Key Features of Portkey?

AI Gateway
Provides a unified API for 1600+ AI models that can be used with intelligent routing, caching, and load balancing.
Observability
Provides real-time monitoring of Large Language Model (LLM) requests, anomaly detection, and a usage analytics dashboard.
Guardrails
Has built-in safety filters, content moderation, and compliance enforcement for production AI.
AI Governance
Allows for the centralization of budgets, quotas, permissions, and audit logging across multiple teams and departments.
👥
Prompt Management
Includes version control, collaboration, and optimization tools for prompt engineering.
MCP Gateway
Has a connector hub for deploying AI agents, tools, and workflows with observability and safety.
Model Catalog
Has a centralized platform for discovering and managing models with performance benchmarking.

What Technology Stack and Infrastructure Does Portkey Use?

Infrastructure

Distributed team with engineering in India, enterprise-grade cloud infrastructure

Technologies

LiteLLMPythonREST APIs

Integrations

1600+ LLM providersMCP servers/toolsEnterprise SSOSlack/Teams

AI/ML Capabilities

LLMOps platform with routing optimization, caching, fallback strategies, and production observability for 1600+ models including agents and tool calling

Based on official website and product descriptions

What Are the Best Use Cases for Portkey?

Enterprise AI Teams
Has a centralized platform for accessing 1600+ models with governance, budgets, quotas, and compliance across departments.
GenAI Developers
Unified API, Caching, Routing, and Observability to Deliver More Rapid Time to Market & Lower Integration Complexity
AI Agent Builders
The MCP Gateway Reduces Tool Integration Complexity, Server Registry, Monitoring of Production Agents
Prompt Engineers
Version Control, Collaboration, Testing & Optimization Platform for Prompt Management
Cost-Conscious Startups
Intelligent Caching, Batching, and Routing Strategies Offer Significant Annual Cost Savings
NOT FORHardware-Specific ML Workloads
Not Optimal - Focused on LLMOps Gateway & No Specialized Hardware Acceleration or Custom Training
NOT FORNon-AI Development Teams
Limited Value - Optimized for Enterprise Production Infrastructure for GenAI / LLM Only & Not General Software Development

How Much Does Portkey Cost and What Plans Are Available?

Pricing information with service tiers, costs, and details
Service$CostDetails🔗Source
Developer$0 forever10K recorded logs/month, no overages allowed, community support, basic features like universal AI gateway, automatic fallbacks, load balancing
Production$49/month100K recorded logs/month, +$9 per additional 100K logs, 30 days log retention/90 days metrics, prompt management (3 templates), simple caching, deterministic guardrails, AI gateway features. Not for custom security/data residencyOfficial pricing page
EnterpriseCustom pricingCustom log limits, advanced features, complex compliance needs, dedicated supportOfficial pricing page
Developer$0 forever
10K recorded logs/month, no overages allowed, community support, basic features like universal AI gateway, automatic fallbacks, load balancing
Production$49/month
100K recorded logs/month, +$9 per additional 100K logs, 30 days log retention/90 days metrics, prompt management (3 templates), simple caching, deterministic guardrails, AI gateway features. Not for custom security/data residency
Official pricing page
EnterpriseCustom pricing
Custom log limits, advanced features, complex compliance needs, dedicated support
Official pricing page
💡Pricing Example: Team processing 500K logs/month using Production plan
Production Plan$127/month
$49 base + $9 x 4 (400K overage)
Enterprise (estimated)$200+/month
Custom quote for high volume + compliance

How Does Portkey Compare to Competitors?

FeaturePortkeyLiteLLMAI GatewayResultantAI
Core Functionality (AI Gateway)YesYesYesYes
Usage & Audit LoggingYes (100K logs/mo)BasicYesAdvanced analytics
Automatic Fallbacks/Load BalancingYesYesYesIntelligent routing
Prompt ManagementYes (3 templates Pro)NoYesYes
Free Tier10K logs/monthYes$5K budgetYes
Pricing ModelPlatform fee + usageUsage-basedBudget-basedAll-inclusive
Budget CapsNoNoYesYes
Enterprise SSO/SecurityCustomNoEnterpriseEnterprise
API IntegrationsLangChain/CrewAIComprehensiveStandardStandard
Starting Price$49/moFree scaling$99/mo$49+tokens
Core Functionality (AI Gateway)
PortkeyYes
LiteLLMYes
AI GatewayYes
ResultantAIYes
Usage & Audit Logging
PortkeyYes (100K logs/mo)
LiteLLMBasic
AI GatewayYes
ResultantAIAdvanced analytics
Automatic Fallbacks/Load Balancing
PortkeyYes
LiteLLMYes
AI GatewayYes
ResultantAIIntelligent routing
Prompt Management
PortkeyYes (3 templates Pro)
LiteLLMNo
AI GatewayYes
ResultantAIYes
Free Tier
Portkey10K logs/month
LiteLLMYes
AI Gateway$5K budget
ResultantAIYes
Pricing Model
PortkeyPlatform fee + usage
LiteLLMUsage-based
AI GatewayBudget-based
ResultantAIAll-inclusive
Budget Caps
PortkeyNo
LiteLLMNo
AI GatewayYes
ResultantAIYes
Enterprise SSO/Security
PortkeyCustom
LiteLLMNo
AI GatewayEnterprise
ResultantAIEnterprise
API Integrations
PortkeyLangChain/CrewAI
LiteLLMComprehensive
AI GatewayStandard
ResultantAIStandard
Starting Price
Portkey$49/mo
LiteLLMFree scaling
AI Gateway$99/mo
ResultantAI$49+tokens

How Does Portkey Compare to Competitors?

vs LiteLLM

Portkey Provides Additional Production Ready Features Like Advanced Logging (100K Logs/Month) & Prompt Management vs LiteLLM’s Basic Proxy Functionality. Portkey is Focused on Enterprise Production & LiteLLM is Focused on Developer Simplicity.

Portkey Best for Production Observability & LiteLLM Best for Simple Proxy Needs.

vs AI Gateway/ResultantAI

AI Gateway Wins in Predictable Budget Pricing ($99 Includes $5K Usage) with Intelligent Routing While Portkey’s ($49+$0.09/1K Logs) Can Surprise Teams at Scale. Portkey Better Suited for Use by Observability Intensive Teams.

AI Gateway Best for Budget Controlling & Portkey Best for Detailed Logging/Analytics.

vs Langfuse

Portkey Provides a Full AI Gateway Solution Including Observability, Where as Langfuse is Tracing Only. Portkey May Be a More Comprehensive Solution But Could be More Expensive When Used for Logging Intensive Use Cases.

Portkey Best for Complete Gateway Solutions & Langfuse Best for Lightweight Tracing.

vs Phoenix (Open Source)

Portkey is a Hosted Solution & Phoenix is Self-Hosted. Portkey Offers Managed Scaling/SLAs; Phoenix is Free But Requires Infrastucture Management.

Portkey Best for Managed Service & Phoenix Best for Cost-Conscious DevOps Teams.

What are the strengths and limitations of Portkey?

Pros

  • Generous Free Tier — 10K Logs/Month Sufficient for Development/Testing
  • Production Ready Logging — 100K Logs/Month Base With Affordable Overage
  • Universal AI Gateway — Works With 2000+ Models Across 40+ Providers
  • Automatic Failovers/Load Balancing — Built-In Reliability Features
  • Included — Templates, Versioning, Playground in Pro For Prompt Management
  • LangChain / CrewAI Natively — Easy Integration With Popular Frameworks
  • Rapid Setup — Only 2 Lines of Code Are Needed to Integrate into Existing Applications

Cons

  • Unlimited Budget — Usage-Based Logging Can Create Bill Shock
  • Separate Model Costs — Pay Providers Directly + Portkey Fees
  • Free Tier Scale Limits — 10K Logs Restrictive for Production Use Cases
  • Intelligent Routing — Pay Full Price Even for Simple Requests
  • Advanced Analytics Features — Likely Available in Enterprise Plans
  • Multiple Variables to Determine Cost — Logs & Model Usage
  • Strict Compliance Required — Production Does Not Have Custom Security Controls

Who Is Portkey Best For?

Best For

  • AI engineering teamsA Single Point of Access to Production Logging for 2,000+ Models
  • Startups building LLM appsGenerous Free Tier & $49/month Production Plan Scale with Growth
  • LangChain/LlamaIndex usersNative Integrations Reduce Time Needed for Setup
  • Teams needing observabilityDetailed Logging (100K+/month), Metrics Retention for Production Monitoring
  • Multi-model applicationsAutomatically Fallback/Load Balance Across Multiple Providers

Not Suitable For

  • Budget-constrained teamsSpending Caps Do Not Exist & Model Costs Passed Through Without Intelligent Cheap-Model Routing
  • Strict compliance requirementsCustom Data Residency and Security Not Available in Production — Requires Enterprise Plan
  • Simple proxy needs onlyMore Than You Need from a Lightweight Option Like LiteLLM — Higher Cost for Basic Routing
  • High-volume cost-sensitiveModel Costs Passed Through Without Intelligent Routing to Cheaper Models

Are There Usage Limits or Geographic Restrictions for Portkey?

Free Tier Logs
10K recorded logs/month, no overages allowed
Production Logs
100K recorded logs/month +$9/100K up to 3M
Log Retention
30 days logs, 90 days metrics (all paid plans)
Prompt Templates
3 templates in Production plan
Overage Policy
$9 per 100K logs beyond base allowance
Production Restrictions
Not recommended for custom security/data residency needs
Model Costs
Paid directly to providers (OpenAI, Anthropic, etc.)

Is Portkey Secure and Compliant?

Enterprise Security ControlsCustom security controls and data residency guarantees available in Enterprise plan only
Audit Logging100K+ recorded logs/month with 30-day retention (paid plans) for compliance monitoring
Production LimitationsExplicitly not recommended for organizations requiring custom security controls
AI Gateway SecuritySecure proxy layer for 2000+ LLM providers with automatic fallbacks
Access ControlTeam-based access management (specifics in Enterprise/custom plans)
Deterministic GuardrailsBuilt-in safety features across all paid plans

What Customer Support Options Does Portkey Offer?

Channels
support@portkey.ai (business hours)24/7 self-service at docs.portkey.ai24/7 for all users
Hours
Business hours for direct support, 24/7 documentation access
Response Time
<24 hours for paid tiers per G2 reviews
Satisfaction
Top-rated on G2 for developer support
Specialized
Dedicated support for Enterprise customers
Business Tier
Priority queues and SLAs for Pro/Enterprise

What APIs and Integrations Does Portkey Support?

API Type
REST API with unified gateway for 1,600+ LLMs
Authentication
API Key and gateway token-based
Webhooks
Supported for observability events and alerts
SDKs
Official SDKs for Python, JavaScript/Node.js
Documentation
Comprehensive docs at docs.portkey.ai with interactive examples
Sandbox
Available in staging environment with usage limits
SLA
Strict uptime SLAs with high availability guarantees
Rate Limits
Configurable per project and tier
Use Cases
LLM routing, caching, observability, agent frameworks like Langchain/Crew.ai

What Are Common Questions About Portkey?

Portkey Is a Production Stack for Builders of GenAI Providing AI Gateway, Observability, Guardrails, Governance, and Prompt Management in One Platform. It Serves as a Unified Interface to 1,600+ LLMs That Allows Teams to Monitor, Optimize, and Scale Their AI Applications Reliably.

Portkey Offers Developers a Free Tier, Pro Plans Starting at Usage-Based Pricing, and Enterprise Custom Pricing. Advanced Guardrails and Dedicated Support Require Paid Tiers. Contact Sales for Exact Quotes.

Portkey is a complete LLMOps Stack, providing Gateway Routing to over 1600 Models; Intelligent Caching for Cost Savings; Production-Ready Agent Support, including Guardrails, Governance & Prompt Management for Enterprise Scale. Unlike pure Observability Tools, Portkey also includes these additional Enterprise-Scale Capabilities.

Yes, Portkey meets all of the typical Enterprise Security Standards (SOC 2 Compliance, Data Encryption, Key Management); Does Not Store Your LLM API Keys; Offers Detailed Access Controls, Audit Logs and More.

Portkey seamlessly Integrates with Langchain, Crew.ai, Autogen, and Other Agent Frameworks; Works as a Drop-In Gateway Compatible with OpenAI, Azure, and 250+ LLM Providers Without Any Code Changes.

Documentation and Community Slack are Available to All Users (Free Tiers); Email Support During Business Hours are Included in the Pro Tier; Priority SLAs, Dedicated Managers are Included in the Enterprise Tier; G2 Reviews Praise Responsive Developer Support.

Yes, Portkey Provides a Generous Free Tier that Allows Developers to Test Both the Gateway and Observability Features; The Free Tier Limits Usage to Test Advanced Features; Pro & Enterprise Tiers Provide Unlimited Production-Scale Usage, Caching, Advanced Guardrails.

The Free Tier Has Limits on the Number of Requests and Limitations on Advanced Features; Custom SLA's in Enterprise Require Sales Contact; High Volume Production May Need Custom Rate Limit Configurations.

Is Portkey Worth It?

Portkey Delivers an Essential, Full-Stack LLMOps Platform for Production AI Applications, Providing Gateway Access to Over 1600 Models, Robust Observability, Cost Optimization, and Governance; Its Full-Stack Approach and Leadership Position in G2 Make it Ideal for Scaling GenAI Reliably; Strong Backing and Fortune 500 Adoption Validate its Readiness for Enterprise Use Cases.

Recommended For

  • AI Engineering Teams Building Production GenAI Applications
  • Startups and Small/Medium Enterprises Scaling LLM Usage Across Multiple Providers
  • Companies Managing High-Volume AI Inference With Cost Concerns
  • Teams Using Agent Frameworks Like Langchain and Crew.ai

!
Use With Caution

  • For organizations that require on-premises deployment options
  • For Teams that require only observability to be provided
  • Budget-constrained projects not yet prepared for the LLMOps complexity

Not Recommended For

  • Non-technical teams without access to a developer resource
  • Projects using LLMs to a minor extent that do not need an entry point (gateway)
  • Industries subject to strict regulations awaiting completion of full compliance documentation
Expert's Conclusion

Portkey is the go-to LLMOps platform for all developer teams seeking to deploy and optimize their GenAI applications in a production environment.

Best For
AI Engineering Teams Building Production GenAI ApplicationsStartups and Small/Medium Enterprises Scaling LLM Usage Across Multiple ProvidersCompanies Managing High-Volume AI Inference With Cost Concerns

What do expert reviews and research say about Portkey?

Key Findings

Portkey is a leading full-stack LLMOps platform, which includes gateway to 1600 + models, tracks 40+ metrics for observability, uses intelligent caching for cost optimization, and provides production-ready agent support. Used by Fortune 500 companies and top rated on G2, it is accelerating the development of AI applications while providing governance of enterprise applications. Established in 2023, with significant investment backing and rapid validation from the market.

Data Quality

Good - comprehensive information from official website and docs.portkey.ai. Limited details on exact pricing/SLAs require sales contact. Customer quotes from site and G2 confirm strong reception.

Risk Factors

!
A young company (established in 2023) within a rapidly evolving LLMOps space
!
The competitive landscape is comprised of established players such as LangSmith
!
Lack of pricing transparency - no publicly available information regarding enterprise pricing
!
Continuation of growth within the LLM provider ecosystem
Last updated: January 2026

What Additional Information Is Available for Portkey?

G2 Leadership

Top rated LLMOps platform on G2 by developer teams developing production GenAI applications. Portkey excels in observability, gateway functionality and developer experience. Earned multiple "Momentum Leader" badges.

Customer Success

Trusted by Fortune 500 companies processing 30 million + policies monthly and startups such as QA.tech. Provides critical visibility into multi-LLM application usage, cost tracking per use case and prompt management at scale.

Scale Metrics

Tracks 2000 + models and processes billions of tokens each day for SME customers. Handles production workload with strict SLA's and real time observability dashboard.

Industry Recognition

Featured in analyst reports on emerging AI infrastructure. Recognized by industry leaders for establishing LLMOps standards. Funded by prominent AI investors.

Agent Framework Support

Integrations directly with Langchain, Crew.ai and Autogen to make AI agents production-ready. Streamlines development, deployment and tracking of multi-agent workflows.

What Are the Best Alternatives to Portkey?

  • LangSmith (Langchain): Evaluation/observability platform by creators of Langchain. Stronger integration with LCEL than competitors but very weak on model gateway and cost management. Suitable for teams that use a lot of Langchain.
  • Helicone: Open source LLM observability with a high focus on OpenAI. Slightly simpler and less expensive than Portkey however it does not have an open provider gateway or guardrails. Good option for small start-up companies looking at monitoring their OpenAI usage. (helicone.ai)
  • Phoenix (Arize): Enterprise level ML observability with some LLM capabilities. Has more comprehensive capabilities for the traditional ML world however is significantly more complex to implement. May be better suited for data science teams rather than developer-focused LLMOps. (arize.com)
  • Traceloop: Open source observability for OpenAI compatible APIs. The company has a free version as well as paid enterprise options. Portkey is significantly more full-stack in its offerings, also does not have a native 1600+ model gateway. Can be a good option for companies that are exclusively using OpenAI for their LLM needs. (traceloop.com)
  • PostHog: Product analytics platform that recently added LLM observability. Provides broad usage analytics however LLM specific features are lacking. Could be a good option for companies already using PostHog for application analytics. (posthog.com)

What Audit Activity Types Does Portkey Offer?

AI Model Inference Calls

Logs every single call to the LLMs with all inputs, selected model, output, latency and cost of compute across 250+ models.

Prompt & Template Modifications

Logs all prompts template changes along with version history and who made each change and when they did so.

API Key Management

Tracks creation of API keys along with usage/access patterns of each key across multiple models/providers.

Configuration Changes

Logs all changes to guardrails such as configuration updates, policy updates, routing logic updates etc.

Authorization & Access Changes

Audits all permissions changes, role assignment changes, and all changes to workspace level access controls.

Routing & Fallback Events

Allows you to see the retry/fallback/model route decision process to show you which model was chosen and why.

Guardrail Hits & Policy Enforcement

Records when PII is detected/redacted and any time a guardrail violation occurs along with all context surrounding the event.

User Activity Attribution

All users will be given full attribution for all their actions with complete tracking of user identity throughout all modules and workspaces.

How Does Portkey's Compliance Framework Alignment Compare?

FrameworkSupportedKey CapabilitySupport Level
EU AI ActYesuser ID, timestamp, action, resource, outcomeFull - Audit logs demonstrate compliance for regulated AI operations
SOC 2Yeswho, what, when, resource, outcomeFull - Generates compliance reports and comprehensive audit trails
Internal Compliance PoliciesYesuser attribution, timestamps, resource, action typeFull - Audit logs demonstrate compliance without stitching data from multiple systems

What Access Control Rbac Capabilities Does Portkey Offer?

Role-Based Access Controls (RBAC)

The use of audit logs is restricted to organization owners and admins with granular permission management; access to the ability to view the audit logs will be based upon role-based access controls for the purpose of governing properly.

Workspace-Level Access Management

Audit logs are automatically collected across all workspaces with a view of the entire organization available to admins while ensuring that each workspace remains isolated from other workspaces.

Flexible Filtering & Searchability

Fast, flexible filtering options are available for viewing audit log data by specific resource(s), the critical nature of an action, the IP address, etc. Use 15 + filters to create custom views of your audit log data.

Secure Visibility Controls

Access control management ensures that only authorized users will have the ability to view audit logs across the organization.

Cross-Organization Audit Trail

Every action taken by a user will be tracked back to the user who performed the action across modules, workspaces, and organizations with a time-stamp associated with the action.

What Search And Analysis Capabilities Does Portkey Offer?

Advanced Search & Filtering

Users will be able to search audit logs using a variety of filter options such as resource type, user, action type, time stamp range, etc. for targeted investigations of audit log data.

End-to-End Request Reconstruction

A capability to recreate the complete execution path (i.e., "trace") for any request made to the system including which model was used, what fallbacks occurred, why retries were initiated, and what metadata was included in the request will exist.

Compliance Export & Reporting

The capability to export logs for audit trail purposes, internal review, or compliance reporting purposes will exist to demonstrate compliance without having to stitch together data from multiple systems.

Observability Dashboard & Analytics

Real-time monitoring of usage, errors, caching behavior, feedback, and metadata will exist through a single, unified dashboard displaying over 40 data points per request.

Activity Log Tracing

A unified, chronological view of the request lifecycle with detailed tracing of all prompt changes, guardrail updates, and configuration modifications will exist.

Performance & Cost Insights

Logs will display a cost breakdown per session, latency analysis, and performance metrics to enable identification of problems quickly.

Multi-Provider & Integration Compatibility

Integration TypeTarget SystemSupportedNotes
LLM Providers250+ LLMstrueSupport for GPT-4, Claude, and 248+ other models with unified audit logging across all providers
LLM ProvidersAnthropic Computer UsetrueComprehensive audit logging with code context, generation metrics, developer attribution, and cost breakdown
Export & IntegrationExport LogstrueExport audit logs for audit trails and compliance reporting to external systems

What Is Portkey's Technical Architecture And Scalability?

Architecture & Deployment - Setup Complexity
No additional configuration required - enable and automatically collect across all workspaces
Architecture & Deployment - Multi-Workspace Coverage
Automatically collected audit logs across all workspaces from single admin interface
Architecture & Deployment - Organization-Wide Visibility
Single pane of glass for audit logs across entire organization with comprehensive user attribution
Data Logging & Storage - Log Retention
Indefinite log retention with searchable access
Data Logging & Storage - Captured Data Points
40+ details per request including cost, performance, accuracy, metadata, user ID, route, model selection, risk level
Data Logging & Storage - Timestamp Tracking
All logs include timestamps and complete user attribution
System Capabilities - Logging Scope
Every AI call, accounted for including prompt modifications, API key creation, guardrail updates, routing changes
System Capabilities - Access Control
Role-based access controls with flexible filtering and organization-wide governance

What Ai Specific Audit Capabilities Does Portkey Offer?

Model Selection & Routing Audit

Which model was selected, what fallbacks occurred, and why retries were triggered will be logged for each AI call.

LLM Inference Call Logging

Logging of all LLM calls will occur with the request context, response generation, latency, and compute costs across 250 + models

Prompt Input/Output Audit

To be able to use a prompt template input (and output) as part of an audit trail or to track the behavior of users in the system.

Guardrail Enforcement Logging

All events related to detecting Personally Identifiable Information (PII), redacting it, and calling a model should be logged prior to making that call. These logs will contain information about every violation of a guardrail and/or policy violation.

API Key & Access Control Audit

The creation of an API key, usage patterns, and the access control associated with the use of the API key are to be tracked so that there is no unauthorized access to the models.

Cost & Performance Metrics

A complete and detailed breakdown of the costs of each session; latency analysis; and performance metrics for every interaction with an AI will be logged.

Request Metadata Tracking

For each AI operation, a minimum of the following data should be logged by the application: User ID, Route, Model, Risk Level, and Fallback Path Metadata.

Configuration Change Audit

An audit trail of all guardrails configurations updated; routing logic changed; and policies modified should be complete.

End-to-End Request Tracing

Any AI request should allow you to reconstruct the entire execution flow which includes the routing decisions made; what fallback was used; and what the outcome of the request was.

AI Gateway Observability

The full detail of how requests are routed through the AI Gateway should be visible via detailed observability dashboard views.

Expert Reviews

📝

No reviews yet

Be the first to review Portkey!

Write a Review

Similar Products