SAN FRANCISCO — Visa is consuming roughly 1.9 trillion AI tokens every month as of March 2026, doubling its February tally, as the payments giant pushes AI adoption deep into its workforce and rewards teams that use the technology to ship products faster. The figure, disclosed by Visa's president of technology Rajat Taneja in an interview with Business Insider, represents one of the largest confirmed enterprise AI usage metrics from any non-technology company.
The scale is staggering. To put 1.9 trillion tokens in context, that is roughly equivalent to processing 1.4 billion pages of text per month, or more than 47 billion pages per year. For a payments company that processes over 200 billion transactions annually, the implication is clear: Visa is embedding AI not as a side project but as a core operational layer that touches every function from software engineering to fraud detection to product development.
A Company-Wide AI Push | 89% Employee Engagement
Taneja told Business Insider that the company's focus is not on raw token counts but on what he called the "volume of impact," measuring how AI usage translates into faster product cycles, fewer manual processes, and better decision-making across the organization. The numbers, however, tell a compelling story on their own.
| Visa AI Metric | Value |
|---|---|
Monthly AI token consumption (March 2026) | 1.9 trillion |
Month-over-month growth | 2x (doubled from February) |
Employee AI engagement rate | 89% |
Power users (25+ prompts/day for half the month) | 44% of workforce |
Most popular AI tool | Anthropic Claude |
Second most popular | OpenAI ChatGPT |
Third most popular | Google Gemini |
Fastest product delivery case | New API built and shipped in 6 days |
Across the company, 89 percent of employees now engage with AI tools regularly. More striking is the depth of that engagement: 44 percent of Visa's workforce is classified as "power users" who average at least 25 prompts per day for half the month. That is not casual experimentation. That is systematic integration of AI into daily workflows at a level that few enterprises outside of pure technology companies have achieved.
The heaviest usage is concentrated in software engineering, which is consistent with the broader industry pattern where code generation, code review, and debugging are the highest-value AI use cases. But Taneja emphasized that adoption is spreading rapidly to other parts of the organization, including product management, risk analysis, compliance documentation, and customer operations.
Anthropic Claude Leads | Visa's AI Tool Stack
Anthropic's Claude is currently the most popular AI tool among Visa's workforce, followed by OpenAI's ChatGPT and Google's Gemini. The preference for Claude aligns with broader enterprise trends: Anthropic's revenue has rocketed from $9 billion to $30 billion in run rate over the past several months, driven largely by enterprise API adoption from exactly this type of large-scale deployment.
Visa's selection of Claude as the preferred tool is notable because Visa operates under some of the most stringent data security and compliance requirements in any industry. The Payment Card Industry Data Security Standard (PCI DSS) imposes strict controls on how cardholder data is processed, stored, and transmitted. That Visa has cleared Claude for widespread internal use suggests that Anthropic's enterprise security and data handling have passed scrutiny that most AI vendors cannot survive.
The multi-model approach, Claude plus ChatGPT plus Gemini, is also strategically significant. Rather than standardizing on a single provider, Visa is maintaining optionality across all three major model families. This reduces vendor lock-in risk, allows teams to select the best model for each specific task, and creates internal competitive pressure that keeps each provider incentivized to improve. The Blackstone-Anthropic $10 billion joint venture has accelerated this kind of enterprise multi-model deployment across the financial services sector.
Rewarding Speed, Not Just Usage | The 6-Day API
Visa has begun incentivizing effective AI use through an internal awards program that recognizes teams for measurable productivity gains. In the most striking example, a team used Anthropic's Claude Sonnet model to build and launch a new API in under six days, a process that would typically take weeks or months through traditional development cycles. The team earned an internal recognition award that came with company points redeemable for perks.
The six-day API story illustrates why the SaaS sector has been in crisis. If a payments company can build, test, and deploy an API in under a week using AI tools, the entire value chain of enterprise software development, from consulting firms to project management platforms to QA vendors, faces compression. The speed advantage is not marginal. It is categorical.
Taneja framed the incentive structure deliberately. The awards do not reward token consumption for its own sake. They reward velocity and impact: how fast did the AI-assisted team deliver compared to the baseline? How much manual work was eliminated? Did the AI usage result in a product that shipped to customers? The distinction matters. Companies that measure AI adoption by usage volume alone risk encouraging busywork. Visa is measuring by outcome.
What 1.9 Trillion Tokens Costs | The Economics of Enterprise AI
At current API pricing, 1.9 trillion tokens per month represents a substantial but manageable cost for a company of Visa's scale. Using Anthropic's published Claude Sonnet pricing as a baseline ($3 per million input tokens, $15 per million output tokens), and assuming a typical 3:1 input-to-output ratio, Visa's monthly AI spend would be in the range of $8 to $12 million. That figure is a rounding error on Visa's $35 billion annual revenue, yet the productivity gains, if the 6-day API case is representative, deliver returns that dwarf the cost.
| Cost Estimate | Detail |
|---|---|
Monthly tokens | 1.9 trillion |
Estimated monthly API cost (Claude Sonnet rates) | $8-12 million |
Visa annual revenue (FY2025) | ~$35 billion |
AI spend as % of revenue | ~0.3-0.4% (annualized) |
Traditional enterprise software dev cost/API | $200K-500K+ over 3-6 months |
AI-assisted API cost (6-day build) | Fraction of traditional cost |
Enterprise AI pricing is also falling rapidly. Anthropic, OpenAI, and Google have all cut API prices by 50-80% over the past 12 months as model efficiency improves and competition intensifies. Visa's token consumption could triple again by year-end without meaningfully impacting the cost structure. The economic moat for AI-native enterprises is not the cost of the tools. It is the organizational capacity to deploy them effectively, and that is where Visa's 89% adoption rate becomes the real competitive advantage.
Visa Defines the Next Era of Commerce | AI as the Customer
In a separate investor presentation this week, Visa outlined a vision for what it calls the "next era of commerce," a world where AI agents, not humans, become the primary customers initiating transactions. The thesis: as AI agents increasingly handle purchasing decisions, travel booking, subscription management, and business procurement on behalf of human users, the payments infrastructure must evolve to authenticate and process agent-initiated transactions at machine speed.
Visa is positioning itself to be that infrastructure layer. The company's internal AI adoption is not just about workforce productivity. It is about building institutional understanding of how AI systems behave, what data they need, and how to trust them as counterparties in financial transactions. An organization where 89% of employees interact with AI daily develops an intuitive understanding of AI capabilities and limitations that cannot be acquired through strategy decks alone.
The February sell-off that hit Mastercard, Visa, and American Express was driven by fears that AI would disintermediate the payments layer entirely. Visa's response has been to become one of the largest enterprise AI consumers in the financial sector, effectively arguing that the AI revolution does not bypass payments infrastructure but instead demands more of it, processing higher transaction volumes at faster speeds with AI-native fraud detection.
What This Signals for Enterprise AI Adoption
Visa's disclosure is significant because it provides a concrete benchmark for enterprise AI adoption at scale. Most large companies report vague metrics about AI "pilots" and "initiatives." Visa is reporting specific numbers: 1.9 trillion tokens, 89% engagement, 44% power users, 25+ prompts per day, a 6-day API build. These are operational metrics, not aspirational ones.
The doubling from February to March suggests the adoption curve is still in its exponential phase. If the trajectory holds, Visa could be consuming 5 to 8 trillion tokens per month by the end of 2026, a figure that would make it one of Anthropic's, OpenAI's, and Google's largest individual enterprise customers globally. For the model providers, landing a Visa-scale deployment is worth far more than the API revenue alone. It validates the enterprise thesis, provides real-world training signal on financial services workflows, and creates a reference customer that makes every other Fortune 500 CFO more willing to sign a similar contract.
For competing payments companies and financial institutions that have been slower to adopt AI, Visa's numbers represent a rapidly widening gap. A 6-day API build versus a 6-month one is not a marginal efficiency gain. It is a structural advantage that compounds over time. Every product Visa ships faster, every fraud pattern it detects earlier, and every compliance workflow it automates is a capability gap that non-AI-native competitors must close or concede.
Filed under
Discussion
Every comment appears live in our Discord server.
Join to see the full conversation and connect with the community.
Comments sync to our ObjectWire Discord · Visa Doubles AI Token Usage to 1.9 Trillion Per Month.
Written by
Jack Brennan