OpenAI’s Endgame: The Everything App
OpenAI isn’t stopping at models. It’s building the interface of the future. And that puts Sam Altman on a direct collision course with xAI, Google, and every legacy platform still playing by old rules
OpenAI reported $4.3B in revenue for the first half of 2025. That puts it on pace for a full-year figure close to $13B, according to internal projections. At the same time, the company burned $2.5B in cash and spent $6.7B on R&D during the same period. With $17.5B still in reserves, OpenAI is not profitable, but it’s operating at a scale few can match — and few understand.
The company’s spending points to a long-term infrastructure buildout. The $6.7B in R&D for just six months suggests OpenAI isn’t just training models — it’s building an internal stack to support commercial deployment at scale. That includes inference optimization, API delivery, and new products like Sora, which could push the company beyond enterprise and into direct consumer markets.
In Q2, Palantir passed the $1B mark in quarterly revenue for the first time — so OpenAI’s ~$2B per quarter already puts it on a much higher level, even if the comparison isn’t quite fair. It’s just to illustrate the scale. More importantly, OpenAI is still at the very beginning of its monetization journey.
Anthropic, one of its main competitors, is estimated to generate around $3B in annual revenue — about one-third of OpenAI’s current pace. But it’s clear that OpenAI will desperately need more cash going forward. Just one of its infrastructure partnerships, with Oracle, is reportedly worth $60B annually — that’s nearly 5x OpenAI’s projected yearly revenue.
What’s remarkable is how fast this company has grown. It’s only been three years since the first demo — and yet we’re already seeing OpenAI directly move markets. Yesterday, after it enabled in-chat purchases on Etsy and Shopify, both stocks jumped — +6% for SHOP and +17% for ETSY. That was the best day for Etsy in years. OpenAI now moves billions in market cap.
Personally, I don’t go to Amazon before I ask ChatGPT. The difference in UX is obvious. And this leads to a broader question: is Amazon’s retail monopoly really untouchable? Sure, they have logistics. But if consumer attention shifts, so do the economics. OpenAI is not just competing with Google or Meta for ads — it’s starting to capture retail data and transactional flow.
I’m not suggesting Big Tech is dead. That’s a trap. But OpenAI is becoming an everything app — the same way Robinhood once tried to redefine personal finance. It makes sense: ChatGPT has become a personal assistant, maybe even a friend, for millions. Loneliness monetizes well. That’s a macro theme, and OpenAI is at the center of it.
Take Reddit. For some people, it’s their entire social life. But that’s a separate thread.
Today, another former pandemic darling surged — UiPath jumped +23% after announcing a major integration with OpenAI. Expect more of this
iPath + OpenAI: The Strategic Core
UiPath (PATH) has announced a deep integration of OpenAI models through a new ChatGPT connector. This allows OpenAI’s LLMs to be embedded directly into enterprise workflows managed by UiPath.
The goal is to accelerate time-to-value, simplify AI agent deployment, and boost ROI for enterprise clients.
What this means:
OpenAI models (GPT-4, Codex, DALL·E, etc.) are now directly available within enterprise automation systems.
This isn’t just about a chatbot — it’s a full agent-based architecture. The models can trigger external APIs, access knowledge bases, generate documents, and make decisions in real time. UiPath handles orchestration — meaning it controls the workflows, roles, and execution logic.
This makes it possible for large companies to use OpenAI not as a toy, but as a serious production tool — for financial reports, customer service, RPA, CRM, and more.
Why this matters:
The shift from isolated LLM usage to orchestrated AI agents in real enterprise environments is a major inflection point. This is the “second wave” of generative AI — moving from PowerPoint demos to real day-to-day business applications.
Unlike Microsoft or Salesforce, UiPath didn’t just “add AI” — it made it part of the core enterprise logic.
UiPath is a company that helps other companies automate routine tasks. In the past, that meant rule-based bots: “If an invoice arrives — open Excel, calculate, send it.” Robots followed scripts.Now, with OpenAI plugged in, those bots can actually think: write emails, talk to customers, analyze documents, search for information, handle back-and-forth chats — all embedded directly in enterprise workflows.
In the beginning I compared OpenAI to Palantir because both leveraged political proximity to land massive enterprise deals. The same playbook is now being used by OpenAI and NVIDIA — with Microsoft and Oracle in the background — to turn AI into a national strategic asset.
Who wins?
NVIDIA, clearly. All of this justifies GPU hoarding. And now that leasing GPUs is part of the financial model, even more so.
Oracle, as the go-to data center operator for OpenAI, becomes a proxy for AI infrastructure spending.
CoreWeave, with new $14B deals (e.g. with Meta), is pushing hyperscale HPC forward — and dragging other miners with it: IREN, CIFR, HIVE.
UiPath, obviously, with deep integration of GPT agents into its enterprise automation stack.
Arista Networks, ALAB, and AMD, all riding the wave of GPU, AI server, and network expansion.
The UiPath deal could have ripple effects across the enterprise software landscape — accelerating both AI adoption and partnerships with OpenAI or its rivals. Many of these names dropped today, as investors begin to reprice the implications.
MongoDB ($MDB) — A flexible NoSQL database used across ML infrastructures. As AI agents become more prevalent, demand for scalable, adaptive databases grows.
Elastic ($ESTC) — Real-time search and log analytics, increasingly critical for LLM integration and observability.
C3.ai ($AI) — A speculative name, but their enterprise POCs are real. Likely to benefit from renewed interest in industrial AI deployments.
Zscaler ($ZS) & Okta ($OKTA) — Cybersecurity and user identity become core infrastructure for deploying AI inside enterprises. LLMs can’t operate safely without them.
Fastly ($FSLY) — A key player in edge infrastructure and low-latency delivery. Real-time AI applications will bring it back into focus.
Snowflake ($SNOW) — Enterprise-scale data storage and analytics — especially relevant when OpenAI is accessed via Azure, as Snowflake is deeply embedded in that ecosystem.
But the real game isn’t just revenue or R&D. Sam Altman is building an Everything App — an AI-native platform that merges search, commerce, productivity, social, and personal agents into one interface. Not a chatbot, not a plugin store — a full-stack ecosystem. And that brings him head-to-head with Elon Musk’s xAI, Google’s Gemini, and anyone else still clinging to the old web.
This publication is for educational and informational purposes only and does not constitute financial, investment, or trading advice. Readers are solely responsible for their own investment decisions. The author may hold positions in the securities mentioned.




The scale is impressive, but the economics still don’t line up with the hype. $13B projected revenue sounds huge, but burning $2.5B in cash and spending $6.7B on R&D in six months shows the model is still massively capital-intensive. That’s not SaaS economics — it’s infrastructure economics.
The “everything app” vision might play out, but in the long run stock prices track cash flows. Until inference costs collapse or OpenAI proves durable margins, it’s not Amazon 2.0 — it’s a company subsidized by investors to chase dominance.
oh no the simpson predicted this, now my favorite substacker EoP!!!