How Uber uses AI for development: inside look
👋 Hi, this is Gergely with a subscriber-only issue of the Pragmatic Engineer Newsletter. In every issue, I cover challenges at Big Tech and startups through the lens of engineering managers and senior engineers. If you’ve been forwarded this email, you can subscribe here. How Uber uses AI for development: inside lookHow Uber built Minion, Shepherd, uReview, and other internal agentic AI tools. Also, new challenges in rolling out AI tools, like more platform investment and growing concern about token costsBefore we start: all The Pragmatic Summit videos are now available to view. Paid newsletter subscribers also have access to each session with the Q&A session, as well. I spent four years working at Uber until 2020 and experienced firsthand the company’s standout engineering culture. Uber is a company that did the speed run of going from a small startup, through hypergrowth, to being a large company facing major risk during the pandemic, when the rideshare business briefly collapsed. Today, it’s maturing as a publicly traded, profitable company, and employs almost 3,000 people in the tech function. At the recent Pragmatic Summit in San Francisco, one of the most interesting behind-the-scenes sessions came from the ridesharing company’s principal engineer, Ty Smith, and director of engineering Anshu Chada, who pulled back the curtain on what Uber has been doing with AI tools, internally. They were candid about the amount of work it took to build up Uber’s internal “AI stack,” why all that work was necessary, and also discussed the drawbacks as well as benefits of this rapidly spreading technology. In today’s issue, we cover:
Longtime readers might recall we’ve covered Uber’s engineering culture over time:
The bottom of this article could be cut off in some email clients. Read the full article uninterrupted, online. Let’s get into it: AI is not new at Uber, but rolling it out companywide is. The company has used machine learning and AI technologies in many systems, including its Marketplace platform, which are responsible for routing and matching drivers with riders, forecasting demand, etc. What is relatively new at nearly all tech companies is the process of integrating AI across engineering and beyond. The official strategy at the ridesharing giant is to become a “GenAI-powered” company: I appreciate Uber sharing this approach openly because while most companies say that they want to be “AI-powered” – however cliche that claim might be – not all provide as much transparency. It’s worthwhile engineers internalizing how leadership views AI. These folks, in general, see a tool that can bring efficiency everywhere. My take is that in some ways, AI is seen similarly to the cloud, which has been perceived as a means to reduce costs and improve the flexibility and elasticity of hardware resources. Today, AI is seen as the way to increase efficiency and lower costs, such as customer support, software development, the finance function – or any function. Uber is focusing not on automating everything possible in engineering. Instead, it wants to:
As Anshu Chada, Engineering Director on Uber’s Dev Platform, puts it:
1. Agentic layers & systemsUber’s “agentic system” for software engineering is actually made up of several systems: Categories of systems:
2. Internal tooling: MCP Gateway, Uber Agent Builder, and the AIFX CLIMCP – the Model Context Protocol – has quickly become the standard way to connect agents and data sources with one another. A frequent analogy is that MCP is like the “USB-C interface for AI agents.” We published a deepdive on the MCP protocol and covered real-world MCP server use cases. Uber put together a “tiger team” (a temporary unit that gets things done fast) to design the MCP strategy and build the central MCP gateway, which looks like this: This MCP gateway allows:
The MCP gateway also provides:
Uber Agent BuilderUber’s Agent Builder product is a no-code solution to build agents that can access Uber’s internal data sources (both MCP servers and Uber data sets), and hand off work to other agents: The platform includes a tool called Agent Studio, where multi-agent workflows can be visualized, debugged, traced, versioned, and evaluated. This is how it looks: The agents built in Agent Builder become discoverable through a registry: Accessing agents via the AIFX CLIUber’s Developer Experience platform team had a few issues with deploying AI agent tooling at scale:
The Dev Experience team built the AIFX CLI, which is the AI tooling command line used by all engineers there. Here’s what it looks like: This tool supports:
3. How AI changes developer workflowsThe traditional way of building software: Devs spent some time planning, most time writing code (code authorship), and then some time in code review. The first AI agent-based workflows were single-threaded: devs worked with a single agent in the command line or inside their IDE: At Uber, the latest workflows which many software engineers follow are pretty different, involving parallel agents, each kicked off with their own tasks: As Ty explained, running multiple agents comes naturally to most devs:
4. Minion: running background agents at scale...Subscribe to The Pragmatic Engineer to unlock the rest.Become a paying subscriber of The Pragmatic Engineer to get access to this post and other subscriber-only content. A subscription gets you:
|













Comments
Post a Comment