← Bookmarks 📄 Article

Ben Horowitz on Investing in AI: AI Bubbles, Economic Impact, and VC Acceleration - YouTube

Ben Horowitz explains why managing genius-level VCs requires focusing on conversation process over direction, why AI demand is unprecedented enough to justify "bubble" valuations, and why the application layer is more defensible than the "biggest foundation model wins" thesis suggested.

· startups business
Read Original
Listen to Article
0:003:29

My Notes (8)

AI as a new computing platform

"AI is a new computing platform. So, you kind of have to look at it as like how many winners were there who built applications on computers. And like that's the order of kind of the size of what this is"

"it's a very big design space like it's an enormous design space like one we've never seen that before in technology"

"one of the reasons why people are so worried about it being a bubble is the valuations have gone up so fast but like if you look at what's going on underneath in terms of the customer adoption the revenue growth rates etc. Like we've never seen demand like this, we've never seen valuations rise like this, but we've never seen demand rise like this either"

Foundational model companies vs application layer companies

"if you go back say three or four years, I think people believe that the big foundation models would be these giant brains that could do anything better than anybody. It has not played out quite like that"

"for any particular use case the long tail of not only kind of scenarios but the long tail or the fat tail I should say of human behavior ends up itself being something that you have to model and understand very very well"

"the complexity of the application itself is very high and is not subsumed in the foundation model"

On giving people a shot

"the best thing society can do for a person is give them a shot. Give them a shot at life, a chance to contribute, a chance to do something larger than themselves and make the world a better place."

"what's been good for humanity historically is when people have a chance to kind of do something larger than themselves and contribute"
"there are many systems ideas like well what if we could make utopia or everybody equal or this and that the other and you know that's kind of ended up doing the opposite. If you look at the history of communism or what have you, it's kind of everybody has an equal chance of getting no shot is much more what occurs"

"you really want to enable contribution"

"ai is such a disruptive phenomenon that every incumbent is under thread from ai in general. a lot of the ways you deal with the threat is that you acquire the dna of the future, so theres gonna be a lot of M&A"

"if you wanna change the world, you gotta believe you can change the world"

Clarity

"if you have clarity, you can move"

"a lot of what an organization needs often is clarity not correctness"

On decision making and judgment:

  • decision making is a combination of intelligence and judgement
  • judgement is a combination of intelligence and knowledge
  • what do you know, and how smart are you at turning that into the correct judgement

Verticalization

  • "an investing team like shouldn't be too much bigger than a basketball team. You know, a basketball team's like five people who start uh and the reason for that is the conversation around the investments really needs to be a conversation"
Summary used for search

• Managing ultra-high-IQ talent (like Martin Casado, "best networking architect in 20 years") means helping them understand how conversation process affects investment decisions, not giving them direction
• Judge VCs on inputs at point of investment (deal quality, winning ability) not 10-year portfolio outputs—feedback loop is too long to wait
• Verticalization works because investing teams max out at "basketball team size" (~5 people) for real conversation; Dave Swensen's insight that shaped a16z's structure
• AI application layer is more complex than expected: Cursor uses 13 different models for programming because "fat tail of human behavior" isn't subsumed by foundation models
• AI isn't a bubble—valuations rising fast but so is demand/adoption at rates "we've never seen before"; customer revenue growth supports the multiples

Ben Horowitz reveals his management philosophy for a16z: when you have people like Martin Casado (the best networking software architect of the last 20 years) as investors, you're not giving direction—you're helping them understand how the process of conversation affects investment decisions. The key mistake VCs make is getting "wrapped around the axle about some weakness" instead of focusing on whether founders are "literally the best in the world at a thing." That singular excellence is what's worth backing, not being "pretty good at a lot of things."

On firm structure, Ben credits Dave Swensen's insight that investing teams shouldn't be bigger than a basketball team (~5 people) because investment decisions need to be real conversations, not presentations. As software ate the world and a16z had to scale, verticalization was the only way to maintain small team dynamics. He judges VCs on inputs—quality at point of investment, ability to win deals with exceptional founders like Meera or Ilia—not waiting 10-15 years for portfolio outcomes. The firm avoided ESG/cleantech verticals because "investing is hard enough without introducing other criteria" beyond "is this going to be a giant company."

On AI, Ben challenges the bubble narrative: valuations are rising at unprecedented rates, but so is demand and revenue growth. "We've never seen demand like this" in customer adoption rates. The market structure is evolving differently than expected—the "biggest foundation model" thesis isn't playing out. Cursor, for example, uses 13 different AI models because the "fat tail of human behavior" in programming requires specialized modeling that foundation models don't capture. They even released their own coding-specific foundation model. This application-layer complexity means more winners than the "one brain to rule them all" scenario people predicted 3-4 years ago. The design space is "enormous" and "bigger than anything I've seen in my career."