Tech CEOs Need to Leave Their ‘God Complex’ Behind, Says A16z Partner David Ulevitch

Semafor
SemaforMar 24, 2026

Why It Matters

Ulevitch’s warning signals that AI leaders must prioritize product excellence over moral gatekeeping, shaping how the industry engages with government contracts and regulatory scrutiny.

Key Takeaways

  • Tech CEOs must avoid assuming authority over government decisions.
  • Government contracts can be lost without harming generational AI firms.
  • Anthropic chose not to work with Pentagon, accepting trade-offs.
  • Legal and moral liability deter CEOs from policing client usage.
  • A16z partner urges humility over “god complex” in tech leadership.

Summary

In a candid interview, a16z partner David Ulevitch warned tech CEOs against adopting a “god complex” when dealing with government customers, using the Pentagon’s recent stance on Anthropic as a case study.

Ulevitch explained that while the Pentagon may publicly criticize a company like Anthropic, it rarely forces a total market shutdown. He noted that losing a single defense contract does not threaten a generational AI firm, but refusing to work with the government can have strategic repercussions.

He said, “I don’t have a god complex,” and added, “I would not want to be the person they call to decide whether to hit the enter button.” The remarks underscore the legal and moral liability CEOs would assume if they tried to police client usage of their models.

The takeaway for leaders is to focus on building robust technology rather than positioning themselves as arbiters of policy. Humility, Ulevitch argues, preserves credibility and keeps AI firms agile amid evolving regulatory pressures.

Original Description

"What a crazy dynamic to want to be in ... But if I did have a God complex, I'd be like, 'Yes, you can just call me every time you want to use my software and ask me if it's ok to use it.'"
a16z partner David Ulevitch talks about Anthropic and the Pentagon on Compound Interest from Semafor Business.

Comments

Want to join the conversation?

Loading comments...