What’s 🔥 in Enterprise IT/VC #391
ServiceNow, Microsoft...closing sales in the enterprise - demonstrable ROI + AI Pixie dust
NOTICE: This is last week’s post, and I accidently unpublished so am republishing and also updating the version number to the correct version #391 instead of #390. Thanks for bearing with me!
Greetings from London! Just wrapped a week long trip over the pond and there is no difference here than in the US, many a VC lamenting about massive Inception rounds for AI companies with valuations which are unsustainable. Yes, it’s a world wide phenomenon. Now on to the news - this was a huge week as the Street to see if the AI hype train is becoming reality. Along those lines, here are some nuggets from ServiceNow CEO Bill McDermott and President and COO, CJ Desai, on closing enterprise sales and where we are in the AI cycle (Earnings call)
Bill McDermott -- Chairman and Chief Executive Officer
We are in a race to put AI to work for people, and that's a ray ServiceNow intends to win for our customers.
As I said, process optimization is the single biggest gen AI use case in the enterprise. And any process that exists in the enterprise today. will be reengineered or engineered depending on how messy the process is with gen AI.
So, every workflow in every enterprise will be rethought.
Demonstrable ROI with many approvals and some AI pixie dust
CJ Desai -- President and Chief Operating Officer
So, I would say, Karl, I would start first is environment, and we shared this in January, Bill, Gina, and I, it remains pretty much the same from our perspective as it -- and what we mean by that, it is not 2021, specifically. So, it still takes many approvals and all the things that we discussed from a sales perspective in trying to get business validation done or a purchase being made.
And pretty much, I would say, that's a standard across industries and geographies. We are absolutely executing well within that environment given our promise of efficiency and automation so that is absolutely resonating and combine that with our in-platform generative AI, which also resonates really well because that is an accelerant to the productivity enhancements that an organization can take.
So, whether it's Wall Street banks, whether it's a life sciences corporation, whether it's governments, that story of automation, digital, productivity, enhance via gen AI is absolutely resonating and that is what is helping us, despite the environment continuing to be the same.
Budgets going 📈 because of AI and it’s becoming a business imperative
Bill McDermott -- Chairman and Chief Executive Officer
And I would just build on that, Karl, just for your benefit on the budgets themselves. The budgets are going up. And what I definitely see is the preference for gen AI now. I think we're ending one era in the enterprise, and we've begun another.
And we're into a new frontier now where gen AI has opened up the eyes of the customer to say, there might be a different way of doing this. And that's creating real opportunity for us. So, CJ is exactly right on the value-based economy, but also, I do see the budgets not only going up in IT but also just see gen AI becoming more of a business imperative. And if you can increase productivity, take cost out, and show that in a value case, this money that will be spent, and maybe different people approving it, but the money will be spent.
So, all the data from those systems of record in terms of how we run this company, we run the whole company on ServiceNow. And now we have 20 different gen AI use cases across all the departments of the company. So, my full expectation is that someday, we could do the earnings call where we're all in this room together and we'll take you through the living, learning lab of a gen AI-run company here at ServiceNow.
Gina Mastantuono -- Chief Financial Officer
Yes. And I would just add -- I think I would just add, we are absolutely customer zero, 100% on all of our gen AI use cases. Deflection rates have doubled for both our employees and customers and they're improving every month. right? It's really early days.
So, it's learning faster and faster. Software engineers are accepting 48% of text-to-code generation.
Speed of growth for AI Products is fastest ever seen
Gina Mastantuono -- Chief Financial Officer
Yes. So, we'll definitely give you a lot more details on all things of gen AI at Investor Day, and that's in a week and a bit. So, stay tuned there. Yes, the adoption curve is stronger than we've seen in any new product category launch, but that's starting from zero, right? So, it's a small dollar at this point in time, but the speed at which it's going to grow to be a really meaningful contributor is faster than anything we've seen.
Bill McDermott -- Chairman and Chief Executive Officer
Yes. It's a really important question, Greg. I really believe the IT budgets in their own right will go up on a standard rate basis as we've seen now for many, many years. The business executives, however, are inserting their will into the generative AI revolution because the CEO is in a boardroom with her senior team sitting around a table with the board of directors, and they're like, "Hey, what are you guys doing on gen AI," and they know now that they got to go into that room with a story because this is a lot like when we had the internet, then we had the iPhone moment, everything went mobile.
Everything is going gen AI. It's just a question of how quickly you get there. So, I believe that a lot of the business operating spend will be moved to gen AI technology use cases that serve the business. And the reason I believe I'm right on that, if you look at great companies, some of them in this quarter like Microsoft and Novartis, so Hitachi Energy or Equinix or IBM, they're looking at this as, hey, what does this mean to my employees, to my customers, to my partners and they're very well aware of the fact that inflation is sticky and rates are high, and they're on their own.
GenAI winners defined by same old story - TIME to VALUE MATTERS Speed kills
And they're looking at not only gen AI, but they're also looking at ServiceNow as a fresh new platform design to take on some of the tougher process challenges that has slowed companies down. And as CJ said, which I think is a major point, the time to implementation on these gen AI use cases has been faster than anything I've seen, not just against Pro but against anything. They want it in now. So, there's an urgency and that urgency is coming from the C-suite, and it's a movement.
And I've never seen a desire for implementation speed like I have for gen AI. And that, to me, is a big factor as you navigate and the way you think about this business, this business model and gen AI is a category, who's going to win, who's going to lose, and which customers really want the solution, how quickly do they want the solution if they see the value they want it yesterday, and that's a great sign for us.
Check out 🧵 here
As always, 🙏🏼 for reading and please share with your friends and colleagues!
Scaling Startups
#love this from David Senra (Founder’s Podcast) on Michael Bloomberg, reminder that for startups, a series of small wins compounded over time = path to potentially massive success
Michael Bloomberg: Take lots of chances, and make lots of individual, spur-of-the-moment decisions!
To succeed, you must string together small incremental advances—rather than count on hitting the many lottery jackpot once.
Trusting to great luck is a strategy not likely to work for most people.
As a practical matter, constantly enhance your skills, put in as many hours as possible, and make tactical plans for the next few steps.
Then, based on what actually occurs, look one more move ahead and adjust the plan.
Take lots of chances, and make lots of individual, spur-of-the-moment decisions.
Don't devise a Five-Year Plan or a Great Leap Forward. Central planning didn't work for Stalin or Mao, and it won't work for an entrepreneur either."
#ambitious goals powered by AI
#what founders and board members do on Saturdays - if you love what you do, then it’s just like any other day. And yes, I’m fired up for what’s ahead at Roadie, batteries included Backstage
#Spot on regarding startup speed vs. big company bloat - from Emery Wells, Frame.io cofounder, sold to Adobe
Startup vs Big Company Dynamics
Startup
• Observation: Users want a new feature.
• Designer (60 min later): Here are some figma prototypes.
• Engineer: We can ship this by the end of the week.
Big Company:
• Observation: Let’s discuss our observations when Suzy’s back; targeting end of month.
End of Month: Bob should really be in this meeting, let’s reschedule.
• Meeting: Users want this feature. We’ll need Jessica's buy-in.
• Jessica Meeting: Presented six weeks of research for the new feature.
• Jessica: This can fit into our H2 planning.
• H2 Kickoff: Remember users want this feature?
• Product Manager: Will draft a product brief that will be ready for H2's second cycle.
• Designer: Drafted designs in Figma, ready for the next review.
• Design Review: Why this feature? Should we prioritize X instead?
• Outcome: Feature sidelined. Cycle repeats.
**Note** This isn’t subtweeting but the reality of software development at scale. Whether you're a startup or a big company, it’s crucial to streamline to avoid this. Success breeds complexity, but no startup is immune.
Here’s Dharmesh, co-founder and CTO of Hubspot, chiming into Emery’s post
HubSpot's a pretty big company (~8,000 people) but we've managed to avoid this death spiral.
Things that have helped:
1) Small teams with high autonomy
2) Hold regular "science fairs" where teams show off what they *shipped*
3) Closely connect our product team to customers, so they can hear needs first hand
4) Have a culture that prioritizes *value* delivered to the customer
5) Have founders that use the product regularly and are perpetually impatient. 😀
#🐶fooding - if you don’t use your own product and love it, how are you going to get others to love your product? Here’s my boldstart partner Eliot Durbin sharing thoughts along with some great comments…
No secret the best teams dogfood their own product. I was thinking about the different ways founders create this culture:
- internal / external hackathons
- dedicated demo / prototype time during team meetings
- bug bashes, weekly wins, celebrate the small wins!
What are some other ways? Please share / brag here :)
Enterprise Tech
#🤯 the future of software? Worth a click…
#when it comes to AI is the model the product or the apps and all that is built around it? Here’s Zuck’s viewpoint which I would agree 💯 (full Dwarkesh pod here)
Zuck on Dwarkesh
TLDR: AI winter is here. Zuck is a realist, and believes progress will be incremental from here on. No AGI for you in 2025.
1) Zuck is essentially an real world growth pessimist. He thinks the bottlenecks start appearing soon for energy and they will be take decades to resolve. AI growth will thus be gated on real world constraints.
> "I actually think before we hit that, you're going to run into energy constraints. I don't think anyone's built a gigawatt single training cluster yet. You run into these things that just end up being slower in the world."
> "I just think that there's all these physical constraints that make that unlikely to happen. I just don't really see that playing out. I think we'll have time to acclimate a bit."
2) Zuck would stop open sourcing if the model is the product
> "Maybe the model ends up being more of the product itself. I think it's a trickier economic calculation then, whether you open source that."...
#every $1B market cap company turned down an offer for $100M or more and same for every $10B turning down $1B and how about Meta? Here’s Zuck on why he turned down a $1B offer early in Facebook’s life (Brian Ji - must watch!) - part of answer is that he would just start another company and he loved the one he had, now that’s true founder!
#Platform vs. app debate - who wins for AI - Steven Sinofsky
Right now we're in the "systems-led" view of AI. It means the platforms are being built out and with that the UX, scenarios, and use cases. The problem is those people/teams are notoriously bad at building "Apps".
The "apps" led revolution follows. The systems people will be far behind the apps on UX, use cases, etc. The fear that the platform will "eat all the good apps" is counter to the history of computing.
Apps and Systems are not just different people but different cultures, lenses, and toolkits. What problems to solve and how to solve them end up being different. When Systems builds Apps they think an App just wraps the platform and defer to the platform for a solution. When Apps people build Systems they think about the "least" amount of work that can get locked up in a platform so they have the most flexibility to solve a domain problem.
Building systems is hard. Building apps is hard. The same teams never end up doing both. It is sells both sides short to think one team can do everything. Then toss in hardware and you get a third dimension where vertical integration doesn't work as universally as many believe.
tl;dr OpenAi saying they will eat all the apps defies history in a big way but also makes little sense technically.
#Andrew Ng on agentic workflows + how using these is like assigning discrete tasks to different employees - also great to see boldstart port co CrewAI listed
Multi-agent collaboration has emerged as a key AI agentic design pattern. Given a complex task like writing software, a multi-agent approach would break down the task into subtasks to be executed by different roles -- such as a software engineer, product manager, designer, QA (quality assurance) engineer, and so on -- and have different agents accomplish different subtasks.
Different agents might be built by prompting one LLM (or, if you prefer, different LLMs) to carry out different tasks. For example, to build a software engineer agent, we might prompt the LLM: "You are an expert in writing clear, efficient code. Write code to perform the task …"...
In many companies, managers routinely decide what roles to hire, and then how to split complex projects -- like writing a large piece of software or preparing a research report -- into smaller tasks to assign to employees with different specialties. Using multiple agents is analogous. Each agent implements its own workflow, has its own memory (itself a rapidly evolving area in agentic technologies -- how can an agent remember enough of its past interactions to perform better on upcoming ones?), and may ask other agents for help. Agents themselves can also engage in Planning and Tool Use. This results in a cacophony of LLM calls and message passing between agents that can result in very complex workflows.
While managing people is hard, it's a sufficiently familiar idea that it gives us a mental framework for how to "hire" and assign tasks to our AI agents. Fortunately, the damage from mismanaging an AI agent is much lower than that from mismanaging humans!
Emerging frameworks like AutoGen, Crew AI, and LangGraph
More here:
#What comes after LLMs? (Gary Marcus)
If all we had was ChatGPT, we could say, hmm “maybe hallucinations are just a bug”, and fantasize that they weren’t hard to fix.
If all we had was Gemini, we could say, hmm “maybe hallucinations are just a bug”.
If all we had was Mistral, we could say, hmm “maybe hallucinations are just a bug”.
If all we had was LLAMA, we could say, hmm “maybe hallucinations are just a bug”.
If all we had was Grok, we could say, hmm “maybe hallucinations are just a bug”.
Instead we need to wake up and realize that hallucinations are absolutely core to how LLMs work, and that we need new approaches based on different ideas.
#Every AI Market Map you may want from agents to apps to AI security to deepfake detection and more (Chief AI Agent)
#the good ol’ days- they don’t build them like they used to
#🤯 no revenue and $2B valuation…Cognition, creator of Devin, just raised a new round at $2B with no revenue (Rowan Cheung)
#to that end, here’s another coding agent, Augment, backed by Eric Schmidt and emerging from stealth with 🤯 $252M of funding (TechCrunch)
In 2022, Ostrovsky and Guy Gur-Ari, previously an AI research scientist at Google, teamed up to create Augment’s MVP. To fill out the startup’s executive ranks, Ostrovsky and Gur-Ari brought on Scott Dietzen, ex-CEO of Pure Storage, and Dion Almaer, formerly a Google engineering director and a VP of engineering at Shopify.
Augment remains a strangely hush-hush operation.
In our conversation, Ostrovsky wasn’t willing to say much about the user experience or even the generative AI models driving Augment’s features (whatever they may be) — save that Augment is using fine-tuned “industry-leading” open models of some sort.
He did say how Augment plans to make money: standard software-as-a-service subscriptions. Pricing and other details will be revealed later this year, Ostrovsky added, closer to Augment’s planned GA release.
“Our funding provides many years of runway to continue to build what we believe to be the best team in enterprise AI,” he said. “We’re accelerating product development and building out Augment’s product, engineering and go-to-market functions as the company gears up for rapid growth.”
#But Github is cranking…1.8M paid copilot users up from 1.3M the prior quarter
Developers, developers, developers
One of best acquisitions of all time
🤯 1.8M paid subs for GitHub CoPilot with 90% of Fortune 100 as GitHub customers overall
Founders if you don’t have an AI story for a devtools co you r toast + yeah, we have enough code generators
#Revisiting Rippling from initial vision at Inception to $13.5B most recent valuation
Markets
#Most great companies by-product of M&A (Bain) and what are the “four areas of focus that have been systematically developed by the best acquirers over the past 20 years.”
The last two decades have upended that paradox to the point that now most great companies are the by-product of M&A, and those that have mastered the art of frequently adding new businesses to their portfolio (we call such companies “Mountain Climbers”) unequivocally perform the best.
It’s not as if M&A isn’t still risky—the landscape is littered with failures. Yet, while some companies made difficult missteps, others learned, deal after deal, how they could substantially boost the odds of success in their favor. To put some data behind this assertion, from 2000 to 2010 companies that were frequent acquirers earned 57% higher shareholder returns vs. those that stayed out of the market. Now that advantage is about 130% (see Figure 1). Sitting on the M&A sideline is generally a losing strategy.
Still tons of opportunity for improvement when it comes to what is most important to successful M&A, the people and culture…
In the end, the success or failure of almost all deals also comes down to people and culture, yet this is where companies have advanced the least. There is much more to be done as companies manage the intersection of business aspirations and employee engagement.
Where are they falling short? Companies underinvest in communications. They underinvest in establishing a “sponsorship spine” to ensure everyone is on board with the inevitable changes. They underinvest in tech tools to measure employee sentiment and employee understanding. They don’t perform retrospectives to see who stayed, who left, who got promoted, who didn’t, and why—information that can help them hone future integrations. And many don’t make any effort to attach economic value to their culture efforts. It may be the first thing CEOs talk about but the last thing they ask their teams to actually do something about.
#Speaking of M&A, Hashicorp, cloud infra software provider and creator of Terraform, was bought by IBM for $6.4B, 1 42.6% premium to Monday’s closing price - from Arvind Krishna, IBM Chairman and CEO
"Enterprise clients are wrestling with an unprecedented expansion in infrastructure and applications across public and private clouds, as well as on-prem environments. The global excitement surrounding generative AI has exacerbated these challenges and CIOs and developers are up against dramatic complexity in their tech strategies," said Arvind Krishna, IBM chairman and chief executive officer. "HashiCorp has a proven track record of enabling clients to manage the complexity of today's infrastructure and application sprawl. Combining IBM's portfolio and expertise with HashiCorp's capabilities and talent will create a comprehensive hybrid cloud platform designed for the AI era."
The rise of cloud-native workloads and associated applications is driving a radical expansion in the number of cloud workloads enterprises are managing. In addition, generative AI deployment continues to grow alongside traditional workloads. As a result, developers are working with increasingly heterogeneous, dynamic, and complex infrastructure strategies. This represents a massive challenge for technology professionals.
Here’s Adam Jacob breaking it down (Adam was founder of Chef so is OG of IaC)
Okay, since @sogrady is a pal, here are my thoughts on IBM buying Hashicorp. First - congratulations to all my Hashicorp people - it's an incredible accomplishment to have built a company like that, and a crazy thing to have someone value what you built at $4B+. Congratulations.
My read on it is pretty straightforward. Hashicorp had two main drivers of revenue afaik, Terraform and Vault. IPO to today they're down from $85.70 to $23.97 before the announcement. Absolute bottom was $19.68, I think?
My read on it is pretty straightforward. Hashicorp had two main drivers of revenue afaik, Terraform and Vault. IPO to today they're down from $85.70 to $23.97 before the announcement. Absolute bottom was $19.68, I think?
My read on it is pretty straightforward. Hashicorp had two main drivers of revenue afaik, Terraform and Vault. IPO to today they're down from $85.70 to $23.97 before the announcement. Absolute bottom was $19.68, I think?
More here 🧵
great issue
What a cracker list of developments in the GenAI and LLM space!