Looking to escape the slowdown in deal activity and exit values, venture capital firms are going all-in on emerging AI opportunities that show the potential to drive long-term growth. Presentation books last Artificial Intelligence and Machine Learning Report released today reflects the ongoing challenges facing venture capital firms, starting with declining deal activity and exit values. Pitchbook’s analysis shows that AI data centers, large local language models (LLMs), and domain-specific foundation models are three of the many growth catalysts VCs need to sustain growth of their businesses and generate returns.
More market turbulence for venture capital firms
AI and machine learning (ML) deal activity fell 19% in just one year, from 8,968 in 2022 to 7,238 in 2023. The value and number of AI and machine learning (ML) deals ML also fell. Pitchbook tracked $2.7 billion in disclosed deal value in Q4 2023, the lowest quarter since Q1 2019. M&A (M&A) activity continues to decline as major companies technology companies are focusing more on partnerships with LLM startups.
Pitchbook notes that the exception to this trend is AMD’s acquisition of Nod.AI in machine learning operations (MLOps), IBMthe acquisition of Mantis in database management and ServiceNowthe acquisition of Ultimate Suite in predictive analysis. It is expected that the start of semiconductors Astera LaboratoriesThe IPO will reinvigorate deal value in the first or second quarter of this year.
Amid plummeting transaction activity and declining transaction values, signs of long-term growth are also emerging. Generative AI leaders raised $6 billion in the fourth quarter of 2023 alone, across 194 deals, largely backed by Microsoft, Google and other tech giants seeking access to the latest LLM technologies. Pitchbook notes that momentum for horizontal platforms has also increased, setting a venture capital record in 2023 with $33 billion raised. Investments in vertical applications have fallen to levels not seen since 2020.
Where venture capitalists say new opportunities lie
Building an organizational structure and product strategy that can capitalize on Nvidia’s many innovations, including rapid advancements in GPUs, is at the heart of new investment opportunities. Pitchbook’s analysis reveals that the three emerging areas of AI data centers, local LLMs and domain-specific foundation models are well-positioned to benefit from Nvidia’s momentum as a key market driver of AI.
Nvidia reported revenue of $22.1 billion for its fourth quarter, fiscal 2024, up 265% year-over-year and 22% sequentially. The data center segment grew 409% year-over-year and 27% sequentially to $18.4 billion. Jensen Huang, founder and CEO of Nvidia, said, “Our data center platform is driven by increasingly diverse drivers: demand for data processing, training and inference from large cloud service providers and those specializing in GPUs, as well as enterprise software and consumer Internet companies. Vertical industries – dominated by automotive, financial services and healthcare – are now worth billions of dollars.
AI Data Centers Show Potential for Skyrocketing Growth
Designed from the infrastructure layer to scale and support more AI-intensive workloads, these data centers are optimized to make the most of high-performance servers, storage, networking and specialized accelerators . AI data centers must also be designed to optimize the power consumption and heat production of high-performance GPUs, while placing a strong emphasis on sustainability.
IDC estimates that $8 billion has been invested in processors, storage and generative AI networks, generating $2.1 billion in cloud revenue and $4.5 billion in application sales. Pitchbook predicts that AI data centers won’t reach software-as-a-service (SaaS)-level margins until 2027. Startups are focused on offering cost-effective solutions and significant savings on GPU hours.
Pitchbook notes that “based on on-demand hourly pricing, startups offer 50-70% savings on GPU hours for advanced users.” Nvidia A100 and providing unique access to the latest H100 chips. The report notes that the leading GPU cloud provider for startups Lambda built the largest H100 chip cluster of any public cloud, surpassing Google And Oracle.
Venture capitalists will evaluate the possibility of creating partnerships with ecosystems of colocation providers. Pitchbook notes that specialty cloud providers have carved out a $4.6 billion market in the nearly $150 billion Internet-as-a-Service market, with more than 90% going to U.S.-based hyperscalers and to Chinese cloud giants. What makes specialty cloud providers unique is their ability to differentiate based on AI chip availability, local presence, multi-cloud support, and support for multiple legacy hardware types .
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about technology and transformative business transactions. Discover our Briefings.