Results

Microsoft Azure sees 40% revenue growth in Q1

Microsoft Azure sees 40% revenue growth in Q1

Microsoft reported better-than-expected first quarter results and delivered Azure revenue growth of 40%.
 
The software and cloud giant reported first quarter net income of $27.7 billion, or $3.72 a share, on revenue of $77.7 billion, up 18% from a year ago. Non-GAAP earnings for the quarter were $4.13 a share.
 
Wall Street was looking for Microsoft first quarter earnings of $3.67 a share on revenue of $75.33 billion.
 
Microsoft Cloud revenue was $49.1 billion, up 26%. The company's Intelligent Cloud unit had revenue of $30.9 billion, up 28% from a year ago. Azure revenue was up 40% from a year ago in the first quarter.
 
CEO Satya Nadella said the company's AI factory and high end copilots were paying off. "It’s why we continue to increase our investments in AI across both capital and talent to meet the massive opportunity ahead," he said.
 

Amy Hood, Microsoft CFO, said Microsoft Cloud saw strong customer demand across the board.  Hood said that Microsoft will be capacity constrained at least through the fiscal year. She said Microsoft will deliver second quarter revenue of $79.5 billion to $80.6 billion. 

Nadella said on the earnings call that the new OpenAI agreement creates more certainty in terms of AGI as well as IP rights. He played down how fast AGI would be here. "AGI will not be achieved any time soon," he said.

He said Microsoft will create value by building systems of AI agents and stringing them together. 

 
By the numbers:
 
  • Capital expenses including assets under finance leases were $34.9 billion, up 74% from a year ago. Half of that sum went to GPUs and CPUs for Azure.
  • Productivity and Business Process revenue was $33 billion, up 17% in the first quarter.
  • Microsoft Commercial Cloud revenue was up 17% and consumer cloud sales were up 26%.
  • Dynamics 365 revenue was up 18%.
  • Windows OEM and devices revenue was up 6%.

Data to Decisions Future of Work Next-Generation Customer Experience Microsoft Chief Information Officer

Google Cloud Q3 revenue surges 34% as backlog hits $155 billion

Google Cloud Q3 revenue surges 34% as backlog hits $155 billion

Google Cloud revenue surged 34% in the third quarter and is hitting an annual run rate of nearly $61 billion.

In the third quarter, Google Cloud delivered revenue of $15.2 billion with operating income of $3.6 billion. Google Cloud saw strength in core AI infrastructure and generative AI products.

The Google Cloud gains landed as Alphabet reported better-than-expected financial results overall. Alphabet reported third quarter net income of $35 billion, or $2.87 a share, on revenue of $102.35 billion, up 16% from a year ago.

Wall Street was looking for third quarter revenue of $99.89 billion with earnings of $2.26 a share.

Alphabet said that Google Search, YouTube ads, Google subscriptions and Google Cloud all delivered double-digit growth in the third quarter.

CEO Sundar Pichai said the company saw strength across every unit to deliver its first $100 billion quarter. "Our full stack approach to AI is delivering strong momentum and we’re shipping at speed," said Pichai.

Capital expenses in the third quarter were $23.95 billion, up 83% from a year ago.

By the numbers:

  • Gemini processes 7 billion tokens per minute via direct API use.
  • The Gemini App has more than 650 million monthly active users.
  • Google Cloud ended the quarter with a backlog of $155 billion.
  • YouTube Premium and Google One drove more than 300 million subscriptions.
  • Alphabet's other bets, which includes Waymo, had third quarter revenue of $344 million with a loss of $1.43 billion in the third quarter.
  • Free cash flow in the third quarter was $24.46 billion, up 39% from a year ago.

On a conference call, Pichai said:

  • "Our extensive and reliable infrastructure, which powers all of Google's products, is the foundation of our stack and a key differentiator. We are scaling the most advanced chips in our data centers, including GPUs from our partner, Nvidia, as well as our own purpose built GPUs, and we are the only company providing a wide range of both."
  • "We are investing in TPU capacity to meet the tremendous demand we are seeing from customers and partners."
  • "Over the last quarter, we rolled out AI Mode globally across 40 languages. It now has over 75 million daily active users, and we shipped over 100 improvements to the product in q3 an incredibly fast pace. Most importantly, AI mode is already driving incremental total query growth for search."
  • "The number of new Google Cloud customers increased by nearly 34% year over year. We are signing larger deals. We have signed more deals, Over $1 billion through Q3 this year than we did in the previous two years combined. More than 70% of existing Google Cloud customers use our AI products."
  • "Over the past 12 months, nearly 150 Google Cloud customers each processed approximately 1 trillion tokens with our models for a wide range of applications."
Data to Decisions Future of Work Marketing Transformation Next-Generation Customer Experience Google Chief Information Officer

AWS fires up Project Rainier, Trainium2 cluster for Anthropic

AWS fires up Project Rainier, Trainium2 cluster for Anthropic

Amazon Web Services said Project Rainier, an AI compute cluster powered by 500,000 Trainium2 chips, is now in use for Anthropic.

The AI infrastructure project is critical for AWS since it is being used by Anthropic to train its Claude models as well as other workloads. AWS said that Project Ranier will ultimately scale to 1 million Trainium2 processors.

Project Ranier was announced a year ago. Anthropic has pursued a multi-cloud approach and recently said it would procure TPUs from Google Cloud. Anthropic's models will now run on Nvidia, AWS and Google Cloud.

Key facts include:

  • AWS said Project Rainier will have more than 1 million Trainium2 chips by the end of the year.
  • The AI compute power is being used to build and deploy future versions of Claude.
  • Project Rainier is AWS largest infrastructure project to date.
  • Project Rainier is designed as a massive “EC2 UltraCluster of Trainium2 UltraServers.”
  • The architecture consists of stringing together UltraServers, which have four physical Trainium2 servers each with 16 Trainium2 chips. They communicate via high-speed connections called NeuronLinks.
  • The combination of these Ultraservers add up to an UltraCluster.
  • AWS said the vertical integration will enable it to continually optimize Project Rainier for cost and energy efficiency.
  • Given that AWS is highly likely to announce Trainium3, the next question will revolve around the replacement cadence and depreciation for Trainium2.

Constellation Research analyst Holger Muller said:

"It's good to see AWS being on track to build its first super computer. Traditionally AWS would scale through many machines not large machines, which require different engineering and different fault tolerances. We will see more details when the new machine will be in production. Obviously, AWS is confident for it to work and wants a portion of the news during GTC week. And finally it's a great proof point for AWS wanting to keep workloads inhouse and living the build vs buy mantra."

Data to Decisions Tech Optimization amazon Chief Information Officer

Google Public Sector: AI agents and the future of government

Google Public Sector: AI agents and the future of government

Google Public Sector CEO Karen Dahut said there's urgency in government to leverage AI, transform and do it in a national secure way. The company positioned itself as a key public sector AI infrastructure provider that can enable AI agents to carry out missions.

“We believe agencies of the future will be powered by AI and agents that are ubiquitous and multimodal. This means the public sector will become more productive and more efficient,” said Dahut.

The Google unit, which launched in 2022, has increasingly gained in public sector accounts and has hooked up with key integrators such as Lockheed Martin. As outlined a year ago, Google Public Sector is also set up on Google Cloud's security foundation and various controls as well as its Mandiant unit and security operations. Google Public Sector is an independent entity that leverage Google Cloud technology, but takes it the last mile (with isolated instances in some cases). Google rolled out Gemini for Government before Gemini for Enterprise.

"The pace truly is unlike anything we've ever seen. We're talking days, not months or years, and there is a heightened sense of global urgency. We've got to move fast, and there's a new national security imperative," said Dahut. "The high ground is no longer just air and space, it's the digital domain. This is our new reality, and the stakes are super high at the same time you are being asked to do more with less in mission critical environments that are constantly evolving."

Google Public Sector 2025 kicked off amid a US government shutdown that Dahut addressed at the top.

"I know that for so many of you in this room, and not in this room, our federal agency customers, leaders, dedicated public servants and the entire contracting community, this government shutdown is creating profound uncertainty and difficulty," said Dahut. "Your missions are critical. The work you do matters."

Dahut added that Google is upping its capital expenses to $85 billion to build out its AI infrastructure. She also emphasized security and options to deploy AI off the cloud. “There is security everywhere all the time. We believe agencies of the future will be powered by a zero trust security foundation that shifts the advantage back to cyber defenders,” said Dahut.

The big themes

At Google Public Sector 2025, there were a few big themes hit by executives.

  • There was an emphasis on Google Cloud's integrated AI stack including its custom TPUs that generate AI performance and cost efficiency. That said, Google Public Sector's keynote featured an extended partnership with Nvidia. Nvidia's Ian Buck, VP of Hyperscale and HPC, filled in for CEO Jensen Huang, who was in the neighborhood for Nvidia GTC Washington, DC, but had to fly for trade talks with China.
  • Google noted that it has an extensive global network to support missions including 42 regions, 124 zones and 202 edge locations. Google executives didn't say it directly, but the subtext is that the company's AI infrastructure is already built instead of merely announced and gigawatt press releases extending into the next decade.
  • Google Public Sector is gaining beyond just the US federal market and outlined a series of state and city customer wins.
  • There's still a healthy dose of federal AI deals as Google Public Sector ran through multiple use cases for US departments and agencies such as the Department of Defense, Pacific Northwest National Laboratory, Department of Energy, FAA, NOAA, National Cancer institute and others.

Gemini for Government as an orchestrator

Thomas Kurian, CEO of Google Cloud, highlighted Gemini for Government and the ability to build models and AI agents.

Kurian cited the US Postal Service, which is using Vertex AI, to modernize legacy systems and the National Cancer Institute, which is automating research processes.

According to Kurian, AI agents and AI conversational platforms will be embedded into "every employee and process workflow for every government agency."

Gemini for Government is designed to connect to data stores and keep context as well as use multiple models.

In a demo of Gemini for Government, executives highlighted agents and connectors to various systems as well as mission specific efforts from partners. 

Kurian also emphasized on-premise and air-gapped deployments of Gemini.

"We also bring our AI to wherever your mission sits, whatever your mission is. The same Gemini model that is available in the public cloud is also available on Google Distributed Cloud," said Kurian. "We call it GDC for connected on premise environments and fully air gap deployments. Together with our partner, Nvidia, we're bringing Gemini on Blackwell GPUs to your data centers."

Data to Decisions Future of Work Next-Generation Customer Experience Google Chief Executive Officer Chief Information Officer Chief Information Security Officer

Google Public Sector, Lockheed Martin pair up for on-premises AI

Google Public Sector, Lockheed Martin pair up for on-premises AI

Lockheed Martin will integrate Google Gemini models into the Lockheed Martin AI Factory in a partnership with Google Public Sector.

The partnership will bring Google's AI tools into Lockheed Martin's on-premises air-gapped infrastructure.

Google Public Sector announced the deal at its Google Public Sector Summit in Washington DC.

According to the companies, Google AI will be deployed in Lockheed Martin infrastructure in a phased deployment. Google's generative AI, including Google Gemini on Google Distributed Cloud, will be deployed in unclassified systems and then classified systems for aerospace, space exploration and cybersecurity.

Google's AI tools and Gemini will be used for accelerated multi-modal data analysis, advanced research and development and logistics including supply chain management and route optimization.

Lockheed Martin said the Google Public Sector partnership "equips our engineers with powerful tools—safely and at scale—to accelerate innovation in support of our business and critical missions."

Data to Decisions Tech Optimization Chief Information Officer

Old Dominion, Google Public Sector Create AI Incubator

Old Dominion, Google Public Sector Create AI Incubator

Google Public Sector and Old Dominion University are aiming to embed AI throughout the university including research, teaching and operations workflows via an incubator.

The incubator, called MonarchSphere, Old Dominion University (ODU) is looking to use AI to accelerate discovery, personalize learning and advance student career paths.

Google Public Sector announced the deal at its Google Public Sector Summit in Washington DC.

The ODU partnership includes Google Cloud's Vertex AI platform, various Gemini models and agentic AI services. MonarchSphere is designed to be a central hub for education connecting various ODU departments.

Dr. Brian O. Hemphill, President of Old Dominion University, said the Google Public Sector pact is a "defining moment" that will make ODU a "future-ready institution." ODU is one of many universities looking for ways to embed AI into curriculum, research and operations and advance student outcomes.

Here's a look at the moving parts of the ODU-Google Public Sector partnership.

  • Research. ODU researchers will have access to Google Cloud compute for AI and big data projects. Faculty can run models quickly for use cases such as genomics. ODU previously used over-taxed on-premise clusters.
  • Google AI for Education Accelerator. ODU is an early member of the accelerator designed to bring AI to education institutions. Google AI for Education Accelerator includes no-cost access to Google certificates and AI training that can be integrated across curriculum and workforce training.
  • Curriculum will be mapped to Google's career pathway tools. ODU faculty are also piloting Google Colab Enterprise in advanced AI courses to give students access to GPUs for model training.
  • Tools for courses including Gemini Pro and Notebook LM. Course designers can generate summaries, outlines and learning materials using genAI. Workflow tools can also speed up delivery. ODU will develop an AI course assistant tool.
  • ODU said it is planning to move beyond a one-size-fits-all model to one that's personalized for each student. MonarchSphere will be extended to local municipalities and small businesses in Virginia.

Dr. Chrysoula Malogianni, Associate Vice President of Innovation at ODU, said the key to the project with Google Public Sector is data management.  Your data is everything.

Malogianni said your AI success largely depends on your data. "Have a data plan," she said. "AI is not a catastrophe or panacea. AI can't do anything. You need robust data. You need infrastructure and a data foundation so you can validate AI. You need to also start preparing your target population for AI adoption. If we don't understand the AI you won't have a plan."

She added that ODU put a lot of work into the data foundation along with Google Public Sector. Key assets included:

  • 20 years of recorded courses and data can be combined with real time data from interactions. 
  • Notebooks for mind maps and course outlines to create assistants with the help of instructional designers. 
  • Data types from transcripts, advisors and student interests.
  • The combination of course data and public data with students to create personalized journeys. 

And don't forget the leadership. "It's important to have visionary leadership because transformation doesn't start from technology. It starts from visionary leadership, appropriate partnership and having a good plan," said Malogianni.

 

Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Chief Information Officer

City of Los Angeles Bets on Google Workspace with Gemini as It Preps for Big Events

City of Los Angeles Bets on Google Workspace with Gemini as It Preps for Big Events

The City of Los Angeles will deploy Google Workspace with Gemini across its workforce of 27,500 employees as part of its broader digital transformation strategy.

Google Public Sector announced the deal at its Google Public Sector Summit in Washington DC. The City of Los Angeles will leverage Google's AI for communications, project planning and ultimately public services. The gains in Los Angeles come as Google Public Sector is landing other states and local governments.

For instance, Google announced that the Maryland Department of Information Technology (DoIT) will roll out Google Workspace with Gemini to nearly 43,000 employees as its usage grows across 59 state agencies. Maryland already had 12,500 state employees who were active users of Google Workspace with Gemini making the DoIT account a classic land and expand.

Los Angeles' Google Public Sector usage comes as the city is rolling out a broader digital transformation effort called SmartLA 2028. LA is preparing for the 2026 World Cup, 2027 Super Bowl and 2028 Olympic and Paralympic Games.

At a high level, SmartLA 2028 includes multiple levels of the resident journey.

  • Smart city infrastructure to use technology in LA's physical assets such as 5G, IoT and fiber.
  • Data tools and practices to work across departments and deliver government to resident and business services.
  • Digital services and applications to deliver services.
  • Connectivity and digital inclusion efforts.
  • Governance and coordination of investments across LA departments.

Ted Ross, CIO for the City of Los Angeles, said Google Public Sector and Google Workspace will be key to "modernizing the city’s approach to constituent services."

Here's a look at where Google Workspace with Gemini fits into LA's technology plans.

  • Workspace is being used to improve communication and information with residents. LA creates multilingual content, public announcements and emergency communications via Workspace and uses Gemini to rewrite at the most accessible reading level.
  • Workflows are being revamped across LA's 45 departments with Gemini summaries and data analysis.
  • City employees are using NotebookLM to analyze grant documents for project funding.
  • AI training for LA's workforce.

Speaking on a panel, Ross outlined the Gemini for Government plans and offered some best practices. Here's a look:

Use cases are not hard to find. Ross said that in government use cases abound for areas where AI can improve efficiencies. Information dissemination and analysis are big ones. "In emergency management, AI has the ability to synthesize real-time information from utilities, cities, counties, states and multiple jurisdictions," said Ross. This information also has to be multi-lingual.

In addition, managers have been able to leverage NotebookLM and query it for new grant opportunities.

Ross added that it helps to think through AI use cases in terms of personas. "Think from the perspective of personas like the broad workforce, manages, front lines," said Ross.

Don't scrimp on training. "I'm a huge fan of training and giving employees an intro to AI," said Ross, who added that the training and use of AI is critical to employee engagement. "AI is a once in a generation shift of how people are computing and you have to train the workforce so you can launch them into the future and build AI fluency. Make the investment in training now."

Contractual protections. If adopting AI in the public sector make sure you are using tools with contractual protections, said Ross. Since the City of LA was using Google Workspace it made sense to use Gemini since it fell under the same contract.

The roadmap ahead. Ross said using AI for transportation modeling is a big focus for the city. "We're getting into predictive modeling to see what happens when people go to an event," said Ross, who noted that the 2028 Olympics will be like having a Super Bowl every day for two weeks. "That includes multilingual traveling assistance."

Data to Decisions Future of Work Next-Generation Customer Experience Innovation & Product-led Growth New C-Suite Marketing Transformation Digital Safety, Privacy & Cybersecurity Google Chief Information Officer

Nvidia's GTC Washington DC news barrage lands amid US vs China AI backdrop

Nvidia's GTC Washington DC news barrage lands amid US vs China AI backdrop

Nvidia CEO Jensen Huang laid out a series of announcements covering an investment in Nokia to meld 6G and AI, quantum computing connectivity with GPUs and a vision to build AI infrastructure as a means for national security.

At Nvidia GTC Washington, Huang laid out a bevy of news items. Nvidia's GTC coincides with Google Cloud's Public Sector Summit in Washington DC. The backdrop of AI competition with China was also hard to avoid.

The need for US-based infrastructure was a big theme as Huang talked about the 6G and AI intersection and an investment in Nokia. Huang said wireless technology is largely deployed on foreign technology. "Our fundamental communication technology, built on foreign technology, and that has to stop — and we have an opportunity to do that," said Huang, who said it's time to get back in the game.

Nvidia is building an AI-native stack for 6G with Nvidia ARC-Pro. Nokia will put Nvidia's wireless AI technology in its future base stations. Nvidia will invest $1 billion in Nokia as a way to layer AI into the transition from 5G to 6G. Nvidia also partnered with Palantir and CrowdStrike and expanded ties with Google Cloud.

In addition, Nvidia said its AI Aerial platform will add multimodal integrate sensing and communications over 6G. Nvidia is also partnering with Booz Allen, Cisco, MITRE, ODC and T-Mobile to build an American AI-RAN stack.

On the quantum front, Nvidia launched NVQLink, a high-speed interconnect that lets quantum processors connect to GPU supercomputers. The company has 17 quantum labs and nine scientific labs in the fold.

According to Nvidia, NVQLink gives quantum computing researchers a system for the control algorithms needed for large-scale quantum error correction. "In the near future, every NVIDIA GPU scientific supercomputer will be hybrid, tightly coupled with quantum processors to expand what is possible with computing," said Huang.

On the manufacturing front, Nvidia also outlined a "mega" Nvidia Omniverse Blueprint to expand libraries for building factory-scale digital twins and physical AI systems for robotics. Nvidia is also working with the U.S. Department of Energy’s national labs to develop AI factory buildouts using Nvidia Omniverse.

The Mega Nvidia Omniverse Blueprint can simulate robot fleets to include technology for designing and simulating factory digital twins.

China vs US provides context

Huang's talk landed as the US and China engage in trade talks. The two sides are also in a middle of an AI battle.

At a panel at Constellation Research's Connected Enterprise, experts noted the following:

  • The AI battle between the US and China is really just one front. Bio, quantum and robotics are other areas.
  • Depending on how the China and US relationship goes there could be global destabilization.
  • Half of the AI researchers are from China.
  • China dominates in renewable energy and is leading in nuclear reactor construction to power AI.
  • China controls the critical mineral supply chain.

"I think we're trying to handicap a three-dimensional chess game and the rules aren't fully developed. It's a very fluid situation and we're approaching it from different perspectives," said George Chanos, founder and CEO of Uvolution.io. "In my view the US and China are marching towards a finish line of singularity."

The big question is what the No. 2 player will do when there's a winner declared--real or perceived, said Chanos. He added:

"If you have boolean Manhattan Projects you have two players moving towards an end game that they think can give them global supremacy. But that's not going to be a gentlemanly type of endeavor. It's going to get heated, and when number two feels that number one is approaching the finish line, I don't know that number two is going to allow number one to cross that finish line. I think the instability that we're seeing around the world today is in large part due to this looming conflict potential."

Constellation Research CEO R "Ray" Wang noted that China is "basically going after US AI and industrial complex by giving up everything free with open source. We've been in an economic war for the last 15 years."

 

Data to Decisions nvidia Chief Information Officer

Scaling AI Innovation Globally: DataMasque's Journey with Synthetic Data and AWS Marketplace

Scaling AI Innovation Globally: DataMasque's Journey with Synthetic Data and AWS Marketplace

We are LIVE from the Amazon Web Services (AWS) Startup Partner Summit. R "Ray" Wang and Bob O'Donnell sat down with Grant de Leeuw, CEO & co-founder of DataMasque, to discuss how high-fidelity synthetic data is empowering regulated industries to unlock AI and ML innovation—while protecting sensitive data. 

DataMasque's “marketplace-first” strategy, leveraging the AWS Marketplace, enabled global growth and rapid US success even before building a local sales force. Now, they’re leading the shift from generative to agentic AI with cutting-edge in-flight and API-based data masking.

Watch the full interview to learn more about a startup paving the way for responsible, scalable AI.

Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Marketing Transformation Next-Generation Customer Experience Tech Optimization

R "Ray" Wang and Bob O'Donnell sat down with Grant de Leeuw, CEO & co-founder of DataMasque, to discuss how high-fidelity synthetic data is empowering regulated industries to unlock AI and ML innovation—while protecting sensitive data.

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/zeiNYDrYyBw?si=Z08W6xi7EzqVhtbP" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Microsoft owns 27% of OpenAI worth $135 billion

Microsoft owns 27% of OpenAI worth $135 billion

Microsoft's stake in OpenAI has a number: $135 billion, or roughly 27% of the AI company.

The disclosure was made as part of OpenAI's recapitalization that solidifies its structure as a public benefit corporation (PBC).

OpenAI and Microsoft previously said they had an understanding about the moving parts in the partnership. The valuation of Microsoft's stake in OpenAI also comes in handy since the software giant reports earnings Wednesday. Questions about the spending and losses related to Microsoft's OpenAI investment were starting to percolate.

In its annual report, Microsoft listed $4.7 billion OpenAI expenses in an “other net” line that included other items. Microsoft also hasn’t disclosed a carrying valuation for OpenAI.

Microsoft's 27% stake in OpenAI Group PBC, the non-profit that owns OpenAI, is down from the 32.5% stake in the for-profit entity. OpenAI Foundation owns a $130 billion stake in the for profit OpenAI.

Nevertheless, the new agreement gives Microsoft plenty of upside should OpenAI deliver on its growth projections. Microsoft also gets some protection in case OpenAI doesn't and can now pursue AGI on its own, be a key compute provider but without right of first refusal and has rights to models through 2032.

Microsoft and OpenAI increasingly compete:

The other moving part worth noting is that Microsoft remains OpenAI's lead frontier model partner. Microsoft also has exclusive IP rights and Azure API exclusivity until artificial general intelligence (AGI).

OpenAI's new deal with Microsoft has a few other wrinkles worth noting:

  • Once AGI is declared by OpenAI, the claim has to be verified by independent experts.
  • Microsoft's IP rights for models and products are extended through 2032 and now include models post-AGI.
  • Microsoft's IP rights to research will remain until AGI or through 2030, whichever comes first. Research IP doesn't include model architecture, weights, inference code, finetuning and any IP related to data center hardware and software.
  • Microsoft doesn't have IP rights to OpenAI's consumer hardware.
  • OpenAI can jointly develop some products with third parties even though API products are exclusive to Azure.
  • Microsoft can pursue AGI independently or with other partners.
  • OpenAI will purchase an incremental $250 billion of Azure services and Microsoft doesn't have right of first refusal to be OpenAI's compute provider.
  • OpenAI can provide API access to US government national security customers.

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity openai Microsoft ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer