Results

Google Public Sector 2024: US Space Force General Saltzman on innovation, scale, leadership

Google Public Sector 2024: US Space Force General Saltzman on innovation, scale, leadership

General Chance Saltzman, Chief of Space Operations for United States Space Force, outlined the agency's increasing challenges, but noted that the innovation at scale and pace is possible with a public-private partnership.

Speaking at the Google Public Sector Summit 2024 in Washington DC, Saltzman (right) hit on multiple themes that apply to the public and private sector, leadership and innovation within a large organization. Here's a recap of the Google Public Sector Summit:

The Space Force recently outlined its pillars for change including developing science and technology processes and making them operational at pace and scale. Saltzman's talk was a few days after Space Force tested the X-37B Orbital Test Vehicle (OTV-7).

Here's a look at the key points from Saltzman's talk:

Scale issues and tracking threats. Saltzman said that about 2008, he was concerned that the database used to track space objects would struggle at 10,000. Today, that database is handling more than 40,000 objects. US Space Force is working with the private sector including Google Public Sector to scale and track the space traffic.

"The number of satellites launched has dramatically changed since I arrived in 2008. The cost per kilo to orbit has gone from $30,000 to $1,500 and these are game changing shifts in the space domain," said Salzman.

This chart tells the story.

The US Department of Defense works with multiple cloud vendors directly under its Joint Warfighting Cloud Capabilities contract vehicle.

Space security. Saltzman also said that GPS jamming and interference with satellite communications is also becoming a security threat. The threats to space infrastructure can also hamper operations on Earth. "It used to be that all we had to do was maintain access to space and then exploit it for its advantages," said Saltzman. "Today to control the domain and protect it, we have to deny an adversary. This control aspect is what has shifted."

Saltzman said space is now contested and that requires more technological prowess. Space used to require efficient services for navigation and communication, but the domain has transformed to be contested.

Innovation within government. Saltzman said, "the government does not innovate well." He added that budget and funding for projects are set up years in advance. "We write our requirements several years before the funding lines up. That creates problems with working capital and being more agile with our resources," he said.

Budget issues aside, innovation is also a mindset. The government is good at creating what Saltzman calls "new-old." In a nutshell, new-old refers to developing new versions of old capabilities. "We take the F-16 and we build the F-22. The system is designed that way. The industrial support is available and we have our concept of operations," he said. "It's all based on standard operating procedure and it is seen as low risk to enhance what we already have."

The catch is that space requires a system that's "new-new" and breaks from tradition. "You really need to break from your own patterns. No one in government wants to do risky things with taxpayer dollars," he said. "The system is designed to give you new-old."

Leadership and innovation. Saltzman said he is aiming to develop new kinds of leaders that can be innovative. "I ask questions like 'are we following tried and true processes?'" he said. "It's a leading question because then we're probably not right. If we start building requirements on things we know we're already off base. We're already going to limit the options of what's possible."

Thinking horizontally. Saltzman said his goal is to create a rethinking of operations that' s more horizontally so that you can deliver vertically to adapt to multiple use cases.

Saltzman's take rhymes with how companies are thinking through digital transformation, AI and cloud computing. Start small, notch wins and build capabilities. "You have to test assumptions and recognize that new-old is not innovation. It is not going to get us where we need to go," he said. "We're trying to put it in terms where it doesn't sound like change is inherently risky. We are already experiencing the risk.

Public private partnership. Saltzman said the private sector is critical to public sector innovation, but it will be frustrating at times. "Curb your frustrating working with the government. At least you get to go home to a different office," he quipped. "Nobody is more frustrated sometimes with the way we do our business. But periodically, I see value in the slow delivery process. We shouldn't be entrepreneurial with taxpayer dollars. Just recognize that together we can operationalize these good ideas."

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work New C-Suite Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Zuora goes private in $1.7 billion deal with Silver Lake, GIC

Zuora goes private in $1.7 billion deal with Silver Lake, GIC

Zuora said it will be acquired by Silver Lake and GIC in a $1.7 billion deal that will take the company private.

Under the terms of the deal, Zuora shareholders will get $10 a share.

The company will continue to be led by founder and CEO Tien Tzuo, who added that going private will help the Zuora build out its monetization suite.

According to Zuora, the deal is expected to close in the first quarter of 2025. Tzuo will roll over the majority of his existing ownership.

Zuora recently acquired Togai and said it would offer a platform for usage-based pricing as well as subscriptions. Generative AI has led to new models for businesses that previously relied solely on subscriptions.

Enterprise software is a hotbed for private equity as New Relic, Alteryx and Smartsheet have been among the companies going private.

 

Matrix Commerce Chief Information Officer

Google Public Sector Summit: 9 takeaways you need to know

Google Public Sector Summit: 9 takeaways you need to know

Google Public Sector Summit featured a packed lineup of AI leaders, panels on use cases and real-world government challenges.

The gist of the conference is that government generative AI customers can leverage commercial Google Cloud but be walled off. Google Public Sector is an independent entity that leverage Google Cloud technology, but takes it the last mile (with isolated instances in some cases). In an interview with analysts, Google Public Sector CEO Karen Dahut said the company's goal is to serve the public sector commercial cloud capabilities for government use. 

"When we came into this market, what we found was traditional gov clouds. They're walled off and lack parity. It lacks the compute scale and doesn't have resiliency. What if we made our commercial cloud available to government by a software defined community cloud with all of the guardrails built in? OMB came to that same conclusion independent from us."

Here's a look at all the takeaways from the conference, lessons and best practices that emerged:

If you invested in data infrastructure, architecture and governance you're able to drive value from generative AI projects. Lakshmi Raman, Director of AI at the Central Intelligence Agency (CIA), said the agency was able to drive value quickly "due to investments made in AI, data and tooling over the last decade." "That investment enabled us to evaluate generative AI capabilities quickly," said Raman.

Improving data quality may be your most important genAI use case. Dr. Ted Kaouk, Chief Data & AI Officer and Director, Division of Data CFTC, said his agency is focused on the quality of data ingestion to focus on anomaly detection and "developing prototypes to detect bad actors."

Look at your data as a product. Zach Whitman, Chief Data Scientist & Chief AI Officer at GSA, said he's been focused on using generative AI to focus on "how to enable better data productization and groundwork that maximizes value in the future."

Ron Robinette, Deputy Secretary, Innovation and Technology & AIO at CA GovOps, seconded Whitman's take. "We have five proof of concepts in state of California and we need our data better prepared to take advantage of that opportunity," he said.

Mark Munsell, Director, Data and Digital Innovation, Founder of Moonshot Labs at National Geospatial-Intelligence Agency, said Munsell is improving its data by making sure everything possible can be entered into a database so it can have structure that can later help improve model training. This structure can then be combined with computer vision.

"We have 100s of petabytes of data from sensors and traditionally humans would look at the data and find signals, but now we need computer vision and model to cover places we can't," said Munsell.

Invest in metadata. Whitman said part of that data productization effort is to invest in metadata. "Overinvest in metadata so you can make the data explainable to the AI systems," he said. "Sometimes that's hard work and it's hard to get investment, but it's worth it."

"Metadata is critical," said Gulam Shakir CTO at National Archives & Records Administration (NARA). "We are leveraging several pilots."

Generative AI is breaking down silos. Whitman noted that conversations about generative AI use cases are going well beyond technology. Use case conversations are involving risk and safety, technology and the business. "We are seeing this cross pollination of great ideas," said Whitman. "It's a game changer that breaks down silos."

AI at the edge and hybrid use cases. At the Google Public Sector Summit, the company spent a lot of time talking to agency leaders about being the "best on-premises cloud" for workloads that are air-gapped, separated from networks and can still run models.

There's a reason for AI systems designed for the field: The public sector--especially the military--often has spotty connectivity. During a panel, Jane Overslaugh Rathbun,

CIO of the US Navy, said sailors are "disconnected continuously." She added that the Navy is looking for edge AI capabilities that can process the sensor data from ships in contested theaters and get sailors the data to make decisions.

Young J. Bang, Principal Deputy Assistant, Secretary of Army Acquisition, Logistics & Tech, noted that the Army is rarely connected at the edge. Bang said a hybrid approach to genAI will emerge where models are trained centrally, fine tuned and sent to the edge.

Smaller models are seen as key. Mark James, Director of Infrastructure and Support Services at the Department of Homeland Security, said AI at the edge is going to require smaller models. ""We're exploring smaller language models to support AI at the edge," said James. For the DHS, ports are a key edge location where smaller models can have impact augmenting officers' day-to-day activities by scanning documents.

Talent. Brig. Gen. Heather W. Blackwell at the US Air Force | JFHQ-DODIN said generative AI is critical to making sure your limited talent resources are used on high-value projects. "We need AI to find those things that my analysts can't see so we can use our limited analytics assets on things only humans can do," said Blackwell.

Maj Gen Anthony Genatempo, Program Executive Officer, Cyber and Networks Air Force Life Cycle Management Center, C3I&N, said you need the talent to also ensure use cases for generative AI work out. "I want to tackle one aspect of our business we do to see if AI can help us out. Right now, I want to cut my contracting timeline from 18 months to 14 days," said Genatempo. "There are aspects of the workforce who thinks AI is about getting rid of them. I'm not getting rid of one person. People that know how to use these tools will replace people who don't."

Generative AI is a cultural opportunity. Raman noted that "culture eats strategy for breakfast" so AI leaders need to make sure "the AI journey is aligned with organizational beliefs."

Culture was a theme echoed by General Chance Saltzman, Chief of Space Operations for US Space Force. He said government needs a different type of leader who knows how to innovate within government. Critical thinking will be critical.

Urs Hölzle, Google Fellow, said cultures need to evolve with an eye on longer term projects and a tolerance for failure. Takeaways from Hölzle on culture include:

  • Cultural change is key to enabling transformative innovation within organizations.
  • Embracing failure as part of the innovation process is crucial. Different projects should be categorized (e.g., core, experimental) to manage risk appropriately.
  • It's tempting to rely on legacy methods in moments of pressure, but true progress requires focusing on new solutions and resisting this tendency.
  • Structured prioritization helps ensure that resources are allocated effectively, avoiding the pitfall of focusing only on short-term wins.
  • Effective leaders foster a culture that embraces learning from failures while being clear about project expectations.
Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Amazon invests in X-Energy Reactor, fuels small modular nuclear reactor run

Amazon invests in X-Energy Reactor, fuels small modular nuclear reactor run

Amazon is investing in X-Energy Reactor Company's $500 million venture round as it becomes clear that AI factories will be increasingly tethered to nuclear reactors.

The company's investment lands a day after Google made a similar move. Nuclear power is seeing a renaissance due to the energy needs of AI workloads.

Under the X-energy investment round, Amazon’s Climate Pledge Fund, Citadel Founder and CEO Ken Griffin, affiliates of Ares Management Corporation, NGP, and the University of Michigan are investors.

X-energy aims to bring more than 5 gigawatts online to the US by 2039. If X-energy hits its target it will have the largest commercial deployments of small nuclear reactors (SMRs). As part of the deal, Amazon committed to support an initial 320-megawatt project with Energy Northwest.

The money will be used to fund X-energy's reactor design and licensing and the first phase of its fuel fabrication facility. X-energy and Amazon will also collaborate to standardize deployments and financing models.

These SMR companies are in the early stages, but raking in funding. Many of the timelines for commercial deployments extend into 2030 or later.

X-energy features are its Xe-100 SMR design and TRISO-X fuel. Each reactor unit is engineered to provide 80 MW of electricity and is optimized in multi-unit plants ranging from 320 MW to 960 MW. These SMRs can be shipped via road, which should enable easier scaling.

 

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

SuperNova Award interview: How IBM used Adobe Firefly to speed up ideation and iteration

SuperNova Award interview: How IBM used Adobe Firefly to speed up ideation and iteration

IBM used Adobe Firefly over the last year to lower its #content spend by 80% 📉 and reduced its ideation time from 15 days to 2 days in a marketing campaign executed in and around the Sphere in Las Vegas.

💡 Perhaps the bigger takeaway is that IBM's returns were largely driven by using Firefly at the front end of the creative process. Joe Prota, Director of Brand Marketing at IBM, is a Constellation Research SuperNova Award finalist. 🏆 In an interview with Constellation's editor-in-chief Larry Dignan, Prota shared the project's goal of scaling Adobe's Firefly in its marketing efforts and leveraging IBM's watsonX for governance.

📽? Watch the full interview covering the topics of leveraging generative AI in the creative process, pilot, process and production, returns, and ideation.

?? Read Larry's full recap article here: https://www.constellationr.com/blog-news/insights/supernova-award-interview-how-ibm-used-adobe-firefly-speed-ideation-and-iteration

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/dRoq244AuUU?si=5SEZZGsMXVk9WHPE" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Google Public Sector: A look at the strategy

Google Public Sector: A look at the strategy

Google Cloud CEO Thomas Kurian said the company is committed to serving the public sector and government "in all of its domains" and said Google Public Sector is the only completely independent division to give it the leeway to serve customers.

Kurian's comments, which were delivered at the Google Public Sector Summit in Washington DC, come as the company's government playbook revolves around showing what it can do for various services with Gemini and its AI portfolio and being seen as an AI partner.

"We want everyone to leave today with 100% certainty that we are committed to supporting the govt in all of its domains. Period. It's not just building our products and accrediting them, but bringing our expertise. Google Public Sector is the only independent division giving it the ability to serve," said Kurian.

In a talk with Google Public Sector CEO Karen Dahut, Kurian drew parallels with how Google Cloud's enterprise strategy has evolved. In multiple cases, Google Cloud has come in as a cloud provider for compute and storage, but the company differentiates with data and AI services including BigQuery, Vertex AI and Gemini throughout its platform. The approach reminds me of how Google Cloud has moved up the cloud food chain in enterprises. Constellation Research previously covered Equifax's cloud transformation in a customer story and that case study highlights how Google Cloud leverages AI and machine learning to be more strategic in multi-cloud environments. 

Dahut and Kurian noted that Google Public Sector is offering innovation faster since it's not just another government cloud. "Governments will want access to the latest innovation and the choice of where you want to deploy," said Kurian, who also argued that Google Public Sector will provide a platform for multiple models.

Google Public Sector's core argument is that its tech stack can be more adaptive, responsible, secure and intelligent.

Kurian said Google Public Sector is also set up on Google Cloud's security foundation and various controls as well as its Mandiant unit and security operations. "Your data is your data and no one else's," said Kurian.

A few key points from the Google Public Sector Summit keynote and the overall strategy.

  • Google Public Sector's core AI pitch revolves around being multimodal so its government clients can "now analyze 90% of the data gathered rather than just the 10%," said Dahut.
  • The company is deploying a show-don't-tell strategy with its customer references at Google Public Sector Summit. The lineup of CxOs from government agencies is impressive. Demos also highlighted how generative AI can be built into government infrastructure for everything from video analysis, mapping and processes for citizen services and security analysts.

  • Google Public Sector's partner strategy has evolved quickly since the way into agencies is often through integrators such as Accenture and Carahsoft. The company is also systematically checking off various clearances for its AI models.
  • Kurian said AI agents have the opportunity to scale government services, automate processes for loans, permits, tax forms and claims and process unstructured data to "fundamentally change how information is processed and decisions are made."
Data to Decisions Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Google Public Sector lands new clearances for Gemini, authorizations for Air Force Cloud One

Google Public Sector lands new clearances for Gemini, authorizations for Air Force Cloud One

Google Public Sector said Gemini in Google Distributed Cloud for secret and top-secret workloads will be available in early 2025 as the two-year-old unit announced key customer wins NIH Strides and CalHEERS as well as a Federal AI Solution Factory with Accenture. Google Cloud also won new authorizations for Air Force Cloud One to provide cloud services to the Department of the Airforce.

At its Google Public Sector Summit in Washington DC, Google Cloud laid more groundwork for government AI workloads. The summit featured a bevy of AI leaders across federal agencies as well as Google Cloud CEO Thomas Kurian. The event kicks off as each federal agency is required to fill Chief Artificial Intelligence Officer roles by the end of the year due to the White House Executive Order on AI.

With government agencies going multicloud, Google Public Sector is positioning itself to be an AI provider that can adapt across hybrid cloud deployments and multiple models with Google Cloud services such as BigQuery and Vertex AI. A demo showcase will highlight how Google generative AI tools can be used for everything from disability claims processing to research grants to geospatial applications and edge computing.

In a keynote, Google Public Sector CEO Karen Dahut (right) said the Public Sector Summit had more than 1,000 public sector leaders at the conference. "The public sector is ready to adopt the latest generative AI technology with the proper guardrails built in," she said. "We are demonstrating what's possible now."

Dahut cited multiple Google Public Sector customers including the US Air Force for preventive maintenance, local governments such as Dearborn, Michigan for citizen services, and the Department of Defense, which has built an AI-driven microscope to be deployed at military treatment centers.

Here's the rundown of what was announced.

  • Gemini in Google Cloud Distributed Hosted and its clearance for secret and top-secret workloads mean public sector agencies can build out AI agents across departments for workflows, code development and cybersecurity.
  • Google Public Sector has achieved Impact Level (IL) 4 and IL5 Authorization to Operate (ATO) for Air Force Cloud One. Air Force Cloud One is a contract vehicle for providing cloud services to the Air Force. The authorization, which builds on top of Google Public Sector's FedRAMP authorizations, means the US Air Force will be able to access Google Cloud for infrastructure and applications.
  • National Institute of Health STRIDES Marketplace. Google Cloud and 11 independent software vendors launched the Google Cloud NIH STRIDES Marketplace, which will give NIH scientists the ability to find, purchase and deploy services such as compute, storage, analytics, AI and machine learning for biomedical research. The marketplace's inaugural partners include Redis, Box for Life Sciences , Augmedix (a Commure company), Sorcero, Egnyte, MongoDB, Weka.io, Form Bio, Red Hat, Rhino Health and Aiforia.

  • Deloitte and Google Public Sector will deploy Google Security Operations to secure CalHEERS, the State of California’s health benefits exchange system. CalHEERS runs Covered California, the state's health insurance marketplace. Google Cloud will provide its suite of cybersecurity tools for threat detection, incident response and security analytics with built in Gemini AI. Covered California said in April it would use Google Cloud's Document AI.
  • Google Public Sector expanded a partnership with Accenture to launch an AI Federal Solution Factory to prototype and pilot AI use cases for agencies. In addition, Slalom Solution Factory and Google Public Sector expanded a partnership aimed at US state and local government and education use cases.
  • The Accenture and Slalom Solution Factory partnerships landed as Google Public Sector enhanced its Partner Program, which includes improved incentives, more training and co-marketing support, accelerated go-to-market and clear delivery frameworks.
  • Google said it will dedicate $15 million in new Google.org funding to Partnership for Public Service and InnovateUS to upskill government workers.
Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Information Security Officer Chief Technology Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

DSAG: SAP's innovation focus on cloud, discriminates against on-premise users

DSAG: SAP's innovation focus on cloud, discriminates against on-premise users

The German Speaking SAP User Group (DSAG) said that SAP on-premises customers are being discriminated against because the software vendor is requiring that new innovations, notably generative AI, will be delivered on its cloud platform.

DSAG is holding its annual conference and users are trying to navigate digital transformation and AI as well as their own IT budgets.

Jens Hungershausen, DSAG Chairman of the Board, said in a letter that SAP is mistaken with its strategy to deliver innovation solely through the public cloud. SAP has been pushing customers through various programs like RISE with SAP to move to S/4HANA Cloud and a clean data core.

In the letter, DSAG noted:

"DSAG also believes that on-premises systems will remain highly relevant for some time to come, e.g. in industries with high process complexity or due to legal or data protection framework conditions or individual requirements. For example, the cloud offering in public administration must meet the sovereignty requirements set by the public administration for certain specialist processes and take into account the applicable legal framework conditions."

This debate over on-premises vs. public cloud with DSAG and SAP is notable given that many vendors and enterprises are betting that on-premises AI infrastructure will gain in popularity due to costs, data proximity and various regulations. "The discrimination of on-premises customers when it comes to innovation, the perceived pressure to switch to the cloud and the increasing dependence on SAP are just a few examples," said Hungershausen.

Holger Mueller, Constellation Research analyst, said the DSAG-SAP debate over the innovation delivery strategy will continue. "It’s clear that DSAG and SAP are at odds on innovation strategy, as to where innovation needs to be delivered," he said. "SAP is focused on public cloud, DSAG wants innovation on premieres and for private cloud. The core challenge remains that SAP needs to show value to get customers to upgrade to public cloud. And needs to do more."

Constellation Research analyst Ray Wang noted:

"Customers signed up for the promise of enterprise software where advancements and innovation would be delivered in exchange for the license of software and maintenance. When software companies break that promise, customers suffer. In the case of SAP’s on-premises customers, this promise continues to be broken and DSAG is correct that this type of discrimination should be addressed."

Data to Decisions Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity Tech Optimization Future of Work SAP Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Dell Technologies launches Nvidia Blackwell PowerEdge system, new rack design

Dell Technologies launches Nvidia Blackwell PowerEdge system, new rack design

Dell Technologies launched its Dell PowerEdge XE9712, a system built on Nvidia's GB200 NVL72 platform with 36 Nvidia Grace CPUs and 72 Nvidia Blackwell GPUs in a rack.

The system, available with liquid cooling, builds on Dell Technologies' AI Factory strategy for scale-out AI workloads. The Dell PowerEdge XE9712's 72 Nvidia Blackwell GPUs use NVLink to act as one GPU.

Dell said that the PowerEdge XE9712 is designed to fit into its latest rack design. The Dell Integrated Rack 7000 fits with Open Compute Project (OCP) standards and is designed to accommodate multiple generations of systems and heterogenous GPU providers. Last week, Dell launched an integrated AI system powered by AMD and its portfolio of CPUs, GPUs and software.

Key facts about the Integrated Rack 7000 include:

  • The 21-inch rack is designed to support CPU and GPU density.
  • The rack has wider and taller server sleds for larger CPUs and GPUs.
  • Dell designed the integrated rack for native liquid cooling and is capable of cooling future deployments up to 480KW.
  • The rack supports Dell and third-party networking systems.

Along with the rack and new Nvidia AI system, Dell Technologies launched Dell PowerScale, Ethernet storage certified for Nvidia DGX SuperPOD.

PowerScale includes integration with RAG frameworks, Dell Data Lakehouse and an upcoming open-source document loader for Nvidia NeMo.

Dell also said it will jointly engineer systems with Intel for AI deployments with the Dell PowerEdge XE9680 and Intel Gaudi 3.

Data to Decisions Tech Optimization Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Databricks, AWS expand partnership with Mosaic AI, more integration

Databricks, AWS expand partnership with Mosaic AI, more integration

Databricks broadened its partnership with Amazon Web Services in a move that will put Databricks Mosaic AI on AWS for custom models. In addition, Databricks will use AWS' Trainium chips as its preferred infrastructure for model training.

Under the partnership, Databricks and AWS will give joint customers the ability to use Mosaic AI to pretrain, tune and serve large language models (LLMs) on AWS.

The two companies will also provide new integrations for Databricks on AWS Marketplace, which is becoming a vehicle for enterprise software sales. For instance, Databricks' AWS business has passed $1 billion annual revenue run rate due to AWS Marketplace.

Key items in the collaboration pact include:

  • Databricks, which recently expanded Mosaic AI to include Mosaic AI Model Serving, will support multiple model providers via Amazon Bedrock.
  • Mosaic AI models can scale up on AWS Trainium chips.
  • Joint customers can launch AI applications while keeping their data secure.
  • The two companies will expand go-to-market efforts and joint offerings. Databricks and AWS will develop custom models built with Mosaic AI on AWS Trainium. The companies also said they will collaborate with systems integrators to migrate data stores from on-premises to AWS.
  • Databricks and AWS will launch industry genAI accelerators for media and entertainment and financial services with more on deck.
  • AWS Marketplace integrations with Databricks will simplify onboarding and configuration and serverless compute.

Related:

 

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity databricks Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer