Results

Databricks adds on to the Lakehouse, acquires MosaicML for $1.3 billion

Databricks adds on to the Lakehouse, acquires MosaicML for $1.3 billion

Databricks said it will acquire MosaicML, which is a generative AI platform specializing in large language models (LLMs), for $1.3 billion.

The news lands as data platforms such as Snowflake, Databricks and MongoDB race to provide ways for enterprises to build their own generative AI models while keeping corporate data secure. The data platform game is focused on fast training of LLMs and models with strong data governance.

Related:

MosaicML is best known for its MPT LLMs. For instance, MosaicML has more than 3.3 million downloads of MPT-7B and MPT-30B LLMs.

Databricks will take MosaicML and integrate its models into Databricks Lakehouse. According to Databricks, MosaicML will enable customers to train LLMs in hours not days and for "thousands of dollars, not millions."

Constellation Research analyst Doug Henschen said:

“Databricks has spent the last few years building up the house side of its Lakehouse platform, but the company’s beginnings were as a data science platform. It can’t afford to lose its distinction and differentiation as a platform for data science, so the acquisition of MosaicML makes complete sense. What’s more, it’s a good fit in terms of company culture and location.”

The $1.3 billion price tag is inclusive of retention packages. Databricks said it expects the entire MosaicML team to join the company. Retaining MosaicML's team will be critical as Databricks integrates and scales the combined platform.

In a blog post MosaicML said it started talking to Databricks about partnerships, but it became clear the effort would scale better combined. MosaicML said:

“Generative AI is at an inflection point. Will the future rely mostly on large generic models owned by a few? Or will we witness a true Cambrian explosion of custom AI models that are built by many developers and companies from every corner of the world? MosaicML’s expertise in generative AI software infrastructure, model training, and model deployment, combined with Databricks’ customer reach and engineering capacity, will allow us to tip the scales in the favor of the many.”

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity databricks Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

IBM acquires Apptio for $4.6 billion, wants to optimize, automate your IT

IBM acquires Apptio for $4.6 billion, wants to optimize, automate your IT

IBM said it has acquired Apptio, which makes IT management and optimization software, for $4.6 billion. Big Blue said the move will bolster its IT automation offerings.

Vista Equity Partners bought Apptio in 2018 for $1.94 billion.

IBM said it will combine Apptio with its IT automation software including Turbonomic, Instana and AIOps and Watsonx AI platform. Enterprises are increasingly looking to automate their IT operations and maximize financial returns (FinOps). IBM said that Apptio will bring anonymized IT spending data to provide insights.

Big Blue noted that it is at the early stages of integrating Apptio and developing roadmaps. 

Constellation Research CEO Ray Wang said:

"It’s sign of the times. Companies want to know how to manage their cloud budgets and Apptio is one of the tools with cost management tools and technology portfolio management or FinOPs. IBM is betting that customers will want to buy software to manage cloud costs and tech spending."

Apptio has more than 1,500 customers and has integrations with multiple IT vendors including Amazon Web Services, Microsoft Azure, Google Cloud Platform, ServiceNow, Salesforce, Oracle, SAP and others.

IBM launches Watsonx, an AI platform with open source models, governance

With the Apptio Purchase, IBM will own three core SaaS offerings including.

  • ApptioOne, which tracks hybrid cloud spend management and optimization.
  • Apptio Cloudability, which provides public cloud spend management and optimization visibility.
  • Apptio Targetprocess, which aligns IT projects with business outcomes.

IBM said the plan is to scale Apptio's products via Red Hat, IBM's portfolio of software and AI products and IBM Consulting. 

Here's a look at Apptio's platform. 

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity IBM AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership business Marketing finance Customer Service Content Management Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

MongoDB launches Atlas Vector Search, Atlas Stream Processing to enable AI, LLM workloads

MongoDB launches Atlas Vector Search, Atlas Stream Processing to enable AI, LLM workloads

MongoDB added Atlas Vector Search and Atlas Stream Processing to its MongoDB Atlas platform along with other enhancements as it aims to be the top choice for data application developers.

The news, announced at its MongoDB.local NYC developer conference, highlights the race for enterprise developers looking to create modern applications that can readily incorporate generative AI capabilities at scale.

MongoDB's announcements come days after Databricks launched Lakehouse Apps to broaden its development platform ambitions. In addition, Snowflake will unveil updates at its Snowflake Summit next week. Snowflake CEO Frank Slootman last month promised "significant product announcements" at Snowflake Summit.

Dev Ittycheria, CEO of MongoDB, said during a keynote that developers spend most of their time working with data instead of creating software. Multiple clouds, endpoints and data stores have also made development more complicated. Streaming data technologies are also heterogenous. "AI is about building smarter and more intelligent applications," said Ittycheria. "There has been an explosion of AI companies running and building apps on MongoDB. We believe there are 1,500 companies building AI workloads on MongoDB today."

MongoDB Atlas Vector Search will bring generative AI capabilities to the Atlas platform by bringing forward highlight relevant information retrieval and personalization.

Doug Henschen, analyst at Constellation Research, put Atlas Vector Search in context:

"Vector Search isn't a generative AI capability on its own, it's an enabler for companies interested in developing their own generative AI capabilities. In announcing this feature, which is entering public preview, Mongo DB is joining a group of leading-edge data platform companies that have recently made, or are about to make, vector-search-related announcements."

In addition, MongoDB Atlas Stream Processing will surface high-velocity streams of complex data. Atlas Stream Processing, which is in private preview, will enable enterprises to leverage large language models (LLMs) and process streams of real-time data in one unified experience.

Henschen said Atlas Stream Processing is a key addition for MongoDB. He said:

"Atlas Stream Processing is the most important announcement at this week’s event, with Vector Search being the second most important announcement in my book. Low-latency workloads and requirements are only becoming more prevalent, so MongoDB really had to step up on this front if it is to live up to the company’s billing as a "developer data platform." Rival data platforms associated with analytics, such as Snowflake and Databricks, have already addressed real-time needs, so MongoDB is filling a competitive gap."

Vector Search and Stream Processing are likely to appeal to developers building AI-based applications. MongoDB said Beamable, Pureinsights, Anywhere Real Estate and Hootsuite are building next-gen applications with the new Atlas capabilities.

To round out the Atlas updates, MongoDB also added Atlas Search Nodes with dedicated resources for search workloads, efficiency improvements with MongoDB Time Series collections and new Atlas Data Federation for queries and isolating workloads on Microsoft Azure.

For MongoDB, the race to build enterprise-grade generative AI apps is an opportunity to grow its multi-cloud developer data platform. Although enterprises aren't scaling LLMs and generative AI applications yet, the interest is there.

To capitalize on the generative AI and LLM interest, MongoDB is looking to address the following with Atlas Vector Search:

  • Provide the flexibility to store and process different types of data. LLMs require data in the form of vectors to represent data types such as text, images and audio.
  • Store vectors so LLMs can use them without needing a specialized database that lacks integration with existing technology stacks.
  • Enable developers to deploy new workloads such as semantic search, text and image search and personalized product recommendations in one platform.
  • Provide developers with the ability to augment pre-trained generative AI models with their own data.
  • Integrate frameworks such as open source LangChain and LlamaIndex and use them so developers can access LLMs from partners.

With Atlas Stream Processing, MongoDB is looking to do the following:

  • Provide developers with real-time streaming data from IoT devices, browsing and inventory feeds to create real-time experience and optimize on the fly.
  • Leverage streaming data without specialized programming languages, APIs and drivers.
  • Give developers one interface to extract insights from streaming data across multiple data types, connectors and technologies.

While Atlas Vector Search and Atlas Stream Processing were the headliners, MongoDB had a series of other launches. Here's the breakdown.

  • MongoDB Atlas Search Nodes give developers dedicated resources so enterprises can scale search workloads independent of the database.
  • MongoDB Time Series collections provide options to modify data that has already been ingested. Time Series collections will also improve storage efficiency and query speeds.
  • The company is adding Microsoft Azure support to MongoDB Atlas Online Archive and Atlas Data Federation to go with Amazon Web Services. Support for Microsoft Azure Blog Storage means MongoDB customers can work with Azure and AWS datasets.
  • MongoDB launched MongoDB Relational Migrator, a tool that streamlines the process of migrating applications and legacy databases.
  • Google Cloud Vertex AI LLMs will integrate with MongoDB Atlas so developers can use Google Cloud foundational models across MongoDB Atlas Vector Search.
  • The company outlined MongoDB Atlas for Industries, which is a set of integrated tools for vertical use cases. The first industry targeted by MongoDB Atlas for Industries is financial services.
  • MongoDB also outlined additional programming language support for deploying MongoDB Atlas on AWS, the Kotlin Driver for MongoDB for server-side applications and more streamlined functionality for Kubernetes and Python.

Bottom line: MongoDB sent the message that developers can leverage the Atlas platform for generative AI capabilities. Henschen said:

"I think the 5% of companies that are innovators and the next 20% to 25% of companies that are fast followers will be the ones that are most interested in these features. It promises to make MongoDB stickier for developers at these companies as they now know they can turn to MongoDB, a tool they already know and love, for help in developing generative AI capabilities."

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity mongodb Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Amazon vs. Walmart: 8 innovation takeaways

Amazon vs. Walmart: 8 innovation takeaways

The Amazon vs. Walmart battle is one of the great American business case studies happening in real time. DisrupTV caught up with Jason Del Rey, author of Winner Sells All, about his reporting on Amazon and Walmart for his book.

Walmart, Target highlight intersection of supply chain, customer experience

Here's a look at the takeaways from the DistrupTV interview, which starts at the 20 minute mark:

  1. "(Walmart) is one of the greatest case studies in the innovator's dilemma that we’ve ever seen in modern business history," said Del Rey. Amazon and Walmart are the two biggest private sector employers in the US and affect our lives in so many ways. "But the rivalry has impacted each other's decisions," said Del Rey.
  2. Walmart CEO Doug McMillon isn't a risk taker but recognizes the company had to take risks to survive. Del Rey said McMillon in an interview said Walmart is on the right path toward transformation, but it took years to get there. Walmart initially thought that Amazon and e-commerce wouldn't be a big threat.
  3. How did Walmart miss Amazon? Part of the Walmart blind spot toward Amazon was arrogance but a lot of it was incentives and how brick and mortar managers didn't want to cannibalize physical sales. "The one thing I learned is that incentives really matter at a business and how different units interact for better or worse," said Del Rey.
  4. Top down culture changes. Del Rey noted that McMillon has changed Walmart culture from the top and Jeff Bezos clearly drove Amazon. What's changing now is that Bezos has handed off the CEO role to Andy Jassy. Del Rey added that the Jassy tenure has been marked by cost cutting and diving into the retail business. "Where there is more of a difference is with the cost cutting. I don't know if Amazon under Jeff Bezos would have had it in them to pull back as harshly," said Del Rey.
  5. What do Walmart and Amazon have in common? Del Rey said Amazon's leadership principles about customer focus and a bias for action came from Walmart. Each company has strayed at different points. "In early 2000s, Walmart got fat on its profits and success," said Del Rey. Today's Amazon added hundreds of thousands of employees and lost its bias toward action and is now focusing operations.
  6. Healthcare potential. Both Walmart and Amazon are targeting healthcare because there's a customer need and the profit margins are better, said Del Rey. "Both of them have been involved in this space in some way," said Del Rey, who noted that both companies have also dueled over healthcare acquisitions. "Both have had failures over the years, but both are really giving healthcare a go."
  7. Will it always be Amazon and Walmart as a duopoly. Del Rey said the competition between the two retail giants is good for competition, but "my fear is competition alone will not be enough." He added that it would be great to see a new company delivering convenience and a serious No. 3 rival to Walmart and Amazon. Del Rey added that Target gets overlooked, but it's more likely that a currently underestimated rival or adjacent player like Shopify will be a threat.
  8. What's the follow up? Del Rey had to stop writing as the generative AI craze took off. Another storyline for the future will be how Amazon and Walmart expand into India.

Next-Generation Customer Experience Innovation & Product-led Growth Matrix Commerce Marketing Transformation Revenue & Growth Effectiveness B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI Chief Executive Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Financial Officer Chief Growth Officer Chief Information Officer Chief Marketing Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer Chief Supply Chain Officer Chief People Officer Chief Human Resources Officer

AI Regulation, Culture Transformation, Google Talk | ConstellationTV Episode 60

AI Regulation, Culture Transformation, Google Talk | ConstellationTV Episode 60

ConstellationTV hits episode 60! 🎉 Tune into this segment and you'll get...

  • 00:00 - Introduction with co-hosts Holger Mueller and Liz Miller.
  • 00:56 - #tech news updates with Liz Miller and Holger Mueller around #ai regulation, #transparency, and more.
  • 11:19 - An interview with Avaya CEO Alan Masarek about Avaya's transformation and its firm foundation of #culture that's been crucial to success.
  • 20:22 - Analysis from Holger and Doug Henschen about Google Talk 2023, and the direction Google is heading with its products and services.
  • 31:55 - Classic CRTV bloopers, this week Liz describes her approach to college dating...

Subscribe to our YouTube channel and never miss an episode! https://lnkd.in/eGCDxfXE

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/p2tP23UoCKg" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

Hyundai Motor's innovation strategy: What we can learn

Hyundai Motor's innovation strategy: What we can learn

Hyundai Motor said it will expand its electric vehicle production and outlined a new strategy called the "Hyundai Motor Way," but the most interesting items had nothing to do with automobiles.

Today's Hyundai is best known for its Hyundai, Kia and Genesis brands. Tomorrow's Hyundai may be better known for autonomous vehicles, flying cars and robots that perform a variety of functions.

Hyundai held an "2023 CEO Investor Day" in Seoul and the broad strategy highlights how the company plans to innovate away from internal combustion engines and become a "smart mobility solution provider." See: Inside the Continuum of Growth and Innovation

Here's what we can learn about innovation from Hyundai's big plan for 2032.

Innovation requires long-term planning. Hyundai outlined a 10-year investment plan to electrify and develop multiple businesses. Hyundai plans to invest $85 billion over 10 years. About $27 billion of that total will go toward electrification, which will feature a value chain that also serves as a bridge to the future.

Constellation ShortList™ Innovation Services and Engineering | The Top 150 Digital Transformation Executives Harnessing Disruptive Technologies to Drive Innovation

Software is everything. Hyundai updated its software defined vehicle (SDV) strategy and plans to build an app ecosystem and an open operating system that will cover everything from autonomous driving, over-the-air updates and other items.

Invest in startups that can advance the SDV strategy. Hyundai plans to use Hyundai-backed startup 42dot as its global software base. Hyundai said:

42dot will start developing its own software platform called Titan by 2024 and validate the platform by 2026 in order to launch an autonomous driving purpose-built vehicle (PBV) business after 2027 with the aim of turning a profit after 2028, according to a phased technology development roadmap.

From there, 42dot will develop new businesses based on PBVs and its software for the mobility and logistics industries. The move makes sense since 42dot can run faster than Hyundai as a whole.

Robotics is a play on the future and requires some patience. Hyundai acquired Boston Dynamics in 2021 and has built out its Robotics Lab. For the market to expand, Hyundai plans the following:

  • Development of cognitive judgement and natural language technology.
  • Spatial navigation and movement technologies.
  • A robot management system that can lead to motion sensing wearable robots as well as new models for multiple purposes.

Mobility will also include air travel so partner up. Hyundai is betting that advanced air mobility will be key to developing cities of the future. Infrastructure for flying vehicles will require partnerships with the likes of Microsoft, Rolls-Royce, Hyundai units and other partners.

Today's sustainability plays may be different tomorrow (think hydrogen). Hyundai plans to become carbon neutral by using hydrogen, including biogas and waste-plastic based hydrogen, to power its EV production facilities and surrounding infrastructure. The company said it will present its hydrogen business vision at CES 2024.

Next-Generation Customer Experience Future of Work Innovation & Product-led Growth Data to Decisions New C-Suite Tech Optimization Innovation Leadership Chief Executive Officer Chief Technology Officer Chief Experience Officer

PegaWorld iNspire 2023: Wrapping things up with Liz Miller

PegaWorld iNspire 2023: Wrapping things up with Liz Miller

Constellation analyst Liz Miller gives her analysis and key takeaways from PegaWorld iNspire 2023.

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/NTZup5uvPsI" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

HPE launches GreenLake for LLMs, aims to democratize Cray supercomputing for AI training

HPE launches GreenLake for LLMs, aims to democratize Cray supercomputing for AI training

Hewlett Packard Enterprise is mobilizing its Cray supercomputing knowledge to launch HPE GreenLake for Large Language Models (LLMs) as enterprises look for options to privately train, tune and deploy AI.

At HPE Discover, HPE outlined its plans for HPE GreenLake for LLMs, which is expected to be generally available by the end of the year. The move is designed to address enterprise concerns about security, data privacy and compliance for generative AI deployments.

Vendors in recent weeks have moved to address corporate data concerns as Salesforce launched an AI trust layer and Oracle said it will offer a cloud service to keep corporate data protected.

HPE is taking a hybrid and private cloud approach to deploying AI with a focus on industries including healthcare and life sciences, financial services, manufacturing and transportation. HPE GreenLake for LLMs is also running on infrastructure powered by nearly 100% renewable energy, which could be critical given enterprises are increasingly tracking carbon footprints.

HPE GreenLake for LLMs is well timed since Constellation Research analyst Dion Hinchcliffe recently published a report outlining how CXOs are moving to private cloud models for cost savings.

According to HPE CEO Antonio Neri, HPE GreenLake for LLMs is the first of a series of AI applications planned. "HPE is making AI, once the domain of well-funded government labs and the global cloud giants, accessible to all by delivering a range of AI applications, starting with large language models," he said.

"We are experiencing an exponential growth of data everywhere, but only 50% of it is used for decisions. The reality is we have been data rich and insight poor. AI and generative AI has accelerated and we now have the ability to harness the power of our data...You don't have to spend millions to acquire supercomputing infrastructure." 

HPE expands GreenLake services amid private cloud renaissance

HPE GreenLake for LLMs will be delivered with Aleph Alpha, a German AI startup that will provide a proven LLM for use cases requiring text and image processing and analysis.

With the launch of HPE GreenLake for LLMs, HPE is looking to expand its market beyond HPC users to R&D innovators, Chief AI Officers and CXOs who are looking to develop models faster and more efficiently. HPE GreenLake for LLMs will be browser based with role-specific tooling.

HPE noted that it is accepting orders now for HPE GreenLake for LLMs with availability at the end of 2023 in North America. Europe will follow in early 2024.

Mobilizing Cray

HPE acquired supercomputing giant Cray in 2019 in a move that propelled the company to the top of the supercomputer rankings.

With that purchase of Cray, HPE has been able to scale AI training and simulation workloads across CPUs and GPUs at once.

And now that generative AI is likely to spread across the enterprise and be democratized, HPE GreenLake for LLMs is able to leverage that Cray infrastructure.

HPE GreenLake for LLMs will be available on-demand and powered by HPE Cray XD supercomputers as well as the HPE Cray Programming Environment, a software suite to optimize HPC and AI applications. There's also a set of tools for developing, porting, debugging and tuning code.

According to HPE, Luminous, the pre-trained LLM from Aleph Alpha, was tuned for multiple use cases for banks, hospitals and law firms on HPE GreenLake for LLMs.

HPE GreenLake for LLMs will also use a HPE's AI software including HPE Machine Learning Development Environment HPE Machine Learning Data Management Software.

In addition, HPE GreenLake for LLMs will run on supercomputers initially hosted in QScale’s Quebec colocation facility that provides power from 99.5% renewable sources.

To round out the AI push, HPE expanded its inferencing compute offerings. New HPE ProLiant Gen11 servers are optimized for AI workloads, using advanced GPUs. The HPE ProLiant DL380a and DL320 Gen11 servers boost AI inference performance by more than 5X over previous models, said HPE. That performance comparison is based on image generative AI performance of NVIDIA L40 (TensorRT 8.6.0) versus T4 (TensorRT 8.5.2), Stable diffusion v2.1 (512x512).

The Constellation Research take

Constellation Research analyst Holger Mueller said HPE left a few open questions to ponder. For instance, what is the connectivity between the data and Cray systems? What happens if HPE machines are on-premises and data is in the cloud?

Mueller added that the real cost savings may be in the ProLiant systems optimized for AI workloads. He added that HPE will compete with Oracle Cloud at Customer as well as IBM and others.

Andy Thurai, who covers AIOps and AI at Constellation Research, said HPE GreenLake for LLMs appears to be going "after the mature AI workloads NOT the innovative workloads."

He said:

"In order for any enterprise to experiment on AI models they need strong ecosystems, data availability, skills and immediate need. HPE doesn’t have that right now. Many organizations are expected to use hyperscale cloud providers and decide if it is going to worthwhile to move to HPE. That can be good and bad. Good: Enterprises already know what they want. Bad: Volume and TAM will be very limited to those customers. Public clouds will come up with a mechanism to keep the initial innovative cloud workloads from moving out. It will be an interesting battle to see."

Thurai also noted the following:

  • HPE GreenLake for LLMs is unique effort outside of the public cloud vendors.
  • Focusing on domain specific use cases in specific industries can bring value that's hard to replicate in public clouds.
  • Strong governance and control of sensitive models and data, no data egress fees, sustainability, and lower costs are all strong claims worth investigation by enterprises.
  • HPE is hoping its partnership with Aleph Alpha will show other AI players that they can train private LLMs just as easily.

More:

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity HPE greenlake SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

HPE expands GreenLake services amid private cloud renaissance

HPE expands GreenLake services amid private cloud renaissance

Hewlett Packard Enterprise added private cloud enhancements to HPE GreenLake, the company's cloud platform, and built out services for backup, machine learning and network as a service.

At HPE Discover, the company outlined a bevy of GreenLake additions. The timing is notable since Constellation Research analyst Dion Hinchcliffe recently published a report outlining how CXOs are moving to private cloud models for cost savings. In a nutshell, public cloud providers haven't been passing on savings and encouraging enterprises to move workloads such as AI on premises.

For stable workloads, high data movement and ongoing workloads the public cloud can be more expensive.

HPE CEO Antonio Neri said during a keynote that GreenLake is enabling hybrid and private cloud choice. Indeed, HPE expanded partnerships with Equinix and AWS. During his HPE Discover keynote, Neri said enterprises will be edge and cloud focused and data driven. He also touted HPE's supercomputing prowess and noted that "only supercomputing can accelerate AI."

He added:

"We are experiencing an exponential growth of data everywhere, but only 50% of it is used for decisions. The reality is we have been data rich and insight poor. AI and generative AI has accelerated and we now have the ability to harness the power of our data."

Among the HPE announcements at Discover:

  • HPE closed its OpsRamp acquisition and launched OpRamp integration with HPE GreenLake's platform and sustainability dashboard. HPE GreenLake customers can use OpsRamp for observability and automation across multi-cloud assets.
  • OpsRamp’s sustainability dashboard will provide visibility into multi-vendor and multi-cloud IT assets.
  • HPE is adding HPE NonStop Development Environment delivered as an Amazon Machine Image (AMI) and HPE Fraud Risk Management as SaaS in AWS Marketplace.
  • HPE's Machine Learning Development Environment is available through HPE GreenLake for High Performance Computing (HPC).
  • HPE build out its network as a service (NaaS) offering with the addition of the HPE Aruba Networking CX 8000, HPE Aruba Networking 9000, and HPE Aruba Networking 10000 series.
  • HPE GreenLake for Private Cloud Business Edition enables customers to spin up virtual machines (VMs) across hybrid clouds on demand and self-manage their private cloud.
  • HPE GreenLake for Private Cloud Enterprise added capabilities for edge use cases and the ability to connect to distributed IT locations.
  • HPE expanded its portfolio for private cloud with partnerships with Equinix. HPE launched HPE GreenLake for Private Cloud Enterprise and HPE GreenLake for Private Cloud Business Edition housed at Equinix's global data centers.
  • HPE will provide pre-configured and tested cloud modules optimized for VMware Cloud Foundation.
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience HPE greenlake SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing finance Healthcare Customer Service Content Management Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Databricks launches Lakehouse Apps, aims to be development platform

Databricks launches Lakehouse Apps, aims to be development platform

Databricks launched Lakehouse Apps, an effort by the data and AI company to become more of a platform for cloud and data-driven apps.

The news comes as Snowflake, MongoDB and Databricks are all holding events in the next few days. There's a race to be a platform for data and AI applications amid generative AI and large language models. Data vendors are trying to be the "locus of modern, cloud/data-driven app development," according to Constellation Research analyst Doug Henschen.

Henschen added:

"Snowflake and MongoDB are also encouraging customers to think of and use their products as platforms for building applications. So last year Snowflake acquired Streamlit, a company that offered a framework for building data applications, and it introduced lightweight transactional capabilities, which had been a bit of a gap. Similarly, MongoDB, which already had plenty of traction with developers, significantly increased its analytical capabilities, which was a bit of a gap. Databricks has announced several development partners, and I’m assuming we’ll see more in the way of native services from Databricks to meet transactional requirements."

Databricks said Lakehouse Apps are designed to enable customers to access applications that run within their Lakehouse instance with their own data. Lakehouse Apps will be available in the Databricks Marketplace in preview in the coming year. Databricks Marketplace will be generally available at Databricks' Data + AI Summit next week.

Databricks added that it will offer AI model sharing within Databricks Marketplace and curate various models for common use cases.

Key points about Lakehouse Apps include:

  • Lakehouse Apps can integrate with Databricks' customer data and use Databricks services with single sign-on.
  • Lakehouse Apps inherit the same security, privacy and compliance controls as Databricks.
  • Early development partners for Lakehouse Apps include Retool, Posit, Kumo.ai and Lamini. Those companies are focused on data science, AI and LLMs.

More:

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Product Officer