Results

AI Regulation, Culture Transformation, Google Talk | ConstellationTV Episode 60

AI Regulation, Culture Transformation, Google Talk | ConstellationTV Episode 60

ConstellationTV hits episode 60! 🎉 Tune into this segment and you'll get...

  • 00:00 - Introduction with co-hosts Holger Mueller and Liz Miller.
  • 00:56 - #tech news updates with Liz Miller and Holger Mueller around #ai regulation, #transparency, and more.
  • 11:19 - An interview with Avaya CEO Alan Masarek about Avaya's transformation and its firm foundation of #culture that's been crucial to success.
  • 20:22 - Analysis from Holger and Doug Henschen about Google Talk 2023, and the direction Google is heading with its products and services.
  • 31:55 - Classic CRTV bloopers, this week Liz describes her approach to college dating...

Subscribe to our YouTube channel and never miss an episode! https://lnkd.in/eGCDxfXE

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/p2tP23UoCKg" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

Hyundai Motor's innovation strategy: What we can learn

Hyundai Motor's innovation strategy: What we can learn

Hyundai Motor said it will expand its electric vehicle production and outlined a new strategy called the "Hyundai Motor Way," but the most interesting items had nothing to do with automobiles.

Today's Hyundai is best known for its Hyundai, Kia and Genesis brands. Tomorrow's Hyundai may be better known for autonomous vehicles, flying cars and robots that perform a variety of functions.

Hyundai held an "2023 CEO Investor Day" in Seoul and the broad strategy highlights how the company plans to innovate away from internal combustion engines and become a "smart mobility solution provider." See: Inside the Continuum of Growth and Innovation

Here's what we can learn about innovation from Hyundai's big plan for 2032.

Innovation requires long-term planning. Hyundai outlined a 10-year investment plan to electrify and develop multiple businesses. Hyundai plans to invest $85 billion over 10 years. About $27 billion of that total will go toward electrification, which will feature a value chain that also serves as a bridge to the future.

Constellation ShortList™ Innovation Services and Engineering | The Top 150 Digital Transformation Executives Harnessing Disruptive Technologies to Drive Innovation

Software is everything. Hyundai updated its software defined vehicle (SDV) strategy and plans to build an app ecosystem and an open operating system that will cover everything from autonomous driving, over-the-air updates and other items.

Invest in startups that can advance the SDV strategy. Hyundai plans to use Hyundai-backed startup 42dot as its global software base. Hyundai said:

42dot will start developing its own software platform called Titan by 2024 and validate the platform by 2026 in order to launch an autonomous driving purpose-built vehicle (PBV) business after 2027 with the aim of turning a profit after 2028, according to a phased technology development roadmap.

From there, 42dot will develop new businesses based on PBVs and its software for the mobility and logistics industries. The move makes sense since 42dot can run faster than Hyundai as a whole.

Robotics is a play on the future and requires some patience. Hyundai acquired Boston Dynamics in 2021 and has built out its Robotics Lab. For the market to expand, Hyundai plans the following:

  • Development of cognitive judgement and natural language technology.
  • Spatial navigation and movement technologies.
  • A robot management system that can lead to motion sensing wearable robots as well as new models for multiple purposes.

Mobility will also include air travel so partner up. Hyundai is betting that advanced air mobility will be key to developing cities of the future. Infrastructure for flying vehicles will require partnerships with the likes of Microsoft, Rolls-Royce, Hyundai units and other partners.

Today's sustainability plays may be different tomorrow (think hydrogen). Hyundai plans to become carbon neutral by using hydrogen, including biogas and waste-plastic based hydrogen, to power its EV production facilities and surrounding infrastructure. The company said it will present its hydrogen business vision at CES 2024.

Next-Generation Customer Experience Future of Work Innovation & Product-led Growth Data to Decisions New C-Suite Tech Optimization Innovation Leadership Chief Executive Officer Chief Technology Officer Chief Experience Officer

PegaWorld iNspire 2023: Wrapping things up with Liz Miller

PegaWorld iNspire 2023: Wrapping things up with Liz Miller

Constellation analyst Liz Miller gives her analysis and key takeaways from PegaWorld iNspire 2023.

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/NTZup5uvPsI" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

HPE launches GreenLake for LLMs, aims to democratize Cray supercomputing for AI training

HPE launches GreenLake for LLMs, aims to democratize Cray supercomputing for AI training

Hewlett Packard Enterprise is mobilizing its Cray supercomputing knowledge to launch HPE GreenLake for Large Language Models (LLMs) as enterprises look for options to privately train, tune and deploy AI.

At HPE Discover, HPE outlined its plans for HPE GreenLake for LLMs, which is expected to be generally available by the end of the year. The move is designed to address enterprise concerns about security, data privacy and compliance for generative AI deployments.

Vendors in recent weeks have moved to address corporate data concerns as Salesforce launched an AI trust layer and Oracle said it will offer a cloud service to keep corporate data protected.

HPE is taking a hybrid and private cloud approach to deploying AI with a focus on industries including healthcare and life sciences, financial services, manufacturing and transportation. HPE GreenLake for LLMs is also running on infrastructure powered by nearly 100% renewable energy, which could be critical given enterprises are increasingly tracking carbon footprints.

HPE GreenLake for LLMs is well timed since Constellation Research analyst Dion Hinchcliffe recently published a report outlining how CXOs are moving to private cloud models for cost savings.

According to HPE CEO Antonio Neri, HPE GreenLake for LLMs is the first of a series of AI applications planned. "HPE is making AI, once the domain of well-funded government labs and the global cloud giants, accessible to all by delivering a range of AI applications, starting with large language models," he said.

"We are experiencing an exponential growth of data everywhere, but only 50% of it is used for decisions. The reality is we have been data rich and insight poor. AI and generative AI has accelerated and we now have the ability to harness the power of our data...You don't have to spend millions to acquire supercomputing infrastructure." 

HPE expands GreenLake services amid private cloud renaissance

HPE GreenLake for LLMs will be delivered with Aleph Alpha, a German AI startup that will provide a proven LLM for use cases requiring text and image processing and analysis.

With the launch of HPE GreenLake for LLMs, HPE is looking to expand its market beyond HPC users to R&D innovators, Chief AI Officers and CXOs who are looking to develop models faster and more efficiently. HPE GreenLake for LLMs will be browser based with role-specific tooling.

HPE noted that it is accepting orders now for HPE GreenLake for LLMs with availability at the end of 2023 in North America. Europe will follow in early 2024.

Mobilizing Cray

HPE acquired supercomputing giant Cray in 2019 in a move that propelled the company to the top of the supercomputer rankings.

With that purchase of Cray, HPE has been able to scale AI training and simulation workloads across CPUs and GPUs at once.

And now that generative AI is likely to spread across the enterprise and be democratized, HPE GreenLake for LLMs is able to leverage that Cray infrastructure.

HPE GreenLake for LLMs will be available on-demand and powered by HPE Cray XD supercomputers as well as the HPE Cray Programming Environment, a software suite to optimize HPC and AI applications. There's also a set of tools for developing, porting, debugging and tuning code.

According to HPE, Luminous, the pre-trained LLM from Aleph Alpha, was tuned for multiple use cases for banks, hospitals and law firms on HPE GreenLake for LLMs.

HPE GreenLake for LLMs will also use a HPE's AI software including HPE Machine Learning Development Environment HPE Machine Learning Data Management Software.

In addition, HPE GreenLake for LLMs will run on supercomputers initially hosted in QScale’s Quebec colocation facility that provides power from 99.5% renewable sources.

To round out the AI push, HPE expanded its inferencing compute offerings. New HPE ProLiant Gen11 servers are optimized for AI workloads, using advanced GPUs. The HPE ProLiant DL380a and DL320 Gen11 servers boost AI inference performance by more than 5X over previous models, said HPE. That performance comparison is based on image generative AI performance of NVIDIA L40 (TensorRT 8.6.0) versus T4 (TensorRT 8.5.2), Stable diffusion v2.1 (512x512).

The Constellation Research take

Constellation Research analyst Holger Mueller said HPE left a few open questions to ponder. For instance, what is the connectivity between the data and Cray systems? What happens if HPE machines are on-premises and data is in the cloud?

Mueller added that the real cost savings may be in the ProLiant systems optimized for AI workloads. He added that HPE will compete with Oracle Cloud at Customer as well as IBM and others.

Andy Thurai, who covers AIOps and AI at Constellation Research, said HPE GreenLake for LLMs appears to be going "after the mature AI workloads NOT the innovative workloads."

He said:

"In order for any enterprise to experiment on AI models they need strong ecosystems, data availability, skills and immediate need. HPE doesn’t have that right now. Many organizations are expected to use hyperscale cloud providers and decide if it is going to worthwhile to move to HPE. That can be good and bad. Good: Enterprises already know what they want. Bad: Volume and TAM will be very limited to those customers. Public clouds will come up with a mechanism to keep the initial innovative cloud workloads from moving out. It will be an interesting battle to see."

Thurai also noted the following:

  • HPE GreenLake for LLMs is unique effort outside of the public cloud vendors.
  • Focusing on domain specific use cases in specific industries can bring value that's hard to replicate in public clouds.
  • Strong governance and control of sensitive models and data, no data egress fees, sustainability, and lower costs are all strong claims worth investigation by enterprises.
  • HPE is hoping its partnership with Aleph Alpha will show other AI players that they can train private LLMs just as easily.

More:

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity HPE greenlake SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

HPE expands GreenLake services amid private cloud renaissance

HPE expands GreenLake services amid private cloud renaissance

Hewlett Packard Enterprise added private cloud enhancements to HPE GreenLake, the company's cloud platform, and built out services for backup, machine learning and network as a service.

At HPE Discover, the company outlined a bevy of GreenLake additions. The timing is notable since Constellation Research analyst Dion Hinchcliffe recently published a report outlining how CXOs are moving to private cloud models for cost savings. In a nutshell, public cloud providers haven't been passing on savings and encouraging enterprises to move workloads such as AI on premises.

For stable workloads, high data movement and ongoing workloads the public cloud can be more expensive.

HPE CEO Antonio Neri said during a keynote that GreenLake is enabling hybrid and private cloud choice. Indeed, HPE expanded partnerships with Equinix and AWS. During his HPE Discover keynote, Neri said enterprises will be edge and cloud focused and data driven. He also touted HPE's supercomputing prowess and noted that "only supercomputing can accelerate AI."

He added:

"We are experiencing an exponential growth of data everywhere, but only 50% of it is used for decisions. The reality is we have been data rich and insight poor. AI and generative AI has accelerated and we now have the ability to harness the power of our data."

Among the HPE announcements at Discover:

  • HPE closed its OpsRamp acquisition and launched OpRamp integration with HPE GreenLake's platform and sustainability dashboard. HPE GreenLake customers can use OpsRamp for observability and automation across multi-cloud assets.
  • OpsRamp’s sustainability dashboard will provide visibility into multi-vendor and multi-cloud IT assets.
  • HPE is adding HPE NonStop Development Environment delivered as an Amazon Machine Image (AMI) and HPE Fraud Risk Management as SaaS in AWS Marketplace.
  • HPE's Machine Learning Development Environment is available through HPE GreenLake for High Performance Computing (HPC).
  • HPE build out its network as a service (NaaS) offering with the addition of the HPE Aruba Networking CX 8000, HPE Aruba Networking 9000, and HPE Aruba Networking 10000 series.
  • HPE GreenLake for Private Cloud Business Edition enables customers to spin up virtual machines (VMs) across hybrid clouds on demand and self-manage their private cloud.
  • HPE GreenLake for Private Cloud Enterprise added capabilities for edge use cases and the ability to connect to distributed IT locations.
  • HPE expanded its portfolio for private cloud with partnerships with Equinix. HPE launched HPE GreenLake for Private Cloud Enterprise and HPE GreenLake for Private Cloud Business Edition housed at Equinix's global data centers.
  • HPE will provide pre-configured and tested cloud modules optimized for VMware Cloud Foundation.
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience HPE greenlake SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing finance Healthcare Customer Service Content Management Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Databricks launches Lakehouse Apps, aims to be development platform

Databricks launches Lakehouse Apps, aims to be development platform

Databricks launched Lakehouse Apps, an effort by the data and AI company to become more of a platform for cloud and data-driven apps.

The news comes as Snowflake, MongoDB and Databricks are all holding events in the next few days. There's a race to be a platform for data and AI applications amid generative AI and large language models. Data vendors are trying to be the "locus of modern, cloud/data-driven app development," according to Constellation Research analyst Doug Henschen.

Henschen added:

"Snowflake and MongoDB are also encouraging customers to think of and use their products as platforms for building applications. So last year Snowflake acquired Streamlit, a company that offered a framework for building data applications, and it introduced lightweight transactional capabilities, which had been a bit of a gap. Similarly, MongoDB, which already had plenty of traction with developers, significantly increased its analytical capabilities, which was a bit of a gap. Databricks has announced several development partners, and I’m assuming we’ll see more in the way of native services from Databricks to meet transactional requirements."

Databricks said Lakehouse Apps are designed to enable customers to access applications that run within their Lakehouse instance with their own data. Lakehouse Apps will be available in the Databricks Marketplace in preview in the coming year. Databricks Marketplace will be generally available at Databricks' Data + AI Summit next week.

Databricks added that it will offer AI model sharing within Databricks Marketplace and curate various models for common use cases.

Key points about Lakehouse Apps include:

  • Lakehouse Apps can integrate with Databricks' customer data and use Databricks services with single sign-on.
  • Lakehouse Apps inherit the same security, privacy and compliance controls as Databricks.
  • Early development partners for Lakehouse Apps include Retool, Posit, Kumo.ai and Lamini. Those companies are focused on data science, AI and LLMs.

More:

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Product Officer

AI is everywhere including your supermarket, homebuilder and soup

AI is everywhere including your supermarket, homebuilder and soup

Generative AI is the hot topic on earnings calls and investor presentations and it's clear we're in a boom for technology. AI is becoming so pervasive that the topic is appearing in industries you wouldn't consider high tech.

Are these real use cases or generative AI washing? It's too early to tell. What is clear is that enterprises are actively checking out AI and implementing it to boost returns. Here's a tour of what a few companies are saying about AI.

Kroger: AI fuels digital engagement among customers

Kroger said that AI will play a role in digital engagement, personalization and summarizing customer data sets including surveys and customer service logs.

Rodney McMullen, CEO of Kroger, said on the company’s first quarter earnings conference call:

“By applying our data and AI-based personalization, we can better understand what truly matters to our customers and deliver more targeted and effective experiences.

As customers' digital engagement increases, we have new and more efficient channels to present the most relevant products and the right promotions at the right times, no matter where and how customers choose to shop with us.”

McMullen said Kroger will work to evaluate new use cases throughout its business. The supermarket chain plans to use search algorithms and generative AI to improve personalized recommendations and substitution accuracy.

He said:

"We are already piloting several large language models to summarize customer database sets. By applying AI to customer surveys and customer service logs, our team can analyze and categorize them in minutes versus days before. This allows the business to react to customer feedback more quickly and accurately, and then reflect these learnings in the customer experience."

McMullen said that Kroger is also looking to simplify work for associates by using AI to optimize store orders, reduce out of stocks and improve inventory management.

Also see: 

Lennar: Homebuilding, marketing meets AI

Stuart Miller, Executive Chairman of Lennar, said the company has deployed an AI-driven digital marketing and dynamic pricing model to drive order volume and manage deliveries. Miller made the comments on Lennar’s fiscal second quarter earnings call.

The system, which Miller called the Lennar Machine, helped the company with backfill cancellations in its most recent quarter and boost deliveries. Lennar Machine also provides the homebuilder's production group with predictability.

Miller said if you look at Lennar Machine, "I think you'll get a better sense of our strategy, and you just might start to imagine where the often talked about AI might find its way into the sometimes-stodgy home building industry and improve productivity." Miller added that Lennar is looking to use AI to enable the company to focus on selling homes in inventory. Lennar is aiming to sell finished homes before they are complete. Miller said:

"Clearing the homes that are complete and closable rather than selling homes that are many quarters in the future is exactly what drives cash flow and we're focused on this part of our business every day."

Miller said Lennar Machine is part of a data-driven approach to homebuilding and the company has made progress. On the other hand, Miller noted that Lennar doesn't "want to get out over our skis."

Nevertheless, Lennar said that the home building process is data driven and can leverage generative AI.

"(Homebuilding is) a very integrated set of systems that is dependent on feedback loops. And any time that you find a process that becomes data-driven and the data improves to the point that it's actually relevant, at some point, there are large learning models that can be helpful in enhancing productivity."

Campbell Soup Co.: A spicy AI tale

Campbell Soup scored a product win when it launched Chunky Ghost Pepper Chicken Noodle and generated social media buzz.

That soup was reportedly AI driven. According to NJBiz, Chunky Ghost Pepper Chicken Noodle was created with the help of Campbell Soup's Insight Engine. The Insight Engine analyzes more than 300 billion data points annually to find trends in social media, restaurants and menus.

With Insight Engine, Campbell Soup has been able to speed up its product development cycle and feed the product pipeline with innovation. The Insight Engine initiative started years ago but is now paying dividends.

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Digital Officer Chief Analytics Officer Chief Data Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Information Security Officer Chief Product Officer

Live from HPE Greenlake: Interview with Vishal Lall, SVP & GM, HPE GreenLake Cloud Services Solutions

Live from HPE Greenlake: Interview with Vishal Lall, SVP & GM, HPE GreenLake Cloud Services Solutions

"What is the right #workload placement strategy?" This is the main question customers are asking about #data storage in the #cloud, says Vishal Lall, GM of Hewlett Packard Enterprise Software and Greenlake Cloud Solution Services.

Enterprises are shifting from storing all #data workloads in the cloud to a #hybrid approach for #optimization. To achieve customer success, Vishal outlines the "holy grail" of optimizing the cloud workload mix for performance, economics, and agility...

Watch the full interview to learn how multi-cloud #hybrid strategies are finally coming to fruition, and how to make the most of #AI and #ML technologies relating to cloud.

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/7JWPbp85f6U" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

Private Cloud a Compelling Option for CIOs: Insights from New Research

Private Cloud a Compelling Option for CIOs: Insights from New Research

As the world finds itself hurtling ever-further into the digital age, the race towards the public cloud seems to be a foregone conclusion for most IT leaders. The public cloud, with its allure of almost infinitely scalable resources and minimal maintenance, has long been heralded as the inevitable destination for the vast majority of IT workloads. As a long-time proponent of the public cloud, I've long understood its many positive benefits (elasticity, modern architecture, quick provisioning, and a bulwark against technical debt.) 

There has been an undeniably gilded narrative surrounding the public cloud in recent years, viewing it as a digital knight in shining armor that has come to liberate our data from the parochial and inefficient clutches of traditional data centers.

The Private/Public Cloud Decision Point Has Changed

But what if the story turned out to be not quite as simple as that? What if, like all good journeys of self-discovery, there were unexpected plot twists as the real story actually emerged? This is in fact the case today when it comes to a fuller and clearer view of private cloud vs. public cloud. I discovered this during a recent study I embarked upon, involving in-depth interviews with 22 Chief Information Officers (CIOs) and senior IT executives in the spring of 2023, uncovering an unexpected protagonist: The private cloud.

The New 2023 Cloud Reality: A Rebalancing Between Private and Public by Dion HinchcliffeThis study sought to venture beyond the buzzwords and industry assumptions, to dig into the reality of cloud solutions in today's organizations, exploring aspects ranging from cost and control to data movement and the handling of high-cost, always-on workloads. The findings, as it is seen in the summary below, challenges much of the prevailing wisdom about public cloud. It paints a more nuanced picture, one where the private cloud, often overlooked in our rush towards the hyperscalers, emerges as a compelling, and in many cases, superior choice.

I've just published a detailed new research report that delves into the study and its findings. It certainly challenges many long-held assumptions and will perhaps, help rewrite the narrative of our IT future. As I noted in a recent industry podcast, "to increase cloud value, it's essential to manage complexity and control costs."

The key findings of the report, for those IT organizations that have begun to proactively incorporate private cloud into their workload mix, include:

  • 50% lower cost
  • 65% more performant
  • Twice as agile for development/DevOps cycles

These are surprising outcomes, and not what I was necessarily expecting to uncover. Private cloud of yore was often a so-called "brownfield" of highly heteregenous legacy IT, was often not consistently managed/governed, and did not use a modern cloud architecture such as cloud-native. Private cloud was seen as rife with high costs and inefficiency. It was to be avoided if possible. However, data from serious adopters show that this is often no longer the case. Private cloud today in fact is often as sleek and contemporary as public cloud from a technology and run-time perspective. And it turns out that in particular use cases it can genuinely shine.

Note: These numbers are median figures, and are only for workloads in which private cloud is specifically advantageous.


My video exploration of the private/public cloud workload placement research.

Private Cloud Is Closing the Workload Decision Gap

Thus, we can now see that many of the disadvantages of private cloud have dropped away, while it appears that some of its other attributes have gained in comparison with public cloud. This is especially the case in certain increasingly well-understood yet vital scenarios. The interviewees I spoke with often struggled with public cloud for certain key use cases. The result was costly bills for frequently-on workloads, poorly performing workloads they had trouble optimizing, needs for data residency, poor multi-year cloud budget predictability, or high data egress to support analytics or high bandwidth digital experiences. 

It's important to emphasize that this study did not delve into the causal factors involved, merely what the CIOs and senior IT leaders experienced in terms of benefits as they rebalanced their IT portfolio with a combination of public and private cloud, putting their workloads where they made the most sense for them at the time. I will seek to explore the root causes for these benefits in a future report.

There is no question that some of these results can seem counterintuitive: Don't hyperscalers have higher economies of scale and shouldn't they be cheaper? Isn't it more agile to be able to spin up instances quickly in public cloud? The findings were that no, for various reasons hyperscalers now appear not to be passing their full cost savings to customers (perhaps putting them into R&D, marketing, growth, or profit.) And FinOps and the cloud operations bureaucracies that have accumulated around cloud usage in organizations, putting considerable red tape between teams trying to innovate and easily spinning up new cloud instances. There was much less worry about such budget risks with private cloud in this study.

Determining Where Your Cloud Workloads Belong

An IT Decision Heuristic for Cloud Workload Sourcing

In the course of conducting this study, I realized we would also able to use the rich data from the interviews to create an up-to-date cloud workload heuristic that can help identify when a workload might be a good candidate for private cloud vs. public cloud today. You can see this in the figure above. The hope is that it will serve as a lighthouse in an era of more pragmatic and nuanced adoption of cloud in the enterprise.

Be sure to read the report itself, with many quotes, more data, and additional color. The path to cloud is not only more complex but more fascinating than we might have first thought. Subscribers to Constellation Research's Research Unlimited subscription can download the new report, The New 2023 Cloud Reality: A Rebalancing Between Private and Public

Update: HPE has graciously made this report available to the general public for a limited time. You can download a copy here.

New Private Cloud Research for the CIO from HPE

Findings Summary for this Cloud Computing Research

The New 2023 Cloud Reality: A Rebalancing Between Private and Public

My Related Research

Digital Assets in the Cloud for Public Sector CIOs

An Update on IBM Cloud for the CIO

An Oracle NetSuite Roadmap for the CIO and CFO

Digital Transformation Target Platforms: A ShortList

AWS re:Invent 2022: Perspectives for the CIO

The Cloud Reaches an Inflection Point for the CIO

How a Transformation Platform Reimagines Success

Digital Transformation Blueprint for the Office of the CFO

The CIO Must Lead Business Strategy Now

The Strategic New Digital Commerce Category of Product-to-Consumer (P2C) Management

How DataStax is Emerging as a Strategic Anchor in Cloud Data Management

New C-Suite Tech Optimization Chief Financial Officer Chief Information Officer Chief Procurement Officer Chief Supply Chain Officer Chief Digital Officer Chief Technology Officer

Massive datasets, models need new visualizations, data exploration, says Virtualitics

Massive datasets, models need new visualizations, data exploration, says Virtualitics

As data proliferates along with artificial intelligence models, the need to spot data bias, collaborate, prioritize use cases and tell stories become more important. The problem? Traditional visualization approaches can't keep up.

Speaking on DisrupTV Episode 326, Michael Amori, CEO of Virtualitics, utlined how 3D visualizations and AI can better navigate massive datasets and make recommendations. Virtualitics aims to explore data and visualize massive, interrelated datasets. Its platform integrates across descriptive, diagnostic, predictive and prescriptive analytics platforms as well as cloud companies storing large datasets.

What intelligent data exploration does? Amori said data is growing at an exponential pace and new approaches are needed to find insights that are unbiased. "We are focused on the data exploration process, which to a lot of people sounds like it's an already solved problem but it actually isn't," said Amori. "In a typical data analytics process, you start out preprocessing the data and then explore the data set to figure out what you want to do with it. Then you come up with predictive models using AI. The problem with the traditional way is that humans approach the data set already with a bias and preconceived hypothesis and what they want to see."

Amori said AI can explore the data and tell the humans what the hypotheses in the data are that ae worth pursuing. "Humans can decide what path to take as opposed to starting with a path in mind that leaves gems in the data," said Amori.

Generated visualizations and explanations. "AI can handle huge amounts of data and we can take advantage of natural language to really explain what's going on with the data," said Amori. "Data visualizations and explanations can really open up for the user what's going on in the data."

Time to intelligent data exploration. Amori said it typically takes one or two months to configure Virtualitics to an enterprise's data sets and training. "Once they are ready, they can load up a new data set and see all the possibilities and areas of interest," said Amori. Virtualitics turns regular data in columns and rows and turns it into a tabular data set that can be turned into a network graph with explanations generated by AI.

Storytelling and collaboration with data. 3D analytics and visualizations from different end points of data help with storytelling, said Amori. One example would be a company that wants to understand market segmentation and data describing customers. A visualization could explore socioeconomic metrics as well as buying patterns. "There are hundreds of metrics for each of these customers and now you have AI look for patterns in the data," said Amori. "It looks like a network representation of communities of customers across quantitative, categorical and unstructured features. AI can come up with recommendations based on 8 different things going on in your data."

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Analytics Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Information Security Officer Chief Product Officer