Results

Connecting the human dots between Apple, Bill Walton and skill building

Connecting the human dots between Apple, Bill Walton and skill building

As new technologies such as generative AI and robotics proliferate, the connection between humans will become even more important. That's a high-level takeaway from DisrupTV Episode 366, which took a few interesting turns.

Christopher Lochhead, thirteen-time No. 1 bestselling author and a "godfather" of category design, and Matt Beane, Author of The Skill Code: How to Save Human Ability in an Age of Intelligent Machines and UCSB professor, were the guests that connected the human dots between three seemingly disparate topics.

Apple

Lochhead, a godfather of category design, said Apple pulled off a massive coup with its AI presentation this week largely because he took a new technology and made it human. "Tim Cook pulled something absolutely legendary here," said Lochhead. "What we had this week was a master class in category design and business strategy. In category design, one of the things we teach entrepreneurs and marketers is listen to the words. Listen to the words. Most people don't pay attention to the words. Apple this week did not announce a new product. Apple announced a new category design, a new category of AI called a personal intelligence system. And they branded it Apple Intelligence."

He added that most people forget that Apple was the category designer of personal computing. Apple put the focus on where it should be for Apple--on the people.

"Strategically, it's beyond genius," he said. "AI is not a new category of technology. AI is every category of technology. It's not a product. It's an enabling technology. Apple is going to use AI as a personal system."

"The last piece of this is that you don't have a strategy unless you can put it on one page. You lead the future and that's exactly what Tim Cook did. It was the result of clarity of strategy and a focus on the categories where Apple wins."

Bill Walton, the teacher

Lochhead met Walton through complete serendipity. He was speaking at an Oracle event where Walton was the closing speaker.

"If you know anything about Bill and that magical mystical deadhead, he read everything. He read because he had that stutter. He read because his mother was a librarian. He spent his entire childhood reading, playing basketball and on his bike. He was an incredibly learned man."

Naturally, Walton read Lochhead's books, notably Play Bigger, and they became fast friends.

"After one of my dear friends was murdered, Bill called me three times a week for the six months after it happened. He was on the road doing calling games doing all this stuff at an incredibly busy time in his life and he always wanted to make sure how I was. A text message from Bill Walton or an email from Bill Walton would just go on and on about how he loved you, and 'thank you for my life.' He said, thank you for my life to everybody. Thank you for my life.

"He was a dichotomy because he could talk about his stories and his life forever, and you would think a person like that might be egotistical. Yet as he was doing it, he was connecting with you, empathizing with you and he wanted to know how you were. He deeply gave a shit about other people.

"He made you feel like the greatest person in the world, he was the greatest, he taught me and everybody how to be a fan.

"He's left me with many things, and one is teaching. He said to me at the time I was calling myself retired just like an uncle: "Chris, you can't use the word retired. You're not retired. You're just like John Wooden. You're a teacher. Go be a teacher."

"What there is to do? I think it is to live like Bill. Bill embraced different. He followed the things that he loved and the people that he loved, he allowed himself to fall in love quickly and to support other people."

Skills and humans in the AI and robotic age

That human connection is also going to be critical for skill building, argued Beane. In his book, The Skill Code: How to Save Human Ability in an Age of Intelligent Machines, Beane examined various technologies through a lens of skill building--that ongoing connection between an expert and a novice. "To have table stakes, you got to have that knowledge to be able to play, but to build skill. There's 160,000 years’ worth of archaeological evidence that we build skill with elbow-to-elbow contact with somebody, who knows more, trying to get some real work done," said Beane.

Beane used robotic surgery as example of how new technologies are inventing new ways to work and build skill. What's lost is that human skill building connection and mentorship.

"A novice by definition is slower and makes more mistake than an expert. You put a tool in an expert hand, that allows them to do more better by themselves. They're gonna love that deal. They take that deal. And it means they're just gonna need help from that novice less."

The trick to leveraging today's new technologies decades from now will be building productivity gains in a way where people also build their capabilities. Beane argued that roles that require a physical presence will adapt better to new technologies and build skills relevant to those workers who are remote. “If you have authority, run a budget, can invest and are developing tools you can build skills, but you have great responsibility to bring novices along for the ride," said Beane.

"If you're going to make healthy progress toward skill and keep it healthy for other people, the challenge, complexity, and connection matters. Human connections that built Walton's story, bonds of trust and respect. We don't think of those as connected to your skill journey. They are essential. The challenge is that the world has become a bit of a padded playground in places, and that is dangerous to skill. You've got to struggle; you've got sweat and you have to be uncomfortable humans. Humans don't like being uncomfortable, but it is required."

Future of Work Innovation & Product-led Growth New C-Suite apple Leadership Chief People Officer Chief Information Officer Chief Marketing Officer Chief Information Security Officer Chief Experience Officer

GPUs, Arm instances account for larger portion of cloud costs, says Datadog

GPUs, Arm instances account for larger portion of cloud costs, says Datadog

GPU instances are taking a larger share of cloud enterprise spending and now are 14% of compute costs compared to 10% a year ago, according to a Datadog report analyzing AWS customer usage.

The report highlights how enterprises are experimenting with training and inference for large language models. A report from Flexera also highlighted how enterprises were experimenting with AI workloads. Datadog said:

"GPU-based EC2 instance types generally cost more than instances that don’t use GPUs. But the most widely used type—the G4dn, used by 74 percent of GPU adopters—is also the least expensive. This suggests that many customers are experimenting with AI, applying the G4dn to their early efforts in adaptive AI, machine learning (ML) inference, and small-scale training. We expect that as these organizations expand their AI activities and move them into production, they will be spending a larger proportion of their cloud compute budget on GPU."

That increased spending is good for Nvidia as well as AWS customers using the cloud vendor's Trainium and Inferentia chips. The focus on GPU instances may also benefit AMD, which is rolling out its new accelerators.

Arm appears to be another downstream winner as cloud workloads are GPU based. Arm-based CPUs are also popular on AWS as enterprises leverage Graviton2 processors. Arm-based instances only account for 18% of EC2 compute costs, but that's double from a year ago.

Arm's data center takeover: A lumpy revolution | Arm launches compute subsystems optimized for AI for edge devices | Nvidia outlines roadmap including Rubin GPU platform, new Arm-based CPU Vera

Datadog noted:

"Arm-based instances still account for only a minority of EC2 compute spending, but the increase we’ve seen over the last year has been steady and sustained. This looks to us as if organizations are beginning to update their applications and take advantage of more efficient processors to slow the growth of their compute spend overall."

Overall, enterprises are mixing various compute instances and containers to optimize costs, but companies aren't adopting the latest technologies. Datadog found that 83% of organizations are still using previous-generation EC2 instance types.

More on genAI dynamics:

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Cloud CCaaS UCaaS Enterprise Service GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Adobe delivers strong Q2, ups outlook on AI monetization, new customers

Adobe delivers strong Q2, ups outlook on AI monetization, new customers

Adobe reported a better-than-expected second quarter as the company expanded its customer base due to generative AI features.

The company reported second-quarter earnings of $3.49 a share on revenue of $5.31 billion, up 10% from a year ago. Non-GAAP earnings in the quarter were $4.48 a share.

Wall Street was expecting Adobe to deliver second quarter earnings of $4.39 a share on revenue of $5.29 billion.

Going into the earnings report, Wall Street was most concerned about Adobe's ability to monetize generative AI. Those concerns appeared following Adobe's first quarter report and was only magnified as enterprise software vendors disappointed investors.

Adobe CEO Shantanu Narayen said the company's "highly differentiated approach to AI and innovative product delivery" is delivering value to current customers and attracting new ones.

The company's Digital Media revenue was $3.91 billion, up 11% from a year ago. Most of that was Creative Cloud revenue, but Document Cloud delivered 19% revenue growth compared to a year ago.

Digital Experience revenue was $1.33 billion, up 9% from a year ago.

In prepared remarks, Narayen said:

"In Creative Cloud, we have invested in training our Firefly family of creative generative AI models with a proprietary data set and delivering AI functionality within our flagship products including Photoshop, Illustrator, Lightroom and Premiere."

He added that Firefly has been used to generate more than nine billion images.

As for Document Cloud, Narayen said Acrobat AI Assistant is now available as an add-on subscription for Reader and Acrobat enterprise customers.

Adobe raised its guidance with third quarter revenue of $5.33 billion to $5.38 billion with non-GAAP earnings of $4.50 a share to $4.55 a share. For fiscal 2024, Adobe projected $21.4 billion to $21.5 billion with non-GAAP earnings of $18 a share to $18.20 a share.

Other key points:

  • Adobe is extending its applications to integrate third-party multi-modal LLMs.
  • The company is seeing early success monetizing AI across its Digital Media and Digital Experience platforms.
  • Adobe is seeing strong usage and demand for AI across all customer segments.
  • The company said it was seeing new demand for Creative Cloud apps powered by new releases and digital channels.

Data to Decisions Marketing Transformation Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity adobe ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Marketing Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Epicor beefs up AI-driven ERP vision via acquisition, new launches

Epicor beefs up AI-driven ERP vision via acquisition, new launches

Epicor has acquired two companies in recent months as it rounds out its strategy to infuse artificial intelligence across its ERP platform.

The company, which recently passed the $1 billion mark in annual recurring revenue, on Wednesday announced the acquisition of KYKLO, which provides product information management and lead-gen tools for manufacturers and distributors.

Epicor CEO Steve Murphy said the purchase is part of the company's AI-driven cognitive ERP vision that aims to turn systems of record into an insights engine.

Constellation ShortList™ Enterprise Cloud Finance

KYKLO will complement Epicor's Commerce software with product information, real-time catalogs and content syndication.

The KYKLO acquisition follows last month's purchase of Smart Software, which provides cloud inventory planning and optimization applications. Smart Software was already a Epicor independent software vendor partner and integrated into multiple Epicor ERP modules.

At Epicor's Insights 2024 user conference last month, the company launched its Epicor Grow portfolio, which includes AI and business intelligence tools aimed at the supply chain.

Epicor Grow includes generative AI, machine learning, analytics and natural language processing for more than 200 industry use cases.

The Epicor Grow portfolio includes, Epicor Prism, which is a genAI service across Epicor Industry ERP Cloud, and Epicor Grow AI, which surfaces insights across industries. Arturo Buzzalino, VP of Products and Innovation at Epicor, said Prism is the company's first LLM pipeline.

Epicor also launched Epicor Grow Inventory Forecasting, which leverages forecasting engines from Smart Software, Epicor FP&A and Epicor Grow BI.

The company also launched Epicor Grow Data Platform to manage enterprise data, create pipelines and leverage business intelligence.

 

 

Matrix Commerce Next-Generation Customer Experience Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Epicor ERP SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

CR CX Convos: LIVE from PegaWorld with Matt Healy

CR CX Convos: LIVE from PegaWorld with Matt Healy

Liz Miller comes to you LIVE again from #PegaWorld with another CR #CX convo! 📣 This time, with Matt Healy, director of product strategy and marketing at Pegasystems.

Learn how Pegasystems brings together hashtag#applications, systems, #AI, workflows and #automation to improve #customerexperience and customer journeys.

Plus the ever-engaging commentary from Liz Miller about #marketing, #digital technology and more! Watch the full convo below👇

On CR Conversations <iframe width="560" height="315" src="https://www.youtube.com/embed/XS-rBwyrqps?si=qDi450gA0KAUlnuR" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Broadcom delivers strong fiscal Q2 on AI demand, splits stock 10-for-1

Broadcom delivers strong fiscal Q2 on AI demand, splits stock 10-for-1

Broadcom saw strong AI demand in the fiscal second quarter and said VMware accelerated its software business.

The company reported fiscal second quarter net income of $2.12 billion, or $4.42 a share, on revenue of $12.49 billion, up 43% from a year ago. Non-GAAP earnings were $10.96 a share. The company also said it will split its stock 10-for-1 on July 15.

Wall Street was looking for second quarter non-GAAP earnings of $10.84 a share on revenue of $12.1 billion.

Broadcom also raised its revenue guidance for fiscal 2024 to $51 billion compared to estimates of $50.28 billion.

CEO Hock Tan said the second quarter results were "once again driven by AI demand and VMware." AI product revenue was $3.1 billion in the quarter.

By unit, Broadcom's semiconductor revenue, which is being driven by AI data centers, was $7.2 billion, up 6% from a year ago. Infrastructure software revenue was $5.28 billion, up 175% from a year ago due to the VMware acquisition.

Nutanix winning deals vs. VMware, but Broadcom punching back with pricing

Broadcom ended the quarter with cash and cash equivalents of $9.81 billion.

Key points from the earnings conference call include:

  • VMware revenue in the second quarter was $2.7 billion, up from $2.1 billion in the first quarter. Tan said the integration was going well and the company is making "good progress on the transition. The company has signed up nearly 3,000 of its largest 10,000 customers to deals, mostly multi-year contracts. 
  • Tan reiterated that VMware would deliver $4 billion a quarter in revenue, but declined to give a time frame. 
  • Networking revenue for Broadcom is expected to grow 40% in the third quarter. Tan said networking products were benefiting from AI data centers. 
  • Tan downplayed any Nvidia competition. He said Broadcom uses its IP portfolio to create custom AI accelerators so the companies don't compute much. "We are not competitors to them and don't try to be either," said Tan. "On networking that may be different, but we are approaching it from different angles. We are very deep in Ethernet and have been doing it for over 25 years. It's a natural extension for us to go into AI, but we don't do GPUs. We enable GPUs to work very well."

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity vmware ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

CR CX Convos: LIVE from PegaWorld with Matt Camuso

CR CX Convos: LIVE from PegaWorld with Matt Camuso

Liz Miller had a chance to catch up with Matthew Camuso, Senior Product Marketing Manager at Pegasystems, while attending #PegaWorld. What does Moose (you gotta watch...there's even a hand signal) think about the intersection point of Marketing, Experience, Data, Analytics and AI? A lot...and they have a blast catching up and talking strategy in the blazing heat of cool Las Vegas. 

On CR Conversations <iframe width="560" height="315" src="https://www.youtube.com/embed/WwYVzDe7Yfo?si=LovDXmtcD46ObyoM" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

GM builds its data factory, eyes genAI

GM builds its data factory, eyes genAI

Brian Ames, senior manager of production AI and data products at General Motors, said the company has stood up its data factory and plans to layer in generative AI capabilities in the next year.

Ames' talk at the Databricks Summit conference is one part of a broader effort at GM to improve its data and software game. In February, GM CEO Mary Barra said the company is "determined to get the software right" to deliver good customer experiences. GM is working to improve its software in autos as well as revamp the technology behind Cruise.

In May at an investor conference, Barra noted that artificial intelligence will be critical to Cruise and the company overall. "When you talk about artificial intelligence, the ultimate application of that is autonomy is our Cruise operations that use machine learning and AI. But there's a lot that we're working on to leverage AI in some of the business processes to take cost and complexity out of what we do," said Barra.

With that backdrop, Ames' talk highlighted how companies need to get their cloud and data strategies right before embarking on generative AI.  We've documented this trend repeatedly in customer stories on CVSWayfair, Equifax, Intuit, Rocket and JPMorgan Chase to name a few.

Ames said GM's first move was to create an infrastructure that could surface data more easily. "GM has a ton of data. That's not the problem. We had a beautiful on prem infrastructure. Why change? Well, two reasons. Number one was data efficiency. More importantly, the world changed. And GM understood that if we didn't have AI and ML in our arsenal, we could find ourselves at a competitive disadvantage," he said.

GM set out to transform its data infrastructure about 15 months ago to go all cloud. The plan was to build a data insight factory that could democratize information that can be used for safety and vehicle telemetry. "We needed to move from solution silos to single sources of truth with rapid collaboration. We needed to move away from fragmented governance into a simple unified governance. And we felt if we did those two things extremely well, that we'd be able to go from pockets of limited AI and ML execution to really building AI and ML into the DNA of GM," said Ames.

Today, GM has this architecture built on Databricks. GM also built an interface that can distribute insights by brand and product. The killer app for this data platform has been telemetry data from GMs fleet to be used to gauge the health of the autos on the road as well as predictive maintenance and safety. The fleet includes 10s of millions of cars with multiple combination of sensors. The goal? Zero crashes.

Going forward, Ames said GM's plan is to break down silos for various AI and machine learning projects around the company and move them to production faster. "we're reducing the time to insight and we're finding ways to contribute value. In year two, we're going to layer on genAI and perhaps take another step at GM towards our mission of zero crashes," said Ames.

More:

Data to Decisions Next-Generation Customer Experience Future of Work Innovation & Product-led Growth New C-Suite Tech Optimization Digital Safety, Privacy & Cybersecurity databricks Big Data SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Databricks launches Data Intelligence Platform, melds data, AI workflows

Databricks launches Data Intelligence Platform, melds data, AI workflows

Databricks is adding generative AI capabilities via Mosaic AI across its data and AI platform, up its data warehousing game and get more out of data via business intelligence tools. The launches come a week after Databricks acquired Tabular.

At its Databricks Summit, which arrives a week after Snowflake's customer conference, the company laid out a strategy that revolves around tightly coupling data management and AI. What Databricks is building now are the tools to add governance to data and AI and deliver real insights.  The idea is that large language models (LLMs) will allow you to interact with data with simple queries.

In a briefing, Databricks officials noted that about 85% of the generative AI experiments in enterprises fail to make it to production. The problem in a nutshell revolves around cost at scale, data privacy and getting the right answers out of models. It is not the model, but what model delivers the best results for your data.

If you zoom out, the battle between Snowflake and Databricks boils down to this: Snowflake needs to prove it can evolve from a data warehouse and data management vendor to an AI platform. Databricks is a data and AI platform that can also offer data warehouse and business intelligence capabilities. Nevertheless, Constellation Research analyst Doug Henschen boiled down the race. "Databricks is leading on AI and genAi, but It has a lot to prove on data warehousing and is behind on data marketplace and data apps," he said. 

With generative AI, Databricks is creating a Data Intelligence Platform that includes Delta Lake, a unified data storage system, Tabular, which will bridge Databricks with the Iceberg crowd, the generally available Unity Catalog, and Mosaic AI, Databricks SQL, dashboards and other tools. Databricks Data Intelligence Platform will be 100% serverless.

The vision and Databricks strategy


During a keynote, Databricks CEO Ali Ghodsi said data and AI is converging. "In the last 18 months, every CEO from a Fortune 500 company or small company I've talked with thinks that data and AI is going to be super strategic for them over the next five years. They think that that's how they're going to win," said Ghodsi. "That's going to be the main differentiating factor whether it's the financial sector, retail,  media, healthcare or in the public sector. Doesn't matter, all of it. It's going to be data and AI."

Ghodsi said that the last 18 months has only increased the pressure to bring use cases into production. He said: 

"There's a food fight inside organizations over who owns an AI. That's number one. Number two, everybody's worried about security and privacy of their data with genAI. They are worried about security and privacy for the whole data estate. And that data estates today is super fragmented."

Ghodsi said the fragmentation can be solved by storing data in open formats so enterprises don't hand data to vendors that'll only lock them in. "Our vision starts with the lakehouse. We said stop giving your data to vendors. It doesn't matter if it's a proprietary data warehouse in the cloud, Snowflake or Databricks. Don't give it to us either."

Databricks acquisition of Tabular was designed to create a USB-like standard for data. Ghodsi said interoperability between Delta Lake and Iceberg will solve a lot of enterprise problems. He added that Databricks will work with the communities in Delta Lake and Iceberg to bring formats together over time. He likened data lakehouse formats as a Betamax vs. VHS type of challenge. 

The other broad theme from Databricks was combining its Lakehouse platform with Mosaic AI. Data intelligence will enable enterprises to ask common questions with genAI and create a AI-meets-business intelligence format. "That's what the whole company is working on," said Ghodsi. 

Nvidia CEO Jensen Huang also showed up at the keynote with Databricks CEO Ali Ghodsi and delivered a few interesting nuggets:

  • Huang argued that open source LLMs "were probably the most important events this year" because it enables enterprises to better leverage genAI. 
  • GenAI will have the most impact on customer service. "Customer service represents probably several trillion dollars worth of expenses and every company is deciding between the chatbot or the customer service agent. It is partly about the fact that you could automate but it's mostly about the data flywheel," said Jensen. "You want to capture the engagement for the data flywheel. We're going to have proactive customer support."
  • AI factories shouldn't be built near populations where the energy grid is already challenged. "Earth has a lot more energy," said Huang. "It's just in the wrong places."

News overview

The news out of Databricks Summit adds up to building out this data and AI platform that's accessible. Here's a look at a few of the announcements.

  • Mosaic AI Agent Framework is in preview and includes tools to build an agent application, evaluate at them with humans and LLM judges and deployment tools with real-time APIs.
  • Mosaic AI Model Training to fine tune small open-source models. Databricks said customers in private preview can use smaller models and reduce costs while reducing latency. Some customers are seeing 10x improvements in inference costs and 2x improvements in latency.
  • A text-to-image model trained on Shutterstock data by Mosaic AI.

  • Support for usage tracking, rate limits and guardrails as well as hooks into Unity Catalog.
  • Databricks SQL is now 70% faster and Databricks showed comparisons for its price/performance vs. Snowflake
  • Databricks AI/BI with dashboards generally available. A tool called Genie so you can query data is in public preview. To date, Databricks hasn't tried to tackle BI use cases, but going forward BI will be a focus.

  • Genie will learn from your data and semantics and feature an ensemble of AI agents to leverage Unity Catalog metadata. Genie will also query data across all workloads and related assets, remember and learn and seek clarifications.
  • Lakeflow Connectors will ingest data from SaaS applications and databases. Lakeflow won't have all of its components right away, but Databricks said there will be a steady cadence over the next 12 months. The aim is to simplify the data engineering process.
  • Unity Catalog OSS, which will be an open catalog that's available now and combines data and AI with interoperability across formats, open APIs and governance.
  • Enhanced Federation that includes Lakehouse Federation to connect data sources to Unity Catalog with policies and Hive Metastore Federation, which can read/write for internal or external Hive Metastore or AWS Glue.
  • Secure Collaboration via Clean Rooms and Foreign Catalog Sharing.
  • Business Metrics, which will pull from your lakehouse assets, leverage a central inventory of certified metrics and make them accessible.

The launches across Databricks Summit have a heavy dose of combining genAI and data warehousing as well as its usual data engineering fare. The case Databricks is making is that it can consolidate your data platforms and silos. In the end, Snowflake and Databricks will compete for customers. 

Constellation Research analyst Holger Mueller said:

"CxOs always need to remind themselves where the vendor came from. Snowflake came out of the data warehouse and showed it could add cloud elasticity. A large part of Snowflake's success was the familiarity with data warehousing. Databricks lived in the big data and cloud world from its inception in a model less familiar to CxOs. If Snowflake manages to add good enough lakehouse capabilities soon, it will win as it has the transactional data. If Snowflake is slow or fumbles, it's Databricks' game to win because the ability to master large amounts of unstructured data is the harder engineering challenge and Databricks has mastered it."

More Databricks:

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity databricks Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Splunk adds genAI tools, more Cisco touchpoints across observability and security

Splunk adds genAI tools, more Cisco touchpoints across observability and security

Splunk launched a new set of generative AI tools across its products, security operations additions, and data management applications as well as more Cisco integrations.

The announcements made at Splunk's .conf24 conference come a week after Cisco's customer conference where the two companies outlined integrations to connect observability platforms. At Cisco Live, the companies announce a new single sign-on system that streamlines workflows between Cisco AppDynamics and Splunk and Splunk Log Observer Connect for Cisco AppDynamics. Cisco AppDynamics will also integrate with Splunk Enterprise, Splunk Cloud and Splunk ITSI. Overall, Cisco and Splunk will look to unify the observability experiences across both platforms as they ultimately integrate them.

Observability and security customers of both Splunk and Cisco are watching integrations closely as well as clues to how the platforms will come together. A Splunk report found that the total cost of downtime for Global 2000 companies is $400 billion annually.

Cisco Q3, Q4 outlook better as company preps Splunk integration

Here's what Splunk announced at .conf24:

  • Splunk added generative AI tools for Observability, security and IT Service Intelligence. AI Assistant in Observability Cloud adds a natural language interface for engineering teams to detect and correct issues. AI Assistant in Security brings genAI to workflows in a move to speed up analyst investigation. Splunk AI Assistant in SPL makes the insights from the company's unified security and observability platform more understandable to customers.
  • Advanced AI for IT Service Intelligence has a new Configuration Assistant and gets Drift Detection for KPIs and Adaptive Thresholds for entities.
  • Splunk Enterprise 8.0 adds a bevy of security operations center advances to simplify how analysts detect, investigate and respond to threats.
  • Federated Analytics enables customers to analyze data directly where it resides starting with Amazon Security Lake.
  • Splunk Attack Analyzer, Splunk Enterprise Security and Splunk SOAR customers will see integrations that leverage Cisco Talos threat intelligence.
  • Splunk's Data Management portfolio will get Pipeline Builders to enable customers to filter, mask and transform data to simplify processing and Ingest Processor, which will give customers the ability to convert logs to metrics and route them to Splunk Observability Cloud, Splunk Platform or Amazon S3.

Constellation Research's take

Constellation Research analyst Andy Thurai said:

"The natural language interface Splunk AI assistant for SPL can be very useful to power Splunk users. SPL is not easy to write and needs an expert level understanding to write it. By providing a natural language interface AI assistant, Splunk/Cisco hopes to democratize the SPL creation.

Natural language queries of incident, related observability data, ITSI, and finding fixes quicker can be good. However, I have found some of Splunk's closest competitors are way ahead of them in this regard.

Data ingestion pipeline and log/data optimization are areas where competitors have an advantage over Splunk, even after these announcements.

The Log Observer connect as a full two-way centralized log mover could be powerful with all telemetry in one place, but I do anticipate scaling issues. But my guess is Log Observer is a temporary fix to connect log and observability clouds.

Splunk's AI announcements today are nothing earth shattering and competitors already are ahead. Cisco/Splunk now has a problem of integrating Splunk Observability Cloud, Log Enterprise, AppDynamics, and ThousandEyes, and Network Observability data into one meaningful solution. While Cisco and Splunk all service portions customer needs well, the combined solutions is going to take a while to build. I estimate that it might take two years or longer. 
 
There are also overlapping solutions such as FSO, RUM, synthetic monitoring, incident intelligence, and logs which that all need to be redesigned.
 
I still stand by my original comments that it might take two years for this to come to fruition at the earliest. It is going to be difficult to decide which architecture will win and if all can be cloud or hybrid. The good thing is that they have one chief product officer who will drive the product and strategy. But it is too early to make a call. I haven't seen enough yet."

More Cisco and Splunk:

Data to Decisions Digital Safety, Privacy & Cybersecurity Tech Optimization Innovation & Product-led Growth Future of Work Splunk cisco systems Chief Information Officer