Results

Retailers hope business, digital transformation efforts pay off for holiday shopping 2023

With holiday shopping season underway, retailers see uncertain consumer spending, better inventory positions, lower supply chain costs and customer experience investments they hope deliver.

Adobe is forecasting US online holiday sales of $221.8 billion (Nov. 1 through Dec. 31) in 2023. Salesforce noted that 2023 is all about keeping loyal customers happy and focusing on the experience metrics that matter.

For our purposes, retail in 2023 matters because the industry is arguably the best lab for digital and business transformation, analytics, customer experience and a bevy of technologies.

Here's a tour of how retailers are setting up for the holidays.

Walmart

Walmart's results are the barometer for consumer health and digital transformation.

"In the U.S., we may be managing through a period of deflation in the months to come. And while that would put more unit pressure on us, we welcome it because it's better for our customers," said Walmart CEO Doug McMillon.

Those comments rattled the stock market. CFO John Rainey added that consumers are making trade-offs and weekly performance metrics were softening at the end of October.

Like other retailers, however, Walmart is focusing on what it can control across its omnichannel experience. McMillion said:

"We're making shopping easier and more convenient. Our net promoter scores for pickup and delivery in Walmart U.S. are improving and we've started using generative AI to improve our search and chat experience. We've released an improved beta version of search to some of our customers who are using our app on iOS. In the coming weeks and months, we will enhance this experience and roll it out to more customers."

Rainey said omni-channel services such as pickup and store fulfilled delivery are driving growth. Walmart 24% e-commerce revenue growth in the third quarter and share gains among higher income households.

On the digital front, McMillon said Walmart is looking to blend its core retail business with new services including membership, third-party marketplaces, and advertising. The combination of these businesses with automation in the supply chain should provide a good mix of operating margins, growth and predictability.

Retail experiences all start at the back end. To that end, Rainey said:

"During the quarter, we opened our third next-generation e-commerce fulfillment center. These 1.5 million square feet facilities are expected to more than double the storage capacity, enable 2X the number of customer orders fulfilled daily, and will expand next and two-day shipping to nearly 90% of the U.S. including marketplace items shipped by Walmart Fulfillment Services. They also unlock new opportunities for our associates to transition into higher skilled tech focused positions."

Macy's

Macy's CEO-Elect Tony Spring said its customers at Macy's, Bloomingdales and BlueMercury will continue "to be under pressure and discerning and how they spend in discretionary categories we offer." He said Macy's is ready to fulfil orders online, in store and through its gift guides.

Spring also argued that the company's department stores can pivot with content and merchandise that goes where customers are headed.

On the digital front, Spring said that Macy's transformation is on track. The company is looking to run three distinct retailers with their own identities and a common data and technology approach. Spring said:

"We can learn from each other without becoming one another as we remove silos to optimize our collective customer insights.

We are also balancing art and science. I like to say that this is STEAM, not STEM. We are embracing data science tools, including AI and machine learning, to drive more accurate and agile decision-making based on changes in demand. This, married with the art of human judgment, helps us become more proactive and customer influenced."

Spring said Macy's will leverage data to create better experiences and scale its growth vectors. The growth vectors include private brands, small format stores, a digital marketplace, a focus on luxury and personalized offers and communications.

Macy's CFO and COO Adrian Mitchell said the company's supply chain is working well and the retailer has a good inventory position that's down 6% from a year ago and down 17% from 2019 levels. The company is seeing lower freight expenses and a better merchandise mix to boost gross margins. Mitchell said Macy's has also improved delivery expenses due to reductions in packages per order and distance traveled.

More retail and commerce: How Home Depot Blends Art and Science of Customer Experience | How Wayfair's tech transformation aims to drive revenue while saving money | Connecting Experiences From Employees to Customers

Target

Target CEO Brian Cornell said, "consumers continue to rebalance their spending between goods and experiences and make tough choices in the face of persistent inflation."

Cornell added that Target is cautious about the near-term outlook but playing the long game by "investing in our stores, our supply chain, our team, our digital capabilities and our assortment."

Same-day services saw high single digit growth on a same-store basis. Cornell said customers are pressured by interest rates, loan payments and less savings. With discretionary income down, consumers are making tradeoffs and waiting for sales.

John Mulligan, Chief Operating Officer at Target, said the retailer has faced multiple challenges over the last 3.5 years including a lack of inventory, a demand boom, a downturn, inventory bloat and normalization.

Mulligan added that Target is cautious about its inventory position, but in a good place so far. Target has also benefited from lower supply chain costs. Mulligan said:

"We've seen improvements in metrics relating to backroom inventory accuracy and the percentage of new assortments set on time. In the digital channel, the percentage of orders picked and shipped on time and the average Drive-Up wait time have all improved from last year. Also, in support of the digital business, the percentage of items ordered but not found has declined from a year ago, meaning that we are fulfilling more items per order and canceling fewer, a key factor in guest satisfaction."

Target is also looking to improve its guest experience as measured by Net Promoter Scores in categories such as checkout, front-of-the store interactions, and digital services.

The Children's Place

Jane Elfers, CEO of The Children's Place, said the company has managed inventory well in the third quarter, but saw higher distribution and fulfilment costs as it pivots to more e-commerce.

The plan for The Children's Place is to hone its digital game given that's how Gen Z will buy when that generation has children.

Elfers said:

"Our digital channels are clearly where our current core millennial customer prefers to shop for her kids. And based on the data, digital is where our future Gen Z moms will overwhelmingly prefer to transact. Almost all new digital buyers will come from Gen Z. Gen Z digital buyers nationwide are expected to surge from 45 million today to over 61 million in 2027, only four short years away. The importance of the digitally native Gen Z demographic to our future business cannot be underestimated, and we remain laser-focused on ensuring that digital is at the core of everything we do."

The current generation of core customers for the retailer remains under pressure, but Elfers said The Children's Place is playing the long game with accelerated digital transformation and fleet optimization. The goal is to "operate the company with less resources, including less stores, less inventory, less people and less expense, allowing us to better service our customer online where she prefers to shop, resulting in what we believe will translate to more consistent and sustainable results over time," she added.

According to The Children's Place, the pivot to digital marketing has paid off including a focus on social media presence.

Gap Inc.

New Gap CEO Richard Dickson said the company is looking to strengthen its "operating platform" as it is retooling its core brands--Gap, Old Navy, Banana Republic and Athleta. He said:

"In some areas, we are in good shape, but we have more work to do. Our supply chain is a pillar of strength at Gap Inc., where our scale gives us unique cost leverage, but we need to accelerate innovation. Our financial strategy is driving early value, but we need to continue our focus on rigor and efficiency. In technology, we’ve made strategic investments and now it’s about optimizing those investments and driving adoption across the organization."

Dickson said the scale of Gap Inc. should be able to boost operating margins. Gap Inc. has four billion-dollar brands with 2,600 company operated stores and 1.4 billion annual visits to the company's websites. The company also has 58 million active customers.

Williams Sonoma

Williams Sonoma CEO Laura Alber's third quarter results delivered record operating margins of 17% as it benefited from customer experience improvements and lower supply chain costs.

The company projected fiscal year revenue to be down 10% to 12% but raised its operating margin outlook to 16% to 16.5%.

Alber acknowledged that consumer spending remains challenged, but a portfolio of brands has enabled the company to weather uncertainty even as same-store sales fall.

"Our in-house design capabilities and vertically integrated supply chain are also key in producing proprietary products at the best quality value relationship in the market," she said.

During the company's third quarter earnings call, Alber outlined the following strategies going into the holiday season.

  • Introducing more new products at mid-tier and lower-tier price points without discounting.
  • Sell through inventories with lower supply chain costs by reducing out of market and multiple shipments.
  • Customer service improvements. Alber noted that Williams Sonoma's customer service metrics have returned to pre-pandemic levels as have on-time deliveries.
  • Leverage investments in last-mile delivery to reduce customer accommodations, returns, damages and replacements.
  • Continue to invest in digital experiences with content, tools for design projects and AI.

Dick's Sporting Goods

Dick's Sporting Goods Lauren Hobart said the company is targeting omnichannel athletes and giving them a good digital experience.

Hobart said:

"In combination with our stores, our digital experience remains an integral part of our success, and the investments we are making in technology are strengthening our athletes’ omnichannel experience and driving increased engagement. This quarter, we added 1.6 million new athletes and are further growing our base of omnichannel athletes. Omni channel athletes make up the majority of our sales and they spend more and shop with us more frequently than single channel athletes."

Hobart also said that the sporting goods retailer is investing in data science and personalization to create one-to-one relationships with athletes.

The company said that its consumer base is holding up well as they prioritize a healthy and active lifestyle.

Best Buy

Best Buy CEO Corrie Barry said "consumer demand has been even more uneven and difficult to predict" and the company lowered its fourth quarter revenue outlook. The technology retailer is focused on customer experiences, driving recurring revenue and offering new services.

"We continue to increase our paid membership base and now have 6.6 million members. This compares to 5.8 million at the start of the year," said Barry. My Best Buy Total is a $179.99 a year service that includes 24/7 Geek Squad Service, AppleCare Plus and two-years of product protection. My Best Buy Plus is a $49.99 a year tier that includes access to new products, two-day shipping, a 60-day return and exchange window and exclusive pricing.

Barry said the company is looking to drive interactions too.

"We have also seen growth in sales from customers who are getting help from our virtual sales associates. These interactions, which can be via phone, chat or our virtual store, drive much higher conversion rates and average order values than our general dot.com levels. This quarter, we had 140,000 customer interactions by a video chat with associates, specifically out of our virtual store locations."

In addition, the company continues to invest in its multichannel fulfillment operations. "As a reminder, while almost one-third of our domestic sales are online, 43% of those sales were picked up in one of our stores by customers in Q3. And most customers shop us in multiple channels," said Barry.

Best Buy is also bolstering its supply chain network to optimize the company's ship-from-store-hub and shipping locations to deliver with speed. About 62% of e-commerce small packages were delivered to customers from automated distribution centers. Those operations are supplemented with a delivery partnership with DoorDash.  

Marketing Transformation Matrix Commerce Next-Generation Customer Experience Innovation & Product-led Growth Sales Marketing Revenue & Growth Effectiveness B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI Chief Information Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Growth Officer Chief Marketing Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer Chief Supply Chain Officer

How much generative AI model choice is too much?

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly.

With enterprises still kicking the tires on large language models (LLMs), use cases and generative AI applications, vendors are big on providing choice, bring-your-own-models and the ability to mix and match foundations.

How much LLM choice is too much? We'll probably find out in the months ahead, but I’d argue we’re getting close. And the OpenAI kerfuffle is only going to invite more startups with LLMs looking to take share. 

Consider the following:

  • Microsoft's Ignite conference featured Copilot Studio, a platform that will enable enterprises to use multiple models, create custom ones and extend Microsoft copilots. Microsoft will have a copilot for every experience. Meanwhile, Scott Guthrie, Executive Vice President, Cloud + AI Group at Microsoft, told analysts the company is committed to model choice beyond its OpenAI partnership. Guthrie said enterprises will ultimately be able to use one model to fine tune another and mix models with the same interaction context.

  • Dell Technologies and Hugging Face announced a partnership aimed at on-premises generative AI. The aim is to give enterprises the ability to procure systems and tune open-source models more easily.
  • Microsoft launched Azure Models as a Service with Stable Diffusion, Llama2 from Meta, Mistral and Jais, the world's largest Arabic language model. Command from Cohere will also be available.
  • Google Cloud has its Model Garden, which features more than 100 models including foundational ones and task-specific ones.
  • Amazon Web Services has Amazon Bedrock, which features models from AI21 Labs, Anthropic, Cohere, Meta, Stability AI and Amazon in a managed service. I'm no psychic but rest assured there will be a heavy dose of Bedrock at re:Invent 2023 later this month.
  • OpenAI outlined plans to build out its ecosystem with custom GPTs and a marketplace to follow. 
  • Software vendors are also enabling bring your own model arrangements, partnerships and development of domain specific models. ServiceNow's strategy plays both sides of the LLM equation--open LLMs as well as ServiceNow developed models. ServiceNow's Jon Sigler, senior vice president of Now Platform, said:

"We have a dual approach to models. We integrate general purpose models through a controller. We are also very focused on the smaller faster, more secure, less expensive models that do domain specific things. The combination of the large language models for general purpose and domain specific gives us the opportunity to do something that nobody else can do."

My hunch is that we'll reach some level of appropriate foundational model choice and then focus on small models aimed at specific use cases. Consider what Baker Hughes did with C3 AI and ESG materiality assessments. Specific model for a specific use case.

The current LLM landscape rhymes with that early big data buildout where the industry initially assumed they'd adopt open-source technology and then expand. The problem: For many enterprises, big data implementations never moved past the science experiment stage. There was no need for enterprises to run their own Hadoop/MapReduce clusters.

Will the average enterprise really know the difference between the latest GPT from OpenAI vs. Meta's Llama 2 vs. Anthropic's Claude vs. MosaicML? Probably not. That choice just brings complexity. The real market in LLMs will be domain-specific models that drive returns. Foundational models will be commoditized quickly and the difference between choosing between 50 models, 100 models and 1,000 models won't matter. A booming marketplace of custom models, however, will drive returns.

In the end, the massive selection of models will require an abstraction layer to make enterprise life easier.

Data to Decisions Innovation & Product-led Growth Next-Generation Customer Experience Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity openai servicenow Microsoft ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR GenerativeAI Chief Information Officer Chief Data Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

OpenAI ousts Sam Altman as CEO over lying to board

Sam Altman is out as CEO of OpenAI.

In a surprise move, OpenAI's board of directors said Altman was out as CEO and board of directors. According to an OpenAI blog post, Mira Murati, the company's CTO, will be the interim CEO effective immediately. OpenAI President and Co-Founder Greg Brockman is also out. He outlined the Altman ouster on X and The Information reported that other of key executives resigning. 

"We too are still trying to figure out exactly what happened...The outpouring of support has been really nice; thank you, but please don’t spend any time being concerned. We will be fine. Greater things coming soon."

The board said that Altman's departure follows "a deliberative review process by the board, which concluded that he was not consistently candid in his communications with the board, hindering its ability to exercise its responsibilities."

OpenAI recently held its developer conference where it outlined plans for custom GPTs with Altman on stage. Microsoft also held its Ignite 2023 conference this week and CEO Satya Nadella called out OpenAI as a key partner.

In a statement, Nadella said:

"We have a long-term agreement with OpenAI with full access to everything we need to deliver on our innovation agenda and an exciting product roadmap; and remain committed to our partnership, and to Mira and the team. Together, we will continue to deliver the meaningful benefits of this technology to the world."

OpenAI launches GPTs as it courts developers, models for use cases | Software development becomes generative AI's flagship use case

Speaking at Ignite, Nadella said:

"They're just doing stunning breakthrough work. We are thrilled to be all-in on this partnership together. As OpenAI innovates we will bring that innovation to Azure AI."

While OpenAI said Murati is a good successor due to her unique skill set and "close engagement" with all aspects of the company, the AI firm will conduct a formal search for a CEO.

OpenAI was founded as a non-profit in 2015 and restructured in 2019 so it could raise capital. Microsoft has been the key partner of OpenAI and has popularized the use of ChatGPT for its copilot strategy. That said, Microsoft is also offering other models on Azure.

For enterprise buyers, the big question is whether the OpenAI management saga has any impact on Microsoft's copilot strategy. See:

In regulatory filings, Microsoft noted OpenAI in risk factors but didn't address management changes at its AI partner. In January, Microsoft announced the third phase of its partnership with OpenAI. 

Here's what Microsoft said in its annual report.

"We have a long-term partnership with OpenAI, a leading AI research and deployment company. We deploy OpenAI’s models across our consumer and enterprise products. As OpenAI’s exclusive cloud provider, Azure powers all of OpenAI's workloads. We have also increased our investments in the development and deployment of specialized supercomputing systems to accelerate OpenAI’s research."

In Microsoft's most recent quarterly SEC filing, the company said:

"We are building AI into many of our offerings, including our productivity services, and we are also making AI available for our customers to use in solutions that they build. This AI may be developed by Microsoft or others, including our strategic partner, OpenAI. We expect these elements of our business to grow. We envision a future in which AI operating in our devices, applications, and the cloud helps our customers be more productive in their work and personal lives. As with many innovations, AI presents risks and challenges that could affect its adoption, and therefore our business. AI algorithms or training methodologies may be flawed. Datasets may be overbroad, insufficient, or contain biased information. Content generated by AI systems may be offensive, illegal, or otherwise harmful. Ineffective or inadequate AI development or deployment practices by Microsoft or others could result in incidents that impair the acceptance of AI solutions or cause harm to individuals, customers, or society, or result in our products and services not working as intended. Human review of certain outputs may be required."

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity openai ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Microsoft Ignite 2023: Three big AI takeaways

At Microsoft's Ignite 2023 conference the company fleshed out its generative AI offerings and strategy and, in some places, put some serious distance between it and the competition.

Here's a look at three Ignite 2023 takeaways. Also see the following for coverage:

Copilot for Azure

With Copilot for Azure, Microsoft is taking direct shots at Duet AI from Google.

Scaling up and down and managing cloud applications/infrastructure has always been difficult for many IT shops. Copilot for Azure allows users, via a natural language chat interface, to discover app configuration, infrastructure details and optimize the workloads, etc. These chores have been a major issue for almost any enterprise running in the Azure cloud until now. Copilot for Azure can be particularly helpful for organizations that want to be mindful of all the services they use, costs, etc. Particularly interesting is the option that the customers can analyze their observability data using Copilot for Azure in optimizing the cloud applications but also diagnosing the incidents and/or configuring it the right way. Copilot for Azure will directly compete with a lot of AIOps and observability vendors.

Copilot for Azure will also simplify management a good bit--a lot of customers have found managing Azure at the infrastructure level more complicated relative to other hyperscalers.

The caveat for Copilot for Azure is that it's a first version that needs to prove the accuracy and worthiness of the usage. In addition to the accuracy and hallucination issues, if the recommendation is wrong it could cost customers more. Copilot for Azure could become the code base for effectively managing the app and infrastructure layer. If Copilot for Azure works as advertised when it goes to GA, enterprises can create a blueprint every time a new app is deployed and continuously optimize. It remains to be seen how Copilot for Azure usage develops.

Other items worth noting about Copilot for Azure.

  • The what-if analysis of the cost and performance module is going to compete directly against a lot of FinOps vendors who found a niche to play for a while.
  • If Microsoft offers the same blanket legal liability/indemnity coverage as the OpenAI services, Copilot for Azure could gain some traction.

GitHub Copilot Chat

Copilot was more of an experimentation via GitHub source code for code development for developers, but now is a front-and-center initiative for Microsoft now. GitHub Copilot Chat is embedded into Microsoft 365, Security offerings, D365 Service Copilot etc. Interestingly, Microsoft seems to move away from the failed "Bing experimentation" Bing chat and Bing Chat Enterprise will be rebranded as copilot, which seemed to have a lot more traction than Bing naming convention.

One of the major announcements is the Copilot Studio, which will allow the users to design, test, and publish copilots much similar model to custom GPTs. Microsoft is figuring out a way to engage more in community development to build an ecosystem that can make their technology adoption more viral. With the latest announcements, Microsoft is turning GitHub into an AI-powered developer platform instead of the (open) source code platform that it used to be. GitHub Copilot Chat in many ways competes against Microsoft Visual Studio with the exception that developers can freely write open-source code in this development platform and use it as a repository, compile and deploy in Azure. There is tighter integration with Azure now.

GitHub plans to infuse Copilot throughout its platform | OpenAI launches GPTs as it courts developers, models for use cases | Software development becomes generative AI's flagship use case

Copilot Studio and GitHub Copilot Chat move Microsoft in the direction of citizen programmers. The original copilot options allow the programmers to finish lines of code or partial code in development IDEs. The GitHub chat interface allows developers to ask for code for certain types of programs being written. In other words, you don't have to start writing code to have the copilot suggest and finish the task. The risk is that the copilot could suggest half-baked code that lands in production. As a productivity enhancement tool, especially for junior and entry-level developers GitHub Copilot Chat can offer a lot.

Given the potential for hallucination, accuracy, copyright and IP issues, Microsoft will probably back blanket coverage by providing indemnity/legal and liability protection.

Azure DevOps

With the introduction of the Azure Migrate application and code assessment, Microsoft is hoping large existing .NET workloads will seamlessly move to the Azure cloud faster to go with the AI and innovation workloads already moving over.

Azure container apps, a serverless app, is a good addition to having large AI workloads that are not OpenAI API calls to move the Azure cloud. With dedicated GPU workload profiles, vector database add-ins, and Azure container apps, Microsoft is hoping to have enterprises use Azure to build general-purpose or context sensitive LLMs, and SLMs instead of just using OpenAI for inferencing. Building LLMs is where the big money is for now.

With the addition to Azure Kubernetes, Microsoft is going after AI training workloads and the hosting of LLMs--a massive market.  Today, LLMs run where they are trained. With optimized workloads in Azure there will be fewer manual configurations. Especially, the Kubernetes AI toolchain operator offers the LLMOps functionality optimized across CPUs and GPUs. Particularly noteworthy is the optimization of GPU vs CPU based on availability. Enterprises could move workloads to CPU clusters instead of waiting for costly, high-demand GPUs for inferencing.

Bottom line: While other vendors are fighting for LLM creation and model traction, Microsoft has moved into operationalizing LLMs and AI. Those moves will leave Azure rivals scrambling once again to catch up.

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Microsoft AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Data Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

CR CX Convos: Intelligence and Context in Modern CX

Is personalization really where CX wants to land? Is it even close to enough? In this installment of CR CX Convos, Liz Miller continues to dive into the details of her last convo with SAP's Nitin Badjatia where they discussed the shifting tides of CX and more specifically therising call for contextualization over personalization. As we continue to innovate how experiences are crafted and delivered, rapidly experimenting and iterating with tools like GenerativeAI, now is the time to better understand the difference between knowing the customer and understanding their circumstance.

This conversation explores the idea of modern CX in context of a customer and their business. Miller talks thru if a bigger shift is in the works that will turn the conversation from personalization to contextualization. Breakthrough experiences that cut through the bland sameness and directly engages with a person in context of their specific industry, acknowledging the uniqueness of their circumstance and journey are more possible than ever. While organizations look past channels of delivery, it will be the journeys and the "moments" that matter.

If you missed the first convo, check out Miller's CR CX Convo with Nitin Badjatia here.

Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Chief Analytics Officer Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Marketing Officer Chief Revenue Officer Chief Technology Officer On cx_convos <iframe width="560" height="315" src="https://www.youtube.com/embed/gPsu8ykFAYw?si=z0rQsv2E_7CZG1SY" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

Alibaba hits brakes on cloud spin off plans

Alibaba said it won't spin off its cloud intelligence group citing US sanctions and an uncertain demand environment.

The company reported cloud revenue of $3.79 billion, up 2% from a year ago, for the quarter ended Sept. 30. EBITA for the cloud unit was $193 million. Alibaba recently launched a new proprietary large language model, built out its AI developer ecosystem and AI application development platform.

However, US expanded export controls are likely to hamper Alibaba's cloud unit and its ability to procure processors for AI inference and model training. Previously: Alibaba plans to spin off its Cloud Intelligence Group within 12 months

The company said in a statement:

"We believe that these new restrictions may materially and adversely affect Cloud Intelligence Group’s ability to offer products and services and to perform under existing contracts, thereby negatively affecting our results of operations and financial condition. These new restrictions may also affect our businesses more generally by limiting our ability to upgrade our technological capabilities."

Due to the uncertainty, Alibaba said it will hit the brakes on its cloud spin off. The company said it will focus on a "sustainable growth model for Cloud Intelligence Group under fluid circumstances."

Tech Optimization Chief Information Officer

ServiceNow's latest Now Assist generative AI features highlight its strategy

ServiceNow launched a series of Now Assist generative AI services across its platform.

The features, available today, are part of a systematic effort to infuse generative AI use cases across workflows courtesy of Nvidia GPUs, ServiceNow large language models (LLMs) and the Now Platform.

Specifically, ServiceNow launched Now Assist in Virtual Agent, flow generation and Now Assist for Field Service Management (FSM). ServiceNow's generative AI strategy revolves around what it calls "practical generative AI applications" that focus on use cases and smaller models that are more efficient.

ServiceNow CEO McDermott talks business transformation, generative AI, processes

During a recent analyst briefing, CJ Desai, ServiceNow's Chief Product Officer, said:

"If we have smaller LLMs, they are cost efficient. They serve a particular use case. Small is better and a specific use case will run faster and provide a great experience. Our strategy is simple: Have ServiceNow specific small language models."

Among the ServiceNow generative AI features now available:

  • Now Assist in Virtual Agent makes it easier to create and deploy generative AI chat experiences. Updates include Q&A in knowledge management and multi-turn conversations for service requests and catalog orders.
  • Flow generation, a feature that enables a broader user base of admins and developers to automate more workflows using generative AI, which will take text prompts and create workflows. Workflows can then be tweaked and refined using App Engine's no-code interface.
  • Now Assist for FSM uses generative AI to access activity, parts and incidental data to summarize field work and provide more proactive service.

The three additions are available in the ServiceNow Store.

Also see:

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity servicenow AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Cisco sees weak Q2 as customers digest, implement shipped orders

Cisco Systems reported better-than-expected first quarter results, but its fiscal 2024 outlook fell short of expectations. Cisco said it saw "a slowdown of new product orders in the first quarter of fiscal 2024."

The company reported first quarter earnings of 89 cents a share on revenue of $14.7 billion, up 8% from a year ago. Non-GAAP earnings were $1.11 a share. CEO Chuck Robbins said the first quarter was a "solid start" to the fiscal year. 

Wall Street was expected to report first quarter earnings of $1.03 a share on revenue of $14.63 billion. Cisco recently announced plans to acquire Splunk to expand its observability, AI and cybersecurity footprint.

The issue for Cisco is its second quarter outlook. CFO Scott Herren said customers are implementing "large amounts of recently shipped product." "We expect to see product order growth rates accelerate in the second half of the year," he said.

Cisco added that it believes the primary reason for the slowdown hitting the second quarter is that customers are digesting "exceptionally strong product delivery over the past three quarters." Cisco estimated that there are one to two quarters of shipped product orders waiting to be implemented. 

As for the outlook, Cisco projected second quarter revenue of $12.6 billion to $12.8 billion with non-GAAP earnings of 82 cents a share to 84 cents a share. For fiscal 2024, Cisco is projecting revenue of $55.8 billion to $55 billion with non-GAAP earnings of $3.87 a share to $3.93 a share. Analysts were expecting second quarter revenue of $14.2 billion and fiscal 2024 revenue of $57.8 billion and non-GAAP second quarter earnings of 99 cents a share and $4.05 for the fiscal year. 

By the numbers:

  • Americas revenue was up 14% with EMEA flat from a year ago and APJC down 3%.
  • Networking revenue in the first quarter was up 10%.
  • Security revenue in the first quarter was up 4%.
  • Observability revenue was up 21%.
  • Collaboration was up 3%.

Cisco closed five acquisitions in the first quarter: Accedian, Working Group Two, Oort Inc., SamKnows, and Code BGP.

Data to Decisions Tech Optimization cisco systems Chief Information Officer

Microsoft launches Azure Models as a Service

Microsoft CEO Satya Nadella said the company will launch Azure Models as a Service, which will aim to build out its OpenAI model offerings with a broader selection.

Nadella, speaking at Microsoft's Ignite 2023 conference, said the company wants to "bring the best selection of open-source models to Azure and do so responsibly."

He added that Azure's models-as-a-service offering will include Stable Diffusion, Llama2 from Meta, Mistral and Jais, the world's largest Arabic language model. Command from Cohere will also be available.

Microsoft launches AI chips, Copilot Studio at Ignite 2023

"With models-as-a-service, developers won't have to provision GPUs so they can focus on development and not back-end operations," said Nadella, who noted that customers can fine tune foundational models with their own data. "We want to support models in every language in every country," said Nadella.

Specifics include:

  • Models as a Service will offer inference APIs and hosted fine tuning for Llama 2 in Azure AI model catalog.
  • PayGo inference APIs are billed by the numbers of tokens used.
  • Llama2 can be fine-tuned with your own data.
  • Content moderation is built into the service.
Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Microsoft ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Microsoft launches AI chips, Copilot Studio at Ignite 2023

Microsoft fleshed out its generative AI infrastructure plans with the launch of its custom processors for model training and inference along with Copilot Studio for new use cases.

At its Ignite 2023 conference, Microsoft Azure joined its rivals in offering custom processors for model training. Nvidia has been dominating the field, but now has competition from AMD along with custom processors such as Amazon Web Services Trainium and Inferentia and Google Cloud TPUs.

Microsoft launched Azure Maia 100 AI Accelerator, the company's first in-house custom AI system on a chip. Azure Maia is designed to optimize training and inferencing efficiency for large language models (LLMs). The launch of Azure Maia highlights how Microsoft is becoming a systems company and more than software. Microsoft in a blog post reiterated that it will work closely with Nvidia and will offer its H200 GPUs as well as AMD's MI300x

"We need to be even the world's best systems company across heterogeneous infrastructure," said Microsoft CEO Satya Nadella during his Ignite 2023 keynote. "We work closely with our partners across the industry to incorporate the best innovation. In this new age, AI will be defining everything across the fleet in the data center. As a hyperscaler, we see the workloads, we learn from them, and then get this opportunity to optimize the entirety of the stack."

According to Microsoft, Azure Maia features:

  • Support for open-standard MX sub-8bit data types.
  • Industry standard ethernet connectivity.
  • 4x Maia per server with liquid-cooled design.
  • 5nm process technology from TSMC.
  • The ability to run OpenAI models like GPT3.5 with testing for GitHub Copilot and Bing.

Rani Borkar, commercial vice president, Azure Hardware Systems and Architecture, said Microsoft is taking a systems approach to its infrastructure and optimizing for AI workloads. Azure has 60 data center regions today.

Microsoft uses Oracle Cloud Infrastructure for Bing conversational workloads

Borkar said Microsoft Azure is supporting a wide range of models from OpenAI and others. She also emphasized that Azure would continue to roll out instances based on the latest GPUs from Nvidia and AMD. Borkar said:

"To support these models, we are in that reimagining the entire cloud infrastructure, with a systems approach top to bottom end to end systems approach with a deep understanding of model architecture and workload requirements. We are optimizing our systems with our custom silicon so that our customers can benefit from performance, power efficiency, and the cloud promise of trust and reliability."

Borkar said that Azure plans to optimize every layer of the AI stack to reduce costs and improve energy efficiency.

To that end, Microsoft also outlined Azure Cobalt 100 CPU, the first Arm-based compute system on a chip from the company. Cobalt has up to a 40% performance/core improvement over previous Azure Arm servers. Features include:

  • Microsoft designed security and power management.
  • 128 cores; 12 channels of DDR.
  • Built on 5nm process technology.
  • Powers Microsoft Teams and Azure SQL Server.

When asked about better price/performance with Azure Maia, Borkar said each workload is different. The game plan is to give customers options to choose from a wide range of chips and ecosystems.

Constellation Research analyst Andy Thurai said:

"If Microsoft can prove that training and inferencing LLM custom models on their chipset can be more cost/power effieicient than the competitors there may be an opportunity to get customers to engage. At this point, I don’t see much traction for the chips when they are released next year."

Copilot Studio

Along with Microsoft's custom processors, the company also launched Copilot Studio. Copilot Studio will be of interest to enterprises looking to customize copilots for new use cases.

GitHub plans to infuse Copilot throughout its platform | OpenAI launches GPTs as it courts developers, models for use cases | Software development becomes generative AI's flagship use case

The game plan for Microsoft and Copilot Studio is straightforward: Build copilots for every Microsoft experience. Inspire features copilot hooks for everything from Teams to PowerApps to security and various business applications.

Omar Aftab, Vice President of Microsoft Conversational AI, said Microsoft is building out a copilot ecosystem and copilot Studio is the next iteration of the strategy. "Copilot Studio allows users to extend various first party Copilots or build their own custom enterprise Copilots," said Aftab. "You essentially have a number of conversational tools at your fingertips."

Microsoft's Copilot Studio launch will also enable customers to build custom GPTs from OpenAI. Copilot Studio is also integrated with Microsoft Azure's various services.

Key points about Copilot Studio include:

  • End-to-end lifecycle management.
  • Managed components, APIs, Azure services and low-code tools and plug-ins in one place.
  • Ability to supplement LLMs with business-critical topics.
  • Built-in analytics and telemetry on Copilot performance.
  • Access to corporate documents as well as public searches.
  • Publishing on websites, SharePoint and other options.
  • Ability to segment access by roles.

Ultimately, Microsoft plans to build a Copilot marketplace, but that will happen over time, said Aftab.

Data to Decisions Future of Work Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Microsoft AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer