Results

Using AI for ESG Assessments, BT150 Lessons Learned | ConstellationTV Episode 68

🎬 This week on ConstellationTV episode 68, co-hosts Doug Henschen & Dion Hinchcliffe give a rundown of the latest enterprise tech news, Larry Dignan interviews SuperNova winner Marie Merle Caekebeke from Baker Hughes about using AI and LLMs for ESG Materiality Assessments, then watch a CCE 2023 panel recap about lessons learned from our BT150 alumni.

00:00 - Introduction
01:19 - Tech News (Tech #earnings, digital transformation, generative AI)
14:26 - SuperNova Winner Interview about AI uses for ESG initatives
22:58 - CCE 2023 Highlights - Lessons Learned from BT150
31:30 - Bloopers!

ConstellationTV is a bi-weekly Web series hosted by Constellation analysts. The show airs live at 9:00 a.m. PT/ 12:00 p.m. ET every other Wednesday. Subscribe to our YouTube Channel: https://lnkd.in/gsFWq66W

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/gxdKte5vniU?si=Qc3ytOthcd2oFhbk" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

Celonis launches Process Intelligence Graph, makes case process enables automation, AI applications

Celonis is betting that process mining data will be the enabler for automation and generative AI across enterprises with the launch of its Process Intelligence (PI) graph.

The argument is worth considering as the intersection of process automation, intelligence, machine learning and artificial intelligence is getting crowded. Multiple vendors are gunning to be that platform that connects the systems and processes behind business transformation.

At Celonis' Celosphere conference, the company outlined how it wants PI Graph to be the "Wikipedia of Process Intelligence." The idea here is that the system agnostic PI Graph, which was introduced a day after Celonis acquired Symbio, will leverage process knowledge from customer deployments and provide an intelligence layer that will form a digital twin of the business and ultimately value chains. Disclosure: I used to work for Celonis. 

This process layer will orchestrate systems, processes and optimizations to continually improve operations. According to Celonis, PI Graph will be the common process language to unify enterprises. Alex Rinke, co-CEO of Celonis, noted that the PI Graph is "the connective tissue that’s been missing in modern enterprises." The data from the PI Graph could give AI and automation models process knowhow. 

Multiple vendors are aiming to be this connective tissue between enterprise processes. The game is to meld process models, KPIs, insights and workflows into one experience. For instance, Microsoft bought Minit in 2022 and then wrapped that process mining technology into a Power Automate wrapper. SAP acquired Signavio and has its platform as well as Datasphere as a wrapper with additions such as LeanIX. UIPath has built out its automation platform that puts process mining together with other technologies including its AutoPilot generative AI.

Constellation Research CEO Ray Wang said Celonis' PI Graph gives companies that ability to embed process intelligence into "their operating models and technology stacks, enabling a new wave of powerful applications and use cases."

At Celosphere, Celonis also launched the following:

  • Process Copilot, a system to identify value via the PI Graph.
  • A new Celonis Studio to enhance app testing, create dashboards and launch actions.
  • Transformation Hub, which provides a hub to measure the value from Celonis implementations.
  • Process Adherence Manager, formerly Process Sphere, is generally available.   
Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Revenue & Growth Effectiveness Chief Information Officer

New Relic brings observability to AI stack

New Relic launched New Relic AI monitoring, which aims to bring observability to AI operations and applications. New Relic also expanded its partnership with AWS to integrate Amazon Bedrock with its AI monitoring platform.

The company’s launch is timely given that boards of directors are pressuring CXOs to deliver generative AI applications and productivity gains, but enterprises are trying to avoid large language models (LLMs) and applications that aren't tracked. New Relic recently went private in a deal valued at $6.5 billion. Meanwhile, Vendors are scrambling to provide a generative AI magic bullet with fun names, domain specific LLMs and add-ons that add up, but the reality has been more challenging.

According to New Relic, AI Monitoring (AIM) will bring visibility across AI applications so enterprises can optimize performance, quality and costs. New Relic AIM will have more than 50 integrations and include LLM model comparisons and response tracing. New Relic said AIM is designed to monitor LLMs and vector databases to surface inaccuracies, biases, security issues and telemetry data to give engineers insights to the AI stack.

Constellation ShortListâ„¢ Observability | Constellation ShortListâ„¢ AIOps | A CIO's Guide to Observability

Since New Relic already has application performance monitoring (APM) tools the extension into AI gives enterprises a suite to observe the entire enterprise stack. Cisco’s acquisition of Splunk is driven by the expanding observability market.

New Relic AIM will monitor the following AI platforms:

  • Orchestration framework: LangChain
  • LLM: OpenAI, PaLM2, HuggingFace
  • Machine learning libraries: Pytorch, TensorFlow
  • Model serving: Amazon SageMaker, AzureML
  • Vector databases: Pinecone, Weaviate, Milvus, FAISS
  • AI infrastructure: Azure, AWS, Google Cloud

Features in New Relic AIM include auto instrumentation, a holistic view across AI applications integrated with application performance monitoring, deep trace insights for LLM responses, performance and cost comparisons and tools to enable responsible use of AI.

Related:

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Dell Technologies, Hugging Face aim to make on-premises generative AI deployments easier

Dell Technologies and Hugging Face are teaming up to target on-premises generative AI deployments to make the jump from enterprise proof-of-concepts to production easier.

While cloud hyperscalers have dominated the generative AI discussion, enterprises are wary of costs as well as protecting data and see hybrid approaches more viable. In addition, CXOs are being asked by boards of directors to produce generative AI projects, but the move to production use cases is challenging. Vendors are scrambling to provide a generative AI magic bullet with fun names, domain specific LLMs and add-ons that add up, but the reality has been more challenging.

"Customers are reporting--and we observe this every day--that they're frustrated by the complexity associated with building AI that includes applications in closed ecosystems," said Dell Technologies' Matt Baker, Senior Vice President of AI Strategy. "If you recall with the era of big data, there was a real challenge progressing from proof of concept to production."

The theory behind the Dell and Hugging Face partnership is that AI is likely to go to where the data lives--mostly on premises--instead of the other way around. Dell has been publishing validated designs that include storage, servers and accelerators.

Jeff Boudier, Head of Product and Growth at Hugging Face, added that "going from a model to a solution you can deploy is a completely different ballgame."

Under the partnership, Dell customers will be able to select open-source models for use cases on Hugging Face, pick optimized infrastructure for that workload and get help tuning models within a container. The model from there is on-premises and owned by the customer.

Boudier added that enterprises are looking to take control of their machine learning and AI destiny. "The only way to do that is to become a builder," he said.

The two companies will also aggregate libraries, data sets and tutorials on training. Baker said that there will be templates and models designed for specific outcomes. Some models will be trained for use cases and an enterprise would then couple it with proprietary data in a container for tuning.

Initially, Dell Technologies and Hugging Face will focus on PowerEdge and data center gear designed for models. Baker, however, noted that workstations will also be included. The Dell and Hugging Face connection will also be available through Apex.

Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

How Wayfair's Tech Transformation Aims to Drive Revenue While Saving Money | BT150 Executive Interivew

 

Constellation Insights Editor in Chief Larry Dignan sits down with Fiona Tan, CTO of Wayfair to discuss new revenue initiatives using AI and ML use cases...

Wayfair saw breakneck growth three years ago and an ensuing hangover that required a focus on operating margins and execution, but a technology transformation has the company thinking big again.
 

The to-do list: build out a flexible technology infrastructure, drive revenue while saving the business money, and leverage years of experience in data analytics, artificial intelligence, and machine learning to create generative AI use cases.

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/Wg59pof50qE?si=9iZH7MOSr3e-slyr" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
Media Name: YouTube Video Thumbnails (2).png

Nvidia launches H200 GPU, shipments Q2 2024

Nvidia launched its Nvidia H200 GPU, which will offer faster memory and more bandwidth for generative AI workloads. The H200 will ship in the second quarter of 2024.

The launch comes as Wall Street waits for Nvidia's earnings and a read on whether the company could meet demand. In addition, Nvidia is about to see competition from AMD and hyperscale cloud players have their own proprietary chips for model training.

Nvidia's H200 is the first to offer HBM3e, which gives the H200 the ability to deliver 141GB of memory at 4.8 terabytes per second. That tally is a big jump in capacity and bandwidth relative to the Nvidia A100. The H200 serves as the base for the Nvidia HGX H200 AI computing platform based on the company's Hopper architecture.

Related:

According to Nvidia, H200 will nearly double the inference speed on the Llama 2 large language model. Nvidia said software updates will boost performance more.

The latest Nvidia GPU will be available across server makers such as ASRock Rack, ASUS, Dell Technologies, Eviden, GIGABYTE, Hewlett Packard Enterprise, Ingrasys, Lenovo, QCT, Supermicro, Wistron and Wiwynn.

Amazon Web Services, Google Cloud, Microsoft Azure and Oracle Cloud Infrastructure will also deploy H200-based instances as will CoreWeave, Lambda and Vultr.

One thing to watch going forward will be the supply of Nvidia GPUs and whether it can meet demand. Also watch what vendors get allocations of GPUs relative to others.

For instance, Super Micro CEO Charles Liang said on the company's fiscal first quarter earnings call.

"We navigated tight AI GPU and key components supply conditions to deliver total solutions and large compute clusters, especially for generative AI workloads where our backorders continue to expand faster than our forecast.

During the first quarter, demand for our leading AI platforms in plug-and-play rack-scale, especially for the LLM-optimized NVIDIA HGX-H100 solutions, was the primary growth driver.

Many customers have started to request direct-attached cold-plate liquid-cooling solutions to address the energy costs, power grid constraints and thermal challenges of these new GPU infrastructures."

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity nvidia AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Celonis acquires Symbio, aims to meld AI, process intelligence in one package

Celonis said it will acquire Symbioworld GmbH, a business process management software vendor, in a move that brings together process mining with AI-drive process modeling.

Terms of the deal weren't disclosed. Celonis and Symbioworld have also launched a version of Symbio's Process Cockpit in beta with select customers that will meld process insights and data in one experience. Disclosure: I used to work for Celonis

The process automation and management market are aiming to make process mining more accessible via AI, generative AI and various customer experiences.

The game is to meld process models, KPIs, insights and workflows into one experience. For instance, Microsoft bought Minit in 2022 and then wrapped that process mining technology into a Power Automate wrapper. SAP acquired Signavio and has its platform as well as Datasphere as a wrapper with additions such as LeanIX. UIPath has built out its automation platform that puts process mining together with other technologies including its AutoPilot generative AI. Celonis had its execution management system. Ultimately, generative AI has the potential to build process models as well as make them more accessible to a broader range of users. Rest assured there will be more process copilots and assistants in the near future.

In a statement, Celonis said Symbio would help it "provide managers and employees with a unified end-to-end process experience." Ultimately, Celonis is looking to use its process mining technology to surface how processes actually run and then leverage Symbio to continually improve them. with a "process first approach."

Symbio has five core products:

  • Navigator, which guides employees in daily work via an AI Copilot.
  • Action Center, a part of Navigator that simplified task tracking and compliance.
  • Process Cockpit, which gives enterprises a holistic view of processes, metrics and optimizations.
  • Business Manager, which maps processes via AI into a flow.

Celonis and Symbio initially collaborated on Process Cockpit, but the companies noted that enhanced process visibility is available in Navigator, which is embedded into Microsoft Teams. The companies added that the combination of generative AI assistants, process data and process knowledge can provide intelligence to a broader base of users.

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Revenue & Growth Effectiveness Chief Information Officer

How Wayfair's tech transformation aims to drive revenue while saving money

Wayfair saw breakneck growth three years ago and an ensuing hangover that required a focus on operating margins and execution, but a technology transformation has the company thinking big again. The to-do list: build out a flexible technology infrastructure, drive revenue while saving the business money, and leverage years of experience in data analytics, artificial intelligence, and machine learning to create generative AI use cases.

During the COVID-19 pandemic, Wayfair became a home goods retailing giant as loyal customers and new ones outfitted home offices and decor. The numbers tell the tale. In 2020, Wayfair revenue surged to $14.1 billion from $9.1 billion in 2019 and it ended the year with 31 million active customers and 61 million orders delivered. In 2021, Wayfair revenue was $13.71 billion with 27 million active customers and 52 million orders delivered. By 2022, Wayfair revenue was $12.2 billion with 40 million active customers and 40 million orders delivered.

Niraj Shah, CEO of Wayfair, said 2023 was about returning to profitability, becoming more efficient, and investing in growth. On Wayfair's third quarter earnings conference call, Shah said, "I'm confident the overarching theme of 2023 will be execution. Our team came into this year with a plan, a plan to see our core recipe return to form, to return our business to profitability, and to continue pushing our major growth initiatives forward."

Get this customer story as a PDF

Indeed, Wayfair is delivering positive free cash flow and improving its active customer metric. Shah said Wayfair's plan is to "nail the basics--driving customer and supplier loyalty and cost efficiency." He added that Wayfair is focusing on what it can control, such as costs, amid a volatile economy and moving the needle on active customers, order values, repeat orders, and market share.

The average Wayfair shopper places about two orders per year totaling about $540. "This shopper isn't someone that's typically refitting an entire room or house but going through their home item by item, project by project making small updates on a much more frequent cadence," said Shah. "If our customers stay in their homes for longer, we're well-positioned to be their retailer of choice the next time they decide that they'd like a new lamp for the living room or want a new set of chairs for their dining table."

Shah said Wayfair's growth trajectory revolves around its specialty and luxury brands, international efforts, physical retail, supplier advertising, and professional services. Wayfair will be vigilant about the returns on these efforts relative to the investment thesis. "Even with a turbulent macro, we remain committed to being adjusted EBITDA profitable in good times and bad," said Shah.

Technology and business alignment

For Wayfair Chief Technology Officer Fiona Tan, a Constellation Research BT150 member, business alignment with information technology is critical--especially as conditions change. In an interview with Constellation Insights at Connected Enterprise, Tan said Wayfair's migration to the cloud paid off well during the COVID-19 pandemic since it was able to scale up when demand spiked.

Nevertheless, Wayfair, founded in 2002, invested heavily in supply chain, logistics, and scaling as demand surged in 2020. Wayfair wasn't alone as many pandemic winners invested in people, technology, and infrastructure only to scale back later.

"As we've seen the category pullback, it's been a good forcing function to make sure that we’re going back to really being proficient with our costs, not only from a technology perspective, but also from an operational perspective," said Tan. "We've been able to focus on growth and profitability in a home market that's still depressed. The fact that we're able to grow our market share in this environment will really set us up well when people start buying more for their home again."

Tan said her team's goal is to work with the business to ensure Wayfair has the "right platform and infrastructure from a technology perspective to enable us to continue to grow in a flexible, scalable, and lean way," said Tan.

As of Wayfair's Investor Day in August, the company said it had more than 2,000 people focused on technology out of more than 13,000 employees.

Tan added that Wayfair is fortunate that it is a digitally native company when it comes to business and operations. "I think a lot of our stakeholders also realize that tech enablement of the business is super important," said Tan. "We're always very focused on the customer experience and what outcomes we're trying to drive."

Wayfair's platform also has to accommodate a model that aggregates suppliers on the back end, retails the product to the customer, and fulfills the orders. This orchestration means Wayfair has to be skilled at merchandising and curating products, fulfilling orders, and shipping and handling expensive, large items.

Tan's technology organization is improving customer experience by helping to deliver the right products at the right price at the right time, leveraging available technologies, and using analytics powered by artificial intelligence and machine learning. Her team also works on marketing technology and generative AI use cases powered by Wayfair's first party data.

In addition, Wayfair has to continually cut technical debt. "We've been doing a lot of work to pay down some of our technical debt," said Tan. "We also see some cool things that were written a long time ago with old code that can potentially reused."

The tech, data, and customer journey platform

Tan said the technology platform behind Wayfair’s business is built around data driven insights, AI, and machine learning to create a personalized shopping journey and seamless experience for customers and suppliers. The main characteristic of Wayfair's technology platform is that it must be flexible.

"Everything we do with the tech platform has been custom built to enable our end-to-end customer journey," said Tan, who also said Wayfair's tech transformation is currently in flight and critical to scaling revenue going forward. "Our tech transformation encompasses both organization and technology improvements and we want to make sure that we're going to be able to build a world class platform while creating an environment where our teams can do their best work."

To date, Wayfair has moved from a monolithic model to cloud native microservices on Google Cloud. The move has enabled it to deploy and scale capabilities faster while being easier to maintain, she said. Moving to the cloud and a microservices model has enabled Wayfair to integrate machine learning, AI, and generative AI into the platform for everything from forecasting to marketing to UX.

Tan added that Wayfair is using Google's large language models for semantic searches and machine learning to help customers find products when they aren't exactly sure what they’re looking for. "There is a considerable amount of machine learning that's powering the customer journey," said Tan.

On the supplier side, Tan said Wayfair uses machine learning to help suppliers automate and manage pricing, inventory merchandizing, and post-service needs. . “With machine learning, we are actually able to extract, augment, and correct the information that we get from our suppliers," said Tan.

Technology drives experience

At Wayfair's Investor Day, Steve Conine, Co-Founder and Co-Chairman with Shah, said technology impacts every facet of the company's business including experience. "We have a massive product catalog and one of the things about home is it is shopped differently. It's very emotive and people often can't tell you exactly what they want, but need to be led to a solution," said Conine. "We've invested a lot in trying to figure out how do you help rationalize a massive catalog and make it exciting for consumers to shop."

Conine, who was CTO for Wayfair's first decade, added, "80% of ecommerce is really the operation side of it and it is very easy for us to throw up a website that promises everything’s in stock and going to ship same day and be delivered to you seamlessly. It is very hard to durably deliver that and make it seamless to a customer. You come on and it feels just like the shopping experience you expect. To make that happen is really what differentiates great retailers from mediocre retailers or retailers that go out of business."

According to Paul Toms, Wayfair Chief Marketing Officer, the customer experience is designed to address multiple life stages, budgets, and ages. "It could be everything for your first dorm room, really reflecting who you are and who you want to be when you go away to college or could be an empty nester couple who are redecorating their child's room and turning it into a guest room. And anything in between and frankly after," said Toms.

Data informs how Wayfair approaches different emotional states, financial needs and constraints, and life stages. "We want them to have good feelings about Wayfair early so that as they grow in their spending power, size of their home, or size of their family that they're thinking about Wayfair as being there for them along that journey," said Toms, who noted that Wayfair has 85 million contacts and can disintermediate media companies and talk to people directly.

Wayfair has a proprietary attribution and measurement platform called Themis, which aggregates customer history and offline activity and allocates incremental revenue to each touchpoint such as Google, Instagram, Facebook, Pinterest, and TV. Toms said Themis is powered by more than 4,000 machine learning models and 5 years of continuous investment.

Once the connection to a customer is made, Wayfair's merchandising engine kicks in where the retailer along with supplier partners identify the highest potential SKUs, merchandise them well with images and reviews, and show those products across various touchpoints.

"It starts with identifying the best items and the highest potential skews in our catalog. This is not just today's winners, but tomorrow's winners. We do this with both algorithms and merchants. So, we have human eyes on all of these products," explained Liza Lefkowski, Vice President of Curated Merchandising, Brands and Store at Wayfair.

Lefkowski, speaking at the company's Investor Day, added that Wayfair is opening physical locations to serve as retail labs where it has staged floor sets for every department.

Pricing also matters and Wayfair tests and models price levels to create optimal margins while competing in the market.

Once a sale closes, Wayfair's proprietary supply chain takes over. "We built that supply chain because we truly believe it is core. It is a key enabler for the business. And without that supply chain, we are not able to provide the quality, the convenience, and the cost base for the business to be successful," said Thomas Netzer, Wayfair's chief operating officer.

Wayfair's proprietary supply chain capabilities include in-house logistics, partnership management, and information management. Wayfair has also invested in robotics and algorithms designed to cut costs with "profit aware sorts," which prioritize products with low costs and low incident rates delivered at high speeds. Wayfair has more than 70 initiatives in supply chain designed to save more than $500 million.

Closing the experience is a sales and service team that is broken up into specialties to guide customers to the best items for their homes. Wayfair has more than 3,000 sales and service agents behind more than 15 million interactions a year. Machine learning and AI drive everything from intent identification to channel routing, agent matching, and follow-ups.

What's next?

Tan said the biggest item on her 2024 to-do list is enabling Wayfair's transformation by enabling new capabilities such as experiences powered by generative AI and large language models.

"I think there are going to be some interesting changes to how customers interact with their user experiences," said Tan, adding that the home category requires new experiences. "It's an interesting category where you can filter with visual searching and conversational searching together. I think that's interesting for us because it's both text and imagery on Wayfair."

At Wayfair's Investor Day, Tan said the company is looking at generative AI "in a very pragmatic way." Specifically, generative AI use cases are framed around experiences and productivity. Wayfair also has a framework for generative AI use cases based on whether there will be a human in the loop to evaluate output before going to a customer or supplier.

One proof of concept revolves around customer sales and service and generated response text. Results have shown a reduction in call resolution time as well as improvements in agent productivity and customer satisfaction, said Tan. Another effort revolves around developer productivity using Microsoft GitHub CoPilot as well as Google's Duet AI. "We are excited about how some of this code completion and generative code can actually help accelerate our tech transformation efforts," she said.

Tan also said that generative AI can play a role in the customer design experience. There are tests with customer facing generative AI in controlled environments. "We believe that some form of conversational search or AI guided assisted shopping will become prevalent in the next couple of years," said Tan. "Our goal is always to make sure that our search experience is the best in the home category. So, customers come to us first and come back whenever they want to search for something for their home."

Generative AI can also be used for the design experience so customers can imagine spaces and style preferences by combining text and images and create photorealistic spaces just based on uploading a picture of a room.

"We believe that we have multiple distinct advantages when it comes to our ability to win using generative AI. We have a very large, rich, and proprietary first party data set that is unique to us and can be used to fine tune and train these foundational LLMs specific to our use cases," said Tan.

According to Tan, Wayfair can leverage pre-trained models as a service and then tailor them to use cases. "Our ability to figure out good use cases for this capability is going to be important," she said.

Wayfair's technology strategy in recent years has revolved around digesting an investment surge and then becoming more efficient. Going forward, Wayfair's technology focus will be more on experience and the art of the customer journey. "We'll really be focused more on the art and that'll be an emphasis as well," said Tan.

In any case, Tan said Wayfair's technology platform will be ready.

Watch the full interview here:

Data to Decisions Next-Generation Customer Experience Future of Work Innovation & Product-led Growth New C-Suite Tech Optimization Digital Safety, Privacy & Cybersecurity Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

GitHub plans to infuse Copilot throughout its platform

GitHub is doubling down on its copilot strategy with the launch of Copilot Chat, Copilot Enterprise and a series of tools designed to enhance software development productivity and collaboration.

At GitHub Universe 2023, GitHub outlined its latest offerings and roadmap. The company, owned by Microsoft, is copilot-happy just like its parent. However, GitHub Copilot is aimed at what is among the top generative AI use cases--software development productivity. GitHub noted a study from last year citing a 55% boost in developer productivity with copilots.

Real-world developer productivity gains are on par with GitHub's findings. Speaking at Constellation Research's Connected Enterprise, Pauline Yang, a Partner at Altimeter Capital speaking on the same CCE panel, laid out the software development economics. Yang said:

"One of the big use cases that we've seen really take off is developer productivity. If you talk to CTOs, they have all these different metrics--how happy their developers are, how much more pull requests are they getting, or how more productive their senior engineers are. We believe that a lot of companies are becoming software companies, even if you're not selling software, and the costs of engineers right now are so high that 40% productivity gains with your engineers is massive and so is happiness of paid developers. All of those gains are economic value."

Given that backdrop it's not surprising that GitHub CEO Thomas Dohmke said the company is "re-founded on Co-Pilot." In a blog post, Dohmke said: "It is our guiding conviction to make it easier for developers to do it all, from the creative spark to the commit, pull request, code review, and deploy—and to do it all with GitHub Copilot deeply integrated into the developer experience."

To that end, GitHub announced the following:

  • GitHub Copilot Enterprise, which can be personalized for an enterprise's code base including private code. This private code personalization capability was also cited by Amazon CEO Andy Jassy as a big use case for Amazon Code Whisperer. GitHub Copilot Enterprise will be available in February for $39 per user per month. Copilot Business is $19 per user per month.
  • GitHub Copilot Chat will be generally available in December as part of the GitHub Copilot subscription. GitHub Copilot Chat will be powered by OpenAI's GPT-4, have an inline Copilot Chat within the context of code editing, slash commands for big tasks and be available in JetBrains suite. GitHub Copilot Chat will also be integrated into the GitHub website and the company's mobile app.
  • GitHub Copilot Partner Program, an ecosystem that will add third-party developer tools to GitHub Copilot. There are more than 25 debut partners including Datadog, Postman and Datastax.
  • GitHub Advanced Security, which is a vulnerability prevention system for large language models. The AI-driven tools are available in preview and will be included in GitHub Advanced Security subscriptions.
  • GitHub Copilot Workspace, which is an early demo for GitHub Next. GitHub said its next-generation Copilot Workspace will be delivered in 2024.

Related:

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

How Baker Hughes Used AI and LLMs for ESG Materiality Assessments | 2023 SNA Award Winners

Baker Hughes’ Marie Merle Caekebeke, Sustainability Executive – Strategic Engagement, was initially skeptical about AI. Now she’s thinking next phases and leveraging AI to make ESG more strategic. Here’s what she learned from a project with C3 AI.

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/mREEJVBeyv4?si=Mo9iX648J_akmRq2" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>