Results

Splunk’s Acquisition by Cisco Accelerates Convergence of Network, Security, and Observability, Fueled by AI

Splunk’s Acquisition by Cisco Accelerates Convergence of Network, Security, and Observability, Fueled by AI

Two questions have haunted me for two decades: first, can we really address security without addressing networks? Second, are observability and security like oil and water? With customers adopting SASE (Secure Access Service Edge) solutions, we have begun to see a convergence of networks and security. Now, with Splunk's acquisition, Cisco has answered my second question.

Cisco officially closed Splunk’s acqusition a few days ago. Last week, the leadership from Cisco and Splunk communicated their joint vision in an executive roundtable that I attended. The roundtable featured Liz Centoni, Cisco EVP and Chief Customer Experience Officer, Tom Casey, SVP of Splunk Products & Technology, and Jeetu Patel, Cisco EVP and GM of Security and Collaboration. They together offered a glimpse into how they see the future of Cisco and Splunk working together to better serve their customers. Here's a breakdown of the key points, my brief analysis, and recommendations for customers.

Leaders' Vision: A Unique Combination of Security, Observability, and AI

The core message from Cisco and Splunk was clear: data, observability, and AI are the cornerstones of modern security.  Jeetu Patel emphasized,

"If you want to be a world-class security company, you have to be a world-class AI company, and if you want to be a world-class AI company, you have to be a world-class data company."

What excites both companies is the unique combination their offerings bring. Cisco's strength in networking and security complements Splunk's expertise in data platform and observability. As Tom Casey highlighted, "There's a lot of complements between the two areas."

This unique combination, according to Jeetu, "doesn't exist in the market today."  A fully integrated security and observability platform with AI at its core has the potential to revolutionize how organizations approach security.

Jeetu outlined three key objectives: enhancing efficacy through generative AI, improving user experience, and optimizing economics. In his closing remarks, he reiterated the companies' commitment to customer-centric innovation, “We will always start from the customer first and work backwards. We love our customers keeping us honest and making sure that we can actually drive to the outcomes.”

My Insights

  • Cisco's acquisition of Splunk presents a unique opportunity to address customers' end-to-end cybersecurity needs, leveraging the power of AI and data analytics. Although integrations require time and effort, when executed effectively, they have the potential to solve complex challenges and enhance operational efficiency.
  • Cisco and Splunk are culturally different companies; Cisco being a mature networking player known for its robust partner ecosystem, while Splunk holds boasts a broader developer reach with its security and observability offerings. Harnessing each other's strengths, they can foster a thriving cybersecurity ecosystem that paves a path for companies to build compelling solutions on their platform. A complementary acquisition typically benefits customers more than an overlapping one.
  • Traditionally, observability and security were seen as distinct areas. Yet, their merging offers Cisco and Splunk an unprecedented opportunity to tackle enduring cybersecurity issues, aggravated by data silos. In a time when organizations struggle with a scarcity of cybersecurity expertise, Cisco's ambition to democratize cybersecurity via AI reflects prevailing industry patterns we observe, placing emphasis on enhanced tool adoption and fortified security posture.

Recommendations for CxOs

  • As you navigate the convergence of networking and security, and formulate your AI strategy, reasses your current cybersecurity landscape. Look for opportunities to streamline your tools to drive increased adoption. Remember, increased adoption is superior to the allure of extravagant features.
  • Advocate for improved integration between Cisco and Splunk by articulating specific business outcomes you aim to achieve. Communicate your expectations to both companies' leadership regarding the enhancements you anticipate in the coming weeks and months. Consider attending their conferences this summer, Cisco Live and Splunk .conf24, to gain insights into their respective product roadmaps and provide your valuable feedback.
  • As you craft your security strategy and execution plan, check out our "11 Top Cybersecurity Trends of 2024 and Beyond." (If you're a vendor and don't have access to the report please contact me for a courtesy copy.) Drawing insights from numerous conversations with security, technology, and business leaders as well as extensive market research, this cybersecurity trends report offers a holistic view into the broader cybersecurity landscape. It also offers tangible recommendations for CxOs who are frantically navigating the cybersecurity maze to design and operationalize their cybersecurity strategy, with the objective to improve their defenses against increasingly sophisticated attacks.
Digital Safety, Privacy & Cybersecurity Chief Information Officer Chief Information Security Officer Chief Technology Officer

BT150 CXO zeitgeist: Data lakehouses, large models vs. small, genAI hype vs. reality

BT150 CXO zeitgeist: Data lakehouses, large models vs. small, genAI hype vs. reality

Enterprises need to focus on data lakehouse strategies in 2024 to properly take advantage of generative AI; model architecture will be critical to managing large and small models; fine tuning is more difficult than you'd think; and CXOs were weary of database vendors glomming on to genAI hype.

Those were some of the takeaways from Constellation Research's April 5 BT150 CXO meetup.

These gatherings, held under Chatham House rules, are a venue to share information and emerging trends.

Here's a look at the topics from our February meetup.

Fine tuning isn't as easy as you'd think. While fine tuning and customizing a foundational model should be easier than training a large language model from scratch, the process is more involved. The tooling isn't mature enough yet for fine tuning at scale and enterprises are evaluating where to host data.

Get that data lakehouse. Enterprises are coming around to the reality that they need to have a data strategy before even thinking about AI. Considerations include:

  • Ability to move data to models in real time.
  • Need to combine enterprise data with third party data.
  • Costs.
  • Need for real-time data ingestion.
  • Build your own enterprise data lakehouse.
  • Benefits of data lakehouse strategy is that the business will also see business intelligence and analytics benefits and the promise of big data.
  • Take 2024 to nail the data lakehouse strategy to prepare your company for AI.

Foundational model strategy. CXOs and Constellation Research analysts expect industry and role specific models to emerge. In addition, enterprises will need to have model strategies that incorporate approaches that use language that applies to industries and companies specifically. Enterprises will need to think through model architectures to manage models for finance, HR, manufacturing, and other roles.

Kill switches. There was a good debate on our call about the need for an AI kill switch. On one side, models will be hacked and when that happens, you'll need to be able to pull the plug and recover. The argument against the kill switch concept is that other parts of the enterprise don't automatically shut down.

CXOs were exhausted by transactional database vendors that are glomming on to the generative AI hype. These vendors are mostly concerned about you moving your data away from them. Wait until you see a feasible generative AI solution from database vendors before falling for the hype.

Small models vs. large ones. LLMs could be seen by enterprises as boiling the ocean and many CXOs and vendors are talking about smaller models that are specific to a task or process. The reality is that models will require a hybrid approach. Some models will be large, some small and some will be run locally too.

Model suites will always win? While CXOs will take a best-of-breed approach to models due to conditions and hardware limitations, but ultimately the suite approach is likely to win. Specialist models will be better for some tasks, but economies of scale over time favor generalists and suites. The feedback loop of more data and context is likely to favor large models. Beware of small model chatter from vendors without a comprehensive AI strategy or access to a large language model.

Generative experiences with avatars. One CXO was piloting a series of avatars to personalize generative experiences by language and use case. This avatar meets genAI approach appears to be positive for the host and participant. The CXO noted that starting with a framework, governance and privacy controls is a key enabler for generative AI use cases.

AI will create interesting dynamics in the labor pool. Analytics and data science roles are likely to be impacted despite what recent surveys have indicated. A college student with the free time to experiment with prompts can replicate the experience of someone doing predictive analytics and data science for decades. Simply put, the entire skill model for enterprises is going to change.

High performance computing will change due to generative AI. HPC is going to have to evolve since it is in the middle of the generative AI revolution. Nvidia's Blackwell launch featured a series of GPU clusters that will likely compete with supercomputers. Generative AI workloads will fundamentally change compute.

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity BT150 AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Digital Officer Chief Data Officer Chief Technology Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Will demographics mitigate the genAI hit to workers?

Will demographics mitigate the genAI hit to workers?

The debate over generative AI and its impact on the workforce is just heating up since the technology hasn't scaled at most enterprises. One of the biggest questions to ponder is whether genAI's impact will be muted by demographics.

Last week, we covered the intersection of tech layoffs, generative AI and middle management, which is taking the biggest hit. The Daily Show's Jon Stewart also had an interesting riff on the "promise of AI."

It's easy to conclude that generative AI is going to take jobs from humans. But there's another argument that genAI will be needed just to maintain and improve productivity levels because there will be fewer workers. There’s a demographic donut hole in the workforce that may be partially ameliorated by genAI.

During Paychex's third quarter earnings call, CEO John Gibson highlighted how small and mid-sized businesses were struggling to find employees. Gibson also noted that the pace of retirements from Baby Boomers is only going to pick up. Meanwhile, generation X doesn't have the numbers to fill the institutional knowledge gap.

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

According to the Bureau of Labor Statistics, the civilian labor participation rate hasn't recovered from its pre-COVID-19 pandemic rate. And the US Census Bureau said that 16.8% of the population is 65 and older and projected to grow to 22% by 2040.

Gibson said Paychex is using AI to drive insights on retention as well as integrations with Indeed. Paychex's partnership with Visier will offer compensation insights.

He added:

"The simple fact is we have a generational change happening in the labor force. Participation rates remain below pre-pandemic levels and it's going to be very difficult given the rate of retirements that we're seeing in Baby Boomers to really see that change. And what you see in the prime age workers were actually at record highs. The problem is not enough prime age people to fill all the opportunities."

"We need to do more to allow businesses to invest in productivity and drive productivity enhancements and that's not going to replace workers. That's going to enable them to get the work done with less workers than are going to exist in the marketplace. I think this is a systemic problem."

Related: Middle managers and genAI | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge | CEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey

Gibson said there's a productivity gap that'll occur as younger workers replace older ones. The only nuance here is that older workers may not all leave the workforce on schedule.

Pew Research found that 19% of Americans ages 65 and older were employed in 2023, nearly double the share from 35 years ago. The typical worker age 65 or older earns $22 an hour.

Workers ages 75 and older are the fastest-growing age group in the workforce. Today, 9% of workers that age is employed. Blackrock CEO Larry Fink should be excited about that development given he sounded the alarm bells about retirement funding and noted that one fix to Social Security would be working longer. Living to age 80 isn't terribly uncommon today.

How these workforce dynamics play out is anyone’s guess. Today, it’s hard to reconcile layoffs in technology with rosier IT employment prospects outlined by CompTIA.

Bottom line: The generative AI hit to the workforce is inevitable, but there's a more nuanced position to take amid the doom and gloom. One thing is certain: Generative AI is going to be a public policy issue.

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Executive Officer Chief People Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Wipro names Pallia CEO to replace Delaporte

Wipro names Pallia CEO to replace Delaporte

Wipro names Pallia CEO to replace DelaporteWipro has named Srini Pallia CEO effective immediately replacing Thierry Delaporte, who stepped down to pursue other interests.

Pallia (right) most recently was CEO for Wipro's Americas 1 unit and has been with the company for more than three decades. Pallia was responsible for Wipro Americas' vision, strategy and industries and was president of Wipro's consumer business and head of business applications services. Delaporte will remain with Wipro through the end of May for the transition to Pallia.

Wipro's Americas 1 unit includes healthcare and medical devices, consumer goods and life sciences, retail, transportation and services, communications, media and information services, technology products and platforms in the US and Latin America. Wipro's Americas 2 division is focused on financial services, manufacturing, technology and energy and utilities in the US and Canada.

The US accounted for 56% of Wipro's fiscal 2023 revenue.

In a statement, Pallia said he was "excited to build on the strong foundation established by Thierry and lead Wipro on its next growth trajectory."

Delaporte led Wipro through a transformational phase in his four years at the helm. The company has launched its ai360 strategy, which revolves around embedding AI throughout its services and offerings.

For the nine months ended Dec. 31, Wipro reported revenue of $8.12 billion with profit of $993 million. Revenue was modestly higher relative to a year ago. For fiscal 2023, Wipro reported revenue of $11 billion, up 14% from 2022, with profits of $1.38 billion.

Tech Optimization wipro Chief Information Officer

Archetype AI raises $13 million in seed funding, launches Newton physical world foundation model

Archetype AI raises $13 million in seed funding, launches Newton physical world foundation model

Archetype AI has raised $13 million in seed funding and launched Newton, a foundational model that is built to understand the physical world via data signals from accelerometers, gyroscopes, radars, cameras, microphones, thermometers and other environmental sensors.

Newton aims to take physical data and combine them with natural language to provide insights about the physical world. Architype AI's funding round was led by Venrock and included Amazon Industrial Innovation Fund, Hitachi Ventures, Buckley Ventures and Plug and Play Ventures.

Archetype AI's Newton highlights how foundational models continue to evolve at a rapid clip. While large language models have focused on language and image patterns, there is plenty of room for more niche use cases. Archetype AI describes Newton as "a first-of-its-kind physical AI foundational model that is capable of perceiving, understanding and reasoning about the world."

Ivan Poupyrev, CEO and co-founder of Archetype AI, said the company's mission is to solve the biggest problems which are "physical, not digital." "Our goal is to encode the entire physical world so we can derive meaning from the signals all around us and create new solutions to problems that we previously couldn’t understand," he said.

Newton is designed to scale across any kind of sensor. In theory, Newton could bring insights to the Internet of things as well as trillions of sensors in multiple industries. Archetype AI is an example of a foundational model company that can work through multiple verticals and use cases.

Speaking at an AWS analyst meetup, Matt Wood, VP of AI at AWS, was asked about whether foundational models would be commoditized quickly. After all, the LLM layer is likely to be abstracted with models being swapped as easily as cloud instances. Wood said foundational models are unlikely to be commoditized. Instead, these models will become more specialized.

Wood said:

"There is so much utility for generative AI. You're starting to see divergence in terms of price per capability, but I think that we're talking about task models, industry-focused models, vertically focused models and more. There's so much utility that I doubt these models are going to become commoditized."

RelatedCEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

DataStax acquires Logspace and its Langflow platform

DataStax acquires Logspace and its Langflow platform

DataStax said it will acquire Logspace, which is the company behind Langflow, an open-source framework for retrieval-augmented generation (RAG) applications. Logspace's Langflow team will continue to run independently with a focus on development and community.

Langflow features visual tools to iterate on data flows and build LangChain RAG applications and deploy them in a click.

With the move, DataStax aims to create an integrated generative AI stack to create applications. DataStax will integrate Langflow with its DataStax Astra DB and Python libraries.

Constellation Research analyst Andy Thurai noted that DataStax needed to upgrade its platform for building RAG-based applications. "DataStax had its own version of software to build RAG-enabled applications called RAGStack, which combined LangChain, LLaMaIndex, and more," said Thurai. "RAGSTack was as the best open-source option for implementing RAG, but it was difficult to use."

Thurai added Langflow will give DataStax the ability to create conversation flows with large language models, chatbots and virtual agents without extensive coding.

Doug Henschen, analyst at Constellation Research, said DataStax's purchase of Logspace keeps pace with where the industry is going. He said:

"For DataStax, this acquisition clearly improves DataStax’s ability to support RAG. Other DB services, like MongoDB and DataStax competitor Azure Cosmos DB, have also added support for vector embedding, vector search and RAG, but it’s early days for everybody. Another competitor adding vector embedding capabilities is Amazon with DynamoDB, so they’re clearly heading in the same direction. It takes time to add such capabilities and gain adoption, so the LangFlow acquisition is all about acceleration while AI interest is peaking."

Data to Decisions Chief Information Officer

AWS' Matt Wood on model choice, orchestration, Q and evaluating LLMs

AWS' Matt Wood on model choice, orchestration, Q and evaluating LLMs

Matt Wood, Vice President of AI at AWS, outlined how enterprises will mix and match multiple models depending on use case, the need for orchestration and how regulated industries may have an advantage in adopting genAI.

Wood spoke with industry analysts including me and Doug Henschen at AWS' New York offices. Here are some of the key themes to note.

AWS is differentiating on ensuring customers retain IP and confidentiality. Unlike competitors, Wood says AWS "does not mix training data for models with customer data" and "does not allow any human review" of training data. The key competitor cited was Azure OpenAI.

Wood said:

"There is a schism appearing in some customers' minds, where you have to be willing to give up some level of IP protection or confidentiality or privacy of your data in order to be successful with generative AI. And that just is not the case on AWS. Customers are unwilling to do that in any of the industries, particularly those in regulated industries."

Regulated industries are traditionally known as technical laggards, but these enterprises are all over generative AI. Regulated industries such as financial services, insurance and health care are "moving slightly faster than the average" on generative AI.

Wood said:

"A lot of the regulatory approaches and compliance that those organizations have been working on for the past 20 years actually set them up very well to work with generative AI. All the governance, privacy and security data standards allow you to be able to get to utility and value with generative AI very quickly."

"Companies are starting to develop muscle around model evaluation" and they are "no longer tightly coupling to single models," said Wood. Doug Henschen's take: "That's consistent with what we hear from GCP and even Microsoft, but Wood insists AWS will be "Switzerland" and Bedrock will remain differentiated on model diversity."

Wood said model choice is the only way to go in the long run. "Other providers are very married to a very small subset of models. And what that means is that customers end up having to approach a model a bit like a Swiss Army knife, which sounds great, except If It's a contractor turned up to fix your house and all they had was a Swiss Army knife, you would not be very happy."

Ultimately, enterprises will toggle between speed, use case and cost when managing model portfolios. Wood added that this portfolio management of models is already carrying over to compute. He said if training speed is needed Nvidia GPUs get the call, but if cost is more of a factor AWS' Trainium chip is an option. Wood said interest among the customer base is split about 50/50 between the need for speed and cost.

Model orchestration will become critical. "What we've seen is that a big part of success in the actual broad production use of generative AI is being able to access the right model for the right use case," said Wood. "Different models operate and have different use cases and have different sweet spots. The real kind of superpower for generative AI is the combination of those models and the compounding effect on the aggregate intelligence of the system."

Orchestration of models in Bedrock will evolve over time so enterprises can leverage multiple models. Wood said orchestration of models today requires hardcoded rules to data, but as models improve to handle more tasks generative AI will be able to play point guard better.

Q leverages multiple models based on specialty and use cases. "Amazon Q is the easy button for Generative AI" and it provides all the GenAI many companies want to use, said Wood. He specifically touted "a step-function change" in digital transformation and migration projects, such as Windows to Linux moves. Henschen said cloud migration will be an important use of GenAI and an accelerator and cost saver for customers looking to migrate off legacy platforms and onto modern, cloud platforms.

Wood noted that Q runs on a variety of models via a series of expert agents.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Palantir will move workloads to Oracle Cloud as both court governments and enterprises

Palantir will move workloads to Oracle Cloud as both court governments and enterprises

Palantir will move its workloads to Oracle Cloud Infrastructure in a partnership that also includes the two companies jointly selling to governments as well as enterprises.

The partnership will combine Palantir's AI platform and Oracle Cloud Infrastructure. The deal has multiple parts to it, but the gist is that the two companies plan to jointly target governments and enterprises with Oracle distributed cloud and AI infrastructure and Palantir's AI Platform (AIP) and Foundry decision support platform.

Both companies are seeing strong growth among governments looking to keep data in-country.

Enterprise generative AI use cases, applications about to surge | Palantir posts strong Q4, sees enterprise traction in US | Palantir's commercial business scales with help of AI boot camps

Here's a look at the moving parts of the partnership:

  • Palantir will move its Foundry workloads to OCI.
  • Palantir's Gotham and AIP will be available in OCI's public cloud as well as Oracle Cloud Infrastructure Dedicated Regions, Oracle Alloy, Oracle EU Sovereign Cloud, Oracle Government Cloud and Oracle Roving Edge.
  • Oracle will workloads and an increase in ongoing cloud revenue.
  • Both companies will jointly sell and support cloud and AI services across government and commercial accounts.

The Oracle and Palantir combination will line up against C3 AI in government and enterprise accounts. C3 AI has partnerships with Microsoft Azure, AWS and Google Cloud.

Data to Decisions Oracle Chief Information Officer

CEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey

CEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey

CEOs are ramping up generative AI adoption as they shift from pilots to active usage, according to the Winter 2024 Fortune/Deloitte CEO survey.

The survey, based on 107 CEOs mostly in the US, found 56% of respondents rank efficiency and productivity as the primary benefit. According to the survey, 58% of CEOs say they are already implementing genAI to automate manual tasks, up from 40% in October, and 45% say they are reducing operating costs.

Going forward, CEOs are looking to balance costs and innovation. Fortune/Deloitte said 51% of CEOs are implementing genAI to accelerate innovation.

A few key generative AI adoption stats from the survey:

  • 50% of CEOs say they are using AI to automate content generation.
  • 42% say genAI is writing code.
  • 56% are using genAI to increase efficiencies.

Related: Middle managers and genAIWhy you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge

Other takeaways include:

  • Geopolitical instability was rated as the primary disruptor by 65% of CEOs surveyed.
  • 27% of CEOs express optimism about the global economy, up from 7% in the fall of 2023.
  • 22% of CEOs have high optimism about their companies.

Data to Decisions Future of Work New C-Suite Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Growth eCommerce CRM ERP finance Social Healthcare CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Digital Officer

Low code platforms will become strategic to CxOs

Low code platforms will become strategic to CxOs

Generative AI and automation will mean low-code coding platforms will become strategic for enterprise transformation efforts. By 2025, Constellation Research estimates more than two-thirds of enterprises will have a standardized low-code tool in house.

That's a big takeaway from Constellation Research analyst Holger Mueller, in a recent published report Key Trends for Low Code in 2024 and Beyond. Mueller wrote:

"Enterprises need to own their digital transformation destiny by creating software assets that enable what matters. Enterprise Acceleration is an all-encompassing enterprise strategy that aligns people and technology so they can become more agile and move faster. The emergence of generative artificial intelligence (GenAI) only exacerbates the need for low-code platforms that enable enterprises to build the next-generation platforms they need in order to be winners in the era of digital transformation."

In many ways, low-code platforms will be critical as a bridge between human built code and code via software robots and agents.

Mueller outlined the following CxO considerations when thinking about low code strategies this year and beyond.

  • Begin with a low-code mindset. Enterprises can't ignore the importance of low-code platforms as an enabler to digital transformation effort.
  • Standardize low-code platforms, but make sure governance is so restrictive that it hampers developers.
  • Select low-code platform vendors that can leverage generative AI because processes are going to be automated at a rapid rate.
  • Know how low-code platforms interact with your standard applications. CxOs will have to balance enterprise application platforms and low-code offerings.
  • Understand your total cost of ownership with low-code platforms to avoid higher software portfolio costs.

More low code platform resources and news:

Next-Generation Customer Experience Tech Optimization Chief Information Officer Chief Technology Officer