Results

Snowflake Summit 2022: Educating the Masses on Cutting-Edge Innovation

Snowflake Summit 2022: Educating the Masses on Cutting-Edge Innovation

The gap between Snowflake’s bold and innovative vision and its product and customer realities were on display at Snowflake Summit 2022.

Attend any big annual enterprise tech show, and you’re likely to hear about the vendor’s far-reaching vision for innovation. That’s why we’re shown those “forward-looking statements” disclaimer slides that nobody ever bothers to read.

So we had Snowflake’s product announcements and presentations at Snowflake Summit 2022 in Las Vegas and then the realities of the company’s cross-cloud platform and its customer base as of the June 13-16 event.

The most palpable reality, given proof by the packed Caesar’s Forum Conference Center, was the scale of Snowflake’s success. The event drew 7,000-plus attendees -- a massive increase over the 2,000-plus attending the company’s last live Summit in 2019. The numbers also spoke for themselves: Snowflake became the fastest enterprise software company to reach $1 billion in revenue last year, and its fiscal year 2022 (ended January 31) closed at $1.2 billion in revenue, as reported by CEO Frank Slootman during his opening keynote. The company now has more than 6,300 customers -- more than several of its well-known independent competitors combined.

The reality is that Snowflake’s highly automated, low-touch, cross-cloud platform has attracted a great big tent of organizations who love the Data Cloud’s ease of deployment, scaling and use. But Snowflake is not just an easy-to-use alternative to running data warehouses on-premises. Or so executives including Slootman; co-founder and President of Product, Benoit Dageville; and Senior VP of Product, Christian Kleinerman; kept telling Summit attendees.

Snowflake co-founder and President of Product, Benoit Dageville, kicks off Snowflake Summit 2022

“Disrupting analytics” was the company’s vision way back in 2014, Dageville explained. That mission was succeeded in 2018 by “disrupting collaboration” through data sharing and the Snowflake Data Marketplace. And the new mission, introduced at Snowflake Summit 2022, is “disrupting app development.”

This new vision, detailed largely by Kleinerman in a series of product announcements, is about building modern applications that might include transactional as well as analytical capabilities. Mind you, Snowflake is not going after traditional Oracle Database type workloads so much as after modern, cloud-native apps blending transactional and analytical requirements.

Here are some of the key components announced, along with their development and release status:

  • Hybrid Tables (in private preview) are a new Snowflake table type supporting fast, single-row operations and suitable for both operational and analytical queries.
  • Unistore (in development and powered by Hybrid Tables) brings together transactional and analytical data to support transactional applications that can also instantly query historical data for analytical context.
  • Native Applications (in private preview) is a framework for building apps using Snowflake functions including UDFs, stored procedures, and (in-development) integrations with the Streamlit low-code development framework, acquired by Snowflake in March. Running on Snowflake takes advantage of the Data Cloud’s management and governance capabilities.

Importantly, the Native App Framework is tied to the Snowflake Marketplace (renamed from “Data Marketplace” because it’s now also about apps, models and more). Using the Marketplace, companies will be able distribute and monetize their apps with built-in commerce capabilities (which, along with Snowflake’s community and networking power, differentiate the company from competitors that have added data sharing support – often through third party marketplaces). Apps also offer the advantage of securely harnessing Snowflake data without copying it or exposing it to app users.   

Snowflake Senior VP of Product, Christian Kleinerman, details the big announcements at Snowflake Summit 2022

The promise of Native Apps was certainly palpable during Wednesday’s “Building in the Data Cloud - Machine Learning and Application Development” keynote. Private-preview customer 84.51°, a retail data science, insights and media company owned by Kroger, presented on an app it’s developing that will securely blend transactional grocery store sales and inventory data with privacy-safeguarded customer loyalty card data to deliver insights on customer buying trends to store chains and consumer products goods companies. App users won’t be able to see or touch the encrypted transactional sales data or the customer loyalty card data used inside the app, but they will get the trend insights derived from these data. Similarly, LiveRamp, a data-enablement, measurement and marketing company, is building an app that will help with identity resolution while respecting data privacy safeguards.   

I felt a bit sorry for these innovators, however, as the room was more than half empty by the time they presented. The content was quite compelling, I thought, but attendees either had competing breakout sessions or it wasn’t (yet) relevant for the less sophisticated Snowflake customers. It didn’t help that the first hour of the keynote was dedicated to Snowflake’s data science capabilities. Here the announcements included:

  • External Table access to on-premises object stores (entering private preview by the end of June) will enable Snowflake users to bring Parquet files from on-premises objects stores (with S3-compatible APIs) into Snowflake analyses without moving that data. This was a big ask among firms that were either not ready or unwilling to move data into the Snowflake Data Cloud, perhaps for data-residency reasons. (This feature is clearly relevant to many customers, not just those interested in data science.)
  • Iceberg Tables (In development) will introduce support for Apache Iceberg to supplement Snowflake-native tables. This open-source choice opens up the platform to employ multiple tools in addition to Snowflake, such as Spark, Flink and Trino. Performance promises to be close to that of Snowflake-native tables without foregoing the governance capabilities that they provide (features for Iceberg Tables such as encryption and replication are said to be in development).
  • Snowpark for Python (in public preview) adds to the Java and Scala support previously offered by Snowpark. To Snowflake’s credit, the offering includes popular Python open-source libraries (and package-management and update capabilities) through an integration with Anaconda.
  • Snowflake Worksheets for Python (in private preview) will support the building of Python user defined functions using Python in conjunction with Snowpark for Python.
  • Large-Memory Instances (in development) will support demanding data science workloads, such as feature engineering and model training on large datasets.

Most of the big crowd there for the day-two keynote stayed through the first 45 minutes or so, but seats gradually started to empty once the demos involving data scientists and data engineers coding in Python commenced. If you’re not a data scientist, I suppose a demo involving Python coding is not likely to be terribly compelling. By the time the app development half of the keynote began, the crowd was asked, “How many of you are data scientists, data engineers or data developers?” I’d guess less than 20% of remaining attendees raised their hands.

This apparent gap in interest left me wondering, why was Snowflake scarcely mention the SQL Machine Learning feature also announced at the Summit? SQL ML is in the same vein as what several rival vendors offer as either “AutoML” features or as built-in data-science algorithms designed for in-database execution. These features are implemented through simple SQL commands, so it would certainly seem to be very relevant to the mainstream crowd at Snowflake Summit. I’m guessing Snowflake didn’t highlight SQL ML because the (private-preview) feature only supports time-series forecasting at this point.

This brings me to the gap between Snowflake Summit announcements and the realities of what’s generally available today. Snowflake’s development stages typically last four to six months. So something just announced as “in development” isn’t likely to be generally available until a year to 18 months later. Something just entering “private preview” is likely to be eight to 12 months away, and “now in public preview” generally means GA is four to six months away.  

Almost all vendors pre-announce capabilities (some more aggressively than others). My sense is that Snowflake’s data science and app-development announcements are aimed at the larger and more cutting-edge customers (and attracting more of them) that will be invited into private previews. The keynote talks were about preparing the rest of the crowd for capabilities that won’t see general availability for at least another year.

Doug's Analysis

I came away from the Summit impressed by a bold and forward-looking company that has a fast-growing base of enthusiastic customers. The cutting-edge innovators taking part in those private previews include the likes of GEICO, CapitalOne, WarnerMedia, JetBlue, AT&T and Fidelity Investments, all of which presented at Snowflake Summit.

I also came away recognizing that Snowflake is sometimes in the position of having to backfill very mainstream, highly requested features, as Slootman acknowledged the company had to do early in his three-year tenure to step up governance capabilities. Over the last year to 18 months, Snowflake also has been stepping up cost-control features, including consumption analytics, budget guard rails and optimization features, with more such features said to be in the works.

Don’t get me wrong: a good annual customer conference should get customers dreaming about cutting-edge capabilities, but I heard more whoops and hollers when Kleinerman introduced new budget and resource group features and new account replication and pipeline replication features than I did at any point during the data science and data apps presentations.

Forward-looking vision and leadership is important for every vendor, but there also has to be a balance of crowd-pleasing upgrades and pain-fixes tied to core functionality. Snowflake delivered a mix of both types of announcements at the Summit, but at times is seemed like the band leader was getting a little too far out ahead of the parade. 

Related Research:
Market Overview: What to Look for in Analytical Data Platforms for a Cloud-Centric World
Trend Report: What to Consider When Choosing a Cloud-Centric Analytical Data Platform
ThoughtSpot Rides the Wave of Customer Cloud Transitions

 

 

Data to Decisions Tech Optimization snowflake Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer

The CUBE Appearance: Day 1 Keynote Analysis | Snowflake Summit 2022

The CUBE Appearance: Day 1 Keynote Analysis | Snowflake Summit 2022

Doug Henschen, VP & Principal Analyst, Constellation Research, sits with Lisa Martin & Dave Vellante, as they Kick-Off Day 1 at Snowflake Summit 2022 at the Caesar’s Forum Convention Center in Las Vegas, NV.

Data to Decisions Tech Optimization Chief Information Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer On <iframe width="560" height="315" src="https://www.youtube.com/embed/H7Y3OXcxigM" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

ConstellationTV Episode 35

ConstellationTV Episode 35

ConstellationTV is here to bring you the latest in what is disruptive and reshaping business and technology. In every episode, you’ll hear from our fellow analysts, from leaders across our network of business transformation experts and influencers, as well as from cutting-edge vendors.

ConstellationTV is a twice monthly Web series with Constellation Research analysts via LinkedIn & Twitter. The show airs live at 9:00 a.m. PT/ 12:00 p.m. ET every other Wednesday. Follow us on Twitter @CRTV_Show & #CRTVShow.

On ConstellationTV <iframe src="https://player.vimeo.com/video/718039693?h=defa86deaf&amp;badge=0&amp;autopause=0&amp;player_id=0&amp;app_id=75194" width="1280" height="720" frameborder="0" allow="autoplay; fullscreen; picture-in-picture" allowfullscreen title="CRTV Episode 35 Final.mp4"></iframe>

Totally OVER Ordinary: Time to Rethink Whose Experience Needs a Redesign

Totally OVER Ordinary: Time to Rethink Whose Experience Needs a Redesign

Examples of powerful experiences that move us, delight us, and even transform us are all around. Consumer-focused experience design is everywhere. And increasingly we are hearing amazing tales of enterprise-focused transformations that are revolutionizing everything from manufacturing to transportation.

We spend an extraordinary amount of time thinking through where and how our customers expect to transact, interact, and even explore. Engagements are carefully crafted, technologies secured, and workflows established to ensure that customers traverse their “choose your own adventure” style of connected, repeatable experiences. Every effort is made to serve customers rightsized engagements—not just for a moment but for their moment. The “work” of being a customer has never been so easy.

So why is the frontline work of delivering experiences the last to get a redesign?

This isn’t to say that the solutions and systems that craft, organize, and operationalize experiences haven’t undergone a radical upgrade. A wealth of technologies and solutions exists today—from the tools we use to reimagine websites to the solutions we leverage to deliver self-determined, self-service engagements via chatbots, virtual agents, and highly contextual content our brave buyers may choose to experience for themselves at any hour of the day, anywhere in the world, and on any and every device imaginable.

But the actual task of delivering on customer experience (CX) strategies in a contact center is hard work that requires an agent to relearn fundamental human actions and reactions. Some of the tools we put in front of agents can be counterintuitive, unnatural, and just plain hard. In our quest to automate processes and digitally transform our enterprisewide communications, we have (rightfully) spent time focused on how our communications systems service our customers. Then we’ve asked our agents to spend hours (and yes, I know that is an understatement) to relearn systems, devise shortcuts (aka workarounds) for disconnected systems, and essentially retrain their brain to get their job done.

The Importance of Agent Experience

Ordinary is over. That ordinary experience of expecting our contact center agents to conform to the tools and processes built in a one-size-fits-all model of work is over. Thinking that “it isn’t over until we say it is over” or “we have plenty of agents, and there are plenty of people who would love to work here” is no longer defensible. Why? Well, unfortunately for contact centers, what has been dubbed “the Great Resignation” is real. Across all industries and job functions, employees aren’t just resigning; they are also reevaluating and redefining their work lives. Although pay and benefits will always be important for employees, this “Great Refactoring” has empowered people to actively walk away from the ordinary and wait for roles that better align with their life goals, outlooks, and even culture.

Think of this: In the U.S., a record-breaking 4.5 million people quit their jobs in November 2021. This means that about 3% of the U.S. workforce walked out on their job, according to the U.S. Department of Labor (DOL). Also consider that at the end of December 2021, the DOL reported nearly 11 million job openings—an increase of nearly 2% from the month before. By the end of the year, the DOL concluded, there were 4.6 million more jobs than unemployed workers in the U.S.

Translation: Nobody is rushing back for any ordinary job.

Workforce experts all agree: This isn’t about higher wages alone. The current job market is ripe for job jumping, and with that jump often comes a higher wage. But this is about why employees think about jumping in the first place. This is about workers needing to feel successful, knowing that the infrastructure, processes, and operations of the company they choose to join are set up for them to succeed.

Designing Beyond the Ordinary

So, what does moving beyond ordinary look like? It starts where it stops—with the agent. Agent-led design should go hand in hand with customer-led design. One begets the other, in the same way a positive agent experience empowers the agent to deliver an exceptional customer experience.

The time of ordinary experience is over. The questions we need to ask in order to move beyond the ordinary won’t be comfortable or easy. What is your agent experience, and who is at the center of that design exercise? What does personalization mean for your agent? Is that capacity for personalization mirrored in the personalization your customers expect? If your bottom line is directly impacted by your front line, can you really afford to ignore the frontline users and their work experiences?

It’s time to empower agents with contextualized workspaces that adapt to every interaction and provide customer data and information to support their interaction workflow. It’s time to put agents back into the design picture so that they can be the CX front line our customers demand and expect. Need some extra inspiration? Check out this recent conversation between Constellation Research’s own R “Ray” Wang and Dhwani Sonhi, vice president of product management and user experience design at 8x8. From the discussion of the new experiences powered by analytics, automation, and artificial intelligence to information about bringing what matters to an agent into a flexible workspace, this is a dynamic conversation about getting the wheels of new agent-led design moving.

New C-Suite Future of Work Next-Generation Customer Experience Chief Customer Officer Chief People Officer Chief Marketing Officer Chief Digital Officer Chief Data Officer

Taming Complexity in Digital Business and IT: Next Generation Approaches

Taming Complexity in Digital Business and IT: Next Generation Approaches

If one were to conduct an inventory of all the different digital technologies, platforms, services, channels, and devices that an organization uses -- and certainly most CIOs do this today -- it would come as a surprise to most. For the average organization, the numbers of items on that tech inventory would number in the hundreds, or even thousands in large organizations. And because nearly everything in business is becoming digitized, all while needing to be made more contextualized and personalized, a growing percentage of this technology has to work closely together to effectively solve business problems, meet customer needs, and create value. This imperaive, however, tends to tight coupling and combinatorial situations that are now reaching levels that are difficult for many organizations to realize and manage.

In short, complexity has become one of the leading challenges cited by executives when it comes to dealing with iT today. One strategy is simply to reduce the number of IT vendors being used (making them deal with making everything work together), and indeed, vendor consolization has been steadily rising. I've even explored recently the trend of unified software service providers as a distinct new trend to cope with this. But the strategy to export complexity to outside the organization tends to break down when a new digital solution or IT system has to be best-of-breed, such as when it runs the core business, providing differentiated capabilities, at least when compared to key competitors. That's because one generally can't consolidate leading edge solutions from a newer, often smaller, vendor as easily.

Instead, in today's world of hypercoverged IT services, new lightweight integration techniques, and new methods to accelerate or streamline user experience, my research shows that a new crop of vendors are using the compelling new design and architectural breakthroughs to collapse time to value, create operational simplicity with large numbers of simultaneously integrated systems, all while delivering high leverage in the user experience to make these many integrations perform significantly more useful work per given unit of user effort.

Reducing Digital Complexity Through Crux Efficiency

What appears to be happening is that in the competitive lab of the industry, certain tech developers are hitting upon highly effective approaches that address the very core of what is holding back organizations from tackling the next levels of complexity in their digital landscapes. The key obstacles: 1) The very high levels of integration required to deliver solutions that are much better than previous ones. 2) Overly complex user experiences that result from having so many underlying systems brought together. 3) The tediousness and time-consuming nature of the sophisticated user interfaces required to deliver on end-to-end digitized business processes. Usability has long been an adoption and efficiency barrier in enterprise, but it's often been unaddressed for a variety of reasons outside the scope of this discussion.

So, rather than barely achieve minimum acceptable levels of integration or sufficient level of usability, a new generation of solutions are emerging that go directly towards the core, or crux, of the problem: If integration is valuable, then we should do it often and have many of them, designed to be useful and efficient as possible. If today's digital processes, as represented in the user experience, are too cumbersome and lengthy, then ways should be found to make them dramatically more streamlined and efficient. As I examine more of this class of solution, I find that several central design patterns emerge consistently in their designs.

Streamlining Complex Integrated Digital Solutions with User Experience and Domain Microservices

The Design Patterns of Highly Streamlined Digital Solutions

While my research shows that there are likely other patterns still to be uncovered, the patterns below appear frequently and at the very core of many of an emerging next-generation of highly streamlied and lower complexity (to the user) digital/IT solutions that have signifficantly higher levels of useful integration and effective usability. And therefore much higher user productivity and business impact.

Pattern 1.1: Accelerated User Experience

The hallmark of this pattern is a maximal reduction in the number of steps, context switches, and individual user actions to accomplish a goal. This can be achieved through automatic completion of fields for the user, offering up the most common and recently used entries to fields, and even asking the user for key bits of information and otherwise carrying the entire task out for the user on their behalf. Notably, this pattern is now being delivered as an entire product category in its own right now as in the form of digital adoption platforms, which seek to reduce the overall user burden of a digital system in every way possible. The essence here is to eliminate every source of friction to the user in carrying out a task to the fullest extent possible, converting their intention to business outcome with an absolute minimum of physical or mental effort, using automation as much as possible. My research shows that 5-to-1 and even 10-to-1 efficiencies in usage, and again, therefore productivity/business results are possible.

Pattern 1.2: Aggregated User Experience

More than just acceleration, highly streamlined IT aggregates underlying experiences into new, briefer, more succinct user experiences. Instead of using a Concur or Workday user experience directly to schedule travel or file expense reports, the streamlined solution might have it's own user experience that aggregates the underlying solution as an end-to-end travel experience. It would not expose the user to fields not relevant to the task, and it would collect the minimum amount of information from the user and instead do the work on its own to gather the information from underlying systems elsewhere (travel data in an airline app, client names from meeting calendars, dining expenses from a bank statement), without making the user go to different systems to collect and then enter contextual information manually into another system. This approach is endlessly adpatable to the majority of activities in digital employee experience, customer experience, and line of business processes. However, a lot of the value and the range of possibilities are determined by how good the integration is with other IT systems that relevant data is stored in. Which brings us to the next layer of highly complementary patterns.

Pattern 2.1: Accumulated Integration

When the value of integration is high and its combined cost/difficulty continues to decrease, far more contextually-aware solutions can be developed than previously, especially with the aggregated user experience pattern above. This is where the value of a solution based on a platform comes in, which systematically integrates other IT systems in a way that builds network effects. In fact, the realization is now that platforms that move the cost/difficulty/depth of integration out of individual solutions and down to the platform level allows solutions to be devleoped that already have, pre-integrated, dozens or even hundreds of other IT systems. We can see in the marketplace today particularly successful solutions that have high levels of 3rd party applications pre-integrated in a high quality way, such as what the popular collaboration service Slack has done for chat apps or Asana has provided with their work coordination solution. This accumulated integration continually raises the bar for an entire organization or even the broader industry, each time another application is integrated. The crux of this pattern is valuing, emphasizing, and amortizing many high quality integrations in large numbers. It's only then that combinatorial complexity can start to be converted into a more linear problem. The most successful new highly streamlined digital/IT solutions have large numbers of integrations out of the box (dozens to hundreds) and also make it easy, through low-code approaches, for new integrations to be added. Importantly, adding a new integration to the platform makes it available every other new solution built on it. This means that over time, integration complexity is systematically reduced.

Pattern 2.2: Differential Integration

Building on the accumulated integration, this pattern simply recognizes the value that occurs that if the previous pattern is fully realized, making it so that the next new solution developed on the platform requires the least amount of new, differential integration to achieve. In other words, if an organization has diligently added its proprietary and/or obscure systems as deep, quality integrations to the platform, that when a new solution is needed, the minimum possible amount of new integration must be developed. This is a significant force multiplier in time to value given that IT integration is still one of the hardest, complex, and higher risk part of any IT project or solution development effort. Aggressive pursuit of systematic integration pays off down the road, not at the first 10-20 integrations, but when a critical mass of applications are now available to easily build new contextual solutions.

Decreasing Coupling and Complexity Through Domain Integration

Pattern 3: Domain Services 

The lesson in this pattern is that the patchwork of APIs and microservices accumulating in silos in organizations must come to an end and evolve to the next level of utility. A single, consistent (but eminently decentralized by its very nature) set of business-level domain services must become the norm. All of IT -- systems and people both -- must speak the same language. The business must speak it too. That means that the APIs must work as a coherent business graph at the domain level. I've been strongly suggesting for years that microservices have to become a core business strategy and common master artifact, and the time this has come, in particularly in the form of a fully consistent business graph. This will take years for most organization, but realizing it at the platform level is where it must happen, for the reasons above and more. Low code/no code, which can unleash an order of magnitude more productivity alone, outright demands domain services that anyone can consume. And doing this ensures that integrations can be used by anyone and that all such solutions are using the right master data, right guardrails, right compliance, and right security approaches in the most orderly, operation-friendly, manageable, and governable way possible.

An Example: Brightspot

As evidence that these types of highly streamlined new solutions are emerging, I would hold up Brightspot, an advanced content management platform that has been around for a few years now, but that has been steadily gaining marketshare and notable wins against much larger competitors because it so well embodies the design patterns above. It dramatically reduces complexity and increased user leverage throughout its user experience and its deeply integrated design. Brightspot is designed around the understanding that it's part of a much larger ecosystem and value chain, and must work contextually with it as closely as possible to make it as easy as it can for users to be effective in their work and in business outcomes.

Brightspot has many user experience accelerators, aggregated experiences, workflow guides, and other ways to ensure that a given business process is carried out as easily and quickly as possible. It does this by pursuing solid, deep, high quality integrations with common applications in their industry (advertising, analytics, DAM, search, social, images, transacation, etc.) to the fullest extent possible. if you're going to build on Brightspot, you can be confident that it will have the widest range of already-built integrations that will make creating business critical workflows and processes within it as simple and relevant as is possible.

Releated Research: How Brightspot Transformed Content Management in the Research Industry

I'll be exploring the capabilities of Brightspot and others in more detail for what these new highly streamlined platforms actually do to provide dramatically improved time-to-value, productivity, and efficiency in the entire development to operations to maintenance/evolution/innovation lifecycle. But the patterns themselves are clear. By understading how these approaches address the very crux of the problems in IT complexity that are still holding back most solutions from reaching new plateaus of productivity is an investment in time that I encourage solution architects, user experience designers, and business solution owners to make if they wish to tap into the next generation of what is possible today. 

Additional Reading

To Strategically Scale Digital, Enterprises Must Have a Multicloud Experience Integration Stack <-- What a highly streamlined IT platform realization looks like

How Tailored Experiences Deliver Efficiency and Productivity

How Headless Revolutionized Content Management

The future of enterprise content is modular and headless | ZDNet

How CXOs Can Attain Minimum Viable Digital Experience for Customers, Employees, and Partners

Cloud Reaches an Inflection Point for the CIO in 2022

Future of Work New C-Suite Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth ipaas AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR B2C Chief People Officer Chief Human Resources Officer Chief Information Officer Chief Customer Officer

Combining Two Modern Practices Propels E-Commerce Success: Product Content Syndication (PCS) and Product-to-Consumer (P2C)

Combining Two Modern Practices Propels E-Commerce Success: Product Content Syndication (PCS) and Product-to-Consumer (P2C)

As I noted last year when describing the emerging Product-to-Consumer (P2C) category,  as the e-commerce sector has grown to high levels of maturity it accumulated "ever-growing overhead in time, resources, and management attention on making the many moving pieces -- product catalogs, commerce systems, feeds, channels, and marketplaces -- fit together and properly operational in a way that is truly sustainable as a business." E-commerce will be a $7.3 trillion global industry by 2023, but only those prepared to evolve and modernize their ecosystems will thrive in an ever-more digitally sophisticated operational environment.

This is particularly true of the activity that is the lifeblood of e-commerce: The process of optimizing and maximizing product content to create sales. This product content syndication, or PCS, is increasingly seen as cenral to driving growth. This flow of product data has become both a technical and strategic advantage when it comes to omnichannel sales in today’s fast-paced e-commerce space, particularly in competitive environments. In short, having the best, richest, and most accurate set of product listings is now of pre-eminant importance for capturing market share and attracting buyers.

Related Research Report: The New E-Commerce Category of P2C Management

Creating a Winning Product Content Syndication Approach

Since a rich variety of product content, combined with a maximal distribution strategy for that content, makes the difference between merely surviving and actually competing, e-commerce managers are seeking better ways beyind the more static product information management approach of the last decade, with more dynamic methods that understand the individual details and optimal operating needs of the full end-to-end ecosystem.

Product Content Syndication with Product-to-Consumer as the Organizing Model for E-Commerce and Digital Business

To help navigate the choices, I've researched some of the top means for syndicating product content and determined what the best and most capable methods are in a new research report, Driving E-Commerce Growth With Product Content Syndication. As noted in the report, syndication puts product content into motion and therefore is the key step that brings an e-commerce ecosystem to life. Selecting the approach and technical solution for PCS is therefore a key decision that is a key determining factor the ultimate success of a digital business and all of its often far-flung constituent elements.

Key Takeaway: Make PCS an Integral Part of an E-Commerce Strategy

Without a more systemic and contextual approach -- I found the overall approach of using P2C management to be the most effective -- every kind of digital businesses, ranging from the existing the traditional e-commerce stores to the hot new emerging D2C channels from manufacturers, each have the same set of management challenge when it comes to keeping their product content updated and optimally used across their ecosystem. Namely, not to just maximize the product content but also to fundamentally transform their engagement with the market in a far more dynamic, intelligent, and compelling way.

The syndicated options outlined in my report range form the most basic and fundamental, all the way up to the most holistic vision available to e-commerce firms currently. Brands, retailers, and stores will be successfully primarily by their effective management of the product content ecosystem. In fact, it is by making PCS a fully integrated aspect of an e-commerce strategy that they can properly realize the insights and knowledge contained within the feedback loops that link them to the market. Harnessing this feedback with PCS in a contextually and data-driven way is how to provide the highest-impact results.

The best sustainable strategy for succeeding with product content is to contextually address each and every channel via automation. E-commerce firms and digital businesses that can that do this from a holistic strategy and matching platform will be in a better position to seize opportunity and survive rapid shifts in the market. Digital business has moved away from the simplistic models of years' past to much more deeply integrated new systems that can cope with today’s operating requirements, regardless of how sophisticated they are. For most organizations, developing and operating a more robust, enlightened, and sustainable PCS strategy will be essential to their long-term growth, maturity, and success.

Additional Reading

The Strategic New Digital Commerce Category of Product-to-Consumer (P2C) Management

The P2C Management Vendor ShortList for 2021

Realizing a Decisive Advantage in Digital Commerce Through Economic Flexibility

How Headless Revolutionized Content Management

The Future of Enterprise Content Management

A New Digital Experience Maturity Model for Improved Business Outcomes

How CXOs Can Attain Minimum Viable Digital Experience for Customers, Employees, and Partners 

To Strategically Scale Digital, Enterprises Must Have a Multicloud Experience Integration Stack

Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Tech Optimization Chief Analytics Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Information Officer Chief Marketing Officer Chief Revenue Officer Chief Supply Chain Officer Chief Technology Officer

Tableau Gets Back on the Conference Circuit in a Time of Change

Tableau Gets Back on the Conference Circuit in a Time of Change

It was Albert Einstein who said, “the measure of intelligence is the ability to change.” With apologies to Einstein, I’d say the ability to change is also the measure of a good business intelligence (BI) platform and vendor. I had changing customer needs and expectations very much in mind when I attended Tableau Conference 2022 (TC22), May 16-19, in Las Vegas.

In one sense, the TC22 reunion of the “data fam” was trip back in time, from the 1990s/2000s rock-and-pop-hit keynote soundtrack, to the return of the “Devs on Stage” and “IronViz” sessions, to the packed and playful “Data Village” opening night reception, complete with playground equipment and Elvis impersonators. It was also nostalgic seeing more than 5,000 people streaming into the Mandalay Bay Convention Center (about 1/3 of the event’s record attendance, but it felt like a TC, just like it used to be).  

Outward appearances notwithstanding, Tableau executives acknowledged from the start of TC22 that it’s a time of change for Tableau -- and for BI and analytics more broadly. Much of the change for Tableau is related to its absorption into Salesforce, which acquired the company two-and-a-half years ago. Mind you, the Tableau name is absolutely NOT going away – it’s the analytics unit within Salesforce, just as Mulesoft has retained its identity and integration role since its acquisition by Salesforce. But the one-million-customer-strong Tableau community is definitely gaining closer ties with the 16-million-customer strong Salesforce Community.

CEO Mark Nelson said during his opening keynote that Tableau can now draw on Salesforce as “a superpower.” A concrete example was the announcement of a coming Model Builder feature for Tableau. The no-code predictive-modeling capability is based on Salesforce’s Einstein Discovery, and it will be integrated into Tableau by the end of 2022.

In his Tableau Conference keynote, CEO Mark Nelson says Tableau is drawing on Salesforce as a "superpower."

Another sign of Salesforce’s superpower influence and contributed strengths has been Tableau’s emphasis on adding enterprise capabilities since the acquisition. At TC21, held virtually in November, enterprise-oriented announcements included a Centralized Row-Level Security feature and a Connected Applications feature. That latter enables administrators to set up trusted relationships with external services.

The enterprise push continued at TC22 with the introduction of Advanced Management capabilities, including Customer-Managed Encryption Keys, an Activity Log feature that tracks how individuals are using Tableau, and an Admin Insights feature that uses Tableau analyses to help admins track dataset usage, license adoption, and visualization load times.

What’s in a Name

Some TC announcements seemed more about branding than dramatically new capabilities (a Salesforce influence?), but sometimes it’s important to get names right. For example, the company announced the rebranding of Tableau Online as Tableau Cloud -- in line with Salesforce naming conventions. The name change is justified in that 70% of Tableau customers now deploy first (and often only) in the cloud. So it’s not about taking the Tableau experience “online,” the roots of the old name; it’s about delivering a cloud-first Tableau. That evolution will certainly require more than the cloud-centric accelerators that were added to the Tableau Exchange as part of the Tableau Cloud unveiling. For example, I’m hoping Tableau Cloud will soon take advantage of the Salesforce superpower known as Hyperforce, which would give it the ability to run with consistency across multiple public clouds. Tableau execs say support for Hyperforce is on the roadmap, but target release dates have yet to be announced.   

Another brand change was the unveiling of CRM Analytics, which is the new name for what was previously called Tableau CRM (and before that Einstein Analytics and before that Wave Analytics). The change to CRM Analytics was actually announced in April, but the move was explained more deeply to analysts at the Tableau Conference. The new name was greeted by customers with a “sigh of relief,” according to one executive, because the product has always been a part of Salesforce and was not developed by Tableau. The naming helps make it super clear that you choose CRM Analytics when you want to bring insights and predictive guidance into sales and service rep Salesforce workflows.

Increasingly, CRM Analytics will be available through even more targeted applications, such as the Revenue Intelligence application, introduced in February, and through five industry-focused versions of Revenue Intelligence – for the Salesforce Financial Services, Manufacturing, Consumer Goods, Communications, and Energy and Utilities clouds – set to be released this summer.

Long Live Dashboards

Despite all the talk of dashboards being dead – talk mostly heard at Tableau-rival events – it was clear from the armies of analysts at TC22 and the rowdy throng at the IronViz competition that data visualizations and drillable dashboards are still very much in demand (something natural-language-search-centric vendor ThoughtSpot acknowledged at its May 9-12 Beyond event, which I also attended.)

The whole self-service movement came about because organizations wanted to make it easier to create pre-defined views of business conditions without having to wait in line for IT to create new reports. All the better that visualization centric dashboards also supported lower-latency monitoring and drill-down analysis, unlike the static reports that they often replaced.

To promote easier consumption by a broad base of business users, Tableau introduced a Data Orientation Pane feature that drew big applause during the popular “Devs On Stage” session. The purpose of the pane is to guide new and novice users in the use of a dashboard by providing descriptions, links to resources (such as how-to-use videos), list of data sources, and details on which fields are being used, which filters are in effect, and what outliers might indicate.

Seeing Tableau With a Beginner's Mindset

In another sign of change afoot at Tableau, there were presentations on new initiatives led by new executives who are taking a look at Tableau offerings with a beginner’s mindset. For example, a Data Fabric initiative led by Amazon/AWS veteran Volker Metten, now a Tableau VP, Product Management, is aimed at better aligning capabilities including Tableau Prep, Tableau Catalog, Virtual Connections, Row-Level Security and Governance Rules. The goal is to ease data access and data usage while ensuring that the right data gets to the right people.

Another initiative led by Metten is working toward greater consistency and interoperability among Tableau products and capabilities that have matured at different rates. Tableau Prep, for example, currently has one version for Tableau Server and a slightly different version for Tableau Cloud, so the company plans to align the two. The team also wants to ensure that Prep will be able to draw data from Salesforce and publish transformed data directly into CRM Analytics – a capability expected to be available before the end of this year.

Francois Ajenstat, Tableau's Chief Product Officer, announces Tableau Model Builder, which is being built on Salesforce Einstein Discovery.

As for changes within the larger BI/analytics market, there’s big push today for what’s variously called “actionable analytics” or “actionable insights.” As I’ve often observed in my research and on social media, the problem that Tableau helped to solve 15 years ago was that of having too much data and not enough insight. Now that self-service capabilities are pervasive, the challenge is often having too many insights, sometimes conflicting insights, and not enough clarity on what actions to take.

At TC22, the announcements geared to improving and clarifying insight included the Data Orientation Pane and new “Data Stories” automated plain-language explanations. This natural-language-explanation capability, based on the Narrative Science acquisition, will be showing up within Tableau as well as within Salesforce applications, bringing insight closer to the action. The coming Model Builder feature for Tableau will go even further to move the needle toward predictions and recommended actions.

These examples are clearly just a start on a journey that will bring more change. As long-time Tableau Chief Product Officer Francois Ajenstat told analysts at TC22, Tableau has gone through “many, many” changes since its founding in 2003 and its initial public offering in 2013. Most notably, there was an executive leadership change when Tableau’s stock slumped in 2016 (due largely to the introduction of Microsoft Power BI), yet Tableau managed to reach new heights.

At TC22 we saw a company that is benefitting from the marketing muscle, extensive tech talent, and intellectual property portfolio of its new corporate parent. And because there is new blood and open discussion and exploration of BI/analytics market changes by Tableau executives, I think we also saw a company that is embracing change rather than clinging to the glory days of the self-service era.

Data to Decisions Tech Optimization Marketing Transformation Innovation & Product-led Growth Next-Generation Customer Experience Future of Work intel Big Data Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning LLMs Agentic AI business SaaS PaaS IaaS Enterprise IT Enterprise Acceleration IoT Blockchain CRM ERP finance Healthcare Chief Information Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer

The CUBE Appearance: Couchbase Application Modernization Event

The CUBE Appearance: Couchbase Application Modernization Event

A "power panel" of analysts including Tony Baer, Doug Henschen and Sanjeev Mohan join Dave Vellante of The CUBE for coverage of the Couchbase Application Modernization event.

Data to Decisions Chief Information Officer Chief Digital Officer Chief Data Officer Chief Technology Officer On <iframe width="560" height="315" src="https://www.youtube.com/embed/FqSVJH_a0PY" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

Monday's Musings: Decision Velocity Will Determine Winners and Losers In A Digital Age

Monday's Musings: Decision Velocity Will Determine Winners and Losers In A Digital Age

Everybody Wants To Rule The World 

Speed Provides Exponential Advantage

Speed has always been a critical success factor in winning wars on the battlefield. You need to move troops faster, reach targets more quickly, and strike with speed and precision. However, what is often not talked about is how the speed with which decisions are made plays a role in claiming victory. Alexander the Great’s success on the battlefield is often credited to the rapid decision-making capabilities of his armies. Enabled by trust and a decentralized command structure, his troops were able to beat their enemies by “out-decisioning” them. In most cases, his opponents had bureaucratic decision architectures, where minor decisions would travel up multiple levels of command before traveling back down to be executed. In the 330s BC, that could mean it took days to make a decision on the battlefield. Such a centralized control and detailed micro-management approach was no match for Alexander the Great’s nimble teams. British military strategist J. F. C. Fuller, writing on Alexander the Great, explained, “Time was his constant ally; he capitalized every moment, never pondered on it, and thereby achieved his ends before others had settled on their means.”1

The speed of decision making plays a similar role in the age of digital giants. Any organization that can make decisions twice as fast or one hundred times faster than its competitors will decimate them. Time is a friend to those who make can make faster, more accurate decisions. While the human brain may take minutes to make a decision and it takes hours for a decision to work through an internal organizational structure, in the digital world machines and artificial intelligence engines can make a decision in milliseconds. Whomever masters these automated decisions at high velocity will have an exponential advantage over those who don’t.

To succeed, businesses must achieve decision velocity: First you have to amass a huge number of users and collect rich data and insights about their interactions—what I call data supremacy. Then you must train artificial intelligence to recognize patterns in that data and automate decisions, processes, and tasks based on those patterns. The higher the number of users, the higher the number of interactions, the higher the amount of data, the higher the quality of insights that AI can learn from, the higher the level of automation of your decisions in your organization. The higher the level of automation of the organization’s decisions, the higher chances you’ll rule your market.

It All Starts With Quality Data - Lots of It

Data is the foundation and the first priority for every organization’s growth and development. You must find and harvest all relevant sources of data and control, if not own, the upstream raw data sources. On the downstream side, you must control access to how the data is shared, monetized, and controlled.  This means identifying where the biggest pools of quality data reside and understanding how data is consumed inside the organization.

However, the battle for data is often misunderstood. Many think data supremacy is only about accumulating the greatest troves of data. But having the most data does not necessarily mean you win. This is a battle for the most insight from well- curated, highly contextual data. Quality trumps quantity. The real goal is to understand the relationships among data. You want to learn how the data interacts with each other and what patterns arise from these interactions.

Where does the raw data come from? Successful organizations mine their organizations top to bottom, harvesting data from enterprise transactional systems like their accounting systems, supply chain, operations, and performance data. Then they pair their baseline back office data with front office data that includes customer interactions from sales, marketing, service, and commerce. They also mine “machine-generated data”—log files from equipment—and external sources such as social media feeds and feedback surveys.

The next source of data organizations rely on is user-generated. Every organization gets excited whenever users provide data on their own, whether through an online resume, a social profile, a customer account for a website, payment information, location data when they “check in” to a restaurant or shop, or photos that can be used for facial recognition and image recognition. The more organizations drive engagement with their users, the richer the data sets they collect and the more opportunities they have to find insight in the data.

These insights come from correlations, associations, and relationships—their “interactions”—among all the data produced and captured. Successful organizations are masters at identifying “signal intelligence,” the meaningful patterns or trends that emerge from the cacophony of data interactions. And they use this signal intelligence to make all sorts of “precision decisions,” from how much to charge for a product, to what customers ought to be targeted for what marketing campaign, to what product should be recommended to what customers.

Thus, the combination of good analytics, automation, and AI will help organizations improve decision velocity and carry this forward the learnings throughout the enterprise

 

Your POV

Have you organized your enterprise to optimize for decision velocity? Ready to move from data to decisions?

Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) org. Please let us know if you need help with your strategy efforts. Here’s how we can assist:

  • Developing your metaverse and digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales.

Disclosures

Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy,stay tuned for the full client list on the Constellation Research website. * Not responsible for any factual errors or omissions.  However, happy to correct any errors upon email receipt.

Constellation Research recommends that readers consult a stock professional for their investment guidance. Investors should understand the potential conflicts of interest analysts might face. Constellation does not underwrite or own the securities of the companies the analysts cover. Analysts themselves sometimes own stocks in the companies they cover—either directly or indirectly, such as through employee stock-purchase pools in which they and their colleagues participate. As a general matter, investors should not rely solely on an analyst’s recommendation when deciding whether to buy, hold, or sell a stock. Instead, they should also do their own research—such as reading the prospectus for new companies or for public companies, the quarterly and annual reports filed with the SEC—to confirm whether a particular investment is appropriate for them in light of their individual financial circumstances.

Copyright © 2001 – 2022 R Wang and Insider Associates, LLC All rights reserved.

Contact the Sales team to purchase this report on a a la carte basis or join the Constellation Executive Network

Innovation & Product-led Growth Tech Optimization Future of Work Insider Associates SoftwareInsider AR Leadership Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief Experience Officer

News Analysis: Inside Disney's Earnings and Streaming Wars Among A Tech Market Rout

News Analysis: Inside Disney's Earnings and Streaming Wars Among A Tech Market Rout

Disney+ Logo 

Disney's Performance Shows Strength and Depth Of Portfolio

Quality balance sheets, predictable revenues are key to sustaining stock prices during this current market rout.  Investors only care about future forecast guidance despite current quarterly performance. While Disney's Q1 2022 showed 23% YoY gains with $1.4 billion in operating profit, guidance has been muted in spite of the near pandemic comparables.

Disney+ Continues To Grow While Netflix Falters

Subscriber growth slowed in Q1, but Disney's streaming offering still grew revenue 5% and added 7.9M subscribers to a total of 137.7M total subscribers.  Disney+ as a standalone offering is the clear #3 in the market.  When the complete Disney streaming offerings are tabulated, they now now surpass Amazon prime with 205 million total subscribers.
 
Good news for investors as Disney contemplates a new ad supported subscription tier and continued international expansion.   International expansion will definitely drive down average revenue per user.  However, the streaming player faces additional headwinds with content libraries being pulled back. Lack of content availability may have an impact on near term subscriber adds.  Further, costs are up as Disney plans $32B in content spend

Figure 1. The Key Streaming Players

Netflix 221.8M
Amazon Prime 200M
Disney 137.7M
HBO Max 73.8M
Paramount+ 56M
Hulu 45.3M
Discovery+ 22M
Apple+ 20M

Parks

The park business showed massive demand for revenge travel.  Disney doubled its revenues to $6.8B as hotel, cruise, and concessions showed growth. Disney’s parks business is a shining light for reopening but inflation will impact Disney later in the year as labor costs, energy costs, and supply chain costs will eat at profit margins.  Disney could see more growth upside if Asia finally opens up as Hong Kong is open but Shanghai is closed.

Studios

Movie openings will be a bright spot thought as this is one revenue stream that has room to grow as Americans flock movie theaters for openings this summer.  Disney could see upside with future box office hits.
 

The Bottom Line

Meanwhile, the culture wars continue to roil Disney internally as 200 employees are protesting a move to Florida and the war with the state continues as backlash.  Overall, Disney has weathered the streaming wars well during lock down and is poised for success with more re-openings.  Add potential Metaverse opportunities, expect Disney to move from media giant to tech giant in the next five years.

Your POV

Who do you think will win the streaming wars? Where do you see Disney in the future of the metaverse?

Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) org. Please let us know if you need help with your strategy efforts. Here’s how we can assist:

  • Developing your metaverse and digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales.

Disclosures

Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy,stay tuned for the full client list on the Constellation Research website. * Not responsible for any factual errors or omissions.  However, happy to correct any errors upon email receipt.

Constellation Research recommends that readers consult a stock professional for their investment guidance. Investors should understand the potential conflicts of interest analysts might face. Constellation does not underwrite or own the securities of the companies the analysts cover. Analysts themselves sometimes own stocks in the companies they cover—either directly or indirectly, such as through employee stock-purchase pools in which they and their colleagues participate. As a general matter, investors should not rely solely on an analyst’s recommendation when deciding whether to buy, hold, or sell a stock. Instead, they should also do their own research—such as reading the prospectus for new companies or for public companies, the quarterly and annual reports filed with the SEC—to confirm whether a particular investment is appropriate for them in light of their individual financial circumstances.

Copyright © 2001 – 2022 R Wang and Insider Associates, LLC All rights reserved.

Contact the Sales team to purchase this report on a a la carte basis or join the Constellation Executive Network

Innovation & Product-led Growth Tech Optimization Future of Work Insider Associates apple SoftwareInsider AR Leadership Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief Experience Officer