Results

Why isn’t identity easy?

Why isn’t identity easy?

My presentation to this year's IdentiverseSWILSON Why Isnt Identity Easy HANDOUTS

From the beginning of e-commerce, we have tended to complicate "digital identity". Its relationship with “real” identity has never been clear and loosely defined metaphorical efforts like “electronic passports” have never fixed liability. The Anglophone (Five Eyes) countries have no tradition of National ID; occasional proposals have been roundly rejected by the public, which has poisoned most of the discourse of identity capability building. Private sector initiatives in banking and technology sectors (e.g. Identrus and Infocard) came and went. British and American efforts at grand public-private federations attracted very few Relying Parties (despite the urgency of better identity) and spawned no commercially sustainable identity businesses. Scandinavia and the Baltics have seemed successful with multi-purpose digital IDs but these are backed by specific legislation. A few free market identity frameworks remain in development in Australia and Canada. Progress has been disappointing and their futures remain uncertain.

The hottest developments in digital identity today are not about identity! Instead,Verifiable Credentials are digitally signed data structures which convey details of the credential issuers and other conditions specific to different use cases. Verifiable credentials are bound to cryptographic keys controlled by the credential holders, so they can't be copied, spoofed or counterfeited. Fresh verifiable credential standards are being developed that update earlier PKI certificates. Multiple credentials and identifiers can be conveniently carried in personal data wallets or data carriers, which ideally feature secure elements to house the all-important private keys.

In digital life and work, we need to show things about ourselves, typically discrete pieces of important data, backed up by metadata that proves origin, Ts&Cs, regulatory commitments, consent and so on, depending on context. If we generalise from credit cards ? including terminal, data carriers, standardised contracts and service providers ? we could build truly global infostructure for verifying everything we routinely need to know. 

In conclusion, we in the industry have made digital identity hard by solving for the wrong problem! Personal human identity is always going to be rich, relative and analog but the challenges in the digital domain boil down to trustable data. We know how to solve for data reliability, by blending cryptography and governance. Our industry has been evolving and shifting focus from identity through attributes to arrive now at Verifiable Credentials. Let’s keep up the good work, let’s be clear about where actual identity problems lie, and use our tools to build infostructure for verifiable data across cyberspace and the digital economy. 

The post Why isn’t identity easy? appeared first on Lockstep.

Digital Safety, Privacy & Cybersecurity Chief Information Officer Chief Information Security Officer Chief Privacy Officer

HOT TAKE: Qualtrics Just Turned UP the Volume on Voice with Clarabridge

HOT TAKE: Qualtrics Just Turned UP the Volume on Voice with Clarabridge

It is officially summer and the shopping spree continues. Some are saying this latest trip to the Billion-Dollar-Mall is set to upend markets…and yes, that’s the short-term news for those following the customer feedback charts. I’m more inclined to see a bigger shake up beyond that as new and interesting moves across multiple players could be signaling a bigger win. But more on that in a bit. Let’s get to the meat of it: Last week’s late announcement has Qualtrics scooping up feedback and analytics darling Clarabridge for a cool $1.1 billion in stock.

Qualtrics has been on a streak since spinning out of SAP in summer 2020. By January 2021, the experience intelligence firm had crushed expectations on an IPO…paid off debt…and still managed to leave some acquisition bucks burning a hole in their pockets. In Clarabridge, Qualtrics has a worthwhile expansion of customer feedback and voice aggregation capabilities that they need to fully realize the vision of being the ultimate engine for experience decision-making and engagement optimization.

Clarabridge has made a name for itself as an AI-powerhouse taking natural language processing to new levels, turning customer service and call center interactions into the highest of high-fidelity signals informing and molding customer engagements. With the capacity to analyze ALL interactions from human to bot, the Clarabridge solution unifies customer interactions into a single analytics pane including attributes that include scores on effort made by the customer, emotion and sentiment, and a tally of topics covered over time. Now just imagine, for a second, where else Qualtrics can point all that AI and NLP power. Did I hear someone say enterprise wide unstructured data perhaps?

The beauty of this move is it takes the direct feedback customers provide through Qualtrics interactions with the direct dialogue customers are having with any agent, be them human or AI.

OK…so let’s address the big elephant in my post. That’s right…I can’t with calling either platform an “experience management” solution as I am 100% in the “customers are and will remain the only source of managing their own experiences” camp. But…I will say that when it comes to customer AND experience intelligence, Qualtrics is racing to the top of that hill. But they are NOT alone. With the Customerville pickup by IFS, you have Qualtrics running up the experience side of customer mountain while IFS races up via customer service, field service, enterprise asset management (EAM) and even ERP lanes. That doesn’t even start to address what competitors like Medalia and InMoment are cooking up.

So now I’m sitting here looking around at players like Invoca, SupportLogic and UserTesting and just…well…wondering. Remember how last week with the Zoom/Five9 acquisition news I said a reckoning was coming? Waves are being made…in crazy directions and it couldn’t happen at a better time when CX juggernauts across Marketing Automation and Sales Engagement are starting down an evolutionary path we are just starting to decipher.

But here is my off the wall hot take: The biggest winners in this just might be SAP...and SAP’s CDP and CX portfolio customers.

WUT? Has she lost it? Well yes, but that is a totally separate blog post. It is easy to forget…but don’t…Qualtrics is still owned by SAP (or more specifically, SAP is still the majority shareholder) and SAP customers have been bathing in the CX messaging of Qualtrics as a critical inbound connection to customer voice that needs to be ingested into SAP’s CDP.

Words like “native” and “out of the box” integrations have been core to the SAP/Qualtrics CX decision velocity story for some time now. So now…with Clarabridge…turn on the fire hose SAP CDP customers! It takes customer voice programs to a new level with AI powered analysis of natural language, connected to a persistent, unified and harmonized customer record that now feeds all that joy BACK into a system of engagement that spans sales, service, marketing and commerce. Suddenly asking about an NPS score looks even more outdated as customers can start asking, no really, why did LIZ walk away from that last experience putting out maximum effort for minimal reward…and what do we do next to flip that?

Marketing Transformation Next-Generation Customer Experience Chief Executive Officer Chief Marketing Officer Chief Digital Officer Chief Data Officer

How Companies Embed Analytics for Digital Transformation | Sisense and Constellation Research

How Companies Embed Analytics for Digital Transformation | Sisense and Constellation Research

Doug Henschen of Constellation Research and Ryan Segar of Sisense discuss the market challenges of BI, and how companies use embedded intelligence, predictive analytics, and ML to innovate. For further insights, we invite you to examine the evolution of embedded analytics now at https://www.sisense.com/reports/

Data to Decisions Tech Optimization Chief Information Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer On <iframe width="560" height="315" src="https://www.youtube.com/embed/qwzBFhBU4D8" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

HOT TAKE: Zoom to Scoop Up Five9…and I’m Not Mad At It

HOT TAKE: Zoom to Scoop Up Five9…and I’m Not Mad At It

For a first trip to the Billion-Dollar-Baby Mall, Zoom made quite the purchase with their $14.7 billion deal for cloud contact center software player Five9. The obvious conversations about this deal center around Zoom’s explosive COVID-fueled growth that needed someplace to crash. Sure, Zoom grew, by their own estimates, 326% in 2020, and that steep climb was going to slow. Acquisition felt like the obvious answer for anyone watching. Sorry…that’s the boring story.

Let’s establish a timeline…

April 2021 Zoom announces the Zoom Apps Fund. It’s a fund to jumpstart the growth of the Zoom ecosystem of apps. Everybody and their momma has an ecosystem and a fund or “ventures”. But look past Zoom’s whole these apps are an “important component in building the future of video communications” bit. Instead turn your eye to a new line that has crept into almost every corporate communication positioning Zoom as “core to how customers meet, communication and collaborate.”

Core to how customers meet, communicate and collaborate.

May 2021 Zoom announces Zoom Event Platform for Virtual Experiences. This one flew past a lot of folks. The new capabilities combined customers, communication, collaboration and, in the case of events, commercialization. With the new event platform, events can be ticketed, with controlled access and billing in one portal, and brings a new integrated networking module for attendee connection along with back-end tracking across everything from registration to revenue.

Core to how customers meet, communicate and collaborate.

June 2021 Zoom launches Zoom Phone Appliances, representing a move into hardware were Zoom tech meets hardware from big names like Poly and Yealink so bring video and voice calling to an all-in-one desk phone. And yes…thanks to that Zoom rooms tech…this slicer and dicer even does whiteboards.

June 2021 Zoom dipped a toe into the buying pond with the pickup of Kites GmbH with a global cross-language interaction solution to ramp up Zoom’s machine translation capabilities. So now, not only is Zoom going to core to how customers meet, communicate and collaborate…they can take it all global…with a fancy new phone.

Core to how ANY customer meets, communicates and collaborates…anywhere.

So now…let’s talk Five9. First it helps to understand that Five9 isn’t your average cloud center software kit. It too saw explosive growth in 2020 as brands across both B2B and B2C markets had to send their front line of customer engagement and experience home. Call center agents didn't just start working at home, they needed to be able to connect and engage with a customer caller that was also in the same motion of uncertainty and flux.

Five9’s solution is underpinned with AI tools that turn everything into smarter things…smart dialing, smarter self-service, smart routing, virtual agents, virtual assistants…basically everything smarter that employees and customers crave to make life (and work life) just a little bit easier. The cloud solution is also untethered to a single communications channel allowing a customer to choose not just when, but also where they need to or want to engage from mobile to messaging.

Dig a bit deeper into the portfolio and you start to see a common tread…it isn’t just about how call center employees could connect and, in essence, collaborate on problem solving and question resolution with a customer, but their platform could enable employees to connect and collaborate with other employees…and suddenly Five9 isn’t “just” a call center tool but part of a larger human capitol management story...and dare I say a larger enterprise communication and collaboration story.

Dare I say it…it starts to sound like solutions that are core to how people (customers, employees, families, whoever) meet, communicate and collaborate.

So now I have to ask myself…what DOES Zoom want to be when it grows up? All signs point to leaving folks like Ring Central and even Cisco in the dust for something higher up the chain and even more broad in the realm of customer experience. And they might just make it. For most platform players in the wild and bizarre world called “Customer Experience” where everyone from surveys to emails are claiming the title of “customer experience platform,” the foundation of most of these solutions is still some Kuato-like vestige of a CRM solution, feeding off of its host waiting for Quaid to arrive.

But…what happens to CX when the center of gravity leaves the familiar trappings of CRM and takes up residence in the land of communication and service? What happens when the center point for a CX platform is literally the center point for communication with any customer or any employee? My obvious questions are not what networking, router or VoiP system Zoom is eying next…but rather what CDP, omnichannel engagement solution, DXP, marketing automation or even sales engagement system could Zoom be eyeing next.

Sure, Zoom started as the “better and sassier” video conference to Cisco. But there is something bigger at play in the opportunity…something bigger than skaing up a stale market and a stale industry. It is the chance to unexpectedly reshape it all and drive analysts like me INSANE as we grapple with new acronyms to segment this new space. This Five9 pickup hints at just how big and bold Zoom CEO Eric Yuan can go. At least to me, this is just the opening salvo.

In the end, this is how I'd summarize this acquisition: Together, Zoom and Five9 dare to be core to how customers and employees meet, communicate and collaborate.

Marketing Transformation Future of Work Next-Generation Customer Experience Chief Customer Officer Chief People Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer

Monday's Musings: Get Ready For Big Government Vs Big Tech

Monday's Musings: Get Ready For Big Government Vs Big Tech

THE AGE OF BIG TECH REGULATIONS HAS ARRIVED

Governments across the world seek to rein in the digital giants for political, economic, and societal rationale. While most digital giants have not flaunted the law, the public perception of their enormous size, massive scale, and unfair competitive advantage drive political motivations to check the power of big tech ahead of any economic or societal costs.

The recent five anti-trust legislation proposed by the House Democrats and the House Republicans Anti-trust agenda reflect the growing bi-partisan sentiment to reign in digital giants. Further, China’s crack down on its digital giants such as Alibaba’s Alipay and Didi highlight both the threat digital giants pose to the power structure of the CCP China government and the populist sentiment that government must do something to protect their citizens. And the recent global minimum tax deal endorsed by the G7 seeks to cut out loopholes that digital giants have effectively used to reduce their tax burdens.

Consequently, in the age of big tech regulations, both built-from-the-ground-up digital giants and joint venture digital giants will experience a high level of scrutiny for potential marketplace abuses. In highly regulated markets such as the EU, market dominance is defined as having more than 39.7% market share. But while the percentage thresholds for what constitutes market dominance may change, how dominance is defined will play a significant role in identifying digital giants that have the power to abuse their market position.

LOOKING FOR ANTI-COMPETITIVE PRACTICES

As we saw in the battle for mobile operating systems with Google Android versus Apple iOS, market share alone does not necessarily convey market dominance. Apple with less than 20 percent adoption versus Google’s 80 percent market share has managed to drive more profits than Google with a more attractive ecosystem. In the future, market dominance might be defined as the percentage of paid users, or percentage of total transactions, or percentage of personal data controlled, or percentage of economic value created.

Once a digital giant is deemed a dominant company, regulators and politicians seeking to create a more fair playing field should look out for these anti-competitive practices:

  1. Forced upsell or cross-sell (Tying). When a company requires that the purchase of one product or service be made with the sale of another to restrict consumer choice. For instance, you want to purchase a video game and you are forced to buy the augmented reality headset with it whether you need it or not.
  2. Bundling. Similar to forced upsell or cross-sell, when a supplier will only sell products that are put into a combined package and will not sell the individual item. For example, you are required to purchase an extended digital warranty for each digital product you buy.
  3. Collusion and price fixing. When competitors work together to pre-determine and agree to either set, increase, or lower prices to impact the market. Imagine every digital mortgage broker got together to agree that they will charge no commission below 2.5 percent. In this case, fixing the floor in pricing would be a form of collusion.
  4. Exclusive dealing. When a customer must purchase a majority or all of a particular type of good or service from a company and is excluded from purchasing from its competitors. One example would be when a hospital is forced to buy only one brand of hardware to run a type of healthcare software, even though the software could run perfectly on any device.
  5. Exclusive rebates. Plans or loyalty programs that force a customer to purchase the majority or all their goods or services from one company and prevents them from purchasing from a competing one in order to receive a discount. An example of an exclusive rebate would be if one vendor provides rebates and volume discounts only if the customer makes all their purchase exclusively through that vendor.
  6. Margin squeezing. When an integrated firm sells a product that is an essential input at a downstream rival for a similar price as the integrated firms’ finished product in order to hamper the rivals’ ability to survive or compete. For instance, a software vendor that resells its component code to competitors as well as builds its own software on the same code, decides to increase the price of the component code it charges competitors but charges its own internal teams less.
  7. Predatory pricing. When a dominant entity reduces prices that create market side losses in order to force competitors out of a market. For example, a digital giant offers free shipping and returns for purchases at a loss to increase the cost of business for a competitor.
  8. Price discrimination. When some market participants are arbitrarily charged higher prices unrelated to the actual costs of supplying, creating, or distributing the service or good. Charging small businesses one price for a product and large enterprises a higher price without a regard for how much volume they order could be considered a price discrimination violation.
  9. Refusal to provide IP. When a dominant firm refuses to license critical intellectual property to potential competitors. An example would be if a critical security technology is banned for use in a competitor’s product, leaving them vulnerable to security attacks.
  10. Refusal to supply. Similar to the refusal to provide IP, when a dominant firm refuses to supply limiting access to a competitor with a good or service to eliminate competition.  

ENCOURAGING INNOVATION THROUGH ANTI-TRUST LAWS

One of the negative consequences of a duopoly market for consumers is that they block competitors from gaining critical mass. In the course of a challenger’s lifecycle, leading digital giants will try to partner, acquire, or otherwise threaten it with retaliatory measures. The result? Each value chain will be left with only a couple of digital giants and very small players in niche markets.

In order to encourage competition, governments must enforce anti-trust laws that ensure a free-market system and corporations must respect these laws. In the US, three pieces of legislation form the core of our antitrust laws. The Sherman Anti-Trust Act sets rules to prevent restraint of trade or the conspiracy to restrain trade. Fines for violating these rules include up to $100 million for organizations and $1 million for individuals with up to 10 years of imprisonment. The Clayton Antitrust act regulates mergers and acquisitions, pricing, discounts, and other unfair practices that reduce competition and enable the creation of monopolies. The Federal Trade Commission Act bans all unfair or deceptive acts or practices and unfair methods of competition.  

In the US the Federal Trade Commission (FTC) plays a significant role in supporting free and open markets through competition. The creation, adoption, and enforcement of these rigorous anti-trust rules allows for vigorous competition on the merits of a company’s offerings. Anti-trust rules often remove the impediments to economic opportunity and powers economic growth. Without these laws, consumers face limited access to products and services and would pay higher prices for goods and services. Some of the common anti-trust rules involve the prevention of bid rigging, market allocation, mergers and acquisitions, and price fixing.

PROTECTING DIGITAL GIANTS FROM OVER-REGULATION

The balance between over-regulation and no regulation is delicate. To achieve it, regulatory bodies have to engage in a cost-benefit analysis. Most regulation is designed to protect the consumer and the smaller competitors. But regulation must also consider the benefits that come from a duopoly market: the stable creation of a new market, increase in the number of jobs, and market efficiencies gained for the consumer. New markets create and expand new categories for spending and for hiring. New types of jobs are created along with new opportunities for both the new market and its ancillary ecosystem. Market efficiencies include cheaper pricing, easier access, and more choices for customers.

In some markets, the minimum efficient scale for a given product, service, or experience is one or two per market—or even per planet. At that point these entities may emerge as regulated duopolies like Airbus and Boeing. They both sell planes, everyone can board them at an airport, and few passengers care what type of plane they are boarding. These factors should be taken into account when considering regulations of digital duopolies. 

A key guideline to keep regulation in check is to put the burden of proof on governments to show how an organization’s actions or behaviors harm consumers in a market as opposed to measuring how a policy change may improve competition. For example, dominant streaming services built on membership and advertising revenue may end up creating new models that disrupt traditional cable, satellite, and movie theater distribution. The net result is that consumers will pay less per view, content creators and consumers will avoid massive distribution fees, and more money will go back to the content creator. These benefits could outweigh the fact that there are only two streaming players with a dominant market share, and that more expensive, traditional competitors are being disrupted in the market.

The successful balance between growth and regulation will require a deft hand and smart regulators. The massive influence of digital giants on politics, society, and the economy require the smartest and most skilled professionals to ensure that policies not only address the policy needs of today, but also builds in a futurist view of the impact today’s policies may have for generations to come.

My book, "Everybody Wants to Rule the World," talks about what needs to be done to regulate big tech while balancing the costs and benefits. Get the book on pre-order now and receive the first 3 chapters today: https://amzn.to/3uR9Q9I

Your POV

How will we balance innovation with regulation?  What's required for free and fair markets?

Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) org. Please let us know if you need help with your AI and Digital Business transformation efforts. Here’s how we can assist:

  • Developing your digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales.

Disclosures

Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy,stay tuned for the full client list on the Constellation Research website. * Not responsible for any factual errors or omissions.  However, happy to correct any errors upon email receipt.

Constellation Research recommends that readers consult a stock professional for their investment guidance. Investors should understand the potential conflicts of interest analysts might face. Constellation does not underwrite or own the securities of the companies the analysts cover. Analysts themselves sometimes own stocks in the companies they cover—either directly or indirectly, such as through employee stock-purchase pools in which they and their colleagues participate. As a general matter, investors should not rely solely on an analyst’s recommendation when deciding whether to buy, hold, or sell a stock. Instead, they should also do their own research—such as reading the prospectus for new companies or for public companies, the quarterly and annual reports filed with the SEC—to confirm whether a particular investment is appropriate for them in light of their individual financial circumstances.

Copyright © 2001 – 2021 R Wang and Insider Associates, LLC All rights reserved.

Contact the Sales team to purchase this report on a a la carte basis or join the Constellation Executive Network

Future of Work Matrix Commerce New C-Suite Tech Optimization Innovation & Product-led Growth Insider Associates SoftwareInsider AR Chief Analytics Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Information Security Officer Chief Marketing Officer Chief People Officer Chief Privacy Officer Chief Procurement Officer Chief Revenue Officer Chief Supply Chain Officer Chief Technology Officer Chief Operating Officer

How DataStax is Emerging as a Strategic Anchor in Cloud Data Management

How DataStax is Emerging as a Strategic Anchor in Cloud Data Management

DataStax famously has its roots in the 2008 open source release of the Apache Cassandra “NoSQL” database, which itself brought global scale database capabilities to companies seeking to overcome the growing limitations of their traditional relational databases. Though founded two years later in 2010 to support enterprise customers, the company has increasingly become a key player in how businesses think about managing data in the cloud, as well as how they hedge the risk of lock-in with large commercial cloud vendors.

Cassandra itself, and therefore DataStax, is a Java-based product that makes for easy uptake and management in IT shops because most such organizations have Java skills, which remain widely available. At its heart, DataStax is aiming at companies that have the classic problem of coping with large scale data in the cloud, along with high geographic dispersion of said data. The Home Depot, for example, uses DataStax to connect its logistics, delivery, supply chain, customers, digital channels, and associates. With Cassandra and DataStax, the home improvement retail giant successfully launched curbside pickup quickly. Datastax’s key strength is in making distributed data straightforward and workable for any business.

DataStax: Agility, Scale, and Choice in the Cloud

On its face, DataStax is to Cassandra what RedHat is to Linux:  A strategic offering that makes it easy to really bet on the product. The company  provides a multicloud-ready, serverless database-as-a-service named Astra DB so that organizations can just use it as a service with little to no provisioning. It also provides commercial support for Cassandra and an enterprise-optimized version for on premises deployments known as DataStax Enterprise.The company recently launched a new data streaming-as-a-service offering called Astra Streaming that’s built on Apache Pulsar to take advantage of that major trend as well. The offerings are fully cloud-native, run in Kubernetes, and they generally play quite well within an assembled modern cloud IT stack that IT needs now to be as future-ready and future-proof as possible.

Perhaps the single most important aspect to understand about DataStax is its vast ability to easily scale data either outside or within the confines of a large commercial cloud. The underlying technology, Cassandra, can easily handle Internet services the size of Facebook after all, which is where it originally came from. Organizations can build on and wield DataStax as their core operational database, assured that they can grow the largest business possible on it. And perhaps most significantly, all without being locked into Amazon Web Services (AWS), Azure, or Google Cloud various data storage offerings.

An Agile Cloud of Data for the Future

The digital world is still in its infancy and data growth is rapidly outpacing growth of compute speed, network speed, and storage. Managing truly vast amounts of data and feeding it through highly analytical and artificially intelligent systems to extract insights and value is becoming the top objective of just about every organization today and in coming years. Consequently, selecting the best database is a truly strategic decision today. The pernicious effects of forces like data gravity, which makes it harder to move from cloud to cloud as data volume increases, means the choice of the right database is crucial.

I’ve had a chance to talk with executives at DataStax over the last few weeks to understand their strategy and also so I can explain it to CIOs and other IT executives or digital teams. The company has its eye closely on where the industry is headed. They bring both true elasticity of database, high performance, cloud-native support, and the ability, when used appropriately, to substantially reduce the risks of data gravity and commercial cloud lock-in. DataStax also understands that the cloud is steadily moving towards models for radical ease-of-consumption, with serverless models for just about everything. The company also appreciates the painful lessons of the 1980s and 1990s, and that putting all the eggs of IT in one basket isn’t wise. This is core to their offering, and as is helping companies avoid the generation after that, where commercial SQL databases like Oracle and its near-predatory pricing also dominated. These are eras that most companies do not wish to return to as they migrate to the cloud. In short, my analysis is that DataStax is at present one of the clearest routes to ensure a move to the cloud doesn’t end up like these previous journeys.

Ultimately, DataStax provides the industry with a model for enterprise data that offers a high degree of both choice and sustainability agility for companies moving to the cloud. They enable CIOs to craft a best-of-breed IT multicloud-ready stack from the data on up that has genuine long-term legs with repeated proven capabilities within the Fortune 100. The company itself is healthy and growing. It recently received funding led by Goldman Sachs.

In the end, data is the only truly irreplaceable asset that organizations have, but it still tends to be underused and under leveraged in most organizations, in part due to over-reliance on schema-heavy, harder-to-access, and more difficult to consume traditional databases. Having a strategic cloud data capability that is both nimble, consumable, scalable, sustainable, and IaaS-neutral must be a key objective today for IT. Making the right choice unlocks the most value, helps achieve successful digital transformation, and increases innovation, while preserving the most flexibility and options among cloud vendors.

Tech Optimization Data to Decisions Big Data SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Digital Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer

Google Cloud Maturing Into a Five-Tool Player in Data

Google Cloud Maturing Into a Five-Tool Player in Data

Google Cloud’s Dataplex, Datastream and Vertex AI announcements point to a well-rounded platform.

The star performers in baseball are known as five-tool players, meaning they hit for average, hit for power, and excel at base-running, throwing and fielding. Think Hank Aaron, Willie Mays and Ken Griffey, Jr. The five-tool equivalents in data excel at data integration, data platforms, data governance, analytics and data science.

Enterprises scouting for star-caliber vendors have certainly had their heads turned by Google Cloud on the strength of its data platforms, with BigQuery being a standout, and its data science capabilities, leading with TensorFlow. (Indeed, Major League Baseball itself is a Google customer that uses BigQuery, among other services). But Google has been akin to an up-and-coming baseball star like Fernando Tatis, Jr. or Vladimir Guerrero, Jr.: It’s clear that both are going to be superstars based on their prodigious hitting, but Tatis needs to work on his fielding while Guerrero is just average when it comes to running the bases. These players need time to mature and develop all five tools.

So it goes with Google Cloud, which is maturing into a five-tool data platform. During the Google Data Cloud Summit last month, it became clear that the pace of maturation is accelerating, with three crucial services announced: Dataplex, Datastream and Vertex AI. These services will help to fill out an integrated, end-to-end platform for data engineers, data scientists, developers, data analysts and business users (see slide below).

Google Cloud is filling out an integrated data platform aimed at a broad spectrum of users.

Google describes Dataplex, now in preview, as an “intelligent data fabric.” We’ve seen fabrics (also known as data virtualization and data federation) before. What is now TIBCO Data Virtualization, for example, was founded as Composite Software in the mid-2000s. Microsoft SQL Server PolyBase and Teradata Query Grid are two other examples. IBM announced an all-new fabric offering, called AutoSQL, at the IBM Think event last month.

The idea with data fabrics is to virtualize access to data, with queries reaching out to myriad, distributed sources without having to move or copy that information into a centralized data warehouse. Fabrics increasingly extend across data lakes and data science environments.

The “intelligent” part of Dataplex promises “automatic data discovery, metadata harvesting… and data quality with built-in AI.” Google is also touting centralized security and governance capabilities, including “data policy management, monitoring and auditing for data authorization, retention and classification.”

It’s pretty clear that Dataplex, which is in private preview, is combining data virtualization, which I would put in the data integration camp, with the data governance role, addressed elsewhere by metadata management and governance offerings such as independent Collibra, IBM Watson Knowledge Catalog and (also in preview) Microsoft Azure Purview.

Given Google’s multi-cloud efforts with Google Anthos and BigQuery Omni (which now extends BigQuery data access to AWS and Azure), Dataplex will surely extend beyond Google Cloud, but we’ll have to wait to see what’s available at launch and what comes later. Google has stated that support for other data sources is coming “soon.”

Datastream, a second big announcement at last month’s Google Data Cloud Summit, is a change data capture (CDC) and replication service, now in preview, that will support low-latency requirements including real-time analytics, heterogeneous database synchronization and event-driven architectures. Squarely in the data integration camp, CDC technology is also not new, with long-standing market leaders being Oracle GoldenGate and Qlik Data Streaming (CDC) (formerly Attunity). Nonetheless, the company says its offering takes a different approach by offering a serverless service that automatically scales while replicating and synchronizing data with minimal latency. It integrates with services including BigQuery, Cloud Spanner, Dataflow and Data Fusion.

Vertex AI has been enhanced with new services to ease the production deployment of AI and ML models.

The third notable announcement last month was Vertex AI, (formerly “Unified AI Platform”) announced at Google I/O. Though this platform for model builders is not entirely new, it has many new components (shown above), including a feature store, pipeline capabilities and model monitoring capabilities. This builds on Google’s already strong AI capabilities by rounding out the tools needed to get models into production more quickly.

In addition to the new integration, governance and data science tools detailed above, Google also announced a preview Analytics Hub service that will provide a collaborative library for curating analytic assets and sharing and monetizing data. So here again, Google is building on core strengths like BigQuery and rounding out the portfolio to be a complete data player. I’m looking forward to an increasingly competitive public cloud playoff season that will extend over the months and years to come.

Data to Decisions Tech Optimization Innovation & Product-led Growth Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Digital Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer

It’s a Hot Summer for Sales Tech

It’s a Hot Summer for Sales Tech

The thermometer outside says it’s 90 degrees, but as I sit gently cooking at my desk, the real heat these days is in the sales tech market. The last four weeks have seen four major funding announcements in what I’ve broadly called the B2B seller enablement space. The amounts raised range from $80 million to $250 million with valuations from several hundred million to several billion dollars.

Don’t let the stupefying numbers (and temperatures) fool you—these companies are hot for good reason. Though none of the four are direct competitors (yet), they all address some of the fundamental challenges of selling effectively, each with its own particular focus and specialisms. They also fill one essential gap that has existed in traditional CRM systems for the past two decades: automating data capture and updates. This simple yet powerful ability to liberate sellers from administrative work is the single unifying factor among every vendor in this category, including the four companies here. It’s also the foundation of all the other capabilities that help improve the ways companies sell.

Here are the big announcements in order of recency:

Introhive Dials in Relationship Intelligence

On June 16, Introhive announced $100 million in Series C funding, led by PSG and including several existing investors. That’s a substantial increase from previous funding rounds totaling $40 million. Introhive, founded in 2012, has developed its offerings around the needs of businesses that build long-term relationships with their clients—think consultancies, law firms, recruitment companies, and, increasingly, financial services and technology companies. For these businesses, building customer relationships is often the job of partners and principals, not necessarily sales teams.

What distinguishes Introhive’s offerings in the market, aside from a clear understanding of how relationship-driven businesses operate, are relationship mapping and an almost obsessive focus on data quality. The company’s automated data capture uses machine learning to populate CRM systems with the timing and nature of customer interactions. At the same time, the system uses that data to build and score relationship maps across organizations, while also identifying important insights about customer accounts.

Introhive also has a specific offering around data cleansing, using AI to incorporate both internal and external data sources to validate data in CRM. The emphasis on data quality has substantial impact on the accuracy and reliability of the analysis Inrohive provides. This spans account intelligence, relationship maps, coaching opportunities, and pipeline analysis.

Gong’s Conversational AI Drives Revenue Intelligence

Gong announced $250 million (the largest in our assortment here) in Series E funding on June 3. The round was led by Franklin Templeton along with numerous other existing investors. This brings the company’s total funding so far to $584 million and values Gong at $7.25 billion.

Founded in 2015, Gong’s claim to fame is using natural language processing (NLP) to understand and assess sales calls and communications. In addition to automated data capture and CRM updates from email, chat, phone, and video calls, Gong uses NLP to analyze the substance of those conversations. By identifying and extracting key words, Gong filters insights into deal intelligence, people intelligence, and market intelligence.

Deal intelligence provides sellers with a clear view into how opportunities are trending and what actions are most likely to close deals. People intelligence gives sales managers and leaders insight into what the most effective sellers are doing and when, as well as identifying where individuals may need coaching, training, or additional support. Market intelligence highlights trends across opportunities like key competitors, emerging trends, and the most effective value propositions.

Outreach Ups the Ante on Remote Selling

On June 2, one day before Gong, Outreach announced $200 million in—wait for it—Series G funding, increasing the company’s valuation to $4.2 billion and total funding to $489 million. This latest round was led by Premji Invest and Steadfast Capital Ventures, with additional investors participating.

Outreach’s founders initially developed the company’s technology when they were running a previous startup and realized their biggest challenge was identifying and pursuing customer leads with limited manpower. They needed to make sales but with a small staff, couldn’t afford to waste time on low-value admin tasks or pursing apparent prospects that weren’t really interested.

To make the best use of what time and resources were available, they used machine learning to capture data from email exchanges and leveraged calendar plugins to automate meeting scheduling. From there, Outreach, founded in 2014, began to focus on developing automated cadences to maintain communication with prospects, gauge their level of interest, and focus on those most likely to close. Initially, the company’s bread and butter was supporting inside sales teams responsible for opportunity identification and qualification. With the pandemic-driven move to remote sales for all kinds of sellers, Outreach has expanded the scope of sales constituents it serves.

Dooly Does It

News about Dooly’s somewhat stealthy $80 million Series B funding round came out on May 20. The round, hot on the heels of Series A funding, was led by Spark Capital with participation by several other investors. According to Tech Crunch, this round, which brings total funding to $100 million, values the company at $300 million. Dooly was founded in 2016.

Dooly brings a straightforward yet extremely powerful offering to the seller enablement market: automated note taking and data syncing across a wide range of applications, including CRM. Through integrations with videoconference platforms, collaboration tools, email, calendar, document stores, and even other seller enablement tools, Dooly uses NLP to identify relevant keywords. This allows users and ops teams alike to build playbooks, connect useful information to keywords, and surface relevant supporting documents at the moment they’re needed.

The flexibility and extensibility of Dooly have broad application across customer-facing teams. By focusing on capturing insights and synchronizing them across multiple systems, Dooly removes a significant admin burden while improving transparency and coordination.

There’s a Big Waterfront to Play In

Looking across these four announcements, what strikes me most is the tremendous potential these companies (and others in this space) have to fundamentally change the way we work, starting with the way we sell. All these technologies are focused on making the job of selling easier and the experience of buyers better as a result. That’s a big reason why the people using each of them seem to be such big fans.

And, fascinatingly, it’s not uncommon to see two or three of the vendors here happily coexisting with a single customer. I wouldn’t be at all surprised to hear of all four of them being used in the same business. At some point growth, expansion, and eventual market consolidation will draw starker competitive lines between them. For now though, the diversity of approaches to tackling the challenges of B2B selling is a good thing. It means plenty of options for companies trying to find the right solutions to match the way they work—and the way they want to in future.

When you consider how much of the economy is comprised of companies that sell to other businesses, it’s easy to see just how big the opportunity really is. Every company has to sell, and everyone wants to do it better. Just wait until marketing is brought into the fold. What we’ve seen so far is still just a drop in the bucket.

On that note, time to go for a swim to cool off…

For more information and detailed discussion of the seller enablement market, see “B2B Relationship Selling in the Virtual Age: New Seller Enablement Tools Facilitate Old-School Fundamentals,” by yours truly.

Future of Work Next-Generation Customer Experience Chief Customer Officer Chief Executive Officer Chief Financial Officer Chief Revenue Officer

Objectives and Key Results (OKRs) Turns COOs into Transformation Leaders

Objectives and Key Results (OKRs) Turns COOs into Transformation Leaders

As a general rule of thumb, management theory has steadily fallen behind what new technologies are making possible. Digital performance management, talent analytics, work coordination and other powerful new Future of Work capabilities are now commonplace because of new breakthroughs in technologies and design.

But these developments are often poorly accounted for in how we strategically manage our businesses. Now a growing body of evidence has demonstrated that a potent enterprise-wide management approach that was originally pioneered by leading technology companies beginning a few decades ago has demonstrated real value in improving how we operate and transform our organizations. What's more, this approach can be combined with the aforementioned Future of Work technologies to consistently achieve better business outcomes.

Known as Objectives and Key Results, or OKRs, the approach was first used widely by senior managers at Intel, where it then spread to Google and was subsequently adopted by LinkedIn, Twitter, Dropbox, Spotify, AirBnB and Uber. It's hard not to notice that these companies are leaders in their industries, and it's widely believed that OKRs helped them get there.

The Spectrum of Managing Objectives and Key Results (OKRs)

The idea behind OKRs themselves is simple, and that's also why they work so well: OKRs help organizations better understand and achieve their objectives through clearly defining and then measuring concrete, specific outcomes.

It's the simplicy and therefore the acccessibiily of OKRs that makes them interesting right away. But it's the results they bring about that keeps them in place. In my experience, when I run across a top technology team in the industry, I can tell soon after we meet whether they are using OKRs. The team is directed, focused, and each member clearly knows what they are about. This relentless focus on bringing about a desired result in the operations of a company has since led to OKRs spread out well beyond its tech roots and into the broader functioning of organizations today. 

I've mentioned OKRs in the context of operations several times now. Although the Chief Information Officer (CIO) has often been the entry point for OKRs in a typical organization in years past, it is increasingly the Chief Operating Officer (COO) , the role most chiefly responsible for the day-to-day operations of an organization, that is bringing them to bear operationally. Intriguingly, when I come across a COO using OKRs, they are often grappling with major changes that they have struggled to progressively activate in the day-to-day functioning of the organization.

With OKRs tied so intimately with the results that are sought by an organization -- including its teams and individual contributors -- the COO can use the future-looking view that the approach is defined by to drive large-scale changes, needed enterprise-wide shifts, and even overarching digital and business transformation.

The secret of course is being able to effectively adopt and wield OKRs across the organization. Given the growing interesting that COOs have in tapping into the results that OKRs have demonstrated in the field, I've recently published a new research report that explores how COOs can get the most effective results from the approach. In particular, it is in balancing the maturity of OKRs with the scale, which can be greatly assisted by an enabling platform that's been shaped for the purpose (see figure above.)

This new research is one of the very first that explores how to activate and succeed with OKRs in operations. Highlights include:

  • Developing a rollout and operations plan for OKRs
  • The key review dimensions for OKRs
  • How to use to OKRs as a top-level operating model
  • Using automation to assist the OKR process
  • Building an effective operational capability around OKRs

If you're seeking the how, what, and whys of COOs and OKRs, please read my new report — OKRs: COOs Can Drive Sustainable Change with This Breakthrough Approach — which provides a thorough examination of OKR methodology through the lens of the COO role. You can download a complimentary copy of the report for a limited time, courtesy of GTMHub.

Objectives and Key Results (OKRs) for the Chief Operating Officer

For another exploration of OKRs and the COO in more detail please join an all-star cast on the topic of OKRs for The Horizontal Thinkers Roundtable: Chief Operating Officer Edition. It's a valuable chance to participate in OKR-centric conversations with other COOs. Please register for the event to get the latest perspectives.

The Horizontal Thinkers Roundtable: Chief Operating Officer Edition

My Current Future of Work Research and Analysis

Building a new, better, and more collaborative future of work post-pandemic | Citrix

The Crisis-Accelerated Digital Revolution of Work

Reimagining the Post-2020 Employee Experience

It's Time to Think About the Post-2020 Employee Experience

How Work Will Evolve in a Digital Post-Pandemic Society

Revisiting How to Cultivate Connected Organizations in an Age of Coronavirus

Working in a coronavirus world: Strategies and tools for staying productive | ZDNet

A Checklist for a Modern Core Digital Workplace and/or Intranet

Creating the Modern Digital Workplace and Employee Experience

The Challenging State of Employee Experience and Digital Workplace Today

Future of Work Chief Analytics Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Information Officer Chief People Officer Chief Revenue Officer Chief Supply Chain Officer

Planful Gets Predictive, Heating Up Augmented Planning Era

Planful Gets Predictive, Heating Up Augmented Planning Era

Planful Predict portfolio starts with signal detection and forecasts aimed at improving financial and operational planning, but there’s more to come.

Financial and operational planning is all about preparing for future outcomes. The better you can see into the future, the better prepared you will be to proactively respond to whatever comes your way.

Accurate foresight is the promise of Predict: Signals, a new product released June 9 by Planful, the cloud-based financial planning, analysis and consolidation vendor. Predict: Signals has been in private preview with ten Planful customers over the last six months, and June 9 marked general availability to all customers. The promise of this optional new feature is to augment human capabilities by using machine learning (ML) to:

  • Surface anomalies, including those that are the root causes of variances from plans
  • Identify notable patterns, particularly those that point to risks
  • Augment human planning and decision-making efforts with ML-supported analysis of forecasts.

Prediction has been squarely in the domain of data scientists for decades, but in recent years we’ve seen automation and augmentation features designed to democratize these capabilities. AutoML features, for example, are making predictive techniques accessible to data warehouse professionals, while augmented analytics features are doing the same for business intelligence users.

Augmented predictive features are a much more recent phenomenon within the planning space, and they promise to improve the efficiency and effectiveness of financial planning and analysis (FP&A) professionals. Predict: Signals, for example, trains predictive models using customer’s historical data. When trained on at least 36 months of data, Planful says Predict: Signals can apply forward-looking anayses to financial projections and deliver insights with 95%-plus confidence levels.

What happens when part of that history includes an abnormal year, I naturally wondered given the pandemic experience? Planful says the feature’s built-in algorithm can identify sections of data, like those seen during last year’s business swings, that have the potential to skew the model and can normalize that data.

Predict: Signals does its training behind the scenes, without any need for data science expertise on the part of Planful customers. Pricing of this add-on feature is based on the volume of data used for training (as measured in gigabytes), and there a multiple subscription and custom pricing options. Once the model is trained, it will validate any forward looking forecast, checking for abnormalities. Confident forecasts are a great starting point for more realistic, on-target what-if scenario planning, and there’s no limit to the number of scenarios you can analyze.

Predict: Signals highlights forecast values that are at low-, medium- or high-risk of not being realized, supporting variance analysis, replanning and proactive action.

As actual performance data rolls in, Predict: Signals supports variance analysis, spotting risks and the underlying causes of exceptions. It also delivers new sets of predicted values, including upper, lower and median values – good goal posts for base-case, best-case and worst-case planning. The feature won’t make any decisions for the planner – it’s meant to augment and not replace the human -- but it does save them time by surfacing the real problems and risks they should address and the positive surprises they should try to maximize.

Planful has more capabilities on the Predict roadmap, so Planful customers can expect a continuing rollout of new augmented capabilities over the next few years. We’ve seen a similar pattern of machine-assisted features gaining traction in the BI and analytics space in recent years, and it has helped those platforms reach a wider community of users. I’m eager to see whether computer-augmentation will help accelerate the move of planning into sales, human resources and other operational areas.   

Data to Decisions Tech Optimization Revenue & Growth Effectiveness New C-Suite Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity business finance ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Marketing SaaS PaaS IaaS CRM ERP Healthcare Customer Service Content Management Collaboration Chief Customer Officer Chief Executive Officer Chief Financial Officer Chief People Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer