Results

Modern Digital Leadership: Exploring the 2019 Business Transformation 150

Modern Digital Leadership: Exploring the 2019 Business Transformation 150

It is evident in 2018 now more than ever before that a new breed of executive leader has emerged among the digital ranks of the world's leading organizations. To better understand and learn from this emerging type of contemporary leadership, we've analyzed the mindset of this new type of leader, which we've dubbed the New C-Suite, over the last few years.

From this we can see that several fundamental forces at work are evident among this group. We can see they are on the front lines in experiencing the seismic shifts in overall stakeholder expectations (faster, better, cheaper, more connected, and personalized everything.) They have a more intense focus on the human side of technology and have high expectations for the art of the possible, with collaborative and human-centered concepts like design thinking or devops high on their list of new approaches. Perhaps most importantly, they tend to think big and are actively cultivating a fresh set of enterprise priorities that put a premium on fundamental and effective digital improvement at scale.

Collectively, these overarching influences are having a profound effect on personal and professional development of the top thinkers and doers in the digital space today.

However, we find that showing what these leaders look like is more effective than hypothesizing about a theoretical new generation of digital leader, even as the latest Harvard research shows that as a category that "challenge led" leaders, which most of these executives are, will inevitably be more effective in times of great change. As a result, Ray Wang and myself are extremely pleased to introduce what we believe are the exemplars of the current digital revolution that's taking place globally in business, industry, government, and society today.

Exploring the Business Transformation 150 for 2019

Early last month we inaugurated the new 2019 Business Transformation 150 (BT150), each of which in some way stand out as this new type of digital leader.

This year's BT150, like last year's list, come from wide range of experience, backgrounds, accomplishments, and skill sets. A few vignettes of this year's inductees will serve to show the the variety, the big idea thinking, and the effective, real-world business and digital transformation experience that each brings to the table.

For example, they could be like 2019 BT150 inductee Sven Gerjets, the Chief Technology Officer (CTO), of Mattel who is strategically melding the traditional worlds of play with today's fast-emerging digital environments to develop and realize a highly accessible new vision for connected toys to more fully realize young minds. Mattel is doing this by proactively investing in and using digital innovative new platforms such as Tynker while redesigning their product experience in a far more immersive and digital fashion.

Or they could be like Dr. Karen Croxson, Deputy Chief Economist of the UK’s Financial Conduct Authority, who has a real passion for using big data and advanced statistics to systematically promote competition, innovation and ethical behavior by businesses in order to enhance the effectiveness and safety of the UK financial system. She's doing this by employing the very latest artificial intelligence and machine learning-based techniques.

Other BT150 inductees are grappling head-on with massive restructuring of both their businesses and their industries at the same time, such as Dow Chemical's CIO Melanie Kalmar, who heads up a diverse team of executives, line of business presidents, and functional vice presidents, collectively called North Star, who are charged with setting Dow’s new firm-wide digital strategy by systematically "harnessing the power of our long history of data collection to drive growth and new business opportunities." Another example is Mike Macrie, the CIO of the Land O'Lakes, who has been promoting a more innovation focused form of IT and in particular has been expousing how to much better and more mindfully measure IT outcomes to ensure digital transformation is actually happening broadly across the organization.

These four stories are just a small sampling of what today's digital leaders are faced with, how they are both becoming and fostering digital change agents everywhere to transform their organization. Most importantly, they are building the kind of future for their colleagues, partner, customers, and the world, that they would like to see. 

Here is the full list of Business Transformation 150 inductees for 2019:

Nearly a third of last year's inductees had in-person representation at Constellation Connected Enterprise (CCE) to accept their induction on stage into the BT150. We are expecting many of the BT50 for 2019 to attend in person as well. The goal of building the BT150 list each year is to a) help improve industry storytelling in an important sector that badly needs it, b) foster innovation and cross pollination of significant new ideas among leaders, c) identify new and more effectie digital leadership techniques, d) to better foster and encourage positive digital change and transformation, and e) identify major accomplishments and leadership in the world of the digital enterprise.

Ray and I hope that you welcome these leaders to the BT150, follow them on social media (you can find their accounts on the detail pages in the links above for their names), and engage them in industry conversation and storytelling. Hope to see you in Half Moon Bay in October!

New C-Suite Innovation & Product-led Growth Tech Optimization Future of Work Marketing Transformation BT150 Leadership AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Customer Experience Chief Executive Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Experience Officer Chief Data Officer

Google Next 2018: A Deeper Dive on AI and Machine Learning Advances

Google Next 2018: A Deeper Dive on AI and Machine Learning Advances

Google Cloud announcements bring deep learning and big data analytics beyond data scientists, but enterprises will want more.

If last week’s Google Next 2018 event is any indication, Google Cloud is growing quickly. Registrations for the July 23-26 event topped 25,000, and actual attendance easily doubled the 10,000 at Google Next 2017. That’s good, but if this public cloud is going to catch up with also-fast-growing rivals Amazon Web Services (AWS) and Microsoft Azure, Google is going to have to play to its strengths.

From my perspective, Google’s biggest appeals to big businesses are its deep learning (DL), machine learning (ML) and data platform capabilities (though I’m biased and my Constellation colleagues who follow G Suite and the rest of Google Cloud Platform (GCP) cloud infrastructure might see it otherwise). Among the many announcements at Google Next 18, the biggest steps forward – and the ones I see as most likely to accelerate growth – were those aimed at expanding the use of Google’s DL, ML and data platform capabilities. Here’s a closer look.

Cloud AutoML Democratizes Data Science

If I had to cite the single biggest announcement of Google Next 18, I’d say it was the beta release of Cloud AutoML, which promises to bring custom DL model building capabilities to organizations even if they don’t have data scientists on staff. It’s a self-service, democratized option that builds on the Google Cloud ML Engine, the data-scientist-oriented offering that became generally available in March 2017.

To review, Cloud ML Engine is a managed machine learning service that lets you train, deploy and export custom models based on Google’s open sourced TensorFlow ML framework or Keras (an open neural net framework written in Python that can run TensorFlow). Cloud ML Engine features automatic hyperparameter tuning and tools for job management and graphical processing unit (GPU)-based training and prediction. Models are also portable, so you can build and train on GCP but then export the models and run them on premises.

Of note to Cloud ML fans, the company announced at Google Next that the engine has added support for training and prediction using scikit learn (for Python-based machine learning) and XGBoost (for gradient boosting in C++, Java, Python or R).

You need to know what you’re doing to use the Cloud ML Engine, so to make things easier for non-data-science-experts, Google introduced a series of machine learning services based on pre-built models. Developers can simply invoke application programming interfaces (APIs) to tap into the services for Natural Language text analysis, Speech-to-Text and Text-to-Speech translation, and machine Vision image detection.

Invoking a service through an API is easy enough, but the down side is of these general-purpose, pre-built models is that they are generic. The idea with AutoML is to start with the pre-built models, but then enable non-data-scientist types to customize through an easy graphical user interface (GUI) and their own data. By taking advantage of all the training that went into the prebuilt model, AutoML customers save development time, but they also benefit from more accurate, custom models based on training on data that’s specific to their industry and organization.

Since its initial alpha release in February, Cloud AutoML Vision has been used selected customers. At Google Next we heard about how retailer Urban Outfitters has used AutoML Vision to build a custom model that recognizes attributes unique to its product imagery. The company says the customer model has improved the search experience on its web site, helping customers to find what they’re after based on visual cues, such as fabric patterns and neck lines. These visual cues don’t necessarily show up in textual metadata, and they’re also not trained into the model behind Google’s standard Vision service.

As announced last week, Cloud AutoML is now in beta (so it’s available to all customers) and it has been extended to include AutoML Natural Language and Translation as well as Vision.

MyPOV on Cloud AutoML. This is a great step forward for Google and it will clearly appeal to any company interested tapping into the power of deep learning without hiring a data scientist. It’s readily apparent to anybody who has compared Google Assistant to the likes of Amazon Alexa, Apple Siri and Microsoft Cortona that Google’s voice and language capabilities are the best available. Cloud AutoML makes state-of-the-art DL accessible to a broad audience, but I think it will appeal to mainstream developers and data scientists alike.

I also appreciate that Google has broadened the appeal of the Cloud ML Engine by adding support for scikit learn and XGBoost. Not all modeling challenges fit TensorFlow, and these open source options expand the possibilities both on GCP and for exporting and deploying models on premises.

BigQuery ML Democratizes Machine Learning

The second big Google Next announcement in the theme of democratization was BigQuery ML, a beta release designed to support machine learning through simple, broadly understandable SQL statements. As the name suggests, this new ML capability has been added to BigQuery, Google’s highly popular data warehousing service. It’s popular largely because it’s serverless, meaning it elastically scales up to petabytes and back down on demand, without requiring database administration. It also supports SQL 2011 standard expressions, query federation, high availability, streaming analytics, encryption and other good stuff, but ease of use is BigQuery’s calling card -- and a differentiator versus more administratively challenging rivals AWS Redshift and Azure SQL Data Warehouse.

BigQuery ML extends SQL functionality to support machine learning through simple CREATEMODEL and ml.PREDICT  SQL commands. At launch, BigQuery ML supports linear regression and binary logistic regression, but Google plans to add many more algorithms and supporting expressions. BigQuery ML applies what’s known as an in-database technique, and the alternative is the conventional approach of exporting data to a separate machine learning and analytics environment, which is obviously more cumbersome, time consuming and expensive.

MyPOV on BigQuery ML. Techniques for in-database execution of advanced analytics including machine learning have been around for nearly a decade, implemented in IBM Db2, Microsoft SQL Server, Oracle Database, and Teradata, among others. All of these databases are now available as cloud services, but where Google’s top hyperscale cloud competitor are concerned, AWS Redshift doesn’t have anything like BigQuery ML. Microsoft SQL Server supports in-database ML, but this functionality has yet to be extended to its Azure SQL and Azure SQL Data Warehouse cloud service counterparts.

So BigQuery ML is not a huge breakthrough, but Google is ahead of its chief cloud rivals in introducing it. I’m sure Microsoft will now bring support for SQL Sever Machine Learning Services to its Azure SQL service ASAP. I also won’t be surprised if AWS makes a similar announcement by Re:Invent 2018, in November, as in-database techniques are no longer rocket science. Once rivals are in the game, I’m sure we’ll see one upmanship in terms of the depth and breadth of ML capabilities. As I’ve seen with earlier in-database initiatives, regression and logistic regression are just the start of what companies will want to do with the masses of data in their data warehouses.

Going Vertical With AI Solutions

Google Cloud had a lot to say about its partner ecosystem at Google Next 18, and it even says it now has a commitment to include at least one partner in 100 percent of its new deals. The company also pivoted at Google Next by deemphasizing products and focusing instead on solutions. That’s another sign of maturation to go along with Google’s growth.

On the theme of playing to its strengths, Google made two other important announcements last week on early examples in an expected wave of AI solutions built with partners. The first announcement was Contact Center AI, which is designed to bring Google’s virtual agent capabilities -- including speech-to-text, text-to-speech, natural language processing and Dialogflow automated workflow -- into partner call center environments. Contact Center AI is now in alpha release, so customers can sign up, but they are being screened for initial deployments. The list of partners is extensive, including Cisco, Genesys, Mitel, Twillio, Vonage and leading systems integrators.

The second AI solution announcement was a planned set of services with partner Iron Mountain. Set for release in September, these services will make Google’s TensorFlow image and optical character recognition capabilities available to Iron Mountain’s content analytics, archiving and storage customers. The services will help customers know what physical and digital documents they have and, according to Iron Mountain, it will help them create new services based on AI-based understanding of and access to this content.

MyPOV on Google Solutions: Call center and document-oriented services are about as broad as you can get when it comes to solutions. Any company with a sizeable number of customers has a call center and Iron Mountain has literally hundreds of thousands of customers. Google also has a partnership with SAP, which is using TensorFlow for ML/DL solutions of its own. But Google has hardly scratched the surface where solutions are concerned.

When enterprise software companies introduce industry vertical solutions, they’re typically drawing on years of experience in multiple verticals. It’s not unusual to see these companies roll out with half a dozen examples in an initial release, and they’ll have at least a few more on the roadmap. That Google announced just two solutions and had no roadmap for additional releases tells you it’s very early days for this company’s solutions and vertical industry offerings.

My Overall Take on Google Next 18

Studies suggest that we’re moving into a multi-cloud world, and, indeed, I’ve talked to plenty of companies that use more than one public cloud provider. The most common pattern I see is companies building and running applications on AWS. In fewer cases Azure is their primary cloud, but it’s very often their choice for email services and desktop applications via Microsoft Office 365. When Google Cloud is in the mix, nine times out of ten I hear it was chosen for its data platforms and ML/DL capabilities. Of course my sampling is biased precisely because these are my research domains. Nonetheless, this is the key reason why I think it’s so important for Google to play up its data-to-decisions strengths. 

Beyond the AutoML and BigQuery ML announcements, Google offered a number of other AI- and ML- related announcements. Kubeflow, for example, promises to support complete machine learning stacks on Kubernetes. And low-power Edge TPU (TensorFlow Processing Unit) chips promise to bring Google’s DL wizardry to mobile and remote sensors and devices. Thus I’d say Google Cloud did a good job of doubling down on these strengths, but much work needs to be done.

While Google has focused on democratizing data science with AutoML, AWS seems to have more to say about end-to-end model management with SageMaker. Microsoft, meanwhile, is addressing model management as well as data and model lineage and governance with Azure Machine Learning. As the number of models and versions mounts, model management and data lineage and governance become increasingly important. Outside of the basic topic of security, I didn’t hear much about these topics at Google Next. As for AI solutions, the partnerships with Iron Mountain, SAP and contact center vendors are a start, but the company is clearly at the start of the runway where AI-based industry solutions are concerned.

Related Research:
Amazon Web Services Adds Yet More Data and ML Services, But When is Enough Enough?
Microsoft Stresses Choice, From SQL Server 2017 to Azure Machine Learning
Google Cloud Invests In Data Services and ML/AI, Scales Business

 

Data to Decisions Tech Optimization Microsoft Chief Information Officer Chief Digital Officer

Event Report: Google Next 18

Event Report: Google Next 18

This week Google held their annual user conference Google Cloud Next. The event covers the entire range of Google offerings, but my focus was primarily on the Gsuite family of collaboration products.

Here are my key takeaways:

 

event report google next 18 lepofsky from Constellation Research on Vimeo.

Future of Work

Some Thoughts on IBM Notes and Domino in 2018

Some Thoughts on IBM Notes and Domino in 2018

After spending time with executives from both IBM and HCL, as well as having discussions with several IBM Business Partners, here are some of my thoughts on the current state of Notes and Domino, and the opportunity available.

Future of Work

Tableau Takes Next Steps Toward Smart Analytics

Tableau Takes Next Steps Toward Smart Analytics

Tableau’s Empirical acquisition is its latest move toward machine-augmented analytics. Here’s a look at the company’s ‘smart’ features.

Tableau last month announced the acquisition of Empirical Systems, an artificial intelligence (AI) startup with an automated discovery and analysis engine designed to spot influencers, key drivers and exceptions in data. It was Tableau’s second acquisition over the last year aimed at accelerating so-called “smart” capabilities and part of a larger push that began in 2016.

As I wrote in my January report, “How Machine Learning and Artificial Intelligence will change BI and Analytics,” consumers and businesses alike are increasingly interested in smart capabilities powered by heuristics, machine learning (ML) and natural language processing. In the area of analytics, these smart capabilities promise to take us beyond the limits of self-service.

Despite the embrace and success of self-service over the last decade, it’s increasingly clear that this approach alone is not enough to truly democratize data-driven decision-making. Self-service tools aren’t always intuitive for nontechnical business users. Even more data-savvy users sometimes need help when selecting data, determining how to analyze that information, and deciding how best to visualize and share insights.

To make things easier for novice and experienced users alike, BI and analytics vendors are developing smart capabilities in at least four areas: data prep, data analysis and discovery, NL query, and prediction. In my latest report, “Tableau Advances the Era of Smart Analytics,” I detail the smart capabilities that Tableau has delivered to date, where it needs to fill gaps, and the strength and weaknesses of what it calls its augmented analytics strategy.

Tableau started stepping up its smart capabilities in 2016 with automated clustering and forecasting capabilities. It followed in 2017 with smart table-, join- and data-source recommendations. This year Tableau also introduced a number of smart features within its Tableau Prep data-preparation offering, introduced in April.

Tableau’s recommended-data-source feature delivers user- and context-specific suggestions. 

One of the gaps in Tableau’s smart lineup, at this writing, is natural language query, a feature that let users ask questions of data in plain English rather than using SQL code. This gap sparked Tableau's 2017 acquisition of ClearGraph, a startup focused on natural lanaguage query. It's well known that Tableau is working on bringing natural language query capabilities into its products, but it has yet to announce release dates. I’m not the only analyst predicting that we'll see Tableau's NL query nnouncement in 2018 -- mostly likely at the Tableau Conference in October.  

This brings us back to the Empirical acquisition. If the turnaround on the ClearGraph acquisition proves out, I would expect a 2019 announcement of new smart features based on Empirical’s assets and expertise. (As was the case in the ClearGraph acquisition, Tableau hired Empirical's leadership and staff as well as acquiring its assets.)

As noted in my report, Tableau is far from alone in delivering smart features and it has not been the first to deliver all of the smart capabilities it now offers, but the company’s pace of investment has accelerated over the past three years. I see Tableau as now having a solid start on delivering expected smart capabilities, and it's adding these features as built-in (no-extra-cost) aspects of its core products. Given that Tableau has more than 74,000 paying customers and hundreds of thousands of users, its efforts are going to go a long way toward brining smart capabilities to the market.

Related Resources:
Tableau Advances the Era of Smart Analytics
How Machine Learning and Artificial Intelligence Will Change BI and Analytics
Tableau Conference 2017: What’s New, What’s Coming, What’s Missing

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Marketing Transformation tableau intel ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Marketing B2B B2C CX Customer Experience EX Employee Experience Growth eCommerce Social Customer Service Content Management Collaboration business SaaS PaaS IaaS CRM ERP finance Healthcare Chief Customer Officer Chief Information Officer Chief Digital Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Musings - Enterprise Acceleration - and what every HR Leader should know about it

Musings - Enterprise Acceleration - and what every HR Leader should know about it

Enterprises have always been faced with competition, as competition is key mechanic of success in the free market economy. But the need for adopting and reacting to the competition and creating new creative strategies has never been as high as it is today. Digital Transformation has substantially changed the game here, forcing enterprises to move faster towards a new objective, while having less room for error – than ever before. Constellation Research has shown that in digitally transformed industries, the leaders are taking more than 70% of revenue and over 77% of profit. This means the risk of being left behind is bigger than ever before Worse, when left behind it is almost impossible to catch up… as a consequence, what matters for an enterprise leader, is how much their enterprise can accelerate as a unit. As it is impossible and unrealistic for an enterprise to catch all the successful trends that become relevant in an early phase, it is even more important that CxOs look at the speed to at which their enterprise can adopt to challenges as thrown at them from existing and new markets. 

 
 

Financial Indices tell the story

To understand the increasing rate of change in the markets, look at the financial indices across the world, and the they tell the story of markets moving faster. Enterprises has been falling out of financial indices at a faster rate than ever before, many of the indices seeing a complete turnaround in members since their start, usually 30+ years ago. The main reason is that many of these enterprises have been acquired, beyond that they others have just not been able to keep up with the rate of change created by the speed of market changes. In essence, these enterprises were not able to accelerate fast enough and adopt to the rate of change they were operating in.

For example:

- The DAX has seen 100%-member turnover in the last 30 years.

- The DOW has seen over 52% of its members disappear since 2003.

- The FTSE has lost 2.6x members that its number of listed enterprises (100) in 34 years.

- The S&P 500 has seen the average age of new members joining reach 10 years.

 

Enterprise Acceleration Formula

So how do we define enterprise acceleration? We look at the two key criteria, technology, manifested in software, and people. For the sake of the blog post we look at people only here. People speed is determined by the talent that people have, their ability to learn new skills, over the speed of forgetting skills and the rate of skills being obsolete.

For HR leaders, the variables are:
  • Hire extraordinary talent, of course with applicability for the enterprise current and future talent needs.
     
  • Understand people have to continuously learn and need modern tools to enable this.
     
  • Make sure skills do not get lost, especially the relevant, just trained ones.
     
  • Minimize the percentage of obsolete skills in the people base of the enterprise. 

This requites a very different approach to people skills, their augmentation and conservation.
 
 

The three key focus areas for Enterprise Acceleration

Strategy

Enterprises need to shift their strategy from satisfying executive, reporting and compliant needs, to a focus on helping the people leader, who has the task to achieve business result for the enterprise. These people leaders manage teams of one or two handful of employees and are chronically under supported from executives, tool and processes. Most requirements from the leadership of an enterprise coming down the chain of command neglect the situation, needs and challenges of these people leaders.

CxOs need to analyze, what people leaders need for their and their team's work and what they need to become more successful. This needs a change of focus from administrators and reporting needs to productivity and efficiency needs as needed by the people leaders in the current situation of the enterprise, as well as the targeted setup of the enterprise from a strategy perspective. This makes the business user focus a key success factor in the overalls strategy for implementing an enterprise acceleration strategy


 

SaaS

CxOs need to have conversations with their software providers on 5 key topics of enterprise automation side:

1. Talent Depth Chart. The talent depth chart is crucial for enterprises to understand where its talent is. As seen in the enterprise acceleration formula, the innate talent of people is one of the key success factors for people speed. This means finding internal talent for the right job is key to identify and then to transport the talent to that next position, where the enterprise needs the people with the right skills.

2. TransBoarding. This is the ability of company to "transport and onboard" talent around in a faster and better way than ever before. Meaning that the transferees will be trained for the next position even before they take on the new position. It also means a seamless transfer between jobs can happen with significant lower cost and friction for the employee and the people leader.

3. 21st century learning. CxOs need to have a conversation about modernizing their learning systems to the needs of the 21st century workforce. This means that learning content has to be pervasive, can be self-created, and can be self-curated and consumed at the space of the modern workforce. Too much learning happens at the wrong time in the wrong place, and the speed of forgetting newly learnt information and capabilities is killing any speed an enterprise was planning to pick up from the training / learning in the first place.

4. Fixed performance management. In order for enterprise to know where the good people are, it needs to know who's performing well. Unfortunately, the state of performance management in enterprises is mostly in a sour state. Fixing whatever needs to be fixed to make performance management work is crucial for the enterprise that wants and needs to accelerate.

5. Lean recruiting. As talent of people is critical for enterprise, the need to recruit fast is crucial for the enterprise. Taking the friction from the talent acquisition process by directly enabling hiring managers to see what talent available in the market is and facilitating the conversation between manager and future candidate is a key step to accelerate on recruiting side.

So, enterprise executives need to ask their software providers, how they can help achieve these 5 capabilities. Are any of them coming, are any on the road map? Or is this the first time the vendor hears about them?

 

Technology Platform

These are three topics for a conversation that people leaders need to have with their CIO / CTO colleague:

1. Machine Learning. No technology will change the work life more than machine learning. An enterprise that wants to accelerate needs to have a strategy on how to leverage machine learning for its needs. HR Leaders need to understand what their enterprise wants and can do in the near future and understand the repercussions and benefits of the technology for the people leaders and their teams in the enterprise. And Machine Learning needs to run automatically on top of the enterprise data, that is ideally stored in Big Data clusters.

2. Big Data. All planning starts with data. In a faster moving economy, enterprises no longer can afford to go on long data collection, cleanse and acquisition projects. All electronic information that it is at the disposal of the enterprise, needs to be available in Big Data clusters. This is the only way for enterprise to directly make decisions based on where data is and where it needs to move / react towards the competition in the respective markets. Establishing enterprise wide Big Data capabilities is a must have for enterprises, so they don't lose time directly at the starting blocks when starting in enterprise acceleration needs to happen.

 
3. Non-Monolithic. Traditionally enterprise software would only provide one way of doing things. That meant that when the enterprise needed different automation for different divisions, organizations or department's customization was needed. Today, a more modern approach to enterprise architecture is needed, which allows each area of the enterprise to run their own set of automation, a non-monolithic approach to enterprise software.

MyPOV

There is only one thing that is certain, and that is that business will never be as slow as it is today. It's vital for enterprises to find ways to help them move faster and be able to accelerate when it matters. It is also clear that more moments will be ahead where the ability of an enterprise to accelerate are crucial. The ability to reach the speed necessary for any transformation ahead is something CxOs needs to focus on, this blog post looked specifically at the people side of enterprise acceleration, something that is near and dear to HR leaders, but ultimately really all people leaders of an enterprise.

Looking forward to hearing back from you how the conversations with the people leaders, the CIO / CTO and the vendors are going.
Future of Work Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Customer Officer Chief People Officer

Event Report: IPsoft Digital Workforce Summit 2018

Event Report: IPsoft Digital Workforce Summit 2018

Summary

  • Digital Workforce Summit On June 7 IPsoft held their annual customer event in NYC, where they showcased the newest features of their digital assistant, Amelia.
  • Amelia is a software platform that uses artificial intelligence to create human-like interactions for helpdesk/support experience, both with employees and customers. People usually interact with Amelia via a chat user-interface, where they enter queries and Amelia responds. What makes Amelia different than just a “Q&A chatbot” is that Amelia can understand the context of the conversation, the sentiment and even the history of the interactions making it a much more natural (and useful) conversation.
  • Amelia Marketplace: Provides Amelia training skills "out of the box" in areas like banking, healthcare and insurance.
  • Amelia City:  Opening of IPsoft’s highly interactive customer briefing centre (at their NYC HQ) that showcases Amelia’s features and use-cases.
  • 1DESK: IPsoft's new platform that bridges front-office (UI) experience with back-end (ITSM) services

We’ve all been there. An application stops working, wifi goes down, you forget a password, or you need an answer to a business process like when to enroll for benefits. Raise your hand if you loved the experience you had looking up the solution or dealing with customer service agents. I’ll wait. No hands up? I didn’t think so. The current breed of support tools are usually frustrating and challenging. Does anyone actually like using those automated phone systems? Push 1, 3, *, 7 if you do or 2, #, 5 if you don’t. How about many searching through help documentation or an FAQ for answers? You get my point.

So how do we solve these challenges? How do we get the accuracy and understanding that a human can provide, while still getting the speed, scalability and ease of use of digital solutions? 

On June 7th I travelled to New York to attend IPsoft’s Digital Workforce Summit where they were discussing their vision of how support should work. Notice the event was called “Digital Workforce” not “Digital Workplace” (a very common term these days) because IPsoft develops solutions which they refer to as digital or cognitive assistants. These new “digital coworkers” are quickly becoming part of the workforce, as they help augment the capabilities of internet help desks and customer support call centres and websites.

 

Who is IPsoft?

IPsoft is a privately owned company, headquartered in New York City, founded in 1998. With over 2500 employees and offices in 13 countries, the company developers solutions for automating and managing IT and business solutions.

Below is an image of their product portfolio:

More Than A Chatbot, Meet Amelia

I imagine most of you reading this have at some point interacted with the growing number of “digital assistants” such as Apple Siri, Google Assistant, Microsoft Cortana or Amazon Alexa. These tools can perform a variety of actions from answering questions, to setting alarms, to interacting with appliances and lighting. But have you used such a tool at work? Vendors such as IPsoft and its competitors are hoping that as our trust and comfort with these digital assistants improves, their usage will become as common as using an ATM for doing your banking.

Amelia is more than just a chatbot or a tool that responds to a simple question by searching for the most relevant answer. IPsoft has spent more than 15 years creating a digital model that mimics many of the ways the human brain works. Amelia’s “brain” (a deeper discussion of artificial intelligence is outside the scope of this article) uses several different elements to determine how to respond to people. These include:

  1. Semantic Memory - this is where knowledge is stored
  2. Process Memory - this is where business processes and workflows are stored
  3. Analytical Memory - predictive models for next best actions and constant refinement for accuracy
  4. Episodic Memory - this is where past experiences are stored
  5. Affective Memory - empathy and contextual understanding

MyPOV

Think of the first three as the understanding of your business’s workflows. For example, if an employee asks how to enrol for healthcare benefits, Amelia will process the information she’s been trained on about healthcare providers, deadlines, common questions, etc. and formulate an answer.  Where things get really interesting are the last two facets: experiences and empathy. Rather than just providing a generic answer, Amelia can leverage a personalized history of interactions (ex: Alan, last year you enrolled for plan A) and understand the sentiment of the question. (ex: Alan, I see you’ve submitted two claims for chiropractic work, I hope your back is feeling better) By combining these five elements, Amelia becomes more than just a “static chatbot” which gathers input and performs and query into knowledge bases to provide a generic response. This elevates Amelia beyond just 1:1 question and answer, and enables employees and customers to have more detailed conversations, ones that can become more granular, or even branch into a different direction from the original questions. 

At the Digital Workforce Summit, IPsoft demonstrated two use-cases that impressed me. The first started off as a pretty standard internal helpdesk situation, where an employee had a question about vacation benefits. However, after answering a few questions Amelia then proactively asked the person of they would like to schedule paid time off, much like a real person may do when having this conversation.

The second demonstration was around customer engagement. It highlighted how Amelia can be used to help a customer find products while shopping online. Amelia is able to guide people through the buying process, asking them questions about things like size, style, colour, availability, etc. ideally leading to a more a successful “surf to sale” (ok quote to cash) experience. 

Augmented Assistance Requires Industry Expertise 

While there are several platforms that software developers can use to add artificial intelligence or machine learning into their applications/chatbots, for the most part they don’t provide the knowledge that is needed to train them to work in specific industries. Think of it like hiring a new employee who while skilled in their area, does not actually know anything specific about your company’s products, rules, policies, workflows, etc. You have to invest a lot of time and money training them after they are hired. Digital assistants are similar in that they are only as good as what they have been trained on and what they’ve experienced and learned from. The cost and complexity of training these digital assistants are two of the challenges that early adopters have faced. At the Digital Workforce Summit, IPsoft announced they intend to reduce these issues by providing 16 pre-trained roles for Amelia with more than 140 skills, via a new industry marketplace.

Here is a sample of the skills that IPsoft will be making available to customers using Amelia:

Industry Role Skills
Banking
  • Credit Card Concierge
  • Personal Teller
  • Financial Advisor
  • Mortgage Agent
  • Payments, lost/stolen cards, travel alerts, reward programs
  • Balances, ordering checks, helping transfers
  • Investing services
  • Approvals, balances, interest calculations
Insurance
  • Automotive Agent
  • Property Rental
  • Life Insurance
  • Policy quotes, payments, accident claims
  • Coverage queries, claim processing
  • Variable vs universal life, coverage and rider queries
Healthcare
  • Insurance Benefits
  • Care Concierge
  • Coverage, plans enrolments
  • Booking appointments, refilling prescriptions


Automating ITSM with IPsoft 1Desk

One of the big announcements at the Digital Workforce summit was the launch of IPsoft 1Desk, which they describe as the “convergence of front and back-office functions into a single autonomic framework."

What this means is that automated support for employees can go beyond just “Q&A type chat conversations" with Amelia, they can now result in actions being taken to resolve the issues. For example, if an employee asks questions about wi-fi or email issues, once Amelia determines what the problem is, it can attempt to take corrective actions automatically, or at least step people through the actions they need to take, including showing screenshots or visual guides. 

Building Trust, Understanding and Administration 

While speaking with customers at the summit, it became clear that while they are very interested in the features that Amelia provides, one of the hurdles is in understanding the administration of “digital agents.” As with any nascent technology, companies need to know what new skills they are going to require from their IT staff, their support staff, their developers and their employees (“users”). For example, several customers told me they desire detailed analytics around what’s happening (what type of questions are being asked), what’s working well (which have high-resolution rates or times) and which processes need to be improved (where is automated response failing, what’s frustrating users, etc). 

Augment, not Annihilate. Enhance not Eliminate

Any time you speak to someone about AI or automation these days, the same question always comes up… “What impact will this have on my headcount?" While it is impossible for me to give a generic answer without learning the specifics of each company, I still find it important to emphasize that solutions like Amelia are not intended to replace humans, but rather augment their ability to get their jobs done "better". Better could mean faster, more accurately, higher volume, better satisfaction, more creatively, or a host of other improvements.

Conclusion: Business not Buzzwords

One of the challenges for IPsoft and its competitors is establishing what market “digital assistants” are part of. Is it RPA? Is it IT Service/Support? Is it Customer Experience? Accurately and simply explaining to customers what problems IPsoft can solve is critical. IPsoft is doing a very good job at focusing on the “business” of automation, rather than the technology. Chief Marketing Officer Anurag Harsh gave an excellent opening keynote, where he focused mainly on “what Amelia can do” and not “how it can do it.” I find it too common these days for tech companies to get too detailed about artificial intelligence and machine learning, neural networks and training data sets, confusing customers and raising too many questions. It would be like a car salesperson telling you how the airbags work instead of just telling you that they keep you safe. Don’t get me wrong, there is a need for technical details to help customers understand scale, security and other differentiators, but I think focusing on business use-cases is always the right way to start the conversation. IPsoft is doing well with their use-case focused messaging and the amazing new Amelia City lab/briefing centre.

For further information about IPsoft’s Digital Workforce Summit, here is a short video I record while at the event:

Also, here is a customer case study my colleague Cindy Zhou and I wrote:

How SEB Bank Uses IPsoft to Increase Customer and Employee Engagement
This case study examines SEB Bank's experience implementing IPsoft's Amelia artificial intelligence (AI) platform for use by both internal employees and external customers. Since the Swedish bank deployed Amelia, which it renamed Aida, internal employees have dramatically reduced the resolution times for common IT support issues. Externally, 91 percent of the customers that used Aida rated the solution as "very good" or "good." Aida also helps customers book branch sales appointments for more-personalized service and reduced wait times.

 

Future of Work

Alteryx Offers Gift of Time for Analytical Innovation

Alteryx Offers Gift of Time for Analytical Innovation

Alteryx adds repeatability, automation features so analysts can streamline data prep, step up predictive analysis.

Alteryx eliminates tedious and time-consuming data-prep work so users can spend more time on innovative, advanced analytical work.

This was the core appeal articulated at Alteryx Inspire, the company’s annual customer conference held June 4-7 in Anaheim, CA. Riffing on the event theme “Alter Everything,” CEO Dean Stoecker encouraged the more than 3,000 Inspire attendees to use Alteryx to eliminate error-prone, repetitive spreadsheet drudgery and drive transformations and new business opportunities through data science.

Alteryx CEO Dean Stoecker shares examples of customers that gained time through repeatable data prep and transformation that they could redirect toward analytical innovation.

Alteryx laid out more than a dozen significant product enhancements at Inspire. It also set forth a roadmap toward machine-learning-based recommendation features and cloud-based deployment options coming to future releases. Here’s a closer look.

Alteryx Upgrades for 2018

Alteryx offers four products that it’s moving to synthesize into a tightly integrated platform. The core product that all customers use is Alteryx Designer, a desktop data-prep, blending and analytical tool that was introduced in 2006. Alteryx Server, added in 2010, provides a platform for connecting to data sources, collaborating around and automating Alteryx data workflows, and executing at big data scale. Today the vast majority of Alteryx revenue comes from an even split of Designer and Server licenses, but the company introduced two new products in 2017.

Using proceeds of its March 2017 IPO, Alteryx made two acquisitions in June 2017. Building on the acquisition of Semanta, a data cataloging and governance vendor, the company introduced Alteryx Connect, which offers data-discovery and collaboration capabilities. The assets of Brooklyn, NY-based startup Yhat were developed into Alteryx Promote, which supports analytical model deployment, monitoring and ongoing optimization.

The key announcements at Inspire were around more than a dozen upgrades across all four products. Some of these upgrades are already available, some are set for later this year and a few won't be available until next year. Demos of several of these upgrades drew raves from attendees while offers seemed overdue. Highlights included:

A social data catalog and asset recommendations. Search enhancements now generally available make it easier for users to find the data they’re after. In Q3 Alteryx will beta release an upgrade whereby users will get recommendations on the most popular data sources blended with a particular data source when they drag and drop it into a workflow. 

Workflow caching. This upgrade, set for beta release in Q3, makes it easier to develop reusable data-prep and data-processing workflows by saving interim steps during an iterative design session.

Data profiling. Now generally available, this feature provides statistical details on the number of records, range of values, average value and other details about data set. Quality stats, such as the percentage and number of missing or exception values, will be useful in driving cleanup recommendations.

Python SDK. Alteryx already offered extensive support for use of R-based models, but with so much analytical development moving to Python, Alteryx has added an SDK supporting analytic development in this ascendant language.

Interactive Data Grid. These enhancements to Alteryx Designer’s core interface, planned for Q3 and beyond, drew hearty applause from customers, as it will make it easier for users to search, sort and filter their data without leaving Alteryx.

Alteryx VP of Product Management Ashley Kramer introduces the long list of upgrades the company has planned for 2018 and beyond. 

MyPOV on Alteryx Upgrades: As noted, some of these upgrades, like key data-source connectors for Alteryx Connect, SAML support in Alteryx Connect and Alteryx Server, and the addition of data profiling and the Python SDK, seemed overdue. The workflow caching, Interactive Data Grid and a new Insight Tool for dashboards (which are all future enhancements) were real crowd pleasers. They look like they’ll deliver more problem-solving shortcuts around what would otherwise require tedious and repetitive manual work. Overall Alteryx seems to have stepped up its pace of development, so I’d expect less catch-up development and more innovative new feature development in future releases.

On the Horizon

Alteryx has plenty of competition both from dedicated data-prep vendors, such as Trifacta and Paxata, on the one hand, and advanced analytics vendors, such as RapidMiner, Knime and Domino Data, on the other (DataRobot and H2O are partners). Adding to the competitive pressure on Alteryx has been efforts by the likes of Tableau, Microsoft PowerBI, Qlik and other mainstream analytics vendors to add basic data-prep capabilities.

Roadmap themes going forward for Alteryx include:

  • Knit together Connect, Designer, Server and Promote more closely to support seamless, end-to-end analytic workflows.
  • Add machine-learning based capabilities across the platform to augment the skill of humans and support automation.
  • Build out a container-based cloud deployment approach to support consistent, reusable use of the platform in hybrid and multi-cloud deployment scenarios.

Specific developments on the horizon include a Solution-Based Modeling Utility that will walk business analysts through the predictive modeling steps of data prep, analysis, model building and deployment through a wizard-driven interface. Also coming, in the ML vein, are data formatting, cleansing and join recommendations and natural language query capabilities.

MyPOV on Alteryx. I know from prior research (like this case study on Ford) that Alteryx is a great, time-saving and democratizing option for companies looking to make data-driven decision-making more of a self-service proposition. As was abundantly evident at Inspire, the company is growing quickly and has plenty of enthusiastic users of Alteryx Designer and Alteryx Server.

What remains to be seen is how successfully the company’s new products, Connect and Promote, can be knitted tightly into inseparable platform. Plenty of customers at Inspire told me they are considering adding these components, but few have done so as yet. A notable exception was Shell Oil Company, which offered a presentation on how it’s using Alteryx Promote as part of an effort to spread ML-based predictive analytical capabilities across the company.

What also remains to be seen is the extent to which native data-prep capabilities coming from the likes of Tableau will sap demand for more Alteryx licenses. Alteryx execs contended that these moves will only seed more demand for its platform. Whether entry-level prep happens in Alteryx or not, I see the company’s expanding analytical and model-deployment capabilities as the company’s bigger opportunity. Prep is something to get out of the way and make repeatable. Analytics is where companies are going to get more bang and business impact for the buck.

Releated Reading:
Qlik Hits Reset Button, Rolls Out New Cloud, AI & Developer Capabilities
MicroStrategy Makes Case for Agile Analytics on its Enterprise Platform
Ford Analytics Team Democratizes Data-Driven Analysis

 

Data to Decisions Tech Optimization tableau Chief Information Officer Chief Digital Officer

News Analysis - Unit4 releases People Platform Extension Kit - A new way to fit ERP to business needs

News Analysis - Unit4 releases People Platform Extension Kit - A new way to fit ERP to business needs

 

It's not often ERP vendors change the game on customization and extension, this press release from Unit4 deserves attention as it does exactly that.

 

Let's dissect the press release in our customary style – it can be found here:

 

 

Unit4, a world leader in enterprise software for service organizations, announces today the release of new cloud services designed for customers and partners to easily extend its enterprise solutions with custom industry specific apps.

MyPOV – Ok – good summary of the press release.
 

 

 

The People Platform Extension Kit is designed to meet the specific challenges of service-based business models. As organizations modernize their business models to engage people in new ways, Unit4 is providing customers and partners with the freedom to develop differentiating front-end applications, that benefit from the critical data held in their back-office systems.

MyPOV – The innovation here is not the extension kit, ERP vendors have done this since decades in various forms, but the focus / choice of capabilities for service heavy industries.
 

 

 

The People Platform is the foundation for creating intelligent enterprise applications, providing services enabling Unit4 applications to become self-driving by offering access to machine learning capabilities, based on data collection and mining. The Extension Kit gives partners and customers access to the full breadth of Unit4 technology. They can construct custom tailored extensions or complete applications, benefiting from the powerful capabilities of the People Platform and intelligence in Unit4 Business World, becoming first-class citizens of the Unit4 application eco-system. Due to the loosely coupled, micro-service based architecture, partners and customers can develop using their preferred tooling and offer their solutions through any industry-standard marketplace.

 MyPOV – Investment in platforms always pays off, once they work and SaaS offerings run on them – this is another proof point, with the extension kit being based and running on the Unit4 People Platform. Making the extension seamless to the users is a key feature, as user experience should not be aware of what is vendor vs extension capability. For the business user the result of an extension has to be seamless UX integration. 

 

 

 

"We're in an age of business process uncertainty where for the first time technology can do more than what traditional business best practice demands," said Holger Mueller, VP and Principal Analyst at Constellation Research. "We're seeing enterprise acceleration at unprecedented rates with organizations moving faster than ever before. They can be a disruptor or be disrupted and it's their people that can make the difference. By empowering them to purpose build services and small apps in the areas that matter, connected to their enterprise applications, they can break away from the monolithic nature of ERP. Through low-code technology like this, people in business become smarter and empowered to work more effectively, producing better value in their work."

MyPOV – Solid quote... ok. Might be a little biased here. The key aspect is the low code and small apps ability. It expands the number of people available to build extension, a key aspect as often SaaS / ERP software does not work well for areas of the business that will never receive Its attention or justify the cost of a consultant or even developer to make things run smoothly. The other key aspect and innovation is that this makes ERP less monolithic, meaning there is only one way to do things. Being able to create lots of small applications with the extension kit – they may have similar / duplicate capability – but work for different users, is a major break through in ERP architecture.

 

 

 

 

"The everything as a service economy is driving business model changes around the world," said Stephan Sieber, CEO of Unit4. "As customer demand for simple online subscription services and rapid value grows, organizations are modernizing business models to create greater efficiencies and to engage customers, employees, and business partners in new ways. Core enterprise systems are vital, but do not deliver competitive differentiation on their own. Our customers have unique strategic processes, and by opening our solution platform for simple application development irrespective of programming language or industry marketplace, they can build very specific apps that deliver rapid value and seamless user experience. Essentially this is the next generation of customization technology enabling organizations to have exactly what they want and need to be successful."

MyPOV – Good quote from Sieber, focusing on what matters – smooth running software, now enabled by technically savvy – but non-developer – resources in the enterprise. 

 

 

 

 
Pricing and Availability
The People Platform Extension Kit and pricing details will be available in Fall '18. Customers and partners can sign up to the Early Adopter Program from June.
 MyPOV – Always good to see immediate availability – at least for early adopters already in June. 
 

Overall MyPOV

Enterprise software will never be a 100% fit for enterprise automation needs. The move to the cloud has made the category even more resistant / unable to move to the perfect fit. It's good to see that in the more modern version of SaaS / cloud-based ERP software, vendors are starting to overcome these limitations. That requires to build on a modern platform and to expose these services, not only to technically savvy developers, but also to reasonably technology aware business users. The ability for business users to create smaller, department level or division wide applications, which can overlap, be redundant is a key step to move of the monolithic heritage of ERP. Good progress by Unit4.

On the concern side, there can be the situation of the what Disney showed well starring Micky Mouse in the Sorcerer's apprentice movie. In this case too many custom apps being created, performance, reliability, compliance and more issues can arise from the approach. But it's better to try a learn than not do the move, so it will be interesting to see what Unit4 customers will be building and what overall experience will be.

But for now, congrats to Unit4 on a major milestone of making ERP fit better to the needs of service businesses.

 

 
 
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work New C-Suite PaaS Chief Information Officer

Discussion with Logitech: The Rise of Video Meetings: Smart, Affordable Collaboration

Discussion with Logitech: The Rise of Video Meetings: Smart, Affordable Collaboration

Improvements in both technology and culture are contributing to a rise in the use of video during webconferences. I sat down (over video of course) with Scott Wharton, GM of Logitech's Video Collaboration division to talk about the state of the market.