Results

Musings - Why splitting Windows is Nadella's first major mistake

On March 29th Microsoft shared that the head of it's Window team, Terry Myerson, was leaving and as a consequence the Windows team was going to be split up into two large development teams under Rajesh Jha and Scott Guthrie (see Nadella's memo here, kudos for transparency). 

 

 

 


Here is what we don't know: Did Myerson quit, or was he compelled to leave as expectations on the Windows progress did not meet the board's / shareholder expectations. Or did he leave knowing his team would be split and not interested in other roles, hanging in there etc. Those are missing pieces that may surface – or not – and change the analysis here.

So why could this be a major mistake for Microsoft and its users? Here are my musings:

Windows was finally 'fixed'. You can't blame Microsoft for not investing into Windows. And with Windows 10 Microsoft had finally fixed practically all the sins of the past, pulverized the skeletons in the closet all coming from the fast paced 90ies, that still had traces in the Windows source code until Windows 10. And Windows 10 has been steadily growing, even though it may have hit a slower pace or temporary backlash (see ComputerWorld here). Yes – Microsoft was no longer on track to get to 1B Windows 10 devices in 2018, but since when does that faze people… all market adoption projections need to be taken with a grain of salt. But with a Windows 7 end of life date in a few years, and when walking into any PC store – it only and always Windows 10 devices, that would have been addressed sooner then later.

Major Platform – with no leader? According to Statcounter (see here), Windows is in a neck to neck race with Android for overall platform leadership. And that's not a fair competition, different platforms, monetization, sales channels, purchase price and and… The real competition that is comparable is Apple's OS X and that's hovering well under 10%... so despite all these 'Hello I am a Mac' advertisements of years past, Apple's OS X hasn't moved up much on Windows 10. Would Apple split OS X? Don't think so. Would anyone split responsibilities of a platform with way over 1B installs up? Everybody wants one, they look for leaders and teams to get them there… And a platform with no leader, typically has ceased to be a platform only a few quarters in. Let's watch the next major Microsoft conference, which is Build in May. I expect some chaperoning by Nadella, and then Jha and Guthrie to merge the messages with their existing and new assets. And then watch for the cracks to appear… first small, then bigger, then visible, then obvious…

Platform Morphing beats Platform Abandonment. You don't split a platform, even when it is old. You renovate it (see e.g. IBM with Z/OS), you re-platform it (see Microsoft with Windows 10), you innovate (see Apple OS X) or you morph it to where it needs to be and where to evolve towards. Nadella is right that in the very long run, the PC is dead. And certainly, the cloud and the edge are showing more growth. But the industry has not come up with an alternative to the PC - yet. You can call the Chromebooks something else than a PC, but the form factor, connectivity is practically the same device. This is where Windows may have to morph, to maybe a browser based OS, and Microsoft has very much the assets (and the ambition) in play with Edge. And a micro Edge browser could very well work on the IoT edge. Wait – we have even have a perfect branding head line – Microsoft Edge for the IoT Edge – with all the good Windows DNA should that IoT edge platform need to get a little more beefier. Wait there is also Windows Server… so morph, position… even "embrace and extend" – remember that? Why no more in 2018?

Warning – Major Brand Implosion. Searched the interwebs for a bit on Windows brand value… with no success. But it must be out there… (please let me know if you find it). What is the #1 brand associated with Microsoft – Windows, then Office. Ask anyone. Why give that up? Yes it maybe old, but so are the affluent aging populations in the 1st world – and they know Windows for their whole computing life time. It may not be the snazziest brand and may need some maintenance… but in the B+ brand area this is just destruction of brand value … again – you morph a brand, you don't… split it, make it disappear (I know Microsoft will of course argue this, but let's watch what happens to the Windows brand in the next 48 months). 
 
What's the platform message? Microsoft tried with the Universal Windows Platform (UWP). A very attractive value proposition for developers. Yes, the mobile part fell flat, but Microsoft has successfully provided tools to run on iOS and Android, and a great testing capability with Xamarin. Developers till have to build for and on Windows devices... so what is the message to the developer community with the Windows split? At the moment I can't go to good places for that... will be interesting for Microsoft to address at Build in May in Seattle. 
 
What does it mean for the future of computing? Microsoft has done remarkable footwork with the HoloLens, which runs Windows 10. I called it the first 'headable' PC. Will Windows 10 slim down as a more device centric OS? What about the synergies of running the same apps in a familiar OS? More questions that don't bide well if see a fragmentation of Windows going forward.
 

MyPOV

Certainly a bold move by Nadella, probably his boldest. I am sure he has major shareholder (aka Bill Gates, Steve Ballmer) support. Both of those two have dedicated decades of their lives to make Windows what it is today. They may know something, that we don't know, and I am happy to correct this blog… when I turn out to be wrong. I can't' imagine Balmer giving Nadella a hard time that he is not moving fast enough to split up Windows... but hey, maybe. But for now, pretty comfortable with the POV… what is yours? Please share!

 

[April 10th 2018] Needless to say Microsoft wants to stress some points here: Windows remains an important part of Microsoft's future, in combination with the Microsoft 365 offerings. Windows also powers the devices on the "intelligent" edge. Microsoft states that customers have been asking for getting Office, Windows and devices closer to each other for a better experience. Fair enough, make up your mind. 1st data point will be ... Build. Or any major announcements before. 

 

Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Data to Decisions android apple Microsoft

Event Report - IBM Think 2018 - IBM is back...

We had the opportunity to attend IBM's Think conference held in Las Vegas, all over the MGM and Mandalay Bay, from March 19th till 22nd 2018. Happening in the busiest week of Spring conference season, I could make it only for one day, invited by the IBM Partner conference, that was happening in parallel. 

 

 

 

 

 

 

 

Prefer to watch – here is my event video … (if the video doesn't show up – check here)

 

 

 ):


Here is the 1 slide condensation (if the slide doesn't show up, check here):

 

 

 


Want to read on? Here you go:

IBM brings Partner program to 21st century – I had the opportunity to attend the PartnerWorld program events and it was good to learn that the partner program is making jolts into the 21st century, with simplifications for partners to do business with IBM, cutting down the incentive system from a triple digit to a single digit number, giving partners sandboxes to evaluate, create and sell joint offerings. To a certain point surprised this is only happening now - but better now than later or never. Talking to partners the top concerns remain channel conflict with IBM's direct sales force, while the top wish was for IBM to build up dedicated partner pre-sales capacities (in North America). Common concerns, fears and wishes from partners towards their product / platform vendor. 

 

 

 

 

IBM Think 2018 Holger Mueller Constellation Research
Teltsch in the Partner Keynote


ICP picks up speed – IBM has re-positioned it's hybrid cloud offerings, formerly BlueMix local with IBM Private Cloud (IPC), basically providing the same value proposition as before – a platform for enterprises to build their next generation applications on. Apart from the development tools, IPC has a twist on operation management and monitoring, an important aspect of a hybrid cloud. Being able to securely scale code and monitor the next generation application is important for enterprises. And it being 2018 – Kubernetes is a key aspect to make this happen, and IPC leverages Kubernetes to achieve load portability across clouds. The product team gave me a private preview demo on what is coming later in the year, and it looks promising in terms of capability and usability, especially for IBM shops, and potentially beyond. 

IBM Think 2018 Holger Mueller Constellation Research
Teltsch in convo with Wylie

 

 

 

 

IBM ML comes to Swift - IBM has been closely working with Apple on Swift, pretty much since the launch of Apple's new programming language. The joint solutions are built on the platform, and given the IBM push on cognitive / Watson / ML / AI, it's key that developers on the Swift platform that are building applications in and for the IBM ecosystem can leverage IBM AI / ML services for the iOS applications. Something joint customers expected and IBM now has (finally) delivered. I had some good conversations with early adopters. 

 

 

 

MyPOV

After a one year hiatus of no conferences and events, IBM is back having a user conference. It has consolidated all the many separate conferences that IBM used to have in one single one, which is of course a major challenges for all involved... but a good change in my view. Customers could not afford to attend 4-5 conferences a year, if they were fully bought into the IBM offerings... moreover, customers had to connect the dots between the various offerings, which at times, where not synced... as each conference would plan it's product and announcement cycles around their individual conferences... Think makes a difference here, aligning messaging, and likely over time product release cycles... that makes it easier for customers and prospects get an overview at a coordinated point in time of the many IBM products, offerings and services. It was good to see the focus of the new partner management regime on making it easier and simpler for partners to do business with (or for?) IBM. Always a good true north for a partner organization.

On the concern side IBM needs to learn how to put out a mega conference, to maximize value for attendees and return of event dollars. It has massive experience of a single property events in Las Vegas, you name it and IBM has been at the respective casino. Multiple property events in Las Vegas are hard for all vendors who have outgrown a single property, but IBM can do better connecting them. And on the product innovation side, it felt at times that announcements could have been made earlier, but were held for Think. Understandable, but a fine line to walk for any vendor. It will be good to see what IBM can create and deliver in the next 12 months, giving a better insight of the innovation power that IBM can harness.

But for now, good to see a single event happening, aligning all messaging, product, offering and services cycles, for a first combined event, Think 2018 was a good start. Stay tuned. 


Also - check out a Twitter Moment of IBM Think 2018 here

 

Tech Optimization Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity PaaS ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Cloud CCaaS UCaaS Enterprise Service Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Event Report - Oracle HCM World 2018 Dallas - Steady Progress

We had the opportunity to attend Oracle's HCM World conference, held from March 20th till 22nd in Dallas. Held in the busiest week of the spring conference circuit, the conference had good analyst and influencer representation and was overall well attended (Oracle claimed 2200+). 

 

 

 


Prefer to watch – here is my event video … (if the video doesn't' show up – check here)

 

):


Here is the 1 slide condensation (if the slide doesn't show up, check here):

 

 

 

 


Want to read on? Here you go:

Oracle keeps delivering in HCM – Since its early beginning the Oracle HCM product, has done well and is growing at an almost metronomic takt rate. New capabilities are created in the product, and Oracle delivers customer adoption of them in the following 6-12 months. Customers, partners and the overall ecosystem is now on the by-yearly announcement schedule of HCM World in Spring and OpenWorld in Fall, and in between customer adoption manifests themselves in go lives. The result is likely the most comprehensive singe platform HCM Suite in the market. 

 

 

 

Oracle HCM Cloud HCM World Holger Mueller Constellation Research
Oracle HCM Cloud Spring 2018 Release Highlights

 


Good DNA in Spring 2018 Release – Along the same lines the 2018 Spring release of Oracle HCM Cloud is a rich release that pushes the boundaries. Oracle replaced and strengthened the Onboarding capabilities in the suite. Given the best practice uncertainty that at the moment plagues Performance Management, Oracle has (wisely) opted for a suite of performance management tools. People leaders can choose from four different approaches to address performance management in their enterprise. The good news is that they can even choose to use different best practices / flavors of performance management across the enterprise – slice and dice by operating company, division, people type etc. The right approach and it will be interesting to see adoption. And last but not least more AI, no conference without AI in 2018… Oracle pointed out that it always has had some form of 'intelligence' but no the offerings are serious and it's good to see Oracle speak of AI (vs the a tad misaligned 'adaptive intelligence' term of the past. 

 

 

 

 

Oracle HCM Cloud HCM World Holger Mueller Constellation Research
The new Oracle HCM Cloud UX paradigm

 


A new UX - Having been a critic of the Oracle HCM UI for a long time, the new UX is a welcome first step in the right direction to improve this situation. Not surprisingly Oracle opts for the mobile focus of the largest user population, and for the high volume transaction. The new UX looks modern, easy to use and has some of the key inner workings a UX in 2017 should have – it's responsive and can be used across devices and form factors as well takes an aggressive stance on defaulting, suggesting entries. At the core is a newsfeed paradigm, that users are familiar with from consumer websites. The newsfeed manages to collapse menu structures and to surface the relevant information at the right time. Not easy to get right - think of the challenges Facebook has had getting this UX right, but from what we saw about the Oracle HCM newsfeed, it's a well working first implementation in an Oracle enterprise application.  All welcome by busy enterprise users, who have to do a real job and can't afford to be held up too long by administrative systems (like any HCM system). Next step is to check in on customer feedback, roll out plans and roadmap.

 

 

 

 

 

Oracle HCM Cloud HCM World Holger Mueller Constellation Research
Oracle has invested to get the Newsfeed right

 

 

 

 

MyPOV

A good HCM World for Oracle, the product keeps progressing and customers and ecosystem are positive. Customers tap more and more into the suite benefits, adding modules after originally going live on more administrative functions as ESS / MSS and Payroll. Needless from the start though, suite level benefits are tangible and create productivity for users as well as HR departments, all leading to a more positive stance towards the product, in this case Oracle HCM Cloud.

On the concern side the event seemed smaller than last years events. Maybe the timing and the location in Dallas did not help. But in general you expect a growing attendee number. Equally in the ecosystem we saw less partner activity... it looks like Deloitte (for NA SIs) and Infosys (for the Indian SIs) seem to have capture the pool position. Next year's HCM World will be a data point on how much Oracle can activate users and prospects to come to an event like this on. For the record, happy users do not need to travel to user conferences as much, as they know what's coming and are busy implementing and using the software.

But overall a good event for Oracle customers and prospects. Oracle HCM Cloud is probably the most complete, single platform, single code base HCM Suite out there. Stay tuned.




 

Future of Work Tech Optimization Innovation & Product-led Growth New C-Suite Data to Decisions Next-Generation Customer Experience Revenue & Growth Effectiveness Sales Marketing Digital Safety, Privacy & Cybersecurity Oracle AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Customer Officer Chief People Officer Chief Human Resources Officer

Nvidia Accelerates Artificial Intelligence, Analytics with an Ecosystem Approach

Nvidia’s GTC 2018 event spotlights a play book that goes far beyond chips and servers. Get set for next era of training, inferencing and accelerated analytics.

“We're not a chip company; we're a computing architecture and software company.”

This proclamation, from NVIDIA co-founder, president and CEO Jensen Huang at the GPU Technology Conference (GTC), March 26-29 in San Jose, CA, only hints at this company’s growing impact on state-of-the-art computing. Nvidia’s physical products are accelerators (for third-party hardware) and the company’s own GPU-powered workstations and servers. But it’s the company’s GPU-optimized software that’s laying the groundwork for emerging applications such as autonomous vehicles, robotics and AI while redefining the state of the art in high-performance computing, medical imaging, product design, oil and gas exploration, logistics, and security and intelligence applications.

Jensen Huang, co-founder, president and CEO, Nvidia, presents the sweep of the
company's growing AI Platform at GTC 2018 in San Jose, Calif.

On Hardware

On the hardware front, the headlines from GTX built on the foundation of Nvidia’s graphical processing unit advances.

  • The latest upgrade of Nvidia’s Tesla V100 GPU doubles memory to 32 gigabytes, improving its capacity for data-intensive applications such as training of deep-learning models.
  • A new NVSwitch interconnect fabric enables up to 16 Tesla V100 GPUs to share memory and simultaneously communicate at 2.4 terabytes per second -- five times the bandwidth and performance of industry standard PCI switches, according to Huang. Coupled with the new, higher-memory V100 GPUs, the switch greatly scales up computational capacity for deep-learning models.
  • The DGX-2, a new flagship server announced at GTC, combines 16 of the latest V100 GPUs and the new NVSwitch to deliver two petaflops of computational power. Set for release in the third quarter, it’s a single server geared to data science and deep-learning that can replace 15 racks of conventional CPU-based servers at far lower initial cost and operational expense, according to Nvidia.

If the “feeds and speeds” stats mean nothing to you, let’s put them into the context of real workloads. SAP tested the new V100 GPUs with its SAP Leonardo Brand Impact application, which delivers analytics about the presence and exposure time of brand logos within media to help marketers calculate returns on their sponsorship investments. With the doubling of memory to 32 gigabytes per GPU, SAP was able to use higher-definition images and a larger deep-learning model than previously used. The result was higher accuracy, with a 40 percent reduction in the average error rate yet with faster, near-real-time performance.

In another example based on a FAIRSeq neural machine translation model benchmark test, training that took 15 days on NVidia’s six-month-old DGX-1 server took less than 1.5 days on the DGX-2. That’s a 10x improvement in performance and productivity that any data scientist can appreciate.

On Software

Nvidia’s software is what’s enabling workloads—particularly deep learning workloads--to migrate from CPUs to GPUs. On this front Nvidia unveiled TensorRT 4, the latest version of its deep-learning inferencing (a.k.a. scoring) software, which optimizes performance and, therefore, reduces the cost of operationalizing deep learning models in applications such as speech recognition, natural language processing, image recognition and recommender systems.

Here’s where the breadth of Nvidia’s impact on the AI ecosystem was apparent. Google, for one, has integrated TensorRT4 into TensorFlow 1.7 to streamline development and make it easier to run deep-learning inferencing on GPUs. Huang’s keynote included a dramatic visual demo showing the dramatic performance difference between TensorFlow-based image recognition peaking at 300 images per second without TensorRT and then boosted to 2,600 images per second with TensorRT integrated with TensorFlow.

Nvidia also announced that Kaldi, the popular speech recognition framework, has been optimized to run on its GPUs, and the company says it’s working with  Amazon, Facebook and Microsoft to ensure that developers using ONNX frameworks, such as Caffe 2, CNTK, MXNet and Pytorch, can easily deploy using Nvidia deep learning platforms.

In a show of support from the data science world, MathWorks announced TensorRT integration with its popular MATLAB software. This will enable data scientists using MATLAB to automatically generate high-performance inference engines optimized to run on Nvidia GPU platforms.

On Cloud

The cloud is a frequent starting point for GPU experimentation and it’s an increasingly popular deployment choice for spikey, come-and-go data science workloads. With this in mind, Nvidia announced support for Kubernetes to facilitate GPU-based inferencing in the cloud for hybrid bursting scenarios and multi-cloud deployments. Executives stressed that Nvidia’s not trying to compete with a Kubernetes distribution of its own. Rather, it’s contributing enhancements to the open-source community, making crucial Kubernetes modules available that are GPU optimized.

The ecosystem-support message was much the same around Nvidia GPU Cloud (NGC). Rather than offering competing cloud compute and storage services, NGC is a cloud registry and certification program that ensures that Nvidia GPU-optimized software is available on third-party clouds. At GTC Nvidia announced that NGC software is now available on AWS, Google Cloud Platform, Alibaba’ AliCloud, and Oracle Cloud. This adds to the support already offered by Microsoft Azure, Tencent, Baidu Cloud, Cray, Dell, Hewlett Packard, IBM and Lenovo. Long story short, companies can deploy Nvidia GPU capacity and optimized software on just about any cloud, be it public or private.

In an example of GPU-accelerated analytics, this MapD geospatial analysis shows six
years of shipping traffic - 11.6 billion records without aggregation - along the West Coas
t.

MyTake on GTC and Nvidia

I was blown away at the range and number of AI-related sessions, demos and applications in evidence at GTC. Yes, it’s an Nvidia event and GPUs were the ever-present enabler behind the scenes. But the focus of GTC and of Nvidia is clearly on easing the path to development and operationalization of applications harnessing deep learning, high-performance computing, accelerated analytics, virtual and augmented reality, and state-of-the art rendering, imaging or geospatial analysis.

Analyst discussions with Huang, Bill Dally, Nvidia’s chief scientist and SVP of Research, and Bob Pette, VP and GM of pro visualization, underscored that Nvidia has spent the last half of its 25-year history building out its depth and breadth across industries ranging from manufacturing, automotive, and oil and gas exploration to healthcare, telecom, and architecture, engineering and construction. Indeed, Nvidia Research placed its bets on AI – which will have a dramatic impact across all industries – back in 2010. That planted the seeds, as Dally put it, for the depth and breadth of deep learning framework support that the company has in place today.

Nvidia can’t be a market maker entirely on its own. My discussions at GTC with accelerated analytics vendors Kinetica, MapD, Fast Data and BlazingDB, for example, revealed that they’re moving beyond a technology-focused sell on the benefits of GPU query, visualization and geospatial analysis performance. They’re moving to a vertical-industry, applications and solutions sell catering to oil and gas, logistics, financial services, telcos, retail and other industries. That’s a sign of maturation and mainstream readiness for GPU-based computing. In one of my latest research reports, “Danske Bank Fights Fraud with Machine Learning and AI,” you can read about why a 147-year-old bank invested in Nvidia GPU clusters on the strength of convincing proof-of-concept tests around deep-learning-based fraud detection.

Of course, there’s still work to do to broaden the GPU ecosystem. At GTC Nvidia announced a partnership through which its open sourced deep learning accelerator architecture will be integrated into mobile chip maker Arm’s Project Trillium platform. The collaboration will make it easier for internet-of-things chip companies to integrate AI into their designs and deliver the billions of smart, connected consumer devices envisioned in our future. It was one more sign to me that Nvidia has a firm grasp on where its technology is needed and how to lay the groundwork for next-generation applications powered by GPUs. 

Related Reading:
Danske Bank Fights Fraud with Machine Learning and AI
How Machine Learning & Artificial Intelligence Will Change BI & Analytics
Amazon Web Services Adds Yet More Data and ML Services, But When is Enough Enough?

Data to Decisions Tech Optimization Chief Information Officer Chief Digital Officer

Event Report - ADP Meeting of the Minds 2018 - Stay the course

We had the opportunity to attend ADP's 25th Meeting of the Minds (MOTM) user conference, held in Orlando at the Walldorf / Hilton from March 18th till 23rd 2018.

 

 

 


Take a look at the event video first (if it does not show up - please check here):

 

 

 

 

 

 

 

 

 

 
Here is the 1 slide condensation (if the slide doesn't show up, check here):
 
Event Report - ADP Meeting of the Minds 2018 - Stay the course from Holger Mueller

Want to read on? Here you go:

 
 
 
 
 
 
 
 
If you want to learn more about the keynote, key tweets are collected in this Twitter Moment here.
 
 
 

MyPOV

A good event for ADP customers and prospects. ADP keeps delivering on a steady pace and creates value for its customer base. The focus on diversity and inclusion is high on people leader's agenda, so it is no surprise that ADP also focuses on this important topic. Good to see also the first TMBC assets making it to the mainstream North American customer base that ADP is targeting with its Meeting of the Minds conference.

On the concern side,  ADP is moving at a conservative, maybe too slow speed. One year is enough time for most vendors to create integrated value from an acquisition like TMBC. And ADP announced its new payroll product, Pi, back at the HR Tech conference in fall - so an update to the MOTM attendees would have been timely. Not to mention ADP's new HR core system Lifion, that ADP advertises for talent publicly, but choose to mention in Orlando. It's always good for enterprises software vendors to be conservative and quality focused, and ADP customers certainly expect that, but vendors can't be too slow rolling out differentiating capabilities either. 

Overall a good event, customers and ecosystem are happy with the progress. A lot of new innovation should see the light at ADP MOTM 2019, fingers crossed. Stay tuned. 


 
Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization New C-Suite Revenue & Growth Effectiveness User Conference AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Customer Officer Chief People Officer Chief Human Resources Officer

Monday's Musings: Designing Five Pillars For Level 1 Artificial Intelligence Ethics

 

Focus On Humanizing AI

As organizations begin their journey into artificial intelligence (AI), ethics often enter the design process.  While achieving a uniform set of ethics may seem insurmountable, some design points will help facilitate the humanization of artificial intelligence and provide appropriate checks and balances.  Constellation has identified design pillars for Level 1 AI.  Level 1 AI is defined as machine learning proficiency (see Figure 1)

Figure 1.  Five Levels of Artificial Intelligence Requires Different Design Points

The five pillars include (see Figure 2):

  1. Transparent.  Alogrithms, attributes, and correlations should be open to inspection for all participants.
  2. Explainable.  Humans should be able to understand how AI systems come to their contextual decisions.
  3. Reversible.  Organizations must be able to reverse the learnings and adjust as needed.
  4. Trainable.  AI systems must have the ability to learn from humans and other systems.
  5. Human-led.  All decisions should begin and end with human decision points.

Figure 2. Five Pillars For Level 1 AI Ethics Focus On Humanizing AI

The Bottom Line.  Instill The Five Design Pillars For AI Ethics In All Projects

Prospects of universal AI ethics seem slim.   However the five design pillars will serve organizations well beyond the social fads and fears.  The goal – build controls that will identify biases, show attribution, and enable course correction as needed.

 

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Matrix Commerce Sales Marketing Revenue & Growth Effectiveness New C-Suite infor SoftwareInsider AI Agentic AI LLMs Generative AI ML Analytics Automation CX Customer Service Supply Chain Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software IoT Blockchain ERP Leadership Collaboration M&A Marketing B2B B2C Customer Experience EX Employee Experience Growth eCommerce Next Gen Apps Social Content Management Machine Learning Robotics SaaS PaaS IaaS Quantum Computing CRM CCaaS UCaaS Enterprise Service developer Metaverse VR Healthcare Data to Decisions finance HCM HR business AR Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Customer Officer Chief Experience Officer Chief Digital Officer Chief Revenue Officer Chief Supply Chain Officer Chief Marketing Officer Chief Financial Officer Chief Operating Officer Chief People Officer Chief Human Resources Officer

Domo Focuses Its Cloud-Based Analytics Message, Adds Predictive Options

Domo insists its platform is aimed at business people, not data analysts. Here’s the appeal to CXOs and line-of-business types.

The key message at Domopalooza 2018, March 13-15 in Salt Lake City, was that Domo is a platform for business, not a tool for techies. I’ve heard this platform messaging before, but it made sense for cloud-based analytics vendor Domo to emphasize its new “for the good of the company” slogan to try to set itself apart from competitors Tableau, Microsoft Power BI and Qlik.

“Domo is an operating system that lets you run your business on your phone,” declared Domo CEO and founder Josh James in his opening keynote.

It’s an apt description, though Domo is most often used to run the marketing and sales aspects of businesses. Once adopted, Domo usage tends to expand and, in some cases, go companywide. Customers I spoke to at Domopalooza were extending their deployments into finance, customer support, supply chain management and other operational areas. Broad use is most common among corporate customers (meaning those with less than $1 billion in annual revenue), but enterprise customers including Target, United Health Group, Telus and L’oreal are expanding their Domo footprints.

These announcements from Domopalooza 2018 will see beta release in spring and
general availability this summer.

To recap some basics (from my Domopalooza 2017 analysis), Domo is a cloud-based, multi-tenant platform onto which you can load diverse data at scale. There are more than 500 connectors to common data sources, and Domo’s Magic ETL supports integration and transformation. Domo’s Vault back end and infrastructure runs primarily on Amazon Web Services, but it’s also available on Azure for customers (such as Target) that aren’t comfortable storing data on AWS.

Domo introduced a Bring Your Own Key encryption option last year. That won over many customers with demanding security requirements. The company is also rolling out a federated data-access option for those who want to retain certain data on-premises without loading it or copying it into the cloud. Performance with this remote-query option depends on the bandwidth and latency of the customer’s data-center connections and query engine.

Once data is loaded into Domo, admins expose data sets with appropriate access privileges. Business intelligence/data analyst types tend to build out the initial cards and pages (akin to data visualizations and dashboards), but it’s common for business types to create and edit their own cards. Once users learn the platform, departmental and line-of-business power users often develop cards, pages and even “Beast Mode” custom calculations. There are hundreds of prebuilt and templated cards and pages available.

Domo execs say it’s not uncommon for CXOs to be big Domo users. Jeremy Andrus, CEO at Traeger Grills and a keynote guest, said he’s the number-one user at his company, which is a $450-million maker of barbeque grills. Andrus said he looks at insights on revenue, day sales, margins, channel productivity and marketing efficiency on a daily basis, usually from his phone. At larger companies the buyers and biggest users are typically line-of-business leaders. A keynote panel of media customers brought together marketing, advertising and operations vice presidents from Domo customers ESPN, The New York Times, Univision and The Washington Post.

Upgrades Announced at Domopalooza

The major announcements at Domopalooza fell into four categories: storytelling, Mr. Roboto, certified content and data center. Most of these upgrades are expected to see beta release in this spring with general availability expected by summer. Here’s a quick summary of what’s in store:

Storytelling: Hightlights include auto-suggested page layouts, more templated page layouts, custom charting capabilities, and support for guided, interactive analyses.

Mr. Roboto: This is Domo’s intelligence, alerting and machine learning layer. Upgrades here include natural-language generation, automated predictions, forecast alerts, and anomaly and correlation detection in third-party data. Also planned are R- and Python-based data science integration.

Mobile views of coming Mr. Roboto capabilities including, left to right, natural language
generation, forecasting and correlation.

Certified content: Coming certification capabilities for cards, data sets and Beast Mode custom calculations will beef up governance and compliance capabilities with granular controls. This will help analysts and administrators ensure sound data and sound, sanctioned analyses, but it’s not a one-way street. Business users will be able to submit the new cards they develop for certification. Beefed up statistics for admins will reveal who’s using what data, cards and Beast Modes and whether there are overlaps, redundancies or inconsistencies.

Data center: These upgrades will help admin and data-management types with data cleansing and validation. Here, too, beefed statistics will help admins understand and tag data and then track usage. Collaboration and group controls will help with managing data sets at scale.

MyPOV on Domopalooza 2018 and Domo’s Direction

I assumed Domopalooza’s move from The Great American Hotel in 2017 to the far larger Salt Palace Convention Center for 2018 would mean a much bigger event, but attendance was roughly the same as last year at around 3,000 people. The extra space made things more commodious and comfortable, but last year’s event seemed to have a bit more energy.

As for the keynotes, Domo leaned heavily on fireside chats with celebrity guests. I prefer hearing from customers, particularly innovators. On that note, Simone Knight, VP, Marketing Strategy and Media Intelligence at Univision, was fantastic both as a guest and in leading the media panel. Ben Schein, Senior Director, Enterprise Data, Analytics and BI at Target, made a reprise appearance, updating the details of the retailer’s massive Domo footprint. Target now has 800 billion (with a B) rows of data in Domo, and demand now averages 3,000 weekly users, up from 1,500 weekly users last year.

As for the announcements, every customer I talked was eager to adopt the new features. The certifications, storytelling and data admin upgrades were described as must haves that can’t come soon enough. The Mr. Roboto capabilities are nice-to-haves that will drive innovation. If you read my January report on “How Machine Learning and Artificial Intelligence will Change BI and Analytics,” you know that Domo is among the leading vendors I detailed that are investing in ML and AI. I like the platform approach of Mr. Roboto, which will enable customers and partners to work with APIs and add their own code and customized capabilities.

Domo creates early anticipation for features by holding a “Sneak Peek” and customer-wish-list session at the end of every Domopalooza. The upside is that Domo’s direction is very customer driven. The downside is that it might be 14 to 18 months between initial public discussions about features and upgrades and general availability. That can make the process seem slow when, in fact, it’s just more open. Companies that are more secretive about their development work for many months before announcing new features run the risk of getting too little input and facing surprises in the beta and release stages.

Domo is still maturing. It started as a platform designed to run in the cloud and give business users web and mobile access to insights. The market messaging is now in line with that original vision and it’s building out the deeper levels of management control and customization capabilities that the developers and administrators are demanding as deployments scale up and out.

Related Reading:
MicroStrategy Makes Case for Agile Analytics on its Enterprise Platform
Tableau Conference 2017: What’s New, What’s Coming, What’s Missing
Qlik Plots Course to Big Data, Cloud and ‘AI’ Innovation

Media Name: Domopalooza 2018 announcements.jpg
Data to Decisions Tech Optimization Chief Customer Officer Chief Executive Officer Chief Marketing Officer Chief Supply Chain Officer Chief Digital Officer Chief Revenue Officer

Event Report - Ultimate Connections 2018 - More HCM, more WfM and more Xander

Ultimate Software had their yearly user conference, Ultimate Connections at the Wynn in Las Vegas, from March 11th till 14th 2018. The event was very well attending with over 4k+ customers and prospects, Ultimate had to find a new home for Ultimate Connections as the conference has outgrown the Bellagio. I wasn't able to attend, but Ultimate was so kind to rope me in remotely into keynote and analyst meeting sessions, very much appreciated.

 

 

 

 

 

 

 

 

Take a look at the event video first (if it does not show up - please check here):

 

Here is the 1 slide condensation (if the slide doesn't show up, check here):
 

 

 

 
Want to read on? Here you go:

 
 
 

[Factuall Correction] UltiPro Workforce Management, including both Time and Scheduling have been generally available since January, 2018.



 

Overall MyPOV

Good to see Ultimate follow up on its substantial road map promises made at Ultimate Connections 2017 and deliver product across the announced areas. Equally good to see more application of Xander in 'natural' AI application areas as with Recruiting, elimination of bias and improving Performance Management. The move into Workforce Management is a natural expansion of Ultimate footprint, welcome by Ultimate clients. It will be a few years, realisitically though, till the Ultimate Scheduling capabilities will match the leading best of breed workforce management solutions. 
 
On the concern side the variety of the Ultimate platform sticks out. While all technology ensembled are proven, it looks more like a yearbook of technologies that were up and coming in the last three to four years back. While the variety does not hurt Ultimate customers at the moment, the vendor will have to strive for a more unified and harmonized platform in the not too distant future.
 
But overall good to see the progress by Ultimate delivering on promises and pushing the AI yardstick further. It remains interesting that no major HCM player except fot Ultimate has named its assistant (yet) and it's unlikely the key HCM players will match or expand the scope that Ultimate offers today and announced for 2018 with Xander. A good position for Ultimate customers. Stay tuned.    
 
Future of Work Tech Optimization Innovation & Product-led Growth New C-Suite Data to Decisions Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Revenue & Growth Effectiveness AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR LLMs Agentic AI Generative AI Robotics Quantum Computing Disruptive Technology Enterprise Acceleration Next Gen Apps IoT Blockchain VR business Marketing IaaS CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer

On a robot fatality

Tragically, we have seen the first fatal accident involving a self-driving car, with a pedestrian in Tempe, Arizona dying after being hit by an Uber in autonomous mode. My thoughts are with the family, and the Uber operator. It would be a horrific experience.

I don’t know the facts of this case, but people are already opining that the victim was jay-walking.  

I sincerely hope commentators don’t simply line up around the clinical rights & wrongs of the road rules, as if the SDC shouldn’t be expected to cope with an errant human. I always thought the point of Self Driving Cars was that they’d work on real roads, without special signposts, beacons or machine-readable lane markings.  Truly autonomous SDCs must adapt to the real world, where the rule is, people don’t always follow the rules. 

As our cities fill with rule-bound robot vehicles, jay-walkers should not have to fear for their lives.  

Recently I wrote

No algorithm is ever taken by surprise, in the way a human can recognise the unexpected. A computer can be programmed to fail safe (hopefully) if some input value exceeds a design limit, but logically, it cannot know what to do in circumstances the designers did not foresee. Algorithms cannot think (or indeed do anything at all) "outside the dots". 

Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Event Report - SAP Ariba Live 2018 - Las Vegas - Sustainability and UX

We had the opportunity to attend SAP Ariba's user conference in Ariba Live, held in Las Vegas at Ceasar's Palace from March 5th till 8th 2018. The conference saw record attendance, it had to move from the Cosmopolitan.

 

 

Don't want to read - but prefer to watch - here is a short video (if it does not show up - please check here)
 
 
And here is the 1 Slide summary from Slideshare:
 

Prefer to read - here is my 5 Tweet event sequence on Twitter:

 
 

MyPOV



A good event for Ariba. Progress on product, a new UX is always positive. Short on new announcements, capabilties - but that's the usual 'lull' with an executive transition. Padgett seems to get started and the next quarters will be important, next year's Ariba Live will be the full record card.

On the concern side, SAP Ariba did not talk and show as much AI / ML and Blockchain that it talked about in 2017. Though the contract text analysis in the partnership with IBM has been delivered, it is not the broad uptake the technology needs in Procurement. Did anybody say software agents?

But for now, a good event. The mix or more soft / feel good topics vs. product and customer news is new - and I am very curious to talk more to customers, prospects and partners to see how it was accepted. Stay tuned. 


 
Revenue & Growth Effectiveness New C-Suite Matrix Commerce Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience SAP Chief Procurement Officer Chief Product Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer