Results

Event Report - Oracle HCM World 2018 Dallas - Steady Progress

Event Report - Oracle HCM World 2018 Dallas - Steady Progress

We had the opportunity to attend Oracle's HCM World conference, held from March 20th till 22nd in Dallas. Held in the busiest week of the spring conference circuit, the conference had good analyst and influencer representation and was overall well attended (Oracle claimed 2200+). 

 

 

 


Prefer to watch – here is my event video … (if the video doesn't' show up – check here)

 

):


Here is the 1 slide condensation (if the slide doesn't show up, check here):

 

 

 

 


Want to read on? Here you go:

Oracle keeps delivering in HCM – Since its early beginning the Oracle HCM product, has done well and is growing at an almost metronomic takt rate. New capabilities are created in the product, and Oracle delivers customer adoption of them in the following 6-12 months. Customers, partners and the overall ecosystem is now on the by-yearly announcement schedule of HCM World in Spring and OpenWorld in Fall, and in between customer adoption manifests themselves in go lives. The result is likely the most comprehensive singe platform HCM Suite in the market. 

 

 

 

Oracle HCM Cloud HCM World Holger Mueller Constellation Research
Oracle HCM Cloud Spring 2018 Release Highlights

 


Good DNA in Spring 2018 Release – Along the same lines the 2018 Spring release of Oracle HCM Cloud is a rich release that pushes the boundaries. Oracle replaced and strengthened the Onboarding capabilities in the suite. Given the best practice uncertainty that at the moment plagues Performance Management, Oracle has (wisely) opted for a suite of performance management tools. People leaders can choose from four different approaches to address performance management in their enterprise. The good news is that they can even choose to use different best practices / flavors of performance management across the enterprise – slice and dice by operating company, division, people type etc. The right approach and it will be interesting to see adoption. And last but not least more AI, no conference without AI in 2018… Oracle pointed out that it always has had some form of 'intelligence' but no the offerings are serious and it's good to see Oracle speak of AI (vs the a tad misaligned 'adaptive intelligence' term of the past. 

 

 

 

 

Oracle HCM Cloud HCM World Holger Mueller Constellation Research
The new Oracle HCM Cloud UX paradigm

 


A new UX - Having been a critic of the Oracle HCM UI for a long time, the new UX is a welcome first step in the right direction to improve this situation. Not surprisingly Oracle opts for the mobile focus of the largest user population, and for the high volume transaction. The new UX looks modern, easy to use and has some of the key inner workings a UX in 2017 should have – it's responsive and can be used across devices and form factors as well takes an aggressive stance on defaulting, suggesting entries. At the core is a newsfeed paradigm, that users are familiar with from consumer websites. The newsfeed manages to collapse menu structures and to surface the relevant information at the right time. Not easy to get right - think of the challenges Facebook has had getting this UX right, but from what we saw about the Oracle HCM newsfeed, it's a well working first implementation in an Oracle enterprise application.  All welcome by busy enterprise users, who have to do a real job and can't afford to be held up too long by administrative systems (like any HCM system). Next step is to check in on customer feedback, roll out plans and roadmap.

 

 

 

 

 

Oracle HCM Cloud HCM World Holger Mueller Constellation Research
Oracle has invested to get the Newsfeed right

 

 

 

 

MyPOV

A good HCM World for Oracle, the product keeps progressing and customers and ecosystem are positive. Customers tap more and more into the suite benefits, adding modules after originally going live on more administrative functions as ESS / MSS and Payroll. Needless from the start though, suite level benefits are tangible and create productivity for users as well as HR departments, all leading to a more positive stance towards the product, in this case Oracle HCM Cloud.

On the concern side the event seemed smaller than last years events. Maybe the timing and the location in Dallas did not help. But in general you expect a growing attendee number. Equally in the ecosystem we saw less partner activity... it looks like Deloitte (for NA SIs) and Infosys (for the Indian SIs) seem to have capture the pool position. Next year's HCM World will be a data point on how much Oracle can activate users and prospects to come to an event like this on. For the record, happy users do not need to travel to user conferences as much, as they know what's coming and are busy implementing and using the software.

But overall a good event for Oracle customers and prospects. Oracle HCM Cloud is probably the most complete, single platform, single code base HCM Suite out there. Stay tuned.




 

Future of Work Tech Optimization Innovation & Product-led Growth New C-Suite Data to Decisions Next-Generation Customer Experience Revenue & Growth Effectiveness Marketing Transformation Digital Safety, Privacy & Cybersecurity Oracle AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Customer Officer Chief People Officer Chief Human Resources Officer

Nvidia Accelerates Artificial Intelligence, Analytics with an Ecosystem Approach

Nvidia Accelerates Artificial Intelligence, Analytics with an Ecosystem Approach

Nvidia’s GTC 2018 event spotlights a play book that goes far beyond chips and servers. Get set for next era of training, inferencing and accelerated analytics.

“We're not a chip company; we're a computing architecture and software company.”

This proclamation, from NVIDIA co-founder, president and CEO Jensen Huang at the GPU Technology Conference (GTC), March 26-29 in San Jose, CA, only hints at this company’s growing impact on state-of-the-art computing. Nvidia’s physical products are accelerators (for third-party hardware) and the company’s own GPU-powered workstations and servers. But it’s the company’s GPU-optimized software that’s laying the groundwork for emerging applications such as autonomous vehicles, robotics and AI while redefining the state of the art in high-performance computing, medical imaging, product design, oil and gas exploration, logistics, and security and intelligence applications.

Jensen Huang, co-founder, president and CEO, Nvidia, presents the sweep of the
company's growing AI Platform at GTC 2018 in San Jose, Calif.

On Hardware

On the hardware front, the headlines from GTX built on the foundation of Nvidia’s graphical processing unit advances.

  • The latest upgrade of Nvidia’s Tesla V100 GPU doubles memory to 32 gigabytes, improving its capacity for data-intensive applications such as training of deep-learning models.
  • A new NVSwitch interconnect fabric enables up to 16 Tesla V100 GPUs to share memory and simultaneously communicate at 2.4 terabytes per second -- five times the bandwidth and performance of industry standard PCI switches, according to Huang. Coupled with the new, higher-memory V100 GPUs, the switch greatly scales up computational capacity for deep-learning models.
  • The DGX-2, a new flagship server announced at GTC, combines 16 of the latest V100 GPUs and the new NVSwitch to deliver two petaflops of computational power. Set for release in the third quarter, it’s a single server geared to data science and deep-learning that can replace 15 racks of conventional CPU-based servers at far lower initial cost and operational expense, according to Nvidia.

If the “feeds and speeds” stats mean nothing to you, let’s put them into the context of real workloads. SAP tested the new V100 GPUs with its SAP Leonardo Brand Impact application, which delivers analytics about the presence and exposure time of brand logos within media to help marketers calculate returns on their sponsorship investments. With the doubling of memory to 32 gigabytes per GPU, SAP was able to use higher-definition images and a larger deep-learning model than previously used. The result was higher accuracy, with a 40 percent reduction in the average error rate yet with faster, near-real-time performance.

In another example based on a FAIRSeq neural machine translation model benchmark test, training that took 15 days on NVidia’s six-month-old DGX-1 server took less than 1.5 days on the DGX-2. That’s a 10x improvement in performance and productivity that any data scientist can appreciate.

On Software

Nvidia’s software is what’s enabling workloads—particularly deep learning workloads--to migrate from CPUs to GPUs. On this front Nvidia unveiled TensorRT 4, the latest version of its deep-learning inferencing (a.k.a. scoring) software, which optimizes performance and, therefore, reduces the cost of operationalizing deep learning models in applications such as speech recognition, natural language processing, image recognition and recommender systems.

Here’s where the breadth of Nvidia’s impact on the AI ecosystem was apparent. Google, for one, has integrated TensorRT4 into TensorFlow 1.7 to streamline development and make it easier to run deep-learning inferencing on GPUs. Huang’s keynote included a dramatic visual demo showing the dramatic performance difference between TensorFlow-based image recognition peaking at 300 images per second without TensorRT and then boosted to 2,600 images per second with TensorRT integrated with TensorFlow.

Nvidia also announced that Kaldi, the popular speech recognition framework, has been optimized to run on its GPUs, and the company says it’s working with  Amazon, Facebook and Microsoft to ensure that developers using ONNX frameworks, such as Caffe 2, CNTK, MXNet and Pytorch, can easily deploy using Nvidia deep learning platforms.

In a show of support from the data science world, MathWorks announced TensorRT integration with its popular MATLAB software. This will enable data scientists using MATLAB to automatically generate high-performance inference engines optimized to run on Nvidia GPU platforms.

On Cloud

The cloud is a frequent starting point for GPU experimentation and it’s an increasingly popular deployment choice for spikey, come-and-go data science workloads. With this in mind, Nvidia announced support for Kubernetes to facilitate GPU-based inferencing in the cloud for hybrid bursting scenarios and multi-cloud deployments. Executives stressed that Nvidia’s not trying to compete with a Kubernetes distribution of its own. Rather, it’s contributing enhancements to the open-source community, making crucial Kubernetes modules available that are GPU optimized.

The ecosystem-support message was much the same around Nvidia GPU Cloud (NGC). Rather than offering competing cloud compute and storage services, NGC is a cloud registry and certification program that ensures that Nvidia GPU-optimized software is available on third-party clouds. At GTC Nvidia announced that NGC software is now available on AWS, Google Cloud Platform, Alibaba’ AliCloud, and Oracle Cloud. This adds to the support already offered by Microsoft Azure, Tencent, Baidu Cloud, Cray, Dell, Hewlett Packard, IBM and Lenovo. Long story short, companies can deploy Nvidia GPU capacity and optimized software on just about any cloud, be it public or private.

In an example of GPU-accelerated analytics, this MapD geospatial analysis shows six
years of shipping traffic - 11.6 billion records without aggregation - along the West Coas
t.

MyTake on GTC and Nvidia

I was blown away at the range and number of AI-related sessions, demos and applications in evidence at GTC. Yes, it’s an Nvidia event and GPUs were the ever-present enabler behind the scenes. But the focus of GTC and of Nvidia is clearly on easing the path to development and operationalization of applications harnessing deep learning, high-performance computing, accelerated analytics, virtual and augmented reality, and state-of-the art rendering, imaging or geospatial analysis.

Analyst discussions with Huang, Bill Dally, Nvidia’s chief scientist and SVP of Research, and Bob Pette, VP and GM of pro visualization, underscored that Nvidia has spent the last half of its 25-year history building out its depth and breadth across industries ranging from manufacturing, automotive, and oil and gas exploration to healthcare, telecom, and architecture, engineering and construction. Indeed, Nvidia Research placed its bets on AI – which will have a dramatic impact across all industries – back in 2010. That planted the seeds, as Dally put it, for the depth and breadth of deep learning framework support that the company has in place today.

Nvidia can’t be a market maker entirely on its own. My discussions at GTC with accelerated analytics vendors Kinetica, MapD, Fast Data and BlazingDB, for example, revealed that they’re moving beyond a technology-focused sell on the benefits of GPU query, visualization and geospatial analysis performance. They’re moving to a vertical-industry, applications and solutions sell catering to oil and gas, logistics, financial services, telcos, retail and other industries. That’s a sign of maturation and mainstream readiness for GPU-based computing. In one of my latest research reports, “Danske Bank Fights Fraud with Machine Learning and AI,” you can read about why a 147-year-old bank invested in Nvidia GPU clusters on the strength of convincing proof-of-concept tests around deep-learning-based fraud detection.

Of course, there’s still work to do to broaden the GPU ecosystem. At GTC Nvidia announced a partnership through which its open sourced deep learning accelerator architecture will be integrated into mobile chip maker Arm’s Project Trillium platform. The collaboration will make it easier for internet-of-things chip companies to integrate AI into their designs and deliver the billions of smart, connected consumer devices envisioned in our future. It was one more sign to me that Nvidia has a firm grasp on where its technology is needed and how to lay the groundwork for next-generation applications powered by GPUs. 

Related Reading:
Danske Bank Fights Fraud with Machine Learning and AI
How Machine Learning & Artificial Intelligence Will Change BI & Analytics
Amazon Web Services Adds Yet More Data and ML Services, But When is Enough Enough?

Data to Decisions Tech Optimization Chief Information Officer Chief Digital Officer

Event Report - ADP Meeting of the Minds 2018 - Stay the course

Event Report - ADP Meeting of the Minds 2018 - Stay the course

We had the opportunity to attend ADP's 25th Meeting of the Minds (MOTM) user conference, held in Orlando at the Walldorf / Hilton from March 18th till 23rd 2018.

 

 

 


Take a look at the event video first (if it does not show up - please check here):

 

 

 

 

 

 

 

 

 

 
Here is the 1 slide condensation (if the slide doesn't show up, check here):
 
Event Report - ADP Meeting of the Minds 2018 - Stay the course from Holger Mueller

Want to read on? Here you go:

 
 
 
 
 
 
 
 
If you want to learn more about the keynote, key tweets are collected in this Twitter Moment here.
 
 
 

MyPOV

A good event for ADP customers and prospects. ADP keeps delivering on a steady pace and creates value for its customer base. The focus on diversity and inclusion is high on people leader's agenda, so it is no surprise that ADP also focuses on this important topic. Good to see also the first TMBC assets making it to the mainstream North American customer base that ADP is targeting with its Meeting of the Minds conference.

On the concern side,  ADP is moving at a conservative, maybe too slow speed. One year is enough time for most vendors to create integrated value from an acquisition like TMBC. And ADP announced its new payroll product, Pi, back at the HR Tech conference in fall - so an update to the MOTM attendees would have been timely. Not to mention ADP's new HR core system Lifion, that ADP advertises for talent publicly, but choose to mention in Orlando. It's always good for enterprises software vendors to be conservative and quality focused, and ADP customers certainly expect that, but vendors can't be too slow rolling out differentiating capabilities either. 

Overall a good event, customers and ecosystem are happy with the progress. A lot of new innovation should see the light at ADP MOTM 2019, fingers crossed. Stay tuned. 


 
Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization New C-Suite Revenue & Growth Effectiveness User Conference AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Customer Officer Chief People Officer Chief Human Resources Officer

Monday's Musings: Designing Five Pillars For Level 1 Artificial Intelligence Ethics

Monday's Musings: Designing Five Pillars For Level 1 Artificial Intelligence Ethics

 

Focus On Humanizing AI

As organizations begin their journey into artificial intelligence (AI), ethics often enter the design process.  While achieving a uniform set of ethics may seem insurmountable, some design points will help facilitate the humanization of artificial intelligence and provide appropriate checks and balances.  Constellation has identified design pillars for Level 1 AI.  Level 1 AI is defined as machine learning proficiency (see Figure 1)

Figure 1.  Five Levels of Artificial Intelligence Requires Different Design Points

The five pillars include (see Figure 2):

  1. Transparent.  Alogrithms, attributes, and correlations should be open to inspection for all participants.
  2. Explainable.  Humans should be able to understand how AI systems come to their contextual decisions.
  3. Reversible.  Organizations must be able to reverse the learnings and adjust as needed.
  4. Trainable.  AI systems must have the ability to learn from humans and other systems.
  5. Human-led.  All decisions should begin and end with human decision points.

Figure 2. Five Pillars For Level 1 AI Ethics Focus On Humanizing AI

The Bottom Line.  Instill The Five Design Pillars For AI Ethics In All Projects

Prospects of universal AI ethics seem slim.   However the five design pillars will serve organizations well beyond the social fads and fears.  The goal – build controls that will identify biases, show attribution, and enable course correction as needed.

 

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Matrix Commerce Marketing Transformation Revenue & Growth Effectiveness New C-Suite Distillation Aftershots infor SoftwareInsider AI Agentic AI LLMs Generative AI ML Analytics Automation CX Customer Service Supply Chain Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software IoT Blockchain ERP Leadership Collaboration M&A Marketing B2B B2C Customer Experience EX Employee Experience Growth eCommerce Next Gen Apps Social Content Management Machine Learning Robotics SaaS PaaS IaaS Quantum Computing CRM CCaaS UCaaS Enterprise Service developer Metaverse VR Healthcare Data to Decisions finance HCM HR business AR Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Customer Officer Chief Experience Officer Chief Digital Officer Chief Revenue Officer Chief Supply Chain Officer Chief Marketing Officer Chief Financial Officer Chief Operating Officer Chief People Officer Chief Human Resources Officer

Domo Focuses Its Cloud-Based Analytics Message, Adds Predictive Options

Domo Focuses Its Cloud-Based Analytics Message, Adds Predictive Options

Domo insists its platform is aimed at business people, not data analysts. Here’s the appeal to CXOs and line-of-business types.

The key message at Domopalooza 2018, March 13-15 in Salt Lake City, was that Domo is a platform for business, not a tool for techies. I’ve heard this platform messaging before, but it made sense for cloud-based analytics vendor Domo to emphasize its new “for the good of the company” slogan to try to set itself apart from competitors Tableau, Microsoft Power BI and Qlik.

“Domo is an operating system that lets you run your business on your phone,” declared Domo CEO and founder Josh James in his opening keynote.

It’s an apt description, though Domo is most often used to run the marketing and sales aspects of businesses. Once adopted, Domo usage tends to expand and, in some cases, go companywide. Customers I spoke to at Domopalooza were extending their deployments into finance, customer support, supply chain management and other operational areas. Broad use is most common among corporate customers (meaning those with less than $1 billion in annual revenue), but enterprise customers including Target, United Health Group, Telus and L’oreal are expanding their Domo footprints.

These announcements from Domopalooza 2018 will see beta release in spring and
general availability this summer.

To recap some basics (from my Domopalooza 2017 analysis), Domo is a cloud-based, multi-tenant platform onto which you can load diverse data at scale. There are more than 500 connectors to common data sources, and Domo’s Magic ETL supports integration and transformation. Domo’s Vault back end and infrastructure runs primarily on Amazon Web Services, but it’s also available on Azure for customers (such as Target) that aren’t comfortable storing data on AWS.

Domo introduced a Bring Your Own Key encryption option last year. That won over many customers with demanding security requirements. The company is also rolling out a federated data-access option for those who want to retain certain data on-premises without loading it or copying it into the cloud. Performance with this remote-query option depends on the bandwidth and latency of the customer’s data-center connections and query engine.

Once data is loaded into Domo, admins expose data sets with appropriate access privileges. Business intelligence/data analyst types tend to build out the initial cards and pages (akin to data visualizations and dashboards), but it’s common for business types to create and edit their own cards. Once users learn the platform, departmental and line-of-business power users often develop cards, pages and even “Beast Mode” custom calculations. There are hundreds of prebuilt and templated cards and pages available.

Domo execs say it’s not uncommon for CXOs to be big Domo users. Jeremy Andrus, CEO at Traeger Grills and a keynote guest, said he’s the number-one user at his company, which is a $450-million maker of barbeque grills. Andrus said he looks at insights on revenue, day sales, margins, channel productivity and marketing efficiency on a daily basis, usually from his phone. At larger companies the buyers and biggest users are typically line-of-business leaders. A keynote panel of media customers brought together marketing, advertising and operations vice presidents from Domo customers ESPN, The New York Times, Univision and The Washington Post.

Upgrades Announced at Domopalooza

The major announcements at Domopalooza fell into four categories: storytelling, Mr. Roboto, certified content and data center. Most of these upgrades are expected to see beta release in this spring with general availability expected by summer. Here’s a quick summary of what’s in store:

Storytelling: Hightlights include auto-suggested page layouts, more templated page layouts, custom charting capabilities, and support for guided, interactive analyses.

Mr. Roboto: This is Domo’s intelligence, alerting and machine learning layer. Upgrades here include natural-language generation, automated predictions, forecast alerts, and anomaly and correlation detection in third-party data. Also planned are R- and Python-based data science integration.

Mobile views of coming Mr. Roboto capabilities including, left to right, natural language
generation, forecasting and correlation.

Certified content: Coming certification capabilities for cards, data sets and Beast Mode custom calculations will beef up governance and compliance capabilities with granular controls. This will help analysts and administrators ensure sound data and sound, sanctioned analyses, but it’s not a one-way street. Business users will be able to submit the new cards they develop for certification. Beefed up statistics for admins will reveal who’s using what data, cards and Beast Modes and whether there are overlaps, redundancies or inconsistencies.

Data center: These upgrades will help admin and data-management types with data cleansing and validation. Here, too, beefed statistics will help admins understand and tag data and then track usage. Collaboration and group controls will help with managing data sets at scale.

MyPOV on Domopalooza 2018 and Domo’s Direction

I assumed Domopalooza’s move from The Great American Hotel in 2017 to the far larger Salt Palace Convention Center for 2018 would mean a much bigger event, but attendance was roughly the same as last year at around 3,000 people. The extra space made things more commodious and comfortable, but last year’s event seemed to have a bit more energy.

As for the keynotes, Domo leaned heavily on fireside chats with celebrity guests. I prefer hearing from customers, particularly innovators. On that note, Simone Knight, VP, Marketing Strategy and Media Intelligence at Univision, was fantastic both as a guest and in leading the media panel. Ben Schein, Senior Director, Enterprise Data, Analytics and BI at Target, made a reprise appearance, updating the details of the retailer’s massive Domo footprint. Target now has 800 billion (with a B) rows of data in Domo, and demand now averages 3,000 weekly users, up from 1,500 weekly users last year.

As for the announcements, every customer I talked was eager to adopt the new features. The certifications, storytelling and data admin upgrades were described as must haves that can’t come soon enough. The Mr. Roboto capabilities are nice-to-haves that will drive innovation. If you read my January report on “How Machine Learning and Artificial Intelligence will Change BI and Analytics,” you know that Domo is among the leading vendors I detailed that are investing in ML and AI. I like the platform approach of Mr. Roboto, which will enable customers and partners to work with APIs and add their own code and customized capabilities.

Domo creates early anticipation for features by holding a “Sneak Peek” and customer-wish-list session at the end of every Domopalooza. The upside is that Domo’s direction is very customer driven. The downside is that it might be 14 to 18 months between initial public discussions about features and upgrades and general availability. That can make the process seem slow when, in fact, it’s just more open. Companies that are more secretive about their development work for many months before announcing new features run the risk of getting too little input and facing surprises in the beta and release stages.

Domo is still maturing. It started as a platform designed to run in the cloud and give business users web and mobile access to insights. The market messaging is now in line with that original vision and it’s building out the deeper levels of management control and customization capabilities that the developers and administrators are demanding as deployments scale up and out.

Related Reading:
MicroStrategy Makes Case for Agile Analytics on its Enterprise Platform
Tableau Conference 2017: What’s New, What’s Coming, What’s Missing
Qlik Plots Course to Big Data, Cloud and ‘AI’ Innovation

Media Name: Domopalooza 2018 announcements.jpg
Data to Decisions Tech Optimization Chief Customer Officer Chief Executive Officer Chief Marketing Officer Chief Supply Chain Officer Chief Digital Officer Chief Revenue Officer

Event Report - Ultimate Connections 2018 - More HCM, more WfM and more Xander

Event Report - Ultimate Connections 2018 - More HCM, more WfM and more Xander

Ultimate Software had their yearly user conference, Ultimate Connections at the Wynn in Las Vegas, from March 11th till 14th 2018. The event was very well attending with over 4k+ customers and prospects, Ultimate had to find a new home for Ultimate Connections as the conference has outgrown the Bellagio. I wasn't able to attend, but Ultimate was so kind to rope me in remotely into keynote and analyst meeting sessions, very much appreciated.

 

 

 

 

 

 

 

 

Take a look at the event video first (if it does not show up - please check here):

 

Here is the 1 slide condensation (if the slide doesn't show up, check here):
 

 

 

 
Want to read on? Here you go:

 
 
 

[Factuall Correction] UltiPro Workforce Management, including both Time and Scheduling have been generally available since January, 2018.



 

Overall MyPOV

Good to see Ultimate follow up on its substantial road map promises made at Ultimate Connections 2017 and deliver product across the announced areas. Equally good to see more application of Xander in 'natural' AI application areas as with Recruiting, elimination of bias and improving Performance Management. The move into Workforce Management is a natural expansion of Ultimate footprint, welcome by Ultimate clients. It will be a few years, realisitically though, till the Ultimate Scheduling capabilities will match the leading best of breed workforce management solutions. 
 
On the concern side the variety of the Ultimate platform sticks out. While all technology ensembled are proven, it looks more like a yearbook of technologies that were up and coming in the last three to four years back. While the variety does not hurt Ultimate customers at the moment, the vendor will have to strive for a more unified and harmonized platform in the not too distant future.
 
But overall good to see the progress by Ultimate delivering on promises and pushing the AI yardstick further. It remains interesting that no major HCM player except fot Ultimate has named its assistant (yet) and it's unlikely the key HCM players will match or expand the scope that Ultimate offers today and announced for 2018 with Xander. A good position for Ultimate customers. Stay tuned.    
 
Future of Work Tech Optimization Innovation & Product-led Growth New C-Suite Data to Decisions Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Revenue & Growth Effectiveness AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR LLMs Agentic AI Generative AI Robotics Quantum Computing Disruptive Technology Enterprise Acceleration Next Gen Apps IoT Blockchain VR business Marketing IaaS CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer

On a robot fatality

On a robot fatality

Tragically, we have seen the first fatal accident involving a self-driving car, with a pedestrian in Tempe, Arizona dying after being hit by an Uber in autonomous mode. My thoughts are with the family, and the Uber operator. It would be a horrific experience.

I don’t know the facts of this case, but people are already opining that the victim was jay-walking.  

I sincerely hope commentators don’t simply line up around the clinical rights & wrongs of the road rules, as if the SDC shouldn’t be expected to cope with an errant human. I always thought the point of Self Driving Cars was that they’d work on real roads, without special signposts, beacons or machine-readable lane markings.  Truly autonomous SDCs must adapt to the real world, where the rule is, people don’t always follow the rules. 

As our cities fill with rule-bound robot vehicles, jay-walkers should not have to fear for their lives.  

Recently I wrote

No algorithm is ever taken by surprise, in the way a human can recognise the unexpected. A computer can be programmed to fail safe (hopefully) if some input value exceeds a design limit, but logically, it cannot know what to do in circumstances the designers did not foresee. Algorithms cannot think (or indeed do anything at all) "outside the dots". 

Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Event Report - SAP Ariba Live 2018 - Las Vegas - Sustainability and UX

Event Report - SAP Ariba Live 2018 - Las Vegas - Sustainability and UX

We had the opportunity to attend SAP Ariba's user conference in Ariba Live, held in Las Vegas at Ceasar's Palace from March 5th till 8th 2018. The conference saw record attendance, it had to move from the Cosmopolitan.

 

 

Don't want to read - but prefer to watch - here is a short video (if it does not show up - please check here)
 
 
And here is the 1 Slide summary from Slideshare:
 

Prefer to read - here is my 5 Tweet event sequence on Twitter:

 
 

MyPOV



A good event for Ariba. Progress on product, a new UX is always positive. Short on new announcements, capabilties - but that's the usual 'lull' with an executive transition. Padgett seems to get started and the next quarters will be important, next year's Ariba Live will be the full record card.

On the concern side, SAP Ariba did not talk and show as much AI / ML and Blockchain that it talked about in 2017. Though the contract text analysis in the partnership with IBM has been delivered, it is not the broad uptake the technology needs in Procurement. Did anybody say software agents?

But for now, a good event. The mix or more soft / feel good topics vs. product and customer news is new - and I am very curious to talk more to customers, prospects and partners to see how it was accepted. Stay tuned. 


 
Revenue & Growth Effectiveness New C-Suite Matrix Commerce Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience SAP Chief Procurement Officer Chief Product Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Event Takeaways - SAP at MWC 2018

Event Takeaways - SAP at MWC 2018

During the Mobile World Congress in Barcelona, held from 26thof February till 1st of March at the Fira. I had the chance to meet with a number of SAP executives and get some important takeaways from the briefings, worth of a blog post of media collected:

 

 

 

  
Prefer to watch – here is my event video …
 
 
Here is the 1 slide condensation (if the slide doesn't show up, check here):
 

Want to read on? Here you go (trying the Twitter thread format - let me know what you think of it):

 

 

 

MyPOV

A good point in time to check-in where SAP stands. Consumption based pricing is likely the most impactful announcement, giving CxOs better options to experiment, evaluate and operate next generation applications. The transformation of the SAP IoT portfolio, moving from templates, APIs and examples to separate products is equally positive.
 
On the concern side - I was not so sure if SAP got the marketing $s back from a large booth at MWC. Apart from the Telco and Comms industry vendors - MWC is not a mainstream IT event. But certainly good to be at the party for SAP.
 
Overall good to catch up with SAP in many of its technology offerings. Sapphire looms large - only 3 monhts till the gathering in the Florida swamps.
 
 

 

 

Future of Work Data to Decisions Innovation & Product-led Growth New C-Suite Tech Optimization SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Time to Bring Back the Software User Conference

Time to Bring Back the Software User Conference

As an industry observer, I attend probably something like 40+ user conferences a year, and from a longitudinal perspective I have been attending them since 30+ years, working for software vendors. My first industry conference, as a German native was of course... CeBIT in Hannover back in 1986...

 

 


So a few observations based on my recent experiences...

 

  • It's about the product. Users take time from their busy jobs to learn more about a vendor's software, that they already use or plan to use. So product needs to get a lot of room. Yes, services matter, too - but it's the product that matters in the first place. Savvy vendors will make sure they have enough in the product pipeline for a key user conference. If there is nothing there at the user conference - not a good sign.
     
  • Have a motivational speaker that matters. Motivational speakers are great to inspire a user group. But it has to relate to the software, to their business. As great as the Cake Master maybe, that is not the motivational speaker to have at a user conference. Ok, to have one motivational speaker off topic - but it better be inspirational and the boss of the attendee can get the value already by name dropping of the speaker.
     
  • Demo software. Many attendees are expert users. Vendors need to show they are experts, too and know their product. The more hands on an executive, the better. But software needs to be shown, it's the one and powerful opportunity for a vendor to have ALL of the attendees see something. Wasting the opportunity is like setting fire to the marketing budget - for nothing. And a demo needs to be live. Too many demos that are screenshots, screencams etc. Vendors need to remember - their users are in the audience, their customers, and 1000s and 1000s of them have to use it every day. It does not look good when an attendee comes back and has to tell the colleagues that the vendor did not show live software.
     
  • Subject Expertise beats Celebrity. Yes, user conferences are also about inspiration. But a shows star, soap operate protagonist or a talk show host - is not something an enterprise software user can relate to for their work and why they spend 3-4 days and a few thousand budget dollars / euros to come to a conference. Vendors show offer subject matters, if push comes to shove from the user base. There is instant validation, trust and respect from a user to another user presenting. There is direct bonding of being in the same boat and sharing experiences from that. No celebrity can do that. Glamour effects don't last.

 

 

 

  • Limit the Philantrophy. It's great for vendors to give back and share a purpose beyond the software. But it should not be 50% of a keynote. It takes away from the value of the philantrophy and begs for question on the purpose of the whole user conference. 
  • Users want to network. Vendors should give users a chance to network. Not just informally, but in a planned way. We are only quarters away from Facebook / LinkedIn et al waking up to the opportunity to connect the right users at the right time at the right conference. Vendors have the choice to provide the platform - with all benefits - or stand by and watch.
      
  • Party Hard but responsibly. Yes, there is customer appreciation and its important. But vendors are in charge that their attendees have fund and are safe. Limit late and early events, give attendees a chance to sleep (so they retain what's being said next day) and make the conference a safe environment.
 

MyPOV 

So what do you see out there? If you are a user or a vendor reading this - share your observations from the recent user conferences. Is the industry heading in the right - or not so right (my view) on this. And most importantly: What should happen at a user conference from your POV? Please comment. See you at the next conference, always happy to say Hi!
 
Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Data to Decisions User Conference