Results

Event Report - WorkForce Software Vision 2018 - More Global and Crew Management

We had the opportunity to attend WorkForce Software 's Vision 2018 user conference held at the Grand Sheraton at Wild Horse Pass, South of Phoenix. The conference was well attended by customers, prospects and partners, substantial growth over 2017 in New Orleans.  

 

Prefer to watch – here is my event video … (if the video doesn't' show up – check here - pardon the bad sound quality at times, will have to record when time allows)

 

 

 

 

 

 

 

 
 
 


Here is the 1 slide condensation (if the slide doesn't show up, check here):

 

 

 


Want to read on - here are the key takeaways summed up in a Twitter Moment:



 

MyPOV

Good progress by WorkForce Software, that keeps growing and investing into product and infrastructure. New capabilities are key and the new Crew Management capability is an important step forward. Mort importantly even,it's the first time I have seen a modern and adequate UX from WorkForce Software, a UX that not only does the job, but is easy and intuitive to use. More to come says the vendor. 
 
On the concern side, a user conference  in 2018 with no story on AI / ML, no virtual assistant and no chat integration is ... a little dated. WorkForce needs to understand the implications of these technologies and move its products forwards and towards them.
 
But for now good times at  WorkForce  Software, that remains the viable alternative to the 800 pound gorilla of Workforce Management, Kronos, in its 5 focus industries. Stay tuned. 
 
 
 
Future of Work Next-Generation Customer Experience Data to Decisions Tech Optimization AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief People Officer Chief Customer Officer Chief Human Resources Officer

Event Report: Workfront LEAP 2018

We all struggle to get work done. Too many tools, too much chaos, not enough time. That's where Work Coordination Platforms (WCP) come in. These tools help people bring organization and structure to projects and processes. It's a highly competitive market with several exciting things taking place including IPOs, funding rounds and acquisitions. This week I had the pleasure of both attending and speaking at Workfront LEAP 2018, Workfront's annual user conference where they introduced new features, new customer stories and a new product. 

The two main product announcements were:

  • Workfront Home: A new starting page that makes it easier for people to see the items that need their attention and manage their time with highly functional calendar integration. Workfront customers will find this a very welcome addition, as it will make it easier to focus on where you need to spend your time without clicking through multiple projects. Interestingly, this comes just a few weeks after another WCP vendor introduced a similar feature, Trello Home.
  • Resource Capacity Planner: One of the key elements of project management is the assignment and tracking of resources... meaning how many people are available and how much time do they have to work on things. This new Workfront feature uses AI-enabled planning to help find the right people at the right time to assign work.

The new product announcement was the launch of Workfront Fusion , a tool that enables people to connect business applications to Workfront to automate the data transfer between applications with no coding. It uses a drag and drop process to define events and actions that outline what should happen and when. This is along the same lines as commercial tools like IFTTT and Zapier, and enterprise tools like Workato, Nintex, Microsoft Flow and Zoho Flow.

Here is a short video where I share my thoughts on the items mentioned above:

 

More Thoughts

As mentioned in the video above, Workfront's mission is to become the "Operational System of Record" for work.

I view this as another way of saying they want to become the place that aggregates together the plethora of tools people use to get their jobs done. That's a popular theme these days, as several software vendors are realizing that the lack of context caused by using so many tools is a huge barrier to getting work done. Workfront has taken several smart steps in accomplishing this goal, starting in 2015 with the acquisition of ProofHQ to provide native asset management and publishing capabilities (called Workfront DAM ) to this week's announcement of Workfront Fusion. They also currently provide integrations with Outlook, Jira, Slack, Box, Adobe and others, and announced upcoming integrations with Microsoft Teams

and Salesforce

 

MyPOV

While I'm impressed with the Workfront products, I'm equally impressed with their executive team and their customers. CEO Alex Shootman has an impressive career in enterprise software including Eloqua, Vignette, BMC and IBM. Chief Product & Technology Officer Steven ZoBell is one of the more forward thinking and creative people I've worked with as a industry analyst and advisor, and they've recently brought on a very experienced CMO, Heidi Melin. The overall feeling I get from Workfront, both via my direct work with them as well as watching them at their events is one of humility and passion, not arrogance and entitlement. I was particularly impressed with Alex's opening keynote this week, as he discussed the hot topic of Digital Transformation, but emphasized that the changes taking place today are not the end goal, but rather just the beginning of the next era in business. 

When listening to what's new at these conferences is important, the more valuable experience is speaking with customers. Workfront has a very strong portfolio of customer references, including Disney, Prudential, Marriott, Thomson Reuters, Bristol-Myers Squibb, Black + Decker, Fender, Trek, Alaska Airlines, and Columbia and many more.

The Work Coordination Platform market is just getting started. Everyone has email, file-sharing, some type of messaging, web-conferencing and a few other tools, but not every employee currently uses a tool to structure and organize work. We're at an inflection point where people are overloaded and we can't simply keep producing more content, asking people to do more work, add more tools, and have more conversations. As the awareness of the benefits of these tools increases, their adoption will rise. Workfront is well positioned both in strategy and features, earning them a spot on the Constellation Research Shortlist for WCP.

What I'd like to see next from Workfront is more innovation on the automation across the entire lifecycle of tasks and processes, adding machine learning/artificial intelligence to proactively prioritize tasks, find expertise and create teams, create or curate content and other items that reduce the need for employees to be involved in task management all, freeing them up to spend more time doing the actual work.

 

 

 

Future of Work

Progress Report - SAP SuccesFactors - More SAP (technology)

We had the opportunity to attend SAP's SuccessFactors Analyst Summit 2018, held from April 30th to May 2nd 2018 in San Francisco at the St. Regis. Attendance with over 15 analysts was good, still a good atmosphere to get a pulse on where the vendor is now, especially as it was the first analyst event under the new leadership of Tomb, and the product leadership of Wilson and Harvey.

 

 

 

 
 

 

Prefer to watch – here is my event video … (if the video doesn't' show up – check here - pardon the bad sound quality at times, will have to record when time allows)
 

 

 
Here is the 1 slide condensation (if the slide doesn't show up, check here):
 
 

Want to read on? Here you go:

SuccessFactors is on the 'Enslin March'. Since SAP put the '5 sisters' (Ariba, Concur, Fieldglass, Hybris, SuccessFactors) under the leadership of Rob Enslin, it's clear that the long-stalled technology conversion of these five business towards the SAP technology stack was about to happen. More specifically it means the long talked, not much executed adoption of SAP HANA as the database, the adoption of SAP Cloud Platform as a PaaS, the use of SAP Analytics as reporting and Analytics platform, the usage of UX elements coming from Fiori (here the SAP Cockpit) and of course SAP Leonardo for machine learning and more. SAP customers will appreciate that change in direction, non-SAP customers using the 5 sisters may be a little more nervous, but as all is deployed in the cloud, they should not notice at all – except for improvement. Behind the 'Enslin March' is the expectation to be able to get more business capability being built by the '5 sisters' – an objective of great priority to customers – who always want more capabilities in return for their SaaS subscriptions.
 
SAP SuccessFactors Holger Mueller Constellation Research
Enslin presents
Focus on Mobile, with Leonardo ML and Co-Pilot. Mobile was the big headliner from SuccessConnect 2017 last year, so with some surprise it is also the big roadmap item for 2018. SAP has built a very good iOS app together with Apple, but ML wasn't ready inside of SAP 12 months ago and SuccessFactors is now adding ML capabilities to the mobile application. More interesting is the adoption of Co-Pilot which is SAP's chatbot platform, enabling conversation as a platform use cases. Most importantly it has support for Google's speech recognition baked in, pretty much the clear leader in this capability, which is crucial to make this application use case work. Still it feels like SuccessFactors is working around the main issue that it has – a necessary refresh of the UX and new consistency for the browser-based UX. Many vendors try to buy time for this with mobile apps and chatbots, but a lot of users still have to go back to the browser as a platform and an all nice UX falls into shambles. SuccessFactors needs to tackle that area to keep its (power) users happy and remain competitive.
 
SAP SuccessFactors Holger Mueller Constellation Research
Tomb talks, Harvey listens

Talent Acquisition bolstered by Onboarding and CRM. The other larger automation item from SuccessConnect was the new Onboarding and its good to see SuccessFactors delivering this capability now. Adding Candidate Relationship Management is a good extension of the Recruiting capabilities, and if done right recruiters using SuccessFactors will appreciate it. But both capabilities are built around an older Recruiting core, which needs an overhaul. And the overall Talent Suite inside of SuccessFactors needs an overhaul, maybe renovation if not replacement. At the core of Talent Management is Performance Management and SuccessFactors has done well innovating with the automation of the 1 to 1 meeting… but it needs to keep up innovating and experimenting in this area and reflect the move to Performance Management suites (which offer all ways to do Performance Management to enterprises and allow them to configure use the fitting practices to the right users and organizational units).

 
SAP SuccessFactors Holger Mueller Constellation Research
Wilson presents the Job Analyzer

Compliance Investment, Permit and Visa Module. One area where SAP and now SuccessFactors has a clear leg up on all the competition is compliance… SAP offers support for 58 countries directly as part of its payroll portfolio and there 41 additional countries covered by partners (total is 99). Keeping up payroll compliant is continuous work and dealing with regulatory uncertainty that is only rising. To cope, SAP has dedicated professionals in each country, who follow the regulatory changes. Apart from payroll, it allows SAP to pop out capabilities like the Permit and Visa Module. This is very relevant capabilities for global enterprises to manage global talent and close to impossible for enterprises to manage inhouse… they use immigration lawyers and those can make mistakes as they decide on case to case basis. Much better to address in software and kudos for SAP to come out with this capability, likely as a response to its Global Mobility capabilities. The best talent somewhere in the world can't help in country x if you can't even get the individual to visit.
 

MyPOV

SuccessFactors is making progress, but the pace seems to be glacial. Realistically using more of SAP technology, a good decision overall, means even more 'plumbing' work by R&D resources vs. 'sun deck' work (building SaaS capabilities). But it is an investment SAP should make, with the expectation of better productivity and more functionality being shipped in a few years – latest. And SAP needs to build a lead in capabilities, as it's the only large ERP suite vendor not having HCM integrated in a traditional way (single sign on, same schema, same application etc.) compared to its key competitors. SAP argues correctly that there are new and better ways of integrated in 2018 than there ever were (microservices, REST, APIs in general etc.) – but SuccessFactors itself is the living proof that this is hard to achieve… I am still confident to win the bet of finding more save buttons in look and feel and location than looking at SuccessFactors modules. And UX is only the iceberg. But SuccessFactors is committed and now must execute – fast. Next stop Sapphire in a few weeks.
 


Tech Optimization Next-Generation Customer Experience Revenue & Growth Effectiveness Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Sales Marketing Digital Safety, Privacy & Cybersecurity AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Customer Officer Chief People Officer Chief Human Resources Officer

Qlik Hits Reset Button, Rolls Out New Cloud, AI & Developer Capabilities

Qlik introduces management team, licensing changes and new hybrid/multi-cloud, augmented intelligence  and development features. Here’s my take from Qonnections 2018.

“Under new management.” When you see these words in front of a local business, you take note, and it makes you wonder what has really changed?

Keynoting at the April 24-26 Qlik Qonnections event in Orlando, Qlik’s new CEO, Mike Capone, after just four months on the job, cut right to the chase and spelled out some important business changes made early in his tenure.

Explaining a reorganization that took place in January, Capone said the goal was to refocus Qlik sales and support efforts on large Enterprise customers while “getting out of the way” of channel partners catering to midsize and smaller commercial customers. The reorg brought a fresh round of layoffs (on top of the layoffs carried out in 2016 when private equity firm Thoma Bravo took Qlik private). Downsizing seldom looks good (except to Wall Street), but in a message to partners, Capone noted that Qlik is reinvesting savings from the reorg to boost partner support programs, including qualified lead generation.

Qlik's new CEO, Mike Capone, was previously COO at Medidata and VP, SVP/GM and CIO,
successively, at ADP.

In a message to customers, Capone promised accelerated development of advanced authoring and visualization capabilities in Qlik Sense, the vendor’s newer, webbier and more self-service oriented product. And despite an “inviolable” promise to maintain support for Qlik View, the company’s original product, Capone also announced new migration tools, a shared Hub interface for all Qlik apps and, most importantly, new licensing terms designed to make it easier to move from Qlik View to Qlik Sense. The licensing change will eliminate the need for a second license. Instead customers will pay a fractional increase in maintenance (somewhere less than 50% of current fees) to add support for Qlik Sense. The Hub and licensing announcements each got big applause from Qonnection’s 3,500-plus attendees.

Cloud, AI and Developer Announcements               

On product and product strategy, executives laid out Qlik’s evolved vision and capabilities on three fronts: cloud, smart capabilities (a.k.a., augmented intelligence) and developer support. Here’s the rundown:


Qlik will promote hybrid- and multi-cloud deployment in part by adding support for
containerization and Linux beginning in June.

Next steps to the cloud. Qlik's cloud vision is to support hybrid- and multi-cloud deployment of Qlik applications and services with a microservices-based architecture that will eventually span and unite all deployment approaches. Qlik will take the next step to deliver on this promise in June by enabling Qlik subscription customers to move Qlik Sense applications into containerized, cloud-based instances than can run on Linux in public or private clouds. The Kubernetes, Docker and Linux support are new to Qlik, and it will enable customers to self-manage instances on Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP) or their own private clouds.

Containerization and Linux support will also benefit managed-cloud services, which are available from Qlik on the Qlik Cloud or from many partners on private or public clouds of the customer’s choice. Qlik’s two-year-old, software-as-a-service-based Qlik Sense Cloud was scarcely mentioned at Qonnections, but it remains an option for workgroup and departmental multi-tenant instances.

To better support the new range of deployment options, Qlik is introducing a Hub interface that give users one view -- with one log-in, one user ID and one user entitlement – of all their Qlik applications, irrespective of where they’re running. What’s more, Qlik’s management console will enable administrators to set governance policies across all public cloud, private cloud and on-premises instances. Qlik demonstrated a role-playing scenario in which non-regulated (but still access- and security-sensitive) applications and data from a German subsidiary were exposed through a U.S.-based cloud instance. In contrast, privacy-sensitive data subject to compliance requirements in Germany was retained on premises in the demo, but it was made accessible with strict access controls.

“Smart” capabilities explained. Qlik announced an “augmented intelligence” initiative last year, stressing that it would support rather than replace human analysis. The company has been developing a Cognitive Engine that will power a number of features to Quick Sense. An Insight Advisor feature added in April recommends the most appropriate visualization options when you select a particular field for analysis. If you don’t actively choose one of these recommendations, the engine will automatically choose what’s seen as the best-fit option.

Coming in June, Qlik Sense will gain a hands-off approach whereby you can simply point the Insight Advisor at a data set (rather than a particular field) and it will automatically surface all statistically significant insights. After choosing one of these starting points, users can then select the particular field of interest, as noted above, and get additional recommendations to take the analysis even deeper.

Insight Advisor is designed to speed time to analysis for untrained users, but in the spirit of augmenting human intelligence you can interact with the tool and go against its recommendations. In this case, Insight Advisor will flash textual warnings explaining why a particular approach is not recommended. During a keynote demo, a Qlik exec selected a tree map view even though the Insight Advisor recommended a different visualization. Messages appeared noting that there were negative values in the selected data set and that tree maps can’t show negative values. Professionals who don’t want or need this sort of assistance can simply turn the Insight Advisor feature off.

In next steps expected sometime later this year, Qlik plans to enable its Cognitive Engine to start learning from the selections that users choose from among its recommendations. In addition, Qlik plans to team the Cognitive Engine with its associative QIX engine, which keeps entire data sets visible for analysis even as you focus in on selected dimensions of data. If you select customers who are buying X product, for example, QIX also shows you which customers are not buying that product. It’s an advantage over drill-down, SQL analysis where you filter out information as you explore, and Qlik says the combination will recommend powerful, serendipitous insights that other technologies would fail to see.

Targeting developers and embedded analytics. Perhaps the most surprising and, to me, forward-looking initiative at Qlik has been its effort to promote analytic customization, extension and development both within and outside of the confines of Qlik applications. As one Qlik executive put it, “analytics shouldn’t just be a destination [as in a report or a dashboard], it should be part of the journey.” The idea is to support embedding of analytics into other applications, including transactional applications, so you can turn insights into actions. An app could even be autonomous, quickly taking action based on insights without even requiring a human interface or interpretation.


Qlik Core, set for beta release this summer, exposes Qlik analytics capabilites as API-
accessible microservices for cloud-oriented development.


To open up new possibilities, Qlik introduced Qlik Core, set for beta release this summer. Able to run on containers and Linux in the cloud, Qlik Core exposes the QIX engine through a microservices architecture and APIs. I talked to the CTO at Measur.io, a development partner and customer, who has been using and helping to guide the development of Qlik Core for more than a year. He said Qlik Core is powering the analytics behind the company’s Internet-of-Things (IoT) sensor applications and giving it all the cloud scalability and deployment flexibility the company needs.

Pricing for Qlik Core has yet to be set, but the licensing approach will combine free development with consumption-based charges for production workloads. Qlik has also open sourced libraries of capabilities, such as data loading with Halyard.JS, Qlik backend services with Enigma.JS, and charting with Picasso.JS, to facilitate cost-effective development and deployment. Qlik’s Qlik Branch developer community is promoting all of the above, and it’s said to growing quickly, with more than 27,000 registered members.

MyPOV on Qlik’s Progress

Overall I was impressed by Qlik’s new management team. Capone went straight to the customer hot buttons, including licensing and accelerated development. He’s also putting an emphasis on driving success through better customer outcomes, whether that’s through direct sales or channel sales. Customers that have better outcomes will naturally buy more software, so the company says it will do more to help customers succeed. Shelfware, meanwhile, is viewed as a failure and an outcome that should be avoided (and perhaps even dis-incented in sales incentive programs).

My consistent lament about Qlik, shared in my analyses from Qlik Qonnections 2016 and 2017, has been its slow pace to innovation. The criticism still applies on the cloud, AI and big data fronts, where Qlik is still delivering on capabilities initially announced 12 to 24 months ago. On cloud, for instance, the multi-cloud deployment option set for June release will initially be limited to consumption. Apps will have to be authored on-premises and then deployed for read-only access in the cloud. Cloud authoring, write-back capabilities and pull-through access to on-premises data sources are all on the roadmap. Similarly, the (late-to-the-game) Big Data Index that Qlik announced last year and demonstrated at Qonnections 2018 won’t see beta release until June or July. I don’t expect general availability before Q3, at the earliest.

I do like Qlik’s platform and QIX associative analysis strengths. I also like the push to enable developers to customize and extend Qlik apps and to create original apps that transcend conventional thinking about analytics. Where there’s confidence in an analysis, human interaction shouldn’t be required. That’s particularly true in emerging streaming and low-latency scenarios where there’s no time for human analysis.

My hope is that Qlik’s new management team continues to be responsive and that it will accelerate innovation. If I had to set the priorities I’d accelerate the practical stuff that customers care about today, such as hybrid-, multi-cloud and developer capabilities, while letting demand for AI capabilities build over the longer term.

Related Reading:
Domo Focuses Its Cloud-Based Analytics Message, Adds Predictive Options
MicroStrategy Makes Case for Agile Analytics on its Enterprise Platform
How Machine Learning & Artificial Intelligence Will Change BI & Analytics

Data to Decisions Chief Executive Officer Chief Information Officer Chief Digital Officer

Event Report - NetSuite SuiteWorld 2018 - NetSuite goes Global and Oracle

We had the opportunity to attend NetSuite's user conference SuiteWorld, held from April 23rd till 26th 2018 in Las Vegas at the Sands Convention Center. Attendance with over 7k attendees, and a better partner / exhibitor mix than in 2017 shows that NetSuite is doing well under the Oracle leadership (my first take on the acquisition is here).
 
 

 

 

 
Prefer to watch – here is my event video … (if the video doesn't' show up – check here)
 
 
Here is the 1 slide condensation (if the slide doesn't show up, check here):
 
Want to read on? Here you go:

NetSuite leverages Oracle. Not surprisingly, NetSuite uses Oracle on all levels possible. Starting from go to Market, branding, events, sales force, support, the platform, localization ability etc. That comes as no surprise as the stand-alone NetSuite never made a push from a global perspective… the attempt to go to English speaking countries shows the often seen, but wrong approach for Globalization, as the countries are not there where the largest opportunities are … see more below. Good to see the platform leverage – NetSuite using Oracle Cloud Infrastructure (OCI) in Germany soon, NetSuite using 12c pluggable database capabilities and even mentioning to look at Oracle Graal – are all good proof points that the Oracle technology and platform is in good shape to power NetSuite (and other SaaS vendors, if they dare).

 
NetSuite unveils the Intelligent Suite. The key announcement was Goldberg unveiling the Intelligent Suite in his keynote. NetSuite has been behind in regards of AI / Machine Learning / Assistants (notably different) – so it is good to see the initiative. Who does not want their ERP software get more intelligent? But the announcement was light on details, also because the Oracle AI story is not complete (and of course NetSuite uses Oracle for this). What the benchmark customers and prospects must ask themselves is – where would the Intelligent Suite be today – had e.g. Google acquired NetSuite. That is relevant as a scenario – as the NetSuite competition can partner e.g. with Google for AI and Machine Learning… so NetSuite is off to a good, overdue start, but needs to show tangible benefits soon. The four scenarios selected are a solid start, but need more work and quick expansion… and lastly an ERP conference with n
 
o virtual assistant (with a name) and a conversational demonstration is not really up to speed with the technological capabilities that are available, and customers demand from their ERP vendors in 2018,
 
Oracle NetSuite SuiteWorld Holger Mueller Constellation Research


Globalization is the Focus. Good to see NetSuite moving faster on the topic, which has been a sore spot for customers and a missed opportunity for the vendor. NetSuite is committing to Brazil, China, Germany, India, Italy and Japan -all large economies, with a need for a ERP Suite that runs in the cloud. The good news for NetSuite and prospects is, that it is not too late, but local competition and alternatives are growing quicker and quicker… so speed and urgency is off the essence, and NetSuite management showed that at SuiteWorld. Enterprises in the target / priority countries should take a thorough look at the offering, even all the Oracle backing, making a North America centric product a global product is not trivial. Also, time for NetSuite to unveil the next set of countries as well, customers need to know more than the first step of the Globalization journey.

 
Oracle NetSuite SuiteWorld Holger Mueller Constellation Research
Goldberg with the investment 

Continued SuitePeople Momentum. One year ago – NetSuite unveiled SuitePeople, its inhouse, organically built HCM Suite. At SuiteWorld NetSuite shows some progress, especially on the customer traction side. Oracle owning NetSuite – contrary to some (not mine) expectations – has not hurt NetSuite HCM prospects. Leveraging the pretty successful Taleo SMB suite as a Recruiting tool makes sense for NetSuite, as long as the integration is handled well. In demos (the keynote) it was one common user experience, something that is the new gold standard for integrating products (REST, declarative UX components render in a UX familiar to the user), so that's a good start. For the observer though, it's another pivot, away from organically building, to leveraging a partner (now owner), Oracle. At some other time, we must see how good the Taleo Recruiting capabilities stand up towards 2018 best practices… but for a NetSuite customer, which is suite minded, some Recruiting that is good enough, is better than no Recruiting. The irony is – NetSuite was there 2013 already, when it announced a HCM partnership with … Oracle. It did not make much sense back then to use the Oracle large enterprise HCM for the NetSuite SMB customer base – but it plugged a whole at the time. Overall NetSuite needs to understand that it has to stop pivoting in HCM (this is the 6h! Before 2013, then Oracle HCM, then many startups, then TribeHR, then partnership with Ultimate, then SuitePeople, now SuitePeople and a shot of Taleo) – and build capabilities for a key enterprise function as their customer base is moving into the contractor, contingent workforce and gig economy.


 

MyPOV

Rumors of NetSuite's demise under Oracle ownership (there were plenty) – have been largely debunked and proven as completely falseby now. To the contrary, NetSuite now has the capital and know how to take the product global, something were customers were already (190+ operating countries setup in NetSuite overall) – but the vendor was not helping them much. Good to see the focus. Equally good (and no surprise) the focus on Oracle Technology. Knowing Larry Ellison, I am sure he is enjoying having two application suites in the Oracle fold that can uptake new Oracle technology – the Cloud Suite and NetSuite. Ellison is a master at getting the competitive spirits going – all good news as long as the technology works, and the quality of the SaaS apps remains right.
 
On the concern side, NetSuite may not be moving fast enough yet. It needs to move to 30, then 50 supported countries fast, to e.g. counter and lead vis a vis other large ERP vendors SMB products (starts with an S…). And while I always like when vendors go back and fix something (the Commerce offering), it does not make sense for NetSuite to play in the space, while e.g. not having a full HCM Suite, no clear CRM strategy , some holes in SCM and so on. So, NetSuite not only needs to move fast, but also smart, which the vendor is doing on most aspects, but needs to do more. For instance, NetSuite needs to announce the next 10 or better 20 target countries for its Globalization / Localization plans, so customers can plan their roll outs accordingly.  

But overall a good event for NetSuite (and Oracle), the North American customer base is energized, partners are present and investing, so it's execution time for NetSuite – to become more global, more complete as a Suite and leverage the Oracle technology. Stay tuned.

Want to learn more? Checkout the Twitter Moment below (if it doesn't show up – check here).And here is a Twitter Moment for the Day #1 Keynote and here for the analyst meeting. 
 
Future of Work Tech Optimization Data to Decisions AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Information Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer

How To Close The Talent Gap With Machine Learning

1

  • 80% of the positions open in the U.S. alone were due to attrition. On an average, it costs $5,000 to fill an open position and takes on average of 2 months to find a new employee. Reducing attrition removes a major impediment to any company's productivity.
  • The average employee's tenure at a cloud-based enterprise software company is 19 months; in the Silicon Valley this trends to 14 months due to intense competition for talent according to C-level executives.
  • Eightfold.ai can quantify hiring bias and has found it occurs 35% of the time within in-person interviews and 10% during online or virtual interview sessions.
  • Adroll Group launched nurture campaigns leveraging the insights gained using Eightfold.ai for a data scientist open position and attained a 48% open rate, nearly double what they observed from other channels.
  • A leading cloud services provider has seen response rates to recruiting campaigns soar from 20% to 50% using AI-based candidate targeting in the company's community.

The essence of every company's revenue growth plan is based on how well they attract, nurture, hire, grow and challenge the best employees they can find. Often relying on manual techniques and systems decades old, companies are struggling to find the right employees to help them grow. Anyone who has hired and managed people can appreciate the upside potential of talent management today.

How AI and Machine Learning Are Revolutionizing Talent Management

Strip away the hype swirling around AI in talent management and what's left is the urgent, unmet needs companies have for greater contextual intelligence and knowledge about every phase of talent management. Many CEOs are also making greater diversity and inclusion their highest priority. Using advanced AI and machine learning techniques, a company founded by former Google and Facebook AI Scientists is showing potential in meeting these challenges. Founders Ashutosh Garg and Varun Kacholia have over 6000+ research citations and 80+ search and personalization patents. Together they founded Eightfold.ai as Varun says "to help companies find and match the right person to the right role at the right time and, for the first time, personalize the recommendations at scale." Varun added that "historically, companies have not been able to recognize people's core capabilities and have unnecessarily exacerbated the talent crisis," said Varun Kacholia, CTO, and Co-Founder of Eightfold.ai.

What makes Eightfold.ai noteworthy is that it's the first AI-based Talent Intelligence Platform that combines analysis of publicly available data, internal data repositories, Human Capital Resource Management (HRM) systems, ATS tools and spreadsheets then creates ontologies based on organization-specific success criteria. Each ontology, or area of talent management interest, is customizable for further queries using the app's easily understood and navigated user interface.

Based on conversations with customers, its clear integration is one of the company's core strengths. Eightfold.ai relies on an API-based integration strategy to connect with legacy back-end systems. The company averages between 2 to 3 system integrations per customer and supports 20 unique system integrations today with more planned. The following diagram explains how the Eightfold Talent Intelligence Platform is constructed and how it works.

For all the sophisticated analysis, algorithms, system integration connections, and mathematics powering the Eightfold.ai platform, the company's founders have done an amazing job creating a simple, easily understood user interface. The elegant simplicity of the Eightfold.ai interface reflects the same precision of the AI and machine learning code powering this platform.

I had a chance to speak with Adroll Group and DigitalOcean regarding their experiences using Eightfold.ai. Both said being able to connect the dots between their candidate communities, diversity and inclusion goals, and end-to-end talent management objectives were important goals that the streamlined user experience was helping enable. The following is a drill-down of a candidate profile, showing the depth of external and internal data integration that provides contextual intelligence throughout the Eightfold.ai platform.

Talent Management's Inflection Point Has Arrived 

Every interaction with a candidate, current associate, and high-potential employee is a learning event for the system.

AI and machine learning make it possible to shift focus away from being transactional and more on building relationships. AdRoll Group and DigitalOcean both mentioned how Eightfold.ai's advanced analytics and machine learning helps them create and fine-tune nurturing campaigns to keep candidates in high-demand fields aware of opportunities in their companies. AdRoll Group used this technique of concentrating on insights to build relationships with potential Data Scientists and ultimately made a hire assisted by the Eightold.ai platform. DigitalOcean is also active using nurturing campaigns to recruit for their most in-demand positions. “As DigitalOcean continues to experience rapid growth, it’s critical we move fast to secure top talent, while taking time to nurture the phenomenal candidates already in our community,” said Olivia Melman, Manager, Recruiting Operations at DigitalOcean. “Eightfold.ai’s platform helps us improve operational efficiencies so we can quickly engage with high quality candidates and match past applicants to new openings.”

In companies of all sizes, talent management reaches its full potential when accountability and collaboration are aligned to a common set of goals. Business strategies and new business models are created and the specific amount of hires by month and quarter are set. Accountability for results is shared between business and talent management organizations, as is the case at AdRoll Group and DigitalOcean, both of which are making solid contributions to the growth of their businesses. When accountability and collaboration are not aligned, there are unpredictable, less than optimal results.

AI makes it possible to scale personalized responses to specific candidates in a company's candidate community while defining the ideal candidate for each open position. The company's founders call this aspect of their platform personalization at scale. "Our platform takes a holistic approach to talent management by meaningfully connecting the dots between the individual and the business. At Eightfold.ai, we are going far beyond keyword and Boolean searches to help companies and employees alike make more fulfilling decisions about 'what's next, " commented Ashutosh Garg, CEO, and Co-Founder of Eightfold.ai.

Every hiring manager knows what excellence looks like in the positions they're hiring for. Recruiters gather hundreds of resumes and use their best judgment to find close matches to hiring manager needs. Using AI and machine learning, talent management teams save hundreds of hours screening resumes manually and calibrate job requirements to the available candidates in a company's candidate community. This graphic below shows how the Talent Intelligence Platform (TIP) helps companies calibrate job descriptions. During my test drive, I found that it's as straightforward as pointing to the profile of ideal candidate and asking TIP to find similar candidates.

Achieving Greater Equality With A Data-Driven Approach To Diversity

Eightfold.ai can quantify hiring bias and has found it occurs 35% of the time within in-person interviews and 10% during online or virtual interview sessions. They've also analyzed hiring data and found that women are 11% less like to make it through application reviews, 19% less likely through recruiter screens, 12% through assessments and a shocking 30% from onsite interviews. Conscious and unconscious biases of recruiters and hiring managers often play a more dominant role than a woman's qualifications in many hiring situations. For the organizations who are enthusiastically endorsing diversity programs yet struggling to make progress, AI and machine learning are helping to accelerate them to the goals they want to accomplish.

AI and machine learning can't make an impact in this area quickly enough. Imagine the lost brainpower from not having a way to evaluate candidates based on their innate skills and potential to excel in the role and the need for far greater inclusion across the communities companies operate in. AdRoll Group's CEO is addressing this directly and has made attaining greater diversity and inclusion a top company objective for the year. Daniel Doody, Global Head of Talent at AdRoll Group says "We're very deliberate in our efforts to uncover and nurture more diverse talent while also identifying individuals who have engaged with our talent brand to include them" he said. Daniel Doody continued, "Eightfold.ai has helped us gain greater precision in our nurturing campaigns designed to bring more diverse talent to Adroll Group globally."

Kelly O. Kay, Managing Partner, Global Managing Partner, Software & Internet Practice at Heidrick & Struggles agrees. "Eightfold.ai levels the playing field for diversity hiring by using pattern matching based on human behavior, which is fascinating," Mr. Kay said. He added, "I'm 100% supportive of using AI and machine learning to provide everyone equal footing in pursuing and attaining their career goals." He added that the Eightfold.ai's greatest strength is how brilliantly it takes on the challenge of removing unconscious bias from hiring decisions, further ensuring greater diversity in hiring, retention and growth decisions.

Eightfold.ai has a unique approach to presenting potential candidates to recruiters and hiring managers. They can remove any gender-specific identification of a candidate and have them evaluated purely on expertise, experiences, merit, and skills. And the platform also can create gender-neutral job descriptions in seconds too. With these advances in AI and machine learning, long-held biases of tech companies who only want to hire from Cal-Berkeley, Stanford or MIT are being challenged when they see the quality of candidates from just as prestigious Indian, Asian, and European universities as well. Daniel Doody of Adroll Group says the insights gained from the Eightfold.ai platform "are helping to make managers and recruiters more aware of their own hiring biases while at the same time assisting in nurturing potential candidates via less obvious channels."

How To Close The Talent Gap

Based on conversations with customers, it's apparent that Eightfold.ai's Talent Intelligence Platform (TIP) provides enterprises the ability to accelerate time to hire, reduce the cost to hire and increase the quality of hire. Eightfold.ai customers are also seeing how TIP enables their companies to reduce employee attrition, saving on hiring and training costs and minimizing the impact of lost productivity. Today more CEOs and CFOs than ever are making diversity and talent initiatives their highest priority. Based on conversations with Eightfold.ai customers it's clear their TIP provides the needed insights for C-level executives to reach their goals.

Another aspect of the TIP that customers are just beginning to explore is how to identify employees who are the most likely to leave, and take proactive steps to align their jobs with their aspirations, extending the most valuable employees' tenure at their companies. At the same time, customers already see good results from using TIP to identify top talent that fits open positions who are likely to join them and put campaigns in place to recruit and hire them before they begin an active job search. Every Eightfold.ai customer spoken with attested to the platform's ability to help them in their strategic imperatives around talent.

An Industrial IoT Upstart Rises: Uptake Technologies

Located in an upscale and ultramodern facility in downtown Chicago, right in the same Goose Island building that also houses the headquarters of Groupon and a number of other up-and-coming digital startups, is a company that even many in the Internet of Things industry still haven't heard much about. The brainchild of Groupon co-founders Brad Keywell and Eric Lefkofsky in 2014, Uptake Technologies is a fast growing Industrial Internet of Things (IIoT) player that has kept largely off the radar while focuses on building early proof points with a strong initial customer base. 

Uptake's primary offering uses the technologies and methods of data science and artificial intelligence, along with existing arrays of sensors that customers already have embedded in their industrial devices, to help enterprises that rely on heavy machines (think trains, planes, mining, and manufacturing lines) get the very most out of them, while managing potential risk and downside ("machines don't have to break" is one of their key tag lines.)

Earlier this month, Uptake held their first ever analyst summit and I was invited to participate.  Well attended by analysts, including the influential (and my fellow Enterprise Irregulars ) Vinnie Mirchandani and Brian Sommer, the session was a end-to-end overview of the startup's history, aspirations, goals, and progress so far. While keeping a fairly low industry profile up until now, the company has already achieved the coveted unicorn status, with a $2.3 billion valuation as of its last funding round, a series D raise for $117 million last fall. The company has taken care to build good relationships with relevant players as well. Uptake's list of strategic partners over the last few years reads like a who's who in industry, including Caterpillar, Progress Rail, and Berkshire Hathaway Energy.

Falling somewhere between asset management and asset optimization, Uptake's approach is a ground-up rethinking of using real-time streams of connected device data -- they touted several times in our sessions that they've already captured over 1.2 billion hours of operational machine data which their algorithms can use as an experience base -- by applying the very latest in data science tools and methods to help organiztions monitor, manage, and maintain their fleets of highly valuable equipment. Uptake's performance-based approach makes the most sense with higher value assets whose failure or unavailability would adversely impact an organization significantly. So far they've largely avoided low cost assets, but indicated that they will likely expand their asset coverage their as they refine their capabilities and understand the needs of customers at that level. Not to mention that higher value industrial assets represent a more profitable business model for the company, at least for now.

Uptake CEO and Co-Founder Brad Keywell and President Ganesh Bell at the Uptake Analyst Summit 2018

Our day kicked off with a session from co-founder Brad Keywell, who discussed the overall vision of the company that he incorporated just four short years ago and now boasts 750 employees with over 50 major industrial customers. For Uptake, â€œit’s about the efficacy of outcome. If we build the right data, we will become the leading source of outcomes," said Brad. Given the nature of digital ecosystems to confer outsized advantage to those with control over best-in-class data sets, this is strategic approach that will prepare them well to go up against other leaders in the space. These competitors will also be wielding their own growing historical datasets and algorithms to build out and wield competitive advantage. In my analysis, this means that the company that offers the lowest total cost of positive outcome with the highest accuracy will tend to win over time, but the cost of entry is having enough relevant industry data. Uptake has made heavy subject matter and data capture investments in key industries, including a relatively high staff count compared to other digital startups, to delver performance management for assets in the industry segments they believe will propel their growth.

Next up was an overview of Uptake's strategy from Ganesh Bell, who recently arrived at the firm in February from his influential role as Chief Digital Officer of GE Power and is also an industry colleague of mine. In his session, he made the long-term objective of the company very clear, to become the category creator and leader for something he calls "Industrial AI", which is the dynamic application of data science, machine learning, and sensor-based data to improve outcomes in industrial organizations. While Uptake is starting with largely predictive solutions at the moment, over time, as their cognitive capabilities increase and their historical data sets deepen, the company will be able to offer ever more strategic capabilities that reach into the realms of forecasting, prescriptive analytics, and otherwise automating planning and operations of asset-heavy enterprises.

The word "transformation" was mentioned by Uptake's leadership when it came to describing that they did for customers. For now the transformation is more of the tactial variety such as shifting industrial customers from time-based maintenance (such as every 3,000 miles) to condition-based maintenance (the sensors show that the device now requires routine service.) These in reality are significant shifts for relatively large companies to make, and it's good to see Uptake looking at immediate impact as well as a long-term AI-based roadmap, though it's also clear Brad, Ganesh, and others will need to be clearer on what that roadmap is in the coming year or two if they seek to have a full seat at strategic partner tables of their industrial customers.

Industrial Internet of Things (IIoT) Analytics and Operations Reference Platform

Figure 1: Uptake realizes nearly the entire reference architecture for an advanced IIoT analytics and ops platform

I also asked Uptake's leadership several times about customer concerns about the data insights there were learning from their customers' equipment, and if there were worries expressed about data ownership and control of those insights, which they will arguably sell to other subsequent customers. While this has been a hot topic in other related industries, I was informed it had not been an major issue so far in discussions with customers, nor a headwind on sales. My view is that this will become much more important for Uptake to manage successfully in the near future as companies increasingly understand the great value they give away by not retaining full control of their industrial data.

The rest of the day included overviews from Chief Product Officer, Greg Goff, Chief Information Security Officer Nicholas J. Percoco, and VP of Data Science, Adam McElhinney, among others. All of them stressed the challenges of creating an advanced industrial analystics capability in remote industrial locations, taking pains to explore the deep thinking they had done to deliver on with the performance, security, and product architecture requirements to create reliable services that accurately predict industrial events. Edge computing, especially distributed analytics on the edge, was also cited numerous times as a core capability of the Uptake platform, and it was clear that the team has done extensive homework to create an early maturity Industrial Internet of Things (IIoT) analytics offering.

My conclusions overall on Uptake based on what I learned at the analyst day:

  • A strong founding and leadership team intent on creating customer impact more than building a high profile
  • Clear tactical vision for shifting legacy industrial asset management to an event-driven, real-time model
  • Good execution on the tech with early case studies with ROI sufficient to drive good revenue growth
  • Amount of staff in evidence to support just 50 customers in key industries is a potential growth bottleneck 
  • Strategic vision needs more development to tell a compelling longer-term story on customer journey
  • Ops data history and algorithms a differentiator, but unclear yet if unique enough to keep low cost competitors away
  • Can deliver as a network orchestrator, one of the most valuable digital strategies, as their strategic vision matures

Related Reading

Defining the Business benefit and the knowledge for your IoT/IIoT project

IoT Solution Building; Managing and Using Operational Data to change the game

IoT: Where are the Integrators? Who are the Integrators?

Data to Decisions Tech Optimization Chief Information Officer Chief Supply Chain Officer Chief Digital Officer

Cloudera Transitions, Doubles Down on Data Science, Analytics and Cloud

Cloudera has restructured amid intensifying cloud competition. Here’s what customers can expect.

Cloudera’s plan is to lead in machine learning, to disrupt in analytics and to capitalize on customer plans to move into the cloud.

It’s a solid plan, for reasons I’ll explain, but that didn’t prevent investors from punishing the company on April 3 when it offered a weaker-than-expected guidance for its next quarter. Despite reporting 50-percent growth for the fiscal year ended January 31, 2018, Cloudera’s stock price subsequently plunged 40 percent.

Cloudera’s narrative, shared at its April 9-10 analyst and influencers conference, is that it has restructured to elevate customer conversations from tech talk with the CIO to a C-suite and line-of-business sell about digital transformation. That shift, they say, could bring slower growth (albeit still double-digit) in the short term, but executives say it’s a critical transition for the long term. Investors seem spooked by the prospect of intensifying cloud competition, but here’s why Cloudera expects to keep and win enterprise-grade customers.

It Starts With the Platform

Cloudera defines itself as an enterprise platform company, and it knows enterprise customers want hybrid and multi-cloud options. Cloudera’s options now range from on-premises on bare metal to private cloud to public cloud on infrastructure as a service to, most recently, Cloudera Altus public cloud services, available on Amazon Web Services (AWS) and Microsoft Azure.

Supporting all these deployment modes is, of course, something that AWS and Google Cloud Platform (GCP) don’t do and that Microsoft, IBM, and Oracle do exclusively in their own clouds. The key differentiator that Cloudera is counting on is its Shared Data Experience. SDX gives customers the ability to define and share data access and security, data governance, data lifecycle management and deployment management and performance controls across any and all deployment modes. It’s the key to efficiently supporting both hybrid and multi-cloud deployments. Underpinning SDX is a shared data/metadata catalog that spans deployment modes and both cloud- and on-premises storage options, whether they are Cloudera HDFS or Kudu clusters or AWS S3 or Azure Data Lake object stores.

As compelling as public cloud services such as AWS Elastic MapReduce may sound, from the standpoint of simplicity, elasticity and cost, Cloudera says enterprise customers are sophisticated enough to know that harnessing their data is never as simple as using a single cloud service. In fact, the variety of services, storage and compute variations that have to be spun up, connected and orchestrated can get quite extensive. And when all those per-hour meters are running the collection of services can also get surprisingly expensive. When workloads are sizeable, steady and predictable, many enterprises have learned that it can be much more cost effective to handle it on-premises. If they like cloud flexibility, perhaps they’ll opt for a virtualized private-cloud approach rather than going back to bare metal.

With more sophisticated and cost-savvy customers in mind, Cloudera trusts that SDX will appeal on at least four counts:

  • Define once, deploy many: IT can define data access and security, data governance, data lifecycle, and performance management and service-level regimes and policies once and apply them across deployment models. All workloads share the same data under management, without having to move data or create copies and silos for separate use cases.
  • Abstract and simplify: Users get self-service access to resources without having to know anything about the underlying complexities of data access, deployment, lifecycle management and so on. Policies and controls enforce who sees what, which workloads run where and how resources are managed and assigned to balance freedom and service-level guarantees.
  • Provide elasticity with choice: With its range of deployment options, SDX gives enterprises more choice and flexibility than a cloud-only provider in terms of how it meets security, performance, governance, scalability and cost requirements.
  • Avoid lock-in: Even if the direction is solidly public cloud, SDX gives enterprises options to move workloads between public clouds and to negotiate better deals knowing they won’t have to rebuild their applications if and when they switch providers.

MyPOV on SDX

The Shared Data Experience is compelling, though at present it’s three parts reality and one part vision. The shared catalog is Hive and Hadoop centric, so Cloudera is exploring ways to extend the scope of the catalog and the data hub. Altus services are generally available for data engineering, but only recently entered beta (on AWS) for analytics deployments and persisting and managing SDX in the cloud. General availability of Cloudera Analytics and SDX services on Azure is expected later this year. Altus Data Science is on the roadmap, as are productized ways to deploy Altus services in private clouds. For now, private cloud deployments are entirely on customers to manage. In short, the all-options-covered rhetoric is a bit ahead of reality, but the direction is clear.

Machine Learning, Analytics and Cloud

Cloudera is counting on these three growth areas, so much so that it last year appointed general managers of each domain and reorganized with dedicated product development, product management, sales and profit-and-loss responsibility. At Cloudera's analyst and influencers conference, attendees heard presentations by each of the new GMs: Fast Forward Labs founder Hilary Mason on ML, Xplain.io co-founder Anupam Singh on analytics, and Oracle and VMware veteran Vikram Makhija on Cloud.

Lead in Machine Learning. The machine learning strategy is to help customers develop and own their ability to harness ML, deep learning and advanced analytical methods. They are “teaching customers how to fish” using all of their data, algorithms of their choice and running workloads in the deployment mode of their choice. (This is exactly the kind of support executives wanted at a global bank based in Denmark, as you can read in my recent “Danske Bank Fights Fraud with Machine Learning and AI” case study report.)

Cloudera last year acquired Mason’s research and consulting firm Fast Forward Labs with an eye toward helping customers to overcome uncertainty on where and how to apply ML methods. The Fast Forward team offers applied research (meaning practical, rather than academic), strategic advice and feasibility studies designed to help enterprises figure out whether they’re pursuing the right problems, setting realistic goals, and gathering the right data.

On the technology side, Cloudera’s ML strategy rests on the combination of SDX and the Cloudera Data Science Workbench (CDSW). SDX addresses the IT concerns from a deployment, security and governance perspective while CDSW helps data scientists access data and manage workloads in self-service fashion, coding in R, Python or Scala and using analytical, ML and DL libraries of their choice.

MyPOV on Cloudera ML. Here, too, it’s a solid vision with pieces and parts that have yet to be delivered. As mentioned earlier, Altus Data Science is on the roadmap (not even in beta), as are private-cloud and Kubernetes support. Also on the roadmap are model-management and automation capabilities that enterprises need at every stage of the model development and deployment lifecycle as they scale up their modeling work. Here’s where Azure Machine Learning and AWS SageMaker, to name two, are steps ahead of the game.

I do like that Cloudera opens the door to any framework and draws the line at data scientist coding with DSW, leaving visual, analyst-level data science work to best-of-breed partners such as Dataiku, DataRobot, H2O and RapidMiner.

Disrupt in Analytics. It was eye opening to learn that Cloudera gets the lion’s share of its revenue from analytics -- more than $100 million out of the company’s fiscal year 2018 total of $367 million in revenue. One might think of Cloudera as being mostly about big, unstructured data. In fact it’s heavily about disrupting the data warehousing status quo and enabling new, SQL-centric applications with the combination of the Impala query engine, the Kudu table store (for streaming and low-latency applications), and Hive on Apache Spark.

Cloudera analytics execs say they’re having a field day optimizing data warehouses and consolidating dedicated data marts (on Netezza and other aging platforms) now seen are expensive silos, requiring redundant infrastructure and copies of data. With management, security, governance and access controls and policies established once in SDX, Cloudera says IT can support myriad analytical applications without moving or copy data. That data might span AWS S3 buckets, Azure Data Lakes, HDFS, Kudu or all of the above.

The new news in analytics is that Cloudera is pushing to give DBA types all the performance-tuning and cost-based analysis options they’re used to having in data warehousing environments. Cloudera already offered its Analytic Workbench (also known as HUE) for SQL query editing. What’s coming, by mid year, is a consolidated performance analysis and recommendation environment. Code named Workload 360 for now, this suite will provide end-to-end guidance on migrating, optimizing and scaling workloads. To be delivered as a cloud service, this project combines Navigator Optimizer (tools acquired with Xplain.io) with workload analytics capabilities introduced with Altus. Think of it as a brain for data warehousing that will help companies streamline migrations, meet SLAs, fix lagging queries and proactively avoid application failures.

MyPOV on Analytics. Workload management tools are a must for heavy duty data warehousing environments, so this analysis-for-performance push is a good thing. Given the recent push into autonomous database management, notably by Oracle, I would have liked to have heard more about plans for workload automation.

Cloudera also didn’t have much to say about the role of Hive and Spark for analytical and streaming workloads, but I suspect they are significant. I’ve also talked to Cloudera customers (read “Ultra Mobile Takes an Affordable Approach to Agile Analytics”) that tap excess relational database capacity to support low-latency querying rather than relying on Impala, Hive or a separate Kudu cluster. Hive, Spark and conventional database services or capacity fall into the category of practical, cost-conscious options that may not drive additional Cloudera analytics revenue, but it’s an open platform that gives customers plenty of options.

Capitalize on the Cloud. As noted above, SDX and the growing Altus portfolio are at the heart of Cloudera’s cloud plans. Enough said about the pieces still to come or missing. I see SDX as compelling, and it’s already helping customers to efficiently run myriad data engineering and analytic workloads in hybrid scenarios. But as a practical matter, many companies aren’t that sophisticated and are choosing to keep things simple with binary choices: X data and use case on-premises and Y data and use case in the cloud. Indeed, one of Cloudera’s customer panel guests acknowledged the importance of avoiding cloud lock in; nonetheless, he said his firm is considering the “simplicity” versus data/application portability tradeoffs of using Google Cloud Platform-native services.

MyPOV on Cloudera Cloud. Binary thinking is not the way to harness the power of using all your data, and it can lead to overlaps, redundancies and need of moving and copying data. Nonetheless, handling X on premises and Y in the cloud may be seen as the simpler and more obvious way to go, particularly if there are natural application, security or organizational boundaries. Cloudera has to execute on its cloud vision, develop a robust automation strategy and demonstrate to enterprises, with plenty of customer examples, that the SDX way is simpler and more cost-effective way to go and a better driver of innovation than binary thinking.

Related Reading:
Nvidia Accelerates AI, Analytics with an Ecosystem Approach
Danske Bank Fights Fraud With Machine Learning and AI
Ultra Mobile Takes an Affordable Approach to Agile Analytics

Data to Decisions Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer

Adobe Acquires Sayspring to Bring Voice Interaction to their AI Platform

Adobe has announced the acquisition of Sayspring, makers of a natural language platform for interacting with devices like Amazon Echo and Google Home/Assistant. 

MyPOV: People are becoming accustomed to using their voice to interact with devices like their phones, tablets and ambient speakers (Echo, Home, etc), soon we will see a similar level of comfort for interacting with our business application software. Using voice commands is a very quick and natural way to find and create content, automate tasks, or look up people and information. It will be interesting to see how Adobe enhances their Sensei platform which provides AI features to their Document, Creative, and Customer Experience Cloud platforms using Sayspring's existing assets, but even more so leveraging their talented team to build new interfaces directly into Adobe software.

 

Future of Work

Event Report - Globoforce Workhuman 2018

 

   
Want to read on? Here you go:
 
Outstanding Speaker Lineup - Workhuman stands out on the conference circuit with an exceptional speaker lineup. The pre-conference could easily have been the full external speaker line up for any other, well funded vendor conference. And in the main conference, give a conference that has Salma Hayek, Amal Clooney and Ashley Judd ('relegated' to a panelist in a MeToo panel) in the run of 24 hours. It is by design, Globoforce wants the WorkHuman conference not to be about product, but though leadership, inspiration and purpose. 
 
Globoforce WorkHuman 2018 Constellation Research Holger Mueller
The Globoforce WorkHuman Cloud
 
 
Globoforce launches WorkHuman Cloud - Inside all the great speaker lineup, it was a challenge to get the attention to what really mattered to any user community - the launch of a new Globoforce product, not surprisingly called the WorkHuman Cloud. It's a suite of five reward / recognition - if you want Performance management capabilities, built on a single platform. All the usual suite benefits apply - single sign-on, UI consistency (some work left), common foundation etc. An almost overdue move by Globoforce, who has been a multi module / product vendor since quite some time. Adoption is not so much of a concern, as true SaaS vendor style, the existing customers are 'on' the platform already with their existing products. 
 
Globoforce WorkHuman 2018 Constellation Research Holger Mueller
Globoforce WorkHuman Emloyee Dashboard
 
 
Impressive Customer Stories - Workhuman stands out from the regular conferences in the sense that it gives more stage to customers than the average conference. Nothing is more powerful than having customers share how they implemented a product, and how it helped them create benefits and favorable outcomes. A lot, impressive, educational and sometimes even inspirational success stories were shared at WorkHuman, no surprises, as we know that a well implemented rewards and recognition system, can have substantial positive impact on the performance of an enterprise.  
 
Globoforce WorkHuman 2018 Constellation Research Holger Mueller
Globoforce WorkHuman My Life Events
 
 

MyPOV

 
WorkHuman stands out as a remarkably different conference. Customers and prospects of Globoforce clearly enjoy the format, and vote by increasing attendance. The success shows a lack of vendor independent, grand scheme (work human!) focus events that serve the HR community. Clearly something that user group conferences should to - but clearly are not doing. Good to see the product progress by Globoforce, who has changed and improved the user experience, and most importantly created a suite of products, the next milestone of maturation of any software vendor. 
 
On the concern side, Globoforce could be a little concerned on how to top this conference in 2019. I would be. And it was remarkable, and deeply surprising, that it was hard for the audience to pay attention to the product updates during the keynote, which were ... substantial. Too much motivation and inspiration makes the product message dull, and at the end of the day, users attend user conferences to learn ... about the product. Inspirational speakers are great, but not the argument to implement, upgrade or purchase a product, that is needed to convince the rest of the enterprise to invest further into any software product.
 
But for now, Globoforce has setup one of the best HR "Un-conferences" on the circuit, probably the best for a vendor. As with anything, success comes with repercussions... and I can't wait to see how WorkHuman 2019 will shape out. Stay tuned.  
 
 
Also - check out a Twitter Moment of IBM Think 2018 here
 
 
 
 
Future of Work Innovation & Product-led Growth Tech Optimization Data to Decisions Next-Generation Customer Experience New C-Suite Sales Marketing Digital Safety, Privacy & Cybersecurity AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief People Officer Chief Customer Officer Chief Human Resources Officer