Results

Connected Enterprise 2016 - Inside The Future Of Blockchain Technologies

Connected Enterprise 2016 - Inside The Future Of Blockchain Technologies

Media Name: research-offerings-research-reports.jpg

Blockchain's journey to world domination has just begun. At the moment, several security models, permission models, and private/public approaches are competing for dominance. Hear from senior executives about how they see blockchain transforming the future of business and technology and why certain approaches will win.

Moderator: Steve Wilson
Global Head of Blockchain at IBM: Aron Dutta
Recovering Product Executive: Chirag Mehta
CEO at Maxssure: Shawn Wiora
Chief Digital Officer at IMS Health: Richie Etwaru

Data to Decisions Digital Safety, Privacy & Cybersecurity Chief Information Officer Chief Digital Officer On <iframe src="https://player.vimeo.com/video/192524252?badge=0&autopause=0&player_id=0" width="1920" height="1080" frameborder="0" title="Visionaries - Inside The Future Of Blockchain Technologies" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>

Connected Enterprise 2016 - Moving Beyond The Hype Of Blockchain

Connected Enterprise 2016 - Moving Beyond The Hype Of Blockchain

Media Name: research-offerings-research-reports.jpg

Blockchain technologies were popularized in 2016 due to their role as the ledger technology underlying Bitcoin. However, distributed ledger technologies such as blockchain have many applications aside from Bitcoin, and have the potential to underpin technology platforms in the future. Learn about emerging applications for blockchain tech.

Moderator: Steve Wilson
Chief Digital Officer at IMS Health: Richie Etwaru
Chief Innovation Officer at Cybric: Mike Kail
Independent Blogger: Chirag Mehta

Digital Safety, Privacy & Cybersecurity Chief Information Officer On <iframe src="https://player.vimeo.com/video/192524153?badge=0&autopause=0&player_id=0" width="1920" height="1080" frameborder="0" title="Executive Exchange - Moving Beyond The Hype Of Blockchain" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>

Connected Enterprise 2016 - Lessons Learned From Digital Leaders

Connected Enterprise 2016 - Lessons Learned From Digital Leaders

Media Name: research-offerings-research-reports.jpg

Digital leaders face a number of challenges including defining new roles for themselves, evangelizing digital business models, and delivering value. Learn how these seasoned digital leaders have navigated the challenges of the digital economy and succeeded in driving transformation in their organizations.

Moderator: R "Ray" Wang
Executive Vice President at Ecolab: Christophe Beck
Global Head of Digital at Philips: Blake Cahill
Executive Vice President and Chief Technology Officer at Houghton Mifflin Harcourt: Brook Colangelo
Chief Marketing Officer at United Rentals: Chris Hummel

Data to Decisions Future of Work Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Information Officer On <iframe src="https://player.vimeo.com/video/192523719?badge=0&autopause=0&player_id=0" width="1920" height="1080" frameborder="0" title="Executive Exchange - Lessons Learned From Digital Leaders" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>

Google enters enterprise software space with Google Jobs API

Google enters enterprise software space with Google Jobs API

Hidden in an overall update on its vast Machine Learning portfolio of last week (see the blog here) was an unusual use case for a vendor like Google - a Jobs API. Clearly a foray into the enterprise application space, HCM in general and Recruiting in detail.

 
 
There is one fundamental problem at the heart of every job search and every recruiting drive - the search for jobs and the search for candidates. Google tries to address the first part - with an improvement in the search for jobs. It has received good validation and is being tested by veritable recruiting vendors CareerBuilder, Dice and Jibe. 
 
Not surprising for search giant Google to pick this challenge. Also not surprising it all starts with ontologies, Google uses two of them: A three tiered occupation ontology (based on the O*Net Standard Occupational Classification) and a skill ontology of over 50k hard and soft skills (how important they are see e.g.  here - Workday acquired Identified). The magic then lies into mapping these two to each other, which Google does in a third step. 
 
A visualization of the occupation ontology - from Google web site - here
It's the third step where Machine Learning really unfolds its power. Google states that its vector for job titles is 100k dimensions big, based on an analysis of 17M job posts. Beyond human grasp and likely also beyond any single project based modelling by a single or team of data scientists. 
 
MyPOV
 
Good to see Google using expertise and moving it towards enterprise applications. One more offering that helps Google move the conversation towards the enterprise, something the vendor desparately wants to foster, grow and ultimately monetize. Good to see uptake and pilots by CareerBuilder, Dice and Jibe. Ultimately also another avenue to get more load into Google Cloud, something the vendor is equally interested in. Finally it is likely the API can be applied also for the reverse search - finding the right candidate - but that's something we will have to wait for Google to provide. In that case some privacy questions will loom...
 
On the concern side it is one more component that may break a Talent Acquisition system. But probably Google can scale more and better than the data science teams at relatively smaller vendors, ultimately given the overall system more stability. 
 
Finally it's an inflection point. Google has probably a lead when it comes to overall machine learning from both an algorithm and platform (with TensorFlow and GPU architectures) - but it has not forayed into enterprise software. Congrats on the first step and we will be watching. 
 
 
 
 

More about Google:
  • Event Report - Google I/O 2016 - Android N soon, Google assistant sooner and VR / AR later - read here
  • First Take - Google Google I/O 2016 - Day #1 Keynote - Enterprise Takeaways - read here
  • Event Preview - Google's Google I/O 2016 - read here
  • Event Report – Google Google Cloud Platform Next – Key Offerings for (some of) the enterprise - read here
  • First Take - Google Cloud Platform - Takeaways Day #1 Keynote - read here
  • News Analysis - Google launches Cloud Dataproc - read here
  • Musings - Google re-organizes - will it be about Alpha or Alphabet Soup? Read here
  • Event Report - Google I/O - Google wants developers to first & foremost build more Android apps - read here
  • First Take - Google I/O Day #1 Keynote - it is all about Android - read here
  • News Analysis - Google does it again (lower prices for Google Cloud Platform), enterprises take notice - read here
  • News Analyse - Google I/O Takeaways Value Propositions for the enterprise - read here 
  • Google gets serious about the cloud and it is different - read here
  • A tale of two clouds - Google and HP - read here
  • Why Google acquired Talaria - efficiency matters - read here
 
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my Youtube channel here
 
 
 
 
 
Future of Work Google Chief People Officer Chief Information Officer

What is your Twitter Persona?

What is your Twitter Persona?

A blog I have meant to write since a long time, being a heavy user of Twitter... related to how reputation and follower dynamic on Twitter are working out - from my vantage point... so let's muse about this...

 
 
 
One of the unique things of Twitter (e.g. compared to Facebook or LinkedIn) is that it shows a users Follower / Following numbers prominently and more importantly it gates progress in regards of following other users on follow back. If memory does not fool me, Twitter installed these limits to avoid abuse and spam. E.g. there is / was a gate a 2000 people followed, where you could only follow more people once your own follower count would pass 2000 followers. So Twitter strives for balanced users - ultimately.
 
With this as a background, when looking at a Twitter use, there are four 'personas' - you can pick them out by the relationship of followed vs. following:
 
  • The Twitter NewBie - characterized by a substantial lead of users followed vs. users following. I think every Twitter user starts like this... at the end of the day you need to follow other users to get going. The first users followed is an interesting insight for anyone who wants to do some social analysis of anyone on Twitter.
     
  • The likely Spammer - characterized by a 10x+ relationship of following vs. followed. These are the dark side of Twitter, often not human accounts.. but they play a role - as you will later see.
     
  • The Hoarder - characterized by a 5x+ relationship of followers vs. following. These are the Twitter users who have 'made it' in the classical sense of reputation management - with other users following them, but them not following back. At some point - e.g. for news organizations and the big celebrities - it is impossible to follow back... that creates an imbalance in the Twitter equilibrium.
     
  • The Social User - characterized by an almost even count of followers vs following. The difference being human error, and often the spammers that aren't followed back.
      

MyPOV

Twitter is practically a null sum game for all users from a follower / following perspective ... Imbalanced follower / following accounts trigger their inverse in single or across multiple accounts, as users try to get over the gated hurdles that Twitter has setup... so the Hoarder creates the opportunity for the Spammer... So in my view Twitter would do better in case it would motivate more balanced follower / following ratios. Some of the problems are Twitter self inflicted, as it does not do a good job to help, assist and teach users from managing the Home feed efficiently. 
 
For me I stayed in the Social User category... my following count is less than 10% under my follower count and I routinely go over my followers and follow back based on a few criteria... check my tweets. 
 
What is your Twitter persona?
 

 

Next-Generation Customer Experience Sales Marketing X

The Polarization Epidemic

The Polarization Epidemic

Terrific NYT infographic -- the colored counties are those whose presidential votes were "landslides" i.e. the winner 20% or more above the loser. The maps look like Hot Zone or any of the virus outbreak movies -- polarization engulfs the entire population over time. More data on the site about the characteristics of the red and blue counties’ populations.

Counties that voted for the Republican or Democratic presidential candidate by 20 percentage points or more

If you're interested in the causes of this trend (ok, *seriously* interested) this 1971 paper by game theorist Thomas Schellingoutlines the many mechanisms that make it almost inevitable. The paper is tough going, but the first couple of pages make clear that even without gerrymandering or malign intent, normal human behavior will lead in this direction unless there's some countervailing force. 

What might that be?  If France, which puts enormous resources into teaching children what it means to be French, is supporting Le Pen, it’s hard to imagine what it will take to make the US one country. Your thoughts? --CAM

Chief Executive Officer

Adobe Acrobat Joins the Market of Mobile Scanning Tools

Adobe Acrobat Joins the Market of Mobile Scanning Tools

We’ve been talking about living in a digital world for decades, but most of us are still surrounded by paper, business cards, stick notes, todo lists, whiteboards, photos, magazines and other physical forms of content. To help reduce that clutter by transforming content into secure, searchable and signable PDF documents, Adobe has updated their Acrobat Reader mobile application to now include scanning functionality.

Adobe Acrobat mobile scanning a document via tablet


MyPOV:

Adding scanning capabilities to Acrobat Reader makes a lot of sense. PDF is already one of the most used file formats, and Acrobat Reader in installed on hundreds of millions of mobile devices, so it’s a natural fit to empower people to easily create PDF via their phone/tablet's camera. There are some very useful features such as reordering pages, rotating and annotating with notes, highlighters, pens and text, and adding your signature... all of which is free. In the future more advanced functionality will be available such as OCR for searching, but this will most likely require an upgrade to full Adobe Document Cloud capabilities. This strategy means Adobe Acrobat Reader’s new scanning features could help increase interest in the full Document Cloud service.

Adobe faces a great deal of competition in the portable scanning market, from products like Box Capture, Dropbox Scan, Evernote Scanable, Microsoft OfficeLens, Google Drive Scan and the new Google PhotoScan. This blog post is not intended to provide a full competitive review of each product, but I can say while the core capabilities in each are similar and useful in many common use-cases, they do vary in areas such as parallax correction (for documents, whiteboards, large screens, etc), OCR and search, digital signatures, annotations and markup, business card scanning, export options, document assembly and organization, sharing, and more. Customers should choose the tool that fits best into their existing environment and workflows, making mobile scanning as seamless to their employees as possible.

 

Future of Work

Microsoft Connect 2016 - Linux, Google and more

Microsoft Connect 2016 - Linux, Google and more

We attended Microsoft’s Connect event in New York, held on November 16th 2016.
 
 

So take a look at my musings on the event here: (if the video doesn’t show up, check here)
 
 

No time to watch – here is the 1-2 slide condensation (if the slide doesn’t show up, check here):
 
 
Want to read on? 

Here you go: Always tough to pick the takeaways – but here are my Top 3:

Microsoft joins the Linux Foundation – For a long-time Linux was the arch enemy of Windows, but realities with open source innovation and the new Microsoft make possible what was unthinkable a few years ago. One of the first Azure offerings running on Linux was another ‘surprising’ announcement at the time, in 2013, of Azure now running the Oracle database (read khttp://enswmu.blogspot.com/2013/06/how-cloud-can-make-unlikeliest-bed.html). Since then Microsoft has run more and more Azure loads on Linux and being part of the community at an influential and exposed level was an almost overdue move. For enterprises, it means that if there was any reservation towards Linux in the past, they are ready to revisited.

Google joins .Net Foundation – Google has been running .Net since some time. So similar to the previous takeaway, it makes sense to influence the community and be a prominent part of it. Again, a new frenemy relationship, as Microsoft and Google have tussled with each other mobile, social, browsers and. Microsoft gets a validation of .Net being an independent platform, Google can attract .Net load for Google Cloud and most importantly enterprises running .Net apps get more choice were to deploy these apps. So a win / win / win for all participants. Ironically both Microsoft and Google will be strong believers that their respective IaaS is the better one. Future will tell.

Coding gets (even) easier with Visual Studio 2017 - Microsoft keeps pushing the productivity limits of the developer experience further out. The IDE gets more productive, Rosslyn constantly monitors what the developer types, DevOps gets easier and so on. Good to see more team capabilities to Team Foundation Server 2017. Visual Studio Mobile gets all the good Xamarin capability for deployment and testing. And Visual Studio now runs – no surprise anymore – natively on a Mac.

Tidbits

  • SQL Server 2016 SP1 – Important housekeeping for Microsoft – as it brings the programming model together across SQL Server editions.
     
  • Azure Data Lake Store – A data lake on Azure, easy to access with e.g. on premise Active Directory and easy to expand to other data sources… it is HDFS compatible – but not native HDFS.
     
  • Azure Data Lake Analytics and Store – Microsoft’s high transaction data volume and processing cloud analytics service is now GA. Support of U-SQL, R, Python and .Net is interesting, but an Azure / Microsoft specific platform. To be fair, so are all competitor products.
     
  • Microsoft launched Teams recently (see here) – and of course developers can build on it, together with the just turned one Graph. Looks like the bot development framework is with Teams for now, too – which makes sense as Microsoft tries to make Teams its Chat platform.
     
  • Azure Progress – Some interesting information from Scott Guthrie on Azure progress. E.g. that Microsoft has operationalized more datacenter capacity in the last 9 months than in its history before. And that capacity will double again in 12 months. Check the Storify for more interesting pieces of information. 
 

    MyPOV

    A lot of progress with Microsoft, almost more than Build, but often product and event cycles don’t align. It is good to see that developer productivity remains top priority for Microsoft and with that the vendor helps enterprises to build next generation applications. It is good to see Microsoft also embraces the reality of Linux, becoming a Linux Foundation member. Will be curious what Microsoft may contribute in the future. Moreover, Microsoft makes good on the premise to protect code investments – both from an UWP and a .Net perspective. Good news for enterprises to get another IaaS to run .Net applications with Google Cloud.

    On the concern side Microsoft still has steps ahead to become an enterprise level PaaS that starts with analysis, design capabilities, does requirement collection, end to end test automation etc. It will be interesting and potentially very powerful to see Visual Studio and Teams come together. I maybe wrong, but it always seems to me that Microsoft is about the developer (nothing wrong with it) and less about the enterprise that needs to build a next generation application. Those run in conjunction with existing standard application packages, so integrating, extending those matters. And a lot of emphasis on mobile, but nothing on social network integration, automation etc.

    But for now, a lot of good progress on all levels – from the partnership level to product level. Good to see the progress on making developer lives more and more easy, one release of Visual Studio at a time.

    Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here).


    More on Microsoft:
    • First Take - Microsoft discovers teams - launches Microsoft Teams - read here
    • News Analysis - Microsoft announces SAP's choice of Azure to help enterprises transform HR - The SaaS land grab is on  - read here
    • First Take - Microsoft Ignite - AI, Adobe and FPGA [From the Fences] - read here
    • News Analysis - GE and Microsoft partner to bring Predix to Azure - Multi-Cloud becomes tangible for IoT - read here
    • Market Move - Microsoft acquired Linked - Tons of synergies, start with Cortana, maybe too many - read here
    • News Analysis - Microsoft opens Windows Holographic to partners for a new era of mixed reality - read here
    • News Analysis - SAP and Microsoft usher in new era of partnership to accelerate digital transformation in the cloud - read here
    • Musings - Will Microsoft's Hololens transform the Future of Work? Read here
    • Event Report - Microsoft Build 2016 - A platform vision and plenty of tools for next generation applications - read here
    • First Take - Microsoft Build 2016 - Day 1 Keynote Takeaways - read here
    • Event Preview - Microsoft Build 2016 - Top 3 Things to watch for developers, managers and execs...  read here
    • News Analysis - Microsoft - New Hybrid Offerings Deliver Bottomless Capacity for Today's Data Explosion - read here
    • News Analysis - Welcoming the Xamarin team to Microsoft - read here
    • News Analysis - Microsoft announcements at Convergence Barcelona - Office365. Dynamics CRM and Power Apps 
    • News Analysis - Microsoft expands Azure Data Lake to unleash big data productivity - Good move - time to catch up - read here
    • News Analysis - Microsoft and Salesforce Strengthen Strategic Partnership at Dreamforce 2015 - Good for joint customers - read here
    • News Analyis - NetSuite announced Cloud Alliance with Microsoft - read here
    • Event Report - Microsoft Build - Microsoft really wants to make developers' lives easier - read here
    • First Hand with Microsoft Hololens - read here
    • Event Report - Microsoft TechEd - Top 3 Enterprise takeaways - read here
    • First Take - Microsoft discovers data ambience and delivers an organic approach to in memory database - read here
    • Event Report - Microsoft Build - Azure grows and blossoms - enough for enterprises (yet)? Read here.
    • Event Report - Microsoft Build Day 1 Keynote - Top Enterprise Takeaways - read here.
    • Microsoft gets even more serious about devices - acquire Nokia - read here.
    • Microsoft does not need one new CEO - but six - read here.
    • Microsoft makes the cloud a platform play - Or: Azure and her 7 friends - read here.
    • How the Cloud can make the unlikeliest bedfellows - read here.
    • How hard is multi-channel CRM in 2013? - Read here.
    • How hard is it to install Office 365? Or: The harsh reality of customer support - read here.
     
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
    Tech Optimization Innovation & Product-led Growth Event Report Executive Events Chief Information Officer

    Event Report - Kronos KronosWorks - Solid progress and big things loom

    Event Report - Kronos KronosWorks - Solid progress and big things loom

    We had the opportunity to attend Kronos yearly user conference KronosWorks, held in Orlando from November 13-16, 2016. 

     
    Here is the 1-2 slide condensation (if the slide doesn’t show up, check here): 

     

    Want to read on? Here you go: Always tough to pick the takeaways – but here are my Top 3:

    Workforce Ready is a success – Kronos has been investing in its SMB product Workforce Ready since some time, now with the recent additions in functional footprint it has taken off more that one would expect, reaching a triple digit million amount for the first time. Overall customer growth is up 45% and global customer growth 110%. It shows that Kronos has realized the need for an integrated HCM Suite for companies under 2500 employees, that are having a large population of hourly workers. And no surprise, Kronos focusses on these industries. For most other vendors the anchor module is Payroll, but for Kronos it is Workforce Management and Payroll. Buyers in these industries don’t only have to worry about one or two payroll runs per month, but usually worry about correct punches every morning and afternoon.

    Workforce Central grows, too – more to come – The bigger brother of Workforce Ready for enterprises, Workforce Central is growing fast, too – the data point being over 2000 enterprises having going live or currently implementing the product. Kronos is actively making the product better, the new HTML5 based UI works well, though is a bit conservative. Good to see Kronos also eliminating ‘sins of the past’ – Java on the desktop is gone, now its about addressing the use of Adobe Flash. But the real news is how we learnt that Kronos is actively working on the next generation of its larger enterprise workforce management product, with later in 2017 as a first date for first release of capabilities. A good sign as some of the architecture changes required to keep Kronos powering in the 21st century cannot rely on the older, but proven architecture. The way how Kronos tackles this and from the little we know right now, there is little to nothing to worry for customers and prospect. Kronos is very conservative and hype free – so this will be an interesting one to watch in an usually hype loaded industry.

    Cloud works for Kronos customers – Moving to the cloud is Kronos’ biggest and still fastest growing business, and has reached 60% of Kronos revenues. In total 4M Kronos users are in the cloud today. Existing, not just net new customers, move to cloud, with 1.1M existing users having moved over to Kronos Cloud. On the technology side, it is good to see that Kronos can and has run on AWS Cloud in Australia. Like many other enterprise software vendors Kronos has also seen that AWS Cloud is not cheaper than an in-house managed cloud – at the moment, but it gives Kronos flexibility in regards of data residency and global presence.

    MyPOV

    Kronos is doing two of the hardest things that an enterprise software vendor can do: Building the next generation of its flagship product and converting existing customer to a new platform, in this case cloud. Most enterprise software customers don’t want to move off an implemented on premise version, it is often amortized, fits the needs of the user community and everyone has gotten used to it, making existing customer conversions often tricky, hard and slow. The pace at which Kronos has managed to do this is impressive. In conversations with customers they realize the value proposition of cloud and are ready to convert. We will have to see how the trend continues when Kronos reaches more conservative and cloud skeptical customers. But time is on Kronos’ side from the overall trend perspective.

    On the concern side Kronos must keep managing the transition and innovate in its product. The current architecture is adequate, but not how you want a workforce management product be for the next 10 years to come. The good news is that Kronos has done the heavy lifting, moving to an API architecture, even more interesting being used as an API layer for Workforce Management capabilities by large enterprises who have opted in building their own Time and Attendance / Absence Management.

    The next year has been important for Kronos for a while and I have been saying that for the previous years, too, as Kronos had to lay the foundation for the future. 2017 will be key. Stay tuned.

    More on Kronos
    • Event Report - Kronos KronosWorks - New Versions, new UX, more mobile - faster implementations - read here
    • First Take - KronosWorks - Day 1 Keynote - R&D Investment, Customer Success and Analyticss - read here
    • Kronos executes - 2014 will be key - read here
    • Tweeting and feeling good about it - read here

    Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here). And check out the Twitter Moment I created on the first day of the analyst meeting here.

    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
    Innovation & Product-led Growth Tech Optimization New C-Suite Data to Decisions Revenue & Growth Effectiveness Next-Generation Customer Experience Future of Work Leadership AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT HR Chief Customer Officer Chief People Officer Chief Human Resources Officer

    Tableau Sets Stage For Bigger Analytics Deployments

    Tableau Sets Stage For Bigger Analytics Deployments

    Tableau announces scalability, governance, data-prep and ‘smart’ features due in 2017, but can the company land more enterprise-wide deals amid tightening competition?

    Tableau Software just replaced its CEO after a few rocky quarters on Wall Street, but that didn’t seem to faze the enthusiastic crowd of 13,000-plus customers gathered at last week’s Tableau Conference in Austin, Texas.

    Customers shared their love for the software, whooping and clapping about cool new features demoed during the popular “Devs on Stage” keynote. They also oohed and aahed their way through the opening keynote, as Tableau previewed a new data-engine and data-governance capabilities as well as compelling natural-language-query and machine-learning-based recommendation features.

    Why the change in leadership given customer satisfaction levels and continued double-digit growth? More on that later, but first a quick synopsis of the coming attractions.

    @Tableau, #Data16

    Hover-over insights, shown inset in blue, will bring instant analysis to Tableau that goes
    beyond showing the details when you mouse over a data point.

    Coming in 2017

    Tableau broke its coming attractions into five categories: Visual Analytics, Data Engine, Data Management, Cloud and Collaboration. Here’s a closer look at what’s in store and a rough idea of what to expect when.

    Visual Analytics: Tableau highlighted a bunch of visual data-analysis upgrades starting with instant, hover-over insights that go beyond just showing a data point when you mouse over a point in a chart. In the carbon dioxide emissions vs. GDP visualization pictured above, for example, the hover-over insight inset in blue shows that most of the countries in a lassoed set of plots selected from the chart are in Africa. Also presented are related insights into Internet and mobile phone usage. The software will generate these drillable insights automatically as the user hovers over a data point.

    Tableau is also beefing up spatial and time-series analysis capabilities, adding the ability to layer multiple data sets against shared dimensions such as location. For example, you might want county, municipal and zip code views of the same geographic areas. Look for these features to show up in the first half of next year.

    Further out on the horizon (in the second half of 2017) Tableau expects to introduce natural language query capabilities. Aided by semantic and syntactic language understanding, this feature will enable users to type questions such as “show me the most expensive houses near Lake Union” (see image below). In this case “expensive” and “near” are relative terms, so the tool will offer a best-guess visualization will slider adjustments for “last sales price” and “within X miles of Lake Union” so user can fine-tune the analysis.

    MyPOV: Tableau’s differentiation compared to simplistic data-visualization tools is supporting a flow of visual exploration, correlation and analysis. The new hover-over insights and layered details will make Tableau visualizations that much more powerful, enabling developers to create fewer visual reports that can be support myriad analyses.

    @Tableau, #Data16

    Natural language query capabilities expected in the second half of 2017 will simplify data exploration for novice and experience analysts alike.

    Hyper Data Engine: Acquired in March, the Hyper Data Engine promises faster analysis and data loading and higher concurrency, supporting “up to tens of thousands of users” on a single shared server. Data loads that used to run overnight will take seconds with Hyper, says Tableau. Last week the company demonstrated ingestion of 400,000 rows of weather data per second with simultaneous analysis and data refreshes.

    Hyper will replace the existing Tableau Data Engine (TDE), starting with the company’s Tableau Online service by the end of this year. Hyper is expected to become generally available in a software in the second half of 2017. Migration of TDE files will be seamless and the new engine will run on existing hardware, Tableau reports.

    MyPOV: If Hyper lives up to its billing it will eliminate performance constraints that many Tableau customers endure when dealing with high data volumes, simultaneous loading and analysis, and high numbers of users. The proof will be in the pudding, but Tableau is confident that Hyper’s columnar and in-memory performance will ensure stream-of-thought analysis without query delays. In fact, they expect Hyper to eventually serve as a stand-alone database option as well as a built-in data engine.

    Data Management: As Tableau has grown up from a departmental solution into an enterprise standard, the company has had to address the needs and expectations of IT. To address data governance, for example, it’s introducing (likely in the first half of 2017) a new Data Sources page and capability for data owners/stewards to certify data sources. A green “Certified” symbol (see image below) will show up wherever that data set is used to show that it has been vetted, that security rules are in force and that related joins and calculations are valid. More importantly, when users add their own data or otherwise depart from the certified data, visual cues will show that the calculations are derived from non-certified data.

    @Talbeau, #Data16

    To support data governance, Tableau will introduce a data-certification capability that will
    show when measures are and are not based on vetted data and calculations.

    Diving deeper into data management, Tableau is working on “Project Maestro,” which will yield a self-service data-prep and data-quality module likely to show up in the second half of next year. This optional, stand-alone module will deliver drag-and-drop-style functionality aimed at the same data owner/steward types who are likely to certify data sets. The idea is to deliver the basics of data-prep and data cleansing required for simple use cases. Tableau customers are likely to continue to rely on software from partners such as Alteryx and Trifacta to handle complex, multi-source and multi-delivery-point deta-prep and data-cleansing workflows.

    MyPOV: Tableau has previously supported the concept of certified sources, but this upgrade is supported by collaborative capabilities (see section below) that will enable new calculations and dimensions to be suggested, reviewed and added to a certified set. Governance capabilities must be agile or users will quickly work around trusted-but-stagnant data sources. On the data-prep front, Maestro looks like it will deliver the 20 percent of functionality that gets most of the use. We’ll see whether it can address 80% of data-prep needs and how it stacks up pricewise versus third-party tools.

    Cloud: Tableau addresses what it sees as a hybrid future with Tableau Online, its multi-tenant cloud service, coupled with cloud-hosted and on-premises deployments. Likewise, Tableau expects to see a mix of cloud and on-premises data sources. Tableau currently relies on pushing extracts out from on-premises sources, but in the first half of 2017 it expects to introduce a Live Query Agent capability that will securely tunnel through firewalls for direct access to on-premises sources.

    On the cloud side, Tableau has connectors for popular SaaS applications such as Salesforce, but you can soon expect to see additional connectors for Eloqua, Anaplan, Google AdWords, ServiceNow. On the horizon are connectors for cloud drives such as Box and DropBox.

    In a separate development expected in 2017, Tableau will port its Server software to run on multiple distributions of Linux. This move is important for cloud-based deployments because Linux dominates in the cloud and costs as little as half as much as comparable Windows server capacity. Tableau itself will be the first to take advantage by soon porting the Tableau Online service to run on Linux.

    MyPOV: Tableau has a head start on cloud compared to its closest rival, Qlik, and I particularly like its embrace of the tools and capabilities of public cloud providers. For example, Tableau is encouraging the use of Amazon RDS for PostGreSQL, ELB for load balancing, S3 for backups and Amazon CloudWatch for load monitoring. And when natural language querying arrives, Tableau says it’s likely to take advantage of Alexa and Cortana voice-to-text services to support mobile interaction.

    Collaboration: Tableau is adding a built-in collaboration platform to its software to facilitate discussion. The platform will enable users to exchange text messages directly with data stewards and other users to answer questions such as, “is this the right data for my analysis?” Bleeding into the new data-governance capabilities, you’ll also be able to see what data is used where and ask whether a new dimension or calculation can be added to a certified data.

    Delivering on a longstanding user request, Tableau is adding data-driven alerting to its software. In another upgrade that will personalize the software, Tableau is adding a Metrics feature that will let users save their favorite stats and capsule visualizations so they can review them, say, each morning on their desktop or on mobile devices.

    Tableau says its software will get smarter with the introduction of machine-learning-based Recommendations. The ambitions is to go beyond the “show me” visualization suggestions currently available to auto suggest data based on the user’s historical behavior, similar users’ behavior, group membership, data certifications, user permissions, recent item popularity, and the context of a user’s current selections. Don’t expect to see that functionality until the second half of 2017.

    MyPOV: Some of this stuff seems obvious and overdue. Collaboration, for example, shows up in lots of software, and it particularly helpful for discussing data and analyses. Alerting within Tableau has heretofore been addressed by custom coding and third-party add-on products, but it’s a long-overdue, basic capability that should be built into the software. Smart, machine-learning-based recommendations are more cutting edge, but with the likes of IBM (with Watson Analytics) and Salesforce (with BeyondCore) already offering such features, Tableau may be in good company by the time it rolls out its own Recommendations feature.

    Why the Leadership Change?

    Over the last couple of years, Tableau has been stepping up into more big enterprise deals. It’s also facing more competition, including from the likes of cloud giants Microsoft (with PowerBI) and Amazon (which will soon release QuickSight). At the same time Tableau is moving into more cloud deployments and subscription-based selling (whether on-premises or in the cloud). These transitions have contributed to revenue and earnings surprises in recent quarters, and that’s something Wall Street never likes.

    In August, Tableau tapped Adam Selipsky, previously head of marketing, sales and support at Amazon Web Services, to “take the company to the next level,” as former CEO, and now chairman, Christian Chabot put it in a press release. That’s just what Selipsky did at Amazon, helping AWS to evolve from departmental and developer-oriented selling to big corporate deals. It’s Selipsky’s challenge to keep Tableau growing and profitable even as it pushes into bigger deployments and increasingly cloud- and subscription-based deals.

    MyPOV: Now that it’s in enterprise-wide deals, Tableau is facing more competition and deal-delaying offers of “free” software thrown in with big stack deals. Tableau has consistently won the hearts and minds of the data-analyst set, but on this bigger stage it must also address the needs of data consumers that might not be data savvy enough for the company’s usual interface. The Governance, Alerting, Metrics, Natural Language Query and Recommendations announcements — as well as new APIs for embedding into applications — are all moves in the right direction. Nonetheless, I won’t be surprised to see a lighter user interface and lower-cost-per-seat options that will round out Tableau as a platform for enterprise-wide deployments.

    Related Reading:

    Qlik Gets Leaner, Meaner, Cloudier
    Salesforce Einstein: Dream Versus Reality
    Oracle Vs. Salesforce on AI: What to Expect When

     

     


    Data to Decisions Tech Optimization Chief Customer Officer Chief People Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer