Results

Introducing Workplace by Facebook

Introducing Workplace by Facebook


Today Facebook officially launched Workplace by Facebook, their enterprise version of the Facebook social network. 


Workplace, formerly named Facebook at Work has been in private beta for over a year and is being used in over 1000 companies, dozens of which have more than 10,000 people using it. Current customers include Coldwell Banker, Club Med, Heineken, Royal Bank and Scotland, Canadian Tire and Telenor. Workplace is now available for any organization to signup. 

Workplace resembles the consumer version of Facebook, with a central newsfeed of posts, groups for specific topics, and a messenger client named Work Chat that provides group chat and 1:1 video. Some of the key Workplace features include events, polls and live streaming which allows organizations to broadcast content to employees in real time. 
 

It should be noted, Workplace was the name IBM gave to their next generation messaging platform in 2003, but was discontinued in 2007.

As outlined in my research report, “Can Facebook at Work Bring Collaboration to the Business World”, one of the greatest strengths of Workplace is that most employees will immediately be familiar with how to use it. The real test comes in how seamlessly integrated Workplace can become with the business workflows that employees use to get their jobs done.

At the time of launch, Workplace does not have integrations with popular enterprise software such as Office 365, Salesforce, Workday, ZenDesk, etc. Instead, Facebook has focused their initial development efforts on the security and administration aspects of Workplace. For example, customers will be able to use single-signon via providers like Microsoft Azure AD, PingIdentity, Okta and OneLogin. While SSO is critical for getting started and gaining adoption, Constellation Research recommends that Facebook quickly develop business software integrations or partner with companies that can provide this functionality.

Workplace does allow for multi-company groups which contain people from other organizations. This is extremely important for many collaborative use-cases, but the caveat (at this time) is that each organization must be using Workplace, there is currently no guest access. 

Rather than using the common “price per user/per month” licensing model, Facebook is doing something very customer friendly and only charging for what is used. Pricing is:
$3USD for each 1-1000 monthly users
$2USD for each 1001 - 10000 monthly users
$1USD for each 10,001+ monthly users

Can Workplace Compete With A Suite?


The main collaboration battle over the last few decades has been fought by Microsoft, Google and IBM. The core of these vendors' offerings is the combination of email/calendar + content creation (documents, slides, spreadsheets) + unified communication (chat, web-conferencing). Along the way vendors offering niche services such as file sharing (Box, DropBox, Egnyte, etc), task management (Asana, Clarizen, Trello, Workfront, etc) and group chat (Slack, Glip, HipChat, Ryver, etc) have each claimed their own spot in the market, but always as a layer in addition to the Microsoft, Google or IBM stacks. 

Workplace by Facebook is not a complete collaboration suite. Since it does not provide its own email, task management, file-sharing or content creation tools, customers will still have to purchase those other products. So can Workplace succeed as a Corporate Social Intranet or Enterprise Social Network? Options such as Yammer, Jive, SocialCast, Thoughtfarmer, and Igloo have been around for years, yet none have dominated the market the way suites like Microsoft’s Office 365, Google's GSuite (formerly Google Apps for Work) and IBM Connections have. Also vendors like Salesforce, Workday, SAP, Oracle, Cisco and Infor have all added communication and collaboration features into their platforms.

In order for Workplace by Facebook to really become a critical business tool, they will need to provide deep integration with email, file-sharing, task management, and as mentioned above business process software such as CRM, ERP, HR, financial, etc. or else they risk the same fate as many social business software platforms that came before them.

So can Workplace provide enough value on it’s own to warrant being an additional tool for employees to use? If early customer interest is any indication, it would appear it can. Leveraging Facebook’s name recognition, Workplace has a big opportunity to become a leader in enterprise social software. Look at the level of attention newcomer Slack has obtained, and that was starting from ground zero. Slack claims to have 3M active daily users, it will be interesting to see how many Facebook cites in 6, 12, 24 months.

The strengths of Facebook’s name recognition and massive business partner ecosystem are certainly assets that can help their road to success. Constellation Research has already received a great deal of customer interest in Workplace (when it was Facebook for Work) and expect interest to increase with today’s official release.
 

 

 

Future of Work Marketing Transformation Next-Generation Customer Experience Revenue & Growth Effectiveness Chief Marketing Officer Chief People Officer Chief Revenue Officer

Salesforce Einstein: Dream Versus Reality

Salesforce Einstein: Dream Versus Reality

Salesforce has introduced Einstein as a set of platform services, but for now it seems more like a collection of acquired parts. Here’s a look at what’s real and what’s coming.

“There’s a reason they call it Dreamforce,” quipped one Salesforce partner executive at the company’s big October 4-7 event in San Francisco. “They’re great marketers, but who knows when any of this AI stuff will be real?”

There’s good reason for skepticism, given Salesforce’s habit of announcing capabilities at Dreamforce well ahead of availability, yet several Einstein “artificial intelligence” (AI) services are actually already available. The capabilities available immediately are mostly those that Salesforce picked up through its many AI-related acquisitions over the last year to 18 months. Several Einstein capabilities coming soon to the Marketing and Sales Cloud were developed organically, according to Salesforce, relying on machine learning and other technologies evolved out of the ExactTarget and Heroku acquisitions.

What’s not yet real, in my view, is the Einstein “platform services” layer depicted in the marchitecture diagram that Salesforce flashed up during several keynotes last week (see below). Salesforce insists that it’s not just a vision, but incorporating all those acquisitions will require a bit of integration work before Salesforce can deliver a consistent set of services and APIs. And only then will Salesforce be able to deliver some of the blended Einstein capabilities described at Dreamforce. Based on executive interviews and time spent at the Dreamforce “Einstein Discovery Center,” here’s a closer look at what to expect and when.

salesforce-einstein-marchitecture

 

‘Discover Smarter Insights’ with Analytics Cloud Einstein

It quickly became clear at Dreamforce that BeyondCore, one of Salesforce’s most recent AI acquisitions, is expected to be among the most significant of the dozen or so deals driving Einstein – at least where machine-learning based data-discovery and analysis are concerned. BeyondCore is the engine behind the “discover smarter insights” capability in Analytics Cloud Einstein, and it’s already available.

Purchased in September, BeyondCore is a cloud-based data-discovery and analysis platform that takes in data and statistical summaries of big data and then uses machine learning to automatically spot correlations and patterns in that data. Rather than starting with hypotheses developed by data scientists, BeyondCore is designed to let business analyst select measures to investigate, such as cost, profitability or customer lifetime value. The engine then identifies and explains the drivers of a measure or combination of measures.

BeyondCore analyses and explanations are delivered in the form of text-based “stories” that are generated automatically and that answer four questions: What Happened? Why did it happen? What will happen? And how can I improve? The answers to the last two questions are predictive and prescriptive insights, respectively. BeyondCore stories can be exported as Word documents or PowerPoint presentations and the engine also generates supporting data visualizations.

To gain insights, BeyondCore connects to data sources including popular relational databases and Hadoop. When data sets are too large to load into the cloud the engine can create and rely upon statistical summaries of larger data sets. Soon after BeyondCore was acquired, a data connector was added for Salesforce CRM data, and a data output was added for Salesforce Wave, so predictive and prescriptive insights can be exported to the Analytics Cloud. Beyond Core was sold based on per-user pricing, and Salesforce executives tell me it will continue to be offered that way as an extra-cost option of Analytics Cloud Einstein.

MyPOV on BeyondCore. This is a powerful engine, and it’s no surprise it will play a prominent role in the Analytics Cloud as well as other Einstein capabilities. As for the limitations of the technology, I’d like to hear more about how quickly the engine can create statistical summaries of big data and how quickly it generates analyses. I’m told the engine can spot complex correlations and patterns across as many as 100 columns of data, but it also requires at least 10,000 rows of data to deliver statistically reliable results. You need a lot of data to make smart, automated decisions, so this capability may not be applicable to small and midsized customers that have fewer than 10,000 customers in key categories.

Recommend Products with Commerce Cloud Einstein

Several Einstein capabilities tied to the Commerce Cloud are either already available or will soon be available because they were developed or in the works at Demandware. Acquired in June, Demandware had a machine-learning based ability to automatically recommend best-fit products for customers based on their individual histories and the history of like customers. It’s a recommendation engine, and it’s a built-in and immediately available part of Commerce Cloud Einstein.

The same recommendation technology also powers an optional (extra-cost) Predictive Email service than can serve up personalized content, offers and product recommendations. Per-month pricing is based on the size of the mail list, but you can send as many personalized messages as you want. Demandware also had two other capabilities in the works that will soon be part of Commerce Cloud Einstein. Commerce Insights, a market-basket-analysis dashboard for merchandizers, is expected to be a built-in capability available before the end of 2016. Predictive Sort, a personalized product search and sorting capability, will be an optional feature, and it’s expected in the first quarter of 2017.

MyPOV on Commerce Cloud Einstein. This sort of recommendation technology has been available for quite some time. It’s table stakes for e-commerce, but it rightly belongs in the Einstein feature set, as machine learning is used to look at customer behavior data, catalog data and online and offline order histories.

Developer Cloud Einstein

Predictive Vision and Predictive Sentiment services are now available in Developer Cloud Einstein, and they’re based on the technology of MetaMind, which Salesforce acquired in April. Demonstrated at Dreamforce by Dr. Richard Socher, formerly MetaMind’s CEO and now Salesforce chief scientist, the vision engine was shown to be easily trainable by business-user types by dragging and dropping collections of images. The ease of training and implementing MetaMind’s sentiment engine was less apparent, but it’s clearly geared to straightforward natural language interaction and use by business users.

MyPOV on vision and sentiment AI. Usually you see mundane demos of such capabilities in marketing use case. (And, yes, these and other services will be available in Marketing Cloud Einstein to foster responsive and personalized customer interactions.) I was far more inspired by the example of MetaMind customer vRad (Virtual Radioligist), which is using vision services to save lives by reviewing thousands of brain scans within seconds to spot and help doctors prioritize cases of life-threatening inter-cranial bleeding. Inspirational!

What’s Coming

The list of Einstein capabilities not yet available is longer, and it includes many of the AI capabilities coming to the Sales, Service, Marketing and Community clouds. There are exceptions, such Predictive Lead Scoring and Opportunity Insights capability announced at Dreamforce. These organically developed features are currently in pilot and will be generally available as part of Sales Cloud Einstein by February. Similarly, Marketing Cloud Einstein scoring capabilities developed out of the Exact Target acquisition will be available this December. Finally, certain SalesforceIQ capabilities derived from RelateIQ are now part of Einstein.

MyPOV on the State of Einstein. It’s no surprise that integration work lies ahead before Einstein capabilities will be available across the Salesforce platform. After all, really important components such as BeyondCore have been a part of the company for just one month. Indeed, we’re at the very beginning of the company’s AI push. Salesforce not only has to turn these, and perhaps other, acquisitions into services that are consistent with its existing machine learning, deep learning and natural language processing work, it also has to embed many of these services into its various clouds and introduce feedback loops so that Einstein services can learn as new data is generated.

Will it take six months, eight months or a year or more to deliver all the Einstein services described at Dreamforce 2016? That’s unclear, and aspirations may well shift along the way. I, for one, hope that Salesforce will also introduce plenty of options for human oversight, such as rejecting recommendations or insights that subject-matter-experts deem to be off base so the technology can learn from mistakes. I didn’t see any examples of that in the demos at Dreamforce, but we have to assume that Einstein (and other early examples of AI) will be less than brilliant at launch. If humans are to gain trust in AI services, we have to design apps that place their ultimate faith in the wisdom of humans.

Related Reading:
Inside Oracle Adaptive Intelligent Apps

Oracle Vs. Salesforce on AI: What to Expect When
Strata + Hadoop World Highlights Long-Term Bets on Cloud

 


Data to Decisions Future of Work Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer

Salesforce Dreamforce 2016 - It's about the platform, not Einstein

Salesforce Dreamforce 2016 - It's about the platform, not Einstein

We had the opportunity to attend the Dreamforce 2016, held in San Francisco, from October 4th till 7th 2016. As usual, it was a massive affair, Salesforce claims over 160k attendees. Noticeable from last year, our conversation with exhibiting partners was more positive than last year, so Salesforce has done something right. 

 
 

Take a look at my musings on the event here: (if the video doesn’t show up, check here)
 
 

No time to watch – here is the 2 slide condensation – overall and for platform (if the slide doesn’t show up, check here):
 
 
 
Want to read on? Here you go: 
 

Overall

Salesforce joins the AI frenzy – No conference in 2016 without AI, so Salesforce made no difference, Einstein (like most other announcements) leaked weeks before Dreamforce. The main message is that Einstein works out of the box, no data scientist required, and of course works on all Salesforce data. But the examples were almost all based on scoring and recommendations, capabilities that Salesforce had already, or just acquired and are really the conservative bottom of AI. E.g. scoring does not pass the muster to be AI. But it works and that’s what matters. Basically Salesforce has put all predictive analytics (or I call it ‘true’ analytics – see here) under the Einstein marketing umbrella. And now needs to figure out more on what the platform is, how data will be aggregated, where that all will run, what it means for privacy and security etc. etc. [Salesforce points out that everything starts with trust at Salesforce, and that Salesforce is committed to not share data across customer boundaries. Did you catch that as an attendee / watcher? I missed it.]. But a fair start that would have looked better with less marketing hype, but this is Salesforce. (Read below what Salesforce does to enable developers to build AI applications, that is interesting).

Commerce Cloud – Salesforce has a proven track record to help CRM specialists, the humans. It’s e-commerce capabilities were less developed and the recent acquisition of Demandware has changed that picture (see here for my colleagues take on the acquisition). that. Salesforce store / e-commerce capabilities are now more beefed up and legitimate than before, so time for Salesforce customers to look at the offering. Commerce Cloud certainly is the cloud with the most product in Salesforce’s cloud arsenal.

Another Quip at productivity – Another acquisition – Quip – has been integrated with Salesforce and tries to solve the separation of business applications and productivity applications – a long, practically forever challenge for business users. With a document collaboration model that brings together both structured, unstructured and document information, including the usual side bar for collaboration / notes it certainly is a plausible approach that Salesforce customers should check out.

Salesforce unshackles itself from the past – Salesforce is one of the SaaS pioneers, and has built on a very successful platform – Force – but it is showing its age since some time and it was not designed and built for the cloud age. Since years Salesforce is dancing a limbo between products built on force (e.g. Sales Cloud and Service Cloud) vs. those built on Heroku (e.g. Marketing Cloud) (and Heroku runs on Amazon’s AWS for the less technically inclined readers). Dancing is a great activity, but not the best for an enterprise software vendor, but given the success of the force platform - not an easy thing to address and solve. But Salesforce has made a substantial commitment to AWS (see our analysis here) and at Dreamforce said for the first time that force applications will run on AWS. Likewise, developers can be building force applications will be able to move their DEV ant TST environments to AWS. Both are key developments that allow the conjecture that at some point the Salesforce platforms may again be truly combined, at the moment it looks like on AWS Cloud. And that is a key takeaway, as life gets better for all players in the Salesforce ecosystem, customers, partners and Salesforce itself.

Platform

Events have arrived – The Force platform on an object model level showed its age as it could not model modern entities such as e.g. Events. Now with a new Events object being created as a first class Salesforce object, this gap from the past gets addressed. And it is key for the future of Einstein – as Machine Learning (and maybe later AI) cannot work on just ‘business data’ alone, they need to model events that resulted from business activities and know if an outcome is positive – or not so positive. So a key step in the Salesforce entity / object model. Unfortunately, it does not find itself on the platform side, too – where due to the acquisitions that Salesforce has made, there is a large variety on design how to model events and outcomes. It would be good to see Salesforce providing a common event model on the Heroku side of the landscape, too – we should see if that will happen. But for now a good move. 

Force to AWS – Mentioned already above, probably the most important takeaways for customers. When Salesforce succeeds in running the force apps (mostly Sales Cloud and Service Cloud) on AWS, it will change its need to invest into, build, operate and maintain data centers. A very important move that should ultimately allow Salesforce to put more R&D on product development, but also help customers with more local instances of Salesforce, which helps both with performance and data residency requirements. Allowing developers to develop in force on AWS is another key step here. We will be watching.

PredictionIO comes to Heroku – This was actually the most important announcement for Einstein going forward. More developers to build machine learning applications with the help of the PredictionIO capabilities. Salesforce needs more AI applications and empowering developers, who in general are keen on this and hear from all tool makers how they can build Machine Learning / AI aps is important. Didn’t have a chance to see how easy Salesforce is making this, but this will be key for Einstein success going forward.

Kafka runs on Heroku – A consequence and productization of last year’s IoT Cloud announcement, but an important step to ingest data from other sources into Heroku, which becomes more and more the future of Salesforce Application Development – both internally and for the developer community. Let’s hope Salesforce did not have to do too many tweaks to Kafka for it to run on Heroku, so that Salesforce developers can take advantage of the high speed of innovation we see on the open source side. Future uptakes of Kafka in Heroku will write that story.

MyPOV

A good Dreamforce for Salesforce. Einstein took center stage, and certainly was a ‘must do’ announcement for Salesforce – given the announcements by the competition in the same area. But the real advancements, with short and longer term effect are on the platform side – the move to Heroku / AWS. Something that could have happened a few years ago now finally seems to take place, we don’t know what made Salesforce execs wave the start flag – but realizations like at customers that common data center based infrastructures do not scale for next generation applications may have happened in a Salesforce office, too. So it is good news for Salesforce customers that a potential end of the dual infrastructure reality is in sight, a move to AWS / Heroku can only help customers. More modern development, DevOps tools, best practices like CI / CD become tangible now, very important to make sure Salesforce not only remains an attractive and viable platform, but also to make its SaaS applications more innovative.

On the concern side, time is of the essence for Salesforce. Key competitors in e.g. CRM as well as platform have made that move already and do not have to pay the tax for a dual platform in all phases of the software development lifecycle. The problem is that Salesforce product development speed is not moving breathtaking fast, e.g. an indicator is that the Sales Cloud has not fully moved to the new Lightning user experience / platform, that was announced …. at Dreamforce… 2014. Granted, Sales Cloud is a massive product, with lots of screens, but 2+ years for a new UI is clearly in negative record territory for any leading SaaS vendor. On the flipside Salesforce is making the right moves to make developers faster and more efficient – when it happens for (paying) platform developers and ISVs, it also happens for Salesforce in house developers. How much will be built in the next 12 months will give an answer to that.

Overall a good Dreamforce for Salesforce, granted platform is not the topic 160k attendees will get excited, but platform messages work and are important (see Salesforce ‘cousin’ Workday making the platform theme the one theme of their keynote – last week) – even for end user audiences. I can’t wait the Salesforce marketing skills and spend to be unleashed for platform – maybe at Dreamforce 2017? We will keep watching.


Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
 
Data to Decisions Innovation & Product-led Growth Event Report Executive Events Chief Information Officer

IoT Architecture; Public Open Source IoT Platforms Plus Enterprise Commercial IoT Platforms

IoT Architecture; Public Open Source IoT Platforms Plus Enterprise Commercial IoT Platforms

In this honeymoon period of pilots and small-scale enterprise IoT deployments it is easy to underestimate the impact of IoT at a ubiquitous global level similar to the World Wide Web. Without the use of Open Source software the Web could not exist, individual software licensing in a traditional manner, as with commercially differentiated products, is simply impossible. The Internet of Things requires the same level of ubiquitous common access in its core functions, and, as with the Web, successful use is dependent on the successful creation of Open Source based shared accessibility.

It is a fact that these initial IoT deployments will be expected during their service life to connect to, and play their role, in the global integration of billions of Devices, Assets, Sensors, and End Points. Consideration of how to exchange and interpret huge volumes of data split between millions of Services operating at a myriad of locations across the planet has to play a part in current choices.

An IoT Platform acts as an intermediary layer between the IoT Devices, or Endpoints and the Services that consume the data outputs. In simplistic deployments each sensor may be directly coupled to the consuming Service, but increasing each IoT Endpoint is connected to several services, and may not even share the same data with all of them. A heat sensor may at one threshold initiate the heating, and at a different threshold start air conditioning plant. An IoT Platform offers the sophistication in End Point management to be able to control these and other issues.

Additionally, the role must also include complex integration of Data Flows with Protocol and Network differences being assimilated before forwarding to the required/selected Service. This part of the role is more inline with the accepted definition of Middleware whereas an IoT Platform will be more focused on IoT Endpoint management, though both will share the same definitions in other respects.

The technology requirements for Web Presentation are simplistic compared to the functional Integration required for the ubiquitous interactivity of Internet of Things. No Enterprise Middleware product was ever designed for this purpose, more importantly the new generation of commercial Enterprise IoT Platforms are not intended for this task either. There is a crucial difference between ‘Public’ and Enterprise IoT Platforms, which really should be thought of as IoT Middleware.

Just like the Web there is a crucial difference between the Public domain with its  ‘any to any’ ubiquitous capabilities provided by using Open Source and Standards to eliminate differentiation, versus the enterprise side, where competitive differentiation is created. Business use of the Web for competitive success is dependent on the differentiation in the higher-level functions above the presentation and transport layers, and here commercial software wins the Enterprise market.

IoT with its ever increasing range of functions, and capabilities across a multitude of sector deployments goes beyond the level of sophistication Open Source IoT Platform projects are currently aiming to provide for Public domain use. Open Source IoT Platform and Middleware projects can expect years of development to remain in tune with the capabilities of commercial software products that are constantly upgrading their capabilities with new functions.

Successful early entrant commercial IoT products that score wide spread adoption often aim to become ubiquitous by offering their core functions as Open Source. As a recent example Z-Wave, a widely adopted IoT Home Automation wireless connectivity product shipping claiming 35 million units shipped since 2005. The speed of development of Z-Wave capabilities in comparison to the various Home Automation standards groups of leading global Home Product manufacturers illustrates the difficulties of consensus standards or open source projects.  

The advantages of a commercial implementation also lie in speed of reaction to problems, particularly in Security. The speed with which Z-Wave were able to add new functionality to their core product to overcome security risks from badly implemented developers extensions back in 2013 also draws attention to the difficulties facing Open Source Middleware platforms in managing constant updates.

IoT Platforms have to perform a wide variety of complex functions, these were outlines in the blog ‘What is an IoT Platform? And why does it need an AoT engine?’ In a Public Open Source IoT platform there are some very new core issues to be addressed, before any consideration of the advanced, or specialist, requirements that Enterprise IoT Platform add.

The logical architectural conclusion is that Enterprises will use a commercial IoT Platform chosen for its specific features and capabilities to gain competitive advantage in tandem with their connection to an external Open Source IoT Platform to gain ubiquitous access to the overall market.

IoT Platform are complicated if they are to offer the functionality required in serious deployments as an April 2016 a survey of IoT Startups made clear. Using information on funding released by Seed and Venture funds it was only possible to identify 14 startups aiming at the IoT Platform market. The preponderance of US based funding means that this list is incomplete as it really only covers the USA market, but it’s the comparison in numbers to the hundreds of IoT startups that were identified in the same report is revealing. 

IoT Platforms are the new ‘Middleware’ without which IoT cannot scale, either in the public domain sectors, or smart cities, deployments. But then neither can IoT deployments in Enterprises succeed without their own IoT Platforms, unfortunately with different functional specifications. The low number of players and the lack of any dominance illustrate the complexity of this requirement definition conundrum.

IoT Platforms are often called the 4th Generation of Middleware following; 1) ETL - Extract, Transform and Load; 2) SOA – Service Oriented Architecture and 3) Enterprise Service Bus; and iPaaS – Integration Platform as a Service. The lack of a defining name and functionality description makes classification of 4th Generation IoT Platform/Middleware products very difficult!

As it is difficult to do much about naming conventions, it is better to concentrate on defining the key functions using the headings below defining fundamental core capabilities;

  1. IoT Endpoints; connection management at scale is challenging, but the real task is to provide IoT Endpoint Integration. Though this might be considered the most important fundamental requirement, it’s poorly addressed in most IoT Middleware/Platforms as the ‘final mile’ challenge of diversity of devices, protocols, network types, etc. requires wide ranging expertise in development. IoT Endpoint Management adds the requirement for data flow management as well as the usual physical connection management. (see blog iot-and-network-connectivity-management-or-aot-and-data-flow-management-network)
  2. IoT Asset Management; an extension of IoT Endpoint management is the maintenance of an ‘Asset Register’ for all connected IoT devices. The Asset Register provides all the necessary details on the IoT Endpoint from Security Privileges to definitions of Data provided; as well as critical contextual details on the Endpoint such as location, functions, etc. required as background data for  intelligent processing of ‘events’. (see blog – new one on Assets that also includes connectitivy issues)
  3. Security; at the first level defined by points 1 & 2 above there should be encryption and spoof detection in the connection. Supporting this function in small cheap battery powered sensors is almost impossible so the IoT Platform Middleware has to act as a ‘gatekeeper’ for the connected Endpoints. The ‘flat network’ map of connections requires reordering into series of virtual secure data networks aligned to different groupings; Enterprises, Activities, Services, Apps etc. Beyond these fundamental tasks it is possible to identify a huge number of desirable capabilities that will be included in time.
  4. IoT Protocols; due to the range of vastly differing requirements that IoT Endpoint devices introduce that are unlike the demands of IT devices new specialized protocols have appeared. These cover all aspects of protocols from communication to payload and as multiple endpoints using different protocols may have to be aligned to orchestrate an event an IoT Platform must be able to freely integrate both IoT Endpoints with Apps, Services and traditional Databases. (see blog https://www.constellationr.com/blog-news/final-mile-part-2-complexity-iot-protocols-and-apis-importance-apis-has-grown-recently-iot )
  5. Data Flow Management; a significant part of the value of IoT is to ‘read and react’ with an optimized response in near real time. This requires an IoT Platform/Middleware to quite literally read a continuous data stream to determine forwarding actions as well as contextual alignments and relationships with IoT Asset or other forms of Stored Data. This is distinctly different from many forms of Data Analytics, which perform deep analysis on batches of data. (see blog https://www.constellationr.com/blog-news/challenge-final-mile-asset-digitisation-and-data-flow-management-making-sure-your-graph)
  6. Message Management; is recognized in traditional Middleware as a core activity to overcome differences in latency and time stamps to ensure messages are processed in the correct order due to the transactional nature of the data. IoT message management is less concerned with the rigid structure and more concerned with dynamic flexibility of Service to accommodate extreme variation in volumes of message as event trigger and reactions create sharp changes in volumes.
  7. Real Time Integration; The dynamics of the constantly changing environment of IoT contrasts directly with the stable fixed connections of a traditional Middleware solution. IoT Endpoints may physically, or virtually, connect and disconnect, and even when connected will create dynamic changes in data flows. Add Data Flow Management as to where and how the incoming stream will be directed into orchestrated Services integration to understand why previous generations of Middleware bear little resemblance to IoT Platforms/ Middleware.   (see blog https://www.constellationr.com/blog-news/iot-where-two-or-even-three-possibly-four-worlds-collide-or-operational-technology-meets )
  8. Other Features; are being added to the minimum working capabilities of IoT listed above as increases use of IoT introduces new Business solution requirements. Two examples are more interaction between various forms of AI and the IoT platform to increase the intelligence of ‘read and react’; and the need for dynamic recognition and registration of embedded IoT Devices. Any IoT Platform/ Middleware will need to be designed and implemented in a manner to be upgraded whilst still working in a non-stop manner as well as support graceful fail over.

The above eight functions represent a demanding, but nevertheless fundamental set of basic requirements for the evaluation of IoT Platforms/Middleware. Amazingly many Platforms are weak on the core task of Endpoint Connection, Asset Registration and Associated Security management. None of these capabilities are currently present as a significant capability in any of the following Open Source projects listed below. Instead the focus has been on Middleware integration for public domain deployment. The listed commercial Enterprise Platforms are more attuned to the necessity in an Enterprise deployment to handle more sophisticated management and control capabilities. 

When considering commercial IoT Platforms it is difficult to analyze the entire market as there are many small relatively invisible players, but only nine IoT Platforms seem to claim IoT Endpoint management. Three of these originate from Industrial Automation, three from major IT vendors, which in each case creates a natural focus on working with the existing markets of the vendors. This leaves just three Startups aiming to provide a full independent Enterprise IoT Platform with sophisticated features that will support heterogeneous deployments.

NB. The focus on the characteristics of IoT end point connectivity management and integration management of resulting data flows removes many well-known IoT solutions from contention. Many big name and well-known IoT products/services may provide some ‘connection management’, but this is to deliver their prime purpose of a direct Business valuable outcome. The choice to separate in this manner is to identify the difference between using Platforms/ Middleware as ‘infrastructure’ from Services that provide direct business value and make use of Infrastructure functions. Many IoT Business Solutions do offer some degree of Connectivity management, but only to gain data flows for their processes, not to offer Heterogeneous infrastructure for other Services.

The list below is in alphabetical order and does not intend to suggest any ranking, those selected offer both significant scalability and ambition for the role, and been the subject of Constellation Research Clients queries. It is not meant to be an exhaustive list, but to draw attention to the features of a selected group of Platforms/Middleware products as a means of defining the necessary capabilities. A wider listing with characteristics of a large number of products that lay claim to being Platforms can be found at http://www.postscapes.com/internet-of-things-platforms/

Five Open Source IoT Platform Projects

HortonWorks (linked with Apache NiFi)
Kaa
microServicesbus.com
Apache NiFi
OpenRemote

Three IoT Platforms developed from Industrial Automation/Telecoms

Bosch IoT Suite
Ericsson Device Connection Platform
Plat.One IoT & M2M (by SAP)

Three Major IT Vendors

AWS IoT Platform (includes 2lemetry)
IBM Watson IoT Device Cloud
Cisco IoT Cloud Connect
 

Three IoT Platforms Developed for Enterprise Use by Startups

Asset Mapping
Telit
Xively

 

Summary;

The whole business case for IoT Technology adoption rests in the accessibility of new data by managing connections to an immense new range of new Devices. Specifically business value lies in the collection and integration of the data to drive a new level of competitive Services and responses. Both endpoint connectivity and data integration require extensive Middleware functionality supported by a new generation of ‘IoT Platforms’.

However there is a distinctive difference between the Middleware that provides the undifferentiated ubiquitous Open Source Platform/ Middleware to interconnect ‘Public’ IoT endpoints, and the requirements of an Enterprise. Commercial IoT Platforms/ Middleware are required to be rich in capabilities to create competitive differentiation as to how an Enterprise ‘reads and reacts’ to opportunities through orchestration of its IoT Assets, and Smart Services.

Enterprises should follow the Open Source IoT Platform initiatives to gain access to as many sources of data as possible whilst recognizing that their own deployment of a Commercial IoT Platform is essential to gaining the competitive advantages that IoT enables in Digital Business.

New C-Suite Future of Work Chief Information Officer

Good Behavior is a Business Opportunity

Good Behavior is a Business Opportunity

1

In Standing on the Sun and (more briefly!) the HBR, Julia Kirby and I argued that sensors of all kinds—from Copenhagen Wheels to Kenyans with Ushahidi on their phones to body cameras—would bring information about negative externalities such as pollution, civil violence, and abuse of authority to the attention of consumers/citizens, who would consequently care more about the behaviors that caused them.

(We titled this “Sensors and sensibilities.” Couldn’t help ourselves.) The consequence would be people voting with their wallets and ballots to endorse choices that respected all constituents.

Our point was that data on “intangibles” that have gone unmeasured by GDP and GAAP accounting would enable society to express its collective desires more effectively. The result would be a world better balancing the needs of all stakeholders. (This happened in the early 20th century; after industrialization concentrated power, anti-trust legislation, labor laws and financial regulation redistributed it to consumers, workers and savers.)

But we did not foresee the next stage—as the demand for such information grows, it becomes a business opportunity. 

Thomson Reuters’ ad, above, shows that it can be. And of course it makes sense that a news organization would see the growing mainstream desire for such information as a new market.  

I expect some will react negatively to a for-profit organization taking up a “cause” and express more trust in NGOs to supply this kind of information.  My view is that the embrace of a powerful company that knows how to do the information gathering and presentation (see more of their work here) will accelerate a positive feedback cycle, fueling awareness and concern that leads to bad actors seeing better behavior in their best interests.

Update

A late-breaking example: the October 1st New York Times reports that brands including Verizon, General Mills, and HP Inc. are “[asking] ad agencies for action on diversity hiring.”  

According to the chief creative officer of General Mills “You don’t need to be a mom to make some Cheerios ads, but if we have more moms on the team…maybe we increase the probability we do work that connects with moms. That’s really where our drive for diversity came—it wasn’t some sort of moral high-horse stance about the failing ad industry.”

-- CAM

 

Future of Work Chief Executive Officer

SAP Trenitalia Digital Summit - Event Report

SAP Trenitalia Digital Summit - Event Report

 
We had the opportunity to attend the Trenitalia / SAP Digital Summit, held last in Rome, Pietrarsa, October 29th 2016. Though not formally labelled as a launch event, it felt like SAP bringing together messaging, executive, an early adopting customer with Trenitalia and products in a single event.

 

So take a look at my musings on the event here: (if the video doesn’t show up, check here)



 
 
No time to watch – here is the 1-2 slide condensation (if the slide doesn’t show up, check here):



 


 
Want to read on? Here you go: Always tough to pick the takeaways – but here are my Top 3:

SAP is committed to IoT – As SAP stated in a press release earlier in the week (see here), SAP plans to spend over 2B Euro on the next 5 years to build its position in IoT. The move makes a lot of sense as SAP has a strong install base in manufacturing and other IoT relevant industries – and these customers are high price, high quality manufacturers who have to come up with an IoT strategy for the customers of their things. Time is of essence in IoT as it is the only next generation application use case where there is worldwide consensus that it has to run in the cloud. Even cloud skeptical decision makers in Europe are firmly in the cloud camp, given volume, velocity and uncertainty as key characteristics of any IoT implementation.

Trenitalia is an early adopter – The event was held in Italy as Trenitalia is an early customer of SAP’s IoT Package ‘Vehicle Management’, with the plan to manage all of the railway’s rolling stock with SAP. During the event we were riding a FrecciaRossa 1000 train and looking at the digital exhaust of the ‘train thing’ while riding the train. Trenitalia expects to be done implementing SAP IoT by 2018, with all rolling stock being managed with the solution. New best practices in preventative maintenance are getting tangible for Trenitalia, e.g. scheduling maintenance based on usage, wear and tear vs. the usual hours of operations. Trenitalia technicians will also know before a train rolls into the maintenance facility, what they will have to service on the train. Trenitalia CEO Morgante pointed out that the IoT information will help to make the maintenance technician work more diverse and interesting, a worth to note wrinkle in the usual full of concern discussion on the repercussions of technology on the future of workers.

SAP knows it’s a long journey – Despite early adopters like Trenitalia, SAP realizes that IoT is a long game. On the product side it is good to see that SAP has realized that the usual standard application approach cannot be the solution to make customers successful with IoT, but that a combination of pre-packaged capability together with the HANA Cloud Platform (HCP) is needed. Trenitalia was a showcase for the SAP IoT package Vehicle Management. SAP plans to ship general horizontal, vertical (e.g. Smart Cities) and application specific packages (e.g. Vehicle Management). But SAP is not going to build all applications in house, but sees the need of acquisitions – with the acquisition of PLAT.ONE and Fedem. PLAT.ONE brings SAP good horizontal capabilities that will help building IoT solutions on top of HCP, e.g. lifecycle management for IoT devices, broad device connectivity as well as important IoT edge capabilities. Fedem solves a ‘secondary’ IoT problem, related to having models that tell operators how a thing will behave in the real world, with more or less available physical measuring points. The technology around the digital twin that Fedem has developed becomes crucial when IoT adoption rises, especially on new products / things, where industries have little experience how long they last, what exposure to e.g. elements they can withstand and what durability they will have overall. A long term investment that can develop into a crucial differentiator for SAP in a few years.
Equally SAP realizes that with the current Industrie 4.0 (by purpose with the German spelling) conversation is helping the vendor, but needs more amplification. So SAP plans to open IoT labs, in Berlin, Johannesburg, Munich, Palo Alto, Sao Leopoldo and Shanghai. Probably a necessary move to get the word out and educate customers on the possibilities and best practices around IoT.


 

MyPOV

A very good event, probably the best I have attended in a long time, for sure the best SAP has done (and I have been to a few / attend a lot, maybe too many events). Watching IoT information live, while being a passenger in the train / thing is just. pretty cool. With introductory presentations in Rome, a dual customer and SAP led presentation on the train, interrupted by video messages from CEOs and heads of development / IT, an event in a train museum, a maiden voyage on a train, an agenda lead 50% by the customer, both CEO’s of vendor and launch customer present etc. – all made this a very well executed and impressive event. And SAP’s ambition to invest 2B Euro over the next 5 years in IoT is certainly worth a similar event. SAP is doing many things right in IoT, basing the offerings on its PaaS with HCP, offering packages, acquiring basic, key and differentiating capabilities, and evangelizing the capabilities, these are all the right steps going forward.

On the concern side SAP still needs to clarify its BigData in general and it’s Hadoop strategy specifically, acquiring Altiscale (my take here) is a step in the right direction, so was Vora (here my take at GA). But IoT – and all next generation applications need a Hadoop answer that runs on cheap HDD, as well as a supported IaaS platform answer. And while I understand the SAP strategy in regards of IaaS is evolving (partnerships with IBM (see here), Microsoft (see here) and AWS (for BW/4HANA – see here)) – the support for Hadoop on HDD in regards of official platforms support is - overdue. SAP needs to answer both to become a viable partner for IoT, eliminating all ‘big’ question marks enterprises have today. ‘Small’ question marks always remain, but SAP needs to eliminate the big ones to really play in IoT better sooner than later.

In closing SAP CEO McDermott shared on the question on what the audience can do to propel digital transformation (from yours truly), that DaaS – Data as a Service – is something to look at – both an opportunity for customers and for SAP. Good to hear SAP / McDermott mention that and starting to tackle that 4th ‘aaS’ area.

Back to the event, well done by Trenitalia and SAP, very few things that could have done better, congrats to an almost perfect event. As for SAP IoT, lots of good moves, but some key decisions are still missing, have to be made. The good news is - they are not hard and could come sooner than later.

Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here).



 

And more on SAP:
  • First Take - SAP BW/4HANA - Data Gravity and Cloud win - read here
  • Event Report - SAP SuccessFactors SConnect - Push on all fronts - read here
  • Event Report - SAP Insider Vienna - HCP, BI and SuccessFactors are the takeaways - read here
  • Event Report - SAP Sapphire 2016 - Top 3 Positives & Concerns: SAP changes - probably for the better - read here
  • First Take - SAP Sapphire Day #2 Keynote - read here
  • News Analysis - SAP and Microsoft usher in new era of partnership to accelerate digital transformation in the cloud - read here
  • First Take -  SAP Sapphire Bill McDermott Day #1 Keynote - read here
  • Event Preview - SAP Sapphire 2016 - What to expect and look for - read here
  • News Analysis - Apple & SAP Partner to Revolutionize Work on iPhone & iPad - read here
  • Progress Report - SAP SuccessFactors makes good progress - now needs appeal beyond SAP - read here
  • News Analysis - SAP HANA Vora now available... - A key milestone for SAP - read here
  • Event Report - SAP Ariba Live - Make Procurement Cool Again - read here
  • News Analysis - SAP SuccessFactors innovates in Performance Management with continuous feedback powered by 1 to 1s  - read here
  • Event Report - SAP SuccessFactors SuccessConnect - Good Progress sprinkled with innovative ideas and challenging the status quo - read here
  • News Analysis - WorkForce Software Announces Global Reseller Agreement with SAP - read here
  • First Take - SAP SuccessFactors SuccessConnect - Day #1 Keynote Top 3 Takeaways - read here
  • News Analysis - SAP SuccessFactors introduces Next Generation of HCM software - read here
  • News Analysis - SAP delivers next release of SAP HANA - SPS 10 - Ready for BigData and IoT - read here
  • Event Report - SAP Sapphire - Top 3 Positives and Concerns - read here
  • First Take - Bernd Leukert and Steve Singh Day #2 Keynote - read here
  • News Analysis - SAP and IBM join forces ... read here
  • First Take - SAP Sapphire Bill McDermott Day #1 Keynote - read here
  • In Depth - S/4HANA qualities as presented by Plattner - play for play - read here
  • First Take - SAP Cloud for Planning - the next spreadsheet killer is off to a good start - read here
  • Progress Report - SAP HCM makes progress and consolidates - a lot of moving parts - read here
  • First Take - SAP launches S/4HANA - The good, the challenge and the concern - read here
  • First Take - SAP's IoT strategy becomes clearer - read here
  • SAP appoints a CTO - some musings - read here
  • Event Report - SAP's SAPtd - (Finally) more talk on PaaS, good progress and aligning with IBM and Oracle - read here
  • News Analysis - SAP and IBM partner for cloud success - good news - read here
  • Market Move - SAP strikes again - this time it is Concur and the spend into spend management - read here
  • Event Report - SAP SuccessFactors picks up speed - but there remains work to be done - read here
  • First Take - SAP SuccessFactors SuccessConnect - Top 3 Takeaways Day 1 Keynote - read here.
  • Event Report - Sapphire - SAP finds its (unique) path to cloud - read here
  • What I would like SAP to address this Sapphire - read here
  • News Analysis - SAP becomes more about applications - again - read here
  • Market Move - SAP acquires Fieldglass - off to the contingent workforce - early move or reaction? Read here.
  • SAP's startup program keep rolling – read here.
  • Why SAP acquired KXEN? Getting serious about Analytics – read here.
  • SAP steamlines organization further – the Danes are leaving – read here.
  • Reading between the lines… SAP Q2 Earnings – cloudy with potential structural changes – read here.
  • SAP wants to be a technology company, really – read here
  • Why SAP acquired hybris software – read here.
  • SAP gets serious about the cloud – organizationally – read here.
  • Taking stock – what SAP answered and it didn’t answer this Sapphire [2013] – read here.
  • Act III & Final Day – A tale of two conference – Sapphire & SuiteWorld13 – read here.
  • The middle day – 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
  • A tale of 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
  • What I would like SAP to address this Sapphire – read here.
  • Why 3rd party maintenance is key to SAP’s and Oracle’s success – read here.
  • Why SAP acquired Camillion – read here.
  • Why SAP acquired SmartOps – read here.
  • Next in your mall – SAP and Oracle? Read here
 
And more about SAP technology:
  • Event Prieview - SAP TechEd 2015 - read here
  • News Analysis - SAP Unveils New Cloud Platform Services and In-Memory Innovation on Hadoop to Accelerate Digital Transformation – A key milestone for SAP read here
  • HANA Cloud Platform - Revisited - Improvements ahead and turning into a real PaaS - read here
  • News Analysis - SAP commits to CloudFoundry and OpenSource - key steps - but what is the direction? - Read here.
  • News Analysis - SAP moves Ariba Spend Visibility to HANA - Interesting first step in a long journey - read here
  • Launch Report - When BW 7.4 meets HANA it is like 2 + 2 = 5 - but is 5 enough - read here
  • Event Report - BI 2014 and HANA 2014 takeaways - it is all about HANA and Lumira - but is that enough? Read here.
  • News Analysis – SAP slices and dices into more Cloud, and of course more HANA – read here.
  • SAP gets serious about open source and courts developers – about time – read here.
  • My top 3 takeaways from the SAP TechEd keynote – read here.
  • SAP discovers elasticity for HANA – kind of – read here.
  • Can HANA Cloud be elastic? Tough – read here.
  • SAP’s Cloud plans get more cloudy – read here.
  • HANA Enterprise Cloud helps SAP discover the cloud (benefits) – read here
 


Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.

 
Tech Optimization Chief Information Officer

What Internet of Things Platforms and Middleware Should You Choose?

What Internet of Things Platforms and Middleware Should You Choose?

How do you decide which IoT suite, Data Lake Management offering or Enterprise Group Messaging app to deploy? Choosing the right tools will determine the success or failure of your digital programs.

I’m excited to announce our new Constellation ShortList™ offering we’ve created to guide businesses to the right technologies for their transformation initiatives – starting with the Internet of Things (IoT),

IoT is more than just a buzzword; it has practical applications in business and is positioned to be a driving force in the next phase of innovation. With everything from cars, phones, appliances, and even dogs becoming more and more connected, forward-thinking businesses are rethinking their business models to take advantage of IoT. This morning, we released the Internet of Things Platforms and Middleware Constellation ShortList, which identifies IoT companies to watch across four categories – Major IT Vendors, Open Source, Specialist Enterprise and Industrial Automation.

Here are the companies that topped the list:

  • AWS IoT Platform
  • Apache NiFi
  • Asset Mapping
  • Bosch IoT Suite
  • Cisco IOT Cloud Connect
  • Ericsson Device Connection Platform
  • Hortonworks
  • IBM Watson IOT Platform
  • Kaa
  • microServiceBus.com
  • OpenRemote
  • Plat.One IoT & M2M (by SAP)
  • Telit
  • Xively

Constellation advises early adopters using disruptive technologies on how to achieve business model transformation. Products and services named to this Constellation ShortList meet the threshold criteria for each category as determined by Constellation Research through client inquiries, partner conversations, customer references, vendor selection projects, market share and internal research.

Additional lists released today include:

Over the next five weeks, we will release nearly 40 Constellation ShortLists authored by our analysts across a range of technologies. If you don’t see a list that applies to your technology needs, check back each Wednesday through early November, or inquire directly by contacting [email protected].

For more information, visit https://www.constellationr.com/shortlist

 

 

 

Applying Design Thinking to Sales

Applying Design Thinking to Sales

I just published my first solo report as an analyst, Sales by Design, Not by Challenge. The report puts forth the concept of how to apply design thinking to the sales process.  Any good sales person will say that sales is an art, it requires understanding the customer’s business objectives, organizational nuances, and being clear that the ultimate goal is to come to a mutually agreed upon outcome. Sales by Design, as explained in the report, helps sales teams understand what the customer’s organizational persona in order to anticipate the customer's needs and manage the sales process accordingly. Sales teams are then able to focus on co-innovation and co-creation of solutions with the customer.
 
When talking about sales strategies, it’s hard to not come across the Corporate Executive Board’s (CEB) book, The Challenger Sale, which advocates the hiring and development of sales representatives that fit a defined "challenger" persona as per CEB's surveys, are the most effective. The book first published over five years ago based on survey data post the financial crisis of 2008.  The selling environment then versus now has changed significantly with the rise of the digital age.  The abundance of information and resources available to buyers and customers in today's social and digitally-driven environment has changed the rules of engagement.  We at Constellation have received a healthy number of client inquiries regarding the Challenger methodology.  I want to be clear that Sales by Design isn’t about dispelling The Challenger Sale; Sales by Design and Challenger are by no means mutually exclusive. Organizations can utilize Sales by Design alongside or as an alternative to Challenger. 
 
In addition to client feedback, I personally tried to implement The Challenger Sale at a prior company. After a recommendation from a former colleague, I read it, found it interesting and bought copies for my global marketing team. As the head of marketing at a mid-size company, the messages around sales and marketing collaboration resonated with me and we went about rethinking our corporate deck and sales collateral.  At the same time, we brought the book to sales and other executive leadership to look at for broader adoption and it quickly became evident that without top-down support from the CEO with dedicated budget and resources for consistent training, recruiting, content development etc, we could not get it off the ground. This doesn’t mean that Challenger doesn’t work for all organizations, but there needs to be acknowledgement of the commitment in time and resources required to make it work.
 
Executives can quickly spot the sales reps trying too hard to be “Challengers”.  They are often overly aggressive, pushing their solutions regardless of fit and are “unafraid to push and discuss money” often at the wrong time.  Hearing these concerns from both clients and other sales teams, it became clear that customers have started to tune out these Challenger-type sales reps. Maybe these are “Challengers” gone wrong, but the more buyers are aware of the style, the less effective it becomes. 
 
As a prior marketing and sales operations executive, I had the privilege to work with some of the best sales people and see them in action. Their success stemmed from their ability to adapt their selling style to the customer’s organizational style. The key to successful selling isn't about the seller's personas and how not to be a “relationship” builder but someone who “pushes customers", but establishes genuine rapport and has a willingness to partner. Sellers leverage both information and technology to their advantage to make meaningful connections and to stay on top of the customer’s personal and organizational developments.  Sales by Design will teach sales teams how to partner with customers to apply design thinking in the sales process to co-create and co-innovate solutions - a win win for both.
 
I invite you to join in the dialog and share with me your thoughts. If you are not a Constellation client, you can download the Table of Contents and an excerpt of the report, Sales by Design, Not by Challenge, here.
 
 
Media Name: Design Thinking.jpg
Next-Generation Customer Experience Chief Marketing Officer Chief Revenue Officer

Strata + Hadoop World Highlights Long-Term Bets on Cloud

Strata + Hadoop World Highlights Long-Term Bets on Cloud

Strata + Hadoop World announcements by Cloudera, IBM, Google and SAP anticipate cloud growth. Here’s why cloud will be so crucial even if data remains on premises.

“Today, 92% of all IT is happening on premises,” said Mike Olson, Cloudera’s chief strategy officer, in his September 28 keynote kickoff at Strata + Hadoop World in New York. “But we see dramatic growth — nearly 35% compound annual growth — over the next 10 years in public cloud workloads.”

Even in the big data arena, where so much volume (in the form of data warehouses) remains on premises, practitioners and vendors alike are looking to the cloud. Why? Because so much of what will drive innovation, differentiation and value is going to happen in the cloud. That vision was underscored at Strata + Hadoop World, where Cloudera set the tone and IBM, Google, SAP and others added to list of announcements that centered on cloud deployment options.

cloudera-cloud-partners

Cloudera eases software deployment and promises portability across the leading public clouds,
but customers also want managed services options.

Cloudera highlighted recent cloud-friendly enhancements including the ability to run the Impala database for Hadoop against cloud-native object stores such as Amazon S3 (with Azure Data Lake likely to follow). It’s also previewing a connector for Microsoft Azure users so they can use PowerBI with Impala. The company has simplified use of its Cloudera Director cloud-management tool by adding templates for deploying on AWS, Azure and Google. This underscores the company’s commitment to portability across the top three public clouds. In July the company also introduced consumption-based subscription terms, with the ability to meter hourly usage of its Hadoop software for temporary, project-based workloads.

IBM announced IBM DataWorks, a cloud-based platform with options for data ingestion, persistence and analysis. Data sources can be on-premises or in the cloud, structured or unstructured, and batch oriented or streaming. Options to persist the data include relational database services, NoSQL database services and IBM’s BigInsights Hadoop distro as a service. IBM Watson services and Apache Spark-based machine learning services support data processing and data discovery. DataWorks offers user interfaces for data engineers (DataWorks Connect), data scientists (with Jupyter notebooks and RStudio as part of Data Science Experience), business analysts (Watson Analytics) and app developers  (BlueMix services). It’s largely a packaging of existing cloud services, but what makes it a platform is shared data and metadata access and governance and shared spaces for data modeling and analysis.

Google announcements are always about cloud, of course, but at Strata + Hadoop the company reached out to the enterprise crowd with Big Query for Enterprise. Aimed at mainstream corporate use, Big Query for Enterprise add support for standard SQL (SQL 2011, specifically), including the ability to update, delete and insert rows and columns in BigQuery datasets using SQL. The offering also adds new ODBC drivers, for connecting to popular BI tools, and new access and identity management capabilities. Finally, pricing options include monthly flat-rate pricing, aimed at lowering the cost of long-term use, versus short-term pricing aimed at ephemeral projects.

SAP confirmed its rumored acquisition of Altiscale last week, and it said the business will continue to offer its high-performance Hadoop and Spark services as a separate business unit. Altiscale execs said they expect to expand their business now that they have SAP’s backing and can take advantage of the company’s data center capacity in Europe and elsewhere. As I explained in this take on the deal (before it was confirmed), Altiscale will enable SAP to offer Hadoop and Spark capacity alongside Hana and Hana Vora on the Hana Cloud Platform. Thus, SAP won’t have to turn to other vendors when customers require cloud-based big data infrastructure as part of their next-generation data-driven applications.

MyPOV On Steps Toward the Cloud

All of these moves are positive, but they’re merely next steps toward supporting the sort of hybrid-cloud deployment scenarios that companies will want and need in the years ahead. It’s great that Cloudera has made it easier to deploy its software on the three leading public clouds, but many companies struggle with the complexities of deploying and running Hadoop. They want fully managed services, so they can click to deploy without having to deal with administering the cluster. That’s particularly true when dealing with temporary projects that require infrastructure that companies want to spin up and shut down just as quickly.

IBM gave DataWorks a hyped up “AI” spin, throwing in mentions of “cognitive” and Watson to cover all the bases. What it seemed to boil down to was the use of machine learning in data processing and data discovery. What I wanted to hear more about was automated model assessment and deployment options. That’s the last mile of turning data into decisions embedded within applications, whether those apps are deployed in the cloud or on premises.

Google is clearly trying to broaden BigQuery’s appeal with those “for Enterprise” enhancements. But with so much data still on premises, the public cloud players should do more than offer secure, dedicated connections to support hybrid deployment scenarios. Microsoft and Oracle both make it a priority to integrate with existing, on-premises IT investments such as data warehouses. Amazon and Google could do more to highlight click-ready integrations with the most popular on-premises data platforms. I know Google has third-party options; but Amazon and Google should both be more vocal and visible about supporting common hybrid scenarios without making about moving everything to the cloud.

SAP really had to have its own options for supporting Hadoop and Spark workloads in the cloud, so the Altiscale buy was a smart, turnkey investment. At the same time SAP still has to play nice and keep its options open with Amazon, Google, Microsoft and IBM. SAP is not a hyper-scale cloud player, so it has done well to partner with all of the above. Given the size and stature of SAP’s customers, these cloud partners would all do well to support hybrid scenarios. That will help build trust and remove cultural barriers that might stand in the way of moving on-premises SAP deployments to the cloud.

As Olson observed, only a fraction of IT activity is currently in the cloud, but that’s where the action and the real value is going to being generated. While the back-office, transactional data is crucial, it’s generally predictable and well understood. The cloud is where companies are increasingly intersecting with partners and customers and uncovering interesting insights. Whether it’s through mobile apps, social networks, partner portals or third-party data enrichment, Constellation Research believes that 60% of the data that companies consider to be mission critical will reside outside their four walls by 2020. Making use of that data will be the key to driving breakthrough business models based on data monetization and data-driven services.

Related Reading:
Oracle Vs. Salesforce on AI: What to Expect When
Teradata Amps Up Cloud And Consulting Offerings
SAP Reportedly Buying Altiscale to Power Big Data Services


Data to Decisions Future of Work Matrix Commerce Tech Optimization Chief Information Officer Chief Marketing Officer Chief Digital Officer

Event Report - Cloud Foundry Summit Europe - Europe & Cloud - A long path

Event Report - Cloud Foundry Summit Europe - Europe & Cloud - A long path

We had the opportunity to attend CloudFroundry’s CFSummit conference held in Frankfurt September 27th and 28th 2016. The conference was well attended with over 700+ participants, coming from customers, prospects and the ecosystem. 

 
So take a look at my musings on the event here: (if the video doesn’t show up, check here)

 


No time to watch – here is the 1-2 slide condensation (if the slide doesn’t show up, check here):

 

 
Want to read on? Here you go: Always tough to pick the takeaways – but here are my Top 3:

Cloud Foundry is growing – Not surprisingly, Cloud Foundry is doing well in Europe. Last year’s Cloud Foundry Summit in Berlin saw about 400 attendees, the 2nd edition this year in Frankfurt saw over 700. Interest in Cloud Foundry is the same as in North America, enterprises are looking for a PaaS platform that allows them to build Next Generation Applications, with the option to deploy on premises and in the cloud, to multiple clouds, and develop software in a modern way that is in tune with the 21st century challenges and best practices. Being in Europe / Germany – there is a European touch to the ecosystem – with e.g. local vendors like SAP and Atos having more prominent roles than e.g. at Cloud Foundry Summit in Santa Clara earlier this year.

Focus on Government – Contrary to the North American edition of the conference, the European edition had an emphasis on showcasing on what is happening in government, with Tuesday having a UK government technologist, and Wednesday having a Dutch government technologist on stage. Certainly no coincidence – could it be that European governments are endorsing and embracing PaaS projects more than e.g. in the US? An interesting wrinkle – that needs more discussion going forward.

IoT rules all use cases – No surprise, as seen on other European events, the IoT use case is strong and prominent. All the way to the fact that IoT reference customers at North American events – usually come from … Europe. And while the average European IT leader is concerned moving customer and employee data into the public cloud – there is no concern, not even discussion when it comes to IoT. The technical requirements are simply too large to even consider running on premises and the pressure on the high quality, high price European manufacturers is simply too high for not choosing a faster to implement cloud strategy.

Community Outreach – On the flipside the more traditional use cases take more time, PaaS, agile development, CI / CD etc. are all newer trends for the Europeans than their North American colleagues. So Pivotal, Cloud Foundry and partners need to invest into community evangelism efforts, and the creation of the Ambassador program is certainly a good step. We will have to check in a few quarters on progress and if it has really moved things in the direction of higher awareness, less concerns and ultimately better adoption.

MyPOV

A good event for Cloud Foundry and partners. European Enterpises have a keen interest to understand what PaaS in general and more specifically – Cloud Foundry can do for them. But basic skepticism towards new best practices, tools and trends coming from North America run high traditionally in Europe – and with good reason: Not everything that comes over the Atlantic, hyped as the derniere cri real lasts and has an impact on enterprise automation. Cloud Foundry now needs to convert that good interesting, first showcase and trial objects into live customer projects, that can be showcased at the 3rd edition of CFSummit Europe in 2017. We will be watching.

Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here).




More on Pivotal / Cloud Foundry
  • Event Report - Pivotal SpringOne Platform - Spring in its 2nd spring - read here
  • Event Report - Cloud Foundry Cloud Foundry Summit - It's good to be king of PaaS - read here
  • News Analysis - Pivotal makes Cloud Foundry more about multi-cloud - read here
  • News Analysis - Pivotal pivots to OpenSource and Hortonworks - Or: OpenSource keeps winning - read here
  • New Analysis: Pivotal Now Makes It Easier Than Ever to Take Software from Idea to Production - read here

More on Next Generation Applications::
 
  • Event Report - Google I/O 2016 - Android N soon, Google assistant sooner and VR / AR later - read here
  • News Analysis - SAP and Microsoft usher in new era of partnership to accelerate digital transformation in the cloud - read here
  • Event Report - OpenStack Summit 2016 - Austin - OpenStack matures, grows up - read here
  • First Take - Workato’s Workbot cuts business users some slack with Slack integration - read here
  • Progress Report - Cloudera is all in with Hadoop - now off to verticals - read here
  • First Take - SAP Cloud for Planning - The next spreadsheet killer is off to a good start - read here
  • Market Move - Oracle buys Datalogix - moves into DaaS - read here
  • News Analysis - SAP commits to Cloud Foundry and OpenStack - Key Steps - but what is the direction? Read here
  • Event Report - MongoDB is a showcase for the power of Open Source in the enterprise - read here
  • Musings - A manifesto: What are 'true' analytics? Read here
  • Future of Work - One Spreadsheet at the time - Informatica Springbok - read here
  • Musings - The Era of the no-design Database - Read here
  • Mendix - the other path to build software - read here
  • Musings - Time to ditch your datawarehouse .... - Read here

Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
Tech Optimization Innovation & Product-led Growth Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work New C-Suite Next-Generation Customer Experience SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer