Results

Advanced Analytics Now Within Reach

Advanced Analytics Now Within Reach

Business leaders want to see what’s coming, not just read reports on what happened last week or last quarter. Advanced analytics represents a class of software that helps companies go beyond historical reporting to predict future opportunities and risks. (For more information read my self-service analytics market overview of IBM Watson AnalyticsSAP Lumira, and SAS Visual Analytics.) Predictive analyses help identify best customers, develop better up-sell and cross-sell offers, understand financial risks, choose new products, and anticipate equipment failures.

Until recently, advanced analytics were the province of statisticians and data scientists, but that’s changing with the emergence of self-service options for analytics. Vendors are pursuing two broad approaches: 

BI meets prediction: Products focused primarily on self-service BI (e.g., data exploration, data visualization, dashboarding and reporting) are being enhanced with predictive features. Products taking this general approach include Microsoft Power BI, Qlik Sense, SAP Lumira, and Tableau, among others.

Analytics simplified: Products include the basics of BI but are focused primarily on advanced analytics have simple, drag-and-drop or point-and-click interfaces and built-in automation features. In most cases, these tools are aimed at data-savvy analyst types. Products falling into this category include Alteryx, Alpine Data Labs’ Enterprise Platform, SAP Predictive Analytics, SAS Visual Analytics/Visual Statistics, and TIBCO Spotfire. 

Selecting the right advanced analytics solution for your organization will depend on the breadth of your deployment, the skill level of your users, and the nature of the decisions users are trying to support. Use the guide below to evaluate analytics solutions for your organization. 

self_service_analytics_evaluation_criteria

Resources

IBM Watson Analytics: Self-Service Analytics Market Overview

DOWNLOAD EXCERPT

SAS Visual Analytics: Self-Service Analytics Market Overview

DOWNLOAD EXCERPT

SAP Lumira with Predictive Analytics 2.0: Self-Service Analytics Market Overview

DOWNLOAD EXCERPT

Data to Decisions ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Spark On Fire: Why All The Hype?

Spark On Fire: Why All The Hype?

Spark Summit event report: IBM unveiled big plans for Apache Spark this week, but it’s not alone in banking on this open source analytics platform. Here’s why this still-green technology is quickly gaining adoption.

Why is Apache Spark, an open-source project that only reached its 1.0 release a little more than a year ago, getting so much attention?

IBM, for one, announced a big commitment to the platform at this week’s Spark Summit in San Francisco. And as IBM execs told analysts at the company’s new Spark Technology Center here, it’s an all-in bet to integrate nearly everything in the analytics portfolio with Spark. Other tech vendors betting on Spark range from Amazon to Zoomdata, even as real-world deployments number in the hundreds and are mostly experiments and proof-of-concept projects.

Spark offers unified access to data, in-memory performance and plentiful processing and analysis options.

Spark offers unified access to data, in-memory performance and plentiful processing and analysis options.

Describing Spark as “an operating system for analytics,” IBM execs cited Spark strengths including:

  • Abstracted data management. Spark lets data scientists focus on the analysis, not data movement, as data pipelines can access data where it lies, whether that’s in Hadoop, in a database, or on Spark’s own cluster.
  • Rich data-processing functionality. The Spark Core provides a flexible data-processing platform supporting distributed task management, scheduling, and basic I/O functionality as well as transformations such as map, filter, reduce and join.
  • In-memory performance. The Spark Core delivers up to 100 times faster performance than Hadoop MapReduce. In iterative machine learning and other predictive analyses, you can squeeze in that many more processing cycles against all of the data, not just samples of data.
  • Plentiful analytics options. IBM execs didn’t dwell on this strength, but Spark offers multiple analytic libraries than run on top of the core, including MLLib (machine learning), Spark SQL, Graph X, Spark Streaming and, released last week, Spark R. IBM plans to run its own software on top of the platform, including SPSS, IBM Streams, and (soon to be open sourced) SystemML, all of which are being ported to run on Spark.

In contrast to IBM, analytics rival SAS views Spark as a competitor. For more than three years, SAS has been working on its own big-data-capable, in-memory analytics platform, the SAS LASR Analytics Server, which runs SAS Visual Analytics (VA) and SAS Visual Statistics (VS) as well as SAS analytics libraries. The SAS LASR Analytic Server can be deployed on a single server, on a dedicated cluster, or on top of Hadoop (see my just-published report, “The Era of Self-Service Analytics Emerges: Inside SAS Visual Analytics and Visual Statistics”).

When I asked SAS about its views on Spark earlier this week, product manager Mike Ames offered the following (lukewarm) statement: “While Spark is currently an immature technology, it shows promise with rapid adoption as a result of its data processing capabilities. SAS and Spark are very capable of coexisting, with products such as SAS Data Loader for Hadoop, which can push transform logic to Spark.”

SAS is right about Spark’s immaturity. I’ve talked to practitioners and integrators who acknowledge that the technology is still green. Like a lot of 1.X open-source software projects, Spark is still buggy and it doesn’t have all the table-stakes systems-management, security and high-availability features that many enterprises would insist upon before running mission-critical workloads.

That’s not to say that Spark is incapable of running production workloads reliably or at scale. Hundreds of companies are doing just that, but they tend to be pioneers with an appetite for innovation and strong engineering teams that are willing to fix bugs and develop best practices where none exist.

Plenty of early adopters presented at Spark Summit, including Airbnb, Baidu, Edmunds.com, NASA, NBC Universal, Netflix, Shopify, Toyota Motor Sales, and Under Armour. Keynoter James Peng, principal architect at Baidu, the Google of China, described that company’s 1,000-plus-node, petabyte-scale Spark deployment, which is delivering 50 times faster performance than the conventional MapReduce processing it previously relied upon. Baidu is also pioneering the use of Spark SQL and the Tachyon caching layer.

MyPOV on Spark

Yes, it’s early days for Spark, but there’s good reason why IBM described it as “potentially the most significant open-source project of the next decade.” SAS acknowledged Spark’s data-processing capabilities, but that’s just the starting point. Even IBM’s characterization of Spark as “an operating system for analytics” seems like a left-handed compliment.

With all those libraries on top of the Spark Core – machine learning, SQL, graph, streaming and R — Databricks and the Spark community are trying to build out an all-purpose analytics platform capable of supporting many forms of analysis and blended analyses. By blending machine learning and streaming, for example, you could create a real-time risk-management app. What’s more, Spark supports development in Scala, Java, Phython and R, which is another reason the community is growing so quickly.

At Spark Summit, Amazon Web Services announced a free Spark service running on Amazon Elastic Map Reduce, and IBM announced plans for Spark services on BlueMix (currently in private beta) and SoftLayer. These cloud services will open the floodgates to developers, and IBM’s contributions will surely help to harden the Spark Core for enterprise adoption.

In short, it’s hard to see any open-source project matching Spark on depth and breadth of analysis and development flexibility (despite a prominent tout of Apache Flink at last week’s Hadoop Summit). And that’s why you’re hearing so much about Spark.

Data to Decisions IBM Chief Information Officer

ClusterHQ enables persistent data containers with Flocker

ClusterHQ enables persistent data containers with Flocker

Probably the hottest area in building next generation applications these days is microservices, running on containers. One key challenge with containers has been that they do not allow persistent data. Enter ClusterHQ which addresses it with its Flocker offering, that just become available today.
Docker containers get persistent with Flocker by ClusterHQ.
Let’s dissect the press release in our customary style (the press release can be found here):

SAN FRANCISCO – June 17, 2015 – ClusterHQ, The Container Data People, today announced the general availability of Flocker 1.0, its container data management software. By enabling stateful Docker containers to be easily moved between servers, Flocker facilitates widespread production deployment of containers for databases, queues and key value stores. Modern applications are being built from both stateless and stateful microservices; Flocker makes it simple and practical for entire applications, including their state, to be containerized to take full advantage of the portability and massive per-server density benefits inherent in containers. This operational freedom allows DevOps organizations to increase the value of their Docker investment and opens the door for containers to be used in a greater variety of mainstream enterprise use cases in production. Flocker 1.0 is available as an Apache-licensed download at clusterhq.com/flocker.

MyPOV – Good to see support for databases, but also queues and value stores. Many next generation application use cases go beyond traditional databases and use queues and / or (named) value stores. Rightfully ClusterHQ stresses that more use cases become possible now for containers. And no surprise, Flocker is open source, Apache license.

“Containers are emerging as one of the most important innovations in modern computing since virtualization. Forward-thinking developers and organizations are latching onto this phenomenon for good reason. Containers are revolutionizing how microservice-based applications are built and operated, by delivering orders of magnitude better density, and unprecedented portability of applications. By making it possible for containers and their data volumes to be moved in production IT environments, innovative vendors are enabling the business benefits of containers to reach even further, removing barriers to Dockerizing everything,” noted Holger Mueller, VP and principal analyst, Constellation Research.
Empowering portability of containers and their data as a single unit, a prerequisite for many production uses, Flocker lets DevOps teams easily containerize their stateful microservices, thus consolidating their entire distributed application into an all-Docker development and operations environment. With everything running in containers, IT operations can be simplified into a unified set of operational processes. Moreover, making it easy to containerize stateful microservices decreases costs so that far more applications can be run on a given set of hardware. Because Flocker enables easy migration of stateful containers, organizations now have the flexibility to accommodate common IT processes such as routine maintenance and load balancing of workloads across servers. The downstream impact is meaningful in today’s real-time economy: businesses can innovate faster and become more responsive to their customers.

MyPOV – In most cases we have seen development organizations shying away from stateful use cases for containers. Those who pushed onwards had to orchestrate complex operations to coordinate code (in containers) and data (in a variety of storage formats) with elaborate DevOps mechanisms. And most were so complex that they hurt what most next generation applications are all about – elasticity of resources. Now there is an option to get both handled in a single product / construct with Flocker.

“Organizations use Docker to accelerate their application development lifecycle and achieve frictionless portability of their distributed applications to anywhere. ClusterHQ identified early on that running stateful services in containers would help drive Dockerized applications more rapidly into production. Their technology helps organizations that want to take advantage of Docker’s benefits for stateful as well as stateless applications.” said Nick Stinemates, head of business development and technical alliances, Docker, Inc.

MyPOV – Good for Flocker to get a quote from the 800 pound Gorilla of containers, Docker. Begs the question what Docker’s plans are here – but for now it looks like a good (informal) partnership as we often see these days.

Flocker provides an API and CLI for managing containers and data volumes, and works with multiple storage solutions including Amazon EBS, Rackspace Cloud Block Storage, any OpenStack Cinder-compatible device, EMC ScaleIO and EMC XtremIO. The pluggable nature of Flocker is designed to work with any storage system, delivering the ability to integrate dockerized applications with existing storage backends. Any company or community that wants its storage to work with Docker can easily write a driver for Flocker.

MyPOV – Good to see a modularized, extensible architecture of Flocker, with pluggable drivers for different storage systems. And for those left out right now – though the initial scope delivered by Flocker is an impressive first release – they can build their own drivers. A good showcase for openness and dynamics of the open source ecosystem.

“The efficiency and operational freedom of using containers for both compute and storage create a competitive advantage so significant that smart organizations are seeking ways to containerize more of the applications that are strategic to their business. The ClusterHQ mission is to make it as easy to containerize data as it is to containerize compute,” said Mark Davis, CEO of ClusterHQ.

MyPOV – Key quote with the ClusterHQ mission – make it as easy to containerize data as it is to containerize compute.

In addition to this news, ClusterHQ today announced a partnership with EMC, as well as the compelling findings from a recent third party survey regarding current and planned adoption of container technology across organizations of all sizes. To learn more about the EMC partnership visit:clusterhq.com/2015/06/17/emc-partnership/; to access the survey and report visit: clusterhq.com/assets/pdfs/state-of-container-usage-june-2015.pdf […]

MyPOV – Good to see ClusterHQ getting the interest of storage heavy weight EMC. And likewise good to see that EMC is looking at this dynamic ecosystem, which could disrupt how the storage market operates.

Overall MyPOV

Good to see a vibrant microservices ecoystem. Even better to see vendors like ClusterHQ tackling tough and crucial capabilities that expand the use cases that can be automated with container based next generation applications. From my (more enterprise software formed) experience and network, the majority of use cases are in a persistent world. If you can't make a business process persistent, it may well not have happened. So making containers persistent is probably an increase of 3-5x uses cases that can be addressed by containers that have persistency.

On the concern side, this is early days, not a trivial problem to solve, so it will be key to see first use cases and the success of early adopters. And it needs to work, reliably, 24x7. It also raises the ante for containers themselves - because users cannot just start another 'container engine' to replace one that has gone bad, but if a container crashes, state is lost. And that is not a good outcome for most use cases.

But overall congrats to ClusterHQ for providing a key capability to Docker with Flocker. Major step, we will be watching the adoption. 

 

Tech Optimization Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Next-Generation Customer Experience PaaS Chief Information Officer

IoT, Mobility, Supply Chain Intelligence on GT Nexus Roadmap

IoT, Mobility, Supply Chain Intelligence on GT Nexus Roadmap

Last week I spent part of it in a much warmer and sunnier location – Hollywood Florida – attending the GT Nexus show: Bridges 2015. The event format was, no surprise, the usual roster of executives and customer stories paraded out on main stage as well as more intimate break out sessions.

On main stage GT Nexus CEO Sean Feeney kicked off the event with an overview of where GT Nexus is and where it is heading. With over $156b worth of goods passing through the GT Nexus systems and over 100k active users, GT Nexus has a clear foundation for continued success in the supply chain world. Mr. Feeney also highlighted the following areas of focus moving forward:

  • GT Nexus transportation management will remain a core offering, one that the company will continue to build on. Good to hear since this is really their bread and butter, and should continue to be their foundation.
  • The Internet of Things (IoT) will become an important play for GT Nexus as the data surrounding their clients products and goods only continues to grow, GT Nexus will be working to drive deeper into the insights this information can offer. Clearly with the explosion of sensors and the data they are throwing off, it makes strategic sense for GT Nexus to put together a strategic offering around this space.
  • Growing mobility ties into the added data GT Nexus will be looking to leverage, since all the data does not mean much if the insights cannot be gained as well as shared wherever and whenever the business dictates.
  • Greater supply chain intelligence. This is GT Nexus’ #1 focus for R&D. Think of this as the platform that will allow clients and partners to move up the ladder to the ultimate goal of becoming more prescriptive rather than simply descriptive.
  • AppXpress, GT Nexus offering a PaaS for their customers and partners to develop applications on their backbone. Clients such as Patagonia and Adidas have already been leveraging this for a year. Look for more clients and partners to take advantage of this offering.

These directions for GT Nexus are natural continuations of their existing business initiatives, but what was an interesting undertone for the firm is their continual work around their platform play. An interesting statement from main stage, and one that I would argue is the most important that came from Sean Feeney was around the amount of data that GT Nexus has passing through their solutions. With continual focus and improvement on this data the greater the results that can be derived from these supply chains.

This notion of becoming the platform to empower their customers with greater visibility is key to the future growth for GT Nexus. Companies such as Deckers, Patagonia and Caterpillar all provided varied uses cases for how they are leveraging the GT Nexus platform, visibility first and then the use cases that emerged. Greater efficiency with supplier relationships, streamlining global movement of inventory and being more efficient with the ability to track & trace product were highlighted via customer discussions. Use cases that might not have the “wow” factor, but use cases that lead to real results and measurable ROI.

What do all these use cases have in common? GT Nexus is the platform being leveraged to provide the necessary data and insights for these companies to implement new use cases. Can GT Nexus continue to build on this platform? There is not reason to believe that with a continued focus and commitment that GT Nexus’ platform play will lead to greater opportunities. It could lead to much greener pastures.


Resources 

The Five Interconnected IoT Business Models

DOWNLOAD EXCERPT

Retail: Prepare for the IoT Revolution

DOWNLOAD SNAPSHOT

Matrix Commerce Innovation & Product-led Growth Data to Decisions Future of Work New C-Suite Tech Optimization Supply Chain Automation Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software IoT Blockchain ERP Leadership Collaboration M&A Chief Information Officer Chief Supply Chain Officer

Unit4 Selects Microsoft Azure for ‘Self-Driving’ ERP Vision - Cloud, Machine Learning, Office and PaaS are the attractors

Unit4 Selects Microsoft Azure for ‘Self-Driving’ ERP Vision - Cloud, Machine Learning, Office and PaaS are the attractors

Earlier today Unit4 announced that it selected Azure as the platform upon which it will build its next generation ERP applications.
 
Let’s digest the press release in our customary Constellation style:

Utrecht, Netherlands, June 16, 2015 – Unit4, a fast growing leader in enterprise applications for service organizations, today announced a strategic collaboration with Microsoft that will speed to market the creation of self-driving business applications and ERP for people-centric organizations.
MyPOV – The enterprise software ISVs are picking their cloud – last week e.g. JDA picked Google Cloud platform – this week it is Unit4 with Microsoft Azure. A good move for both – but later.

Unit4 will use the smart technology in Azure’s PaaS platform components and Microsoft Office solutions including predictive analytics, machine learning, event stream analysis and complex event processing. Combined with Unit4’s People Platform, the collaboration boosts speed of innovation and will see customers benefiting from a new approach to enterprise computing.
MyPOV – Unit4 is using Microsoft where Microsoft is putting more R&D dollars than likely Unit4 has overall – so a smart move to leverage that. With Machine Learning, Office, PaaS capabilities Microsoft had created an attractive set of functions that should attract more enterprise software vendors than Unit4. But Unit4 gets the prize for first mover.

Unit4 with Microsoft – making self-driving ERP a reality
To deliver on the promise of self-driving ERP, Unit4 and Microsoft will establish a development team to ensure that innovations in Azure and Office are used as quickly as possible within the Unit4 People Platform. Contextual information will be derived by combining business data within Unit4 applications with relevant information from Office 365, Office Delve, email, calendar, Yammer and more. This will enable unprecedented business value. Professional services firms will be able to combine an analysis of historical data with predictive analytics to gain valuable insight on which projects to bid for. Public sector organizations will improve fraud detection on bill payments with advanced pattern recognition and machine learning. Not-for Profits will have the ability to match campaigns to donation patterns and target donors more effectively.


MyPOV – Unit4 presented their next generation application platform last week at its North American analyst summit (my take here). It’s a modern and attractive architecture that is good to have Microsoft as a technology partner for. But know how transfer is always challenging, and with all software the devil is in the details so it is good to see both vendor forming a team. The Office 365 and Office Delve integration can be a substantial sales channel for Unit4, assuming done right.

“Microsoft and Unit4 share a long history of increasing productivity for enterprises and this collaboration will accelerate the innovation necessary to make self-driving ERP a reality,” says Jose Duarte, Unit4 CEO. “Like a self-driving car, self-driving ERP takes care of the tasks that are better served by technology, leaving people to focus on the exceptions that need human intervention. Unit4 and Microsoft’s combined know-how and technology will set a new industry standard for business applications for people-centric industries.”
MyPOV – Good quote by CEO Duarte, clarifying more the vision of self-driving ERP.

To achieve this vision, ERP systems require access to complete and high-quality data. Traditional ERP provides users with non-intuitive empty forms, asking them to enter all the required data, leading to non-intuitive data entry, errors and frustration. Self-driving ERP dramatically simplifies data collection by utilizing key technologies such as predictive analytics to provide meaningful in-context information to users. Such information enables intuitive data entry based on pre-populated forms and in-context yes-no validation questions. It results in great user experience all the way from desktop to mobile to wearables.
MyPOV – Unit4 shares a little of its magic around self-driving ERP – which has a heavy context component. We know context is very powerful, but ERP solutions have relied on the human users for the longest time to provide the context. If Unit4 can capture and insert the context into ERP applications, it will be very powerful, eye opening ERP capabilities.

Working together in the cloud
Azure will be Unit4’s preferred public cloud deployment platform globally. Microsoft’s approach to Azure IaaS cloud deployments and Unit4’s Cloud Your Way methodology, which gives customers the flexibility to run their applications in public and private clouds, are key components and the foundation of this collaboration. Both organizations also cater to a hybrid computing environment which will be the reality in the enterprise world for some time to come. Azure enhances Unit4’s offering through flexible security, data privacy and residency. With a deep understanding of the challenges facing the modern enterprise, Unit4 Business Applications with Azure offer highly scalable ramp-up and ramp-down capabilities and provide customers with the flexibility and adaptability they need to run their business without disruption.

MyPOV – More detail on what both vendors plan to deliver. Good to see acknowledgement of the hybrid reality, which is key for ERP sales these days. Data privacy and residency is another key component, so good to see that both are looking at this.

“Unit4’s business applications and People Platform made them a strategic company for us to work with to help realize self-driving ERP,” stated Nicole Herskowitz, Senior Director of Product Management, Microsoft Azure. “We are excited to be working with Unit4 and through this collaboration, we are not only delivering accelerated value but also enabling faster time to innovation in the cloud for our customers.”
MyPOV – Good for Unit4 to get a product development executive for the quote, which always bides for more commitment than e.g. a business development executive. Both vendors will need to work closely together to pull this off, and it is good to see that the senior partner, Microsoft has skin in the game. […]

Overall MyPOV

It’s time for the smaller enterprise software vendors to pick their clouds… Last week it was JDA (with Google), NetSuite picked also Microsoft (though to a lesser extent than Unit4). Even SAP is up to a partnership with IBM last fall. It all started with Infor choosing AWS a year ago.

Unit4 has done a very good job to articulate how key Microsoft capabilities in Azure, Machine Learning, Office and PaaS will be leveraged. They are crucial to make Unit4’s vision of ‘self driving’ ERP a reality. As such Unit4 probably takes the largest dependency of all ERP vendors, but also is up for the biggest upside. The good news is that Microsoft needs to make all these capabilities work in order to attract Azure business. So a pretty safe bet for the much smaller Unit4 to make.

On the concern side, Unit4 needs to make a very compelling, but also not trivial ERP vision happen with ‘self-driving’ ERP. No other vendor has shipped dynamic context functionality with a breadth of an ERP system. No small undertaking and Unit4 needs to make it work. But it is better to have to execute on a compelling vision than not having e.g. a vision at all. We expect Microsoft to support Unit4 at its best, as it needs to create an enterprise ISV showcase in the overall IaaS and PaaS market.

Overall it is good to see smaller vendors going after a better R&D return by leaning on technology offerings of larger, technology partners. To some point that has always happened in the past, e.g. for RDBMS. In the 21st century the dependency gets bigger, but with that the applications become also more powerful. Congrats to both vendors to this partnership, we are curious to see what will be developed.

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity New C-Suite unit4 Microsoft ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Cloud CCaaS UCaaS Enterprise Service Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

SAP Releases IoT, Big Data-Ready HANA - SPS 10

SAP Releases IoT, Big Data-Ready HANA - SPS 10

This morning SAP announced the release of its latest version of SAP HANA, its in memory database, SPS10. 
 

So let’s dissect the press release in our custom style (it can be found here):

NICE, France — June 16, 2015 — SAP SE (NYSE: SAP) today announced the release of service pack 10 (SPS10) for the SAP HANA® platform, helping customers successfully extend all the key functionalities of their core business to the edge of the network where remote business transactions and events actually occur. The latest release delivers new
capabilities that help customers connect with the Internet of Things (IoT) at enterprise scale, manage Big Data more effectively, further extend high availability of data across the enterprise and develop new applications. The announcement was made at SAPinsider HANA 2015, being held June 16-18 in Nice, France.
MyPOV – Good to see SAP to follow up with its plans from Sapphire (my event report here) and deliver the key new functionality side – which is all about the support of Hadoop for BigData and IoT scenarios. On top of that SAP has put in a lot of new functionalities beyond good housekeeping – but let’s read on.

“SAP HANA gives customers one integrated platform for transactional and analytic workloads,” said Quentin Clark, chief technology officer, SAP. “The new capabilities of SAP HANA ensure data center-readiness, synchronization data to any remote system, extend high availability and disaster recovery of enterprise data and perform advanced analytics. We are readying our customers for the inevitable digitization of our entire economy.”
MyPOV – Good quote from CTO Clark, correctly focusing on necessary (and some would say overdue) good database capabilities like data center readiness, remote synchronization and HA / DR. We know a number of enterprises have been waiting for these capabilities to start or extend their investments into HANA – so it’s good to see SAP delivering on these.

Connecting to the Internet of Things at Enterprise Scale
The new remote data synchronization feature of SAP HANA enables organizations to synchronize data between the enterprise and remote locations at the edge of the network. Developers can now build IoT and data-intensive mobile apps that leverage SAP HANA remote data synchronization between the enterprise and remote locations via the SAP® SQL Anywhere® suite – a leading enterprise-ready, embeddable database technology with more than 12 million licenses. Now, enterprise data can be securely captured, accessed and fed back into SAP HANA from remote workplaces such as retail stores and restaurants. In addition, customers can collect and analyze IoT data for performing critical tasks at distant locations including predictive maintenance on ships, pumping stations and in mines with low bandwidth or intermittent connections or even while offline.

MyPOV – SAP keeps harping on the IoT use case which finally, with Hadoop support, becomes a real strategy for HANA customers. And it is good to see that SAP is leveraging assets from the former Sybase, the venerable but proven SQL Anywhere. SAP being able to use the SQL Anywhere offline and remote capabilities, included with the synchronization functions is a good move. More things than we think don’t have continuous, good, reliable wireless (or event network) access.


Streamlining Data Access and Management of Big Data
Businesses can continue to harness Big Data using expanded smart data integration capabilities of SAP HANA to the latest Hadoop distributions from Cloudera and Hortonworks. Additional enhancements for SAP HANA include faster data transfer with Spark SQL and a single user interface (UI) for SAP HANA and Hadoop cluster administration using Apache Ambari. IT organizations can also take advantage of new rules-based data movement among multiple storage tiers based on business requirements. For example, organizations can set up rules to keep one year of data in memory and set a rule to move older data to disk storage or Hadoop. Finally, customers can have greater confidence in the data they arecollecting with new smart data quality capabilities for SAP HANA to cleanse data and merge duplicates by using an easy-to-use, Web-based development workbench.
MyPOV – A very good (and overdue) move by SAP allowing the usual hot, less hot and cold separation of data. A year ago mentioning Hadoop in SAP circles was a bad word, it is remarkable and applaudible (?) to see SAP turning the corner here and supporting the most prominent enabling database technology for next generation applications. Spark was already mentioned at Sapphire and will be key for SAP to make the future offering fly. More to come.
It is also key for SAP to support both Cloudera and Hortonworks on the distribution side, SAP cannot pay favorites here (and should not either). And always good to see SAP use common open source tools like Ambari. And good to see more data cleansing and quality options, something always good to have and previously required 3rd party solutions.


Extending High Availability and Scalability Across the Enterprise
SAP HANA delivers new high availability and disaster recovery capabilities to help ensure data center readiness and support for always-on, mission-critical applications. Features such as 1-to-n asynchronous replication, auto-host failover for dynamic tiering and incremental backups help reduce system downtime and facilitate true business continuity across the enterprise. In addition, companies can leverage the NUMA (non-uniform memory access) -aware architecture of SAP HANA to support large scale systems with more than 12TB of memory to rapidly process large data sets and improve overall business performance. Enhancements in workload management help further improve mixed workload performance to optimize resource more effectively.

MyPOV – Good to see SAP extending and catching up on HA and DR capabilities – something customers have been focussing on for since quite some time. Dynamic tiering is a key capability to optimize cost and performance and something customer equally do welcome. Finally NUMA support is key to keep HANA on the cutting edge of both processor and in memory capability, an important move by SAP. Though SAP has not been explicit on this, one can almost read the learning between the lines of SAP running more and more HANA applications for their customers. Along that comes important lessons learnt and key additions to the product. [...]


Innovating Through Advanced Analytics
With the expanded data processing capabilities in SAP HANA SPS10, businesses can accelerate the development of powerful applications with advanced analytics. SAP HANA text mining now extends to the SQL syntax, making it easier for developers to build next-generation applications. The new spatial processing enhancements of SAP HANA include support multidimensional objects and spatial expressions in SAP HANA models or SQLScript. As a result, developers can incorporate engaging visualizations in their business applications.
MyPOV – To a certain point SAP HANA comes full circle, as it started with T-Rex – an in memory search solution that SAP developed a long time ago. Allowing now for SQL queries to enable text mining is like text search of T-Rex making it fully to the database world, being accessible with the universal database language SQL. Moreover, good to see SAP keeping investing into spatial support, which is critical to build real world modelling, next generation applications. Humans and things have locations and knowing and understanding them is crucial for a modern database. 

Strong Momentum on SAP HANA
The number of customers transforming their business with SAP HANA is dramatically increasing. SAP HANA currently has more than 6,400 customers, almost doubling from only one year ago. SAP HANA Cloud Platform has quickly built momentum with approximately 1,400 customers. SAP Business Suite 4 SAP HANA (SAP S/4HANA) has driven tremendous interest out of the gate with more than 370 customers in 2015 alone. The SAP Business Warehouse application on SAP HANA continues to have strong traction with over 1,900 customers. Adoption of SAP HANA by startups has soared, with more than 1,900 leveraging the SAP HANA platform today. In addition, there are more than 815,000 active users of SAP HANA.
MyPOV – Kudos to SAP for sharing the most complete number on SAP HANA adoption. These are impressive numbers, but it looks like progress may have slowed down a bit, comparing with the very high growth numbers from the early HANA years. But all high growth needs to be converted in sustainable growth based on real use cases and here SAP has made good progress. Better 100 live customers and references than 500 interested parties that don’t do much (the number are completely my manufacturing, to illustrate the point).
 

Overall MyPOV

At the pre-kindergarten age of 4.5 years, HANA is growing up fast. It is good to see that SAP keeps supporting and extending older ‘growth spurts’ like spatial and not abandoning them. HANA is helping SAP internally to host applications built on HANA, and that daily work has an influence on the product roadmap and releases. Using your own HANA has advantages. (Ok the analogy has to stop somewhere).

Extending the HA / DR capabilities is key housekeeping, that to a certain point was even overdue. The exiting new capabilities are the Hadoop support enabling BigData and IoT use cases.

On the concern side, SAP needs to keep expanding SAP HANA capabilities and drive to more customer adoption. It has made the right moves on the product side, now it needs to get the early customer adoption to show case the new capabilities, but that’s a good and natural problem to have. And as SAP has not abandoned some early development strands (e.g. special) but keeps supporting and extending them, it is clear that SAP has the R&D know how and budget to keep developing and investing in SAP HANA.

Overall congratulations to SAP for a new and rich SAP HANA release with SPS 10 – it’s time for the 4.5 year old HANA to go to preschool soon (for the non US based reader that is the year before 1st grade in the US).

Resources

Nine Cloud Trends Every CXO Needs to Know in 2015

DOWNLOAD SNAPSHOT


Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience SAP SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Salesforce Unveils Next Generation Marketing Cloud; Now Any Journey Is Possible

Salesforce Unveils Next Generation Marketing Cloud; Now Any Journey Is Possible

I’m speaking at Salesforce Exact Target’s Connections Conference. And Salesforce has announced their next generation marketing cloud. For more information follow @salesforce@marketingcloud and #CNX15 on Twitter. With the next generation Journey Builder brands are empowered to create journeys that blur the lines of CRM and span the Salesforce Customer Success Platform – connecting journeys across sales, service, marketing and custom apps. Keeping the connectivity between and from difference departments is very important and critical to the success of providing great customer experiences.

Digital Marketing Event Salesforce ExactTarget Connections #CNX15

With the ubiquity of smartphones, connected products, apps, wearable devices and digital communications there are trillions of customer interactions every day. The customer’s journey and every customer interaction, whether it’s engaging with a marketing campaign, speaking with a salesperson or getting a customer support case resolved or taking to another customers, is an opportunity to build a relationship and define the brand. Because of this, companies are now competing on customer experience.

In actuality, I think companies were always competing on customer experience. But we now have the evidence via social networking to show the affect of a bad experience or comment. Glad times have changed and the Internet because the voice of the people. Markets are conversations – or so say the guys that wrote the book The Cluetrain Manifesto. It’s a must read if you have not read it, no matter what business you are in. READ IT. Seriously. It predicts the future of where we are today and where we are going.

Delivering an exceptional customer experience across all channels is not easy. While marketers have ample access to customer data, activating that data and engaging customers with relevant content across every channel is a significant challenge. To address this challenge, marketers are moving from manually executed batch and blast campaigns to event-triggered automation and real-time personalization.

Saleforce’s Next Generation Active Audiences allows marketers to advertise based on a single view of the customer across the digital advertising ecosystem with partners Krux, Facebook, LiveRamp, LiveIntent, Neustar, Twitter and Viant. Leading global brands like FleetCor, Room & Board and the Michael J. Fox Foundation harness the Salesforce Marketing Cloud to connect with customers in a whole new way.

Here’s more details on what these new additions included:

  • Next Generation Journey Builder: The next generation of Journey Builder empowers companies to guide customers on 1-to-1 journeys across channels and devices to ensure they always deliver the right message, at the right time, via the right channel.  Now companies can make any journey possible and connect every interaction across every department, from post-service customer satisfaction to product adoption programs, and loyalty programs to employee onboarding. Marketers can empower all teams within the organization to unleash the power of the world’s #1 CRM and connect all interactions along the customer journey to deliver customer success.

  • New Native Journeys With Sales Cloud and Service Cloud: New pre-built Sales Cloud and Service Cloud events and activities in Journey Builder make it easier than ever to manage customer journeys that span marketing, sales, service and other kinds of interactions.
  • New Pre-Built Journey Triggers: Now, for the first time, Salesforce objects like contacts, leads, accounts and cases, as well as custom objects are available as pre-built triggers in Journey Builder. Marketers can automate inbound event-driven triggers, such as a customer joining a loyalty program or downloading an app, which then send the customer a message on any channel to begin their journey. Triggers can also automatically modify data in the customer contact record or set up wait times and decision splits to adjust the journey in real-time based on customer interactions across sales, marketing and service.

  • Next Generation Active Audiences: Today, Salesforce announces the next generation of its ad platform Active Audiences, which syncs ad targeting with CRM, empowering marketers to run more relevant ads across all of the places they execute campaigns. New partnerships will extend its scope to reach audiences across the broad display advertising ecosystem. Active Audiences can now help marketers:
  • Connect with customers and prospects across the display advertising ecosystem: New Marketing Cloud partners announced today — Krux, LiveRamp, LiveIntent, Neustar and Viant — will empower Salesforce users to activate CRM data and run ads across more than 100 digital advertising networks and technologies.

  • Coordinate targeted, relevant advertising to drive sales, service, and marketing journeys: Full integration with Journey Builder empowers marketers to bring targeted, relevant advertising in line with sales, service and marketing journeys. Active Audiences orchestrates digital advertising based on the customer’s entire experience with a brand – the emails and mobile messages they open, their purchase history, their engagement with the customer service team and where they are in a sales cycle.

  • Target people across social networks: Salesforce is a Facebook Marketing Partner for ad technology, content marketing, community management, and audience onboarding; a LinkedIn Certified Marketing Partner across sponsored updates and company pages; and a Twitter Official Partner. Through Active Audiences, marketers can reach their customers and lookalikes on Facebook and Twitter. With Salesforce Social.com, they can create and optimize media at scale across Facebook, Twitter and LinkedIn.

Pricing and Availability

  • Salesforce Marketing Cloud and Journey Builder are generally available today for customers. Journey Builder pricing starts at $3,750 per month.

  • New Journey Builder activities, triggers and events with Sales Cloud and Service Cloud will be available in Q4 2015.

  • Active Audiences is generally available today for customers. New Active Audiences display features are expected to be generally available in Q3 2015. Active Audiences pricing starts at $4,200 per month.

Are you building next generation customer experiences? Make sure to submit your application to the SuperNova Awards for this fall at Constellation’s Connected Enterprise Conference.

@Drnatalie, VP and Principal Analyst, Constellation Research, Covering Marketing, Sales and Customer Service to Deliver Amazing Customer Experiences

Marketing Transformation Tech Optimization Sales Marketing Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work New C-Suite Next-Generation Customer Experience Event Report salesforce Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Executive Events Chief Marketing Officer

Future of Work: Fuze and LiveMinutes Combined

Future of Work: Fuze and LiveMinutes Combined


My mantra for quite some time has been "They key to successful collaboration is neither tools nor culture, it’s purpose.”  With that in mind, I’m excited by one of the latest events in the social business market, Fuze's acquitision of LiveMinutesFuze, which describes themselves as "visual communication and collaboration” is one of the popular new video conferencing vendors taking on market veterans like Cisco WebEx and Citrix GoToMeeting.

Over the last year or so I’ve had the opportunity to use Fuze several times and have always been quite impressed. It was easy to use on computers and mobile devices. It has an attractive modern looking user experience. But to be blunt… it always felt like just another video conferencing product. It was an accessory, not a place where people would go to do work. It did not fulfil a specific purpose, such as helping close sales deals or creating marketing collateral. Until now.

Enter LiveMinutes with their collaborative workspaces where people can create and share content. Combining Fuze and LiveMinutes together results in Fuze Spaces, a hybrid of online real-time content creation, file-sharing, task management, chat and video conferencing all weaved into one. While there are several collaboration platforms out there that offer a plethora of collaboration tools including blogs, wikis, file-sharing, task management, web conferencing, etc. what I like about Fuze Spaces is that they focus on the core idea that people get work done around projects. Those projects have a defined purpose. They are not just a "social stream" where people can post updates, nor just a web-conferencing tool for sharing slides. Instead Fuze Spaces seamlessly combines the features people need such as document co-authoring, file-sharing, text chat and video.

screenshot of Fuze LiveMinutes Spaces

Of course Fuze is not looking to pivot away from their sweet spot of meetings, but rather improve the entire meeting experience, from preparation (before), to participation (during), to follow-ups (after). What the combination of Fuze and LiveMinutes allows them to do is redefine what a meeting itself is. Rather than sticking to the cliché â€œ60 minute scheduled event on everyone’s calendar”, Fuze Spaces helps blur the line between meetings and projects. Some work may be done collaboratively in real time while other parts could be worked on separately by individuals. Everything will be collected and available in the Space, providing all members access to everything related to the project.

One feature I particularly like is their in-context commenting system, where in stead of posting feedback at the bottom or side of a page, people can comment right inline on the object being shared.

Fuze Spaces Inline Annotations

If your organization is evaluating new tools for team productivity, I suggest you add Fuze Spaces to your list.

 

Future of Work

Webinar About Best Practices: Customer Experience Management, Technology, Roles and Strategy

Webinar About Best Practices: Customer Experience Management, Technology, Roles and Strategy

Is your brand following these best practices for customer experience management? Find out more at this webinar on 6 steps to superb customer experience management and here’s the research paper on best practices in customer experience management, technology, roles and the strategy required for success! As brands realize customer experience management is key to their overall strategy and long-term growth, Constellation Research recommends considering the following to deliver an integrated web, mobile, social, email and commerce experience:

Six Approaches Brands Must Adopt to Drive Experience Management

1. Decide Who Will Lead The Experience Management Strategy: A Competitive Advantage

Leaders of experience management must be effective communicators and be able to bridge many disciplines and functional areas. They must keep their eye both on the internal needs and strategy of the business, while taking into consideration the prospect’s experience. This may mean organizations at the very least assign the CEO, CIO or CMO to this charge. Though most of these roles are in overwhelm with their current responsibilities; tough to add more and expect them to really perform well.

2. Multi-disciplinary Skill Sets Required of Chief Experience Management Officer

Regardless of who takes on the role, leaders of experience management must be effective in communicating what the goals of the experience management team are, how they fit into the rest of the business why they drive revenue. Experience management needs to be focused on what customers are interested in, have concerns about and providing the information they need to make purchases.

3. Experience Management Technology and Integration

With strategy and leadership decided, processed mapped from the customer’s viewpoint, technology can be chosen and deployed to deliver on the brand’s promise. Brands should focus on creating meaningful, multichannel interactions that optimize the customer experience, improve conversions, scale business, and increase revenue via an interconnected platform.

4. Consider an integrated, interconnected technology platform: The need to provide a continuously connected and integrated experience is often difficult if the technology wasn’t designed to provide that from the start. Contemplate a comprehensive experience platform that can provide an elegant, integrated solution that connects channels, engagement automation and analytics and commerce, with external tools and databases, to drive exceptional customer experiences for each and every unique customer.

5. Strive for unity among channel connectivity: Customers expect you to recognize them when they engage with your brand, no matter what channel or device they use. And they expect you to remember previous interactions with them and keep the context of the conversation as they move from channel to channel or device. You will want your website, as the hub of experience management, to be directly connected to the email experience you provide, as well as have it parallel simultaneously branded experiences in social, mobile, commerce and print.

6. Use predictive insights to deliver real-time, optimized responses: To provide an experience where customers can navigate across multiple devices (mobile or desk-bound), brands must deliver engagement and shopping experiences that recognize each device and automatically adjust interactions to deliver seamless experiences. You will want to be able to respond to each customer’s interactions in real time and extend relevant content and offers based on an individual’s real-time activity, when their engagement is at its highest.

Which steps are you following? All six or only a few? Use this as a guide to determine how close your organization is to best practices! Join R “Ray” Wang and I for the webinar to learn more details!

@drnatalie, VP and Principal Analyst, Constellation Research, Covering Marketing, Sales and Customer Service to Deliver Amazing Customer Experiences

 

Share

Marketing Transformation Next-Generation Customer Experience Chief Customer Officer

IBM Joins Apache Spark Bandwagon (and Coopetition)

IBM Joins Apache Spark Bandwagon (and Coopetition)

IBM stole the day-one headlines at Spark Summit 2015 in San Francisco with a big endorsement of the open-source, big-data-analysis platform. But it’s sure to be a selective embrace, as IBM, like other commercial vendors, plans to offer its own software and services on top of Spark.

IBM threw its significant weight behind Apache Spark on Monday, calling the open-source, in-memory platform “potentially the most significant open-source project of the next decade.”

Among the moves announced, IBM will offer Spark as a service on its BlueMix cloud, opening a Spark development center in San Francisco and redirecting more than 3,500 IBM researchers and developers to work on Spark-related projects. It also promised to educate more than 1 million data scientists and data engineers on Spark through community partnerships and support for online courses.

The big news on day one of Spark Summit was news of IBM's embrace of the open source platform.

The big news on day one of Spark Summit 2015 was IBM’s announcement it will throw its weight behind the open source platform.

All of the above is great news for the Spark community. But is Databricks, the Spark development, certification and support firm, in danger of being eclipsed by big companies embracing the platform? Spark is the darling of the conference circuit this year, with Databricks executives often showing up at Informatica World, Alteryx Inspire15 and other events as keynote speakers. Even when official representatives aren’t there, Spark is often mentioned as a “Spark inside” enabler of new big data initiatives, as was the case at the Teradata Influencers’ Summit.

But the embrace of Spark isn’t always wholehearted. That’s because the platform supports multiple modes of analysis, including machine learning, SQL, R, graph and streaming. Hadoop distributor Cloudera, for example, was early to jump on the Spark bandwagon, but it tout’s the platform’s machine learning capabilities, not Spark SQL, which presents a threat to Cloudera’s Impala SQL-on-Hadoop component. Hortonworks and MapR also support Spark, but they give equal billing to Hive and Drill, their favored SQL-on-Hadoop options, while invariably showing Apache Storm in architectural diagrams as the streaming option instead of (or in addition to) Spark Streaming.

I’m set to hear more about IBM’s specific Spark plans here in San Francisco this week, but at last week’s Hadoop Summit in San Jose, a few IBMers informally told me the company is mostly interested in using the Spark in-memory platform and machine learning options. As for Spark SQL and Spark Streaming? These are two areas where IBM can offer its own technologies. What’s more, IBM is contributing its own SystemML machine learning software to the Spark community, building influence in this core area.

With a Spark service now available on BlueMix and thousands of IBMers now working Spark-based applications, Databricks will see new competition to its eponymous Databricks platform (formerly called Databricks Cloud), which runs on Amazon Web Services. IBM’s move is also a challenge to analytics leader SAS, which has spent the last three years developing SAS Visual Analytics and Visual Statistics as it’s choice for in-memory big-data analysis (either on top of Hadoop or on a dedicated distributed cluster).

Even if commercial plans lie behind IBM’s embrace of Spark, Databricks executives weren’t about to throw cold water on any endorsements of the platform. “It’s great to see some of the large vendors in the community throwing their weight behind Spark,” Databricks executive Arsalan Tavakoli-Shiraji told me last week. “SAP is integrating Hana with Spark, IBM is embracing it, and Intel is also making a lot of contributions, so it’s great to see the community growing.”

Stay tuned for more from me this week from IBM, SAS and the Spark Summit as the fast-moving big-data analysis world moves even faster.


Data to Decisions IBM AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership business Marketing finance Customer Service Content Management Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer