Results

Google Cloud Platform - Takeaways Day #1 Keynote

Google Cloud Platform - Takeaways Day #1 Keynote

We have the opportunity to attend Google's Google Cloud Platform event this week in San Francisco, a key event for Google in the on going 'battle for the public cloud'.



 
 

So take a look for my top 3 takeaways:



 

No time to watch - read on:


Greene Debut - Since Diane Greene joined Google in November last year, there has been a lot of expectation, that she will move the Goggle offerings into a better place with the enterprise. In her short remarks she hit good points in regards of investment (almost 10B in 2015 alone) and TCO savings. As an avid sailor she used the revolutionary foils as a metaphor what Google wants to do for the enterprise.

GCP grows - Urs Hoelzle then walked us through key advancements of Google Cloud Platform (GCP), after unveiling the pitch line for GCP - better software faster. What stuck with me is once again scale, focus on security, machine learning and the new buzzword 'NoOps' vs DevOps.

3 layers of GCP -  Then it came to Brian Stevens to share the three layers of GCP, Infrastructure and Operations, Application Development and Data and Analytics. This structure formed his part of the keynote with key announcements in each area.., couple with a major customer win.

 
  • Infrastructure and Operations - This year Google will add a data center in Tokyo and Oregon, and 10 more locations will come by 2017. Locations are key for speed and compliance and it is good to see that Google is ramping up GCP locations. The key product announcement was Stackdriver, the new GCP Ops Console, that interestingly not only shows GCP loads and operations, but offers insights into 3rd party clouds, too - today AWS Cloud. The key customer win was Coca-Cola.
      
  • Application Development - The key demo here was around Kubernetes, scaling a load well on GCP, more interestingly though also in hybrid mode, which Google demoed with an Intel server on stage. The key customer win was Disney Interactive.
     
  • Data & Analytics - On the product side Google showed Datastudio 360 and and then unveiled Cloud Machine Learning, a key step forward on how to build 'true' analytics applications. The key customer win was Spotify, which was demoed impressively. 
 

MyPOV

I tweeted my Top 3 questions before the event on what enterprises (and me) are looking for Google to address - here they are:
 

So how did Google do?

Ad 1 - Google was not too explicit here - but being able to monitor loads in AWSCloud and move them makes clear what the options for enterprises are. And with a strong focus on MachineLearning on top of BigData Google things it can out feature AWSCloud and Azure.

Ad 2 - As we know from enterprises already, its hard to figure out how Google and GCP can specifically help them. There is perceived value, but it is not tangible enough. And while Coca-Cola, Disney Interactive and Spotify are great customer wins from a pure breed cloud showcase - they don't give the average CIO confidence that GCP can power their use cases.

Ad 3 - Google did a very good job here and has probably the most impressive offering in the market. But again - how does it relate to an enterprise out there is the question. It was very impressive to see how Spotify uses the MachineLearning and BigData tools - probably a key reason for chosing GCP - but how does it relate to the average CIO out there looking at Google was not addressed.

So overall a good start for Google, it has shown once again what it does well, work with enormous amounts of data, process with a lot of compute and a very attractive price - but we knew that before. Good to see focus and progress on security and administration, with a multicloud angle. But it is only Day #1 at the event - stay tuned for more tomorrow.




 
More about Google:
  • News Analysis - Google launches Cloud Dataproc - read here
  • Musings - Google re-organizes - will it be about Alpha or Alphabet Soup? Read here
  • Event Report - Google I/O - Google wants developers to first & foremost build more Android apps - read here
  • First Take - Google I/O Day #1 Keynote - it is all about Android - read here
  • News Analysis - Google does it again (lower prices for Google Cloud Platform), enterprises take notice - read here
  • News Analyse - Google I/O Takeaways Value Propositions for the enterprise - read here 
  • Google gets serious about the cloud and it is different - read here
  • A tale of two clouds - Google and HP - read here
  • Why Google acquired Talaria - efficiency matters - read here
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my Youtube channel here
Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing finance Healthcare Customer Service Content Management Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Cloudera Takes to the Cloud, Highlights Industry Use Cases

Cloudera Takes to the Cloud, Highlights Industry Use Cases

Cloudera plans public cloud push as applications multiply in financial services, insurance, life sciences, retail and telecommunications. Hadoop may not be easy, but it is gaining mainstream adoption.

Hadoop is going mainstream, it’s increasingly moving into the cloud, and it’s delivering solid business value. These are three key themes that were highlighted this week at Cloudera’s third annual Analyst Day in San Francisco.

Inside Cloudera Analyst Day

Cloudera shared strong evidence of broad adoption and business value through panels that dug into the details of real-world deployments. Here’s a short list of the types of applications seen across five industries:

Financial Services: Cloudera has more than 100 customers in this category, and use cases typically start with governance and security. Big banks, for example, have to retain transactional data for regulatory reasons, and many have embraced Hadoop for high-scale data-retention and analyses including anti-money-laundering and stress testing. As the bread of data in a data lake spreads across lines of business, financial services develop 360-degree views of customer preferences and behaviors (customer 360).

Insurance: Insurers use Hadoop for customer 360 and claims-fraud analysis applications. More mature adopters are moving into Internet of Things (IoT) applications such as usage-based pricing. In the automotive arena, for example, pay-as-you-drive and how-you-drive pricing will be ubiquitous within a few years, a Cloudera exec predicted. The platform is making high-scale analysis of telematics data practical and affordable.

Life Sciences: Whether it’s healthcare providers, pharmaceutical companies or crop sciences firms, these organizations are modernizing their data infrastructures to handle data at unprecedented scale. Cloudera customer Cerner, which analyzes electronic medical records on a Hadoop-based platform, has come up with an automated way to predict sepsis infections in hospital patients. The alerts have reportedly saved more than 3,000 lives to date.

Retail: It’s all about getting closer to the customer, differentiating products and services, and optimizing inventory to maximize sales and keep customers happy. That’s a journey that starts with resolving customer identities across channels and then better integrating data from across channels. These first two steps get you to the most valuable stage of understanding customer interactions, behaviors and value across all channels and over time.

Telcos: Telcos are big users of Hadoop, and they start with governance-oriented call-data-record remediation and customer-churn analysis. Operations groups use the platform for network troubleshooting, security and risk analysis. As use of the platform matures, front-end and back-end insights are integrated for proactive network optimization, customer service and anti-churn initiatives.

The four themes that cut across all industries are driving customer insights, improving products and services, reducing risk and modernizing IT infrastructures. On this last point, Cloudera said that only 15% of its 850-plus enterprise customers have deployed its software on public clouds, but that’s where it’s seeing the fastest growth. “Data that’s born in the cloud wants to stay in the cloud,” observed Cloudera Chief Strategy Officer, Mike Olson, and that trend will accelerate as IoT scenarios flourish, he added.

Cloudera plans to ramp up in this areas with Cloudera Director, an automated cloud deployment tool and abstraction layer that hides the complexities and differences among various clouds and deployment options including Amazon Web Services, Google, OpenStack and VMWare. With Cloudera Director 2.0, released in January, Cloudera added a cluster cloning feature and the ability to automatically grow and shrink clusters to save money.

MyPOV on Cloudera Analyst Day

There was a bit of a disconnect between what Cloudera talked about in its market observations and strategy overviews and what it detailed in its product roadmap (which was largely under NDA). For example, there was no signal of new cloud deployment capabilities beyond Director 2.0, other than supporting Microsoft Azure as a deployment option. And despite all the talk of industry specific use cases, Cloudera executives only vaguely alluded to blueprints, templates, and frameworks — from Cloudera and from partners — that give customers a starting point on proven applications. It’s nice to hear about vertical use cases, but Cloudera has much more work to do on broad platform acceptance before it can go too far down the vertical-industry path.

At one point during the day Cloudera described its technology as being “fast and easy,” but that discussion reminded me of SAP couching its next-generation ERP suite as being “simple.” When I questioned execs about the use of these terms, Chief Strategy Officer Mike Olson qualified that Hadoop is fast and easy as compared to relational database approaches when trying to solve high-scale data challenges. He also pointed to efforts Cloudera has made to simplify deployment with tools like Navigator Optimizer and Cloudera Director, which speed and ease analysis and optimization of SQL workloads and cloud deployment, respectively.

At other points during the day Cloudera execs talked about the time and money the company has to invest to help clients move from proof-of-concept projects to broad and fruitful production use. And it also discussed how it’s now employing extensive automated testing to ensure the quality of its software distribution, which is now includes more than 25 open source components.

In short, “fast” and “easy” are not terms I would associate with Hadoop. But “proven,” “value driving” and even “industry standard” work for me and for the many companies that now rely on the platform.


Data to Decisions Chief Information Officer

Real Time IoT Sensing requires Real Time Responsive Apps, and only now are these arriving in the market

Real Time IoT Sensing requires Real Time Responsive Apps, and only now are these arriving in the market

There is a distinct feeling that technology capabilities to monitor and capture ‘real time’ data using an ever increasing range of low cost sensors are getting out of ahead of the availability of Apps and Services that can provide ‘real time’ optimized responses. In the Industrial Technology and automation sector there is a long history of developing Machine to Machine responsive systems, but it would seem that for an IT sector based on historic transaction applications it’s a big paradigm (apologies) shift.

Service Engineer management, including Preventative Maintenance, is widely regarded as having excellent potential for substantial improvements in operating efficiency and direct cost saving. Achieving these goals requires more than capturing real time data, it requires an App that uses this data to make real time optimized responses.

Currently, whether, or not cloud based and mobile accessed, activities are planned in abstraction from reality on the basis of ‘historic’ data in ‘traditional’ IT applications.  The justification for the adoption of IoT is to interact with ‘reality’ using a flow of real-time sensed data to drive a new generation of dynamically optimized ‘read and respond’ Apps.

Using IoT driven Service Maintenance changes activities from being planned on the basis of history, or responding to equipment failure, into being active optimized responses to the reality of the present, often with proactivity to developing situations.

But this cannot be achieved just by adding IoT sensing to the current traditional Service Maintenance Applications that were never built to include this kind functionality. Though admittedly better data input added to the overall data available can improve performance, but the answers will always be via historic reports, not ‘real time’ optimizations. ( ‘real time’ is a difficult term to define as in practice latency means nothing is real time, but in the context of IoT it means reacting to data flows, not historic data processing).

IoT driven Cloud based Apps such as Uber, the real time responsive Taxi Cab service, show how a new generation of Apps can provide real time optimization from IoT data inputs. These near real time read and responsive Apps are usually dubbed ‘Smart Services’; to distinguish them from current generation of Apps that may connect via the Internet and use Cloud services, but lack real time optimization to IoT data flows.

The costs and inefficiencies of associated with equipment failure are an issue across all Industry sectors so, not surprisingly, Service Management, with Preventative Maintenance have been an immediate target for applying IoT. Its not only break fix notifications for equipment failure, but being the ability to use complex event processing to predict that an imminent failure might occur.

Predictive Maintenance has always been the goal of any Service Management but to date the only possibility has been using historic failure records for guidance. Due to the time and costs of detailed record keeping and the need for a long term period of analysis this has only been possible for selected large value equipment. IoT sensing now makes it possible to provide absolutely accurate ‘real time’ data warnings across many items at low costs.

The benefits of Preventative Maintenance range from a less expensive fix of a simple wearing part thus avoiding wider spread damage to adjacent parts if left unattended. Ultimately, unaddressed failure could be of a catastrophic nature resulting in the need for a complete replacement unit. It is also important is being able to choose the time to carry out service work; Retailing, as an example, would prefer service work to be carried out of trading hours; whereas Manufacturing processes need to choose when to make planned shutdowns.

IoT sensing is one half of the game changing to Service Management, but it’s the complex event processing capability to make use of the real time data flows that makes preventative maintenance possible. IoT sensing brings the new unique capability to use real time data flows to establish relationships, and provide outcomes, that would not have been previously been possible. (See previous blog; event hubs or engines add react capability analytics to read real time IoT data).

As an example, IoT Complex Event Processing would interpret reporting rising changes in temperature, energy consumption, and vibration from individual sensors, as advance warning of a potential bearing failure in a rotating part.  Reading the real situation will always be more accurate than even the best of historic time based operations.

An often-repeated industry story that illustrates the limitations of break fix actions with existing preventative maintenance routines tells of a unexpected breakdown in a heat pump being repaired with a new parts and a through overall service. Seven weeks later the annual time planned preventative maintenance service fell due and a different engineer dismantled the heat pump once again replacing the nominated ‘wearing’ parts in accordance with the instructions for an annual service.

Clearly ‘real time’ Smart Services using IoT data bring obvious benefits, but a Service Management and Preventative Maintenance package should be providing deeper operational aspects as well.  It’s not just the equipment that benefits from real time dynamics, in this complicated working environment, engineering response and activity planning needs the same dynamic approach.

If an Engineer is on a site for one task, and another event occurs on the same site, then automatic re-planning of the service engineer’s day should occur. In turn this should lead to wider re-planning of the rest of field engineering teams activities for the day to ensure cohesive coverage.

Unexpectedly long repair times, traffic impacts on travel time, relationships to Customer Service satisfaction all require a fully integrated real time approach in a new generation of Service Management and Preventative Maintenance Services and Apps.

The commercial impact and importance of this market has not gone unnoticed by technology industry vendors, or equipment manufacturers, resulting in a wide range of announcements. See links to five vendors teaming IoT sensing with Service Management below. However, whilst all provide the core new capabilities outlined above there are noticeable differences in the capability to team the real time react events with an equally dynamic real time reactive Service Management environment.

As the shift to IoT sensor based Preventative Maintenance takes place and an increasing amount Service Management activities are driven by real time event responses, then the necessity for a similar shift to real time Service Management operations becomes clear. Service Management professionals are facing a similar transformation to Internet ‘reality’ operations as marketing professionals faced with the adoption of Internet ‘reality’ of Social Tools.

Salesforce focus on this aspect in their announcement last week, (15th March), of Field Service Lightening referring to it as 360-degree operations, the extent to which this is available in other Vendors offerings is less clear.

http://www.salesforce.com/service-cloud/features/field-service-lightning/

http://www.sap.com/pc/tech/internet-of-things/software/predictive-maintenance/index.html

http://www.ibm.com/internet-of-things/asset-management.html

https://blogs.microsoft.com/iot/2015/12/01/azure-iot-suite-predictive-maintenance-now-available/

https://www.bosch-si.com/solutions/manufacturing/predictive-maintenance/increase-machine-uptime.html

http://www.softwareag.com/us/solutions/manufacturing/iot/overview/default.asp

New C-Suite

Cloudera Progress Report and Analyst Day

Cloudera Progress Report and Analyst Day

We had the opportunity to attend Cloudera's Analyst Days in San Franscisco this week, held at the beautiful Ritz Carlton hotel. The third analyst meeting of the vendor saw record attendance with almost 60 analysts making the trip.
 
So take a look at this short video for my Top 3 takeaways:

 

No time to watch -  read on:

Company Growth while preserving culture - The vendor is doing well in all regards, significant revenue, paying customer and partner growth. Talent acquisition challenges are being addressed with a new Budapest development location. Good to see the attention to keep the culture intact, being at 1100 employees not an easy task.  

Cloud & Cybersecurity - Always a treat to listen to Mike Olson, his main point was that Hadoop is now around for real, and moving back to where it started - into the cloud. His other strategic outlook was on cyber security - where Hadoop clearly has a key technology enabler role. 

Quality & Support - This was an area Cloudera said a year ago it would focus further on and it was good to see that Cloudera came back, reported on progress, and showed some good results. 

 

MyPOV

Good to see the overall progress at Cloudera, the vendor is doing well, it has a plan to take advantage of the 'real' Hadoop adoption wave and it looks that it is handling product growth well. It's great to have Intel as an investor, but also as a partner to share future hardware design, so Cloudera can take advantage of them in its products. Unfortunately most of the product plans were under NDA, likely to be unwrapped in a few months, so can't comment on these. 

On the concern side Cloudera needs to tie together the directions. The vendor went at length to make clear it understands and cares for the vertical needs of Hadoop adoption, but those needs have to find themselves in specific future product capabilities. And I am looking forward to see the first 'hard' feature uptake from Intel into Cloudera software.

Overall Cloudera is in a very good position to keep taking advantage of the rise of Hadoop as the critical data engine for enterprises. 2016 will be (another) key year - stay tuned, we will be watching. 

More on BigData / Hadoop / noSQL:

 
  • News Analysis - SAP HANA Vora now available... - A key milestone for SAP - read here
  • Progress Report - Hortonworks wants to become the next generation for the enterprise – a tall ask - read here
  • News Analysis - SAP Unveils New Cloud Platform Services and In-Memory Innovation on Hadoop to Accelerate Digital Transformation – A key milestone for SAP - read here
  • News Analysis - SAP delivers next release of SAP HANA - SPS 10 - Ready for BigData and IoT - read here
  • News Analysis - Salesforce Transforms Big Data Into Customer Success with the Salesforce Analytics Cloud - read here
  • Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices - read here
  • Event Report – Couchbase Connect – Couchbase’s shows momentum - read here
  • News Analysis - Couchbase unveils N1QL and updates the NoSQL Performance Wars - read here
  • Event Report - MongoDB keeps up the momentum in product and go to market - read here
  • News Analysis - Pivotal pivots to OpenSource and Hortonworks - Or: OpenSource keeps winning - read here
  • Progress Report - Cloudera is all in with Hadoop - now off to verticals - read here
  • Market Move - Oracle buys Datalogix - moves into DaaS - read here
  • Event Report - MongoDB is a showcase for the power of Open Source in the enterprise - read here
  • Musings - A manifesto: What are 'true' analytics? Read here
  • Future of Work - One Spreadsheet at the time - Informatica Springbok - read here
  • Musings - The Era of the no-design Database - Read here
  • Mendix - the other path to build software - Read here
  • Musings - Time to ditch your datawarehouse .... - Read here

Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here

 
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Hortonworks Hadoop SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Experience Officer

SAP HANA Vora now available... - A key milestone for SAP

SAP HANA Vora now available... - A key milestone for SAP

Last week SAP used the largest Software and IT fair – CeBIT – to make news in a number of areas, the one of interest is around HANA Vora, announced last year in September (we attended the launch event, blog post here), and now GA in its first release.

 
 
 
So let’s take apart the press release (it can be found here) in our customary style:
 
HANNOVER — SAP SE (NYSE: SAP) today announced general availability of SAP HANA Vora, an in-memory query engine that brings powerful contextual analytics across all data stored in Hadoop, enterprise systems and other distributed data sources.MyPOV – Good summary, and they key step forward for SAP – as mentioned many times before, ‘Hadoop’ used to be a ‘bad’ word around SAP, that for the longest time was on the ‘in memory only’ track. We noticed the change at Sapphire 2015 and product certainty was created with Hana Vora in September last year.

To facilitate distributed data processing across enterprise and Hadoop data, SAP has contributed part of the code for SAP HANA Vora to the Apache Spark open source ecosystem.
MyPOV – Good to see SAP working with more open source in general, Vora maybe the largest contribution to open source that SAP has done so far. It certainly has the largest impact on SAP, as practically all next generation application use cases that enterprises are looking into, comprises BigData stored in Hadoop clusters. SAP before Vora could not address this data directly, so Vora is key for SAP to keep building 21st century applications. Even more important for Hana Cloud Platform (HCP), SAP’s PaaS tool that otherwise would not have been a competitive offering for stand alone projects, a market that SAP wants to be in and is effectively in.
SAP also announced that CenterPoint Energy Houston Electric (CenterPoint Energy) will implement the SAP HANA platform and SAP HANA Vora to bring together its highly distributed enterprise data framework. While Hadoop will allow CenterPoint Energy to reduce information technology costs associated with increasing Big Data storage requirements, SAP HANA Vora will allow for more informed business decisions through powerful data analytics. […]
MyPOV – Always good to see a customer on a press release, using the new announced capabilities – and good to see cost savings associated with Vora (no surprise, as HDD and SSD are cheaper than RAM).
CenterPoint Energy to Innovate for Customers
Delivering power to more than 2.3 million consumers, CenterPoint Energy collects electronic meter data every 15 minutes for energy usage reporting, which leads to substantial data storage costs. Within six weeks, SAP and CenterPoint Energy architected a testing environment that processed over 5 billion records of data with Hadoop, SAP HANA and SAP HANA Vora. As a result of its successful test deployment, CenterPoint Energy will implement and standardize on the SAP HANA platform and SAP HANA Vora.
MyPOV – Great to see the use case and very clear (as we blogged and stated many times) that in memory (so HANA) cannot be the all encompassing solution for IoT scenarios.
“Our initial analysis proved that SAP HANA paired with SAP HANA Vora is the right solution for us moving forward operationally, while allowing for innovation around our Internet of Things and predictive analytics initiatives,” said Gary Hayes, CIO and SVP of CenterPoint Energy. “With the help of SAP, we are transforming to a ‘live’ digital enterprise to better serve customers.”
MyPOV – Good quote from the CIO, Hayes, hitting the right points here – the combination of ERP data in HANA with IoT and Analytics capabilities, that otherwise would not have been easily integrated and accessible from SAP, with SAP tools. 
 
Digitizing Businesses with SAP HANA Vora 
[…] “As organizations begin their journey toward becoming smarter digital enterprises, the natural starting point and enabler is their core in-memory technology platform,” said Greg McStravick, general manager and global head of Platform GTM, SAP. “With SAP HANA and SAP HANA Vora, customers can turn massive amounts of Big Data into business context. We are pleased to work with companies like CenterPoint Energy who value the customer service enhancements that a single, end-to-end digital enterprise platform linking corporate data, social sentiment and other data such as weather patterns can provide.”
MyPOV – Good quote from McStravick – but while I understand the perspective, he has it wrong: It is not the Hadoop based BigData that gives context to the business data – but business data that is the context to the (gravitational) BigData. SAP needs to get that perspective right soon, so it can create value for its customers with the right solutions.
SAP HANA Vora leverages and extends the Apache Spark execution framework to provide enriched interactive analytics on Hadoop. The core foundation of SAP HANA is complemented by SAP HANA Vora, which is designed to add insight across large volumes of operational and contextual data taken from enterprise applications, data warehouses, data lakes and edge Internet of Things sensors.
MyPOV – Good description of what HANA Vora does – the ‘divouring’ of massive volumes of data residing in Hadoop, that in memory HANA could never hold. Keeping Vora to Spark keeps the HANA to Vora an in memory connection and thus on ‘even footing’. Good approach to keep the ‘speed’ argument going, but in most use cases we expect Vora to query data in Hadoop that is not RAM based.
 
SAP HANA Vora aims to solve key Big Data challenges by providing:
Data correlation for making precise contextual decisions — Enables mashup of operational business data with external unstructured data sources for more powerful analytics
MyPOV – Very powerful and very important – but it’s the business data that is the context – not the unstructured data.
 
Simplified management of Big Data — Allows data to be processed locally on a Hadoop cluster, removing any data ownership and integration challenges
MyPOV – Indeed, much easier to keep e.g. IoT data in Hadoop instead of cycling it into memory via federation tools from Sybase. 
 
Online analytical processing (OLAP) modeling capabilities on Hadoop data— Makes real-time drill-down analysis possible on large volumes of Hadoop data distributed across thousands of nodes
MyPOV – Very powerful indeed, but the ‘drill up’ is equally important, just finding the business content to data stored in Hadoop clusters… even on a single occurrence level.
SAP HANA Vora is targeted at benefiting customers in various industries where highly interactive Big Data analytics in a business process context is paramount, such as financial services, telecommunications, utilities, healthcare and manufacturing. SAP has an established partner ecosystem, including Cloudera, Databricks, Hortonworks and MapR Technologies, that plans to support SAP HANA Vora. Read what SAP partners have to say at “Partner Quotes: SAP HANA Vora Now Available to Bring Contextual Analytics Across All Enterprise and Big Data Systems.”
MyPOV – Good to see this as an ecosystem play and good for SAP to have all key Hadoop and Spark players on board. 

 
Supporting the Apache Spark Community 
SAP has recently open-sourced new features to the Apache Spark ecosystem, one of the most active open source communities. These features include a data hierarchy capability that enables drill-down analysis on Hadoop data, and an extension to Spark’s data source application program interface (API) that improves distributed query efficiency from Spark to SAP HANA. These open source offerings are now available as a GitHub project. SAP plans to strengthen its commitment to the developer community by continuing to make more open source contributions in the future.
MyPOV – Good to see SAP using more open source, but also supporting open source with contributions. It will be interesting to see if any other enterprises software vendors will start contributing to Vora, or if this will remain an SAP only contribution.

Overall MyPOV

Always good to see software vendors deliver, especially when it is a strategic piece of software, that basically allows the vendor to survive for the next decades to come. Some people may think this is exaggerated, but those should keep in mind that SAP had no Hadoop story 12 months ago. The question will now have to be if Vora is the right and the full story – but it the start of a new book for SAP, and we are in the first chapter.

On the concern side, as elaborated above – SAP will need to make sure the perspective is not coming from the traditional (HANA based) ERP application, but from the Hadoop based Bigdata. That is where enterprises are building their next generation applications – in may use cases out of sheer necessity, because all other storage mechanisms and mediums are not cost effective or even – if cost played no role – feasible. The sooner SAP understands this – the better.

But for now, a good day for SAP customers as their vendor has done major step to future proof offerings and to remain a key player in enterprise software going forward. Tuning is always part of an offering and I am sure SAP will sooner than later get the whole story right.




 
More on SAP:
 
  • Event Report - SAP Ariba Live - Make Procurement Cool Again - read here
  • News Analysis - SAP SuccessFactors innovates in Performance Management with continuous feedback powered by 1 to 1s  - read here
  • Event Report - SAP SuccessFactors SuccessConnect - Good Progress sprinkled with innovative ideas and challenging the status quo - read here
  • News Analysis - WorkForce Software Announces Global Reseller Agreement with SAP - read here
  • First Take - SAP SuccessFactors SuccessConnect - Day #1 Keynote Top 3 Takeaways - read here
  • News Analysis - SAP SuccessFactors introduces Next Generation of HCM software - read here
  • News Analysis - SAP delivers next release of SAP HANA - SPS 10 - Ready for BigData and IoT - read here
  • Event Report - SAP Sapphire - Top 3 Positives and Concerns - read here
  • First Take - Bernd Leukert and Steve Singh Day #2 Keynote - read here
  • News Analysis - SAP and IBM join forces ... read here
  • First Take - SAP Sapphire Bill McDermott Day #1 Keynote - read here
  • In Depth - S/4HANA qualities as presented by Plattner - play for play - read here
  • First Take - SAP Cloud for Planning - the next spreadsheet killer is off to a good start - read here
  • Progress Report - SAP HCM makes progress and consolidates - a lot of moving parts - read here
  • First Take - SAP launches S/4HANA - The good, the challenge and the concern - read here
  • First Take - SAP's IoT strategy becomes clearer - read here
  • SAP appoints a CTO - some musings - read here
  • Event Report - SAP's SAPtd - (Finally) more talk on PaaS, good progress and aligning with IBM and Oracle - read here
  • News Analysis - SAP and IBM partner for cloud success - good news - read here
  • Market Move - SAP strikes again - this time it is Concur and the spend into spend management - read here
  • Event Report - SAP SuccessFactors picks up speed - but there remains work to be done - read here
  • First Take - SAP SuccessFactors SuccessConnect - Top 3 Takeaways Day 1 Keynote - read here.
  • Event Report - Sapphire - SAP finds its (unique) path to cloud - read here
  • What I would like SAP to address this Sapphire - read here
  • News Analysis - SAP becomes more about applications - again - read here
  • Market Move - SAP acquires Fieldglass - off to the contingent workforce - early move or reaction? Read here.
  • SAP's startup program keep rolling – read here.
  • Why SAP acquired KXEN? Getting serious about Analytics – read here.
  • SAP steamlines organization further – the Danes are leaving – read here.
  • Reading between the lines… SAP Q2 Earnings – cloudy with potential structural changes – read here.
  • SAP wants to be a technology company, really – read here
  • Why SAP acquired hybris software – read here.
  • SAP gets serious about the cloud – organizationally – read here.
  • Taking stock – what SAP answered and it didn’t answer this Sapphire [2013] – read here.
  • Act III & Final Day – A tale of two conference – Sapphire & SuiteWorld13 – read here.
  • The middle day – 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
  • A tale of 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
  • What I would like SAP to address this Sapphire – read here.
  • Why 3rd party maintenance is key to SAP’s and Oracle’s success – read here.
  • Why SAP acquired Camillion – read here.
  • Why SAP acquired SmartOps – read here.
  • Next in your mall – SAP and Oracle? Read here
 
 
And more about SAP technology:
 
  • Event Prieview - SAP TechEd 2015 - read here
  • News Analysis - SAP Unveils New Cloud Platform Services and In-Memory Innovation on Hadoop to Accelerate Digital Transformation – A key milestone for SAP read here
  • HANA Cloud Platform - Revisited - Improvements ahead and turning into a real PaaS - read here
  • News Analysis - SAP commits to CloudFoundry and OpenSource - key steps - but what is the direction? - Read here.
  • News Analysis - SAP moves Ariba Spend Visibility to HANA - Interesting first step in a long journey - read here
  • Launch Report - When BW 7.4 meets HANA it is like 2 + 2 = 5 - but is 5 enough - read here
  • Event Report - BI 2014 and HANA 2014 takeaways - it is all about HANA and Lumira - but is that enough? Read here.
  • News Analysis – SAP slices and dices into more Cloud, and of course more HANA – read here.
  • SAP gets serious about open source and courts developers – about time – read here.
  • My top 3 takeaways from the SAP TechEd keynote – read here.
  • SAP discovers elasticity for HANA – kind of – read here.
  • Can HANA Cloud be elastic? Tough – read here.
  • SAP’s Cloud plans get more cloudy – read here.
  • HANA Enterprise Cloud helps SAP discover the cloud (benefits) – read here.
 

Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Hadoop SAP Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

How Do We Do More? CityTalks Inspires Questions

How Do We Do More? CityTalks Inspires Questions

1
 

Sitting in the audience of the City of Sydney’s #SydCityTalk event featuring human rights advocate and former President of Ireland, Mary Robinson, it was clear that she was preaching to the choir. The message of “people first” deeply resonated with the audience and spread out like a shock wave from the stalls back. It wasn’t that we haven’t heard discussions about the importance of human-centred policy and action before – it’s just apparent that this style of conversation has been missing from our public discourse for some time.

After all, we live in an age where our sense of humanity has taken a backseat on our roadtrip to the future, and we’ve packed off the difficult issues like climate change, asylum seekers and refugees to live with the relatives.

So hearing a discussion of how governments, business and citizens can work together seems strangely foreign and wildly exciting.

Mary Robinson packed plenty into a short presentation – sustainable development goals, global focus, Nelson Mandela, Richard Branson and Bill Gates and global recognition for the programs and actions of the City of Sydney. Be sure to watch her speech in the video below.

Debunking Trickle Down Economics

One of the most interesting talks of the evening was Richard Denniss, Chief Economist from The Australia Institute. Not only was he able to make economics sound interesting and entertaining, he was able to do so in a way that illustrated his main point – that trickle down economics does not work. While we have seen this for ourselves in the widening gap between rich and poor – and the accelerating distance between the poor and the poorest – the raw numbers from the IMF tell an altogether more compelling story.

The research from five IMF economists, drew attention to the issue of global inequality, dismissed “trickle-down” economics and urged governments to target policies toward the bottom 20 percent of their citizens.

The problem with inequality is that it actually cripples growth. If we invest in the top 20% of our population, then GDP declines over the medium term. While a 1% increase in the income share of the poorest 20% of the population results in a 0.38% increase in GDP.

Where to from here?

Each of the speakers told a compelling and vital story. But the facts and figures from Richard Denniss’ speech coupled with Mary Robinson’s urgent insistence on change made me wonder. In fact, it made many of us in the audience wonder – where do we go from here? The levers of change are being applied to the UN’s sustainable development goals – and Australia is a willing signatory. But there is a yawning gulf between intention and policy, signature and action. Where do we go from here? How do we take these good intentions and make change happen? And precisely who is this WE?

I would dearly love to hear an update on progress at the next City Talks event.

Perhaps it is too soon to expect change to take place – or maybe – just maybe, we need more impatience in the mix of government, business and citizen policy making.

You can watch the full replay of the event below.

Marketing Transformation Chief Marketing Officer

The last thing privacy needs is new laws

The last thing privacy needs is new laws

World Wide Web inventor Sir Tim Berners-Lee has given a speech in London, re-affirming the importance of privacy, but unfortunately he has muddied the waters by casting aspersions on privacy law. Berners-Lee makes a technologist's error, calling for unworkable new privacy mechanisms where none in fact are warranted.

The Telegraph reports Berners-Lee as saying "Some people say privacy is dead – get over it. I don't agree with that. The idea that privacy is dead is hopeless and sad." He highlighted that peoples' participation in potentially beneficial programs like e-health is hampered by a lack of trust, and a sense that spying online is constant.

Of course he's right about that. Yet he seems to underestimate the data privacy protections we already have. Instead he envisions "a world in which I have control of my data... I can sell it to you and we can negotiate a price, but more importantly I will have legal ownership of all the data about me" he said according to The Telegraph.

It's a classic case of being careful what you ask for, in case you get it. What would control over "all data about you" look like? These days, most of the data about us - that is, personal data aka Personally Identifiable Information or PII - is collected or created behind our backs, by increasingly sophisticated algorithms. On the one hand, I agree wholeheartedly that people deserve to know more about these opaque processes, and we need better notice and consent mechanisms, but on the other hand, I don't see that data ownership can possible fix the privacy problem.

What could "ownership" of data even mean? If personal information has been gathered by a business process, or created by clever proprietary algorithms, we get into obvious debates over intellectual property. Look at medical records: in Australia and I suspect elsewhere, it is understood that doctors legally own the medical records about a patient, but that patients have rights to access the contents. The interpretation of medical tests is regarded as the intellectual property of the healthcare professional.

The philosophical and legal quandries are many. With data that is only potentially identifiable, at what point would ownership flip from the creator of the data to the individual to whom it applies? What if data applies to more than one person, as in household electricity records, or, more seriously, DNA?  Who owns that? 

The outcome we probably all seek is less exploitation of people through data about them. Privacy (or, strictly speaking, data protection) is fundamentally about restraint. When an organisation knows you, they should be restrained in what they can do with that knowledge, and not use it against your interests. Organisations should show self-restraint, and where that fails, there should be legal limits to what can be done with personal data. And thus, over 130 countries now have legislation which require that organisations only collect the personal data they really need for stated purposes, that personal data collected for one reason not be re-purposed for others, that people are made reasonably aware of what's going on with their personal data, and so on.

Berners-Lee alluded to the privacy threats of Big Data, and he's absolutely right. But I point out that existing privacy law can substantially deal with Big Data. It's not necessary to make new and novel laws about data ownership. When an algorithm works out something about you, such as your risk of developing diabetes, without you having to fill out a questionnaire, then that process has collected personal data, albeit indirectly. Technology-neutral privacy laws don't care about the method of collection or creation of personal data. Synthetic personal data, collected as it were algorithmically, is treated by the law in the same way as data gathered overtly. An example of this principle is found in the successful European legal action against Facebook for automatic tag suggestions, in which biometric facial recognition algorithms identify people in photos without consent.

Technologists often under-estimate the powers of existing broadly framed privacy laws, doubtless because technology neutrality is not their regular stance. It is perhaps surprising, yet gratifying, that conventional privacy laws treat new technologies like Big Data and the Internet of Things simply as potential new sources of personal data. If brand new algorithms give businesses the power to read the minds of shoppers or social network users, then those businesses are restrained in law as to what they can do with that information, just as if they had collected it in person. Which is surely what regular people expect of privacy laws. 

Data to Decisions Digital Safety, Privacy & Cybersecurity Matrix Commerce Tech Optimization Distillation Aftershots Security Zero Trust Chief Executive Officer Chief Financial Officer Chief People Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Information Security Officer Chief Privacy Officer

The Industrial Internet, Accelerator in a Box and Retail Disruption on #DisrupTV

The Industrial Internet, Accelerator in a Box and Retail Disruption on #DisrupTV

1
Each week, Vala Afshar and R “Ray” Wang host a web series DisrupTV. It’s a 30 minute deep dive into the world of digital transformation featuring the people and organizations that are leading that change.

This week’s episode featured GE’s Chief Digital Officer, Ganesh Bell, Constellation Research Principal Analyst, Guy Courtin and myself.

Setting a cracking pace, GE have become the poster child for the world of digital transformation, coining the term “industrial internet”, establishing startups in Silicon Valley and setting a vision to be a top 10 software company by 2020. In the episode, Ganesh talks about the challenges of transformation – of moving from an industrial company to a digital company and what it takes. It’s well worth watching the replay to learn more about the tangible impact of digital transformation that GE is making not just within their business but well beyond it.

Joining Ray and Vala, about 25 minutes in, I shared some insight into the world of enterprise innovation in Australia:

Guy Courtin joined around 45 minutes in and brought amazing insight into the changing world of retail. From showrooming to the internet of things, he covered a vast terrain of disruption and opportunity, suggesting that bricks and mortar stores still have plenty of advantages over their digital only counterparts, and explaining that to be truly transformative, we need to stop thinking about “e” commerce and connect the dots around the customer’s commercial experience.

While the show ran for just over an hour, it’s jam packed with insight and energy. And DisrupTV is fast becoming an authoritative, must watch series for all those who are serious about the business of disruption and transformation in business. Check out recordings of past episodes here. And watch this week’s episode replay from Blab below.

Marketing Transformation Chief Marketing Officer

SAP Ariba Live - Making Procurement Cool Again

SAP Ariba Live - Making Procurement Cool Again

We had the chance to attend SAP Ariba's user conference Ariba Live in Las Vegas, together with colleagues Chris Kanaracus, Guy Courtin and Ray Wang. The conference was well attended with over 2500 people in attendance. 
 
Guy and I recorded a short video - take a look:
 
 
No chance to watch? Read on:
 
You can find Guy's Supply Chain and Procurement's takeaways here. My next generation Applications takeaways are as as follows:
 
Open APIs - As common these days, SAP Ariba will publish APIs, starting with five areas, hierarchies and approvals the most prominent ones. Kudos to the vendor for working from a roamap going forward so customers and prospects can plan their uptake of these APIs.
 
End User Enablement - We have been writing about end user enablement since a while and it is a key strategy for vendors, as it achieves a number of benefits: First it enables users with reasonable technology savviness to build their own applications. Secondly that helps enterprises to become more agile and to accelerate, critical for their future success. And lastly it protects the vendors from being disrupted from new market entrants, just using their APIs with an attractive user interface. Good to see Ariba enabling a lightweight end user PaaS, allowing to create forms and deploy them not only to the web, but also to tablets and mobile devices. 
 
Platform Innovation - Ariba had one of the earliest internet scale, some may say cloud platforms and as such it shows its age. While there was not much happening two and more years ago on the platform side, its good to see that this has changed. SAP Ariba is actively using Hadoop, exploiting microservices and using popular frameworks like AngularJS. And of course HANA is more and getting though the product. The use case should also be interesting for the recently gone into GA HANA Vora (see below for news analysis when announced). 
 

MyPOV

Good to see traction on the platform side at SAP Ariba. It looks like the division has found new speed and dynamics, that attending customers noticed. Procurement is a huge opportunity for SAP and its customers and it looks like there is a better grasp at getting into a very good position in the next years to come.
 
On the concern side SAP Ariba needs to execute on the new vision and roadmap. Networked applications of the scale that SAP Ariba needs to build are not trivial, even with today's advances on the cloud side. Operating this on internal data centers is a valid strategy, but can / could become also of concern as capability and TCO of the popular cloud based IaaS platforms will become more and more competitive. 
 
But overall good to see the progress at SAP Ariba - we will be watching, stay tuned. 
 
Matrix Commerce Revenue & Growth Effectiveness New C-Suite Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience SAP Chief Procurement Officer Chief Supply Chain Officer Chief Product Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Marketers as Innovators – Join the #DisrupTV Live Stream

Marketers as Innovators – Join the #DisrupTV Live Stream

1
This weekend – at 5am Australian daylight time – I will be joining the hosts of DisrupTV, R “Ray” Wang and Valar Afshar to talk marketing-led innovation, and provide a snapshot of the Australian innovation landscape. This weekly web series is streamed live on Blab.im and is focused on leadership, innovation and disruption in the enterprise and brings together A-list guests, the latest enterprise news, hot startups, insight from influencers, and much more. And when I say “A-list guests”, I’m not talking about celebrities. I’m talking about business and technology leaders who are changing the way that we do, think about and create value in business.

The show has featured:

The discussion with Alex Osterwalder is eye opening and full of insight for those seeking to change the way businesses organise themselves, create value and operate in the world. It’s well worth tuning in (embedded below).

This week’s interview features GE’s Chief Digital Officer, Ganesh Bell. He leads digital innovation and transformation, and is responsible for the digital solutions business and digital engagement to drive business growth. I will be discussing the nature of corporate innovation, how a market-product fit wins over a product-market fit in the enterprise, and will touch on some of the initiatives arising from the Australian Government’s #IdeasBoom. We’ll also be joined in the “Influencer’s Corner” by Guy Courtin, VP and Principal Analyst at Constellation Research.

Be sure to tune in at 11 a.m. PT/ 2 p.m. ET and remember to tweet your questions using the #DisrupTV hashtag.

Marketing Transformation Chief Marketing Officer