Results

Event Preview - Microsoft Build 2016 - Top 3 Things to watch for developers, managers and execs...

Event Preview - Microsoft Build 2016 - Top 3 Things to watch for developers, managers and execs...

Microsoft's Developer Conference Build is starting tomorrow in San Francisco, and will run through the rest of the week. Well worth a little event preview...

 
 
 
So take a look on what I think are the Top 3 things to watch for the key communities attending: Developers, Managers and Executives:
 
 
No time to watch - well here are all key areas to watch in one slide:
 
Holger Mueller Microsoft Build 2016 Developers, Managers, Executives

Check out my Contellation Research colleague Alan Lepofsky's preview of Build (here), too!


More about Microsoft:
  • News Analysis - Microsoft - New Hybrid Offerings Deliver Bottomless Capacity for Today's Data Explosion - read here
  • News Analysis - Welcoming the Xamarin team to Microsoft - read here
  • News Analysis - Microsoft announcements at Convergence Barcelona - Office365. Dynamics CRM and Power Apps 
  • News Analysis - Microsoft expands Azure Data Lake to unleash big data productivity - Good move - time to catch up - read here
  • News Analysis - Microsoft and Salesforce Strengthen Strategic Partnership at Dreamforce 2015 - Good for joint customers - read here
  • News Analyis - NetSuite announced Cloud Alliance with Microsoft - read here
  • Event Report - Microsoft Build - Microsoft really wants to make developers' lives easier - read here
  • First Hand with Microsoft Hololens - read here
  • Event Report - Microsoft TechEd - Top 3 Enterprise takeaways - read here
  • First Take - Microsoft discovers data ambience and delivers an organic approach to in memory database - read here
  • Event Report - Microsoft Build - Azure grows and blossoms - enough for enterprises (yet)? Read here.
  • Event Report - Microsoft Build Day 1 Keynote - Top Enterprise Takeaways - read here.
  • Microsoft gets even more serious about devices - acquire Nokia - read here.
  • Microsoft does not need one new CEO - but six - read here.
  • Microsoft makes the cloud a platform play - Or: Azure and her 7 friends - read here.
  • How the Cloud can make the unlikeliest bedfellows - read here.
  • How hard is multi-channel CRM in 2013? - Read here.
  • How hard is it to install Office 365? Or: The harsh reality of customer support - read here.
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
Future of Work Tech Optimization Data to Decisions Microsoft netsuite salesforce ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

SAP Bets On Cloud For Analytics, BPC Optimized for S/4 HANA

SAP Bets On Cloud For Analytics, BPC Optimized for S/4 HANA

SAP Financials 2016 event highlights a comprehensive cloud platform and real-time analysis. Here’s why SAP Cloud for Analytics and BPC Optimized for S/4 Hana Finance are getting attention.

Two next-generation products stood out at the March 15-16 SAP Financials2016 and GRC2016 event in Las Vegas: SAP Cloud For Analytics and SAP BPC (Business Planning and Consolidation) Optimized for S/4 HANA Finance. There are, in fact, several paths forward in SAP’s vast financial planning and analysis (FP&A) portfolio, but these are the options seeing the most aggressive development and, not surprisingly, the most interest. Here’s why.

Introduced last fall and based on what was originally called SAP Cloud For Planning, SAP Cloud For Analytics is a comprehensive product spanning planning, business intelligence (BI), and, coming later this year, predictive analytics. Long-range plans also call for Governance, Risk and Compliance (GRC) functionality, but that part won’t show up in 2016.

SAP Analytics Strategy

SAP has offered many analytics products over the years, including Lumira, BusinessObjects Web Intelligence (Webby) and you can even throw BPC onto the list. But where those products have been for specific types of analysis and deployed mostly on-premises, the goal with Cloud for Analytics is to cover all the bases in a cohesive and consistent product that runs on the SAP HANA Cloud Platform.

A key selling point is broad data connectivity and real-time, in-memory analysis. Connection points include on-premises instances of SAP HANA, SAP BW and bi-directionally to BPC. You can also connect to the (private) HANA Enterprise Cloud, Salesforce and Google Enterprise Apps. Soon to be added will be connections to SAP ECC, BusinessObjects Universes, SAP SaaS apps (Ariba, Concur, Hybris, SuccessFactors, etc.) and live connections with write-back capabilities to on-prem instances of SAP BW.

Though it addresses planning, BI and, soon, predictive analytics, Cloud For Analytics is not a monolithic, all-or-nothing proposition. You can subscribe to just planning or just BI and so on. And there are different subscription levels for different types of users, whether they’re basic report or plan consumers or advanced developers or analysts.

Another, emerging constituent for Cloud for Analytics is business unit leaders, CXOs and even board members, all of whom are supported through an optional Digital Boardroom application. The idea is to help executives see where the business stands and where it’s headed, and with access to all that business data and data visualization and planning capabilities in the cloud, it’s an accessible option.

At SAP Financials 2016 (and also at the recent SXSW event in Austin, TX), SAP offered a Digital Boardroom virtual demo that showed how executives can traverse a Value-Driver view and plug in new planning assumptions for what-if scenario analysis. Plug in a new cost figures for key raw materials, for example, and you can see the impact on costs and margins. Or you could plug in an increase in sales staffing in a fast-growing market to gauge the impact on sales and profitability.

The Digital Boardroom app will become even more powerful as Cloud for Analytics gains predictive capabilities later this year, so executives can do simulations and get predictive recommendations on best actions. You can do this executive-level planning and analysis one a single screen with a mouse and keyboard, but the Digital Boardroom option also supports a slick, three-screen conference room setup using giant touch-screen displays (hardware not included).

MyPOV: It’s early days for Cloud for Analytics, and the roadmap reveals that there’s plenty of functionality that has yet to be added. On the planning front, it’s up against fast-growing, cloud-based performance management rivals such as Anaplan, Host Analytics and Adaptive Insights. These products are also more mature, offering prebuilt apps in areas such as workforce planning as well as financial consolidation, something not yet on the roadmap for Cloud for Analytics.

On the BI front there are myriad competitors, from cloud-native BI systems like BIRST, Domo and GoodData to newish cloud offerings from Qlik and Tableau to emerging as-a-service offerings from Amazon Web Services, IBM and Oracle. The predictive analytics capabilities SAP will bring to Cloud for Analytics from its KXEN acquisition could be a real differentiator, but its’ unclear to me how deep this functionality will go and how soon it will be generally available.

For now I’d say that the breath of SAP Cloud For Analytics is very promising, but the depth in each area of analysis has yet to be realized. It will clearly be most attractive to SAP customers who want real-time access to and analysis of on-premises data in the cloud.

BPC Optimized for S/4HANA Finance

BPC has been a cornerstone FP&A option for SAP customers nearly a decade. A key differentiator versus the rival Oracle Hyperion portfolio is that BPC combines both planning and consolidation in a single product. BPC has evolved to include several permutations, including BPC NetWeaver Standard, BPC for Microsoft and BPC Embedded (as in, an embedded component of SAP Hana).

BPC Optimized for S/4 Hana Finance is seeing growing interest because it gives the many companies that have implemented BPC a path to HANA in-memory performance without having to reimplement on HANA. BPC gets direct, real-time access to ERP data from an S/4HANA Finance instance that sits next to a conventional deployments of the SAP Business Suite.

For now, the real-time advantage for BPC Optimized for S/4 Hana Finance is limited to the planning side of the product, but that alone is attractive to many because you get on-the-fly slicing and dicing against up-to-the-minute data. Future plans call for sophisticated, real-time analyses of sales and profitability, investments, liquidity, product costs, tariffs and more.

The appeal of this product will become even more powerful in the third quarter when it gains support for real-time consolidation. This will enable business units and corporate to predict and close end-of-period results that much more quickly, a crucial advantage.

MyPOV: For net-new customers who are implementing from scratch, BPC Embedded is the likely path forward because there’s no legacy to worry about and you develop from the start to gain the real-time analysis advantages of SAP HANA. But for the many existing customers who have content they want to leverage, BPC Optimized for S/4 Hana is a more practical path to real-time planning and consolidation. It’s also a path the S/4Hana Finance Universal Journal, which promises single-table access to all cost and account information, including customer and vendor data. With universal access to real-time data combined with (coming) what-if simulation and recommendations, we’re talking state-of-the-art capabilities that can change the way you see and do business.


Data to Decisions Chief Financial Officer Chief Information Officer

News Analysis - Oracle Unveils Suite of Breakthrough Services.. or short: Oracle Cloud Machine

News Analysis - Oracle Unveils Suite of Breakthrough Services.. or short: Oracle Cloud Machine

Earlier this week Oracle unveiled the Oracle Cloud Machine, its offering to run on premises what is running in the Oracle Cloud. It’s a key milestone for Oracle and for the overall industry, showing that hybrid cloud is real, and enterprises plan, want and likely will move loads between public cloud and on premises.

 
 

So let’s do our customary news analysis of the press release, that can be found here:
Oracle today launched a new family of offerings designed to enable organizations to easily move to the cloud and remove some of the biggest obstacles to cloud adoption. These first-of-a-kind services provide CIOs with new choices in where they deploy their enterprise software and a natural path to easily move business critical applications from on-premises to the cloud.
MyPOV – Interesting that Oracle approaches it as a strategy / platform to move to the public cloud. Moving load back from public cloud to on premises is what is the more immediate value proposition that comes to (my) mind. 

While organizations are eager to move their enterprise workloads to the public cloud, many have been constrained by business, legislative and regulatory requirements that have prevented them from being able to adopt the technology. Today, Oracle is making it easier for organizations in every industry to make this transition and finally reap the performance, cost and innovation benefits of Oracle Public Cloud Services and run them wherever they want—in the Oracle Cloud or in their own datacenter.
MyPOV – Oracle touches a key point with restrictions that keep enterprises from moving to public cloud. Giving customer choice is always good and that’s what Oracle customers are now getting.
 
"We are committed to helping our customers move to the cloud to help speed their innovation, fuel their business growth, and drive business transformation,” said Thomas Kurian, president, Oracle. “Today’s news is unprecedented. We announced a number of new cloud services and we are now the first public cloud vendor to offer organizations the ultimate in choice on where and how they want to run their Oracle Cloud.
MyPOV – Good quote from Kurian, certainly Oracle is the first provider to run Oracle Cloud loads on Oracle Cloud Machine on premises. Who else would? Back in January Microsoft made the Azure Stack available that offers the same qualities. Going forward I am sure Microsoft and Oracle will spar with each other over who can move more load(s) and better.
 
Unveiled today, Oracle Cloud at Customer enables organizations to get all of the benefits of Oracle’s cloud services—agility, simplicity, performance, elastic scaling, and subscription pricing—in their datacenter. This is the first offering from a major public cloud vendor that delivers a stack that is 100 percent compatible with the Oracle Cloud but available on-premises. Since the software is seamless with the Oracle Cloud, customers can use it for a number of use cases, including disaster recovery, elastic bursting, dev/test, lift-and-shift workload migration, and a single API and scripting toolkit for DevOps. Additionally, as a fully managed Oracle offering, customers get the same experience and the latest innovations and benefits using it in their datacenter as in the Oracle Cloud.
MyPOV – Good description of the main argument Oracle has been using since many years now: Same platform and software for both sides of the firewall and the benefit of load fluctuation between public cloud and on premises.
 
By extending the Oracle Cloud into their data center, customers can:
Have full control over their data and meet all data sovereignty and data residency requirements that mandate customer data remain within a company’s data center or contained within a geographic location while still taking advantage of the benefits of the cloud
MyPOV – Control over data / privacy / compliance are key arguments for enterprises to keep loads on premises. Given the limbo enterprises are in with the invalidation of the safe harbor agreement, a very valuable offering for many of them.
 
Enable workload portability between on-premises and cloud using identical environments, toolsets, and APIs 
MyPOV – The main Oracle quality, Oracle showed the move of a database between on premises and the cloud back at Oracle OpenWorld last year. The question is how much dynamic load the enterprises will let float.
   
Easily move Oracle and non-Oracle workloads between on-premises and the cloud based on their changing business requirements
MyPOV – The interesting information bit here is ‘non Oracle workloads’ which is the strategy Oracle unveiled earlier in January this year (read here) and OpenWorld last year: With the help of a nested hypervisor infrastructure Oracle wants to attract, run and operate also non Oracle loads. Technically, an enterprise should be able to take e.g. an AWS Cloud load and take in on premises on Oracle Cloud Machine.
   
Comply with security and privacy regulations such as PCI-DSS for the global credit and debit card industry, HIPAA for the US healthcare industry, FedRAMP for the US federal government, Germany’s Federal Data Protection Act, the United Kingdom's Data Protection Act, and other industry- and country-specific regulations
MyPOV – Good to see Oracle has done the homework on all the data privacy and residency mandates that enterprises face – and we know enterprises struggle to stay on top of them. They sure are open for solutions that give them flexibility in regards to where the legislative bodies of their respective countries will march.
 
Today, Oracle is announcing the availability of the following Oracle Cloud at Customer services:
Infrastructure: Provides elastic compute, elastic block storage, virtual networking, file storage, messaging, and identity management to enable portability of Oracle and non-Oracle workloads into the cloud. Additional IaaS services that complete the portfolio, including Containers and Elastic Load Balancer will be available soon.
MyPOV – Good enough to see a sizable enough functionality to really run some load on premises. The Load Balancer will be key, container support will be important to move next generation applications to on premises, as they take advantage of modern software construction and operation technologies like microservices.
 
Data Management: Enables customers to use the number one database to manage data infrastructure in the cloud with the Oracle Database Cloud. The initial set of Database Cloud Service offerings will be followed by Oracle Database Exadata Cloud for extreme performance and a broad set of Big Data Cloud services, including Big Data Discovery, Big Data Preparation, Hadoop, and Big Data SQL.
MyPOV – The list of future offerings is long, reminding us that this is a version 1. But the most common Oracle load, its database is supported and with that there is massive market potential for Oracle Cloud Machine.
 
Application Development: Develop and deploy Java applications in the cloud using Oracle Java Cloud, soon to be followed by other services for polyglot development in Java SE, Node.Js, Ruby, and PHP.
MyPOV – This is likely the most interesting part of Oracle Cloud Machine from a PaaS perspective. It’s practically impossible to develop on premises and use the latest development tools and technologies. This is a first option to take the location of development back inhouse, which a number of organizations will welcome.
 
Enterprise Integration: Simplify integration of on-premises applications to cloud applications and cloud application to cloud application integration using the Oracle Integration Cloud Service. Additional capabilities for SOA, API Management, and IoT will be added soon.
MyPOV – While AppDev was interesting, this is the most impactful capability, as Oracle moves the boundary of integration between the firewall on premises and public cloud based systems to inside the firewall. And with that is open to control and inspection completely in the realm of the enterprise. Something most enterprise IT shops have done for many decades and in many countries are more comfortable to keep doing, then having the interface happening before / after data passes the firewall.
 
Management: Unifies the experience of managing workloads seamlessly on-premises and in the Oracle Cloud.
MyPOV – Ok – no brainer, needed to be there, more details on how to move load would be interesting. When could load ‘burst’ to the cloud, one of the ‘holy grails’ of cloud computing, could data be split based on statutory requirements etc. many more questions.
 
The Oracle Cloud has shown strong adoption, supporting 70+ million users and more than 34 billion transactions each day. It runs in 19 data centers around the world.
MyPOV – Impressive stats, a fortune for how many servers run Oracle on premises, how many users do they serve and how many transactions do they run. I am sure some smart people in Redwood Shores have assessed this…

 

MyPOV

Not a surprising move by Oracle, as it has been talking on symmetrical setup, products and capabilities between its cloud and on premise products since many years. That Oracle makes the move only now (and recently Microsoft, too) shows that Oracle thinks it cloud architecture is so mature, that it can ship it out to customers. At the same time it is clear that customers cannot replicate the full extend (and complexity) of Oracle’s Cloud platform across multiple servers and system landscapes – so offering the same on premises is the logical consequence. No surprise Oracle will have to offer the management of the Oracle Cloud Machines, something CxOs should definitively consider using as a service.

For enterprises it comes back to : Is the ability to run loads that may move to the (Oracle) cloud sometime in the future worth buying Oracle Cloud Machine and services today – versus using existing setups in machines and people. Every enterprise will have a different answer to the equation. 
 
But with many enterprise boards looking at the Capex vs Opex ration, CIOs and CTOs know they need to have more Opex options, so we expect some reasonable interest from the enterprise side. If Oracle now can show TCO superiority (something the vendor usually isn’t shy on) to other, also cloud based solutions, Oracle Cloud machine maybe the on premise route for Oracle to ‘steal’ load from other IaaS providers (thanks to the nested hypervisor). 

But speculation, certainly exciting times ahead, we will be watching, stay tuned.

 
Recent blog posts on Oracle:
  • Progress Report - Oracle Cloud - More ready than ever, now needs adoption - read here
  • Event Report - Oracle Openworld 2015 - Top 3 Takeaways, Top 3 Positives & Concerns - read here
  • News Analysis - Quick Take on all 22 press releases of Oracle OpenWorld Day #1 - #3 - read here
  • First Take - Oracle OpenWorld - Day 1 Keynote - Top 3 Takeaways - read here
  • Event Preview - Oracle Openworld - watch here

Future of Work / HCM / SaaS research:
  • Event Report - Oracle HCM World - Full Steam ahead, a Learning surprise and potential growth challenges - read here
  • First Take - Oracle HCM World Day #1 Keynote - off to a good start - read here
  • Progress Report - Oracle HCM gathers momentum - now it needs to build on that - read here
  • Oracle pushes modern HR - there is more than technology - read here. (Takeaways from the recent HCMWorld conference).
  • Why Applications Unlimited is good a good strategy for Oracle customers and Oracle - read here.

Also worth a look for the full picture
  • Event Report - Oracle PaaS Event - 6 PaaS Services become available, many more announced - read here
  • Progress Report - Oracle Cloud makes progress - but key work remains in the cellar - read here
  • News Analysis - Oracle discovers the power of the two socket server - or: A pivot that wasn't one - TCO still rules - read here
  • Market Move - Oracle buys Datalogix - moves more into DaaS - read here
  • Event Report - Oracle Openworld - Oracle's vision and remaining work become clear - they are both big - read here
  • Constellation Research Video Takeaways of Oracle Openworld 2014 - watch here
  • Is it all coming together for Oracle in 2014? Read here
  • From the fences - Oracle AR Meeting takeaways - read here (this was the last analyst meeting in spring 2013)
  • Takeaways from Oracle CloudWorld LA - read here (this was one of the first cloud world events overall, in January 2013)

And if you want to read more of my findings on Oracle technology - I suggest:
  • Progress Report - Good cloud progress at Oracle and a two step program - read here.
  • Oracle integrates products to create its Foundation for Cloud Applications - read here.
  • Java grows up to the enterprise - read here.
  • 1st take - Oracle in memory option for its database - very organic - read here.
  • Oracle 12c makes the database elastic - read here.
  • How the cloud can make the unlikeliest bedfellows - read here.
  • Act I - Oracle and Microsoft partner for the cloud - read here.
  • Act II - The cloud changes everything - Oracle and Salesforce.com - read here.
  • Act III - The cloud changes everything - Oracle and Netsuite with a touch of Deloitte - read here

Finally find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
Future of Work Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth New C-Suite Next-Generation Customer Experience Oracle SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Digital Officer Chief Analytics Officer Chief Executive Officer Chief Operating Officer

Event Report – Google Cloud Platform Next – Key Offerings for (some of) the enterprise

Event Report – Google Cloud Platform Next – Key Offerings for (some of) the enterprise

We had the opportunity to attend the inaugural Google Cloud Platform conference in San Francisco March 23rd and 24th 2016. Google Cloud Platform Next. For an inaugural event the conference was well attended and situated at beautifully transformed Pier 48 on the San Francisco Waterfront.

 
 
 
So take a look at my key takeaways, with with a focus on the Day #2 keynote (don’t miss my Day #1 takeaways here, if you haven’t seen them, watch them first):
 

No time to watch – read on:

The keynote was lead by Greg DeMichillie who promised that it would be ‘not marketing and all technology’ and it lived certainly up to that, in detail the audience learned:


Google Datacenters - Joe Kava led us through the principles of Google Data Center setup and design, the session was the first time Google shared to a public and wide audience what these principles are. And they are
  • Availability – Google design for very high availability, and it needs it for its very own services, achieves it with building own hardware, networking etc.
     
  • Security – Google is committed to security, e.g. was the first major cloud provider to enable HTTPS / TLS by default (as one example).
     
  • Performance – Google’s proprietary designs are aided by e.g. by high efficiency cooling and innovative hardware designs, e.g. the new OCP announcements around 48V rack specification.
     
  • Culture – Google runs its datacenters with its own employees only. From design to built – it’s all done by Google itself, from ‘chip to chiller’. Kava shared a chart that showed that humans make more machines than machines, so the Google built monitoring systems are crucial.
     
  • Sustainability - Google is the only cloud provider running exclusively on renewable energy, it helps that Google is the largest private investor in renewables. Own R&D in cooling technology is another strength.
     
  • Innovation – Google is using its advances on Machine Learning to harden its data center operations further, all along the software supported operations. 
As mentioned yesterday already, Google will open data centers in Tokyo and Oregon this year. 10 more locations by 2017. And data center locations matter, as we learnt e.g. from Google Partner Avere Systems, there is a difference in performance when you e.g.  render CGI in Iowa or Oregon when based in Southern California. Moreover locations matter for data residency and privacy, and Google is likely going to need more EU and APAC based instances (no 2017 locations were announced).

Security matters – Next was Niels Provos walking us through the security framework of Google and then GCP. Too much for a blog post – but compare the below two pictures. Google of course re-uses its own Google App security stack, and then extends it for the needs of the GCP customers. No surprise here, good to see the synergies. 
 
 
Google Security Stack
Google Cloud Platform Security Sack
 
 

Configuration matters – Next up was Eric Brewer, updating us that Google has been using containers since over 10 years over Borg, then Omega and now Kubernetes. He walked us through recently announced Kubernetes 1.2 and then tackled the challenge of configuration, a key DevOps problem. Making the config available via a mounted volume and gives many advantages, and that’s what Google is introducing with Helm, the package manager of Kubernetes, now on GitHub.
 

Analyst Tidbits

 
  • Customer Panel – We had the chance to talk to Google customers Coca-Cola, Snapchat and Spotify. All have very different use cases, Snapchat has been native to Google from the very start, Spotify just choose GCP – mainly for its Analytics / MachineLearning abilities and Coca-Cola built a webscale soccer worldcup picture sharing application. When asked what they would like GCP to improve the mentioned IAM (Spotify), better server farm management (Snapchat) and more partner and SI support (Coca-Cola). Good to see happy customers. 
  • Partner Panel – Next was a panel of a very different set of partners with Avere (Hybrid storage, burst to cloud), PWC (building new applications on GCP), BitNami (providing the app stack to ISVs) and xxx (a classis ISV, building a content management system). Good to see GCP being able to serve such a diverse set of partners, and good to see happy partners. 
  • Google pushes MachineLearning further – Google machine learning platform Tensorflow became more interesting given the announcements made on assisted learning. Going beyond the data scientist, machine learning specialist is a key step for advancing the overall industry. And Google know that Machine Learning creates substantial need for storage and compute – something every cloud provider wants. The portability of the Tensorflow models makes the product interesting compared to the competition where more lockin is typically in play. 
 
     

    MyPOV

    A good event for Google that has shown some very compelling arguments to use GCP. 

    There can be no doubt at this point that Google understands and operates at cloud scale. Allowing GCP customers to use the same infrastructure and architecture is very attractive for enterprises, as long as their needs fit into the Google use cases. In the executive Q&A Diane Greene was very adamant that it wants all of the enterprise load, and chairman Eric Schmidt made it clear that Google has now understood it needs to come more ‘to you’ (i.e. the enterprises). But there are loads in the enterprise (thing e.g. ERP, CRM etc. systems) that Google has no answer for – except for a re-write of the load into new applications. But that is not palatable for many enterprises, so Google is banking on a multi-cloud future. And we can agree that the future if multi cloud, the question for enterprises is – how much automation / load gets put into which cloud. Naturally they gravitate towards simplicity and to the ‘one butt to kick’ decision making, that has worked well in the past for them – and there is no reason why the same principles will not work in the future. 

    Google now needs to think hard how wide it wants to make its product offering to capture enterprise load – e.g. Google can and could go after basic cloud load like storage and DR. Not a peep in the keynotes at GCP. For now enterprises should look at GCP for their next gen Application needs that are aligned with Google’s core competencies – that by itself is a great synergy in what enterprises need to build and what Google can offer. If this will be enough for Google to catch up to current players ahead of Google in the cloud race (AWS and Azure) remains to be seen.
    Future of Work Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth New C-Suite Next-Generation Customer Experience Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Digital Officer Chief Analytics Officer Chief Executive Officer Chief Operating Officer

    Musings - Implications for CxOs from the DoJ vs Apple tussle

    Musings - Implications for CxOs from the DoJ vs Apple tussle

    A lot has been written over the US Department of Justice and Apple sparring over unlocking the iPhone of one of the San Bernardino attackers.



     

    What can CxOs in charge of enterprises / enterprise functions learn from it - take a look:



     
     
    No time to watch - here are the key takeaways
     
    • Privacy Matters - Privacy is important for consumers as well as enterprises, guarding it is quickly becoming at Top 3 objective for CxOs.
       
    • Legal Frameworks Lag Behind - The legal frameworks have no been able to keep up with technology progress, that a device can / could be locked beyond the governments access has not happened before.
       
    • The Courts Decide - If you venture into the legal gray zone, be ready to defend your position in court.
       
    • Governments (so far) always win - Through all technologies, starting from the telegraph, the phone, the internet, etc. governments have always won the oversight battle.
       
    • Innovation implications - Apple has been so adamant to defend privacy, that it is even ready to close capabilities that it needs and just used around Error #53. When vendors are not able to correct own mistakes quickly, they need to develop capabilities more carefully, test longer - and that actually will result in slower innovation cycles. 
     

    MyPOV

    The gap between legal frameworks and technology capability is likely to widen going forward (see Safe Harbor Debate, see FCC regulation of the internet etc.), for CxO it means to be aware of the gap and make conscious decisions. On the societal side the legislative power will have to step up, ultimately driven by voters view if (in their country) there should be unbreakable devices - for the first time. Regulatory certainty around operating businesses will become even more important going forward.
     
    [Update March 22nd] It looks like the DOJ has found a 3rd party to help unlock the contested iPhone. Now Apple is worried about what the 'break in' technology would be that endangers the privacy ambition. Lesson learnt: Governments find their way, one way or the other, bu we knew that before. 


     
    More Musings
    • Musings - Retail is the breeding ground for NextGen Apps - read here
    • Musings – Time to re-invent email – for real! Read here
    • The Dilemma with Cloud Infrrastrcture updates - read here
    • Are we witnessing the Rise of the Enterprise Cloud? Read here
    • What are true Analytics - a Manifesto. Read here
    • Is TransBoarding the Future of Talent Management? Read here
    • How Technology Innovation fuels Recruiting and disrupts the Laggards - read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
     
    Tech Optimization Digital Safety, Privacy & Cybersecurity Security Zero Trust SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief Technology Officer Chief Data Officer Chief Executive Officer

    SAP Ariba Live – making procurement strategic again

    SAP Ariba Live – making procurement strategic again

    Sorry it is the crazy season of United States presidential primaries and I had to tap into the Donald Trump tag line. I think I am a little punchy after spending a few days at the Cosmopolitan casino in Las Vegas. Other than the usually craziness that occurs in Las Vegas, I spend my time with the SAP Ariba team learning of their indexdirection for 2016 and beyond. The biggest theme I took away from the event was the re-positioning of procurement. Evolving from being primarily a spend management tool to a much more global and strategic function. Taken in the context of how commerce, both from a B2B and B2C world have evolved, the ability to approach spending with greater flexibility and insight is crucial to remain not only competitive but to also capture opportunities. The highlights from the show:

    • Massive volume of commerce passing through SAP Ariba: With $1 trillion worth of commerce volume passing through SAP Ariba, 8 documents per second and $40 billion in payments passing via their platform and 113 million annual PO-invoice transactions in 2015, it is impressive the volume and amount of commerce that is being handled.
    • Simple, Global and Innovative – this was the mantra from main stage right from the get go. SAP Ariba has been hard at work to make the business platform simple to use – almost like the interfaces and transactional systems we are used to as consumers! Both from being able to tie in greater amounts of necessary data, to stream lining the process to the look and feel of the system the simplicity of being able to leverage the tool was apparent from main stage and in conversations with the team. The challenge for SAP Ariba and their team will be with regards to being global and innovative. These terms are too often thrown around with out much thought behind what they truly mean. Being global is necessary to truly meet spend needs from customers but the innovative side is one that SAP Ariba will have to work with their clients to demonstrate true innovation and not have it become an empty tag line.
    • Guided buying – bringing the user experience of the B2C eCommerce world to the world of procurement. The guided buying platform brings the interface and ease of use that we expect to as consumers to the B2B space. Similar to our experience as consumers, the system applies a layer of intelligence to allow the user to be guided towards the best options. When we interact with an Amazon as well as other online commerce, we are accustomed to and expect to be providing with suggestions for related products or ancillary goods. With guided buying, SAP Ariba is providing this layer of intelligence to the procurement world.
    • Supplier management – the network is only as powerful as the ability to add (and by definition subtract) suppliers to the network. A crucial characteristic of any robust network is the ability for customers to rapidly onboard vital suppliers. Whether it is to find a new source of material, to add new suppliers that can support a new product or entering a new market. SAP Ariba is adding simplicity and efficiency to the process. This is vital as speed and flexibility with regards to buying is crucial as the speed of business continues to accelerate.

    Another interesting undertone – is the view of leveraging this platform to become one that goes beyond simply procurement but to being able to added data feeds such as IoT. The example provided revolved around picking up a signal from an IoT beacon around predictive maintenance. The signal would feed the platform warning of a possible break down of a product. Users can then tap right into the SAP Ariba platform to order the necessary part. This was a great example of how the open platform can allow for new business models to evolve and keep pace with digital disruptors such as IoT.

    SAP Ariba is working on making “procurement cool again” I would argue they are making procurement more strategic again. Procurement and buying have to be taken into account with all other aspects of the supply chain. As customers continue to drive the ecosystem – regardless of B2B or B2C – a holistic view of the entire network is vital to capable to meet demands and uncover opportunities.


    Tagged: procurement, SAP Ariba, Supply Chain

    Matrix Commerce Tech Optimization Revenue & Growth Effectiveness New C-Suite Chief Procurement Officer Chief Product Officer

    Inside Cloudera Analyst Day

    Inside Cloudera Analyst Day

    Media Name: research-offerings-research-reports.jpg

    Cloudera plans public cloud push as applications multiply in financial services, insurance, life sciences, retail and telecommunications. Hadoop may not be easy, but it is gaining mainstream adoption.

    Data to Decisions Chief Information Officer On <iframe src="https://player.vimeo.com/video/160284542" width="640" height="360" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>
    <p><a href="https://vimeo.com/160284542">Inside Cloudera Analyst Day</a> from <a href="https://vimeo.com/constellationresearch">Constellation Research</a> on <a href="https://vimeo.com">Vimeo</a>.</p>

    Google Cloud Platform - Takeaways Day #1 Keynote

    Google Cloud Platform - Takeaways Day #1 Keynote

    We have the opportunity to attend Google's Google Cloud Platform event this week in San Francisco, a key event for Google in the on going 'battle for the public cloud'.



     
     

    So take a look for my top 3 takeaways:



     

    No time to watch - read on:


    Greene Debut - Since Diane Greene joined Google in November last year, there has been a lot of expectation, that she will move the Goggle offerings into a better place with the enterprise. In her short remarks she hit good points in regards of investment (almost 10B in 2015 alone) and TCO savings. As an avid sailor she used the revolutionary foils as a metaphor what Google wants to do for the enterprise.

    GCP grows - Urs Hoelzle then walked us through key advancements of Google Cloud Platform (GCP), after unveiling the pitch line for GCP - better software faster. What stuck with me is once again scale, focus on security, machine learning and the new buzzword 'NoOps' vs DevOps.

    3 layers of GCP -  Then it came to Brian Stevens to share the three layers of GCP, Infrastructure and Operations, Application Development and Data and Analytics. This structure formed his part of the keynote with key announcements in each area.., couple with a major customer win.

     
    • Infrastructure and Operations - This year Google will add a data center in Tokyo and Oregon, and 10 more locations will come by 2017. Locations are key for speed and compliance and it is good to see that Google is ramping up GCP locations. The key product announcement was Stackdriver, the new GCP Ops Console, that interestingly not only shows GCP loads and operations, but offers insights into 3rd party clouds, too - today AWS Cloud. The key customer win was Coca-Cola.
        
    • Application Development - The key demo here was around Kubernetes, scaling a load well on GCP, more interestingly though also in hybrid mode, which Google demoed with an Intel server on stage. The key customer win was Disney Interactive.
       
    • Data & Analytics - On the product side Google showed Datastudio 360 and and then unveiled Cloud Machine Learning, a key step forward on how to build 'true' analytics applications. The key customer win was Spotify, which was demoed impressively. 
     

    MyPOV

    I tweeted my Top 3 questions before the event on what enterprises (and me) are looking for Google to address - here they are:
     

    So how did Google do?

    Ad 1 - Google was not too explicit here - but being able to monitor loads in AWSCloud and move them makes clear what the options for enterprises are. And with a strong focus on MachineLearning on top of BigData Google things it can out feature AWSCloud and Azure.

    Ad 2 - As we know from enterprises already, its hard to figure out how Google and GCP can specifically help them. There is perceived value, but it is not tangible enough. And while Coca-Cola, Disney Interactive and Spotify are great customer wins from a pure breed cloud showcase - they don't give the average CIO confidence that GCP can power their use cases.

    Ad 3 - Google did a very good job here and has probably the most impressive offering in the market. But again - how does it relate to an enterprise out there is the question. It was very impressive to see how Spotify uses the MachineLearning and BigData tools - probably a key reason for chosing GCP - but how does it relate to the average CIO out there looking at Google was not addressed.

    So overall a good start for Google, it has shown once again what it does well, work with enormous amounts of data, process with a lot of compute and a very attractive price - but we knew that before. Good to see focus and progress on security and administration, with a multicloud angle. But it is only Day #1 at the event - stay tuned for more tomorrow.




     
    More about Google:
    • News Analysis - Google launches Cloud Dataproc - read here
    • Musings - Google re-organizes - will it be about Alpha or Alphabet Soup? Read here
    • Event Report - Google I/O - Google wants developers to first & foremost build more Android apps - read here
    • First Take - Google I/O Day #1 Keynote - it is all about Android - read here
    • News Analysis - Google does it again (lower prices for Google Cloud Platform), enterprises take notice - read here
    • News Analyse - Google I/O Takeaways Value Propositions for the enterprise - read here 
    • Google gets serious about the cloud and it is different - read here
    • A tale of two clouds - Google and HP - read here
    • Why Google acquired Talaria - efficiency matters - read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my Youtube channel here
    Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing finance Healthcare Customer Service Content Management Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

    Cloudera Takes to the Cloud, Highlights Industry Use Cases

    Cloudera Takes to the Cloud, Highlights Industry Use Cases

    Cloudera plans public cloud push as applications multiply in financial services, insurance, life sciences, retail and telecommunications. Hadoop may not be easy, but it is gaining mainstream adoption.

    Hadoop is going mainstream, it’s increasingly moving into the cloud, and it’s delivering solid business value. These are three key themes that were highlighted this week at Cloudera’s third annual Analyst Day in San Francisco.

    Inside Cloudera Analyst Day

    Cloudera shared strong evidence of broad adoption and business value through panels that dug into the details of real-world deployments. Here’s a short list of the types of applications seen across five industries:

    Financial Services: Cloudera has more than 100 customers in this category, and use cases typically start with governance and security. Big banks, for example, have to retain transactional data for regulatory reasons, and many have embraced Hadoop for high-scale data-retention and analyses including anti-money-laundering and stress testing. As the bread of data in a data lake spreads across lines of business, financial services develop 360-degree views of customer preferences and behaviors (customer 360).

    Insurance: Insurers use Hadoop for customer 360 and claims-fraud analysis applications. More mature adopters are moving into Internet of Things (IoT) applications such as usage-based pricing. In the automotive arena, for example, pay-as-you-drive and how-you-drive pricing will be ubiquitous within a few years, a Cloudera exec predicted. The platform is making high-scale analysis of telematics data practical and affordable.

    Life Sciences: Whether it’s healthcare providers, pharmaceutical companies or crop sciences firms, these organizations are modernizing their data infrastructures to handle data at unprecedented scale. Cloudera customer Cerner, which analyzes electronic medical records on a Hadoop-based platform, has come up with an automated way to predict sepsis infections in hospital patients. The alerts have reportedly saved more than 3,000 lives to date.

    Retail: It’s all about getting closer to the customer, differentiating products and services, and optimizing inventory to maximize sales and keep customers happy. That’s a journey that starts with resolving customer identities across channels and then better integrating data from across channels. These first two steps get you to the most valuable stage of understanding customer interactions, behaviors and value across all channels and over time.

    Telcos: Telcos are big users of Hadoop, and they start with governance-oriented call-data-record remediation and customer-churn analysis. Operations groups use the platform for network troubleshooting, security and risk analysis. As use of the platform matures, front-end and back-end insights are integrated for proactive network optimization, customer service and anti-churn initiatives.

    The four themes that cut across all industries are driving customer insights, improving products and services, reducing risk and modernizing IT infrastructures. On this last point, Cloudera said that only 15% of its 850-plus enterprise customers have deployed its software on public clouds, but that’s where it’s seeing the fastest growth. “Data that’s born in the cloud wants to stay in the cloud,” observed Cloudera Chief Strategy Officer, Mike Olson, and that trend will accelerate as IoT scenarios flourish, he added.

    Cloudera plans to ramp up in this areas with Cloudera Director, an automated cloud deployment tool and abstraction layer that hides the complexities and differences among various clouds and deployment options including Amazon Web Services, Google, OpenStack and VMWare. With Cloudera Director 2.0, released in January, Cloudera added a cluster cloning feature and the ability to automatically grow and shrink clusters to save money.

    MyPOV on Cloudera Analyst Day

    There was a bit of a disconnect between what Cloudera talked about in its market observations and strategy overviews and what it detailed in its product roadmap (which was largely under NDA). For example, there was no signal of new cloud deployment capabilities beyond Director 2.0, other than supporting Microsoft Azure as a deployment option. And despite all the talk of industry specific use cases, Cloudera executives only vaguely alluded to blueprints, templates, and frameworks — from Cloudera and from partners — that give customers a starting point on proven applications. It’s nice to hear about vertical use cases, but Cloudera has much more work to do on broad platform acceptance before it can go too far down the vertical-industry path.

    At one point during the day Cloudera described its technology as being “fast and easy,” but that discussion reminded me of SAP couching its next-generation ERP suite as being “simple.” When I questioned execs about the use of these terms, Chief Strategy Officer Mike Olson qualified that Hadoop is fast and easy as compared to relational database approaches when trying to solve high-scale data challenges. He also pointed to efforts Cloudera has made to simplify deployment with tools like Navigator Optimizer and Cloudera Director, which speed and ease analysis and optimization of SQL workloads and cloud deployment, respectively.

    At other points during the day Cloudera execs talked about the time and money the company has to invest to help clients move from proof-of-concept projects to broad and fruitful production use. And it also discussed how it’s now employing extensive automated testing to ensure the quality of its software distribution, which is now includes more than 25 open source components.

    In short, “fast” and “easy” are not terms I would associate with Hadoop. But “proven,” “value driving” and even “industry standard” work for me and for the many companies that now rely on the platform.


    Data to Decisions Chief Information Officer

    Real Time IoT Sensing requires Real Time Responsive Apps, and only now are these arriving in the market

    Real Time IoT Sensing requires Real Time Responsive Apps, and only now are these arriving in the market

    There is a distinct feeling that technology capabilities to monitor and capture ‘real time’ data using an ever increasing range of low cost sensors are getting out of ahead of the availability of Apps and Services that can provide ‘real time’ optimized responses. In the Industrial Technology and automation sector there is a long history of developing Machine to Machine responsive systems, but it would seem that for an IT sector based on historic transaction applications it’s a big paradigm (apologies) shift.

    Service Engineer management, including Preventative Maintenance, is widely regarded as having excellent potential for substantial improvements in operating efficiency and direct cost saving. Achieving these goals requires more than capturing real time data, it requires an App that uses this data to make real time optimized responses.

    Currently, whether, or not cloud based and mobile accessed, activities are planned in abstraction from reality on the basis of ‘historic’ data in ‘traditional’ IT applications.  The justification for the adoption of IoT is to interact with ‘reality’ using a flow of real-time sensed data to drive a new generation of dynamically optimized ‘read and respond’ Apps.

    Using IoT driven Service Maintenance changes activities from being planned on the basis of history, or responding to equipment failure, into being active optimized responses to the reality of the present, often with proactivity to developing situations.

    But this cannot be achieved just by adding IoT sensing to the current traditional Service Maintenance Applications that were never built to include this kind functionality. Though admittedly better data input added to the overall data available can improve performance, but the answers will always be via historic reports, not ‘real time’ optimizations. ( ‘real time’ is a difficult term to define as in practice latency means nothing is real time, but in the context of IoT it means reacting to data flows, not historic data processing).

    IoT driven Cloud based Apps such as Uber, the real time responsive Taxi Cab service, show how a new generation of Apps can provide real time optimization from IoT data inputs. These near real time read and responsive Apps are usually dubbed ‘Smart Services’; to distinguish them from current generation of Apps that may connect via the Internet and use Cloud services, but lack real time optimization to IoT data flows.

    The costs and inefficiencies of associated with equipment failure are an issue across all Industry sectors so, not surprisingly, Service Management, with Preventative Maintenance have been an immediate target for applying IoT. Its not only break fix notifications for equipment failure, but being the ability to use complex event processing to predict that an imminent failure might occur.

    Predictive Maintenance has always been the goal of any Service Management but to date the only possibility has been using historic failure records for guidance. Due to the time and costs of detailed record keeping and the need for a long term period of analysis this has only been possible for selected large value equipment. IoT sensing now makes it possible to provide absolutely accurate ‘real time’ data warnings across many items at low costs.

    The benefits of Preventative Maintenance range from a less expensive fix of a simple wearing part thus avoiding wider spread damage to adjacent parts if left unattended. Ultimately, unaddressed failure could be of a catastrophic nature resulting in the need for a complete replacement unit. It is also important is being able to choose the time to carry out service work; Retailing, as an example, would prefer service work to be carried out of trading hours; whereas Manufacturing processes need to choose when to make planned shutdowns.

    IoT sensing is one half of the game changing to Service Management, but it’s the complex event processing capability to make use of the real time data flows that makes preventative maintenance possible. IoT sensing brings the new unique capability to use real time data flows to establish relationships, and provide outcomes, that would not have been previously been possible. (See previous blog; event hubs or engines add react capability analytics to read real time IoT data).

    As an example, IoT Complex Event Processing would interpret reporting rising changes in temperature, energy consumption, and vibration from individual sensors, as advance warning of a potential bearing failure in a rotating part.  Reading the real situation will always be more accurate than even the best of historic time based operations.

    An often-repeated industry story that illustrates the limitations of break fix actions with existing preventative maintenance routines tells of a unexpected breakdown in a heat pump being repaired with a new parts and a through overall service. Seven weeks later the annual time planned preventative maintenance service fell due and a different engineer dismantled the heat pump once again replacing the nominated ‘wearing’ parts in accordance with the instructions for an annual service.

    Clearly ‘real time’ Smart Services using IoT data bring obvious benefits, but a Service Management and Preventative Maintenance package should be providing deeper operational aspects as well.  It’s not just the equipment that benefits from real time dynamics, in this complicated working environment, engineering response and activity planning needs the same dynamic approach.

    If an Engineer is on a site for one task, and another event occurs on the same site, then automatic re-planning of the service engineer’s day should occur. In turn this should lead to wider re-planning of the rest of field engineering teams activities for the day to ensure cohesive coverage.

    Unexpectedly long repair times, traffic impacts on travel time, relationships to Customer Service satisfaction all require a fully integrated real time approach in a new generation of Service Management and Preventative Maintenance Services and Apps.

    The commercial impact and importance of this market has not gone unnoticed by technology industry vendors, or equipment manufacturers, resulting in a wide range of announcements. See links to five vendors teaming IoT sensing with Service Management below. However, whilst all provide the core new capabilities outlined above there are noticeable differences in the capability to team the real time react events with an equally dynamic real time reactive Service Management environment.

    As the shift to IoT sensor based Preventative Maintenance takes place and an increasing amount Service Management activities are driven by real time event responses, then the necessity for a similar shift to real time Service Management operations becomes clear. Service Management professionals are facing a similar transformation to Internet ‘reality’ operations as marketing professionals faced with the adoption of Internet ‘reality’ of Social Tools.

    Salesforce focus on this aspect in their announcement last week, (15th March), of Field Service Lightening referring to it as 360-degree operations, the extent to which this is available in other Vendors offerings is less clear.

    http://www.salesforce.com/service-cloud/features/field-service-lightning/

    http://www.sap.com/pc/tech/internet-of-things/software/predictive-maintenance/index.html

    http://www.ibm.com/internet-of-things/asset-management.html

    https://blogs.microsoft.com/iot/2015/12/01/azure-iot-suite-predictive-maintenance-now-available/

    https://www.bosch-si.com/solutions/manufacturing/predictive-maintenance/increase-machine-uptime.html

    http://www.softwareag.com/us/solutions/manufacturing/iot/overview/default.asp

    New C-Suite