Results

How to Secure the Best Cloud Software Contract Webinar with R “Ray” Wang

How to Secure the Best Cloud Software Contract Webinar with R “Ray” Wang

Constellation Research principal analyst and bestselling author R "Ray" Wang will teach you how to structure the most favorable cloud software contract for your organization. R "Ray" Wang will share his top cloud negotiation tips derived from his involvement in 1,000+ contract negotiations.

Don't get trapped in an unfavorable cloud contract. Learn how to navigate the increasingly complex cloud services market by registering for this webinar today. 

Details:

You will learn:

  • Top cloud negotiation tips from one of enterprise tech's leading analyst
  • Common contract negotiation pitfalls
  • How to ensure you secure the best cloud services deal
Tech Optimization Marketing Transformation Webinar AR SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Financial Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Constellation Announces 2015 SuperNova Award Winners

Constellation Announces 2015 SuperNova Award Winners

Constellation announces the winners of the 2015 SuperNova Awards at Constellation's Connected Enterprise.

2015 SuperNova Award Winners

Constellation announced nine SuperNova Award winners last night at the SuperNova Awards Gala Dinner. 

The Constellation SuperNova Awards are the first and only awards to celebrate the leaders and teams who have overcome the odds to successfully apply emerging and disruptive technologies for their organizations. The SuperNova Award winners demonstrated great leadership in selecting, implementing, and deriving business value from disruptive technologies. More information about the winners below. 

All applications were evaluated by the SuperNova Award Judges, comprised of industry thought leaders, and then put to a public vote. 

2015 SuperNova Award Winners

Consumerization of IT & The New C-Suite - Martin Brodbeck, CTO, Sterling Backcheck

Martin Brodbeck won the SuperNova Award for his implementation of SnapLogic Infrastructure-as-a-Service solutions to streamline Sterling Backcheck’s automated transactions service. He moved the entire system from a complicated, custom-built, open-source system to one streamlined cloud-based transactions system. The project shortened the customer onboarding time to just days. Frictionless transactions in the cloud translate to faster time to revenue for Sterling Backcheck.

Data to Decisions - Alda Mizaku, Lead Business Solutions Analyst, Mercy

Mizaku and her team won the SuperNova Award for leading a big data analytics project to improve the delivery of patient care and bridge the gaps between the clinical and coding worlds. Part of the strategy included automation of secondary diagnosis detection targeted to improve the accuracy of provider documentation. This strategy seeks to more accurately reflect care that has already been provided and help bubble up comorbidity factors. This project, which uses a combination of data warehouse tables, ETL logic, and custom reporting led to a diagnosis increase of 36%.  

Digital Marketing Transformation -Naveen Gunti, Sr. Director of e-Commerce Technology and Operations, Tumi Holdings, Inc.

Naveen Gunti won the SuperNova Award for his use of Adobe Marketing Cloud to improve Tumi’s online customer experience. The implementation enabled customers to view dynamic images of Tumi products on the Tumi website. The resulting increased customer engagement on the website led to an increase of on-site time by 40%.

Future of Work, Human Capital Management -Asha Aravindakshan, Operations Director, Global Talent, Ashoka

Asha Aravindakshan won the SuperNova Award for leading the implementation of FinancialForce HCM at Ashoka. Prior to the implementation, Ashoka used Excel spreadsheets to manage their global workforce. Recognizing adoption as an essential element required for the success of the new HCM system, Aravindakshan led a movement to conduct performance reviews on FinancialForce. 97% of Ashoka's employees engaged with FinancialForce in response to this performance review incentive. Thanks to Aravindakshan's leadership, Ashoka can now address the HCM needs of an international workforce.

Future of Work, Social Business - Steve Nava, Sr. Director Field Service, Luminex

Steve Nava won the SuperNova Award for leading the implementation of ServiceMax to improve communication and collaboration of field service engineers at Luminex. The implementation was so successful that it transformed Luminex’s field service department into a solutions-oriented business.  Luminex’s fix rate increased to 98%, and the invoice cycle went from 28 days to 96 hours. 

Matrix Commerce - Jordan Kivelstadt, CEO, Free Flow Wines

Jordan Kivelstadt won the SuperNova Award for being one of the first companies to deliver wine in kegs. He used Netsuite to address the specific supply chain issues presented by this model. Free Flow Wines is disrupting the wine industry as more restaurants are choosing to serve wine in kegs. This year, Free Flow Wines expects to deliver the equivalent of 300,000 cases of wine. 

Next Generation Customer Experience - Dan Wallis, Director of KP OnCall, Kaiser Permanente

Dan Wallis won the SuperNova Award for leading the implementation of Oracle Service Cloud to support a system that helps to automatically diagnose health conditions via the web. This project has allowed Kaiser Permanente to better serve their customers by reducing costs and wait times. Now Kaiser patients can accurately self-diagnose conditions without visiting a doctor or calling a nurse call center. 

Technology Optimization & Innovation - Dr. David Bray, Chief Information Officer, Federal Communications Commission

Dr. David Bray won the SuperNova Award for his overhaul of the FCC’s legacy Consumer Help Center. He implemented cloud-based Zendesk to modernize the Commission's Help Center. The Zendesk implementation helped the FCC replace 18 outdated complaint forms, activate 24/7 complaint tracking, and improve transparency. The cloud-based solution selected and implemented under Bray's leadership led to a savings of 5/6 over a custom-built, in-house solution. 

Technology Optimization & Innovaiton - Erica Stevens, VP of Supply Chain and IT, Dylan's Candy Bar

Erica Stevens won the SuperNova Award for her implementation of Netsuite at Dylan’s Candy Bar. This implementation transformed Dylan’s into a ubiquitous channel retailer. Dylan's is now able scale and pivot its retail operations to meet the needs of their customers on many channels including web, mobile, and brick and mortar.

The Rewards

Congratulations to the winners! Continue to be brave, innovative, and disruptive!

Data to Decisions Future of Work Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief People Officer Chief Procurement Officer Chief Supply Chain Officer

My opening remarks on privacy at Constellation Connected Enterprise 2015

My opening remarks on privacy at Constellation Connected Enterprise 2015

A big part of my research agenda in the Digital Safety theme at Constellation is privacy. And what a vexed topic it is! It's hard to even know how to talk about privacy. For many years, folks have covered privacy in more or less academic terms, drawing on sociology, politics and pop psychology, joining privacy to human rights, and crafting new various legal models.

Meanwhile the data breaches get worse, and most businesses have just bumped along.

When you think about it, it's obvious really: there's no such thing as perfect privacy. The real question is not about 'fundamental human rights' versus business, but rather, how can we optimise a swarm of competing interests around the value of information?

Privacy is emerging as one of the most critical and strategic of our information assets. If we treat privacy as an asset, instead of a burden, businesses can start to cut through this tough topic.

But here's an urgent issue. A recent regulatory development means privacy may just stop a lot of business getting done. It's the European Court of Justice decision to shut down the US-EU Safe Harbor arrangement.

The privacy Safe Harbor was a work-around negotiated by the Federal Trade Commission, allowing companies to send personal data from Europe into the US.

But the Safe Harbor is no more. It's been ruled unlawful. So it's big, big problem for European operations, many multinationals, and especially US cloud service providers.

At Constellation we've researched cloud geography and previously identified competitive opportunities for service providers to differentiate and compete on privacy. But now this is an urgent issue.

It's time American businesses stopped getting caught out by global privacy rulings. There shouldn't be too many surprises here, if you understand what data protection means internationally. Even the infamous "Right To Be Forgotten" ruling on Google's search engine - which strikes so many technologists as counter intuitive - was a rational and even predictable outcome of decades old data privacy law.

The leading edge of privacy is all about Big Data. And we aint seen nothin yet!

Look at artificial intelligence, Watson Health, intelligent personal assistants, hackable cars, and the Internet of Everything where everything is instrumented, and you see information assets multiplying exponentially. Privacy is actually just one part of this. It's another dimension of information, one that can add value, but not in a neat linear way. The interplay of privacy, utility, usability, efficiency, efficacy, security, scalability and so on is incredibly complex.

The broader issue is Digital Safety: safety for your customers, and safety for your business.

Digital Safety, Privacy & Cybersecurity Chief Information Officer

Randstad Sourceright: Good Progress and the beginning of a balancing act

Randstad Sourceright: Good Progress and the beginning of a balancing act

We had the opportunity to attend the Randstad Sourceright analyst summit this week, it took place in Atlanta and was well attended by the analyst community:

 

Take a look at my top 3 takeaways of the event:

 
 
If you don't have a chance to watch - here are the takeaways:
 
  • Talentradar debut - Randstad Sourceright showed the first deviverable on its product roadmap with Talentradar. It brings together recruiting information across the varies systems involved as well as Randstad Sourceright parnters like e.g. Hirevue, Smashfly etc. Technologies mentioned were Informatica and R, we could see Domo being used for visualization. Randstad Sourceright has delivered a solid version one of the product, now we need to understand the roadmap and customer adoption as next steps.
     
  • Standardization - The whole outsourcing industry is recovering from a hangover of too customized deals sold and implemented early in the millenium. The answer is standardization and leveraging global capabilities, Randstad Sourceright is making good progress on both fronts. Discipline is key though and it was re-assuring hearing the North American sales leaders stating that they would walk away from business if not falling inside the parameters of standard delivery.
     
  • RiseSmart - In a surprise move Randstad Sourceright acquired outplacement vendor RiseSmart (see here for vendor and here for press release). RiseSmart CEO Sathe was there and told the vendor's story - bringing software to the outplacement business. In a good move Randstad Sourceright has decided to keep RiseSmart operate independently.



 

    MyPOV

    It is good to see Randstad Sourceright growing and making progress standardizing, globalizing its product offerings. The acquisition of RiseSmart opens new revenue potential and the chance to disrupt the outplacement market. Equally it is good to see the product focus showing first deliverables with Talentradar. And the vendor is keeping tabs on a booming Recruiting startup eco system with its Randstad Foundation Fund. 
     
    On the concern side Randstad Sourceright will have to take into account that more resourcing decisions will be made by software, and more resourcing decisions will be made by hiring managers directly, and no longer by recruiters. These changes are disruptive for Randstad Sourceright customers and therefore for the vendor itself. Preparing and switching over in time will be the key challenge for executive management in the next years. 
     
    But for now congrats on good progress, we will be keeping tabs, stay tuned. 

    --------------

    More on Recruiting


     
     

    • Musings - How Technology Innovation fuels Recruiting and disrupts the Laggards - read here
    • Musings - What is the future of recruiting? Read here
    • HRTech 2014 takeaways - read here.
    • Why all the attention to recruiting? Read here.
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here


     
    Tech Optimization Chief Information Officer

    Pivotal Now Makes It Easier Than Ever to Take Software from Idea to Production

    Pivotal Now Makes It Easier Than Ever to Take Software from Idea to Production

     
    Today Pivotal used its upcoming European Cloud Foundry user conference to release a round up press release on its overall progress… time to check in where Cloud Foundry stands today.

     


    So let’s dissect the press release in our customary style:

     
    San Francisco, November 2, 2015 – Pivotal®, the company accelerating digital transformation for enterprises, today announced a new release of Pivotal Cloud Foundry, the comprehensive Cloud Native platform that unifies the software delivery process with an integrated application framework, platform runtime, and infrastructure automation. Pivotal Cloud Foundry now includes expanded support for Spring Cloud Services, Microsoft Azure, .NET applications, Docker images, and application lifecycle management. With these enhancements, Pivotal further enables businesses to rapidly build, deploy, and operate Cloud Native applications on a wide choice of hosted, public, and private clouds.

    MyPOV – Good summary to start the press release – hitting all the key new capabilities, we will dissect and comment below. But worth to mention the ‘cloud native’ positioning here, will be interesting to see if Pivotal can pull that association between cloud native and its products of.

     
    “The days of monolithic technologies are ending. Today’s modern enterprises practice agile software development with Cloud Native tools, process, and culture that can respond to speed of market and customer demand,” said James Watters, vice president and general manager, Cloud Platform Group, Pivotal. “Pivotal Cloud Foundry delivers a comprehensive Cloud Native application development and operations environment so you can spend time building business value instead of your IT infrastructure.”

    MyPOV – Good quote by Waters, hitting the right value proposition of CloudFoundry, though the tool itself is also monolithic – in the sense of offering one way to build software.

     
    Integrated Microservices with Spring Cloud Services  
    Based on the popular Spring Cloud OSS, which is used by Netflix to operate its global, on-demand video streaming service, Spring Cloud Services for Pivotal Cloud Foundry goes one step further to provide opinionated provisioning and lifecycle management to these components.
    The first and only secure, enterprise-ready distribution of core Netflix OSS components, Spring Cloud Services enables developers and operators of Cloud Native distributed systems architectures to quickly and easily build microservices by adding a suite of production-ready services to the Pivotal Cloud Foundry marketplace. Spring Cloud Services allows developers to focus on delivering business value and defers the deployment and management of important distributed systems patterns such as application configuration, service discovery, and fault-tolerance to the Pivotal Cloud Foundry platform.

    MyPOV – Cloud Foundry needed a productivity framework to accelerate time to market for solutions build with the product. Nothing lies closer than using the venerable spring framework, souped up with the Netflix OSS components. A good move for the product, but like all productivity tools, it comes with the addition of increased dependency. We expect Pivotal customers to not be too concerned with this dependency, but they should be making aware tradeoffs.

     
    Native Support for .NET Applications 
    Thanks to the next-generation runtime shipping in this latest release, .NET applications can now run on Pivotal Cloud Foundry. With this expanded support for .NET, enterprises can support a heterogeneous environment consisting of both Linux-based and Windows-based applications. .NET applications will run natively on Windows Server 2012 R2 Hyper-V virtual machines, and Pivotal Cloud Foundry can manage applications with the same commands and many of the same consistent Day 2 operational benefits as existing applications.

    MyPOV – This is a key move by Microsoft and Pivotal to avoid developers of .Net applications to have to go back and rebuild these .Net apps as their first order of business. Instead Microsoft and now Pivotal gives developers the opportunity to operate these older .Net applications in conjunction with the next generation applications they want to build (and the vendors want them to build). Lastly it is the ultimate proof point of investment protection for .Net applications, a promise Microsoft has made over a decade ago and is honoring today.

     
    Native Support for Docker Images 
    Docker applications can now leverage the built in Pivotal Cloud Foundry platform capabilities, such as scheduling, health management, load balancing, enterprise identity, logging, and multi-cloud support. Now in beta, native Docker image support is made possible by the new elastic runtime and makes Pivotal Cloud Foundry the most advanced container management system on the market today. Customers can deploy applications to Pivotal Cloud Foundry based on Docker images from public, secure registries such as Docker Hub.

    MyPOV – Good move to provide better support for Docker, and the way how enterprises want to build, operate and consume Microservices – in a secure, repeatable and reliable way. Registry integration is the capability in demand and it is good to see Pivotal providing the capability.

     
    Application Lifecycle Management Toolchain
    Delivering on Pivotal’s vision of comprehensive Cloud Native application lifecycle management, the company is partnering with GitLab, CloudBees, and JFrog to deliver a turnkey continuous integration and continuous delivery (CI/CD) solution.
    Building upon the popular software project management tool, Pivotal Tracker, customers can integrate platform-managed versions of GitLab source code repository, CloudBees Jenkins continuous integration, and JFrog Artifactory binary artifact management. By providing the building blocks of a modern application delivery toolchain, Pivotal Cloud Foundry empowers software organizations to build and deploy microservices and Cloud Native applications with confidence and speed.

    MyPOV – It is good to see Pivotal acknowledging other popular development tools such as the ones mentioned and being integrated now with Pivotal Tracker. Next would be a roadmap / sharing of plans of other popular adjacent tools. For now congrats to the three who made it – GitLab, CloudBees and JFrog.

     
    Early Access Support for Microsoft Azure
    Pivotal Cloud Foundry extends its Cloud Native platform with early access support for Microsoft Azure, adding to the already-supported Amazon Web Services® (AWS), VMware vSphere®, VMware vCloud Air®, and OpenStack®. With Pivotal Cloud Foundry, customers can deploy and manage Cloud Native applications on almost any infrastructure; without the operational cost and complexity of maintaining their own underlying cloud infrastructure. [..]

    MyPOV – Good to see Pivotal extending deployment options, as previously indicated – now adding support to Microsoft Azure. A good move for CloudFoundry users, who get more deployment options for their projects.

     

    Overall MyPOV

    Pivotal is making good progress with CloudFoundry, creating more value and synergies for customers and prospect. It further solidifies CloudFoundry‘s position as the leading enterprise PaaS. With Microsoft Azure support and access to .Net applications, Microsoft acknowledges the position of Cloud Foundry further, bringing core delivered Microsoft .Net assets to the CloudFoundry platform.

    On the concern side – with success comes also responsibility, Pivotal needs to deliver these capabilities, ensure customer success and become a reliable partner both for its growing ecosystem. There is no indication that Pivotal cannot deliver this, but the task ahead is not trivial. Starting to create, communicate and deliver to roadmaps will be the first steps.

    But for now it’s good to be a Pivotal customer and prospect. 


    More on Pivotal

     
    • News Analysis - Pivotal makes CloudFoundry more about multi-cloud - read here
    • News Analysis - Pivotal pivots to OpenSource and Hortonworks - Or: OpenSource keeps winning - read here

    More on Next Generation Applications::

     
     
    • Progress Report - Cloudera is all in with Hadoop - now off to verticals - read here
    • First Take - SAP Cloud for Planning - The next spreadsheet killer is off to a good start - read here
    • Market Move - Oracle buys Datalogix - moves into DaaS - read here
    • News Analysis - SAP commits to CloudFoundry and OpenStack - Key Steps - but what is the direction? Read here
    • Event Report - MongoDB is a showcase for the power of Open Source in the enterprise - read here
    • Musings - A manifesto: What are 'true' analytics? Read here
    • Future of Work - One Spreadsheet at the time - Informatica Springbok - read here
    • Musings - The Era of the no-design Database - Read here
    • Mendix - the other path to build software - read here
    • Musings - Time to ditch your datawarehouse .... - Read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here



     
    Future of Work Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Data to Decisions Digital Safety, Privacy & Cybersecurity New C-Suite Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

    Oracle Open World 2015: Three Important Cloud Services

    Oracle Open World 2015: Three Important Cloud Services

    Oracle Open World 2015 announcements included three data-related standouts: Oracle Cloud Platform For Big Data, Oracle Data Visualization Cloud Service, and Oracle Data Cloud. Here’s a deeper dive on what stands out.

    Our cloud is integrated, it’s ready, and it’s bigger than and superior to any rival cloud. That was the big-picture message Oracle offered customers at Oracle Open World 2015. What’s more, the cloud is where most customers will soon be headed, said Oracle CEO Mark Hurd, predicting that “virtually all” enterprise data would be stored in the cloud by 2025.

    It would be impossible to detail all two-dozen-plus announcements made last week, so I’ll narrow things to my Data-to-Decisions (D2D) research domain and focus on three announcements that caught my eye: The Oracle Cloud Platform For Big Data, the Data Visualization Cloud Service, and the Oracle Data Cloud. With all three, Oracle is trying to differentiate itself in the cloud.

    Oracle Cloud Platform For Big Data

    Oracle’s Hadoop-in-the-cloud Big Data Cloud Service was announced at Oracle Open World 2014. At Oracle Open World 2015 the company reannounced a family of supporting services newly packaged as the Oracle Cloud Platform for Big Data. The components include Oracle Big Data Preparation Cloud Service, Oracle GoldenGate Cloud Service, Oracle Big Data Discovery Cloud Service and Oracle NoSQL Database Cloud Service. The idea is to surround the Hadoop service with a breadth of data-prep, data-movement and data-analysis options.

    @Oracle, #OOW15

    The Oracle Cloud Platform for Big Data combines platforms (database, Hadoop, NoSQL) and analysis options.

    I say “reannounce” because Oracle talked about all of these services back in June. The new “Cloud Platform for Big Data” is a new brand. The Oracle Big Data Preparation service is aimed at preparing and enriching semi-structured and unstructured data, such as clickstreams and social data. Under the hood it makes use of Apache Spark machine learning, Yago entity resolution training sets and Apache UIMA natural language processing.

    The GoldenGate Cloud service is based on Oracle’s well-known data-replication software. This service is designed to pump data in near real time into Oracle Database Cloud Service, Exadata Cloud Service, Hive, Hbase, HDFS, Flume and Kafka (in this case, in the cloud). It’s a complement to low-latency, data-streaming applications, such as those in IoT scenarios.

    The Oracle Big Data Discovery Cloud Service is a business-analyst-oriented tool for exploring, transforming and discovering data within Hadoop (again, in this case running in Oracle’s cloud). Data Discovery first samples, profiles and catalogs all available data. Machine-learning algorithms then surface interesting correlations and offer suggested visualizations for exploring attributes. Search and guided navigation features support data exploration.

    Filling out Oracle’s Cloud Platform is the NoSQL Database Cloud service, which is Oracle’s high-scale key-value store database delivered as a service.

    MyPOV: Not to be forgotten is the Oracle Big Data SQL Cloud Service, which does SQL querying across Oracle Database Cloud services, the Big Data Cloud Service and the NoSQL Database Cloud Service. Taken together, it’s a broad (if Oracle-centric) collection. IBM has a broader collection of IBM and open-source-based services on BlueMix, and Amazon Web Services has more customers using its cloud. But Oracle is building out an impressive portfolio, and the company’s dominant database position will surely feed cloud growth.

    Oracle Data Visualization Cloud Service

    Nearly every BI vendor has introduced a data-visualization module in recent years in response to the fast growth of Tableau Software. Oracle has evolved what it offers for the cloud with the Oracle Data Visualization Cloud Service. Set to become generally available in November, this new stand-alone service is based on capabilities first seen in the Oracle Business Intelligence Cloud Service introduced in April.

    @Oracle, #OOW15

    The Oracle Data Visualization Cloud Service offers 18 charting options and a palate of colors and shapes for depicting data.

    The Oracle Data Visualization Cloud Service will enable you to link to on-premises and cloud data sources (both from Oracle and third parties) as well as your own spreadsheets. There are 18 different types of visualizations and a palate of colors, shapes and sizes for depicting data points.

    Execs at Oracle Open World made a point of saying “all you need is a browser.” That’s because with Tableau’s cloud service, Tableau Online, users author charts and dashboards with the desktop client and then publish to the cloud for collaboration. Tableau is working on bringing full authoring capabilities to the cloud. And, indeed, Oracle is working on a desktop client for times when you need to work offline.

    MyPOV: Oracle execs made claims about its new service being “more modern than Tableau” at Oracle Open World. That starts with full authoring capabilities in the cloud, but I’m not seeing some of the other differences claimed. The press release says Oracle’s service “eliminates the complexity typically associated with blending and correlating data sets,” but Tableau also automatically finds joins when mashing up data sets. Both products also select best-fit visualizations automatically based on the dimensions of data used in an analysis. This auto-charting capability has been around for a while, and it’s also present in SAS Visual Analytics and IBM Watson Analytics.

    For a real head-to-head comparison with Tableau, I want to investigate Oracle’s performance characteristics and its connection capabilities (once this service is available). Tableau’s strengths include its in-memory engine and its live-data-connection capabilities with multiple databases, apps and cloud services, including multiple connection options with Amazon, Google, IBM, Microsoft, Oracle, Salesforce, SAP and others. Will Oracle match that? I also want to tour the “fit and finish” of the visualizations and “storytelling” capabilities. Some of the charts seen at Oracle Open World looked hard to read, but that may be due to the data-filtering and presentation inexperience of the demonstrators.

    Yet-to-be released pricing details from Oracle will also be key to any comparison, but to me, these visualization capabilities are most attractive when teamed with the Oracle BI Cloud Service. That’s because it’s not only a data-exploration and visualization service; you also get the database and reporting functionality. Here, too, the more Oracle-centric you are in the on-premises world, the more attractive the cloud options will be.

    Oracle Data Cloud

    Several new features of the Oracle Data Cloud were announced at Oracle Open World, but a larger context emerged last week with IBM’s announced intent to acquire The Weather Company. Thus, I was eager to learn more about Oracle as a data provider. Oracle Data Cloud is built on technology, data and analytics expertise picked up in the BlueKai and Datalogix acquisitions. Talking to execs from both companies now leading Oracle Data Cloud, I came away impressed.

    @Oracle, #OOW15

    Oracle Data Cloud offers data from more than 1,500 specialty retailers and 30 supermarket loyalty cards.

    Oracle Data Cloud offers offers data from more than 1,500 CPG and specialty retailers across 110 Million US Households. With data-enrichment and predictive analytics options on top of this data, Oracle can find likely buyers by product and category.

    MyPOV: Having data and being able to enrich that data and apply predictive analytics is the name of the game in marketing, and these initiatives are moving into the sales and service arenas as well. In the business-to-business arena, Oracle Data Cloud can enrich your data with Dun & Bradstreet information to find look-alikes of your best customers. A next step is bringing service data full circle back into your understanding of customers to drive efforts such as retention campaigns.

    Many tech vendors are introducing libraries of third-party data that are integrated with their offerings. But big guns like IBM and Oracle are stepping up to become primary data providers. Expect to see data from outside of your organization becoming a bigger and bigger part of your future success.


    Data to Decisions Future of Work Matrix Commerce Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer

    Event Report - Oracle Openworld 2015 - Top 3 Takeaways, Top 3 Positives & Concerns

    Event Report - Oracle Openworld 2015 - Top 3 Takeaways, Top 3 Positives & Concerns

    We had the opportunity to attend Oracle OpenWorld in San Francisco this week. The conference was again the usual spread out affair, with e.g. JavaOne happening at the Hilton, the HR events happening at the Palace Hotel etc. Official attendance was 60k+ - it felt the same as last year.


     
    So take a peek:


     
     
    If you can't watch - here is the gist:
     
    Top 3 Takeaways
     
    • IaaS is here - There was something missing on the Oracle Cloud architecture, and that was IaaS. As Ellison shared candidly, Oracle built SaaS, realized that it needed PaaS and building PaaS it needed IaaS. So Oracle found its way to the cloud top down, with the important auto-scaling feature coming in 2 weeks / later this year. Pricing is attractive, as Ellison put it - Oracle dedicated instances will be at 50% of AWS flexible instances cost wise. And Storage will be at 1/10 of AWS S3. So cost will not be an issue / argument moving to Oracle's cloud.
       
    • More Multitenancy, please - Oracle introduced multitenancy at the database level 2 years ago with Oracle 12c, this year it showed the transfer of a database container from data center A to data center B while writing to the database. And it introduced the largest extension to WebLogic, making JVMs multitenant, a key reliability and flexibility addition to running Java applications (needless to say Oracle announced Docker support, too).
       
    • SCM closes the suite - SCM was the holdout on the Oracle Cloud Suite, and Oracle announced key new additions and products to address this gap. Coupled with the also announced e-commerce products, this announcement as well as the progress of the rest of the SaaS products makes the Oracle offering likely the most complete suite in the cloud. Or on premises as Oracle keeps supporting the duality of deployment. 
     
    Top 3 Positives
     
    • The chip-to-click stack becomes more real - With autoscaling the Oracle integrated tech stack learns a key trick to become a cloud infrastructure stack in regards of operational TCO. But also good news for on premises customers who can run workloads in a more elastic way.
       
    • SaaS Suite gets complete - With key SCM functionality and a new e-commerce suite Oracle adressess both gap and good house-keeping in its SaaS suite, which is now complete in terms of all major funtional areas.
       
    • Differentiation Sprinkles - No other large application vendor talks as much and has an as clear DaaS vision as Oracle. The analytical models are sane, too, as a quick conversation with the Datalogix team concluded. And then Oracle keeps creating value for the citizen developer and citizen integrator, allowing business end users to create mobile, web applications and integrations. 

      Top 3 Concerns

      • Will it all work? - Oracle is likely to undertake the largest engineering project with developers having the same logo on their paychecks. It all has to work seamlessly together, and Oracle has a checkered quality record in the past. To be fair, quality issues have been much less creating headlines for the vendor in the recent past.
         
      • Can Oracle sell it? - The challenge moves now from engineering to go to market, sales through direct and indirect channels. Oracle needs to onboard 1000s of partners in order to maintain the same relevancy in the future that it has today.
         
      • A relationship test - Oracle customers usually have less love lost for their vendor than for most other vendors in the market. That relationship needs to improve in order to become a service provider, where renewals are frequent and regular. Very different to the perpetual license model.
         

      MyPOV

      Oracle keeps executing along the vision of the 'IBM of the 21st century' - the single stop for everything an enterprise needs - on premises and in the cloud. The cloud viability has been notched up by significant degrees with the product progress shared at this OpenWorld. Good for customers, as they will get stronger and richer products. It is clear that horizontal integration inside the layers of the technology stack (e.g. a complete SaaS Suite, an powerful PaaS platform) are desirable for customers. How many layers of vertical integration are desired is less certain and will be the interesting story to watch, as we never had that many layers to deploy hardware and software to in the past, and the once upon a time model of the IBM stack feel apart in the 70ies of last century. Exciting times ahead, we will be watching and analyzing, stay tuned.

      ---------------


       
      I compiled a short presentation with all first 22 press releases of this OpenWorld being discussed - take a look:
       
       
      No time to watch - checkout the presentation below:
       
      Oracle OpenWorld - A quick take on all 22 press releases of Day #1 - #3 from Holger Mueller



      More on Oracle OpenWorld:
      • News Analysis - Quick Take on all 22 press releases of Oracle OpenWorld Day #1 - #3 - read here
      • First Take - Oracle OpenWorld - Day 1 Keynote - Top 3 Takeaways - read here
      • Event Preview - Oracle Openworld - watch here

      Future of Work / HCM / SaaS research:
      • Event Report - Oracle HCM World - Full Steam ahead, a Learning surprise and potential growth challenges - read here
      • First Take - Oracle HCM World Day #1 Keynote - off to a good start - read here
      • Progress Report - Oracle HCM gathers momentum - now it needs to build on that - read here
      • Oracle pushes modern HR - there is more than technology - read here. (Takeaways from the recent HCMWorld conference).
      • Why Applications Unlimited is good a good strategy for Oracle customers and Oracle - read here.

      Also worth a look for the full picture
      • Event Report - Oracle PaaS Event - 6 PaaS Services become available, many more announced - read here
      • Progress Report - Oracle Cloud makes progress - but key work remains in the cellar - read here
      • News Analysis - Oracle discovers the power of the two socket server - or: A pivot that wasn't one - TCO still rules - read here
      • Market Move - Oracle buys Datalogix - moves more into DaaS - read here
      • Event Report - Oracle Openworld - Oracle's vision and remaining work become clear - they are both big - read here
      • Constellation Research Video Takeaways of Oracle Openworld 2014 - watch here
      • Is it all coming together for Oracle in 2014? Read here
      • From the fences - Oracle AR Meeting takeaways - read here (this was the last analyst meeting in spring 2013)
      • Takeaways from Oracle CloudWorld LA - read here (this was one of the first cloud world events overall, in January 2013)

      And if you want to read more of my findings on Oracle technology - I suggest:
      • Progress Report - Good cloud progress at Oracle and a two step program - read here.
      • Oracle integrates products to create its Foundation for Cloud Applications - read here.
      • Java grows up to the enterprise - read here.
      • 1st take - Oracle in memory option for its database - very organic - read here.
      • Oracle 12c makes the database elastic - read here.
      • How the cloud can make the unlikeliest bedfellows - read here.
      • Act I - Oracle and Microsoft partner for the cloud - read here.
      • Act II - The cloud changes everything - Oracle and Salesforce.com - read here.
      • Act III - The cloud changes everything - Oracle and Netsuite with a touch of Deloitte - read here

      Finally find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
      Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Future of Work New C-Suite Data to Decisions Openworld Oracle SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

      Who buys Bitcoin for Identity?

      Who buys Bitcoin for Identity?

      You'll have to forgive the deliberate inaccuracy in the title, but I just couldn't resist the wordplay. The topic of this blog is the use of the blockchain for identity, and not exactly Bitcoin, which I appreciate is not the same thing. By my facetiousness and by my analysis, you'll see I don't yet take the identity use case seriously.

      In 2009, Bitcoin was launched. A paper was self-published by a person or persons going by the nom de plume Satoshi Nakamoto, called "Bitcoin: A Peer-to-Peer Electronic Cash System" and soon after an open source software base appeared at http://www.bitcoin.org. Bitcoin offered a novel solution to the core problem in electronic cash: how to prevent double spending without reverting to a central authority. Nakamoto's conception is strongly anti-authoritarian, almost anarchic, with an absolute rejection of fiat currency, reserve banks and other central institutions. Bicoin and its kin aim to change the world, and by loosening the monopolies in traditional finance, they may well do that.

      Separate to that, the core cryptographic technology in Bitcoin is novel, and so surprising, it's almost magical. Add to that spell the promises of "security" and "anonymity", and we have a powerful mix that some people see stretching far beyond mere money, and into identity. So is that a reasonable step?

      Bitcoin’s secret sauce

      A decentralised digital currency scheme requires some sort of community-wide agreement about when someone spends a virtual coin, so she cannot spend it again. Bitcoin’s trick is to register every single transaction on one public tamper-proof ledger called the blockchain, which is refreshed in such a way that the whole community in effect votes on the order in which transactions are added or, equivalently, the time when each coin is spent.

      No proof of identity or KYC check is needed to register a Bitcoin account; currency – denominated "BTC" – may be transferred freely to any other account. Hence Bitcoin may be called anonymous, although the unique account identifiers are set in stone, providing a indelible money trail that has been the undoing of many criminal Bitcoin users.

      The continuous arbitration of blockchain entries is done by a peer-to-peer network of servers that race each other to double-check a special hash value for the latest refreshed chain. The particular server that wins each race is rewarded for its effort with some Bitcoin. The ongoing background computation that keeps a network like this honest is referred to technically as "Proof of Work" and since there is a monetary reward for helping run the BTC network, the servers are colloquially called miners.

      Whether or not Bitcoin lasts as a form of electronic cash, there is a groundswell of enthusiasm for the blockchain as a new type of decentralized public ledger technology (DLT) for a much broader range of transactions, including “identity”. The shudder quotes are deliberate on my part, reflecting that the blockchain-for-identity speculations have not been clear about what part of the identity puzzle they might solve.

      For identity applications, the reality of Bitcoin mining creates some particular challenges which I will return to. But first let’s look at the positive influence of Bitcoin and then review some of its cryptographic building blocks.

      Bitcoin inspirations

      Bitcoin solves what was thought to be an unsolvable problem - double spending of electronic cash. It's the latest example of a wondrous pattern in applied maths. Unsolvable problems are, in fact, solved quite often, after which frenetic periods of innovation can follow. The first surprise or prototype solution is typically inefficient but it can inspire fresh thinking and lead to more polished methods.

      One of the greatest examples is Merkle’s Puzzles, a theoretical method invented by Ralph Merkle in 1974 for establishing a shared secret number between two parties who need only exchange public pieces of data. This was the holy grail for cryptography, for it meant that a secret key could be set up without having to carry the secret from one correspondent to the other (after all, if you can securely transfer a key across a long distance, you can do the same with your secret message and thus avoid the hassle of encryption altogether). Without going into detail, Merkle’s solution could not be used in the real world, but it solved what was thought to be an unsolvable problem. In quick succession, practical algorithms followed from Diffie & Hellman, and Rivest, Shamir & Adleman (the names behind “RSA”) and thus was born public key cryptography.

      Bitcoin has spurred dozens of new digital currencies, with different approaches to ledgers and arbitration, and different ambitions too (including Ripple, Ethereum, Litecoin, Dogecoin, and Colored Coins). They all promise to break the monopoly that banks have on payments, radically cut costs and settlement delays, and make electronic money more accessible to the unbanked of the world. These are what we might call liquidity advantages of digital currencies. These objectives (plus the more political promises of ending fiat currency and rendering electronic cash transactions anonymous or untraceable) are certainly all important but they are not my concern in this blog.

      Bitcoin's public sauce

      Before looking at identity, let's review some of the security features of the blockchain. We will see that safekeeping of each account holder's private keys is paramount - as it is with all Internet payments systems and PKIs.

      While the blockchain is novel, many elements of Bitcoin come from standard public key cryptography and will be familiar to anyone in security. What's called a Bitcoin "address" (the identifier of someone you will send currency to) is actually a public key. To send any Bitcoin money from your own address, you use the matching private key to sign a data object, which is sent into the network to be processed and ultimately added to the blockchain.

      The only authoritative record of anyone's Bitcoin balance is held on the blockchain. Account holders typically operate a wallet application which shows their balance and lets them spend it, but, counter-intuitively, the wallet holds no money. All it does is control a private key (and provide a user experience of the definitive blockchain). The only way you have to spend your balance (that is, transfer part of it to another account address) is to use your private key. What follows from this is an unforgiving reality of Bitcoin: your private key is everything. If a private key is lost or destroyed, then the balance associated with that key is frozen forever and cannot be spent. And thus there has been a string of notorious mishaps where computers or disk drives holding Bitcoin wallets have been lost, together with millions of dollars of value they controlled. Furthermore, numerous pieces of malware have - predictably - been developed to steal Bitcoin private keys from regular storage devices (and law enforcement agencies have intercepted suspects' private keys in the battle against criminal use of Bitcoin).

      You would expect the importance of Bitcoin private key storage to have been obvious from the start, to ward off malware and destruction, and to allow for reliable backup. But it was surprisingly late in the piece that "hardware wallets" emerged, the best known of which is probably now the Trezor, released in 2013. The use of hardware security modules for private key management in soft wallets or hybrid wallets has been notably ad hoc. It appears crypto currency proponents pay more attention to the algorithms and the theory than to practical cryptographic engineering.

      Identifying with the blockchain

      The enthusiasm for crypto currency innovation has proven infectious, and many commentators have promoted the blockchain in particular as something special for identity management. A number of start-ups are "providing" identity on the blockchain - including OneName, and ShoCard - although on closer inspection what this usually means is nothing more than reserving a unique blockchain identifier with a self-claimed pseudonym.

      Prominent financial services blogger Chris Skinner says "the blockchain will radically alter our futures" and envisages an Internet of Things where your appliances are "recorded [on the blockchain] as being yours using your digital identity token (probably a biometric or something similar)". And the government of Honduras has announced that American Bitcoin technology firm Factom will build a blockchain-based land title registry, which they claim will be "immutable", resistant to insider fraud, and extensible to "more secure mortgages, contracts, and mineral rights".  Interestingly, the Factom-Honduras project stalled for the second half of 2015.  I find it emblematic of the whole blockchain craze that one of the most popular use cases for decentralized ledger technology is little more than a press release.

      While blockchain aficionados have been quick to make a leap to identity, the opposite is not the case. The identerati haven't had much to say about blockchain at all. Ping Identity CTO Patrick Harding mentioned it in his keynote address at the 2015 Cloud Identity Summit, and got a meek response from the audience when he asked who knew what blockchain is (I was there). Harding's suggestions were modest, exploratory and cautious. And only now has blockchain figured prominently in the twice-yearly freeform Internet Identity Workshop unconference in Silicon Valley. I'm afraid it's telling that all the initial enthusiasm for blockchain "solving" identity has come from non identity professionals.

      What identity management problem would be solved by using the blockchain?

      The most prominent challenges in digital identity include the following:

      • account creation including validation of identity or other attributes
      • the cost and inconvenience of multiple account registrations
      • the inconvenience and insecurity of multiple usernames and passwords
      • identity theft and account takeover
      • interoperability of identity data or attributes between services and applications
      • provenance of attributes.

      What does the blockchain have to offer?

      Certainly, pseudonymity is important in some settings, but is rare in economically important personal business, and in any case is not unique to the blockchain. The secure recording of transactions is very important, but that’s well-solved by regular digital signatures (which remain cryptographically verifiable essentially for all time, given the digital certificate chain). Most important identity transactions are pretty private, so recording them all in a single public register instead of separate business-specific databases is not an obvious thing to do.

      The special thing about the blockchain and the proof-of-work is that they prevent double-spending. I’ve yet to see a blockchain-for-identity proposal that explains what the equivalent “double identify” problem really is and how it needs solving. And if there is such a thing, the price to fix it is to record all identity transactions in public forever.

      The central user action in all blockchain applications is to “send” something to another address on the blockchain. This action is precisely a digital (asymmetric cryptographic) signature, essentially the same as any conventional digital signature, created by hashing a data object and encrypting it with one’s private key. The integrity and permanence of the action comes from the signature itself; it is immaterial where the signature is stored.

      What the blockchain does is prevent a user from performing the same action more than once, by using the network to arbitrate the order in which digital signatures are created. In regular identity matters, this objective simply doesn’t arise. The primitive actions in authentication are to leave one’s unique identifying mark (or signature) on a persistent transaction, or to present one’s identity in real time to a service. Apart from peer-to-peer arbitration of order, the blockchain is just a public ledger - and a rather slow one at that. Many accounts of blockchain uses beyond payments simply speak of its inviolability or perpetuity. In truth, any old system of digitally signed database entries is reasonably inviolable. Tamper resistance and integrity come from the digital signatures, not the blockchain. And as mentioned, the blockchain itself doesn't provide any assurance of who really did what - for that we need separate safeguards on users' private keys, plus reliable registration of users and their relevant attributes (which incidentally cannot be done without some authority, unless self-attestation is good enough).

      In addition to not offering much advantage in identity management, there are at least two practical downsides to recording non Bitcoin activity on the blockchain, both related to the proof-of-work. The peer-to-peer resolution of the order of transactions takes time. With Bitcoin, the delay is 10 minutes; that’s the time taken for an agreed new version of the blockchain to be distilled after each transaction. Clearly, in real time access control use cases, when you need to know who someone is right away, such delay is unacceptable. The other issue is cost. Proof-of-work, as the name is meant to imply, consumes real resources, and elicits a real reward.

      So for arbitrary identity transactions, what is the economics for using the blockchain? Who would pay, who would be paid, and what market forces would price identity, in this utopia where all accounts are equal?

      Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Matrix Commerce Distillation Aftershots Data to Decisions Future of Work New C-Suite Tech Optimization AI Blockchain Security Zero Trust Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Privacy Officer

      News Analysis - Quick Take on all 22 press releases of Oracle OpenWorld Day #1 - #3

      News Analysis - Quick Take on all 22 press releases of Oracle OpenWorld Day #1 - #3

      Oracle has unleashed the usual barrage of press releases at OpenWorld, happening right now in San Francisco.
       

      I compiled a short presentation with one slide for each press release and MyPOV in them - take a look:

       
       
      If you don't have time to watch - I posted the presentation to Slidshare here:
       
       
      Oracle OpenWorld - A quick take on all 22 press releases of Day #1 - #3 from Holger Mueller
       
      More on Oracle:
       
      • First Take - Oracle OpenWorld - Day 1 Keynote - Top 3 Takeaways - read here
      • Event Preview - Oracle OpenWorld - watch here


      Future of Work / HCM / SaaS research:
      • Event Report - Oracle HCM World - Full Steam ahead, a Learning surprise and potential growth challenges - read here
      • First Take - Oracle HCM World Day #1 Keynote - off to a good start - read here
      • Progress Report - Oracle HCM gathers momentum - now it needs to build on that - read here
      • Oracle pushes modern HR - there is more than technology - read here. (Takeaways from the recent HCMWorld conference).
      • Why Applications Unlimited is good a good strategy for Oracle customers and Oracle - read here.

      Also worth a look for the full picture
      • Event Report - Oracle PaaS Event - 6 PaaS Services become available, many more announced - read here
      • Progress Report - Oracle Cloud makes progress - but key work remains in the cellar - read here
      • News Analysis - Oracle discovers the power of the two socket server - or: A pivot that wasn't one - TCO still rules - read here
      • Market Move - Oracle buys Datalogix - moves more into DaaS - read here
      • Event Report - Oracle Openworld - Oracle's vision and remaining work become clear - they are both big - read here
      • Constellation Research Video Takeaways of Oracle Openworld 2014 - watch here
      • Is it all coming together for Oracle in 2014? Read here
      • From the fences - Oracle AR Meeting takeaways - read here (this was the last analyst meeting in spring 2013)
      • Takeaways from Oracle CloudWorld LA - read here (this was one of the first cloud world events overall, in January 2013)

      And if you want to read more of my findings on Oracle technology - I suggest:
      • Progress Report - Good cloud progress at Oracle and a two step program - read here.
      • Oracle integrates products to create its Foundation for Cloud Applications - read here.
      • Java grows up to the enterprise - read here.
      • 1st take - Oracle in memory option for its database - very organic - read here.
      • Oracle 12c makes the database elastic - read here.
      • How the cloud can make the unlikeliest bedfellows - read here.
      • Act I - Oracle and Microsoft partner for the cloud - read here.
      • Act II - The cloud changes everything - Oracle and Salesforce.com - read here.
      • Act III - The cloud changes everything - Oracle and Netsuite with a touch of Deloitte - read here

      Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
       
      Tech Optimization Innovation & Product-led Growth Future of Work Data to Decisions New C-Suite Marketing Transformation Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Openworld Oracle SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML Leadership HR Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer

      IBM Insight 2015 Spotlights Cloud Services, Spark, Watson Analytics Upgrades

      IBM Insight 2015 Spotlights Cloud Services, Spark, Watson Analytics Upgrades

      IBM serves up data-analytic cloud services, an Apache Spark service on BlueMix, and new data-discovery capabilities within IBM Watson Analytics. As for that Cognos update? It’s a half step to self-service BI.

      IBM’s announcements its IBM Insight 2015 event this week in Las Vegas weren’t all about cloud, but those were the ones I found to be the most interesting during an opening-day keynote otherwise focused on highlighting recent accomplishments and “Insight economy” vision statements.

      The three cloud announcements that caught my attention concerned:

      • Insight Cloud Services and Industry Analytics Solutions
      • IBM Analytics on Apache Spark
      • New Q&A and data-discovery capabilities within the IBM Watson Analytics cloud app.

      Here’s a quick rundown on what I liked and why.

      @IBM, #IBMInsight2015

      IBM Insight Cloud Services combine data and contextual analysis aimed at delivering actionable insights.

      Insight Cloud Services

      IBM announced a set of data and analysis services aimed at bringing contextual insight to next-generation web and mobile apps. The data is from partners Twitter and The Weather Company and more than 150 other sources including public sources such as the U.S. Census and Bureau of Labor Statistics. The deep analytics draw on APIs borrowed from IBM Watson and include data-analysis and language-processing capabilities such as entity extraction, which is used to spots people, places, things and events within textual data. [Note: The day after this post, "IBM announced that it has entered into a definitive agreement to acquire The Weather Company’s B2B, mobile and cloud-based web properties, including WSI, weather.com, Weather Underground and The Weather Company brand. The Weather Channel (TV business) will not be acquired by IBM, but will license weather forecast data and analytics from IBM.]

      The portfolio will include some 20 services, and IBM has licensed a high-scale, high-speed data-delivery platform from The Weather Channel, which delivers mission-critical weather data to airlines, insurance companies, media outlets and many other industries. Running on Softlayer, the platform is described as robust and scalable with high availability.

      As for the purpose of these services, the idea is to extract, integrate and markup data and then provide contextual analysis that brings meaning. Joel Cawley, the general manager of Insight Cloud Services, used the example of the simple weather data point of a 50-degree temperature reading. That’s not so remarkable unless you add the context that it’s 50 degrees in Boston in February, where it has been below freezing for the last three weeks straight. Or maybe it’s Miami in August, and temperatures haven’t dropped below 75 since May.

      The emphasis with Insight Cloud Services is on delivering actionable information, according to Cawley. Weather data services, for example, could be used by utilities to forecast demand and predict service outages, by local governments to develop emergency plans, or by retailers to optimize inventories and increase sales.

      IBM introduced industry-specific Insight Cloud Services last May, including IBM Demand Insights, used by retailers and others to understand the correlations between sales of specific products with weather, events, news, trends and social commentary. Customers Urban Outfitters and Costco presented here in Vegas. The IBM Market Insights service is used by consumer products and media companies to better understand customers based on likes and interests expressed in social media. This provides customer-segmentation data that can be used to improve targeted marketing efforts.

      IBM also introduced a cloud-based Fan Insights Service for sports and entertainment firms. The Ottawa Senators hockey team is presenting here on how it plans to use the service. The service is aimed at predicting ticket sales and concession and staffing needs throughout the season as well as effective marketing strategies based on fan sentiment, behavioral trends and individual fan preferences.

      MyPOV: These services sound compelling, though I’m a little sketchy on just how contextual data services translate into actionable insights. The temperature example (50 degrees in Boston versus Miami) makes sense and who doesn’t want actionable insight? But is this support-heavy approach — much like IBM’s joint mobile iOS apps with Apple – whereby supporting implementation services will be required to deliver actionable insights within applications? I’m hoping these services will be straightforward and easy to use for Web and mobile developers.

      IBM Analytics on Apache Spark

      Spark is, of course, the white-hot open source in-memory analytics framework that IBM promised it would back in a big way earlier this year. This managed Spark service on IBM BlueMix has been in the works for a few months. On Monday I sat in on a Spark panel that included early customers Climformatics and consulting firm SmarterData.

      The service is said to include Spark Core as well as its SQL, Graph, R and MLLib machine-learning components. There’s also a Notebook user interface for accessing, loading and visualizing data with drag-and-drop functionality. The service accesses data from BlueMix cloud services including Cloudant, DashDB, Streams and the DataWorks data-transformation and cleansing service.

      MyPOV: I’m a big fan of Spark for its in-memory performance and analytic versatility, and this new service is early proof that IBM is following through on its commitment to develop Spark for the enterprise. IBM’s new service gives developers a much-needed option (besides Databricks) for learning and deploying Spark-based applications in the cloud. It runs stand-alone on cloud storage in IBM’s cloud, with no requirement to also run Hadoop (which should make things easier). It’s a pay-as-you-go, big data service that’s ripe for the times.

      IBM Watson Analytics and Cognos

      IBM Watson Analytics is IBM’s intuitive, cloud-based app with natural-language question-and-answer capabilities and smart, automated recommendation for visualization and analysis. Cognos is, well, the aging business intelligence suite born in an earlier era, but IBM has announced a significant facelift.

      A lot of the sexy stuff inside IBM Watson Analytics was actually developed by the Cognos and SPSS teams, but IBM decided to add a dollop of Watson and serve it up as an independent, cloud-based product. Watson Analytics even has self-service predictive analytics capabilities, which I detailed in this report.

      The “what’s new” in this IBM Watson Analytics refresh includes a new Expert Storybooks feature for data discovery. Developed in collaboration with nearly a dozen partners, Storybooks help users spot the most relevant facts, patterns and exceptions in data. A Deloitte-developed Storybook, for example, measures the effectiveness of incentive programs, while one from The Weather Company helps users understand how weather impacts revenue trends. IBM also introduced a Secure Gateway for Watson Analytics for accessing on-premises data. And it added connectors for DB2, Informix, Netezza, IBM SQL Database, IBM dashDB and a variety of popular third-party data sources.

      As for the latest upgrade of Cognos, IBM introduced an extensive user-interface refresh aimed at consolidating overlapping functionality and bringing self-service capabilities to report and dashboard consumers and creators. It also introduced “Intent-Driven Modeling” that interprets what you are after based on search terms. Unfortunately, the deeper you go, the more you see all the complexity of the underlying product. And IBM has done little to streamline administration and the heavier aspects of data-management.

      MyPOV: IBM calls Watson Analytics its tool for citizen data scientists (going after Tableau) while Cognos Analytics is, well, the legacy product. I can see where existing Cognos customers will appreciate all the self-service improvements in this upgrade, so maybe it will stem the rate of attrition. But I wouldn’t expect a flood of new customers.

      Watson Analytics looks like the future for IBM, but it’s up against Tableau, Qlik, and a host of new and revamped cloud options from the likes of Amazon, Microsoft, Oracle and Salesforce. It’s notable that Amazon is addressing the complexity of data analysis starting with the back-end data layer with its recently announced Quick Sight service. That service is obviously far from proven, but it just may be that simplicity and ease of use has to start with core data management.


      Data to Decisions Marketing Transformation Future of Work Chief Customer Officer Chief Information Officer Chief Digital Officer