Results

SuperNova Award Winner Protrait - Asha Aravindakshan of Ashoka with FinancialForce HCM

SuperNova Award Winner Protrait - Asha Aravindakshan of Ashoka with FinancialForce HCM

Last week at Constellation Connect Enterprise we awarded our SuperNova awards for the most successful transformations we have seen in the last 12 months. The winner in the "Future of Work" category was Asha Aravindakshan of non profit Ashoka (see more here). 

 
 
We had the chance to speak with Asha at CCE - so take a look:
 
 
In case you didn't have a chance to watch - here are the key takeaways:
 
  • Ashoka is a non profit working globally with 400+ employees in 47 countries, except the US every country has 10 or less employees.
  • Ashoka went from a spreadsheet based solution to using FinancialForce HCM.
  • In a DIY approach Aravindakshan, together with the Salesforce system administrator and a FinancialForce consultant, with an effort of less than 6 hours per week, took Ashoka live in 10 weeks. 
  • Aravindakshan used the yearly performance review cycle to drive adoption to the system, reaching over 97% of the employee base in the first month.
  • Training was handled in less than one hour - as users were familiar with the user interface using Salesforce already. 
 
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
 
Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization New C-Suite Data to Decisions AR Executive Events Chief People Officer Chief Information Officer Chief Experience Officer

Event Hubs, or Engines, add the ‘React’ capability to Analytics - Turning IoT event triggers and data into high value business outputs.

Event Hubs, or Engines, add the ‘React’ capability to Analytics - Turning IoT event triggers and data into high value business outputs.

In recent weeks there have been announcements from many of the big names of the technology industry of ‘Event Engines’ in association with other parts of their Cloud based product sets. In addition there are a number of startups also offering Event Processing engines with a variety of different benefits that will run on an available Cloud service. Most are specifically announced as part of Internet of Things, IoT, solutions, or sometimes IoT suites, but what’s different from BPM business process engines?

The better question is why are these announced specifically for IoT solutions, and how do they fit with the development of IoT architectures as covered in the preceding two blogs in this series?

Research report now available: The Foundational Elements for the Internet of Things (IoT)

The sheer volume of devices and resulting data flows makes real time Analytics essential to ‘read’ and tag the valuable ‘Insights’, but real business value lies in using Event Hubs/Engines coupled with Analytical output to respond, or ‘react’, with sophisticated Business valuable actions.

Event Processing is not new, but Business valuable IoT deployments demand Complex Event Processing, CEP, to combine data from multiple sources into meaningful events then orchestrate process elements into optimized Process responses. Add the demand for real time, the unique combination of both ‘push’ and ‘pull’ data, frequent changes plus dynamic utilisation and a new generation of cloud based Event Engines is required.

Event Engines sit in a third layer over the Internet of Things, IoT, architecture that is becoming increasingly clearly defined; A base layer connectivity Infrastructure built on Fog Computing, or Edge based Clouds to localized the speed of ‘interactions’ between IoT Devices; Built over this is the store/search capabilities of Graph Databases with their unique capability to establishing relationships between data around connections in a manner similar to the manner that data is created by IoT Devices.

Event Engines are effectively the third layer of IoT architecture providing the higher levels of Business Value by managing responses to Event Trigger conditions, either as a complex event process from a single trigger event, or to draw a conclusion from a complex alignment of the flow of current data with stored data. The association of a Graph Database with IoT and Event Engines operating on Clouds has led to some Event Engine providers calling their associated (Graph)Database an ‘Event Cloud’.

Its not hard to imagine any number of examples that would be defined as simple event processes, classically the story of the towel dispenser running out of towels calling to be refilled. To some the definition of IoT remains that of a simple sensor on a machine reporting a certain value as changed, but to many it is now understood to be a wide range of data flowing from many different types of connected Devices. As IoT deployment scale increases both value and amount of data to be processed call for automation of ‘React’ outputs, and possibly integration with existing Enterprise Applications. 

Complex Event Processing, CEP, is by definition complex to explain, but in simple terms is about finding new values from combinations of data and delivering an output that is Business Valuable. As is often the case with new Technology Event Engines and CEP is best understood by using examples of what can be done. The following two examples are edited versions of examples from the Wikipedia explanation of Complex Event Processing.

As an example of Complex Event Processing consider a car equipped with with just three simple IoT sensors that measure Tyre Pressure, Speed, and the presence of a Driver by a seat pressure.  Individually each is able to offer a data flow and trigger condition. Combining the same data flows from the same three simple sensors using Complex Event Processing produces new data that is wholly different and of much higher value.

CEP Example 1: The speed sensor indicates car is moving when the tire pressure sensor data flow indicates the pressure in one tyre is dropping from 45 psi to 41 psi over 60 minutes. As the pressure in the tire is decreasing, a series of data events reporting the tyre pressure are being generated. In parallel a data flow is being generated indicating the car is being driven. (The presence of a driver and speed of movement). The car Event Processing Engine combines all three current and stored data flows to define the situation as gentle tyre deflation over a period of some time and outputs to the driver the display "loss Of Tire Pressure". This output may also be written into the structured database of the car maintenance log, and possibly in connected cars will even be sent out to seek the tyre puncture repair options.  

CEP Example 2: Changing just one event reporting parameter in same situation produces an entirely different output and triggers appropriately different actions.  If the Tyre pressure drops the same amount, but in 5 seconds, then the car Event Processing Engine will conclude the output to be “Tyre Punctured”, or “Blow Out”. This potentially catastrophic event will bring into play skid management control, hazard lights coming on, and possibly a warning being flashed externally to warn of a potential accident.

As with all forms of IoT deployment there are strong business management reasons to understand and define what outputs are required as much as Technology practitioners should know how to use IoT to deliver the requirement. The providers of Event Engines look to offer best practice drag and drop process design tools so, as with many Cloud based capabilities, the creation of Event Processes may move to becoming a business user activity.

The above two examples indicate the principles of both the two major forms of CEP, Aggregation and Detection, as well as the common combination of both into a Hybrid solution.

Aggregation Oriented CEP carry out processes by continuously calculating an ‘average value’ from multiple data flows to produce and trigger an output. Vibration increasing over a period taken in combination with speed in revs per minute, multiplied by hours run might indicate bearing wear; whereas rising, or constantly high temperature in the engine plus speed and hours running could be used to indicate when an oil change might be required.

Detection Oriented CEP seeks to find a required output trigger from a combination of event inputs in which a determined pattern or sequence can be found. Facebook’s search capability is an example using Detection Oriented CEP to looking for alignments and matches between apparently unrelated data held in Graph Databases.

Hybrid CEP is rapidly becoming a norm as the number of IoT sensors and Devices producing data flows increase Event Processing possibilities and informed users ask for a wider selection of output conditions.

It is tempting, and wrong, to draw parallels with event processing in Business Process Management, BPM, and therefore to consider BPM Rule Engines for IoT event processing. BPM Rule Engines are neither meant for continuous dynamic reprograming, nor can they combine IoT Device ‘push’ data with ‘pull’ data from APIs as two of the most obvious limitations.

Complex Event Processing, allied to Fog Computing and Graph Databases, makes for true game changing capabilities of the type that underpins many of the new high value Business disruptive capabilities. Simple Event trigger alarms may be enough to justify the first generation of IoT pilot deployment connected through small scale ‘Intranet’ deployments, but the real prize that a Global IoT environment brings delivers far higher levels of direct business value.

The Enterprises in each sector that are the first to learn how to connect, collate and process the new streams of Data in a massively connected IOT Device world will immediately gain the competitive advantage they seek. Its right to see that Data delivers the advantage, but to unlock that advantage requires new understanding of exactly what and how IoT works. The leading Enterprises are there already, think of Amazon, Facebook, or Google to understand how they collect and use data to monetize their new business models, and look around in your own Sector.

Resources

The Foundational Elements for the Internet of Things (IoT)

Some further sources for information on Event Hubs/Engines linked to Cloud Suites

Salesforce Products and Platform

http://www.salesforce.com/iot-cloud/

SAP HANA and IoT

http://go.sap.com/uk/product/technology-platform/iot-platform-cloud.html

Microsoft IoT solution products overview 

https://azure.microsoft.com/en-gb/services/stream-analytics/

AWS  IoT product architecture overview

https://aws.amazon.com/iot/how-it-works/

Google IoT cloud products

https://cloud.google.com/solutions/iot/

AWS examined by TechCrunch 

http://techcrunch.com/2015/10/08/amazon-announces-aws-iot-a-platform-for-building-managing-and-analyzing-the-internet-of-things/#.fkbxmq:1f9A

Oracle IoT solution architecture as announced at OOW autumn 2015

https://www.oracle.com/solutions/internet-of-things/index.html 

How to Secure the Best Cloud Software Contract Webinar with R “Ray” Wang

How to Secure the Best Cloud Software Contract Webinar with R “Ray” Wang

Constellation Research principal analyst and bestselling author R "Ray" Wang will teach you how to structure the most favorable cloud software contract for your organization. R "Ray" Wang will share his top cloud negotiation tips derived from his involvement in 1,000+ contract negotiations.

Don't get trapped in an unfavorable cloud contract. Learn how to navigate the increasingly complex cloud services market by registering for this webinar today. 

Details:

You will learn:

  • Top cloud negotiation tips from one of enterprise tech's leading analyst
  • Common contract negotiation pitfalls
  • How to ensure you secure the best cloud services deal
Tech Optimization Sales Marketing Webinar AR SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Financial Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Constellation Announces 2015 SuperNova Award Winners

Constellation Announces 2015 SuperNova Award Winners

Constellation announces the winners of the 2015 SuperNova Awards at Constellation's Connected Enterprise.

2015 SuperNova Award Winners

Constellation announced nine SuperNova Award winners last night at the SuperNova Awards Gala Dinner. 

The Constellation SuperNova Awards are the first and only awards to celebrate the leaders and teams who have overcome the odds to successfully apply emerging and disruptive technologies for their organizations. The SuperNova Award winners demonstrated great leadership in selecting, implementing, and deriving business value from disruptive technologies. More information about the winners below. 

All applications were evaluated by the SuperNova Award Judges, comprised of industry thought leaders, and then put to a public vote. 

2015 SuperNova Award Winners

Consumerization of IT & The New C-Suite - Martin Brodbeck, CTO, Sterling Backcheck

Martin Brodbeck won the SuperNova Award for his implementation of SnapLogic Infrastructure-as-a-Service solutions to streamline Sterling Backcheck’s automated transactions service. He moved the entire system from a complicated, custom-built, open-source system to one streamlined cloud-based transactions system. The project shortened the customer onboarding time to just days. Frictionless transactions in the cloud translate to faster time to revenue for Sterling Backcheck.

Data to Decisions - Alda Mizaku, Lead Business Solutions Analyst, Mercy

Mizaku and her team won the SuperNova Award for leading a big data analytics project to improve the delivery of patient care and bridge the gaps between the clinical and coding worlds. Part of the strategy included automation of secondary diagnosis detection targeted to improve the accuracy of provider documentation. This strategy seeks to more accurately reflect care that has already been provided and help bubble up comorbidity factors. This project, which uses a combination of data warehouse tables, ETL logic, and custom reporting led to a diagnosis increase of 36%.  

Digital Marketing Transformation -Naveen Gunti, Sr. Director of e-Commerce Technology and Operations, Tumi Holdings, Inc.

Naveen Gunti won the SuperNova Award for his use of Adobe Marketing Cloud to improve Tumi’s online customer experience. The implementation enabled customers to view dynamic images of Tumi products on the Tumi website. The resulting increased customer engagement on the website led to an increase of on-site time by 40%.

Future of Work, Human Capital Management -Asha Aravindakshan, Operations Director, Global Talent, Ashoka

Asha Aravindakshan won the SuperNova Award for leading the implementation of FinancialForce HCM at Ashoka. Prior to the implementation, Ashoka used Excel spreadsheets to manage their global workforce. Recognizing adoption as an essential element required for the success of the new HCM system, Aravindakshan led a movement to conduct performance reviews on FinancialForce. 97% of Ashoka's employees engaged with FinancialForce in response to this performance review incentive. Thanks to Aravindakshan's leadership, Ashoka can now address the HCM needs of an international workforce.

Future of Work, Social Business - Steve Nava, Sr. Director Field Service, Luminex

Steve Nava won the SuperNova Award for leading the implementation of ServiceMax to improve communication and collaboration of field service engineers at Luminex. The implementation was so successful that it transformed Luminex’s field service department into a solutions-oriented business.  Luminex’s fix rate increased to 98%, and the invoice cycle went from 28 days to 96 hours. 

Matrix Commerce - Jordan Kivelstadt, CEO, Free Flow Wines

Jordan Kivelstadt won the SuperNova Award for being one of the first companies to deliver wine in kegs. He used Netsuite to address the specific supply chain issues presented by this model. Free Flow Wines is disrupting the wine industry as more restaurants are choosing to serve wine in kegs. This year, Free Flow Wines expects to deliver the equivalent of 300,000 cases of wine. 

Next Generation Customer Experience - Dan Wallis, Director of KP OnCall, Kaiser Permanente

Dan Wallis won the SuperNova Award for leading the implementation of Oracle Service Cloud to support a system that helps to automatically diagnose health conditions via the web. This project has allowed Kaiser Permanente to better serve their customers by reducing costs and wait times. Now Kaiser patients can accurately self-diagnose conditions without visiting a doctor or calling a nurse call center. 

Technology Optimization & Innovation - Dr. David Bray, Chief Information Officer, Federal Communications Commission

Dr. David Bray won the SuperNova Award for his overhaul of the FCC’s legacy Consumer Help Center. He implemented cloud-based Zendesk to modernize the Commission's Help Center. The Zendesk implementation helped the FCC replace 18 outdated complaint forms, activate 24/7 complaint tracking, and improve transparency. The cloud-based solution selected and implemented under Bray's leadership led to a savings of 5/6 over a custom-built, in-house solution. 

Technology Optimization & Innovaiton - Erica Stevens, VP of Supply Chain and IT, Dylan's Candy Bar

Erica Stevens won the SuperNova Award for her implementation of Netsuite at Dylan’s Candy Bar. This implementation transformed Dylan’s into a ubiquitous channel retailer. Dylan's is now able scale and pivot its retail operations to meet the needs of their customers on many channels including web, mobile, and brick and mortar.

The Rewards

Congratulations to the winners! Continue to be brave, innovative, and disruptive!

Data to Decisions Future of Work Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief People Officer Chief Procurement Officer Chief Supply Chain Officer

My opening remarks on privacy at Constellation Connected Enterprise 2015

My opening remarks on privacy at Constellation Connected Enterprise 2015

A big part of my research agenda in the Digital Safety theme at Constellation is privacy. And what a vexed topic it is! It's hard to even know how to talk about privacy. For many years, folks have covered privacy in more or less academic terms, drawing on sociology, politics and pop psychology, joining privacy to human rights, and crafting new various legal models.

Meanwhile the data breaches get worse, and most businesses have just bumped along.

When you think about it, it's obvious really: there's no such thing as perfect privacy. The real question is not about 'fundamental human rights' versus business, but rather, how can we optimise a swarm of competing interests around the value of information?

Privacy is emerging as one of the most critical and strategic of our information assets. If we treat privacy as an asset, instead of a burden, businesses can start to cut through this tough topic.

But here's an urgent issue. A recent regulatory development means privacy may just stop a lot of business getting done. It's the European Court of Justice decision to shut down the US-EU Safe Harbor arrangement.

The privacy Safe Harbor was a work-around negotiated by the Federal Trade Commission, allowing companies to send personal data from Europe into the US.

But the Safe Harbor is no more. It's been ruled unlawful. So it's big, big problem for European operations, many multinationals, and especially US cloud service providers.

At Constellation we've researched cloud geography and previously identified competitive opportunities for service providers to differentiate and compete on privacy. But now this is an urgent issue.

It's time American businesses stopped getting caught out by global privacy rulings. There shouldn't be too many surprises here, if you understand what data protection means internationally. Even the infamous "Right To Be Forgotten" ruling on Google's search engine - which strikes so many technologists as counter intuitive - was a rational and even predictable outcome of decades old data privacy law.

The leading edge of privacy is all about Big Data. And we aint seen nothin yet!

Look at artificial intelligence, Watson Health, intelligent personal assistants, hackable cars, and the Internet of Everything where everything is instrumented, and you see information assets multiplying exponentially. Privacy is actually just one part of this. It's another dimension of information, one that can add value, but not in a neat linear way. The interplay of privacy, utility, usability, efficiency, efficacy, security, scalability and so on is incredibly complex.

The broader issue is Digital Safety: safety for your customers, and safety for your business.

Digital Safety, Privacy & Cybersecurity Chief Information Officer

Randstad Sourceright: Good Progress and the beginning of a balancing act

Randstad Sourceright: Good Progress and the beginning of a balancing act

We had the opportunity to attend the Randstad Sourceright analyst summit this week, it took place in Atlanta and was well attended by the analyst community:

 

Take a look at my top 3 takeaways of the event:

 
 
If you don't have a chance to watch - here are the takeaways:
 
  • Talentradar debut - Randstad Sourceright showed the first deviverable on its product roadmap with Talentradar. It brings together recruiting information across the varies systems involved as well as Randstad Sourceright parnters like e.g. Hirevue, Smashfly etc. Technologies mentioned were Informatica and R, we could see Domo being used for visualization. Randstad Sourceright has delivered a solid version one of the product, now we need to understand the roadmap and customer adoption as next steps.
     
  • Standardization - The whole outsourcing industry is recovering from a hangover of too customized deals sold and implemented early in the millenium. The answer is standardization and leveraging global capabilities, Randstad Sourceright is making good progress on both fronts. Discipline is key though and it was re-assuring hearing the North American sales leaders stating that they would walk away from business if not falling inside the parameters of standard delivery.
     
  • RiseSmart - In a surprise move Randstad Sourceright acquired outplacement vendor RiseSmart (see here for vendor and here for press release). RiseSmart CEO Sathe was there and told the vendor's story - bringing software to the outplacement business. In a good move Randstad Sourceright has decided to keep RiseSmart operate independently.



 

    MyPOV

    It is good to see Randstad Sourceright growing and making progress standardizing, globalizing its product offerings. The acquisition of RiseSmart opens new revenue potential and the chance to disrupt the outplacement market. Equally it is good to see the product focus showing first deliverables with Talentradar. And the vendor is keeping tabs on a booming Recruiting startup eco system with its Randstad Foundation Fund. 
     
    On the concern side Randstad Sourceright will have to take into account that more resourcing decisions will be made by software, and more resourcing decisions will be made by hiring managers directly, and no longer by recruiters. These changes are disruptive for Randstad Sourceright customers and therefore for the vendor itself. Preparing and switching over in time will be the key challenge for executive management in the next years. 
     
    But for now congrats on good progress, we will be keeping tabs, stay tuned. 

    --------------

    More on Recruiting


     
     

    • Musings - How Technology Innovation fuels Recruiting and disrupts the Laggards - read here
    • Musings - What is the future of recruiting? Read here
    • HRTech 2014 takeaways - read here.
    • Why all the attention to recruiting? Read here.
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here


     
    Tech Optimization Chief Information Officer

    Pivotal Now Makes It Easier Than Ever to Take Software from Idea to Production

    Pivotal Now Makes It Easier Than Ever to Take Software from Idea to Production

     
    Today Pivotal used its upcoming European Cloud Foundry user conference to release a round up press release on its overall progress… time to check in where Cloud Foundry stands today.

     


    So let’s dissect the press release in our customary style:

     
    San Francisco, November 2, 2015 – Pivotal®, the company accelerating digital transformation for enterprises, today announced a new release of Pivotal Cloud Foundry, the comprehensive Cloud Native platform that unifies the software delivery process with an integrated application framework, platform runtime, and infrastructure automation. Pivotal Cloud Foundry now includes expanded support for Spring Cloud Services, Microsoft Azure, .NET applications, Docker images, and application lifecycle management. With these enhancements, Pivotal further enables businesses to rapidly build, deploy, and operate Cloud Native applications on a wide choice of hosted, public, and private clouds.

    MyPOV – Good summary to start the press release – hitting all the key new capabilities, we will dissect and comment below. But worth to mention the ‘cloud native’ positioning here, will be interesting to see if Pivotal can pull that association between cloud native and its products of.

     
    “The days of monolithic technologies are ending. Today’s modern enterprises practice agile software development with Cloud Native tools, process, and culture that can respond to speed of market and customer demand,” said James Watters, vice president and general manager, Cloud Platform Group, Pivotal. “Pivotal Cloud Foundry delivers a comprehensive Cloud Native application development and operations environment so you can spend time building business value instead of your IT infrastructure.”

    MyPOV – Good quote by Waters, hitting the right value proposition of CloudFoundry, though the tool itself is also monolithic – in the sense of offering one way to build software.

     
    Integrated Microservices with Spring Cloud Services  
    Based on the popular Spring Cloud OSS, which is used by Netflix to operate its global, on-demand video streaming service, Spring Cloud Services for Pivotal Cloud Foundry goes one step further to provide opinionated provisioning and lifecycle management to these components.
    The first and only secure, enterprise-ready distribution of core Netflix OSS components, Spring Cloud Services enables developers and operators of Cloud Native distributed systems architectures to quickly and easily build microservices by adding a suite of production-ready services to the Pivotal Cloud Foundry marketplace. Spring Cloud Services allows developers to focus on delivering business value and defers the deployment and management of important distributed systems patterns such as application configuration, service discovery, and fault-tolerance to the Pivotal Cloud Foundry platform.

    MyPOV – Cloud Foundry needed a productivity framework to accelerate time to market for solutions build with the product. Nothing lies closer than using the venerable spring framework, souped up with the Netflix OSS components. A good move for the product, but like all productivity tools, it comes with the addition of increased dependency. We expect Pivotal customers to not be too concerned with this dependency, but they should be making aware tradeoffs.

     
    Native Support for .NET Applications 
    Thanks to the next-generation runtime shipping in this latest release, .NET applications can now run on Pivotal Cloud Foundry. With this expanded support for .NET, enterprises can support a heterogeneous environment consisting of both Linux-based and Windows-based applications. .NET applications will run natively on Windows Server 2012 R2 Hyper-V virtual machines, and Pivotal Cloud Foundry can manage applications with the same commands and many of the same consistent Day 2 operational benefits as existing applications.

    MyPOV – This is a key move by Microsoft and Pivotal to avoid developers of .Net applications to have to go back and rebuild these .Net apps as their first order of business. Instead Microsoft and now Pivotal gives developers the opportunity to operate these older .Net applications in conjunction with the next generation applications they want to build (and the vendors want them to build). Lastly it is the ultimate proof point of investment protection for .Net applications, a promise Microsoft has made over a decade ago and is honoring today.

     
    Native Support for Docker Images 
    Docker applications can now leverage the built in Pivotal Cloud Foundry platform capabilities, such as scheduling, health management, load balancing, enterprise identity, logging, and multi-cloud support. Now in beta, native Docker image support is made possible by the new elastic runtime and makes Pivotal Cloud Foundry the most advanced container management system on the market today. Customers can deploy applications to Pivotal Cloud Foundry based on Docker images from public, secure registries such as Docker Hub.

    MyPOV – Good move to provide better support for Docker, and the way how enterprises want to build, operate and consume Microservices – in a secure, repeatable and reliable way. Registry integration is the capability in demand and it is good to see Pivotal providing the capability.

     
    Application Lifecycle Management Toolchain
    Delivering on Pivotal’s vision of comprehensive Cloud Native application lifecycle management, the company is partnering with GitLab, CloudBees, and JFrog to deliver a turnkey continuous integration and continuous delivery (CI/CD) solution.
    Building upon the popular software project management tool, Pivotal Tracker, customers can integrate platform-managed versions of GitLab source code repository, CloudBees Jenkins continuous integration, and JFrog Artifactory binary artifact management. By providing the building blocks of a modern application delivery toolchain, Pivotal Cloud Foundry empowers software organizations to build and deploy microservices and Cloud Native applications with confidence and speed.

    MyPOV – It is good to see Pivotal acknowledging other popular development tools such as the ones mentioned and being integrated now with Pivotal Tracker. Next would be a roadmap / sharing of plans of other popular adjacent tools. For now congrats to the three who made it – GitLab, CloudBees and JFrog.

     
    Early Access Support for Microsoft Azure
    Pivotal Cloud Foundry extends its Cloud Native platform with early access support for Microsoft Azure, adding to the already-supported Amazon Web Services® (AWS), VMware vSphere®, VMware vCloud Air®, and OpenStack®. With Pivotal Cloud Foundry, customers can deploy and manage Cloud Native applications on almost any infrastructure; without the operational cost and complexity of maintaining their own underlying cloud infrastructure. [..]

    MyPOV – Good to see Pivotal extending deployment options, as previously indicated – now adding support to Microsoft Azure. A good move for CloudFoundry users, who get more deployment options for their projects.

     

    Overall MyPOV

    Pivotal is making good progress with CloudFoundry, creating more value and synergies for customers and prospect. It further solidifies CloudFoundry‘s position as the leading enterprise PaaS. With Microsoft Azure support and access to .Net applications, Microsoft acknowledges the position of Cloud Foundry further, bringing core delivered Microsoft .Net assets to the CloudFoundry platform.

    On the concern side – with success comes also responsibility, Pivotal needs to deliver these capabilities, ensure customer success and become a reliable partner both for its growing ecosystem. There is no indication that Pivotal cannot deliver this, but the task ahead is not trivial. Starting to create, communicate and deliver to roadmaps will be the first steps.

    But for now it’s good to be a Pivotal customer and prospect. 


    More on Pivotal

     
    • News Analysis - Pivotal makes CloudFoundry more about multi-cloud - read here
    • News Analysis - Pivotal pivots to OpenSource and Hortonworks - Or: OpenSource keeps winning - read here

    More on Next Generation Applications::

     
     
    • Progress Report - Cloudera is all in with Hadoop - now off to verticals - read here
    • First Take - SAP Cloud for Planning - The next spreadsheet killer is off to a good start - read here
    • Market Move - Oracle buys Datalogix - moves into DaaS - read here
    • News Analysis - SAP commits to CloudFoundry and OpenStack - Key Steps - but what is the direction? Read here
    • Event Report - MongoDB is a showcase for the power of Open Source in the enterprise - read here
    • Musings - A manifesto: What are 'true' analytics? Read here
    • Future of Work - One Spreadsheet at the time - Informatica Springbok - read here
    • Musings - The Era of the no-design Database - Read here
    • Mendix - the other path to build software - read here
    • Musings - Time to ditch your datawarehouse .... - Read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here



     
    Future of Work Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Data to Decisions Digital Safety, Privacy & Cybersecurity New C-Suite Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

    Oracle Open World 2015: Three Important Cloud Services

    Oracle Open World 2015: Three Important Cloud Services

    Oracle Open World 2015 announcements included three data-related standouts: Oracle Cloud Platform For Big Data, Oracle Data Visualization Cloud Service, and Oracle Data Cloud. Here’s a deeper dive on what stands out.

    Our cloud is integrated, it’s ready, and it’s bigger than and superior to any rival cloud. That was the big-picture message Oracle offered customers at Oracle Open World 2015. What’s more, the cloud is where most customers will soon be headed, said Oracle CEO Mark Hurd, predicting that “virtually all” enterprise data would be stored in the cloud by 2025.

    It would be impossible to detail all two-dozen-plus announcements made last week, so I’ll narrow things to my Data-to-Decisions (D2D) research domain and focus on three announcements that caught my eye: The Oracle Cloud Platform For Big Data, the Data Visualization Cloud Service, and the Oracle Data Cloud. With all three, Oracle is trying to differentiate itself in the cloud.

    Oracle Cloud Platform For Big Data

    Oracle’s Hadoop-in-the-cloud Big Data Cloud Service was announced at Oracle Open World 2014. At Oracle Open World 2015 the company reannounced a family of supporting services newly packaged as the Oracle Cloud Platform for Big Data. The components include Oracle Big Data Preparation Cloud Service, Oracle GoldenGate Cloud Service, Oracle Big Data Discovery Cloud Service and Oracle NoSQL Database Cloud Service. The idea is to surround the Hadoop service with a breadth of data-prep, data-movement and data-analysis options.

    @Oracle, #OOW15

    The Oracle Cloud Platform for Big Data combines platforms (database, Hadoop, NoSQL) and analysis options.

    I say “reannounce” because Oracle talked about all of these services back in June. The new “Cloud Platform for Big Data” is a new brand. The Oracle Big Data Preparation service is aimed at preparing and enriching semi-structured and unstructured data, such as clickstreams and social data. Under the hood it makes use of Apache Spark machine learning, Yago entity resolution training sets and Apache UIMA natural language processing.

    The GoldenGate Cloud service is based on Oracle’s well-known data-replication software. This service is designed to pump data in near real time into Oracle Database Cloud Service, Exadata Cloud Service, Hive, Hbase, HDFS, Flume and Kafka (in this case, in the cloud). It’s a complement to low-latency, data-streaming applications, such as those in IoT scenarios.

    The Oracle Big Data Discovery Cloud Service is a business-analyst-oriented tool for exploring, transforming and discovering data within Hadoop (again, in this case running in Oracle’s cloud). Data Discovery first samples, profiles and catalogs all available data. Machine-learning algorithms then surface interesting correlations and offer suggested visualizations for exploring attributes. Search and guided navigation features support data exploration.

    Filling out Oracle’s Cloud Platform is the NoSQL Database Cloud service, which is Oracle’s high-scale key-value store database delivered as a service.

    MyPOV: Not to be forgotten is the Oracle Big Data SQL Cloud Service, which does SQL querying across Oracle Database Cloud services, the Big Data Cloud Service and the NoSQL Database Cloud Service. Taken together, it’s a broad (if Oracle-centric) collection. IBM has a broader collection of IBM and open-source-based services on BlueMix, and Amazon Web Services has more customers using its cloud. But Oracle is building out an impressive portfolio, and the company’s dominant database position will surely feed cloud growth.

    Oracle Data Visualization Cloud Service

    Nearly every BI vendor has introduced a data-visualization module in recent years in response to the fast growth of Tableau Software. Oracle has evolved what it offers for the cloud with the Oracle Data Visualization Cloud Service. Set to become generally available in November, this new stand-alone service is based on capabilities first seen in the Oracle Business Intelligence Cloud Service introduced in April.

    @Oracle, #OOW15

    The Oracle Data Visualization Cloud Service offers 18 charting options and a palate of colors and shapes for depicting data.

    The Oracle Data Visualization Cloud Service will enable you to link to on-premises and cloud data sources (both from Oracle and third parties) as well as your own spreadsheets. There are 18 different types of visualizations and a palate of colors, shapes and sizes for depicting data points.

    Execs at Oracle Open World made a point of saying “all you need is a browser.” That’s because with Tableau’s cloud service, Tableau Online, users author charts and dashboards with the desktop client and then publish to the cloud for collaboration. Tableau is working on bringing full authoring capabilities to the cloud. And, indeed, Oracle is working on a desktop client for times when you need to work offline.

    MyPOV: Oracle execs made claims about its new service being “more modern than Tableau” at Oracle Open World. That starts with full authoring capabilities in the cloud, but I’m not seeing some of the other differences claimed. The press release says Oracle’s service “eliminates the complexity typically associated with blending and correlating data sets,” but Tableau also automatically finds joins when mashing up data sets. Both products also select best-fit visualizations automatically based on the dimensions of data used in an analysis. This auto-charting capability has been around for a while, and it’s also present in SAS Visual Analytics and IBM Watson Analytics.

    For a real head-to-head comparison with Tableau, I want to investigate Oracle’s performance characteristics and its connection capabilities (once this service is available). Tableau’s strengths include its in-memory engine and its live-data-connection capabilities with multiple databases, apps and cloud services, including multiple connection options with Amazon, Google, IBM, Microsoft, Oracle, Salesforce, SAP and others. Will Oracle match that? I also want to tour the “fit and finish” of the visualizations and “storytelling” capabilities. Some of the charts seen at Oracle Open World looked hard to read, but that may be due to the data-filtering and presentation inexperience of the demonstrators.

    Yet-to-be released pricing details from Oracle will also be key to any comparison, but to me, these visualization capabilities are most attractive when teamed with the Oracle BI Cloud Service. That’s because it’s not only a data-exploration and visualization service; you also get the database and reporting functionality. Here, too, the more Oracle-centric you are in the on-premises world, the more attractive the cloud options will be.

    Oracle Data Cloud

    Several new features of the Oracle Data Cloud were announced at Oracle Open World, but a larger context emerged last week with IBM’s announced intent to acquire The Weather Company. Thus, I was eager to learn more about Oracle as a data provider. Oracle Data Cloud is built on technology, data and analytics expertise picked up in the BlueKai and Datalogix acquisitions. Talking to execs from both companies now leading Oracle Data Cloud, I came away impressed.

    @Oracle, #OOW15

    Oracle Data Cloud offers data from more than 1,500 specialty retailers and 30 supermarket loyalty cards.

    Oracle Data Cloud offers offers data from more than 1,500 CPG and specialty retailers across 110 Million US Households. With data-enrichment and predictive analytics options on top of this data, Oracle can find likely buyers by product and category.

    MyPOV: Having data and being able to enrich that data and apply predictive analytics is the name of the game in marketing, and these initiatives are moving into the sales and service arenas as well. In the business-to-business arena, Oracle Data Cloud can enrich your data with Dun & Bradstreet information to find look-alikes of your best customers. A next step is bringing service data full circle back into your understanding of customers to drive efforts such as retention campaigns.

    Many tech vendors are introducing libraries of third-party data that are integrated with their offerings. But big guns like IBM and Oracle are stepping up to become primary data providers. Expect to see data from outside of your organization becoming a bigger and bigger part of your future success.


    Data to Decisions Future of Work Matrix Commerce Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer

    Event Report - Oracle Openworld 2015 - Top 3 Takeaways, Top 3 Positives & Concerns

    Event Report - Oracle Openworld 2015 - Top 3 Takeaways, Top 3 Positives & Concerns

    We had the opportunity to attend Oracle OpenWorld in San Francisco this week. The conference was again the usual spread out affair, with e.g. JavaOne happening at the Hilton, the HR events happening at the Palace Hotel etc. Official attendance was 60k+ - it felt the same as last year.


     
    So take a peek:


     
     
    If you can't watch - here is the gist:
     
    Top 3 Takeaways
     
    • IaaS is here - There was something missing on the Oracle Cloud architecture, and that was IaaS. As Ellison shared candidly, Oracle built SaaS, realized that it needed PaaS and building PaaS it needed IaaS. So Oracle found its way to the cloud top down, with the important auto-scaling feature coming in 2 weeks / later this year. Pricing is attractive, as Ellison put it - Oracle dedicated instances will be at 50% of AWS flexible instances cost wise. And Storage will be at 1/10 of AWS S3. So cost will not be an issue / argument moving to Oracle's cloud.
       
    • More Multitenancy, please - Oracle introduced multitenancy at the database level 2 years ago with Oracle 12c, this year it showed the transfer of a database container from data center A to data center B while writing to the database. And it introduced the largest extension to WebLogic, making JVMs multitenant, a key reliability and flexibility addition to running Java applications (needless to say Oracle announced Docker support, too).
       
    • SCM closes the suite - SCM was the holdout on the Oracle Cloud Suite, and Oracle announced key new additions and products to address this gap. Coupled with the also announced e-commerce products, this announcement as well as the progress of the rest of the SaaS products makes the Oracle offering likely the most complete suite in the cloud. Or on premises as Oracle keeps supporting the duality of deployment. 
     
    Top 3 Positives
     
    • The chip-to-click stack becomes more real - With autoscaling the Oracle integrated tech stack learns a key trick to become a cloud infrastructure stack in regards of operational TCO. But also good news for on premises customers who can run workloads in a more elastic way.
       
    • SaaS Suite gets complete - With key SCM functionality and a new e-commerce suite Oracle adressess both gap and good house-keeping in its SaaS suite, which is now complete in terms of all major funtional areas.
       
    • Differentiation Sprinkles - No other large application vendor talks as much and has an as clear DaaS vision as Oracle. The analytical models are sane, too, as a quick conversation with the Datalogix team concluded. And then Oracle keeps creating value for the citizen developer and citizen integrator, allowing business end users to create mobile, web applications and integrations. 

      Top 3 Concerns

      • Will it all work? - Oracle is likely to undertake the largest engineering project with developers having the same logo on their paychecks. It all has to work seamlessly together, and Oracle has a checkered quality record in the past. To be fair, quality issues have been much less creating headlines for the vendor in the recent past.
         
      • Can Oracle sell it? - The challenge moves now from engineering to go to market, sales through direct and indirect channels. Oracle needs to onboard 1000s of partners in order to maintain the same relevancy in the future that it has today.
         
      • A relationship test - Oracle customers usually have less love lost for their vendor than for most other vendors in the market. That relationship needs to improve in order to become a service provider, where renewals are frequent and regular. Very different to the perpetual license model.
         

      MyPOV

      Oracle keeps executing along the vision of the 'IBM of the 21st century' - the single stop for everything an enterprise needs - on premises and in the cloud. The cloud viability has been notched up by significant degrees with the product progress shared at this OpenWorld. Good for customers, as they will get stronger and richer products. It is clear that horizontal integration inside the layers of the technology stack (e.g. a complete SaaS Suite, an powerful PaaS platform) are desirable for customers. How many layers of vertical integration are desired is less certain and will be the interesting story to watch, as we never had that many layers to deploy hardware and software to in the past, and the once upon a time model of the IBM stack feel apart in the 70ies of last century. Exciting times ahead, we will be watching and analyzing, stay tuned.

      ---------------


       
      I compiled a short presentation with all first 22 press releases of this OpenWorld being discussed - take a look:
       
       
      No time to watch - checkout the presentation below:
       
      Oracle OpenWorld - A quick take on all 22 press releases of Day #1 - #3 from Holger Mueller



      More on Oracle OpenWorld:
      • News Analysis - Quick Take on all 22 press releases of Oracle OpenWorld Day #1 - #3 - read here
      • First Take - Oracle OpenWorld - Day 1 Keynote - Top 3 Takeaways - read here
      • Event Preview - Oracle Openworld - watch here

      Future of Work / HCM / SaaS research:
      • Event Report - Oracle HCM World - Full Steam ahead, a Learning surprise and potential growth challenges - read here
      • First Take - Oracle HCM World Day #1 Keynote - off to a good start - read here
      • Progress Report - Oracle HCM gathers momentum - now it needs to build on that - read here
      • Oracle pushes modern HR - there is more than technology - read here. (Takeaways from the recent HCMWorld conference).
      • Why Applications Unlimited is good a good strategy for Oracle customers and Oracle - read here.

      Also worth a look for the full picture
      • Event Report - Oracle PaaS Event - 6 PaaS Services become available, many more announced - read here
      • Progress Report - Oracle Cloud makes progress - but key work remains in the cellar - read here
      • News Analysis - Oracle discovers the power of the two socket server - or: A pivot that wasn't one - TCO still rules - read here
      • Market Move - Oracle buys Datalogix - moves more into DaaS - read here
      • Event Report - Oracle Openworld - Oracle's vision and remaining work become clear - they are both big - read here
      • Constellation Research Video Takeaways of Oracle Openworld 2014 - watch here
      • Is it all coming together for Oracle in 2014? Read here
      • From the fences - Oracle AR Meeting takeaways - read here (this was the last analyst meeting in spring 2013)
      • Takeaways from Oracle CloudWorld LA - read here (this was one of the first cloud world events overall, in January 2013)

      And if you want to read more of my findings on Oracle technology - I suggest:
      • Progress Report - Good cloud progress at Oracle and a two step program - read here.
      • Oracle integrates products to create its Foundation for Cloud Applications - read here.
      • Java grows up to the enterprise - read here.
      • 1st take - Oracle in memory option for its database - very organic - read here.
      • Oracle 12c makes the database elastic - read here.
      • How the cloud can make the unlikeliest bedfellows - read here.
      • Act I - Oracle and Microsoft partner for the cloud - read here.
      • Act II - The cloud changes everything - Oracle and Salesforce.com - read here.
      • Act III - The cloud changes everything - Oracle and Netsuite with a touch of Deloitte - read here

      Finally find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
      Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Future of Work New C-Suite Data to Decisions Openworld Oracle SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

      Who buys Bitcoin for Identity?

      Who buys Bitcoin for Identity?

      You'll have to forgive the deliberate inaccuracy in the title, but I just couldn't resist the wordplay. The topic of this blog is the use of the blockchain for identity, and not exactly Bitcoin, which I appreciate is not the same thing. By my facetiousness and by my analysis, you'll see I don't yet take the identity use case seriously.

      In 2009, Bitcoin was launched. A paper was self-published by a person or persons going by the nom de plume Satoshi Nakamoto, called "Bitcoin: A Peer-to-Peer Electronic Cash System" and soon after an open source software base appeared at http://www.bitcoin.org. Bitcoin offered a novel solution to the core problem in electronic cash: how to prevent double spending without reverting to a central authority. Nakamoto's conception is strongly anti-authoritarian, almost anarchic, with an absolute rejection of fiat currency, reserve banks and other central institutions. Bicoin and its kin aim to change the world, and by loosening the monopolies in traditional finance, they may well do that.

      Separate to that, the core cryptographic technology in Bitcoin is novel, and so surprising, it's almost magical. Add to that spell the promises of "security" and "anonymity", and we have a powerful mix that some people see stretching far beyond mere money, and into identity. So is that a reasonable step?

      Bitcoin’s secret sauce

      A decentralised digital currency scheme requires some sort of community-wide agreement about when someone spends a virtual coin, so she cannot spend it again. Bitcoin’s trick is to register every single transaction on one public tamper-proof ledger called the blockchain, which is refreshed in such a way that the whole community in effect votes on the order in which transactions are added or, equivalently, the time when each coin is spent.

      No proof of identity or KYC check is needed to register a Bitcoin account; currency – denominated "BTC" – may be transferred freely to any other account. Hence Bitcoin may be called anonymous, although the unique account identifiers are set in stone, providing a indelible money trail that has been the undoing of many criminal Bitcoin users.

      The continuous arbitration of blockchain entries is done by a peer-to-peer network of servers that race each other to double-check a special hash value for the latest refreshed chain. The particular server that wins each race is rewarded for its effort with some Bitcoin. The ongoing background computation that keeps a network like this honest is referred to technically as "Proof of Work" and since there is a monetary reward for helping run the BTC network, the servers are colloquially called miners.

      Whether or not Bitcoin lasts as a form of electronic cash, there is a groundswell of enthusiasm for the blockchain as a new type of decentralized public ledger technology (DLT) for a much broader range of transactions, including “identity”. The shudder quotes are deliberate on my part, reflecting that the blockchain-for-identity speculations have not been clear about what part of the identity puzzle they might solve.

      For identity applications, the reality of Bitcoin mining creates some particular challenges which I will return to. But first let’s look at the positive influence of Bitcoin and then review some of its cryptographic building blocks.

      Bitcoin inspirations

      Bitcoin solves what was thought to be an unsolvable problem - double spending of electronic cash. It's the latest example of a wondrous pattern in applied maths. Unsolvable problems are, in fact, solved quite often, after which frenetic periods of innovation can follow. The first surprise or prototype solution is typically inefficient but it can inspire fresh thinking and lead to more polished methods.

      One of the greatest examples is Merkle’s Puzzles, a theoretical method invented by Ralph Merkle in 1974 for establishing a shared secret number between two parties who need only exchange public pieces of data. This was the holy grail for cryptography, for it meant that a secret key could be set up without having to carry the secret from one correspondent to the other (after all, if you can securely transfer a key across a long distance, you can do the same with your secret message and thus avoid the hassle of encryption altogether). Without going into detail, Merkle’s solution could not be used in the real world, but it solved what was thought to be an unsolvable problem. In quick succession, practical algorithms followed from Diffie & Hellman, and Rivest, Shamir & Adleman (the names behind “RSA”) and thus was born public key cryptography.

      Bitcoin has spurred dozens of new digital currencies, with different approaches to ledgers and arbitration, and different ambitions too (including Ripple, Ethereum, Litecoin, Dogecoin, and Colored Coins). They all promise to break the monopoly that banks have on payments, radically cut costs and settlement delays, and make electronic money more accessible to the unbanked of the world. These are what we might call liquidity advantages of digital currencies. These objectives (plus the more political promises of ending fiat currency and rendering electronic cash transactions anonymous or untraceable) are certainly all important but they are not my concern in this blog.

      Bitcoin's public sauce

      Before looking at identity, let's review some of the security features of the blockchain. We will see that safekeeping of each account holder's private keys is paramount - as it is with all Internet payments systems and PKIs.

      While the blockchain is novel, many elements of Bitcoin come from standard public key cryptography and will be familiar to anyone in security. What's called a Bitcoin "address" (the identifier of someone you will send currency to) is actually a public key. To send any Bitcoin money from your own address, you use the matching private key to sign a data object, which is sent into the network to be processed and ultimately added to the blockchain.

      The only authoritative record of anyone's Bitcoin balance is held on the blockchain. Account holders typically operate a wallet application which shows their balance and lets them spend it, but, counter-intuitively, the wallet holds no money. All it does is control a private key (and provide a user experience of the definitive blockchain). The only way you have to spend your balance (that is, transfer part of it to another account address) is to use your private key. What follows from this is an unforgiving reality of Bitcoin: your private key is everything. If a private key is lost or destroyed, then the balance associated with that key is frozen forever and cannot be spent. And thus there has been a string of notorious mishaps where computers or disk drives holding Bitcoin wallets have been lost, together with millions of dollars of value they controlled. Furthermore, numerous pieces of malware have - predictably - been developed to steal Bitcoin private keys from regular storage devices (and law enforcement agencies have intercepted suspects' private keys in the battle against criminal use of Bitcoin).

      You would expect the importance of Bitcoin private key storage to have been obvious from the start, to ward off malware and destruction, and to allow for reliable backup. But it was surprisingly late in the piece that "hardware wallets" emerged, the best known of which is probably now the Trezor, released in 2013. The use of hardware security modules for private key management in soft wallets or hybrid wallets has been notably ad hoc. It appears crypto currency proponents pay more attention to the algorithms and the theory than to practical cryptographic engineering.

      Identifying with the blockchain

      The enthusiasm for crypto currency innovation has proven infectious, and many commentators have promoted the blockchain in particular as something special for identity management. A number of start-ups are "providing" identity on the blockchain - including OneName, and ShoCard - although on closer inspection what this usually means is nothing more than reserving a unique blockchain identifier with a self-claimed pseudonym.

      Prominent financial services blogger Chris Skinner says "the blockchain will radically alter our futures" and envisages an Internet of Things where your appliances are "recorded [on the blockchain] as being yours using your digital identity token (probably a biometric or something similar)". And the government of Honduras has announced that American Bitcoin technology firm Factom will build a blockchain-based land title registry, which they claim will be "immutable", resistant to insider fraud, and extensible to "more secure mortgages, contracts, and mineral rights".  Interestingly, the Factom-Honduras project stalled for the second half of 2015.  I find it emblematic of the whole blockchain craze that one of the most popular use cases for decentralized ledger technology is little more than a press release.

      While blockchain aficionados have been quick to make a leap to identity, the opposite is not the case. The identerati haven't had much to say about blockchain at all. Ping Identity CTO Patrick Harding mentioned it in his keynote address at the 2015 Cloud Identity Summit, and got a meek response from the audience when he asked who knew what blockchain is (I was there). Harding's suggestions were modest, exploratory and cautious. And only now has blockchain figured prominently in the twice-yearly freeform Internet Identity Workshop unconference in Silicon Valley. I'm afraid it's telling that all the initial enthusiasm for blockchain "solving" identity has come from non identity professionals.

      What identity management problem would be solved by using the blockchain?

      The most prominent challenges in digital identity include the following:

      • account creation including validation of identity or other attributes
      • the cost and inconvenience of multiple account registrations
      • the inconvenience and insecurity of multiple usernames and passwords
      • identity theft and account takeover
      • interoperability of identity data or attributes between services and applications
      • provenance of attributes.

      What does the blockchain have to offer?

      Certainly, pseudonymity is important in some settings, but is rare in economically important personal business, and in any case is not unique to the blockchain. The secure recording of transactions is very important, but that’s well-solved by regular digital signatures (which remain cryptographically verifiable essentially for all time, given the digital certificate chain). Most important identity transactions are pretty private, so recording them all in a single public register instead of separate business-specific databases is not an obvious thing to do.

      The special thing about the blockchain and the proof-of-work is that they prevent double-spending. I’ve yet to see a blockchain-for-identity proposal that explains what the equivalent “double identify” problem really is and how it needs solving. And if there is such a thing, the price to fix it is to record all identity transactions in public forever.

      The central user action in all blockchain applications is to “send” something to another address on the blockchain. This action is precisely a digital (asymmetric cryptographic) signature, essentially the same as any conventional digital signature, created by hashing a data object and encrypting it with one’s private key. The integrity and permanence of the action comes from the signature itself; it is immaterial where the signature is stored.

      What the blockchain does is prevent a user from performing the same action more than once, by using the network to arbitrate the order in which digital signatures are created. In regular identity matters, this objective simply doesn’t arise. The primitive actions in authentication are to leave one’s unique identifying mark (or signature) on a persistent transaction, or to present one’s identity in real time to a service. Apart from peer-to-peer arbitration of order, the blockchain is just a public ledger - and a rather slow one at that. Many accounts of blockchain uses beyond payments simply speak of its inviolability or perpetuity. In truth, any old system of digitally signed database entries is reasonably inviolable. Tamper resistance and integrity come from the digital signatures, not the blockchain. And as mentioned, the blockchain itself doesn't provide any assurance of who really did what - for that we need separate safeguards on users' private keys, plus reliable registration of users and their relevant attributes (which incidentally cannot be done without some authority, unless self-attestation is good enough).

      In addition to not offering much advantage in identity management, there are at least two practical downsides to recording non Bitcoin activity on the blockchain, both related to the proof-of-work. The peer-to-peer resolution of the order of transactions takes time. With Bitcoin, the delay is 10 minutes; that’s the time taken for an agreed new version of the blockchain to be distilled after each transaction. Clearly, in real time access control use cases, when you need to know who someone is right away, such delay is unacceptable. The other issue is cost. Proof-of-work, as the name is meant to imply, consumes real resources, and elicits a real reward.

      So for arbitrary identity transactions, what is the economics for using the blockchain? Who would pay, who would be paid, and what market forces would price identity, in this utopia where all accounts are equal?

      Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Matrix Commerce Data to Decisions Future of Work New C-Suite Tech Optimization AI Blockchain Security Zero Trust Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Privacy Officer