Results

The Robots Are Here!, The Future of HR Tech #CCE2015

The Robots Are Here!, The Future of HR Tech #CCE2015

The robots are coming! What does the future of work have in store for people in the digital age. Will it be a dystopian existence where humans are the bottleneck to productivity and innovation or will we enter a world of augmented humanity and the humanization of digital? 

Future of Work Chief People Officer On <iframe src="https://player.vimeo.com/video/144780869" width="640" height="360" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>
<p><a href="https://vimeo.com/144780869">Visionaries: The Robots Are Here!, The Future of HR Tech #CCE2015</a> from <a href="https://vimeo.com/constellationresearch">Constellation Research</a> on <a href="https://vimeo.com">Vimeo</a>.</p>
Media Name: screen-shot-2016-07-15-95316-am.png

Software AG – Betting Big on Digital Platform

Software AG – Betting Big on Digital Platform

A few weeks ago I spent the majority of my time in Las Vegas with Software AG at their Innovation World summit. Unlike Vegas, what happened at Innovation World isn’t going to stay at Innovation World…okay poor attempt at humor. Software AG should be touting what they spoke about during the 3 days at the Aria Hotel and Casino. The main theme of the event is around the digital platform Software AG is doubling down on. Smart move? Time will tell…but a positive move in light of the digital disruption we are in the midst of and what we all believe will only become greater changes moving forward.

software ag

Software AG aggressively pushed their digital platform agenda from day one on main stage as their executives took turns highlighting Software AG’s efforts with creating a digital platform to allow their customers to innovate. Karl-Heinz Streibich, Software AG’s CEO, focused on 7 drivers that is the catalyst for digital disruptions:

  • The shared economy, driven in large part by the Internet.
  • Standardization, think of the smart phones we all carry.
  • Asset lite companies, as digital becomes the main asset companies are shedding traditional assets.
  • Transparency brought by greater connectivity, this will only accelerate with the rise of the IoT (Internet of Things).
  • Fast sprints of innovation, digital platforms allows for rapid innovation.
  • Lower costs, with digitization the costs are reduced.
  • Unbridled creativity, once digitization has touched “everything” creativity will only continue to explode.

Such drivers continue to change the way companies and service providers address the market. Of course the undertone from main stage was that the customers know better what their business needs are, while Software AG knows best how to translate this into bites and bytes. This is the nature of the business world we are currently living in – digital has created an acceleration that is unprecedented. Vendors and solution providers cannot predetermine what problems and business use cases their customers will have next quarter let alone in 6-12 months. The idea of providing the platform – truly a Lego set for customers to then go out and create their own solutions. Karl-Heinz was right in saying – you know your business better than anyone, but we know the digital aspects better than you…let us work together.

There is a caveat to this notion – that while digital has greatly disrupted business, there are some truisms that will remain. The majority of business use cases have some basic aspects and needs that are consistent across industry and company. Under this light it is important that companies such as Software AG lean on their industry teams to accelerate focused business processes that are vertically aligned. Verticals such as retail, finance or manufacturing are already looking at how they can bring their industry insights to accelerate the time to productivity with their solution sets. Can Software AG leverages their vertical teams to start creating some standard building blocks that their N+1 customer in that specific vertical take advantage of?

The challenge for the German software vendor lies in their ability to pivot into verticals, compete with other service providers that have longer histories and track records servicing these verticals. The advantage Software AG brings is they are also not wed to their legacy offerings, which many other vendors are still fighting.

Matrix Commerce Innovation & Product-led Growth Tech Optimization Future of Work AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

SuperNova Award Winner Protrait - Asha Aravindakshan of Ashoka with FinancialForce HCM

SuperNova Award Winner Protrait - Asha Aravindakshan of Ashoka with FinancialForce HCM

Last week at Constellation Connect Enterprise we awarded our SuperNova awards for the most successful transformations we have seen in the last 12 months. The winner in the "Future of Work" category was Asha Aravindakshan of non profit Ashoka (see more here). 

 
 
We had the chance to speak with Asha at CCE - so take a look:
 
 
In case you didn't have a chance to watch - here are the key takeaways:
 
  • Ashoka is a non profit working globally with 400+ employees in 47 countries, except the US every country has 10 or less employees.
  • Ashoka went from a spreadsheet based solution to using FinancialForce HCM.
  • In a DIY approach Aravindakshan, together with the Salesforce system administrator and a FinancialForce consultant, with an effort of less than 6 hours per week, took Ashoka live in 10 weeks. 
  • Aravindakshan used the yearly performance review cycle to drive adoption to the system, reaching over 97% of the employee base in the first month.
  • Training was handled in less than one hour - as users were familiar with the user interface using Salesforce already. 
 
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
 
Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization New C-Suite Data to Decisions AR Executive Events Chief People Officer Chief Information Officer Chief Experience Officer

Event Hubs, or Engines, add the ‘React’ capability to Analytics - Turning IoT event triggers and data into high value business outputs.

Event Hubs, or Engines, add the ‘React’ capability to Analytics - Turning IoT event triggers and data into high value business outputs.

In recent weeks there have been announcements from many of the big names of the technology industry of ‘Event Engines’ in association with other parts of their Cloud based product sets. In addition there are a number of startups also offering Event Processing engines with a variety of different benefits that will run on an available Cloud service. Most are specifically announced as part of Internet of Things, IoT, solutions, or sometimes IoT suites, but what’s different from BPM business process engines?

The better question is why are these announced specifically for IoT solutions, and how do they fit with the development of IoT architectures as covered in the preceding two blogs in this series?

Research report now available: The Foundational Elements for the Internet of Things (IoT)

The sheer volume of devices and resulting data flows makes real time Analytics essential to ‘read’ and tag the valuable ‘Insights’, but real business value lies in using Event Hubs/Engines coupled with Analytical output to respond, or ‘react’, with sophisticated Business valuable actions.

Event Processing is not new, but Business valuable IoT deployments demand Complex Event Processing, CEP, to combine data from multiple sources into meaningful events then orchestrate process elements into optimized Process responses. Add the demand for real time, the unique combination of both ‘push’ and ‘pull’ data, frequent changes plus dynamic utilisation and a new generation of cloud based Event Engines is required.

Event Engines sit in a third layer over the Internet of Things, IoT, architecture that is becoming increasingly clearly defined; A base layer connectivity Infrastructure built on Fog Computing, or Edge based Clouds to localized the speed of ‘interactions’ between IoT Devices; Built over this is the store/search capabilities of Graph Databases with their unique capability to establishing relationships between data around connections in a manner similar to the manner that data is created by IoT Devices.

Event Engines are effectively the third layer of IoT architecture providing the higher levels of Business Value by managing responses to Event Trigger conditions, either as a complex event process from a single trigger event, or to draw a conclusion from a complex alignment of the flow of current data with stored data. The association of a Graph Database with IoT and Event Engines operating on Clouds has led to some Event Engine providers calling their associated (Graph)Database an ‘Event Cloud’.

Its not hard to imagine any number of examples that would be defined as simple event processes, classically the story of the towel dispenser running out of towels calling to be refilled. To some the definition of IoT remains that of a simple sensor on a machine reporting a certain value as changed, but to many it is now understood to be a wide range of data flowing from many different types of connected Devices. As IoT deployment scale increases both value and amount of data to be processed call for automation of ‘React’ outputs, and possibly integration with existing Enterprise Applications. 

Complex Event Processing, CEP, is by definition complex to explain, but in simple terms is about finding new values from combinations of data and delivering an output that is Business Valuable. As is often the case with new Technology Event Engines and CEP is best understood by using examples of what can be done. The following two examples are edited versions of examples from the Wikipedia explanation of Complex Event Processing.

As an example of Complex Event Processing consider a car equipped with with just three simple IoT sensors that measure Tyre Pressure, Speed, and the presence of a Driver by a seat pressure.  Individually each is able to offer a data flow and trigger condition. Combining the same data flows from the same three simple sensors using Complex Event Processing produces new data that is wholly different and of much higher value.

CEP Example 1: The speed sensor indicates car is moving when the tire pressure sensor data flow indicates the pressure in one tyre is dropping from 45 psi to 41 psi over 60 minutes. As the pressure in the tire is decreasing, a series of data events reporting the tyre pressure are being generated. In parallel a data flow is being generated indicating the car is being driven. (The presence of a driver and speed of movement). The car Event Processing Engine combines all three current and stored data flows to define the situation as gentle tyre deflation over a period of some time and outputs to the driver the display "loss Of Tire Pressure". This output may also be written into the structured database of the car maintenance log, and possibly in connected cars will even be sent out to seek the tyre puncture repair options.  

CEP Example 2: Changing just one event reporting parameter in same situation produces an entirely different output and triggers appropriately different actions.  If the Tyre pressure drops the same amount, but in 5 seconds, then the car Event Processing Engine will conclude the output to be “Tyre Punctured”, or “Blow Out”. This potentially catastrophic event will bring into play skid management control, hazard lights coming on, and possibly a warning being flashed externally to warn of a potential accident.

As with all forms of IoT deployment there are strong business management reasons to understand and define what outputs are required as much as Technology practitioners should know how to use IoT to deliver the requirement. The providers of Event Engines look to offer best practice drag and drop process design tools so, as with many Cloud based capabilities, the creation of Event Processes may move to becoming a business user activity.

The above two examples indicate the principles of both the two major forms of CEP, Aggregation and Detection, as well as the common combination of both into a Hybrid solution.

Aggregation Oriented CEP carry out processes by continuously calculating an ‘average value’ from multiple data flows to produce and trigger an output. Vibration increasing over a period taken in combination with speed in revs per minute, multiplied by hours run might indicate bearing wear; whereas rising, or constantly high temperature in the engine plus speed and hours running could be used to indicate when an oil change might be required.

Detection Oriented CEP seeks to find a required output trigger from a combination of event inputs in which a determined pattern or sequence can be found. Facebook’s search capability is an example using Detection Oriented CEP to looking for alignments and matches between apparently unrelated data held in Graph Databases.

Hybrid CEP is rapidly becoming a norm as the number of IoT sensors and Devices producing data flows increase Event Processing possibilities and informed users ask for a wider selection of output conditions.

It is tempting, and wrong, to draw parallels with event processing in Business Process Management, BPM, and therefore to consider BPM Rule Engines for IoT event processing. BPM Rule Engines are neither meant for continuous dynamic reprograming, nor can they combine IoT Device ‘push’ data with ‘pull’ data from APIs as two of the most obvious limitations.

Complex Event Processing, allied to Fog Computing and Graph Databases, makes for true game changing capabilities of the type that underpins many of the new high value Business disruptive capabilities. Simple Event trigger alarms may be enough to justify the first generation of IoT pilot deployment connected through small scale ‘Intranet’ deployments, but the real prize that a Global IoT environment brings delivers far higher levels of direct business value.

The Enterprises in each sector that are the first to learn how to connect, collate and process the new streams of Data in a massively connected IOT Device world will immediately gain the competitive advantage they seek. Its right to see that Data delivers the advantage, but to unlock that advantage requires new understanding of exactly what and how IoT works. The leading Enterprises are there already, think of Amazon, Facebook, or Google to understand how they collect and use data to monetize their new business models, and look around in your own Sector.

Resources

The Foundational Elements for the Internet of Things (IoT)

Some further sources for information on Event Hubs/Engines linked to Cloud Suites

Salesforce Products and Platform

http://www.salesforce.com/iot-cloud/

SAP HANA and IoT

http://go.sap.com/uk/product/technology-platform/iot-platform-cloud.html

Microsoft IoT solution products overview 

https://azure.microsoft.com/en-gb/services/stream-analytics/

AWS  IoT product architecture overview

https://aws.amazon.com/iot/how-it-works/

Google IoT cloud products

https://cloud.google.com/solutions/iot/

AWS examined by TechCrunch 

http://techcrunch.com/2015/10/08/amazon-announces-aws-iot-a-platform-for-building-managing-and-analyzing-the-internet-of-things/#.fkbxmq:1f9A

Oracle IoT solution architecture as announced at OOW autumn 2015

https://www.oracle.com/solutions/internet-of-things/index.html 

How to Secure the Best Cloud Software Contract Webinar with R “Ray” Wang

How to Secure the Best Cloud Software Contract Webinar with R “Ray” Wang

Constellation Research principal analyst and bestselling author R "Ray" Wang will teach you how to structure the most favorable cloud software contract for your organization. R "Ray" Wang will share his top cloud negotiation tips derived from his involvement in 1,000+ contract negotiations.

Don't get trapped in an unfavorable cloud contract. Learn how to navigate the increasingly complex cloud services market by registering for this webinar today. 

Details:

You will learn:

  • Top cloud negotiation tips from one of enterprise tech's leading analyst
  • Common contract negotiation pitfalls
  • How to ensure you secure the best cloud services deal
Tech Optimization Marketing Transformation Webinar AR SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Financial Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Constellation Announces 2015 SuperNova Award Winners

Constellation Announces 2015 SuperNova Award Winners

Constellation announces the winners of the 2015 SuperNova Awards at Constellation's Connected Enterprise.

2015 SuperNova Award Winners

Constellation announced nine SuperNova Award winners last night at the SuperNova Awards Gala Dinner. 

The Constellation SuperNova Awards are the first and only awards to celebrate the leaders and teams who have overcome the odds to successfully apply emerging and disruptive technologies for their organizations. The SuperNova Award winners demonstrated great leadership in selecting, implementing, and deriving business value from disruptive technologies. More information about the winners below. 

All applications were evaluated by the SuperNova Award Judges, comprised of industry thought leaders, and then put to a public vote. 

2015 SuperNova Award Winners

Consumerization of IT & The New C-Suite - Martin Brodbeck, CTO, Sterling Backcheck

Martin Brodbeck won the SuperNova Award for his implementation of SnapLogic Infrastructure-as-a-Service solutions to streamline Sterling Backcheck’s automated transactions service. He moved the entire system from a complicated, custom-built, open-source system to one streamlined cloud-based transactions system. The project shortened the customer onboarding time to just days. Frictionless transactions in the cloud translate to faster time to revenue for Sterling Backcheck.

Data to Decisions - Alda Mizaku, Lead Business Solutions Analyst, Mercy

Mizaku and her team won the SuperNova Award for leading a big data analytics project to improve the delivery of patient care and bridge the gaps between the clinical and coding worlds. Part of the strategy included automation of secondary diagnosis detection targeted to improve the accuracy of provider documentation. This strategy seeks to more accurately reflect care that has already been provided and help bubble up comorbidity factors. This project, which uses a combination of data warehouse tables, ETL logic, and custom reporting led to a diagnosis increase of 36%.  

Digital Marketing Transformation -Naveen Gunti, Sr. Director of e-Commerce Technology and Operations, Tumi Holdings, Inc.

Naveen Gunti won the SuperNova Award for his use of Adobe Marketing Cloud to improve Tumi’s online customer experience. The implementation enabled customers to view dynamic images of Tumi products on the Tumi website. The resulting increased customer engagement on the website led to an increase of on-site time by 40%.

Future of Work, Human Capital Management -Asha Aravindakshan, Operations Director, Global Talent, Ashoka

Asha Aravindakshan won the SuperNova Award for leading the implementation of FinancialForce HCM at Ashoka. Prior to the implementation, Ashoka used Excel spreadsheets to manage their global workforce. Recognizing adoption as an essential element required for the success of the new HCM system, Aravindakshan led a movement to conduct performance reviews on FinancialForce. 97% of Ashoka's employees engaged with FinancialForce in response to this performance review incentive. Thanks to Aravindakshan's leadership, Ashoka can now address the HCM needs of an international workforce.

Future of Work, Social Business - Steve Nava, Sr. Director Field Service, Luminex

Steve Nava won the SuperNova Award for leading the implementation of ServiceMax to improve communication and collaboration of field service engineers at Luminex. The implementation was so successful that it transformed Luminex’s field service department into a solutions-oriented business.  Luminex’s fix rate increased to 98%, and the invoice cycle went from 28 days to 96 hours. 

Matrix Commerce - Jordan Kivelstadt, CEO, Free Flow Wines

Jordan Kivelstadt won the SuperNova Award for being one of the first companies to deliver wine in kegs. He used Netsuite to address the specific supply chain issues presented by this model. Free Flow Wines is disrupting the wine industry as more restaurants are choosing to serve wine in kegs. This year, Free Flow Wines expects to deliver the equivalent of 300,000 cases of wine. 

Next Generation Customer Experience - Dan Wallis, Director of KP OnCall, Kaiser Permanente

Dan Wallis won the SuperNova Award for leading the implementation of Oracle Service Cloud to support a system that helps to automatically diagnose health conditions via the web. This project has allowed Kaiser Permanente to better serve their customers by reducing costs and wait times. Now Kaiser patients can accurately self-diagnose conditions without visiting a doctor or calling a nurse call center. 

Technology Optimization & Innovation - Dr. David Bray, Chief Information Officer, Federal Communications Commission

Dr. David Bray won the SuperNova Award for his overhaul of the FCC’s legacy Consumer Help Center. He implemented cloud-based Zendesk to modernize the Commission's Help Center. The Zendesk implementation helped the FCC replace 18 outdated complaint forms, activate 24/7 complaint tracking, and improve transparency. The cloud-based solution selected and implemented under Bray's leadership led to a savings of 5/6 over a custom-built, in-house solution. 

Technology Optimization & Innovaiton - Erica Stevens, VP of Supply Chain and IT, Dylan's Candy Bar

Erica Stevens won the SuperNova Award for her implementation of Netsuite at Dylan’s Candy Bar. This implementation transformed Dylan’s into a ubiquitous channel retailer. Dylan's is now able scale and pivot its retail operations to meet the needs of their customers on many channels including web, mobile, and brick and mortar.

The Rewards

Congratulations to the winners! Continue to be brave, innovative, and disruptive!

Data to Decisions Future of Work Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief People Officer Chief Procurement Officer Chief Supply Chain Officer

My opening remarks on privacy at Constellation Connected Enterprise 2015

My opening remarks on privacy at Constellation Connected Enterprise 2015

A big part of my research agenda in the Digital Safety theme at Constellation is privacy. And what a vexed topic it is! It's hard to even know how to talk about privacy. For many years, folks have covered privacy in more or less academic terms, drawing on sociology, politics and pop psychology, joining privacy to human rights, and crafting new various legal models.

Meanwhile the data breaches get worse, and most businesses have just bumped along.

When you think about it, it's obvious really: there's no such thing as perfect privacy. The real question is not about 'fundamental human rights' versus business, but rather, how can we optimise a swarm of competing interests around the value of information?

Privacy is emerging as one of the most critical and strategic of our information assets. If we treat privacy as an asset, instead of a burden, businesses can start to cut through this tough topic.

But here's an urgent issue. A recent regulatory development means privacy may just stop a lot of business getting done. It's the European Court of Justice decision to shut down the US-EU Safe Harbor arrangement.

The privacy Safe Harbor was a work-around negotiated by the Federal Trade Commission, allowing companies to send personal data from Europe into the US.

But the Safe Harbor is no more. It's been ruled unlawful. So it's big, big problem for European operations, many multinationals, and especially US cloud service providers.

At Constellation we've researched cloud geography and previously identified competitive opportunities for service providers to differentiate and compete on privacy. But now this is an urgent issue.

It's time American businesses stopped getting caught out by global privacy rulings. There shouldn't be too many surprises here, if you understand what data protection means internationally. Even the infamous "Right To Be Forgotten" ruling on Google's search engine - which strikes so many technologists as counter intuitive - was a rational and even predictable outcome of decades old data privacy law.

The leading edge of privacy is all about Big Data. And we aint seen nothin yet!

Look at artificial intelligence, Watson Health, intelligent personal assistants, hackable cars, and the Internet of Everything where everything is instrumented, and you see information assets multiplying exponentially. Privacy is actually just one part of this. It's another dimension of information, one that can add value, but not in a neat linear way. The interplay of privacy, utility, usability, efficiency, efficacy, security, scalability and so on is incredibly complex.

The broader issue is Digital Safety: safety for your customers, and safety for your business.

Digital Safety, Privacy & Cybersecurity Chief Information Officer

Randstad Sourceright: Good Progress and the beginning of a balancing act

Randstad Sourceright: Good Progress and the beginning of a balancing act

We had the opportunity to attend the Randstad Sourceright analyst summit this week, it took place in Atlanta and was well attended by the analyst community:

 

Take a look at my top 3 takeaways of the event:

 
 
If you don't have a chance to watch - here are the takeaways:
 
  • Talentradar debut - Randstad Sourceright showed the first deviverable on its product roadmap with Talentradar. It brings together recruiting information across the varies systems involved as well as Randstad Sourceright parnters like e.g. Hirevue, Smashfly etc. Technologies mentioned were Informatica and R, we could see Domo being used for visualization. Randstad Sourceright has delivered a solid version one of the product, now we need to understand the roadmap and customer adoption as next steps.
     
  • Standardization - The whole outsourcing industry is recovering from a hangover of too customized deals sold and implemented early in the millenium. The answer is standardization and leveraging global capabilities, Randstad Sourceright is making good progress on both fronts. Discipline is key though and it was re-assuring hearing the North American sales leaders stating that they would walk away from business if not falling inside the parameters of standard delivery.
     
  • RiseSmart - In a surprise move Randstad Sourceright acquired outplacement vendor RiseSmart (see here for vendor and here for press release). RiseSmart CEO Sathe was there and told the vendor's story - bringing software to the outplacement business. In a good move Randstad Sourceright has decided to keep RiseSmart operate independently.



 

    MyPOV

    It is good to see Randstad Sourceright growing and making progress standardizing, globalizing its product offerings. The acquisition of RiseSmart opens new revenue potential and the chance to disrupt the outplacement market. Equally it is good to see the product focus showing first deliverables with Talentradar. And the vendor is keeping tabs on a booming Recruiting startup eco system with its Randstad Foundation Fund. 
     
    On the concern side Randstad Sourceright will have to take into account that more resourcing decisions will be made by software, and more resourcing decisions will be made by hiring managers directly, and no longer by recruiters. These changes are disruptive for Randstad Sourceright customers and therefore for the vendor itself. Preparing and switching over in time will be the key challenge for executive management in the next years. 
     
    But for now congrats on good progress, we will be keeping tabs, stay tuned. 

    --------------

    More on Recruiting


     
     

    • Musings - How Technology Innovation fuels Recruiting and disrupts the Laggards - read here
    • Musings - What is the future of recruiting? Read here
    • HRTech 2014 takeaways - read here.
    • Why all the attention to recruiting? Read here.
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here


     
    Tech Optimization Chief Information Officer

    Pivotal Now Makes It Easier Than Ever to Take Software from Idea to Production

    Pivotal Now Makes It Easier Than Ever to Take Software from Idea to Production

     
    Today Pivotal used its upcoming European Cloud Foundry user conference to release a round up press release on its overall progress… time to check in where Cloud Foundry stands today.

     


    So let’s dissect the press release in our customary style:

     
    San Francisco, November 2, 2015 – Pivotal®, the company accelerating digital transformation for enterprises, today announced a new release of Pivotal Cloud Foundry, the comprehensive Cloud Native platform that unifies the software delivery process with an integrated application framework, platform runtime, and infrastructure automation. Pivotal Cloud Foundry now includes expanded support for Spring Cloud Services, Microsoft Azure, .NET applications, Docker images, and application lifecycle management. With these enhancements, Pivotal further enables businesses to rapidly build, deploy, and operate Cloud Native applications on a wide choice of hosted, public, and private clouds.

    MyPOV – Good summary to start the press release – hitting all the key new capabilities, we will dissect and comment below. But worth to mention the ‘cloud native’ positioning here, will be interesting to see if Pivotal can pull that association between cloud native and its products of.

     
    “The days of monolithic technologies are ending. Today’s modern enterprises practice agile software development with Cloud Native tools, process, and culture that can respond to speed of market and customer demand,” said James Watters, vice president and general manager, Cloud Platform Group, Pivotal. “Pivotal Cloud Foundry delivers a comprehensive Cloud Native application development and operations environment so you can spend time building business value instead of your IT infrastructure.”

    MyPOV – Good quote by Waters, hitting the right value proposition of CloudFoundry, though the tool itself is also monolithic – in the sense of offering one way to build software.

     
    Integrated Microservices with Spring Cloud Services  
    Based on the popular Spring Cloud OSS, which is used by Netflix to operate its global, on-demand video streaming service, Spring Cloud Services for Pivotal Cloud Foundry goes one step further to provide opinionated provisioning and lifecycle management to these components.
    The first and only secure, enterprise-ready distribution of core Netflix OSS components, Spring Cloud Services enables developers and operators of Cloud Native distributed systems architectures to quickly and easily build microservices by adding a suite of production-ready services to the Pivotal Cloud Foundry marketplace. Spring Cloud Services allows developers to focus on delivering business value and defers the deployment and management of important distributed systems patterns such as application configuration, service discovery, and fault-tolerance to the Pivotal Cloud Foundry platform.

    MyPOV – Cloud Foundry needed a productivity framework to accelerate time to market for solutions build with the product. Nothing lies closer than using the venerable spring framework, souped up with the Netflix OSS components. A good move for the product, but like all productivity tools, it comes with the addition of increased dependency. We expect Pivotal customers to not be too concerned with this dependency, but they should be making aware tradeoffs.

     
    Native Support for .NET Applications 
    Thanks to the next-generation runtime shipping in this latest release, .NET applications can now run on Pivotal Cloud Foundry. With this expanded support for .NET, enterprises can support a heterogeneous environment consisting of both Linux-based and Windows-based applications. .NET applications will run natively on Windows Server 2012 R2 Hyper-V virtual machines, and Pivotal Cloud Foundry can manage applications with the same commands and many of the same consistent Day 2 operational benefits as existing applications.

    MyPOV – This is a key move by Microsoft and Pivotal to avoid developers of .Net applications to have to go back and rebuild these .Net apps as their first order of business. Instead Microsoft and now Pivotal gives developers the opportunity to operate these older .Net applications in conjunction with the next generation applications they want to build (and the vendors want them to build). Lastly it is the ultimate proof point of investment protection for .Net applications, a promise Microsoft has made over a decade ago and is honoring today.

     
    Native Support for Docker Images 
    Docker applications can now leverage the built in Pivotal Cloud Foundry platform capabilities, such as scheduling, health management, load balancing, enterprise identity, logging, and multi-cloud support. Now in beta, native Docker image support is made possible by the new elastic runtime and makes Pivotal Cloud Foundry the most advanced container management system on the market today. Customers can deploy applications to Pivotal Cloud Foundry based on Docker images from public, secure registries such as Docker Hub.

    MyPOV – Good move to provide better support for Docker, and the way how enterprises want to build, operate and consume Microservices – in a secure, repeatable and reliable way. Registry integration is the capability in demand and it is good to see Pivotal providing the capability.

     
    Application Lifecycle Management Toolchain
    Delivering on Pivotal’s vision of comprehensive Cloud Native application lifecycle management, the company is partnering with GitLab, CloudBees, and JFrog to deliver a turnkey continuous integration and continuous delivery (CI/CD) solution.
    Building upon the popular software project management tool, Pivotal Tracker, customers can integrate platform-managed versions of GitLab source code repository, CloudBees Jenkins continuous integration, and JFrog Artifactory binary artifact management. By providing the building blocks of a modern application delivery toolchain, Pivotal Cloud Foundry empowers software organizations to build and deploy microservices and Cloud Native applications with confidence and speed.

    MyPOV – It is good to see Pivotal acknowledging other popular development tools such as the ones mentioned and being integrated now with Pivotal Tracker. Next would be a roadmap / sharing of plans of other popular adjacent tools. For now congrats to the three who made it – GitLab, CloudBees and JFrog.

     
    Early Access Support for Microsoft Azure
    Pivotal Cloud Foundry extends its Cloud Native platform with early access support for Microsoft Azure, adding to the already-supported Amazon Web Services® (AWS), VMware vSphere®, VMware vCloud Air®, and OpenStack®. With Pivotal Cloud Foundry, customers can deploy and manage Cloud Native applications on almost any infrastructure; without the operational cost and complexity of maintaining their own underlying cloud infrastructure. [..]

    MyPOV – Good to see Pivotal extending deployment options, as previously indicated – now adding support to Microsoft Azure. A good move for CloudFoundry users, who get more deployment options for their projects.

     

    Overall MyPOV

    Pivotal is making good progress with CloudFoundry, creating more value and synergies for customers and prospect. It further solidifies CloudFoundry‘s position as the leading enterprise PaaS. With Microsoft Azure support and access to .Net applications, Microsoft acknowledges the position of Cloud Foundry further, bringing core delivered Microsoft .Net assets to the CloudFoundry platform.

    On the concern side – with success comes also responsibility, Pivotal needs to deliver these capabilities, ensure customer success and become a reliable partner both for its growing ecosystem. There is no indication that Pivotal cannot deliver this, but the task ahead is not trivial. Starting to create, communicate and deliver to roadmaps will be the first steps.

    But for now it’s good to be a Pivotal customer and prospect. 


    More on Pivotal

     
    • News Analysis - Pivotal makes CloudFoundry more about multi-cloud - read here
    • News Analysis - Pivotal pivots to OpenSource and Hortonworks - Or: OpenSource keeps winning - read here

    More on Next Generation Applications::

     
     
    • Progress Report - Cloudera is all in with Hadoop - now off to verticals - read here
    • First Take - SAP Cloud for Planning - The next spreadsheet killer is off to a good start - read here
    • Market Move - Oracle buys Datalogix - moves into DaaS - read here
    • News Analysis - SAP commits to CloudFoundry and OpenStack - Key Steps - but what is the direction? Read here
    • Event Report - MongoDB is a showcase for the power of Open Source in the enterprise - read here
    • Musings - A manifesto: What are 'true' analytics? Read here
    • Future of Work - One Spreadsheet at the time - Informatica Springbok - read here
    • Musings - The Era of the no-design Database - Read here
    • Mendix - the other path to build software - read here
    • Musings - Time to ditch your datawarehouse .... - Read here
    Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here



     
    Future of Work Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Data to Decisions Digital Safety, Privacy & Cybersecurity New C-Suite Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

    Oracle Open World 2015: Three Important Cloud Services

    Oracle Open World 2015: Three Important Cloud Services

    Oracle Open World 2015 announcements included three data-related standouts: Oracle Cloud Platform For Big Data, Oracle Data Visualization Cloud Service, and Oracle Data Cloud. Here’s a deeper dive on what stands out.

    Our cloud is integrated, it’s ready, and it’s bigger than and superior to any rival cloud. That was the big-picture message Oracle offered customers at Oracle Open World 2015. What’s more, the cloud is where most customers will soon be headed, said Oracle CEO Mark Hurd, predicting that “virtually all” enterprise data would be stored in the cloud by 2025.

    It would be impossible to detail all two-dozen-plus announcements made last week, so I’ll narrow things to my Data-to-Decisions (D2D) research domain and focus on three announcements that caught my eye: The Oracle Cloud Platform For Big Data, the Data Visualization Cloud Service, and the Oracle Data Cloud. With all three, Oracle is trying to differentiate itself in the cloud.

    Oracle Cloud Platform For Big Data

    Oracle’s Hadoop-in-the-cloud Big Data Cloud Service was announced at Oracle Open World 2014. At Oracle Open World 2015 the company reannounced a family of supporting services newly packaged as the Oracle Cloud Platform for Big Data. The components include Oracle Big Data Preparation Cloud Service, Oracle GoldenGate Cloud Service, Oracle Big Data Discovery Cloud Service and Oracle NoSQL Database Cloud Service. The idea is to surround the Hadoop service with a breadth of data-prep, data-movement and data-analysis options.

    @Oracle, #OOW15

    The Oracle Cloud Platform for Big Data combines platforms (database, Hadoop, NoSQL) and analysis options.

    I say “reannounce” because Oracle talked about all of these services back in June. The new “Cloud Platform for Big Data” is a new brand. The Oracle Big Data Preparation service is aimed at preparing and enriching semi-structured and unstructured data, such as clickstreams and social data. Under the hood it makes use of Apache Spark machine learning, Yago entity resolution training sets and Apache UIMA natural language processing.

    The GoldenGate Cloud service is based on Oracle’s well-known data-replication software. This service is designed to pump data in near real time into Oracle Database Cloud Service, Exadata Cloud Service, Hive, Hbase, HDFS, Flume and Kafka (in this case, in the cloud). It’s a complement to low-latency, data-streaming applications, such as those in IoT scenarios.

    The Oracle Big Data Discovery Cloud Service is a business-analyst-oriented tool for exploring, transforming and discovering data within Hadoop (again, in this case running in Oracle’s cloud). Data Discovery first samples, profiles and catalogs all available data. Machine-learning algorithms then surface interesting correlations and offer suggested visualizations for exploring attributes. Search and guided navigation features support data exploration.

    Filling out Oracle’s Cloud Platform is the NoSQL Database Cloud service, which is Oracle’s high-scale key-value store database delivered as a service.

    MyPOV: Not to be forgotten is the Oracle Big Data SQL Cloud Service, which does SQL querying across Oracle Database Cloud services, the Big Data Cloud Service and the NoSQL Database Cloud Service. Taken together, it’s a broad (if Oracle-centric) collection. IBM has a broader collection of IBM and open-source-based services on BlueMix, and Amazon Web Services has more customers using its cloud. But Oracle is building out an impressive portfolio, and the company’s dominant database position will surely feed cloud growth.

    Oracle Data Visualization Cloud Service

    Nearly every BI vendor has introduced a data-visualization module in recent years in response to the fast growth of Tableau Software. Oracle has evolved what it offers for the cloud with the Oracle Data Visualization Cloud Service. Set to become generally available in November, this new stand-alone service is based on capabilities first seen in the Oracle Business Intelligence Cloud Service introduced in April.

    @Oracle, #OOW15

    The Oracle Data Visualization Cloud Service offers 18 charting options and a palate of colors and shapes for depicting data.

    The Oracle Data Visualization Cloud Service will enable you to link to on-premises and cloud data sources (both from Oracle and third parties) as well as your own spreadsheets. There are 18 different types of visualizations and a palate of colors, shapes and sizes for depicting data points.

    Execs at Oracle Open World made a point of saying “all you need is a browser.” That’s because with Tableau’s cloud service, Tableau Online, users author charts and dashboards with the desktop client and then publish to the cloud for collaboration. Tableau is working on bringing full authoring capabilities to the cloud. And, indeed, Oracle is working on a desktop client for times when you need to work offline.

    MyPOV: Oracle execs made claims about its new service being “more modern than Tableau” at Oracle Open World. That starts with full authoring capabilities in the cloud, but I’m not seeing some of the other differences claimed. The press release says Oracle’s service “eliminates the complexity typically associated with blending and correlating data sets,” but Tableau also automatically finds joins when mashing up data sets. Both products also select best-fit visualizations automatically based on the dimensions of data used in an analysis. This auto-charting capability has been around for a while, and it’s also present in SAS Visual Analytics and IBM Watson Analytics.

    For a real head-to-head comparison with Tableau, I want to investigate Oracle’s performance characteristics and its connection capabilities (once this service is available). Tableau’s strengths include its in-memory engine and its live-data-connection capabilities with multiple databases, apps and cloud services, including multiple connection options with Amazon, Google, IBM, Microsoft, Oracle, Salesforce, SAP and others. Will Oracle match that? I also want to tour the “fit and finish” of the visualizations and “storytelling” capabilities. Some of the charts seen at Oracle Open World looked hard to read, but that may be due to the data-filtering and presentation inexperience of the demonstrators.

    Yet-to-be released pricing details from Oracle will also be key to any comparison, but to me, these visualization capabilities are most attractive when teamed with the Oracle BI Cloud Service. That’s because it’s not only a data-exploration and visualization service; you also get the database and reporting functionality. Here, too, the more Oracle-centric you are in the on-premises world, the more attractive the cloud options will be.

    Oracle Data Cloud

    Several new features of the Oracle Data Cloud were announced at Oracle Open World, but a larger context emerged last week with IBM’s announced intent to acquire The Weather Company. Thus, I was eager to learn more about Oracle as a data provider. Oracle Data Cloud is built on technology, data and analytics expertise picked up in the BlueKai and Datalogix acquisitions. Talking to execs from both companies now leading Oracle Data Cloud, I came away impressed.

    @Oracle, #OOW15

    Oracle Data Cloud offers data from more than 1,500 specialty retailers and 30 supermarket loyalty cards.

    Oracle Data Cloud offers offers data from more than 1,500 CPG and specialty retailers across 110 Million US Households. With data-enrichment and predictive analytics options on top of this data, Oracle can find likely buyers by product and category.

    MyPOV: Having data and being able to enrich that data and apply predictive analytics is the name of the game in marketing, and these initiatives are moving into the sales and service arenas as well. In the business-to-business arena, Oracle Data Cloud can enrich your data with Dun & Bradstreet information to find look-alikes of your best customers. A next step is bringing service data full circle back into your understanding of customers to drive efforts such as retention campaigns.

    Many tech vendors are introducing libraries of third-party data that are integrated with their offerings. But big guns like IBM and Oracle are stepping up to become primary data providers. Expect to see data from outside of your organization becoming a bigger and bigger part of your future success.


    Data to Decisions Future of Work Matrix Commerce Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer