Results

Obama's Cybersecurity Summit

Obama's Cybersecurity Summit

The White House Summit on Cybersecurity and Consumer Protection was hosted at Stanford University on Friday February 13. I followed the event from Sydney, via the live webcast.

It would be naive to expect the White House Cybersecurity Summit to have been less political. President Obama and his colleagues were in their comfort zone, talking up America's recent economic turnaround, and framing their recent wins squarely within Silicon Valley where the summit took place. With a few exceptions, the first two hours was more about green energy, jobs and manufacturing than cyber security. It was a lot like a lost episode of The West Wing.

The exceptions were important. Some speakers really nailed some security issues. I especially liked the morning contributions from Intel President Renee James and MasterCard CEO Ajay Banga. James highlighted that Intel has worked for 10 years to improve "the baseline of computing security", making her one of the few speakers to get anywhere near the inherent insecurity of our cyber infrastructure. The shocking truth is that cyberspace is built on terrible foundations; the software development practices and operating systems that bear the economy today were not built for the job. For mine, the Summit was too much about military/intelligence themed information sharing, and not enough about why our systems are such a shambles. I know it's a dry subject but if they're serious about security, policy makers really have to engage with software quality and reliability, instead of thrilling to kids learning to code. Software development practices are to blame for many of our problems; more on software failures here.

Ajay Banga was one of several speakers to urge the end of passwords. He summed up the authentication problem very nicely: "Stop making us remember things in order to prove who we are". He touched on MasterCard's exploration of continuous authentication bracelets and biometrics (more news of which coincidentally came out today). It's important however that policy makers' understanding of digital infrastructure resilience, cybercrime and cyber terrorism isn't skewed by everyone's favourite security topic - customer authentication. It's in need of repair yet it is not to blame for the vast majority of breaches. Mom and Pop struggle with passwords and they deserve better, but the vast majority of stolen personal data is lifted by organised criminals en masse from poorly secured back-end databases. Replacing customer passwords or giving everyone biometrics is not going to solve the breach epidemic.

Banga also indicated that the Information Highway should be more like road infrastructure. He highlighted that national routes are regulated, drivers are licensed, there are rules of the road, standardised signs, and enforcement. All these infrastructure arrangements leave plenty of room for innovation in car design, but it's accepted that "all cars have four wheels".

Tim Cook was then the warm-up act before Obama. Many on Twitter unkindly branded Cook's speech as an ad for Apple, paid for by the White House, but I'll accentuate the positives. Cook continues to campaign against business models that monetize personal data. He repeated his promise made after the ApplePay launch that they will not exploit the data they have on their customers. He put privacy before security in everything he said.

Cook painted a vision where digital wallets hold your passport, driver license and other personal documents, under the user's sole control, and without trading security for convenience. I trust that he's got the mobile phone Secure Element in mind; until we can sort out cybersecurity at large, I can't support the counter trend towards cloud-based wallets. The world's strongest banks still can't guarantee to keep credit card numbers safe, so we're hardly ready to put our entire identities in the cloud.

In his speech, President Obama reiterated his recent legislative agenda for information sharing, uniform breach notification, student digital privacy, and a Consumer Privacy Bill of Rights. He stressed the need for private-public partnership and cybersecurity responsibility to be shared between government and business. He reiterated the new Cyber Threat Intelligence Integration Center. And as flagged just before the summit, the president signed an Executive Order that will establish cyber threat information sharing "hubs" and standards to foster sharing while protecting privacy.

Obama told the audience that cybersecurity "is not an ideological issue". Of course that message was actually for Congress which is deliberating over his cyber legislation. But let's take a moment to think about how ideology really does permeate this arena. Three quasi-religious disputes come to mind immediately:

  • Free speech trumps privacy. The ideals of free speech have been interpreted in the US in such a way that makes broad-based privacy law intractable. The US is one of only two major nations now without a general data protection statute (the other is China). It seems this impasse is rarely questioned anymore by either side of the privacy debate, but perhaps the scope of the First Amendment has been allowed to creep out too far, for now free speech rights are in effect being granted even to computers. Look at the controversy over the "Right to be Forgotten" (RTBF), where Google is being asked to remove certain personal search results if they are irrelevant, old and inaccurate. Jimmy Wales claims this requirement harms "our most fundamental rights of expression and privacy". But we're not talking about speech here, or even historical records, but rather the output of a computer algorithm, and a secret algorithm at that, operated in the service of an advertising business. The vociferous attacks on RTBF are very ideological indeed.
  • "Innovation" trumps privacy. It's become an unexamined mantra that digital businesses require unfettered access to information. I don't dispute that some of the world's richest ever men, and some of the world's most powerful ever corporations have relied upon the raw data that exudes from the Internet. It's just like the riches uncovered by the black gold rush on the 1800s. But it's an ideological jump to extrapolate that all cyber innovation or digital entrepreneurship must continue the same way. Rampant data mining is laying waste to consumer confidence and trust in the Internet. Some reasonable degree of consumer rights regulation seems inevitable, and just, if we are to avert a digital Tragedy of the Commons.
  • National Security trumps privacy. I am a rare privacy advocate who actually agrees that the privacy-security equilibrium needs to be adjusted. I believe the world has changed since some of our foundational values were codified, and civil liberties are just one desirable property of a very complicated social system. However, I call out one dimensional ideology when national security enthusiasts assert that privacy has to take a back seat. There are ways to explore a measured re-calibration of privacy, to maintain proportionality, respect and trust.

President Obama described the modern technological world as a "magnificent cathedral" and he made an appeal to "values embedded in the architecture of the system". We should look critically at whether the values of entrepreneurship, innovation and competitiveness embedded in the way digital business is done in America could be adjusted a little, to help restore the self-control and confidence that consumers keep telling us is evaporating online.

 

See Part 2 of my coverarge of the summit here

Data to Decisions Digital Safety, Privacy & Cybersecurity New C-Suite Tech Optimization Security Zero Trust Chief Customer Officer Chief Financial Officer Chief Information Officer Chief Digital Officer Chief Information Security Officer Chief Privacy Officer

Progress Report - SAP HCM makes progress and consolidates - a lot of moving parts

Progress Report - SAP HCM makes progress and consolidates - a lot of moving parts

SAP had their first anual analyst meeting in San Francisco and it was a good time to catch up on overall progress almost 6 months after SuccessConnect in Las Vegas (my takeaways here).
 


Before we embark, here is a great slide on the history of both SAP HCM and SuccessFactors:

 

As typical for a Progress Report – here are my Top 3 takeaways:

1. A lot of moving parts – Even before the SAP S/4HANA announcement from the other week (see my First Take here), there were already a lot of moving parts in the SAP HCM / SuccessFactors portfolio. While SAP built EmployeeCentral on the MDF Framework, it was clear that more had to happen with the rest of the SuccessFactors application portfolio, as well as the BI / Analytics space. And SAP was also working on making the proven and venerable R/3 payroll engine available for SaaS deployments. And then there was this new database that SuccessFactors is supposed to run on, HANA. And with the new S/4HANA product, there is a new architecture and product suite announced and it will need to have some HCM capabilities, too. So let’s peel all these pieces back:


  • SAP plans to make cloud applications only available for real cloud deployments. In contrast to S/4HANA – EmployeeCentral and other SuccessFactors applications will not be available to be installed on premise or in a private cloud. Not really a surprise as the code is probably not designed and engineered for that – but what kind of new ERP suite is S/4HANA without HCM. Of course SAP has already ‘side by side’ implementations of Business Suite and EmployeeCentral / SuccessFactors available, and is likely to make the same available for on premise and private cloud S/4HANA installations. But it means that you can have the new SAP HCM products only in the cloud. A gutsy move considering that there are still large customer groups that think / want to deploy HCM on premise (or private cloud). Of course SAP still has R/3 HCM available – but questions will arise in regards of how in tune that functionality is with the HCM best practices of 2015.
The SuccessFactors V12 Home Page
  • SAP is in the process of moving all payroll relevant information into EmployeeCentral, making the R/3 payroll engine more and more a ‘thin & dumb’ engine that can be called easily, and quasi elastic. SAP showed a demo of an employee taking vacation / time off and how it instantly changed the paycheck. Given the history of the SAP payroll engine, quite a feat. But a necessity as e.g. ADP and Ceridian have shown the same already last year. Additionally SAP has made progress in the UI it uses on top of the Payroll Administration – looks as good (or is it) Fiori [Update Feb. 19th - SAP confirms it is Fiori. Good.]. Maybe it can bring back some ‘fun’ to run payroll.

  • SAP is in process of finishing its work around the Learning module. Bringing your own content sharing and using ‘true’ analytics to serve relevant content will be a powerful upgrade to the former Plateau product. 

  • At the same time SAP plans (sorry details under NDA) to move other SuccessFactors Talent Management modules to the MDF platform. That is certainly welcome as the diversity of the different SuccessFactors architecture is a sizeable technical debt. Good to see SAP tackling this area (at last some customers say).

  • And finally SAP has some interesting plans in the Business Intelligence and Analytics space. SuccessFactors has always been good at Benchmarking, with almost all customers participating in the benchmarking process. With the integration of Lumira, SAP HCM users will receive an attractive BI tool with probably good enough capabilities to fight of potential Tableau installs for the average HCM user. Moreover, SAP is making good progress in the ‘real’ analytics area for some selected analytical questions (what I mean with ‘real’ analytics – read here). 
 

The SAP HCM Cloud Architecture - OData APIs 
 

2. APIs as the new integration paradigm – Dmitri Krakovsky walked us through a set of new qualities for future SuccessFactors products as well as capabilities of the existing architecture. And as Ettling pointed out earlier, SAP HCM has abandoned the mantra of the one schema / object model. The future are APIs and using OData as interface to have the existing and new applications communicate to each other [Update  SAP correctly reminds me that it is using ODATA already today.] Not a new approach, but new for SAP overall. Many questions remain, such as if the APIs will be open to outside of SAP built consumption, what is the integration platform (HCP?) for more complex integration and transformation processes and so on. From a strategic perspective the most important aspect was that Krakovsky said that being ready for more acquisitions was an additional benefit of the strategy. And while that was not a statement to any specific acquisitions, it is certainly is good to be ready for them – on an architecture level.
 
SAP Services Lifecyle
3. Services as differentiator!?  – SAP spend a good portion of the day talking about various service offerings and capabilities. The topic is near and dear to SAP HCM leader Ettling, so it is good to SAP investing in the area. When I asked what sets SAP apart in services, Ettling said it is the adoption teams that grow proportionally to subscription revenue and the newly revisited support offerings.
 

Tidbits


  • SAP JAM - As custom at SAP HCM events, we also got an update on the JAM product. JAM has passed 17.5M users and is growing well, it looks like the work package approach the team has taken, is creating value for SAP customers using JAM alongside SAP HCM products. 
SAP JAM momentum slide
    • Bye Bye Boomi – SAP SuccessFactors is moving away from the Boomi based integration and towards HANA cloud integration – a good step to reduce technical debt and 3rd party license payments. 
    • Bye Bye Oracle – And SuccessFactors is moving off Oracle and onto HANA. New customers will be brought to HANA first, but the overall migration will be completed in 2016. We will be watching. 


    MyPOV

    SAP is making good progress on its HCM portfolio. It is good to see, that the vendor is actively embarking on the journey to bring all its HCM products on the MDF Framework, which really is a platform. With that SAP has a dual positioning challenge with HANA Cloud Platform (HCP), but the vendor juggled the two well: MDF is for transactional (HCM) applications, HCP is an all-purpose PaaS product. But that all creates a lot of moving parts, and moving parts in software always bring the risk of quality issues. SAP has a senior enough team for not making that an issue, but it is certainly an area to watch, also given key competitors have less of a re-build / re-write load in the coming next years. But re-writes, re-platforming has to happen every 10 or so years in most cases, so 2015 just marks the date where SAP embarks into the effort. Lastly SAP needs to be more proactive and transparent on the roadmap and milestones related to that effort. A complete roadmap of what is going to happen certainly will be welcomed by prospects, customers and the overall ecosystem. That other events happen, like e.g. an S/4HANA launch, the acquisition of Concur is a possibility, if not a likely reality. And that such events could re-trigger a new roadmap is nothing shocking anymore in the cloud era. It could be even something customers welcome, as they want to take advantage of the new functionality. So a public roadmap of future products would be an important step. 

    ----------

     

    And more on overall SAP strategy and products:

     

    • First Take - SAP launches S/4HANA - The good, the challenge and the concern - read here
    • First Take - SAP's IoT strategy becomes clearer - read here
    • SAP appoints a CTO - some musings - read here
    • Event Report - SAP's SAPtd - (Finally) more talk on PaaS, good progress and aligning with IBM and Oracle - read here
    • News Analysis - SAP and IBM partner for cloud success - good news - read here
    • Market Move - SAP strikes again - this time it is Concur and the spend into spend management - read here
    • Event Report - SAP SuccessFactors picks up speed - but there remains work to be done - read here
    • First Take - SAP SuccessFactors SuccessConnect - Top 3 Takeaways Day 1 Keynote - read here.
    • Event Report - Sapphire - SAP finds its (unique) path to cloud - read here
    • What I would like SAP to address this Sapphire - read here
    • News Analysis - SAP becomes more about applications - again - read here
    • Market Move - SAP acquires Fieldglass - off to the contingent workforce - early move or reaction? Read here.
    • SAP's startup program keep rolling – read here.
    • Why SAP acquired KXEN? Getting serious about Analytics – read here.
    • SAP steamlines organization further – the Danes are leaving – read here.
    • Reading between the lines… SAP Q2 Earnings – cloudy with potential structural changes – read here.
    • SAP wants to be a technology company, really – read here
    • Why SAP acquired hybris software – read here.
    • SAP gets serious about the cloud – organizationally – read here.
    • Taking stock – what SAP answered and it didn’t answer this Sapphire [2013] – read here.
    • Act III & Final Day – A tale of two conference – Sapphire & SuiteWorld13 – read here.
    • The middle day – 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
    • A tale of 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
    • What I would like SAP to address this Sapphire – read here.
    • Why 3rd party maintenance is key to SAP’s and Oracle’s success – read here.
    • Why SAP acquired Camillion – read here.
    • Why SAP acquired SmartOps – read here.
    • Next in your mall – SAP and Oracle? Read here.

     


    And more about SAP technology:
    • HANA Cloud Platform - Revisited - Improvements ahead and turning into a real PaaS - read here
    • News Analysis - SAP commits to CloudFoundry and OpenSource - key steps - but what is the direction? - Read here.
    • News Analysis - SAP moves Ariba Spend Visibility to HANA - Interesting first step in a long journey - read here
    • Launch Report - When BW 7.4 meets HANA it is like 2 + 2 = 5 - but is 5 enough - read here
    • Event Report - BI 2014 and HANA 2014 takeaways - it is all about HANA and Lumira - but is that enough? Read here.
    • News Analysis – SAP slices and dices into more Cloud, and of course more HANA – read here.
    • SAP gets serious about open source and courts developers – about time – read here.
    • My top 3 takeaways from the SAP TechEd keynote – read here.
    • SAP discovers elasticity for HANA – kind of – read here.
    • Can HANA Cloud be elastic? Tough – read here.
    • SAP’s Cloud plans get more cloudy – read here.
    • HANA Enterprise Cloud helps SAP discover the cloud (benefits) – read here.
     
    Find more coverage on the Constellation Research website here.

     

    Future of Work Tech Optimization Next-Generation Customer Experience Revenue & Growth Effectiveness Data to Decisions Innovation & Product-led Growth New C-Suite Sales Marketing Digital Safety, Privacy & Cybersecurity Oracle ADP SuccessFactors workday SAP AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief People Officer Chief Information Officer Chief Customer Officer Chief Human Resources Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

    The Rite To Be Forgotten

    The Rite To Be Forgotten

    This is a repost of my analysis of the Right to be Forgotten. Original post published September 30, 2014. 

    The European Court of Justice recently ruled on the so-called "Right to be Forgotten" granting members of the public limited rights to request that search engines like Google suppress links to Personal Information under some circumstances.  The decision has been roundly criticised by technologists, by American libertarians, and even by some privacy advocates.  Objections are raised on various grounds; the one I want to answer here is that search engines should not have to censor "facts" retrieved from the "public domain". 

     
    In an address on August 18, the European Union's Justice Commissioner Martine Reicherts made the following points about the Right to be Forgotten (RTBF):
    • "[The European Court of Justice] said that individuals have the right to ask companies operating search engines to remove links with personal information about them -- under certain conditions. This applies when information is inaccurate, for example, or inadequate, irrelevant, outdated or excessive for the purposes of data processing. The Court explicitly ruled that the right to be forgotten is not absolute, but that it will always need to be balanced against other fundamental rights, such as the freedom of expression and the freedom of the media -- which, by the way, are not absolute rights either".

    For The New Yorker, Toobin interviewed Kent Walker, Google's general counsel. Walker said Google likes to think of itself as a "card catalogue": "We don't create the information. We make it accessible. A decision like [the ECJ's], which makes us decide what goes inside the card catalogue, forces us into a role we don't want."

    But there's a great deal more to search than Walker lets on.

    Google certainly does create fresh Personal Information, and in stupendous quantities. Their search engine is the bedrock of a hundred billion dollar business, founded on a mission to "organize the world's information". Google search is an incredible machine, the result of one of the world's biggest ever and ongoing software R&D projects. Few of us now can imagine life without Internet search and instant access to limitless information that would otherwise be utterly invisible. Search really is magic - just as Arthur C. Clarke said any sufficiently advanced technology would be.

    On its face therefore, no search result is a passive reproduction of data from a "public domain". Google makes the public domain public.

    But while search is free, it is hyper profitable, for the whole point of it is to underpin a gigantic advertising business. The search engine might not create the raw facts and figures in response to our queries, but it covertly creates and collects symbiotic metadata, complicating the picture. Google monitors our search histories, interests, reactions and habits, as well as details of the devices we're using, when and where and even how we are using them, all in order to divine our deep predilections. These insights are then provided in various ways to Google's paying customers (advertisers) and are also fed back into the search engine, to continuously tune it. The things we see courtesy of Google are shaped not only by their page ranking metrics but also by the company's knowledge of our preferences (which it forms by watching us across the whole portfolio of search, Gmail, maps, YouTube, and the Google+ social network). When we search for something, Google tries to predict what we really want to know.

    In the modern vernacular, Google hacks the public domain.

    The collection and monetization of personal metadata is inextricably linked to the machinery of search. The information Google serves up to us is shaped and transformed to such an extent, in the service of Google's business objectives, that it should be regarded as synthetic and therefore the responsibility of the company. Their search algorithms are famously secret, putting them beyond peer review; nevertheless, there is a whole body of academic work now on the subtle and untoward influences that Google exerts as it filters and shapes the version reality it thinks we need to see.

    Some objections to the RTBF ruling see it as censorship, or meddling with the "truth". But what exactly is the state of the truth that Google purportedly serves up? Search results are influenced by many arbitrary factors of Google's choosing; we don't know what those factors are, but they are dictated  by Google's business interests.  So in principle, why is an individual's interests in having some influence over search results any less worthy than Google's? The "right to be forgotten" is an unfortunate misnomer: it is really more of a 'limited right to have search results filtered differently'. 

    When people frame RTBF as "rewriting history" they seem to regard Googe's search results as a formal public record. But they're not - they are they means to an end for an advertising business.  Search results represent Google's proprietary assessment of what matters.  And they are relative.  Search results are utterly different from one user to another, and from one month to the next.  Why should this customised stream of corporate consciousness not be subject to reasonable editing so as to balance the rights of people that it happens to include? 

    If Google's machinery reveals Personal Information that was hitherto impossible to find, then why shouldn't it at least participate in protecting the interests of the people affected? I don't deny that modern technology and hyper-connectivity creates new challenges for the law, and that traditional notions of privacy may be shifting. But it's not a step-change, and in the meantime, we need to tread carefully. There are as many unintended consequences and problems in the new technology as there are in the established laws. The powerful owners of benefactors of these technologies should accept some responsibility for the privacy impacts. With its talents and resources, Google could rise to the challenge of better managing privacy, instead of pleading that it's not their problem.

    Research

    The State of Privacy in 2015

    Download Report Snapshot

    Big Privacy Rises to the Challenges of Big Data

    Webcast

    Critical Big Data and Privacy Trends


    Digital Safety, Privacy & Cybersecurity Security Zero Trust Chief Customer Officer Chief Information Officer Chief Information Security Officer Chief Privacy Officer

    Supply chains strive to achieve precise demand shaping – doubled edged sword

    Supply chains strive to achieve precise demand shaping – doubled edged sword

    Over the past few weeks I have been meeting with a number of supply chain services companies who are talking about and focusing on developing solutions that will allow users to be laser focused with demand sensing and shaping. This was particularly evident during my meetings at NRF in New York. We also have the likes of eCommerce giant Amazon who have patented technology that claims to be able to put on the truck the product you have yet to order because they know that you will order it! All very interesting and exciting for supply chains – these supply chains strive to eliminate or at least control the lumpiness associated with their demand patterns.

    However this begets a question – is this necessarily good? For example. The situation I hear often is what takes place at Starbucks. A regular client walks into their local Starbucks, the barista notices them standing in line and knows their preferred order. The customer reaches the cash register and their usual venti, skinny, vanilla latte is already waiting for them. All they have to do is pay and pick up their piping hot coffee.  Sounds lovely.

    They know what  you want before you order it!

    They know what you want before you order it!

    And for the most part maybe that customer appreciates the convenience, and feeling of being so well known that you are the “mayor” of that Starbucks. But what if that customer does not want that skinny vanilla latte? What if the customer wants a hot chocolate one day? Do they dare deviate from their usual order or do they accept the usual order for the convenience?

    The same holds true for grocers such as Stop and Shop or Walmart, who let you order online and pick up in store – and will predict what your basket will look like. So all you need to do is drive to the grocery store and pick up your order. There is no need to think too much. Of course the positive is that there are tremendous time savings for the customer if they do not want to contemplate a new mix of groceries. But what if the consumer wants to try a new cheese or kitchen cleaner? If their order is already compiled for them will they get the opportunity to see what else is available? Or do we not want to give them the opportunity? How do we make sure they have the opportunity to browse?

    My point is not that supply chain users and vendors should not stop striving to get too smart and more effective when it comes to demand shaping and sensing. However there must be some balance when it comes to how precise and “effective” the supply chains need and want to be with regards to the customer. Yes, we want to eliminate lumpiness and extract those savings from the supply chain. But retailers and other players in the supply chains need to still keep a balance with being very precise with how they manipulate and predict demand with the opportunity for their customers to deviate from their usual demand. Retailers and other customer focused industries need to determine how precise they want to be with their demand shaping and how much freedom they want to give their customers to roam and wander through options.

     

    Matrix Commerce Tech Optimization Data to Decisions Innovation & Product-led Growth Next-Generation Customer Experience Sales Marketing Revenue & Growth Effectiveness Future of Work New C-Suite amazon B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI Chief Supply Chain Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Growth Officer Chief Information Officer Chief Marketing Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer

    Research Preview: 9 Cloud Trends Every CxO Needs to Know in 2015

    Research Preview: 9 Cloud Trends Every CxO Needs to Know in 2015

    In the digital age, the cloud has transitioned from the responsibility of the CIO to the responsibility of the entire C-suite. Cloud computing has morphed from a simple utility into a running a set applications which is utilized by all functions within a business. Consequently, all CxOs should be aware of major trends in cloud technology. I’ve identified nine of these cloud trends in my latest piece of research, “Nine Cloud Trends Every CxO Needs to Know in 2015”. This report provides a comprehensive first look at these key trends and allows CxOs to familiarize themselves with them and then drive to first actions and conclusions for their respective enterprise. The success of every department depends on the CxO’s fluency of the apps and underlying cloud technology that, both, power and are produced by organizations in the digital age. 

     
     

    Cloud adoption will fundamentally transform IT organizations. Locations, jobs, positions, skills and the self-understanding of IT’s role in the 21st century are dramatically changing. The next decade will see more changes for IT than the function has seen since its inception in the 1950’s. Read my lips: proficiency in the basic elements of the cloud will be a requirement for all CxO roles in the next ten years. Embrace the cloud or face disruption.

    This shift is happening already. Most vendors have transitioned to a platform zone, which has allowed them to create middleware platforms. The consequence is that Platform as a Service (PaaS) will disappear even more as a market category in 2015. CxOs need to be aware of the benefits, but also the lock-ins, that come with using cloud as a middleware platform. At the same time, OpenStack will lose its unity as the participating vendors see more economic benefits for themselves by differentiating beyond the compatibility layer. And beyond OpenStack, there is an overall need for cloud vendors to specialize in order to differentiate from the market leader.

    Despite the success of the public cloud, most enterprises will still have hybrid deployments as the predominant form of cloud adoption. At the same time, growing legal and statutory requirements will make the location of data centers a key criteria for enterprises selecting cloud vendors. But enterprises will also realize that for the next-generation application demands, there is no alternative to cloud adoption. The nature and demands of next-generation applications force the adoption of cloud, as cloud qualities are key for the benefits that enterprises want to garner from these applications.
     


    Prepare yourself for a productive conversation about the future of the cloud in your organization. The table of contents and a snapshot of my report is available for download.

    Download Report Snapshot

     

    New C-Suite Tech Optimization Future of Work Matrix Commerce Innovation & Product-led Growth Data to Decisions Digital Safety, Privacy & Cybersecurity Oracle softlayer Google IBM amazon Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Supply Chain Growth eCommerce Leadership finance Customer Service Content Management M&A Chief Customer Officer Chief People Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Analytics Officer Chief Executive Officer Chief Operating Officer

    IoT; Solution Architecture; Network Integration Groups and Fog Computing

    IoT; Solution Architecture; Network Integration Groups and Fog Computing

    Your Enterprise has had amazing productivity improvements in its manufacturing operations for a low investment by adding Sensing to critical points in the flow process.  The sensors are spread across the entire internal material and process flow from warehouse to inspection in an Intranet of Things. Logically the next move is to use the Internet of Things externally to add the same capabilities to the flow of materials from its suppliers. But how do you scale up for this, and does it affect your existing IoT investment?

    Research report now available: The Foundational Elements for the Internet of Things (IoT)

    If you have been following this series of Blogs On IoT then you will have realized that this is not quite a straightforward issue of doing the same thing on a larger scale. You will also recognize that the underlining of Intranet of Things versus Internet of Things is a very deliberate way of drawing attention to the difference. See blog; ‘From the Intranet of Things to the Internet of Everything – Introducing the required solution architecture’ which describes the Network centric infrastructure of ‘Fog Computing’ required to connect and support a whole world of Internet connected sensor technology.

    The Challenge of making connections to the ‘right’ sensors probably with multiple ‘owners’; that can supply the ‘right’ data; at the ‘right’ time; to the ‘right’ receiving Service; which in turn can evoke the ‘right’ responses to the data is indeed complex! To solve this the technology companies are introducing new, radical products with capabilities that build upon the experience of Cloud, Mobility, etc. as well as the experiences gained with global scale Internet based Web Services.

    The previous blog, defining ‘Intranet v Internet’ networked Devices, showed how Networking is extending to provide new capabilities supporting activities moving to a huge number of small devices connected around the ‘edge’. As opposed to traditional networking where traffic flows move to centralized points such as Data Centers. In this blog the topic is how the ‘flow’ of data is created and managed through Network Integration of groups of sensors and responding services into virtual networks complete with virtual firewalls. As with many complex architectural solutions a very simple Use Case will help in clarifying the issues to address and the resulting solution.

    The diagram below illustrates a manufacturing enterprise with a representation of an external ecosystem of suppliers, logistics and the road network. Initially the enterprise applied Sensors to various aspects of its internal manufacturing operations increasing the amount of live real time data available to make optimized dynamic operational decisions in the light of much increased information. The improvements possible for this first phase of IoT deployment are usually considerable, (see blog on Harley Davidson success story using the Intranet of Things internally).

    Naturally such excellent returns from small investments encourage extension externally to add data from the longer gestation periods and operational complications for suppliers and logistics. In other words a move from internal Intranet connected Sensors to external Internet connected Sensors relying on Networking to provide the connectivity and functionality required.

    There are many more factors externally that can affect an optimized plan for a manufacturing operation than the more limited numbers of internal factors more directly under the enterprise control. Ultimately all Digital Business comes down to the ability to sense and respond in an optimal manner to external factors. Successful engagement with, and use of, the data from the Internet of Things is a prerequisite of a successful Digital Business!

    In the diagram sensors are positioned on every element, some such as the road traffic reporting sensors are clearly publically accessed, whilst others such as those on an individual truck belonging to the Logistics Company are private. ‘Network Integration’ is required to identify, and integrate, a group of sensing Devices into a virtual network to communicate the consolidation of data to one, or more, selected authorized receiving point(s). The receiving point(s) are then responsible for the choice, and orchestration, of Services, or transactional Applications, that will provide a Business valuable response.

    Though all Internet of Things devices share a common physical connection the network has to provide an advanced set of Services, which identify, and integrate, the required Devices into a series of secure virtual networks. Each virtual network contains the sensors that together provide the complete data set relating to a particular activity. In turn the Network Integrated Group must also included one, or more, designated receiving points.

    Network Integration Groups will be a mixture of predetermined, fixed groupings, and dynamic on demand groupings responding to a authenticated data request from a designated receiving point. In IoT, the Fog Computing technology model requires Networking to play a substantial and active role, even more so than in the Cloud Computing technology model.

    A simple use case based on the diagram above helps to explain; At the top right is Acme Manufacturing who are operating a dynamic production plan as part of their flexibility to respond agilely to Digital Business market opportunities. Starting from the bottom right is a flow of suppliers including Beta Components; and Road Runners one of several Logistics operators that together make up the extended value chain that Acme Manufacturing is dependent upon. Clearly the unmonitored external operational conditions exceed the number of internal operational points that IoT is now monitoring so successfully.

    Acme Manufacturing has an outline production plan for the day based on an online call off for certain parts to Beta Components together with a time for delivery. Road Runners Logistics also gets a copy of the online documentation to use with public road sensors and traffic reporting in making their decision as to the best route/journey time. This enables them to inform Beta Components of their collection time and Acme Manufacturing of the planned deliver fulfillment details.

    This online electronic call off interaction between the three companies provides the details of each company’s private sensors that will be monitoring the entire fulfillment together with the authentication to for access. Acme Manufacturing can now requests the Network Integration Group Services function to integrate the identified sensors into a virtual group that will collate and report all the data relevant to the Call Off to the Acme Manufacturing receiving point.

    Each call off, on each and every Supplier, will result in defining a new and different Network Integration Group requirement to create a unique group of ioT Devices. And, of course there will, in time, for any given Enterprise, operating as a connected dynamic Digital Business hundreds, even thousands of these virtual Network Integration Groups live at once, each connected to fulfill a specific business actions requirements.

    Together the IoT Devices in a given Network Integration Group will be able to supply a cohesive and complete data set as a full information flow required to invoke dynamic Business responses to changing conditions being monitored. In the Use Case of Acme Manufacturing the dynamic reporting of how the delivery of the critical parts to match planned production is progressing. At the same time that Acme Manufacturing are managing their call offs and aligned IoT data flows, so will Beta Components being doing the same with their suppliers, as will many other Enterprises.

    The delivery of the finished retail product from Acme will be tracked in much the same way by citizen customers who may want to use other Sensing sources to decide on when and where they might take delivery. This in turn introduces the term the Internet of Everything as at a mass Smart City level devices such as Wearable Watches may participate in Network Integration Groups.

    Enterprises are coming to terms with using a mix of Public and Private, even Hybrid, Cloud Computing to support their internal IT transactional systems. Externally supporting massive numbers of shared Devices in the dynamic interactional Internet of Things by Fog Computing lifts the level of understanding and use of capabilities to a new level!

    Fog Computing could be said to be the next stage in the Technology Industry long path towards the term ‘The Network is the Computer’, first expressed by Sun in respect of its TCP/IP based Unix Workstations in the mid eighties. The term has been resurrected with each new wave of Technology ever since, whether Client-Server enterprise applications, or Internet based web servers! It is clear is that the network infrastructure, or ‘Fabric’, and Networking as a set of services, are again required to take a further range of critical of tasks. A series of recent announcements from Cisco, both in terms of strategic direction, and of products sets, relate to the introduction such capabilities.

    This blog has outlined the role of ‘Network Integration’ in the Internet of Things with its enabling architecture of Fog Computing. For most adopters of IoT sensing this is not an immediate issue, as excellent financial returns, indeed usually better by a significant factor than the returns on most IT projects today, can be delivered over the existing enterprise closed Networks. However, too many promising technology pilots and initial schemes fail to deliver long term success due to a lack of understanding of the development direction the industry is taking so it is important to recognize the direction of IOT.

    Understanding exactly what the Internet of Things does makes it clear that the successful use of IoT and IoE are critical success factors for Enterprises in Digital Business models.

    The next blog in this series on the Internet of Things, IoT, will move to describe Data flows, Industrial to Information Technology data models and their integration, leading to invoking of Services Orchestration. It is also recommended to read the Digital Business series of Blogs published on alternative weeks to the IoT blogs on the Constellation Research web site to understand core business functions in a true Digital Business together with the technology requirements, deployments and alignments required.

    Research report now available: The Foundational Elements for the Internet of Things (IoT)

    Market Move – Vector Capital takes Saba Software Private

    Market Move – Vector Capital takes Saba Software Private

    Today we learned that Saba Software (Saba) is being taken private by San Francisco headquartered Vector Capital (Vector). The deal is valued at 400M US$ (see here) - setting the Saba stock price at 9 US$, pretty close to where it was trading over the counter (SABA was de-listed in 2013).

    So let’s review the press release in our standard approach to major announcements:

    REDWOOD SHORES, CA and SAN FRANCISCO, CA. – February 10, 2015 –Saba (OTC Pink: SABA), a global leader in cloud-based intelligent talent management solutions, today announced that it has entered into a definitive agreement with affiliates of Vector Capital (“Vector”) under which an affiliate of Vector will acquire all of the outstanding shares of Saba common stock for $9.00 per share in an all cash offer.

    MyPOV – Nice way to stick in the new software category that Saba is trying to shape – ‘intelligent talent management’, with a lot of use of analytics. A good positioning and we took a first look at the new Compensation Management product back at HR Tech in Amsterdam in fall of 2014.

    “Over the course of Saba’s comprehensive review, the Board of Directors and our advisors evaluated a wide range of strategic alternatives, and engaged with a number of parties. We are pleased to have reached this agreement with Vector, which provides significant cash value for our shareholders. Our Board unanimously believes that this is the best outcome for Saba, our shareholders, customers, partners and employees,” said Bill Russell, Saba’s Non-Executive Chairman.

    MyPOV – Let’s hope this is the best outcome for Saba, which has been trouble for (too) long by not being able to release earning statements. Vector actually helped Saba with a credit line in the past (see here), so this acquisition may be a consequence of that credit action. If a good or bad next move – only time will tell and we may never know. Vector has a good track record getting troubled vendors back on track, I remember once famously hyped Niku (ultimately sold to CA) as a good example.

    “Over the last 17 years, Saba has delivered a growing set of innovative intelligent talent management solutions, which are in use today by more than 2,200 global market leaders and innovators,” said Shawn Farshchi, President and CEO of Saba. “Vector has been a great partner to Saba since 2013. We are thrilled to continue the relationship, and take advantage of the support and resources of Vector and their partner network to strategically invest in expanding our product portfolio, further our customer success programs, and continue to the next stage of the company’s growth and market leadership.”

    MyPOV – The way how Farshchi puts it, it looks like Vector might invest more into Saba. And that investment is certainly needed, as the vendor is building a new product suite and needs to compete with both Talent Management and overall HCM suite vendors with deep pockets.

    “Vector, along with some of the world’s premiere financial institutions and investors, are excited to help Saba move beyond its financial restatement process and put the focus squarely on the Company’s innovative cloud talent management platform and its blue chip customer base,” said David Fishman, Managing Director and Head of the Private Equity Team at Vector Capital.

    Andy Fishman, Managing Director at Vector Capital, said, “We are excited to partner with the management team and the dedicated and talented group of employees at Saba. We look forward to them becoming part of the Vector family.”


    MyPOV – Interesting to find two quotes from Vector in the press release. Goldman Sachs veteran David Fishman is certainly the ‘software guy’ at Vector – with exposure to almost all its software related portfolio. Maybe both Vector divisions – Private Equity and Vector Capital help to fund the deal. Or Vector Capital gave Saba the credit, and now it is Private Equity to step up. Though the credit was relatively small (25M US$) compared to the current transaction value.

    The transaction is subject to customary closing conditions and the approval of Saba shareholders, and is expected to close in the coming months. The transaction is not subject to any financing conditions. Saba senior management is expected to remain in Redwood Shores. [...]

    MyPOV – Good to add that the senior management will stay in the campus in Redwood Shores – which promises personnel stability.


    Implications, Implications

    So let’s look at the implications of this transaction for the constituents.


    Implications for Saba Customers

    The hope would be that this transaction ends concerns about the viability of Saba. It will be back to Vector and Saba to make sure that roadmaps and viability from a corporate and financial perspective are given and communicated. Customers should actively reach out to Saba to get re-assurances on both roadmap items, global build-out and continuing operations. Saba and Vector should expect these inquiries and be ready to address them, ideally even proactively.

    Assuming this will all pass well, this transaction is good news for Saba customers. Saba was previously cut out from raising capital, it now has the privacy of private equity to restructure, get books in order and grow product and global presence.

    In the long run customers should not forget that Vector is a private equity firm and will not hold Saba ‘forever’ – so looking at the customary exit for PE firms in 5 or 7 years of holding Saba will be an important aspect for future contract negotiations.


    Implications for Saba

    We can only hope that this will put the vendor on solid footing, good investment and return to the strength and good reputation that Saba once had in the Learning and Talent Management space. As we wrote earlier, Learning is not winning the Talent Management Wars or the HCM Wars – so it is key for Saba to actively expand its portfolio. Like all Talent Management vendors it will need to decide if it needs to create a HR Core system – or not. The new (true) analytic (more here) angle of the latest product version is promising, but needs customer traction and adoption.


    Implications for Saba Competitors

    It looks like nobody wanted to buy Saba. Its strong Learning would have been a fit for Workday, could have complemented both Oracle and SAP. Would have helped Infor. Maybe ADP. Added a lot of customers for Ceridian and Ultimate, but probably was too expensive for them. If Saba ever explored this and why it did not happen - we will probably never know. Assuming Vector will invest into Saba, they (and the Talent Management vendors like Cornerstone, Lumesse, Skillsoft SumTotal, TechnoMedia etc. will see a more formidable competitor. The targeting of Saba customers may come to an end soon.


    MyPOV

    I hope this turns a less fit vendor into a good market participant and competitor. Talent Management has for a long time overpromised and under delivered. It is time for Round 2 – using the next generation capabilities of (true) analytics, BigData and cloud. That analytics matter for talent management, starting in recruiting is widely accepted not only by the traditional vendors but a bustling startup scene. Looking forward to have one stronger vendor in the market – good luck to Saba customers, employees and the vendor overall. Good to see the Saba story has not ended (yet).

    ----------


     
    More Market Move blog posts:
    • Market Move - Oracle buys Datalogix - moves into DaaS - read here
    • Market Move - SAP strikes again - this time it is Concur - and the push into spend management - read here
    • Market Move - Cisco wants to buy Metacloud - getting more into the cloud game - read here
    • Market Move - Skillsoft announces to acquire Sumtotal - creating a(nother) HCM vendor - read here
    • Market Move - Infor runs CloudSuite on AWS - Inflection point or hot air balloon? Read here
    • Marker Move - SAP acquires Fieldglass -  off into contingenty workforce - early move or reaciton? Read here
    • Why NetSuite acquired TribeHR - read here
    • Microsoft gets even more serious about devices - acquires Nokia smartphone business - read here
    • Why Intuit acquires Elastic Intelligence - read here
    • Why SAP acquired hybris software - read here
    • Why IBM acquired SoftLayer - read here
    • Why Oracle is buying Tekelec - read here
    • Why Oracle bought Nimbula - read here
    Find more coverage on the Constellation Research website here.

     

    Future of Work Tech Optimization Innovation & Product-led Growth New C-Suite Data to Decisions Next-Generation Customer Experience Sales Marketing Digital Safety, Privacy & Cybersecurity Oracle SuccessFactors workday SAP AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Information Officer Chief Technology Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer Chief Information Security Officer Chief Data Officer

    Data is Eating Marketing: Digital, Social and Mobile in 2015

    Data is Eating Marketing: Digital, Social and Mobile in 2015

    1

    Data. It’s out there. And there is plenty of it. We create data with every status update, photo shared or website viewed. Each search we make is being monitored, sorted, indexed and analysed. Every purchase we make is being correlated, cross-matched and fed into supply chain systems. And every phone call we make is being logged, kept, passed on for “security purposes”.

    There are so many kinds of data that it is hard to keep up with it all. There is the data that we know about – the digital items we intentionally create. There are digital items that are published – like books, websites and so on. There is email which creates its own little fiefdom of data.

    There is also the data about data – metadata – which describes the data that we create. Take, for example, a simple Tweet. It is restricted to 140 characters. That is the “data” part. But the metadata attached to EACH and EVERY tweet includes information like:

    • Your location at the time of tweeting (ie latitude and longitude)
    • The device you used to send the tweet (eg phone, PC etc)
    • The time of your tweet
    • The unique ID of the tweet.

    But wait, there’s more. From the Twitter API, you can also find out a whole lot more, including:

    • Link details contained within the tweet
    • Hashtags used
    • Mini-profiles of anyone that you mention in your tweet
    • Direct link information to any photos shared in your tweet

    There will also be information related to:

    • You
    • Your bio / profile
    • Your avatar, banner and Twitter home page
    • Your location
    • Your last tweet.

    There is more. But the point really is not about Twitter. It is the fact that a seemingly innocuous act is generating far more data than you might assume. The same metadata rules apply to other social networks. It could be Facebook. Or LinkedIn. It applies to every website you visit, each transaction you make. Every cake you bake. Every night you stay (you see where I am going, right?)

    For marketers, this data abundance is brilliant, but also a distraction. We could, quite possibly, spend all our time looking at data and not talking to customers. Would this be a bad thing? I’d like to think so.

    The question we must ask ourselves is “who is eating whom?”.

    In the meantime, for those who must have the latest stats – We Are Social, Singapore’s massive compendium is just what you need. Binge away.

    Marketing Transformation Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Sales Marketing Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning LLMs Agentic AI Robotics SaaS PaaS IaaS Quantum Computing Enterprise IT Enterprise Acceleration IoT Blockchain CRM ERP CCaaS UCaaS Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership Chief Marketing Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

    The State of Customer Success Management in 2015

    The State of Customer Success Management in 2015

    State of Customer Success ManagementThis report is about The State of Customer Success Management in 2015. Constellation’s research team delivers its inaugural series on the state of the state. The state of the state research explores the impact of digital transformation, next generation customer experience and matrix commerce from a systems perspective considering the political, economical, societal, technological, environmental, and legislative point of view. This research report explores a key area – Customer Success Management (CSM).

    It goes into detail about how next generation customer experience is guiding the success of Customer Success Management.  A shift to Customer Success Management is emanate because we live in the world of a continuous, opt-in economy, where the value of a customer is determined by how long they stay a customer and if they continue to increase their purchase amounts over time. As a result, companies must prepare themselves to deliver great, continuous and consistent customer experiences. Before the opt-in economy, businesses were focused on the initial sale. A great deal of money was spent advertising and marketing to potential prospects, enticing them to convert from a lead to a sale.

    However, little attention was paid to the after sale experience, even though the ubiquitous poor customer experiences still exist today after decades of research showing that after sales service directly affects the financial stability of a company. It makes absolutely no sense to spend millions or in some cases billions of dollars in advertising, marketing and sales to then drive the customer to the competitor because the after sales service experience is horrible. Yet this occurs every single day in many, many companies. Customer Success Management is based on the ability to deliver consistent customer experience process, before, during and— in particular, after the sale which continuous loops into increase customer lifetime value, enhanced revenue, increased margins and profits.

    The main themes in the paper are:

    • Delivering a brand promise instead of a product or service requires new approaches.

    • Clients who believe in customer experience build CSM organizations.

    • CSM delivers more customers, less churn, and higher margins, really!

    • Predictive analytics identify known and reveal unknown attributes that drive customer success and

    • Expect larger customer experience vendors to incorporate these principles or acquire in this space.

     The reason some companies like Zappos, Nordstorm, Lexus, which are all considered “luxury” brands can offer excellent service is that their business model is built with enough margin to provide the people, process and technology that can deliver great experiences. Businesses must move away from the thinking that this type of service is limited to only luxury brands and they themselves must stop cutting corners on CSM. All businesses that expect to make it through the next several years must begin to change their business models immediately so that they have the margin to provide great, loyalty creating experiences.

    Is your company ready to transform how it treats its customers? For information on this report, download a snapshot. 

    Download Report Snapshot

    Next-Generation Customer Experience B2C CX Chief Customer Officer Chief People Officer Chief Human Resources Officer

    CEN Member Chat: Critical Big Data and Privacy Trends

    CEN Member Chat: Critical Big Data and Privacy Trends

    Learn how companies can strike a balance between privacy and the business value derived from big data. Steve Wilson and R "Ray" Wang discuss major trends at the intersection of big data and privacy.

    Digital Safety, Privacy & Cybersecurity Chief Information Officer On <iframe src="//player.vimeo.com/video/119173818" width="370" height="231" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>
    Media Name: big-data-privacy-trends-swilson-01.png