Results

IoT Delivers the Perfect Meal

IoT Delivers the Perfect Meal

Paleo, local, vegan, extra cheese, gluten free -- consumers are demanding when it comes to food. How are restaurants and food manufacturers to satisfy consumers that want each meal tailored to their dietary preferences? Technology of course. My latest report, The True Value of Technology for Food Supply Chains highlights opportunities for food retailers to satisfy customer demands with the help of the internet of things and big data technology. 

IoT Delivers the Perfect Meal

The True Value of Technology for Food Supply Chains 

DOWNLOAD EXCERPT

In 2015, grocery, restaurant and food services accounted for over 20 percent of U.S. retail spending. Food choices have evolved from simply being about nourishment into a statement of character. People define themselves by the restaurants they frequent or by espousing a vegan, gluten-free or paleo diet. Consumers frequently scrutinize their food: Is it GMO-free or organic? Not only are consumers picky, consumers feel empowered to make demands of food purveyors. Consumers want food customized to their dietary preferences -- they want their food personalized. 

But how should food suppliers deliver mass personalization of food products given the fragility of the food supply chain? The answer lies in disruptive technologies like the internet of things and big data analytics.  

IoT enabled farming equipment, sensors, smart infrastructure, and data collection are already revolutionizing farming. Sensors and big data analytics improve freshness tracking and monitor food safety during distribution. Big data analytics and IoT sensors ensure inventory and predict consumer trends at the last stage of the food supply chain. Moreover, the implementation of IoT sensors at every stage of the food supply chain provides food suppliers the opportunity to protect their margins and make data-driven decisions about growth.  

Supply chains have always been heavily dependent on data. The food supply chain is no exception. Consider variables such as perishability, shelf life, recipe mix, commodity price fluctuation, and the ability to better use data in the management of food processes takes on greater significance. There is an important opportunity for cloud- based data-handling software to bring the scale and flexibility necessary to keep pace with the changing winds in the food industry. As restaurants and grocers look to keep up with consumer demands around food sourcing, the ability to quickly add new sources and their data can be handled by these platforms. Timely and actionable data is available to the food supply chain. It is up to food businesses to seize the opportunity.

Click here to download a short excerpt of this report. 

Spread the word: Click here and Retweet!

Matrix Commerce Tech Optimization Data to Decisions Innovation & Product-led Growth Next-Generation Customer Experience Sales Marketing Revenue & Growth Effectiveness Future of Work New C-Suite B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI Chief Information Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Growth Officer Chief Marketing Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer Chief Supply Chain Officer

Vindication On My Position: Social Customer Service Sucks

Vindication On My Position: Social Customer Service Sucks

1

I am as tired of telling you not to embark on Social Customer Service as you are of telling me I am a grouchy old man and I don’t get it.

Fine.

My data and case studies have not convinced you, so let’s try a different approach.  Let’s have someone else show you their data.

Nice and BCG run a study on the subject (link below, registration required) and their data vindicates my positions: abandonment, slow to process, unable to deliver on complex situations, being dropped from investment, etc.

Don’t take my word? no problem – but still… don’t do social customer service.

Excerpt below, and link at the bottom

The report found that the number of consumers using social media to resolve customer service issues has dropped compared to two years ago. While daily, weekly, and monthly use of social media channels doubled between 2011 and 2013, those same categories declined between 2013 and 2015, while the number of respondents who never use or are not offered social media customer service rose from 58 percent in 2013 to 65 percent in 2015.

Respondents who do not use social media cited a number of reasons why. It takes too long to address issues said 33 percent, it has limited functionality reported 32 percent, and it isn’t feasible for complex tasks according to 30 percent. Social media was the channel with the highest percentage of abandons in both 2013 and 2015, with the number rising from 32 percent to 42 percent over that period.

Source: NICE & BCG 2016 CUSTOMER SURVEY

What do you think? Am I just being “jaded, even more so lately” as someone commented following one of my recent presentations (where, I might add, I talked about this same problems…)

Next-Generation Customer Experience Chief Customer Officer

IoT Pilots should include basic security functional elements for experience Mastering IoT security means mastering new security techniques

IoT Pilots should include basic security functional elements for experience Mastering IoT security means mastering new security techniques

Security starts with the identification of risks that in turn defines actions that are required. IoT devices range from simple sensors to embedded intelligence in sophisticated machines, and their deployment covers the whole spectrum of industries, and applications, as such there is not a single standard answer. It would seem unnecessary to consider imposing IT security practices to pilot a handful of simple monitoring sensors in a building, but a pilot should be the opportunity to learn about the technology and security aspects as well as the business benefits.

Current risk justification often focuses on the obvious difference in the security risk profiles; using as a simple example Building Management IoT deployment downstream data flows from IoT temperature monitoring points are seen to have low to minimal risks against upstream command responses to activate power, heating or other building functions.

But this misses the risk to the Enterprise from each and every IoT sensor as a network access point that could be compromised. Eurecom, French Technology Institute, discovered 38 vulnerabilities in the Firmware of 123 IoT sensing products. Hundreds moving to thousands of IoT connected devices multiplies the risk of security breaches to new levels.

Experts believe it likely that many Pilots and initial IoT deployments will occur without an adequate understanding of the security risks, and require expensive retro attention. A blogs cannot provide in depth coverage of the topic, but it is an excellent format to draw attention to the issues and to provide links to more in depth papers. For simplicity, and inline with the popularity of IoT for Building Management, this blog refers to IoT sensor deployment in Buildings as an illustrative use case.

Before considering new security capabilities that have been, or are being, developed for the IoT market place, it pays to understand the basic architectural model.  The so-called Final Mile Architecture described in some detail in the blog the importance of using Final Mile Architecture in an IoT Pilot stressed the importance of understanding the use of Connection, Asset Data and Mapping, and Data Flow management. However this blog did not mention the need to consider security aspects, as an example the importance of Firewall protected ‘safe’ location of the IoT Asset Data and Mapping Engine together with the Data Flow Engine.

Whilst Network Connection management is understood from its role in IT systems there is very little understanding of the use, and role, of IoT Gateways, Asset Data and Mapping, or Data Flow engines as core building blocks in IoT deployment, let alone how to use each to reducing security risk and vulnerability.

Most IoT sensor deployments will make use of one of the specialized physical network types, such as Zigbee, that interconnect low value sensor points and will connect to the main ‘Internet’ through an IoT Gateway. IoT Gateways come in all forms from simple physical interconnection of different physical network media to those with sophisticated intelligent management that introduce security capabilities. Intel publish a good guide to IoT Gateways in general with Cisco offer a useful FAQs on the topic.

The choice of an IoT Gateway product for a simple/pilot deployment level tends to focus on the primary physical network function of a Gateway, rarely recognizing that a Gateway is a key access point to an Enterprise, or public network and should be secure.

The IoT Gateway coupled with Network Connection management should be considered as the first major security point in IoT architecture. Some IoT Gateways add encryption to traffic forwarded across the network as a further security feature.  Citrix publish a useful guide to the security implications of IoT Gateways. and Intel offer a guide to the implementation of security profiles in IoT Gateways. IoT Gateway physical locations are usually decided by the transmission capabilities of the sensor side network, but the physical location of the next two functional blocks, the Asset Data and Mapping Engine and Flow Data engine, is a critical security consideration.

The IoT architectural question relating to where, and how, processing power is related to network architecture was outlined in the blog; IoT Architecture. But the arrangement and physical location of the key functions of Asset Data and Mapping engine and the Data Flow engine in relation to security will be dependent on individual deployment factors. Therefore the following statements are general principals applied to the Building Management example.

As the role and capabilities of an Asset Data and Mapping engine, and Data Flow engine are not well understood it might be desirable to read a previous blog IoT Data Flow Management, the science of getting real value from IoT data. The white paper Data Management for IoT provides more detail in the use of IoT data together with its differences to conventional data. However the best explanation of Asset Data and Mapping with its function in adding context data and location to simple IoT sensor event data comes from watching the Asset Mapping Explainer video on Building Management.

It is good security practice to keep sensor event traffic across the network semi anonymous and not append the critical contextual data that identifies the sensor, location and complete data file from the Asset Data and Mapping engine until securely within the Firewall/Data Center and ready for processing.

Just as few pilot installations appreciate the full role of the IoT Gateway beyond physical functionality, few pilots include the means to manage large numbers of IoT sensors beyond a simple recognizable representative number on a dedicated GUI screen. Good practice will use an IoT Gateway with encryption to ensure that all data traversing the network to the Asset Data and Mapping engine has low vulnerability. After the full data set is appended to the sensor event data by the Asset Data and Mapping engine it becomes an important architectural consideration to limit where on the network this data is accessible.

Similar considerations apply to the Data Flow engine in terms of its location, but also as to its role and use as a part of the IoT security architecture. A Data Flow engine, as its name suggests with its functionally described in the blogs previously referenced, can ensure that not all data is flooded across the entire network.

Cleverly positioned IoT Data Flow engines can control and manage data using elements of the data payload to direct to required destinations. Avoiding all the data being available over the entire network is another basic security good practice in IoT architectural design.

IoT Architecture incorporating basic security elements in its design is a new discipline, and as such really should be incorporated into proving pilots to gain experience in these new functional building blocks before moving to scale deployments.

 As IoT gains momentum and increasingly intelligent devices are interconnected Security becomes an increasingly issue, witness the challenges with Mobile Phones and Tablets today. Developing a full understanding of the all the elements and vulnerabilities requires an effort to master the topic, and the rest of this blog is devoted to providing the necessary links.

The development of both new security risk and protection methodologies and new technology capabilities is under way and there are several different initiatives driving or coordinating efforts that provide interesting details.  

Two good starting points are; 1) The International IoT Security Foundation for a general appreciation of the subject broken down into the various elements and issues in a multipart series. 2) The ambitious OWASP (Open Web Application Security Project) Internet of Things Project describes itself as designed to help manufacturers, developers, and consumers better understand the security issues associated with the Internet of Things, and to enable users in any context to make better security decisions when building, deploying, or assessing IoT technologies. The project looks to define a structure for various IoT sub-projects such as Attack Surface Areas, Testing Guides and Top Vulnerabilities.

A more commercial view comes from WindRiver, an Intel company, whose products are embedded into Intel processors, and from there into other products, in their white paper on Security in the Internet of Things with the interesting sub title ‘Lessons from the past for the connected future’. All these references provide both methods and architectural appreciation of the challenge with solutions using current technology. There are however two new technology approaches, one aiming to authenticate process interactions and the other to authenticate actual processor functions.

BlockChain has suddenly gained a big following for its possibilities in ensuring that ‘chain’ reactions, or interactions, can be tested and established as secure in their outcomes. Though somewhat infamous for its relationship to Bitcoin Internet currency, nevertheless it has much wider applicability in the ‘any to any’ environment of IoT. IBM has built a complete Blockchain demonstrator reported by CIO online under the headline of IBM Proof of Concept for Blockchain powered IoT.

PUF standing for Physically Unclonable Function is the technique for using the variations introduced during chip production to be read as a unique ‘signature’ for the chip as part of establishing its authenticity. This unique signature is used to create a unique encrypted checksum reply to an identity challenge enabling several different possible uses. Wikipedia provides a good description of the basic technique and its principle applications.

In conclusion the following quote is taken from the concluding summary of the Telefonica White paper ‘Scope, Scale and Risk as never before’

The networks IoT creates will be some of the biggest the World has ever seen. And that makes them enormously valuable to attackers . . . it is apparent that the Internet of Things is growing far faster and with a higher user knowledge base than its predecessor – The Internet itself. And this raises significant concerns.

What is a pilot today, and a closed IoT network tomorrow, one day will be part of the biggest network the World has ever known so in planning a pilot, or a deployment, it is absolutely necessary to understand the security dimension.

New C-Suite

Hortonworks Connected Data Platforms: More Than Sum of Parts

Hortonworks Connected Data Platforms: More Than Sum of Parts

Hortonworks integrates Hortonworks Data Platform (Hadoop) and Hortonworks DataFlow (streaming data) platforms to offer a cohesive approach to analyzing data in motion and data at rest. Here’s how they fit together.

The “Connected Data Platforms” that Hortonworks introduced on March 1 are its well-known Hortonworks Data Platform (HDP) Hadoop distribution and its Hortonworks DataFlow (HDF) platform aimed at collecting, curating and routing real-time data from any source to any destination. HDP and HDF can be used independently, but here’s how they fit together to become a cohesive platform for managing and analyzing streaming and historical data.

Interest in streaming data analysis has been growing steadily in recent years, but the emergence of Internet of Things (IoT) opportunities has interest soaring. The thing is, streaming-data use cases such as connected-cars, smart oil fields, smart utilities and precision medicine often require analysis of historical data, which brings context to the real-time insights. That’s why HDF and HDP need to be connected.

Inside Hortonworks Connected Data Platforms

This week Hortonworks introduced HDP’s 2.4 release. Notable upgrades include support for and bundling of Apache Spark 1.6 software as well as improved system management and remote optimization capabilities through Apache Ambari 2.2 and SmartSense 2.2. Ambari, the open source management software, gained an Express Upgrade feature that lets you quickly stop jobs, update software and restart the cluster and running jobs all within one hour, even on large systems. SmartSense is a “phone home” capability that relays system-performance parameters to Hortonworks, which can diagnose problems and offer more than 250 recommendations on optimizing system performance and availability.

The biggest development with HDP 2.4 is a new distribution strategy with two separate release cadences. Core Apache Hadoop components including HDFS, MapReduce and YARN as well as Apache Zookeeper will be updated annually, in line with other members of the ODPi consortium. Hortonworks is expediting other, newer capabilities through new “Extended Services” releases, which will be offered as quickly as they can be made available. One example of an Extended Service is support for Spark 1.6. Other candidates for this release approach will include Hive, HBase, Ambari “and more,” says Hortonworks.

MyPOV on HDP 2.4: I like this two-pronged strategy with the stable, slower moving core complemented throughout the year by extended services. Hortonworks has lagged behind Cloudera in the past in adding certain new capabilities that customers have been anxious to use. This is a good approach to fast tracking capabilities that are in demand (although they presumably can’t require changes to Hadoop core components). The approach also simplifies matters for other distributors of ODPi-based distributions.

Hortonworks DataFlow 1.2

HDF is Hortonwork’s streaming data platform based on Apache NiFi and adapted from last year’s Onyara acquisition. Upgrades with the move HDF 1.2, which will be available later this month, include the integration of Apache Kafka and Apache Storm streaming analytics engines. The release also gains support for Kerberos for centralized authentication across applications. On the near-term roadmap is support for Spark Streaming, which should be available by early summer, according to Hortonworks.

MyPOV on HDF: There’s much to like in Hortonworks DataFlow, including a drag-and-drop approach for developing the routing, transformation and mediation within dataflows. It also offers built-in data-security and data-provenance capabilities. One exec described it as “a FedEx for streaming data,” providing the digital equivalent of a logistics system for routing streaming data and tracking sources and changes to digital information along the way. The ecosystem seems strong, with support for more than 130 processors for systems including Kafka, Couchbase, Microsoft Azure Event Hub and Splunk.

How HDP and HDF are Connected

Hortonworks wants to be a multi-product company, so it has stressed that HDP and HDF will be sold and can be used independently. HDF can route data to (and draw from) other Hadoop distributions, databases such as Cassandra and cloud-based sources, such as Amazon S3.

When use cases span data in motion and data-at-rest, HDP and HDF have commonalities that makes them easier to use together. For example, both HDP and HDF share more than 70 data processors and both use Ambari for system deployment and management. What’s more, Hortonworks is promising that SmartSense, and the Ranger and Atlas security and governance projects will also support both platforms.

MyPOV on Connected Platforms: The need for the combination of streaming and historical data analysis is popping up in many quarters. It was touted as a benefit of Spark Streaming 2.0 at the recent Spark Summit East event, and MapR also has a strategy to address both forms of data in one platform.

Hype around streaming data opportunities is nothing new. More than a decade ago, complex event processing systems were touted as “ready to go mainstream.” At long last, I think we’re finally seeing signs that streaming data analysis is emerging. The mobile, social, cloud and big data trends set the stage and maybe, just maybe, the promise of IoT possibilities is pushing it over the top.

PS: Hortonworks also spotlighted two promising Spark related developments this week. First, it’s shipping a preview of Apache Zeppelin with HDP 2.4, providing a coding-free UI for visualization and a notebook-style approach to working on Spark. This is a usability improvement and democratization tool that Spark sorely needs. Second, in a partnership with HP Enterprise Labs, Hortonworks will bring to open source an optimized shuffle engine for Spark that HP Enterprise says will offer 5X to 15X performance improvements as well as optimized use of memory. This tech doesn’t have project status yet, let alone acceptance from the Spark community, but Hortonworks says it will ship the software with HDP later this year.

Related:
Spark Summit East Report: Enterprise Appeal Grows
Strata + Hadoop World Report: Spark, Real-Time In the Spotlight

 

 


Data to Decisions Tech Optimization Chief Information Officer Chief Digital Officer

Capgemini Joins Forces with Blueprint to Offer Advanced Requirements Management Capabilities for Financial Services

Capgemini Joins Forces with Blueprint to Offer Advanced Requirements Management Capabilities for Financial Services

What’s The Combo Up To? Blueprint, an innovator and global leader in accelerating and de-risking large, complex IT projects, and Capgemini, one of the world’s foremost providers of consulting, technology and outsourcing services, today announced a resale agreement increasing the ability for strategic customer growth worldwide.

How Will Capgemini and Blueprint Work Together? Capgemini will continue to leverage Blueprint’s wide array of requirements management capabilities both for its internal product development and for its customer implementations. Blueprint’s leading-edge, enterprise solution allows Capgemini to accelerate delivery of its industry leading solutions and reduce total cost of ownership for its customers. Blueprint’s software assists in aligning business strategy with IT execution, ensuring regulatory compliance, and supporting organizational transformation.

A Note From the Executives: Martin Saipe, Senior Vice President of Corporate Development, Blueprint said, “Blueprint’s growing alliance with Capgemini represents a significant milestone in Blueprint’s channel strategy. Capgemini is a key part of our growth plan through the system integration channel. We are excited about this new resale capability and are expecting significant activity in the near-term.” 

Anand Moorthy, Vice President, Global Testing Practice, Financial Services, Capgemini stated, “Our collaboration with Blueprint provides very robust requirements and documentation tools for customer implementations and internal product development,” said . We can accelerate the requirements process for many of our customers because of pre-built, accelerator models that are refined over time, incorporating best practices which we then deploy in selected projects. Rather than creating new custom requirements for each project from scratch, we are able to be more prescriptive in our delivery helping our customers manage costs and lock-in quality.”

MY POV: This combination will help Capgemini’s clients accelerate the requirements process  because of pre-built, accelerator models that are refined over time, incorporating best practices which we then deploy in selected projects. As a former systems integrator and management consultant, I can say this is a very distinct advantage.

@DrNatalie Petouhoff, VP and Principal Analyst, Constellation Research

Covering Customer-facing Applications That Drive Better Business Results

Share

Next-Generation Customer Experience Chief Customer Officer

Creating New Business Patterns for Social Impact

Creating New Business Patterns for Social Impact

1
I have always believed that a sense of purpose would drive change, no matter whether that change was behavioural, economic or cultural. And as such, my work in marketing has always been driven by an interest in psychology, behaviour and action. The reality is, is that I am curiously interested in people and what makes them tick – not in the things that they tell you when prompted, but in the millions of tiny actions that create our personalities. For example, I love the way that vegans wear leather, or doctors smoke cigarettes. I adore the inconsistencies that defeat algorithms and confound logic.

But I also love the way that these apparent inconsistencies can also create opportunities.

Over the last couple of years, businesses have started to pay closer attention to millennials – that generation born between 1982 and 2004. And while the span is open to debate, it is clear that this generation have a substantially different mindset from those that came before. I notice this in the work that I do with youth entrepreneurship organisation, Vibewire – where I am regularly confronted by behaviours, actions and expectations that, on the surface, appear completely alien. And I notice it in my work with corporations and clients, and in the research I do for various public speaking events. But as this generation begins to reach into management and executive ranks of government and business, it is something that we are all having to come to grips with.

Deloitte’s Millennial Survey is a recent example of the research which serves to reinforce what we have long suspected – that a sense of values and purpose is at the core of the millennial mindset. Thus far we have seen this play out in the consumer landscape, with a significant reduction in leading indicators of personal consumption – consider:

  • The fall in the number of driving licenses issued and the downstream impact on car sales
  • The rise in preference for public transport and the increasing pressure on inner city housing
  • The interest in entrepreneurship opportunities and skills and the downstream disinterest in professional careers and career paths.

The Deloitte report indicated that while millennials are “pro-business”, they are also particularly interested in business’ potential to “do good”:

Millennials continue to express positive views of business, and their opinions regarding businesses’ motivations and ethics showed stark improvement in this survey. However, much skepticism remains, driven by the majority-held belief that businesses have no ambition beyond profit. Almost nine in 10 (87 percent) believe that “the success of a business should be measured in terms of more than just its financial performance.”

 

deloitte-millennial1

However, while there is an alignment of values between business and millennials, there is a substantial gap in the alignment of purpose. The report concludes: “Millennials would prioritize the sense of purpose around people rather than growth or profit maximization”.

deloitte-millennial2

This, of course, suggests unsettling economic, cultural and social futures while the mis-match is sorted out. But as in most things, the most negative impacts will be felt by those businesses that respond too late or fail to plan strategically.

How to plan ahead for generational change

Whether your business has felt the winds of generational change or not, make no mistake, it is coming. From 2015, the Baby Boomer generations began retiring from the global workforce, taking their years of experience and expertise and substantial spending power with them. This trend will accelerate in the coming years. And as those experienced business leaders trade suits and ties for no ties and sun-filled beaches, enterprises from downtown Chicago to dusky Beijing will be restocked with ambitious, values focused millennials seeking to make their mark on the world. And this shift will force substantial change to what has been “business as usual”, with values and purpose taking centre stage.

Anecdotally, we are already seeing this play out. Financial services organisations are softening their positioning and message to the market. Utilities and resources companies are speaking of values, and professional services firms proclaim purpose and social impact. It’s out with conspicuous consumption and in with the sharing economy.

But this is just the beginning. Real change must be embedded deep in the hearts and souls of these organisations. It must be lived in the brand experience. And the “old ways” – the “business as usual” approaches must be re-made for this changing age.

Innovating for social impact

Often when we talk of innovation, we focus on something new or novel that is introduced to the public. It could be technology or an experience. It could combine the two. But we will begin to find that our efforts at innovation trip and stumble as they reach the market if we fail to take into account the changing nature of our buyer’s values and purpose. It won’t be good enough to put “lipstick on a pig” and serve it up on a bed of kale. We will need to begin the challenging task of creating shared value outcomes that don’t just serve our markets, stakeholders and management. We will need to address social impact too.

Over the last year or so, I have been working to create powerful business innovation frameworks that help entrepreneurs bring their products and services to market faster. My very first of these was an adaptation of the Lean Canvas used by startups for the purposes of social impact. I called it the Shared Value Canvas. Recently I have turned my attention to workshop and facilitation formats that use the same lean and agile methods employed by the world’s most innovative companies, tweaked to incorporate a social impact or social innovation outcome.

5-day-sprint-spread

In the coming days, I expect to release a comprehensive handbook that guides facilitators and teams through a Five Day Social Sprint. Designed for not-for-profits and for-purpose organisations, it’s a deep dive into the tools and techniques that rapidly move from idea to product within a week’s worth of effort. It has been inspired by the Google Ventures, five day sprint process – but revised and refocused for social impact.

And while I hope it finds favour with charities and not-for-profit organisations around the world, I also hope it inspires more traditional businesses to find tangible ways to bring purpose and values to life within their organisations, one innovation at a time.

Marketing Transformation Chief Customer Officer Chief Marketing Officer

Hortonworks wants to be the next generation database for the enterprise

Hortonworks wants to be the next generation database for the enterprise

We had the opportunity to attend the 1st Hortonworks analyst summit held in San Francisco February 29th and March 1st at the beautiful Nikko hotel.

 


 

 
This was the first analyst event held by Hortonworks and despite some substantial competition there was a good turnout of 20+ analysts in attendance, I was joined by fellow Constellationite Doug Henschen.

So take a look at the video:



 

No time to watch – here are the key takeaways:

Momentum – Hortonworks is growing, doing well, a testament to the transformational power of Hadoop in the enterprise. No surprise as enterprises for the first time have the opportunity to bring together all their enterprise data in a single platform, at a fraction of the cost of maintenance of a single silo of data. And no need to know what insights and questions maybe asked later. Hortonworks execs were bullish to have passed fellow competitor Cloudera, we will see at that vendor’s analyst summit. Anyway – great proof point of the momentum.

Next generation DB for the enterprise – It was very clear that the Hortonworks ambition is to become the next generation database for enterprises. And while for ‘data at rest’ the vendor has achieved that status (with all Hadoop based vendors, Hadoop has become the de-facto next gen database for the enterprise) – the verdict is all open on the ‘data in motion’ use cases (think most prominently of nothing else than IoT). And Hortonworks has now integrated last year’s acquisition of Onyara, providing an integrated product with HDF 1.2. It’s sibling for data at rest is HDP 2.4, which has Apache Spark 1.6 support. HP Enterprise ‘donated’ a re-write of the MapReduce Shuffle code in C (originally in Java), showing up to 15x performance gains - -so more performance to come for ODP at some point.

Two release trains – A sign of maturation of the Hadoop market and its adoption is shown by Hortonworks now moving to two release trains…. One for the cutting edge users (called Extended Services) and one for the more conservative (often live) enterprises with Hadoop Core. A good move that should make both growing constituent groups happy.



 

MyPOV

Always good to be at the first analyst summit, especially when it the vendor sees the traction like Hortonworks does. The vendor has a clear ambition to become the next generation database for the enterprise – addressing both data at rest and in motion. These are traditional different architectures and players and bringing both together will be a challenge for anyone – so a tall ask (hence the blog post title). But it’s good for vendors to have an ambition and given the early phases it is good to see Hortonworks is off to an early start to bring these two database domains together.

On the concern side – more is needed than a database, but a complete platform. Given the Springsource DNA in the executive team I am sure Hortonworks leadership is aware of that, but I guess it is one step at a time. And becoming more of the de-facto standard as the Hadoop distribution for an IaaS player (as Hortonworks has achieved with Microsoft Azure) is a short term priority.

For enterprises it is clear Hortonworks is one of the two (or three if you add MapR) Hadoop distributions to work with, and the ambition of becoming the next generation enterprise database will be attractive to more enterprises than less. We will be watching, stay tuned.



More on BigData


 
  • News Analysis - SAP Unveils New Cloud Platform Services and In-Memory Innovation on Hadoop to Accelerate Digital Transformation – A key milestone for SAP - read here
  • News Analysis - SAP delivers next release of SAP HANA - SPS 10 - Ready for BigData and IoT - read here
  • News Analysis - Salesforce Transforms Big Data Into Customer Success with the Salesforce Analytics Cloud - read here
  • Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices - read here
  • Event Report – Couchbase Connect – Couchbase’s shows momentum - read here
  • News Analysis - Couchbase unveils N1QL and updates the NoSQL Performance Wars - read here
  • Event Report - MongoDB keeps up the momentum in product and go to market - read here
  • News Analysis - Pivotal pivots to OpenSource and Hortonworks - Or: OpenSource keeps winning - read here
  • Progress Report - Cloudera is all in with Hadoop - now off to verticals - read here
  • Market Move - Oracle buys Datalogix - moves into DaaS - read here
  • Event Report - MongoDB is a showcase for the power of Open Source in the enterprise - read here
  • Musings - A manifesto: What are 'true' analytics? Read here
  • Future of Work - One Spreadsheet at the time - Informatica Springbok - read here
  • Musings - The Era of the no-design Database - Read here
  • Mendix - the other path to build software - Read here
  • Musings - Time to ditch your datawarehouse .... - Read here


Finally find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
 
 

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work New C-Suite Next-Generation Customer Experience Hortonworks Hadoop SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Experience Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Europe and US move on from Safe Harbor privacy - but not far

Europe and US move on from Safe Harbor privacy - but not far

For many years, American businesses have enjoyed a bit of special treatment under European data privacy laws. The so-called "Safe Harbor" arrangement was negotiated by the Federal Communications Commission (FCC) so that companies could self-declare broad compliance with data security rules. Normally organisations are not permitted to move Personally Identifiable Information (PII) about Europeans beyond the EU unless the destination has equivalent privacy measures in place. The "Safe Harbor" arrangement was a shortcut around full compliance; as such was widely derided by privacy advocates outside the USA, and for some years had been questioned by the more activist regulators in Europe. And so it seemed inevitable that the arrangement would be eventually annulled, as it was last October.

With the threat of most personal data flows from Europe into America being halted, US and EU trade officials have worked overtime for five months to strike a new deal. Today (January 29) the US Department of Commerce announced the "EU-US Privacy Shield".

The Privacy Shield is good news for commerce of course. But I hope that in the excitement, American businesses don't lose sight of the broader sweeper of privacy law. Even better would be to look beyond compliance, and take the opportunity to rethink privacy, because there is more to it than security and regulatory short cuts.

The Privacy Shield and the earlier Safe Harbor arrangement are really only about satisfying one corner of European data protection laws, namely transborder flows. The transborder data flow rules basically say you must not move personal data from an EU state into a jurisdiction where the privacy protections are weaker than in Europe. Many countries actually have the same sort of laws, including Australia. Normally, as a business, you would have to demonstrate to a European data protection authority (DPA) that your information handling is complying with EU laws, either by situating your data centre in a similar jurisdiction, or by implementing legally binding measures for safeguarding data to EU standards. This is why so many cloud service providers are now building fresh infrastructure in the EU.

But there is more to privacy than security and data centre location. American businesses must not think that just because there is a new get-out-of-jail clause for transborder flows, their privacy obligations are met. Much more important than raw data security are the bedrocks of privacy: Collection Limitation, Usage Limitation, and Transparency.

Basic data privacy laws the world-over require organisations to exercise constraint and openness. That is, Personal Information must not be collected without a real demonstrated need (or without consent); once collected for a primary purpose, Personal Information should not be used for unrelated secondary purposes; and individuals must be given reasonable notice of what personal data is being collected about them, how it is collected, and why. It's worth repeating: general data protection is not unique to Europe; at last count, over 100 countries around the world had passed similar laws; see Prof Graham Greenleaf's Global Tables of Data Privacy Laws and Bills, January 2015.

Over and above Safe Harbor, American businesses have suffered some major privacy missteps. The Privacy Shield isn't going to make overall privacy better by magic.

For instance, Google in 2010 was caught over-collecting personal information through its StreetView cars. It is widely known (and perfectly acceptable) that mapping companies use the positions of unique WiFi routers for their geolocation databases. Google continuously collects WiFi IDs and coordinates via its StreetView cars. The privacy problem here was that some of the StreetView cars were also collecting unencrypted WiFi traffic (for "research purposes") whenever they came across it. In over a dozen countries around the world, Google admitted they had breached local privacy laws, apologised, and deleted the collected WiFi contents. The matter was settled in just a few months in places like Korea, Japan and Austraia. But in the US, where there is no general collection limitation privacy rule, Google has been defending this practice. The strongest legislation that seems to apply is wiretap laws but their application to the Internet is complex. And so it's taken years, and the matter is still not resolved.

I don't know why Google doesn't see that a privacy breach in the rest of the world is a privacy breach in the US, and instead of fighting it, concede that the collection of WiFi traffic was unnecessary and wrong.

I don't know why Google doesn't see that a privacy breach in the rest of the world is a privacy breach in the US, and instead of fighting it, concede that the collection of WiFi traffic was unnecessary and wrong.

Other proof that European privacy law is deeper and broader than the Privacy Shield is found in social networking mishaps. Over the years, many of Facebook's business practices for instance have been found unlawful in the EU. Recently there was the final ruling against "Find Friends", which uploads the contact details of third parties without their consent. Before that there was the long running dispute over biometric photo tagging. When Facebook generates tag suggestions, what they're doing is running facial recognition algorithms over photos in their vast store of albums, without the consent of the people in those photos. Identifying otherwise anonymous people, without consent (and without restraint as to what might be done next with that new PII), seems to be an unlawful under the Collection Limitation and Usage Limitation principles.

In 2012, Facebook was required to shut down their photo tagging in Europe. They have been trying to re-introduce it ever since. Whether they are successful or not will have nothing to do with the "Privacy Shield".

The Privacy Shield comes into a troubled trans-Atlantic privacy environment. Whether or not the new EU-US arrangement fares better than the Safe Harbor remains to be seen. But in any case, since the Privacy Shield really aims to free up business access to data, sadly it's unlikely to do much good for true privacy.

 

 

The examples cited here are special cases of the collision of Big Data with data privacy, which is one of my special interest areas. See for example "Big Privacy" Rises to the Challenges of Big Data.

 

 

Data to Decisions Matrix Commerce Digital Safety, Privacy & Cybersecurity Security Zero Trust Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer Chief Information Security Officer Chief Privacy Officer

Oracle Unveils First Data Protection Solution with Seamless Cloud Tiering

Oracle Unveils First Data Protection Solution with Seamless Cloud Tiering

This morning Oracle announced a new storage offering that extends strategic options for enterprises on how, where and when to store data, all at an attractive TCO position. But it’s for data on… mainframes. Why would Oracle go after this market, at default unlikely and unusual for the vendor, so worth a blog post. 
 

So let’s take apart the press release in our customary style, it can be found here:
Redwood Shores, Calif.– March 1, 2016 – Oracle announces the all new StorageTek Virtual Storage Manager (VSM) 7 System, the most secure and scalable data protection solution for mainframe and heterogeneous systems with the additional capability to provide fully automated tiering directly to the public cloud. Furthermore, Oracle’s StorageTek VSM 7 System has been architected to seamlessly integrate with Oracle Storage Cloud Service – Object Storage and Oracle Storage Cloud Service – Archive Service and provides storage administrators with a built-in cloud strategy. With Oracle, cloud storage is now as accessible as on-premise storage.

MyPOV – Summarizes well what VSM is about –give (IBM) mainframe customers an option to expand storage both on premises and to the cloud. With the integration to Oracle Cloud Service - Archive Service, VSM offers the option to bring this data to the cloud, if desired or required. Giving enterprises the option between on premises / cloud storage is a key capability to allow flexible deployment scenarios of storage, while enterprises are formulating their long term systems and application strategy.
 
“In the past, data protection solutions were designed to deal with exponential data growth on-premises, but an entirely different dynamic drove the design of the VSM 7,” said James Cates, senior vice president, Archive Product Development, Oracle. “The core is still there—elevated performance, twice the capacity, a higher degree of scalability, but we saw a gap in the market, so we developed Engineered Cloud Tiering to enable mainframe users to take advantage of cloud economics.”
MyPOV – Good quote by Cates, summing up both the traditional (on premises) and new (cloud) capabilities of VSM version 7.
Organizations can experience these benefits:

Performance & Scalability: Oracle’s StorageTek VSM 7 System is a superior data protection solution for IBM z Systems mainframes with full data interchange across previous generation VSM systems and key features that IBM’s TS7700 virtual tape system lacks. Oracle’s StorageTek VSM 7 System delivers, 34x more capacity, significantly higher scalability to 256 StorageTek VSM 7 Systems, data deduplication and native cloud tiering that provides mainframe and heterogeneous storage users the ability to access additional capacity on demand.
 MyPOV – It’s always key for vendors to build enough new capabilities and capacity in a new offering, to make it attractive enough for enterprises to consider adopting the new offering. At the same time a substantial gain of capabilities and capacity is essential in order to make sure the offering remains attractive given the effects of Moore’s Law on all things hardware. It looks like Oracle has built in enough of a capability and capacity progression in VSM 7. 
 
Enhanced Security: Powered by Oracle’s breakthrough SPARC M7 processor, Oracle’s StorageTek VSM 7 System delivers wide-key encryption for data at rest and on removable tape media without performance compromise and also uses Silicon Secured Memory for data protection.
MyPOV – Good to see the recent hardware based security offerings being part of the offering. Another example of what Oracle can do with the integrated ‘chip to click’ technology stack - designing higher level offerings in conjunction with lower level system capabilities here taking advantage of building its own processors with SPARC M7.
 
Availability: Oracle’s StorageTek VSM 7 System provides data protection solutions from on-premises to Oracle Public Cloud, with a single point of contact for all mainframe storage and heterogeneous storage requirements. Policies can be set to automatically copy or migrate files from external disk storage to low cost, off-site cloud storage. With native cloud tiering, customers benefit from end-to-end visibility and diagnostics from on-premise StorageTek VSM 7 System deployments to the Oracle Storage Cloud Service or Oracle Storage Cloud Archive Service.
MyPOV – As data volumes explode for enterprises, keeping and protecting data is key for next generation applications that these enterprises are supporting. Being able to manage policy based – and thus automated – on premises vs cloud storage choices is very valuable.
 
Disaster Recovery: Oracle’s StorageTek VSM 7 System enables a “lights out” disaster recovery strategy, as a mainframe is no longer required at remote sites, dramatically reducing costs and simplifying deployments. Electronic data sharing across separate complexes, clustering, replication, and DR to the Oracle Public Cloud provides a breadth of simple, flexible disaster recovery options.
MyPOV – Disaster Recovery remains a must have capability for most enterprises to ensure business continuity. Oracle provides a key capability not having to mirror complete system capabilities (here a mainframe) on both sides of a DR equation, opening scenarios with system replacement, cost saving and more agile qualities.
 
Oracle is also addressing the needs of mission critical heterogeneous environments
underserved by other solutions. Extending its enterprise-proven architecture to a broader customer base, Oracle is providing robust mainframe data protection capabilities with flexible disaster recovery options.
MyPOV – Good to hear that VSM supports scenarios beyond mainframes, while keeping mainframe quality to allow support for more heterogeneous system deployments. Enterprises have heterogeneous system landscapes, and having the option of high quality system choices at disposal is very valuable for decision makers.
 

Overall MyPOV

The cloud has brought new life to many areas of the technology stack that have lived a more quiet life in the recent decade. Storage is one of these areas, where suddenly a well understood, on the path to commoditization IT offering, has become strategic (again). All of the 7 next generation application use cases Constellation Research has identified and that enterprises are evaluating, building and looking to procure, show how strategically important storage is. In many cases, Storage is the critical path from enabling the overall use case both from a capability and / or cost perspective. So seeing innovation in the space, giving enterprise more deployment options and reducing the total cost of ownership (TCO) is a positive development for CxOs making strategic next generation application platform decisions.

Moreover, the combination of these offerings with the cloud gives enterprise key new strategic choices. Still necessary older on premises architectures can be extended while taking advantage of cloud bases technologies and deployments, while enterprises formulate their systems and platform strategy going forward. Oracle’s symmetrical architecture for both on premises and on cloud based products, increases the deployment choices and flexibility substantially: Enterprises know with higher level of confidence that they can move data from on premises to cloud and vice versa. This confidence into a capability is crucial for deployment flexibility that has risen to even more importance given the recent data residency challenges enterprises face with the invalidation of the E.U. / USA Safe harbor agreement.

So overall VSM 7 is a further proof point of the Oracle ‘chip to click’ technology stack, deployed either seamlessly on premises or in the Oracle cloud. Enterprise with storage needs should take note / a look.


 
Recent blog posts on Oracle:
 
  • Market Move - Oracle acquires Ravello Systems - makes good on nested hypervisor roadmap - read here
  • Progress Report - Oracle Cloud - More ready than ever, now needs adoption read here
  • Event Report - Oracle Openworld 2015 - Top 3 Takeaways, Top 3 Positives & Concerns - read here
  • News Analysis - Quick Take on all 22 press releases of Oracle OpenWorld Day #1 - #3 - read here
  • First Take - Oracle OpenWorld - Day 1 Keynote - Top 3 Takeaways - read here
  • Event Preview - Oracle Openworld - watch here


Future of Work / HCM / SaaS research:
  • Event Report - Oracle HCM World - Full Steam ahead, a Learning surprise and potential growth challenges - read here
  • First Take - Oracle HCM World Day #1 Keynote - off to a good start - read here
  • Progress Report - Oracle HCM gathers momentum - now it needs to build on that - read here
  • Oracle pushes modern HR - there is more than technology - read here. (Takeaways from the recent HCMWorld conference).
  • Why Applications Unlimited is good a good strategy for Oracle customers and Oracle - read here.

Also worth a look for the full picture
 
  • Event Report - Oracle PaaS Event - 6 PaaS Services become available, many more announced - read here
  • Progress Report - Oracle Cloud makes progress - but key work remains in the cellar - read here
  • News Analysis - Oracle discovers the power of the two socket server - or: A pivot that wasn't one - TCO still rules - read here
  • Market Move - Oracle buys Datalogix - moves more into DaaS - read here
  • Event Report - Oracle Openworld - Oracle's vision and remaining work become clear - they are both big - read here
  • Constellation Research Video Takeaways of Oracle Openworld 2014 - watch here
  • Is it all coming together for Oracle in 2014? Read here
  • From the fences - Oracle AR Meeting takeaways - read here (this was the last analyst meeting in spring 2013)
  • Takeaways from Oracle CloudWorld LA - read here (this was one of the first cloud world events overall, in January 2013)

And if you want to read more of my findings on Oracle technology - I suggest:
  • Progress Report - Good cloud progress at Oracle and a two step program - read here.
  • Oracle integrates products to create its Foundation for Cloud Applications - read here.
  • Java grows up to the enterprise - read here.
  • 1st take - Oracle in memory option for its database - very organic - read here.
  • Oracle 12c makes the database elastic - read here.
  • How the cloud can make the unlikeliest bedfellows - read here.
  • Act I - Oracle and Microsoft partner for the cloud - read here.
  • Act II - The cloud changes everything - Oracle and Salesforce.com - read here.
  • Act III - The cloud changes everything - Oracle and Netsuite with a touch of Deloitte - read here

Finally find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Oracle SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Opportunities for Agencies: Innovation Can Be Learned

Opportunities for Agencies: Innovation Can Be Learned

1
One of the most interesting and useful podcasts that I listen to is Drew McLellan’s Build a Better Agency podcast. Each week, Drew serves up fascinating and tangible tips, tricks and proven approaches to help agency owners grow their business. It’s a great combination of tactics and strategy, capability building and new ideas.

A couple of months ago, I had the opportunity to talk to Drew about the ways that agency owners can re-think their businesses and client relationships. With so many of the traditional agency offerings like design, SEO and even copywriting now commoditised and available through crowd-aggregation platforms, many agencies are being challenged to innovate or die. It seems that digital disruption is reaching into even the most creative of disciplines.

Or is it.

I have always seen the great strengths of the agency model working where a trusted relationship is able to be nurtured over a number of years. In these instances, agencies are able to take on more and more strategic work, shifting from a transactional supplier role into something more substantial. A partner. Or advisor. It is in these roles where agency owners have the greatest of opportunities – to recast the relationship again, bringing their teams’ creative problem solving talents into the value equation.

One of the ways of doing this is through the use of the tools and techniques popularised by high tech startups, like the lean canvas. This “business model on a page” approach quickly moves a client discussion to a higher level. It frames a new style of conversation that agency owners can lead.

And the great thing is, you can try it on your own agency first. In the podcast I share some tips for getting started. And remember – innovation isn’t something you are born with. It can be learned.

Marketing Transformation Chief Marketing Officer