Results

Europe and US move on from Safe Harbor privacy - but not far

Europe and US move on from Safe Harbor privacy - but not far

For many years, American businesses have enjoyed a bit of special treatment under European data privacy laws. The so-called "Safe Harbor" arrangement was negotiated by the Federal Communications Commission (FCC) so that companies could self-declare broad compliance with data security rules. Normally organisations are not permitted to move Personally Identifiable Information (PII) about Europeans beyond the EU unless the destination has equivalent privacy measures in place. The "Safe Harbor" arrangement was a shortcut around full compliance; as such was widely derided by privacy advocates outside the USA, and for some years had been questioned by the more activist regulators in Europe. And so it seemed inevitable that the arrangement would be eventually annulled, as it was last October.

With the threat of most personal data flows from Europe into America being halted, US and EU trade officials have worked overtime for five months to strike a new deal. Today (January 29) the US Department of Commerce announced the "EU-US Privacy Shield".

The Privacy Shield is good news for commerce of course. But I hope that in the excitement, American businesses don't lose sight of the broader sweeper of privacy law. Even better would be to look beyond compliance, and take the opportunity to rethink privacy, because there is more to it than security and regulatory short cuts.

The Privacy Shield and the earlier Safe Harbor arrangement are really only about satisfying one corner of European data protection laws, namely transborder flows. The transborder data flow rules basically say you must not move personal data from an EU state into a jurisdiction where the privacy protections are weaker than in Europe. Many countries actually have the same sort of laws, including Australia. Normally, as a business, you would have to demonstrate to a European data protection authority (DPA) that your information handling is complying with EU laws, either by situating your data centre in a similar jurisdiction, or by implementing legally binding measures for safeguarding data to EU standards. This is why so many cloud service providers are now building fresh infrastructure in the EU.

But there is more to privacy than security and data centre location. American businesses must not think that just because there is a new get-out-of-jail clause for transborder flows, their privacy obligations are met. Much more important than raw data security are the bedrocks of privacy: Collection Limitation, Usage Limitation, and Transparency.

Basic data privacy laws the world-over require organisations to exercise constraint and openness. That is, Personal Information must not be collected without a real demonstrated need (or without consent); once collected for a primary purpose, Personal Information should not be used for unrelated secondary purposes; and individuals must be given reasonable notice of what personal data is being collected about them, how it is collected, and why. It's worth repeating: general data protection is not unique to Europe; at last count, over 100 countries around the world had passed similar laws; see Prof Graham Greenleaf's Global Tables of Data Privacy Laws and Bills, January 2015.

Over and above Safe Harbor, American businesses have suffered some major privacy missteps. The Privacy Shield isn't going to make overall privacy better by magic.

For instance, Google in 2010 was caught over-collecting personal information through its StreetView cars. It is widely known (and perfectly acceptable) that mapping companies use the positions of unique WiFi routers for their geolocation databases. Google continuously collects WiFi IDs and coordinates via its StreetView cars. The privacy problem here was that some of the StreetView cars were also collecting unencrypted WiFi traffic (for "research purposes") whenever they came across it. In over a dozen countries around the world, Google admitted they had breached local privacy laws, apologised, and deleted the collected WiFi contents. The matter was settled in just a few months in places like Korea, Japan and Austraia. But in the US, where there is no general collection limitation privacy rule, Google has been defending this practice. The strongest legislation that seems to apply is wiretap laws but their application to the Internet is complex. And so it's taken years, and the matter is still not resolved.

I don't know why Google doesn't see that a privacy breach in the rest of the world is a privacy breach in the US, and instead of fighting it, concede that the collection of WiFi traffic was unnecessary and wrong.

I don't know why Google doesn't see that a privacy breach in the rest of the world is a privacy breach in the US, and instead of fighting it, concede that the collection of WiFi traffic was unnecessary and wrong.

Other proof that European privacy law is deeper and broader than the Privacy Shield is found in social networking mishaps. Over the years, many of Facebook's business practices for instance have been found unlawful in the EU. Recently there was the final ruling against "Find Friends", which uploads the contact details of third parties without their consent. Before that there was the long running dispute over biometric photo tagging. When Facebook generates tag suggestions, what they're doing is running facial recognition algorithms over photos in their vast store of albums, without the consent of the people in those photos. Identifying otherwise anonymous people, without consent (and without restraint as to what might be done next with that new PII), seems to be an unlawful under the Collection Limitation and Usage Limitation principles.

In 2012, Facebook was required to shut down their photo tagging in Europe. They have been trying to re-introduce it ever since. Whether they are successful or not will have nothing to do with the "Privacy Shield".

The Privacy Shield comes into a troubled trans-Atlantic privacy environment. Whether or not the new EU-US arrangement fares better than the Safe Harbor remains to be seen. But in any case, since the Privacy Shield really aims to free up business access to data, sadly it's unlikely to do much good for true privacy.

 

 

The examples cited here are special cases of the collision of Big Data with data privacy, which is one of my special interest areas. See for example "Big Privacy" Rises to the Challenges of Big Data.

 

 

Data to Decisions Matrix Commerce Digital Safety, Privacy & Cybersecurity Distillation Aftershots Security Zero Trust Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer Chief Information Security Officer Chief Privacy Officer

Oracle Unveils First Data Protection Solution with Seamless Cloud Tiering

Oracle Unveils First Data Protection Solution with Seamless Cloud Tiering

This morning Oracle announced a new storage offering that extends strategic options for enterprises on how, where and when to store data, all at an attractive TCO position. But it’s for data on… mainframes. Why would Oracle go after this market, at default unlikely and unusual for the vendor, so worth a blog post. 
 

So let’s take apart the press release in our customary style, it can be found here:
Redwood Shores, Calif.– March 1, 2016 – Oracle announces the all new StorageTek Virtual Storage Manager (VSM) 7 System, the most secure and scalable data protection solution for mainframe and heterogeneous systems with the additional capability to provide fully automated tiering directly to the public cloud. Furthermore, Oracle’s StorageTek VSM 7 System has been architected to seamlessly integrate with Oracle Storage Cloud Service – Object Storage and Oracle Storage Cloud Service – Archive Service and provides storage administrators with a built-in cloud strategy. With Oracle, cloud storage is now as accessible as on-premise storage.

MyPOV – Summarizes well what VSM is about –give (IBM) mainframe customers an option to expand storage both on premises and to the cloud. With the integration to Oracle Cloud Service - Archive Service, VSM offers the option to bring this data to the cloud, if desired or required. Giving enterprises the option between on premises / cloud storage is a key capability to allow flexible deployment scenarios of storage, while enterprises are formulating their long term systems and application strategy.
 
“In the past, data protection solutions were designed to deal with exponential data growth on-premises, but an entirely different dynamic drove the design of the VSM 7,” said James Cates, senior vice president, Archive Product Development, Oracle. “The core is still there—elevated performance, twice the capacity, a higher degree of scalability, but we saw a gap in the market, so we developed Engineered Cloud Tiering to enable mainframe users to take advantage of cloud economics.”
MyPOV – Good quote by Cates, summing up both the traditional (on premises) and new (cloud) capabilities of VSM version 7.
Organizations can experience these benefits:

Performance & Scalability: Oracle’s StorageTek VSM 7 System is a superior data protection solution for IBM z Systems mainframes with full data interchange across previous generation VSM systems and key features that IBM’s TS7700 virtual tape system lacks. Oracle’s StorageTek VSM 7 System delivers, 34x more capacity, significantly higher scalability to 256 StorageTek VSM 7 Systems, data deduplication and native cloud tiering that provides mainframe and heterogeneous storage users the ability to access additional capacity on demand.
 MyPOV – It’s always key for vendors to build enough new capabilities and capacity in a new offering, to make it attractive enough for enterprises to consider adopting the new offering. At the same time a substantial gain of capabilities and capacity is essential in order to make sure the offering remains attractive given the effects of Moore’s Law on all things hardware. It looks like Oracle has built in enough of a capability and capacity progression in VSM 7. 
 
Enhanced Security: Powered by Oracle’s breakthrough SPARC M7 processor, Oracle’s StorageTek VSM 7 System delivers wide-key encryption for data at rest and on removable tape media without performance compromise and also uses Silicon Secured Memory for data protection.
MyPOV – Good to see the recent hardware based security offerings being part of the offering. Another example of what Oracle can do with the integrated ‘chip to click’ technology stack - designing higher level offerings in conjunction with lower level system capabilities here taking advantage of building its own processors with SPARC M7.
 
Availability: Oracle’s StorageTek VSM 7 System provides data protection solutions from on-premises to Oracle Public Cloud, with a single point of contact for all mainframe storage and heterogeneous storage requirements. Policies can be set to automatically copy or migrate files from external disk storage to low cost, off-site cloud storage. With native cloud tiering, customers benefit from end-to-end visibility and diagnostics from on-premise StorageTek VSM 7 System deployments to the Oracle Storage Cloud Service or Oracle Storage Cloud Archive Service.
MyPOV – As data volumes explode for enterprises, keeping and protecting data is key for next generation applications that these enterprises are supporting. Being able to manage policy based – and thus automated – on premises vs cloud storage choices is very valuable.
 
Disaster Recovery: Oracle’s StorageTek VSM 7 System enables a “lights out” disaster recovery strategy, as a mainframe is no longer required at remote sites, dramatically reducing costs and simplifying deployments. Electronic data sharing across separate complexes, clustering, replication, and DR to the Oracle Public Cloud provides a breadth of simple, flexible disaster recovery options.
MyPOV – Disaster Recovery remains a must have capability for most enterprises to ensure business continuity. Oracle provides a key capability not having to mirror complete system capabilities (here a mainframe) on both sides of a DR equation, opening scenarios with system replacement, cost saving and more agile qualities.
 
Oracle is also addressing the needs of mission critical heterogeneous environments
underserved by other solutions. Extending its enterprise-proven architecture to a broader customer base, Oracle is providing robust mainframe data protection capabilities with flexible disaster recovery options.
MyPOV – Good to hear that VSM supports scenarios beyond mainframes, while keeping mainframe quality to allow support for more heterogeneous system deployments. Enterprises have heterogeneous system landscapes, and having the option of high quality system choices at disposal is very valuable for decision makers.
 

Overall MyPOV

The cloud has brought new life to many areas of the technology stack that have lived a more quiet life in the recent decade. Storage is one of these areas, where suddenly a well understood, on the path to commoditization IT offering, has become strategic (again). All of the 7 next generation application use cases Constellation Research has identified and that enterprises are evaluating, building and looking to procure, show how strategically important storage is. In many cases, Storage is the critical path from enabling the overall use case both from a capability and / or cost perspective. So seeing innovation in the space, giving enterprise more deployment options and reducing the total cost of ownership (TCO) is a positive development for CxOs making strategic next generation application platform decisions.

Moreover, the combination of these offerings with the cloud gives enterprise key new strategic choices. Still necessary older on premises architectures can be extended while taking advantage of cloud bases technologies and deployments, while enterprises formulate their systems and platform strategy going forward. Oracle’s symmetrical architecture for both on premises and on cloud based products, increases the deployment choices and flexibility substantially: Enterprises know with higher level of confidence that they can move data from on premises to cloud and vice versa. This confidence into a capability is crucial for deployment flexibility that has risen to even more importance given the recent data residency challenges enterprises face with the invalidation of the E.U. / USA Safe harbor agreement.

So overall VSM 7 is a further proof point of the Oracle ‘chip to click’ technology stack, deployed either seamlessly on premises or in the Oracle cloud. Enterprise with storage needs should take note / a look.


 
Recent blog posts on Oracle:
 
  • Market Move - Oracle acquires Ravello Systems - makes good on nested hypervisor roadmap - read here
  • Progress Report - Oracle Cloud - More ready than ever, now needs adoption read here
  • Event Report - Oracle Openworld 2015 - Top 3 Takeaways, Top 3 Positives & Concerns - read here
  • News Analysis - Quick Take on all 22 press releases of Oracle OpenWorld Day #1 - #3 - read here
  • First Take - Oracle OpenWorld - Day 1 Keynote - Top 3 Takeaways - read here
  • Event Preview - Oracle Openworld - watch here


Future of Work / HCM / SaaS research:
  • Event Report - Oracle HCM World - Full Steam ahead, a Learning surprise and potential growth challenges - read here
  • First Take - Oracle HCM World Day #1 Keynote - off to a good start - read here
  • Progress Report - Oracle HCM gathers momentum - now it needs to build on that - read here
  • Oracle pushes modern HR - there is more than technology - read here. (Takeaways from the recent HCMWorld conference).
  • Why Applications Unlimited is good a good strategy for Oracle customers and Oracle - read here.

Also worth a look for the full picture
 
  • Event Report - Oracle PaaS Event - 6 PaaS Services become available, many more announced - read here
  • Progress Report - Oracle Cloud makes progress - but key work remains in the cellar - read here
  • News Analysis - Oracle discovers the power of the two socket server - or: A pivot that wasn't one - TCO still rules - read here
  • Market Move - Oracle buys Datalogix - moves more into DaaS - read here
  • Event Report - Oracle Openworld - Oracle's vision and remaining work become clear - they are both big - read here
  • Constellation Research Video Takeaways of Oracle Openworld 2014 - watch here
  • Is it all coming together for Oracle in 2014? Read here
  • From the fences - Oracle AR Meeting takeaways - read here (this was the last analyst meeting in spring 2013)
  • Takeaways from Oracle CloudWorld LA - read here (this was one of the first cloud world events overall, in January 2013)

And if you want to read more of my findings on Oracle technology - I suggest:
  • Progress Report - Good cloud progress at Oracle and a two step program - read here.
  • Oracle integrates products to create its Foundation for Cloud Applications - read here.
  • Java grows up to the enterprise - read here.
  • 1st take - Oracle in memory option for its database - very organic - read here.
  • Oracle 12c makes the database elastic - read here.
  • How the cloud can make the unlikeliest bedfellows - read here.
  • Act I - Oracle and Microsoft partner for the cloud - read here.
  • Act II - The cloud changes everything - Oracle and Salesforce.com - read here.
  • Act III - The cloud changes everything - Oracle and Netsuite with a touch of Deloitte - read here

Finally find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Oracle SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Opportunities for Agencies: Innovation Can Be Learned

Opportunities for Agencies: Innovation Can Be Learned

1
One of the most interesting and useful podcasts that I listen to is Drew McLellan’s Build a Better Agency podcast. Each week, Drew serves up fascinating and tangible tips, tricks and proven approaches to help agency owners grow their business. It’s a great combination of tactics and strategy, capability building and new ideas.

A couple of months ago, I had the opportunity to talk to Drew about the ways that agency owners can re-think their businesses and client relationships. With so many of the traditional agency offerings like design, SEO and even copywriting now commoditised and available through crowd-aggregation platforms, many agencies are being challenged to innovate or die. It seems that digital disruption is reaching into even the most creative of disciplines.

Or is it.

I have always seen the great strengths of the agency model working where a trusted relationship is able to be nurtured over a number of years. In these instances, agencies are able to take on more and more strategic work, shifting from a transactional supplier role into something more substantial. A partner. Or advisor. It is in these roles where agency owners have the greatest of opportunities – to recast the relationship again, bringing their teams’ creative problem solving talents into the value equation.

One of the ways of doing this is through the use of the tools and techniques popularised by high tech startups, like the lean canvas. This “business model on a page” approach quickly moves a client discussion to a higher level. It frames a new style of conversation that agency owners can lead.

And the great thing is, you can try it on your own agency first. In the podcast I share some tips for getting started. And remember – innovation isn’t something you are born with. It can be learned.

Marketing Transformation Chief Marketing Officer

CEN Member Chat: Assisted Productivity

CEN Member Chat: Assisted Productivity

Constellation Research VP & Principal Analyst, Alan Lepofsky, shares his latest technology research on Assisted Productivity - How Artificial Intelligence Will Help People Get Work Done. Chris Kanaracus, Constellation's Managing Editor provides 2016 highlights on business and technology trend insights to watch for. 

Future of Work Chief People Officer On <iframe src="https://player.vimeo.com/video/156972109" width="500" height="281" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>
Media Name: cen-member-chat-assisted-productivity.png

New Report: True Value of Technology in Food Supply Chains

New Report: True Value of Technology in Food Supply Chains

My March issue of Bon Appetit magazine arrived just days ago, and it's not just any issue. It's the culture issue. Why does this matter? It matters because we've been dubbed as food obsessed. Notice the smartphone with the pizza photo. That artisan pizza looks good enough to eat.

This scenario provides just one example among many of how consumers value food choices more and more these days. That cultural shift affects food supply chains. 

It's this new trend related to consumer preferences that makes this new report authored by Guy Courtin, Constellation Research VP & Principal Analyst, so relevant now. It validates the need for food supply chains to embrace technology that can deliver mass personalization to satisfy today's savvy consumers.

As Guy's executive summary points out, 

The food supply chain is challenging enough when it comes to the growing, harvesting, processing, transporting and distribution of food so that humans can safely consume it. As grocers, food manufacturers and restaurants strive to keep up with consumers’ needs and preferences, they must have the discipline to continually explore how technologies and processes can buttress their efforts to meet the evolving demands of regulatory compliance and consumer preferences. In addition, they must continuously battle for profit margin and market share that are being threatened from multiple angles.

This report looks at what companies in the food supply chain need to consider to thrive in these challenging times. This report offers insights on four of Constellation’s business research themes: Consumerization of Technology, Matrix Commerce, Data to Decisions, and Technology Optimization & Innovation.

Spoiler alert: Whole Foods Market, one of my personal favorites for grocery stores, is mentioned.

If you want to find out more, you can download a report excerpt here

 

DOWNLOAD EXCERPT

Data to Decisions Matrix Commerce New C-Suite Tech Optimization Marketing Transformation Revenue & Growth Effectiveness Next-Generation Customer Experience Supply Chain Automation Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software IoT Blockchain ERP Leadership Collaboration M&A Chief Supply Chain Officer Chief Experience Officer

McDonald’s breakfast gamble paying off…customer continues to dominate relationship.

McDonald’s breakfast gamble paying off…customer continues to dominate relationship.

Last Fall, McDonald’s announced it would be offering their breakfast menu all day. It was widely seen as a bold move, but one that had a certain level of risk. Especially when it came to how their supply chain would be able to support the change. We took a cautiously optimistic view of the move – click here for our post. The gamble 657553-03d0daec-1fa9-11e5-9bcb-f95a534f046cseems to be paying off for McDonald’s as well as to the detriment of their competitors. Players such as Jack in the Box basically admitted that their competitor’s offering all day breakfast has been detrimental to their business:

“Jack in the Box sales in the last part of the quarter were lower than we anticipated as several competitors began promoting aggressive value offers,” Jack in the Box CEO Lenny Comma said. “We also experienced weakness at breakfast and lunch throughout the quarter, which we attribute primarily to our decision to shift the timing of some of our promotional activity around breakfast to the second quarter as compared to the first quarter of last year. In addition, we believe a competitor’s messaging around its launch of all-day breakfast had some impact on our results, particularly in the 10:30 a.m. to noon period.”

Click here for the full post on McDonald’s breakfast results.

The undertone of this shift with McDonald’s and the positive results they are enjoying, goes back to the rise of the customer. The customer spoke, those that listened are reaping the benefits. The restaurant and food industries are retail sub-segments that are particularly sensitive to customers’ tastes…literally. As we have witnessed customers’ power growing in the relationships with retailers, this is never more apparent than in the food sub-segment.  Restaurants and grocers have to be acutely in tune with the changing winds of demand from their customer base. Especially as food has become a fashion extension – foodies of all shapes and forms are abound.

As McDonald’s offered all day breakfast, not based on a whim but due to what they perceived as an unmet demand from their customer base. The lesson to take from these results is not only that McDonald’s has found success with their venture, but they also had the proper basis to make this decision. Weighing perceived customer demand, pent up market need and impacts on margins have to be taken into consideration when launching into a new product and direction.

Congratulations to McDonald’s on finding success with their all day breakfast. But as we all know, success can be fleeting. The Golden Arches cannot rest on their laurels. Not only will their competitors refocus on how they can retake some market share but the customers’ demand will evolve in manners we have not yet thought of. It is up to these entities to try and stay ahead of this wave. Not an easy task.

For an in-depth look at the food supply chain, click here for our latest research.


Tagged: Food and Beverage, McDonalds, Restaurant, Supply Chain

Data to Decisions Matrix Commerce New C-Suite Tech Optimization Innovation & Product-led Growth Supply Chain Automation Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software IoT Blockchain ERP Leadership Collaboration M&A Chief Supply Chain Officer

NBCUniversal Launches One-Stop Shop for Data-Driven Ad Targeting

NBCUniversal Launches One-Stop Shop for Data-Driven Ad Targeting

What is Audience Studio? NBCUniversal wants to make it easier for advertisers to use data to target audiences more precisely across TV, digital and social media. The media company, owned by Comcast Corp. , is introducing a new division called Audience Studio, which is dedicated to helping marketers employ data for ad targeting purposes by tying together four different ad buying products NBCU has introduced over the past few years. Audience Studio is being led by Denise Colella, NBCU’s senior vice president of data platforms and strategy who is in the midst of putting together a team of specialists for the new endeavor, including several planned hires.

Does Audience Studio Have a Data Management Platform? In addition to the group of ad targeting experts, at the heart of Audience Studio is a new “data management platform.” It’s basically a set of digital tools that advertisers will be able to use to match their own data with data from NBCU and third-party sources, in order to put together ad targeting segments, such as new moms in the market for a family friendly car. Marketers can then use that information to direct their advertising to those groups on TV and the Web.

What Four Offerings Does Audience Studio Include? Audience Studio will essentially tie together four recently launched NBCU ad offerings: Its Audience Targeting Platform lets advertisers target specific audiences on linear TV, and NBCUx is a similar product for digital media. NBCU+ Powered by Comcast provides marketers some access to Comcast set-top data for ad targeting purposes. And Social Synch helps brands extend the reach of their ad buys across various social networks.

Why Is Audience Studio An Interesting Opportunity for Marketers? Until now, a marketer looking to take advantage of two or more of these tools might have had to input data several different ways and come up with different definitions of potential ad targeting segments. Brands were largely then left to try to match up the segments manually. As more advertisers look to employ data-centric ad strategies across multiple media outlets, the process has the potential to cause major headaches, according to Krishan Bhatia, executive vice president of business operations and strategy at NBCU.

“If before you had this Chinese Wall between groups, with this, you are permeating that,” Mr. Bhatia said. “Going forward, a brand can now align their data inputs and outputs.”

What’s the Most Difficult Part of Targeting the Right Audience Segments? Naturally, advertisers want to eliminate any potential barriers when trying to increase their use of sophisticated targeting. Yet one complaint some buyers have raised recently is that they are worried that the each of the big TV companies will build its own unique systems and processes for data-driven advertising. That level of complexity might hold back the overall market’s potential, these buyers say.

In fact, some have advocated for a single technological solution that all the big TV players could employ for data-driven ad buys.

With Audience Studio, will NBCU be accused of going its own way? Mr. Bhatia said that he’d be more than willing to listen if such a broader effort were under way. But, we don’t have the luxury to wait for that solution to emerge,” he said. “We’d be the first people to think through how we might use that to help our clients. But right now we want to establish a leadership position and let our ad clients use any sort of data you could possibly imagine for advertising.”

MY POV: With yet another possible target marketing option in the marketplace, marketers need to take a really close look at the technology their currently have. It may mean that they do a “bake off” and determine, by comparing actual results, what technologies and platforms will really serve their needs to not only target the right audience, but also to increase lead conversion rates. The good thing is marketers have more choices than ever. The bad thing is, marketers have more choices than ever.

@Drnatalie, VP and Principal Analyst, Constellation Research

Covering Cloud, IOT, Marketing, Sales and Service to Create Awesome Customer Experiences

Share

Marketing Transformation Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Future of Work AI ML Machine Learning Generative AI Analytics Automation B2B B2C CX EX Employee Experience business Marketing SaaS PaaS Growth Cloud Digital Transformation eCommerce Enterprise Software CRM ERP Leadership Social Customer Service Content Management Collaboration Chief Marketing Officer

How the Car Park exposes Digital Failures that Kill Customer Experience

How the Car Park exposes Digital Failures that Kill Customer Experience

1

The humble car park. It is difficult to believe that this is the frontline on the identification of digital failures and the need for Digital Devil’s Advocates – see – http://wp.me/p15cZf-ek . The humble piece of real estate that is the car park has become a clear example of how digital will fail when new technology or initiatives are locked on to legacy processes. This was clearly seen with the Westfield example – http://wp.me/p15cZf-dy. Note that the media finally caught up to capioIT with their breathless “exclusive” on the parking issue that only came out 3 months after we had first identified the issue.

Of course, Westfield is not the only parking issue, the rise of GoGet, ZipCar, Car2Go et al highlights how inflexible current frameworks can be in a digital world and how a new regime of thought and process and investment is essential.

IMG_0454

As the photo highlights many car parks only allow one arrival per day for free parking periods. With photographic identification of license plates it is possible to know if a car enters and remains in the parking station. If you have two hours free parking in the pre digital time, you could remain for for 1 hr. 59 minutes, and then exit, only to return again to maximize free parking times.

Now the parking manager is now able to charge after the first two hours regardless of re-entry. I know this has caught me out. They do this because the model works for their limited thinking, car-parking revenue is increased, and car parks turnover. So far so good.

The rise of the shared car rental is anathema to this inflexible thinking and old school technology. An individual may have a share car for 2 hours to go shopping then return it. Someone else may take the same car and also do their shopping in the centre. They may find that they are charged for car park usage the moment they enter, getting a massive bill shock if they choose the wrong car park. Car sharing technologies tend to be local in nature, and often in high-density areas. This only amplifies the issue.

The irony is that the lowest technology car park, that is on street parking usually provided by local government has been able to find a cost effective and low-tech solution to the problem that remembers the customer experience. They use a tin of spray paint to maximize the customer experience.

IMG_0479

Simply car park managers have clearly not caught up with the car sharing technology and consumer behavior. Unsurprisingly users are going to be unnecessarily punished as a result.

Yet again this example highlights the point that capioIT regularly is an evangelist for, and increasing consulting around for clients.

 Digital is more than a technology play.

  • The customer experience has to be front and centre of the outcome.
  • When the customer isn’t central digital fails whether it is online insurance, car parking or attending a sporting event.
  • There is no compromise, or wriggle room digital investments have to be more than technology.

 

Capture Point

Digital is hard. Digital failure is easy. Disruption and Digital are clichéd, but there is a valid reason. This is so true when Digital becomes just a technology solution leveraging legacy. If you, your organization, or stakeholders rely on old technology and processes to become digital then you are significantly increasing the expectation of failure. Get the Digital Devil’s Advocate to ensure that you have the correct durable platform, not an investment that is on track to fail.

 


Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer

Event Report - IBM Interconnect - IBM innovates and partners into the hybrid cloud era

Event Report - IBM Interconnect - IBM innovates and partners into the hybrid cloud era

We had the opportunity to attend IBM’s Interconnect conference happening in Las Vegas this week. The conference is a combination of the former Interconnect, the developer conference Pulse and the former Rational user conference. With over 25k attendees the conference was high in demand, spanning across the Mandalay Bay and MGM properties. 

 
 

For the top takeaways – check out my video – take a look:
 

If you have no chance to watch – read on:

The ‘Connect’ Offering group – IBM – as many other vendors recently (see Microsoft and of course Oracle), want their customers to leverage existing investment on premises in products tool and know-how and connect (pun intended) with the cloud offerings. IBM has been re-using a number of existing capabilities (e.g. Dataworks) and building new ones to enable this strategy option. When one considers, as I learnt on my question to Robert LeBlanc, that there are over 200M WebSphere instances out there – there is a lot of investment to protect and tap into.

The VMware partnership – No surprising partnership, LeBlanc and Eschenbach laughed when I asked them what took them so long (likely VMware had to make some progress on the SDDC offerings) – but this is a win / win / win partnership for customers, and both vendors. Customers get (finally) a global, single SLA, single bill if they want to move VMware loads to the cloud, VMware gets that outlet beyond its fragmented partner system and IBM gets load (more below).

New programming model with Whisk – Next generation applications require new constructs in regards of how software is being built, key qualities being light weight, declarative, portable and not being charged for putting them in place, but only when they are used. Whisk is now IBM’s answer to similar programming models like AWS’ Lambda, though it deploys differently across products. But the qualities are the same – provide lightweight rules to cloud services at no cost, and only pay when they are being utilized. A key capability.
 

Analyst Tidbits

A lot of additional interesting areas of innovation:

Blockchain – IBM has been doing work (as featured also at the Open Technology Summit on Sunday – see a Storify here). IBM is early among all the technology vendors, but given the exposure to the financial sector – not really a surprises. IBM will also allow on premises WebSphere instances to utilize Blockchain implementations on the web, an interesting capability making Blockchain ‘hybrid’.

BlueMix and Github – Making it easier to publish, find and reuse code artefacts is a key capability for all PaaS platform, the partnership with Github gives BlueMix users now better access to Github, a good move.

Watson gets ‘human’ – Watson has received new APIs, which allow the cognitive platform to understand visual expressions, sense emotions and interpret the tone of a human voice. Important advances for making Watson interact with humans, something that IBM has always wanted and now finally – (earlier implementations have e.g. been Wipro Holmes) available.

Swift on the server / in the cloud - A lot of room was given to Swift, Apple’s new mobile development language in the Monday keynote. IBM is extending Swift to make it run on the server side, which was also called the ‘cloud’ side. And IBM is not totally naïve on this – as it is building its 100+ mobile applications with / for Apple. Enterprise applications usually need a server side and to be able to use one programming language is of value, though we need to see how the overall adoption of Swift will be. It makes sense for IBM given the Apple parnership - but in general I think the days of mobile OS centric development platforms are counted.

Object Storage – This form of storage is crucial for next generation applications and IBM has acquired Cleversafe with that capability. Now Cleversafe and enhancements run on the IBM Cloud (aka SoftLayer) with a number of useful enhancements.

Siemens choses Watson IoT – The large customer announcement on Monday was that Siemens Building Technologies has chosen IBM to build its IoT platform. Cognitive capabilities were key according to CEO Matthias Rebellius. Two mini take aways: It once again confirms Europe leads in IoT, high price and quality manufacturer there need an IoT strategy and are choosing North American technology vendors for it (who is left in Europe in that category?). And it’s key for IBM to get same and similar decisions with customers in order to create more load for IBM cloud.
 

MyPOV

Overall a good conference for IBM customers, who can clearly see how IBM is helping them to leverage the on premises assets, during their journey to more cloud. As we know cloud is no longer an If? but a When? question for enterprises, so it’s good to see IBM is there to assist... If executed right the Connect offerings will be of huge value helping enterprises to move to the cloud. The VMware partnership is synergistic and helps customers with VMware loads. And then there are the many, many other innovations a vendor like IBM churns out, all of them move IBM’s capabilities more into the direction of a partner to be chose for the business model change (mostly digital transformation) – lying ahead of enterprises in the next years.

On the concern side, IBM is moving so many areas that it has to make sure they all work together. The times where IBM could get away operating like 5 independent companies (like 10-15 and more years ago) are over. Products and offerings need to fit seamlessly and well together. It is good to see that IBM is aggressively putting both organization and talent in the right places. E.g. all database offerings (including DB2 (!)) are under a single CTO, outside-in perspectives are not only valued, but put in charge (e.g. the new head of Watson is from weather.com). Now the move to become a software company needs speedy execution, as IBM operates under the market scrutiny and pressure of a record number of consecutive quarters with revenue decline.

But overall a good Interconnect event for IBM, showing progress on all fronts. We will be watching through the year if it is enough. Stay tuned.

 
Tech Optimization Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Next-Generation Customer Experience IBM PaaS SaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Imagining the IoT Stadium of the Future

Imagining the IoT Stadium of the Future

1

IOTEarlier his week I had the chance to attend the IBM InterConnect conference to learn about what IBM is doing in the sports and technology space and share some thoughts in that area via their blog and social media. The event itself has a large emphasis on technology infrastructure needed to support the volume of data and systems integration that all industries now require, and within that, the Internet of Things (known shorthand as IoT) kept popping up as a topic.

Now IoT isn’t a huge topic in traditional sports business, but it is quite relevant in the health and fitness industry (wearables, smart watches, sensors, equipment) and continues to expand in so many ways, from home appliances to vehicles and more. So what I want to do is have a bit of fun and brainstorm all the ways that IoT could live within the context of a “smart stadium.” Here are a few things that came to mind, some of which exist in various forms today and others that may never exist, but it’s fun to imagine.

1. Beacons: Beacons and other geo-location technologies are actively being deployed in sports venues and used to collect and deliver all sorts of information related to the gameday experience. We are learning about what fans are engaging with different parts of the building, what types of location specific messaging can drive behavior and gathering insights on fans that may otherwise be anonymous to the team.

2. Smart Seats: One of the most fundamental tenants of sports is putting butts in seats, so why don’t we make those seats smarter? We do get a lot of data from access control systems to tell us when people enter/exit the venue and what their seat location is, but we don’t know when that person is or isn’t sitting in that seat. Many modern venues with club spaces struggle with getting people out of the club and back in those lower-level seats we see on TV. We also have challenges around logistics for food stands, bathrooms and other non-seating spaces. In a way, I think the data that “smart seats” could generate would be quite interesting.

3. Hawkers: Another classic element of sports are the vendors that navigate the building selling everything from peanuts to pennants to fans in the comfort of their seats. While in-seat ordering has changed the value proposition of hawkers, many venues still use this to drive per-caps. Smart technology on where hawkers are and the volume of goods sold vs. available can really optimize this process.

4. Security: Loyalty cards are becoming the new norm for ticket management and access control, but one thought that came up in an earlier Sports & Technology session was that as these cards and related systems get smarter, they become a valuable tool for security. We’ve started to see this with things like Clear in use at some ballparks and many new security procedures becoming the norm. Anything that can minimize risk while streamlining the stadium ingress process would be quite valuable.

5. Alcohol: Ah, a big gameday money-maker, but one that also has an equally big liability issue. Right now, staff training is the primary method for teams to avoid overserving, but why can’t we be smarter? Let’s take those same ticket cards (which often have loaded value) and track the level of alcohol purchases on each card. And yes, there are ways around this, so maybe we can use the technology in a product like Breathometer (first saw them on Shark Tank) and see if there’s a logical way to scale this for wider use in a venue.

6. Personal Trackers: We already have team apps being made available on the Apple Watch, so maybe we can we take these one step further and incentive fans to opt-in to share additional data that these gadgets generate. Instead of a decibel level tracker on the big screen, can we show the collective heart rate of our fans or the opponent’s fans? Maybe we discover unusual fan movement patterns that can be used to improve facility operations.

7. Parking Lots: A combination of in-ground/above-ground sensors and cameras could be just the ticket to streamlining the ingress and egress issues that many stadiums are constantly dealing with. Combine that data with real-time analysis and a feed to the team’s mobile app and now your fans can know what route to use regardless of accidents or traffic changes.

8. Environment: With the ever-growing importance of developing environmentally conscious facilities, there would be a great opportunity to use sensors and other IoT devices to actively monitor everything from air and noise pollution to waste disposal and more.

9. Lighting: We’ve seen things like Nest and other “home automation” systems around thermostats, lightbulbs, audio systems and more, could we have smart lighting in the future? This might be hard for those massive lighting structures used during night games, but there is so much other lighting throughout a venue that could be automated based on data collection and analysis.

10. Robots: Ok, I’m getting crazy here, but in 50 years, who knows! No one predicted eSports twenty years ago.

I’m sure there’s a lot more IoT possibilities that I missed, but it’s fun to try to picture how this type of technology will become part of the stadium experience, especially considering how pervasive it is everywhere else. It’s this type of technology evolution that puts more and more emphasis on the infrastructure within the venue. As I mentioned in this tweet earlier in the week, the pipes matter. Teams need to do whatever they can to plan for the future instead of today.