Results

The government cannot simply opt-out of opt-in

The government cannot simply opt-out of opt-in

The Australian government is to revamp the troubled Personally Controlled Electronic Health Record (PCEHR). In line with the Royle Review from Dec 2013, it is reported that patient participation is to change from the current Opt-In model to Opt-Out; see "Govt to make e-health records opt-out" by Paris Cowan, IT News.

That is to say, patient data from hospitals, general practice, pathology and pharmacy will be added by default to a central longitudinal health record, unless patients take steps (yet to be specified) to disable sharing.

The main reason for switching the consent model is simply to increase the take-up rate. But it's a much bigger change than many seem to realise.

The government is asking the community to trust it to hold essentially all medical records. Are the PCEHR's security and privacy safeguards up to scratch to take on this grave responsibility? I argue the answer is no, on two grounds.

Firstly there is the practical matter of PCEHR's security performance to date. It's not good, based on publicly available information. On multiple occasions, prescription details have been uploaded from community pharmacy to the wrong patient's records. There have been a few excuses made for this error, with blame sheeted home to the pharmacy. But from a system's perspective -- and health care is all about the systems -- you cannot pass the buck like that. Pharmacists are using a PCEHR system that was purportedly designed for them. And it was subject to system-wide threat & risk assessments that informed the architecture and design of not just the electronic records system but also the patient and healthcare provider identification modules. How can it be that the PCEHR allows such basic errors to occur?

Secondly and really fundamentally, you simply cannot invert the consent model as if it's a switch in the software. The privacy approach is deep in the DNA of the system. Not only must PCEHR security be demonstrably better than experience suggests, but it must be properly built in, not retrofitted.

Let me explain how the consent approach crops up deep in the architecture of something like PCEHR. During analysis and design, threat & risk assessments (TRAs) and privacy impact assessments (PIAs) are undertaken, to identify things that can go wrong, and to specify security and privacy controls. These controls generally comprise a mix of technology, policy and process mechanisms. For example, if there is a risk of patient data being sent to the wrong person or system, that risk can be mitigated a number of ways, including authentication, user interface design, encryption, contracts (that obligate receivers to act responsibly), and provider and patient information. The latter are important because, as we all should know, there is no such thing as perfect security. Mistakes are bound to happen.

One of the most fundamental privacy controls is participation. Individuals usually have the ultimate option of staying away from an information system if they (or their advocates) are not satisfied with the security and privacy arrangements. Now, these are complex matters to evaluate, and it's always best to assume that patients do not in fact have a complete understanding of the intricacies, the pros and cons, and the net risks. People need time and resources to come to grips with e-health records, so a default opt-in affords them that breathing space. And it errs on the side of caution, by requiring a conscious decision to participate. In stark contrast, a default opt-out policy embodies a position that the scheme operator believes it knows best, and is prepared to make the decision to participate on behalf of all individuals.

Such a position strikes many as beyond the pale, just on principle. But if opt-out is the adopted policy position, then clearly it has to be based on a risk assessment where the pros indisputably out-weigh the cons. And this is where making a late switch to opt-out is inconscionable.

You see, in an opt-in system, during analysis and design, whenever a risk is identified that cannot be managed down to negligible levels by way of technology and process, the ultimate safety net is that people don't need to use the PCEHR. It is a formal risk management ploy (a part of the risk manager's toolkit) to sometimes fall back on the opt-in policy. In an opt-in system, patients sign an agreement in which they acept some risk. And the whole security design is predicated on that.

Look at the most recent PIA done on the PCEHR; section 9.1.6 "Proposed solutions - legislation" makes it clear that opt-in participation is core to the existing architecture. The PIA makes a "critical legislative recommendation" including "a number of measures to confirm and support the 'opt in' nature of the PCEHR for consumers (Recommendations 4.1 to 4.3) [and] preventing any extension of the scope of the system, or any change to the 'opt in' nature of the PCEHR".

The fact is that if the government changes the PCEHR from opt-in to opt-out, it will invalidate the security and privacy assessments done to date. The PIAs and TRAs will have to be repeated, and the project must be prepared for major redesign.

The Royale Review report (PDF) did in fact recommend "a technical assessment and change management plan for an opt-out model ..." (Recommendation 14) but I am not aware that such a review has taken place.

To look at the seriousness of this another way, think about "Privacy by Design", the philosophy that's being steadily adopted across government. In 2014 NEHTA wrote in a submission (PDF) to the Australian Privacy Commissioner:

  • The principle that entities should employ "privacy by design" by building privacy into their processes, systems, products and initiatives at the design stage is strongly supported by NEHTA. The early consideration of privacy in any endeavour ensures that the end product is not only compliant but meets the expectations of stakeholders.

One of the tenets of Privacy by Design is that you cannot bolt on privacy after a design is done. Privacy must be designed into the fabric of any system from the outset.

If the government was to ignore this core element of its own Privacy by Design credo, and not revisit the architecture of the PCEHR which was never design for Opt Out, it would be an amazing breach of the public's trust in the healthcare system.

Resources:

Getting Started Guide: Privacy Engineering

DOWNLOAD SNAPSHOT

The State of Identity Management in 2015

DOWNLOAD SNAPSHOT

The State of Digital Privacy in 2015

DOWNLOAD SNAPSHOT


Digital Safety, Privacy & Cybersecurity Security Zero Trust Chief Executive Officer Chief Information Officer Chief Information Security Officer Chief Privacy Officer

Alteryx Pioneers Self-Service Data-Prep and Analytics

Alteryx Pioneers Self-Service Data-Prep and Analytics

Alteryx Inspire15 Event Report: There’s a huge gap between spreadsheets on the one hand and statistician- and data-scientist-oriented workbenches on the other. Alteryx is doing its best to fill the void. “Excel needs a BFF!,” quipped Alteryx president George Mathew at the company’s May 18-21 Inspire15 event in Boston.

That line drew a big laugh from the more than 800 Inspire attendees because many Alteryx customers use its software to replace broken spreadsheet-based analyses. At Home Depot, for example, Charles Coleman, senior analyst, special projects, used to spend two weeks pulling together point-of-sale, marketing and merchandising data in Excel in order to study store clustering and performance. Presenting at Inspire15, Coleman explained how he now uses Alteryx to blend and analyze that data in less than an hour.

Alteryx President and COO George Mathew highlights coming in-database and big-data platform connections at Inspire15.

Alteryx President and COO George Mathew highlights coming in-database and big-data platform connections at Inspire15.

There are millions of analysts out there who are ready for something more powerful than Excel, yet most aren’t ready for (or just aren’t interested in) the coding and complexity associated with traditional data-mining and statistical-analysis tools. Alteryx appeals to these would-be customers with a desktop designer tool that combines self-service data blending and coding-free predictive and spatial analysis. Desktops can be combined with the Alteryx server for companywide sharing of analytic applications and scheduled reports. There’s also a cloud-based Alteryx Analytic Gallery where you can quickly provision browser and mobile-device access to apps, reports and visualizations.

Alteryx data-prep capabilities include everything from data-extraction and cleansing to blending and enrichment. In many cases Alteryx is used exclusively for these functions in conjunction with analysis and data-visualization options like Tableau Software, and, more recently, QlikView and Qlik Sense. At Inspire, Alteryx president and COO George Mathew noted that the company has more than 300 joint customers with Tableau, including recent wins at Audi, EasyJet, EMC, and Johnson & Johnson.

I sat in on a session presented by Levi’s exec Michelle Londeree, who detailed how she used Alteryx and Tableau to quickly create drillable dashboards. The dashboards replaced conventional PDF reports that were simply taking too long to maintain and modify using the company’s conventional, IT-centric BI tools.

Many Qlik and Tableau users don’t even realize that the people prepping their data are doing so with Alteryx behind the scenes. But with interest growing in forward-looking predictive analytics, both Qlik and Tableau are actively promoting Alteryx as a partner that can help their customers embed predictive and spatial analyses. I witnessed this partner love first-hand at the recent Qlik Connections conference in Dallas, where the vendor had both keynote mentions and how-to sessions on bringing Alteryx predictions into Qlik analyses.

Once the data blending is done, the Alteryx Designer workflow continues with an extensive menu of tools for predictive and spatial data analysis that you drag and drop into place and then configure without coding. Workflows conclude with report and visualization options, and you bundle everything up into an application that can be shared with business users through the Alteryx Server or cloud-based Gallery environment.

Alteryx’ Next Release: In-Database and Spark

The next major release of Alteryx, due this fall, will bring upgrades for novices and experts alike. The newbies will get more tutorials and data samples at the desktop level to help them get started quickly. There will be charting upgrades and new analyses including social (graph) exploration. Server-level upgrades in the next release will include enhanced collaboration and version control, scheduling features, better auditing for data governance, and new scaling and redundancy features.

The next release will also bring enhanced data connections, with better, REST ties to Salesforce, Marketo, MongoDB, SharePoint, and Redshift. New sources for Alteryx will include Qlik QVX files, JVX (JSON) reading and writing for Cassandra, and SAP Hana. In the big-data-analysis vein, Alteryx is developing in-database analytics ties with Teradata and Amazon Redshift and push-down analysis capabilities with Hadoop and Spark. These in-database and push-down approaches bring the analysis to the data rather than moving the data to the analysis –- a proven time and labor saver when dealing with data at scale.

Apache Spark is the darling of the conference circuit this year, frequently mentioned and often the subject of keynote appearances. Earlier this month at Informatica World, the guest was Professor Michael Franklin, director of the UC Berkeley AMPLab where Spark was invented.  At Inspire15, Ion Stoica, CEO of Databricks, the commercial developer of Spark, joined George Mathew onstage. Both speakers touted Spark’s in-memory performance and its broad array of analysis approaches, including SQL, R, machine learning, graph and streaming.

Spark integration will extend Alteryx support for R-language-based analyses to large data sets through the Spark R engine. Mathew also touted Spark as a better, faster alternative to MapReduce on top of Hadoop, and he promised Alteryx will become a “first-class citizen” within Databricks Cloud, that vendor’s Spark-based big-data analysis environment running on Amazon Web Services.

MyPOV: Welcome to the Big Leagues

Alteryx is growing fast. Founded in 2010, Alteryx had only about 200 customers 18 months ago, but executives say that count will surpass 1,000 by this summer. Between its self-service data-blending and analytics capabilities and its Qlik and Tableau partnerships, the company is also winning bigger, broader deals with higher-profile customers.

With the coming in-database and big-data platform ties, Alteryx is going to draw even more attention, and, inevitably, more direct competition with some very large vendors. George Mathew made it sound like the gap between Excel and SAS/SPSS is a “greenfield” that Alteryx has all to itself, but giants including IBM, Microsoft, SAS, and SAP are playing close attention to this space.

The biggest competitor is surely SAS, which recently integrated its BI-oriented Visual Analytics and analytics-oriented Visual Statistics products. Both were introduced within the last two years to support coding-free, drag-and-drop analysis. SAP is converging its traditional data-mining workbench and its business-user-oriented KXEN InfiniteInsight acquisition in SAP Predictive Analytics. IBM is aiming for a broad user base with Watson Analytics. And Microsoft, too, is gearing up in this arena, building on predictive cloud services like Azure Machine Learning, acquiring Revolution Analytics, and getting more aggressive with its recent revamp of Power BI.

Make no mistake, self-service data-prep and self-service analytics are following in the footsteps of self-service business intelligence. So the gap between Excel and advanced analytics tools won’t be Alteryx’s alone to exploit. But Alteryx is doing a very good job of pioneering this space with an end-to-end, self-service workflow, plentiful data-access options including cloud sources, and a growing story around big-data analysis. Customers at the company’s event seemed truly inspired by the capabilities that are there today and by what’s coming.

Media Name: Alteryx President George Mathew.jpg
Data to Decisions Chief Information Officer Chief Digital Officer

Book Summary: Lesson 4 From Disrupting Digital Business: Data Is The Foundation

Book Summary: Lesson 4 From Disrupting Digital Business: Data Is The Foundation

Get All 10 Lessons Learned From Disrupting Digital Business

As with the beginning of every revolution, those in the midst of it can feel it, sense it, and realize that something big is happening. Yet it’s hard to quantify the shift. The data isn’t clear. It’s hard to measure. Pace of change is accelerating. Old rules seem not to apply.

Sometimes when you are in the thick of it, it’s hard to describe what’s happening.  In the case of digital business, these models have progressed over the past 20 years.  However, non-traditional competitors have each exploited a few patterns with massive success. However, as the models evolved, winners realize there are more than a handful of patterns.

Lesson 1 – Transform Business Models And Engagement

Lesson 2 – Keep The Brand Promise

Lesson 3 – Sell The Smallest Unit You Can

Lesson 4 – Data Is The Foundation Of Digital Business

In fact, the impact is significant and now quantifiable with 52% of the Fortune 500 gone since 2000 and the average age of the S&P 500 company in 1960 is down from 60 years to a little more than 12 projected in 2020.  That is a 500% compression that has changed the market landscape forever in almost every industry.

Over the course of the next 10 weeks, I’ll be sharing one lesson per week.  For traditional businesses to succeed, they will have to apply all 10 lessons from Disrupting Digital Business in order to not only survive, but also relearn how to thrive.

Data Is The Foundation Of Digital Business

Lesson 4 Data Is The Foundation

We’ve all heard about the importance of data.  From big data to small data, the digital world relies on every interaction.  Digital enables every touch point, every click, every piece of digital exhaust to transform into right-time, relevant, and contextual, insight.  In fact this data powers the ability to achieve a data driven decisions framework where data is transformed into information based on business process and context.  From campaign to lead, lead to opportunity, order to cash, incident to resolution, hire to retire, procure to pay, and concept to product, context enables relevancy of process which transforms data into information.  Context driven by role, relationship, business process, product, geo-spatial, time, sentiment, and even intent enable right relevancy.

We then take this information and analyze for patterns.  The bigger the data set, the more opportunities for algorithms to find patterns of insight.  The goal is to ask questions of the data and also take the surfaced up patterns.  From these patterns, systems gain the ability to guide the decisions and actions based on the data.  This is the data to decisions framework that guides digital business.

From data to decisions

Just having the data is part one. The next step is to build big data business models.  My December 6th, 2012 post on Harvard Business Review is great place to get started.

Homework

So where do we begin?  Best to start by identifying the questions you seek to answer.  We often have clients start by asking how do I know what traits make up my most valuable products, employees, customers, and suppliers?  This questions helps drive the questions around what context matters.  From there you can then figure out the information flows and business processes that drive that context. With this information in hand, then understanding the people and devices touched will provide the next level of design. Finally understanding the data source and channel of your data will provide you the reverse engineering required to succeed in getting the data to decisions framework.

Armed with this information, you can then start about how to create big data business models powered by insight.  Start by identifying what new brand promises can be delivered from insight.  Then identify what insight can be aggregated, benchmarked, or remixed for sale.  At the highest level, determine what marketplaces occur for that insight.

The Complete 10 Lessons Learned From Disrupting Digital Business

For those attending the full keynotes and book tours, you’ll get the complete session and in many cases a copied of a signed booked.   For those following virtually, I’ve provided the slimmed down slide share deck for your use.

You now have the 10 lessons learned to disrupt digital business in your hands. You can take this information and change the world in front of you or choose to sit on the knowledge as the world passes you by and digital darwinism consumes your organization.

I trust you will do the right thing. And when you want some company, come join us as a client at Constellation Research where we’re not afraid of the future and the art of the possible.

Get The Book Now Before Digital Darwinism Impacts You

Purchase on Amazon
Bulk Orders: contact [email protected]
About Disrupting Digital Business

Join the Digital Disruption Tour. Events in San Francisco, Atlanta, London, and Amsterdam!

Your POV.

Are you ready to disrupt digital business?  Have you ordered the book?

Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) org.

Please let us know if you need help with your Digital Business transformation efforts. Here’s how we can assist:

  • Developing your digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Resources

Reprints

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales .

Disclosure

Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy,stay tuned for the full client list on the Constellation Research website.

* Not responsible for any factual errors or omissions.  However, happy to correct any errors upon email receipt.

Copyright © 2001 -2015 R Wang and Insider Associates, LLC All rights reserved.
Contact the Sales team to purchase this report on a a la carte basis or join the Constellation Customer Experience

The post Book Summary: Lesson 4 From Disrupting Digital Business – Data Is The Foundation appeared first on A Software Insider's Point of View.

Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth Sales Marketing SoftwareInsider AI ML Machine Learning Generative AI Analytics Automation B2B B2C CX EX Employee Experience business Marketing SaaS PaaS Growth Cloud Digital Transformation eCommerce Enterprise Software CRM ERP Leadership Social Customer Service Content Management Collaboration LLMs Agentic AI HR HCM IaaS Supply Chain Disruptive Technology Enterprise IT Enterprise Acceleration Next Gen Apps IoT Blockchain finance M&A Enterprise Service Metaverse developer Quantum Computing Healthcare VR CCaaS UCaaS Robotics Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief People Officer Chief Procurement Officer Chief Supply Chain Officer Chief Technology Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Operating Officer Chief Revenue Officer Chief Experience Officer

CEN Member Chat: Five Tenets of Digital Boards

CEN Member Chat: Five Tenets of Digital Boards

Andy Mulholland, Constellation Research VP & Principal Analyst explains five strategies boards must implement in order to successfully guide digital transformation projects. 

ResourcesBoards Prepare Executives for Digital Business and Digital Leadership

Tech Optimization Chief Information Officer On <iframe src="https://player.vimeo.com/video/128741509" width="500" height="313" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>
Media Name: 5-tenets-digital-boards-amulholland.png

Forget Co-Eds and Grumpy Cat, Facebook Is Now a Political Hub

Forget Co-Eds and Grumpy Cat, Facebook Is Now a Political Hub

1
 

Politics and Facebook 2

Facebook has come a long way from the day when a handful of horny college boys used it to list and rate female students. Many will argue that despite the network’s continued popularity and evolving advertising and targeting algorithms, it’s done little more to benefit humanity than its original program.

Facebook’s evolution continues; toilet-flushing cat lovers have found Pinterest and the selfie-addicted seem to have moved on to Instagram (albeit it not as fast as many of us wish). So what’s next for Facebook? What’s the next big trend among the next generation, among Millennials?

“Like” Button is the New Political Button 

Back in September 2013 Pew Research released a report that suggested most people are not comfortable with political conversations on Facebook. Fast forward almost two years and things have changed dramatically.

Last week, eMarketer reported on a study based on research by GfK for Harvard University Institute of Politics that shows the next generation has adopted Facebook as its preferred social network for sharing their political ideologies.

“When the study asked US millennial internet users about their digital political activities, respondents were most likely to have signed an online petition. However, the next four most popular responses all related to Facebook actions, as those with an account reported “liking” a political issue (30%) or candidate (24%) on Facebook, as well as using the social network to advocate for a political position.”

The report when on to list the massive popularity of Facebook as a political platform for the rest of the population as well, citing a report by ShareThis that indicated 71% of US internet users chose Facebook as their preference to share or comment on the 2014 midterm elections.

What’s even more interesting is the fact that the popularity of politics on social media seems to be focused on Facebook specifically; neither Millennials nor political pundits listed Twitter as a preferred network for political campaigning.

Politics 2.0: Ad Targeting on Facebook

Indeed, Facebook’s “Like” button is the next political support button. According to the online publication, The Record, delegates at a recent Conservative political convention in Canada found one quarter of the educational sessions provided by the party were focused on digital engagement with voters.

Columnist Stephanie Levits suggests that while data science has always been important for politicians, Facebook has taken it to another level. “Facebook can match voter lists gathered by parties to Facebook users and then target just those users with particular ads, and go even further by taking those profiles and looking for users who are similar, and help politicians blast them with ads as well.”

There doesn’t seem to be an end to the growth of political conversations on social media.  Google research data indicate that a typical voter will make 14.7 online attempts to gather information about political issues before deciding how to vote or for whom to vote.

My mother always said that religion and politics are not polite conversation at social events, yet somehow Facebook has embraced politics and religion as popular fodder for our collective digital/social conversations.

“Like” It or Hate It

There’s certainly two camps in this discussion; more and more people are turning to Facebook to support their political affiliations yet there’s a growing frustration with politics in Facebook conversations.

Facebook Politic Meme

I’m going to suggest a change to the idiom, “When your parents join Facebook, it’s time to quit Facebook.” Maybe it should read, “When Facebook friends become fixated on politics, it’s time to quit Facebook.”

Sensei Debates

  1. Is Facebook a better or worse social network thanks to its political content and opinion?
  2. Do you turn to Facebook for political discourse?

Sam Fiorella
Feed Your Community, Not Your Ego

The post Forget Co-Eds and Grumpy Cat, Facebook Is Now a Political Hub appeared first on Sensei Marketing.

Marketing Transformation Next-Generation Customer Experience Sales Marketing Future of Work Innovation & Product-led Growth New C-Suite Digital Safety, Privacy & Cybersecurity meta Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Chief Customer Officer Chief Marketing Officer Chief People Officer Chief Human Resources Officer

Regulation Needed to Curb Inappropriate Re-Identification

Regulation Needed to Curb Inappropriate Re-Identification

In 2014, the New York Taxi & Limousine Company (TLC) released a large "de-identified" dataset containing 173 million taxi rides taken in 2013. Soon after, computer scientist Anthony Tockar managed to undo the hashed taxi registration numbers. Tockar went on to combine public photos of celebrities getting in or out of cabs, to recreate their trips, including it was alleged, where those trips had started at strip clubs. See Anna Johnston's analysis here.

This re-identification demonstration has been used by some to bolster a general claim that anonymity is increasingly impossible.

On the other hand, medical research advocates like Columbia University epidemiologist Daniel Barth-Jones argue that the practice of de-identification can be robust and should not be so easly dismissed as impractical. The identifiability of celebrities in these sorts of datasets is a statistical anomaly reasons Barth-Jones and should not be used to frighten regular people out of participating in medical research on anonymised data. He wrote, in a law journal article, that:

"[Examining] a minuscule proportion of cases from a population of 173 million rides couldn't possibly form any meaningful basis of evidence for broad assertions about the risks that taxi-riders might face from such a data release." (emphasis added by me).

In his position, Barth-Jones is understandably worried that re-identification of small proportions of special cases is being used to exaggerate the risks to ordinary people. But Barth-Jones belittles the risk of re-identification with exaggerations of his own. The assertion "couldn't possibly form any meaningful basis" over-states his case quite dramatically. The fact that any people at all were re-identified plainly does create a basis for concern for everyone.

Barth-Jones objects to any conclusion that "it's virtually impossible to anonymise large data sets" but in an absolute sense, that claim is surely true. If any proportion of people in a dataset may be identified, then that data set is plainly not "anonymous". Moreover, as statistics and mathematical techniques (like facial recognition) improve, and as more ancillary datasets (like social media photos) become accessible, the proportion of individuals who may be re-identified will keep going up.

[Readers who wish to pursue these matters further should look at the recent Harvard Law School online symposium on "Re-identification Demonstrations", hosted by Michelle Meyer, in which Daniel Barth-Jones and I participated, among many others.]

Both sides of this vexed debate need more nuance. Privacy advocates have no wish to quell medical research per se, nor do they call for absolute privacy guarantees, but we do seek full disclosure of the risks, so that the cost-benefit equation is understood by all. One of the obvious lessons in all this is that "anonymous" or "de-identified" are useless descriptions. We need tools that meaningfully describe the probability of re-identification.

And we need policy and regulatory mechanisms to curb inappropriate re-identification.

I argue that the act of re-identification ought to be treated as an act of Algorithmic Collection of PII, and regulated as just another type of collection, albeit an indirect one. If a statistical process results in a person's name being added to a hitherto anonymous record in a database, it is as if the data custodian went to a third party and asked them "do you know the name of the person this record is about?". The fact that the data custodian was clever enough to avoid having to ask anyone about the identity of people in the re-identified dataset does not alter the privacy responsibilities arising. If the effect of an action is to convert anonymous data into personally identifiable information (PII), then that action collects PII. And in most places around the world, any collection of PII automatically falls under privacy regulations.

It looks like we will never guarantee anonymity, but the good news is that for privacy, we don't need to. Privacy is the protection you need when you affairs are not anonymous, for privacy is a regulated state where organisations that have knowledge about you are restrained in what they do with it. Equally, the ability to de-anonymise should be restricted in accordance with orthodox privacy regulations. If a party chooses to re-identify people in an ostensibly de-identified dataset, without a good reason and without consent, then that party may be in breach of data privacy laws, just as they would be if they collected the same PII by conventional means like questionnaires or surveillance.

Surely we can all agree that re-identification demonstrations serve to cast light on the claims made by governments for instance that certain citizen datasets can be anonymised. In Australia, the government is now implementing telecommunications metadata retention laws, in the interests of national security; the data we are told is de-identified and "secure". In the UK, the National Health Service plans to make de-identified patient data available to researchers. Whatever the merits of data mining in diverse fields like law enforcement and medical research, my point is that any government's claims of anonymisation must be treated critically (if not skeptically), and subjected to strenuous and ongoing privacy impact assessment.

Resources:

Getting Started Guide: Privacy Engineering

DOWNLOAD SNAPSHOT

The State of Identity Management in 2015

DOWNLOAD SNAPSHOT

The State of Digital Privacy in 2015

DOWNLOAD SNAPSHOT


Digital Safety, Privacy & Cybersecurity Security Zero Trust Chief Customer Officer Chief People Officer Chief Information Officer Chief Information Security Officer Chief Privacy Officer

Why Digital Marketing Transformation is Important

Why Digital Marketing Transformation is Important

1

I recently spent time with IBM travelling as part of their IBM Connect conference series in Auckland, Sydney and Melbourne. At each location, I hosted a panel discussion that centred on the “voice of the customer” – drawing out the experience and knowledge of panels that included ADMA’s CEO, Jodie Sangster, CIO of Tennis Australia, Samir Mahir, City of Melbourne’s Executive Manager, Commercial and Marketing, Lucan Creamer, Think Global Research’s Mark Tyler, and Twitter’s Head of Data Sales, Fred Funke.

I spent a few minutes with the IBM team to share my thoughts on why digital marketing transformation is important – and how you can use the “Marketing PANDA” to focus your efforts around customer centricity.

Marketing Transformation Sales Marketing Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Future of Work IBM Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning business SaaS PaaS CRM ERP Leadership Chief Information Officer Chief Marketing Officer

Getting the Swing of Microsoft Sway

Getting the Swing of Microsoft Sway

Below is an embedded Microsoft Sway. To view the entire thing, you will need to place your mouse over the Sway and then scroll down. (it's a bit confusing at first)

Future of Work Microsoft

Event Report - Alteryx Inspire - Positive Growth on Product, Go to Market and Customers

Event Report - Alteryx Inspire - Positive Growth on Product, Go to Market and Customers

We had the opportunity to attend Alteryx user conference this week in Boston, held at the Westin hotel (, which did not win a prize for its elevator service). The conference was well attended with 850+ attendees, coming from all over the world. The vendor is growing fast e g. grew live customers by 50%, and so all the positive energy of growth was at the conference – customers, prospects, partners and employees are all energized.

 
So here are my top 3 takeaways from the event:

Good Housekeeping – A year ago when Alteryx had their user conference in San Diego (it will be back there in 2016), I had the time to sit down to learn and use the product, something analysts seldom find the time to do. And while the product had very powerful capabilities, it felt like having a little aging user experience and seemed like built together quickly, with some inconsistencies in the UI. But the user community was happy back then (as now) with the product, more importantly though Alteryx has listened and there are improvements in the 9.5 product and more coming in 10 (shipping later this year). The UI is re-done, naming improved, tools harmonized, menus made consistent, performance improved – all key new qualities of upcoming versions. And customers will benefit from them, especially given the Alteryx business model (see below).
 
CEO Stoecker on Analytics Independence
 
Analytic Independence – a blessing or a misnomer? – The conference ran all under the motto of analytical independence, something biding well to Boston as a location with Tea Party and other history etc. something both CEO Stoecker and COO Mathew used in their keynotes as themes. But what Alteryx really does well and is used for is data prep and data automation. Getting data from a variety of sources collected, exported, transformed, enriched and fed to target systems is the forte of Alteryx. As such the vendor has done well partnering with the leading visualization front end vendors Tableau and Qlik. But all of this does not have a ‘true’ analytics aspect (see my view here), the closest to this happening is modelling analytical insights with R, the most popular analytical language these days. And Alteryx does well here with R support, working closely with Spark and enabling SparkR as new capability at the conference. What Alteryx enables is the freedom to get data fed to models, and build these models if you like, so more about data modelling and prep freedom than analytics. Regardless, something all enterprises need and Alteryx makes a big difference for customers in regards of these vital steps. All customers we spoke to on and off the record see huge productivity gains in their BI operations, something key for enterprises today. As business accelerates, insights need to accelerate and Alteryx is a vital tool for enterprises to achieve that.
 
 
 
COO Mathews on Predictive Visualizations

A unique go to markeb – Alteryx follows the ‘land fast – then expand’ model, which means downloading the product is free and easy, in the following weeks Alteryx Sales will catch up with the customer to turn the ‘free’ trial into revenue. It was great to learn that Alteryx has improved that model even further, realizing how vital the first 5 minutes after the download are for further commercial success. There the ‘housekeeping’ improvements mentioned above will help significantly. And having widely available documentation, how tos, training available is crucial, too. It looks like Alteryx has gotten the model right as the number of customers has grown by 50% in the last 12 months. Probably the only path for a smaller vendor to gain market share quickly, and we applaud Alteryx for the guts and the improvement and operation of the model. Viral, self-service selling tis the future of enterprise software sales, avoiding lengthy RfP cycles, throwing ‘bodies in planes’ etc. The future user / decision maker wants to see, touch use the prospective software quickly and make the buying decision on their own.

 
New Visualizations coming in Alteryx 10
 

MyPOV

Good progress by Alteryx on product capabilities, the upcoming release 10 has key improvements. Also good to see that the vendor has perfected the light weight, even viral go to market approach.

On the concern side it is clear that Alteryx banks on the ‘keep data in place’ approach (vs moving it all) – it is too early to call what approach will win in the market place in the long run.

A neutral area right now is that Alteryx has ‘lost’ the desktop at approx. 50% of their customers to partners Tableau and Qlik – if Alteryx can keep the mix there and be successful on strengthening its front end tools (in the plans), all is good. If the desktop market share falls further it risks to be relegated to a powerful ‘backend tool’.

But for now its good news for Alteryx customers, as the vendor is pushing the envelope further in data prep and ‘true’ analytics.

Find more coverage on the Constellation Research website here.

And some key tweets from Day #1 of Inspire15:

 




And a Meerkat capture of the Day #1 and #2 Keynotes by CEO Stoecker and COO Mathews - check out my colleagues Doug Henschen and my takeaways at the end of the Day #2 keynote capture: 
 
 
 
Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Hortonworks SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing finance Healthcare Customer Service Content Management Chief Information Officer Chief Experience Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

IBM isn’t about dark suits and starched shirts anymore: focused on providing tailored customer experience

IBM isn’t about dark suits and starched shirts anymore: focused on providing tailored customer experience

Beginning of last week I was in San Diego for IBM’s Amplify customer and partner conference. While I have been fortunate to have had traveled to some interesting places in our world, this was the first time I had been to San Diego. One of those oddities of my travel. It certainly did not disappoint, unfortunately I was not able to visit any of the beaches nor were the Padres in town. But I digress.

IBM had over 3700 attendees and large number of their partners in attendance at Amplify 2015. Not surprising for a company of IBM’s stature. The show kicked off with Alex Banayan, Author; Venture Associate with Alsop Louie Partners on stage followed by Deepak Advani the General Manager for IBM Commerce. Nothing of great notice other than the fact this is IBM and you had a millennial hosting and IBMers presenting wearing denim rather than the pin strip blue suits and starched white shirts! But this contrast is not lost on what IBM is looking to do with retail and eCommerce. From main stage the message was clear – we are here to empower our clients to provide their customers with an experience that is second to none. IBM is gambling that their wide breadth of offerings, from software to services all tied back to powerful analytics powered by…you guessed it…Watson is a winning formula. A timely message for a market that is ripe for new business models for addressing customer demand and desire. Hardly a gamble, but more a sound strategy to gain market share within retail.

So how is IBM ready to provide the framework necessary to meet this growing need? Deepak outlined 6 pillars that IBM is building their solution around. They are split between customer engagement and partners & suppliers. Around customer engagement they are:

  • Marketing
  • eCommerce merchandising
  • Customer analytics

Focusing on the partners & suppliers portion, IBM is focused on following three areas:

  • Procurement
  • Payment
  • Integration B2B

 

Screen Shot 2015-05-22 at 11.30.56 AM

Deepak and IBM are betting on the belief that these areas of focus will allow them to provide retailers, and really anyone in the commerce supply chain, the framework to address many of their business issues as well as allow for new business models. These pillars allow IBM to leverage their large portfolio of products and more importantly – services – in a well-articulated offering.

The one area that is lacking is with regards the retail execution and fulfillment portion. An aspect of the commerce supply chain that during my numerous hallway and formal conversations was not lost on both IBM executives as well as customers. It will be something to watch as they have a partnership with the likes of JDA and Pitney Bowes that should address this gap in the their offering. In the long term it will be interesting to see how these partnerships evolve. Does IBM stand to lose some control of their relationships with certain opportunities if they cannot also offer these execution and fulfillment offerings from their own suite? How could this impact both their sales forces and those of their partners, aka who will lead in certain deals? What happens if an implementation of the solution does not live up to expectations, which side will shoulder the responsibility?

Clearly IBM is not shy about partnering to augment their offerings or fill in gaps. They announced during the event a partnership with Facebook, which coupled with the partnerships with Apple and Twitter, demonstrate that IBM is not averse about pulling a wide variety of players into their ecosystem. These partnerships are cleaner since for the most part they are to provide IBM with some powerful and rich data sources to feed their growing analytics machine. When it comes to filling out the pillars Deepak mentioned, in the long run will IBM want to have the remaining pillars to be IBM blue or can partnerships suffice?


Tagged: Commerce, IBM, IBM Amplified, Retail, Supply Chain

Matrix Commerce Data to Decisions ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer