Results

Long Way to Go to Tackle Identify Fraud

Long Way to Go to Tackle Identify Fraud

Steve Wilson, Constellation VP & Principal Analyst shares his POV in 10.5 minutes.

"Identity fraud is very simple, it is necessary to work in authentication because it is now very easy to replace, falsifying and stealing the digital identity. The identification industry has worked on increasing security and privacy, reduce costs and barriers of authentication, but needs to go much further."

Digital Safety, Privacy & Cybersecurity Chief Information Officer On <iframe width="560" height="315" src="https://www.youtube.com/embed/It0s2xPL-iU" frameborder="0" allowfullscreen></iframe>

News Analysis - Apple & SAP Partner to Revolutionize Work on iPhone & iPad

News Analysis - Apple & SAP Partner to Revolutionize Work on iPhone & iPad

It looks like the SAP “pre Sapphire leak announcement” tradition that broke 2 years ago is alive and well – today Apple announced a partnership with SAP. It’s not clear what may have motivated Apple to push on the gas pedal in regards of the timeline, apart from the known slowing of iPad Sales and more recently iPhone Sales. With 13 days to Sapphire, there are a number of selling days in the quarter left... 

 
So let’s pick apart the press release in our customary style – it can be found here:
 
CUPERTINO, California and WALLDORF, Germany — May 5, 2016 — Apple® and SAP today announced a partnership to revolutionize the mobile work experience for enterprise customers of all sizes, combining powerful native apps for iPhone® and iPad® with the cutting-edge capabilities of the SAP HANA platform. This joint effort will also deliver a new iOS software development kit (SDK) and training academy so that developers, partners and customers can easily build native iOS apps tailored to their business needs.
MyPOV _ Great introductory paragraph that summarizes the scope well – native apps for iPhone and iPad, a new iOS SDK to run on SAP HANA platform (why not SAP HANA Cloud Platform – HCP – as mentioned later?), and good reference to training (that is often forgotten in partnerships like these).
 
“This partnership will transform how iPhone and iPad are used in enterprise by bringing together the innovation and security of iOS with SAP’s deep expertise in business software,” said Tim Cook, Apple’s CEO. “As the leader in enterprise software and with 76% of business transactions touching an SAP system, SAP is the ideal partner to help us truly transform how businesses around the world are run on iPhone and iPad. Through the new SDK, we’re empowering SAP’s more than 2.5 million developers to build powerful native apps that fully leverage SAP HANA Cloud Platform and tap into the incredible capabilities that only iOS devices can deliver.”
MyPOV – Good quote from Cook, of course the partnership makes sense, but it is not clear what makes this partnership special. SAP could have (and has) built on iOS (natively) before. Would be good to learn what the ‘incredible capabilities that only iOS can deliver’ are. But let’s read on.
 
“We’re proud to take this special partnership between Apple and SAP to a groundbreaking new place,” said Bill McDermott, CEO of SAP. “In giving people an agile and intuitive business experience, we empower them to know more, care more and do more. By combining the powerful capabilities of SAP HANA Cloud Platform and SAP S/4HANA, together with iOS, the leading and most secure mobile platform for enterprise, we will help deliver live data to people wherever and whenever they choose to work. Apple and SAP share a commitment to shaping the future, helping the world run better and improving people’s lives.”
MyPOV – Good quote of McDermott – though it is not clear what SAP will deliver, new capabilities on SAP HANA Cloud Platform – as mentioned before – and or native apps for S/4HANA. The question is of course – what about the existing applications etc. And interesting that McDermott gives iOS the ‘most secure mobile platform for the enterprise’ badge – without justification. But I guess both CEOs / press teams agreed on the CEOs complimenting each other.
 
The companies plan to deliver a new SAP HANA Cloud Platform SDK exclusively for iOS that will provide businesses, designers and developers the tools to quickly and efficiently build their own iOS apps for iPhone and iPad, based on SAP HANA Cloud Platform, SAP’s open platform as a service. These native apps will provide access to core data and business processes on SAP S/4HANA, while taking full advantage of iPhone and iPad features like Touch ID®, Location Services and Notifications.
MyPOV – Always good to see SAP talk HCP, a product that in my view has not gotten the attention, keynote time, marketing spend etc. that it deserves, as it is vital for both customers and SAP to e.g. create partnerships like this and build innovative next generation applications. Without a good competitive platform SAP won’t be able to do well in enterprise software. So it is good to see HCP mentioned by McDermott twice (!) in one paragraph. And now we learn that S/4HANA processes (APIs?) will be exposed in the SDK, good to know / understand. The question is of course – what happens with all the other SAP applications.
 
A new SAP Fiori for iOS design language will take the award-winning SAP Fiori user experience to the next level by combining it with a consumer-grade iOS experience to deliver on the robust user needs in the enterprise and enable developers to build next-generation apps. To help SAP’s 2.5-million member global developer community take full advantage of the new SDK and Apple’s innovative hardware and software, a new SAP Academy for iOS will offer tools and training. The new SDK, design language and academy will begin rolling out before the end of the year.
MyPOV – Good to see Fiori in the mix, as it should guarantee a high level of UI consistency for SAP users. As much as we live in the ‘mobile first’ world – users are still using browsers (or an iPad) with a different form factor and deserve an ‘as consistent as possible’ user experience. Why it requires a ‘new’ Fiori design language is something we need to understand better… new is good – but more languages also add complexity. Maybe not so close to the 'simple' SAP likes to stress. And as developers could build iOS apps today, why is a new SDK needed? And good to see the know-how dissemination efforts.
 
As a part of the partnership, SAP will develop native iOS apps for critical business operations. These apps for iPhone and iPad will be built with Swift™, Apple’s modern, secure and interactive programming language, and will offer a familiar user experience with the SAP Fiori for iOS design language. Workers across industries will be empowered to access the critical enterprise data, processes and user experience they need to make decisions and take action right from their iPhone or iPad through apps designed to enable a field maintenance worker to order parts or schedule service, or a doctor to share the latest patient data with other healthcare professionals.
MyPOV – The Swift endorsement is a key win for Apple, but does not bide too well for developer productivity in the likely scenario of building cross mobile OS applications. [Update May 9th 2016 - Apple points out correctly that it has open sources Swift and there are initiatives by 3rd parties on the way to address cross platform support of Swift. And certainy Swift is the most efficient platform for iOS apps today.]. How will a developer build an Android, Windows 10 etc. application working in HCP? Maybe the new Fiori SDK language will address this, though no data to support this in this press release (and I lack the technical ingenuity at this point to figure out if this would work). But in the age of e.g. Google and Microsoft enabling mobile developers to create cross platform applications (see my event reports from Google Cloud Platform here and of Microsoft Build here) this is a step back for SAP developers, and a win for the Apple proprietary, ‘walled garden’ approach to build ecosystems.  [Update May 9th 2016 - Apple correctly points out that if they 3rd party initiatives come to fruition, they would definitively make the 'walled garden' metaphor invalid. Agreed.]
 
As market leader in enterprise application software, SAP helps companies of all sizes and industries run better. From back office to boardroom, warehouse to storefront, desktop to mobile device – SAP empowers people and organizations to work together more efficiently and use business insight more effectively to stay ahead of the competition. SAP applications and services enable approximately 310,000 business and public sector customers to operate profitably, adapt continuously, and grow sustainably. For more information, visit www.sap.com. 
Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today, Apple leads the world in innovation with iPhone, iPad, Mac, Apple Watch and Apple TV. Apple’s four software platforms — iOS, OS X, watchOS and tvOS — provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store, Apple Music, Apple Pay and iCloud. Apple’s 100,000 employees are dedicated to making the best products on earth, and to leaving the world better than we found it.
MyPOV – No need to comment on the boilerplate closing paragraphs of both SAP and Apple.
 

Overall MyPOV

A good move by Apple and SAP to partner, the question is really, what took both sides so long, almost two years longer than the Apple and IBM partnership. But then Swift was not around – so coming to it late may not be too bad for SAP. With SAP’s market share it makes sense for Apple to partner with the leading enterprise application vendor, but both will have to work hard to get a level of differentiation that justifies the premium prices that Apple hardware commands. And that’s a good hurdle, as premium hardware deserves premium software. The new Fiori SDK may well point in that direction – we will see how good and well the new joint application will do and how much of unique Apple ecosystem feature they will embody ([May 9th 2016 - Factually corrected to official Apple product names;] the Apple 3D Touch, Apple Touch ID, Apple Pay etc come to mind)..

Surprisingly, the announcement is void of details on the go to market side. The press quotes no revenue share between the two vendors. Compared to the similar press release Apple did in July 2014 with IBM (see here), we don’t hear / see / read anything on the services / support side. No mention of marketing / sales either. Where will joint customers get their applications from? The Apple Store? Their own branded Apple Store? Their own branded SAP store? From the app developer? So there are some questions that Apple and SAP will have to answer soon. Also of note, Apple partnered as well with Cisco (see here) late summer 2015, but we have not seen heard much about progress on this partnership. That announcement was also weaker on go to market than the original 'Apple comes to the enterprise' than the one with IBM, so we will have to see how the partnership will pan out going forward. 

On the concern side - it looks like Apple and SAP may have missed what Facebook, Google and Microsoft recently announced in regards of chat and conversational bots coming to your smartphone. So building 'another' 100 (why is it always 100?) mobile apps may miss the boat on where mobile usage is going. An Apple / SAP partnership bringing S/4HANA (and other systems) capabilities to iMessage would have been in synch with the announcement wave of spring of 2016. But only what has not happened can still happen and Apple / SAP may have left an arrow back there, with Sapphire looming. And fair enough, the whole conversation / chat bot - Conversation as a Platform as Microsoft calls it is in its infancy... but then - if working - would disrupt the whole apps ecosystem. Something that Apple surely does not want, something that SAP is more open to, as it needs to build mobile endpoints that are popular and expected by its customers. 

What is good to see is the support of HCP, which becomes more and more strategic for SAP with every quarter. Understanding what value SAP can bring to the existing, pre S/4HANA applications will be important for customers as well as for the success of the partnership, as for now – despite ambitions plans for S/4HANA – the bulk of SAP users are and will remain for the foreseeable future on the pre S/4HANA SAP applications. And that’s where the partnership needs to play for the next years to yield the device sales that Apple is hoping to get from this partnership. I expect a lot of SAP customers sitting on 2 more device refresh cycles (assuming a 2 year mobile device refresh cycle) before they will move to S4/HANA en masse. We will be watching – we will likely learn more at Sapphire in Orlando in a few weeks.



 
More on SAP:
  • Progress Report - SAP SuccessFactors makes good progress - now needs appeal beyond SAP - read here
  • News Analysis - SAP HANA Vora now available... - A key milestone for SAP - read here
  • Event Report - SAP Ariba Live - Make Procurement Cool Again - read here
  • News Analysis - SAP SuccessFactors innovates in Performance Management with continuous feedback powered by 1 to 1s  - read here
  • Event Report - SAP SuccessFactors SuccessConnect - Good Progress sprinkled with innovative ideas and challenging the status quo - read here
  • News Analysis - WorkForce Software Announces Global Reseller Agreement with SAP - read here
  • First Take - SAP SuccessFactors SuccessConnect - Day #1 Keynote Top 3 Takeaways - read here
  • News Analysis - SAP SuccessFactors introduces Next Generation of HCM software - read here
  • News Analysis - SAP delivers next release of SAP HANA - SPS 10 - Ready for BigData and IoT - read here
  • Event Report - SAP Sapphire - Top 3 Positives and Concerns - read here
  • First Take - Bernd Leukert and Steve Singh Day #2 Keynote - read here
  • News Analysis - SAP and IBM join forces ... read here
  • First Take - SAP Sapphire Bill McDermott Day #1 Keynote - read here
  • In Depth - S/4HANA qualities as presented by Plattner - play for play - read here
  • First Take - SAP Cloud for Planning - the next spreadsheet killer is off to a good start - read here
  • Progress Report - SAP HCM makes progress and consolidates - a lot of moving parts - read here
  • First Take - SAP launches S/4HANA - The good, the challenge and the concern - read here
  • First Take - SAP's IoT strategy becomes clearer - read here
  • SAP appoints a CTO - some musings - read here
  • Event Report - SAP's SAPtd - (Finally) more talk on PaaS, good progress and aligning with IBM and Oracle - read here
  • News Analysis - SAP and IBM partner for cloud success - good news - read here
  • Market Move - SAP strikes again - this time it is Concur and the spend into spend management - read here
  • Event Report - SAP SuccessFactors picks up speed - but there remains work to be done - read here
  • First Take - SAP SuccessFactors SuccessConnect - Top 3 Takeaways Day 1 Keynote - read here.
  • Event Report - Sapphire - SAP finds its (unique) path to cloud - read here
  • What I would like SAP to address this Sapphire - read here
  • News Analysis - SAP becomes more about applications - again - read here
  • Market Move - SAP acquires Fieldglass - off to the contingent workforce - early move or reaction? Read here.
  • SAP's startup program keep rolling – read here.
  • Why SAP acquired KXEN? Getting serious about Analytics – read here.
  • SAP steamlines organization further – the Danes are leaving – read here.
  • Reading between the lines… SAP Q2 Earnings – cloudy with potential structural changes – read here.
  • SAP wants to be a technology company, really – read here
  • Why SAP acquired hybris software – read here.
  • SAP gets serious about the cloud – organizationally – read here.
  • Taking stock – what SAP answered and it didn’t answer this Sapphire [2013] – read here.
  • Act III & Final Day – A tale of two conference – Sapphire & SuiteWorld13 – read here.
  • The middle day – 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
  • A tale of 2 keynotes and press releases – Sapphire & SuiteWorld – read here.
  • What I would like SAP to address this Sapphire – read here.
  • Why 3rd party maintenance is key to SAP’s and Oracle’s success – read here.
  • Why SAP acquired Camillion – read here.
  • Why SAP acquired SmartOps – read here.
  • Next in your mall – SAP and Oracle? Read here
 
And more about SAP technology:
  • Event Prieview - SAP TechEd 2015 - read here
  • News Analysis - SAP Unveils New Cloud Platform Services and In-Memory Innovation on Hadoop to Accelerate Digital Transformation – A key milestone for SAP read here
  • HANA Cloud Platform - Revisited - Improvements ahead and turning into a real PaaS - read here
  • News Analysis - SAP commits to CloudFoundry and OpenSource - key steps - but what is the direction? - Read here.
  • News Analysis - SAP moves Ariba Spend Visibility to HANA - Interesting first step in a long journey - read here
  • Launch Report - When BW 7.4 meets HANA it is like 2 + 2 = 5 - but is 5 enough - read here
  • Event Report - BI 2014 and HANA 2014 takeaways - it is all about HANA and Lumira - but is that enough? Read here.
  • News Analysis – SAP slices and dices into more Cloud, and of course more HANA – read here.
  • SAP gets serious about open source and courts developers – about time – read here.
  • My top 3 takeaways from the SAP TechEd keynote – read here.
  • SAP discovers elasticity for HANA – kind of – read here.
  • Can HANA Cloud be elastic? Tough – read here.
  • SAP’s Cloud plans get more cloudy – read here.
  • HANA Enterprise Cloud helps SAP discover the cloud (benefits) – read here.
 
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization New C-Suite Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work android SAP apple Google IBM SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Qlik Extends Its Platform As Cloud Disruption Looms

Qlik Extends Its Platform As Cloud Disruption Looms

Qlik Sense Enterprise 3.0 and the platform strategy dominates Qonnections 2016. Change lies ahead as analysis moves into the cloud.

Qlik Sense 3.0 is coming in June, Qlik Sense Cloud is ramping up for wider use, and the Qlik DataMarket is gaining more powerful data-connection and data-enrichment capabilities. These were among the notable announcements at Qlik’s May 2-4 Qonnections event in Orlando, Fla.

Qlik has good reason for confidence coming out of Qonnections. Customers I spoke to now understand and accept the company’s platform approach, seeing both QlikView and Qlik Sense as here-to-stay components of a larger ecosystem. Instead of facing a forced migration to Qlik Sense - the company’s newer, more visual and self-service-oriented product – customers now trust that they will continue to have options and plenty of time to evolve their deployments as they see fit. At the same time they seem to accept that the bulk of Qlik’s investment is in Qlik Sense, which will see three updates per year versus one update per year for Qlik View.

Inside Qlik Qonnections 2016

Beyond the platform, Qlik made announcements about its investments in analytics, data and cloud:

Analytics: Qlik Sense Enterprise 3.0, coming in June, will deliver visual search and exploration capabilities designed to enable business users to find new insights without requiring analysts to edit and modify visualizations and data. The update adds smart features, including automatic detection and charting of temporal (time-related) and geographic data. Developers will get new integration and visualization APIs as well as a plugin for Visual Studio.

Data: Qlik announced the acquisition of partner Industrial CodeBox, a move that will turn that company’s QVSource product into a built-in tool for integrating cloud-based data sources such as SaaS applications and social networks with Qlik Sense and QlikView. QVSource has more than 40 pre-built connectors for popular Web-based sources including Twitter, Facebook, Microsoft Dynamics CRM, and SugarCRM. Qlik also announced Qlik DataMarket Financial Data Packages, due in June, offering licensable data on stocks, indices and corporate financials drawn from 35 exchanges around the globe.

Cloud: Qlik introduced Qlik Sense Cloud last year with a basic freemium service for personal use. Qlik Sense Cloud Plus, added in January and priced at $20 per user, per month, offers secure sharing of analyses among small workgroups. Qlik Sense Cloud Business, announced at Qonnections and due out in the second half of this year, will support secure sharing of analyses with internal and external groups. It’s aimed at departments of larger businesses and small and midsized enterprises and is offered through monthly and annual contracts.

MyPOV On Qlik’s Progress

I’m most impressed by Qlik’s data investments. A keynote demo of a new Visual Data Preparation tool due out with the June 3.0 release brought rousing applause from customers. The tool lets you upload and explore data sets, which are shown as visual bubbles on a palate. Related data sets automatically show up in close proximity, and joins are as simple as inspecting the data and pulling the bubbles together.

The new data-prep interface targets spreadsheet-savvy business users. It stops short of the munging, mashup and transformation capabilities offered by Qlik partners such as Alteryx, Informatica, Paxata, and Trifacta, Hjalmar Gislason, Qlik’s VP of data, told me. But I think Qlik is delivering the basics that are most in demand.

I also liked the Industrial CodeBox acquisition, a tuck-in deal that enhances the power of the budding Qlik DataMarket. CodeBox provides data-connection and data-enrichment options that will only see more use as data is increasingly born in the cloud and as digital businesses move toward blending and monetizing information through cloud data services.

What I’m less impressed with is Qlik’s methodical, bottom-up move to the cloud. Qlik was late to the game when it launched Qlik Sense Cloud last year, and the strategy is to move up from personal and workgroup use to departments and SMBs later this year. Qlik Sense Cloud Enterprise edition won’t show up until sometime in 2017. Qlik Sense Cloud runs on Amazon Web Services, but we’ve seen no detail on how it might access AWS data sources and cloud services.

Yes, we agree with Qlik’s argument that it’s going to be a hybrid world for a long time to come. But in our research and advisory work, Constellation Research sees demand for cloud-based analytical capabilities from companies large and small. In fact, we think digital disruption is bringing an era in which the majority of information deemed critical will be accessed externally rather than owned and managed on premises.

Amazon Web Services, Google and Microsoft are building out extensive portfolios of analytical capabilities complementing their massive public clouds. They know that getting the data into the cloud is just the first step. The next step is taking advantage of data scale and massive compute power to harness automation, machine learning and artificial intelligence capabilities that will gradually take some (though certainly not all) of the labor and complexity out of data management and data analysis. In the future (how near isn't quite clear), these sorts of features promise to transform our ideas of ease of use and change the BI and analytics battle from self-service to smart cloud services. With a cloudier approach and more partnering, Qlik could be leading the way.

Related Reading:
SAS Goes Cloud, But Will Customers Follow?
SAP Bets On Cloud For Analytics, BPC Optimized for S/4 HANA
Oracle Data Cloud: The Data-as-a-Service Differentiator
Qlik Unveils QlikView 12, Qlik Sense Cloud Roadmap

 

Data to Decisions Tech Optimization Chief Information Officer Chief Digital Officer

JDA hosts a great event…but what does the future hold?

JDA hosts a great event…but what does the future hold?

index

I just came back from Nashville, well actually Las Vegas but was in Nashville to start the week. The JDA Focus 2016 event was being held in the Music City. It brought together a large gathering of some of the top supply chain professionals from the around the globe. Per usual, JDA put on a good show, at least the first day since that was what I was able to attend! But even being in Nashville for less than 24 hours, I took away some observations from the event and JDA:

  • Talking a good game – The main stage presentations by CEO Bal Dail and Chief Revenue Office Razat Guarav were in stark contrast to the former administrations. How? Much more focused on the disruptors facing the market and with a keen eye on the future. Bal focused on the company embarking in a “big pivot” focusing on how customers are impacting our businesses. While Razat hit on the major disruptors that face supply chains and our industries. More on both later. What was refreshing was a message from main stage that called out and hit on many of the trends and drivers that we are all facing. Both Bal and Razat also started giving the audience a glimpse into how JDA will address these shifts, whether it is the new retail.me offering, greater emphasis on JDA labs or creating a digital hub, all promising efforts to address their customers’ needs. Coupling their willingness to address new disruptors head on coupled with solutions that are poised to take on these changes was refreshing to hear from this leadership team, not always what would come from main stage.
  • Facing disruptorsand making the pivot – One of the big threads that we at Constellation Research have been working on with our customers were reflected on main stage in Nashville (as much as I would like to take credit for those ideas…alas I cannot). Razat hit on 5 big themes of disruption: mobile, IoT, social, cloud and big data. We speak at length about these disruptors; feel free to read our research, but what is the biggest underlying driver is the rise of the consumer. Many of these disruptors have empowered the consumer, given the consumer a growing voice in the ecosystem. When it comes to the supply chain whether you are B2B or B2C the consumer has become the driver – your business must make this the center of their strategy. The same goes for the technology providers that are servicing these businesses. Bal and his team have a great challenge ahead as they look to pivot themselves to help their customers’ better address the consumers and the disruptors that have made chaos the new norm.
  • So where does JDA go from here? So JDA is painting a picture of awareness and willingness to pivot to meet their customers’ needs. Good. But what does the future hold for JDA? Over the past decade the company has absorbed Manugistics, i2 Technologies and Red Prairie. All were best of breed supply chain solution providers. JDA became, on paper, a supply chain powerhouse being able to address a wide array of industry needs. Ranging from process and discrete manufacturing, retail and logistics. Impressive. But the question remains – what does New Mountain Capital have in mind long term for this asset? While other supply chain players have been focusing their efforts on specific industries – players like Plex focused on manufacturing, Aptos being spun off from Epicor to focus on retail while Epicor can concentrate on ERP. Can JDA continue to find success competing on all fronts? Or do they need to consider following a similar strategy as Epicor and break up the parts? Maybe the pieces competing on their own are more powerful than the whole? I do not believe this is the only direction JDA can take, but at some point New Mountain Capital will want to reap the rewards from their investment. How that happens will be interesting to observe.

JDA remains a major player in the field of supply chain. The leadership and culture have an aggressive level of expectations of themselves and the business – it is now up to the solutions and software to catch up. They are clearly aware and in tune with the disruptors that are impacting all businesses. The next few months will be crucial for the JDA leadership team to implement their pivot strategy and find success.

Disclosure – I worked at i2 Technologies from 2004 to 2009, i2 Technologies was acquired by JDA in 2009. 

Matrix Commerce Chief Information Officer

IoT and Network Connectivity Management, or AoT and Data Flow Management on the Network?

IoT and Network Connectivity Management, or AoT and Data Flow Management on the Network?

Starting in the early 80s with Ethernet running through to todays’ sophisticated and ubiquitous Internet offerings; wired to wireless, fixed to mobile, with all the tools and methods a vast pool of experience in ‘networking’ has built up. Against this background it seems strange to focus on the networking challenge, but are IoT Devices the same as IT Devices from a networking perspective?

IT was, and is, about connecting computers into computer systems around significant data exchanges; or in the case of the Internet, then usually human driven Web navigation. Whilst the popular expectation is that the Devices  of ‘Internet’ of Things will connect and interact in a similar manner on a network, that may not be true.

To make the point consider the example of Sensity an IoT company that uses LED lights and lighting as IoT sensors. Quote; ….has been designed to instantly convert any lighting manufacturer’s LED fixtures into IP-enabled sensory node in a light sensory network that provides both the lighting control and cloud-based IoT services via a standard NEMA socket. Examples of Sensity deployments range from monitoring car parking bays to counting people moving around retail stores and much more.

Its an example of IoT; its network connected, has a link to IP, but its not networked in a manner that any IT networking professional will find it easy to relate to. (take this link for details of NEMA connectivity). Sensity, is an impressively innovative IoT solution with huge business value in any number of ways, but it’s also an example of the all too common IoT problem of ‘new’ networking solutions.

Just at the time when IT networks and protocols have become relatively standardized to support ubiquitous network connectivity the arrival of IoT networked devices disrupts this with a whole range of ‘differences’. The innovative IoT solutions are usually complete packages from sensors to graphical user displays, including the network topography, to bypass these differences.  However there is general a requirement to use an Enterprise IT network for as a backbone connection to the Cloud where the service element will be hosted. (N.B. using conventional TCP/IP protocols for IoT data can often mean the packet header payload is bigger than the data being transported). 

The implication inherent in the name, and the claims of ubiquitous connectivity, is that todays IT networks providing Internet connectivity will support IoT; in reality for many deployments that’s only likely to be true about the backbone element.

In the Telecom Industry the concept of the ‘Final Mile’ challenge being made up of any number of different connection formats was at least limited in its diversity of network content to voice and bell signaling. Currently the IoT Final Mile diversity exists in just about any, and every, facet of ‘networking’, from physical media layer upward.

In small scale and pilot sensor networks this level of diversity may not be noticeable for the impact its traffic with have Enterprise IT networks for backbone connectivity, some pilots may even taking place in totally closed special IoT sensor networks. With time and scale the convenient separate local IoT closed network will soon vanish, and IoT network traffic levels will be felt on the Enterprise IT network.

At the MIT Technology Review Digital Summit Todd Greene Follow the CEO of PubNub made the observation that a new type of Network for connecting IoT embedded devices is required. His argument was based on both the scale and latency implications resulting from the complex infrastructure of the real ‘internet’ Quote; Unfortunately the Internet isn’t just one network, and considerations include heterogeneous networks, including cell towers, slow connectivity, fast connectivity, proxy servers, and firewalls; all things that can disrupt connectivity’.

Todd Greene Follow could be expected to make this point as his company, PubNub, has, since 2009, been actively promoting their alterative interconnection network for exactly the kind of low latency, small data packet traffic that makes up IoT sensing. PubNub can certainly argue they have something right in their assessment given the size and number of messages they are now carrying for a collection of well-known companies. So is it really the answer to deploy a new parallel network infrastructure as his message in respect of IoT and traditional IT networks would suggest.

For the majority of enterprises integrating with and using IoT will need to be a part of their existing Enterprise IT network as, unless in a specialized sensing process based interconnected industry such as Oil Refining, there is unlikely to be an economic argument in shifting to a specialized alternative IoT sensing network.

The obvious consideration of IoT traffic impacts on the Enterprise IT network will be sheer amount of traffic, but it’s the size of network packets and the frequency of device transmissions that introduce some basic issues that have to be addressed. The two starting points for any Network Professional are Traffic impacts of volume, timing and latency, and Security management. Just how these concerns are addressed will depend on choosing whether device connection management, or data stream/flow management, is the better primary choice.

Summarizing the numbers of variables to be considered into just these two headings may seem at odds with the huge amount written on the issues of IoT networks. The concern for those facing the reality of supporting IoT sensor deployments on Enterprise IT networks is to find the approach that addresses their particular requirement, and not become lost in a sea of individual issues.

Are the issues unimportant? Of course not, but as with everything to do with IoT it’s all about outcomes! In this case that means choosing tools and techniques to manage the devices connectivity, or to consider the alternative to manage consolidated data streams. Is it is possible to have such a neat separation? In the long term no, the two are equally important and required, but in the short term when tactical success matters then it helps to understand what is the dominant issue as a priority.

As in all markets in the early stages much of the information available comes from a vendor of a product therefore the presentation of the ‘facts’ will by necessity be concerned with the product. ‘Issues based selling’ is a well-known technique so it pays to establish an overall approach to use to consider products within an objective context.

Googling the term ‘IoT Connection Management’ will provide papers from Cisco, Huawei and others, which define how to control and manage the huge number of different types and ways that IoT devices are connected. Connection management is a necessity when faced with the diversity of large numbers and types of devices that an enterprise might have in use across both cellular and traditional enterprise IP addressed devices. 

Naturally when first starting to consider IoT pilots and small-scale deployments extending traditional IT Network connection management to include IoT Devices is seen as the starting point. At this stage the impact of managing the service level availability of new connections exceeds the potential impact of the data stream management.

However as IoT sensing moves into production systems the number of IoT sensors and the concentration rises dramatically. Deloittes new headquarters building in Amsterdam has 22000 sensor points to manage its abilities to be a ‘Smart’ Building. Consider the network impact these thousands of simple IoT sensing connections make in communicating only a few bytes of data each, but doing so frequently. In aggregate that’s a lot of traffic, but the individual IoT sensor connections may have little capability to be managed beyond checking for their presence, (think of the example of Sensity). Counting its data transmissions can check the presence of a sensor just as well.

Deloittes Smart Building ‘The Edge’ is expected to provide more than 3 petabytes of data a year from its 22000 IoT sensors, at that level of mature IoT deployment the challenge has to move to Data Streaming management. This is a specific new functionality arising from the technologies associated with IoT and a new challenge for IT Network Managers.

The example of The Edge Smart Building graphically makes the point that the immediate connectivity onboarding management swiftly moves to becoming massive traffic management challenge. IoT devices introduce a volume of connections x the minuscule amounts of data x frequency of sending that taken together impose a very different traffic profile for management purposes. Even if the answer is to segment IoT sensors onto a different network there is still likely to be a Data Streaming management challenge.

The business value from IoT sensors is in either ‘real-time’ Smart Services, or in Analysis of Things, AoT, and for both that means interconnection with the Enterprise IT network to access Cloud based resources. At this connection point even  if at no other connection point, IoT Data Streaming management will be a necessity 

Googling Data Streaming or Data Flow management will produce a lot of results, as usual most are written around products rather than in the context of the issues to be considered. As data is what provides the business value the whole question of the creation of data by sensors, through to the consolidation data in a form suitable for consumption by Smart Services and Analytics processing does need to be addressed. But that’s a further topic in its own right, and here consideration rests purely on the network impacts.

In the course of little more than a year developed from the growing experience gained from deploying IoT based sensor systems the focus has changed from the IoT Sensors themselves to the aspects of creating, managing and using the IoT Data. The whole topic of Data Streaming and Flow management together with the new forms of Analysis of Things, AoT, has expanded to make Data Architecture a pressing consideration somewhat overtaking network connection Architecture.

At the beginning of this blog Sensity, with their clever LED lighting IoT solution, was used to illustrate high levels of IoT connections, where the value lies in the aggregation of data rather than the management of each individual connection.  Should, or even could, this be managed via individual Network Connections, or is it one Gateway connection with management of the resulting Data Stream?

However Network Connections come back into the picture when considering non Enterprise IT network Wireless connected deployments. Connecting cars, goods in transit, high value large items in storage yards etc., require IoT deployments that rely on 3/4G, SigFox, LTE, even sometimes Wifi. Here connectivity and service management becomes the priority.

These forms of public Wireless services are, for the foreseeable future, going to be subjected to ongoing change with new specifications/capabilities, even Service Providers Business models, changing. Recently the LoRa Alliance claimed to be the fastest growing standards alliance in IoT, and of course, there is the arrival of 5G on the horizon, all of which require connection level changes to be managed.

In Public Wireless Networks the service provider usually provides the IoT Network Connection management together with business / commercial management as an important competitive differentiator. That means the device connection point itself is often overlooked, with focus only at the IT Enterprise connection point. Flexibility in IoT Device connections is key to trying to avoid Service provider lock-in preventing changes to better commercial deals.

The barrier to change on the IoT devices themselves is not inconsiderable as, if originally intended for WiFi direct connectivity, there will be an inbuilt full TCP/IP network stack with protocols payloads exceeding data payloads. This level of message size is likely to exceed the capacity of specialist IoT networks such as SigFox. There are similar problems associated with each network type, ie 3/4G, LTE, etc. that render changing from one to another as somewhere between an expensive redevelopment of communication stacks and effectively impossible.

This is a not inconsiderable issue to face up to before making a choice of Network Connectivity types for an IoT deployment using public wireless networking services. To change low cost simple sensors the answer will be to ‘rip and replace’, but investment in complex ‘Smart Sensors’ needs to address future proofing. As with most areas where IoT stretches the capabilities of existing technologies start-ups are providing new answers.

An interesting example is Wivity who claims to ‘eliminate the complexity of public wireless connectivity with a - “Build Once, Connect Everywhere” approach based on a hardware modem being incorporated in the Smart Device design. The interchangeable Wivity modems accept the same HTTP calls from the IoT device no matter what network connection is being deployed so providing an ongoing path to new upgraded network types. Together with using lightweight protocols and other edge based techniques Network connectivity is made simpler and flexible. Wivity call for some re thinking on the Telecoms market and IoT in their blog https://wivity.com/blog/IoT-is-a-Different-Animal

To summarize; deploying IoT pilots means considering and testing more than simple sensor connectivity to GUI, or analytics. IoT is a generational change in the type of technology and its business role, resulting in understanding network and data connectivity needing careful investigation. IoT pilots make low enough demands on IT Enterprise Networks that the impact of full-scale rollouts are easy to miss.

 

 

This is a last post for five weeks as i will be taking a sabbatical break though continuing to follow the technology market as usual

New C-Suite

Blockchain: Almost Everything You Read Is Wrong

Blockchain: Almost Everything You Read Is Wrong

Almost everything you read about the blockchain is wrong. No new technology since the Internet itself has excited so many pundits, but blockchain just doesn’t do what most people seem to think it does. We’re all used to hype, and we can forgive genuine enthusiasm for shiny new technologies, but many of the claims being made for blockchain are just beyond the pale. It's not going to stamp out corruption in Africa; it's not going to crowdsource policing of the financial system; it's not going to give firefighters unlimited communication channels. So just what is it about blockchain?

The blockchain only does one thing (and it doesn’t even do that very well). It provides a way to verify the order in which entries are made to a ledger, without any centralized authority. In so doing, blockchain solves what security experts thought was an unsolvable problem – preventing the double spend of electronic cash without a central monetary authority. It’s an extraordinary solution, and it comes at an extraordinary price. A large proportion of the entire world’s computing resource has been put to work contributing to the consensus algorithm that continuously watches the state of the ledger. And it has to be so, in order to ward off brute force criminal attack.

How did an extravagant and very technical solution to a very specific problem capture the imagination of so many? Perhaps it’s been so long since the early noughties’ tech wreck that we’ve lost our herd immunity to the viral idea that technology can beget trust. Perhaps, as Arthur C. Clarke said, any sufficiently advanced technology looks like magic. Perhaps because the crypto currency Bitcoin really does have characteristics that could disrupt banking (and all the world hates the banks) blockchain by extension is taken to be universally disruptive. Or perhaps blockchain has simply (but simplistically) legitimized the utopian dream of decentralized computing.

Blockchain is antiauthoritarian and ruthlessly “trust-free”. The blockchain algorithm is rooted in politics; it was expressly designed to work without needing to trust any entity or coalition. Anyone at all can join the blockchain community and be part of the revolution.

The point of the blockchain is to track every single Bitcoin movement, detecting and rejecting double spends. Yet the blockchain APIs also allow other auxiliary data to be written into Bitcoin transactions, and thus tracked. So the suggested applications for blockchain extend far beyond payments, to the management of almost any asset imaginable, from land titles and intellectual property, to precious stones and medical records.

From a design perspective, the most troubling aspect of most non-payments proposals for the blockchain is the failure to explain why it’s better than a regular database. Blockchain does offer enormous redundancy and tamper resistance, thanks to a copy of the ledger staying up-to-date on thousands of computers all around the world, but why is that so much better than a digitally signed database with a good backup?

Remember what blockchain was specifically designed to do: resolve the order of entries in the ledger, in a peer-to-peer mode, without an administrator. When it comes to all-round security, blockchain falls short. It’s neither necessary nor sufficient for any enterprise security application I’ve yet seen. For instance, there is no native encryption for confidentiality; neither is there any access control for reading transactions, or writing new ones. The security qualities of confidentiality, authentication and, above all, authorization, all need to be layered on top of the basic architecture. ‘So what’ you might think; aren’t all security systems layered? Well yes, but the important missing layers undo some of the core assumptions blockchain is founded on, and that’s bad for the security architecture. In particular, as mentioned, blockchain needs massive scale, but access control, “permissioned” chains, and the hybrid private chains and side chains (put forward to meld the freedom of blockchain to the structures of business) all compromise the system’s integrity and fraud resistance.

And then there’s the slippery notion of trust. By “trust”, cryptographers mean “out of band” or manual mechanisms, over and above the pure math and software, that deliver a security promise. Blockchain needs none of that - so long as you confine yourself to Bitcoin. Many carefree commentators like to say blockchain and Bitcoin are different things, yet the connection runs deeper than they know. Bitcoins are the only things that are actually “on” the blockchain. When people refer to putting land titles or diamonds “on the blockchain”, they’re using a short hand that belies blockchain’s limitations. To represent any physical thing in the ledger requires firstly a schema – a formal agreement about which symbols in the data structure correspond to what property in the real world – and secondly a process to bind the owner of that property to the special private key (known in the trade as a Bitcoin wallet) used to sign each ledger entry. Who does that binding? How exactly do diamond traders, land dealers, doctors and lawyers get their blockchain keys in the first place? How does the world know who’s who? These questions bring us back to the sorts of hierarchical authorities that blockchain was supposed to get rid of.

There is no utopia in blockchain. The truth is that when we fold real world management, permissions, authorities and trust, back on top of the blockchain, we undo the decentralization at the heart of the design. If we can’t get away from administrators then the idealistic peer-to-peer consensus algorithm of blockchain is academic, and simply too much to bear.

I’ve been studying blockchain for two years now. My latest in-depth report was recently published by Constellation Research.

 
 
Digital Safety, Privacy & Cybersecurity Chief Information Officer

Consumers Are The Biggest Disruptor in the Supply Chain

Consumers Are The Biggest Disruptor in the Supply Chain

We are in the middle of event season, which means lots of airplanes, hotel rooms and restaurant dinners (some better than others). During the past few weeks I have flown to all the event hot spots – Las Vegas, San Francisco, Detroit, Nashville, Chicago, New York, San Jose, Miami, Washington DC to name a few. I have also attended a wide range of events, from the likes of Infosys, JDA, Plex, Demandware, SAP, Oracle, Epicor etc etc. There has been one thread that is common – the rise of the consumer. Now this is nothing new to us here at Constellation Research. We have been been touting the rise of the consumer in the commercial ecosystem (B2B and B2C) as the biggest disruptor to date. It is good to hear that the solution providers are recognizing this shift as well.

So why is the consumer gaining in strength?

The have the voice because of social. A growing number of retailers are making sure they do better social listening to gauge how their customers are viewing them. What kinds of features or services might they be interested in. Companies like Newell Rubbermaid or Best Buy, have done a lot of work to keep an ear to the social sounding boards.

Consumers have the reach via mobile. As Demandware pointed out at their show, mobile is king. True mobile meaning our phones, not our tablets, are what sit at the top of the food chain when it comes to customer interactions and touch points. We all know the statistics about how often we check our phones and the fact they are with us almost the entirety of our waking hours. Who could have imagined what Marty Cooper did in 1983 would give us such reach when it comes to the relationship consumers have with the commerce ecosystem.

The internet provides consumers with virtually unlimited choices. Before we saw the rise of the world wide web and subsequently eCommerce, our choices were often time tethered to the stores that was within a reasonable physical range or whatever inventory that could be displayed in a catalog. All this changed with the rise of the internet. Suddenly if you were a purveyor of fine wine in the Rhone valley, you could attract buyers from Hong Kong to Pittsburgh. Regardless of your size, through a few clicks of a mouse your products were accessible by anyone with a browser and a dial up modem! Consumers now had access to a treasure trove of products.

Finally the consumers’ expectations have been set by the likes of Amazon. Not only can consumers access a wide swath of products through the eCommerce giant, but they also expect perks such as free shipping, returns, access to long tail products, to name a few.

All this taken together is why consumers are becoming, if not are already, the biggest disruptors within the commerce supply chain. Based on what I am hearing this event season, the vendors and service providers are agreeing with this assertion. The challenge will be how to best offer the solutions and technologies that can allow participants in the commerce supply chain to meet their consumers’ needs. These solution providers must keep in mind their customers’ customers as they design and offer new solutions. How can they empower their customers to be able to better meet the growing expectations and needs of the end customer? No small challenge indeed. As this crazy event has demonstrated, at least most if not all the vendors are reading off the same music sheet.

Matrix Commerce Chief Customer Officer

#CMTV Speaks to R "Ray" Wang at CloudExpo

#CMTV Speaks to R "Ray" Wang at CloudExpo

Media Name: research-offerings-research-reports.jpg

R "Ray" Wang discusses Dominating Digital Disruption at CloudExpo. 

Tech Optimization Chief Information Officer On <iframe width="560" height="315" src="https://www.youtube.com/embed/qgl34x8fq-M" frameborder="0" allowfullscreen></iframe>

Infosys Confluence 2016 - The Future is Software + People 

Infosys Confluence 2016 - The Future is Software + People 

We had the opportunity to attend Infosys’ Confluence event in San Francisco, held from April 27th to 29th at the Hilton Union Square. The conference was well attended with over 1500 participants, coming from customers, prospects and the ecosystem, a surge by 50+% over last year. 
 
Have a look at my top three takeaways of the event here:

 
 
 
No time to watch – here is the 1-2 slide condensation:
 

Want to read on? Here you go: 

Always tough to pick the takeaways – but here are my Top 3:

ZeroDistance – Bringing people closer to technology, and positioning Infosys as the enabler for zero distance between people is a worthy goal and was well received. The other two dimensions to ZeroDistance are the end users and the code of the solutions that are to be built. Amongst the common drivers (technology, education), ‘extreme disintermediation’ was the one that stuck out for me. I think Infosys articulated that well, but did not mention the elephant in the room when it comes to disintermediation, DLT aka Blockchain.

Mana – Infosys launched a new offering, Mana, for now squarely focused at improving its internal processes, while drinking its own champagne. The champagne brand is IIP – aka Infosys Information platform – (and with that AiKiDo) that the provider uses to improve efficiency (doing things right) and effectiveness (doing the right thing) for its largest employee population group – L3 support consultants. Infosys sees Mana as a knowledge based AI platform, for now it’s a great internal showcase looking at the digital exhaust of IT support work – and then come up with faster, better, more automated resolutions.

Product Progress – The ‘Platform’ family (IIP, IAP, and IKP) is making good progress, with currently 220 engagements and 17 live customers. The ATP statistics work is being done with IIP and it was showcased widely and proudly at Confluence. On the Edge family side (TradeEdge, CreditEdge, ProcureEdge and AssistEdge) Infosys is doing well, too – with now over 60 customers live and doubled revenue.


Also please take a look at the video colleagues Alan Lepofsky, Doug Henschen and me recorded after Day #1 of Confluence - here.

MyPOV

A good event for Infosys, that is showing how it works both at regaining momentum and changing the service provider landscape, with building more IP and products, all the way that Sikka mentioned that the future in services is the combination of software and people. The good news is that Infosys now has the chops to provide it and has created a great internal showcase with Mana. It is clear the executive team has thought this through, as a move to services lowers the cost of service overall, makes people more productive and enables them to server more accounts.

On the concern side Infosys is in the middle of a transformation that it needs to pass and come out stronger at the end. To become more agile the provider is flattening the organization, a rather unique and for most service providers unique re-organizational process. Changing a hierarchical organization to a flatter one is never trivial. But to be fair – there is no alternative to this transformation given the course Sikka has set for Infosys.

Overall a good Confluence for Infosys, that is changing ever slightly but determined from a ‘people only’ to a ‘software and people’ model. We will be watching.

Want to learn more? Checkout the Storify collection below. 

More on Infosys:
 
  • Progress Report - Infosys Analyst Meeting - Can you transform customers while you transform yourself? Looks like it - read here

Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.

 
Future of Work Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization User Conference infosys SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing finance Healthcare Customer Service Content Management Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

IBM Design - It's More Than Just Drawing Pretty Pixels

IBM Design - It's More Than Just Drawing Pretty Pixels

On April 13th I traveled to the headquarters of IBM Design in Austin Texas. My goal for the day was to learn how this new (well, two year old) division of the company is impacting product design and customer satisfaction. Below is an approximately 10 minute long video where I recap the key things I learned. If you don't have time to watch, here's the main thing you need to know:

IBM Design is about a lot more than just making products look nice. The (1000+ people) team's mission is to applying "design thinking" (the discipline of using creative problem solving to find solutions) to almost everything IBM does. It's not just about products (like Verse, Watson Analytics, BlueMix) but also changing the processes IBM follows for everything from feature requests, to on-boarding new employees and performance reviews. Following this formula: People + Practices + Places = Outcomes, IBM Design is changing the culture of IBM as much as it is changing the products. 

As proof of IBM's long term commitment to design, they have recently announced a new title, Distinguished Designer. Those of you familiar with careers at IBM will know that Distinguished Engineer is a huge honour for developers. This new design-centric honour is intended to carry similar significance in the industry. The first three people to received this new distinction are Charlie Hill, Adam Cutler and Doug Powell. I worked with Charlie at Lotus for many years, and have had several meetings with both Adam and Doug. Doug was actually my instructor when I went through IBM's Design Thinking workshop last year. Congratulations to all three on this well deserved reward.

 

Future of Work Data to Decisions Innovation & Product-led Growth New C-Suite Tech Optimization IBM Chief Experience Officer