Results

Qlik Extends Its Platform As Cloud Disruption Looms

Qlik Extends Its Platform As Cloud Disruption Looms

Qlik Sense Enterprise 3.0 and the platform strategy dominates Qonnections 2016. Change lies ahead as analysis moves into the cloud.

Qlik Sense 3.0 is coming in June, Qlik Sense Cloud is ramping up for wider use, and the Qlik DataMarket is gaining more powerful data-connection and data-enrichment capabilities. These were among the notable announcements at Qlik’s May 2-4 Qonnections event in Orlando, Fla.

Qlik has good reason for confidence coming out of Qonnections. Customers I spoke to now understand and accept the company’s platform approach, seeing both QlikView and Qlik Sense as here-to-stay components of a larger ecosystem. Instead of facing a forced migration to Qlik Sense - the company’s newer, more visual and self-service-oriented product – customers now trust that they will continue to have options and plenty of time to evolve their deployments as they see fit. At the same time they seem to accept that the bulk of Qlik’s investment is in Qlik Sense, which will see three updates per year versus one update per year for Qlik View.

Inside Qlik Qonnections 2016

Beyond the platform, Qlik made announcements about its investments in analytics, data and cloud:

Analytics: Qlik Sense Enterprise 3.0, coming in June, will deliver visual search and exploration capabilities designed to enable business users to find new insights without requiring analysts to edit and modify visualizations and data. The update adds smart features, including automatic detection and charting of temporal (time-related) and geographic data. Developers will get new integration and visualization APIs as well as a plugin for Visual Studio.

Data: Qlik announced the acquisition of partner Industrial CodeBox, a move that will turn that company’s QVSource product into a built-in tool for integrating cloud-based data sources such as SaaS applications and social networks with Qlik Sense and QlikView. QVSource has more than 40 pre-built connectors for popular Web-based sources including Twitter, Facebook, Microsoft Dynamics CRM, and SugarCRM. Qlik also announced Qlik DataMarket Financial Data Packages, due in June, offering licensable data on stocks, indices and corporate financials drawn from 35 exchanges around the globe.

Cloud: Qlik introduced Qlik Sense Cloud last year with a basic freemium service for personal use. Qlik Sense Cloud Plus, added in January and priced at $20 per user, per month, offers secure sharing of analyses among small workgroups. Qlik Sense Cloud Business, announced at Qonnections and due out in the second half of this year, will support secure sharing of analyses with internal and external groups. It’s aimed at departments of larger businesses and small and midsized enterprises and is offered through monthly and annual contracts.

MyPOV On Qlik’s Progress

I’m most impressed by Qlik’s data investments. A keynote demo of a new Visual Data Preparation tool due out with the June 3.0 release brought rousing applause from customers. The tool lets you upload and explore data sets, which are shown as visual bubbles on a palate. Related data sets automatically show up in close proximity, and joins are as simple as inspecting the data and pulling the bubbles together.

The new data-prep interface targets spreadsheet-savvy business users. It stops short of the munging, mashup and transformation capabilities offered by Qlik partners such as Alteryx, Informatica, Paxata, and Trifacta, Hjalmar Gislason, Qlik’s VP of data, told me. But I think Qlik is delivering the basics that are most in demand.

I also liked the Industrial CodeBox acquisition, a tuck-in deal that enhances the power of the budding Qlik DataMarket. CodeBox provides data-connection and data-enrichment options that will only see more use as data is increasingly born in the cloud and as digital businesses move toward blending and monetizing information through cloud data services.

What I’m less impressed with is Qlik’s methodical, bottom-up move to the cloud. Qlik was late to the game when it launched Qlik Sense Cloud last year, and the strategy is to move up from personal and workgroup use to departments and SMBs later this year. Qlik Sense Cloud Enterprise edition won’t show up until sometime in 2017. Qlik Sense Cloud runs on Amazon Web Services, but we’ve seen no detail on how it might access AWS data sources and cloud services.

Yes, we agree with Qlik’s argument that it’s going to be a hybrid world for a long time to come. But in our research and advisory work, Constellation Research sees demand for cloud-based analytical capabilities from companies large and small. In fact, we think digital disruption is bringing an era in which the majority of information deemed critical will be accessed externally rather than owned and managed on premises.

Amazon Web Services, Google and Microsoft are building out extensive portfolios of analytical capabilities complementing their massive public clouds. They know that getting the data into the cloud is just the first step. The next step is taking advantage of data scale and massive compute power to harness automation, machine learning and artificial intelligence capabilities that will gradually take some (though certainly not all) of the labor and complexity out of data management and data analysis. In the future (how near isn't quite clear), these sorts of features promise to transform our ideas of ease of use and change the BI and analytics battle from self-service to smart cloud services. With a cloudier approach and more partnering, Qlik could be leading the way.

Related Reading:
SAS Goes Cloud, But Will Customers Follow?
SAP Bets On Cloud For Analytics, BPC Optimized for S/4 HANA
Oracle Data Cloud: The Data-as-a-Service Differentiator
Qlik Unveils QlikView 12, Qlik Sense Cloud Roadmap

 

Data to Decisions Tech Optimization Chief Information Officer Chief Digital Officer

JDA hosts a great event…but what does the future hold?

JDA hosts a great event…but what does the future hold?

index

I just came back from Nashville, well actually Las Vegas but was in Nashville to start the week. The JDA Focus 2016 event was being held in the Music City. It brought together a large gathering of some of the top supply chain professionals from the around the globe. Per usual, JDA put on a good show, at least the first day since that was what I was able to attend! But even being in Nashville for less than 24 hours, I took away some observations from the event and JDA:

  • Talking a good game – The main stage presentations by CEO Bal Dail and Chief Revenue Office Razat Guarav were in stark contrast to the former administrations. How? Much more focused on the disruptors facing the market and with a keen eye on the future. Bal focused on the company embarking in a “big pivot” focusing on how customers are impacting our businesses. While Razat hit on the major disruptors that face supply chains and our industries. More on both later. What was refreshing was a message from main stage that called out and hit on many of the trends and drivers that we are all facing. Both Bal and Razat also started giving the audience a glimpse into how JDA will address these shifts, whether it is the new retail.me offering, greater emphasis on JDA labs or creating a digital hub, all promising efforts to address their customers’ needs. Coupling their willingness to address new disruptors head on coupled with solutions that are poised to take on these changes was refreshing to hear from this leadership team, not always what would come from main stage.
  • Facing disruptorsand making the pivot – One of the big threads that we at Constellation Research have been working on with our customers were reflected on main stage in Nashville (as much as I would like to take credit for those ideas…alas I cannot). Razat hit on 5 big themes of disruption: mobile, IoT, social, cloud and big data. We speak at length about these disruptors; feel free to read our research, but what is the biggest underlying driver is the rise of the consumer. Many of these disruptors have empowered the consumer, given the consumer a growing voice in the ecosystem. When it comes to the supply chain whether you are B2B or B2C the consumer has become the driver – your business must make this the center of their strategy. The same goes for the technology providers that are servicing these businesses. Bal and his team have a great challenge ahead as they look to pivot themselves to help their customers’ better address the consumers and the disruptors that have made chaos the new norm.
  • So where does JDA go from here? So JDA is painting a picture of awareness and willingness to pivot to meet their customers’ needs. Good. But what does the future hold for JDA? Over the past decade the company has absorbed Manugistics, i2 Technologies and Red Prairie. All were best of breed supply chain solution providers. JDA became, on paper, a supply chain powerhouse being able to address a wide array of industry needs. Ranging from process and discrete manufacturing, retail and logistics. Impressive. But the question remains – what does New Mountain Capital have in mind long term for this asset? While other supply chain players have been focusing their efforts on specific industries – players like Plex focused on manufacturing, Aptos being spun off from Epicor to focus on retail while Epicor can concentrate on ERP. Can JDA continue to find success competing on all fronts? Or do they need to consider following a similar strategy as Epicor and break up the parts? Maybe the pieces competing on their own are more powerful than the whole? I do not believe this is the only direction JDA can take, but at some point New Mountain Capital will want to reap the rewards from their investment. How that happens will be interesting to observe.

JDA remains a major player in the field of supply chain. The leadership and culture have an aggressive level of expectations of themselves and the business – it is now up to the solutions and software to catch up. They are clearly aware and in tune with the disruptors that are impacting all businesses. The next few months will be crucial for the JDA leadership team to implement their pivot strategy and find success.

Disclosure – I worked at i2 Technologies from 2004 to 2009, i2 Technologies was acquired by JDA in 2009. 

Matrix Commerce Chief Information Officer

IoT and Network Connectivity Management, or AoT and Data Flow Management on the Network?

IoT and Network Connectivity Management, or AoT and Data Flow Management on the Network?

Starting in the early 80s with Ethernet running through to todays’ sophisticated and ubiquitous Internet offerings; wired to wireless, fixed to mobile, with all the tools and methods a vast pool of experience in ‘networking’ has built up. Against this background it seems strange to focus on the networking challenge, but are IoT Devices the same as IT Devices from a networking perspective?

IT was, and is, about connecting computers into computer systems around significant data exchanges; or in the case of the Internet, then usually human driven Web navigation. Whilst the popular expectation is that the Devices  of ‘Internet’ of Things will connect and interact in a similar manner on a network, that may not be true.

To make the point consider the example of Sensity an IoT company that uses LED lights and lighting as IoT sensors. Quote; ….has been designed to instantly convert any lighting manufacturer’s LED fixtures into IP-enabled sensory node in a light sensory network that provides both the lighting control and cloud-based IoT services via a standard NEMA socket. Examples of Sensity deployments range from monitoring car parking bays to counting people moving around retail stores and much more.

Its an example of IoT; its network connected, has a link to IP, but its not networked in a manner that any IT networking professional will find it easy to relate to. (take this link for details of NEMA connectivity). Sensity, is an impressively innovative IoT solution with huge business value in any number of ways, but it’s also an example of the all too common IoT problem of ‘new’ networking solutions.

Just at the time when IT networks and protocols have become relatively standardized to support ubiquitous network connectivity the arrival of IoT networked devices disrupts this with a whole range of ‘differences’. The innovative IoT solutions are usually complete packages from sensors to graphical user displays, including the network topography, to bypass these differences.  However there is general a requirement to use an Enterprise IT network for as a backbone connection to the Cloud where the service element will be hosted. (N.B. using conventional TCP/IP protocols for IoT data can often mean the packet header payload is bigger than the data being transported). 

The implication inherent in the name, and the claims of ubiquitous connectivity, is that todays IT networks providing Internet connectivity will support IoT; in reality for many deployments that’s only likely to be true about the backbone element.

In the Telecom Industry the concept of the ‘Final Mile’ challenge being made up of any number of different connection formats was at least limited in its diversity of network content to voice and bell signaling. Currently the IoT Final Mile diversity exists in just about any, and every, facet of ‘networking’, from physical media layer upward.

In small scale and pilot sensor networks this level of diversity may not be noticeable for the impact its traffic with have Enterprise IT networks for backbone connectivity, some pilots may even taking place in totally closed special IoT sensor networks. With time and scale the convenient separate local IoT closed network will soon vanish, and IoT network traffic levels will be felt on the Enterprise IT network.

At the MIT Technology Review Digital Summit Todd Greene Follow the CEO of PubNub made the observation that a new type of Network for connecting IoT embedded devices is required. His argument was based on both the scale and latency implications resulting from the complex infrastructure of the real ‘internet’ Quote; Unfortunately the Internet isn’t just one network, and considerations include heterogeneous networks, including cell towers, slow connectivity, fast connectivity, proxy servers, and firewalls; all things that can disrupt connectivity’.

Todd Greene Follow could be expected to make this point as his company, PubNub, has, since 2009, been actively promoting their alterative interconnection network for exactly the kind of low latency, small data packet traffic that makes up IoT sensing. PubNub can certainly argue they have something right in their assessment given the size and number of messages they are now carrying for a collection of well-known companies. So is it really the answer to deploy a new parallel network infrastructure as his message in respect of IoT and traditional IT networks would suggest.

For the majority of enterprises integrating with and using IoT will need to be a part of their existing Enterprise IT network as, unless in a specialized sensing process based interconnected industry such as Oil Refining, there is unlikely to be an economic argument in shifting to a specialized alternative IoT sensing network.

The obvious consideration of IoT traffic impacts on the Enterprise IT network will be sheer amount of traffic, but it’s the size of network packets and the frequency of device transmissions that introduce some basic issues that have to be addressed. The two starting points for any Network Professional are Traffic impacts of volume, timing and latency, and Security management. Just how these concerns are addressed will depend on choosing whether device connection management, or data stream/flow management, is the better primary choice.

Summarizing the numbers of variables to be considered into just these two headings may seem at odds with the huge amount written on the issues of IoT networks. The concern for those facing the reality of supporting IoT sensor deployments on Enterprise IT networks is to find the approach that addresses their particular requirement, and not become lost in a sea of individual issues.

Are the issues unimportant? Of course not, but as with everything to do with IoT it’s all about outcomes! In this case that means choosing tools and techniques to manage the devices connectivity, or to consider the alternative to manage consolidated data streams. Is it is possible to have such a neat separation? In the long term no, the two are equally important and required, but in the short term when tactical success matters then it helps to understand what is the dominant issue as a priority.

As in all markets in the early stages much of the information available comes from a vendor of a product therefore the presentation of the ‘facts’ will by necessity be concerned with the product. ‘Issues based selling’ is a well-known technique so it pays to establish an overall approach to use to consider products within an objective context.

Googling the term ‘IoT Connection Management’ will provide papers from Cisco, Huawei and others, which define how to control and manage the huge number of different types and ways that IoT devices are connected. Connection management is a necessity when faced with the diversity of large numbers and types of devices that an enterprise might have in use across both cellular and traditional enterprise IP addressed devices. 

Naturally when first starting to consider IoT pilots and small-scale deployments extending traditional IT Network connection management to include IoT Devices is seen as the starting point. At this stage the impact of managing the service level availability of new connections exceeds the potential impact of the data stream management.

However as IoT sensing moves into production systems the number of IoT sensors and the concentration rises dramatically. Deloittes new headquarters building in Amsterdam has 22000 sensor points to manage its abilities to be a ‘Smart’ Building. Consider the network impact these thousands of simple IoT sensing connections make in communicating only a few bytes of data each, but doing so frequently. In aggregate that’s a lot of traffic, but the individual IoT sensor connections may have little capability to be managed beyond checking for their presence, (think of the example of Sensity). Counting its data transmissions can check the presence of a sensor just as well.

Deloittes Smart Building ‘The Edge’ is expected to provide more than 3 petabytes of data a year from its 22000 IoT sensors, at that level of mature IoT deployment the challenge has to move to Data Streaming management. This is a specific new functionality arising from the technologies associated with IoT and a new challenge for IT Network Managers.

The example of The Edge Smart Building graphically makes the point that the immediate connectivity onboarding management swiftly moves to becoming massive traffic management challenge. IoT devices introduce a volume of connections x the minuscule amounts of data x frequency of sending that taken together impose a very different traffic profile for management purposes. Even if the answer is to segment IoT sensors onto a different network there is still likely to be a Data Streaming management challenge.

The business value from IoT sensors is in either ‘real-time’ Smart Services, or in Analysis of Things, AoT, and for both that means interconnection with the Enterprise IT network to access Cloud based resources. At this connection point even  if at no other connection point, IoT Data Streaming management will be a necessity 

Googling Data Streaming or Data Flow management will produce a lot of results, as usual most are written around products rather than in the context of the issues to be considered. As data is what provides the business value the whole question of the creation of data by sensors, through to the consolidation data in a form suitable for consumption by Smart Services and Analytics processing does need to be addressed. But that’s a further topic in its own right, and here consideration rests purely on the network impacts.

In the course of little more than a year developed from the growing experience gained from deploying IoT based sensor systems the focus has changed from the IoT Sensors themselves to the aspects of creating, managing and using the IoT Data. The whole topic of Data Streaming and Flow management together with the new forms of Analysis of Things, AoT, has expanded to make Data Architecture a pressing consideration somewhat overtaking network connection Architecture.

At the beginning of this blog Sensity, with their clever LED lighting IoT solution, was used to illustrate high levels of IoT connections, where the value lies in the aggregation of data rather than the management of each individual connection.  Should, or even could, this be managed via individual Network Connections, or is it one Gateway connection with management of the resulting Data Stream?

However Network Connections come back into the picture when considering non Enterprise IT network Wireless connected deployments. Connecting cars, goods in transit, high value large items in storage yards etc., require IoT deployments that rely on 3/4G, SigFox, LTE, even sometimes Wifi. Here connectivity and service management becomes the priority.

These forms of public Wireless services are, for the foreseeable future, going to be subjected to ongoing change with new specifications/capabilities, even Service Providers Business models, changing. Recently the LoRa Alliance claimed to be the fastest growing standards alliance in IoT, and of course, there is the arrival of 5G on the horizon, all of which require connection level changes to be managed.

In Public Wireless Networks the service provider usually provides the IoT Network Connection management together with business / commercial management as an important competitive differentiator. That means the device connection point itself is often overlooked, with focus only at the IT Enterprise connection point. Flexibility in IoT Device connections is key to trying to avoid Service provider lock-in preventing changes to better commercial deals.

The barrier to change on the IoT devices themselves is not inconsiderable as, if originally intended for WiFi direct connectivity, there will be an inbuilt full TCP/IP network stack with protocols payloads exceeding data payloads. This level of message size is likely to exceed the capacity of specialist IoT networks such as SigFox. There are similar problems associated with each network type, ie 3/4G, LTE, etc. that render changing from one to another as somewhere between an expensive redevelopment of communication stacks and effectively impossible.

This is a not inconsiderable issue to face up to before making a choice of Network Connectivity types for an IoT deployment using public wireless networking services. To change low cost simple sensors the answer will be to ‘rip and replace’, but investment in complex ‘Smart Sensors’ needs to address future proofing. As with most areas where IoT stretches the capabilities of existing technologies start-ups are providing new answers.

An interesting example is Wivity who claims to ‘eliminate the complexity of public wireless connectivity with a - “Build Once, Connect Everywhere” approach based on a hardware modem being incorporated in the Smart Device design. The interchangeable Wivity modems accept the same HTTP calls from the IoT device no matter what network connection is being deployed so providing an ongoing path to new upgraded network types. Together with using lightweight protocols and other edge based techniques Network connectivity is made simpler and flexible. Wivity call for some re thinking on the Telecoms market and IoT in their blog https://wivity.com/blog/IoT-is-a-Different-Animal

To summarize; deploying IoT pilots means considering and testing more than simple sensor connectivity to GUI, or analytics. IoT is a generational change in the type of technology and its business role, resulting in understanding network and data connectivity needing careful investigation. IoT pilots make low enough demands on IT Enterprise Networks that the impact of full-scale rollouts are easy to miss.

 

 

This is a last post for five weeks as i will be taking a sabbatical break though continuing to follow the technology market as usual

New C-Suite

Blockchain: Almost Everything You Read Is Wrong

Blockchain: Almost Everything You Read Is Wrong

Almost everything you read about the blockchain is wrong. No new technology since the Internet itself has excited so many pundits, but blockchain just doesn’t do what most people seem to think it does. We’re all used to hype, and we can forgive genuine enthusiasm for shiny new technologies, but many of the claims being made for blockchain are just beyond the pale. It's not going to stamp out corruption in Africa; it's not going to crowdsource policing of the financial system; it's not going to give firefighters unlimited communication channels. So just what is it about blockchain?

The blockchain only does one thing (and it doesn’t even do that very well). It provides a way to verify the order in which entries are made to a ledger, without any centralized authority. In so doing, blockchain solves what security experts thought was an unsolvable problem – preventing the double spend of electronic cash without a central monetary authority. It’s an extraordinary solution, and it comes at an extraordinary price. A large proportion of the entire world’s computing resource has been put to work contributing to the consensus algorithm that continuously watches the state of the ledger. And it has to be so, in order to ward off brute force criminal attack.

How did an extravagant and very technical solution to a very specific problem capture the imagination of so many? Perhaps it’s been so long since the early noughties’ tech wreck that we’ve lost our herd immunity to the viral idea that technology can beget trust. Perhaps, as Arthur C. Clarke said, any sufficiently advanced technology looks like magic. Perhaps because the crypto currency Bitcoin really does have characteristics that could disrupt banking (and all the world hates the banks) blockchain by extension is taken to be universally disruptive. Or perhaps blockchain has simply (but simplistically) legitimized the utopian dream of decentralized computing.

Blockchain is antiauthoritarian and ruthlessly “trust-free”. The blockchain algorithm is rooted in politics; it was expressly designed to work without needing to trust any entity or coalition. Anyone at all can join the blockchain community and be part of the revolution.

The point of the blockchain is to track every single Bitcoin movement, detecting and rejecting double spends. Yet the blockchain APIs also allow other auxiliary data to be written into Bitcoin transactions, and thus tracked. So the suggested applications for blockchain extend far beyond payments, to the management of almost any asset imaginable, from land titles and intellectual property, to precious stones and medical records.

From a design perspective, the most troubling aspect of most non-payments proposals for the blockchain is the failure to explain why it’s better than a regular database. Blockchain does offer enormous redundancy and tamper resistance, thanks to a copy of the ledger staying up-to-date on thousands of computers all around the world, but why is that so much better than a digitally signed database with a good backup?

Remember what blockchain was specifically designed to do: resolve the order of entries in the ledger, in a peer-to-peer mode, without an administrator. When it comes to all-round security, blockchain falls short. It’s neither necessary nor sufficient for any enterprise security application I’ve yet seen. For instance, there is no native encryption for confidentiality; neither is there any access control for reading transactions, or writing new ones. The security qualities of confidentiality, authentication and, above all, authorization, all need to be layered on top of the basic architecture. ‘So what’ you might think; aren’t all security systems layered? Well yes, but the important missing layers undo some of the core assumptions blockchain is founded on, and that’s bad for the security architecture. In particular, as mentioned, blockchain needs massive scale, but access control, “permissioned” chains, and the hybrid private chains and side chains (put forward to meld the freedom of blockchain to the structures of business) all compromise the system’s integrity and fraud resistance.

And then there’s the slippery notion of trust. By “trust”, cryptographers mean “out of band” or manual mechanisms, over and above the pure math and software, that deliver a security promise. Blockchain needs none of that - so long as you confine yourself to Bitcoin. Many carefree commentators like to say blockchain and Bitcoin are different things, yet the connection runs deeper than they know. Bitcoins are the only things that are actually “on” the blockchain. When people refer to putting land titles or diamonds “on the blockchain”, they’re using a short hand that belies blockchain’s limitations. To represent any physical thing in the ledger requires firstly a schema – a formal agreement about which symbols in the data structure correspond to what property in the real world – and secondly a process to bind the owner of that property to the special private key (known in the trade as a Bitcoin wallet) used to sign each ledger entry. Who does that binding? How exactly do diamond traders, land dealers, doctors and lawyers get their blockchain keys in the first place? How does the world know who’s who? These questions bring us back to the sorts of hierarchical authorities that blockchain was supposed to get rid of.

There is no utopia in blockchain. The truth is that when we fold real world management, permissions, authorities and trust, back on top of the blockchain, we undo the decentralization at the heart of the design. If we can’t get away from administrators then the idealistic peer-to-peer consensus algorithm of blockchain is academic, and simply too much to bear.

I’ve been studying blockchain for two years now. My latest in-depth report was recently published by Constellation Research.

 
 
Digital Safety, Privacy & Cybersecurity Chief Information Officer

Consumers Are The Biggest Disruptor in the Supply Chain

Consumers Are The Biggest Disruptor in the Supply Chain

We are in the middle of event season, which means lots of airplanes, hotel rooms and restaurant dinners (some better than others). During the past few weeks I have flown to all the event hot spots – Las Vegas, San Francisco, Detroit, Nashville, Chicago, New York, San Jose, Miami, Washington DC to name a few. I have also attended a wide range of events, from the likes of Infosys, JDA, Plex, Demandware, SAP, Oracle, Epicor etc etc. There has been one thread that is common – the rise of the consumer. Now this is nothing new to us here at Constellation Research. We have been been touting the rise of the consumer in the commercial ecosystem (B2B and B2C) as the biggest disruptor to date. It is good to hear that the solution providers are recognizing this shift as well.

So why is the consumer gaining in strength?

The have the voice because of social. A growing number of retailers are making sure they do better social listening to gauge how their customers are viewing them. What kinds of features or services might they be interested in. Companies like Newell Rubbermaid or Best Buy, have done a lot of work to keep an ear to the social sounding boards.

Consumers have the reach via mobile. As Demandware pointed out at their show, mobile is king. True mobile meaning our phones, not our tablets, are what sit at the top of the food chain when it comes to customer interactions and touch points. We all know the statistics about how often we check our phones and the fact they are with us almost the entirety of our waking hours. Who could have imagined what Marty Cooper did in 1983 would give us such reach when it comes to the relationship consumers have with the commerce ecosystem.

The internet provides consumers with virtually unlimited choices. Before we saw the rise of the world wide web and subsequently eCommerce, our choices were often time tethered to the stores that was within a reasonable physical range or whatever inventory that could be displayed in a catalog. All this changed with the rise of the internet. Suddenly if you were a purveyor of fine wine in the Rhone valley, you could attract buyers from Hong Kong to Pittsburgh. Regardless of your size, through a few clicks of a mouse your products were accessible by anyone with a browser and a dial up modem! Consumers now had access to a treasure trove of products.

Finally the consumers’ expectations have been set by the likes of Amazon. Not only can consumers access a wide swath of products through the eCommerce giant, but they also expect perks such as free shipping, returns, access to long tail products, to name a few.

All this taken together is why consumers are becoming, if not are already, the biggest disruptors within the commerce supply chain. Based on what I am hearing this event season, the vendors and service providers are agreeing with this assertion. The challenge will be how to best offer the solutions and technologies that can allow participants in the commerce supply chain to meet their consumers’ needs. These solution providers must keep in mind their customers’ customers as they design and offer new solutions. How can they empower their customers to be able to better meet the growing expectations and needs of the end customer? No small challenge indeed. As this crazy event has demonstrated, at least most if not all the vendors are reading off the same music sheet.

Matrix Commerce Chief Customer Officer

#CMTV Speaks to R "Ray" Wang at CloudExpo

#CMTV Speaks to R "Ray" Wang at CloudExpo

Media Name: research-offerings-research-reports.jpg

R "Ray" Wang discusses Dominating Digital Disruption at CloudExpo. 

Tech Optimization Chief Information Officer On <iframe width="560" height="315" src="https://www.youtube.com/embed/qgl34x8fq-M" frameborder="0" allowfullscreen></iframe>

Infosys Confluence 2016 - The Future is Software + People 

Infosys Confluence 2016 - The Future is Software + People 

We had the opportunity to attend Infosys’ Confluence event in San Francisco, held from April 27th to 29th at the Hilton Union Square. The conference was well attended with over 1500 participants, coming from customers, prospects and the ecosystem, a surge by 50+% over last year. 
 
Have a look at my top three takeaways of the event here:

 
 
 
No time to watch – here is the 1-2 slide condensation:
 

Want to read on? Here you go: 

Always tough to pick the takeaways – but here are my Top 3:

ZeroDistance – Bringing people closer to technology, and positioning Infosys as the enabler for zero distance between people is a worthy goal and was well received. The other two dimensions to ZeroDistance are the end users and the code of the solutions that are to be built. Amongst the common drivers (technology, education), ‘extreme disintermediation’ was the one that stuck out for me. I think Infosys articulated that well, but did not mention the elephant in the room when it comes to disintermediation, DLT aka Blockchain.

Mana – Infosys launched a new offering, Mana, for now squarely focused at improving its internal processes, while drinking its own champagne. The champagne brand is IIP – aka Infosys Information platform – (and with that AiKiDo) that the provider uses to improve efficiency (doing things right) and effectiveness (doing the right thing) for its largest employee population group – L3 support consultants. Infosys sees Mana as a knowledge based AI platform, for now it’s a great internal showcase looking at the digital exhaust of IT support work – and then come up with faster, better, more automated resolutions.

Product Progress – The ‘Platform’ family (IIP, IAP, and IKP) is making good progress, with currently 220 engagements and 17 live customers. The ATP statistics work is being done with IIP and it was showcased widely and proudly at Confluence. On the Edge family side (TradeEdge, CreditEdge, ProcureEdge and AssistEdge) Infosys is doing well, too – with now over 60 customers live and doubled revenue.


Also please take a look at the video colleagues Alan Lepofsky, Doug Henschen and me recorded after Day #1 of Confluence - here.

MyPOV

A good event for Infosys, that is showing how it works both at regaining momentum and changing the service provider landscape, with building more IP and products, all the way that Sikka mentioned that the future in services is the combination of software and people. The good news is that Infosys now has the chops to provide it and has created a great internal showcase with Mana. It is clear the executive team has thought this through, as a move to services lowers the cost of service overall, makes people more productive and enables them to server more accounts.

On the concern side Infosys is in the middle of a transformation that it needs to pass and come out stronger at the end. To become more agile the provider is flattening the organization, a rather unique and for most service providers unique re-organizational process. Changing a hierarchical organization to a flatter one is never trivial. But to be fair – there is no alternative to this transformation given the course Sikka has set for Infosys.

Overall a good Confluence for Infosys, that is changing ever slightly but determined from a ‘people only’ to a ‘software and people’ model. We will be watching.

Want to learn more? Checkout the Storify collection below. 

More on Infosys:
 
  • Progress Report - Infosys Analyst Meeting - Can you transform customers while you transform yourself? Looks like it - read here

Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.

 
Future of Work Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization User Conference infosys SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing finance Healthcare Customer Service Content Management Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

IBM Design - It's More Than Just Drawing Pretty Pixels

IBM Design - It's More Than Just Drawing Pretty Pixels

On April 13th I traveled to the headquarters of IBM Design in Austin Texas. My goal for the day was to learn how this new (well, two year old) division of the company is impacting product design and customer satisfaction. Below is an approximately 10 minute long video where I recap the key things I learned. If you don't have time to watch, here's the main thing you need to know:

IBM Design is about a lot more than just making products look nice. The (1000+ people) team's mission is to applying "design thinking" (the discipline of using creative problem solving to find solutions) to almost everything IBM does. It's not just about products (like Verse, Watson Analytics, BlueMix) but also changing the processes IBM follows for everything from feature requests, to on-boarding new employees and performance reviews. Following this formula: People + Practices + Places = Outcomes, IBM Design is changing the culture of IBM as much as it is changing the products. 

As proof of IBM's long term commitment to design, they have recently announced a new title, Distinguished Designer. Those of you familiar with careers at IBM will know that Distinguished Engineer is a huge honour for developers. This new design-centric honour is intended to carry similar significance in the industry. The first three people to received this new distinction are Charlie Hill, Adam Cutler and Doug Powell. I worked with Charlie at Lotus for many years, and have had several meetings with both Adam and Doug. Doug was actually my instructor when I went through IBM's Design Thinking workshop last year. Congratulations to all three on this well deserved reward.

 

Future of Work Data to Decisions Innovation & Product-led Growth New C-Suite Tech Optimization IBM Chief Experience Officer

Event Report - OpenStack Summit 2016

Event Report - OpenStack Summit 2016

We had the opportunity to attend OpenStack Summit in Austin this week, our first visit of an OpenStack Summit. Always good to see first hand and in person on how well community, vendors and ecosystem are doing. In short - OpenStack is doing well, growing up and maturing (there are pros and cons to it, more below).

 

So take a look at this video for my overall event report (and see my Day #1 blog post here):

 

No time to watch? Check out the 2 slide summary:

 
More time - read on:
 
Tough to pick the Top 3 takeaways - but here you go:
 
  • OpenStack grows up - I spoke to many OpenStack veterans, including 4 'original' attendees of the very first summit in Austin... and they all see more of an enterprise attendance, more 'suits' and interest from enterprises. That's a welcome and good development.
     
  • Great Story for ISVs and Telcos - but the rest? -  OpenStack has become the de-facto standard for network and device virtualization - with all the benefits of opensource (one major being... 'it's free') as well as for ISVs sitting on complex architectures and looking for ways to move their data centers to a standard, IT accepted offering (e.g. SAP and Workday were presenting). The question is - what about the rest of the enterprise spectrum. We heard encouraging statements from WalMart and WellsFargo - but they were less flamboyant than the 'all in' messages we hear at the public cloud events. An area to watch.
     
  • Right Themes for Mitaka - but where is the sizzle? - Keeping the focus on manageability and usability makes of course a lo of sense for OpenStack, but the community needs to be careful to not spend too much time looking in the rear view mirror. Key innovative cloud areas like Microservice Management, Serverles Architecture, Machine Learning, Bots, In Memory advances, Big Data securing and even security are not land mark items going forward (for now). This is a very challenge of the nature of OpenSource, enterprises and people need to spend time and money to make things work - and sometimes making things work takes a longer time. In the meantime the next innovation pops up somewhere else. In the meantime all these innovative areas offer a differentiation strategy to all OpenStack players - but that can also lead to more fragmentation, with the risk of loosing interoperability and the vendor diversity advantages of OpenStack (see IBM's attempt to stop that in general here). 
 

MyPOV

Good progress by OpenStack. The community has weathered the significant reduction of commitment of a large member (HP) well, and it looks like the roadmap and projects have not taken (too much) of a hit. Good to see more attendees, over 50% first time attendees show an interest beyond the early adopters. And projects are maturing, which on the one side is good to see - but OpenStack may have a 'pipeline' problem. 2013 efforts were balanced across 'Proof of Concept (POC)' / Test and Production - now Production has doubled (good) but the percenage of members in POC mode has gone down to 50% (see below). Are still enough new members looking at OpenStack? 
 
 
OpenStack will have to work hard to keep up the value proposition and create growth vis a vis the prominent public cloud vendors. 2016 and the future will be different as many (e.g. IBM, Microsoft and Oracle) now offer the cloud stack on premises, too - which reduces the value of one of the key OpenStack arguments. We will be watching...
 
Still not enough - check out my Storify collections of Day #1 (here), Day #2 (below) and the Analyst Summit (here). 
 
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
 
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Next-Generation Customer Experience User Conference SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

From the Internet of Things, to the Analytics of Things focusing on new forms of Analytics coupled to Smart Services.

From the Internet of Things, to the Analytics of Things focusing on new forms of Analytics coupled to Smart Services.

The Internet of Things has been experiencing the full on hype phase as the ceaseless barrage of statistics as to the numbers of things that will be connected. Fortunately this is fading out to be replaced with meaningful experiences on where, and how, the business value is to be found. The Analytics of Things, or AoT, is a new, and welcome part of this shift delivered from some well-respected data analytic technology companies.

The introduction of the term AoT draws attention to the new data and its value being generated both by IoT devices, and Smart Services. As with all data there is a requirement to analysis, but in this case it will introduce new types of Analytics. AoT also refers to the use of analytics in making connected devices smart and able to take intelligent action.  For more details see Deloittes; AoT – a short take on Analytics

Previous blogs in this series have dwelt on the new and innovative ability to use ‘real-time’ data to be able react in an optimized manner not previously possible. The introduction of Smart Services as the mechanism for this, using Complex Event Processing is a significant change enabled by IoT sourced data. The most recent blog illustrated the application of these capabilities in a Smart City Technology and Business conceptual architecture, and included the need for the advanced Analytics of AoT.

IoT is still in the development stage of understanding for many people so adding the even less understood topic of AoT seems almost counter productive. If anything seems familiar it will be the Analytic part though this is wrongly assumed to be part of Big Data Analytics. AoT is concerned with wholly different data which in turn has the capability to reveal very different outcomes from existing reports.

In the blog on Smart City Architecture a simple Smart Service use case ‘Find me Parking’, demonstrated interactions between Smart Services by adding a further Smart Service providing route planning with traffic jam avoidance. Obviously these services are aimed at Citizens, but there are substantial benefits to the City Administration from analyzing the radically different data that Smart Services produce. Three simple examples of the data available are;

  1. Query inputs from citizens providing their starting geo-location as well as their intended destination geo-location
  2. The true total demand for Parking including requests that were unfulfilled, as well as the amount of time a car was on the streets waiting for a parking bay to become available.
  3. The routes used and the changes introduced by this traffic, as well as journeys abandoned due to traffic or non availability of a parking bay.

It would be difficult, and expensive, to obtain any of this data by other means such as personal surveys, or checkpoints. But the real point is that each of these three data outputs is generated within the context of an event, or citizen activity; and shows both failed, and successful, activity event chains and outcomes.

AoT explores new sources of data from Smart Service requests, and outcomes, adding entirely new, and original, data sets that have not been available from traditional sources. IoT enabled Smart Services bring the capability to apply entirely new high value analytics.

There is also data available from direct-coupled IoT sensors, many of these sources are now new, and examples include; sensing Buses passing route points to check if they are on time; various traffic flow sensing schemes; traffic light maintenance and other directly sensed conditions. All produce data inputs to AoT, but often the individual sensors have too low values for AoT advanced Analytics.  While valuable these inputs remain more suited to validate existing planning and scheduling applications, and reporting analytics.

AoT is not yet a fully developed set of technologies products and methods, but the possibilities are becoming increasingly recognized as IoT enabled Smart Services are becoming more widespread. Interesting articles have started to appear outlining the importance of the topic, with the crucial association to ‘Smart’; the Wall Street Journal Blog The Analytics of Things and Deloittes Short Take on Analytics are examples.

Some companies that are well respected in the ‘data’ sector are active proponents of AoT; including IBM, and Teradata, each having developed specific approaches and products for this wholly new branch of Analytics. Other technology companies together with Startups are active, and there is a strong link between AoT and AI, or Artificial Intelligence.

The IBM ‘Watson’ initiative, is an example of this broader approach being defined as a cognitive reckoning engine thus fusing the broader implications of AI with AoT. However the IBM approach certainly does contain the mix of people, unstructured data and IoT based smart inputs that are the basis for AoT. IBM’s acquisition of the Weather Channel is an interesting example of simple sensor data being turned into a Smart Service of Weather Forecasting. The forecast data is then consumed in further Analytical analysis in combination with other Smart Services.

IBM, in common with other providers of AoT capabilities sees this new generation of unique outcomes being used to construct and refine the rules for Smart Services Complex Event Processing engines. This continuous refinement made possible by the interactive relationship between AoT, IoT and Smart Services is claimed to lead in the direction of Cognitive Reckoning and AI.

The earlier example of the Smart Parking and Smart Routing Services feeding new data to AoT illustrated just how this provides a very different view of parking and traffic activities. Using these results to adjustments City Planning and traffic models will lead to new insights being made available to both services to further refine the optimization of their outcomes. AoT adds significant new value, but will also need significant new skills.

Teradata have recently introduced a new business unit specializing in AoT deployments, this builds on the release at the beginning of 2014 of Teradata Aster-GR to add Graph capabilities to the flagship Teradata Aster Discovery Platform 6. Shifting to managing and exploring data by means of Graphs is an important principle underpinning much of the use of IoT data, and therefore AoT, see blog From Relational Databases to Graph.

Teradata used their online magazine to publish a description of the Aster-GR capability and how it integrates with other data in different formats in an article entitled ‘Hands-On’. Teradata has a more technically detailed paper entitled SAS/Access – interface to Teradata produced in conjunction with SAS, (a well respected name in advanced Analytics) on how to integrate solutions.

If you work in Analytics and figure that Big Data, and Data Lakes, are as far as the technology has advanced then its time to take a fresh look at the topic. Google AoT, or Analysis of Things, to find a range of content on the new fresh capabilities that it introduces. As Smart Services, based on IoT, become established as the real definition of the Digital Service economy then AoT moves to the forefront of delivering business and competitive value.

An important part of the research is to carefully check out how AoT will function in association with an Enterprise Data Warehouse; the good news is that this may be a whole lot simpler than expected. As an example Teradata accepts direct IoT data inputs for processing and aggregation into the existing data. 

New C-Suite