Results

4 Factors Drive Great Customer Experience And Commerce

4 Factors Drive Great Customer Experience And Commerce

We just finished some new research that shows the correlation to customer experience and commerce / revenue. Below are some of the highlights from the research. 

Continuity of Customer Experiences Drices the Future of Commerce

DOWNLOAD EXCERPT

The Four Factors that Drive Superior Customer Experience Engagement

1. Know who the most profitable customers are online

  • Stat: Only 52 percent of U.S. and 53 percent of U.K. brands were very effective in knowing who their most profitable customers are.
  • This means that nearly half of their time is spent with the wrong customers (low value) or, if they are spending time with the right customers, the customer experience does not provide a high conversion rate.
  • Thus, brands require a new approach.

 2. Know where (which channels) their high value customers are coming from.

  • Stat: Only 58 percent of U.S. and 48 percent of U.K. brands were very effective in knowing where (which channels) their high value customers are coming from.
  • In this context, channels include social media, websites, search engines, etc. Basically, brands are guessing at where to spend their marketing and ad dollars to provide the highest return on investment.

3. Know which high value products customers are interested in.

  • Stat: Only 59 percent of U.S. and 56 percent of U.K. brands were very effective in knowing which products customer are most interested in.
  • Consequently, brands may not be displaying the best products to the highest value customers, thus losing revenue and profits.

4. Know which high value products customers have viewed.

  • Stat: Only 60 percent of U.S. and 56 percent of U.K. brands were very effective in knowing which products customer have actually viewed.
  • When brands understand the customer’s journey, they can direct their spending and messaging to the moments of maximum influence.
  • When they do this, they will have a much greater chance of reaching customers in the right place at the right time with the right message, producing higher conversion rates.

How does your company stack-up compared to these companies? What have you done to improve your customer experience so that it results in better revenue and customer experiences?

@drnatalie, VP and Principal Analyst, Covering Marketing, Sales, and Customer Service to Great Amazing Customer Experiences

References:
Harris Poll Research and Primary Constellation Research

Next-Generation Customer Experience Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Sales Marketing Digital Safety, Privacy & Cybersecurity Chief Customer Officer

Musings - Speed matters for HR - how to accelerate - Part II

Musings - Speed matters for HR - how to accelerate - Part II

So we looked at the need for enterprises to accelerate last week, find the blog post here, with more of a technology view, as all good ‘inner values’ are based on a decent architecture of the right technology. More specifically these are the enablement of BigData and the provision of ‘true’ analytics (my definition here). 
 
 
 
But even the right technology and architecture can fail, if the design principles aren’t right – and it’s key to keep in mind that business user centricity is paramount for successful, next generation HR systems, that help enterprises to get faster. No surprise, as layers of organization, a Tayloresque organization model, competency splits and more have slowed down organization since… forever (ok, the invention of these principles, that ironically were designed to make organizations more efficient, faster).

So let’s look at the next set of three criteria for a successful next generation HR product:
 

‘Lean’ Recruiting

Recruiting is one of the most challenging positions in the enterprise. CEOs (usually) have 2-3 years before they are removed for bad performance, sales reps get 2-3 quarters, recruiters 2-3 weeks. That’s still better than 2-3 hours for a waiter or a retailer sales clerk, but probably one of the most scrutinized positions in the enterprise. If a recruiter cannot bring on board the talent an enterprise is looking for after a few weeks, they will be looking for a new job soon. If you keep in mind unfavorable workforce dynamics, the need for acceleration etc. that job will not get easier.

So let’s keep in mind what the recruiter role was originally put in place for. Have a professional who does nothing but recruiting and is therefore better at it than the occasional recruitment by a manager in the line of business. And save time for that manager, so they don’t have to sift through many resumes, initial interviews etc. But at the same time the creation of the recruiter established a disconnect between the hiring manager and the candidate, that has to be overcome for every vacancy. The fact that up to a third of position in North America are not being filled due to perceived ‘friction’ between recruiters and line of business managers is telling. It looks like in too many case managers prefer to settle for hope (that employee will get better), have their team work more (‘so hard to find good people’), or kick the can down the road (the next manager can fix this) – than for recruiting new talent.

The good news is that with the advances in BigData, cheap, elastic compute capability from the cloud that allows for ‘true’ analytics, the recruiter can be largely bypassed and a filtered, fitted list of candidates can be served to the manager. The manager has to make the final call anyway, so instead of a recruiter working on many resumes to whittle down the list of good applicants to the final interview, managers can today get the list of final candidates, with the help of next generation recruiting software. We are at the cusp of video analysis to even add early conversation to the data trough from which managers can find the best candidates, after software has created the shortlist. And some vendors have even changed the recruiting model. Thanks to the advances of technology, headhunting techniques can be applied. Formerly only reserved to 6 digit salaried positions, the searching of social networks (interestingly Facebook beats out the ‘network of liars’, LinkedIn) for best fit candidates empowers managers to contact the best candidates directly, while they are still in their current jobs. This follows the old adage of the best people not looking for jobs (but being happily employed somewhere).

So empowerment of the business user, usage of Analytics and BigData – enabled ‘lean’ recruiting. With no recruiters involved (who will become coaches for managers and applicants, product managers at software companies etc.) – certainly an aggressive vision, but ask any manager out there if they would like it. They are more likely going to say Yes than No… which means it will happen, sooner than later.

Talent ‘Depth Chart’

Sports teams, security and military teams have depth charts. A coach needs to know who can play the same position, should she decide to substitute the player on the field, should the player be injured, booked etc. Likewise there can’t be an ‘emergency’ meeting when the heavy machine gunner is incapacitated… so why has the Talent ‘Depth Chart’ not been enabled for positions in an enterprise?

So let’s look what it means first: A manager should see at any given time how well the current holder of a positon is performing. Who else in the enterprise can do that job, and can do a better job than the incumbent. What talent (using ‘lean’ recruiting) above could be hired from outside the enterprise. Or if the position may not be around much longer and the local labor laws are challenging to shut down the position – what candidates from contingent worker sources could be hired. A contractor maybe even the best fit to the position, regardless of the employment status.

Again the good news is that vendors are actively thinking about this. It is relatively easy to see how well an employee is performing, the manager will have an opinion on this anyway. But it is hard to do something about it. So finding coaching, learning and mentoring tools and options is the first step. Finding good fits in the enterprise is the next step. Easing the conversation with the manager of a professional that a manager would like to ‘poach’ with a ‘one’ click automation. Giving the phone numbers of the best 2-3 external candidate fits for a position. Same for the best contractors. Show how well the can do the job, where would the rank on the ‘Talent Depth’ chart for that position? All these functions can be build today, enabling the manager to assess incumbent and potential talent for a position.

Transboarding

I wrote about Transboarding the first time in summer of 2014 (more here) – it’s a word constructed out of merging transfer, off- and onboarding. A lot of effort, time and resources are spend on Recruiting and Onboarding and they are key HR functions, but the most common HR event, the transfer is very little automated and supported.

Consider the manager using a ‘Talent Depth’ chart as described above. She identifies a suitable individual in another department. Let’s got for the easy scenario – the individual wants to transfer, his manager is supportive, even better needs the current incumbent in the current position. Basically a talent swap. Even such a smooth and simple case is substantial work with HR Core, Training and Talent Management systems. If we talk hourly workers, add the complexity of Workforce Management. What if the manager could do an ‘electronic handshake’, set the date and the rest would be automated? The APIs for these functions are available. We are only waiting for vendors to ‘glue’ them together for very powerful automation in a flexible, the business user empowering way.

Being able to Transboard people efficiently will be key for enterprises going forward. We already known enterprises need to become faster, if they can allocate people faster as needed, they will get faster, too. Not to mention the higher satisfaction of employees who can rotate through positions faster, without haggling managers, without fear of upsetting the current manager and so on.

Stay tuned for Part III.
Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Data to Decisions Innovation & Product-led Growth New C-Suite Sales Marketing Tech Optimization AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief People Officer Chief Information Officer Chief Customer Officer Chief Human Resources Officer

2015 Roundup Of Analytics, Big Data & Business Intelligence Forecasts And Market Estimates

2015 Roundup Of Analytics, Big Data & Business Intelligence Forecasts And Market Estimates

1
  • NYC SkylineSalesforce (NYSE:CRM) estimates adding analytics and Business Intelligence (BI) applications will increase their Total Addressable Market (TAM) by $13B in FY2014.
  • 89% of business leaders believe Big Data will revolutionize business operations in the same way the Internet did.
  • 83% have pursued Big Data projects in order to seize a competitive edge.

Despite the varying methodologies used in the studies mentioned in this roundup, many share a common set of conclusions. The high priority in gaining greater insights into customers and their unmet needs, more precise information on how to best manage and simplify sales cycles, and how to streamline service are common themes.

The most successful Big Data uses cases revolve around enterprises’ need to get beyond the constraints that hold them back from being more attentive and responsive to customers.

Presented below is a roundup of recent forecasts and estimates:

  • Wikibon projects the Big Data market will top $84B in 2026, attaining a 17% Compound Annual Growth Rate (CAGR) for the forecast period 2011 to 2026. The Big Data market reached $27.36B in 2014, up from $19.6B in 2013. These and other insights are from Wikibon’s excellent research of Big Data market adoption and growth. The graphic below provides an overview of their Big Data Market Forecast.  Source: Executive Summary: Big Data Vendor Revenue and Market Forecast, 2011-2026.

Wikibon big data forecast

  • IBM and SAS are the leaders of the Big Data predictive analytics market according to the latest Forrester Wave™: Big Data Predictive Analytics Solutions, Q2 2015. The latest Forrester Wave is based on an analysis of 13 different big data predictive analytics providers including Alpine Data Labs, Alteryx, Angoss Software, Dell, FICO, IBM, KNIME.com, Microsoft, Oracle, Predixion Software, RapidMiner, SAP, and SAS. Forrester specifically called out Microsoft Azure Learning is an impressive new entrant that shows the potential for Microsoft to be a significant player in this market. Gregory Piatetsky (@KDNuggets) has done an excellent analysis of the Forrester Wave Big Data Predictive Analytics Solutions Q2 2015 report here. Source: Courtesy of Predixion Software: The Forrester Wave™: Big Data Predictive Analytics Solutions, Q2 2015 (free, no opt-in).

Forrester Wave Big Data Predictive Analytics

  • IBM, KNIME, RapidMiner and SAS are leading the advanced analytics platform market according to Gartner’s latest Magic Quadrant. Gartner’s latest Magic Quadrant for advanced analytics evaluated 16 leading providers of advanced analytics platforms that are used to building solutions from scratch. The following vendors were included in Gartner’s analysis: Alpine Data Labs, Alteryx, Angoss, Dell, FICO, IBM, KNIME, Microsoft, Predixion, Prognoz, RapidMiner, Revolution Analytics, Salford Systems, SAP, SAS and Tibco Software, Gregory Piatetsky (@KDNuggets) provides excellent insights into shifts in Magic Quadrant for Advanced Platform rankings here.  Source: Courtesy of RapidMinerMagic Quadrant for Advanced Analytics Platforms Published: 19 February 2015 Analyst(s): Gareth Herschel, Alexander Linden, Lisa Kart (reprint; free, no opt-in).

Magic Quadrant for Advanced Analytics Platforms

  • Salesforce estimates adding analytics and Business Intelligence (BI) applications will increase their Total Addressable Market (TAM) by $13B in FY2014. Adding new apps in analytics is projected to increase their TAM to $82B for calendar year (CY) 2018, fueling an 11% CAGR in their total addressable market from CY 2013 to 2018. Source: Building on Fifteen Years of Customer Success Salesforce Analyst Day 2014 Presentation (free, no opt in).

Salesforce Graphic

  • 89% of business leaders believe big data will revolutionize business operations in the same way the Internet did. 85% believe that big data will dramatically change the way they do business. 79% agree that ‘companies that do not embrace Big Data will lose their competitive position and may even face extinction.’ 83% have pursued big data projects in order to seize a competitive edge. The top three areas where big data will make an impact in their operations include: impacting customer relationships (37%); redefining product development (26%); and changing the way operations is organized (15%).The following graphic compares the top six areas where big data is projected to have the greatest impact in organizations over the next five years. Source: Accenture, Big Success with Big Data: Executive Summary (free, no opt in).

Big Data Big Success Graphic

Frost & Sullivan Graphic

 

global text market graphic

 

  • Customer analytics (48%), operational analytics (21%), and fraud & compliance (21%) are the top three use cases for Big Data. Datameer’s analysis of the market also found that the global Hadoop market will grow from $1.5B in 2012 to $50.2B in 2020, and financial services, technology and telecommunications are the leading industries using big data solutions today. Source: Big Data: A Competitive Weapon for the Enterprise.

Big Data Use Cases in Business

  • 37% of Asia Pacific manufacturers are using Big Data and analytics technologies to improve production quality management. IDC found manufacturers in this region are relying on these technologies to reduce costs, increase productivity, and attract new customers. Source: Big Data and Analytics Core to Nex-Gen Manufacturing.

big data in manufacturing

  • Supply chain visibility (56%), geo-location and mapping data (47%) and product traceability data (42%) are the top three potential areas of Big Data opportunity for supply chain management. Transport management, supply chain planning, & network modeling and optimization are the three most popular applications of Big Data in supply chain initiatives. Source: Supply Chain Report, February 2015.

Big data use in supply chains

  • Finding correlations across multiple disparate data sources (48%), predicting customer behavior (46%) and predicting product or services sales (40%) are the three factors driving interest in Big Data analytics. These and other fascinating findings from InformationWeek’s 2015 Analytics & BI Survey provide a glimpse into how enterprises are selecting analytics applications and platforms. Source: Information Week 2015 Analytics & BI Survey.

factors driving interest in big data analysis

 

Data to Decisions Sales Marketing Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Future of Work intel salesforce Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning LLMs Agentic AI Robotics SaaS PaaS IaaS Quantum Computing Enterprise IT Enterprise Acceleration IoT Blockchain CRM ERP CCaaS UCaaS Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices

Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices

We had the opportunity to attend the Teradata Influencer Summit held at the beautiful L’Auberge Del Mar in northern San Diego. When mentioning to other influencers that I would be en route for that meeting earlier in the week, I mostly gathered incredulous stares and comments like ‘are they still around and interesting’? I missed the first half day, but the next one and a half days gave a good insight into where Teradata is and where they want to be in the next years. 

 
 
 

So here are my top 3 takeaways from the event

A compelling vision – In recent times Teradata had been blamed for a lack of vision and / or a lack of sense of realism. I followed the Summit two years ago from the fences and back then an executive had an elaborate presentation to show why Hadoop would be a failure – or only a temporary fashion that would disappear as fast as it showed up. But quickly afterwards (and some colleagues say the Summit helped) Teradata reversed course, changed from a ‘struggle to snuggle’ approach vis a vis Hadoop, embracing more of Hadoop, partnering e.g. with MongoDB already a year ago (and again was a key sponsor of this year’s MongoDB World – also this week, event report here). All slides at the summit had Teradata, Aster and Hadoop based offerings. As a matter of fact the Teradata UDA includes all three database paradigms.
 
The vision of the 'Sentient' Enterprise
And along these capabilities Teradata is using its by now proven Querygrid capabilities, following the ‘co-existence’ route with Hadoop, or ‘federated query’ strategy all vendors with an age of 5+ years pursue. As part of that Teradata sees the world separated into three categories of systems – tightly coupled (Teradata), loosely coupled (Aster) and non-coupled (Hadoop). An interesting approach to the system landscape, that Ratzesberger used to explain the vision of the ‘sentient enterprise’. There are pros and cons with federation, see MyPOV section below.


It’s coming together – Not sure when Teradata came up with the vision, but obviously work needs to be done here. Teradata needs to write the endpoint to execute queries on the Hadoop side, and it currently supports Hive for that. But Hive has known benefits and challenges, and Teradata will address the challenges soon, stay tuned on that topic. The other piece critical is that a tool that can transport data back and forward, extract as needed from a variety of data sources as desired, transform information, can apply rules ‘in flight’ etc. Teradata calls it a Listener, but it is much more. In the old times one could call it a bona fide ETL tool, only that today it scales to the 21st century requirements with streaming data, humungous data volumes, BAM and CEP like support etc. – a key product / tool to make the Teradata UDA fly. 
 
 
 
The Analytical Application Platform

Self-disruption built in – All successful technology companies struggle with disrupting themselves. It’s a price of success and harvesting the benefits of maturation of technology that create the potential for technology disruption. Co-President Wimmer remarked that all successful database companies, the 30+ year olds are setup for disruption today. Usually technology players react to disruption risks with acquisitions, try to insert knew product capabilities and employee talent and these approaches are usually less successful than intended for a variety of reason.

Teradata has taken a unique approach here, allowing the acquisition of ThinkBigA to keep operating independently, and even taking a step further, funding its global expansion. Unique, because ThinkBigA does pure best of breed BigData projects, with complete independence of choosing technologies customer’s desire. As such ThinkBigA cannibalizes Teradata customers to a certain extent, and remarkably Teradata does nothing to stop that.
 
The Teradata free ThinkBigA architecture
So kudos to the Teradata management, that coming of a close to death experience in regards of the Hadoop phenomena, now has a built in disruptor with ThinkBigA. Quite a gutsy move, coming close to e.g. an enterprise ERP vendor allowing a subsidiary services company to sell services on competitor products to its customers. What? Granted the ThinkBigA case is not that drastic, as the provider ‘only’ uses open source, but the disruption is on hand. The benefits for Teradata are twofold: The vendor learns about real customer demands and quality of open source BigData products, that it can use for its own R&D strategy and effort. Secondly Teradata has a way to remain relevant (and extract revenues) from customers it otherwise would have lost to BigData / Opensource players anyway.
 
ThinkBigA BigData Project recommendations, notice bullet #5 - emphasis added
 

MyPOV

Teradata is alive and well, with solid product plans for the immediate future. The three tiered vision of federated queries and data is common for vendors with existing, substantial business. It resonates well with enterprises that are more conservative both in regards of technology adoption and security needs.

On the concern side, the verdict is out if enterprises operating with a federated storage and query approach will end up missing insights, or only getting served sub optimal insights. Conceptually one can make the case that every insight can only be caught with a single source storage, one place to be queried etc. but if the difference between the two is substantial for the success of a business, remains to be seen.

So a key 2nd half for Teradata on the product side, with a number of substantial announcements to come, key capabilities in the pipeline. Teradata may have been down, but it’s back on its feet and throwing punches. Only the paranoid survive and Teradata has gotten a litte more paranoid - in a positive way. Good to see for prospects, customers and the overall ecosystem as nothing fuels innovation better than competition. We will be watching.
Data to Decisions Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Future of Work Hadoop Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Collaboration Around Files - A Case Study On Working With Clients

Collaboration Around Files - A Case Study On Working With Clients

I recently had the opportunity to speak with one of the world's largest advertising agencies about how they collaborate with their clients. Details of their solution are covered in the Constellation Research case study "Powering Global Client Collaboration with Secure File Sharing, How a Leading Ad Agency Used Egnyte Adaptive Enterprise File Services to Work with Clients". Constellation clients can download the full report here

Here are a few highlights:

  • One of the main tasks of a digital agency is working with clients to create the multimedia assets used in their sales and marketing campaigns. The process is very collaborative, starting with gathering requirements, then brainstorming, multiple rounds of drafts and edits, followed by review, approval and production. Before Egnyte, it was difficult to share files and discuss changes via email. 
  • The CIO along with other department leads started a multi-year technology refresh that focused on platforms with strong cloud-based and mobile features. The project began by replacing their legacy email system with Google’s productivity suite, Google Apps for Work. Google Apps was used not only for email and calendar, but also word processing, spreadsheets, file-sharing and internal web-conferencing all via a single integrated offering. 
  • By switching to Fuze for real-time collaboration with clients, the Agency is able to easily share files from Egnyte into a Fuze video conference, where anyone can participate from his/her computer, phone or tablet. In these meetings, designs are reviewed and improved onscreen, eliminating the typical back and forth exchanges that lead to confusion and costly mistakes. 

Constellation clients can download the full report Powering Global Client Collaboration with Secure File Sharing to learn more about the challenges, solutions, benefits, takeaways and recommendations.

 

Future of Work Sales Marketing Next-Generation Customer Experience Revenue & Growth Effectiveness Google Chief Marketing Officer Chief People Officer Chief Revenue Officer

Teradata Influencer Summit Highlights

Teradata Influencer Summit Highlights

Herman Wimmer, Co-President kicks off the event. Herman says that the guiding priorities of Teradata are:

Key Principle #1: Analytic ecosystem: Teradata, DB, UDA, Real-time, Fabric Architecture

Key Principle #2: Big Data Technologies: Aster, Hadoop, Bid Data Apps, Apps Center, Open Source Contribution and leverage

Key Principle#3: Cloud for Analytics

Key Principle #4: Enterprise Class Production Analytics, Hybrid Implementations (Pubic/ Private), Broader Market Penetration

Key Principle #5: Consulting, Big Data Consulting, Analytics Consulting, Managed Services

Key Principle #6: Future Markets: Healthcare, Government, Innovation, People, Passion….

In a digital world, where business models are changing very fast (note not everyone agrees with this or sees it) business will need real-time data to make better decisions to make the customer experience the best it can be. Companies that used to compete on selling cars are really COMPETING ON THE “experience or what it feels like to drive/ own the car.” Engineers, marketers, Customer Service Professionals can guess what is making the customer happy- DRIVING THAT CUSTOMER EXPERIENCE or they can use these technologies available to drive better business. It is not easy, but Teradata definitely simplifying it. And it takes investment of people, process, time, and the technology and PASSION.

My POV: Where is your company with respect to going beyond the conversations of “big data” for “big data” sake and are truly embracing the data in the business, where it can make a difference. Who should lead this? CEOs that “get the value of data” are looking to someone in their organization to do it – CIO, CTO, CMO, Customer Service Professional to lead data revelations. And if one of those “titles” doesn’t step-up and lead or co-lead with other executives, CEOs will find and hire Chief Digital Officers or Chief Customer Officers to make it happen. It is an opportunity, but it is not without risk. Done well, it bring huge financial rewards to the company that master it. And probably most likely help that person’s career. If you are in a position that isn’t data-centric – then it’s up to you to turn it into that.

@drnatalie, VP and Principal Analyst, Covering Marketing, Sales, Marketing and Customer Service using Big Data Analytics to Deliver Amazing Customer Experiences

 

Next-Generation Customer Experience Chief Customer Officer

Cisco announces intent to acquire Piston Cloud

Cisco announces intent to acquire Piston Cloud

Continuing the trend toward managed cloud services Cisco announced its intent to acquire Piston Cloud. This announcement comes after the announcements that EMC acquired Virtustream (my take here), and IBM acquired Blue Box (my take here). 
 
Let’s dissect the Cisco blog (it can be found here) about the announcement:

Cloud computing has fundamentally altered the IT landscape: dramatically boosting IT agility, while lowering costs. To realize the business advantages of cloud, organizations are shifting to a hybrid IT model—blending private cloud, public cloud, and on-premise applications.

MyPOV – It looks like the demand for hybrid cloud has gotten much stronger, considering this announcement.

To help customers maintain control and compliance in this hyper-connected, hyper-distributed IT environment, Cisco and its partners are building the Intercloud—a globally connected network of clouds. Today, Cisco is taking another important step towards realizing our ambitious Intercloud vision.

MyPOV – Good for Cisco to connect the intended acquisition with Intercloud – my analysis of the original announcement is here.

We are pleased to announce our intent to acquire Piston Cloud Computing, which will help accelerate the product, delivery, and operational capabilities of Cisco Intercloud Services.

Paired with our recent acquisition of Metacloud, Piston’s distributed systems engineering and OpenStack talent will further enhance our capabilities around cloud automation, availability, and scale. 


MyPOV – Piston Cloud’s product delivers a ‘cloud os’ that can ‘plug in’ a number of cloud services. As such Piston Cloud gives a large number of using different technologies, something that enterprises care about as they do not know for sure which technologies they want to use for their next generation application projects. Being able to run these from the same platform is a significant value proposition that Piston Cloud has delivered. Given the heterogeneous nature of the environments traditional Cisco customers operate, certainly a good move. My analysis of the mentioned Metacloud acquisition is here.

The acquisition of Piston will complement our Intercloud strategy by bringing additional operational experience on the underlying infrastructure that powers Cisco OpenStack Private Cloud.

MyPOV – So Piston Cloud will become part of the Cisco OpenStack offering – no surprise here. Good to have the clarity.

Additionally, Piston’s deep knowledge of distributed systems and automated deployment will help further enhance our delivery capabilities for customers and partners.

MyPOV – Fair for Cisco to admit that Piston Cloud has some serious chops bringing together such diverse technologies like e.g. Docker or Spark.

To bring the world of standalone clouds together, Cisco and our partners are building the Intercloud. The Intercloud is designed to deliver secure cloud services everywhere in the world. Our enterprise-class portfolio of technology and cloud services gives customers the choice to build their own private clouds or consume cloud services from a trusted Intercloud Provider. The Intercloud provides choice of services, all with compliance and control. In a nutshell: we’re delivering cloud the way our customers need it.

MyPOV – The marketing line for Cisco Intercloud. But the intended acquisition of Piston Cloud certainly gives more credibility and more crucially more capability into this direction. One can only speculate why only now these two have come together.

Piston will join our Cloud Services team under the leadership of Faiyaz Shahpurwala, senior vice president, Cloud Infrastructure and Managed Services Organization.

MyPOV – Always good to hear where acquisitions are going to be anchored organizationally. I have shared my concerns in regards of services leadership building products, but that is a general concern and happy to give Shapurwala and the team the benefit of the doubt. The future will tell.

Overall POV

It can’t be coincidence of EMC, IBM and Cisco are all completing acquisitions in the managed cloud / hybrid cloud space. It makes sense for all three vendors to push the hybrid agenda, as all three of them have existing sales channels into local data centers. To a certain point this movement indicates a failure of the ‘public only' cloud vendors (e.g. AWS and Google).
 
Nothing that can be held e.g. against Cisco, who by itself tries to differentiate itself by ‘playing nice’ to the demands, needs (and maybe angst) of CIOs in regards of the public cloud. The interesting development is, that CIOs seem to be comfortable letting 3rd parties do the management of their private cloud infrastructure, but still want to see their data center being utilized. Write down time frames may play a role here, but overall an interesting development that I will spend more time analyzing going forward.

On the concern side, it is another acquisition and Cisco will have to hold on to the Piston Cloud team talent. But Cisco is an experienced acquirer and knows how to make sure that the key people stay around.

So overall a good move by Cisco, the flexibility of Piston Cloud is something that has the potential to be a differentiator for Cisco InterCloud. Congratulations.

 

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work cisco systems SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Market Move - IBM gets into private cloud (services) with Blue Box acqusition

Market Move - IBM gets into private cloud (services) with Blue Box acqusition

The need to manage private clouds seems to be really heating up on the demand side – last week EMC announced plans to acquire Virtustream (see analysis here), today both IBM and Cisco announced the respective purchases / intent to purchase of BlueBox and PistonCloud. 
 
Let’s dissect the IBM press release (it can be found here):

ARMONK, N.Y - 03 Jun 2015: IBM (NYSE: IBM) today announced it has acquired Blue Box Group, Inc., a managed private cloud provider built on OpenStack.
Blue Box is a privately held company based in Seattle that provides businesses with a simple, private cloud as a service platform, based on OpenStack. Customers benefit from the ability to more easily deploy workloads across hybrid cloud environments. Financial details were not disclosed.
MyPOV – Another acquisition in Seattle, Blue Box originally started as a website hosting provider and has evolved to a OpenStack managed private cloud provider.

Enterprises are seeking ways to embrace all types of cloud to address a wide range of workloads. Today’s announcement reinforces IBM’s commitment to deliver flexible cloud computing models that make it easier for customers to move to data and applications across clouds and meets their needs across public, private and hybrid cloud environments. With Gartner forecasting that 72 percent of enterprises will be pursuing a hybrid cloud strategy this year [1], it is increasingly important for companies to leverage multiple models while maintaining consistent management across their cloud platforms.
MyPOV – For the longest time the race was about getting public cloud capabilities going. It looks like by mid of 2015 enterprises are ready to move to cloud, albeit in a more conservative fashion, with more private cloud aspects than ever before. With the bare metal of SoftLayer IBM is able to give customers more confidence levels than most competitors, but it seems that this was not enough, hence the Blue Box acquisition.

Through Blue Box, IBM will help businesses rapidly integrate their cloud-based applications and on-premises systems into OpenStack-based managed cloud. Blue Box also strengthens IBM Cloud’s existing OpenStack portfolio, with the introduction of a remotely managed OpenStack offering to provide clients with a local cloud and increased visibility, control and security.
MyPOV – This paragraph unveils the ‘crown jewels’ – the capability to manage a Openstack system deployed remotely. Enterprises may still want to utilize their data centers and see their servers, but are more open to have them managed remotely.

This move further accelerates IBM’s commitment to open technologies and OpenStack. IBM has 500 developers dedicated to working on open cloud projects to bring new cloud innovations to market. With Forrester Research recently finding that more than twice as many firms use or plan to use IBM Cloud as their primary hosted private cloud platform than the next closest vendor [2], Blue Box is a strategic fit into the IBM Cloud portfolio.
MyPOV – No surprise – OpenStack compatibility is important and makes Blue Box a good fit.

Blue Box can enhance and complement developer productivity by:

· Speeding delivery of applications and data through simplified and consistent access to public, dedicated and local cloud infrastructures

· Supporting managed infrastructure services across hybrid cloud environments and IBM’s digital innovation platform, Bluemix

· Offering a single management tool for OpenStack-based private clouds regardless of location

MyPOV – Good summary of the BlueBox capabilities. Blue Box partnered with small PaaS vendor Mendix, the mention here of Bluemix may make those deployments uncertain – but we are not aware of a statement in these regards. In our view it maybe well worth for IBM to look at the Mendix capabilities.

This acquisition will enable IBM to deliver a public cloud-like experience within the client’s own data center, relieving organizations of the burden of traditional private cloud deployments.
MyPOV – Underlines the value prop – the customer keeps the data center the management goes to IBM. The acquisition reminds me of Cisco’s recent acquisition of MetaCloud (see here), which also had a significant service aspect.

“IBM is dedicated to helping our clients migrate to the cloud in an open, secure, data rich environment that meet their current and future business needs,” said IBM General Manager of Cloud Services Jim Comfort. “The acquisition of Blue Box accelerates IBM’s open cloud strategy making it easier for our clients to move to data and applications across clouds and adopt hybrid cloud environments."

“No brand is more respected in IT than IBM,” said Blue Box Founder and CTO Jesse Proudman. “Blue Box is building a similarly respected brand in OpenStack. Together, we will deliver the technology and products businesses need to give their application developers an agile, responsive infrastructure across public and private clouds. This acquisition signals the beginning of new OpenStack options delivered by IBM. Now is the time to arm customers with more efficient development, delivery and lower cost solutions than they've seen thus far in the market.”

MyPOV – The usual quotes – no comment needed.

IBM currently plans to continue to support Blue Box clients and enhance their technologies while allowing these organizations to take advantage of the broader IBM portfolio. […]
MyPOV – Key statement. Blue Box customers should contact IBM asap to make sure they can keep the services that matter to them.



 

Overall MyPOV

A good move for IBM, opens new service offerings to the IBM Cloud portfolio. Given the recent acquisition at EMC and Cisco, it looks like the private cloud is alive and well. To a certain point that is a failure of ‘pure’ public cloud players like AWS and Google, it looks like they have not been able to convince CIOs to move all their load to a public cloud setup. Will be interesting how the mix of private cloud – e.g. the ones managed with BlueBox – vs public cloud will end up in a few years. And IBM listens to customers, wants more cloud revenue, so as much as it can get that running private clouds, IBM will of course do that…

 

 

 

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work softlayer IBM SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Couchbase unveils N1QL and updates the NoSQL Performance Wars

Couchbase unveils N1QL and updates the NoSQL Performance Wars

A number of press releases by Couchbase crossed the wires and the may have a deep impact on the No SQL / BigData market – so let’s look at them quickly.
The press releases can be found here
 
A new programming language – N1QL – I have been critical on attempts to create new programming languages in the past (see e.g. on SAP River here). In general the world does not need more programming languages, unless they can clearly produce benefits to programmers and application architecture. In the case of N1QL (pronounced Nickel), Couchbase tries to bring SQL to document /JSON formats. That is new and addresses a pain point / tradeoff that had to be made in the past when choosing a database for a next generation application. As N1QL is SQL compatible, Couchbase wants to tap into the large SQL developer ecosystem. The question will be how much understanding of document formats will the SQL developer need to acquire to build performing applications, and how much of that can be automated by the Couchbase framework for the programming language. From the endorsements that Couchbase has garnered in the press release it is clear that there is certainly value being created by N1QL. 
 
I should learn later in San Jose today how Couchbase can show efficiency gains with N1QL and will keep you posted.
 
Performance Wars – Round 2 – Earlier this week MongoDB published a performance report done by a 3rd party that showed superior performance of MongoDB over Cassandra and Couchbase (see Event Report here). No surprise Couchbase has shot back claiming that performance report is not fully transparent on all aspects of the test and more importantly, that tests show superior performance and TCO for Couchbase. We will be at Couchbase’s user conference later today and certainly learn more on the topic.

From these spats we can learn that performance really matters for NoSQL databases, which again shows that enterprises are moving their next generation applications into more mission and performance critical use cases than before. And that’s a good sign for enterprises and the industry.

And finally: As useful and important performance tests are – they always struggle on the comparison side. And every customer situation is different, so we recommend to take them as one measurement point, but strongly recommend customers to run their own benchmarks for critical performance pieces.

MyPOV

A good start for Couchbase Connect, Couchbase also unveiled the preview of its 4.0 server – so there will be lots of things to talk about at the 49ers stadium in Santa Clara for the next days. Stay tuned.

Resources

Inside the 9 Cloud Trends Every CXO Needs to Know in 2015

Tech Optimization Data to Decisions Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity Future of Work Next-Generation Customer Experience Chief Information Officer

Updates to HP Helion Portfolio - a commentary

Updates to HP Helion Portfolio - a commentary

HP is having its yearly HP Discover conference right now in Las Vegas, and (finally - we have been waiting for some time) has clarified its cloud plans.
 
 
The press release can be found here - let's dissect the press release in our usual commentary style:

LAS VEGAS – June 2, 2015 – Today at HP Discover 2015, HP announced updates to the HP Helion portfolio that will help enterprises realize the benefits of transitioning to a hybrid infrastructure. HP introduced HP Helion CloudSystem 9.0, the next release of its flagship integrated enterprise cloud solution, and enhancements to HP Helion Managed Cloud Services to manage enterprise workloads in a secure hosted cloud environment.

MyPOV – It has been quiet around HP Helion for a while – so it is good to more activity, right in time for HP Discover 2015. The wording with ‘help to transition to a hybrid infrastructure’ is interesting – as most enterprises already live and breathe in one today. But certainly there are a number of (HP customer?) enterprises left that are only looking at this.

Enterprises today spend the majority of IT budgets – by some estimates as high as 90 percent – on maintaining existing systems. HP estimates enterprises can reduce IT maintenance costs by approximately 40 percent by migrating existing systems to a cloud-based architecture. This can free-up the capital enterprise IT departments need to modernize application development and pursue new revenue generating opportunities.

MyPOV – A very good pitch for cloud in general, let’s read on if HP can make it tangible / unique why customers should use Helion.

“Enterprise customers have a range of needs in moving to the cloud—some need to cloud-enable traditional workloads, while others seek to build next generation ‘cloud native’ apps using modern technologies like OpenStack®, Cloud Foundry® and Docker,” said Bill Hilf, senior vice president, HP Helion Product and Service Management. “The expanded support for multiple hypervisors and cloud environments in HP Helion CloudSystem 9.0 gives enterprises and service providers added flexibility to gain cloud benefits for their existing and new applications.”

MyPOV – Hilf mentions what is making this announcement unique – support for multiple hypervisors and cloud environments (aka IaaS providers).

A single, flexible cloud solution for diverse cloud requirements
HP Helion CloudSystem is an integrated, end-to-end, private cloud solution, built to help customers realize the benefits of a hybrid infrastructure—designed for traditional and cloud native workloads, enabling automation, orchestration and control across multiple heterogeneous clouds, workloads and technologies.


MyPOV – First time it comes across to me that HP Helion Cloudsystem is a private cloud solution (only). But then HP says it is more allowing cross cloud orchestration. So it is actually a hybrid cloud management system.

HP Helion CloudSystem 9.0 expands support for multiple hypervisors and multiple clouds to provide enterprises and service providers with maximum flexibility. Additionally,

HP Helion CloudSystem 9.0 integrates HP Helion OpenStack and the HP Helion Development Platform to provide customers an enterprise grade open source platform for cloud native application development and infrastructure. […]


MyPOV – So CloudSystem 9.0 will orchestrate loads across IaaS providers, hypervisors, can use OpenStack and finally integrates with the HP PaaS. Hope got it right.

HP Helion CloudSystem 9.0 features/benefits include:

· Simultaneous support for multiple cloud environments, including Amazon Web Services (AWS), Microsoft Azure, HP Helion Public Cloud, OpenStack technology and VMware, with the ability to fully control where workloads reside


MyPOV – When HP will deliver this, it will be very attractive to enterprises.

· The latest release of HP Helion OpenStack, exposing OpenStack software APIs to simplify and speed development and integration with other clouds and offering developer-friendly add-ons with the HP Helion Development Platform based on Cloud Foundry

MyPOV – So we have support for OpenStack environment and for the HP PaaS, which uses Cloud Foundry (note that CloudFoundry itself is also bringing cross cloud capabilities, just last week it announced support for Microsoft Azure).

· Support for multiple hypervisors, now including Microsoft Hyper-V, Red Hat KVM, VMware vSphere, as well as bare metal deployments, offering customers additional choice and avoiding vendor lock-in

MyPOV – Equally a good move – allowing support for a number of leading hypervisors. For now we notice the absence of AWS XEN hypervisor.

· Support for AWS-compatible private clouds through integration with HP Helion Eucalyptus, giving customers the flexibility to deploy existing AWS workloads onto clouds they control

MyPOV – That should take care of AWS XEN support.

· Support for unstructured data through the Swift OpenStack Object Storage project

MyPOV – Good to integrate another OpenStack initiative. Underlines HP’s commitment to OpenStack.

· The latest version of HP Cloud Service Automation, providing the management capabilities to control hybrid cloud environments and a built-in path to support distributed compute, efficient object storage and rapid cloud native application development

MyPOV – Good to see an automation offering to tie this all together, this is an offering with substantial complexity and enterprises will need at least a good automation tool to pull this hybrid cloud offering off.

· An intuitive setup model delivered as a virtual appliance, allowing for installation in hours

MyPOV – Good move, let’s not forget HP sells hardware, customer buy hardware from HP – so appliances make adoption and setup significantly easier.

HP Helion CloudSystem 9.0 is available as standalone software supporting a multiple vendor hardware environment or as a fully-integrated blade-based or hyper-converged infrastructure with HP ConvergedSystem. Availability is planned for later this year. […]

MyPOV – Interesting – would be good for HP to name the ‘other’ hardware environments and what kind of support customers can expect, compared with running CloudSystem with HP hardware. This is the ‘seam’ for hardware vendor based OpenStack offerings, and it will be key to see how far HP goes here.

HP Financial Services’ IT investment and consumption offerings are available to help enterprises acquire HP Helion CloudSystem 9.0, in line with their hybrid IT transformation strategy.

MyPOV – Always a good move and a substantial part of HP hardware sales.

“The digital age is disrupting the entertainment industry. Consumers want entertainment on demand, where they want it, on any device. To address this challenge, IT must take the lead on the digital journey,” said John Herbert, executive vice president and CIO, 20th Century Fox. “The Fox Media Cloud is built on HP Helion CloudSystem, supporting the distribution of broadcast quality TV episodes and full length feature films. HP Helion CloudSystem has helped our transition to an internal service provider model, enabling the delivery of hybrid cloud services while maintaining control, to provide new services faster, while ensuring high reliability and performance. This has provided us the ability to support the tremendous growth of our digital businesses while saving millions.”

MyPOV – A nice and good quote from a blue chip customer, but (sorry 20th Century Fox) not the A+ player in the streaming media space. Interesting how media companies and IaaS / Cloud providers search and find each other. The media companies need to rebuild their IT infrastructure for the new streaming and on demand business model, and the cloud elasticity is a great solution towards this. A great challenge, too – so will be good to get some live stats from 20th century Fox and HP in a few quarters.

Customers increasingly rely on experienced channel partners to help guide them on their cloud implementations. With 56 percent of enterprise and mid-market customers working through a partner to build their private clouds, HP CloudSystem 9.0 presents a significant opportunity for HP partners.

“Comport is automating many operations required to support hospital IT infrastructure,” said Jack Margossian, president and CEO, Comport Healthcare Solutions. “It’s the beginning of an important transition in the data center, from a cost center to an efficient service center. IT can spin up and manage applications and data that support patient care in minutes instead of months, using toolsets they know. Using HP Helion CloudSystem, we are able to free up IT resources and reduce costs, enabling us to build new services while still providing for public cloud or SaaS resources. We look forward to introducing HP Helion CloudSystem 9.0 to our customers, with its additional flexibility and manageability.”


MyPOV – It is clear that HP needs (and wants) partners to support Helion. The business of many of the traditional hardware partners is in turmoil and most of them are getting into some kind of cloud business. As such they are looking for cloud operating systems that are widely connected and enable to run on many different IaaS platforms. That’s where Helion certainly is an attractive value proposition. But it does not allow HP by itself to compete e.g. with AWSCloud, Azure, Google Cloud Platform as it will be a partner offering and levels of service and economies of scale on the purchasing side are challenging in this go to market approach.

An enhanced enterprise-grade virtual cloud environment
HP Helion Managed Cloud Services provides enterprise security and high availability capabilities needed to run mission critical business applications, while meeting customers’ data sovereignty, regulatory and compliance requirements, and backed by enterprise grade service level agreements.


MyPOV – Data residency is a key aspect and consideration, HP needs to show which locations and jurisdictions it will support when. IBM has thrown down the challenge to all IaaS providers here with the goal to be present in 48 locations by end of 2015.

To broaden this offering and deliver greater value and flexibility to customers, HP Helion Managed Cloud Services will launch into beta HP Helion OpenStack Managed Private Cloud and HP Helion Eucalyptus Managed Private Cloud, both of which will be consumable as a service via an easy access portal.

In addition to these new beta offerings, HP Helion Managed Cloud Services will support the development of cloud native applications within a managed cloud service via the HP Helion Development Platform and automation of select virtual private cloud services.


MyPOV – Seems repetitive on the press release – and HP will have to explain when to use which Managed Private Cloud offering. Especially when Eucalyptus capabilities are also desired, and how to combine the two offerings. I am not sure why HP has not combined them already.

HP Helion Managed Cloud Services features/benefits include:

· New automated provisioning capabilities through a self-service portal based on HP Cloud Service Automation, enabling clouds to be deployed more quickly

· Support for multiple platforms to enable hybrid cloud proof-of-concepts using HP Helion OpenStack and HP Helion Eucalyptus

· Cloud native application development capabilities through integration with the HP Helion Development Platform, allowing enterprises to rapidly develop, deploy and deliver cloud native apps

HP Helion Managed Cloud enhancements are planned for availability later this year.


MyPOV – Another good recap what can be down across HP Helion OpenStack and / Eucalyptus and what you can build on HP’s PaaS.

Global enterprise support when it counts
HP offers Global Datacenter Care and Support in over 130 countries. Customers can call 24x7x365 and receive support in more than 30 languages, for any hardware platform, OS or hypervisor.


MyPOV – This is an important service that enterprises clearly value. It is not clear if this extend to partner data centers and which countries are covered by this service by HP owned data centers. That is what HP competes on and it will be key to clarify sooner than later.
 

Overall MyPOV

A very functional reach and ambitious plan by HP to make Helion a ‘Switzerland’ for cloud services – from a support of competitor IaaS, different data center deployment options all the way down to the hypervisor. But it is an announcement and we look forward to HP deliver all of this, which certainly can’t be easy, but will make HP an attractive player for cloud. Enterprises that have been slow to adopt cloud will certainly take notice as the offering seems low risk as it keeps many deployment doors open.

On the concern side – HP will need to deliver and reach similar economies of scale like its competitors. By being open to 3rd party IaaS HP creates value for customers, but loses load, which is critical to reach economies of scale in the cloud market. The partner bases approach is key for HP, too – but has its challenges when it comes to consistent level of service and delivery.

On the bright side it looks like HP has found its running shoes and is racing towards an attractive goal / location. But it needs to get there, get partners on board, get customers to adopt and build next generation application on its cloud. It’s a long way, but good for HP getting there.

 

Resources

 

Inside the 9 Cloud Trends Every CXO Needs to Know in 2015

 

DOWNLOAD SNAPSHOT

Tech Optimization Innovation & Product-led Growth Future of Work Data to Decisions New C-Suite HP SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer