Results

News Analysis - Teradata Launches First Enterprise Support for Presto

News Analysis - Teradata Launches First Enterprise Support for Presto

We attended the Teradata Influence Summit last week in Del Mar, North of San Diego, (Progress Report here) and fresh off the heels of this event, Teradata announces support for Presto.
 

Let’s dissect the press release (find it here) our customary style:

SAN DIEGO – June 8, 2015 – To make it easier for more users to extract insights from data lakes, Teradata (NYSE: TDC), the big data analytics and marketing applications company, today announced a multi-year commitment to contribute to Presto’s open source development and provide the industry’s first commercial support. Based on a three-phase roadmap, Teradata’s contributions will be 100 percent open source under the Apache® license and will advance Presto’s modern code base, proven scalability, iterative querying, and the ability to query multiple data repositories.

MyPOV – Interesting and good move by Teradata, partnering with a promising open source initiative with good DNA (originally from Facebook) and proven practical usage (e.g. AirBnb, Facebook and more). As part of the Teradata UDA, the vendor used to execute its federated queries on the Hadoop side on Hive, with Presto it get a more generic and more powerful support for these queries. And good to see that Teradata will be a ‘good citizen’ for open sources, becoming a (substantial) contributor to Presto. Also good to see it is a multiyear commitment, and the roadmap (we saw it under NDA) is rich, but realistic.

Developed and used by Facebook, Presto is a powerful, next-generation, open source SQL query engine which supports big data analytics. There is a growing interest in Presto, as these corporations have adopted it: Airbnb, DropBox, Gree, Groupon, and Netflix.

MyPOV – And here is the value, Presto is SQL base, a commonly known language for millions of business users and analysts. Bringing the ‘SQL back to no-SQL’ is a giant quest, and Presto is one of the more successful initiatives on that strategy path. It also has great user DNA.

Presto complements the Teradata® QueryGridTM and fits within the Teradata® Unified Data Architecture™ vision. Presto integrates with the Teradata® Unified Data Architecture™ by providing users the ability to originate queries directly from their Hadoop platform, while Teradata QueryGrid allows queries to be initiated from the Teradata Database and the Teradata Aster Database all through a common SQL protocol.

MyPOV – Teradata lays out the strategy here, which is good for transparency. Querygrid will be used on the Teradata and Aster side, throwing off queries to Presto as needed, no surprise. But Teradata will make its Presto offering open for direct Hadoop queries, a good move.

Presto is agnostic and runs on multiple Hadoop distributions. In addition, Presto can reach out from a Hadoop platform to query Cassandra, relational databases, or proprietary data stores. This flexibility allows Presto to combine data from multiple sources, allowing for analytics across the entire organization through a single query. This cross-platform analytic capability allows Presto users to extract the maximum business value from data lakes of any size, from gigabytes to petabytes.

MyPOV – The paragraph describes the value that Presto brings pretty well, from Presto a user can query pretty much anything. So Presto gives Teradata flexible data access again, but not from the Teradata level, but the Presto / OpenSource level. A very new approach for Teradata, but a good sign as it shows that Teradata is walking the path of times, which has a clear rise of open source at its end.

Teradata’s three-phase contribution to 100-percent open source code will advance Presto’s enterprise capabilities, which benefit customers.

Phase 1 - Enhance essential features that simplify the adoption of Presto, including installation, support documentation, and basic monitoring. The Phase 1 capabilities are available today for download at Teradata.com/Presto or on Github


MyPOV – Kudos for laying out a roadmap, always something appreciated by the ecosystem. Apparently the installation of Presto was not trivial, so Teradata focused on that logically as Phase 1/

Phase 2 - Integrate Presto with other key parts of the big data ecosystem, such as standard Hadoop distribution management tools, interoperability with YARN, and connectors that extend Presto’s capabilities beyond the Hadoop distributed file system (HDFS). These features will be available at the end of 2015.

MyPOV – This will be the key release for the Teradata / Presto offering.

Phase 3 –Enable ODBC (Open Database Connectivity) and JDBC (Java Database Connectivity API) to expand adoption within organizations and enhance integration with business intelligence tools. Enhance security by providing access based on job roles. These enhancements will be completed and available in 2016.

MyPOV – And this will be to make the Teradata / Presto release very, very attractive to SQL savvy business users (and developers).

In addition to its open source contributions, Teradata commercial support is now available from Think Big consulting. Think Big will offer its proven expertise in three areas to enable users to feel confident about putting Presto into production with assistance:

Presto Jumpstart – In the cloud or onsite, Think Big will assist with piloting new functionality
Presto Development – In the cloud or onsite, Think Big consultants will help customers design, build, and deploy a Presto solution
Think Big Academy - Two-day workshops will help customers understand the best uses and criteria for architectural decisions.


MyPOV – No surprise – Teradata will offer services here, and Thing Big is the place where Teradata offers these. A good move.

Overall MyPOV

Opensource is on the rise. In the last 12 months we have seen more and more open source uptake from Oracle, IBM and even outspoken past open source sceptics like Microsoft and SAP. Major ‘gifts’ have been made to open source – think of Pivotal’s recent move (see here). This all means that even skeptical enterprises have no choice than to implement, run and operate open source. The good news is, that vendors see their opportunities on the services side, which will have to be paid for, but overall it looks (for now) as if open source is a significant relief to IT budgets. The less shared secret is – it save time and reduces R&D budgets and efforts at ISVs, too.

Closer to Terada – a very smart move. Take a promising open source offering, free from competitor influence, and own the place. Contribute generously and lavishly to the roadmap, be a good open source citizen, and own it even more – all good moves. 
 
On the strategic side Presto is a huge hedge for Teradata – in the worst case scenario (Teradata ‘classic’ business slowly winding down), this is the first step of re-inventing Teradata on Hadoop. We will see if it comes to that, but a hedge is a hedge, even if not needed. The cross database type capabilities of Presto are very attractive. 
 
Teradata has done a good first move here, it will be interesting how the competition responds (find other open source initiatives – or even join Presto?). We will be watching.
 
Data to Decisions Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Future of Work Hadoop Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Product Review: Two Apple Watch Enterprise Apps Launched With IMS Health Life Sciences Wearable Platform

Product Review: Two Apple Watch Enterprise Apps Launched With IMS Health Life Sciences Wearable Platform

IMSHealthWear 1.00 Gives IMS Health First Mover Advantage In Wearable Form Factors And Apps For Healthcare Customer Experience

The rise and success of wearables such as the Apple Watch may not come as a surprise.  Constellation estimates that Apple sold 7.25 million units to date with the potential to deliver up to 43.50 million units by end of 2015.  While the consumer side of the AppleWatch launch generated a lot of buzz, demand, and innovation, the disruption on the enterprise side has been limited to a few players such as SAP, IBM, Oracle, Salesforce.com and Zoho.  With other wearables moving beyond a watch, customers seek new solutions and platforms that take advantage of these new form factors.

The June 8th, 2015, launch of IMSHealthWear 1.00 signifies an early breakthrough for wearables in enterprise healthcare. The launch of two new apps for the AppleWatch take advantage of the wearables form factor as well as introduce a platform to serve a broader set of wearables in the future (see Figure 1).

  • IMS MGRWear focuses on team, groups, and partners. Key features include smart and immediate tracking, access to aggregated and integrated sales data, immediate approvals and task routing, platform based security and performance, 360 degree view of territory performance.  Managers can view rep performance, drill in on key performance indicators, and even approve expense reports.
  • IMS  REPWear empowers sales reps. Key features include smart and immediate routing, access to aggregated and integrated sales data, secure document distribution, platform based security and performance, complete access to customer data.  Reps can easily access their calendar, make calls from the phone, respond to short messages, receive alerts on their accounts, and quickly retrieve customer information.

The road map for the five total apps have a planned general availability for Q1 2016.

Figure 1. The IMSHealthWear Life Sciences Wearable Platform Line Up

IMS HealthWear 1.00 Lineup

Source: IMS Health

Form Follows Function In Wearables

With a screen size of 272 x 340 pixels for the 38mm version and 312 x 390 for the 42 mm version on the Apple Watch, the initial set of solutions provide another form factor for sales reps and managers.   Design of the existing apps take advantage of ambient notifications, the ability to access quick bites of information, and respond to quick tasks.   With the wearable apps running on the same platform as the core IMS solution, users do not have to worry about a separate technology stack to support the product.   Expect future wearable solutions to focus on three main themes: simplifying complex tasks, reducing the process time, and capturing attention when needed (see Figure 2).

Figure 2.  The IMSHealthWear 1.00 Launch Includes 5 Apple Watch Apps And A Platform


Source: IMS Health

 The Bottom Line: New Wearables Platform Shows Innovation Is Alive And Well Post-Merger

The acquisition of Cegedim by IMS Health brought together strategic assets required for a network economy.  As one of the key rules in disrupting digital business, successful organizations must bring the content, network, and technology together to create successful business models in the digital era.  IMS Health brought significant content through its insights business and a strong network of providers, payors, and other health care professionals.  Cegedim brought the  technology to the table.

As with all acquisitions, clients often fear the stifling of incremental and transformational innovation post merger. With the deal closed in April 2015, the arrival of a wearables platform two months post merger presents a pleasant surprise to customers and prospects. More importantly, IMS Health is inviting customers to pilot the new apps, and join the consortium of Life Science customers and wearable manufacturers to build the next set of apps.

The new apps when delivered, should help every individual inside an organization to quickly access insight and more importantly speed up decision making. As often said in the digital world, success equates to saving time or capturing one’s attention.   The new IMSHealthWear apps when launched as promised should take a strong step in this direction.

Your POV

Are you in the midst of digital transformation? Have you looked at how the Apple Watch and other wearables can play a role in reducing time or catpuring attention?  Add your comments to the blog or schedule a meeting with me.

Please let us know if you need help with your Digital Business transformation efforts. Here’s how we can assist:

  • Developing your digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

 

New C-Suite Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Future of Work Data to Decisions SoftwareInsider B2C CX Marketing B2B Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning business SaaS PaaS CRM ERP Leadership LLMs Agentic AI HR HCM IaaS Supply Chain Enterprise IT Enterprise Acceleration IoT Blockchain finance M&A Enterprise Service Metaverse developer Quantum Computing Healthcare VR CCaaS UCaaS Robotics Chief Customer Officer Chief People Officer Chief Marketing Officer Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief Financial Officer Chief Revenue Officer Chief Human Resources Officer Chief Experience Officer

4 Factors Drive Great Customer Experience And Commerce

4 Factors Drive Great Customer Experience And Commerce

We just finished some new research that shows the correlation to customer experience and commerce / revenue. Below are some of the highlights from the research. 

Continuity of Customer Experiences Drices the Future of Commerce

DOWNLOAD EXCERPT

The Four Factors that Drive Superior Customer Experience Engagement

1. Know who the most profitable customers are online

  • Stat: Only 52 percent of U.S. and 53 percent of U.K. brands were very effective in knowing who their most profitable customers are.
  • This means that nearly half of their time is spent with the wrong customers (low value) or, if they are spending time with the right customers, the customer experience does not provide a high conversion rate.
  • Thus, brands require a new approach.

 2. Know where (which channels) their high value customers are coming from.

  • Stat: Only 58 percent of U.S. and 48 percent of U.K. brands were very effective in knowing where (which channels) their high value customers are coming from.
  • In this context, channels include social media, websites, search engines, etc. Basically, brands are guessing at where to spend their marketing and ad dollars to provide the highest return on investment.

3. Know which high value products customers are interested in.

  • Stat: Only 59 percent of U.S. and 56 percent of U.K. brands were very effective in knowing which products customer are most interested in.
  • Consequently, brands may not be displaying the best products to the highest value customers, thus losing revenue and profits.

4. Know which high value products customers have viewed.

  • Stat: Only 60 percent of U.S. and 56 percent of U.K. brands were very effective in knowing which products customer have actually viewed.
  • When brands understand the customer’s journey, they can direct their spending and messaging to the moments of maximum influence.
  • When they do this, they will have a much greater chance of reaching customers in the right place at the right time with the right message, producing higher conversion rates.

How does your company stack-up compared to these companies? What have you done to improve your customer experience so that it results in better revenue and customer experiences?

@drnatalie, VP and Principal Analyst, Covering Marketing, Sales, and Customer Service to Great Amazing Customer Experiences

References:
Harris Poll Research and Primary Constellation Research

Next-Generation Customer Experience Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Marketing Transformation Digital Safety, Privacy & Cybersecurity Chief Customer Officer

Musings - Speed matters for HR - how to accelerate - Part II

Musings - Speed matters for HR - how to accelerate - Part II

So we looked at the need for enterprises to accelerate last week, find the blog post here, with more of a technology view, as all good ‘inner values’ are based on a decent architecture of the right technology. More specifically these are the enablement of BigData and the provision of ‘true’ analytics (my definition here). 
 
 
 
But even the right technology and architecture can fail, if the design principles aren’t right – and it’s key to keep in mind that business user centricity is paramount for successful, next generation HR systems, that help enterprises to get faster. No surprise, as layers of organization, a Tayloresque organization model, competency splits and more have slowed down organization since… forever (ok, the invention of these principles, that ironically were designed to make organizations more efficient, faster).

So let’s look at the next set of three criteria for a successful next generation HR product:
 

‘Lean’ Recruiting

Recruiting is one of the most challenging positions in the enterprise. CEOs (usually) have 2-3 years before they are removed for bad performance, sales reps get 2-3 quarters, recruiters 2-3 weeks. That’s still better than 2-3 hours for a waiter or a retailer sales clerk, but probably one of the most scrutinized positions in the enterprise. If a recruiter cannot bring on board the talent an enterprise is looking for after a few weeks, they will be looking for a new job soon. If you keep in mind unfavorable workforce dynamics, the need for acceleration etc. that job will not get easier.

So let’s keep in mind what the recruiter role was originally put in place for. Have a professional who does nothing but recruiting and is therefore better at it than the occasional recruitment by a manager in the line of business. And save time for that manager, so they don’t have to sift through many resumes, initial interviews etc. But at the same time the creation of the recruiter established a disconnect between the hiring manager and the candidate, that has to be overcome for every vacancy. The fact that up to a third of position in North America are not being filled due to perceived ‘friction’ between recruiters and line of business managers is telling. It looks like in too many case managers prefer to settle for hope (that employee will get better), have their team work more (‘so hard to find good people’), or kick the can down the road (the next manager can fix this) – than for recruiting new talent.

The good news is that with the advances in BigData, cheap, elastic compute capability from the cloud that allows for ‘true’ analytics, the recruiter can be largely bypassed and a filtered, fitted list of candidates can be served to the manager. The manager has to make the final call anyway, so instead of a recruiter working on many resumes to whittle down the list of good applicants to the final interview, managers can today get the list of final candidates, with the help of next generation recruiting software. We are at the cusp of video analysis to even add early conversation to the data trough from which managers can find the best candidates, after software has created the shortlist. And some vendors have even changed the recruiting model. Thanks to the advances of technology, headhunting techniques can be applied. Formerly only reserved to 6 digit salaried positions, the searching of social networks (interestingly Facebook beats out the ‘network of liars’, LinkedIn) for best fit candidates empowers managers to contact the best candidates directly, while they are still in their current jobs. This follows the old adage of the best people not looking for jobs (but being happily employed somewhere).

So empowerment of the business user, usage of Analytics and BigData – enabled ‘lean’ recruiting. With no recruiters involved (who will become coaches for managers and applicants, product managers at software companies etc.) – certainly an aggressive vision, but ask any manager out there if they would like it. They are more likely going to say Yes than No… which means it will happen, sooner than later.

Talent ‘Depth Chart’

Sports teams, security and military teams have depth charts. A coach needs to know who can play the same position, should she decide to substitute the player on the field, should the player be injured, booked etc. Likewise there can’t be an ‘emergency’ meeting when the heavy machine gunner is incapacitated… so why has the Talent ‘Depth Chart’ not been enabled for positions in an enterprise?

So let’s look what it means first: A manager should see at any given time how well the current holder of a positon is performing. Who else in the enterprise can do that job, and can do a better job than the incumbent. What talent (using ‘lean’ recruiting) above could be hired from outside the enterprise. Or if the position may not be around much longer and the local labor laws are challenging to shut down the position – what candidates from contingent worker sources could be hired. A contractor maybe even the best fit to the position, regardless of the employment status.

Again the good news is that vendors are actively thinking about this. It is relatively easy to see how well an employee is performing, the manager will have an opinion on this anyway. But it is hard to do something about it. So finding coaching, learning and mentoring tools and options is the first step. Finding good fits in the enterprise is the next step. Easing the conversation with the manager of a professional that a manager would like to ‘poach’ with a ‘one’ click automation. Giving the phone numbers of the best 2-3 external candidate fits for a position. Same for the best contractors. Show how well the can do the job, where would the rank on the ‘Talent Depth’ chart for that position? All these functions can be build today, enabling the manager to assess incumbent and potential talent for a position.

Transboarding

I wrote about Transboarding the first time in summer of 2014 (more here) – it’s a word constructed out of merging transfer, off- and onboarding. A lot of effort, time and resources are spend on Recruiting and Onboarding and they are key HR functions, but the most common HR event, the transfer is very little automated and supported.

Consider the manager using a ‘Talent Depth’ chart as described above. She identifies a suitable individual in another department. Let’s got for the easy scenario – the individual wants to transfer, his manager is supportive, even better needs the current incumbent in the current position. Basically a talent swap. Even such a smooth and simple case is substantial work with HR Core, Training and Talent Management systems. If we talk hourly workers, add the complexity of Workforce Management. What if the manager could do an ‘electronic handshake’, set the date and the rest would be automated? The APIs for these functions are available. We are only waiting for vendors to ‘glue’ them together for very powerful automation in a flexible, the business user empowering way.

Being able to Transboard people efficiently will be key for enterprises going forward. We already known enterprises need to become faster, if they can allocate people faster as needed, they will get faster, too. Not to mention the higher satisfaction of employees who can rotate through positions faster, without haggling managers, without fear of upsetting the current manager and so on.

Stay tuned for Part III.
Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Data to Decisions Innovation & Product-led Growth New C-Suite Marketing Transformation Tech Optimization AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief People Officer Chief Information Officer Chief Customer Officer Chief Human Resources Officer

2015 Roundup Of Analytics, Big Data & Business Intelligence Forecasts And Market Estimates

2015 Roundup Of Analytics, Big Data & Business Intelligence Forecasts And Market Estimates

1
  • NYC SkylineSalesforce (NYSE:CRM) estimates adding analytics and Business Intelligence (BI) applications will increase their Total Addressable Market (TAM) by $13B in FY2014.
  • 89% of business leaders believe Big Data will revolutionize business operations in the same way the Internet did.
  • 83% have pursued Big Data projects in order to seize a competitive edge.

Despite the varying methodologies used in the studies mentioned in this roundup, many share a common set of conclusions. The high priority in gaining greater insights into customers and their unmet needs, more precise information on how to best manage and simplify sales cycles, and how to streamline service are common themes.

The most successful Big Data uses cases revolve around enterprises’ need to get beyond the constraints that hold them back from being more attentive and responsive to customers.

Presented below is a roundup of recent forecasts and estimates:

  • Wikibon projects the Big Data market will top $84B in 2026, attaining a 17% Compound Annual Growth Rate (CAGR) for the forecast period 2011 to 2026. The Big Data market reached $27.36B in 2014, up from $19.6B in 2013. These and other insights are from Wikibon’s excellent research of Big Data market adoption and growth. The graphic below provides an overview of their Big Data Market Forecast.  Source: Executive Summary: Big Data Vendor Revenue and Market Forecast, 2011-2026.

Wikibon big data forecast

  • IBM and SAS are the leaders of the Big Data predictive analytics market according to the latest Forrester Wave™: Big Data Predictive Analytics Solutions, Q2 2015. The latest Forrester Wave is based on an analysis of 13 different big data predictive analytics providers including Alpine Data Labs, Alteryx, Angoss Software, Dell, FICO, IBM, KNIME.com, Microsoft, Oracle, Predixion Software, RapidMiner, SAP, and SAS. Forrester specifically called out Microsoft Azure Learning is an impressive new entrant that shows the potential for Microsoft to be a significant player in this market. Gregory Piatetsky (@KDNuggets) has done an excellent analysis of the Forrester Wave Big Data Predictive Analytics Solutions Q2 2015 report here. Source: Courtesy of Predixion Software: The Forrester Wave™: Big Data Predictive Analytics Solutions, Q2 2015 (free, no opt-in).

Forrester Wave Big Data Predictive Analytics

  • IBM, KNIME, RapidMiner and SAS are leading the advanced analytics platform market according to Gartner’s latest Magic Quadrant. Gartner’s latest Magic Quadrant for advanced analytics evaluated 16 leading providers of advanced analytics platforms that are used to building solutions from scratch. The following vendors were included in Gartner’s analysis: Alpine Data Labs, Alteryx, Angoss, Dell, FICO, IBM, KNIME, Microsoft, Predixion, Prognoz, RapidMiner, Revolution Analytics, Salford Systems, SAP, SAS and Tibco Software, Gregory Piatetsky (@KDNuggets) provides excellent insights into shifts in Magic Quadrant for Advanced Platform rankings here.  Source: Courtesy of RapidMinerMagic Quadrant for Advanced Analytics Platforms Published: 19 February 2015 Analyst(s): Gareth Herschel, Alexander Linden, Lisa Kart (reprint; free, no opt-in).

Magic Quadrant for Advanced Analytics Platforms

  • Salesforce estimates adding analytics and Business Intelligence (BI) applications will increase their Total Addressable Market (TAM) by $13B in FY2014. Adding new apps in analytics is projected to increase their TAM to $82B for calendar year (CY) 2018, fueling an 11% CAGR in their total addressable market from CY 2013 to 2018. Source: Building on Fifteen Years of Customer Success Salesforce Analyst Day 2014 Presentation (free, no opt in).

Salesforce Graphic

  • 89% of business leaders believe big data will revolutionize business operations in the same way the Internet did. 85% believe that big data will dramatically change the way they do business. 79% agree that ‘companies that do not embrace Big Data will lose their competitive position and may even face extinction.’ 83% have pursued big data projects in order to seize a competitive edge. The top three areas where big data will make an impact in their operations include: impacting customer relationships (37%); redefining product development (26%); and changing the way operations is organized (15%).The following graphic compares the top six areas where big data is projected to have the greatest impact in organizations over the next five years. Source: Accenture, Big Success with Big Data: Executive Summary (free, no opt in).

Big Data Big Success Graphic

Frost & Sullivan Graphic

 

global text market graphic

 

  • Customer analytics (48%), operational analytics (21%), and fraud & compliance (21%) are the top three use cases for Big Data. Datameer’s analysis of the market also found that the global Hadoop market will grow from $1.5B in 2012 to $50.2B in 2020, and financial services, technology and telecommunications are the leading industries using big data solutions today. Source: Big Data: A Competitive Weapon for the Enterprise.

Big Data Use Cases in Business

  • 37% of Asia Pacific manufacturers are using Big Data and analytics technologies to improve production quality management. IDC found manufacturers in this region are relying on these technologies to reduce costs, increase productivity, and attract new customers. Source: Big Data and Analytics Core to Nex-Gen Manufacturing.

big data in manufacturing

  • Supply chain visibility (56%), geo-location and mapping data (47%) and product traceability data (42%) are the top three potential areas of Big Data opportunity for supply chain management. Transport management, supply chain planning, & network modeling and optimization are the three most popular applications of Big Data in supply chain initiatives. Source: Supply Chain Report, February 2015.

Big data use in supply chains

  • Finding correlations across multiple disparate data sources (48%), predicting customer behavior (46%) and predicting product or services sales (40%) are the three factors driving interest in Big Data analytics. These and other fascinating findings from InformationWeek’s 2015 Analytics & BI Survey provide a glimpse into how enterprises are selecting analytics applications and platforms. Source: Information Week 2015 Analytics & BI Survey.

factors driving interest in big data analysis

 

Data to Decisions Marketing Transformation Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Future of Work intel salesforce Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning LLMs Agentic AI Robotics SaaS PaaS IaaS Quantum Computing Enterprise IT Enterprise Acceleration IoT Blockchain CRM ERP CCaaS UCaaS Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices

Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices

We had the opportunity to attend the Teradata Influencer Summit held at the beautiful L’Auberge Del Mar in northern San Diego. When mentioning to other influencers that I would be en route for that meeting earlier in the week, I mostly gathered incredulous stares and comments like ‘are they still around and interesting’? I missed the first half day, but the next one and a half days gave a good insight into where Teradata is and where they want to be in the next years. 

 
 
 

So here are my top 3 takeaways from the event

A compelling vision – In recent times Teradata had been blamed for a lack of vision and / or a lack of sense of realism. I followed the Summit two years ago from the fences and back then an executive had an elaborate presentation to show why Hadoop would be a failure – or only a temporary fashion that would disappear as fast as it showed up. But quickly afterwards (and some colleagues say the Summit helped) Teradata reversed course, changed from a ‘struggle to snuggle’ approach vis a vis Hadoop, embracing more of Hadoop, partnering e.g. with MongoDB already a year ago (and again was a key sponsor of this year’s MongoDB World – also this week, event report here). All slides at the summit had Teradata, Aster and Hadoop based offerings. As a matter of fact the Teradata UDA includes all three database paradigms.
 
The vision of the 'Sentient' Enterprise
And along these capabilities Teradata is using its by now proven Querygrid capabilities, following the ‘co-existence’ route with Hadoop, or ‘federated query’ strategy all vendors with an age of 5+ years pursue. As part of that Teradata sees the world separated into three categories of systems – tightly coupled (Teradata), loosely coupled (Aster) and non-coupled (Hadoop). An interesting approach to the system landscape, that Ratzesberger used to explain the vision of the ‘sentient enterprise’. There are pros and cons with federation, see MyPOV section below.


It’s coming together – Not sure when Teradata came up with the vision, but obviously work needs to be done here. Teradata needs to write the endpoint to execute queries on the Hadoop side, and it currently supports Hive for that. But Hive has known benefits and challenges, and Teradata will address the challenges soon, stay tuned on that topic. The other piece critical is that a tool that can transport data back and forward, extract as needed from a variety of data sources as desired, transform information, can apply rules ‘in flight’ etc. Teradata calls it a Listener, but it is much more. In the old times one could call it a bona fide ETL tool, only that today it scales to the 21st century requirements with streaming data, humungous data volumes, BAM and CEP like support etc. – a key product / tool to make the Teradata UDA fly. 
 
 
 
The Analytical Application Platform

Self-disruption built in – All successful technology companies struggle with disrupting themselves. It’s a price of success and harvesting the benefits of maturation of technology that create the potential for technology disruption. Co-President Wimmer remarked that all successful database companies, the 30+ year olds are setup for disruption today. Usually technology players react to disruption risks with acquisitions, try to insert knew product capabilities and employee talent and these approaches are usually less successful than intended for a variety of reason.

Teradata has taken a unique approach here, allowing the acquisition of ThinkBigA to keep operating independently, and even taking a step further, funding its global expansion. Unique, because ThinkBigA does pure best of breed BigData projects, with complete independence of choosing technologies customer’s desire. As such ThinkBigA cannibalizes Teradata customers to a certain extent, and remarkably Teradata does nothing to stop that.
 
The Teradata free ThinkBigA architecture
So kudos to the Teradata management, that coming of a close to death experience in regards of the Hadoop phenomena, now has a built in disruptor with ThinkBigA. Quite a gutsy move, coming close to e.g. an enterprise ERP vendor allowing a subsidiary services company to sell services on competitor products to its customers. What? Granted the ThinkBigA case is not that drastic, as the provider ‘only’ uses open source, but the disruption is on hand. The benefits for Teradata are twofold: The vendor learns about real customer demands and quality of open source BigData products, that it can use for its own R&D strategy and effort. Secondly Teradata has a way to remain relevant (and extract revenues) from customers it otherwise would have lost to BigData / Opensource players anyway.
 
ThinkBigA BigData Project recommendations, notice bullet #5 - emphasis added
 

MyPOV

Teradata is alive and well, with solid product plans for the immediate future. The three tiered vision of federated queries and data is common for vendors with existing, substantial business. It resonates well with enterprises that are more conservative both in regards of technology adoption and security needs.

On the concern side, the verdict is out if enterprises operating with a federated storage and query approach will end up missing insights, or only getting served sub optimal insights. Conceptually one can make the case that every insight can only be caught with a single source storage, one place to be queried etc. but if the difference between the two is substantial for the success of a business, remains to be seen.

So a key 2nd half for Teradata on the product side, with a number of substantial announcements to come, key capabilities in the pipeline. Teradata may have been down, but it’s back on its feet and throwing punches. Only the paranoid survive and Teradata has gotten a litte more paranoid - in a positive way. Good to see for prospects, customers and the overall ecosystem as nothing fuels innovation better than competition. We will be watching.
Data to Decisions Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Future of Work Hadoop Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Collaboration Around Files - A Case Study On Working With Clients

Collaboration Around Files - A Case Study On Working With Clients

I recently had the opportunity to speak with one of the world's largest advertising agencies about how they collaborate with their clients. Details of their solution are covered in the Constellation Research case study "Powering Global Client Collaboration with Secure File Sharing, How a Leading Ad Agency Used Egnyte Adaptive Enterprise File Services to Work with Clients". Constellation clients can download the full report here

Here are a few highlights:

  • One of the main tasks of a digital agency is working with clients to create the multimedia assets used in their sales and marketing campaigns. The process is very collaborative, starting with gathering requirements, then brainstorming, multiple rounds of drafts and edits, followed by review, approval and production. Before Egnyte, it was difficult to share files and discuss changes via email. 
  • The CIO along with other department leads started a multi-year technology refresh that focused on platforms with strong cloud-based and mobile features. The project began by replacing their legacy email system with Google’s productivity suite, Google Apps for Work. Google Apps was used not only for email and calendar, but also word processing, spreadsheets, file-sharing and internal web-conferencing all via a single integrated offering. 
  • By switching to Fuze for real-time collaboration with clients, the Agency is able to easily share files from Egnyte into a Fuze video conference, where anyone can participate from his/her computer, phone or tablet. In these meetings, designs are reviewed and improved onscreen, eliminating the typical back and forth exchanges that lead to confusion and costly mistakes. 

Constellation clients can download the full report Powering Global Client Collaboration with Secure File Sharing to learn more about the challenges, solutions, benefits, takeaways and recommendations.

 

Future of Work Marketing Transformation Next-Generation Customer Experience Revenue & Growth Effectiveness Google Chief Marketing Officer Chief People Officer Chief Revenue Officer

Teradata Influencer Summit Highlights

Teradata Influencer Summit Highlights

Herman Wimmer, Co-President kicks off the event. Herman says that the guiding priorities of Teradata are:

Key Principle #1: Analytic ecosystem: Teradata, DB, UDA, Real-time, Fabric Architecture

Key Principle #2: Big Data Technologies: Aster, Hadoop, Bid Data Apps, Apps Center, Open Source Contribution and leverage

Key Principle#3: Cloud for Analytics

Key Principle #4: Enterprise Class Production Analytics, Hybrid Implementations (Pubic/ Private), Broader Market Penetration

Key Principle #5: Consulting, Big Data Consulting, Analytics Consulting, Managed Services

Key Principle #6: Future Markets: Healthcare, Government, Innovation, People, Passion….

In a digital world, where business models are changing very fast (note not everyone agrees with this or sees it) business will need real-time data to make better decisions to make the customer experience the best it can be. Companies that used to compete on selling cars are really COMPETING ON THE “experience or what it feels like to drive/ own the car.” Engineers, marketers, Customer Service Professionals can guess what is making the customer happy- DRIVING THAT CUSTOMER EXPERIENCE or they can use these technologies available to drive better business. It is not easy, but Teradata definitely simplifying it. And it takes investment of people, process, time, and the technology and PASSION.

My POV: Where is your company with respect to going beyond the conversations of “big data” for “big data” sake and are truly embracing the data in the business, where it can make a difference. Who should lead this? CEOs that “get the value of data” are looking to someone in their organization to do it – CIO, CTO, CMO, Customer Service Professional to lead data revelations. And if one of those “titles” doesn’t step-up and lead or co-lead with other executives, CEOs will find and hire Chief Digital Officers or Chief Customer Officers to make it happen. It is an opportunity, but it is not without risk. Done well, it bring huge financial rewards to the company that master it. And probably most likely help that person’s career. If you are in a position that isn’t data-centric – then it’s up to you to turn it into that.

@drnatalie, VP and Principal Analyst, Covering Marketing, Sales, Marketing and Customer Service using Big Data Analytics to Deliver Amazing Customer Experiences

 

Next-Generation Customer Experience Chief Customer Officer

Cisco announces intent to acquire Piston Cloud

Cisco announces intent to acquire Piston Cloud

Continuing the trend toward managed cloud services Cisco announced its intent to acquire Piston Cloud. This announcement comes after the announcements that EMC acquired Virtustream (my take here), and IBM acquired Blue Box (my take here). 
 
Let’s dissect the Cisco blog (it can be found here) about the announcement:

Cloud computing has fundamentally altered the IT landscape: dramatically boosting IT agility, while lowering costs. To realize the business advantages of cloud, organizations are shifting to a hybrid IT model—blending private cloud, public cloud, and on-premise applications.

MyPOV – It looks like the demand for hybrid cloud has gotten much stronger, considering this announcement.

To help customers maintain control and compliance in this hyper-connected, hyper-distributed IT environment, Cisco and its partners are building the Intercloud—a globally connected network of clouds. Today, Cisco is taking another important step towards realizing our ambitious Intercloud vision.

MyPOV – Good for Cisco to connect the intended acquisition with Intercloud – my analysis of the original announcement is here.

We are pleased to announce our intent to acquire Piston Cloud Computing, which will help accelerate the product, delivery, and operational capabilities of Cisco Intercloud Services.

Paired with our recent acquisition of Metacloud, Piston’s distributed systems engineering and OpenStack talent will further enhance our capabilities around cloud automation, availability, and scale. 


MyPOV – Piston Cloud’s product delivers a ‘cloud os’ that can ‘plug in’ a number of cloud services. As such Piston Cloud gives a large number of using different technologies, something that enterprises care about as they do not know for sure which technologies they want to use for their next generation application projects. Being able to run these from the same platform is a significant value proposition that Piston Cloud has delivered. Given the heterogeneous nature of the environments traditional Cisco customers operate, certainly a good move. My analysis of the mentioned Metacloud acquisition is here.

The acquisition of Piston will complement our Intercloud strategy by bringing additional operational experience on the underlying infrastructure that powers Cisco OpenStack Private Cloud.

MyPOV – So Piston Cloud will become part of the Cisco OpenStack offering – no surprise here. Good to have the clarity.

Additionally, Piston’s deep knowledge of distributed systems and automated deployment will help further enhance our delivery capabilities for customers and partners.

MyPOV – Fair for Cisco to admit that Piston Cloud has some serious chops bringing together such diverse technologies like e.g. Docker or Spark.

To bring the world of standalone clouds together, Cisco and our partners are building the Intercloud. The Intercloud is designed to deliver secure cloud services everywhere in the world. Our enterprise-class portfolio of technology and cloud services gives customers the choice to build their own private clouds or consume cloud services from a trusted Intercloud Provider. The Intercloud provides choice of services, all with compliance and control. In a nutshell: we’re delivering cloud the way our customers need it.

MyPOV – The marketing line for Cisco Intercloud. But the intended acquisition of Piston Cloud certainly gives more credibility and more crucially more capability into this direction. One can only speculate why only now these two have come together.

Piston will join our Cloud Services team under the leadership of Faiyaz Shahpurwala, senior vice president, Cloud Infrastructure and Managed Services Organization.

MyPOV – Always good to hear where acquisitions are going to be anchored organizationally. I have shared my concerns in regards of services leadership building products, but that is a general concern and happy to give Shapurwala and the team the benefit of the doubt. The future will tell.

Overall POV

It can’t be coincidence of EMC, IBM and Cisco are all completing acquisitions in the managed cloud / hybrid cloud space. It makes sense for all three vendors to push the hybrid agenda, as all three of them have existing sales channels into local data centers. To a certain point this movement indicates a failure of the ‘public only' cloud vendors (e.g. AWS and Google).
 
Nothing that can be held e.g. against Cisco, who by itself tries to differentiate itself by ‘playing nice’ to the demands, needs (and maybe angst) of CIOs in regards of the public cloud. The interesting development is, that CIOs seem to be comfortable letting 3rd parties do the management of their private cloud infrastructure, but still want to see their data center being utilized. Write down time frames may play a role here, but overall an interesting development that I will spend more time analyzing going forward.

On the concern side, it is another acquisition and Cisco will have to hold on to the Piston Cloud team talent. But Cisco is an experienced acquirer and knows how to make sure that the key people stay around.

So overall a good move by Cisco, the flexibility of Piston Cloud is something that has the potential to be a differentiator for Cisco InterCloud. Congratulations.

 

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work cisco systems SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Market Move - IBM gets into private cloud (services) with Blue Box acqusition

Market Move - IBM gets into private cloud (services) with Blue Box acqusition

The need to manage private clouds seems to be really heating up on the demand side – last week EMC announced plans to acquire Virtustream (see analysis here), today both IBM and Cisco announced the respective purchases / intent to purchase of BlueBox and PistonCloud. 
 
Let’s dissect the IBM press release (it can be found here):

ARMONK, N.Y - 03 Jun 2015: IBM (NYSE: IBM) today announced it has acquired Blue Box Group, Inc., a managed private cloud provider built on OpenStack.
Blue Box is a privately held company based in Seattle that provides businesses with a simple, private cloud as a service platform, based on OpenStack. Customers benefit from the ability to more easily deploy workloads across hybrid cloud environments. Financial details were not disclosed.
MyPOV – Another acquisition in Seattle, Blue Box originally started as a website hosting provider and has evolved to a OpenStack managed private cloud provider.

Enterprises are seeking ways to embrace all types of cloud to address a wide range of workloads. Today’s announcement reinforces IBM’s commitment to deliver flexible cloud computing models that make it easier for customers to move to data and applications across clouds and meets their needs across public, private and hybrid cloud environments. With Gartner forecasting that 72 percent of enterprises will be pursuing a hybrid cloud strategy this year [1], it is increasingly important for companies to leverage multiple models while maintaining consistent management across their cloud platforms.
MyPOV – For the longest time the race was about getting public cloud capabilities going. It looks like by mid of 2015 enterprises are ready to move to cloud, albeit in a more conservative fashion, with more private cloud aspects than ever before. With the bare metal of SoftLayer IBM is able to give customers more confidence levels than most competitors, but it seems that this was not enough, hence the Blue Box acquisition.

Through Blue Box, IBM will help businesses rapidly integrate their cloud-based applications and on-premises systems into OpenStack-based managed cloud. Blue Box also strengthens IBM Cloud’s existing OpenStack portfolio, with the introduction of a remotely managed OpenStack offering to provide clients with a local cloud and increased visibility, control and security.
MyPOV – This paragraph unveils the ‘crown jewels’ – the capability to manage a Openstack system deployed remotely. Enterprises may still want to utilize their data centers and see their servers, but are more open to have them managed remotely.

This move further accelerates IBM’s commitment to open technologies and OpenStack. IBM has 500 developers dedicated to working on open cloud projects to bring new cloud innovations to market. With Forrester Research recently finding that more than twice as many firms use or plan to use IBM Cloud as their primary hosted private cloud platform than the next closest vendor [2], Blue Box is a strategic fit into the IBM Cloud portfolio.
MyPOV – No surprise – OpenStack compatibility is important and makes Blue Box a good fit.

Blue Box can enhance and complement developer productivity by:

· Speeding delivery of applications and data through simplified and consistent access to public, dedicated and local cloud infrastructures

· Supporting managed infrastructure services across hybrid cloud environments and IBM’s digital innovation platform, Bluemix

· Offering a single management tool for OpenStack-based private clouds regardless of location

MyPOV – Good summary of the BlueBox capabilities. Blue Box partnered with small PaaS vendor Mendix, the mention here of Bluemix may make those deployments uncertain – but we are not aware of a statement in these regards. In our view it maybe well worth for IBM to look at the Mendix capabilities.

This acquisition will enable IBM to deliver a public cloud-like experience within the client’s own data center, relieving organizations of the burden of traditional private cloud deployments.
MyPOV – Underlines the value prop – the customer keeps the data center the management goes to IBM. The acquisition reminds me of Cisco’s recent acquisition of MetaCloud (see here), which also had a significant service aspect.

“IBM is dedicated to helping our clients migrate to the cloud in an open, secure, data rich environment that meet their current and future business needs,” said IBM General Manager of Cloud Services Jim Comfort. “The acquisition of Blue Box accelerates IBM’s open cloud strategy making it easier for our clients to move to data and applications across clouds and adopt hybrid cloud environments."

“No brand is more respected in IT than IBM,” said Blue Box Founder and CTO Jesse Proudman. “Blue Box is building a similarly respected brand in OpenStack. Together, we will deliver the technology and products businesses need to give their application developers an agile, responsive infrastructure across public and private clouds. This acquisition signals the beginning of new OpenStack options delivered by IBM. Now is the time to arm customers with more efficient development, delivery and lower cost solutions than they've seen thus far in the market.”

MyPOV – The usual quotes – no comment needed.

IBM currently plans to continue to support Blue Box clients and enhance their technologies while allowing these organizations to take advantage of the broader IBM portfolio. […]
MyPOV – Key statement. Blue Box customers should contact IBM asap to make sure they can keep the services that matter to them.



 

Overall MyPOV

A good move for IBM, opens new service offerings to the IBM Cloud portfolio. Given the recent acquisition at EMC and Cisco, it looks like the private cloud is alive and well. To a certain point that is a failure of ‘pure’ public cloud players like AWS and Google, it looks like they have not been able to convince CIOs to move all their load to a public cloud setup. Will be interesting how the mix of private cloud – e.g. the ones managed with BlueBox – vs public cloud will end up in a few years. And IBM listens to customers, wants more cloud revenue, so as much as it can get that running private clouds, IBM will of course do that…

 

 

 

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work softlayer IBM SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer