Results

Spoken and Unspoken Rules of Social Media

Spoken and Unspoken Rules of Social Media

1

When I first started blogging I voraciously read Darren Rowse’s Problogger website. It seemed like every conceivable issue I was facing had already been tackled and fixed by Darren. Similarly, I followed Yaro Starak’s advice, thinking I’d tread the entrepreneurial path. And when it came to marketing, I’d look to Olivier Blanchard’s insightful Brand Builder blog.

But I wasn’t really looking for a “how to guide” – I was seeking to learn the ropes. To understand the ways of this new, digital world.

What I realised pretty quickly was that this brave new world was not so unlike the scared old world that I was leaving behind with every tap on my keyboard. The lifeblood of social was relationships and the currency of that relationship was trust. And, really, the only way to learn the ropes was to participate – voyeurism can be fine for a while but is ultimately unsatisfying.

The deep water of social media, however, can be managed effectively with a few simple rules:

  1. Don’t swim with sharks: We have an inbuilt radar for detecting danger and threat. In the real world (IRL), the hair stands up on the back of our necks, a little voice whispers in our ears and we cross the street to avoid an unpleasant person or situation. In the digital world the same approaches apply – yet we seem to turn off our threat detection system the moment we turn on our computer. Be sure to keep an eye and ear out for scammers. Trust your friends – the ones you know IRL. Don’t click random links in email or send money to people you have never met. Don’t believe strangers when they tell you how much better they can make your website.
  2. It’s not rude to ignore people: Following on from the previous point – if you don’t know someone IRL, it’s fine to ignore them. You don’t have to “friend” or “follow” someone who follows you on social networks. You don’t have to answer a random email. Develop a healthy sense of scepticism and you’ll be fine.
  3. Don’t publish anything you wouldn’t show your Nan: Yes, I did say “publish”. It’s important to realise that everything you put online is a form of publishing. That means it’s trackable, findable and traceable. Google will find it eventually. So before you go an have that argument with a stranger; before you flame your boss (when you think she’s not looking); or before you start sharing those photos of your ex that you really should delete, think again. If you wouldn’t say or show your grandmother what you are going to publish online, then your best bet is to save it for home.

But if these three rules are not enough for you, you’ll love Jeremy Waite’s 80 Rules of Social Media.

rules-of-social-media-infographic

Via BitRebels.

 

Marketing Transformation Chief Marketing Officer

Telling a Data-Driven Story

Telling a Data-Driven Story

1
During the last election, I was constantly amazed by the way that politicians of all persuasions bored us to death with FACTS. It was as if they were following a mantra which was to wheel out fact after fact as though they would eventually convince us through the weight of their overburdened arguments alone.

We would hear about HOW many jobs had been created. Or HOW much debt had been accumulated. But hardly, if ever, would anyone dive below the facts to discover anything deeper. Once upon a time, journalists would have done the hard work of contextualising the facts – connecting the dots, explaining the WHYs and WHEREFOREs – and otherwise telling the story that the facts alone never reveal.

But in a world where journalism has been cut to the bone, telling the story or investigating the underlying realities is a luxury that media proprietors cannot afford. And worse, the public has been lulled into accepting the shrill, scant messages that flash across our Twitter streams as though it’s some form of dyslexic gospel. Hashtag #auspol. Hashtag #outrage.

But there is another way – and it requires a more comprehensive strategy than we have seen from our politicians. It’s also far more comprehensive than we have seen from the majority of the businesses vying for our attention and our wallets. It’s a strategy that puts a little joy back into the communications and the storytelling that we share. It reminds us that for all our grievances, aspirations and needs, we remain, resolutely and wonderfully human.

Inspired by another great Leslie Bradshaw presentation:

The data is useful, but only when it tells a story. What ever you do this week, don’t get lost in the digits of digital.

FingerprintsCreative Commons License Kevin Dooley via Compfight

 

Marketing Transformation Chief Marketing Officer

Webinar Asks Where Do You Stand with Siebel?

Webinar Asks Where Do You Stand with Siebel?

Constellation-research
 

Constellation Research will hold a Webinar on Where You Stand with Siebel on September 18th at 9:30 a.m. US Pacific time . The webinar will cover the different directions Oracle Siebel customers, partners, and consultants can take their Siebel implementations and their careers.

Siebel technology is 20 years old this year and it shows signs of both robustness and age. With Oracle sending mixed messages about the product’s future, this webinar helps those people who making a living working Siebel determine exactly where they stand.

“Clearly Siebel technology is not dead and plays an important role in the IT infrastructure of many of our clients” noted R “Ray” Wang, CEO of Constellation Research. “Figuring out exactly where Siebel fits into the mix is challenging. We are pleased to be helping Siebel customers meet this challenge. ”

The webinar is based on the research paper “The State of Siebel in the 2013 Market: Different Strategies for Moving Siebel Implementations Forward and Methods to Assess Career Risk that will be published that week. This research helps Oracle Siebel customers understand the real position Siebel technology holds in the market, why the conventional wisdom about the product is often wrong, and what the trends driving the misconceptions in the market. The report also offers pragmatic advice for taking different Siebel implementations in different directions and how they will impact different careers. The report will be available on the Constellation website.

Webinar Information

When: Wednesday, September 18, 2013 at:

  • 9:30 a.m. US Pacific time
  • 10:30 a.m. US Mountain time
  • 11:30 a.m. US Central time
  • 12:30 a.m. US Eastern time
  • 17:30 UK time
  • 18:30 Central European time

Webinar Information: To register for this complimentary webinar, go to: https://www3.gotomeeting.com/register/578393734

Tech Optimization Chief Information Officer

What's really happening to privacy?

What's really happening to privacy?

The cover of Newsweek magazine on 27 July 1970 featured a cartoon couple cowered by computer and communications technology, and the urgent all-caps headline “IS PRIVACY DEAD?”

Is Privacy Dead Newsweek

Four decades on, Newsweek is dead, but we’re still asking the same question.

Every generation or so, our notions of privacy are challenged by a new technology. In the 1880s (when Warren and Brandeis developed the first privacy jurisprudence) it was photography and telegraphy; in the 1970s it was computing and consumer electronics. And now it’s the Internet, a revolution that has virtually everyone connected to everyone else (and soon everything) everywhere, and all of the time. Some of the world’s biggest corporations now operate with just one asset – information – and a vigorous “publicness” movement rallies around the purported liberation of shedding what are said by writers like Jeff Jarvis (in his 2011 book “Public Parts”) to be old fashioned inhibitions. Online Social Networking, e-health, crowd sourcing and new digital economies appear to have shifted some of our societal fundamentals.

However the past decade has seen a dramatic expansion of countries legislating data protection laws, in response to citizens’ insistence that their privacy is as precious as ever. And consumerized cryptography promises absolute secrecy. Privacy has long stood in opposition to the march of invasive technology: it is the classical immovable object met by an irresistible force.

So how robust is privacy? And will the latest technological revolution finally change privacy forever?

Soaking in information

We live in a connected world. Young people today may have grown tired of hearing what a difference the Internet has made, but a crucial question is whether relatively new networking technologies and sheer connectedness are exerting novel stresses to which social structures have yet to adapt. If “knowledge is power” then the availability of information probably makes individuals today more powerful than at any time in history. Search, maps, Wikipedia, Online Social Networks and 3G are taken for granted. Unlimited deep technical knowledge is available in chat rooms; universities are providing a full gamut of free training via Massive Open Online Courses (MOOCs). The Internet empowers many to organise in ways that are unprecedented, for political, social or business ends. Entirely new business models have emerged in the past decade, and there are indications that political models are changing too.

Most mainstream observers still tend to talk about the “digital” economy but many think the time has come to drop the qualifier. Important services and products are, of course, becoming inherently digital and whole business categories such as travel, newspapers, music, photography and video have been massively disrupted. In general, information is the lifeblood of most businesses. There are countless technology-billionaires whose fortunes are have been made in industries that did not exist twenty or thirty years ago. Moreover, some of these businesses only have one asset: information.

Banks and payments systems are getting in on the action, innovating at a hectic pace to keep up with financial services development. There is a bewildering array of new alternative currencies like Linden dollars, Facebook Credits and Bitcoins – all of which can be traded for “real” (reserve bank-backed) money in a number of exchanges of varying reputation. At one time it was possible for Entropia Universe gamers to withdraw dollars at ATMs against their virtual bank balances.

New ways to access finance have arisen, such as peer-to-peer lending and crowd funding. Several so-called direct banks in Australia exist without any branch infrastructure. Financial institutions worldwide are desperate to keep up, launching amongst other things virtual branches and services inside Online Social Networks (OSNs) and even virtual worlds. Banks are of course keen to not have too many sales conducted outside the traditional payments system where they make their fees. Even more strategically, banks want to control not just the money but the way the money flows, because it has dawned on them that information about how people spend might be even more valuable than what they spend.

Privacy in an open world

For many for us, on a personal level, real life is a dynamic blend of online and physical experiences. The distinction between digital relationships and flesh-and-blood ones seems increasingly arbitrary; in fact we probably need new words to describe online and offline interactions more subtly, without implying a dichotomy.

Today’s privacy challenges are about more than digital technology: they really stem from the way the world has opened up. The enthusiasm of many for such openness – especially in Online Social Networking – has been taken by some commentators as a sign of deep changes in privacy attitudes. Facebook's Mark Zuckerberg for instance said in 2010 that “People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people - and that social norm is just something that has evolved over time”. And yet serious academic investigation of the Internet’s impact on society is (inevitably) still in its infancy. Social norms are constantly evolving but it’s too early to tell to if they have reached a new and more permissive steady state. The views of information magnates in this regard should be discounted given their vested interest in their users' promiscuity.

At some level, privacy is about being closed. And curiously for a fundamental human right, the desire to close off parts of our lives is relatively fresh. Arguably it’s even something of a “first world problem”. Formalised privacy appears to be an urban phenomenon, unknown as such to people in villages when everyone knew everyone – and their business. It was only when large numbers of people congregated in cities that they became concerned with privacy. For then they felt the need to structure the way they related to large numbers of people – family, friends, work mates, merchants, professionals and strangers – in multi-layered relationships. So privacy was borne of the first industrial revolution. It has taken prosperity and active public interest to create the elaborate mechanisms that protect our personal privacy from day to day and which we take for granted today: the postal services, direct dial telephones, telecommunications regulations, individual bedrooms in large houses, cars in which we can escape or a while, and now of course the mobile handset.

In control

Privacy is about respect and control. Simply put, if someone knows me, then they should respect what they know; they should exercise restraint in how they use that knowledge, and be guided by my wishes. Generally, privacy is not about anonymity or secrecy. Of course, if we live life underground then unqualified privacy can be achieved, yet most of us exist in diverse communities where we actually want others to know a great deal about us. We want merchants to know our shipping address and payment details, healthcare providers to know our intimate details, hotels to know our travel plans and so on. Practical privacy means that personal information is not shared arbitrarily, and that individuals retain control over the tracks of their lives.

Big Data: Big Future

Big Data tools are being applied everywhere, from sifting telephone call records to spot crimes in the planning, to DNA and medical research. Every day, retailers use sophisticated data analytics to mine customer data, ostensibly to better uncover true buyer sentiments and continuously improve their offerings. Some department stores are interested in predicting such major life changing events as moving house or falling pregnant, because then they can target whole categories of products to their loyal customers.

Real time Big Data will become embedded in our daily lives, through several synchronous developments. Firstly computing power, storage capacity and high speed Internet connectivity all continue to improve at exponential rates. Secondly, there are more and more “signals” for data miners to choose from. No longer do you have to consciously tell your OSN what you like or what you’re doing, because new augmented reality devices are automatically collecting audio, video and locational data, and trading it around a complex web of digital service providers. And miniaturisation is leading to a whole range of smart appliances, smart cars and even smart clothes with built-in or ubiquitous computing.

The privacy risks are obvious, and yet the benefits are huge. So how should we think about the balance in order to optimise the outcome? Let’s remember that information powers the new digital economy, and the business models of many major new brands like Facebook, Twitter, Four Square and Google incorporate a bargain for Personal Information. We obtain fantastic services from these businesses “for free” but in reality they are enabled by all that information we give out as we search, browse, like, friend, tag, tweet and buy.

The more innovation we see ahead, the more certain it seems that data will be the core asset of cyber enterprises. To retain and even improve our privacy in the unfolding digital world, we must be able to visualise the data flows that we’re engaged in, evaluate what we get in return for our information, and determine a reasonable trade of costs and benefits

Is Privacy Dead? If the same rhetorical question needs to be asked over and over for decades, then it’s likely the answer is no.

New C-Suite Next-Generation Customer Experience Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Marketing Officer

Workday powers on - adds more to its plate

Workday powers on - adds more to its plate

The Workday Rising conference has concluded and - fair to say - it was a very good event for Workday. There is something naturally exhilarating for customers, partners and vendor when their is significant growth - everything gets bigger and better including a user conference like Rising. And customer and partners are invigorated to see more companies being on board, doing the same as them.
 


Let's visit announcements, the findings from the keynote and Technology summit coupled with the takeaways.


Recruiting sees light of day

Workday showed the  new recruiting application, which fills a key functional gap on the talent management side, that got more sensitive to Workday with Oracle's acquisition of Taleo - the go to partner for Workday before that acquisition. And Workday deserves credit that they not only built - but delivered mobile first - with the recruiting software using the known mobile platforms that Workday has used - with native iOS and Android apps and HTML5 for the rest of devices. Potentially we may see HTML5 only here - but later more on that.

The new mobile recruiting screens look polished and it is evident that Workday spent a lot of time to get the usability right. Desite all the lipservice to mobile first in the enterprise software industry - Workday may be the first vendor delivering mobile first for a major piece  of automation. Behind the scenes recruitment, due to its circular and parallel nature, required enhancements to the Workday process management capabilities, that used to be strictly linear. An enhancement that will benefit Workday functionality beyond recruiting.






But its a first release of a major piece of business automation - so not all can be done and addressed - it will be important to understand how the roadmap of recruitment functionality will pan out beyond this first release. Major functional components like e.g. sourcing and the very start of recruiting in connection with talent pools, need to be defined and clarified further.




BigData Analyics - a new flavor

One of the key announcements was around BigData Analytics, in which Workday allows to query, import and transfer data in Hadoop clusters (the BigData component) through a business user friendly interface (with friendly support from Datameer) into the Workday object model. The analytics then are the visual representation of these imported contents in combination with the data already residing in Workday from the HCM and Financial side. And Workday builds templates that help both with the aggregation and extraction of the data and its representation in a new dashboard.






The approach makes sense for Workday in order to be able to offer the best content aggregated - or queried from an unstructured NoSQL system like a Hadoop cluster. And the examples around the compensation planning scenario, with the selection of compensation data coming from Deloitte and IBM / Kenexa is a great showcase. It will be interesting to see if customers could aggregate and mix and match across both (and  more) sources - which would be a key value add.

Workday is one of the first enterprise vendors to uptake Hadoop capabilities and deserves credit for pioneering the space - but like the classic data warehouse vendors, Workday wants the Hadoop content in their own transactional - in this case object - space. But that defeats the idea of a non structured database like Hadoop - unless you could show a dynamic and on the fly morphing of that object model into anything the result set may require. And then drive to decisions or recommended actions powered by (real) analytics - the one that do or at least suggest an action.

But we didn't see anything like this - so non surprisingly one of the Workday design partners stated that internally, what Workday fancily, buzz-wordingly calls BigData Analytics - is simply called reporting. And maybe you should call it Workday's new dashboarding capability with an ETL that can merge unstructured content into its dashboards.

So Workday deserves credit for a first start - but a lot needs to be covered to make this a BigData solution (e.g. consider to put the Workday data into the Hadoop clusters for some higher insight potential) and add real analytical capabilities. Workday will also need to address the flexible representation of the result set - in a much more visually appealing way then in this release. The best dash boarding content can look tired and old when coupled with pedestrian visualization capabilities.   Workday should think about using visualizations or risk being marginalized by the likes of Tableau and Qliktech for HCM dashboard needs.




Bye bye wheel - welcome new UI

The famous Workday wheel seems to have found the way to the scrapyard - and though odd - it was a strong unique identifier for the Workday UI. Gone with the wheel is also Adobe's flex technology, a good move of Workday to get rid of some technical debt ie thn its user interface layer.





The new UI looks clean and takes advantage of all the nice HTML5 capabilities like re-sizing, zooming and moving objects. The wheel gets replaced by a more common menue and we saw the usual search options and drop down menus.

Like all other recent HTML5 UIs shown (e.g. SAP's Fiori and Infor come to mind) the density of information seems to be a victim of the new technology. The verdict is still out on what this means for usability - but it certainly makes the life of the vendors easier. We also did not see any detailed transactional screens, and that's where the rubber hits the road as far as new user interfaces and usability are concerned.




Welcome Workday Student

The other major announcement was the creation of the first vertical application of Workday with the creation of a higher education solution. If anyone doubted how long Duffield would still be active in the business  re-watch the keynote video - there is no doubt that announcing new software is like a fountain of youth. And a ton of fun. And with Workday Student supposed to ship not before 2014 / 2015 we will see Duffield around for significant time to come.





And the move to HigherEd is a common path Duffield companies have taken - to the point that some of them are now leapfrogging from his 2nd last, over the last (Peoplesoft) company to the new Workday Student system.

One would be naive thinking that the release of the new recruiting system and the announcement of a HigherEd vertical are pure coincidence - there are nice synergies between the two - starting with the recruitment pipeline, the identification of skills and potential etc. On the flipside Workday will most likely face a set of not so usual competitors on campus - as we should not be surprised to see large internet properties like Facebook and LinkedIn pursuing the student market. Not the usual competition and Workday will have to show it can provide a consumer grade UI for the Millenial users that form today's student body and compete with a new set of competitors.

On the architecture side it remains to be seen how Worday will isolate and abstract for the new vertical functionality demand. This is particularly interesting since Workday business logic resides on an object model and given the poor track record of the overall enterprise software industry to create and maintain vertical functionality on top of moving horizontal function pools.




Little noticed... OpenStack

At the Tech Summit it was a nice surprise to see OpenStack / Grizzly on the slides. Workday had to build their own proprietary cloud tech stack when the company started out - as there were no standards and alternatives. But now these have emerged and it is key that Workday finds a way to tap into the larger (and cheaper) compute and other IT resources offered by the cloud infrastructure vendors... And while Workday has done this with AWS for development and test systems - it's still an open question for production systems. Equally we know that HP has announced that Workday will run in the HP OpenCloud - so OpenStack is a very good path for Workday to be able to run on HP's (and other) public clouds.

More importantly it gives Workday the technical capability to - yes horrible dictu - to deploy on premise. And while it may disappoint some cloud purists, this is a smart move given that we live in the age of the PRISM/NSA scandal fallout and vendors are wise to be able to deploy both to the public an private cloud. Customers may well ask and demand for that capability in the next quarters.

And as we speak about Workday and public clouds its worth mentioning that the BigData Analytics offering runs on top of AWS Hadoop service. It is better for Workday to use AWS than build up it's own infastructure. It will be interesting to see if customers care - but so far nothing critical in this regards has come up.




Less can be more

But apart from the new functionality seen - the major change affecting Workday customers will be that the company is switching from 3 to 2 releases per year. This is a good move as the previous pace put some onus on their customers, even though it's SaaS and supposedly these upgrades / updates are easy, they still take a toll on the users in the client organizations. It also equates into significant cost savings since instead of integration and regression testing the scope three times a year - Workday customers will now do this only twice a year. Certainly a welcome change. And in my experience a 6 months cycle is much more manageable for customers and is a good compromise between productive application usage and innovation in automation becoming available.






Moreover Workday will put a preview environment into the release cycle before production and after the sandbox - a good move to allow customers more time to familiarize themselves and test a new update.






Behind the scenes the company has moved to a single code line - which is a application development feat seldom achieved. Workday seems to have found a way to just flag the code and objects that need to go to a release or version. This gives Workday the ability to release functionality at will and also has the ability - shocking, too - to move off the one release lockstep mechanism for customers.

Putting my old product developer hat on, this is a key change for Workday and its customers and may certainly have been influenced by larger functional deliveries happening down the road e.g. with a new user interface, recruitment etc. and a huge flexibility win for customers and Workday. That said, it was also good to see the constant tuning and improvement mentality shared in detail at the Technology Summit - certainly a best practice and good to see it lived so well by Workday product executives.




Financials

The Financials side of the product seems to be doing well with Workday adding capabilities and scalability to qualify and play in the BigFin market space. The requirements of the Financial product are a good alternate stress test of the Workday platform compared to the HCM load.

My mayor takeaway is, that the Financials side is also a defensive move for Workday. Previously I blogged about Peoplesoft being challenged even on closed sales on the CxO level by SAP and Oracle - so the former Peoplesoft DNA does not want that to repeat this phenomena and being able to expand and own Financials as a footprint is one strategic move to avoid that Peoplesoft history repeats for Workday.

But on that front Workday missed to pitch to their predominant HCM centric user base present in the keynote audience as the value proposition of an integrated HCM and Finance system was not as clearly articulated. It will not make these HCM users go and see their Finance counterparts as soon as they are back in their office from Workday Rising. But what hasn't happened has the charme of being able to still happen in the future.




The road ahead

With recruiting shipping the major piece that Workday still misses on the HCM side is Learning - and the company for now is partnering in this area. But in a Q&A Bhusri was pretty clear that down the road Workday will address this functional gap - and not with an acquisition. That would be true to the Workday tradition of build and not buy.






On the payroll side the company plans to ship the UK and France payrolls in 2015 and 2016 respectively. The overall thinking off the management team is that with 7-8 native Workday payrolls the company can cover enough ground in terms of critical user base. The rest is planned to be addressed with partners and with plans to extend and expand the payroll interface for those partners.  This will pose still some headache to international customers who will have to look for partners and run interfaces that could break - every and any pay cycle. And what we heard more than one time from customers on stage was, that they key value for them was .... integration.




The non product challenge

Workday will have to keep growing revenues, customers and expand its global presence at a fast and steady pace to keep investors happy. Japan was announced at the conference as a new country Workday will operate in. The good news for Workday is, that it can source from its former Peoplesoft bench - but the competitive landscape is changing and the pressure to deliver numbers has not changed - but only increased.




MyPOV

If you were - like yours truly - very concerned that Workday has already a lot on its plate - then the company has certainly added more to its plate. But executives seem to be confident that they can deliver - and to their credit - they have so far delivered. Shipping a thought leading recruiting product will be a key milestone. The technical challenges seem to be under control and some weaknesses are turning to the better - so 2014 will be a key year for Workday to continue to deliver on product and master the non technical growth challenges as well.

---------------------------------------------------------------
If you missed it - check out the Storify collection here.
Wondering on my pre Rising questions - I will come back to them in a later post. 

Future of Work Tech Optimization Data to Decisions Innovation & Product-led Growth New C-Suite Marketing Transformation Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Oracle SuccessFactors workday SAP AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Financial Officer Chief People Officer Chief Information Officer Chief Technology Officer Chief Customer Officer Chief Human Resources Officer Chief Information Security Officer Chief Data Officer

Why SAP acquired KXEN? Getting serious on Analytics

Why SAP acquired KXEN? Getting serious on Analytics

With this mornings announcement of SAP announcing it's intention to acquire San Francisco based analytics vendor KXEN, we may witness the beginning of the fall season for acquisitions by the usual suspects.

 

Actually - that prize may go to IBM - who finalized its acquisition of Trusteer the other week - but more on that later.
 

Real vs faux analytics

To clarify this post to the novice reader - we only refer here to the real analytics - the one that according to the definition either recommend an action - or even perform an automated action. That's what the Greek -tics suffix is all about... Unfortunately some marketeers found that analytics is a nice new word open for abuse as a buzzword and re-purposed it for the dinosaurish sounding reporting and the ancient BI and the more recent dashboarding. All of these are not analytics as we mean it here... and analytics as KXEN provides them. Look forward to Jonathan Becher and team to sort this out, clean it up and land on the true analytics side soon. (More on this here.).
 

SAP and Analytics

This is a long story that I won't put down in this blog - but it was a story that was finally coming around. SAP has gone from having no story (before the Business Objects acquisition), to a wrong story (when it was partnering with SPSS), to a confusing story that needed to be explained - but was turning to the better.
 

 

Again - I won't dive into the details - but a complex story it is. And what it was lacking was the necessary tool aspect to build analytical applications - the tool you want to give - ideally the business user - a  tool in the hand to have a chance to solve a complex business question in order to take the actions to steer towards the desirable outcome.

So far no vendor in the analytics field has been able to give these tools in the hands of business users, best case the power users, and I have blogged about the quest for the holy grail of analytics before...
 

The case for KXEN

KXEN has built a suite of analytics products mainly around scoring algorithms and data mining - and the good news is that these are easy to understand for business users... almost any person going through higher statistics or business classes has solved decisions with weighting and scoring and at least heard of data mining.

So with KXEN SAP gets tools that allow to drive to analytical decisions not only for classic on premise data, but also unstructured data - both of which SAP could not easily do before and in most practical circumstances would have to resort to an R built model. And while R is a good choice for SAP overall - it only caters to the geeky data modeller and statistician - not to the end users...

Additionally the forays of KXEN into sentiment analysis and recommendation as well as the Genius product that is catered for marketers - provide interesting products for SAP to leverage. Most interesting will be the packacked apps that KXEN has built... on top of salesforce.com.
 

One SAP interna

It's surprising that the quote on the press release comes from unusually low in the organization - Michael Reh - which may point to a less board centric communication stategy, but potentially also a more collegiate and decentralized acquisition strategy. The hope is that this is not a signal of less importance of analytics for SAP.
 

The competitive angle

One cannot think of SAP acqiring KXEN without the IBM acquisition of Trusteer. We blogged earlier that SAP wants to be a technology company - and then you need to react if one of your key competitors - and IBM has become that for HANA in the last 3-6 months - does a strategic acquisition that propels them forward. And KXEN is a good reaction - that even passes IBM given the higher end user focus of KXEN over Trusteer.

And then it makes - ahh - the irony - SAP a salesforce partner. A large group of KXEN's executives team comes from salesforce.com (BI / Reporting tools and Service Cloud) and KXEN had (rightfully) decided to put a number of their models in the cloud on top of salesforce.com. Let's expect grown up reactions from both companies and we see another co-opetition relationship forming. The fun fact will be that it will SAP's KXEN running on the salesforce cloud infrastructure using data in an Oracle database.
 

Implications for Customers

There is no reason for KXEN customer to fear the acquisition - assuming SAP will secure the management and expertise diligently - and if concerned - secure support and commitments now for as long as can be negotiated. But understanding what SAP plans to do with KXEN on top of HANA will be interesting and probably valuable for KXEN customers - so they should wait and see at least for these plans to crystallize soon.

For SAP customers looking elsewhere confused and potentially disillusioned by the current state of analytics - this is exciting news and they should press hard for the road map of integrated offerings.
 

Implications for Partners

While  many services companies in the IT field maybe looking at how successful IBM is acquiring analytics companies and making a services play out of the business - it's too early to tell if SAP can create a similar services ecosystem around analytics. In general it's worth watching and looking for value added services that can be build or productized on top of HANA.
 

Implications for SAP

SAP gains a very good analytics company and now needs to maximize the return for mutual benefit of customers and ecosystem. It has taken a long time to create a good analytics story - now, one can only hope it becomes a great one.
 

MyPOV

A good and fitting acquisition for SAP that will make HANA a better competitor in the (true) analytic space with SAP gaining a end user friendly tool, some interesting packaged apps and a thorough data mining and scoring engine and expertise. If you are a fan of (true) analytics like yours truly - then this is a great  move.

Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work IBM SAP ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

SaaS Adoption Trends and Customer Experience Report Published by Constellation Research

SaaS Adoption Trends and Customer Experience Report Published by Constellation Research

Frank ScavoSaaS Adoption Gains Ground, Based on Outstanding Economic Characteristics

Irvine, CA – September 10, 2013. Constellation Research, Inc. the research and advisory firm focused how disruptive technologies transform business models announced today the publication of  "SaaS Adoption Trends and Customer Experience” by Constellation Vice President and Principal Analyst, Frank Scavo. Based on survey results from Computer Economics, this report documents the increasing adoption levels and investment rates for SaaS applications across all categories and recommends best practices for buyers in evaluating and contracting with SaaS providers.

This new report is report reveals:

  • The percentage of organizations investing in 10 categories of SaaS applications, along with and extensive list of vendors in each category.
  • Analysis of 11 key benefits of SaaS along with 10 concerns, ranking by importance in the minds of buyers.
  • Customer preference for multi-tenant versus single tenant applications.
  • A summary of SaaS adoption and investment rates by organization size and region of the world along with the growing popularity of SaaS over the past two years.
  • A summary of the ROI and TCO experiences of organizations that have implemented SaaS.
  • Seven best practices for SaaS buyers, in light of the differences between SaaS and on-premises systems.

“Software-as-a-Service has the strongest economic characteristics of any technology we surveyed,” said Frank Scavo, the report’s author. “But that doesn’t relieve buyers of their responsibility to make intelligent decisions. We wrote this report document the current state of SaaS adoption and to give buyers some guidance on how to evaluate and contract with SaaS providers.”

This report fits into Constellation’s business-focused research themes of Technology Optimization and Innovation and the Consumerization of IT.

Download the report snapshot

ABOUT FRANK SCAVO
Frank Scavo is Vice President and Principal Analyst covering topics in IT strategy, IT management metrics, and enterprise applications. He is also the President of Computer Economics, an IT metrics research firm.

COORDINATES
Profile:
http://www.constellationr.com/users/fscavo
Twitter: @fscavo
Linkedin: www.linkedin.com/in/frankscavo
Geo: Irvine, CA

THE REPORT
More information about "SaaS Adoption Trends and Customer Experience" may be found here: http://constellationr.com/research/saas-adoption-trends-and-customer-experience

Press Contacts:
Contact the Media and Influencers relations team at [email protected] for interviews with analysts.

Sales Contacts:
Contact our sales team at [email protected].

Media Name: FrankScavosm.jpg
New C-Suite Tech Optimization Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Marketing Officer

News Analysis: Clarabridge Raises $80M in Funding For Expansion

News Analysis: Clarabridge Raises $80M in Funding For Expansion

General Catalyst, Summit, and Yuchun Lee To Take Clarabridge To Next Phase Of Growth

Rapidly growing Reston, VA based Clarabridge, announced on September 10, 2013 a $80 million round of capital.  Founded in 2006, Clarabridge is a leading provider of customer experience solutions.  The funding announcement is significant as Clarabridge:

  • Invests into global expansion and product innovation. General Catalyst Partners, Summit Partners, and Yuchun Lee invests in the latest round .  Clarabridge intends to apply the investment towards global expansion, accelerate product innovation, and execute strategic transactions.  Key customers include B/E Aerospace, Best Buy, Charming Shoppes, Inc., Choice Hotels, Dell, Expedia, E.ON, Fidelity, Gaylord Hotels, Government of British Columbia, Intuit, J.D. Power, L’Oréal USA, Marriott International, PetSmart, QVC Inc., Sage North America, United Airlines, Walmart, Walgreens, and Wendy’s International.

    Point of View (POV): With over 150% of revenue growth over the past 3 years, Clarabridge plans to expand beyond it’s latest entry into San Francisco and London.  The CEM vendor  has the opportunity to build out new geographical markets while expanding industry reach in auto, cpg, finance, healthcare, hospitality, insurance, manufacturing, pharma, restaurants, retail, technology, telecommunications and travel.   Moreover, as the CEM space continues to evolve, Clarabridge now has a war chest to acquire new technologies or engineering talent as the market continues to expand and large legacy vendors acquire to consolidate.
  • Brings on experienced investors and board level expertise. Previous board members included David Blundin of Link Ventures, Don Raine of Grotech ventures, John Glushik of Intersouth Partners, Jonathan Perl of Boulder Ventures, and Sanju Bansal COO of MicroStrategy.   Larry Bohn of General Catalyst Partners and Tom Jennings of Summit Partners will join as part of the board.  Meanwhile, Yuchun Lee will serve as Chairman of the Board.

    Point of View (POV): While the previous board and investors provided the initial catalyst to Clarabridge’s success, in order to take it to the next level, the company needed new energy and direction.  David Blundin and Sanju remain on the board from the previous set of investors.  With Yuchun as chairman, expect innovative approaches to partnerships, OEM relationships, and positioning of Clarabridge in a broader customer experience context.

The Bottom Line: Clarabridge Poised For Growth

At the 2013 C3 Customer event, Constellation spent time speaking with over 50 of the 400 clients in attendance.  The range of customers covered key brands across a variety of industries.  Customers chose Clarabridge for a few reasons:

  • Intuitive user experience
  • Ability to handle multiple sources of engagement
  • Global language support
  • Self learning systems

Customer centricity is now a strategic imperative and requires first rate orchestration, company wide commitment, and leadership.  This new round of funding and more importantly board level expertise will help take Clarabridge build, acquire, and partner with the capabilities that customers will need to succeed.

Your POV.

What’s your plan to achieve customer centricity? Are you embarking on a digital business transformation?  Let us know how it’s going!  Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) com.

Please let us know if you need help with your Matrix Commerce and Digital Business transformation efforts.  Here’s how we can assist:

  • Assessing matrix commerce readiness
  • Developing your digital business strategy
  • Vendor selection
  • Implementation partner selection
  • Connecting with other pioneers
  • Sharing best practices
  • Designing a next gen apps strategy
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Related Resources

Reprints

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales .

Disclosure

Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy, stay tuned for the full client list on the Constellation Research website.

* Not responsible for any factual errors or omissions.  However, happy to correct any errors upon email receipt.

Copyright © 2001 – 2013 R Wang and Insider Associates, LLC All rights reserved.
Contact the Sales team to purchase this report on a a la carte basis or join the Constellation Customer Experience!

 

David Blundin
Board Member

David Blundin is the founder of Link Ventures. He is also the CEO of Vestmark, Inc. Previously, David was Chief Technologist at Vignette Corporation which has over 1,000 enterprise customers including seven of the top ten firms in the Financial Services sector. Prior to Vignette, David was the founder, CEO and Chairman of DataSage, Inc. DataSage was acquired by Vignette in a half-billion dollar transaction in January of 2000. DataSage’s software enabled businesses to centrally capture and analyze all of their electronic interactions – often involving terabytes of information – and use that information to personalize interactions with customers. In 1998 and 1999 DataSage was selected to the DataWarehouse 100. In 1999 DataSage received a DM Review World Class Solution Award and was selected to Computerworld’s 100 Emerging Companies to Watch. Prior to 1997, David was the President and CEO of Cirrus Recognition Systems, a data mining software company. Previously, he was among the first ten employees at MicroStrategy and, subsequently, served on Microstrategy’s board of directors in 2003 and 2004. David received a BS in Computer Science degree from MIT where he researched neural network technology at the AI lab. He serves on the boards of several high-tech start-ups and is also a co-founding board member of a hedge fund that employs data mining techniques.

Don Rainey
Board Member

Don Rainey joined Grotech Ventures in September, 2007 as a General Partner and focuses his investment activities on software and e-commerce technologies. Until recently, Don was a General Partner in Intersouth Partners, a venture capital firm based in Durham, North Carolina.

Formerly, Don was an entrepreneur with a track record of a number of successful businesses. He was President of Attitude Network, one of the first entertainment networks online, which was sold to TheGlobe.com. He also served as a founding member of the Board of Directors for Accipiter through its acquisition by CMGI and was COO of DaVinci Systems, which was sold to ON Technology, enabling a combined entity IPO four months later. Don has also held senior executive positions with Novell and the IBM Corporation. Currently, Don is an Adjunct Professor at the University of Maryland, where he teaches graduate courses on new venture creation and finance. Don is also among a select group of venture capitalists chosen to serve as a consultant to the Chief Information Officer of the US Department of Defense, through the DeVenCI program, advising on emerging technologies.

Don is a member of the boards of ARPU, Zenoss, Mid-Atlantic Venture Association (MAVA), March of Dimes – Capital Area, Virginia Tech Intellectual Properties Foundation and Mindshare, an organization that helps CEOs from the most promising start-ups in the Greater Washington Metropolitan region build long-term sustainable companies. Previously, Don has served on the boards of Covega, Defywire, Maxcyte, Artifact Software, Flatburger, Enterprise Investment Advisory Committee for the State of Maryland and Investment Advisory Board of the State of Virginia’s Growth Acceleration Program.

Don is a James Madison University Graduate, BBA and holds a MS in Bioscience Management from George Mason University.

John Glushik
Board Member

John Glushik is a general partner with Intersouth Partners, one of the most active and experienced early-stage funds in the Southeast with more than 80 investments in private companies over the last 20 years. Intersouth manages more than $750 million in seven venture capital limited partnerships, the most recent of which was established in May 2006 and totals $275 million. This fund is the largest venture capital fund in North Carolina and one of the largest early stage funds in the Southeast. John has been actively involved with most of the companies in the Intersouth information technology portfolio, serving as a board member and a board observer. His work covers all aspects of venture investment and portfolio management. He has led multiple venture financings and he has managed a number of successful liquidity events. John is an active member of the Council for Entrepreneurial Development (CED), serving on CED’s board of directors and executive committee. He serves on the board of the Florida Venture Forum and he has served as co-chair of the AeA Venture Forum. He also serves on the advisory boards of the Entrepreneurs Foundation of the Southeast and Southeast TechInventures. He teaches as an Adjunct Professor at the Kenan-Flagler Business School at the University of North Carolina. He speaks frequently at North Carolina State University and the Fuqua School of Business at Duke University where he is an Entrepreneur Affiliate. John serves on the Engineering Alumni Council and Devil Fund Board at the School of Engineering at Duke. Prior to Intersouth, John worked as an engineer and consultant in the information technology and aerospace industries. His previous experience includes software development, telecommunications engineering, data communications research and strategic market consulting. He holds a B.S. in mechanical engineering and materials science from Duke, an M.S. in aeronautics and astronautics from the Massachusetts Institute of Technology and an M.B.A. from the J.L. Kellogg Graduate School of Management at Northwestern University.

Jonathan Perl
Board Member

Mr. Perl is a General Partner at Boulder Ventures and has more than 12 years of investing experience, focused primarily on early-stage information technology companies. He led Boulder Ventures’ successful investments in iLumin Software (acquired by Computer Associates) and Era (acquired by SRA International), and is on the Boards of Millennium Pharmacy Systems, Zenoss, and Metron Aviation. Mr. Perl is also a Board Member of The Mid-Atlantic Venture Association (MAVA) and a National Child Research Center (NCRC) Trustee and Chair of its Finance Committee. He is Co-Chair of MAVA’s Capital Connection for both 2009 and 2010.

Mr. Perl holds an MBA from the Amos Tuck School of Business at Dartmouth College and a BA (magna cum laude) in classical history from Tufts University.

Sanju Bansal
Board Member

Sanju K. Bansal has served as Executive Vice President and Chief Operating Officer since 1993 and was previously Vice President, Consulting since joining MicroStrategy in 1990. He has been a member of the Board of Directors of MicroStrategy since September 1997 and has served as Vice Chairman of the Board of Directors since November 2000. Prior to joining MicroStrategy, Mr. Bansal was a consultant at Booz Allen & Hamilton, a worldwide technical and management consulting firm, from 1987 to 1990. Mr. Bansal received an S.B. in Electrical Engineering from the Massachusetts Institute of Technology and an M.S. in Computer Science from The Johns Hopkins University.

 

New C-Suite Data to Decisions Tech Optimization Marketing Transformation Innovation & Product-led Growth Next-Generation Customer Experience Future of Work SoftwareInsider B2C CX ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation B2B EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief Data Officer Chief Digital Officer Chief Analytics Officer Chief Financial Officer Chief Operating Officer Chief Marketing Officer Chief Revenue Officer Chief Information Security Officer Chief People Officer Chief Human Resources Officer Chief Experience Officer

SNA Finalists Data-to-Decisions

SNA Finalists Data-to-Decisions

Data-to-Decisions [D2D] is one of the broadest business research themes that Constellation Research covers. As such, the SuperNova Award entries cover a wide range of data collection, management and analysis for every aspect of organizational and personal decision making. The judges had quite a decision of their own to make, selecting the most innovative and disruptive uses of data to impact decision making from among 28 applicants representing just as varied use cases. D2D and the submissions received apply the most interesting data sets to the most advanced decision challenges facing enterprises, governments, communities and individuals, today. I want to congratulate the 11 finalists in D2D and introduce you to them. For it's you, the reader of this blog, those who are interested in disruptive, innovative decision making, the public who find this page, who will decide the winner from these finalists. Voting opens to the public, from 2013 September 9 through October 9 on the Constellation site. So, here are the innovators and organizations found to be the most innovative of the SuperNova Award D2D applicants.

Vote for the SuperNova Awards

Ashish Braganza, Senior Manager of Global Business Intelligence, Lenovo

Lenovo, the world’s largest PC vendor, is a $US34 billion personal technology company serving customers in more than 160 countries. The challenge facing Lenovo to maintain that top spot. To do so, Lenovo created the Global Business Intelligence [GBI] team and tasked them to improve website conversion rate, and provide an eight-fold return on the investment in the GBI team. Through predictives, optimization and data, using Adobe Systems' Adobe Marketing Cloud [Analytics, Social, Media Optimizer, Target and Experience Manager Solutions] the GBI team was able to directly impact the corporate financial stream, creating a five-fold increase in click-through, boosting revenue per customer by 26 percent.

Brad Donovan, Manager, Agile Analytics and Innovation, GlaxoSmithKline

GSK employees serve US communities by discovering, developing, and delivering new medicines, vaccines, and other healthcare products to help people do more, feel better, and live longer. Unfortunately, a long established commitment to advanced statistics and what is now called data science, had led to a situation wherein the IT & Analytic communities could not respond when Marketing and Sales needed new insight. Teradata, IBM & SAS came on as an advisory council to verify or refute IT's objections to the new Agile Analytics & Innovation group using data directly from the EDW by way of Teradata Data Labs. This is a move from where the old Analytics team was seen as a bunch of PhD stats using SAS and pushing out one-time models. Using Teradata EDW, Data Labs and analytics partners, the Agile Analytics & Innovation group took data aggregation from 30 hours to 3 minutes, model execution from 40 hours to 1 hour, QC from 40 hours to 5 hours, but most importantly they took the predictives and models of the Analytics team from a "bright shiny object" to an actionable solution in production, accessible by thousands of business analysts throughout the company. In many ways, this is a reverse of the current trend towards creating data science teams, and shows how companies that are just moving forward with data science, can leverage and productionalize the improved inferences and predictions stemming from computational statistics, data mining and machine learning.

Bruce Yen, Director of Business Intelligence, Guess?

Established in 1981, Guess?, Inc. began as a denim company and has since successfully grown into a global lifestyle brand that directly operates 511 retail stores in the United States and Canada and 328 retail stores in Europe, Asia and Latin America. To be immediately responsive to customer and business needs, Guess sought to harness the power of big data in near real-time and thus compress the business decision cycle. Guess? Inc. extended the power of the HP Vertica Analytics Platform to power all of its BI & Analytics, but the real disruption came with a cutting-edge analytics iPad application, “G-Mobile,” designed for non-traditional Business Intelligence (BI) users. Via G-Mobile, Guess? extends analytics to the business front line — including designers, buyers, planners, sales executives, and allocators — so they can better manage the business with the right data at the right time. This reduced the load window by 50 - 62.5 percent, from 3-4 hours on the legacy platform to 90 minutes at most, accelerated speed-to-insight: on the legacy system, it might have taken 15 to 20 minutes to generate a merchandise report; it takes just five to 20 seconds on the iPad accessing data remotely, and improved merchandise allocation and distribution of inventory across retail locations due to complex queries, such as sales for all best-sellers, performed 60 to 80 times faster.

Dirk Zeller, Head of IT Consulting at Mercedes-AMG GmbH, Mercedes-AMG GmbH

The image of AMG as the successful performance brand of Mercedes-Benz is reflected in its impressive successes in the world of motorsport and its unique vehicles. Today it is a one hundred percent subsidiary of Daimler AG and is the group's technological spearhead in the high-performance car segment. The AMG brand promise of "Driving Performance" stands for state-of-the-art technology and pure driving excitement. The engine is the key component of every Mercedes-AMG vehicle, allowing us to deliver on the promise of ‘Driving Performance’. AMG have realized that a key function like engineering could benefit tremendously from real-time analytics to innovate and accelerate all engine testing processes which are usually time-consuming (e.g. up to 50mn of engine dyno time are wasted in case of a non-successful engine test-run) while the resources, especially dynometers are limited. Comparing current engine testing data with previous test-bench data to evaluate the performance of the engine was for example extremely complex and in some cases not even possible. Using SAP HANA, mobile, and predictive products, Mercedes-AMG – in collaboration with SAP AG Partner MHP - have built a highly innovative real-time quality assurance platform for the optimization of end-to-end testing processes in development and manufacturing. The solution delivers real-time analytics to engineers on any device to allow them get a 360° view of the performance of the engine during all testing phases – also leveraging high-volumes of polytechnical data coming from sensors connected directly to the enginesThis has reduced run-time for non-successful test-drives (before: 50mn ; after: immediate halt if parameters are out of range), saved between one to several days in engine testing capacity, which can now allow us to test more engines during the same period of time and/or allocate this time for other added value tasks, and decreased capital expenditures. The result is a highly scalable platform for the future. The same approach can be applied to other use-cases: test vehicle on test-track, test-vehicle on long-test runs, crash tests, interior testing, and even getting real-time information from the vehicle while being used by the customer, allowing for a future of predictive maintenance.

Karen Simmons, Senior Director, Enterprise Data Warehouse, Kelley Blue Book Co., Inc.

Founded in 1926, Kelley Blue Book delivers the most market-reflective values in the industry on its top-rated website www.KBB.com, including the famous Blue Book® Trade-In and Suggested Retail Values, and Fair Purchase Price. Kelley Blue Book undertook a significant initiative to create a single 360-degree view of online activity and behavior across disparate data sources throughout and from outside the enterprise, Web360. Kelley Blue Book leveraged technology from Informatica, IBM Netezza, MicroStrategy and SAS to launch Web360, thereby replacing a fragmented assortment of data integration tools, as well as proprietary data integration frameworks, with a unified platform approach to integration. The Web360 initiative enables the company to profile, cleanse, integrate and analyze large data sets with complex relationships. This resulted in improved analytics and intelligence, providing an enhanced and more compelling consumer experience, creating increased performance for advertisers who benefit from new and faster data-driven intelligence.

Oswaldo Mestre, Director, Division of Citizen Services, Office of the Mayor, City of Buffalo 311 Call and Resolution Center

The City of Buffalo 311 Call and Resolution Center, in conjunction with the ancillary programs within the Division of Citizen Services, not just a call center; 311 increases the City's effectiveness in responding to public inquiries, providing insight into the needs and concerns of residents, and promoting accountability by ensuring that services are being delivered in a consistent and timely manner citywide. Using data mostly from 311, the Division of Citizen Services’ Operation Clean Sweep identifies various areas of the city to send the Clean Sweep team of a city, state, county and federal government police and health and human service providers to offer educational outreach, along with beautification crews to address physical issues in the area. Using KANA LAGAN CRM system, 311 allows the City to track issues, get locations, categorize and store each department’s issues in one system, thereby enabling each department to prioritize and respond to issues accordingly.

Lance Henderson, CEO, Zamzee

Zamzee uses sensor technology and a community web site to influence families to healthier lifestyles. Using Bunchball’s Nitro gamification platform and avatar engine, Zamzee participated in research studies by Hopelabs showing that activity levels and associated health measures were all positively influenced through the gamification process.

Roman Coba, Chief Information Officer, McCain Foods Limited

McCain Foods Limited is an international corporation in the frozen food industry, known for frozen potato specialties, and also producing frozen pizza, appetizers, oven meals, juice and desserts. Using Teradata Enterprise Data Warehouse with MicroStrategy and IBM Infosphere, McCain Foods use Optimal Equipment Efficiency (OEE) in real-time and projecting it to all production employees in an easy-to-understand format, creating a cultural shift with the ability to access and analyze not just data, but data that is transformed into information that is intelligent and actionable. One unintended result has been the creation of an extremely competitive workforce as production employees can now see in real-time how they compare to other plants, which makes them compete to outdo other plants.

Ronald Baden, VP of Services, Host Analytics

Host Analytics provides cloud-based financial applications for planning, close management, reporting and analytics. Using FinancialForce PSA, Host Analytics went from using Microsoft Word documents and manual processes to using the FinancialForce PSA application’s detailed reports and dashboards allowed the company to better allocate resources based on project needs. This led to Host Analytics customers giving the company the highest combined overall ratings for reported measures in each of the Vendor, Product and Implementation experience categories, as well as receiving other customer service awards.

Russ Turner, Site Reliability Engineering - Manager, Domino’s Pizza

Domino's pizza restaurant chain launched a web ordering service in 2007. This created a big data problem of unwieldy amounts of machine log data. Domino's deployed Splunk Enterprise to deal with this data. The learning experience from the initial uses of Splunk led to this machine analytics solution being deployed across the enterprise and to use in areas other than the original IT and networking areas to to improving business decisions through visualizing sales trends such as orders per minute, numbers of transactions per store, types of pizza customers are ordering and the coupons they’re using to do so. Machine-to-machine data (M2M) data analytics allows Domino's to determine the types of devices–iPhones, Androids or Xbox’s–that are being used to place orders, or assess promotions in real time. All of this made Domino's IT team a legitimate source of business insight across all areas of the organization.

Tony Candeloro, Vice President Product Development, ARI

ARI provides global vehicle fleet management that drives the best results for each of its clients’ unique and complex needs. In the US, more than 450,000 fleet vehicles are covered by ARI’s maintenance management programs. ARI facilitates maintenance and repairs for these vehicles via a nationwide open vendor network. Currently, the network consists of more than 66,000 vendors with a controlled spend for parts and labor of almost one billion dollars. ARI’s commitment to its customers is to match them with a vendor that is best suited for their vehicle and repair type, to negotiate the best price on the parts and labor costs that get passed through to the customer, and to keep vehicle downtime to a minimum. By leveraging SAP HANA in-memory technology, SAP Xcelsius Dashboards and Infosol’s Info Burst Data Caching, ARI created a a neural data network that relates all vendors with given a radius throughout the US. This data drives dashboards that can analyze regional vehicle operating parameters, regional vehicle spend, and regional vendor operations to identify opportunities to leverage and target our clients’ total spend. Correlating this information with geospatial dashboards provides ARI with a clear picture to better negotiate discounts on behalf of our clients and to communicate with our vendors about the opportunities that ARI provides to their immediate markets.

Bottom Line

The Constellation Research Super Nova Award finalists are market leaders and fast followers, using technology from entrepreneurial firms such as Splunk and Concurrent, from established vendors such as HP, Informatica, SAP and Teradata, and innovative solutions from all manner of vendors in-between. If you wish to get ideas on how to solve similar problems, get the entire story, from the link for each finalist in the above summaries<>. Every organization, every individual, can use internal and external data to create inferences and predictions to gain better insight and increase performance. Learn from your peers, by reading these SNA applications. You can also learn the theory behind data-to-decsions.

Data to Decisions Chief Customer Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief People Officer Chief Procurement Officer Chief Supply Chain Officer