Results

Digital Ad Spend Grows But What About the Investment?

Digital Ad Spend Grows But What About the Investment?

1

When I look at infographics, I am looking not just at the facts and figures (boring) – I am looking a the underlying story. I want to understand what is taking place behind the numbers. I seek insight and connection between the sources of information, the behaviours of the industry and opportunities for the future.

So this infographic from Invesp, fired up my neurons.

Summarising the state of play for the digital advertising industry globally, it shows just how dominant Google remains in the face of challenges from social networks. A staggering 42.6% of ad spending finds its way into the search giant’s coffers, while Facebook, Yahoo! and Microsoft duke it out for less than half of that combined.

From an industry point of view, growth in digital advertising indicates a certain level of health. It shows that digital has firmly moved out of the experimental mode and is now a core part of a marketer’s arsenal. But it also raises significant questions – after all, if spending is increasing, are we also seeing a rise in investment? And by investment I mean:

  • Evaluating and implementing marketing platforms and technologies: Pumping more budget into digital is going to also shift the focus towards digital engagement. After all, a digital call to action can result in a click, a download, a sale and so on … and if that is the case, what investments are marketers making in terms of marketing platforms and systems of engagement? Which platforms are you evaluating for marketing automation or social media management? How are you tracking conversion, monitoring the velocity of online conversation and improving rates of conversion? CMOs should evaluate their marketing processes and look for automation opportunities.
  • Building the capacity and experience of your teams: The digital marketing skills gap continues to widen. For decades, marketers have been forced to do more with less – and now as the demand for digital skills accelerates, many CMOs find themselves responsible for teams who have transitioned from into “digital” from more “traditional” marketing fields. This has resulted in teams with limited or poor digital experience, basic skills and little time to build capacity. CMOs should carry out a Digital Skills Audit as a matter of priority.
  • Investing in customer engagement strategy: Much of our marketing strategy is built around maximising the value of channels. It’s time to stop this nonsense. We need to map customer journeys and then invest in engagement that adds value to the customer experience at key “moments of truth”. This means stepping away from the channel. Even if that channel is “digital first”. 

Have your say

What have I missed? What have I mis-read? What else needs to be improved?

digital-ad-spending

 

Marketing Transformation Innovation & Product-led Growth Tech Optimization Future of Work AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership Chief Executive Officer Chief Marketing Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Digital Officer Chief Analytics Officer Chief Operating Officer

Analysis of Intuit's Acquisition of Elastic Intelligence

Analysis of Intuit's Acquisition of Elastic Intelligence

Intuit logo

Late last week the news came out, that Intuit had acquired Elastic Intelligence, maker of the Connection Cloud product - one of the few cross platform, cloud enabled BI solutions in the market. Intuit will use Connection Cloud to complement capabilities of their QuickBase solution.

 

We look at this event from the Future of Work, Data to Decisions and Consumerization of IT perspective - respectively through the lenses of our analysts Holger Mueller and Alan Lepofsky. Click here to navigate down to the Advice and Point of View sections.

 

The Collaboration Take

 

One of the basic tenets of enterprise collaboration software is that it allows people to work together to achieve a common goal. Example goals include planning an event, creating marketing material, closing a sales deal, or any one of a thousand other use-cases where people work together to get their jobs done. Without a consistent structure for entering information, the data in these collaboration platforms becomes difficult to search, filter and report on. Products that use form-based entry solve this issue by having people enter information into specific fields rather than into a blank wiki page, blog entry or community forum.

 

Intuit QuickBase has been around for more than a decade, allowing people to create applications without having to be an application developer. The acquisition of Connection Cloud and its future integration with QuickBase should allow people to integrate data from other enterprise systems into the QuickBase applications they create.

 

For example, an organization may be able to create an application for the Sales team that pulls in data from both their CRM and their ERP system, allowing them to get an account overview that is not available in either of those systems on its own.

 

The SaaS Take

 

Software as a Service (SaaS) is the growth engine for enterprise applications in general. The unique nature of QuickBase is its capability to get end-users to build and maintain surprisingly elaborate business application. The extensive library of partners and building blocks gives QuickBase users a powerful but end user manageable arsenal of onality.

 

SaaS vendors need to continuously expand their capabilities and the addition of business intelligence functionality is a key value add for QuickBase.

 

The BigData Take

 

One of the biggest challenges for enterprises today is how to create value from big data projects. Though Connection Cloud does not necessarily fall under a big data play - the result of using the product can likely result in one. With the capability of using many of the leading SaaS OLTP products as a data source, Connection Cloud is one of the few products to provide out of the box cross SaaS product business intelligence... and with the combination of multiple OLTP sources - data volumes could quickly  move to (lower end) big data volumes.

 

It will be interesting to see if Intuit can capitalize on the big data trend - especially in the light of maintaining end user ease of use.

 

The  Enterprise Take

 

One of the most interesting developments in enterprise applications has been end user programming - for quite some time now. No vendor has really tackled the challenge with a workable solution - but Intuit is one of the closer vendors to successfully address the topic with QuickBase.

 

Through the combination of the existing applications and the capabilities of Connected Cloud, which enable more business intelligence content, Intuit’s lead in this area will be solidified. And it makes a whole new set of applications possible. While previously all OLTP vendors had a  lock on BI and reporting solutions to run on their own application and product framework - it may now be possible to build QuickBase applications on top of that. This gives QuickBase a new value proposition to build applications.

 

Another possibility is, that Intuit will not use Connected Cloud solely for the pedestrian reporting and BI needs - but to extract more data from the SaaS OLTP applications. This would make the integration of QuickBase with SaaS OLTP easier and again open new dimensions of QuickBase application scope.

 

However, like before - Intuit will have to address the write back problem. Right now QuickBase makes it easy to build one way applications, in the sense that you take data from another system, import it and work on it in QuickBase.

 

Likewise it supports island applications with self contained data storage in QuickBase.  What Intuit needs to address are circle applications that will allow users to start in 3rd party applications, provide value through a QuickBase application and then return that back to that (or another) 3rd party application. Or better for QuickBase - start there, hand over data to 3rd party, process something there, and then return to QuickBase. It  matters in enterprise applications where business processes get started and lead to final outcomes.

 

 

Advice For Customers

 

This is good news for QuickBase customers, who get key capabilities added to the product. It’s time to re-evaluate scope of your existing QuickBase applications and see how the additional capabilities will add value to these solutions. Likewise, with the expanded capabilities it’s time to see which new applications you may decide to build with QuickBase.

On the flipside - the formerly amicable relationship between Elastic Cloud and SaaS OLTP vendors, may now change given QuickBase’s competitive status for the overall enterprise applications landscape. So monitor how many connections the Connected Cloud product will have when run by Intuit.

 

Advice For Partners

 

This is exciting news and as customers revisit their application portfolio, you should review your product roadmaps and service offerings. How can the future additional capabilities of QuickBase make your offerings stronger and more attractive in the market, how can these capabilities help address new automation areas? These and similar questions should be addressed quickly.

 

Advice For Competitors

 

You should not be surprised, as Intuit will keep investing into QuickBase. You will need a strategy to address the power of end user programming and the disruptive nature of that trend to the conventional enterprise applications space - may they be SaaS or classic on premise applications. Intuit (and others) have not fully figured out this one yet - but someone will come along sooner than later and you need to be ready. While Elastic Intelligence enhances Intuit’s integration capabilities, they are still weak on collaboration/enterprise social networking features. In today’s “social business era”, vendors than provide a strong set of collaboration features such as Liking, Commenting, Rating and Sharing should utilize this as a competitive advantage.

 

Advice for Intuit

 

This is a great move, you will have to make sure you integrate the more complex BI capabilities in a user friendly way into QuickBase - and make them configurable, usable and extendable by a skilled business end user. Likewise you need to address the circle natured applications we mentioned earlier. Intuit’s next acquisition should focus on improving their enterprise social networking capabilities, enabling people to create, collaborate on and share information in QuickBase applications.

 

OurPOV

 

A good acquisition by Intuit, that helps further differentiate QuickBase. Adding new additional scope to its current application scope with business intelligence capabilities, provides  QuickBase with extended usage attractiveness to both customers and partners. Keeping QuickBase easy and intuitive to use is the emerging challenge. We look forward to hearing more details on roadmap, pricing and availability.

 

Data to Decisions Future of Work Tech Optimization Innovation & Product-led Growth New C-Suite SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief Experience Officer

Ten Things CIOs Need to Know about WebRTC Webinar

Ten Things CIOs Need to Know about WebRTC Webinar

Ten Things webinar with E. Brent Kelly. Aired May 23, 2013.

Future of Work Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Chief Customer Officer Chief Executive Officer Chief People Officer Chief Information Officer Chief Marketing Officer On <iframe src="https://player.vimeo.com/video/66834454" width="500" height="313" frameborder="0" webkitAllowFullScreen mozallowfullscreen allowFullScreen></iframe>
Media Name: screenshotcr.png

How the Cloud can make the unlikeliest bed Fellows

How the Cloud can make the unlikeliest bed Fellows

When revisiting the Oracle earnings call of this week, it is pretty obvious that Oracle is trying to position Oracle 12c pretty much everywhere as the cloud database of choice. And not only position and try to sell – but make it an integral part of the cloud tech stack of well-known partners (NetSuite), lesser known partners (salesforce) and even competitors (Microsoft).

So here is what Ellison said(courtesy of SeekingAlpha, emphasis added) in the Q4 earnings call last Thursday:
 
Next week, we will be announcing technology partnerships with the most important –the largest and most important SaaS companies and infrastructure companies in the cloud. And they will be using our technology, committing to our technology for years to come. That’s how important we are doing 12c. We think 12c will be the foundationof a modern cloud where you get multi-tenant applications with a high degree of security and a high degree of efficiency, you at least have to sacrifice one for the other.
Again, I would call them a startling series of announcement with companies like Saleforce.com, NetSuite, Microsoft all that happen next week will give you the details. These partnerships in the cloud I think will reshape the cloud and reshape the perception of Oracle Technology in the cloud. 12c in other words is the most important technology we’ve ever developed for this new generation of cloud security.
 
So let’s dissect and interpret this: Ellison makes it very clear that the aforementioned SaaS and IaaS companies will be using 12c for years to come. The design point of separating user data from metadata is the key architectural change of Oracle 12c from previous versions of the database. And he clearly mentions long term partners NetSuite and Salesforce, but also usual foe Microsoft. So what is going on?
 

The Cloud market matures

As Constellation Research has shown last week with our post The cloud is growing up – 3  signs from the news (see here) – the cloud market has entered a 2ndphase in which more vendors compete for less demand and at the same time need to accelerate their offerings – through acquisition (e.g. IBM buysSoftLayer), through bundling(HP announced Cloud OS) or partnering (e.g. Google andRedHat). And we have the most unlikely combination of partners now most likely working on a blended cloud technology tech stack.
 

Oracle’s ISV business

Let’s not forget that Oracle’s ISV business is an integral part of the Oracle revenue. And that for most of the last quarter century the largest Oracle ISV has been... SAP. So Oracle knows how to make partners successful on its database. And contrary to public perception, we are sure when the call was placed to 500 Oracle Parkway from One Microsoft Way in Redmond, Oracle was listening.
 
The other remarkable aspect is that now in a span of 20 years – back then it was Hasso Plattner with his decision to run R/3 development on Oracle’s database  - and now Steve Ballmer (and maybe even Bill Gates) – choose Oracle as their strategic partner. Very, very few technology companies can muster that test over a 20 year time range.
 

Microsoft’s problem

The root cause for the expected Oracle Microsoft partnership lies deep in the history of Microsoft technical decisions. When it was clear that Microsoft needed a SQL database and it then partnered with Sybase – it made the decision to run SQL Server on the Windows technology stack – and only there. And that limited the number of cores that were supported and allowed the database team to – let’s be polite – not address scalability issues in the best way.
 
All this was hidden while the world was running applications on premise. And it was also hidden as long as the load on the database server side was manageable. Ever wondered why the Microsoft enterprise applications only had a SMB focus? And why Microsoft ran internally on SAP?
 
So this will be a key case study how platform decisions and technical debt can creep up on even one of the largest and most successful technology vendors. But kudos go to the Microsoft executives to as it looks like really jump over their shadows and address the technical issues through a partnership with Oracle.
 

Virtualization layer complications

So where will be the line in the sand between Oracle and Microsoft IP and products? Next up from the database in the technology stack – you will hit the virtualization layer – and here Oracle and Microsoft have their respective own offerings with Oracle VM and Hyper-V. We expect this is where Microsoft will draw a line and Oracle 12c will have to find a way to support Hyper-V.
 
At the end of the day this is a reasonable architectural fault line – as it protects the Microsoft application code to become virtualization layer agnostic – while it requires Oracle’s database to become compatible with different virtual machines. And this makes sense for Oracle as it comes back to its DNA as partner for ISVs – with the virtualization layer becoming something similar to the ODBC of the cloud age.
 
At the same time it gives Oracle the chance to optimize a little better with its very own Oracle VM – which will be key to pitch for the overall Oracle tech stack to the many ISVs, who do not own a virtualization offering themselves.
 
So this would be a reasonable compromise which ultimately is a win win for both sides, though short term it will but some architects in Redwood Shores in high gear.
 

Did Microsoft have options?

The only other real option that Microsoft could have looked at would have been IBM. And IBM would in general have been a more compatible partner than Oracle – at least from the general outside perception. And though this is speculation, Constellation is sure that Microsoft will have done some due diligence on Armonk’s DB2.
 
And then Microsoft could have gone more radical by e.g. looking at using Hadoop as a conventional data store (see here) – but that would have most likely pushed the limits a little too aggressive… but for a second - think of storing all the information that Microsoft applications use and create in one single and consistent data store. Not a solution for 2013 – but for 2014+. Obviously Microsoft’s need was much more immediate – like to run the Dynamics applications and get SaaS market share.
 

So why Oracle?

We don’t have the details, but the savings that the Oracle 12c database achieves by de-coupling metadata from user data achieves must be so impressive, that they even convinced Microsoft to partner. We cannot think of many other and better benchmarks for Oracle 12c.
 
And while the hint of NetSuite adopting 12c is not surprising – the adoption by Salesforce are another proof point of the achievements Andy Mendelsohn and team have put in place with 12c.
 
Both Microsoft and Salesforce know the SAP story – and how that 20+ year ago decision of Hasso Plattner to build R/3 on Oracle has shaped the Oracle, the SAP and the RDBMS markets and ecosystems. We are certain Microsoft did not make this decisions light heartedly. And surely Salesforce may have wanted to rid itself of the Oracle dependency. But ultimately the cloud business is all about cost of ownership – and if someone has a silver bullet – you need to have it, too – or your days may be counted as a competitive cloud vendor.
 
Oracle deserves credit that – again contrary to widely held public perception – 12c is available for partners, even widely perceived competitors – alongside their internal development of Fusion Applications. All rightful concerns of Oracle not supporting the platform for their own advantage – need to take a pause.
 
And we are really curious where Oracle and SAP are on bringing the SAP products to 12c.
 

Advice for customers

This is good news for Oracle and Microsoft customers. Microsoft customers get a scalable database under the Microsoft SaaS applications, Oracle RDMS customers get more usage of 12c and another way to build applications for 12c. And at the same time Oracle’s tech stack and applications teams have now an external benchmark that they need to be better at building on top of 12c than their relative competitors. So as a customer – wait, see and validate the expected benefits.
 

Advice for partners

For a Microsoft partner – this makes your business more viable in areas where before the sizing teams would have cringed and where the hardware cost could have been prohibitive. For the Oracle database partners this expands the addressable market. And for ISVs in general this is great news – as you may now have the choice to develop in Java or C# - with the latter no longer being limited by database capacity. You still may take a dependency on the cloud technology stack you will be using, but when HyperV will be supported by Oracle 12c – it may be an option to run your C# applications on an Oracle data center.
 

MyPOV

We congratulate both companies to the partnership and see this as net positive – Oracle is true to its technology partner foundation and Microsoft has solved a long term tech stack weakness that is exposed by the nature of the cloud. It’s now execution time for the technical teams and we look forward to learn soon about the first product and customer proof points – maybe as soon as the Build Conference this coming week.
 

 

The only negative: We are sad that one of the best April Fools headline is gone forever … 
New C-Suite Tech Optimization Innovation & Product-led Growth Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Microsoft netsuite salesforce SAP Oracle SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Is Oracle 12c the end of multi-tenancy as we knew it?

Is Oracle 12c the end of multi-tenancy as we knew it?

Oracle released it's Q4 numbers, and the usual strong but entertaining statements on achievements and the competition were heard on the earnings call. Andrew Nusca over at ZDNet has done a great job to extract the 25 striking things from that call, you can find it here. You can find the webcast here and a transcript here.

During the call Larry Ellison also gave a preview of events scheduled for next week in regard to the next Oracle database release, Oracle 12c. Oracle 12c was announced back at OpenWorld in 2012 - as the first pluggable database that would separate user data from metadata and allow multiple tenants in the same database.

Multi-tenancy Confusion

There is  now some confusion around the term multi-tenancy. Though in general there is agreement that multi-tenancy means the co-existence of multiple tenants for shared resources - there are now two interpretations of the term.

The classic multi-tenancy term was related to the database sharing data elements (or records) across tenants. That design was critical for the first and early SaaS vendors - as they needed to share precious data base resources. Often this is referred to as a tenant striped database. 

The Oracle view on multitenancy is that user data becomes the tenant - and as you can run multiple user data stores (or containers as Oracle calls them) in the same database - you have a multi-tenant database. Oracle complements this by separating the metadata from the user data and can point multiple user data stores to a common set of metadata, thus achieving better hardware utilization and with that better elasticity of the database. Or in other words - you can run more database on the same server with 12c.

The key advantages of 12c

As already mentioned above, the separation of the user data from the meta data allows 12c to use less resource than its predecessors and with that a better hardware utilization. Better hardware utilization with putting  more user data on the same machines is better elasticity of the offering, which is key for anything cloud these days.

But there are  more key advantages to this tenant concept. First of all, the standard tool you may want to run on the database are still available to run - with no changes to the the security model. A BI tool like e.g. .SAP's Business Objects can just run on the 12c database - with no modification. To the user its just looking at the database as with no multitenancy. In the striped multi-tenancy case the unmodified tool would give access to all tenants data - something clearly not desirable if you want to stay in business as a SaaS vendor.

Moreover you can move and copy the user data more easily. And you can change the schema both on the metadata and user data separately - and then just point upgraded versions to the right partners of metadata and user data. Big advantage for upgrades and high availability.

And finally most enterprise software needs some way to customize it. With the striped multi-tenancy model this was very limited - as all tenants were on the same schema. With the new multi-tenancy architecture - more can be done to the individual schemas of a tenant. Theoretically anything and independent from the tenants - but of course with a price when upgrading, no discussion needed.

Is this new?

Not really - most SaaS and PaaS vendors today will store one tenants data in a separate database, often even on separate servers. The advantages are  mentioned beforehand - and often database scalability even drives to that design (more about that next week). But these vendors pay the price with a higher operating cost -- all their databases run with the overhead of a one to one relationship of user and metadata - which results in a larger footprint and with that higher cost to operate and less elasticity. 

Oracle's innovation is to provide the separation of meta data and user data from each other and achieving better elasticity for the offering. If Oracle will be able to make the upgrade to 12c transparent in the sense that a pre 12c Oracle database user may take advantage of 12c easily and e.g. be able to unstripe the older usage - will remain to be seen. It will make adoption of 12c much easier.  

Big expectations

Larry Ellison mentioned events next week to provide more details on 12c - and the endorsement of NetSuite, salesforce and... Microsoft. And while NetSuite was hardly a surprise - salesforce was more surprising. But Ellison had almost kind words for Marc Benioff. Now what Microsoft may do here - will be very interesting - expect lots of speculation till the event.

MyPOV

Oracle was not shy to tout 12c back at OpenWorld and now in the Q4 earnings call. With missed earnings - maybe a diversion strategy - but when 12c ships - it will change multi-tenancy as we knew it. And can't wait for the partnership announcements Oracle said the company would make next week. Heightened expectations. 

My latest take on Oracle overall can be found here - takeaways from the Oracle analyst summit. 

Tech Optimization Innovation & Product-led Growth Future of Work Data to Decisions New C-Suite netsuite salesforce Oracle Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

How hard is multi channel CRM in 2013?

How hard is multi channel CRM in 2013?

I have previously shared my not so optimal experience installing Office365 from Microsoft, and raised my concerns on the status of customer service in this post. But the OfficeSaga Part 1 kept giving through the last week - so it compelled me to write a little more on the state of multi-channel CRM in 2013 - which seems to be pretty sad. Here is the new storify collection

During the heyday of CRM in the late 90ies of last century, it was all about the chase to treat customers consistently across interaction channels and across organizational functions. The demo of the informed sales rep, who aware of a customer service / customer support situation, steers to a sensitive customer interactions, was seen at every CRM show / demo. Likewise the mirror scenarios - where customer service professionals are made aware of impending sales and treat the customer accordingly. Seen too often to ever forget. 

In 2013 the situation has gotten a little more complex - as customers can not only be interacted with face to face and through the phone, but they may also show up at your web store and in the social media. But still the promise of multi-channel - or often as a buzzword now omni-channel - CRM is that the customer will be treated consistently across the interation channels and all actors on the enterprise side are informed across organizational boundaries.

 

The test: Install Office365

 

As mentioned I chronicled already my close to 24 hour challenge to install Microsoft Office. But the case was closed with my 3rd install - with the help of Microsoft 3rd party support. But in the week after the surprises started.

 

In-Function Disconnects

 

I was surprised when the first 3rd level support consultant was following up with me on the successful install of Office365. That would be great customer service if indeed I had not been successful installing Office365 - but I was successful. And I referenced the case number in the interaction with the second 3rd party engineer.

But ok - giving the benefit of the doubt I replied with thanks and good news. After all my lesson learnt from this is - get to 3rd level support asap - good to have a relationship with two 3rd level support professionals at Microsoft.

And then witness my surprise that the same engineer followed up again - a day later with the same question - if all was good with my Office365 installation...  at this point I decided to no longer reply.

What should have happened in perfect CRM? The agents should have looped back with me if my Office365 is running well now. They also could have called me - as they have all my phone numbers - even more personal, but ok.

 

Surveys are great - they need to work

The OfficeSaga Part 2 got even more lively - when I started to receive links to feedback surveys. 

Great practice - only the first two links didn't work. Using Chrome first I was suspecting a potential Mirosoft issue and tried IE - also no luck. Back to Twitter and tell @Office - and what do I get on Twitter - the next non working link. Than nothing.

The next day I get my 3rd request to fill out a feedback survey - which then worked - both in Chrome and IE. 

And it sounds plausible to test links to sites / surveys before you send them to customers. And if you have a Twitter conversation - granted it's a challenged one due to the 140 char limitation - don't end it - drive it to closure.

 

More in function disconnects

The highlight and by now the last one I hope - was getting a call from a sales rep for the Small Business Division, if I wanted to buy the Office365 version for small business, as my free trial had expired. 

 

 

Looks like a very good sales practice - only I had purchased one year subscription already when I switched laptos - pondering the eventuality that I was not allowed to switch machines during the free trial period... 

And similar like the service rep who should have known that the case was closed - the sales rep should have saved the call to me as I had purchase Office365 already.

 


Advice to CRM users

If you sell and service customers across channels - do so consistently. If you haven't recently - test your systems from the outside - and hopefully you do not run into negative surprises.

 

Advice to CRM vendors

Check if you multi-channel story is complete and working. Do all users interacting with a customer have consistent information and access to the customer's past interactions?  Check your timed actions - are they still in synch with business reality? Can users quickly validate the latest status of a customer before interacting with the customer?
aaa

 

MyPOV

It looks to me that the state of multi-channel CRM is in a more dire state than I thought. If an enterprise like Microsoft, who is also a vendor of CRM systems, does not have a best in class implementation of CRM - hab bad may this be with regular end users? 
And yes - consistent multi-channel CRM is hard - but what customers expect and deserve in 2013.  Time to make it real.

New C-Suite Next-Generation Customer Experience Microsoft B2C CX Chief Customer Officer Chief People Officer Chief Human Resources Officer

Bridging the Social Chasm

Bridging the Social Chasm

1

When IBM’s Center for Business Value released its 2011 report into the relationship between social media, marketing and brands, it revealed a “perception gap”. On the one hand, marketers had an understanding that their connected consumers “wanted” or even “expected” a certain style of interaction through social media. And on the other hand, there was the hard reality of what those customers actually wanted. The gap between the two was the distance between two competing realities.

But is anyone listening?

In reality, we are not really dealing with a gap. It could be better described as a “mismatch” – after all, a “gap” would indicate some alignment. But the problem for brands is that the distance between the two sets of expectations is growing. We are now dealing with a widening chasm in the world of customer experience.

CustomerExperienceGap

Two years after IBM’s original report, even a casual investigation of most branded social media would indicate that the chasm is becoming more pronounced as brands continue to shift their marketing spend and resources into digital and social media (Gartner’s US Digital Marketing Spending Report indicates that 25% of the marketing budget is now devoted to digital).

But when it comes to business effectiveness, more budget is not necessarily always the answer (though there would be few marketers who would refuse an increase, I am sure). To bridge the social chasm, business must begin to re-think, re-action and re-calibrate their organisational approach to social:

  • Re-think: Start with what you know. Create a new social baseline and audit all your activity for assessment. Real time analytics and dashboards such as those from Anametrix can provide the kinds of decision-ready data that is essential to informed decision-making

  • Re-calibrate: If you have started a social business program in the last two years, it’s now worthwhile assessing its impact. Have you achieved the original milestones? Has the program had the kind of impact that you expected? Take a look at R “Ray” Wang’s 50 use cases that help demystify social business and think through the business processes and workflows that are business critical. Are your social programs impacting business results? If not, it may be time to recalibrate.

  • Re-action: This is no time for social business fatigue. No one ever said that change was easy. And equally, no business achieved competitive advantage by being complacent. It’s time to re-action the business programs that are core to your strategy.

What’s your experience?

Interestingly, this recent workplace research study by Microsoft revealed that there is also a chasm between business management and the workforce. Teams not only expect or demand more collaboration – about 17% of people are actively ignoring IT policy and installing social tools independently. This is delivering some value to the business – with 60 percent of participants in the Microsoft study indicating that their use of social tools has increased productivity – but this would be a far cry from the billions of locked-in value that McKinsey Global Institute’s 2012 study revealed.

If businesses can’t work to unlock the value in the low hanging opportunities within their own business, how long will customers have to wait?

It seems like there are whole industries on the brink of disruption. Social may not be the driving force, but it could be the trigger.

Microsoft Social Tools in the Workplace Research Study by Mark Fidelman

 

Marketing Transformation Innovation & Product-led Growth Tech Optimization Future of Work AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership Chief Customer Officer Chief Marketing Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Digital Officer Chief Analytics Officer Chief Executive Officer Chief Operating Officer

Getting Glass With Google Glass

Getting Glass With Google Glass

Today I went with a friend of mine to the Google campus (known as the GooglePlex) to pick up his Google Glass.

Google Google Glass

MyPOV

It's certainly an interesting experience. It's really impressive how clear the display is and how good the camera is. The audio is a bit quiet. I would not call them "comfortable", but they are not invasive. I imagine after a day or two you probably forget you're wearing them.

Google Glass does not do anything my (Google) phone can't do (pics, video, directions, search, weather, stocks, etc), it just does it hands-free. It's important to note, this is NOT augmented reality yet, meaning the data is not yet being overlaid on top of the real world, just on a little display at the side of the glasses. I am sure there will be some amazing apps when these are really released. I'd love to see a golf GPS app!

My overall take is that the ability to have information easily accessible (or recordable) is clearly important. Having it displayed contextually (without the need to search) based on what you're seeing will be really powerful. Is Google Glass the final answer? Of course not. At some point we'll view information displayed on your own glasses or contact lenses, via holograms in the air in front of us, or eventually embedded directly into our eyes/brains. We'll look back at Google Glass similar to how we now view the original mobile phone (Motorola DynaTAC) or tablets (Apple Newton).

Future of Work Innovation & Product-led Growth Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Next-Generation Customer Experience Matrix Commerce Google AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership Chief Information Officer Chief Technology Officer Chief Data Officer Chief Digital Officer Chief Analytics Officer Chief AI Officer Chief Information Security Officer Chief Product Officer Chief Financial Officer Chief Operating Officer Chief Executive Officer Chief Experience Officer

Is WebRTC Secure...Enough?

Is WebRTC Secure...Enough?

Every website you visit may download a different collaboration app!

When I think of voice, video, or data on my PC, tablet, or smartphone, I think of applications over which I have some control. For example, I can choose if I want to download the Lync, Skype, GoToMeeting, WebEx, Vidyo, or Google Hangouts clients. Choice is preserved across all of my devices: I choose what I want on my device. I can also remove any of these apps any time I want.

But WebRTC changes all that.

In thinking about how WebRTC will be used, it is possible that many of the sites I visit may have their own unique real-time communications and collaboration application, and this application will be downloaded automatically in the Java Script my browser fetches from the web server...without any permission required on my part.

Given the variety in each of these communications applications, it would be very easy to inadvertently click on something that gave camera or microphone control to someone I don't know and don't care to know. How does WebRTC provide security for this brave new world of ubiquitous browser-based voice, video, and data?

I recently had the privilege of speaking with Eric Rescorla of RTFM, Inc. about this topic. Eric is the author of two IETF RTC-Web working group documents focusing on WebRTC security--one discusses WebRTC Security Considerations and the other proposes a WebRTC Security Architecture that satisfies these security considerations.

According to these documents, "RTC-Web communications are directly controlled by some Web server,...[and] a Web browser might expose a JavaScript API which allows a server to place a video call [unknowingly by the user]. Unrestricted access to such an API would allow any site which a user visited to "bug" a user's computer, capturing any activity which passed in front of their camera."

In this post, I will discuss WebRTC security and how it has been specifically formulated to protect users from unauthorized persons or sites creating malicious scripts that could take over control of the user's camera and microphone. It is important to recognize that WebRTC security is still under development in the working groups; hence, there may be some variation in the final specification from what is discussed below.

Because the user has no control over the Web servers visited during a browsing session, a key to making WebRTC secure is to make each browser the only trusted base for which security decisions can be made and to assume that any Web site could have malicious Java Code embedded therein. Furthermore, identity is at the heart of any decision to allow a Web-based application to have camera and microphone control.

WebRTC Calling Scenarios

Eric's security considerations document identifies several different WebRTC calling scenarios and what the user expectations are from those scenarios.

1. Using A Dedicated Calling Service: A user may establish a relationship with a Website that provides a calling service. This could be a site that effectively provides a "rendezvous" capability or directory for calling other people using WebRTC, or it could be a service that interconnects WebRTC with the PSTN or enterprise infrastructure. In this case, because there is an established trust relationship with the website, the user may want to give this service the ability to automatically access the camera and microphone. Social networking sites or gaming sites may be examples of a dedicated calling service.

However, by giving the site long-term authorization, the user is effectively also automatically giving the site the ability to "bug" the computer and make calls on the user's behalf. User expectation is that the site is not listening in on the calls and that the user can be sure the call is actually made to the intended person or entity.

2. Calling The Site Itself: Suppose a user looking for information goes to a support website or an e-commerce site or a vendor site and wishes to contact someone associated with the organization that owns the site. An easy way to do this with WebRTC would be for the site owner to put a button on the site with verbiage to this effect: "Click here to talk to a representative". The user assumes that he is actually calling the site he is visiting, and the expectation is that this site will be able to access the camera and microphone with the user's permission one time only, and only for the duration of the call.

3. Redirection and Calling: We are bombarded with advertisements on many of the sites we visit. Often these ads are served up by parties not even affiliated with the site we are visiting. If we click on an ad, we will often be redirected to a site we may not know and with which we have no relationship at all. The original site we visited may not even know if we were redirected by clicking on an ad. Users would expect that the site to which we were redirected would not be able to take over camera and microphone control without permission.

The above scenarios deal with the Web browser being pointed at a particular site, and the user at least having some control over allowing that site to make calls based on the site origin and the relationship. WebRTC must make sure that "origin-based" attacks can be avoided.

However, another kind of attack is also possible. This is done by network attackers, often known as man-in-the-middle attacks. These kind of attacks can be made when we use an unsecured network such as a hotspot or home Wi-Fi network. In this scenario, we use HTTP, rather than HTTPS (secure HTTP). While on the unsecured network we point our browser to a particular site unaffiliated with a calling service we may have authorized to make calls in our behalf. The attack proceeds as follows (per Eric's document):

1. I connect to http://anything.example.org/. Note that this site is unaffiliated with the calling service.
2. The attacker modifies my HTTP connection to inject an IFRAME (or a redirect) to http://calling-service.example.com.
3. The attacker forges my credentials at the calling service site, making the user's browser assume it is pointing at http://calling-service.example.com/ while the attacker injects JavaScript to initiate a call to himself.

 

Attacks can also be made while connected to secure HTTPS sites if that site fetches JavaScript from an unsecured, HTTP, site.

The Security Mechanisms Within WebRTC

This kind of scenario is pretty scary to think about. It reminds me of the rather sophisticated attack at the Iranian nuclear facilities where the microphones and video cameras were hacked by a third party who could see and hear what was going on within these facilities. How can WebRTC's security mechanisms prevent unauthorized parties from taking over our devices?

Security is based around trust, and in WebRTC, any security or trust property that the user needs enforced would need to be guaranteed by the browser. Realistically, however, in a working system, the browser must rely on other trusted sources. For example, if I log into a website that provides WebRTC rendezvous services (a directory), then I trust that website to assure that the other users I may wish to call are also authenticated. The website itself becomes the trusted identity provider. There are a number of other third-party identity providers such as BrowserID, Federated Google Login, Facebook Connect, LinkedIn, OAuth, OpenID, and WebFinger. WebRTC can use also these trusted third parties to verify a user's identity.

Here's how it works (see figure below).

Figure 1. The WebRTC security architecture.

User A and User B are both connected to the same secure website via HTTPS. They have also authenticated their identity using their credentials with either an external identity provider or with the website itself.

User A decides to call user B. This will likely be done by clicking on some type of a "call" button next to B's name. When A clicks on the call button, the Web server sends a message to the JavaScript running in A's browser that creates two peer connections: one for audio and one for video (assuming this is an audio and video call--a peer connection is needed for both media types). At this point, no security has been invoked and any website can proceed with WebRTC to this point.

Next, the calling application needs to actually get the audio and video from the microphone and camera. User A is presented with a "door hanger", which is a pop-up window that asks if the website can access user A's camera and microphone.

Figure 2. An example of a WebRTC "door hanger" asking the user for permission to access the microphone and camera (source: a real call using Uberconference.com).

This door hanger has two key elements:

1. It identifies which website is asking for use of the camera and microphone.
2. It gives you the option to allow or deny camera and microphone access.

There are differences in how the browsers currently implement camera and microphone access permissions. Chrome implements persistent permissions only, meaning that if a site is given permission once, it will always have permission. Firefox, on the other hand, has implemented one-time permissions, meaning that the user must always approve camera and microphone access regardless of how many times the same site is browsed to so as to invoke a call.

In addition, the browser window will always display an indicator showing the user that they are in a WebRTC call. If the indicator cannot be displayed, then the standard requires the call to be terminated.

Figure 3. Indicator showing that this site is using the camera or microphone in a WebRTC call (source: a real call using Uberconference.com).

Once user A gives permission to use the camera and microphone, the peer connection script contacts the identity server to get a token that binds user A's identity to his "fingerprint" (digital information uniquely identifying the user). Next, the peer connection looks up possible IP addresses through which the media can flow in order to traverse firewall or NAT devices (these IP addresses are actually ICE candidates--we won't go into the details of ICE, TURN, and STUN in this article, but they can and will be used with WebRTC to securely traverse network boundaries).

At this point, A's browser sends a "communications offer" to the Web server which in turn routes the offer to user B's Web browser. The Java Script on B's browser processes the offer, and the very first thing it does is contact A's identity server to verify that the identity of A in the offer is the same as the identity of A provided by the identity server. As mentioned above, the identity server may be external to the website or it could be the website itself. Once the identity is verified, the "trusted element" icon is shown in the browser URL address pane.

After verifying the identity of user A, user B's browser pops up a message indicating that there is an incoming call from A. If B accepts this invitation, B's browser sets up the peer connection, asks for permission to use the camera and microphone, contacts B's identity server, and returns a message to A containing B's security information, the media information, and the IP addresses needed to traverse B's firewall/NAT.

A's browser receives this message, and contacts B's identity server to verify B's identity. Once B's identity is verified, the two browsers can set up the audio and video exchange on the two media channels that each browser previously established.

The browsers exchange a Datagram Transport Layer Security (DTLS) handshake on every media channel (two channels in this case because there is both voice and video). Once these DTLS handshakes are completed, the media is encrypted and begins flowing between the browsers using Secure Real-time Protocol (SRTP).

Security in WebRTC is still a work in progress. Although this identity model has not yet been implemented in the browsers supporting WebRTC, it is under active development. It is also important to note that the identity server portion of the WebRTC security model will be optional and application specific so that people can make anonymous calls when needed or appropriate, such as when connecting to an ecommerce or support site.

It is also important to note that at any time during the call, if the user points the browser to a different website, the call is terminated because the JavaScript is torn down.

DTLS Versus SDES

There is consensus that DTLS will be mandatory for WebRTC to support. There is active debate over whether to allow SDES at all (see Laurent Philonenko's post on NoJitter.com). SDES, which is used more in the SIP world, would help interface more easily to existing SIP-based infrastructure. Mozilla Firefox currently does not support SDES while Google Chrome does.

The implication here is that if DTLS-SRTP is used, then there will need to be a border element between the WebRTC world and the SIP world. Hence, the border controller providers likely have an excellent future in front of them as secure WebRTC calling and communications applications become widely deployed.

Conclusion

WebRTC uses IETF communications protocols to assure that media and data flowing between browsers is secure. The level of security in a given call will depend on several factors as well as on the context of the communications application.

If HTTPS is not used or an HTTPS site pulls in JavaScript from an HTTP-only site, then there will be a lower level of security (the browser will also alert you that the page has both secure and non-secure data, so you can intelligently decide whether to continue). Furthermore, if there is no identity server involved, which will often be the case when a user goes to a simple calling service and logs on, then the level of security will be good but not as good as if there were an independent identity server.

Finally, anonymous calls can be made, but the security standard under development does suggest that the browser should allow one-time only camera and microphone access permissions.

WebRTC requires the communications application to ask the user if it can access the camera and microphone. There is some variation as to the persistence of the approval: persistent (Chrome) or one-time (Firefox). DTLS is required in WebRTC; there is active discussion about SDES being another security option.

Readers interested in trying WebRTC for themselves can go to www.uberconrerence.com and sign up for a free basic account (this is voice only), or alternatively, they may point their Chrome or Firefox 22 Beta browser to https://apprtc.appspot.com to try voice and video.

Portions of this article are excerpted from Dr. Kelly's recently published report titled, "Ten Things CIOs Should Know About WebRTC."

Future of Work Matrix Commerce New C-Suite Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Marketing Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Digital Officer Chief Analytics Officer Chief Operating Officer

The Cloud is growing up - three Signs from the News

The Cloud is growing up - three Signs from the News

Every technology market goes through different growth phases and at this point I think we are witnessing the beginning of the second phase for the cloud market, in which the number of players increase, mainly by new market entrants. And at the same time competition increases as the combined forecast of the market players exceeds the overall market growth – so there is significant price competition in the market.

Nothing to worry about in general, this is a normal phase for every technology market as the market potential attracts more players than the market can bear long term, but it’s all for the better as this process challenges the existing leaders, creates new players and sets up the market for stage three – continuous and supported growth.

Exhibit 1 – IBM wins GSA complaint

The CIA was looking to put some processes in the cloud already last year, got complaints at the time for Microsoft and AT&T, but ended up selecting Amazon’s AWS in January. 

Ironically the findings report triggered by the complaint found a cost advantage for the 2nd best bid – IBM – but the CIA felt it wanted to go with proven technology.

Not surprisingly, IBM complained to the GAO – and recently the complaint was upheld… so the CIA needs to re-tender… and we will stay tuned to what happens.

What it shows though is that the cloud market is maturing, as this is the first time we see a GAO complaint around the cloud market. Something that happens routinely in other government procurement situations…. Anyone remember the Boeing / Airbus tanker selectionscharmuetzel?

And simply put – IBM could not let this one slide, too big of an opportunity, with all the benefits of being one of the first cloud providers to the federal government, follow up business etc. – which is a sign that the overall potential of the cloud market is not enough at this point to e.g. make IBM walk away to other large cloud opportunities.

Exhibit 2 – HP and Red Hat bundle away

Another sign of entering the 2nd phase of a technology market is, that players partner and / or create bundlings to differentiate their services from the other market players. So it happened last week when HP announcedtheir CloudOS at their HPConverge conference and Red Hat announcedthe Red Hat Cloud Infrastructure at Red Hat Summit.

In both cases congrats need to go to the respective marketing teams under Marty Homlish and Jacky Yeaney for associating generic terms like OS and infrastructure with their offerings. A dream for any marketer. As the association of the generic term with your brand is the Holy Grail for associative thinking, like e.g. we picked HP because of their Cloud Operating system – and guess what, no one else has a CloudOS. Same story for infrastructure.

And it does not matter that behind the scenes – there is nothing really  new that HP and Red Hat have created – they just bundled existing offerings together. But there is a value for customers from bundling, as the expectation is that the bundling vendor will enable an out of the box integration of these services. And the bundling vendors of course want to bundle with higher ground offers, that make their product unique and easier to differentiate and sell. So Red Hat of course uses Red Hat Enterprise Linux and HP will enable their moonshot servers soon, as the hardware platform for HP Cloud OS.

Exhibit 3 – Partnerships game heats up

In the last keynote of the HP Converge conference – normally these closing keynotes are boring, wrap up the message affairs – HP’s COO Bill Veghte unleashed a zinger for the cloud  market, mentioning that Workday was  moving to the HP Converged Cloud. A huge move – even in cloud terms, but Veghte peppered it even more disclosing, that Workday would be leaving their existing partner – and HP competitor – AWS.

Only one journalist (@StevenJBurke  kudos!) pickedup on this – I guess the rest was gone – and the replay of the keynote has not been made available by HP - yet. So the news only slowly cooked up – and prompted a re-commitment of Workday to AWS. 

The sign of maturation in the cloud market to look for here is, that there are less prized possessions to claim for the infrastructure players in order to make their otherwise boring offerings more attractive. Expect more tug of war between the infrastructure vendors trying to get more of the prized and recognized SaaS vendors to adopt their cloud offerings.

Advice for cloud consumers

This is a great phase for the market to be in and for you to make you first steps to the cloud or to double your investments if you already started. The vendors will vie for your business and offer it to you at most attractive terms, since they try to  fulfill their sales quotas that are ambitious. The risk is that you may pick a partner who will no longer be in the game in the next phase or market maturation.

Advice for cloud market players - infrastructure

You need to have a sound partnership strategy in place up the cloud stack, a pure acquisition strategy will not be enough in the longer term, unless you are really, really deep pocketed.

Advice for cloud market players - platform

Time to cast your strategy – will you be a open vendor that will try to partner with a lot for the infrastructure vendors, or do you pick a single, or better only a few of the infrastructure players? Equally you need to look up the stack to make sure you are not being shut out of more higher level positioned opportunities in the cloud stack.

Advice for cloud market players – SaaS

If you have your own infrastructure – keep evaluating it from a cost perspective. After cloud consumers you are the most attractive group of prospects in the cloud market. If you partner – re-assess your partners in terms of cost effectiveness and next market phase survivability.

Advice for cloud market players – Services

Your services are the glue keeping the market together. Try to move up to the cloud stack where the more lucrative opportunities are – and those engagements that determine the utilization direction down the stack.

MyPOV

Great phase for the cloud market – it is graduating from an early interest phase to a player competition for customers. Challenges exist for consumers of cloud offerings to bet on the winning horses, and for cloud vendors to become and stay a horse that is in contention. Exciting times.  

 

New C-Suite Tech Optimization Innovation & Product-led Growth Future of Work Data to Decisions Marketing Transformation Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity HP SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Growth eCommerce Social Customer Service Content Management Chief Executive Officer Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Operating Officer Chief Marketing Officer