Results

SAP’s Mobility Strategy: One Million Developers Blooming

SAP’s Mobility Strategy: One Million Developers Blooming

SAP in its press conference last week provided a major update on its database and mobility strategy. In my first post, I outlined my view of SAP’s database strategy. Now, in this second post, I provide my perspective on SAP’s mobility strategy.

SAP recognizes mobility as a critical element of its product strategy going forward, along with social business, cloud, and in-memory computing. But as some of my associates been hammering over the past two years, success in mobility requires SAP to enable thousands of small development firms and individual developers to build applications for SAP. Not just a few large system integrators or ISVs: SAP needs the enterprise equivalent of Apple’s App Store ecosystem.

SAP’s thinking on this front has been evolving. After its Sybase acquisition, it put the “Sybase Unwired Platform” (or SUP, but now renamed, the SAP Mobility Platform) at the center of its mobility strategy. You want to build mobile apps for SAP? Wonderful—buy, borrow, or otherwise get access to the SAP products you want to integrate with, plus an instance of SUP, and have at it. The problem was, however, that this approach limits the number of developers to the following categories:


  • Large or midsize ISV-partners of SAP, who were willing to make the investment in an SAP development environment, to develop mobility apps for sale to current and future SAP customers. This would be a small number.
  • System integration partners of SAP, who had live project opportunities that included mobility apps as deliverables. The SI could use the client's SAP development environment. These resulting apps would be to meet the needs of a specific client, although the SI might reuse the code in future projects. But this approach would produce few apps for a wider audience.
  • Individual developers or small SAP partners who understand SAP’s middleware and development architecture well enough to forgo use of SAP’s platform and can write mobility apps directly against SAP’s back-end databases. This is where many "app store" type apps could be produced. But here is where small developers come up against a brick wall: SAP does not make it easy to gain access to trial or development versions of many of the SAP products that a mobility programmer would need.

For some reason, Oracle, Microsoft, and IBM are able to make life easier for small developers. But for a hint of what it’s like for a small developer to work with SAP, take a look at this blog post (on SAP’s own SCN community site!): Why Does SAP Make This So Complicated?

Two Good Announcements, But More Work Needed

SAP, to its credit, appears to understand the problem. In its announcements, two were especially noteworthy in terms of addressing the needs of the developer masses and in terms of filling out SAP’s own mobile apps portfolio.

  • First, to enable those thousands of developers, SAP announced partnerships with three leading mobility development tools providers: PhoneGap (recently acquired by Adobe), Appcelerator, and Sencha. These will allow developers, working with tools they are already familiar with, to build apps using a new OData connector to integrate with SAP’s back-end systems. In addition, these tools will allow many simple apps to be built without having to rely upon SAP’s Mobility Platform.
  • The second big announcement was that SAP is acquiring Syclo, which has its own mobility platform as well as a suite of well-regarded field service and asset management applications. SAP said that what it is really after here is not the platform (which overlaps functionality of SAP’s Mobility Platform) but the field service apps. This is another step in SAP building out its portfolio of its own out-of-the box mobile apps. Based on my own work with clients, I know that field service and asset management, in fact, are top use cases for mobility in enterprise systems. If SAP can continue to buy or build collections of key mobility apps like Syclo’s, it will begin to fill out the major white spaces in its mobility portfolio, while still leaving much room for third-party developers to fill in the rest.

These are welcome announcements, but will they be enough? I don't think so. These announcements give the developers new tools, but they don't address the problem of getting access to an SAP development system for testing. Some of my associates, such as Vijay Vijayasankar and Dennis Howlett, echo this concern:

Dennis writes,

Needless to say there is a major hitch: developers who want to build apps with SAP data need access to a NetWeaver instance to test and model. Customers would have that, but small developer shops without an SAP license would not have that access without pricey, hair-pulling hurdles, which Sikka acknowledged during the press conference was a “19th century” approach. When pressed on this issue, SAP’s Fawad Zakariya, VP of Mobility and a key player in mobility ecosystem development reporting directly to Poonen, asserted that good news on this front was coming.

In a similar vein, Vijay writes,

I cannot stress enough on the licensing and monetization model to be figured out upfront – without that, access to software is practically meaningless. Developers have a lot of choice today, including many OSS choices. SAP needs a compelling story for them to use SAP technology….

… we are not sure how SAP handles the licensing/pricing in this scenario . And without that clarity coming real quick – I doubt if scores of developers will jump in and start developing cool apps. Sanjay Poonen responded on twitter few days ago than SAP will get it right quickly, and I totally trust him to do so – hopefully by SAPPHIRE in Orlando.

So here we have it. SAP is making significant progress to curry favor with small developers, but it still doesn't have a total solution to enable them with access to test versions or sandbox instances of SAP back-end systems.

Listen to the Developers

Some of my associates are still concerned that SAP has not found the right “pricing model” for mobility apps, but I think that is last year’s debate. Although not part of the formal announcements last week, it appears SAP is working on a pricing scheme that differentiates between major functional mobility apps, casual apps, and even “free” apps. Add in occasional one-off “enterprise pricing” for very large corporate deals, and I don’t think pricing needs to be an obstacle. Everyone can make money and customers can pay appropriately.

But the licensing problems are more systemic within SAP and most likely face legal or organizational resistance based on “how we’ve always done business.”

I am not a mobile apps developer. Therefore, I have no experience on which to judge when SAP will have all the pieces in place to encourage, in the words of SAP, one million developers to bloom. I can only look to those small developers already within SAP’s ecosystem for their reaction—when they are happy, then I’ll know SAP is on the right path. And what I’m hearing from them so far is that they’re still concerned about the licensing issues.

With SAP as the largest enterprise application company in the world, the mobility announcements are welcome news, but we still don’t have a total solution to enable thousands, let alone, millions of developers. I look forward to hearing about progress reported out of the SAPPHIRE conference in a few weeks.

You can watch a video of the entire press conference.
 

Photo credit: Flickr/docentjoyce

Related Posts

SAP’s Database Strategy Faces an Uphill Battle
SAP in Transition on Mobile, Cloud, and In-memory Computing
SAP Innovating with Cloud, Mobile and In-memory Computing

Tech Optimization

SAP’s Database Strategy Faces an Uphill Battle

SAP’s Database Strategy Faces an Uphill Battle

Last week I attended a half day SAP press conference in San Francisco, on the subject of SAP’s strategy for database technology and mobility. Both are hot topics in enterprise software, and there were plenty of announcements. In fact, it’s quite easy to get lost in the weeds. So, in this first of two posts, I’ll try to summarize what I see as the big picture for SAP’s database strategy.


SAP Positioning Itself as a Database Company

When SAP acquired Sybase in 2010, it said it was doing the deal primarily Sybases's mobility platform. But Sybase also has its traditional relational database products, leading with its Adaptive Server Enterprise (ASE) database. At the same time, SAP itself has been rolling out its HANA in-memory database (IMDB) technology. Until now, these two database products were managed separated, but no longer. SAP is consolidating all its database offerings—HANA and Sybase’s—along with middleware and tools, under one management unit.

There were many tactical announcements. SAP announced general availability of its BW business intelligence product on HANA, and its plan to make HANA available later this year as the database of choice for its small business customers of its Business One ERP product. In addition, customers later this month will have the flagship Sybase ASE database available as a deployment option for SAP’s Business Suite and All-in-One products.

SAP rolled out all of these announcements under the banner of its plan to become known as a database company.

Database Migrations Difficult to Justify

After the press conference, one SAP executive sensed my ambivalence about this plan. With Oracle taking an ever-increasing adversarial position toward SAP, I can understand SAP’s discomfort with having a large percentage of its best customers running on Oracle’s database. At the same time, the other two major providers of relational databases (IBM and Microsoft) are SAP-friendly. IBM is SAP’s largest system integration partner, while SAP and Microsoft often find their technology interests aligned. So, how do you threaten Oracle while not also threatening IBM and Microsoft?

Furthermore, does SAP honestly believe that existing SAP customers are going to migrate in droves from Oracle, IBM’s DB2, or Microsoft SQL Server to HANA or ASE? In the case of business analytics, there may be some movement toward HANA, yes, as the value of in-memory performance for analytic applications is somewhat easy to envision. But what about SAP’s business applications, such as Business One, All-in-One, and the Business Suite? With all the challenges and demands placed on CIOs these days, it’s difficult to imagine an installed SAP customer undergoing a database migration, simply to eliminate some Oracle, or DB2, or Microsoft SQL Server licenses. SAP insists there is business value for HANA in some transaction processing—and I can see that, say, in supply chain management. But is that enough to justify a database migration? Even less so, why would a customer swap out Oracle, DB2, or Microsoft for ASE, which is essentially a like-for-like product? I just don’t see it.

In side-bar discussions, SAP executives basically agree. Alright then, so the target is net-new application customers? But here the challenge is essentially the same. In most cases, business apps prospects already have skills and experience with Oracle, DB2, or MS SQL Server. Are they really going to want to invest in learning Sybase ASE, or HANA? Unless they can completely eliminate those other database platforms from their environments, going with Sybase or HANA is adding to their complexity, not simplifying things.

I think that selling databases is going to be harder row to hoe than SAP is making it out to be.

Subsidizing HANA May Meet Complications

Perhaps recognizing the challenge, SAP realizes it is going to have to sweeten the pot, especially for HANA. So, at the press conference, SAP announced that it is putting up some serious money, through two funds:

  • For new and existing customers: a $337 million fund to subsidize services for SAP customers to convert to HANA. I assume the initial target for these funds will be in migrating business analytics customers to HANA.
     
  • For technology start-ups: a $155 million venture capital fund through SAP Ventures for start-ups to build new apps on HANA.

It’s encouraging that SAP is putting its money where its mouth is. For customers, making a database change may not be cost-justified without some help from SAP. Moreover, tech start-ups may need some financial encouragement to build new products on HANA.

However, I see complications with each of these funding efforts.

  • With the customer fund, there may be issues with SAP’s partners. By funding SAP’s own services to assist with HANA, SAP is taking work away from partners, who typically play a key role in SAP implementations and migrations. In response to my question on this, SAP executives said that it will bring partners into this work at some point in the future.

    Nevertheless, I have to believe that, at first, partners will view SAP as increasing its share of services at the partners’ expense. This is especially true under current economic conditions where customers can only absorb a certain amount of change at once. Moreover, by delivering HANA services directly, SAP delays giving partners the HANA experience they will need for the future. SAP can solve this problem, of course, by ponying up the money but letting customers choose whether to use SAP’s professional services group or partners to deliver the services, or by co-delivering services with partners.
     
  • For start-ups, HANA may not be as attractive as SAP thinks. Looking back at SaaS and other tech start-ups over the past decade, most of them chose to build on open-source database technologies, such as MySQL or PostgreSQL. The reason, of course, is that open-source infrastructure minimizes their own costs as they grow. It also leaves more customer budget available to invest in the application, instead of the required infrastructure.

    I once asked a start-up executive why his firm was building on MySQL instead of Oracle. He replied, “Oracle scales technically, but it doesn’t scale economically.” I have to wonder if HANA will face the same resistance, even with funding from SAP Ventures. A quick check with associates indicated that there are already open-source in-memory databases (IMDBs), including CSQL and VoltDB. I have no knowledge of the capabilities of these products or how they compare with HANA. It is likely that HANA is head and shoulders above open source alternatives. But Oracle’s flagship database was and still is head and shoulders above open source capabilities, and that didn’t stop cloud start-ups from using MySQL and PostgreSQL.

Ultimately, very few organizations want to buy databases—or middleware. They want business applications, and those apps require databases and middleware as part of the technology stack. So, when SAP talks about becoming a database company, it’s hard for me to become excited.

Perhaps SAP already knows that it's going to be difficult. Earlier this year, it began floating the idea of making its goal, "to become the No. 2 database provider by the year 2015." But by the time of the press conference, the goal had been watered down to “becoming the fastest growing database provider.”

When you are starting from such a small market share, becoming the “fastest-growing” is not a very high bar.

Update, Apr. 16: Some deeper questions on Oracle's database strategy from Jonathan Wilson. And, a good post from Vitaliy Rudnytskiy, pointing out that HANA is more than an "in-memory database."

Related posts

SAP in Transition on Mobile, Cloud, and In-memory Computing
SAP Innovating with Cloud, Mobile and In-memory Computing

Tech Optimization

Research Summary: Why the Move from Transaction to Experience Requires Better Analytics

Research Summary: Why the Move from Transaction to Experience Requires Better Analytics

Forward And Commentary

This trends report examines how changing expectations among business leaders and the consumerization of IT will shape the future of insights and decision-making. As organizations make the move from transactions to engagement to experience, a new type of analytics will be required.

A. Introduction

Business leaders seek better insights for smarter decision-making. Unfortunately, today’s traditional intelligence tools were designed for two-dimensional transactional systems. As data from consumer trends such as mobile, social, cloud, big data, and video make their way into the enterprise, organizations seek new tools to discern insight from these new engagement and experiential systems.

The shift from transaction to engagement to experience depends on better business analytics. Success requires that new business analytical tools support the information supply chain as data moves from a cacophony of upstream data sources to new and innovative downstream modes of consumption.

B. Research Findings – Why the Move from Transaction to Experience Requires Better Analytics

Leaders seek more than just reporting and dashboards, they expect to make real decisions. A recent Constellation Research survey identified key expectations from business analytics to include: supporting business strategy and planning; optimizing costs across the value chain; identifying hidden patterns and relationships in big data; providing context for relevant engagement; and predicting demand in networks.  Along with these key trends, the report discusses the:

  1. Five Consumer Forces Influence the Future of Analytics
  2. How Business Leaders Move Beyond Simple Reporting and Dashboards in Their Expectations of Business Analytics
  3. Why Organizations Seek Insight to Make Better Decisions in the Shift from Transaction to Experience
  4. How Big Data Provides the Key Element in Moving from Real- Time to Right-Time

Figure 1. Moving From Transaction To Experience

 

(Right-click to see full image)

The Bottom Line: Business Analytics Must Support Decision-Making Across the Information Supply Chain

The shift from transaction to engagement to experience depends on better business analytics. Success for business leaders requires that new business analytical tools support the information supply chain as data moves from a cacophony of upstream data sources to new and innovative downstream modes of consumption (see Figure 7).

  • Classify. Incoming information must be tagged and associated with relevant metadata and context.
  • Transform. Information must be converted to standard and conventions that can be consumed by all sources.
  • Augment. Related information must be attached to the data.
  • Secure. Access to, securing of, and masking of information must be applied.
  • Deliver. Information and insights should be delivered to the relevant input nodes.
  • Refresh. Periodic updates to information must be performed to keep data relevant.

C. Report Links

See how analytics play a major role in the shift from transaction to experience.  Buy the full research report on the Constellation Research website.

Your POV.

How are you using analytics to improve engagement and experience? Let us know your experiences.  Add your comments to the blog or reach me via email: R (at) ConstellationRG (dot) com or R (at) SoftwareInsider (dot) com.

Reprints

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact sales (at) ConstellationRG (dot) com.

Disclosure

Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy, stay tuned for the full client list on the Constellation Research website.

Copyright © 2001 – 2012 R Wang and Insider Associates, LLC All rights reserved.

 

Data to Decisions Matrix Commerce Next-Generation Customer Experience Innovation & Product-led Growth Leadership Chief Experience Officer

Vendor Event: Why Does Everyone Need Analytics?

Vendor Event: Why Does Everyone Need Analytics?

 

Wednesday, April 25, 2012
10:00 am Pacific
1:00 pm Eastern

Presenters
Ray Wang
Principal Analyst and Chief Executive Officer
Constellation Research, Inc.

Patrick Morrissey
Vice President
Tidemark

Jennifer Maddox
Director of Product Marketing
Tidemark

Why the Move from Transaction to Experience Requires Better Analytics

Business leaders seek better insights for smarter decision-making. Unfortunately, today’s traditional intelligence tools were designed for two-dimensional transactional systems. As data from consumer trends such as mobile, social, cloud, big data, and video make their way into the enterprise, organizations seek new tools to discern insight from these new engagement and experiential systems.

In fact, leaders seek more than just reporting and dashboards, they expect to make real decisions. A recent Constellation Research survey identified key expectations from business analytics to include: supporting business strategy and planning; optimizing costs across the value chain; identifying hidden patterns and relationships in big data; providing context for relevant engagement; and predicting demand in networks.

The shift from transaction to engagement to experience depends on better business analytics.  Success requires that new business analytical tools support the information supply chain as data moves from a cacophony of upstream data sources to new and innovative downstream modes of consumption.

Why Does Everyone Need Analytics?  Because work—as we know it—soon won’t be as we know it.

Think real time. Connect finance and operations together with plans, forecasts, budgets, and a global view of the business so that everyone can see what they need, the way they need it.

Think mobile. Manage your business anywhere with a swipe of your finger from an iPad or mobile device by updating plans, doing real-time analysis, and collaboration.

Excel who? Think analytics for all.

Come join us for an interactive discussion and webinar! Register today!

 

Data to Decisions Innovation & Product-led Growth Leadership Chief Experience Officer

Constellation Research Kicks Off 2012 SuperNova Awards for Innovators in Technology

Constellation Research Kicks Off 2012 SuperNova Awards for Innovators in Technology

All-Star Judges Select Semi-Finalists for the Second Annual Awards Event That Recognizes the Explorers, Pioneers, and Unsung Heroes That Successfully Put Technology to Work

Finalists to win $100K in prizes from Constellation Research

Semifinalists invited to speak at Constellation’s Connected Enterprise Innovation Summit

SAN FRANCISCO, CA, April 10, 2012—Constellation Research, Inc., announces the kickoff of the 2012 SuperNova Awards, the only awards to recognize trailblazers that have overcome the odds to successfully implement new technologies within their organizations. The 2012 SuperNova Awards will recognize technology leaders in the following categories:

  • Big Data
  • Future of Work
  • Consumerization of IT and the new C-suite
  • Technology Optimization
  • Next Generation Customer Experience
  • Matrix Commerce

 

Esteemed judges selected for their accomplishments in the industry, will evaluate applicants against a vigorous set of criteria that evaluates real-world and pragmatic applicability. The judges will select semifinalists who embody the human spirit to innovate, overcome adversity, and successfully deliver market-changing approaches. Semifinalists will be announced, and invited to Connected Enterprise, Constellation’s Executive Innovation Summit on June 29, 2012.  Semifinalists will receive VIP access and admission to the event and many will be selected to speak.

SuperNova Award finalists and winners will be announced at the SuperNova Awards Gala on November 9, 2012 at Constellation’s Connected Enterprise Innovation Summit, planned for The St. Regis Monarch Beach, Newport Beach, California. 

“The SuperNova award honors the true industry heroes- innovators that are solving business problems by putting technology to work in new ways that are changing the outcomes of business,” said R "Ray" Wang, founder and president, Constellation Research.  The SuperNova  Awards program is fueled with the momentum of hundreds entries, and we’ll be honoring the most impressive this June at our Innovation Summit.

Constellation Research encourages all tech evangelists to submit for a SuperNova Award. More information about the awards can be found here: http://www.constellationrg.com/supernova-awards-2012

Event Report: Clarabridge Customer Connections 2012 #cbc312

Event Report: Clarabridge Customer Connections 2012 #cbc312

Clarabridge “Turns Up The Heat” On Delivering Context For Customer Experience

CEO, Sid Banderjee, opened up Clarabridge‘s 4th annual user conference to 350 customers at the Doral Golf & Spa in Miami, FL on March 5th, 2012.  Clarabridge, a sentiment and text analytics software provider helps companies discern insight from their text based customer feedback and the growing plethora of social and mobile data points.  The goal – aggregation of insights from qualitative analytics that transform key organizational processes in customer experience, new product development, and employee satisfaction.

Clarabridge has shown success with a Global 1000 customer list that spans key verticals in technology/telco, retail/CPG, manufacturing, travel/hospitality, financial services/insurance.  Major clients include Bank of America, Best Buy, Cisco, Dell, Disney, Fidelity, General Mills, Hilton, IHG Hotels, Kaiser Permanente, Marriott, Siemens Sony, T-Mobile, United Airlines, Verizon, Visa, Walgreen’s, Walmart, and Zynga.

Some highlights from the event include:

  • Keynote from customer experience transformist Bruce Temkin. Bruce’s keynote discussed how organizations apply Voice of the Customer (VoC) programs to augment customer experience.  Temkin highlighted his VoC Maturity assessment methodology that drills in on six key areas – detection, dissemination, diagnosis, discussion, designing, and deploying.  The key quote from Bruce was “Customer feedback is cheap, actionable insight may be valuable, but taking action on insight is precious. VoC programs are useless unless you act on what you find”
  • Best practices discussions from Global 1000 companies.Leading brands such as Acer, Best Buy, B/E Aerospace, Charming Shoppes, Choice Hotels International, Inc., Dell Inc, Expedia, Estée Lauder, Fidelity Investments, GE Appliances, United Airlines, Sage, Verizon, Vodaphone, Wendy’s, Walmart, and Zynga shared best practices.   Experiences from Wynn Parrish, VP Product Support of B/E Aerospace showd how customer management and warranty liability could be minimized.  Michael Silverman at Silverman research highlighted how Unilver uses VOC for internal employee programs.  One of the highlights was Jared Anderson (Best Buy) and Jonathan Sunberg’s (Confirmit) panel on voice of the customer at the leading edge/
  • Official details on the Clarabridge 5.0 launch. The launch of Clarabridge 5.0 provides the foundation for a customer insight data analytics hub (See Figure 1).  As part of the launch, Clarabridge Collaborate adds integrated notifications and alerts.  A new satisfaction scoring and sentiment transparency capability brings customer satisfaction scores into the equation to determine customer loyalty and retention programs.  Many attendees expressed interest in the new theme and event detection capabilities which provide custom categorization models to quickly surface new trends.  Last but not least, the natural language processing engine now supports Italian, Dutch, and Japanese.

Figure 1. Transforming Feedback Into Insight

Figure 2. Scenes From Clarabridge C3


Source: R Wang and Insider Associates, LLC. All rights reserved.
The Bottom Line: Voice of Customer Programs Key to Improving Customer Engagement

VOC’s provide a time honored tradition in capturing a customer’s expectations, preferences and aversions.  As the first step to any social business program, listening is critical to success.  Without understanding the customer or competitive landscape, organizations lack the insight required to address the root causes behind customer satisfaction and experience.  Customer service professionals can expect these techniques to extend beyond text based systems.  Early adopters to social business know they must build competencies in social media monitoring, text analytics, and social analytics.  In fact, organizations will see a digital divide grow in customer experience between those who use VOC techniques beyond social and those who fail to fast follow early adopters.  Why? Because customer experience is the only defensible position in today’s market.

Your POV

What strategies and tactics are you using to drive engagement?  How do you measure success.  Add your comments to the blog or send us a comment at R (at) SoftwareInsider (dot) org or R (at) ConstellationRG (dot) com

Please let us know if you need help with your Social CRM/ Social Business efforts.  Here’s how we can assist:

  • Assessing social business/social CRM readiness
  • Developing your social business/ social CRM  strategy
  • Vendor selection
  • Implementation partner selection
  • Connecting with other pioneers
  • Sharing best practices
  • Designing a next gen apps strategy
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Related Research:

Reprints

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales .

Disclosure

Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy, stay tuned for the full client list on the Constellation Research website.

* Not responsible for any factual errors or omissions.  However, happy to correct any errors upon email receipt.

Copyright © 2001 – 2012 R Wang and Insider Associates, LLC All rights reserved.
Contact the Sales team to purchase this report on a a la carte basis or join the Constellation Customer Experience!

 

Matrix Commerce Next-Generation Customer Experience Innovation & Product-led Growth Leadership Chief Experience Officer

Dolby Audio Conferencing Revisited

Dolby Audio Conferencing Revisited

In January 2012 , I wrote about Dolby Labs entering the audio conferencing market with wide-band spatially-oriented audio technology. At that time, I stated that the company needed to do a bit more to prove to me that it really had something differentiating. In the initial demo, there was nothing to compare Dolby’s audio with, so I challenged the Dolby team to provide a way for people to hear and immediately compare a narrow-band PSTN audio conference with a regular wide-band audio conference with a Dolby audio conference.
 
The company was up to the challenge at Enterprise Connect 2012, creating an environment in which the three types of audio conferences could be immediately and simultaneously tested. Dolby set up an Asterisk PBX that supported G.711 narrow-band audio and G.722 wideband audio.
 
Three different conferences were active simultaneously:
 
  1. A wideband audio conference anchored in the Asterisk PBX. All participants used a Cisco phone along with a mono headset.
  2. A narrow-band audio conference anchored in the Asterisk PBX. All participants used a Grandstream phone with a mono headset.
  3. A wideband audio conference anchored in Dolby’s media server. All participants used Mac personal computers with USB-based stereo headphones and spatial speaker orientation was enabled.
Both wideband conferences were clearly superior to the narrow band audio conference. The G.722 wideband audio was actually quite good. It was not stereo, but it did have good sound quality. The Dolby sound quality over the stereo headset and with the spatial orientation was excellent.
 
I asked if I could get a stereo headset for the G.722 wideband system, but we were unable to secure one at the moment.
 
To further test the different audio solutions, I asked the other participants to all speak at the same time.
 
In the narrow-band conference and in the G.722 wide-band conference, the audio was a garbled mess. In the Dolby conference, the difference in clarity was amazing. Even though multiple people were speaking over one another, the spatial audio though the stereo headphones made it possible to hear every speaker quite clearly. The topic of the conversation was what each speaker had for breakfast, and we dubbed this remarkably different sounding conference the “breakfast conversation”. Dolby continued to use this breakfast conversation throughout the three days of demonstrations as a way to show how it is a very different solution.
 
Here are the key takeaways to me:
  • Dolby is on to something with its media server and client audio technology. The audio experience is clearly superior.
  • When multiple people speak, Dolby’s audio is second to none. This is important because in an audio-only conference, there are no visual cues that enable people to avoid speaking over one another. This happens rather frequently, and the ability to distinguish these words and phrases is very helpful. Usually once people realize that someone else is speaking, one person stops, but this ability to distinguish words in these situations may relieve some of the cueing issues found in regular audio conferences.
  • The Dolby solution does require stereo speakers or a stereo headset. This is an issue because most Bluetooth headsets and many high-end call center and executive Bluetooth, DECT, and wired headsets have only one ear bud or speaker. Dolby indicated that it is speaking with Bluetooth chipset manufacturers, encouraging them to develop stereo capabilities. (Plantronics was, interestingly enough, just across the hall from Dolby at this event, so the stereo headset message clearly got to that manufacturer.) At issue here is whether people will change their behavior and broadly adopt stereo headsets. From a vanity perspective, stereo headsets with a wire or a band connecting the ear buds or speakers will mess up a coiffed hairdo. Worrying about someone’s hair is possibly trivial, but probably not, particularly in an office setting or where people regularly switch between audio-only and video conferences.
  • Dolby needs to address how the solution will work in a conference room. People will not come into a conference room and put on a headset.  
Overall, the technology is very promising. The real question is whether people will pay just a little more for a high quality audio conference and will they use a headset. It may be that some audio conferencing solutions, delivered as a premises-based solution or through a conferencing service provider, may choose to offer Dolby’s audio conferencing capability as a key solution differentiator.
New C-Suite Tech Optimization

The Problem with Presence

The Problem with Presence

At Enterprise Connect 2012, one of the session speakers made the following statement: “Presence is not about being productive, it is a way to interrupt.” 
 
The speaker went on to say that the presence icons one sees on a buddy list are of no value because they are not context aware (I can only assume he was referring to his own company's offering as well as the presence indicators from competitors). He went on to introduce a concept called “awareness”, likening it to a personal assistant that prepares everything for a meeting in advance so that when the meeting starts, all of the content supporting a meeting is ready to go.
 
Discussions about context awareness are not new, and I specifically remember having them with Microsoft and Cisco some years ago. Clearly, context is an important element as one considers the overall communications environment in which people work.
 
I do take exception, however, to the idea that presence has no value unless there is rich context surrounding it. Clearly, the more context that can be built into an IM/presence engine, the better, assuming the context is accurate. The better presence and IM solutions do aggregate presence in the following ways:
  1. Computer presence (logged in, tapped on keyboard lately),
  2. Telephone presence (in a call, not in a call)
  3. Calendar presence (in a meeting, not in a meeting)
  4. Mobile phone presence (not all IM/presence engines can do this, but some can)
  5. Location (some try to determine location automatically; for others, you must set it)
  6. “What’s Happening” status (a simple message one can type into the buddy list indicating what one is presently involved with)
I would agree with the Enterprise Connect speaker that *undisciplined* use of presence can indeed be a serious interrupter and time waster. I recently wrote a Quark that points to disturbing research that shows that social media and unified communications will cause employee distraction.
 
However, disciplined use of presence, even if it is just computer presence, can be a godsend. Simple computer presence can almost entirely eliminate internal voice mail. It can also eliminate repeated attempts to contact people through different communications channels. While it seems laughable given the tools available today, there are many people that receive email, desk phone voice mail, and mobile phone voice mail all about the same topic. If there is a speech-to-text mechanism enabled on either the desk phone or the mobile voice mail, then the user gets yet another email message about the same topic.
 
Stop already! Presence and IM can eliminate most of this communications overload and free people from email and voice mail jail.
 
Disciplined use of the presence and IM system can be extremely helpful, *if* organizations will establish a few rules of etiquette:
 
  1. Update your presence status (or better yet, get a system that does most of the updating automatically). If the presence status is wrong, people won’t trust it and will use other methods to contact one another.
  2. Don’t sleuth. It is annoying when someone sits on the network watching the presence status of others while not revealing their own. If you don’t want people to bother you, simply log off, and they will know to send you an email – or better yet, change your status to DND (do not disturb), and the person will send you an email.
  3. Ask before you call. It is a courtesy that is both respectful and time saving. Most often, the person you are trying to contact will respond that they can take the call or tell you when they can take it. It just saves leaving a voice message or completely interrupting someone.
  4. Keep IM to short transactions. If a conversation goes beyond just a few back and forth text exchanges, it is often faster and more productive to escalate the conversation to a phone call.
 
These simple courtesies and practices can make presence and IM become capabilities one won’t want to live without. Organizations that faithfully use these guidelines will see significant productivity gains, even from public presence and IM systems, like Yahoo!, AOL, Skype, and Google. They get even more value from more sophisticated enterprise IM/presence solutions like Microsoft Lync, IBM Sametime, Cisco Jabber, Avaya one-X/Flare, and similar systems from other vendors that add context to presence and IM.
 
As the speaker said, awareness and context are clearly useful, but to state that presence/IM does not enable productivity is clearly in error and ignores the fact that many, many enterprise and consumer users do find it highly valuable. It is undisciplined presence/IM that we need to need to watch out for and correct, augmenting IM/presence with context as possible.
New C-Suite Next-Generation Customer Experience Tech Optimization

Microsoft Dynamics ERP on Azure: What Are the Benefits?

Microsoft Dynamics ERP on Azure: What Are the Benefits?

Last week I attended Microsoft’s annual Convergence conference, for users and partners of its Dynamics line of enterprise applications. The back-to-back briefings were a great opportunity to get an update on where Microsoft is going with enterprise applications.

But the big news from my perspective is that by the end of 2012, two of Microsoft's ERP products, GP and NAV, will be available on Microsoft's Azure cloud.

Click on the video interview at the right for my initial thoughts, which I am expanding upon in this post.


Azure Complements Existing Hosted Offerings

Microsoft customers have always been able to deploy NAV (formerly, Navision) and GP (formerly, Great Plains) on-premises. In addition, some customers have chosen in the past to have Microsoft partners host their systems in partner data centers. MyGPCloud is one of the largest such partners, hosting GP for thousands of small business customers. Likewise, Tribridge offers similar hosting services for all Dynamics ERP products.

Now, Microsoft is offering customers the option to deploy their GP or NAV systems on Microsoft's Azure cloud, which runs in Microsoft data centers. This offering will not replace partner hosting but simply will be another deployment option for customers.

Through back channels, I've heard some partners express uncertainty about this new development. Is Microsoft attempting to go direct with customers? How will the partners make money? During the session, Microsoft executives made clear that, under Azure deployment, partners will still maintain the customer relationship and deliver the services for implementation and ongoing support. The only difference is that with the Azure deployment option partners will be relieved from the need to maintain data center infrastructure.

What Are the Benefits?

Over the past two years, I've been one of those encouraging the Dynamics team to go faster in moving to Azure, as cloud ERP is already available from competitors. But now that Microsoft is on the verge of actually doing it, I wanted to know, what are the benefits? Specifically, if customers can already have these systems hosted by a Microsoft partner--and if Microsoft will still work through partners in selling and supporting systems deployed on Azure--what are the added benefits of Azure?

I asked this question a year ago at Convergence and, frankly, the answers were not that clear. After asking this same question in several briefings this year, and adding my own analysis, I think the benefits picture is now emerging.

  • Azure deployment is cheaper than hosting. Azure is a true elastic cloud platform, with data center economies of scale that traditional hosting cannot come close to matching. This should allow Microsoft to price these services at a lower cost than what partners can offer.
  • Azure deployment scales beyond partner hosting. As a true cloud platform, Azure deployments can scale instantly beyond what partner hosting can offer. Hosted ERP relies upon dedicated resources, which must be planned and expanded manually to meet changing customer requirements. With Azure, customers will never exhaust the resources available.
  • Azure supports worldwide deployments better than partner hosting does. Microsoft runs Azure data centers worldwide and can move customer systems and data between them as needed. Hosting partners do not have this capability, unless they are utilizing a true cloud IaaS, such as Amazon's EC2. The move to Azure is therefore a better choice for organizations that are running separate instances in different parts of the world.
  • Azure deployment provides easier version upgrades. With partner hosting, upgrades and maintenance are handled more or less as they are with on-premises software: each customer is treated separately (though I suspect some partners are more organized about this than others). With Azure deployment, Microsoft will have a more disciplined approach to application management: rolling out new versions, upgrades, and patches to its customers, similar to what it does today with Microsoft CRM (even though, as I point out in the interview, CRM is not yet an Azure service).
  • Azure deployment is provided directly by Microsoft. Most new prospects will have a higher level of comfort with cloud services provided directly by Microsoft and backed by the Microsoft brand and service level guarantees. Hosting is often delivered by service providers who are relatively unknown. The direct Microsoft relationship is also simpler and easier to explain. The software comes from Microsoft and the cloud services are delivered directly by Microsoft.

It is also important to point out at least one advantage of Azure deployment over partner hosting that Microsoft is not claiming--that is, that Azure deployment provides the ability to inter-operate with other Azure services, such as Office 365 or other future Azure data services (some of which I was briefed on). Microsoft has made a big deal about its vision of the so-called "hybrid cloud," meaning that customers will be able to move selected "workloads" to Azure while keeping other workloads on-premises or in partner-hosted data centers. Therefore, if I want to inter-operate Microsoft's Office 365 with my NAV system, it should not make any difference if my NAV instance is on-premises, in a partner data center, or on the Azure cloud.

Optimizing Azure as a Cloud Platform

I am struck by the fact that I've had to piece together this value proposition for Azure ERP myself, lobbing softball questions to Microsoft executives, parsing their answers, and adding my own analysis. If Microsoft itself is not prepared to articulate the value proposition of Azure ERP, how can it expect that its customers or its partners will perceive it?

Therefore, I do not envision customers and prospects staging a mad rush to Azure. As I said in the interview linked above, what if Dynamics throws an party and no one shows up?

Nevertheless, from a strategic perspective, I do believe that moving to Azure is the right thing for Dynamics. Mike Ehrenberg, one of only a handful of Microsoft Technical Fellows, told us an interesting story. He said that when they first spoke with CEO Steve Ballmer about moving Dynamics ERP to the cloud they told him that they could do it in one of two ways:

  1. The quick way: hosting it in Microsoft data centers in a highly virtualized environment, as they had done with Microsoft CRM, or
  2. The strategic way: working with the Azure team to optimize the Azure capabilities needed to support true scalable enterprise business applications, such as SQL Azure, until it could support Dynamics ERP.

Mike reported that Ballmer thought for about two seconds before choosing the second option. He likened it to Microsoft Windows and Microsoft Office, years ago. It took the requirements of Office as a set of user applications to make Windows "become better" as a PC platform. Likewise, it would take Dynamics as a set of enterprise applications to make Azure become better as a cloud platform.
The problem, of course, is that it's taking much longer to develop Azure as a enterprise-class platform. In the meantime, competitors such as NetSuite, Workday, SAP, Plex, and others have already become established as cloud ERP providers and have gained market share in this emerging market. Nevertheless, Microsoft entering this market later this year is a welcome development that will mean an increasing number of choices for buyers.
Postscript: watch for Part 1 of my market overview of cloud ERP over the next few weeks.

Future of Work Tech Optimization

ALU Opentouch Conversation Raises the Bar for UC Tablets

ALU Opentouch Conversation Raises the Bar for UC Tablets

Alcatel-Lucent (ALU) unveiled its new Opentouch Conversation Tablet at Enterprise Connect and delivers a truly unified solution.  Unlike many business tablets that provide unified communications (UC) features, the ALU Opentouch software provides a highly sophisticated single architecture that supports multi-party, multi-devices and multi-modal conversations running its UC software on the iPad.  What could have been complex and cumbersome is actually a sleek solution on a single application to deliver this robust functionality.

The ALU Opentouch Conversation has several advantages that provide competitive differentiation.  It supports multiple devices –the desktop phone, smart phone, tablet, video and PC/MAC- all from a single interface and requires only one user license for all devices. One server can support up to 1,500 end-users greatly reducing the hardware footprint required by other vendors’ offerings.  The user can continue a conversation without interruption when going from one device to the other.  It also offers presence capabilities so a user can view the availability of another prior to contact.

Its initial release initially supports the iPad, which is currently the favorite tablet among business professionals.  Future release will also support other tablet devices.   The ALU solution stands out because it truly offers a unified experience for the user, which has been a promise of UC that in reality required the integration of many components.   As the tablet is becoming the device of choice for many business professionals, ALU engineered a solution that I believe is worth serious consideration.

Next-Generation Customer Experience