Results

What Does IBM’s Cognitive Computing Era Mean For Collaboration?

What Does IBM’s Cognitive Computing Era Mean For Collaboration?

One of the most common complaints from employees is that they can’t keep up with the vast amounts of information available to them. Many of you have heard me talk about the issue not being email overload, nor even information overload, but rather input overload. We have too many places to look for, or respond to people, content and conversations.

But what if computers could do that for you? What if they could eliminate having to know where content was, which was the most relevant, what needs your attention now or what you can avoid working on? Even better, what if computers could go beyond helping you prioritize and filter, and start to actually take actions on your behalf?

With ideas like these in mind, IBM has just announced their new strategic initiative, Cognitive Business.

You may have heard of IBM Watson, the computer that beat two Jeopardy grand champions. Well, that system was designed to do a lot more than just play trivia games!  Watson does not just know what it’s programmed to know, it learns and adapts, similar to the way humans do. This is called cognitive computing.

While IBM's marketing is currently focused on large industry initiates like healthcare and finance, cognitive computing could also one day help individuals and teams collaborate more effectively. IBM’s collaboration platform, IBM Connections and their new email client, IBM Verse could one day leverage the cognitive capabilities of IBM Watson to help people make sense of the vast amounts of information they are currently flooded with. In addition to helping people filter and prioritize information, one day the tools could recommend actions and even proactively respond.

Let’s call this next generation of personal productivity and teamwork, Cognitive Collaboration.

What Customers Should Consider

IBM is not the only company applying artificial intelligence to teamwork. While IBM will argue they are the only company with true "cognitive computing" (Watson), to the average person the technology behind the scenes does not matter as much as the end result. Other vendors are also adding digital assistant capabilities of their tools, such as Microsoft with Cortana and Google with Google Now. Salesforce also recently released Salesforce IQ which proactively helps sales professions organize their pipelines. Customers should speak with these vendors to get a list of current capabilities as well as future roadmaps in how their platforms will help employees get work done. It's important to note, most of these cognitive capabilities only work in cloud based deployments, as the computing power for these systems is not offered on-premises.

What Partners Should Consider

The success of any platform is determined by the strength of its ecosystem. For collaboration software, it's critical that tools integrate with other enterprise applications and expose their features so that 3rd party developers can create add-ons and additional features. IBM Watson has a very strong focus on developers, and has been rapidly expanding the set of cognitive capabilities that are available for use in other applications. I look forward to seeing if the IBM partner ecosystem comes up with interesting ways to use Watson to help people collaborate.

Moving in the Right Direction

I really like the new “Outthink” messaging. It bridges IBM’s long standing message of “Think", with this new cognitive era. I've not been a fan of the previous company wide mantra "Be Essential", but I think trying to get every employee to rally around cognitive computing is a good move. I look forward to seeing how IBM ties together their collaboration tools and the Watson platform.


 

Future of Work

Teradata Expands Market Opportunity for Industry-Leading Data Warehouse on Amazon Web Services

Teradata Expands Market Opportunity for Industry-Leading Data Warehouse on Amazon Web Services

In what coud have been taken as an April Fool’s joke if the date was April 1st, Teradata today announced support for its data warehouse on Amazon’s AWS Cloud.
 
 

So let’s dissect the press release in our custom style, it can be found here:
SAN DIEGO – October 7, 2015 – Teradata Corp. (NYSE: TDC), the big data analytics and marketing applications company, announced today it is making its Teradata Database, the market’s leading data warehousing and analytic solution, available for cloud deployment on AWS to support production workloads. The initial version of Teradata Database on AWS will be offered on a variety of individual multi-terabyte virtual servers--known as Amazon Elastic Cloud Compute (EC2) instances--in supported AWS regions via a listing in the AWS Marketplace. […] 
MyPOV – So that sums it up well: The prove veteran Teradata Database will be running on Amazon AWS EC2 instances, with support for production workloads. For the longest time customers had to buy hardware to run the Teradata Datawarehouse from Teradata – but these times are over now. Definitively a good move and another proofpoint of a new approach a Teradata (we noticed already at the analyst summit – see here), Teradata is not afraid to challenge revenue streams and traditions.

 
[…] AWS is the leading cloud service provider and has a global footprint with more than one million active customers in 190 countries. With Teradata Database on AWS, supported use cases include test and development, quality assurance, data marts, disaster recovery, and production analytics. The primary benefits to customers include:
• Wider accessibility to the market’s leading data warehouse and analytic solution
• Closer proximity of the database to data sources and partner software in the cloud
• Easier scalability with self-service provisioning and hourly pay-as-you-go convenience
MyPOV – Teradata is reaping all the benefits from running on AWS, supporting all possible deployments and use cases. Now Teradata is available for anyone with access to the web (and a credit card, we need to see what Teradata will charge for using its software).
 
“This is a significant announcement for Teradata because it illustrates a fundamentally new deployment option for what has long been the industry’s most respected engine for production analytics,” said Chris Twogood, Vice President of Product and Services Marketing at Teradata. “In terms of convenience, security, performance, and market adoption, cloud computing has proven its value. By incorporating AWS as the first public cloud offering for deploying a production Teradata Database, we will make it easier for companies of all sizes to become data-driven with best-in-class data warehousing and analytics.”
MyPOV – Good quote from Twogood – note the emphasis on ‘first’ – so there may be more and different cloud providers coming down the road.

 
Twogood said that a growing number of existing and prospective customers want a hybrid mixture of deployment options – with some resources maintained physically on-premises and other services delivered virtually via the cloud. Representing Teradata’s initial version of Teradata Database for public cloud deployment, Teradata Database on AWS will expand the range of choices for companies to extract the greatest analytical insights for their organizations:
• On-premises integrated data warehouse: Teradata Platforms and Appliances
• Purpose-built managed environment: Teradata Cloud
• Self-service public cloud: Teradata Database on AWS
• Hybrid approach using a combination of the above
MyPOV – This adds a third deployment option of Teradata. And typically public cloud based deployments (we still need to learn on pricing, reading along in the press release) – are cheaper for customers than on premise / private cloud deployments, due to less hardware utilization on premises. We will also have to see if Teradata will support bursting to the cloud, e.g. when on premises capacity is maxed out. This is tricky given the nature of data centric applications like Teradata's, but it can be achieved.
 
Additionally, Teradata Production and Advisory Services, delivered by a deep bench of industry and analytic experts within Teradata Professional Services, are available to assist new and existing customers with provisioning, integration, management, and fine tuning of Teradata Database across all deployment options.
MyPOV – Good to see Teradata is not missing the opportunity to sell services. Customer will initially need help to deploy these new capabilities. Over the longer run they should be automated though software though, as software scales better than humans.
 
More information about Teradata Database on AWS is available from company representatives exhibiting at the AWS re:Invent annual user conference from October 6-9 in Las Vegas and at the Gartner Symposium/ITxpo 2015 event from October 4-8 in Orlando. Additional insight will also be shared during the upcoming Teradata 2015 PARTNERS Conference and Expo, the premier global data analytics conference taking place October 18-22 in Anaheim. Attendees are encouraged to visit Teradata’s Analytics in the Cloud station in the Expo Hall and participate in the cloud-focused PARTNERS breakout session titled, “The Rise of the Purpose-Built Analytics Cloud” on Tuesday, October 20 from 9:00 to 10:15 am.
MyPOV – Well good ;bang for buck’ hitting an own and two other major events. Currently attending AWS reinvent, the event has grown to 15k+ attendees, so lots of attention on new announcements.
 
Availability
The initial version of Teradata Database on AWS will be available in Q1 2016 for global deployment. It will be offered on a variety of individual multi-terabyte Amazon EC2 instance types via a listing in the AWS Marketplace. EC2 is a web-based service that allows business subscribers to run application programs in the Amazon computing environment and pay only for capacity that is actually used. Customers will have the ability to deploy standalone on AWS or can complement both on-premises and Teradata Cloud environments.
MyPOV – Good to see a near availability date, even more important to see that ‘pay as you use; principles will be followed. Key also to see that a AWS only deployment is supported, so the AWS deployment is a first class citizen next to the Teradata on premises and Teradata cloud deployment options. But we miss hearing what it will cost for customers to use Teradata on AWS.
 

Overall MyPOV

A good move by Teradata, that is not stopping at challenging the pastest, its engineered system business, optimized to run the Teradata Database. It’s a copy of the Microsoft post Ballmer strategy – analogous to an ‘Office everywhere’ there seems to be a ‘Teradata everywhere’ strategy in place. Customers should understand the full cost of running Teradata Warehouse on AWS, including license / usage payments to Teradata.

On the concern side we see uncertainty on pricing. Cost for moving to cloud is key planning and decision criteria for enterprise. We are certain that Teradata will not make this move without making AWS based Teradata Database reasonably attractive enough. But it would also not be good business acumen to make these too cheap, unless we may see a departure by Teradata from building its engineered systems in the future. That would be a serious blow to the overall engineered (or converged) system approach, but maybe data warehouses can be run cheaper on Linux based ‘plain vanilla’ systems. We will see.

Overall a good move to Teradata, that really only begs the question, what took so long to get there.



More on Teradata
  • News Analysis - Teradata Launches First Enterprise Support for Presto read here
  • Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices read here
  • Check out my colleague Doug Henschen's view on Presto and the recent analyst event - read here
  • Progress Report - Teradata is alive and kicking and shows some good 'paranoid' practices - read here
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
Data to Decisions Tech Optimization Sales Marketing Innovation & Product-led Growth Next-Generation Customer Experience Future of Work intel amazon Big Data Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration SaaS PaaS IaaS Enterprise IT Enterprise Acceleration IoT Blockchain CRM ERP CCaaS UCaaS Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Spare us the outrage over Safe Harbor changes

Spare us the outrage over Safe Harbor changes

For 35 years now, a body of data protection jurisprudence has been built on top of the original OECD Privacy Principles. The most elaborate and energetically enforced privacy regulations are in Europe (although well over 100 countries have privacy laws now). By and large, the European privacy regime is welcome by the roughly 700 million citizens whose interests it protects.

Over the years, this legal machinery has produced results that occasionally surprise the rest of the world. Among these was the "Right To Be Forgotten", a ruling of the European Court of Justice (ECJ) which requires web search operators in some cases to block material that is inaccurate, irrelevant or excessive. And this week, the ECJ determined that the U.S. "Safe Harbor" arrangement (a set of pragmatic work-arounds that have permitted the import of personal information from Europe by American companies) is invalid.

These strike me as entirely logical outcomes of established technology-neutral privacy law. The Right To Be Forgotten simply treats search results as synthetic personal information, collected algorithmically, and applies regular privacy principles: if a business collects personal information, then lawful limits apply no matter how it's collected. And the self-regulated Safe Harbor was found to not provide the strength of safeguards that Europeans have come to expect. Its inadequacies are old news; action by the court has been a long time coming.

In parallel with steadily developing privacy law, an online business ecosystem has evolved, centred on the U.S. and based on the limitless resource that is information. Fabulous products, services and unprecedented economic success have flowed. But the digital rush (like gold and oil rushes before it) has brought calamity. A shaken American populace, subject to daily breaches, spying and exploitation, is left wondering who and what will ever keep them safe in cyberspace.

So it's honestly a mystery to me why every European privacy advance is met with such reflexive condemnation in America.

The OECD Privacy Principles safeguard individuals by controlling the flow of information about them. In the decades since the principles were framed, digital technologies and business models have radically expanded how information is created and how it moves. Personal information is now produced as if by magic (by wizards who make billions by their tricks). But the basic privacy principles are steadfastly the same, and are manifestly more important than ever. You know, that's what good laws are like.

A huge proportion of the American public would cheer for better data protection. We all know they deserve it. If American institutions had a better track record of respecting and protecting the data commons, then they'd be entitled to bluster about European privacy. But as things stand in Silicon Valley and Washington, moral outrage should be directed at the businesses and governments who sit on their hands over data breaches and surveillance, instead of those who do something about it.

Digital Safety, Privacy & Cybersecurity Chief Information Officer

Part 3; Things are changing, but what is making everything change? A summary of change factors in technology and business in 2015

Part 3; Things are changing, but what is making everything change? A summary of change factors in technology and business in 2015

Part 1 and Part 2 outlined what, and how, core innovative technologies are creating business disruption, and provided some examples of new business models that illustrate these core principles. Part 3 endeavours to offer insights into common principles for successful re-invention of business models in the emerging de-centralized Digital Economy, starting from the following closing statement of Part 2.

Research report now available: The Foundational Elements for the Internet of Things (IoT)

There are many revenue rich products, and markets, being created currently, what they have in common is that they are real innovations with a comprehensive and cohesive approach to an emerging recognisable issue; what is being offered, to whom, and how they operate are all innovatively addressed. None of them are merely offering the existing product from an existing market place with availability online, the old definition of multi Channel operations in a Digital Business environment!

So what if anything can be pin pointed as under pining successful markets and competitive products?

There are many complex multi part answers to this question, but if you want to take the big picture as to what has enabled many of the new business capabilities the answer lies in data. This is not Data types, uses, or tools familiar to current Enterprise IT systems and Business reporting, rather it’s the ability to tap into, use and respond quicker than competitors to the flow of data across everything that could be relevant, contextual and contemporary. New markets such as AirBnB exist because data can be aggregated and stored on a previously undreamt massive scale at very low cost. But cheap storage data aggregation is not the only possible answer to the competitive use of data.

A more complex answer to how a successful business model uses data in a different manner is the rapid, disruptive growth of an alternative business model for second hand car buying and selling. In the UK WeBuyAnyCar.com has fundamentally altered the entire sales process for selling a used car in some two years.

The prospective seller enters the Registration Number of their car which is instantly checked against the UK Government registration data base to establish make, model etc. to which the seller adds further details on mileage, condition, etc. when prompted. Instantly a cash purchase price is offered valid for one week with the proviso that on delivery to a WeBuyAnyCar.com physical collection point the car should be as declared. WeBuyAnyCar.com calculates the price to offer by real time continuous analysis of car auctions prices realized by cars of the same make and model.

This example combines the necessary government required documentation to register a sale, with both reference data to establish the details of the car for sale, and real time data to establish its current realisable value hour by hour in the market place. Unsurprisingly even existing trade buyers now use the values produced by WeBuyAnyCar.com in place of subscribing to a monthly paper trade guide, as such WeBuyAnyCar.com could be said to have created and gained control over pricing and the new market for car trading.

Today the Web and Apps have created increasing equality between the knowledge of a buyer and seller, so for competitive advantage the Enterprise has to equip its Knowledge Workers with not just more data, but better analytics of a real time nature unavailable to the buyer, or their competitors, to win the battle of Knowledge Buyer versus Knowledge Worker! 

In summary; decentralization is both the creation of technology enablers, and the manifestation of the desire of individuals to organize outcomes from smaller elements that suit their preferences. This directly attacks current market leaders existing combinations of products and services delivered from an enterprise constructed for optimization of their delivery. The resulting business drivers from this disruption remove the inbuilt advantages of established market leaders with their resource ownership. However the disruption is proving to offer increasingly large scope for new markets and products using innovative business models.

Research report now available: The Foundational Elements for the Internet of Things (IoT)

Footnotes

  1. The next blog in this series will examine the development within the Technology Industry of Fog, or Cloud Edge, Computing as the basic architectural model for decentralized technology devices and services as opposed to the role of Cloud Computing for centralized functions.
  2. In judging the awards for the 2015 Constellation Research Connected Enterprise annual Super Nova competition it was extremely noticeable that new hardly known startups dominated the entries submitted by their satisfied customers. In each case the change achieved could truly be described as both innovative and disruptive, whilst the outcomes for the enterprise and its business were remarkable and substantial. See Constellation Connected Enterprise for more details on awards.
Tech Optimization Chief Information Officer

IBM launches Industry's First Consulting Practice Dedicated to Cognitive Business

IBM launches Industry's First Consulting Practice Dedicated to Cognitive Business

Earlier today IBM unveiled its plans to create a dedicated consulting practice, staffed by 2000 consultants and dedicated to nothing but cognitive business.

 
 
 
The press release can be found here, going for the medium, video to comment:
 
 

If you don't have a chance to watch here are the key takeaways:
  • It is early times for cognitive business. Good to see IBM adding dedicated services professionals to its products (Watson and BlueMix to build cognitive apps).
  • The concern is that very few enterprises have come out that they are using IBM (none mentioned in the press release). Granted, they may not want to go public with their showcases as they may be a strategic advantage and disrupt their industries.
  • The dedicated website to 'out-think' is doing a good job showing the breadth and depth of the IBM cognitive, but also lack specific examples.
 

MyPOV

A good move by IBM, after many years of investment in cognitive products (Watson), time to get dedicated professionals to leverage the opportunity. It can be taken as a sign that Watson is now ready for doing business and tying careers in professional services to it. 
 
On the concern side we know Watson is still a complex system to operate and see results (e.g. corpus training) and IBM will have to mention some specific 'lighthouse' customer show cases. 
 
But overall a good move by IBM, it is early days for cognitive computing as a next gen application category. We will be watching. 
 
 
Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity IBM AI Agentic AI LLMs Generative AI ML Analytics Automation Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

What to make ot the Court of Justice of the EU invalidating the Safe Harbor Agreement

What to make ot the Court of Justice of the EU invalidating the Safe Harbor Agreement

Today the Court of Justice of the EU invalidated the Safe Harbor agreement between the EU and the US. Take a look at the press release of the court here.

So take a look at what it means for next generation applications (and also other applications that store European consumer / business data):


 
 


If you can't watch here are my takeaways

 
  • A win for data privacy for Europeans, the question will be at what cost. That cheap electronic gadget or sports apparel is likely no longer a click away. 
  • Not so good news for small and medium size business (SMB) who do business with European entities and consumers who now face additionally challenges. Challenges will be huge in case a SMB does business in their own data center in North America. Or works with a datacenter provider who does not have a EU presence. 
  • It largely will benefit the larger vendors, particularly those who have a EU presence, or so called model agreements for the respective locations of their data centers. 
 
What is your take? 
Digital Safety, Privacy & Cybersecurity Security Zero Trust Chief Information Officer Chief Information Security Officer Chief Privacy Officer

Couchbase Intros 4.0 Release, Highlights Enterprise Wins

Couchbase Intros 4.0 Release, Highlights Enterprise Wins

Couchbase Live New York event puts new features, NoSQL adoption in the spotlight. Watch for basics and the best fit for NoSQL in the enterprise.

This week’s Couchbase Live New York marked the general availability of Couchbase Server 4.0 and a coming out party of sorts for enterprise customers including Marriott, GE, Cox Automotive and Gannett.

In beta since June, Couchbase Server 4.0 saw 40% higher download activity than the vendor’s previous beta release largely due to two important new features. Multidimensional Scaling lets operations provision infrastructure for data, query and indexing services independently, saving money where resources are sufficient while letting you deploy faster storage, more memory or more processing power where needed. 

The SQL-like N1QL (pronounced “nickle”) query language introduced in 4.0 will enable Couchbase developers to eliminate complicated query code and copies of data previously required within applications. Instead, N1QL will help to streamline app development and upkeep by supporting routine ad hoc querying at the data tier. It will also make it easier to use third-party SQL-based integration and BI tools. 

@Couchbase, #cbliveny, #NoSQL, #bigdata

Couchbase Server 4.0 introduces N1QL, a “SQL-like” query language added to simpify app development.

 
A highlight at Couchbase Live NY was a presentation by Marriott solutions architect Thomas Vidnovick, who explained how and why the hotel chain is retiring a mainframe-based reservations app and replacing it with a new app running on Couchbase on distributed commodity hardware. Vidnovick said Marriott went with open source for its low software and support cost, NoSQL for its JSON-based, object-oriented development, and Couchbase specifically for its high transaction throughput and ability to quickly add nodes to scale out.  

 

MyPOV On Couchbase

Couchbase impressed in New York with presentations from several large, household-name customers. But several presenters from these companies also reminded attendees that NoSQL databases are still playing catchup on some basics. For example, Vidnovick said he’s looking forward to using new Couchbase 4.0 features including LDAP support for ID management and auditing support – surprisingly basic capabilities to see just now being added to enterprise software. 

It’s also clear that NoSQL is complementing, rather than entirely replacing conventional relational databases. Cox Automotive executive Tony Selke, for example, said that his firm continues to use Microsoft SQL Server for its transactional applications. It uses Couchbase to handle the fast-changing, ephemeral data behind some 6 million automotive sales listings that are added, deleted or updated within its systems each day. 

Nearly all NoSQL database vendors are still working on basics including security and administrative features, but as Couchbase demonstrated in New York, large companies are increasingly embracing NoSQL for high-scale, next-generation applications.  

Data to Decisions Marketing Transformation Tech Optimization Chief Information Officer Chief Digital Officer

#SocBiz #FutureOfWork News Week Ending Oct 2, 2015

#SocBiz #FutureOfWork News Week Ending Oct 2, 2015

Here is a recap of some of the key news of the last week in the Social Business / Employee Collaboration / Future of Work world.

Did I miss something big? Please post a link in the comments.

 

Reference Links:

BoxWorks - watch the keynotes

My review: BoxWorks 2015 - Solid Platform Direction But Not Enough On Core Usage

Why Leading Apps Prioritize Interoperability (and Work with Box!)

Transforming work in the cloud: IBM at BoxWorks 2015

Today at BoxWorks: Unlocking Dynamic New Content Experiences

Introducing Box Capture: Connecting Your Phone's Camera to Business Processes

The browser experience in Microsoft OneDrive for Business gets a makeover

Make Your Meetings More Personal with Microsoft Cortana + LinkedIn

Microsoft Office Delve adds Praise, Favorites and enhances content creation

Meet the new Asana

Asana Picks Up Google Exec Chris Farinacci To Run Its Business Ops

Collaboration Without Complication: Jive Unveils A Simplified, Mobile, Interactive Intranet

IBM Notes and Domino V9.0.1 adds IBM Client Application Access and Macintosh 64-bit processor support for IBM Notes

 

 

 

 

 

 

 

 

 

Future of Work Tech Optimization Innovation & Product-led Growth New C-Suite Sales Marketing Next-Generation Customer Experience Revenue & Growth Effectiveness Data to Decisions Digital Safety, Privacy & Cybersecurity Chief Digital Officer Chief Marketing Officer Chief People Officer Chief Revenue Officer Chief Experience Officer

IBM Launches New Cognitive Business Consulting Practice

IBM Launches New Cognitive Business Consulting Practice

IBM is taking a big step toward fleshing out the business potential of its Watson technology, announcing a new Cognitive Business Solutions unit on Tuesday. The practice will include more than 2,000 consultants skilled in machine learning, data science and analytics. 

It's a big bet for IBM but the company is convinced it's a safe one, judging from the announcement:

"Cognitive computing is the path to the next great set of possibilities for business," said Bridget van Kralingen, senior vice president, IBM Global Business Services. "Clients know they are collecting and analyzing more data than ever before, but 80 percent of all the available data -- images, voice, literature, chemical formulas, social expressions -- remains out of reach for traditional computing systems."

Going Vertical

To start, IBM is honing in on industries such as banking, retail and insurance. An upcoming IBM research survey of 5,000 C-suite executives found that nearly all respondents in those verticals plan to invest in cognitive capabilities. However, executives also widely cited the lack of available skills for cognitive technology as a hurdle.

"Their whole point is really taking the next generation of computing to all their different areas," such as talent management and big data, says Constellation Research founder Ray Wang. "They're trying to get these systems to find patterns of insights and make decisions. This is talking about going from transactions to systems that think."

The Bottom Line

 All business leaders should start thinking about what their company's cognitive strategy should look like, with a particular focus on how systems can learn from the business's own employees, Wang adds. 

IBM and Watson are not the only options in the market for cognitive capabilities, given the likes of Wipro's Holmes, and it remains early days overall for the market. "In all wars and battles for technology dominance its not necessarily the best technology, it's the best ecosystem," he says. "We're just at the beginning of this."

New C-Suite

Couchbase: Time to Take A Seat?

Couchbase: Time to Take A Seat?

There are enough NoSQL databases out there to make quite the kettle of alphabet soup, but Couchbase is making some moves that may help it float to the top of the pot.

The company will announce the general availability of Couchbase 4.0 at an event this week in New York. Important new features include NIQL, a SQL-like query language for running analytics, says Constellation Research vice president and principal analyst Doug Henschen. While there are plenty of these types of languages available in the NoSQL domain, "they've got a good start here," he adds.

"They're also focusing on multi-dimensional scaling, which is about being able to independently scale the resources you need for data, query and indexing workloads," Henschen says. "You're hearing about that a lot on big data platforms. That's key because it's about balancing scale and speed requirements without breaking the bank. If everything has to go together, you might needlessly scale up network, storage or compute capacity when only one or two dimensions might do the trick."

Couchbase has also added JSON document handling in the last year. "They're trying to give you the best of both worlds: Document data handling and huge scalability," Henschen says.

No Magic Bullet

Couchbase has managed to snare some extremly high-profile, data-rich customers, including PayPal and LinkedIn. Still, it's a crowded market with many alternatives to Couchbase.

"There's a world of products out there," such as MongoDB, which is popular with developers but has been addressing scalability issues, Henschen says. "To appeal to developers is one thing," he adds. "IT has to worry about the long term."

Couchbase and MongoDB have been engaging in a tit-for-tat PR battle over the question of scalability, firing claims and counter-claims. That said, Couchbase's sweet spot seems to be in between NoSQL platforms such as Riak and Cassandra, which are geared toward extremely high-scale, global deployments and MongoDB, which is often deployed with one to three nodes, Henschen says.

The Bottom Line

Ultimately, there's no single correct NoSQL platform for all use cases, he adds.

"There's a lot of nuance in that world of choices," Henschen says. "Reads versus writes, for example, are two different things. And there are application use cases where MongoDB is just fine at high scale."

Chief Information Officer