Results

News from one of the oldest software companies - and it's getting exciting

News from one of the oldest software companies - and it's getting exciting

One of the oldest software companies in Europe - if not in the word - purely founded for the purpose of software as a business, is Software AG in Germany. Unlike some older companies, Software AG never dabbled into anything hardware - but only focused on software for its 40+ years in existence.





Software AG was founded in Darmstadt, Germany by 6 members of a local consulting firm, not even 30 miles north of the more prominent other German software company's location, Walldorf (SAP (AG)). But it was too early for business software, so the Software AG founders focused on building the tools that were needed at the time, zeroing in on a database, that was supposed to be adaptable, so in 1971 the company launched adabas (you guess it - adaptable database system). It prove to be quite successful in the banking and insurance areas and along came the need to establish a programming language, that the founders wanted to make more easy to use and learn than the other programming languages on the market. In 1979 the company delivered natural as a 4GL application development environment - an easy to learn programming language that supported both procedural and event driving programming. 

In the 90ies the company extended it product range and most notably had partnerships with SAP (yes - a cheaper database option was even then a topic of interest) and Microsoft (porting of DCOM to other platforms than Windows). 

Later the company established a fondness of all things XML, which despite very high marks on the usability and lowered cost of ownership aspects of its products, never really took off and put the company in the doldrums to a certain extent.




Reshaping Strategy

In 2007 Software AG acquired webmethods - and with that focusing more on the integration aspects of software than it's creation, which in hindsight - and the 2013 perspective - provided to be pretty pivotal. Equally the acquisition of IDS Scheer AG with its Aris modelling tool was a key addition to is products and services portfolio. When Software AG was looking at a way to accelerate slow running business processes it acquired Terracotta - a leader of in memory caching technology. All that formed a very good base for growth at Software AG.




The latest 

While the company still provides adabas and natural, the main focus has been on integration and creation of high value, real-time and highly complex processes. Software AG picked more acquisitions in 2013 to complement this strategy, acquiring the CEP platform Apama from Progress, realizing the need for better visualization acquiring Jackbe and improving the agility of integration to webmethods with Longjump. And finally there is ambition to play in the IT transformation market with the acquisition of alphabet AG. It will be interesting what the company will unveil at their upcoming user conference in early October in San Francisco.




MyPOV

It's remarkable to have a 40+ year run in the software industry. Inevitably there will be success and failure, but with the right degree of innovation and re-inventing itself, Software AG has become one of the key players in the upcoming cloud integration game.

But the game has not changed - and we do not see it changing with the cloud - that integration vendors have to create value and not simply be a point to point connection protocol. Here the cloud is an opportunity for all of them -  not just Software AG - to more easily create value added services on top of the pure integration data streams. 

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Google salesforce IBM SAP Oracle SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Is it Personal Information or not ? Embrace the uncertainty.

Is it Personal Information or not ? Embrace the uncertainty.

The US General Services Administration defines Personally Identifiable Information as information that can be used to distinguish or trace an individual’s identity, either alone or when combined with other personal or identifying information that is linked or linkable to a specific individual (underline added). This means that items of data can constitute PII if other data can be combined to identify the person concerned. The fragments are regarded as PII even if it is the whole that does the identifying. And the definition means that a piece of data may be classed as PII before it is identified, rather than after.  This is only prudent. 

I am frequently asked - semi-rhetorically - by IT and security professionals if the definition means that even an IP or MAC address nowadays could count as PII? And I've said that in short, yes, it appears so!

Some security people are uncomfortable with this, but why? When it comes down to it, what is so worrying about having to take care of Personal Information? In Australia and in 100-odd other jurisdictions with OECD based data protection laws, it means that data custodians are required to handle their information assets in accordance with certain sets of Privacy Principles. This is not trivial but neither is it necessarily onerous. If the obligations in the Privacy Principles are examined in a timely manner, alongside compliance, security and information management, then they can be accommodated as just another facet of organisational hygiene.

So for instance, consider a large data base of 'anonymous' records indexed by MAC address. This is just the sort of data that's being collected by retailers with in-store cell phone tracking systems, and used to study how customers move through the facility and interact with stores and merchandise. Strictly speaking, if the records are not identifiable then they are not PII and data protection laws do not apply. But the new definition of Personal Information in Australia means IT designers need to consider the prospect of the records becoming identifiable in the event that another data set comes into play. And why not? If anonymous data becomes identified then the data custodian will suddenly find themselves in scope for privacy laws, so it's prudent to plan for that scenario now. Depending on the custodian's risk appetite, any large potentially identifiable data set should be managed with regard to Privacy Principles. These would dictate that the collection of records should be limited to what's required for clear business purposes; that records collected for one purpose not be casually used for other unrelated purposes; and that the organisation be open about what data it collects and why. These sorts of measures are really pretty sensible.

Security practitioners I've spoken with about PII and identifiability are also upset about the ambiguity in the definition of Personally Identifiable Information. They complain that the identifiability of a piece of data is relative and fluid, and they don't want to have to interpret the legal definition. But I'm struck here by an inconsistency, because security management is all about uncertainty.

Yes, identifiability changes over time and in response to organisational developments. But security professionals and ICT architects should treat the future identification of a piece of unnamed data as just another sort of threat. The probability that data becomes identifiable depends on a range of variables that are a lot like other factors (like the emergence of other data, changes of circumstance, or developments in data analysis) that are routinely evaluated during risk assessment.

To deal with identifiability and the classification of data as PII or not, you should look at the following:

  • consider the broad context of your data assets, how they are used, and how they are linked to other data sets
  • think about how your data assets might grow and evolve over time
  • look at business pressures and plans to expand the detail and value of data, and the resulting potential for new linkages
  • make assumptions, and document them, as you do with any business analysis, and
  • plan to review periodically.

Many organisations maintain a formal Information Assets Inventory and/or an Information Classification regime, and these happen to be ideal management mechanisms in which to classify data as PII or not PII. That decision should be made against the backdrop of the organisation's risk appetite. How conservative or adventurous are in you respect of other risks? If you happen to mis-classify Personal Information, what could be the consequences, and how would the organisation respond? Do some scenario planning, and involve legal, risk and compliance. While you're at it, take the chance to raise awareness outside IT of how information is managed. Be prepared to review and change your classifications from non-PII to PII over time. Remember that security managers should always be prepared for change. Embrace the uncertainty in Personal Information!

Truly, privacy can be tackled by IT professionals in much the same way as security. There are no certainties in security and it's the same in privacy. We will never have perfect privacy; rather, privacy management is really about putting reasonable arrangements in place for controlling the flow of Personal Information.

So, if something that's anonymous today, might be identified later, you're going have to deal with that eventually. Why not start the planning now, treat identifiability as just another threat, and roll your privacy and security management together?

 

See also the excellent survey of identifiability by William B. Baker and Anthony Matyjaszewski The changing meaning of 'personal data' (2010). The essay looks specifically at the question of whether IP addresses can be PII, and highlights a trend in the US towards conceding that IP addresses combined with other data can identify, and may therefore count as PII:

Privacy regulators in the European Union regard dynamic IP addresses as personal information. Even though dynamic IP addresses change over time, and cannot be directly used to identify an individual, the EU Article 29 Working Party believes that a copyright holder using "reasonable means" can obtain a user's identity from an IP address when pursuing abusers of intellectual property rights. More recently, other European privacy regulators have voiced similar views regarding permanent IP addresses, noting that they can be used to track and, eventually, identify individuals.
This contrasts sharply to the approach taken in the United States under laws such as COPPA where, a decade ago, the FTC considered whether to classify even static IP addresses as personal information but ultimately rejected the idea out of concern that it would unnecessarily increase the scope of the law. In the past few years, however, the FTC has begun to suggest that IP addresses should be considered PII for much the same reasons as their European counterparts. Indeed, in a recent consent decree, the FTC included within the definition of "nonpublic, individually-identifiable information" an “IP address (or other "persistent identifier")." And the HIPAA Privacy Rule treats IP addresses as a form of "protected health information" by listing them as a type of data that must be removed from PHI for deidentification purposes.

New C-Suite Digital Safety, Privacy & Cybersecurity Infosec Security Zero Trust Chief Customer Officer Chief Executive Officer Chief Information Security Officer Chief Privacy Officer

Richard Napier Creates Siebel 20 Year Anniversary Quiz

Richard Napier Creates Siebel 20 Year Anniversary Quiz

CelebrationLogo-150x150Leading Siebel blogger Richard Napier has created a quiz to celebrate the 20th Anniversary of the product. His On Demand Education blog is one of the most popular dedicated to Oracle Siebel technology.

The free quiz has some technical questions, some functional questions, and some Siebel trivia. The quiz is free, and even has a leaderboard to capture and share results through Twitter, Facebook and other social networks.

Richard challenges his readers to "see who is the Siebel Quiz Master". The quiz can be found here at http://ondemand-education.com/corp/index.php/quizzes/siebel-20-year-anniversary-quiz/

Tech Optimization Chief Information Officer

The end of an era – Blackberry goes private

The end of an era – Blackberry goes private

Okay okay, Blackberry will not disappear from the face of consumer electronics, but it is facing its Waterloo and being sent to Saint Helena (for you Napoleon history buffs). Blackberry has to hope that going private and being tucked away on a symbolic island away from prying eyes will allow the once mighty company to rediscover itself.

But can this move help the smart phone manufacturer? Blackberry’s story is well documented. The pioneer of mobile email, it got caught flat footed with the rise of the iPhone and then Android devices. As smart phones moves away from simple email into the world of apps, texting and web browsing, just having a secure email service was not longer the “killer app.” The iPhone and more importantly the App Store open up a world of mobile tools that was ignored by Blackberry. As a former Blackberry user I remember trying to download apps on my device and being frustrated by the lack of choice, the amount of memory it ate up on my phone and the overall lack of user friendliness. Meanwhile some of my colleagues were zipping around on their user friendly touch screen iPhones.

So now Blackberry should have the ability to make some difficult strategic decisions away from the pressures of the public market. What should they do? I still think what I published a few posts ago is the direction they should head in, click here. Being private should make this process smoother.

It is always a little sad to see a once mighty company wither away, much like watching other tech giants like Digital, Wang, Compaq fall away. However it is a good lesson for the likes of Google, Apple, Facebook to realize that the top of the mountain is not a right but something that was earned and has to be constantly worked on to maintain (Apple probably knows this better than the others).

Good luck Blackberry…it was a great ride.

New C-Suite Next-Generation Customer Experience Innovation & Product-led Growth Chief Executive Officer Chief Financial Officer

ADMA Young Marketer of the Year Finalists Announced

ADMA Young Marketer of the Year Finalists Announced

1
 

The finalists for the ADMA Young Marketer and Young Creative of the Year have been announced, celebrating and showcasing the work of the Australians under the age of 30. It’s hotly contested, with winners flying to New York City in 2014 on an all-expenses paid trip to meet with leading marketers, creatives and agencies including Google Creative lab, OgilvyOne Worldwide and Anomaly.

In the running for Young Marketer of the Year are:

  • Leigh Allen, Marketing and International Marketing Solutions Manager, ESPN Australia/NZ
  • Anna Guerchenzon, Marketing Team Manager, Telstra
  • Jasmine Hildebrand, BTL Manager-Marketing, AAMI Insurance
  • Chris Howatson, Managing Director, CHE Proximity, Melbourne

Young Creatives have entered their work for judging – and you can check it out yourself at the ADMA site.

  • Jardin Anderson, art director, Rapp DBB, for Get ahead of yourself.com.au which calls on young creatives and marketers to ditch an old award in favour of one from ADMA.
  • Michael Gagliardi, creative/art director For KWP! Advertising, for #YoungPeopleGetIt – getting young marketers and creatives to enter YMYC by speaking to them in the language only they understand.
  • James Nguyen, art director at OBM, for Follow the Follower –  for a fresh twist in getting the leading lights in the marketing and advertising industries to follow the young person on Twitter.
  • Tony Simmons, art director at The Brand Agency –  for See Where It Can Take You – for showing junior marketers and creatives where their career can take them with YMYC using their own Facebook timeline as the narrative.

The overall winners will be announced on Friday 1 November at the ADMA Awards at The Star, Pyrmont.

Is there something you see that you love? Leave me a comment below.

Best Practices for SaaS Upgrades as Seen in Workday's Approach

Best Practices for SaaS Upgrades as Seen in Workday's Approach

If you're involved with enterprise software, you need to pay attention to what Workday is doing--even if you're not interested in HR or financial systems. Because Workday is one of the best examples of how enterprise applications can and should be delivered in the cloud.

This was one point I took away from Workday's annual user conference in San Francisco and from a day-long series of briefings for industry analysts earlier this month. 

The differences between Workday's practices and the approach of traditional enterprise software vendors are striking. There are several points of contrast, but in this post I'd like to focus on how Workday delivers software upgrades and some new twists in how it does this.

Traditional Approach to Software Upgrades

In the traditional enterprise software model, vendors develop new versions and provide them to their customers that are under maintenance agreements. The customer takes delivery of the new version, installs it on a test copy of the system, migrates data from the existing production version, retrofits any customizations or interfaces with other systems, revises its user procedures, performs system testing,  and migrates all of its users to the new version. In the process, if there is any time left in the schedule, the customer also may investigate how it would like to use any new functionality offered in the new version.

The bottom line is that in the traditional model, software upgrades are both a technical exercise as well as a business exercise. The technical challenges of data migration, retrofitting of customizations, and reworking system interfaces can be significant and can encourage customers to stay on older versions of a vendor's system for many years. When such a customer finally wants to get current on the latest version, the upgrade process can rival the time and expense of the original implementation. The technical aspect can be so much work that companies often retain outside service providers to manage or assist in the effort. The business aspects--accommodating changes to business processes or embracing new functionality--are often jettisoned for the sake of simply getting the new version installed from a technical perspective. As a result, customers often do not realize the benefits of the new functionality that the vendor offers.

The Workday Approach

Workday's approach to upgrades, from the beginning, is simple: it takes responsibility for all technical aspects of the software upgrade, allowing the customer to focus solely on the business aspects. There are at least three reasons that Workday can do this:

  • Workday's object model allows most customizations to be brought forward to new versions of the system with little or no retrofitting.
  • Likewise, Workday's Integration Cloud, based on technology it obtained through its acquisition of Cape Clear,  allow most custom integrations to continue to work with new versions of its system.
  • Since Workday operates the system on behalf of the customer, Workday takes all responsibility for migrating the customer's data to the new version. 

The impact of this last point should not be underestimated. Last year, Workday's CTO, Stan Swete, wrote about how important it is for the SaaS provider to take full responsibility for migrating customer data to new versions: 

[The] Software-as-a-Service (SaaS) model improves service delivery quality by letting the provider own the end-to-end process of development, conversion, and deployment. In the on-premise software world the vendor controls development (and associated QA), but there is a hand off for conversion and deployment. At Workday, the update process is not done until every customer is on the new version. The same team that project manages our development also project manages conversion and deployment.

When it comes to version upgrades, not all SaaS providers are created equal. Some are little more than single tenant hosting providers. Others are multi-tenant SaaS providers, but they deploy new versions as separate instances of the system and allow customers to stay on older versions for long periods of time. This makes version upgrades considerably more difficult if and when customers do decide to upgrade. Workday, as discussed, is at the other end of the spectrum, keeping all customers current on the latest version. Salesforce.com, NetSuite, and Plex, are similar to Workday in this regard, though they may differ in the details of how they do it.

    Further Improvements in Workday's Approach

    This year, Workday has further refined its approach to version upgrades in three ways:

    1. Single production instance for all versions. Previously, Workday would deploy a new version of Workday as a system instance that was separate from the previous version, and Workday would migrate customers in waves from the old version to the new version over a three week period. Workday's new approach is for the current version and the new version to exist simultaneously on the same system instance. Workday will now move customers to the new version by means of a set of "switches" that dictate which features of the system the customer will see. This new approach is possible because of Workday's object orientation discussed earlier.
    2. Continuous development and deployment of new functionality. Instead of holding all functionality enhancements for its periodic version upgrade, Workday is now introducing smaller changes on a weekly basis. This is especially important for small but high-priority changes or for tax and regulatory updates. Contrast this to the traditional vendors, who required many months or years between the time customers request changes and the time they actually see them in updated versions.
    3. Continuous conversion of customer data. As Workday develops new features that require changes to its data model, the single production instance now allows Workday to convert customer data in the background in advance of actually migrating customers to the new version. This reduces the amount of downtime required during the when the customer is moved to the new version. 
    4. Preview instance. Now that there is a single production instance and continuous conversion of customer data, Workday is now able to offer customers a preview instance of the new version, giving customers a longer time-frame in which to evaluate and plan for the new version. Under the traditional model, customers only get a hands-on look at the new version when they take delivery of the upgrade, install it, and convert their data to it in a prototype environment. Workday's approach gives customers much more time and encourages them to make use of the new functionality.

     Swete summarized these changes in a blog post during the user conference:

    Probably the best example of embracing continuous change is happening on the service delivery side of our business. Workday has moved to continuous deployment of new features to a single code line. This move, along with the continuous background conversion of data for new features, enables us to complete updates for our production customers with less scheduled downtime. Application of changes to a single code line reduces the expense of maintaining multiple code lines around each update we do. Moving to continuous deployment also gives us the flexibility to continue to respond to our customers’ requirements when it comes to the number of updates we do each year.

    As Swete indicates, the single production instance, continual development approach, and continuous conversion of customer data allow Workday to scale back from three new major versions a year to just two. The conference audience applauded when co-CEO Aneel Bhusri made this announcement, perhaps indicating that many companies have difficulty absorbing three major upgrades a year. At first blush, the reduction in the number of new versions a year would imply that Workday is slowing down the number of new features per year. But in an sidebar conversations with Workday executives the next day, it became clear that these most recent improvements actually mean that Workday will be introducing more new features each year. The difference is that the smaller changes will be trickled-in on a weekly basis, while major new features will be held for the twice-yearly updates.  As indicated earlier, this approach also allows Workday to accommodate regulatory or tax-law changes on short notice, which have become more common in recent years.

    Workday's core strategy of reducing or even eliminating the technical burden of version upgrades is a best practice for SaaS providers, allowing customers to focus exclusively on business improvement and maximizing the value of their system investment. More SaaS providers should follow this example.

    Postscript: Over at Diginomica, Phil Wainwright has two good posts covering some of these same points:

    Note: Workday covered my travel expenses for attending its user conference.


    Related Posts

    The Simplicity and Agility of Zero-Upgrades in Cloud ERP
     

    New C-Suite Future of Work Tech Optimization Innovation & Product-led Growth workday SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

    1st take - Oracle in memory option for its database - very organic

    1st take - Oracle in memory option for its database - very organic

    Oracle's massive OpenWorld conference kicked off the other day - opening with the traditional Larry Ellison keynote - and a partner keynote - this time Fujitsu talking about the goods of their M10 system.




    But the real news was Ellison presenting the new in  memory option for the Oracle database. Interesting enough the presentation was void of version numbers and Oracle did also not post a press release about the new capabilities - which raises some questions on why that did not happen. More to come during the week - or a desire to keep things general for revenue recognition purposes - we will learn later.



    Coming since a long time

    In memory technology has been important for Oracle since quite some time, starting with the TimesTen in 2005. But it was always an option to solve a limited set of performance problems - not running the overall database in memory. Credit for shipping a complete in memory database and evangelizing the market goes to SAP with HANA - something Ellison doubted that SAP could deliver. Well SAP did deliver, and did well, so since about 6  months we heard Ellison hinting to the next version of the Oracle database beyond 12c (12c R1?) to address the in memory technology, last on the Q1 earnings call the other week.



    Interesting similarities

    As an observer it's interesting to see how much both industry veterans - Ellison and Plattner - care about solving a performance problem that traditional databases could not address - the flexible crunching of large amounts of data - often referred to as the analytical applications buzzword wise today. And both gentlemen get a tad professorial talking about this - Plattner with blackboard sessions - Ellison with talking about the fundamental challenges of database architecture. And both are passionate about the topic and Ellison was evidently in best spirits - winning two races at the America's Cup certainly helped, too.



    An organic approach

    The path Oracle has chosen to address in memory is more organic - allowing customers to turn the in memory feature on / off with what Ellison referred to just a switch. If you turn the switch on - a DBA has just to walk through three steps and the database will take advantage of the in  memory option... and what happens behind the scene is that tables will be transported to in memory, stored there in columnar format and all future transactions of the application running in the database will be saved both to the new in memory column store and the traditional row store, which most likely will reside on disk.





    The key benefit for customers will be that they do not have to change a line of code to get to the benefits of in  memory, the system will just get faster as more data gets moved to memory and once it's there the system will be significantly faster as a demo showed.



    Key Benefits

    Oracle is choosing a systemic approach to the in  memory problem, which is possible as Oracle owns the underlying infrastructure of the row database. Oracle knows what the CRUD operations on its database are being operated and can sort them out to in memory as parametrized. 





    Ironically Ellison claimed that this will even accelerate the database - having the dual writes. This is largely aided by being able to drop expensive index file management that no longer need to be maintained as the database will automatically direct the queries for these tables to in memory, where thanks to RAM speed no index files are required. 

    So this will allow customers to play with in memory - by upgrading the memory on their database servers and see what benefits they can achieve with a partial move of data to in  memory.



    The integrated play

    And it would not be Oracle of 2013 - if Oracle would not ship hardware that empowers the latest software move - and indeed Oracle has available a number of Exa-Severs that are ideally tuned to operate large in memory databases. Ironically the hardware is available today - the software - no mention (yet).





    Questions remain

    A lot of details remain to be clarified - and my hope is latest the database summit on Wednesday will address them. As with all powerful software - the question is going to be the price for the switch and for sure Oracle will not make it cheap. But Oracle will price it right to make it easy for customers to stay on Oracle and not move to alternate products. 



    The ISV angle

    As we all know the largest SaaS vendor - Salesforce.com - stroke a deal with Oracle continuing to rely on Oracle 12c going forward. There was a lot of hoopla around this back in June - but Benioff supported the decision with tweets going along the keynote - which almost had a feel of vindication. Now finally Salesforce.com could show why they decided to stay on the Oracle database. 



     

    And they will not be alone in that decision - it's the first time a Micosoft executive is presenting at Oracle OpenWorld ...



    MyPOV

    There is a parallel between the AmericasCup and in memory databases right now. Oracle is playing catchup - and we all need to wonder why TeamOracleUSA did not sail as fast from the start and why Oracle let others (SAP) get a lead in the in  memory database game. But unlike to sailing where the puffs in the San Francisco Bay may decide the outcome - for the in memory database game - the customer adoption makes the difference - and there Oracle has made it technically easy for customers to follow. Let's see how easy Oracle will make it commercially... if Oracle gets this right there may well be soon more SAP customers running the Oracle in memory option database wise than running HANA. 

    You can also find the tweetstream of the keynote in this Storify here

    New C-Suite Future of Work Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Openworld SAP Oracle Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

    How Bart Simpson might defend TouchID

    How Bart Simpson might defend TouchID

    A week and a bit after Apple released the iPhone 5S with its much vaunted "TouchID" biometric, the fingerprint detector has been subverted by the Chaos Computer Club (CCC). So what are we to make of this?

    Security is about economics. The CCC attack is not a trivial exercise. It entailed a high resolution photograph, high res printing, and a fair bit of phaffing about with glue and plastics. Plus of course the attacker needs to have taken possession of the victim's phone because one good thing about Apple's biometric implementation is that the match is done on the device. So one question is, Does the effort required to beat the system outweigh the gains to be made by a successful attacker? For a smartphone with a smart user (who takes care not to load up their device with real valuables) the answer is probably no.

    But security is also about transparency and verification, and TouchID is the latest example of the biometrics industry falling short of security norms. Apple has released its new "security" feature with no security specs. No stated figures on False Accept Rate, False Reject Rate or Failure to Enroll Rate, and no independent test results. All we have is anecdotes that the False Reject Rate is very very low (in keeping with legendary Apple human factors engineering), and odd claims that a dead finger won't activate the Authentec technology. It's held out to be a security measure but the manufacturer feels no need to predict how well the device will withstand criminal attack.

    There is no shortage of people lining up to say the CCC attack is not a practical threat. Which only begs the question, ok, just how "secure" do we want biometrics to be? Crucially, that's actually impossible to answer, because there are still no agreed real life test protocols for any biometric, and no liveness detection standards. Vendors can make any marketing claim they like for a biometric solution without being held to account. Contrast this Wild West situation with the rigor applied to any other branch of security like cryptographic algorithms, key lengths, Trusted Platform Modules, smartcards and Secure Elements.

    You can imagine Bart Simpson defending the iPhone 5S fingerprint scanner:

    It won't be spoofed!
    I never said it couldn't be spoofed!
    It doesn't really matter if it is spoofed!!!"

    Demonstrations of biometric failings need to be taken more seriously. They expose a systemic naivety in the industry and a willful disregard for security professionalism. The truth is that consumer biometrics are all about convenience, not security. And that would be ok, if only manufacturers were honest about it.

    New C-Suite Chief Customer Officer Chief Marketing Officer

    Event Report: Day 1 At Oracle Open World 2013: The Quest For Innovation #oow13

    Event Report: Day 1 At Oracle Open World 2013: The Quest For Innovation #oow13

    Past Oracle Open Worlds Have Disappointed Customers and Partners

    Let’s be frank.  The past five years at Oracle Open World have disappointed even the faithful.   The over emphasis on hardware marketing and revisionist history on cloud adoption bored audiences.  The $1M paid advertorial keynotes had people walking out on the presenters 15 minutes into the speech.  Larry Ellison’s insistence on re-educating the crowd on his points subsumed the announcements on Fusion apps.   Even the cab drivers found the audience tired, the show even more tiring.

    Oracle went from hot innovative must attend event to has been while most industry watchers, analysts, and media identified shows such as Box’s BoxWorks, Salesforce.com’s DreamForce, and Exact Target’s Connections as the innovation conferences in the enterprise.  These events such as Constellation’s Connected Enterprise, capture not only the spirit of innovation but also provide customers a vision to work towards.  Hence, most believe Open World could use much needed rejuvenation and a shot of innovation juju (see Figure 1.)

    Figure 1. Oracle Open World Lights Up San Francisco From September 22nd to September 27th

    “Next Slide Please”: Oracle Enters A Period Of Reinvention At #OOW13

    Walking through the event on Saturday (Day 0) and today (Day 1), one will notice a slight change in the spirit of the event. While half the base is die hard Oracle Red Stack customers (i.e. those who grew up from database to middleware to apps), the good news is the other half of the Oracle customers who came in through acquisition (i.e. or some say by accident) are present in larger numbers.  These customers by acquisition sought best of breed, took more risks, and fought in some cases not to be on the Oracle Red Stack.

    For Oracle to win the innovation battle, the company must win over the mind share of the Oracle customers by acquisition.  In fact, these customers represent the early adopters representing market leaders and fast followers while the core Oracle Red Stack is more cautious adopters and laggards (see Figure 2).  Market leaders and fast followers have key components required for successful building blocks of corporate IT and often have line of business leaders that push the envelope.  Oracle must tap into that spirit in order to move its base forward towards innovation.

    Figure 2. Organizational DNA Determines Pace And Appetite For Disruptive Tech Adoption

    Open World 2013 Attempts To Change The Tenor Of Oracle’s Outward Conversation

    In the spirit of innovation, attendees can expect six distinct mega themes to emerge from this uber event catering to 60,000 physical attendees and potentially 100,000 online.

    1. Customer experience. While the term CRM is loosely used to define many things.  Leaders realize that CRM is the technology.  Customer experience is the business process and journey maps.  Customer centricity is a state of mind that’s required of management and leaders. While customer experience is the new term du jour, all three elements (i.e. technology, business process, and people leadership) are required for success.  Front office is more descriptive than Tom Siebel’s legacy term of CRM.  Expect Oracle to showcase it’s Right Now and Eloqua acquisitions and make the case for why existing Siebel users and new customers in the CMO office should consider Oracle.  Those attempting to understand the State of Siebel (i.e. SOS), should read the latest form Constellation’s Bruce Daley.  As customer experience moves to the cloud, Elizabeth Herrell’s recent research should provide a good primer on why cloud enables channels in customer service.
    2. Internet of things. Edward Screven, Oracle’s chief corporate architect will deliver the inaugural push on how Oracle plays a growing role in machine to machine communications or what’s been described by GE as the industrial internet or by Cisco as the Internet of Things.  Constellation’s Joseph di A Paoloantonio has a good primer on what IOT is and a lot more on sensor and analytic ecosystems.
    3. Big data. Big data is about making decisions in the future not rehashing the past.  Oracle’s co-president Marc Hurd will deliver the keynote with Thomson Reuters and NYSE Euronext.  Given Oracle’s arsenal of solutions for analytics and real time decisions, attendees may want to think about how to build big data business models.  The path from data to decisions requires a good foundation of business and IT orchestration.  Marketers looking to tell a data driven story should explore the benefits of bigdata as Constellation’s Gavin Heaton points out.
    4. Fusion Apps with an emphasis on HCM. Oracle has not fared well in upgrading PeopleSoft customers or convincing customers to adopt Fusion HCM.  In about 50 end to end HCM deals in the past year, Constellation has seen Workday win about 40 deals, with SAP SuccessFactors with about 7, and Oracle with about 3.  Oracle must get customers on board and provide prospects with an end to end customer reference in order to show traction.  Expect Oracle to highlight those successes at the Sheraton Palace which is HCM/HR Central this year and attempt to convince the industry experts that this shift is happening.   Cloud buyers should also realize that there is more than software required as Constellation’s Frank Scavo notes.
    5. Flagship Oracle database 12c. Larry kicks off the event on Sunday highlighting Oracle Database 12c. Attendees can expect an attack on fake clouds and really confirmation of the end of multi-tenancy as we know it. Oracle’s innovations in pluggable databases have significant implications on the future of cloud delivery and the ability to address many requirements of highly regulated industries who have been hesitant to move to the cloud.  Expect Constellation’s Holger Mueller who covers IaaS and PaaS to dissect the hype from fact in the messaging.
    6. Oracle Social. The big tents in Union Square celebrating Oracle Social return.  Expect Group VP of Cloud Social,  Oracle’s Meg Bear to highlight how one can convert conversations to currency.  The pavillion is the edgiest of the main tents at Oracle Open World and expect Oracle to highlight where Vitrue and other Oracle CX products tie back to enabling social for humans and even in the M2M world.  This shift to purposeful collaboration as Alan Lepofsky talks about touches on not only Future of Work, but also Customer Experience.

    While at the event, attendees should also test drive the new Oracle user experience. The new UI/UX puts a significant refresh to the legacy Oracle Swan UI/UX.  Expect a mobile first orientation and a platform for developers to take advantage of.  Oracle’s new investments in mobile and on the platform will soon pay off for customers and developers.

    The Bottom Line: Oracle’s Attempting To Amp Up Its Mindshare And Time Will Tell

    Attendees seek a vision from one of the great Sun Tzu Art of War masters.  Larry Ellison has effectively maneuvered Oracle through 30+ years of technology changes and mastered the mergers and acquisitions game.  Oracle has accomplished much by bringing down the cost of ownership for a stack of technologies for customers and serving as a stable and competent technology partner.  However, the market has changed. the business models have evolved, the buying power has shifted, and the innovation leaders now come from the startups.  Oracle can no longer just rely on maintenance revenues and acquisitions for growth.  For Oracle to remain relevant as an innovation thought leader, customers  and prospects need to know what Oracle’s vision is for the future and how their businesses can benefit.   If Oracle can successfully tell this story over the next 12 to 18 months and execute over the next 3 to 5 years, the company has a shot at remaining relevant in this emerging convergence of enterprise and consumer technology.  Should Oracle fail, it will go the way of Computer Associates and remain relevant among cautious adopters and laggards but fail to capture the spark and innovation Oracle once was known for.  Look forward to seeing you at the event.

    Your POV.

    What’s your plan to invest with Oracle?  Do you see Oracle being innovative or more a laggard.  What’s the future of statups and cloud in your overall technology strategy? Is Oracle still relevant? Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) com.

    Join my colleagues Bruce Daley, Holger Mueller, and Frank Scavo at this year’s Oracle Open World.  Just ping us and lets’ catch up!

    Related Research:

    Reprints

    Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales .

    Disclosure

    Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy, stay tuned for the full client list on the Constellation Research website.

    * Not responsible for any factual errors or omissions.  However, happy to correct any errors upon email receipt.

    Copyright © 2001 – 2013 R Wang and Insider Associates, LLC All rights reserved.
    Contact the Sales team to purchase this report on a a la carte basis or join the Constellation Customer Experience!

     

     

     

    Data to Decisions Future of Work Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Tech Optimization Revenue & Growth Effectiveness Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity Event Report intel SoftwareInsider Oracle Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning LLMs Agentic AI Robotics HR HCM business Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Enterprise IT Enterprise Acceleration IoT Blockchain CRM ERP Leadership finance Healthcare VR CCaaS UCaaS M&A Enterprise Service Executive Events Chief Customer Officer Chief Executive Officer Chief People Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Digital Officer Chief Analytics Officer Chief Financial Officer Chief Operating Officer Chief Revenue Officer Chief Human Resources Officer Chief Experience Officer

    The Day Customer Engagement Changed Forever

    The Day Customer Engagement Changed Forever

    I had the opportunity to attend Twiliocon this week, and what I saw there convinced me that the way organizations reach out and communicate with their customers or constituents has changed forever. Twilio CEO, Jeff Lawson, was nearing the end of his keynote address, which focused on why software-based outreach and contact solutions were so much better than contact center systems offered by the traditional communications vendors.

    The convincing event occurred when Lawson stopped talking and began doing. He went over to his laptop and typed in about 10 lines of PHP code.  (PHP is a common Web scripting language.) The code did something extremely simple, yet powerful. It first opened up an Excel file and extracted out a name, mobile phone number, and shirt size. It then called a QR code generator and encoded the name, phone number and shirt size in a QR code. It then sent a text message to the mobile phone number with the QR code image and some text that indicated that the person who owned the mobile device had won a t-shirt. The code looped through each row of the spreadsheet, sending out approximately 1,800 real text messages to those in the Twiliocon audience in a matter of a few seconds.

    Why is this remarkable? It is because Lawson did not have to even think about how his code would actually send the message or how to contract with carriers to receive it or what the tariffs would be. His company, Twilio, has put a very simple API (application programmer interface) between web developers and all of the telecommunications messiness that occurs behind the scenes, making it extremely simply for any web developer to create a real-time or near real-time communications solution that can interface with the public telephone system and with mobile carriers. 

    Twilio has APIs that enable voice, text messaging, and picture text messaging applications to be created just as simply as Lawson created his small app. This allows organizations to build their own customer engagement solutions rather than buying a contact center solution, with its accompany price tag for the hardware, software, and professional services required to customize it to meet the organization’s needs. Twilio also has relationships with carriers across the world so that calls or messages can be routed to people in nearly any geography and so that local inbound dialing numbers can be obtained in these geographies. The pricing in the U.S. is simple: 1¢ per minute for voice, 0.75¢ per text message, and 2¢ to send and 1¢ to receive a picture message.

    Clearly there are a number of capabilities a full-fledged call center provides that Twilio does not, such as an IVR system, an auto dialer, dashboards, etc. But the ability for an organization to so easily create customer engagement applications is going to change how companies, governments, and non-profit organizations reach out to their respective customers or constituents. For organizations with existing PBX or call center infrastructure, Twilio now supports SIP, which will enable these organizations to continue using their existing equipment while beginning to leverage the capabilities Twilio has to offer. It also supports WebRTC, which on its own has the potential to completely disrupt the communications market.
    Twilio is on a roll. The company garnered $70 million in new funding in recent weeks, and Twilio users are generating over four million voice calls per day, which is causing the company to project 100% revenue growth for 2013. Twilio’s executives believe the company has tremendous growth potential, pointing out that many customers have used Twilio in limited scenarios to verify that it works and that these customers are now beginning to build out very significant apps. In fact, Lawson claimed that 96% of Americans have already interfaced with an application that is based on Twilio. Twilio and a few others like it, are completely changing how organizations reach out to and engage with their customers.

     

     
    New C-Suite Next-Generation Customer Experience Chief Customer Officer Chief Marketing Officer