Results

Digital Transformation Digest: MongoDB Files to Go Public, Microsoft and Facebook's Massive Undersea Fiber Line Completed, and More

Digital Transformation Digest: MongoDB Files to Go Public, Microsoft and Facebook's Massive Undersea Fiber Line Completed, and More

Constellation Insights

MongoDB goes public: NoSQL database vendor MongoDB has submitted its long-anticipated filing to go public with the U.S. Securities and Exchange Commission. The S-1 form includes a wealth of information about the company's current business standing and future plans. Here's a look at the highlights.

  • Long popular with developers, MongoDB's filing supplies a figure that bears it out: There have been more than 30 million downloads of its Community Server freemium edition. MongoDB is also popular with corporations, with more than 4,300 customers overall and among them greater than 50 percent of the Fortune 500.
  • MongoDB says its architecture "combines the best of both relational and non-relational databases," and this is helping it poach workloads. During its fiscal 2017, about 30 percent of new business "resulted from the migration of applications from relational databases."
  • The company was formed in 2007 as 10gen but changed its name to MongoDB in 2013. While based in the U.S., 35 percent of its revenue now comes from outside the country, according to the filing.
  • MongoDB rolled out Atlas, its DBaaS (database as a service) cloud offering, in June 2016. As of July 31, it accounted for 5 percent of overall revenue. While describing MongoDB Atlas as a key component of its growth strategy, the company's filing also acknowledges it is still figuring out how best to market the service.
  • The company now has about 820 employees, with 211 in research and development compared to 324 in sales and marketing, as of July 31.
  • MongoDB has been tabbed a tech "unicorn" with respect to its high valuation, which is around $1.6 billion. But it has incurred net losses in every quarter since its inception, and was in the red $45.8 million for the six months ended July 31. The company expects its operating expenses to "increase significantly" after it goes public.

Microsoft, Facebook complete new undersea US-to-EU cable: A milestone for next-generation networking was reached this week with the completion of Marea, an undersea telecommunications cable that runs between Virginia and Spain. Marea's capacity is enormous, as described by Microsoft, who partnered with Facebook and Telefonica on its construction:

At more than 4,000 miles (6,600 kilometers) long and almost 10.25 million pounds (4.65 million kilograms) — or about the weight of 34 blue whales — the Marea cable is a feat of engineering, collaboration and innovation. The cable can transmit up to 160 terabits of data per second. That’s more than 16 million times faster than the average home internet connection, making it capable of streaming 71 million high-definition videos simultaneously.

POV: Google has set the pace for this type of investment among major Internet companies, building out an extensive undersea cable network over the past few years. But Marea is said to be the largest-capacity undersea cable built to date.

Moreover, any existing cables emanate from the New York area; by locating Marea further south, Facebook and Microsoft will gain latency benefits, since both companies have extensive data center operations there.

Finally Marea was built with an "open" design that will allow it to more easily take advantage of future innovations in networking, Microsoft says.

Legacy watch: How not to do procurement: In 2011, the Canadian government hired IBM to install PeopleSoft at a number of departments and agencies. The first stage of the project was contracted for $5.7 million, but through a series of contract amendments the total has ballooned to $185 milllion, the Canadian Broadcasting Company reports.

As it stands today, the project, dubbed Phoenix, will cover more than 100 departments. Since going live in 2016, it has been wracked by performance issues and there are 1,000 identified bugs remaining to be fixed, the CBC report says.

Sources quoted in the story suggest that scope creep, that familiar IT project management bugbear, was an issue. But it appears the Canadian government made a serious misstep when it agreed to one particular contract term, as former Treasury Board analyst Roman Klimowicz told the CBC:

The request for proposal details the government's right to extend the terms of the Phoenix maintenance and support contract "for a period of up to approximately 20 years."

Klimowicz wonders if it was a good idea to give IBM so much control over defining the project, implementing and operating it — and now attempting to fix it."There appears to be a conflict potentially," said Klimowicz, who was never involved in the Phoenix contract. "The statement of requirement could leave loopholes, could leave escape avenues in it … then IBM basically has an open bag of money to help themselves to."

Zenefits Shift 2017 - Another pivot and even more software

Zenefits Shift 2017 - Another pivot and even more software

We had the opportunity to attend Zenefits’ Shift 2017 event in San Francisco, held at the Metreon on September 21st, 2017. The conference was well attended, though less in audience  than the Z2 launch 12 months ago. But the main objective was to get an online viewership, and Arianna Huffington delivered, during her keynote / interview the online viewership spiked beyond 60k+.
Zenefits Shift17 Constellation Research Holger Mueller
 
A video shows more than a 1000 words, so if you prefer to watch: (if the video doesn’t show up, check here)
 
 
No time to watch – here is the 1 slide condensation (if the slide doesn’t show up, check here):
 

Always tough to pick the takeaways – but here are my Top 3:

Zenefits does another pivot: From disrupting to powering brokers. The original idea of Zenefits was to disrupt both software and brokerage. Use the brokerage commissions and make the software free. Last year, at Z2, Zenefits pivoted to more software, starting to charge the SMB clientele for software and services. At Shift 2017 the vendor communicated even more of a move to software, reducing its own brokerage function and adding the capability to help brokers run more efficiently. Likely Zenefits is productizing internal assets, my speculation here, a strategy that would make sense. Oh, and Zenefits unveiled also a new logo – but never talked about it… what a difference to Z2.
 
Zenefits Shift17 Constellation Research Holger Mueller
Fulcher welcomes to Shift17
Meet employees where they are: New mobile, Voice (Alexa) demo. Like many other vendors Zenefits follows the credo of people centricity, meaning that consumption of HR software needs to be made as easy and frictionless as possible… this means that HCM software needs to be consumed where users do work: Most prominently in chat software and ideally voice accessible. Zenefits showed a demo with Alexa, going through a time off request.
 
Zenefits Shift17 Constellation Research Holger Mueller
Carr talks Services
New Connect offerings – Ben Connect and Pay Connect. Zenefits has a strong platform and partner focus, but needs to make the consumption of partner services easy… meaning that it needs to do the integration work for these partners to allow easy uptake by its SMB customer base. Benefits does this through the Connect product family, and at Shift 2017 it unveiled the Ben Connect and Pay Connect products. Ben Connect provides integration to benefit providers, Pay Connect to 3rdparty payroll providers (ADP, Gusto, Paychex, and I believe on more provider).
 
Zenefits Shift17 Constellation Research Holger Mueller
Reeves talks Products
Native Zenefits Payroll now at 30 states. A year ago, Zenefits unveiled its payroll offering, supporting a handful of states, the vendor has now arrived at the support of 30 states… and remains committed to support all 50. I missed the roadmap presentation on this, but it supposedly exists, and is key for SMBs to plan their rollouts and / or to make software selection decisions.
 

MyPOV

A good user conference for Zenefits, that is going more into the direction of becoming an enterprise software vendor. New CEO Fulcher has brought in a new management team, with a new COO, CMO and head of product. All industry veterans, so we can expect good things to come for Zenefits customers. The new mobile product looks good, though Zenefits makes the common (SFO / Valley) mistake of favoring iOS over Android. With 50% of US smartphones being Android phones, not really people centric, but as mentioned a common and repeated flaw of HCM vendors.

On the concern side, Zenefits showed very little software. And while it is great to listen to Arianna Huffington, Patty McCord, Ben Horowitz and Shawn Achor – showing a live mobile demo and an Alexa demo is not cutting the balance between fuzzy feel good and tangible product. We saw the same recently at SuccessFactors conference (Oprah, the Cake Master etc.). Maybe both vendors have new management teams and need to cover over a pause in product development. But vendors should not forget, the main reason to attend their user conference is their product and its roadmap. All appreciation, vision and though leadership on general practices – comes later. Showing core software capabilities as slide show is not what users (and influencers) expect.

But for now, good progress by Zenefits, more software DNA in its executive team and likely its product future. Stay tuned.
 
Stay tuned.
 
Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here).

Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here
 
Future of Work Innovation & Product-led Growth Event Report Executive Events Chief People Officer

Digital Transformation Digest: Google Adds 'Zero-Touch' Enterprise Deployments for Android Devices, Red Hat's New Patent Promise, and IBM's Latest Open Source Moves

Digital Transformation Digest: Google Adds 'Zero-Touch' Enterprise Deployments for Android Devices, Red Hat's New Patent Promise, and IBM's Latest Open Source Moves

Constellation Insights

Google introduces 'zero-touch' enterprise deployment for Android devices: It's going to be easier and more secure to roll out enterprise Android devices, with Google's introduction of "zero-touch enrollment" capabilities.

Under the program, companies that purchase Android devices can use EMM (enterprise mobility management) software to automatically apply configurations and policies the first time a user turns the device on. Supported EMMs include VMWare AirWatch, BlackBerry, MobileIron, IBM and GSuite.

Initially, the feature is available only on Google's Pixel phone when purchased through Verizon. Google is working with Samsung, Huawei, Sony, LG and other device makers to add zero-touch; Sony's Xperia XZ1 and XZ1 Compact will be among the first additional devices to get it. Google is also working with a variety of other carriers besides Verizon.

POV: This is certainly a desirable feature given how much it can cut down on device management and end-user support tasks. It also has security benefits, since the devices won't ever be used in an un-managed state.

However, this is also an instance where Google is playing catch-up for a change, as Apple has offered similar capabilities through its Device Enrollment Program, as has Samsung with its Knox Mobile Enrollment service. It's nonetheless a welcome addition to Google's mobile enterprise capabilities with real benefits for customers.

Red Hat updates, expands its 'patent promise': In 2002, Red Hat issued a decree saying it would not enforce its patents against free and open-source software. Fifteen years later, the company has released a new version of the Patent Promise, one it says substantially extends the original's scope. Here's how Red Hat explains the decision, from an FAQ:

We issued the first Patent Promise 15 years ago. Since then, both Red Hat and open source have changed considerably, and some aspects of the Promise became outdated. Open source is what Red Hat does, and open innovation plays an increasingly important role in technology and beyond. Our expanded Patent Promise recognizes and is designed to protect open innovation.

The new Promise is substantially clearer and broader than its predecessor. While the old Promise covered approximately 35 percent of open source software, the new version will cover more than 99 percent. It applies to all software meeting the free software or open source definitions of the Free Software Foundation or the Open Source Initiative and listed by the FSF or OSI.

Both the original and new promise covered the entirety of Red Hat's patents. But the company today has more than 2,000 patents, compared to just a handful at the time of the first promise.

Both the new Promise and the original Promise covered all Red Hat’s patents. It’s worth noting that at the time of the original Promise, Red Hat had only a few patents, while now it has more than 2000.

POV: As a company based on open source software, Red Hat's pledge seems like a natural step. It doesn't appear that it provides any protection to companies from patent lawsuits brought by non-practicing entities—otherwise known as patent trolls—but clearly puts a flag in the ground stating that Red Hat is a trusted partner to companies looking to innovate with open-source software.

IBM open-sources Websphere Liberty for agile app development: Big Blue has open-sourced the code for Websphere Liberty, the ligher-weight version of its flagship Java application server. IBM's Ian Robinson explains why in a blog post:

We created Liberty five years ago to enable developers to easily and quickly create applications using agile and dev/ops principles. It has been an incredibly successful and popular transformation for WebSphere and now is the time to take it to the next level by moving the essential Liberty code base into the open.

This week IBM launched the Open Liberty project and moved our Liberty development effort to it. The code is available in GitHub under the Eclipse Public License V1, and our ongoing development for WebSphere Liberty will be based on this project. Open Liberty is focused on creating a runtime to support Java microservices that can be frequently updated and easily moved between different cloud environments.

At any time, developers can move up to the commercial versions of WebSphere Liberty, adding dedicated technical support and more advanced capabilities. Because Open Liberty and WebSphere Liberty are built on the same codebase this transition is seamless, so there’s no need to modify your applications.

POV: IBM is also open-sourcing its IBM J9 virtual machine implementation, which along with Open Liberty provides a full, IBM-approved Java stack.

While Open Liberty is easier to set up and manage, gets more frequent updates, and offers more deployment options than the full-blown WebSphere, it isn't as feature-rich. There are also many existing applications that would be difficult or not possible to move to Open Liberty due to feature gaps.

IBM is betting that moving Open Liberty to an open-source model will attract community support and subsequently more development resources and market traction around the code base. It's far from an unprecedented move, but still stands as another example of where open source wins, notes Constellation VP and principal analyst Holger Mueller.

"While IBM knows how to partner and work with open source, customers have to keep a watchful eye on vendors not just punting the code over," Mueller says. "I'm not saying that is the case here, but with all the struggles at IBM it's a potential risk to consider."
 

Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity

Progress Report - Kronos grows and grows

Progress Report - Kronos grows and grows

We had the opportunity to attend Kronos’ very first analyst day, held at the vendor’s new headquarters outside of Boston. It was the first day at the office for many of the executives and employees, a very nicely remodeled older office building (the former Wang HQ), with many interesting and useful features and capabilities. Interestingly there are no rooms, corner offices etc. facing the windows. The idea is to allow as much possible day light to flood the building. Not even CEO Ain has an office with a window.
 

So, look at my musings on the event here: (if the video doesn’t show up, check here)
 
 
No time to watch – here is the 1-2 slide condensation (if the slide doesn’t show up, check here):
 

Want to read on? 

Here you go: Always tough to pick the takeaways – but here are my Top 3:

 
Ain opens 1st Kronos Analyst Day
Kronos keeps growing in all dimensions. Kronos keeps doing well, having achieved the 1B milestone a few years ago, delivering 1.3B US$ in the last financial year and now aiming at 1.4B US$. The growth is documented in headcount, too – Kronos hired 1000+ in the last 12 months, one additional reason to open the new headquarters. And Kronos has left itself some room to grow – there is still space in the new office building.

 
Workforce Ready Key Directions
Workforce Ready. Kronos’ SMB product is doing well. Adding and cross selling HCM capabilities seems to be something that Kronos has mastered well. The vendor claims they are the fastest growing HCM product, that has passed +100M US$ already. The functional highlight was improved reporting with new dashboards, an updated user interface and new mobile capabilities.

 
Workforce Central Directions

Workforce Central. Kronos’ larger enterprise product is doing well as well. Often used in conjunction with other HCM products, it has undergone, and will see more API work. RESTful integration is the writing on the wall. And it would not be Kronos if the vendor would not spend R&D on more advanced scheduling capability as well as new schedulers. Improved integration into 3rd party HCM products is another key investment theme.
 

MyPOV

Kronos is investing in product and growing on all fronts. It has carved out a working niche in SMB and has become the ‘Switzerland’ that can partner with the likes of Oracle, SAP and Workday for Workforce Management capabilities. It seems like competition in the field and is building out its capabilities. Good news for customers is that Kronos has overcome technical challenges from the past, and now offers a stable, reliable workforce management system, that clients need and deserve.
 
On the concern side, Kronos remains a conservatively run company. As it competes with vendors with more enterprise software DNA, it can find itself on the backfoot when it comes to visionary and sometimes flashy announcements. Customers should ask and dig deeper on Kronos’ innovation pipelines and plans, and likely are not going to be disappointed. Kronos has a large development team and is working on many new workforce management capabilities.

Stay tuned for Kronosworks later this year in Las Vegas, I am certain we will know a few things more about what’s next for Kronos customers.
 
    Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here).

    Future of Work Data to Decisions Innovation & Product-led Growth New C-Suite Tech Optimization AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief People Officer Chief Experience Officer Chief Customer Officer Chief Human Resources Officer

    When AI and Personal Information Collide: The Privacy Implications

    When AI and Personal Information Collide: The Privacy Implications

    Constellation Insights

    Stanford University professor Michal Kosinski made waves recently for research he conducted that suggested AI can determine a person's sexual orientation based on pictures of their face. Now Kosinski is going further, saying that AI could also pinpoint someone's political leanings, level of intelligence and other personal data points based on photographs, as the Guardian reports:

    Faces contain a significant amount of information, and using large datasets of photos, sophisticated computer programs can uncover trends and learn how to distinguish key traits with a high rate of accuracy. With Kosinski’s “gaydar” AI, an algorithm used online dating photos to create a program that could correctly identify sexual orientation 91% of the time with men and 83% with women, just by reviewing a handful of photos.

    Kosinski’s research is highly controversial, and faced a huge backlash from LGBT rights groups, which argued that the AI was flawed and that anti-LGBT governments could use this type of software to out gay people and persecute them. Kosinski and other researchers, however, have argued that powerful governments and corporations already possess these technological capabilities and that it is vital to expose possible dangers in an effort to push for privacy protections and regulatory safeguards, which have not kept pace with AI.

    Kosinski is also known for his controversial work on psychometric profiling, including using Facebook data to draw inferences about personality. The data firm Cambridge Analytica has used similar tools to target voters in support of Donald Trump’s campaign, sparking debate about the use of personal voter information in campaigns.

    There is much more to the Guardian's full report, which is well worth a read.

    Analysis: AI advancements test, but don't rise above privacy laws

    Privacy regulators increasingly recognize that the creation of personal information through computer algorithims is a form of collection, says Constellation VP and principal analyst Steve Wilson, who leads the firm's coverage of digital security and privacy issues: "If a computer program sets a flag in a database saying 'this person is right wing,' or 'this person is LGBT,' then that represents an act of collection of personal information."

    This practice can be termed algorithmic collection, or synthetic PII, and is treated exactly the same under privacy laws as collecting the information by getting subjects to fill out a questionnaire. In some jurisdictions, such as Australia, PII related to sexual preference, health, political beliefs and biometrics are classified as sensitive and given extra protections, Wilson says.

    "Sensitive PII cannot be collected without consent, so automated algorithmic collection of a person's sexuality or politics is a huge problem, even if it's done for research purposes," he says. "This is another nice example of how technology does not outpace the law. If most people are intuitively uneasy about computers working out their sexuality or other deep, even unconscious traits, then they can get some comfort from the fact that existing laws put restraints on this type of action."

    The fundamental point is that there are limits to what personal information should be collected abotu people, and conditions on how it's collected, Wilson adds. "While new technology can create new ways to break the law, the fact is that privacy laws themselves remain as relevant as ever."

    Digital Safety, Privacy & Cybersecurity Chief Customer Officer Chief Information Officer Chief Digital Officer

    Today's Tip: Know When To Automate With Artificial Intelligence

    Today's Tip: Know When To Automate With Artificial Intelligence


    Six Factors For Powering AI Driven Smart Services

    Recent client conversations indicate a desire for designing new AI Driven Smart Services.   The rush to incorporate artificial intelligence into processes often requires a deeper examination of which services should be AI enabled.  Constellation’s latest framework for augmenting humanity encompasses six factors (see Figure 1):

    1. Repetitiveness.  The greater a process is repeated, the more likely the process should be AI powered.  One-offs and custom processes with minimal repetition are lower priority candidates for AI.
    2. Volume.  When the volume of transactions and interactions exceed human capacity, the smart service should be AI powered.  Volumes within human capacity will remain human powered.
    3. Complexity.  Good candidates for AI powered include complexity beyond human comprehension and simple tasks that can be optimized by AI.
    4. Physical presence.  Processes that require a heavy physical presence will most likely require human powered capabilities.  However, processes that put lives in jeopardy serve as great candidates for AI powered and automation.  In general, low physical presence requirements play well to AI powered approaches.
    5. Time to complete. High time to market requirements favor AI powered approaches.  Lower time to completion requirements will remain human powered.
    6. Nodes of interaction.  Simple interaction nodes will lean human powered.  AI serves best complex and high volume nodes of interaction.

    Figure 1. Constellation’s AI Powered Framework

    The Bottom Line.  Apply The AI Powered Framework To Smart Service Prioritization

    Six factors play a significant role in identifying which AI driven smart services deliver the greatest opportunities.  Early adopters have prioritized business processes using the Constellation business hierarchy of needs.  Align candidates to the five categories of regulatory compliance, operational efficiency, revenue growth, strategic differentiation, and brand.   Keep in mind, that AI enablement must require a strong data strategy, deep data governance, mature business process optimization, and a data driven design point.

    Your POV.

    So what will you automate first with AI?  Do you have a digital transformation strategy?  

    Please let us know if you need help with your Digital Business transformation efforts. Here’s how we can assist:

    • Developing your digital business strategy
    • Connecting with other pioneers
    • Sharing best practices
    • Vendor selection
    • Implementation partner selection
    • Providing contract negotiations and software licensing support
    • Demystifying software licensing

     

    Data to Decisions Future of Work Marketing Transformation Matrix Commerce Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity New C-Suite AI Agentic AI LLMs Generative AI ML Analytics Automation Cloud CRM Data to Decisions Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Software finance HCM HR Machine Learning Next Gen Apps SaaS PaaS IaaS Supply Chain business Marketing Enterprise Acceleration IoT Blockchain ERP Healthcare Customer Service Content Management Collaboration B2C CX Leadership Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Financial Officer Chief Operating Officer Chief Revenue Officer Chief People Officer Chief Human Resources Officer Chief Experience Officer

    News Analysis - Oracle Unveils New Programs that Transform how Customers Buy and Consume Cloud – Gloves Off

    News Analysis - Oracle Unveils New Programs that Transform how Customers Buy and Consume Cloud – Gloves Off

    Even though we are almost only two weeks off Oracle’s OpenWorld conference in San Francisco - Oracle is pushing the gas pedal. Yesterday Oracle announced its Sparc M8 chip / architecture – today it was Chairman and CTO Larry Ellison’s turn: New pricing.

     
     

    The press release can be found here. So, let’s dissect in usual fashion:
    At a live event today, Oracle Executive Chairman of the Board and CTO Larry Ellison announced new programs that lower costs by delivering increased automation and flexibility, and enable customers to get more value from their existing Oracle software investments. The new Oracle Cloud programs include Bring Your Own License to PaaS and Universal Credits.
    MyPOV – Nice summary. Pricing simplification has been on Ellison’s mind since a long time. In his view easier pricing accelerates sales processes, something he really wants. This move reminded me of Ellison pushing through a single global price list, published on its website, connected with the Oracle Store in the dot com era. It was revolutionary then – today it was visionary.
    “We are completely transforming the way all companies buy and use cloud by providing flexibility and choice,” said Ellison. “Today, we combined the lowest prices with the highest performance and more automation to deliver a lower total cost of ownership for our customers.”
    MyPOV – Rationale is key to Oracle and Ellison – and his presentation was more about how automation of the Oracle database processes allows Oracle to lower prices and compete with the competition. He even said at one point that Oracle would make cost a SLA item…
    While organizations are eager to move to the cloud, many have not due to obstacles that have forced them to choose between flexibility and lower costs. They have been challenged by the complexity of the cloud and the inability to rebalance spend across different services. Organizations have also been constrained by limited visibility and control over cloud spend. Until now, they have been unable to fully leverage their on-premises software investments in the cloud, having been limited to IaaS services or sacrificing key database features at the PaaS layer. Oracle’s new cloud programs address customers’ cloud adoption challenges by improving and simplifying the way they purchase and consume cloud services.
    MyPOV – Always good to see simplification. In a more complex and accelerated environment enterprises need all the simplification they need. And while pricing complexity is one problem, product fit and evaluation is even bigger. But Oracle is right – once enterprises have established the services they want - / need – budget risk is one of the main pre-occupations of CxOs. Though the ‘horror’ stories of early cloud days of enterprises burning through a month’s budget in few days because someone didn’t turn something off – are over – it remains a concern till today. But price is only half the equation – product / service selection remains a challenge.
    Bring Your Own License to Oracle Database PaaS: Delivering Increased Value Through License Mobility
    Currently, customers can bring their on-premises licenses to Oracle IaaS. Today, Oracle is expanding the offering by enabling customers to reuse their existing software licenses for Oracle PaaS, including Oracle Database, Oracle Middleware, Oracle Analytics, and others. Customers with existing on-premises licenses can leverage that investment to use Oracle Database Cloud at a fraction of the old PaaS price. Running Oracle Database on Oracle IaaS is faster and offers more features than Amazon, delivering the industry’s lowest total cost of ownership. Additionally, customers can further reduce management and operational costs required for on-premises maintenance by taking advantage of this PaaS automation.

    MyPOV – Good to see Oracle expanding its BYOL program. That lowers the hurdle for customers to move to the Oracle cloud – and helps against the competition. That is – in Oracle’s eyes most prominently AWS – so they are mentioned here… Oracle has gone at length that its IaaS Gen2 is cost effective – now it is getting aggressive on the pricing side. Always good to see vendors living up to their announcements.
    Universal Credits: Flexible Buying and Consumption Choices for Oracle’s PaaS and IaaS Services
    Oracle is introducing Universal Credits, the industry’s most flexible buying and consumption model for cloud services. With Universal Credits, customers have one simple contract that provides unlimited access to all current and future Oracle PaaS and IaaS services, spanning Oracle Cloud and Oracle Cloud at Customer. Customers gain on-demand access to all services plus the benefit of the lower cost of pre-paid services. Additionally, they have the flexibility to upgrade, expand or move services across datacenters based on their requirements. With Universal Credits, customers gain the ability to switch the PaaS or IaaS services they are using without having to notify Oracle. Customers also benefit from using new services with their existing set of cloud credits when made available.

    MyPOV – The real simplification comes here. And while pre-pay programs exist for IaaS since a while, they are usually tied to dedicated products and services. Having the portability across products and services is important for customers, especially in early phases of their cloud journey. Over time their footprint solidifies – but given where most Oracle customers are now, this move is a significant reduction in complexity. The option to use the credits for Oracle Cloud at Customers is a differentiation to pure public cloud based IaaS vendors. It caters well to customers who are still cloud concerned, who are in geographies where Oracle’s IaaS Gen 2 may not be yet, or where data residency requires in country operation of IT. 
     
    Behind the scenes it means – Oracle must have a lot of capacity. The dedicated service / product pre pay helps IaaS vendors to plan capacity and build out of their IaaS capacity. When customers can switch with little notice, an IaaS vendor must be confident to have enough capacity. Theoretically too much demand can create wait lists – but that would be bad for business… but an area to watch going forward.

     

    Overall MyPOV

    Always good to see simplification. Customers who are uncertain on which products and services they can consume can now start budgeting for instance for trials and proof of concept. BYOL is powerful for customers ready to move to cloud. So overall a good move by Oracle.

    Looking behind the scenes – we can assume a few things: Oracle is confident IaaS Gen 2 works. It just has poured another 2B in CAPEX in Q1 2018. First time Oracle has spent 2B+ CAPEX quarters in recent years (or at all – I did not go back). All these investments need load to get justified. If pricing held customers up – it has been simplified. And the move gives the Oracle sales force the instant chance to have a cloud conversation with customers – given the BYOL option – and the public cloud and cloud at customer options. The executives in 500 Oracle Parkway are making it was easy as possible to sales force and customers to move to cloud and generate cloud revenues. That Oracle is under pressure to show cloud growth (like all major enterprise vendors) is no surprise, and the last earnings call has not lowered pressure, despite good results.

    For the industry, its key to watch how well Oracle can move customers to cloud (or cloud at customer). For Oracle competitors, massive potential could disappear. As Constellation estimates that 30-40% of on premise systems are either Oracle systems, or systems so closely connected to an Oracle system – this is a key development to watch – to determine the overall move of the on-premise load to public cloud (the other one is the VMware / AWS partnership). If Oracle succeeds with this – we may see a fast move of almost all enterprise load to the cloud. The reasons are – when a year of server refresh goes missing, most of on premises IT becomes – very expensive. So expensive that the board s and CEOs are asking how fast the rest can move… Stay tuned. It’s going to be a crucial fall.


    For more on the event - check out the Storify embedded below. 
     
    Innovation & Product-led Growth Tech Optimization Future of Work Data to Decisions New C-Suite PaaS SaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

    Digital Transformation Digest: Kohl's Cozies Up to Amazon, Hitachi Forms Vantara for Digital Business, Oracle Points SPARC to Its Cloud

    Digital Transformation Digest: Kohl's Cozies Up to Amazon, Hitachi Forms Vantara for Digital Business, Oracle Points SPARC to Its Cloud

    Constellation Insights

    Kohl’s cozies up closer to Amazon: Just weeks after announcing it would welcome Amazon smart home experience centers at a number of its department stores, Kohl's is adding Amazon item return services in 82 locations starting next month. It's a continuation of Amazons's brick-and-mortar strategy, which took a big leap forward with the acquisition of Whole Foods, and could conceivably lead to a purchase of Kohl's. 

    Here are some key details from the announcement:

    “We are thrilled to launch this unprecedented and innovative concept, allowing customers to bring in their unpackaged Amazon returns to Kohl’s and we will pack them, ship them, and return them to Amazon for free,” said Richard Schepp, Chief Administrative Officer. “This is a great example of how Kohl’s and Amazon are leveraging each other's strengths – the power of Kohl’s store portfolio and omnichannel capabilities combined with the power of Amazon’s reach and loyal customer base.”

    POV: There are a number of caveats to consider. First, the 82 Kohl's stores that will feature Amazon returns are all in Chicago and Los Angeles. One would expect, however, the initial rollout is a test run for adding the service to most or all of Kohl's stores eventually. Also, the announcement notes that "eligible" Amazon items will be accepted as returns; it's not clear what the limitations are, but for free, who can complain? 

    For large retailers like Kohl's, increasing foot traffic is crucial even as they build out online revenue streams. Amazon return centers certainly could drive that foot traffic and result in more in-store sales.

    Kohl's has had more success than other department chains in adjusting to omnichannel realities. It also has much larger stores than Whole Foods, raising possibilities for Amazon that a typical Whole Foods store footprint cannot. Acquiring Kohl's, which has a market capitalization of about $7 billion, would be practically trivial for Amazon. While not a lock, an eventual deal looks like a strong possibility.

    Hitachi creates Vantara unit for digital business: There is a new—in a sense—player in big data and digital transformation consulting services, with Hitachi's launch of Vantara, a new unit that combines Hitachi Data Systems, Hitachi Insight Group and Pentaho. Here's how Hitachi describes the opportunity for Vantara and customers:

    The market opportunity for mission-critical data solutions has never been greater. Data has become a business's greatest asset—if they can extract actionable insights from it. Data holds the key to new revenue streams, better customer experiences, improved market insights and lower costs of doing business. However, a comprehensive offering has yet to emerge that combines both OT and IT expertise to uncover its true potential—until now.

    Hitachi Vantara will continue to provide superior infrastructure and analytics technologies that enterprises rely on for their mission-critical data in their data centers, in the cloud and at the edge of new innovations. The new company is targeting the emerging IoT market opportunity, in which there is no clear winner yet.

    Hitachi has developed its own IoT platform, Lumada, which will be part of Vantara. The new entity is going after high-end business, focusing on the global Fortune 1000.

    POV: Hitachi may have big ambitions for Vantara but the likes of IBM and Dell EMC are competing for the same business. Where Hitachi says it has an advantage is with its operational technology background, which Vantara engagements will couple with IT know-how. By any measure, Vantara is a big move by a big player, and one that bears watching.

    Oracle delivers SPARC M8 systems, clarifies Solaris's future: A couple weeks in advance of OpenWorld, Oracle has announced a new series of servers based on the SPARC M8 microprocessor. It also said it plans to support the Solaris OS until at least 2034.

    SPARC M8 chips include advancements for software-on-silicon based security measures; 2x faster encryption than x86 systems and SPARC M7; and superior performance for Oracle database and Java workloads compared to x86 and M7, according to the announcement:

    "Oracle has long been a pioneer in engineering software and hardware together to secure high-performance infrastructure for any workload of any size," said Edward Screven, chief corporate architect, Oracle. "SPARC was already the fastest, most secure processor in the world for running Oracle Database and Java. SPARC M8 extends that lead even further."

    POV: Oracle's hardware revenue fell 5 percent year-over-year in its first quarter to $943 million. But it's doubtful Oracle has true hopes for on-premises hardware sales as a growth story. Rather, it is betting that innovation in the SPARC platform can give its cloud services a performance and efficiency edge.

    As for Solaris, the lengthy support commitment should please customers with legacy Solaris workloads, but it's not clear how many resources Oracle will pour into the OS going forward. Sharp eyes at the Register noted that a number of OpenWorld sessions focus on moving Solaris workloads to the cloud—presumably, its own.

    AWS adds per-second billing: The cloud pricing wars just got a new wrinkle, with Amazon Web Services' introduction of per-second billing. Here's how AWS chief evangelist Jeff Barr describes the value proposition in a blog post:

    Some of our more sophisticated customers have built systems to get the most value from EC2 by strategically choosing the most advantageous target instances when managing their gaming, ad tech, or 3D rendering fleets. Per-second billing obviates the need for this extra layer of instance management, and brings the costs savings to all customers and all workloads.

    While this will result in a price reduction for many workloads (and you know we love price reductions), I don’t think that’s the most important aspect of this change. I believe that this change will inspire you to innovate and to think about your compute-bound problems in new ways.

    Per-second billing goes into effect in all AWS regions on October 2, for Linux instances "that are newly launched or already running," Barr wrote. Amazon is also requiring a one-minute minimum charge per instance.


    POV: The move is both good for AWS customers as well as AWS, if it can recycle instances faster and loan them out more often per minute, says Constellation Research VP and principal analyst Holger Mueller. "It's like the Frankfurt airport Sheraton, which usually has 120 to 130 percent utilization, because people check in and check out of the same rooms multiple times in 24 hours," he says.

    Per-second pricing isn't currently available on other major clouds, but you can expect AWS's competitors to follow suit soon.

    Data to Decisions Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Digital Officer Chief Revenue Officer

    Rethinking IT Service Management in the Era of Cloud, Customer Experience, and Design Thinking

    Rethinking IT Service Management in the Era of Cloud, Customer Experience, and Design Thinking

    Most practitioners would agree that there's been a steady shift in IT service management over the last decade. The stagnation of ITIL combined with the rise of agile methods, devops, public cloud, and even Shadow IT has had a growing and inexorable impact on how we manage our IT services today.  The customer shift in expectations for service management is clear too: Be more responsive, be easier to consume, move faster, and lead the business from the front when it comes to technology services. As a result, it's clear now that the evolution of the practice has reached a significant inflection point.

    Thus, in an age where crafting easy-to-use and engaging customer experiences using techniques such as empathy-driven design thinking have become best practice for service design, the old and decidedly staid world of ITSM is getting a reboot. Certainly, leading vendors like ServiceNow and BMC, have helped make this shift possible with increasingly consumerized, customer communities, and self-service capabilities, but this transformation is much more than just about the tools. There's a new sense that service management has to grow up to lead the business itself in how it adopts and consumes technology, while becoming a prime customer of the development process itself.

    To be clear, other groups in the business -- including the enterprise architecture team, the Chief Digital Officer, and even the marketing technology groups -- are also busy doing the same thing. However, they are not positioned in the center of service delivery itself and don't have the infrastructure, mandate, or experience in service design, deployment, and management. But service management groups must think big and seize the initiative or risk being relegated to the margins of shared services.

    The Next Generation of Service Management: Beyond ITIL with Customer Experience, Design Thinking, and Shadow IT

    Service Management Now Informed by New Developer Methods and High Quality Consumer Experiences

    Pushed forward with the advent of new ideas from the development side of IT, service management is becoming much more iterative, proactive, customer-focused. This is helped along with the aforementioned enabling of new solutions that take the friction out of service management by enable high degrees of ease-of-use and self-service, while the development side of the house integrates much more closely with ITSM, then uses fast feedback cycles to rapidly iterate services using agile and devops methods until the right solution is ultimately refined out of the initial proof of concepts or prototypes.

    IT service management pundits such as Dennis Drogseth have dubbed this shift away a reactive service desk and towards a more integrated, adaptive, and forward-looking service model, Service Management 2.0. While the trend of adding the "2.0" suffix is now out-dated, the point is a correct one: Traditional service management has to evolve to become more effective in meeting business requirements using emerging new methods. The practice of ITSM as it is codified in ITIL 3.0 is not only too heavy-weight today, but it does not reflect many of the countless lessons learned in usability and customer journey over the last decade since it was last updated. Design thinking, an increasingly popular way of creating customer-centric technology services, wasn't even on the radar in IT when the most recent version of ITIL was developed.

    Rethinking Service Management Using Today's Digital Processes and Lessons Learned

    For my own part, I recently had an opportunity to widely survey the state-of-the-art in ITSM last month when I gave an opening keynote at the 20th annual Service Management conference in Melbourne, Australia. I also gave two deep-dive next-generation ITSM workshops to nearly 40 top service management professionals that were highly informative.  In the process, I encountered a pleasantly surprising number of practitoners that were hungry for a new model of service delivery beyond or in complement with the traditional ITIL model. Most are looking at incorporating agile into service management and some were closely evaluating devops as part of the process. But one thing was clear: The practice is not evolving as fast as the marketplace or our stakeholder expectations, despite an urgent need to move faster across the industry.

    The ITSM workshops I facilitated last month were particularly revealing as we jointly developed a new model for ITSM that I believe will a) resonate with practictioners, b) look familiar enough to be readily understandable, and c) yet deeply incorporate the most vital new trends mentioned above. For lack of a better term, I'll call this new view Service Management 2020, both for a target date of end of the decade for ITSM groups to overhaul their function, along with the idea of the 20:20 hindsight that we have now with several decades of ITSM experience to see what worked and what now needs to change. 

    A Vision for Service Management 2020

    • The customer experience is paramount. Where legacy ITSM is process-centric, the new view is that measuring and managing the resulting impact and quality of the journey of the service management customer is what matters most of all. While design thinking is not necessarily a mandatory new process in service design and 'service devops', ITSM must determine some effective method to map out the customer experience, ensure stakeholder needs are being met, and use data from the field to ensure it's the right journey through service management (and hard part, keeping it updated.)
    • Service management is just as important a service customer as the end user. This is the signature lesson of devops and continuous delivery: Operations and development must collaborate closely together to iterate towards the right solution that is optimized for a) the customer experience and b) operations and service management. Both are vital and essential stakeholders to please with service management.
    • Agile and devops must be incorporated into service management. Older legacy service development processes are slow, wasteful, don't course correct quickly enough, and won't lead to an adequate user experience or sufficiently meet business requirements. A key point: These new generation of processes use end-to-end visiblity and collaboration to get information from the customer as quickly as possible from rapidly iterating builds back to the development groups so the right solution can be created. ITSM tends to be siloed from these processes, and so it must be removed from this silo as soon as possible and assume a larger role in the IT value chain.
    • IT service management must evolve into business service management. The end game is not so much about IT, as it is how digital impacts the very way the business operates and thinks. IT is now a key component of almost all business services, and thus shared services functions can and should in many organizations focus on business digitization, as as our organizations become technology companies.
    • Move towards 90% automation as soon as possible. The reality is that ITSM budgets tend to be tight, and don't grow quickly, even as responisbilities mount and a new generation fo service management arrives that must be dealt with effectively. However, with the rise of AI-powered support, chatbot-based ITSM services, community-powered self-help, and other automated aids arrive, however, service management professionals should use these to free up their time and resources to focus on the strategic transformation activities represented by this list, which is going to take the next three years at least to address properly.

    There is little doubt that we are entering one of the most exciting times in the field of service management, yet there is much work yet to do to pathfind the way. The vision of Service Management 2020 is one that I believe will resonate with most practitioners as they attempt to modernize one of the most vital technology capabilties within our organizations. The process of shifting the model of service management itself in the way described above will also make ITSM more strategic. Practitioners should be ready to communicate and educate upwards to ensure they gain C-Suite support for their efforts to evolve into a proactive digital business service management function.

    Continuing the Discussion

    Please add your comments on the future of ITSM below. You can also reach me via email: dion (at) ConstellationR (dot) com or @dhinchcliffe on Twitter.

    Also, please let us know if you need assistance with your service management transformation efforts. Here’s how we can help:

    • Developing your ITSM strategy and transformation plans
    • Connecting with other service management peers and leaders
    • Accessing the latest service management best practices
    • Understanding the service management vendor space
    • Identifying options for implementation partners
    • Developing and validating digital transformation roadmaps and playbooks
    • Providing advisory and education to IT executives, CXOs, and boards

    Additional Reading

    Digital (Service) Transformation and the Leadership Quandary

    Rethinking Field Service Management in Digital Business

    Systems of Engagement and Enterprise Business Architecture

    The New CIO Mindset

    Future of Work New C-Suite Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Digital Officer Chief Data Officer

    Digital Transformation Digest: Chambers Ending An Era at Cisco, Vertica 9 Unveiled, IBM's New Cloud Data Migration Service

    Digital Transformation Digest: Chambers Ending An Era at Cisco, Vertica 9 Unveiled, IBM's New Cloud Data Migration Service

    Constellation Insights

    John Chambers' legacy at Cisco: After more than 20 years as either executive chairman or CEO of Cisco, John Chambers is stepping back from his duties. Chambers will not stand for reelection this December to Cisco's board, which intends to name CEO Chuck Robbins chairman as his successor.

    At the time Chambers was appointed CEO in 1995, the networking giant had $1.2 billion in revenue. It now generates nearly $50 billion annually, driven by an eye-popping 180 acquisitions during Chambers' tenure as CEO. That growth strategy has continued under the leadership of Robbins, who took the CEO job in 2015.

    Not every acquisition has been a success for Cisco. Critics often point to Chambers' decision to abruptly kill Flip, the consumer-oriented camcorder Cisco bought for $590 million, as an example of a misfire. Significant deals made under Robbins' watch include the $1.4 billion purchase of Jasper, maker of an IoT platform, and the $3.7 billion Cisco plunked down for application performance monitoring vendor AppDynamics.

    Revenue has fallen for the last seven quarters, but Cisco has been beating analyst estimates for earnings per share.

    POV: It's not as if Chambers' departure comes as any surprise, given it's been two years since he stepped down as CEO. (In the meantime, Robbins has overseen a retrenchment of Cisco's strategy with a focus on next-generation networking and multi-cloud management.) But it still marks the end of an era.

    "John Chambers wrote the playbook for massive growth by acquisition in high tech," says Constellation founder and CEO R "Ray" Wang. "His leadership over the years at Cisco was unparalleled in driving scale, improving margins, and leading the market in financial engineering. His legacy will be known as one of the legendary Silicon Valley leaders during the golden age of networking."

    The only down side will be the highly competitive, Game of Thrones-like environment Chambers is leaving behind. "That will need some healing under Chuck Robbins to reinvigorate the culture," Wang says.

    Vertica 9 unveiled post-Micro Focus acquisition: Earlier this month, HPE completed the $8.8 billion spinoff of its software assets to Micro Focus. Now the latter has taken the wraps off Vertica 9, the latest version of the analytics database platform. Here are the key details from its announcement:

    Vertica provides organizations with a single, unified analytical database that supports all major cloud platforms, all popular data formats, enhanced integrations with Spark and Kafka and an analyze-in-place, unified architecture that enables businesses to monetize their data assets with cloud elasticity – regardless of data location.

    The new release triples load performance, dramatically increases query performance with Flattened Tables, and extends concurrency by up to 60 percent. In addition, Vertica 9 natively integrates with key ecosystem technologies and open source innovation, including Microsoft PowerBI, Cloudera Manager and Apache Spark 2.1.

    Vertica has also added support for Google Cloud Platform in this release, and is rolling out a beta version of its Eon Mode. This separates compute and storage, allowing for just-in-time provisioning on analytics jobs, which can save customers money.

    In addition, Vertica 9 features a new set of machine learning algorithms, additional data-prep tools and a new writer tool for Parquet, the columnar storage format associated with Hadoop File System.

    POV: Vertica 9 will be generally available in October. That's roughly a year after the release of Vertica 8, timing that suggests the Micro Focus spinoff didn't cause excessive distractions at the product engineering level.

    The beta release of Eon Mode represents where Vertica is playing catch-up to others in the market, says Constellation VP and principal analyst Doug Henschen. Snowflake Computing was among the pioneers of separating compute and storage decisions when it was founded in 2012 and it has since been followed by Teradata with its IntelliFlex architecture, Henschen adds.

    This separation will ease flexible cloud deployment, but Vertica 9 also makes it easier to deploy on the AWS, Azure and Google clouds, by way of cloud-native marketplaces/launchers in bring-your-own-license (BYOL) approaches, Henschen says.

    However, Vertica still doesn’t offer its own Database as a Service (DBaaS) offerings. Constellation sees DBaaS as increasingly popular, as these options tend to be highly automated and save customers from having to deal with routine and repetitive database admin, patching and software-update tasks.

    Vertica remains a popular choice for its massive scalability and advanced analytical capabilities, often showing up as the embedded data platform behind third-party SaaS offerings, such as Datorama, Domo and GoodData, Henschen adds. The EON architecture and streamlined BYOL options are positive moves, but Henschen notes in his Constellation ShortList for Hybrid and Cloud-Friendly RDBMS, getting into the thick of the hybrid cloud competition demands multi-cloud database services, preferably managed by the database provider.

    The machine learning advancements and other new features extend Vertica's capablities for cloud and IoT use cases, but they were put in place under HPE's ownership, Henschen notes.

    While Vertica has synergies with the Autonomy search and machine learning platform Micro Focus also acquired from HPE, the rest of the portfolio focuses on DevOps, hybrid IT, security and risk management. Micro Focus officials have characterized Vertica as a growth engine for the company, it's possible Vertica could be spun out yet again, he adds: "I’m looking forward to seeing what the new Micro Focus does with this valuable asset."

    IBM rolls out physical cloud migration offering: Bandwidth remains an obstacle when it comes to moving large data sets to the cloud. To get around the problem, vendors including Amazon Web Services and Google have been pushing physical data migration options—in AWS's case, it's a tractor trailer called Snowmobile, albeit one aimed at petabyte-scale data sets.

    Now IBM is getting in on the trend with Mass Data Migration, a service that uses $395 portable storage device with up to 120TB of capacity. The devices include 256-bit encryption and UPS next-day air service. It's possible for the devices to be sent out, their data migrated to IBM's cloud, and returned to the customer within a week, according to a statement.

    IBM claims that it is offering more storage per dollar compared to competing products. The devices are available in the U.S. now and in the European Union soon.

    POV: Network speeds are too slow to move customer data to the cloud, so IaaS providers have to create these rugged temporary storage appliances to help with the process, says Constellation VP and principal analyst Holger Mueller. "The interesting question going forward will be whether these are going to be 'dumb' storage devices, or easy to deploy, rugged servers that can capture data—e.g. at the IoT edge—and offer lightweight processing on site," he adds. "The good news for customers is they are getting more choices, and easier way and faster ways to move to the public cloud."

    Data to Decisions Tech Optimization Chief Executive Officer Chief Information Officer Chief Digital Officer