Results

News Analysis - Microsoft to deliver Microsoft Cloud from datacenters in Africa

News Analysis - Microsoft to deliver Microsoft Cloud from datacenters in Africa

[South Africa has 11 official languages – and as I start blogs on a new IaaS locations with the local language ‘being learnt’ by the provider, I took the alphabetically first and last language.]

Microsoft made the announcement to bring Azure to South Africa recently. Data Center locations mater from both a constitutional and data residency – as well as a performance perspective.


 
 
 
So let’s take apart Scott Guthrie’s blog, which can be found here:
May 18, 2017 – Johannesburg, South Africa – Today Microsoft revealed plans to deliver the complete, intelligent Microsoft Cloud for the first time from datacenters located in Africa. This new investment is a major milestone in the company’s mission to empower every person and every organization on the planet to achieve more, and a recognition of the enormous opportunity for digital transformation in Africa.
MyPOV – Good to see the intent, in line with the general Microsoft Azure (or is it now the Microsoft Cloud?) value pitch of being the intelligent cloud.
Expanding on existing investments, Microsoft will deliver cloud services, including Microsoft Azure, Office 365, and Dynamics 365, from datacenters located in Johannesburg and Cape Town, South Africa with initial availability anticipated in 2018. The new cloud regions will offer enterprise-grade reliability and performance combined with data residency to help enable the tremendous opportunity for economic growth, and increase access to cloud and internet services for organizations and people across the African continent.
MyPOV – So 2018 will see the go live. No surprise there is an Office 365 and Dynamics 365 angle, two products that have to comply with data privacy and data residency legislation.
“We’re excited by the growing demand for cloud services in Africa and their ability to be a catalyst for new economic opportunities,” said Scott Guthrie, Executive Vice President, Cloud and Enterprise, Microsoft Corp. “With cloud services ranging from intelligent collaboration to predictive analytics, the Microsoft Cloud delivered from Africa will enable developers to build new and innovative apps, customers to transform their businesses, and governments to better serve the needs of their citizens.”
MyPOV – Good quote from Guthrie, which also hints at the next generation application angle for developers and the government angle. Government usually require data residency.
Expanding Access & Opportunity: Currently many companies in Africa rely on cloud services delivered from outside of the continent. Microsoft’s new investment will provide highly available, scalable, and secure cloud services across Africa with the option of data residency in South Africa. With the introduction of these new cloud regions, Microsoft has now announced 40 regions around the world – more than any major cloud provider. The combination of Microsoft’s global cloud infrastructure with the new regions in Africa will connect businesses with opportunity across the globe, help accelerate new investments, and improve access to cloud and internet services for people and organizations from Cairo to Cape Town.
MyPOV – South Africa is an island from a IaaS perspective. A large economy, but far away from a connectivity perspective, it’s in a similar situation like Australia, only that the IaaS vendors made it to Australia much earlier. With 40 regions Microsoft currently leads fellow competitors AWS and Google, but Microsoft does not clarify how many data centers are in one location. And to tackle reliable services in a geography, it needs to be at least, two data centers. Microsoft does not share how many data centers are in a region, but will have two regions with one each in Cape Town and Johannesburg. That should be a good answer for any HA (High Availability) concerns, though Oracle has moved the 'standard' quickly to three data centers per location / region.
“We greatly value Microsoft’s commitment to invest in cloud services delivered from Africa. Standard Bank already relies on cloud technology to provide our customers with a seamless experience,” says Brenda Niehaus, Group CIO at Standard Bank. “To achieve success as a business, we need to keep pace with market developments as well as customer needs, and Office 365 empowers us to make a culture shift towards becoming a more dynamic organization, whilst Azure enables us to deliver our apps and services to our customers in Africa. We’re looking forward to achieving even more with the cloud services available here on the continent.”
MyPOV – Always good to have launch customers and good to have them provide a quote in a press release announcing future to be used products / services.
Investing in African Innovation: This announcement expands on ongoing investments in Africa, where organizations are using currently available cloud and mobile services as a platform for innovation in health care, agriculture, education, and entrepreneurship. Microsoft has been working to support local start-ups and NGOs, unleashing innovation that has the potential to solve some of the biggest problems facing humanity, such as the scarcity of water and food, and economic and environmental sustainability. One start-up, M-KOPA Solar, provides affordable pay-as-you-go solar energy to over 500,000 homes using mobile and cloud technology. AGIN has built an app connecting 140,000 smallholder farmers to key services, enabling them to share data and facilitating $1.3 million per month in finance, insurance and other services.
MyPOV – Always good to show the potential and upside – and Africa has a lot of both. It’s not clear what M-Kopa and AGIN are or will be using from Microsoft. 
 
Across Africa, Microsoft has brought 728,000 small and mid-size enterprises (SMEs) online to help them transform and modernize their businesses, and over 500,000 are now utilizing Microsoft cloud services, with 17,000 using the 4Afrika hub to promote and grow their businesses. The Microsoft Cloud is also helping Africans build job skills, with 775,000 trained on subjects ranging from digital literacy to software development. We anticipate the Microsoft Cloud from Africa will fuel extensive new opportunities for our 17,000 regional partners and customers alike.
MyPOV – Impressive numbers, the consumer and educational aspect of the Microsoft product and services portfolio has a lot of potential in Africa. On the other side it also requires Microsoft to invest into infrastructure in Africa, and this is a first step.
“This development broadens the options available to us in our modernization journey of Government ICT infrastructure and services. It allows us to take advantage of new opportunities to develop innovative government solutions at manageable costs, as well as drive overall improvements in operations management, while improving transparency and accountability,” says Dr. Setumo Mohapi, CEO at SITA.
MyPOV – Again – good to see a current / future customer quote – covering the government aspect and potential.
The Microsoft Trusted Cloud: Microsoft has deep expertise protecting data, championing privacy, and empowering customers around the globe to meet extensive security and privacy requirements. With Microsoft’s Trusted Cloud principles of security, privacy, compliance, transparency, and the broadest set of compliance certifications and attestations in the industry, Microsoft’s cloud infrastructure supports over a billion customers and 20 million businesses around the globe. […]
MyPOV – Good to see Microsoft stressing the security aspect. As in every new geographic region where the cloud arrives, there is a large group of skeptical CxOs and security concerns are at the top of their list of reasons why they cannot move to the cloud. These concerns need to be addressed. There is no reason though why these concerns cannot be addressed as well in South Africa like they have in the rest of the world… with a broadly favorable outcome for the cloud.

 

Overall MyPOV

Always good to see IaaS vendors adding locations to their global clouds. South Africa is a key possession from the combination of GDP and remoteness toward network backbones. Australia has similar characteristics, but has a 4x larger GDP, so no surprise the global monopoly game between the IaaS vendors has seen Australia see the respective IaaS flags earlier than South Africa. But now it is South Africa, and with that Africa’s turn. And Africa (after Asia) is the world’s second largest continent – from a population perspective. As such Africa is key for Microsoft for all its offerings, as the press release outlines: For Office usage, for Dynamics usage and for getting strongly locally rooted customers (like governments) on the Microsoft cloud.

On the concern side, there is little to address. Microsoft is large enough to make the CAPEX happen, the question is only, which geographies got trumped by South Africa. But that’s what we learn soon from the next data center location announcement. And then South African data centers will likely not be efficient to service any economy north of the equator, e.g. the African Mediterranean rim is likely served better from Europe. And then there is the prize of the first Middle Eastern data center. And that Microsoft does not shy away from network investments can be seen from the recent MAREA cable announcement (with Facebook  see here), which will hit Europe closer to Africa than any other transatlantic cable, in Bilbao (Spain).

But for now, congrats to Microsoft – that between the three large providers (add AWS and Google) is the first with an announcement for a South African location. Even going further down the provider list to e.g. IBM, Oracle and SAP (though SAP may not push the IaaS build out now) – Microsoft has made the first announcement / move towards South Africa / Africa. So, congrats are in order.

 

 

Tech Optimization Innovation & Product-led Growth Future of Work Data to Decisions New C-Suite Next-Generation Customer Experience Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

News Analysis - Informatica Reinvents iPaaS with Next-Generation Cloud Services

News Analysis - Informatica Reinvents iPaaS with Next-Generation Cloud Services

Informatica had its user conference Infa17 this week in San Francisco, from May 15th till May 18th2017. We had the chance to be there on Monday, and actually speak about MDM trends (see the Storify here).

 
As usual user conferences come with a flurry of press releases, this one was no difference – let me dissect in the custom style, it can be found here:
Expands key capabilities of iPaaS to include end-to-end data management
Advances productivity with CLAIRE – metadata-driven Artificial Intelligence
Scales enterprise-wide to manage complex hybrid data environments
Adheres to the broadest security certifications MyPOV – A fair summary, these press releases are a bit too long, when the PR folks realize they need a bullet list of content items at the beginning.
Informatica World, SAN FRANCISCO, Calif., May 16, 2017 – Informatica, the Enterprise Cloud Data Management leader accelerating data-driven digital transformation, today announced Informatica Intelligent Cloud Services, the most advanced Integration Platform as a Service (iPaaS) solution available for end-to-end enterprise cloud data management. Informatica Intelligent Cloud Services will feature a next-generation user experience based on a modern API-based microservices architecture, powered by Informatica’s innovative enterprise unified metadata intelligence - known as CLAIRE Engine.
MyPOV – Great summary, described what is happening and bridges between well know iPaaS to microservices and AI, introducing the CLAIRE engine.
 
Informatica Intelligent Cloud Services expands Informatica’s leading application and data integration iPaaS capabilities to now include critical end-to-end cloud data management. This includes industry leading and enterprise-class data integration, API management, application integration, data quality and governance, master data management and data security, all re-imagined for the cloud. Informatica Intelligent Cloud Services is built on the Informatica Intelligent Data Platform™, and has reimagined the front- and back-end experience for the modern cloud environment, enabling organizations to efficiently unleash the power of all their data, wherever it resides, to fuel successful digital transformation initiatives.
MyPOV – Ok more detail, and a reminder all is on one single platform – the Informatica Intelligent Data Platform.
“As the industry’s number one iPaaS leader, we are driving innovation in this market,” said Amit Walia, executive vice president and chief product officer, Informatica. “With this launch, we are completely re-inventing iPaaS. We are delivering the industry’s broadest end-to-end data management solution for the cloud, with a next-generation user experience running on an API-based microservices architecture. Powered by metadata and Artificial Intelligence, we help enterprises accelerate their cloud-powered digital transformations.”
MyPOV – Good quote rom Walia. Everybody wants to re-invent things these days, but when putting AI into action, it truly has the potential of doing things differently. My concern is that Informatica uses ‘old’ terms like e.g. metadata. No need to talk about metadata in the AI era. More below.
 
Data Management Reimagined for Cloud
Informatica Intelligent Cloud Services moves past the traditional definition of iPaaS to include cloud data integration, cloud application and process integration, API management and connectivity. Informatica Intelligent Cloud Services delivers the industry’s first, and only, family of clouds that provides industry-leading data management capabilities, powered by CLAIRE from Informatica.
MyPOV – Ok – this is the third – and different collection of what the iPaaS does, maybe with a data management angle.
 
The family of clouds available in Informatica Intelligent Cloud Services include:
Informatica Integration Cloud – Modern digital strategies require a variety of integration approaches and patterns. Integration Cloud greatly expands the traditional definition of iPaaS to include advanced, unique functionality, such as Integration Hub and B2B. This also includes application integration, data integration and API management. For example, Cloud Integration Hub provides pub-sub-hub integration capabilities for hybrid data management.
Informatica Data Quality & Governance Cloud – Modern digital strategies require trusted data. Data Quality & Governance Cloud includes functionality that delivers the data quality and governance foundation for all cloud projects and initiatives. For example, Cloud Data Quality Radar provides the ability to assess and fix data quality issues within cloud applications, such as Salesforce and Marketo.
Informatica Master Data Management Cloud – Modern digital strategies require authoritative data. Master Data Management Cloud provides single, complete and accurate views across all forms of master data, in a single source of truth. For example, Cloud Customer 360 for Salesforce provides cloud MDM capabilities that scale to the most demanding enterprise requirements with a laser focus on business self-service and self-management of master data.
Future Clouds – Additional, modular data management clouds, products and solutions will be seamlessly added to Informatica Intelligent Cloud Services over time.MyPOV – So what used to be once products are now ‘clouds’. The separation between integration, data management and then data quality and governance makes sense as these are the organic organization and breaking points how data handling organizations are setup. Good to see Informatica leaving the door open for future clouds aka products.
Adopting cloud and using data to drive disruption requires excellence in data management. The innovative combination of a modern user experience and API-based microservices architecture, built on the industry’s only Intelligent Data Platform powered by CLAIRE, enables Informatica Intelligent Cloud Services to deliver increased productivity and address new use cases, at scale.
MyPOV – Another collection of capabilities, would be good to have an example or customer proof point.
 
Next-Generation Experience and Architecture: Innovative Approach for Maximum Productivity
All the clouds that comprise Informatica Intelligent Cloud Services share a consistent, next-generation user experience across the entire spectrum of data management capabilities. The API-based microservices architecture delivers common services (e.g., user authentication, workflow creation, asset management, search, tagging, and more) that not only look the same, but also operate exactly the same wherever they are invoked across the cloud. This user experience dramatically reduces the learning curve for new tools and drives self-service across the environment.
MyPOV – Always good to mention suite level benefits – and this time making them tangible with examples. Consistency and synergies is what enterprises want to see when they buy multiple, suite integrated products from the same vendor.
 
The reimagined next-generation user experience includes a single, personalized home page with tiles for items such as personal tasks and connections, plus tiles that are dashboards for the projects that person has underway in each of the data management clouds. This home page gives them visibility and access to all the data management projects they may have underway across all the clouds of Informatica Intelligent Cloud Services.
MyPOV - It's never enough to offer great capability and functionality behind the scenes, it has to been seen and experience by the user. Good to see the UX progress, which frankly was an area where Informatica has been challenged in the past, particularly in terms of consistency. Good to see the tile approach for UX - which has worked well across the industry to bring information together consistently for multiple user roles. 

 
Additionally, the new microservices are based on open REST APIs. This will enable a continued rapid pace of innovation for Informatica, allowing the company to bring new services and advance existing services at a rapid pace. It will also enable quick integration with customer and partner reference architectures.
MyPOV - An overdue move, good nonetheless. REST has won and it's time to adopt it for vendors. 
 
Powered by CLAIRE: Intelligence Throughout the Cloud
CLAIRE—with clairvoyance in mind and AI in the center—is the industry’s most advanced metadata-driven Artificial Intelligence (AI) technology and is embedded in the Informatica Intelligent Data Platform. CLAIRE delivers intelligence to the entire portfolio of Informatica data management solutions that includes data integration, master data management, data quality and governance, data security, cloud data management, and big data management capabilities. CLAIRE delivers AI by applying machine learning to technical, business, operational and usage metadata across the entire enterprise. This scale and scope of metadata is transformational and allows CLAIRE to help data and integration developers by partially or fully automating many tasks, while business users find it easier to locate and prepare the data they are looking for from anywhere in the enterprise. Meanwhile, data scientists gain a faster understanding of data and data stewards find it easier to visualize data relationships.
MyPOV – Good intro and explanation of CLAIRE, though vague on detail in regards of machine learning algorithms, platform, pricing and so on – but it is early days.

Enterprise-wide Management for a Hybrid World
Informatica Intelligent Cloud Services is built to enable enterprises to run complex hybrid environments. It provides operational insights delivered through a single dashboard for monitoring and managing all data management clouds and products and their data. Customers benefit from easy connectivity to all cloud, on-premise and big data sources across the enterprise using pre-built Informatica connectors.
MyPOV – We live in world of multi-cloud and hybrid systems – integrations in general and MDM in specific needs to span across them – so it is good to see Informatica providing a single pane of glass and tools on a common platform.

Cloud with Industry-leading Security and Trust
Informatica Intelligent Cloud Services is built for the enterprise with security as a core design principle. It has the following certifications and standards for industry leading security:
AICPA SOC 2 Type 2 and SOC 3 attestations.
Externally audited HIPAA compliance.
ISO 27000-aligned Information Security Management System, and EU-US Privacy Shield and compliance security program.
Member of Cloud Security Alliance and Salesforce AppExchange certified.
MyPOV – Security remains the top – or one of the top 3 concerns of cloud users, so it is good that Informatica seeks security certifications that help address these concerns.
 

Overall MyPOV

Informatica is in the transition from on premises to the cloud, what started gingerly a year ago at INFA16 is now in full swing and available at INFA17. And a cloud platform brings new capabilities e.g. in the area of Machine Learning, and as we are in the ‘buzzword AI’ age – the AI assistants are coming. Good to see CLAIRE being launched, we have to understand a little bit better what she does, where she lives, how she is educated, and what other needs she have (thanks for staying with the metaphor).

On the concern side, Informatica use some older vocabulary, and vocabulary is also an indicator for thinking. Metadata was the gold in integration mechanisms of the past. Take a look – to make it not contentious – at Google Photos: Users do not flag or tag pictures anymore. Machine Learning does. That there is a tagging (or metadata) repository underneath it – of course – but users never see that. It isn’t even exposed. If integration vendors will go so far remains to be seen – but with the rise of AI – CLAIRE will take care of the metadata, Informatica user should not know, see and talk about it – just use it the same way people use Google Photos. A high ask, but with a big prize.

Overall a good INFA17, Informatica is running on all cylinders, the PE investment seems not to hinder the execution, the vendor is moving fast, which is great news for customers and prospects. For enterprises in general, as the integration problems only get bigger for the foreseeable future.
 

 
 
Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Data to Decisions Future of Work SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

A hidden message from Ed Snowden

A hidden message from Ed Snowden

Media Name: night-sky-1467173572719-f14b9fb86e5f.jpg

The KNOW Identity Conference in Washington DC last week opened with a keynote fireside chat between tech writer Manoush Zomorodi and Edward Snowden. 

Once again, the exiled security analyst gave us a balanced and nuanced view of the state of security, privacy, surveillance, government policy, and power.  I have always found him to be a rock-solid voice of reason. Like most security policy analysts, Snowden sees security and privacy as symbiotic: they can be eroded together, and they must be bolstered together. When asked (inevitably) about the “security-privacy balance”, Snowden rejects the premise of the question, as many of us do, but he has an interesting take, arguing that governments tend to surveil rather than secure.  

The interview was timely for it gave Snowden the opportunity to comment on the “Wannacry” ransomware episode which affected so many e-health systems recently.  He highlighted the tragedy that cyber weapons developed by governments keep leaking and falling into the hands of criminals. 

For decades, there has been an argument that cryptography is a type of “Dual-Use Technology”; like radio-isotopes, plastic explosives and supercomputers, it can be used in warfare, and thus the NSA and other security agencies try to include encryption in the “Wassenaar Arangement” of export restrictions.  The so-called “Crypto Wars” policy debate is usually seen as governments seeking to stop terrorists from encrypting their communications.  Even if crypto export control worked, it doesn’t address security agencies’ carelessness with their own cyber weapons.

But identity was the business of the conference. What did Snowden have to say about that?

  • Identifiers and identity are not the same thing.  Identifiers are for computers but “identity is about the self”, to differentiate yourself from others.
  • Individuals need names, tokens and cryptographic keys, to be able to express themselves online, to trade, to exchange value.
  • “Vendors don’t need your true identity”; notwithstanding legislated KYC rules for some sectors, unique identification is rarely needed in routine business. 
  • Historically, identity has not been a component of many commercial transactions.
  • The original Web of Trust, for establishing a level of confidence in people though mutual attestation, was “crude and could not scale”. But new “programmatic, frictionless, decentralised” techniques are possible.
  • He thought a “cloud of verifiers” in a social fabric could be more reliable, to avoid single points of failure in identity.

When pressed, Snowden said actually he was not thinking of blockchain (and that he saw blockchain as being specifically good for showing that “a certain event happened at a certain time”).  

Now, what are identity professionals to make of Ed Snowden’s take on all this? 

For anyone who has worked in identity for years, he said nothing new, and the identerati might be tempted to skip Snowden. On the other hand, in saying nothing new, perhaps Snowden has shown that the identity problem space is fully defined. 

There is a vital meta-message here.

In my view, identity professionals still spend too much time in analysis.  We’re still writing new glossaries and standards.  We’re still modelling. We’re still working on new “trust frameworks”.  And all for what?  Let’s reflect on the very ordinariness of Snowden’s account of digital identity.  He’s one of the sharpest minds in security and privacy, and yet he doesn’t find anything new to say about identity. That’s surely a sign of maturity, and that it’s time to move on.  We know what the problem is: What facts do we need about each other in order to deal digitally, and how do we make those facts available?

Snowden seems to think it’s not a complicated question, and I would agree with him.

 

 

Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Distillation Aftershots AI Blockchain Security Zero Trust Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Privacy Officer

Roundup Of Cloud Computing Forecasts, 2017

Roundup Of Cloud Computing Forecasts, 2017

1
  • Cloud computing is projected to increase from $67B in 2015 to $162B in 2020 attaining a compound annual growth rate (CAGR) of 19%.
  • Gartner predicts the worldwide public cloud services market will grow 18% in 2017 to $246.8B, up from $209.2B in 2016.
  • 74% of Tech Chief Financial Officers (CFOs) say cloud computing will have the most measurable impact on their business in 2017.

Cloud platforms are enabling new, complex business models and orchestrating more globally-based integration networks in 2017 than many analyst and advisory firms predicted. Combined with Cloud Services adoption increasing in the mid-tier and small & medium businesses (SMB), leading researchers including Forrester are adjusting their forecasts upward. The best check of any forecast is revenue.  Amazon’s latest quarterly results released two days ago show Amazon Web Services (AWS) attained 43% year-over-year growth, contributing 10% of consolidated revenue and 89% of consolidated operating income.

Additional key takeaways from the roundup include the following:

  • Wikibon is predicting enterprise cloud spending is growing at a 16% compound annual growth (CAGR) run rate between 2016 and 2026. The research firm also predicts that by 2022, Amazon Web Services (AWS) will reach $43B in revenue, and be 8.2% of all cloud spending. Source: Wikibon report preview: How big can Amazon Web Services get?
Wikibon Worldwide Enterprise IT Projection By Vendor Revenue

Wikibon Worldwide Enterprise IT Projection By Vendor Revenue

Rapid Growth of Cloud Computing, 2015–2020

Rapid Growth of Cloud Computing, 2015–2020

Worldwide Public Cloud Services Forecast (Millions of Dollars)

Worldwide Public Cloud Services Forecast (Millions of Dollars)

  • By the end of 2018, spending on IT-as-a-Service for data centers, software and services will be $547B. Deloitte Global predicts that procurement of IT technologies will accelerate in the next 2.5 years from $361B to $547B. At this pace, IT-as-a-Service will represent more than half of IT spending by the 2021/2022 timeframe. Source: Deloitte Technology, Media and Telecommunications Predictions, 2017 (PDF, 80 pp., no opt-in).
Deloitte IT-as-a-Service Forecast

Deloitte IT-as-a-Service Forecast

  • Total spending on IT infrastructure products (server, enterprise storage, and Ethernet switches) for deployment in cloud environments will increase 15.3% year over year in 2017 to $41.7B. IDC predicts that public cloud data centers will account for the majority of this spending ( 60.5%) while off-premises private cloud environments will represent 14.9% of spending. On-premises private clouds will account for 62.3% of spending on private cloud IT infrastructure and will grow 13.1% year over year in 2017. Source: Spending on IT Infrastructure for Public Cloud Deployments Will Return to Double-Digit Growth in 2017, According to IDC.
Worldwide Cloud IT Infrastructure Market Forecast

Worldwide Cloud IT Infrastructure Market Forecast

  • Platform-as-a-Service (PaaS) adoption is predicted to be the fastest-growing sector of cloud platforms according to KPMG, growing from 32% in 2017 to 56% adoption in 2020. Results from the 2016 Harvey Nash / KPMG CIO Survey indicate that cloud adoption is now mainstream and accelerating as enterprises shift data-intensive operations to the cloud.  Source: Journey to the Cloud, The Creative CIO Agenda, KPMG (PDF, no opt-in, 14 pp.)
Cloud investment by type today and in three years

Cloud investment by type today and in three years

AWS Segment Financial Comparison

AWS Segment Financial Comparison

  • In Q1, 2017 AWS generated 10% of consolidated revenue and 89% of consolidated operating income. Net sales increased 23% to $35.7 billion in the first quarter, compared with $29.1 billion in first quarter 2016. Source: Cloud Business Drives Amazon’s Profits.
Comparing AWS' Revenue and Income Contributions

Comparing AWS’ Revenue and Income Contributions

  • RightScale’s 2017 survey found that Microsoft Azure adoption surged from 26% to 43% with AWS adoption increasing from 56% to 59%. Overall Azure adoption grew from 20% to 34% percent of respondents to reduce the AWS lead, with Azure now reaching 60% of the market penetration of AWS. Google also increased adoption from 10% to 15%. AWS continues to lead in public cloud adoption (57% of respondents currently run applications in AWS), this number has stayed flat since both 2016 and 2015. Source: RightScale 2017 State of the Cloud Report (PDF, 38 pp., no opt-in)
Public Cloud Adoption, 2017 versus 2016

Public Cloud Adoption, 2017 versus 2016

  • Global Cloud IT market revenue is predicted to increase from $180B in 2015 to $390B in 2020, attaining a Compound Annual Growth Rate (CAGR) of 17%. In the same period, SaaS-based apps are predicted to grow at an 18% CAGR, and IaaS/PaaS is predicted to increase at a 27% CAGR. Source: Bain & Company research brief The Changing Faces of the Cloud (PDF, no opt-in).
60% of IT Market Growth Is Being Driven By The Cloud

60% of IT Market Growth Is Being Driven By The Cloud

  • 74% of Tech Chief Financial Officers (CFOs) say cloud computing will have the most measurable impact on their business in 2017. Additional technologies that will have a significant financial impact in 2017 include the Internet of Things, Artificial Intelligence (AI) (16%) and 3D printing and virtual reality (14% each). Source: 2017 BDO Technology Outlook Survey (PDF), no opt-in).
CFOs say cloud investments deliver the greatest measurable impact

CFOs say cloud investments deliver the greatest measurable impact

Cloud investments are fueling new job throughout Canada

Cloud investments are fueling new job throughout Canada

  • APIs are enabling persona-based user experiences in a diverse base of cloud enterprise As of today there are 17,422 APIs listed on the Programmable Web, with many enterprise cloud apps concentrating on subscription, distributed order management, and pricing workflows.  Sources: Bessemer Venture Partners State of the Cloud 2017 and 2017 Is Quickly Becoming The Year Of The API Economy. The following graphic from the latest Bessemer Venture Partners report illustrates how APIs are now the background of enterprise software.
APIs are fueling a revolution in cloud enterprise apps

APIs are fueling a revolution in cloud enterprise apps

Tech Optimization SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Key Takeaways from Salesforce's Q1

Key Takeaways from Salesforce's Q1

Constellation Insights

While Salesforce dipped back into the red in its first quarter, after posting modest profits in the previous few, the company remains on a serious growth tear with revenue up 25 percent year-over-year. The full numbers are available here, but as usual, we will go through the earnings conference call and pull out the most relevant pieces of information and perspective (H/T to Seeking Alpha for the call transcript).

Einstein, the Straight Shooter: Salesforce's take on AI in the enterprise is Einstein, built from a collection of homegrown and acquired technologies. It's been pushing out Einstein capabilities across its various clouds. But for now, at least, one Einstein component, Guidance, remains in-house. Salesforce CEO Marc Benioff gave a preview of it on the call:

[W]e then have a piece of Einstein now that we've not yet rolled out to our customers called Einstein Guidance. So this is a capability that I use with my staff meeting, when I do my forecast and when I do my analysis of the quarter, which happens every Monday at my staff meeting like a lot of CEOs do, it's a very typical process, of course, we have our top 20 or 30 executives around the table. We talk about different regions, different products, different opportunities. And then I ask one other executive their opinion and that executive is Einstein. And I will literally turn to Einstein in the meeting and say, "Okay, Einstein, you've heard all of this. Now what do you think?"

And Einstein will give me the over and under on the quarter and show me where we're strong and where we're weak and sometimes will even point out a specific executive, which has done in the last three quarters and said that this executive is somebody who needs specific attention during the quarter. ... I think for a CEO, typically the way it works is, of course, you have various people, mostly politicians and bureaucrats, in your staff meeting who are telling you what they want to tell you to kind of get you to believe what they want you to believe. Einstein comes without bias.

Industries Going All In: Salesforce rolled out a vertical strategy several years ago, and president Keith Block gave a progress update on the call. Ten of the 15 largest telcos, 8 of the United States' 10 largest retailers, and nine of the top 10 wealth management firms "rely" on Salesforce, he said. For public sector, Block reported expanded relationships with the U.S. Army and Air Force, as well as a new deal with the state of Florida centered on tourism. 

Salesforce As Digital Transformation Driver: Benioff noted on the call that acquisitions such as SteelBrick for CPQ (configure, price, quote) capabilities are helping drive new Sales Cloud deals, particularly into existing customers. Overall, Salesforce is capturing a bigger piece of IT spending now and this is positioning it as a key player for digital transformation projects, Block said:

[W]e're able to say, "Okay, now the walls between sales, service and marketing are coming down." So now we have an opportunity to provide a 360-degree view of the customer with service and with marketing. And take that now one step further, as we've moved from systems of record to systems of engagement to now systems of intelligence. ... So it's an expansion of our capabilities and our opportunity to drive transformation with these customers.

 

Amazon Web Services Is Salesforce's 'Best Friend': Salesforce is moving some of its workloads to Amazon Web Services, a move that will help it shave costs and expand its global footprint. But the alignment seems much closer than a financial transaction, judging by this remark from Benioff:

I think at Salesforce, we really strongly believe that the enemy of my enemy is my friend, and I think that makes Amazon Web Services our best friend. 

As for that enemy? Benioff didn't actually say the word "Oracle," but he may as well have.

Data to Decisions Marketing Transformation Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer

Google I/O: The Key Enterprise Takeaways

Google I/O: The Key Enterprise Takeaways

/cINSIGHTS

Google's I/O developer conference kicked off this week and as in past years, it generated a lot of news spanning both consumer and enterprise-oriented scenarios (and of course, in some cases that line is a bit blurry). Here's a look at the top takeaways from the event's announcements for CXOs to consider.

Not Mobile First, AI First

Artificial intelligence has been the hottest trend in tech for some time now, and fittingly was the dominant focus of I/O. Put simply, Google wants to dominate the AI discussion and is making major moves to succeed in doing so.

It introduced Google.ai, which ties together all of its AI efforts in one place. It's aimed at both private companies, individual developers and academics and will focus on Google's AI research, tools and applied AI. 

Google CEO Sundar Pichai unveiled a new project called AutoML, a neural network capable of designing neural networks. This notion has been a holy grail of sorts in the AI field, and Pichai says Google will make major strides on the relatively near horizon, writing in a blog post:

We hope AutoML will take an ability that a few PhDs have today and will make it possible in three to five years for hundreds of thousands of developers to design new neural nets for their particular needs. 

AI is informing how Google evolves its products in a fundamental way, and the shift applies to the tech industry as a whole, Pichai added.

We are now witnessing a new shift in computing: the move from a mobile-first to an AI-first world. ... Think about Google Search: it was built on our ability to understand text in webpages. But now, thanks to advances in deep learning, we’re able to make images, photos and videos useful to people in a way they simply haven’t been before. Your camera can “see”; you can speak to your phone and get answers back—speech and vision are becoming as important to computing as the keyboard or multi-touch screens.  

There is still a long way to go before we are truly an AI-first world, but the more we can work to democratize access to the technology—both in terms of the tools people can use and the way we apply it—the sooner everyone will benefit.

Google has its own commercial considerations for AI, of course, going beyond its core products. It is hoping to make Google Cloud Platform the go-to place for developing bespoke AI applications. Google's secret sauce for accomplishing that are its TPUs (Tensor Processing Units), specialized chips designed for machine learning workloads.

The first generation of TPUs were introduced last year, but focused on running Google's existing machine learning models more efficiently. Pichai annnounced that the second-generation of the chips, dubbed Cloud TPUs, will be offered through Google Compute Engine later this year. Cloud TPUs not only run existing models but can train new ones. 

Google is also clustering the TPUs into what it calls "pods," which provide huge performance gains over past approaches. A new large-scale translation module once required a full day of training on 32 high-end GPUs, but accomplishes the same thing now "in an afternoon using just one eighth of a TPU pod," according to Google. TPUs will work in conjunction with TensorFlow, the machine learning framework Google open-sourced in 2015 to considerable success.

The company is running an alpha program for the TPUs and is also introducing the TensorFlow Research Cloud, which will make 1,000 TPUs available to researchers from both private industry and academia if they're willing to give back contributions to the open-source community.

Instant Apps Go GA

First announced at last year's I/O conference, Android Instant Apps are now out of preview and available to all developers. Instead of making users download and install an app, Instant Apps actually stream to devices from Google Play. Later on, users can decide to install them permanently.

Naturally, Instant Apps aren't as powerful as installed ones, which have deeper access to the device, but they do include useful capabilities such as payments and location.

Instant Apps provide a middle ground between websites and full-featured apps. That's a useful tool for enterprises to have in the toolbox, whether for internal users or outreach to customers. Constellation Research VP and principal analyst Holger Mueller noted earlier this year that Instant Apps have security advantages, since installed apps would involve MDM (mobile device management) issues.

The question now is how much momentum Google can build for Instant Apps out of the gate. Instant Apps capabilities will ship with Android O, the next version of the mobile OS, but will also be compatible with previous versions—a must, given the rampant fragmentation in the Android ecosystem. 

Enterprises should take a look at how Instant Apps can fit into their overal mobility, marketing and internal IT strategies. Beyond the potential use cases, Instant Apps give IT leaders a new way to balance development resources; currently supported, full-blown apps could be replaced with lighter touch Instant Apps requiring less overhead for IT.

Google Steps Toward HR with Google for Jobs

There has been much speculation about which directions Google will head in the enterprise application market since the arrival of former VMWare head Diane Greene as SVP of cloud. While it's not clear that Greene's fingerprints are on it, a new Google service called Google for Jobs brings the company into the orbit of HR and HCM software. Pichai described the new service in a blog post:

[A]lmost half of U.S. employers say they still have issues filling open positions. Meanwhile, job seekers often don’t know there’s a job opening just around the corner from them, because the nature of job posts—high turnover, low traffic, inconsistency in job titles—have made them hard for search engines to classify. Through a new initiative, Google for Jobs, we hope to connect companies with potential employees, and help job seekers find new opportunities.

As part of this effort, we will be launching a new feature in Search in the coming weeks that helps people look for jobs across experience and wage levels—including jobs that have traditionally been much harder to search for and classify, like service and retail jobs. 

Google has already worked with companies such as LinkedIn and Glassdoor to integrate them with Google for Jobs. What will be interesting to watch for are potential partnerships down the road with enterprise HR and HCM vendors. 

 

Data to Decisions Future of Work New C-Suite Tech Optimization Chief People Officer Chief Information Officer Chief Digital Officer

SAP Leonardo and Digital Transformation: The Strategic Implications

SAP Leonardo and Digital Transformation: The Strategic Implications

Billed as a "digital innovation system", the central focus on SAP Leonardo at the company's massive annual confab in Orlando this week speaks volumes as to its perceived importance to the firm's overall vision. SAP is clearly betting Leonardo is the right mix of technologies, tools, patterns, and services to help large companies deliver quickly on the latest high-impact digital capabilities beyond the fundamentals in the increasingly commoditized public cloud industry. The specific capabilities targeted by Leonardo include technologies such as Internet of Things, machine learning, analytics, blockchain, and big data as well as key enabling approaches like design thinking and tapping into strategic pools of shared intelligence.

You can get a clearer sense of the specifics of the Leonardo announcement from my industry colleagues Chris Kanaracus and Andy Mulholland, as well as Holger Mueller.

Most organizations today badly need a leg up to close their digital gap, and Leonardo is very much aimed at this target. As I've noted on ZDNet recently, the data shows there's no doubt that most organizations are now under severe pressure this year to digitally adapt much faster than they have been. The prime reason is that technology change continues to speed up, and the distance between what enterprises need to do to be digitally competitive and what they're actually doing is growing. This has turned into a vital existential issue: The number of companies that have encountered end-of-life events in recent years has led to the many dire predictions for the curtailed longevity of the large enterprise that are cited at so many conferences and articles these days. SAP Leonardo has the potential to help with these strategic challenges across the board and become an engine for digital transformation.

SAP Leonardo: Digital Business Enabler

The Strategic Motivation for Systems like Leonardo

Yet organizations need much more depth for their digital transformation efforts than the high-level blueprints or non-enterprise-class cloud offerings that most vendors are currently promoting. At issue -- and where Leonardo offers a specific solution that is geared for true enterprise use -- is the ever-more arduous task of successfully navigating and then executing on the fast-growing number of modern new digital capabilities expected today. Staying on the technology treadmill has grown very difficult for the typical organization to achieve directly. The reasons for this abound: With IT budgets relatively flat over the last few years, the incoming technology portfolio is proving much too large and costly to absorb in terms of available talent, scope, and resources using home-grown skills, infrastructure, platforms, applications for all but the very largest players.

Worse yet, is that unless organizations have already been investing in scale to develop native expertise in the complex and sophisticated capabilities of emerging enterprise tech (public cloud stacks, cognitive technologies, connected device hardware/software, open ledger stacks ala blockchain, to name just a few), then they're already years behind, never mind figuring out how best to combine these technologies together into a cohesive and mature set of digital business and customer experience offerings. They need a way to catch up and sustain the exponential tech growth curve.

Instead, for many organizations, what's really needed are ready-to-use platforms that combine all these digital services together into proven capabilities that are refined and matured across many customers. This is exactly what SAP, Salesforce, and many others did back in the day with ERP and CRM, and to which enterprises were largely happy to delegate while receiving best practices and economies of scale in return. It appears to be the same route that most will now have take to deliver on the promise of today's next generation of enterprise technologies.

All of this is why few CIO or CDOs will be rolling their own target platforms for digital transformation. Instead, they will be curating a small but top flight set of next-generation digital business platforms, of which Leonardo is a prime example, from market leaders that they can then realize their specific digital businesses strategies upon. I've previously included the SAP Cloud Platform on my informal short-list of digital transformation target platforms and Leonardo is an important extension of this platform that's aimed at a) enabling contemporary digital business -- which will ultimately represent the majority of revenue generation for companies within 10 years -- and b) representing SAP's product suite for the latest strategic enterprise technologies.

Digital Business Enablement: Leonardo Goes Up the Stack

Leonardo clearly shows how what I call digital transformation target platforms are really dividing into two halves as a category: a) Enterprise-class Platform-as-a-service (PaaS) offerings that offering foundational cloud capabilities like compute, data, and storage, as well as the standard applications related to the ERP suite, and b) new higher-order digital capabilities that yes, improve those same standard applications, but more significantly, allow them to be crafted into transformative new digital business services and business models that will sustain enterprises into the future.

Examples of the higher order business functions that Leonardo can enable include helping enterprises convert unconnected market offerings into data-driven connected products (IoT), applying blockchain in specific industries (financial services, healthcare, real estate, supply chain, etc.), and using machine learning/artificial intelligence to generate timely strategic insights to significantly improve the customer experience. Thus, Leonardo is decidedly in the latter category of target platform, which SAP Chief Digital Officer Jonathan Becher told me at SAPPHIRE NOW this week is based upon and entirely operates within the SAP Cloud Platform, which itself fits squarely in the first category. In short, Leonardo is a higher order digital business platform, while SAP Cloud Platform provides the core foundation.

Key Leadership Takeaways for SAP Leonardo

What then are the strategic implications of SAP Leonardo that business and IT leaders should know? Here are the key takeaways:

  • Leonardo can shorten the time needed to execute on a more ambitious digital business vision. How much remains to be seen, but SAP's betting that by having Leonardo's capabilities already fit together and proven by initial early adopter customers (currently Trenitalia, Nissan, and Stara) that it can help enterprise customers simultaneous take on a more expansive digital vision while also adopting the new technologies that Leonard offers much faster. Just as significantly, SAP is touting industry-specific accelerators for Leonardo. The company already plays in 25 major industries today, as they reported at SAPPHIRE NOW this week, but are currently focusing on accelerators for Leonardo in the retail, consumer products, manufacturing, sports, and entertainment industries at first. Other industry accelerators are coming. SAP claims these accelerators, which have much of the domain knowledge for an industry already baked into a reference model, can boost the execution speed of a Leonardo-based solutions by up to a substantial 50%. Leadership takeaway: If these expectations are borne out, Leonard can potentially contribute to a vital first-mover advantage in digital's increasing winner-takes-all world.
  • Leonardo goes beyond technology, by applying its system in key ways that are more likely to create competitive advantage. Unusually among digital platforms (for lack of a better word, as SAP calls it a 'system'), Leonardo's promotional materials prominently lists design thinking and something SAP calls data intelligence as part of its central capabilities. Design thinking is an increasingly popular -- some would say trendy -- way of developing more people-centric solutions by employing empathy for the problem space. Hard data for design thinking's effectiveness is still in short supply, but most reasonable people believe the practice is more likely to produce better digital experiences than older methods. As digital customer experience is becoming the market differentiator (and a major force for value creation), SAP is anticipating that featuring design thinking in the application of Leonardo to enterprises' digital business needs will ensure its technologies are more aptly and effectively applied. Data intelligence is touted by SAP as a way to "extract insights from a large network of anonymized data." Done right, this allows Leonardo customers to use private pools of otherwise hard-to-obtain strategic data that can be employed to create a distinctly unique digital business offering. As controlling or having access to vital and hard-to-replicate sources of data has steadily become a sustainable competitive advantage over the last decade, this lets Leonardo offer a hard-to-copy value proposition than goes beyond technology. Leadership takeaway: SAP is hoping that these capabilities will offer significant advantage well beyond basic technology modernization, as Leonardo offers strategies design and data services that can potentially impart real differentiated business advantage, at least against those not already using the system.
  • There is a risk of vendor lock-in, but the platform underlying Leonardo is multi-cloud capable. Along with the advantages of Leonardo (speed to market, vital emerging tech, competitive differentiation in design and data), there is issue of vendor and one-size fits all. The CIOs I speak with regularly these days are concerned about the tendency for the largest vendors to offer an exhaustive technology and business applications stack, making them both beholden to vendors like never before (by putting most or all of their eggs in one vendor basket) while also making it hard to differentiate in the market without very extensive and time-consuming customizations. Certainly, there are advantages as well, as a vendor being a one-stop-stop often means the product pieces fit together better, support and management is easier, etc. Leonardo's strategically significant nature certainly extends the company's value chain more deeply into the enterprise than ever and will make SAP customers more reliant on them. Yet I spoke at SAPPHIRE NOW with several customers running alternative clouds like Microsoft Azure, with SAP running on top, and reporting very good results. As long as SAP maintains this kind of friendliness towards hybrid vendor configurations that lets enterprises mix-and-match their cloud infrastructure with important new higher-order platforms and systems, then there is lower risk. Leadership takeaway: Require long-term multi-cloud flexibility from SAP (and all vendors) in your contracts, and verify Leonardo supports your planned strategic vendor mix before adopting.

There was a good amount of discussion this week at SAPPHIRE NOW about the composition of Leonardo and if it hung together as workable product. Overall, I believe the vision makes sense, even if the messaging itself, which puts technologies and design/data approaches at the same level of abstraction, is a little unclear, but I believe this will be sorted out. The bottom line is the Leonardo does offer a big enough umbrella that it's clear that it's intended for enterprises that want to step into their overall digital future, whatever that may be. In practical terms, Leonardo offers a go-to-market vehicle and approach for companies that want to meaningfully adopt Internet of Things, machine learning, and blockchain technologies in an overall approach that makes sense.

Longer term, the main question will be whether the elements of Leonardo truly fit together so advantageously as described, if SAP continues to incorporate important emerging technologies into it, and if some of the more interesting components, like industry accelerators, get fully fleshed out to critical mass. For now, I advise enterprises to consider Leonardo earnestly (taking care not to overcommit prematurely) for use in their digital transformation plans and experiments, especially if they are already SAP Cloud Platform customers.

Related Links

SAP Leonardo

SAP Cloud Platform

Sapphire 2017: Keynote By SAP CEO Bill McDermott and his Guests Announcing SAP Leonardo

Introducing the New SAP Leonardo: Empowering Companies to Digitally Transform at Scale

New C-Suite Innovation & Product-led Growth Data to Decisions Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity SAP AI Blockchain ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing SaaS PaaS IaaS Supply Chain Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT CRM ERP Leadership finance Customer Service Content Management Collaboration M&A Enterprise Service CCaaS UCaaS Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Digital Officer Chief Operating Officer

How to avoid the blockchain weeds

How to avoid the blockchain weeds

After spending two years researching blockchain and the evolution of advanced ledger technologies, I still find a great spread of understanding across my clients and business at large about blockchain.  While ledger superpowers like Hyperledger, IBM, Microsoft and R3 are emerging, there remains a long tail of startups trying to innovate on the first generation public blockchains. Most of the best-selling blockchain books confine themselves to Bitcoin, and extrapolate its apparent magic into a dizzying array of imagined use cases.  And I’m continuously surprised to find people who are only just hearing about blockchain now. 

For all sorts of specialists, it can seem that everyone is talking about blockchain and ledger technologies, but the truth is most people are not yet up to speed.  No one should be shy to ask what blockchain is really all about. 

Many blockchain primers and infographics dive into the cryptography, trying to explain to lay people how “consensus algorithms”, “hash functions” and digital signatures all work.  In their enthusiasm, they can speed past the fundamental question of what blockchain was really designed to do.  I’ve long been worried about a lack of critical thinking around blockchain and the activity it’s inspired. If you want to be able to pick the wheat from the chaff in this area, you really only need to know what blockchain does, and not how it does it.

So I’ve tried to write a fresh and uncomplicated explanation of what blockchain can do and what it cannot do.  You can down load the report Blockchain Explained in Plain English here.

 

New C-Suite Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer Chief Digital Officer

Teradata Transition to Cloud and Consulting Continues

Teradata Transition to Cloud and Consulting Continues

Teradata simplifies pricing, executes on business consulting and hybrid cloud strategy. A look at next steps in the company’s ongoing transition.

“Business outcome led, technology enabled.” This was the theme at the May 8-10 Teradata Third-Party Influencers Summit in San Diego, and it reflected a two-to-one ratio of consulting-oriented presentations to technology updates.

Teradata has been expanding already robust consulting and implementation offerings in part because mass migrations to cloud computing and open-source big data platforms like Hadoop have reduced demand for Teradata’s on-premises racks and appliances for data warehousing. Even as data volumes have continued to grow exponentially, Teradata’s revenues have declined in recent years from a high of $2.7 billion in 2014 to $2.3 billion in 2016.

Teradata compared it's old (at left) and new (at right) pricing scheme and cloud managed
services options at its May Influencers Summit.

Last year’s Summit was held shortly after the company replaced its CEO, announced plans to sell off its Aprimo marketing business unit, and introduced a more aggressive path to cloud and consulting services. At this year’s Summit we learned that Teradata has not only executed on that strategy, it has gone further to transform itself by pursuing simplicity, flexibility and control in four areas:

Pricing: Responding to feedback that its licensing approach was too complex, with too many licensing models and too many a la carte options, Teradata has devised a consistent, subscription-based licensing approach that will apply on-premises or in private or public clouds. The model is based on two dimensions: T Cores and Tiers. T Cores measure compute cores and disk I/O, but there are discounts if you’re using less than maximum input/output capacity.

The four-tiers reflect how capacity is being used, ranging from the free Developer tier to progressively more feature-rich (and most costly) Base (simple production), Advanced (production with mixed workloads), and Enterprise (mission-critical, enterprise workloads) tiers. The pricing is designed to be simple, predictable and consistent, with no penalties for choosing or moving between on-premises, private cloud or public cloud deployment. What’s more, pricing is more aggressive, with the Base Tier taking on cloud rivals like Amazon Redshift.

Portfolio: Where Teradata previously offered as many as nine systems in its portfolio, in now offers just two. IntelliFlex, the company’s new flagship, separates storage and compute decisions to support multiple workloads within a single rack. Customers can add different types of storage and compute nodes, ranging from archival retrieval to the ultimate in in-memory query performance. Customers can also add capacity in smaller increments than previously available and they can quickly reconfigure as needs change.

IntelliBase is Teradata’s entry-level appliance. It costs approximately 15% more than commodity hardware. IntelliBase is designed for more balanced data warehouse workloads. IntelliBase is designed for more balanced data warehouse workloads. It is not as flexible as IntelliFlex, which can be reconfigured to address high I/O or high CPU requirements..

Cloud: Teradata has made over and recast its managed cloud services as IntelliCloud. The offering combines the new T-Core- and Tier-based pricing scheme with three flexible infrastructure options behind the cloud services. Teradata previously offered only appliance-grade (2800 series) capacity behind its services, but you can now choose IntelliFlex or IntelliBase as the platform for managed services in the Teradata Cloud, which has data centers in Las Vegas and Frankfurt. The third option is Teradata-managed services running on carefully selected infrastructure services in the Amazon cloud (and, later this year, the Microsoft’s Azure cloud). Consumption options are more elastic with the public cloud options, but it won’t be as performant as IntelliFlex-based capacity, and service-level agreements aren’t available because Teradata has no control over the infrastructure. The intent it to give customers choice, with a fourth choice being bring-your-own-license and managing Teradata Database on AWS or Azure yourself.

Consulting: Teradata has consolidated its growing consulting offerings under the Teradata Global Services umbrella, and it has formalized three service lines to avoid overlaps and confusion. Think Big Analytics, the big data consulting business Teradata acquired in 2014, continues as the business-outcome-focused unit, offering industry-focused expertise in data science, data visualization and big data solutions. Enterprise Data Consulting focuses on technology, offering expertise in architecture, data management, data governance, security and services. Customer Services helps customers get the most out of their systems and people, applying proactive and reactive expertise in systems and software management and change management.

MyPOV on Teradata’s Ongoing Transformation

Disruptive market forces have dealt Teradata a tough hand to play. There’s clearly disillusionment with complex open source platforms like Hadoop these days, but that doesn’t mean we’re going back to Teradata’s heyday of enterprise data warehousing. Companies are still pursuing high-scale data lake approaches on low-cost, distributed platforms, whatever flavor prevails (whether that’s HDFS, objects stores like S3 and Azure Data Lake, or the next open source fad). Companies will also continue to rationalize their comparatively high-cost data warehousing infrastructure expenditures.

Teradata acknowledged last year that we’re in a “post-relational world,” but this year’s Summit shows signs that it’s truly adapting to a changed market. The company has not only delivered far more flexible hardware, it has gone further with the simplified, hybrid subscription-based pricing and more flexible cloud-deployment options.

Teradata is becoming more of a software and services company and less of a hardware vendor. That shift should eventually improve profitability, even if revenues continue to slide as deployments shift to the cloud.

Will customers trust Teradata to provide impartial, “business outcome led” consulting services? Customer Gerhard Kress said he chose Teradata in 2013 in large part because “the company understands that the world is a lot bigger than Teradata.” Director, Data Services at Siemens, Kress presented at the Summit on the train manufacturer’s global IoT deployment, and he noted that other vendors (mostly big platform vendors) asserted that they could address all challenges within their stacks. Teradata, meanwhile, suggested a heterogeneous approach reflecting technologies already in place at Siemens.

This “Blended Architecture” slide, from a Teradata Ecosystem Architecture Services presentation,
captures the vendor’s realistic sense of its place within enterprise environments.

Teradata has also become more realistic about its cloud ambitions. Two years ago Teradata talked about pursuing midsize businesses with the Teradata Database on AWS service. At this year’s Summit Teradata said it’s no longer pursuing that idea. Instead, executives said the company is focused on the needs of the 500 highest-scale and most sophisticated customers. That’s where Teradata’s technology really shines.

Teradata thrived in the past when it focused on delivering data-driven business outcomes at top of the market. It appears that focus is back.

Media Name: Teradata IntelliCloud.jpg
Media Name: Teradata Ecosystem Architecture Services.jpg
Data to Decisions Tech Optimization Big Data AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership business Marketing finance Customer Service Content Management Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Examining Leonardo: SAP's 'Digital Innovation System'

Examining Leonardo: SAP's 'Digital Innovation System'

Constellation Insights

In January, SAP moved to bundle its offerings for IoT under the brand name Leonardo. Jump forward just several months later to this week's Sapphire Now conference, and the vision has already grown much broader. 

What you might call 'Leonardo 2.0' now encompasses IoT, machine learning, analytics, big data, design thinking services and blockchain, all built atop SAP's cloud platform. Customers who adopt SAP Cloud Platform won't be tied to SAP data centers for the underlying infrastructure, as the company has inked partnerships with Amazon Web Services, Microsoft Azure and Google Cloud Platform as well.

The new Leonardo is SAP's "digital innovation system," and the basis for the company's drive into digital transformation projects. Digital transformation was the central theme of this year's Sapphire, and where SAP sees ample opportunity for growth as customers not only migrate older ERP environments to the new S/4HANA, but look to adopt the leading-edge technologies targeted by Leonardo.

SAP is shipping a number of industry accelerators—fixed-priced products composed of services and subscription licenses—for Leonardo, aimed initially at retail, sports organizations, consumer products and discrete manufacturing. 

For sure, Leonardo is an apt title for what the product set stands to offer customers, with its evocation of renaissance man Leonardo Da Vinci. It also raises the bar high for SAP to deliver customer success, of course, given how much Leonardo projects will depend not only on technology but professional services from SAP and partners. 

Some may question whether quickly broadening Leonardo's remit so far beyond IoT could create market confusion, but Sapphire provides the biggest platform of the year for SAP to educate customers on its product strategy. In any event, it's far from clear how much penetration the original Leonardo vision had made among the installed base. 

SAP is making the right move with the new incarnation of Leonardo because it sketches out a clearer picture of what a digital business ought to be, says Constellation Research VP and principal analyst Andy Mulholland.

"There will be a lot of CIOs who will be grateful for the clarity this new version of SAP Leonardo provides towards potential a proof-of-concept deployment of IoT and AI in the enterprise," he says. "Describing SAP Leonardo as adding systems of intelligence to to the traditional SAP systems of record will help the general market confusion as to exactly how to view the role of IoT and AI in enterprise digital business, as well as positioning Leonardo as an architecrure and a platform for ongoing development."

"The numbers and importance of the partners who have expressed commitment to supporting SAP Leonardo suggests that they too feel this is an important and radical move in the marketplace," Mulholland adds. "Constellation Research advises SAP customers to take a detailed look, and suggests non-SAP customers may also find the platform interesting too."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Data to Decisions Tech Optimization Chief Executive Officer Chief Information Officer Chief Digital Officer