Results

Deloitte, Ernst & Young Add to Hyperledger Momentum

Deloitte, Ernst & Young Add to Hyperledger Momentum

Constellation Insights

The Hyperledger blockchain consortium is continuing to gain momentum, adding eight new members including Deloitte and Ernest & Young. Hyperledger now has 142 members, up substantially from the initial 30 who joined the project at launch in February 2016. Here are the key details from Hyperledger's announcement:

“These new members have joined Hyperledger bringing a diverse set of skills at a crucial time,” said Brian Behlendorf, Executive Director, Hyperledger. “Consensus is a great platform for our members to set the stage and speak to what’s happening in our community, as production blockchain deployments increase."

Hosted at the Linux Foundation, Hyperledger encompasses a set of open-source projects around various aspects of blockchain technology. The arrival of Deloitte and Ernst & Young should help further its mission of finding vertical applications for blockchain. Other new members announced this week include Alphapoint, Change Healthcare, CITIC, Clause Inc, FZG360 Network Co. Ltd and Schroder Investment Management Limited.

Hyperledger also announced that two projects, Sawtooth and Iroha, have graduated from incubation to "active" status at the Linux Foundation. The first focuses on scalability and security issues, while the latter seeks to create a componentized blockchain framework whose parts can be used in other implementations. 

In March, Hyperledger announced that its Fabric project was the first to achieve active status. While the ranking doesn't denote a production-ready version 1.0, among other criteria it means the project has achieved enough diversity of support to survive any one company dropping out. 

Overall, this week's news is good for the Hyperledger effort, as well as the future of business, says Constellation Research VP and principal analyst Andy Mulholland.

"In order for digital business ecosystems and markets to function in a frictionless manner, they must have the commercial capability provided by a distributed ledger to allow any to any business transactions to be recorded," he says. "The quality of the membership of the HyperLedger project says a great deal about the business value well-known enterprises are placing upon its success."


24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Matrix Commerce Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Executive Officer Chief Information Officer Chief Digital Officer

Qlik Plots Course to Big Data, Cloud and 'AI' Innovation

Qlik Plots Course to Big Data, Cloud and 'AI' Innovation

Qlik highlights upgrades and the roadmap to high-scale, hybrid cloud and ‘augmented intelligence.’ Here's my take on the long-range plans.

Big data scalability, hybrid cloud flexibility and smart “augmented” intelligence. These are the three plans that business intelligence and analytics vendor Qlik officially put on its roadmap at the May 15-18 Qonnections conference in Orlando, Florida.

Qlik also highlighted six important upgrades coming in the Qlik Sense June 2017 release – one of five annual updates now planned for the company’s flagship product (reflecting cloud-first pacing, though on-premises customers can choose whether and when to make the move). The June upgrade highlights include:

  • Self-service data-prep capabilities
  • New data visualizations and color-selection flexibility
  • Qlik GeoAnalytics geospatial analyses added through the vendor’s January acquisition of Idevio
  • An improved QlikSense Mobile app that supports offline analysis
  • Support for advanced analytics capabilities based on R and Python
  • Easier conversion of QlikView apps to Qlik Sense.

Qlik is promising "augmented intelligence" said to combine the best of machine intelligence
with human interaction and decisions.

Most of these upgrades earned hearty applause from the more than 3,200 attendees at the Qonnections opening general session, but the sexiest and most visionary announcements were the ones on the roadmap. Here’s a rundown of what to expect, along with my take on what’s coming.

Building Toward BigData Analysis

Qlik’s key differentiator is its associative QIX data-analysis engine, which is at the heart of the company’s platform and is shared by its Qlik Sense and QlikView applications. QIX keeps the entire data set and rich detail visible even as you focus in on selected dimensions of data. If you select customers who are buying X product, for example, you’ll also see which customers are not buying that product. It’s an advantage over drill-down analysis where you filter out information as you explore

There have been limits, however, in how much data you can analyze within the 64-bit, in-memory QIX engine. Qlik has a workaround whereby you start with aggregated views of large data sets. Using an On-Demand App Generation capability you can then drill down to the detailed data in areas of interest. But the drawback of this approach is that you lose the powerful associative view of non-selected data.

The Associative Big Data Index approach announced at Qonnections will create index summaries of large data sets, drawn from sources such as Hadoop or high-scale distributed databases. A distributed version of the QIX engine will then enable users to explore the fine-grained detail within slices of the data without losing sight of the summary-level index of the entire data set.

MyPOV on Qlik big data capabilities: What I like about the Associative Big Data Index is that it will leave data in place, whether that’s in the cloud or in an on-premises big data source. It brings the query power to the data, eliminating time-consuming and costly data movement. The distributed architecture also promises performance. In a demo, Qlik demonstrated nearly instantaneous querying of a 4.5-terabyte data set. Granted, it was a controlled, prototype test, so we’ll have to wait and see about real-world performance.

Speaking of waiting, on big data, as on the hybrid cloud and augmented intelligence fronts, Qlik senior vice president and CTO Anthony Deighton set conservative expectations, telling customers they would see progress by next year’s Qonnections event. He didn’t rule out the possibility of an earlier release, but nor did he promise that any of the new capabilities would be generally available by next year’s event. As has been Qlik’s habit in recent years, it’s responding slowly to demands in emerging areas like big data and cloud.

Preparing for Hybrid Cloud

The business intelligence market has forced a binary, either-or, on-premises or cloud-based choice, said Deighton. He vowed that Qlik will change it to an and/or choice by fostering hybrid flexibility with the aid of microservices, APIs and containerized deployment. The approach will also require sophisticated, federated identity management, which the vendor has developed to support European GDPG data security and privacy compliance requirements set to go into effect next year.

In a prototype preview at Qonnections, Qlik demonstrated workloads being spawned and assigned automatically across Qlik nodes running on Amazon, in the Qlik Cloud and on-premises. The idea is to flexibly send workloads to the most appropriate resources. That could mean spawning public cloud instances on the fly when scale is required. Or it could mean keeping analyses on-premises when regulated data is involved. Qlik is working with big banks and hospitals, among other customers, to master microservices orchestration across on-premises, private-cloud and public-cloud instances.

MyPOV On Qlik’s cloud plans: As noted above, Qlik made no promises as to when it will deliver on this flexible, cloud-friendly microservices vision, other than to say that we’ll hear more at Qonnections 2018. Qlik’s cloud offerings need these workload-management features, particularly where Qlik Sense Enterprise in the cloud is concerned. Customers want better performance as well as the granular services and APIs they’re used to from leading SaaS vendors. I believe it’s more important for Qlik to deliver quickly on this front than on any other, so let’s hope it’s something introduced before Qonnections 2018.

Augmenting Intelligence

There have been many announcements about “smart” capabilities this year. A few have of the capabilities have actually launched (like those detailed in my detailed reports on Salesforce Einstein and Oracle Adaptive Intelligent Apps), but most are works in progress. Some are conservatively described as automated predictive analytics or machine learning while others are billed as “artificial intelligence.”

Over the past year, Deighton and other Qlik executives have charged that competitive AI and cognitive offerings tend to remove humans from decision making. In keeping with this theme, the company announced that it’s working on “augmented intelligence” that will “combine the best” of what machines can do with human input and interaction. The approach will eschew automation in favor of machine-human interaction that will bring context to data and promote better-informed machine learning, said Deighton.

The general idea is for humans to interact with concise lists of computer-generated suggestions. This will happen through computer-augmented interfaces at various stages in the data-analysis lifecycle. When users bring data together, for example, data-analysis algorithms will be applied to suggest how the data might be correlated. In the analysis stage, algorithms will suggest the best analytical approaches. And once results are generated, data-visualizations algorithms will be applied to suggests best-fit visualizations. Humans will interact with the suggestions and make the final selections at every stage. Deighton promised something that will neither dump too many possibilities on users, at one extreme, nor create “trust gaps” by automating and remove human input from decisioning.

MyPOV on Qlik Augmented Intelligence: Based on conversations with Qlik executives, I’d say we’re in the early stages of Qlik’s augmented intelligence initiative. It all sounds good, but the details were sketchy. I heard a bit about analytic libraries and potential partnerships on the machine learning and neural net front. But executives weren't ready to name partners or predict availability. In short, we may see the beginnings of Qlik’s augmented intelligence capabilities at Qonnections 2018, but Qlik execs were up front in describing the initiative as something that may take a few years to mature.

Qlik’s most direct competitors, including Tableau, Microsoft, SAP and IBM, are all working on smart data exploration, basic prediction and “smart” recommendation features of one stripe or another. IBM is actually on the second-generation of its cloud-based IBM Watson Analytics service. Yet we’re still in the very earliest phases of bringing advanced analytics, machine learning and artificial intelligence to the broad business intelligence market. I think 2017 may mark the end of the beginning. By 2018 and beyond, we’ll start to see vendor selections based on smart features rather than the maturing trend toward self-service capabilities.

RELATED READING:
Qlik Gets Leaner, Meaner, Cloudier
Inside Salesforce Einstein Artificial Intelligence
Tableau Sets Stage For Bigger Analytics Deployments

Media Name: Qlik Augmented Intelligence Vision.png
Data to Decisions Future of Work Tech Optimization ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Customer Officer Chief Information Officer Chief Digital Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Java 'Father' James Gosling Joins Amazon Web Services

Java 'Father' James Gosling Joins Amazon Web Services

Constellation Insights

It's not often that a single technical hire at a vendor merits news coverage, but when it involves a tech luminary of the likes of Java "father" James Gosling, proper attention should be paid. Gosling announced this week on his personal Facebook page that he has taken a position with Amazon Web Services, after a stint as chief software architect of Liquid Robotics.

Gosling co-created Java along with Mike Sheridan and Patrick Naughton while the three worked at Sun Microsystems in the 1980s. After Oracle bought Sun in 2010, taking control of Java, Gosling left the company, later citing issues with his salary, job responsibilities and micromanagement. 

Liquid Robotics has developed an autonomous robot that traverses oceans, collecting and transmitting information about water temperature, wind speeds and other metrics. The startup's technology is used by energy companies and military organizations, among others. 

It's not clear whether Gosling will be doing anything of the sort for AWS, although his background at Liquid Robotics would seem to have some synergies with IoT (Internet of Things) projects in general.

Gosling's title at AWS is distinguished engineer, a distinction handed out to relatively few people overall at the company. Distinguished engineer positions are not sinecures and AWS will undoubtedly have him working on something highly strategic. 

Gosling himself declined to reveal what he'll be doing at AWS, yet suggested as much, in a Facebook post:

I've been getting a lot of questions on what I'll be working on at Amazon. Sadly, I can't say: it's Amazon's policy to be quiet. As much as secrecy can be annoying, it usually makes sense. ...Years ago I worked at IBM for a while and had to go through "confidential information" training. When I came back grumpy my manager smiled and said "IBMs biggest secret is that it has nothing worth keeping secret". Doesn't apply at Amazon. It looks like it'll be a fun ride.

AWS supports and relies upon Java for many of its services, particularly on the back end. Meanwhile, a great many on-premises enterprise workloads—ones AWS is keen to migrate—are Java-based. It's conceivable that Gosling could help lead efforts to improve Java services and tooling for AWS, and his name on the marquee won't hurt in convincing conservative enterprises to place bets with the company.

Gosling could also serve as an evangelist to the coding community. "AWS needs luminaries for the developers to look up to and he definitely is one," says Constellation Research VP and principal analyst Holger Mueller. "So this is a good move."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

Personal Observations to the IT department on Digital Business, IoT & AI

Personal Observations to the IT department on Digital Business, IoT & AI

I rarely write a blog based on personal opinion as my engineering background always favors researched factual reporting. Opinion pieces are, by necessity, based on subjective views; to have value must be insightful. Having prepared and published research presenting the ‘Big Picture’ view of Digital Enterprises functioning in the Digital Economy it became all to clear while individual technologies might be understood by IT; the deployment into Digital Business was not understood.

In fact the alignment between IT and Business management in the deployment of IoT as a core element in a Digital Enterprise was somewhere between absent, and poor. Industrial companies with Operational Technology departments had moved swiftly forward, usually with limited reference to the Information Technology operation. This blog provides a simple summary of the main points, in my opinion, that IT Professionals have to understand to see the ‘Big Picture’ of Digital Business and how it will transform their Enterprise.

As the above is controversial, but easy to prove with evidence, I am going to start with an explanation as to why researching a Big Picture report gives a different view to reporting on particular elements in a specialized manner. Why and how can the work produce some insightful views in addition to the factual report itself.

An Analyst interacts with many different people, with a wide range of roles, experiences and employers to build the ‘big picture’ view across an entire market spanning products, companies and deployments. Each discussion revolves round in depth microcosm that is one part of the whole picture; in a mature market the integration to the whole is part of the discussion.

In an immature market where both the Business deployment and the Technology products are still rapidly emerging the degree of consensus in the views is a critical issue. The concepts and practices of ‘Digital’ make for not just a big Picture, but for the entire redesign of Business, and the enabling Technology Frameworks into a new inclusive Digital Economy over the coming years.

A great deal of research, with the active cooperation of vendors and users was required to produce the final report, which can be found here as an abbreviated set of Blogs under the title Distributed Business and Technology models. (The title aims to be ‘neutral and inclusive’ avoiding terms with strong individual definitions). The resulting report defines the aspirational target for both users planning their business/technology adoption strategy, and vendors their product/services features development.

 

Most recognized markets whether discussing business, or technology, have usually a reasonable match between Business user expectations and Technology vendors’ product capabilities. Sadly in Digital Business this is not so, there are serious mismatches between Business Managers, those Deploying and the Vendors of the Technology. The goal of this blog is to identify what I believe are some of the biggest gaps, or issues that need to be addressed.

Currently both the Business management and the Technology Management are simultaneous drivers of deployment, but for completely different benefits and reasons. The visions of Digital Business is all to often remote from those of the IT department. In manufacturing, or certain industry sectors like Buildings, or Medical, the presence of Operational Technology departments has helped to overcome the ‘gap’ and resulted in these sectors accelerating rates of adopting the new practices.

Cloud is an excellent example of this; the Information Technology department will be advanced in reducing cost through using centralized large-scale data centers whilst Business Management and Operational Technology see Cloud as the technology that supports the distributed low latency edge computing requirements of IoT in a Digital Business.

Sadly much of the following comments seem to target the IT department, and its need to come to terms with a Business and Technology transformation of a type last experienced in the early 90s that lead to the creation of the Enterprise IT department. The following comments assume that IT professionals and departments are keen to identify their new, or additional, role in the Digital Enterprise.

After twenty-five years of Client-Server based Enterprise IT driven by the Close Coupled State full architecture model a radical technology change to an all-together different Business model using a Loose Coupled, Stateless architecture is radical. After all the current architecture has proved able to be adjusted to accommodate the inclusion of the Internet, Web, Mobility, and Clouds, so why not IoT, and AI, as well?

The following, gathered from a wide range of sources, are my opinion the key points where misunderstandings around the Digital Business model and its enablement with Technology are most common. And it starts with the fundamental question of what is Digital Business.

Digital Business, the Digital Economy, and similar terms that are widely used, often interchangeably, are not really understood as to their real definition, and are seen as part of the current Internet/Web economy. The prevailing view is Digital Business relates to an extension of the current model, with added new technology capabilities to support increased volumes of business. There is widespread failure to really grasp that Digital in this context refers to a new generation of Commerce that changes Business models, and Enterprise organizational structures.

The enabling technologies for the Digital Enterprise that make up CAAST, (Clouds, Apps, AI, Services & Things), even if bearing a familiar name such as Cloud, are deployed in a different approach to enable Digital Business. Digital Business deploys these technologies to optimize continuous changes in opportunities to do business that will gain increased revenues and margin improvements. Digital Business is more than a Front Office activity running through out the Enterprise Business and Operating model.

The contrast with the current role of IT, as a predominantly Back Office activity providing the administrative functions necessary to record transactions through stable processes could not be greater. Information Technology needs to grasp the difference between their current roles, and understand Operation Technology learning from the leadership this discipline has already showed operating ‘real-time’ event optimization. The role of IT is necessary for compliance, and will continue, but is likely to become subordinate, even increasingly outsourced, as attention turns to investing in Digital Business.

IoT is all too often either seen as a consumer wave, or as interesting way to add more data to current IT processes. The Digitized representation of the Physical World, by using IoT, to allow assets and events, to be ‘read’ and create dynamic optimized ‘react’ is at the heart of Digital Business. IoT should be understood as a group of technologies that create the crucial digital data to extend the use of computers into new areas of the Enterprise Business operating model.

Business Managers have come to realize that continuous innovation and dynamic optimization requires the decentralization in Enterprise operations that is a core feature of Digital Business models. After years of using IT to support ever more centralized business activities and processes this is counter intuitive to many IT professional who fear for the ‘State full’ synchronization of Enterprise data.

The IT role is to manage the integration of Systems in known relationships, Close Coupled, to maintain the single version of the truth, state full, data model. Digital Business requires a new role to ensure whatever Device can communicate whenever it needs, Loose Coupled, and align data flow to activities in a Stateless Model.

The Loose Coupled, Stateless nature of Digital Business coupled to the translation of the physical World into Digital Models, massively increases the volumes of data to be ‘read’. Simultaneously the time available to ‘respond’ in order to be able to influence the outcome is decreased thereby making the introduction of automation necessary.

Adopting Digital Business by Technology correctly creates the environment in which to deploy AI, (standing for Augmented Intelligence). As with Digital Business versus current online Business AI is not Analytics and BI taken to the next level, it is a new approach requiring an investment in time to understand how to deliver the ‘augmentation’ human capacity to manage the vast increases in data and decisions Digital Business brings.

In summary IT departments cannot muddle through and expect that as before they will be able to assimilate and adopt this latest round of technology change. Technology professionals as ever will be in great demand to deliver what Business requires. Currently IT professionals in IT departments need to conduct serious strategic knowledge building exercises.

Equipped with this knowledge the IT department becomes able to fulfill a wider role, perhaps in conjunction with the existing Operation Technology department, and certainly to bring Technology skills to the Enterprise Business strategy.

 

Interesting Links

1) ASUG, (Association of SAP User Groups) has had a long running educational program in the form of Webinars given by a range of Experts with Case Studies. The views are not limited to, or constrained by SAP and its products, and offer distinctly practical advice. The current series can be found at https://www.asug.com/news/saps-journey-to-the-cloud-and-you-part-i

2) The Internet of Things World – Europe event provides an illustration of the scale of IoT to Digital Business adoption in the numbers of speakers from many World-class enterprises, but also demonstrates the almost total absence of IT departments and professionals from the program. See speaker lists and the program details here

New C-Suite Innovation & Product-led Growth Tech Optimization Future of Work AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing SaaS PaaS IaaS Supply Chain Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

News Analysis - Microsoft to deliver Microsoft Cloud from datacenters in Africa

News Analysis - Microsoft to deliver Microsoft Cloud from datacenters in Africa

[South Africa has 11 official languages – and as I start blogs on a new IaaS locations with the local language ‘being learnt’ by the provider, I took the alphabetically first and last language.]

Microsoft made the announcement to bring Azure to South Africa recently. Data Center locations mater from both a constitutional and data residency – as well as a performance perspective.


 
 
 
So let’s take apart Scott Guthrie’s blog, which can be found here:
May 18, 2017 – Johannesburg, South Africa – Today Microsoft revealed plans to deliver the complete, intelligent Microsoft Cloud for the first time from datacenters located in Africa. This new investment is a major milestone in the company’s mission to empower every person and every organization on the planet to achieve more, and a recognition of the enormous opportunity for digital transformation in Africa.
MyPOV – Good to see the intent, in line with the general Microsoft Azure (or is it now the Microsoft Cloud?) value pitch of being the intelligent cloud.
Expanding on existing investments, Microsoft will deliver cloud services, including Microsoft Azure, Office 365, and Dynamics 365, from datacenters located in Johannesburg and Cape Town, South Africa with initial availability anticipated in 2018. The new cloud regions will offer enterprise-grade reliability and performance combined with data residency to help enable the tremendous opportunity for economic growth, and increase access to cloud and internet services for organizations and people across the African continent.
MyPOV – So 2018 will see the go live. No surprise there is an Office 365 and Dynamics 365 angle, two products that have to comply with data privacy and data residency legislation.
“We’re excited by the growing demand for cloud services in Africa and their ability to be a catalyst for new economic opportunities,” said Scott Guthrie, Executive Vice President, Cloud and Enterprise, Microsoft Corp. “With cloud services ranging from intelligent collaboration to predictive analytics, the Microsoft Cloud delivered from Africa will enable developers to build new and innovative apps, customers to transform their businesses, and governments to better serve the needs of their citizens.”
MyPOV – Good quote from Guthrie, which also hints at the next generation application angle for developers and the government angle. Government usually require data residency.
Expanding Access & Opportunity: Currently many companies in Africa rely on cloud services delivered from outside of the continent. Microsoft’s new investment will provide highly available, scalable, and secure cloud services across Africa with the option of data residency in South Africa. With the introduction of these new cloud regions, Microsoft has now announced 40 regions around the world – more than any major cloud provider. The combination of Microsoft’s global cloud infrastructure with the new regions in Africa will connect businesses with opportunity across the globe, help accelerate new investments, and improve access to cloud and internet services for people and organizations from Cairo to Cape Town.
MyPOV – South Africa is an island from a IaaS perspective. A large economy, but far away from a connectivity perspective, it’s in a similar situation like Australia, only that the IaaS vendors made it to Australia much earlier. With 40 regions Microsoft currently leads fellow competitors AWS and Google, but Microsoft does not clarify how many data centers are in one location. And to tackle reliable services in a geography, it needs to be at least, two data centers. Microsoft does not share how many data centers are in a region, but will have two regions with one each in Cape Town and Johannesburg. That should be a good answer for any HA (High Availability) concerns, though Oracle has moved the 'standard' quickly to three data centers per location / region.
“We greatly value Microsoft’s commitment to invest in cloud services delivered from Africa. Standard Bank already relies on cloud technology to provide our customers with a seamless experience,” says Brenda Niehaus, Group CIO at Standard Bank. “To achieve success as a business, we need to keep pace with market developments as well as customer needs, and Office 365 empowers us to make a culture shift towards becoming a more dynamic organization, whilst Azure enables us to deliver our apps and services to our customers in Africa. We’re looking forward to achieving even more with the cloud services available here on the continent.”
MyPOV – Always good to have launch customers and good to have them provide a quote in a press release announcing future to be used products / services.
Investing in African Innovation: This announcement expands on ongoing investments in Africa, where organizations are using currently available cloud and mobile services as a platform for innovation in health care, agriculture, education, and entrepreneurship. Microsoft has been working to support local start-ups and NGOs, unleashing innovation that has the potential to solve some of the biggest problems facing humanity, such as the scarcity of water and food, and economic and environmental sustainability. One start-up, M-KOPA Solar, provides affordable pay-as-you-go solar energy to over 500,000 homes using mobile and cloud technology. AGIN has built an app connecting 140,000 smallholder farmers to key services, enabling them to share data and facilitating $1.3 million per month in finance, insurance and other services.
MyPOV – Always good to show the potential and upside – and Africa has a lot of both. It’s not clear what M-Kopa and AGIN are or will be using from Microsoft. 
 
Across Africa, Microsoft has brought 728,000 small and mid-size enterprises (SMEs) online to help them transform and modernize their businesses, and over 500,000 are now utilizing Microsoft cloud services, with 17,000 using the 4Afrika hub to promote and grow their businesses. The Microsoft Cloud is also helping Africans build job skills, with 775,000 trained on subjects ranging from digital literacy to software development. We anticipate the Microsoft Cloud from Africa will fuel extensive new opportunities for our 17,000 regional partners and customers alike.
MyPOV – Impressive numbers, the consumer and educational aspect of the Microsoft product and services portfolio has a lot of potential in Africa. On the other side it also requires Microsoft to invest into infrastructure in Africa, and this is a first step.
“This development broadens the options available to us in our modernization journey of Government ICT infrastructure and services. It allows us to take advantage of new opportunities to develop innovative government solutions at manageable costs, as well as drive overall improvements in operations management, while improving transparency and accountability,” says Dr. Setumo Mohapi, CEO at SITA.
MyPOV – Again – good to see a current / future customer quote – covering the government aspect and potential.
The Microsoft Trusted Cloud: Microsoft has deep expertise protecting data, championing privacy, and empowering customers around the globe to meet extensive security and privacy requirements. With Microsoft’s Trusted Cloud principles of security, privacy, compliance, transparency, and the broadest set of compliance certifications and attestations in the industry, Microsoft’s cloud infrastructure supports over a billion customers and 20 million businesses around the globe. […]
MyPOV – Good to see Microsoft stressing the security aspect. As in every new geographic region where the cloud arrives, there is a large group of skeptical CxOs and security concerns are at the top of their list of reasons why they cannot move to the cloud. These concerns need to be addressed. There is no reason though why these concerns cannot be addressed as well in South Africa like they have in the rest of the world… with a broadly favorable outcome for the cloud.

 

Overall MyPOV

Always good to see IaaS vendors adding locations to their global clouds. South Africa is a key possession from the combination of GDP and remoteness toward network backbones. Australia has similar characteristics, but has a 4x larger GDP, so no surprise the global monopoly game between the IaaS vendors has seen Australia see the respective IaaS flags earlier than South Africa. But now it is South Africa, and with that Africa’s turn. And Africa (after Asia) is the world’s second largest continent – from a population perspective. As such Africa is key for Microsoft for all its offerings, as the press release outlines: For Office usage, for Dynamics usage and for getting strongly locally rooted customers (like governments) on the Microsoft cloud.

On the concern side, there is little to address. Microsoft is large enough to make the CAPEX happen, the question is only, which geographies got trumped by South Africa. But that’s what we learn soon from the next data center location announcement. And then South African data centers will likely not be efficient to service any economy north of the equator, e.g. the African Mediterranean rim is likely served better from Europe. And then there is the prize of the first Middle Eastern data center. And that Microsoft does not shy away from network investments can be seen from the recent MAREA cable announcement (with Facebook  see here), which will hit Europe closer to Africa than any other transatlantic cable, in Bilbao (Spain).

But for now, congrats to Microsoft – that between the three large providers (add AWS and Google) is the first with an announcement for a South African location. Even going further down the provider list to e.g. IBM, Oracle and SAP (though SAP may not push the IaaS build out now) – Microsoft has made the first announcement / move towards South Africa / Africa. So, congrats are in order.

 

 

Tech Optimization Innovation & Product-led Growth Future of Work Data to Decisions New C-Suite Next-Generation Customer Experience Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

News Analysis - Informatica Reinvents iPaaS with Next-Generation Cloud Services

News Analysis - Informatica Reinvents iPaaS with Next-Generation Cloud Services

Informatica had its user conference Infa17 this week in San Francisco, from May 15th till May 18th2017. We had the chance to be there on Monday, and actually speak about MDM trends (see the Storify here).

 
As usual user conferences come with a flurry of press releases, this one was no difference – let me dissect in the custom style, it can be found here:
Expands key capabilities of iPaaS to include end-to-end data management
Advances productivity with CLAIRE – metadata-driven Artificial Intelligence
Scales enterprise-wide to manage complex hybrid data environments
Adheres to the broadest security certifications MyPOV – A fair summary, these press releases are a bit too long, when the PR folks realize they need a bullet list of content items at the beginning.
Informatica World, SAN FRANCISCO, Calif., May 16, 2017 – Informatica, the Enterprise Cloud Data Management leader accelerating data-driven digital transformation, today announced Informatica Intelligent Cloud Services, the most advanced Integration Platform as a Service (iPaaS) solution available for end-to-end enterprise cloud data management. Informatica Intelligent Cloud Services will feature a next-generation user experience based on a modern API-based microservices architecture, powered by Informatica’s innovative enterprise unified metadata intelligence - known as CLAIRE Engine.
MyPOV – Great summary, described what is happening and bridges between well know iPaaS to microservices and AI, introducing the CLAIRE engine.
 
Informatica Intelligent Cloud Services expands Informatica’s leading application and data integration iPaaS capabilities to now include critical end-to-end cloud data management. This includes industry leading and enterprise-class data integration, API management, application integration, data quality and governance, master data management and data security, all re-imagined for the cloud. Informatica Intelligent Cloud Services is built on the Informatica Intelligent Data Platform™, and has reimagined the front- and back-end experience for the modern cloud environment, enabling organizations to efficiently unleash the power of all their data, wherever it resides, to fuel successful digital transformation initiatives.
MyPOV – Ok more detail, and a reminder all is on one single platform – the Informatica Intelligent Data Platform.
“As the industry’s number one iPaaS leader, we are driving innovation in this market,” said Amit Walia, executive vice president and chief product officer, Informatica. “With this launch, we are completely re-inventing iPaaS. We are delivering the industry’s broadest end-to-end data management solution for the cloud, with a next-generation user experience running on an API-based microservices architecture. Powered by metadata and Artificial Intelligence, we help enterprises accelerate their cloud-powered digital transformations.”
MyPOV – Good quote rom Walia. Everybody wants to re-invent things these days, but when putting AI into action, it truly has the potential of doing things differently. My concern is that Informatica uses ‘old’ terms like e.g. metadata. No need to talk about metadata in the AI era. More below.
 
Data Management Reimagined for Cloud
Informatica Intelligent Cloud Services moves past the traditional definition of iPaaS to include cloud data integration, cloud application and process integration, API management and connectivity. Informatica Intelligent Cloud Services delivers the industry’s first, and only, family of clouds that provides industry-leading data management capabilities, powered by CLAIRE from Informatica.
MyPOV – Ok – this is the third – and different collection of what the iPaaS does, maybe with a data management angle.
 
The family of clouds available in Informatica Intelligent Cloud Services include:
Informatica Integration Cloud – Modern digital strategies require a variety of integration approaches and patterns. Integration Cloud greatly expands the traditional definition of iPaaS to include advanced, unique functionality, such as Integration Hub and B2B. This also includes application integration, data integration and API management. For example, Cloud Integration Hub provides pub-sub-hub integration capabilities for hybrid data management.
Informatica Data Quality & Governance Cloud – Modern digital strategies require trusted data. Data Quality & Governance Cloud includes functionality that delivers the data quality and governance foundation for all cloud projects and initiatives. For example, Cloud Data Quality Radar provides the ability to assess and fix data quality issues within cloud applications, such as Salesforce and Marketo.
Informatica Master Data Management Cloud – Modern digital strategies require authoritative data. Master Data Management Cloud provides single, complete and accurate views across all forms of master data, in a single source of truth. For example, Cloud Customer 360 for Salesforce provides cloud MDM capabilities that scale to the most demanding enterprise requirements with a laser focus on business self-service and self-management of master data.
Future Clouds – Additional, modular data management clouds, products and solutions will be seamlessly added to Informatica Intelligent Cloud Services over time.MyPOV – So what used to be once products are now ‘clouds’. The separation between integration, data management and then data quality and governance makes sense as these are the organic organization and breaking points how data handling organizations are setup. Good to see Informatica leaving the door open for future clouds aka products.
Adopting cloud and using data to drive disruption requires excellence in data management. The innovative combination of a modern user experience and API-based microservices architecture, built on the industry’s only Intelligent Data Platform powered by CLAIRE, enables Informatica Intelligent Cloud Services to deliver increased productivity and address new use cases, at scale.
MyPOV – Another collection of capabilities, would be good to have an example or customer proof point.
 
Next-Generation Experience and Architecture: Innovative Approach for Maximum Productivity
All the clouds that comprise Informatica Intelligent Cloud Services share a consistent, next-generation user experience across the entire spectrum of data management capabilities. The API-based microservices architecture delivers common services (e.g., user authentication, workflow creation, asset management, search, tagging, and more) that not only look the same, but also operate exactly the same wherever they are invoked across the cloud. This user experience dramatically reduces the learning curve for new tools and drives self-service across the environment.
MyPOV – Always good to mention suite level benefits – and this time making them tangible with examples. Consistency and synergies is what enterprises want to see when they buy multiple, suite integrated products from the same vendor.
 
The reimagined next-generation user experience includes a single, personalized home page with tiles for items such as personal tasks and connections, plus tiles that are dashboards for the projects that person has underway in each of the data management clouds. This home page gives them visibility and access to all the data management projects they may have underway across all the clouds of Informatica Intelligent Cloud Services.
MyPOV - It's never enough to offer great capability and functionality behind the scenes, it has to been seen and experience by the user. Good to see the UX progress, which frankly was an area where Informatica has been challenged in the past, particularly in terms of consistency. Good to see the tile approach for UX - which has worked well across the industry to bring information together consistently for multiple user roles. 

 
Additionally, the new microservices are based on open REST APIs. This will enable a continued rapid pace of innovation for Informatica, allowing the company to bring new services and advance existing services at a rapid pace. It will also enable quick integration with customer and partner reference architectures.
MyPOV - An overdue move, good nonetheless. REST has won and it's time to adopt it for vendors. 
 
Powered by CLAIRE: Intelligence Throughout the Cloud
CLAIRE—with clairvoyance in mind and AI in the center—is the industry’s most advanced metadata-driven Artificial Intelligence (AI) technology and is embedded in the Informatica Intelligent Data Platform. CLAIRE delivers intelligence to the entire portfolio of Informatica data management solutions that includes data integration, master data management, data quality and governance, data security, cloud data management, and big data management capabilities. CLAIRE delivers AI by applying machine learning to technical, business, operational and usage metadata across the entire enterprise. This scale and scope of metadata is transformational and allows CLAIRE to help data and integration developers by partially or fully automating many tasks, while business users find it easier to locate and prepare the data they are looking for from anywhere in the enterprise. Meanwhile, data scientists gain a faster understanding of data and data stewards find it easier to visualize data relationships.
MyPOV – Good intro and explanation of CLAIRE, though vague on detail in regards of machine learning algorithms, platform, pricing and so on – but it is early days.

Enterprise-wide Management for a Hybrid World
Informatica Intelligent Cloud Services is built to enable enterprises to run complex hybrid environments. It provides operational insights delivered through a single dashboard for monitoring and managing all data management clouds and products and their data. Customers benefit from easy connectivity to all cloud, on-premise and big data sources across the enterprise using pre-built Informatica connectors.
MyPOV – We live in world of multi-cloud and hybrid systems – integrations in general and MDM in specific needs to span across them – so it is good to see Informatica providing a single pane of glass and tools on a common platform.

Cloud with Industry-leading Security and Trust
Informatica Intelligent Cloud Services is built for the enterprise with security as a core design principle. It has the following certifications and standards for industry leading security:
AICPA SOC 2 Type 2 and SOC 3 attestations.
Externally audited HIPAA compliance.
ISO 27000-aligned Information Security Management System, and EU-US Privacy Shield and compliance security program.
Member of Cloud Security Alliance and Salesforce AppExchange certified.
MyPOV – Security remains the top – or one of the top 3 concerns of cloud users, so it is good that Informatica seeks security certifications that help address these concerns.
 

Overall MyPOV

Informatica is in the transition from on premises to the cloud, what started gingerly a year ago at INFA16 is now in full swing and available at INFA17. And a cloud platform brings new capabilities e.g. in the area of Machine Learning, and as we are in the ‘buzzword AI’ age – the AI assistants are coming. Good to see CLAIRE being launched, we have to understand a little bit better what she does, where she lives, how she is educated, and what other needs she have (thanks for staying with the metaphor).

On the concern side, Informatica use some older vocabulary, and vocabulary is also an indicator for thinking. Metadata was the gold in integration mechanisms of the past. Take a look – to make it not contentious – at Google Photos: Users do not flag or tag pictures anymore. Machine Learning does. That there is a tagging (or metadata) repository underneath it – of course – but users never see that. It isn’t even exposed. If integration vendors will go so far remains to be seen – but with the rise of AI – CLAIRE will take care of the metadata, Informatica user should not know, see and talk about it – just use it the same way people use Google Photos. A high ask, but with a big prize.

Overall a good INFA17, Informatica is running on all cylinders, the PE investment seems not to hinder the execution, the vendor is moving fast, which is great news for customers and prospects. For enterprises in general, as the integration problems only get bigger for the foreseeable future.
 

 
 
Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Data to Decisions Future of Work SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

A hidden message from Ed Snowden

A hidden message from Ed Snowden

Media Name: night-sky-1467173572719-f14b9fb86e5f.jpg

The KNOW Identity Conference in Washington DC last week opened with a keynote fireside chat between tech writer Manoush Zomorodi and Edward Snowden. 

Once again, the exiled security analyst gave us a balanced and nuanced view of the state of security, privacy, surveillance, government policy, and power.  I have always found him to be a rock-solid voice of reason. Like most security policy analysts, Snowden sees security and privacy as symbiotic: they can be eroded together, and they must be bolstered together. When asked (inevitably) about the “security-privacy balance”, Snowden rejects the premise of the question, as many of us do, but he has an interesting take, arguing that governments tend to surveil rather than secure.  

The interview was timely for it gave Snowden the opportunity to comment on the “Wannacry” ransomware episode which affected so many e-health systems recently.  He highlighted the tragedy that cyber weapons developed by governments keep leaking and falling into the hands of criminals. 

For decades, there has been an argument that cryptography is a type of “Dual-Use Technology”; like radio-isotopes, plastic explosives and supercomputers, it can be used in warfare, and thus the NSA and other security agencies try to include encryption in the “Wassenaar Arangement” of export restrictions.  The so-called “Crypto Wars” policy debate is usually seen as governments seeking to stop terrorists from encrypting their communications.  Even if crypto export control worked, it doesn’t address security agencies’ carelessness with their own cyber weapons.

But identity was the business of the conference. What did Snowden have to say about that?

  • Identifiers and identity are not the same thing.  Identifiers are for computers but “identity is about the self”, to differentiate yourself from others.
  • Individuals need names, tokens and cryptographic keys, to be able to express themselves online, to trade, to exchange value.
  • “Vendors don’t need your true identity”; notwithstanding legislated KYC rules for some sectors, unique identification is rarely needed in routine business. 
  • Historically, identity has not been a component of many commercial transactions.
  • The original Web of Trust, for establishing a level of confidence in people though mutual attestation, was “crude and could not scale”. But new “programmatic, frictionless, decentralised” techniques are possible.
  • He thought a “cloud of verifiers” in a social fabric could be more reliable, to avoid single points of failure in identity.

When pressed, Snowden said actually he was not thinking of blockchain (and that he saw blockchain as being specifically good for showing that “a certain event happened at a certain time”).  

Now, what are identity professionals to make of Ed Snowden’s take on all this? 

For anyone who has worked in identity for years, he said nothing new, and the identerati might be tempted to skip Snowden. On the other hand, in saying nothing new, perhaps Snowden has shown that the identity problem space is fully defined. 

There is a vital meta-message here.

In my view, identity professionals still spend too much time in analysis.  We’re still writing new glossaries and standards.  We’re still modelling. We’re still working on new “trust frameworks”.  And all for what?  Let’s reflect on the very ordinariness of Snowden’s account of digital identity.  He’s one of the sharpest minds in security and privacy, and yet he doesn’t find anything new to say about identity. That’s surely a sign of maturity, and that it’s time to move on.  We know what the problem is: What facts do we need about each other in order to deal digitally, and how do we make those facts available?

Snowden seems to think it’s not a complicated question, and I would agree with him.

 

 

Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Distillation Aftershots AI Blockchain Security Zero Trust Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Privacy Officer

Roundup Of Cloud Computing Forecasts, 2017

Roundup Of Cloud Computing Forecasts, 2017

1
  • Cloud computing is projected to increase from $67B in 2015 to $162B in 2020 attaining a compound annual growth rate (CAGR) of 19%.
  • Gartner predicts the worldwide public cloud services market will grow 18% in 2017 to $246.8B, up from $209.2B in 2016.
  • 74% of Tech Chief Financial Officers (CFOs) say cloud computing will have the most measurable impact on their business in 2017.

Cloud platforms are enabling new, complex business models and orchestrating more globally-based integration networks in 2017 than many analyst and advisory firms predicted. Combined with Cloud Services adoption increasing in the mid-tier and small & medium businesses (SMB), leading researchers including Forrester are adjusting their forecasts upward. The best check of any forecast is revenue.  Amazon’s latest quarterly results released two days ago show Amazon Web Services (AWS) attained 43% year-over-year growth, contributing 10% of consolidated revenue and 89% of consolidated operating income.

Additional key takeaways from the roundup include the following:

  • Wikibon is predicting enterprise cloud spending is growing at a 16% compound annual growth (CAGR) run rate between 2016 and 2026. The research firm also predicts that by 2022, Amazon Web Services (AWS) will reach $43B in revenue, and be 8.2% of all cloud spending. Source: Wikibon report preview: How big can Amazon Web Services get?
Wikibon Worldwide Enterprise IT Projection By Vendor Revenue

Wikibon Worldwide Enterprise IT Projection By Vendor Revenue

Rapid Growth of Cloud Computing, 2015–2020

Rapid Growth of Cloud Computing, 2015–2020

Worldwide Public Cloud Services Forecast (Millions of Dollars)

Worldwide Public Cloud Services Forecast (Millions of Dollars)

  • By the end of 2018, spending on IT-as-a-Service for data centers, software and services will be $547B. Deloitte Global predicts that procurement of IT technologies will accelerate in the next 2.5 years from $361B to $547B. At this pace, IT-as-a-Service will represent more than half of IT spending by the 2021/2022 timeframe. Source: Deloitte Technology, Media and Telecommunications Predictions, 2017 (PDF, 80 pp., no opt-in).
Deloitte IT-as-a-Service Forecast

Deloitte IT-as-a-Service Forecast

  • Total spending on IT infrastructure products (server, enterprise storage, and Ethernet switches) for deployment in cloud environments will increase 15.3% year over year in 2017 to $41.7B. IDC predicts that public cloud data centers will account for the majority of this spending ( 60.5%) while off-premises private cloud environments will represent 14.9% of spending. On-premises private clouds will account for 62.3% of spending on private cloud IT infrastructure and will grow 13.1% year over year in 2017. Source: Spending on IT Infrastructure for Public Cloud Deployments Will Return to Double-Digit Growth in 2017, According to IDC.
Worldwide Cloud IT Infrastructure Market Forecast

Worldwide Cloud IT Infrastructure Market Forecast

  • Platform-as-a-Service (PaaS) adoption is predicted to be the fastest-growing sector of cloud platforms according to KPMG, growing from 32% in 2017 to 56% adoption in 2020. Results from the 2016 Harvey Nash / KPMG CIO Survey indicate that cloud adoption is now mainstream and accelerating as enterprises shift data-intensive operations to the cloud.  Source: Journey to the Cloud, The Creative CIO Agenda, KPMG (PDF, no opt-in, 14 pp.)
Cloud investment by type today and in three years

Cloud investment by type today and in three years

AWS Segment Financial Comparison

AWS Segment Financial Comparison

  • In Q1, 2017 AWS generated 10% of consolidated revenue and 89% of consolidated operating income. Net sales increased 23% to $35.7 billion in the first quarter, compared with $29.1 billion in first quarter 2016. Source: Cloud Business Drives Amazon’s Profits.
Comparing AWS' Revenue and Income Contributions

Comparing AWS’ Revenue and Income Contributions

  • RightScale’s 2017 survey found that Microsoft Azure adoption surged from 26% to 43% with AWS adoption increasing from 56% to 59%. Overall Azure adoption grew from 20% to 34% percent of respondents to reduce the AWS lead, with Azure now reaching 60% of the market penetration of AWS. Google also increased adoption from 10% to 15%. AWS continues to lead in public cloud adoption (57% of respondents currently run applications in AWS), this number has stayed flat since both 2016 and 2015. Source: RightScale 2017 State of the Cloud Report (PDF, 38 pp., no opt-in)
Public Cloud Adoption, 2017 versus 2016

Public Cloud Adoption, 2017 versus 2016

  • Global Cloud IT market revenue is predicted to increase from $180B in 2015 to $390B in 2020, attaining a Compound Annual Growth Rate (CAGR) of 17%. In the same period, SaaS-based apps are predicted to grow at an 18% CAGR, and IaaS/PaaS is predicted to increase at a 27% CAGR. Source: Bain & Company research brief The Changing Faces of the Cloud (PDF, no opt-in).
60% of IT Market Growth Is Being Driven By The Cloud

60% of IT Market Growth Is Being Driven By The Cloud

  • 74% of Tech Chief Financial Officers (CFOs) say cloud computing will have the most measurable impact on their business in 2017. Additional technologies that will have a significant financial impact in 2017 include the Internet of Things, Artificial Intelligence (AI) (16%) and 3D printing and virtual reality (14% each). Source: 2017 BDO Technology Outlook Survey (PDF), no opt-in).
CFOs say cloud investments deliver the greatest measurable impact

CFOs say cloud investments deliver the greatest measurable impact

Cloud investments are fueling new job throughout Canada

Cloud investments are fueling new job throughout Canada

  • APIs are enabling persona-based user experiences in a diverse base of cloud enterprise As of today there are 17,422 APIs listed on the Programmable Web, with many enterprise cloud apps concentrating on subscription, distributed order management, and pricing workflows.  Sources: Bessemer Venture Partners State of the Cloud 2017 and 2017 Is Quickly Becoming The Year Of The API Economy. The following graphic from the latest Bessemer Venture Partners report illustrates how APIs are now the background of enterprise software.
APIs are fueling a revolution in cloud enterprise apps

APIs are fueling a revolution in cloud enterprise apps

Tech Optimization SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Key Takeaways from Salesforce's Q1

Key Takeaways from Salesforce's Q1

Constellation Insights

While Salesforce dipped back into the red in its first quarter, after posting modest profits in the previous few, the company remains on a serious growth tear with revenue up 25 percent year-over-year. The full numbers are available here, but as usual, we will go through the earnings conference call and pull out the most relevant pieces of information and perspective (H/T to Seeking Alpha for the call transcript).

Einstein, the Straight Shooter: Salesforce's take on AI in the enterprise is Einstein, built from a collection of homegrown and acquired technologies. It's been pushing out Einstein capabilities across its various clouds. But for now, at least, one Einstein component, Guidance, remains in-house. Salesforce CEO Marc Benioff gave a preview of it on the call:

[W]e then have a piece of Einstein now that we've not yet rolled out to our customers called Einstein Guidance. So this is a capability that I use with my staff meeting, when I do my forecast and when I do my analysis of the quarter, which happens every Monday at my staff meeting like a lot of CEOs do, it's a very typical process, of course, we have our top 20 or 30 executives around the table. We talk about different regions, different products, different opportunities. And then I ask one other executive their opinion and that executive is Einstein. And I will literally turn to Einstein in the meeting and say, "Okay, Einstein, you've heard all of this. Now what do you think?"

And Einstein will give me the over and under on the quarter and show me where we're strong and where we're weak and sometimes will even point out a specific executive, which has done in the last three quarters and said that this executive is somebody who needs specific attention during the quarter. ... I think for a CEO, typically the way it works is, of course, you have various people, mostly politicians and bureaucrats, in your staff meeting who are telling you what they want to tell you to kind of get you to believe what they want you to believe. Einstein comes without bias.

Industries Going All In: Salesforce rolled out a vertical strategy several years ago, and president Keith Block gave a progress update on the call. Ten of the 15 largest telcos, 8 of the United States' 10 largest retailers, and nine of the top 10 wealth management firms "rely" on Salesforce, he said. For public sector, Block reported expanded relationships with the U.S. Army and Air Force, as well as a new deal with the state of Florida centered on tourism. 

Salesforce As Digital Transformation Driver: Benioff noted on the call that acquisitions such as SteelBrick for CPQ (configure, price, quote) capabilities are helping drive new Sales Cloud deals, particularly into existing customers. Overall, Salesforce is capturing a bigger piece of IT spending now and this is positioning it as a key player for digital transformation projects, Block said:

[W]e're able to say, "Okay, now the walls between sales, service and marketing are coming down." So now we have an opportunity to provide a 360-degree view of the customer with service and with marketing. And take that now one step further, as we've moved from systems of record to systems of engagement to now systems of intelligence. ... So it's an expansion of our capabilities and our opportunity to drive transformation with these customers.

 

Amazon Web Services Is Salesforce's 'Best Friend': Salesforce is moving some of its workloads to Amazon Web Services, a move that will help it shave costs and expand its global footprint. But the alignment seems much closer than a financial transaction, judging by this remark from Benioff:

I think at Salesforce, we really strongly believe that the enemy of my enemy is my friend, and I think that makes Amazon Web Services our best friend. 

As for that enemy? Benioff didn't actually say the word "Oracle," but he may as well have.

Data to Decisions Marketing Transformation Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer

Google I/O: The Key Enterprise Takeaways

Google I/O: The Key Enterprise Takeaways

/cINSIGHTS

Google's I/O developer conference kicked off this week and as in past years, it generated a lot of news spanning both consumer and enterprise-oriented scenarios (and of course, in some cases that line is a bit blurry). Here's a look at the top takeaways from the event's announcements for CXOs to consider.

Not Mobile First, AI First

Artificial intelligence has been the hottest trend in tech for some time now, and fittingly was the dominant focus of I/O. Put simply, Google wants to dominate the AI discussion and is making major moves to succeed in doing so.

It introduced Google.ai, which ties together all of its AI efforts in one place. It's aimed at both private companies, individual developers and academics and will focus on Google's AI research, tools and applied AI. 

Google CEO Sundar Pichai unveiled a new project called AutoML, a neural network capable of designing neural networks. This notion has been a holy grail of sorts in the AI field, and Pichai says Google will make major strides on the relatively near horizon, writing in a blog post:

We hope AutoML will take an ability that a few PhDs have today and will make it possible in three to five years for hundreds of thousands of developers to design new neural nets for their particular needs. 

AI is informing how Google evolves its products in a fundamental way, and the shift applies to the tech industry as a whole, Pichai added.

We are now witnessing a new shift in computing: the move from a mobile-first to an AI-first world. ... Think about Google Search: it was built on our ability to understand text in webpages. But now, thanks to advances in deep learning, we’re able to make images, photos and videos useful to people in a way they simply haven’t been before. Your camera can “see”; you can speak to your phone and get answers back—speech and vision are becoming as important to computing as the keyboard or multi-touch screens.  

There is still a long way to go before we are truly an AI-first world, but the more we can work to democratize access to the technology—both in terms of the tools people can use and the way we apply it—the sooner everyone will benefit.

Google has its own commercial considerations for AI, of course, going beyond its core products. It is hoping to make Google Cloud Platform the go-to place for developing bespoke AI applications. Google's secret sauce for accomplishing that are its TPUs (Tensor Processing Units), specialized chips designed for machine learning workloads.

The first generation of TPUs were introduced last year, but focused on running Google's existing machine learning models more efficiently. Pichai annnounced that the second-generation of the chips, dubbed Cloud TPUs, will be offered through Google Compute Engine later this year. Cloud TPUs not only run existing models but can train new ones. 

Google is also clustering the TPUs into what it calls "pods," which provide huge performance gains over past approaches. A new large-scale translation module once required a full day of training on 32 high-end GPUs, but accomplishes the same thing now "in an afternoon using just one eighth of a TPU pod," according to Google. TPUs will work in conjunction with TensorFlow, the machine learning framework Google open-sourced in 2015 to considerable success.

The company is running an alpha program for the TPUs and is also introducing the TensorFlow Research Cloud, which will make 1,000 TPUs available to researchers from both private industry and academia if they're willing to give back contributions to the open-source community.

Instant Apps Go GA

First announced at last year's I/O conference, Android Instant Apps are now out of preview and available to all developers. Instead of making users download and install an app, Instant Apps actually stream to devices from Google Play. Later on, users can decide to install them permanently.

Naturally, Instant Apps aren't as powerful as installed ones, which have deeper access to the device, but they do include useful capabilities such as payments and location.

Instant Apps provide a middle ground between websites and full-featured apps. That's a useful tool for enterprises to have in the toolbox, whether for internal users or outreach to customers. Constellation Research VP and principal analyst Holger Mueller noted earlier this year that Instant Apps have security advantages, since installed apps would involve MDM (mobile device management) issues.

The question now is how much momentum Google can build for Instant Apps out of the gate. Instant Apps capabilities will ship with Android O, the next version of the mobile OS, but will also be compatible with previous versions—a must, given the rampant fragmentation in the Android ecosystem. 

Enterprises should take a look at how Instant Apps can fit into their overal mobility, marketing and internal IT strategies. Beyond the potential use cases, Instant Apps give IT leaders a new way to balance development resources; currently supported, full-blown apps could be replaced with lighter touch Instant Apps requiring less overhead for IT.

Google Steps Toward HR with Google for Jobs

There has been much speculation about which directions Google will head in the enterprise application market since the arrival of former VMWare head Diane Greene as SVP of cloud. While it's not clear that Greene's fingerprints are on it, a new Google service called Google for Jobs brings the company into the orbit of HR and HCM software. Pichai described the new service in a blog post:

[A]lmost half of U.S. employers say they still have issues filling open positions. Meanwhile, job seekers often don’t know there’s a job opening just around the corner from them, because the nature of job posts—high turnover, low traffic, inconsistency in job titles—have made them hard for search engines to classify. Through a new initiative, Google for Jobs, we hope to connect companies with potential employees, and help job seekers find new opportunities.

As part of this effort, we will be launching a new feature in Search in the coming weeks that helps people look for jobs across experience and wage levels—including jobs that have traditionally been much harder to search for and classify, like service and retail jobs. 

Google has already worked with companies such as LinkedIn and Glassdoor to integrate them with Google for Jobs. What will be interesting to watch for are potential partnerships down the road with enterprise HR and HCM vendors. 

 

Data to Decisions Future of Work New C-Suite Tech Optimization Chief People Officer Chief Information Officer Chief Digital Officer