Results

When AI and Personal Information Collide: The Privacy Implications

When AI and Personal Information Collide: The Privacy Implications

Constellation Insights

Stanford University professor Michal Kosinski made waves recently for research he conducted that suggested AI can determine a person's sexual orientation based on pictures of their face. Now Kosinski is going further, saying that AI could also pinpoint someone's political leanings, level of intelligence and other personal data points based on photographs, as the Guardian reports:

Faces contain a significant amount of information, and using large datasets of photos, sophisticated computer programs can uncover trends and learn how to distinguish key traits with a high rate of accuracy. With Kosinski’s “gaydar” AI, an algorithm used online dating photos to create a program that could correctly identify sexual orientation 91% of the time with men and 83% with women, just by reviewing a handful of photos.

Kosinski’s research is highly controversial, and faced a huge backlash from LGBT rights groups, which argued that the AI was flawed and that anti-LGBT governments could use this type of software to out gay people and persecute them. Kosinski and other researchers, however, have argued that powerful governments and corporations already possess these technological capabilities and that it is vital to expose possible dangers in an effort to push for privacy protections and regulatory safeguards, which have not kept pace with AI.

Kosinski is also known for his controversial work on psychometric profiling, including using Facebook data to draw inferences about personality. The data firm Cambridge Analytica has used similar tools to target voters in support of Donald Trump’s campaign, sparking debate about the use of personal voter information in campaigns.

There is much more to the Guardian's full report, which is well worth a read.

Analysis: AI advancements test, but don't rise above privacy laws

Privacy regulators increasingly recognize that the creation of personal information through computer algorithims is a form of collection, says Constellation VP and principal analyst Steve Wilson, who leads the firm's coverage of digital security and privacy issues: "If a computer program sets a flag in a database saying 'this person is right wing,' or 'this person is LGBT,' then that represents an act of collection of personal information."

This practice can be termed algorithmic collection, or synthetic PII, and is treated exactly the same under privacy laws as collecting the information by getting subjects to fill out a questionnaire. In some jurisdictions, such as Australia, PII related to sexual preference, health, political beliefs and biometrics are classified as sensitive and given extra protections, Wilson says.

"Sensitive PII cannot be collected without consent, so automated algorithmic collection of a person's sexuality or politics is a huge problem, even if it's done for research purposes," he says. "This is another nice example of how technology does not outpace the law. If most people are intuitively uneasy about computers working out their sexuality or other deep, even unconscious traits, then they can get some comfort from the fact that existing laws put restraints on this type of action."

The fundamental point is that there are limits to what personal information should be collected abotu people, and conditions on how it's collected, Wilson adds. "While new technology can create new ways to break the law, the fact is that privacy laws themselves remain as relevant as ever."

Digital Safety, Privacy & Cybersecurity Chief Customer Officer Chief Information Officer Chief Digital Officer

Today's Tip: Know When To Automate With Artificial Intelligence

Today's Tip: Know When To Automate With Artificial Intelligence


Six Factors For Powering AI Driven Smart Services

Recent client conversations indicate a desire for designing new AI Driven Smart Services.   The rush to incorporate artificial intelligence into processes often requires a deeper examination of which services should be AI enabled.  Constellation’s latest framework for augmenting humanity encompasses six factors (see Figure 1):

  1. Repetitiveness.  The greater a process is repeated, the more likely the process should be AI powered.  One-offs and custom processes with minimal repetition are lower priority candidates for AI.
  2. Volume.  When the volume of transactions and interactions exceed human capacity, the smart service should be AI powered.  Volumes within human capacity will remain human powered.
  3. Complexity.  Good candidates for AI powered include complexity beyond human comprehension and simple tasks that can be optimized by AI.
  4. Physical presence.  Processes that require a heavy physical presence will most likely require human powered capabilities.  However, processes that put lives in jeopardy serve as great candidates for AI powered and automation.  In general, low physical presence requirements play well to AI powered approaches.
  5. Time to complete. High time to market requirements favor AI powered approaches.  Lower time to completion requirements will remain human powered.
  6. Nodes of interaction.  Simple interaction nodes will lean human powered.  AI serves best complex and high volume nodes of interaction.

Figure 1. Constellation’s AI Powered Framework

The Bottom Line.  Apply The AI Powered Framework To Smart Service Prioritization

Six factors play a significant role in identifying which AI driven smart services deliver the greatest opportunities.  Early adopters have prioritized business processes using the Constellation business hierarchy of needs.  Align candidates to the five categories of regulatory compliance, operational efficiency, revenue growth, strategic differentiation, and brand.   Keep in mind, that AI enablement must require a strong data strategy, deep data governance, mature business process optimization, and a data driven design point.

Your POV.

So what will you automate first with AI?  Do you have a digital transformation strategy?  

Please let us know if you need help with your Digital Business transformation efforts. Here’s how we can assist:

  • Developing your digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

 

Data to Decisions Future of Work Marketing Transformation Matrix Commerce Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity New C-Suite AI Agentic AI LLMs Generative AI ML Analytics Automation Cloud CRM Data to Decisions Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Software finance HCM HR Machine Learning Next Gen Apps SaaS PaaS IaaS Supply Chain business Marketing Enterprise Acceleration IoT Blockchain ERP Healthcare Customer Service Content Management Collaboration B2C CX Leadership Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Financial Officer Chief Operating Officer Chief Revenue Officer Chief People Officer Chief Human Resources Officer Chief Experience Officer

News Analysis - Oracle Unveils New Programs that Transform how Customers Buy and Consume Cloud – Gloves Off

News Analysis - Oracle Unveils New Programs that Transform how Customers Buy and Consume Cloud – Gloves Off

Even though we are almost only two weeks off Oracle’s OpenWorld conference in San Francisco - Oracle is pushing the gas pedal. Yesterday Oracle announced its Sparc M8 chip / architecture – today it was Chairman and CTO Larry Ellison’s turn: New pricing.

 
 

The press release can be found here. So, let’s dissect in usual fashion:
At a live event today, Oracle Executive Chairman of the Board and CTO Larry Ellison announced new programs that lower costs by delivering increased automation and flexibility, and enable customers to get more value from their existing Oracle software investments. The new Oracle Cloud programs include Bring Your Own License to PaaS and Universal Credits.
MyPOV – Nice summary. Pricing simplification has been on Ellison’s mind since a long time. In his view easier pricing accelerates sales processes, something he really wants. This move reminded me of Ellison pushing through a single global price list, published on its website, connected with the Oracle Store in the dot com era. It was revolutionary then – today it was visionary.
“We are completely transforming the way all companies buy and use cloud by providing flexibility and choice,” said Ellison. “Today, we combined the lowest prices with the highest performance and more automation to deliver a lower total cost of ownership for our customers.”
MyPOV – Rationale is key to Oracle and Ellison – and his presentation was more about how automation of the Oracle database processes allows Oracle to lower prices and compete with the competition. He even said at one point that Oracle would make cost a SLA item…
While organizations are eager to move to the cloud, many have not due to obstacles that have forced them to choose between flexibility and lower costs. They have been challenged by the complexity of the cloud and the inability to rebalance spend across different services. Organizations have also been constrained by limited visibility and control over cloud spend. Until now, they have been unable to fully leverage their on-premises software investments in the cloud, having been limited to IaaS services or sacrificing key database features at the PaaS layer. Oracle’s new cloud programs address customers’ cloud adoption challenges by improving and simplifying the way they purchase and consume cloud services.
MyPOV – Always good to see simplification. In a more complex and accelerated environment enterprises need all the simplification they need. And while pricing complexity is one problem, product fit and evaluation is even bigger. But Oracle is right – once enterprises have established the services they want - / need – budget risk is one of the main pre-occupations of CxOs. Though the ‘horror’ stories of early cloud days of enterprises burning through a month’s budget in few days because someone didn’t turn something off – are over – it remains a concern till today. But price is only half the equation – product / service selection remains a challenge.
Bring Your Own License to Oracle Database PaaS: Delivering Increased Value Through License Mobility
Currently, customers can bring their on-premises licenses to Oracle IaaS. Today, Oracle is expanding the offering by enabling customers to reuse their existing software licenses for Oracle PaaS, including Oracle Database, Oracle Middleware, Oracle Analytics, and others. Customers with existing on-premises licenses can leverage that investment to use Oracle Database Cloud at a fraction of the old PaaS price. Running Oracle Database on Oracle IaaS is faster and offers more features than Amazon, delivering the industry’s lowest total cost of ownership. Additionally, customers can further reduce management and operational costs required for on-premises maintenance by taking advantage of this PaaS automation.

MyPOV – Good to see Oracle expanding its BYOL program. That lowers the hurdle for customers to move to the Oracle cloud – and helps against the competition. That is – in Oracle’s eyes most prominently AWS – so they are mentioned here… Oracle has gone at length that its IaaS Gen2 is cost effective – now it is getting aggressive on the pricing side. Always good to see vendors living up to their announcements.
Universal Credits: Flexible Buying and Consumption Choices for Oracle’s PaaS and IaaS Services
Oracle is introducing Universal Credits, the industry’s most flexible buying and consumption model for cloud services. With Universal Credits, customers have one simple contract that provides unlimited access to all current and future Oracle PaaS and IaaS services, spanning Oracle Cloud and Oracle Cloud at Customer. Customers gain on-demand access to all services plus the benefit of the lower cost of pre-paid services. Additionally, they have the flexibility to upgrade, expand or move services across datacenters based on their requirements. With Universal Credits, customers gain the ability to switch the PaaS or IaaS services they are using without having to notify Oracle. Customers also benefit from using new services with their existing set of cloud credits when made available.

MyPOV – The real simplification comes here. And while pre-pay programs exist for IaaS since a while, they are usually tied to dedicated products and services. Having the portability across products and services is important for customers, especially in early phases of their cloud journey. Over time their footprint solidifies – but given where most Oracle customers are now, this move is a significant reduction in complexity. The option to use the credits for Oracle Cloud at Customers is a differentiation to pure public cloud based IaaS vendors. It caters well to customers who are still cloud concerned, who are in geographies where Oracle’s IaaS Gen 2 may not be yet, or where data residency requires in country operation of IT. 
 
Behind the scenes it means – Oracle must have a lot of capacity. The dedicated service / product pre pay helps IaaS vendors to plan capacity and build out of their IaaS capacity. When customers can switch with little notice, an IaaS vendor must be confident to have enough capacity. Theoretically too much demand can create wait lists – but that would be bad for business… but an area to watch going forward.

 

Overall MyPOV

Always good to see simplification. Customers who are uncertain on which products and services they can consume can now start budgeting for instance for trials and proof of concept. BYOL is powerful for customers ready to move to cloud. So overall a good move by Oracle.

Looking behind the scenes – we can assume a few things: Oracle is confident IaaS Gen 2 works. It just has poured another 2B in CAPEX in Q1 2018. First time Oracle has spent 2B+ CAPEX quarters in recent years (or at all – I did not go back). All these investments need load to get justified. If pricing held customers up – it has been simplified. And the move gives the Oracle sales force the instant chance to have a cloud conversation with customers – given the BYOL option – and the public cloud and cloud at customer options. The executives in 500 Oracle Parkway are making it was easy as possible to sales force and customers to move to cloud and generate cloud revenues. That Oracle is under pressure to show cloud growth (like all major enterprise vendors) is no surprise, and the last earnings call has not lowered pressure, despite good results.

For the industry, its key to watch how well Oracle can move customers to cloud (or cloud at customer). For Oracle competitors, massive potential could disappear. As Constellation estimates that 30-40% of on premise systems are either Oracle systems, or systems so closely connected to an Oracle system – this is a key development to watch – to determine the overall move of the on-premise load to public cloud (the other one is the VMware / AWS partnership). If Oracle succeeds with this – we may see a fast move of almost all enterprise load to the cloud. The reasons are – when a year of server refresh goes missing, most of on premises IT becomes – very expensive. So expensive that the board s and CEOs are asking how fast the rest can move… Stay tuned. It’s going to be a crucial fall.


For more on the event - check out the Storify embedded below. 
 
Innovation & Product-led Growth Tech Optimization Future of Work Data to Decisions New C-Suite PaaS SaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

Digital Transformation Digest: Kohl's Cozies Up to Amazon, Hitachi Forms Vantara for Digital Business, Oracle Points SPARC to Its Cloud

Digital Transformation Digest: Kohl's Cozies Up to Amazon, Hitachi Forms Vantara for Digital Business, Oracle Points SPARC to Its Cloud

Constellation Insights

Kohl’s cozies up closer to Amazon: Just weeks after announcing it would welcome Amazon smart home experience centers at a number of its department stores, Kohl's is adding Amazon item return services in 82 locations starting next month. It's a continuation of Amazons's brick-and-mortar strategy, which took a big leap forward with the acquisition of Whole Foods, and could conceivably lead to a purchase of Kohl's. 

Here are some key details from the announcement:

“We are thrilled to launch this unprecedented and innovative concept, allowing customers to bring in their unpackaged Amazon returns to Kohl’s and we will pack them, ship them, and return them to Amazon for free,” said Richard Schepp, Chief Administrative Officer. “This is a great example of how Kohl’s and Amazon are leveraging each other's strengths – the power of Kohl’s store portfolio and omnichannel capabilities combined with the power of Amazon’s reach and loyal customer base.”

POV: There are a number of caveats to consider. First, the 82 Kohl's stores that will feature Amazon returns are all in Chicago and Los Angeles. One would expect, however, the initial rollout is a test run for adding the service to most or all of Kohl's stores eventually. Also, the announcement notes that "eligible" Amazon items will be accepted as returns; it's not clear what the limitations are, but for free, who can complain? 

For large retailers like Kohl's, increasing foot traffic is crucial even as they build out online revenue streams. Amazon return centers certainly could drive that foot traffic and result in more in-store sales.

Kohl's has had more success than other department chains in adjusting to omnichannel realities. It also has much larger stores than Whole Foods, raising possibilities for Amazon that a typical Whole Foods store footprint cannot. Acquiring Kohl's, which has a market capitalization of about $7 billion, would be practically trivial for Amazon. While not a lock, an eventual deal looks like a strong possibility.

Hitachi creates Vantara unit for digital business: There is a new—in a sense—player in big data and digital transformation consulting services, with Hitachi's launch of Vantara, a new unit that combines Hitachi Data Systems, Hitachi Insight Group and Pentaho. Here's how Hitachi describes the opportunity for Vantara and customers:

The market opportunity for mission-critical data solutions has never been greater. Data has become a business's greatest asset—if they can extract actionable insights from it. Data holds the key to new revenue streams, better customer experiences, improved market insights and lower costs of doing business. However, a comprehensive offering has yet to emerge that combines both OT and IT expertise to uncover its true potential—until now.

Hitachi Vantara will continue to provide superior infrastructure and analytics technologies that enterprises rely on for their mission-critical data in their data centers, in the cloud and at the edge of new innovations. The new company is targeting the emerging IoT market opportunity, in which there is no clear winner yet.

Hitachi has developed its own IoT platform, Lumada, which will be part of Vantara. The new entity is going after high-end business, focusing on the global Fortune 1000.

POV: Hitachi may have big ambitions for Vantara but the likes of IBM and Dell EMC are competing for the same business. Where Hitachi says it has an advantage is with its operational technology background, which Vantara engagements will couple with IT know-how. By any measure, Vantara is a big move by a big player, and one that bears watching.

Oracle delivers SPARC M8 systems, clarifies Solaris's future: A couple weeks in advance of OpenWorld, Oracle has announced a new series of servers based on the SPARC M8 microprocessor. It also said it plans to support the Solaris OS until at least 2034.

SPARC M8 chips include advancements for software-on-silicon based security measures; 2x faster encryption than x86 systems and SPARC M7; and superior performance for Oracle database and Java workloads compared to x86 and M7, according to the announcement:

"Oracle has long been a pioneer in engineering software and hardware together to secure high-performance infrastructure for any workload of any size," said Edward Screven, chief corporate architect, Oracle. "SPARC was already the fastest, most secure processor in the world for running Oracle Database and Java. SPARC M8 extends that lead even further."

POV: Oracle's hardware revenue fell 5 percent year-over-year in its first quarter to $943 million. But it's doubtful Oracle has true hopes for on-premises hardware sales as a growth story. Rather, it is betting that innovation in the SPARC platform can give its cloud services a performance and efficiency edge.

As for Solaris, the lengthy support commitment should please customers with legacy Solaris workloads, but it's not clear how many resources Oracle will pour into the OS going forward. Sharp eyes at the Register noted that a number of OpenWorld sessions focus on moving Solaris workloads to the cloud—presumably, its own.

AWS adds per-second billing: The cloud pricing wars just got a new wrinkle, with Amazon Web Services' introduction of per-second billing. Here's how AWS chief evangelist Jeff Barr describes the value proposition in a blog post:

Some of our more sophisticated customers have built systems to get the most value from EC2 by strategically choosing the most advantageous target instances when managing their gaming, ad tech, or 3D rendering fleets. Per-second billing obviates the need for this extra layer of instance management, and brings the costs savings to all customers and all workloads.

While this will result in a price reduction for many workloads (and you know we love price reductions), I don’t think that’s the most important aspect of this change. I believe that this change will inspire you to innovate and to think about your compute-bound problems in new ways.

Per-second billing goes into effect in all AWS regions on October 2, for Linux instances "that are newly launched or already running," Barr wrote. Amazon is also requiring a one-minute minimum charge per instance.


POV: The move is both good for AWS customers as well as AWS, if it can recycle instances faster and loan them out more often per minute, says Constellation Research VP and principal analyst Holger Mueller. "It's like the Frankfurt airport Sheraton, which usually has 120 to 130 percent utilization, because people check in and check out of the same rooms multiple times in 24 hours," he says.

Per-second pricing isn't currently available on other major clouds, but you can expect AWS's competitors to follow suit soon.

Data to Decisions Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Digital Officer Chief Revenue Officer

Rethinking IT Service Management in the Era of Cloud, Customer Experience, and Design Thinking

Rethinking IT Service Management in the Era of Cloud, Customer Experience, and Design Thinking

Most practitioners would agree that there's been a steady shift in IT service management over the last decade. The stagnation of ITIL combined with the rise of agile methods, devops, public cloud, and even Shadow IT has had a growing and inexorable impact on how we manage our IT services today.  The customer shift in expectations for service management is clear too: Be more responsive, be easier to consume, move faster, and lead the business from the front when it comes to technology services. As a result, it's clear now that the evolution of the practice has reached a significant inflection point.

Thus, in an age where crafting easy-to-use and engaging customer experiences using techniques such as empathy-driven design thinking have become best practice for service design, the old and decidedly staid world of ITSM is getting a reboot. Certainly, leading vendors like ServiceNow and BMC, have helped make this shift possible with increasingly consumerized, customer communities, and self-service capabilities, but this transformation is much more than just about the tools. There's a new sense that service management has to grow up to lead the business itself in how it adopts and consumes technology, while becoming a prime customer of the development process itself.

To be clear, other groups in the business -- including the enterprise architecture team, the Chief Digital Officer, and even the marketing technology groups -- are also busy doing the same thing. However, they are not positioned in the center of service delivery itself and don't have the infrastructure, mandate, or experience in service design, deployment, and management. But service management groups must think big and seize the initiative or risk being relegated to the margins of shared services.

The Next Generation of Service Management: Beyond ITIL with Customer Experience, Design Thinking, and Shadow IT

Service Management Now Informed by New Developer Methods and High Quality Consumer Experiences

Pushed forward with the advent of new ideas from the development side of IT, service management is becoming much more iterative, proactive, customer-focused. This is helped along with the aforementioned enabling of new solutions that take the friction out of service management by enable high degrees of ease-of-use and self-service, while the development side of the house integrates much more closely with ITSM, then uses fast feedback cycles to rapidly iterate services using agile and devops methods until the right solution is ultimately refined out of the initial proof of concepts or prototypes.

IT service management pundits such as Dennis Drogseth have dubbed this shift away a reactive service desk and towards a more integrated, adaptive, and forward-looking service model, Service Management 2.0. While the trend of adding the "2.0" suffix is now out-dated, the point is a correct one: Traditional service management has to evolve to become more effective in meeting business requirements using emerging new methods. The practice of ITSM as it is codified in ITIL 3.0 is not only too heavy-weight today, but it does not reflect many of the countless lessons learned in usability and customer journey over the last decade since it was last updated. Design thinking, an increasingly popular way of creating customer-centric technology services, wasn't even on the radar in IT when the most recent version of ITIL was developed.

Rethinking Service Management Using Today's Digital Processes and Lessons Learned

For my own part, I recently had an opportunity to widely survey the state-of-the-art in ITSM last month when I gave an opening keynote at the 20th annual Service Management conference in Melbourne, Australia. I also gave two deep-dive next-generation ITSM workshops to nearly 40 top service management professionals that were highly informative.  In the process, I encountered a pleasantly surprising number of practitoners that were hungry for a new model of service delivery beyond or in complement with the traditional ITIL model. Most are looking at incorporating agile into service management and some were closely evaluating devops as part of the process. But one thing was clear: The practice is not evolving as fast as the marketplace or our stakeholder expectations, despite an urgent need to move faster across the industry.

The ITSM workshops I facilitated last month were particularly revealing as we jointly developed a new model for ITSM that I believe will a) resonate with practictioners, b) look familiar enough to be readily understandable, and c) yet deeply incorporate the most vital new trends mentioned above. For lack of a better term, I'll call this new view Service Management 2020, both for a target date of end of the decade for ITSM groups to overhaul their function, along with the idea of the 20:20 hindsight that we have now with several decades of ITSM experience to see what worked and what now needs to change. 

A Vision for Service Management 2020

  • The customer experience is paramount. Where legacy ITSM is process-centric, the new view is that measuring and managing the resulting impact and quality of the journey of the service management customer is what matters most of all. While design thinking is not necessarily a mandatory new process in service design and 'service devops', ITSM must determine some effective method to map out the customer experience, ensure stakeholder needs are being met, and use data from the field to ensure it's the right journey through service management (and hard part, keeping it updated.)
  • Service management is just as important a service customer as the end user. This is the signature lesson of devops and continuous delivery: Operations and development must collaborate closely together to iterate towards the right solution that is optimized for a) the customer experience and b) operations and service management. Both are vital and essential stakeholders to please with service management.
  • Agile and devops must be incorporated into service management. Older legacy service development processes are slow, wasteful, don't course correct quickly enough, and won't lead to an adequate user experience or sufficiently meet business requirements. A key point: These new generation of processes use end-to-end visiblity and collaboration to get information from the customer as quickly as possible from rapidly iterating builds back to the development groups so the right solution can be created. ITSM tends to be siloed from these processes, and so it must be removed from this silo as soon as possible and assume a larger role in the IT value chain.
  • IT service management must evolve into business service management. The end game is not so much about IT, as it is how digital impacts the very way the business operates and thinks. IT is now a key component of almost all business services, and thus shared services functions can and should in many organizations focus on business digitization, as as our organizations become technology companies.
  • Move towards 90% automation as soon as possible. The reality is that ITSM budgets tend to be tight, and don't grow quickly, even as responisbilities mount and a new generation fo service management arrives that must be dealt with effectively. However, with the rise of AI-powered support, chatbot-based ITSM services, community-powered self-help, and other automated aids arrive, however, service management professionals should use these to free up their time and resources to focus on the strategic transformation activities represented by this list, which is going to take the next three years at least to address properly.

There is little doubt that we are entering one of the most exciting times in the field of service management, yet there is much work yet to do to pathfind the way. The vision of Service Management 2020 is one that I believe will resonate with most practitioners as they attempt to modernize one of the most vital technology capabilties within our organizations. The process of shifting the model of service management itself in the way described above will also make ITSM more strategic. Practitioners should be ready to communicate and educate upwards to ensure they gain C-Suite support for their efforts to evolve into a proactive digital business service management function.

Continuing the Discussion

Please add your comments on the future of ITSM below. You can also reach me via email: dion (at) ConstellationR (dot) com or @dhinchcliffe on Twitter.

Also, please let us know if you need assistance with your service management transformation efforts. Here’s how we can help:

  • Developing your ITSM strategy and transformation plans
  • Connecting with other service management peers and leaders
  • Accessing the latest service management best practices
  • Understanding the service management vendor space
  • Identifying options for implementation partners
  • Developing and validating digital transformation roadmaps and playbooks
  • Providing advisory and education to IT executives, CXOs, and boards

Additional Reading

Digital (Service) Transformation and the Leadership Quandary

Rethinking Field Service Management in Digital Business

Systems of Engagement and Enterprise Business Architecture

The New CIO Mindset

Future of Work New C-Suite Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Digital Officer Chief Data Officer

Digital Transformation Digest: Chambers Ending An Era at Cisco, Vertica 9 Unveiled, IBM's New Cloud Data Migration Service

Digital Transformation Digest: Chambers Ending An Era at Cisco, Vertica 9 Unveiled, IBM's New Cloud Data Migration Service

Constellation Insights

John Chambers' legacy at Cisco: After more than 20 years as either executive chairman or CEO of Cisco, John Chambers is stepping back from his duties. Chambers will not stand for reelection this December to Cisco's board, which intends to name CEO Chuck Robbins chairman as his successor.

At the time Chambers was appointed CEO in 1995, the networking giant had $1.2 billion in revenue. It now generates nearly $50 billion annually, driven by an eye-popping 180 acquisitions during Chambers' tenure as CEO. That growth strategy has continued under the leadership of Robbins, who took the CEO job in 2015.

Not every acquisition has been a success for Cisco. Critics often point to Chambers' decision to abruptly kill Flip, the consumer-oriented camcorder Cisco bought for $590 million, as an example of a misfire. Significant deals made under Robbins' watch include the $1.4 billion purchase of Jasper, maker of an IoT platform, and the $3.7 billion Cisco plunked down for application performance monitoring vendor AppDynamics.

Revenue has fallen for the last seven quarters, but Cisco has been beating analyst estimates for earnings per share.

POV: It's not as if Chambers' departure comes as any surprise, given it's been two years since he stepped down as CEO. (In the meantime, Robbins has overseen a retrenchment of Cisco's strategy with a focus on next-generation networking and multi-cloud management.) But it still marks the end of an era.

"John Chambers wrote the playbook for massive growth by acquisition in high tech," says Constellation founder and CEO R "Ray" Wang. "His leadership over the years at Cisco was unparalleled in driving scale, improving margins, and leading the market in financial engineering. His legacy will be known as one of the legendary Silicon Valley leaders during the golden age of networking."

The only down side will be the highly competitive, Game of Thrones-like environment Chambers is leaving behind. "That will need some healing under Chuck Robbins to reinvigorate the culture," Wang says.

Vertica 9 unveiled post-Micro Focus acquisition: Earlier this month, HPE completed the $8.8 billion spinoff of its software assets to Micro Focus. Now the latter has taken the wraps off Vertica 9, the latest version of the analytics database platform. Here are the key details from its announcement:

Vertica provides organizations with a single, unified analytical database that supports all major cloud platforms, all popular data formats, enhanced integrations with Spark and Kafka and an analyze-in-place, unified architecture that enables businesses to monetize their data assets with cloud elasticity – regardless of data location.

The new release triples load performance, dramatically increases query performance with Flattened Tables, and extends concurrency by up to 60 percent. In addition, Vertica 9 natively integrates with key ecosystem technologies and open source innovation, including Microsoft PowerBI, Cloudera Manager and Apache Spark 2.1.

Vertica has also added support for Google Cloud Platform in this release, and is rolling out a beta version of its Eon Mode. This separates compute and storage, allowing for just-in-time provisioning on analytics jobs, which can save customers money.

In addition, Vertica 9 features a new set of machine learning algorithms, additional data-prep tools and a new writer tool for Parquet, the columnar storage format associated with Hadoop File System.

POV: Vertica 9 will be generally available in October. That's roughly a year after the release of Vertica 8, timing that suggests the Micro Focus spinoff didn't cause excessive distractions at the product engineering level.

The beta release of Eon Mode represents where Vertica is playing catch-up to others in the market, says Constellation VP and principal analyst Doug Henschen. Snowflake Computing was among the pioneers of separating compute and storage decisions when it was founded in 2012 and it has since been followed by Teradata with its IntelliFlex architecture, Henschen adds.

This separation will ease flexible cloud deployment, but Vertica 9 also makes it easier to deploy on the AWS, Azure and Google clouds, by way of cloud-native marketplaces/launchers in bring-your-own-license (BYOL) approaches, Henschen says.

However, Vertica still doesn’t offer its own Database as a Service (DBaaS) offerings. Constellation sees DBaaS as increasingly popular, as these options tend to be highly automated and save customers from having to deal with routine and repetitive database admin, patching and software-update tasks.

Vertica remains a popular choice for its massive scalability and advanced analytical capabilities, often showing up as the embedded data platform behind third-party SaaS offerings, such as Datorama, Domo and GoodData, Henschen adds. The EON architecture and streamlined BYOL options are positive moves, but Henschen notes in his Constellation ShortList for Hybrid and Cloud-Friendly RDBMS, getting into the thick of the hybrid cloud competition demands multi-cloud database services, preferably managed by the database provider.

The machine learning advancements and other new features extend Vertica's capablities for cloud and IoT use cases, but they were put in place under HPE's ownership, Henschen notes.

While Vertica has synergies with the Autonomy search and machine learning platform Micro Focus also acquired from HPE, the rest of the portfolio focuses on DevOps, hybrid IT, security and risk management. Micro Focus officials have characterized Vertica as a growth engine for the company, it's possible Vertica could be spun out yet again, he adds: "I’m looking forward to seeing what the new Micro Focus does with this valuable asset."

IBM rolls out physical cloud migration offering: Bandwidth remains an obstacle when it comes to moving large data sets to the cloud. To get around the problem, vendors including Amazon Web Services and Google have been pushing physical data migration options—in AWS's case, it's a tractor trailer called Snowmobile, albeit one aimed at petabyte-scale data sets.

Now IBM is getting in on the trend with Mass Data Migration, a service that uses $395 portable storage device with up to 120TB of capacity. The devices include 256-bit encryption and UPS next-day air service. It's possible for the devices to be sent out, their data migrated to IBM's cloud, and returned to the customer within a week, according to a statement.

IBM claims that it is offering more storage per dollar compared to competing products. The devices are available in the U.S. now and in the European Union soon.

POV: Network speeds are too slow to move customer data to the cloud, so IaaS providers have to create these rugged temporary storage appliances to help with the process, says Constellation VP and principal analyst Holger Mueller. "The interesting question going forward will be whether these are going to be 'dumb' storage devices, or easy to deploy, rugged servers that can capture data—e.g. at the IoT edge—and offer lightweight processing on site," he adds. "The good news for customers is they are getting more choices, and easier way and faster ways to move to the public cloud."

Data to Decisions Tech Optimization Chief Executive Officer Chief Information Officer Chief Digital Officer

Event Report - SAP SuccessFactors SuccessConnect - New Leadership, Old Challenges

Event Report - SAP SuccessFactors SuccessConnect - New Leadership, Old Challenges

We had the opportunity to attend SAP SuccessFactors Success Connect yearly user conference in Las Vegas, held from August 29thto 31st 2017, at the Cosmopolitan in Las Vegas. The conference was well attended, SAP claimed better than the 2016 edition with over 3500 attendees. 
 
 
So take a look at my musings on the event here: (if the video doesn’t show up, check here)
 
 
 
No time to watch – here is the 1-2 slide condensation (if the slide doesn’t show up, check here):

 
 
Want to read on? Here you go: 

Always tough to pick the takeaways – but here are my Top 3:

Executive Changes. New leadership in place, now time to pick up speed. As documented, SuccessFactors has new leadership at the board, at the helm and at product level. And accordingly, Rob Enslin, Greg Tomb and newly hired product, engineering and delivery leaders Amy Wilson and James Harvey were on hand for the keynote. Executive transitions are never easy, and while the four executives did an ok job during the keynote, they could have done better. At the end of the day users attend user conferences … to learn about the product – and that part was thinner than at previous Success Connect editions and certainly thinner than at competitor events. Over time I am sure areas of communication will be de-lined and most importantly all executives will have setup the course and roadmap for SuccessFactors to go forward. Well intended attempts of making the keynote more personal and approachable appeared more scripted, and while I know it was all genuine, knowing most executives since many years, the team failed to pick up where the last Success Connect in the US (from a region perspective) and in Europe / London (from a temporal perspective) stood.
 
Traditional broad push, more focus on Talent Management with Onboarding. A lot is happening at SuccessFactors, as the vendor claimed 2500 product developers working on the product, a number that seemed high based on my napkin quality calculation. The new highlight was a new Onboarding module, to be built on the SuccessFactors MDF framework. Good to see the leverage of MDF, but not a critical piece of automation for Talent Management. We saw the new iOS mobile application, heard multiple times how it was designed with Apple, and it is a good implementation of a mobile HR application. That previous year pledges not to let the Android version fall behind iOS again was missed, got probably lost in the transition. And then SAP SuccessFactors is (finally) moving to HANA, a move that has been supposed to happen since a while, though the scope is not fully clear, as we heard from ‘fully’ to move of the analytics layer to HANA. Harvey showed some of the benefits of moving to HANA in early product, a good start, that focusses more on dashboards and reporting than more advanced scenarios.
 
Long term platform and positioning questions loom.Unfortunately, Success Connect provided no light on the inherent platform issues that SuccessFactors had. To start, the original SuccessFactors operated on three and a half platforms. Then SuccessFactors added the MDF framework, and some modules run on it (EmployeeCentral and soon Onboarding for instance). In the meantime, SAP overall moved to SAP Cloud Platform as its PaaS, based on CloudFoundry. And SuccessFactors still runs on competitor Oracle’s database. All nothing news, all major questions to tackle and we felt SuccessFactors was coming close to answer them – but no update on them yet. One could argue, why do these matter, it’s all SaaS, but the agility of a vendor is determined by the productivity and the number of their platforms… obviously more platforms mean more work, support and maintenance – and therefore matter for customers as well.
 

Analyst Tidbits

Progress on Bias. SAP has been pushing the elimination / reduction of bias via software since Sapphire 2016. We were updated on the progress. Masking of bias related attributes and pictures is now possible. Most efforts – not surprisingly are on the reporting side. A good area to focus on and a possible differentiator.

SAP and Diversity. SAP has met its goal for gender diversity recently, meeting its goal of 25% of women in leadership positions (1stlevel manager on upwards) early and now established a new goal of 28% by 2020 and 30% by 2022 (more here). Newly minted SAP CMO Alicia Tilman and me had a great conversation on the topic – watch it here.
 

MyPOV

A good SuccessFactors conference, with new leadership in place, finding its way ‘around the furniture’. Under 100 days in its not fair to expect Willis and Harvey to have a plan, sometimes conference schedules don’t align with plan readiness… and it is better to not share something hastily. On a board level Enslin needs to come up with a vision of the SAP SaaS business he oversees now, especially how they relate to S4/HANA, platforms and future direction. But likely also too early to address, so again, better to wait than communicate prematurely. And while those high-level questions are sorted out, there is a lot of progress and work happening on product level. Attendees were generally happy with the progress. Partners showed up in force and see more traction. But strategic direction, messages and vision matter to customers and prospects – so time for SAP SuccessFactors to deliver them.
 
On the concern side, SuccessFactors still needs to find its way inside, versus or with the SAP mainstream. The acquisition is soon more than six years ago and while SuccessFactors has delivered a compelling HR Core product with EmployeeCentral, it need to address looming payroll (no mention in keynote) and overall platform / suite level questions. The sooner, the better. And as a general conversation, HR conferences increasingly seem to be about celebrities and good vibes. All good but the product message can’t be overshadowed too much, SuccessConnect in my view reached a critical level here.
So overall, especially on product level, good progress, let’s hope for SucessFactors clients and prospects that this productivity level is not slowing down, the high-level questions get addressed soon and then SuccessFactors gets to execute toward that vision. There is practically no HCM vendor that is not working on its next generation architecture at the moment, or at least major new investments (e.g. Payroll) – SuccessFactors will have to tackle the same challenges. Customer want and deserve answers. Stay tuned.
 
 
Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here).


 
Future of Work Next-Generation Customer Experience Revenue & Growth Effectiveness Data to Decisions Innovation & Product-led Growth New C-Suite Marketing Transformation Digital Safety, Privacy & Cybersecurity Tech Optimization AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Information Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer Chief Experience Officer

Ellison Reveals Oracle's Plan for 'Self-Driving' Cloud Database Service

Ellison Reveals Oracle's Plan for 'Self-Driving' Cloud Database Service

Constellation Insights

Larry Ellison teases Oracle's 'self-driving' database: Oracle chairman and CTO Larry Ellison has spilled the beans on the company's big news announcement for OpenWorld, and it's all about the database. Here's what Ellison told analysts this week during Oracle's first-quarter earnings call:

On October 1 at Oracle OpenWorld, we'll announce the next generation of the Oracle database. When we deliver it by the end of this calendar year Oracle will become the world's first fully autonomous database. Based on machine learning, this new version of Oracle is totally automated self-driving system that does not require a human being either to manage the database or tune the database.

Using artificial intelligence to eliminate most sources of human error enables Oracle to deliver unprecedented reliability in the Cloud. We will be offering public Cloud SLAs, service level agreements for the Oracle database that guarantee 99.995% systems availability time. 99.995% availability means less than 30 minutes of planned or unplanned downtime per year.

A self-driving database eliminates the labor cost of tuning, managing, and upgrading the database, plus avoiding all of the costly downtime associated with human error. ... Running Oracle's autonomous database is much, much cheaper than running traditional human-driven databases like Amazon's Redshift.

Ellison pledged that customers who move from Amazon Web Services' Redshift database service can cut their costs by at least half. Moreover, Oracle will provide SLAs guaranteeing those cost savings to customers who make the switch, he said.

POV: It was a characterially aggressive announcement and set of pledges from Ellison, who has taken to name-checking AWS regularly as Oracle tries to catch up in IaaS (infrastructure as a service) and PaaS (platform as a service).

While Oracle may never accomplish those goals, it is telling that Ellison framed the upcoming database service as a migration story. There are a great many on-premises Oracle database workloads that the company would like to see running on its cloud, not Amazon's or others, even as it offers support for doing so. 

Ellison left a number of crucial questions unanswered, in particular with regard to pricing. It's unclear how Oracle will undercut AWS so dramatically on cost while introducing a new feature set that one might presume is a premium add-on. It could be that Oracle will include the new automation capabilities as part of its baseline database service; answers should arise at OpenWorld.

Then there is the matter of what Oracle means by database automation, as Oracle has had automation features for tasks such as patching and testing for some time. "Perhaps they're bringing it to the next level, but I'm not sure how they get off calling it 'AI' or 'self-driving,'" says Constellation VP and principal analyst Doug Henschen. "I'll have to wait to see exactly what they're talking about. I'd be wary of AI-washing of what is really machine learning and automation trained on the very confined domain of database administration."

The cloud plays into the upcoming service's extremely high SLA as well. "It's not hard for Larry to promise something better than what most companies can achieve on-premises with Oracle's database or Exadata," Henschen says. "When you're doing a DBaaS for thousands or tens of thousands of customers, the administrative and workload patterns become really clear and can be automated. No one customer can amass that sort of data and metadata."

Tech Optimization Chief Information Officer

Digital Transformation Digest: Oracle's Cloud Sea Change, Microsoft Introduces Azure Confidential Computing, and a Look at Looker 5

Digital Transformation Digest: Oracle's Cloud Sea Change, Microsoft Introduces Azure Confidential Computing, and a Look at Looker 5

Constellation Insights

Oracle's sea-change moment: This week, Oracle reported that for the first time,it sold more cloud software subscriptions than new on-premises licenses in its fiscal 2018 first quarter. This should be seen as a genuine milestone from both the vendor strategy and customer buying perspectives.

Total cloud revenue for the quarter was $1.5 billion, comprising SaaS, PaaS and IaaS, with SaaS taking up $1.1 billion of that sum. Meanwhile, new license sales fell 11 percent to $966 million. Revenue was $7.4 billion overall for the quarter.

To be sure, Oracle has spared little expense and effort in transitioning its business model toward the cloud, through both a series of acquisitions and an overhaul of its sales force and compensation strategy. The single largest move Oracle has made in the cloud was the $9.3 billion acquisition of NetSuite, which gave it a nearly $1 billion annual cloud revenue boost.

POV: The numbers need to be placed into context. In its Q4 2017, Oracle reported $2.6 billion in new license revenue and $1.36 billion in cloud sales. But Oracle's Q4 is historically its largest and is when many on-premises deals are finalized.

Still, Q1's results represent "an inflection point for Oracle and the industry overall," says Constellation VP and principal analyst Holger Mueller.

Oracle says that within SaaS, ERP is the largest segment, with 5,000 Fusion ERP customers and 12,000 NetSuite customers. Now the question is how Oracle can accelerate growth in the PaaS and IaaS segments, where it lags the likes of Microsoft Azure and Amazon Web Services by a great deal.

The low-hanging fruit for Oracle lies in migrating on-premises Oracle database workloads to the cloud. In a statement, chairman and CTO Larry Ellison revealed one way Oracle will convince on-premises customers to make the switch:

"In a couple of weeks, we will announce the world's first fully autonomous database cloud service," said Oracle Chairman and CTO, Larry Ellison. "Based on machine learning, the latest version of Oracle is a totally automated "self-driving" system that does not require human beings to manage or tune the database. Using AI to eliminate most sources of human error enables Oracle to offer database SLA's that guarantee 99.995% reliability while charging much less than AWS."

A related number—Oracle's capital expenditures—is also important to watch. In its fiscal 2017, Oracle spent roughly $2 billion on capex, largely to build out its global data center footprint. The total was much higher than in previous quarters. That pace of spending continued in Q1, with $473 million in capex.

The first iteration of Oracle's IaaS was not a major success; a next-generation version rolled out this year may account for the jump in capex.

Microsoft introduces Azure confidential computing: In the wake of mega-hacks such as the Equifax incident that exposed the personal information of up to 143 million people, security is an even bigger differentiator for cloud vendors. Microsoft this week took a significant step forward in security for its Azure platform, with the rollout of Azure confidential computing. Here are the key details from a blog post by Azure CTO Mark Russinovich:

Put simply, confidential computing offers a protection that to date has been missing from public clouds, encryption of data while in use. This means that data can be processed in the cloud with the assurance that it is always under customer control.

Data breaches are virtually daily news events, with attackers gaining access to personally identifiable information (PII), financial data, and corporate intellectual property. While many breaches are the result of poorly configured access control, most can be traced to data that is accessed while in use, either through administrative accounts, or by leveraging compromised keys to access encrypted data.

Russinovich says Azure, Microsoft Research, Windows team members and Intel have been working on confidential computing for more than four years.

The software and hardware involved uses Trusted Execution Environments (TEEs) to shield data. Developers won't have to change their code to use the TEEs, which initially include the software-based Virtual Secure Mode, which leverages Hyper-V and Windows Server 2016; and the hardware-based Intel SGX TEE.

POV: The announcement comes a few weeks after Microsoft unveiled the CoCo framework, for creating large-scale, confidential blockchain networks. An early adopter program for Azure confidential computing is open now, and Russinovich is set to demonstrate the technology at Microsoft's upcoming Ignite conference.

Cloud vendors including Microsoft have already provided encryption for data at rest, or in storage, and data in transit. Adding encryption to data that's in use completes the picture. It's worth noting that Intel's SGX isn't exclusive to Microsoft, meaning it could show up in servers owned by cloud rivals sooner than later.

However, Russinovich didn't discuss pricing in his post; one would consider the confidential computing capabilities to come at a premium, but one that companies may be more than willing to pay for their most senstive data and applications.

Taking a look at Looker 5: BI and analytics vendor Looker has experienced rapid growth, coming up on the 1,000-customer mark. This week it released Looker 5, the latest version of its data platform. Here are some of the key details from its announcement:

Looker 5 delivers dozens of ... features that transform the way users experience and interact with their data, including: simplifying daily workflows by letting anyone interact with Zendesk, Salesforce, Adwords and more with the Action Hub; empowering departments with purpose-built Applications by Looker; and dramatically improving the user experience with new Viz Blocks for nearly limitless visualizations and Data Blocks to add valuable public data to any analysis.

Finally, Looker’s flagship business intelligence tool gains powerful features like Data Merge to easily combine data from different databases, 57 new statistical functions, a new SQL runner with visualizations, and many more.

Looker 5 is set for release next month.

POV: Looker has made significant progress in breaking down longstanding barriers between back-end databases and warehouses and front-end analysis capabilities, says Constellation VP and principal analyst Doug Henschen. With its human-friendly LookML language, which abstracts SQL, and its reusable Block approach, Looker is designed to make it easier for analysts and power uses to integrate, transform, model data and quickly deal with new from multiple sources without moving it – steps that used to require heavy lifting by data management professionals, he adds.

"As the names suggest, Action Hub and Applications are good examples of going beyond BI to put data-driven insights to work," Henschen says. "I like the cloud-services and SaaS-based focus of Action Hub and expect the list of data-driven Applications to fill out beyond the initial list of three over time. People forget that delivering BI and analytics shouldn’t be an end goal; they should be the means to making smarter, better-informed decisions."

Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Information Officer Chief Procurement Officer Chief Digital Officer

Event Review: Slack Frontiers

Event Review: Slack Frontiers

Today Slack held their first user conference, Slack Frontiers. Since jumping into the collaboration market just a few short years ago, Slack has been one of the primary catalysts in kick-starting an industry that was in a bit of a rut. Slack's rapid rise to social business stardom has resulted in a slew of competitive products, from small startups to big names like Microsoft (Teams), Google (Hangouts), Facebook (Workplace), Cisco (Spark), Atlassian (Stride) and IBM (Watson Workspace). Similarly Slack has motivated thousands of developers to build integrations and add-ons for Slack. However, fame often brings with it challenges, such as high expectations that are often hard to live up to. More on that below.

If you don't have time to read this post and watch the videos, here's a summary:

Let's start with some impressive numbers:

  • 9 million weekly active users
  • 6 million daily active users
  • 2 million paid users
  • Usage in 43% of the Fortune 100 
  • and most significant, $200M in annual recurring revenue (ARR)

It's good to see Slack continuing to grow in this highly competitive market. Now on to the product announcements.

  • Internationalization: Today 55% of Slack's usage is outside of the US with the UK, Japan, Germany and Canada being their next largest markets. However until now Slack was only available in English. Now Slack will be available in French, German and Spanish, with Japanese scheduled for release before the end of the year. Slack has done a thorough job here, as they are not just releasing the product in new languages, they are also updating documentation, customer support and purchasing currencies. While this should help Slack expand into more international customers, it's still far behind products like Microsoft Teams which is available in 25 languages.
  • Shared Channels: This allows two organizations that are using Slack to create a shared channel between them. This will help Slack customers connect with people outside of their organization such as partners, suppliers and customers. This is different than the current guest access feature, as shared channels provides more accountability and administration features. This new feature is available in beta, and is currently only available in Slack for Teams, not Slack Enterprise Grid. Shared Channels is an important architectural milestone which should pave the way to more powerful features in the future, but it's critical this be available in Enterprise Gird sooner rather than later.

I've been asked several times what the difference between Shared Channels and Guest Access is, so I made this video which I hope properly explains it:

Image: Slack Shared Channels enable two companies to work together in a shared stream

MyPOV

While the new features announced are important, especially for growth in the enterprise market, they were not as innovative as I would like to have seen from a company with so much promise and momentum. Moving people away from email is nothing new, enterprise social networks and communities have been preaching that story for a decade. Slack and its competitors have great potential to change the way people work by bringing together multiple styles of communication, diverse forms of content, improvements to business processes via workflow and automation, and more importantly inventing new versions of each of those things. I appreciate how Slack has raised the awareness of team collaboration and breathed life into the social business market, but overall I didn't feel they moved the needle at Frontiers as much as I would have liked to have seen, and I spoke to several customers and partners that felt the same way.

That said, Frontiers was a very well done event for their first conference. They brought together several great customers and partners which helps build a community. There were very useful sessions on best practices, roadmaps and customer stories. I look forward to seeing how Slack executes on product, marketing and vision as they round out the year and the journey towards Frontiers 2018.

 

 

Future of Work