Results

FCC Chairman Ajit Pai's Proposal to Overturn Net Neutrality, Annotated

FCC Chairman Ajit Pai's Proposal to Overturn Net Neutrality, Annotated

Constellation Insights

Federal Communications Commission Chairman Ajit Pai has unveiled how he intends to roll back so-called "net neutrality" regulations passed in 2015. While Pai has the benefit of a Republican-majority FCC board, he could still be in for a fight as advocates and the public push back. 

Net neutrality forbids ISPs from blocking or slowing down Internet traffic that points to legal content, and from favoring traffic based on payments or other considerations. Critics like Pai say the rules, which placed ISPs into the "common carrier" category under Title II of the Communications Act, go too fair and are anticompetitive. 

Consumer advocacy groups and Internet companies such as Netflix and Facebook have been the most prominent net neutrality advocates, with ISPs seeking further control taking the other position. But in today's environment, with trends such as the IoT (Internet of Things), the app economy and general move toward cloud computing, makes the way Internet traffic is regulated in the U.S. a core concern for enterprises.

Hence, the details of Pai's proposals are worth a close look. Here are some of the key highlights from a speech he gave this week in Washington outlining his plan, accompanied by my annotations and context. 

Pai: "Whether I am in Red America, Purple America, or Blue America, whether I am above the Arctic Circle or in the bayous of Louisiana, people tell me that they want fast, affordable, and reliable Internet access. They say that they want the benefits that come from competition. And they tell me that they want to access the content and use the applications, services, and devices of their choice." 

  • The FCC received an unprecedented 3.7 million comments from the public regarding net neutrality before the rules were passed. The Sunlight Foundation, a nonprofit that advocates for open government, performed an analysis using 800,000 of those comments and found that only 1 percent were in opposition to net neutrality. However, other polls have shown that only a scarce majority of Americans understand well enough what net neutrality is. 

Pai: "[Under] the Telecommunications Act of 1996 ... America’s Internet economy produced the world’s most successful online companies: Google, Facebook, and Netflix, just to name a few. ... And under this framework, consumers benefited from unparalleled innovation. But two years ago, the federal government’s approach suddenly changed. The FCC, on a partyline vote, decided to impose a set of heavy-handed regulations upon the Internet. It decided to slap an old regulatory framework called “Title II”—originally designed in the 1930s for the Ma Bell telephone monopoly—upon thousands of Internet service providers, big and small. It decided to put the federal government at the center of the Internet. Why? Unfortunately, the answer has nothing to do with the law or the facts. Nothing about the Internet was broken in 2015. Nothing about the law had changed. And there wasn’t a rash of Internet service providers blocking customers from accessing the content, applications, or services of their choice.

  • Net neutrality proponent Free Press this week published a list of incidents it says violated the spirit of net neutrality, both before the rules were passed in 2015 and afterward. 

Pai: So what happened after the Commission adopted Title II? Sure enough, infrastructure investment declined. Among our nation’s 12 largest Internet service providers, domestic broadband capital expenditures decreased by 5.6% percent, or $3.6 billion, between 2014 and 2016, the first two years of the Title II era. This decline is extremely unusual. It is the first time that such investment has declined outside of a recession in the Internet era. And the impact hasn’t been limited to big ISPs. Smaller, competitive providers have also been hit. ... Our nation’s smallest providers simply do not have the means or the margins to withstand the Title II regulatory onslaught. And remember—these are the kinds of small companies who are critical to meeting consumers’ hope for a more competitive broadband marketplace and closing the digital divide.

  • Opinions vary on Title II/net neutrality's impact on ISP infrastructure investment. The Internet Association, a lobbying group which represents companies such as Google and Facebook, noted that major ISPs Comcast, Verizon and AT&T have made aggressive investments in fiber during the past couple of years. Free Press, meanwhile, put out a flyer listing capital expenditures from about two dozen publicly traded ISPs. About three-fourths had expanded their investments from 2015 to 2016. Comcast's capital expenditures grew 26.6 percent during that period, while Cincinatti Bell's rose 50.3 percent.

Pai: When businesses cut back on capital expenditures, the areas that provide the most marginal returns on investment are the first to go. And in the case of broadband, that means low-income rural and urban neighborhoods. As a result, Title II has kept countless consumers from getting better Internet access or getting access, period. It is widening the digital divide in our country and accentuating the practice of digital redlining—of fencing off lower-income neighborhoods on the map and saying, “It’s not worth the time and money to deploy there.”

  • ISPs have accepted billions in subsidies over the years in the name of supporting rural broadband, but modern fiber networks remain largely nonexistent in less populated areas. Moreover, ISPs are loathe to continue servicing their aging and increasingly obsolete copper networks, which support DSL, the main option for broadband in the U.S. countryside, and have taken steps to phase out the service without an acceptable broadband option being in place.

Pai: Now, some have called on the FCC to reverse Title II immediately, through what is known as a Declaratory Ruling. But I don’t believe that is the right path forward. This decision should be made through an open and transparent process in which every American can share his or her views. So what are the basic elements of this Notice of Proposed Rulemaking? First, we are proposing to return the classification of broadband service from a Title II telecommunications service to a Title I information service—that is, light-touch regulation drawn from the Clinton Administration.

  • The next major step in the debate comes May 18, when the FCC votes on a Notice of Proposed Rulemaking regarding Pai's proposal. If it passes—which is practically guaranteed—then a public comment period will open. Net neutrality has simmered in the U.S. for going on 20 years and has always been politically charged. It will be no different, and perhaps more controversial than ever before, as the year unfolds. 

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Next-Generation Customer Experience Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Information Officer Chief Digital Officer

Infosys Launches Nia, Its Next-Gen AI Platform

Infosys Launches Nia, Its Next-Gen AI Platform

Constellation Insights

Since becoming CEO of Infosys roughly three years ago, Vishal Sikka has taken the massive Indian systems integrator down new paths, chief among them a focus on creating intellectual property rather than relying on technologies from partners. One of Infosys's key creations under Sikka's leadership Mana, an AI (artificial intelligence) platform, and the company has now launched a new version called Nia. 

The platform combines Mana's big data, machine learning and cognitive capabilities of Mana with AssistEdge, Infosys's robotic process automation tool. Nia also gets a boost from the recent acquisition of Skytree, a machine learning startup. Infosys laid out the value proposition of Nia in a statement:

Infosys's first-generation AI platform was about IT, simplification, efficiency and cost. Capabilities included socialization of organizational knowledge, deep analytics, service automation, automated incident root cause analysis and others. ... Infosys Nia tackles breakthrough business problems such as forecasting revenues, forecasting what products need to be built, understanding customer behavior, deeply understanding the content of contracts and legal documents, understanding compliance, and fraud.

 
Much of the conversation around AI of late has centered on the automation of jobs and what effect that could have on the workforce. In a video message, Sikka alluded to this but offered a measured perspective: "AI can do much more than automate the work that we used to do; It can amplify the work that we will do.” 
 
 

Skytree emerged as a commercial company out of Georgia Tech in 2012, and raised $20 million in capital through 2013. However, it hadn't logged any other funding rounds since then, suggesting the Infosys deal may have been a fire sale, notes Constellation Research VP and principal analyst Doug Henschen. "Skytree was up against formidable competition, given the rise of open source machine learning options," he adds. 

Infosys may have sought out Skytree for its talent. In a press interview earlier this month, Sikka noted that Infosys had picked up eight PhDs through the deal.

Infosys didn’t issue a press release on the deal, but in a press interview, Sikka stressed the talent picked up in the deal – eight PhDs in machine learning – rather than the IP. 

Mana gained about 50 customers since its launch one year ago and Infosys is betting on software-driven services like it for growth. 

The name change to Nia—which Sikka described in the video as a "beautiful word" that is both the last three letters of California and a Swahili word for "purpose—will hopefully stick, Henschen says. 

"If Infosys hopes to build up a brand associated with AI, it has to settle on a name," he says. "The move to consolidate the many piece parts into a unified platform is a welcome step."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

SAS Takes Next Steps to Cloud Analytics

SAS Takes Next Steps to Cloud Analytics

SAS Viya is now available as the cloud-friendly platform for SAS Visual apps and, soon, SAS 9. Next up should be more cloud-based services options.

SAS, like many well-established tech vendors, has to keep one eye on the future and one eye on the past. At the April 2-5 SAS Global Forum in Orlando, FL, the company did its best to reassure the 5,500-plus attendees that it can take them into the future without obsoleting past investments in SAS technologies and training.

To the tens of thousands of companies running SAS 9 (there are some 77,000 site licenses for the software), the message was “SAS 9 is here to stay.” And to the thousands of customers running SAS’s newer Visual products (there are more than 6,000 site licenses for SAS Visual Analytics and north of 1,800 for SAS Visual Statistics), the message was “everything in the portfolio can now run on SAS Viya.”

SAS Viya is the lynchpin of the company’s future. Introduced at SAS Global Forum 2016, Viya is the company’s virtualization and container-ready, Hadoop-compatible, next-generation back-end architecture. It supports in-memory, distributed processing at scale as well as lifecycle management for data and models. Microservices and REST APIs support services-oriented embedding of analytic services. Viya also extends language support beyond SAS to Python, Lua and, coming, R, Java and Scala.

At SAS Global Forum the company announced that the entire Visual suite -- Visual Analytics, Visual Statistics, Visual Data Mining & Machine Learning, Visual Investigator, Visual Forecasting, Optimization and Econometrics – can now run on Viya. Heretofore these products ran on the SAS LASR Server, which will continue to be available and supported. But if you want the combination of scalability, virtualization and multi-cloud, container-based portability and flexibility, you’ll want Viya.

As for SAS 9, connectivity to Viya will be introduced in the third quarter, opening up big-data and machine-learning capabilities. These jobs will be sent to Viya while more routine workloads and procedures will continue to run on SAS 9.

“If you’re processing all your jobs in less than one or two minutes, you’re fine and you don’t need to move [to Viya],” said SAS co-founder and CEO Jim Goodnight during a keynote discussion. If you need big-data or modern machine learning capabilities, “…think of Viya as an extension of SAS 9.”

The plan is to add the more routine analytical capabilities to Viya over time, but some analyses and workloads, including mainframe (zOS) and AIX workloads, will remain tied to SAS 9 (currently on release 9.4).

The SAS Visual portfolio can now run on the Viya architecture. I'm hoping to see software-
as-a-service options that would appeal to new customers.

Other important announcements at SAS Global Forum included:

  • SAS Visual Investigator 10.2: This analytic application was released in 2016, but 10.2 update offers improved search and discovery, scorecarding, alerting, entity analytics, workflow and administrative capabilities. SAS is also developing and delivering pre-built content for more than 14 investigative use cases, including child welfare, anti-money-laundering, power and energy monitoring, insider threat detection, and prescription drug abuse.
  • Current Expected Credit Loss (CECL): This SAS content helps banks deal with what’s formally known as ASU 2016-13, a new U.S. GAAP standard for credit loss accounting. CECL replaces today’s “incurred loss” approach and will become effective for SEC filers in 2020. The rules establish a lifetime loss for every loan, require point-in-time loss estimates and increased public disclosure requirements.
  • SAS-Cisco IoT Partnership: The companies have been working together for 18 months to create the Cisco SAS Edge-to-Enterprise IoT Analytics Platform. The platform includes SAS Event Stream Processing, now certified to run on CISCO UCS servers, and was launched with proof-of-concept content for the energy and mining industries.
  • SAS Results-as-a-Service: A combination of strategy and consulting services wherein SAS professionals deliver analytical solutions within weeks or months. Once a solution is approved, SAS can deploy it in the cloud or on-premises, with supporting managed services or, if desired, training and hand offs to customer teams. The service is aimed at companies that don’t have available staff or infrastructure to tackle new analytic challenges.

MyTake on SAS Global Forum Announcements

As I commented after last year’s Global Forum, Viya is SAS’s modern architecture as well as its answer to multiple open-source and cloud threats. The biggest threat by far is Apache Spark, which is gaining adoption quickly and is now widely available as a service on multiple public clouds. Spark software is also distributed and supported for on-premises deployment by multiple vendors, including IBM and the big-three Hadoop vendors, Cloudera, Hortonworks and MapR.

Many SAS customers are, indeed, very conservative and more concerned about continued SAS 9 support than big-data analysis or cloud-deployment options. But with an eye to the future, SAS Viya is crucial. It can’t get here soon enough, in my book, because Apache Spark, as a scalable, in-memory platform and as an elastic, pay-for-what-you-use cloud service, has been steadily gathering steam.

In an one-on-one conversation, SAS Executive VP and CTO Oliver Schabenberger asserted that open-source alternatives lack the data-governance and model-management capabilities that many SAS customers, particularly regulated companies, insist upon. That may be, but dark warnings about governance failed to keep many BI practitioners from embracing self-service data-discovery and data-visualization products -- even before those products gained governance capabilities.

On the topic of open source machine learning and algorithms, Schabenberger said “SAS won’t take a back seat to anybody on analytics.” But there’s no denying that the cost advantages and ecosystem strengths of the open-source model are driving huge adoption. That’s precisely why SAS opened up Viya to Python. SAS has had a lot to say, in recent years, about making free SAS software available to colleges and universities. But to a fault, customers in the sessions I attended at the Forum said their new hires tend to use open-source languages such as Python.

If SAS governance and lifecycle-management capabilities and its analytic depth and breadth are truly superior, they’ll stand up to open-source competition. It’s clearly a topic of debate within SAS. Some executives pointed out that it’s long been possible to invoke open-source algorithms from within SAS, though they’d like to see more explicit support for Spark and other emerging options. And now that Viya is here, several execs hinted that SAS will be much more aggressive about offering SAS analytics and software as ready-to-run cloud services. These would be welcome, future-minded steps that might attract a next-generation of SAS customers, but they’re not on the official roadmap for now. I’m hoping to see more openness and more cloud services options at next year’s SAS Global Forum.

Media Name: SAS Visual application suite.jpg
Media Name: SAS Viya.jpg
Data to Decisions AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership business Marketing finance Customer Service Content Management Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

SAP IoT Partnerships Focus on Next-Generation Factories

SAP IoT Partnerships Focus on Next-Generation Factories

Constellation Insights

SAP has long had a substantial presence in manufacturing for its ERP software, but in recent years has brought newer technologies to bear on manufacturing operations. It made two more significant moves in that direction this week at the Hannover Messe industrial trade show in Germany, announcing partnerships with Mitsubishi Electric and robotics manufacturer Kuka.

"SAP is playing to its established strengths by adding the integration of capabilities and data between operational technology and automation to enterprise information technology systems, plus adding the Intelligence of HANA and the GUI interfaces of Fiori to create new levels of integrated functionality," says Constellation Research VP and principal analyst Andy Mulholland. 

Under the first deal, Mitsubishi will use data generated from its factory automation capabilities with SAP's IoT (Internet of Things) technology to create new services for remote device management, production monitoring and predictive maintenance.

Mitsubishi sells products across the full spectrum of factory automation, such as programmable controllers, human-machine interfaces, power distribution products and industrial robots. It's already done some work to connect with SAP software at a deeper level, such as a connector it introduced last year that pushes shop floor data directly into SAP ERP. 

Meanwhile, KUKA plans to integrate its robots with SAP Cloud Platform, again for scenarios such as predictive maintenance and factory floor monitoring. It will also develop robot applications based on SAP's IoT platform, but details on what functions those will focus on weren't disclosed. Finally, KUKA will incorporate some elements of SAP technology into its homegrown Industrie 4.0 platform, Connyun. 

Overall, the goal is to connect "the top floor with the shop floor," as one SAP executive said in a statement. 

Hannover Messe was the proper place for SAP to introduce the new partnerships, given what a prominent event it is for the world's major industrial companies. 

The deals represent not only SAP's desire to expand further into manufacturing, but also its ongoing transition to a cloud business model. While on-premises software license sales actually rose 13 percent year over year in the first quarter, for SAP and other large software vendors, the rush is on to migrate customers to the cloud, both for operational efficiencies and over the long term, more money. Manufacturing systems generate massive amounts of data and by partnering with the likes of Mitsubishi, SAP hopes to push more workloads to its cloud services.

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer Chief Supply Chain Officer

IBM, ABB Tie Up for Industrial AI

IBM, ABB Tie Up for Industrial AI

Constellation Insights

IBM is teaming up with power systems and robotics giant ABB in a bid to apply AI (artificial intelligence) to the shop floor, smart grids and other industrial scenarios. The partnership will incorporate ABB's Ability platform with IBM's Watson IoT (Internet of things) technology:

The solutions enable current connected systems that simply gather data to become cognitive industrial machines that use data to understand, sense, reason and take actions to support industrial workers.

ABB CEO, Ulrich Spiesshofer enumerates: “This powerful combination marks truly the next level of industrial technology, moving beyond current connected systems that simply gather data, to industrial operations and machines that use data to sense, analyze, optimize and take actions that drive greater uptime, speed and yield for industrial customers.”

“This important collaboration with ABB will take Watson even deeper into industrial applications — from manufacturing, to utilities, to transportation and more,” said Ginni Rometty, IBM Chairman, president and CEO. 

Analysis: ABB-IBM Partnership speaks to broader trend

The pairing of ABB and IBM's platforms represent a continued shift away IT and toward operational technology, says Constellation Research VP and principal analyst Andy Mulholland. "IoT instrumentation creating massive new streams of data for AI to process is fast becoming the biggest driving force for the creation of digital business," he says. "The puzzlement of the IT community about the role and use of IBM Watson in IT should be rapidly being replaced by the desire to gain more understanding of this changing focus for the deployment of technology in enterprises."

First up, IBM and ABB will focus on the factory floor and smart grids. 

On the first, real-time production images collected by ABB systems will be analyzed through Watson IoT for Manufacturing, according to a statement. Watson will scan the images for defects, eliminating the previous manual inspection process. The result will be higher throughput for production lines along with better quality, the companies say.

Meanwhile, Watson will also be used to predict demand for electric power based on historical and weather data, allowing utilities to fine-tune the upkeep and operation of smart grids. 

ABB and IBM made the announcement in conjunction with Hannover Messe, the large industrial trade fair held annually in Germany.

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

Infor Acquires Birst for Cloud Business Intelligence: What It Means

Infor Acquires Birst for Cloud Business Intelligence: What It Means

Constellation Insights

Infor has spent billions in recent years in the course of creating a next-generation cloud ERP suite, with most of the results coming from organic development. To shore up its hand in analytics, however, it is going outside with plans to acquire Birst, a fairly small but mature player in cloud-based BI (business intelligence). Terms of the deal weren't disclosed.

In a statement, Infor CEO Charles Phillips provided the rationale for buying Birst:

"This is much of the same team that built Siebel Systems BI, which is now Oracle's BI stack. They put the band back together, pivoted to the cloud and built a modern BI platform with an understanding of future needs, experience with a wide variety of use cases, and commitment the cloud."

Birst was founded in 2004 and has raised about $139 million in venture capital to date. Customers include American Express, Kellogg's, Schneider Electric and Citrix. Its capabilities include an ETL (extract, transform and load) engine, reports and dashboards, visualization, smart discovery and data blending.
 
While small, to attract customers Birst has used tactics such as offering concurrent user pricing, a model that can help companies which want to give many users occasional access to BI save a lot of money compared to named user pricing.


Meanwhile, Infor now has a "critical mass of cloud subscribers and petabytes of mission-critical data in the cloud," making Birst an ideal fit for deriving value from it, according to a statement.

It should be noted that Infor already has a BI offering, which includes an in-memory calculation engine, integration with Office and an application studio geared to the abilities of business users. Birst's feature set is more robust and the company has what Infor needs to meet customer demands. Infor and Birst also have complementary domain expertise:

Customers running multiple ERP systems have asked Infor to build the enterprise analytic layer across the reality of a federated environment. ERP application companies rarely have the expertise or interest to build this aggregation layer. BI companies provide the analytics platform but don't understand industry processes and potential insights.

"This is a natural move as enterprise applications customers move into the cloud," says Constellation Research VP and principal analyst Doug Henschen. SAP and Oracle have had cloud BI offerings for a number of years, Birst gives Infor a mature, customer-ready cloud business intelligence platform, he adds. 

In turn, Infor's courtship represents a safe exit for Birst, which entered the cloud BI arena way before its time, Henschen notes. Infor's acquisition comes as the competition is getting formidable, with major vendors including AWS, Microsoft, Google, Oracle, SAP and IBM all pursuing cloud-based BI and analytic capabilities.

Meanwhile, "Tableau, Qlik and other heretofore on-premises-focused BI and analytics vendors have also been moving into the cloud, and it has all added up to increasing competition for Birst," Henschen says.

"Infor will have to integrate with Birst and create a migration path from its existing Infor XI BI capabilities, but it gives them a mature cloud platform and a better shot at retaining customers that might have otherwise chose third-party, cloud-based BI and analytics options," Henschen says.

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

 

Tech Optimization Chief Information Officer

Microsoft to Launch IoT As A SaaS Service

Microsoft to Launch IoT As A SaaS Service

Constellation Insights

While Microsoft has offered an IoT  PaaS (platform as a service) for some time through the Azure cloud, it's betting that some customers are willing to trade customization for faster time-to-market. Microsoft IoT Central is a fully managed SaaS (software as a service) that "enables powerful IoT scenarios without requiring cloud solution expertise," as Redmond says in its announcement:

Built on the Azure cloud, Microsoft IoT Central simplifies the development process and makes it easy and fast for customers to get started, making digital transformation more accessible to everyone.

Microsoft IoT Central will be available along with our existing platform-as-a-service (PaaS) solution, Azure IoT Suite, which enables deep customization and full control. This new IoT SaaS offering has the potential to dramatically increase the speed at which manufacturers can innovate and bring new products to market.

Further details on IoT Central weren't available, but it will become available over the next few months. While the initial version will apparently focus on manufacturing scenarios, expect packages for other cases, such as for logistics and retail, to emerge over time. 

Microsoft made a number of other IoT announcements, including Connected Factory, a specialized version of Azure IoT Suite.

Microsoft Azure IoT Suite Connected Factory ... helps accelerate a customer’s journey to Industrie 4.0 and makes it easy to connect on-premises OPC UA and OPC Classic devices to the Microsoft cloud and get insights to help drive operational efficiencies. In addition, it enables customers to securely browse and configure factory devices from the cloud.

Meanwhile, Microsoft is introducing a new service called Azure Time Series Insights. It's supposed to automate the process of analyzing event data from IoT endpoints, which can easily consist of billions of signals:

It helps organizations discover hidden trends, spot anomalies, and conduct root-cause analysis in near real time, all without writing a single line of code through its simple and intuitive user experience. In addition, it provides rich APIs to enable companies to integrate its powerful capabilities into their existing workflows and applications.

Yet another IoT announcement concerns security. Azure IoT will now support the hardware security standards Device Identity Composition Engine (DICE) and Hardware Security Module (HSM), according to a statement. Microsoft will discuss all of its IoT announcements during the Hannover Messe industrial conference in Germany this week.

Redmond's IoT strategy bears watching amid a crowded and highly competitive market. 

"Microsoft is really turning up its focus on IoT over the last few months, but it's been taking an interestingly different direction to most of the other technology vendors," says Constellation Research VP and principal analyst Andy Mulholland. "Their recent focus has been more toward adding intelligence to outcomes, and ignoring the task of the so called final-mile connectivity with the management of the sensors and devices."

"In practice there are no outcomes without good quality inputs from the IoT estate, added to which controlling and managing the data inputs was seen a year or two ago as a shrewd move to control marketplaces," he adds. "In practice, the diversity of sensors, devices and networks that are required to be integrated made this a difficult area for big technology vendors to productize. However perhaps Microsoft has the key to wining marketplace control by using its long experience in working with developers."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

The Linux Foundation Hones In On IoT with EdgeX Foundry

The Linux Foundation Hones In On IoT with EdgeX Foundry

Constellation Insights

Some 50 companies have joined a new open source project focused on IoT (Internet of Things) edge computing at the Linux Foundation. The effort could foster interoperability and faster maturing for enterprise and industrial IoT. Here are the key details from the Foundation's announcement:

IoT is delivering significant business value by improving efficiencies and increasing revenue through automation and analytics, but widespread fragmentation and the lack of a common IoT solution framework are hindering broad adoption and stalling market growth. ... EdgeX solves this by making it easy to quickly create IoT edge solutions that have the flexibility to adapt to changing business needs.

Designed to run on any hardware or operating system and with any combination of application environments, EdgeX can quickly and easily deliver interoperability between connected devices, applications, and services, across a wide range of use cases. Interoperability between community-developed software will be maintained through a certification program.

A key player in EdgeX is Dell. In October, Dell revealed Project FUSE, an IoT stack developed with dozens of partners that it intended to open-source. Those plans have apparently come to fruition through EdgeX Foundry. Dell is contributing the FUSE source code to the Linux Foundation project under the Apache 2.0 open source license, which is considered one of the most permissible of its kind:

The contribution consists of more than a dozen microservices and over 125,000 lines of code and was architected with feedback from hundreds of technology providers and end users to facilitate interoperability between existing connectivity standards and commercial value-add such as edge analytics, security, system management and services.

EdgeX Foundry members include AMD, ForgeRock, VMWare and dozens of other companies that play at different levels of the IoT hardware and software spectrum. The project is founded on the belief that edge computing—wherein sensors and devices send data to distributed gateways rather than a centralized data center, thereby speeding performance and mitigating network congestion—will drive the future of IoT.

IoT's potential depends on getting the right sources of data connected in the right way at the sensor and device level, says Constellation Research VP and principal analyst Andy Mulholland. "As the numbers of devices and sensors have started to proliferate so has the realisation that there are new complexities to master at the edge of the IoT network," he adds. "The sheer variation of activities means this can't be a market where one or two products will emerge as the winners. Instead adopting standards and open source at the edge is to every enterprise and technology vendors's benefit. Its good to see the Linux Foundation stepping up to the challenge with EdgeX Foundry, and to see that significant support is already in place to make this move work."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

Digital Business Distributed Business and Technology Models Part 4; Augmented Intelligence and Machine Leaning

Digital Business Distributed Business and Technology Models Part 4; Augmented Intelligence and Machine Leaning

Business has constantly pushed for better ‘intelligence’ to support improved decision-making to support the continued drive towards increasing competitiveness. The impact of Cloud Services in reducing cost and improving availability of capacity together with the rise of Big Data from Web and Social activities has taken analytics and Business Intelligence to new highs. But are these the capabilities to support Digital Business with its massively increased data loads, constant new perspectives, and collapsed time frames required to achieve ‘immediate’ dynamic optimizations? Part 4 of this series explores the new requirement for ‘Intelligence’ in a Digital Business as defined in part 1 of this series.

At the heart of Digital Business is using IoT sensing technology to convert physical objects and events into digital representations to use Augmented Intelligence and Machine Learning to create the Business benefit. The result is to produce amounts and types of data into the Enterprise, together with constantly changing market dynamics that simply defy traditional BI reporting methods. The challenge is to both analyze these data flows, and to make optimized decisions, within the limited time frames required for optimized responses.

AI, together with IoT, are the two new core technologies at the heart of the CAAST technology model of Clouds, Apps, AI, Services and Things that, when used in new integrated frameworks create the capabilities of Digital Business.

In the first blog in this series, part 1, the business model architecture of a Digital Enterprise is outlined and therefore it is recommended to reading this first before reading further. A notable factor of the Business activities of a Digital Enterprise is a constant series of dynamic and innovative adjustments in respond to the conditions of its Digital Ecosystem of partners. This market led optimization is a startling reversal of the current, traditional Business models, which are built on the optimization of Enterprise assets through stability in the operating model as a key factor.

This reversal in the Business model driver unsurprisingly also affects the current, traditional approach to implementation of a Business driven requirement. IT Enterprise architecture methods start with the business requirement, which is assumed to be an Enterprise Application, and proceeds down the technology stack. At each layer the selection of a technology, or product, is made based on the requirements of the Enterprise Application. As many Enterprise Applications were written using particular operating systems, etc., the result is to produce a custom implementation.

Increased awareness of standards, and a trend towards standardization, including the benefits of HyperVisors with Cloud technology, has improved commonality over recent years. Fortunately the complications/cost of maintaining custom, individual technology deployments under Enterprise Applications have been low in the past as the Enterprise Business model would rely on stability thus expecting Business application to have a life of many years.

Digital Business models are built on dynamic responsiveness to markets, and opportunities, and are delivered through quick build Apps, not monolithic applications, accordingly deployment relies on being able to deploy over a common set of enabling technologies. These capabilities span both internal and external infrastructure and are described in more detail in Parts 2, 3a and 3b titled Dynamic Infrastructure, Distributed Services Technology Management and Distributed Services Business management respectively and relating to the two common, shared infrastructure layers.

It is in the final two layers; covered in here in part 5 Augmented Intelligence and Machine Leaning and the concluding Part 6 on Business Apps and Services that a Digital Business competitively differentiates its self. Aligned to the fast moving light weight nature of rapidly deployed Apps is an Enterprise organizational model that reflects the same shift away from centralized monolithic processes and departments. The dynamic innovative Digital Enterprise has become a fast moving decentralized structure able to act swiftly to take decisions and act.

The Enterprise IT structured centralized data environment using historic analysis and reporting to delivers Business Intelligence, or BI, is not present in the activities and environment that relates to Digital Business.

Enterprise IT incorporating BI reporting remains vitally important for those processes that support key commercial functions, including compliance, where stability and ongoing comparisons remains key. However where Digital Business models are implemented the transformation in both Business and Technology models not surprisingly calls for a equal transformation in the approach to ‘Intelligence’.

The following diagram illustrates the two layers of Intelligence and Business Apps and Services, with the diversity of the Apps and Services layer producing a constantly changing demand for intelligent responses from the Intelligence layer below. Note; the term App is used to indicate a deployed business capability whose functionally is fully controlled by single Enterprise. The term Service is used to indicate an orchestration of functional elements from different Enterprises, either created dynamically by response to an event, or to build a Business offer to the market, and therefore not totally controlled by a single Enterprise.

One of the key traits that defines Digital Business is the ability to ‘Read and React’ to the data flow arising from the events and circumstances instrumented by IoT. The data volumes to be analyzed and the time frames in which to do this are one part of the challenge. The other is to use human experience and machine leaning to automate the react decision-making based on the analysis.

AI is used as a convenient term to cover the huge range of technologies and methods that are being developed to address these challenges. In the immediate future there is common agreement amongst the major Technology vendors approaches that Augmented Intelligence is the key. The goal being to use, or augment, Human Intelligence to work in this challenging environment, rather than trying to replace human experiences with entirely computer generated responses. Information Week published an excellent article defining this topic in response to the US Government concerns.

Given time, and the right information, an experienced human mind can successfully work out a reasonable solution to the requirements for ‘read and react’ responses, bit not at the frequency and volumes that Digital Business requires. It is comparable with the drivers that created Industrial Automation, where the ever-increasing production volumes increased speeds beyond the human operators capabilities. Some/ perhaps many, repetitive Office based role face the similar pressures, and though improvements can be made to improve human interactions with computers, ultimately the answer is increasingly likely to be Office Automation.

Even focusing on Augmented Intelligence and Machine Learning technology introduces a big and complex subject that is not a topic for this blog. Here the focus remains on exploring the use of the technologies of CAAST, (Clouds, Apps, AI, Services and Things), in building solutions for Digital Business. To learn more about AI, and Augmented Intelligence, the following links provide access to good primers on the topic; starting with The Verge explanation of common terms; followed by Wired explaining in more depth Machine Learning including the following useful paragraph;

AI is a branch of computer science attempting to build machines capable of intelligent behavior, while 
Stanford University defines machine learning as “the science of getting computers to act without being explicitly programmed”. 

You need AI researchers to build the smart machines, but you need machine-learning experts to make them truly intelligent. Quote from Wired see above link

Information Week published, ‘Why AI should stand for Augmented Intelligence’, drawing on an interview with IBM for their approach. PC Mag considered how Salesforce approached AI noting how, and where, it fits in relationship to Apps above, with the Data flow handing and infrastructure below. The importance of, and the role of Machine Learning comes up in these articles, with ZDnet discussing SAP views on Machine Learning as a closing view for the Technology background reading.

The interviews all point to the major technology vendors sharing, and working on, similar definitions and capabilities for the addition of intelligence and automated processing. But it will be quite a while before enough maturity occurs to allow interworking. As Augmented Intelligence and Machine Learning calls for in depth experience, knowledge and focus on a particular element an Enterprise will be faced with using different vendors for different business deployments.

The Digital Enterprise is made up of a series of high business value activity ‘pools’ operating in a semi autonomous manner to make rapid innovative competitive moves in response to changing events and operational activities. This is a complete reversal of current Business Models that rely on using centralized conformity around a set of optimized processes to reduce costs.

See the diagram below, which appears in Part 1, Understanding the Digital Business Operating model that illustrates this point. The independent enterprise activity ‘pools’ are shown, with the red lines illustrating orchestrations between the activity pools in a response to an external event in a customer building. This same reversal of the Business Model applies equally to the Technology Model, shifting the architecture from close-coupled state to alignment with fixed enterprise processes to loose coupled, stateless orchestrations in response to the events of Digital Business.

Augmented Intelligence starts with deployments to improve the operations of a particular activity; this allows selection of a technology vendor to be made based on either their specialist knowledge of the activity, or to maximize the impact of Augmented Intelligence/Machine Learning of the existing technology installation. In time, and with increasing maturity as a second phase, Augmented Intelligence/Machine Learning will move to the optimization at Enterprise level of the interconnections between the Activity pools.

The ability to start business beneficial deployments around specific activities, rather than wait for a full level of enterprise wide maturity should do much to reduce risks and difficulties for early adopters.

A commercial decision on; 1) where, and why, to initially deploy Augmented Intelligence/Machine Learning should depend on the importance of the activity pool, and, the scope for its operational improvement; 2) It has to be possible to use IoT sensing to provide the necessary quality of digital data to operate Augmented Intelligence/Machine Learning; 3) The output has to be capable of being channeled a ‘react’ Apps that can deliver the Business benefit as a ‘real time’ optimized operational improvement.

Summary; The Digital Enterprise Business is, by definition, a business that has created a full digital representation of its principle Business assets and activities to use computational facilities to optimize business operations. Though it may seem initially that IoT sensing is at the core, Augmented Intelligence/ Machine Learning represents the other half of the transformation.

 

Links to Information on Augmented Intelligence/ Machine Learning

The following is not intended to be an exhaustive listing, and is presented alphabetically. This list is provided for informative purpose and selection is based on Client and Press interest. Inclusion, or absence, from the listing does not imply any significance.

Amazon AWS - https://aws.amazon.com/machine-learning/?tag=vglnk-c312-20

Google - https://research.google.com/pubs/MachineIntelligence.html

IBM Watson – https://www.ibm.com/watson/

Microsoft - https://news.microsoft.com/features/microsofts-ai-vision-rooted-in-research-conversations/#6lVIzKOAeOhXwa57.97

Salesforce - https://www.salesforce.com/uk/products/einstein/overview/

SAP - https://www.sap.com/uk/solution/machine-learning.html

 

Summary; Background to this series

This is third part in a series on Digital Business and the Technology required to support the ability of an Enterprise to do Digital Business. An explanation for the adoption of a simple definition shown in the diagram below to classify the technology requirements rather than attempt any form of conventional detailed Architecture is provided, together with a fuller explanation of the Business requirements.

 

 

 

 

Part One - Digital Business Distributed Business and Technology Models;

Understanding the Business Operating Model

Part Two - Digital Business Distributed Business and Technology Models;

The Dynamic Infrastructure

Part Three – Digital Business Distributed Business and Technology Models

  1. Distributed Services Technology Management
  2.  Distributed Services Commercial Management
Tech Optimization Innovation & Product-led Growth Future of Work AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing SaaS PaaS IaaS Supply Chain Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

'Supercard' Payments Startup Plastc Goes Bankrupt, But So Was Its Core Concept

'Supercard' Payments Startup Plastc Goes Bankrupt, But So Was Its Core Concept

Constellation Insights
 

This week, payment card startup Plastc abruptly went belly-up, telling more than 80,000 customers who had preordered the devices that it is filing for bankruptcy and will not ship a single item.  

“We are disappointed and emotionally distraught, and while we know this is extremely disappointing for you, we want our backers to know that we did everything we could to make Plastc Card a reality,” the company told disappointed would-be customers, who apparently won't be reimbursed.  

The programmable card would have allowed users to store up to 20 cards on the unit itself, with access to an unlimited number of cards through Plastc's app.  

Plastc says it had landed $3.5 million in venture capital as recently as February, but the investors subsequently withdrew the funding. That money would have been enough to ship working cards. A second investor came forward with a $6.75 million offer but also withdrew at the eleventh hour, according to Plastc: "The round was a signature away from closing and we were extremely caught off guard when they notified us yesterday they were backing out." 

Plastc's fate isn't quite as dire as that befell Coin, another payment card startup. Last year, Fitbit acquired Coin for its payments platform technology, but production and sale of its devices were immediately halted. Fitbit is expected to incorporate payment capabilities into its devices as early as this year. 

Stratos, yet another card startup, went out of business abruptly in 2015 after just six months, but its assets were recently acquired by the Danish company CardLab, which is planning to revive the product. 

Plastc and other failed “super cards” supported magstripe and claimed to be upgradeable to chip-based card technology to. Given that mag stripe is being phased out, this was a crucial promise.  There was never any point in trying to squeeze any life out of obsolete mag stripes, says Constellation Research VP and principal analyst Steve Wilson.  

"You had Coin and Plastc, and also Loop Pay, which was acquired by Samsung, that basically simulates a mag stripe card electromagnetically by blasting EM waves at a POS machine to trick it into thinking a real card has been run over the read head," Wilson says. "All these gizmos sought to keep the old card technology alive, while the US payments industry was slowly catching up with the rest of the world going to chip. Why would you try to keep mag stripe alive, when it was actually the cause of so much fraud?" 

Moreover, programmable third party cards play in a legally murky realm, Wilson notes. "Their operation was in violation of the payment scheme rules, which forbid cloning cards and actually forbid merchants accepting a payment card that is not properly branded," he says. "Coin and Plastc would have put merchants in a difficult position, of enticing them to accept a non-standard pseudocard, just so some customers could enjoy yet another gimmicky way of paying." 

The idea of a programmable mag stripe card seemed clever but it was ignoring far more pressing problems, Wilson says. The most important innovation needed in the payments space is chip-grade security for online payments through various channels. "Card present" payments—those made in person—with chip cards have a robust approach to security woefully missing online. 

Each Card Present payment instruction made with a chip card is signed with a unique cardholder key, which uniquely stamps each payment so it's tied to the cardholder, cannot be tampered with, and cannot be replayed, Wilson says: "You cannot clone a chip card because the key is held inside the chip and only ever activated, on a transaction-by-transaction basis. No attacker can skim chip cards, and then clone the cards." 

It’s for that reason Wilson said from the outset that Coin and Plastc could not promise an upgrade to chip cards. “It’s just not possible to take your chip cards and copy them into one super card, as Plastc and Coin did with mag stripe cards.” 

However, thieves can still steal card holder details and use them online, because "card not present" payments still use unsigned personal data, Wilson adds. 

While the payments industry has pushed the 3-D Secure protocol, which provides an additional authentication layer for online purchases, it has a poor user experience and isn't all that secure, Wilson says. 

"The industry has been avoiding the inevitable—transaction signing in the online environment just as we do offline," he says. The barrier has been figuring out how to get chip cards interfaced through commodity computers, but there's another way—replicating cardholder data within mobile phones. 

This is how ApplePay and SamsungPay essentially work, given they store cardholder details inside special security chips, known as secure elements, in the phones, Wilson says. The problem is that these solutions are proprietary. 

"How about we just replicate chip card functionality in phones, in an open manner," Wilson says. "Let banks issue virtual cards by writing their card data and keys into the secure elements in an open, standards-based and non discriminatory way. 

"Banks and innovators need access to the secure elements, but that's controlled by handset manufacturers and carriers, because they are wedded to a rent-based business model where the precious secure element storage is levied," he adds. "The tech innovation is pretty easy. The business models need changing."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Customer Officer Chief Information Officer Chief Digital Officer