Results

Digital Transformation Digest: Microsoft IoT Central Eases App Development, UPS's Holiday Crunch Could Spark Drone Debate, and More

Digital Transformation Digest: Microsoft IoT Central Eases App Development, UPS's Holiday Crunch Could Spark Drone Debate, and More

Constellation Insights

Microsoft IoT Central enters public preview: If your enterprise wants to empower line-of-business users to build IoT applications quickly, Microsoft says it has the answer in the form of IoT Central, which is now in public preview. 

IoT Central is a counterpart to Azure IoT Suite, which offers deeper customization capabilities and access to underlying services, with the tradeoff being a need for skilled developers. Both services provide quickstart application templates. IoT Central's browser-based Application Builder environment employs a wizard-like approach for creating models of IoT devices, setting application logic and parameters, and testing via simulation before live deployment.

IoT Central taps into a rising tide of interest in low-code platforms aimed at empowering citizen developers. (Microsoft has some experience in this area already with PowerApps.)

While the focus is on simpler scenarios, IoT Central applications can scale out to millions of devices, Microsoft says. It leverages multiple components of Azure IoT Suite, such as Azure Hub for device connectivity, but runs as a fully managed service. (For a deeper dive into IoT Central's technical details, go here.) Microsoft is offering a 30-day free trial for up to 10 devices; preview pricing for larger deployments is set at $0.50 per device per month.

POV: The speed and simplicity Microsoft is promising IoT Central will deliver could be compelling for many enterprises, particularly ones who have struggled to get IoT applications out quickly with other solutions (including Azure IoT suite). But buying decisions will require a thorough side-by-side assessment of the feature sets for IoT Suite and IoT Central with regard to customization capabilities and pricing. For its part, Microsoft says IoT Central offers a "medium" degree of customizability.

Still to come are important features for IoT Central such as pre-built integrations with enterprise apps. Microsoft says those are coming for Dynamics, Salesforce and other products.

"It's always good to see vendors making things simpler," says Constellation Research VP and principal analyst Holger Mueller. At Build 2016, Microsoft held a workshop for Mueller and other analysts in which they connected a Raspberry Pi to Azure IoT. The task was ultimately manageable but required a lot of steps and had some stressful moments. "It's good to see it simplified even further."

UPS faces holiday crunch—again: This year's holiday shopping season saw an uptick in online sales so strong that UPS had to delay some deliveries by one or two days, the Wall Street Journal reports. In an attempt to keep up with demand, the company has mandated that its delivery drivers work up to 70 hours over an eight-day period. While that's allowed under federal law, the Teamsters union, which represents drivers, is planning demonstrations and potential legal actions if the mandate isn't reversed.

UPS has had these holiday congestion issues before, despite hiring thousands of additional seasonal workers as driver helpers and distribution center staff. It expects to deliver 750 million packages between the U.S. Thanksgiving holiday and December 31, a rise of 5 percent over last year, according to industry publication Freightwaves.

POV: As the world's largest package delivery company, UPS is a crucial bellwether in the rapidly changing world of logistics. It is currently in contract negotiations with the Teamsters, who are working under a five-year pact that expires July 31. The new deal will most likely be for a similar term.

While the bulk of negotiations will concern working hours, wages and benefits, talks could also broach the role of self-driving trucks and other autonomous forms of delivery in UPS's operations. The company has been testing drones for years, including one earlier this year that launches from a truck's roof, delivers a package adjacent to locations along the driver's route, then returns to its docking station.

The drones have huge potential for UPS's bottom line, as they could save many millions in fuel otherwise burned by trucks going to those routes and would also provide obvious operational efficiencies. They also could have a negative effect on the UPS driver fleet's livelihood. The Teamsters have said they are "closely monitoring" the development of drones at UPS and based on their statement, won't be avid proponents of them.

Overall, the next few years will be interesting times at UPS as it juggles the challenges of meeting consumer demand, negotiating with its workforce and managing the rollout of new technologies.

Researchers predict most software will be written by computers in 2040: Scientists at the U.S. Department of Energy's Oak Ridge National Laboratory argue in a newly released paper that by 2040, the bulk of software code will be written by machines, with humans playing a highly diminished role:

The combination of machine learning, artificial intelligence, natural language processing, and code generation technologies will improve in such a way that machines, instead of humans, will write most of their own code by 2040. This poses a number of interesting challenges for scientific research, especially as the hardware on which this Machine Generated Code will run becomes extremely heterogeneous. Indeed, extreme heterogeneity may drive the creation of this technology because it will allow humans to cope with the difficulty of programming different devices efficiently and easily.

The authors cite a variety of projects and technologies that already exist for machine-generated code, including the Defense Advanced Project Agency’s (DARPA) Probabilistic Programming for Advancing Machine Learning and Microsoft's DeepCoder. In the future, "if a human does need to write some code, they may find that they spend more time using autocomplete and code recommendation features than writing new lines on their own," they write.

POV: The paper is worth a read (h/t to the Register) but is framed as speculative, and perhaps rightfully so. Its authors give no specific reason for the 2040 prediction, and barely get into important software development topics such as requirements gathering, testing and security. But the paper's focus on the very real problem of increasing hardware heterogenity, and how machine-generated code could help mitigate it, is a provocative one.

Data to Decisions Future of Work Matrix Commerce Tech Optimization Chief Customer Officer Chief People Officer Chief Information Officer Chief Supply Chain Officer Chief Digital Officer

Event Report - Pivotal SpringOne 2017 - It’s all about PCF – and some Spring

Event Report - Pivotal SpringOne 2017 - It’s all about PCF – and some Spring

We had the opportunity to attend Pivotal’s yearly user conference of the Spring developer community. SpringOne, held in San Francisco December 4th till 7th 2017 at Moscone West. With about 3000 attendees it is the best attended Pivotal conference every, a proof point for the popularity of the Pivotal products.

[I am writing this blog post after attending the analyst summit on Monday, and attending Tuesday at the conference – more news is coming out and I may revise my judgement at that point.]

 

Prefer to watch – here is the video summary (if it doesn’t show up – find it on my YouTube Channel here).
 


Here is the 1 slide condensation (if the slide doesn’t show up, check here):
 


Want to read on? Here you go: 
 
Pivotal Cloud Foundry (PCF) 2.0 is here – Pivotal thought that so much substantial work has happened on Cloud Foundry recently, that it is worth to rev a release number, making it PCF 2.0. And certainly, the addition of Kubernetes support (announced in August, now GA) with PKS, the announcement of serverless capabilities (Pivotal Function Service – PFS), the integration of VMware NSX-T stack for networking and security – all make this a lot of new functionality. In combination with support for Microsoft Azure Stack, more support for Windows containers (like auto-scaling) and access to Google Cloud platform services more additional capabilities are added … and partners are flocking to PCF, the most prominent being IBM, adding Open Liberty as an embedded server option to Spring Boot, commercial support for IBM WebSphere Application Server Liberty Buildpack in PCF and better integration to a whole plethora of IBM product and services. 

 
Mee opens SpringOne


Pivotal moves into Serverless – No surprise, Pivotal announced its serverless plans, that supposingly materialize in across 2018. Serverless is powerful for enterprises to build and operate their next generation applications, and in order to keep enterprises and developers happy, Pivotal had to come up with its own serverless alternative. It looks architected well, with the usual Pivotal suite integration (Rabbit MQ) but also with Apache Kafka, to manage events that wake up the serverless capabilities. But it is early days - so stay tuned for more on this in the coming months.

 
If Onsie Then Emojis


Partner, partners… did I mention partners? – Enterprise software ecosystems have a fine nose when it comes to identify a vendor that has momentum, and vendors that can partner tend to flock to that vendor. Pivotal is no exception. Accenture launched a joint business unit with Pivotal, signaling the engagement of the large system integrators. The IaaS side is well represented with Google and Microsoft. Developer tools integration is happening with IBM, Microsoft and many more. Tech stack integration is happening with IBM and others. The Dell EMC keiretsu was there with Dell EMC (an on-premise version to run PCF), Virtustream (running PCF for you) and VMware (NSX most prominent). And many startups like e.g. Datadog, Solace etc. Year over year – since SpringOne in Las Vegas last year, I’d say the ecosystem has doubled in presence and efforts.

 
And yes - a new Spring Banner is unveiled
 

MyPOV

Pivotal is on a roll when it comes to Cloud Foundry and Spring. Enterprises want (and need) to build next generation applications fast, and they naturally look for frameworks (Spring) to help them on a platform (Cloud Foundry). This created the “2nd spring for Spring” as I stated a year ago (see here) - which otherwise was a developer community slowly fading away… not anymore and good to see the revival. Serverless is an important innovation for Pivotal to keep new type of workloads in the fold… we now have to see how it materializes in a few quarters from now. Almost ironically there were more Cloud Foundry announcements than Spring announcements – at least on the first day when I attended. The audience did not seem to bother, enterprises and developers know that at the end of the day, that the platform comes first.

On the concern side, while it was almost refreshing not to hear a mention of Machine Learning / AI at a conference in 2017, it still means that Pivotal will have to give its customers and users a solution in this important space. But fair enough, serverless first. Equally the BigData / Hadoop relationship is not mended with the leading vendors, given Pivotal’ s database history… but better to fix this sooner than later… and the only IaaS hold outs to come to the same level as Google, IBM and Microsoft have been AWS and Oracle… but it is likely that Pivotal is out to get those players in 2018. The ecosystem and success that Pivotal has been able to create around Cloud Foundry is too much of an attraction not to be part of the party.

Overall great momentum for Pivotal, good innovation and announcements for both Cloud Foundry and Spring. Excited and eager customers, increased partner interest are all good signs that all is well in Pivotal land. Stay tuned.

Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here). And a Storify collection of the analyst summit can be found here.
 
Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Digital Business Transformation; Technology outside of IT Applying Digital Technology to practical, everyday business activities

Digital Business Transformation; Technology outside of IT Applying Digital Technology to practical, everyday business activities

Not a title designed to upset the IT department, just a reflection on the manner that Technology has pervaded most aspects of current life to become a practical tool. Think of a modern car, a highly sophisticated technology environment of interconnected devices interacting with, and responding to, wide range of events within a largely self-contained ‘enterprise’ system. It’s certainly not part of the IT role, nor are the Technologies necessarily the same, and there is no alignment with the Business functionality IT supports.

As the article your car may be the most powerful computer you own makes clear the functionality is focused on using Technology to ‘Read and Respond’, completely different in almost every way from the role of IT to provide internal administrative processes and transaction records. Using the increasingly popular terminology; Systems of Record equates to the role and technologies of IT, whereas Systems of Engagement feeding the data to Systems of Intelligence in Digital Business are something entirely different!

Computational power got cheap, programming got easier, and more engineers understood how to make use of these changes. An Apple Watch has more processing power than an Apple 4 Smart Phone, which in turn has more power than a Cray Supercomputer of the 80s. (http://pages.experts-exchange.com/processing-power-compared/ ). Add the boost in connectivity brought by the Internet, with acceptance in use by the population at large, and the resulting transformation in functionality equates to the arrival of ‘Digital’ in everyday life.

Amazing changes, that have resulted in us as individuals not seeing Technology as contained within the IT department. In fact, we have all got rather good at ‘innovating’, every time you load and start to use an App, it’s a personal innovation of some aspect of your life. Digital Technology is merely a convenient name for a group of new technologies, CAAST, (Clouds, Apps, AI, Services and Things), that in combination enable an activity to use technology beneficially.

Hardly surprising that much of the innovations are straight forward practical applications of the capabilities to everyday issues, like Triax Spot-r in the Construction industry. Big Enterprises might get headlines around Business Transformation, but transformation of industry sector working practices is much more widespread. These changes in turn get adopted to evolve current Enterprise activities, so the spread of the ‘Digital Revolution’ continues as a series of quiet success stories.

The success of Triax Spot-r, is an excellent example, less a Technology story and much more a practical capability to address a series of issues confronting ‘on the job’ management. In fact, deployment success is coming from construction workers and their management realizing how to get even greater value than the original business proposition.

Our networked devices worn by every worker on the site provide real-time location visibility and keep you informed of safety incidents as they occur. Quote Triax.

Technically Triax's Spot-r system integrates a number of the new Technologies in an imaginative manner. The combination of a proprietary wireless mesh network with wearable devices, (see photo of a belt clip), that includes an accelerometer, gyroscope and altimeter to give previously unavailable ‘real-time’ data. Triax Spot-r original basic proposition was to automatically check workers in-and-out of the job site and notify of potential and real site safety incidents picked up by its sensors, (slipping, tripping, jumping and falls), to aid site supervisors in fulfilling their legal obligations to manage a safe site.

Spot-r also provides the geolocation of potentially, or actual, injured workers to improve response times for providing aid. The inclusion of a self-alert button allows workers to report unsafe conditions, site hazards or other potential injuries in real-time directly from their work area. All good personal safety features to encourage workers to want to wear a Spot-r device.

Business value is delivered to site supervisors and managers by a cellular wireless connected mobile dashboard, (see photo), connected to the Triax cloud service. In addition to the direct real-time safety information all data is aggregated on each worker’s time & attendance, location, subcontractor activity, and any incident types.

An Open API allows developers to add further functionality and integration, such as with Procore Technologies' project management platform. Here integration automatically sends Spot-r worksite data, including man hours and safety incidents, to Procore for accidents, timecards, manpower and daily construction reports. Not surprisingly, Insurance companies appreciate Triax Spot-r as a means of cutting the cost of injuries on construction sites, and that offers site operators a further incentive in the form of lower insurance premiums.

Spot-r offers a classic win-win proposition for site operators, supervisors, managers, even the workers, and above all a valuable aid to fulfilling safety legislation compliance. It is made possible by a clever combination of the new Digital Technologies, it’s not part of Enterprise IT, and neither does it demand a business transformation. It’s a great example of the large number of adopt and deploy moves across many different industry sectors driven entirely around practical business cases by ‘hands-on’ business managers. The quiet and little publicized take up of Digital technology to provide tangible immediate improvement to existing operations.

A conversation with an early adopter site manager on a large New York site with 400 construction workers produced an enthusiastic endorsement, reeling of a multiplicity of ways that Spot-r was significantly improving operations. The outcome being that Spot-r was scheduled to be rolled out on all the other construction sites that his employer was operating. The benefits of a tangible increase in site and personal safety having even overcome initial Union and worker resistance to change common to the construction industry sector.

Not surprisingly Spot-r has attracted a lot of attention, with Business Insurance, an industry sector publication, selecting the product for an innovation award. The benefits are simple to grasp, deployment is straight forward and importantly with this type of low cost solution the message easily spread by Social media, including a youtube clip. Though a wide range of Building, Facilities Management and Construction publications have also been quick to covered the product and its contribution to safety and improved site management.

All in all Triax Spot-r makes an excellent example of the kind of practical implementation of Digital Technology that is the reality of ‘Digital Technology’. Straight forward direct improvements to current activities providing an evolutionary non-threatening way to gain direct benefit. Unlike the big strategic reports with their focus on large scale high risk business model transformations. As an example Construction Global publication featured in its July 2017 edition an in-depth study of the future of the construction industry by Balfour Beatty, a leading global construction company, entitled Innovation 2050. Clearly aimed at Board level strategic thinking it presented ten well thought through ‘big’ conclusions that would ‘transformation’ Building and Construction.

No doubt the conclusions are a necessary ‘wake up call’ to Board and senior management, but is it right to take the report as recommending Boards make immediate ‘revolutionary change’ through enterprise wide ‘Transformation’. A high risk path, and noticeably few, if any, of these reports offer direct recommendations for ‘practical’ deployment activities featuring particular products. It does seem much more likely that the Building and Construction sector, (and other Industry sectors), will evolve through a gracefully path of every increasing ‘evolutionary’ deployments identified by operational Business managers spotting opportunities.

That’s not to say there is no value in reports such as Innovation 2050, there is a need to make sure senior management is encouraging adoption, rather than hindering, with an eye to the future. In much the same way, the role of IT lies in making sure at an Enterprise level Technology works for the entire Enterprise overall and doesn’t support individual business deployments at the expense of the whole. The big picture of the longer-term impact of Digital Technology transforming markets into Digital Business is important, but equally so is keeping attention focused on encouraging ‘hands-on’ managers and supervisors to identify and deploy practical Digital upgrades.

 

Addendum;

https://www.constellationr.com/blog-news/increasing-digital-competitiveness-your-current-business-model-lessons-ge-and-industrie-40

New C-Suite Innovation & Product-led Growth Tech Optimization Future of Work AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

Digital Transformation Digest: CVS-Aetna Merger Is All About the Analytics, IBM Rolls Out Power9 Systems for AI Workloads, and More

Digital Transformation Digest: CVS-Aetna Merger Is All About the Analytics, IBM Rolls Out Power9 Systems for AI Workloads, and More

Constellation Insights


The CVS-Aetna merger is all about the analytics: CVS Health has made its bid for Aetna official after months of speculation, saying it will pay $69 billion for the insurer in a deal that on the surface has as much risk as reward. The plan is to leverage CVS's nearly 10,000 retail pharmacy locations—an increasing number of which also contain MinuteClinc care centers—along with Aetna's substantial subscriber base, many of whom are already CVS customers, adding new services while driving down costs along the way. CVS is using about $4 billion of its own cash, with the rest coming from new debt and equity.

In a statement, CVS and Aetna characterized the merger as "a natural evolution for both companies as they seek to put the consumer at the center of healthcare delivery." The companies plan to drive the merger's success through their respective analytics and data platforms. Each brings somewhat differing strengths to the table. Aetna has for years been building out a deep data science and predictive analytics bench (and has had great success using it to combat fraud), while CVS has invested in a analytics platform from Epic, the large EHR (electronic health records) software vendor, among many other ventures.

The potential combinations of CVS's retail purchase and pharmacy data—both of which are tracked through its ExtraCare rewards program—along with Aetna's rich pools of provider-side data are all about creating a personalized health care experience that leads to reduced costs and improved patient health, the companies say. Here's one example provided by CVS and Aetna:

Twenty percent of Medicare patients are readmitted to the hospital soon after being discharged at significant annual costs, much of which is avoidable. Readmission rates can be cut in half if patients have a complete review of their medications after discharge from the hospital to help them manage their care at home.

POV: CVS and Aetna's merger is unprecedented from an industry standpoint, making the cultural adjustments found with any corporate merger perhaps more difficult than most. (One wonders if the belief that Amazon will enter the retail pharmacy business helped make CVS pull the trigger on such an expensive deal sooner rather than later—particular as competitors such as Walgreens are stumbling financially.) It must also pass antitrust muster with U.S. authorities.

CVS and Aetna are giving themselves an ample timeline for clearing that hurdle, saying they expect the deal to close in the second half of next year. The integration work that follows will no doubt play out over a number of years, but CVS and Aetna will have to make serious decisions about their analytics strategy going forward much sooner. The good news is that both companies have strong technical leadership in the persons of CVS CIO Stephen Gold and Aetna EVP Meg McCarthy.

IBM launches new Power Systems geared for AI: While IBM has lagged behind Amazon, Google and Microsoft in the cloud platform market, it's hoping to be the leader in AI workloads with the introduction of new Power Systems Servers that incorporate the Power9 microprocessor.

Built specifically for compute-intensive AI workloads, the new POWER9 systems are capable of improving the training times of deep learning frameworks by nearly 4x[2] allowing enterprises to build more accurate AI applications, faster.

The system was designed to drive demonstrable performance improvements across popular AI frameworks such as Chainer, TensorFlow and Caffe, as well as accelerated databases such as Kinetica.

As a result, data scientists can build applications faster, ranging from deep learning insights in scientific research, real-time fraud detection and credit risk analysis.

IBM has the U.S. Department of Energy as an initial Power9 customer; the agency will use the chips in its Summit and Sierra supercomputers. Google also plans to use POWER9 in its data centers.

POV: IBM has been working on Power9 for four years, and the fact that the likes of Google, which has developed its own AI-oriented chips, is using them speaks to the progress Big Blue has made. The Power9 systems also incorporate GPUs from NVIDIA

Power 9 supports up to 5.6 times more I/O and double the threads than "its x86 contemporaries," IBM said, in an allusion to Intel's x86-based servers that hold a 90-plus percent market share in data centers. IBM is hoping to capture 20 percent of the market through Power9 by 2020, Network World reports. That may be a lofty goal but IBM will certainly try its best. One way it will surely seek attention for Power9 is through deep learning benchmarks and its PowerAI software.

Earlier this year, it announced the results of tests using a 64-server Power system with 256 NVIDIA GPUs on the Caffe deep learning framework, saying it had bested a team from Facebook's AI research arm. The question is whether IBM can succeed in getting customers to move deep learning projects to its cloud—which will surely offer Power9-based instances soon—or make the investment in them for on-premises systems.

New York AG, senators demand net neutrality vote delay: As with all major policy decisions, the Federal Communications Commission held a public comment period concerning its upcoming vote to overhaul so-called net neutrality rules. Underscoring the topic's interest among the public, the period drew more than 22 million comments. The problem, says New York attorney general Eric Schneiderman and a group of 28 senators, is that more than a million comments supporting net neutrality being overturned may have been fake ones posted via bots.

The FCC board has a Republican majority, led by chairman and prominent net neutrality critic Ajit Pai, and it seems a foregone conclusion that the rules will be voted down. Net neutrality, which was passed in 2015 after years of debate, prohibits ISPs from favoring legal Internet traffic based on payments or other considerations. But opponents characterize the rules as an overreach that's been bad for competition and ultimately, consumers.

"A transparent and open process is vitally important to how the FCC functions," the officials said in a letter to Pai. "The FCC must invest its time and resources into obtaining a more accurate picture of the record as understanding that record is essential to reaching a defensible resolution."

POV: Thousands of the fake comments used the actual names and addresses of New York residents, which amounts to identity theft, Schneiderman wrote in a post on Medium. Schneiderman's office asked for relevant records in the course of its investigation nine times, but the FCC provided "no substantive response," he wrote.

This week, the FCC's inspector general agreed to cooperate in Schneiderman's investigation, but it's not clear whether anything will stop the Dec. 14 vote from occurring. For one thing, the public comment period has been over for some time, making it difficult from a procedural perspective to call for a makeover. Still, net neutrality proponents have other options; federal law states that decisions made by agencies such as the FCC can be overturned if they're deemed to have been made in an "arbitrary" or "capricious" manner.

Data to Decisions Marketing Transformation Matrix Commerce Tech Optimization Chief Customer Officer Chief Information Officer Chief Digital Officer

Progress Report - Ultimate Software Analyst Day 2017 - Keeping the momentum

Progress Report - Ultimate Software Analyst Day 2017 - Keeping the momentum

We had the opportunity to attend Ultimate Software’s first analyst summit, held at the vendor’s headquarters in Weston, Florida, November 16th. Always good to be at an inaugural analyst meeting, good attendance by the usual HCM analysts. 

 
 
Here is the 1 slide condensation (if the slide doesn’t show up, check here):
 

Want to read on? Here you go:

Deeper Insights into the Ultimate People Culture – We had the chance to get another glimpse into the Ultimate Software culture – with Chief People Officer Maza joining us for dinner. A truly remarkable story, in which more aspects get uncovered each time… this time what stood out that founder Scherr preferred to go in debt over laying off employees. And certainly, a vendor’s culture is strong when …. The waiter at the restaurant knows all about it. On the analyst day we had CEO Scherr share more on this, including the trust card and objective coin. All good principles of people leadership, built on the conviction that happy employees work better, resulting in happier and more satisfied customers. 
 
Ultimate Software Constellation Research Holger Mueller
Scott Scherr takes us through Ultimate Software History

Workforce Management on Track – One of the bigger functionality items that Ultimate announced at Ultimate Connections this spring was Workforce Management. The vendor confirmed that the new capability is on track for a January 2018 general availability. It will be interesting to monitor adoption and further roadmap – and what it means for Ultimate’s partnership with Infor. And overall the yardstick in workforce management has been moved with Kronos’ recent launch of Workforce Dimensions (see here). It certainly is a key area in HCM going forward – payroll relevance, gig economy potential and compliance aspects all make workforce management a must have capability for HCM vendors. 
 
 
Ultimate Software Constellation Research Holger Mueller
Software & Services happy together (Dodd & Rogers)


(New) Mobile Application Adoption Good – Ultimate has had an uneven mobile user experience in the past and announced to rectify this at Ultimate Connections. The mobile application (both for iOS and Android) has shipped and has seen good adoption. Mobile remains the preferred access for most HCM interactions. So, it is important for Ultimate customers that the vendor is getting this right and it certainly looks like that. Now Ultimate has to keep it that way. 
 
Ultimate Software Constellation Research Holger Mueller
2 confident sales leaders - Swick & Phenicie


Marketplace Progress Good – Marketplaces are key for modern enterprise software sales, as the days of ‘minibus load visits’ are coming to an end. HCM buyers want to see, try and buy software fast, and that means they have to look at Marketplaces. The Ultimate Marketplace, just launched recently, has made good progress in both partner adoption and market impact. And it is early days and differentiation in the field is getting hard. But a must have area for all vendors, including Ultimate so key and good to see the progress.
 
Ultimate Software Constellation Research Holger Mueller
Hartshorne plays a game
 

MyPOV

After announcing its most ambitious product roadmap agenda at Ultimate Connections in Las Vegas this spring, Ultimate has been making good on the delivery of the roadmap. And that matters, as the vendor was slowly but steadily falling behind… not so anymore if the current development speed and output can be maintained and met with customer adoption and revenue in the next quarters and years. Ultimate has made progress with its AI platform and assistant Xander and now needs to keep the momentum, to build on the leadership of being the first HCM vendor with a working assistant.

On the concern side, Ultimate must increase the speed of its UX renovation / innovation. There are too many at best mediocre UX quality screens left in the system, especially at the manager and HR professional level. Maybe Ultimate is banking on replacing a large part of that with a chat / voice based assistant – trying to leap frog this UI innovation cycle… Ultimate Connections 2018 will give us the answer into this question.

But overall a new and re-energized Ultimate Software. Committed to people, customers and R&D. On track for 1B US$ in revenue for 2018. Stay tuned.



 
Tech Optimization Next-Generation Customer Experience Revenue & Growth Effectiveness Digital Safety, Privacy & Cybersecurity Distillation Aftershots Data to Decisions Innovation & Product-led Growth Future of Work AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR LLMs Agentic AI Generative AI business Marketing IaaS Disruptive Technology Enterprise Acceleration Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Customer Officer Chief People Officer Chief Human Resources Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Amazon Web Services Adds Yet More Data and ML Services, But When is Enough Enough?

Amazon Web Services Adds Yet More Data and ML Services, But When is Enough Enough?

Amazon Web Services CEO Andy Jassy invoked the Lauren Hill song “Everything is Everything” at this week’s re:Invent event in Las Vegas to underscore his assertion that AWS has more than twice the number of services of any other public cloud. The question is, will the services catalog ever become – or, indeed, is it already – so extensive that it becomes unwieldy from a customer development, deployment, and cost-management perspective?

This year’s re:Invent followed the more-more-more pattern of past events, with more attendees, more exhibitors, more floor space and, you guessed it, yet more services and capabilities announced. That was certainly the case in the data and analytics arenas, with announcements across database, big-data management, analytics, machine learning (ML) and artificial intelligence (AI). Sometimes less is more, however, a point I’ll get back to in my conclusion, but let’s start with a recap of what I see as the most important data-to-decisions related announcements.

Aurora Upgrades Promise Easier Deployment, Cost Saving and Compatibility

Aurora is Amazon’s flagship database service, aimed at winning converts from the likes of Oracle Database, Microsoft SQL Server and IBM Db2 with what AWS says is comparable or better durability, availability and performance at as little as one tenth the cost. Aurora is a decidedly commercial offering, too (only available on AWS), but it’s compatible with both MySQL and, as of October, PostgreSQL, the two most popular open-source relational databases. Aurora launched with MySQL compatibility, but PostgreSQL offers closer compatibility with enterprise-focused capabilities supported in Oracle and Microsoft SQL Server.

At Re:Invent, AWS announced two Aurora upgrades. Aurora Serverless (now in preview) will deliver automatic, on-demand scaling (both up and down), which will simplify deployment, ease ongoing management and align cost with usage. Aurora Multi-Master, another preview capability, scales both reads (already available)  and writes (the part that’s new). Sometime next year this Multi-Master capability will be extendable across multiple regions. In short, Multi-Master promises performance, consistency and high-availability at scale, and with multi-region support these traits will span even global deployments.

MyPOV on Aurora upgrades: The Serverless move was a no-brainer and it was only a matter of time. Multi-Master may be a response to customer demand, as AWS claimed, but it also answers Google’s introduction of Spanner, that vendor’s globally scalable relational database. (Similarly, the DynamoDB Global Tables announcement at re:Invent answers Microsoft's introduction of global-capable CosmosDB). Global deployment is far from a mainstream demand, however, so this is more of a battle for bragging rights. The real mainstream market changer is the general availability PostgreSQL compatibility, which will make it easier to migrate workloads running on Oracle or SQL Server into AWS on Aurora without extensively rewriting queries and database functionality. It’s a less flashy announcement, but it’s the most significant in terms of winning new customers.

Neptune Graph Database Service

AWS entered a whole new database category with the introduction of Neptune, a graph database service now in “limited preview.” Graph analysis is about exploring network relationships, as in people in a social network; customers and influencers in a retail or telco environment; employees, candidates, organizations and job openings in an HR context; and network nodes and assets in a national-intelligence, security or IT-network analysis context.

Most companies use graph-analysis features that have been grafted onto relational databases to do this work. But graph databases designed for the task do a better, more flexible job when exploring millions or even billions of relationships. To give customers a choice, AWS has designed Neptune to use both the Property Graph and W3C's Resource Description Framework (RDF) models and their respective query languages, Apache TinkerPop Gremlin and RDF SPARQL.

MyPOV on Neptune: Amazon made some pretty sweeping disparaging statements about the scalability, durability and performance of existing open source and commercial graph database options. Neo4j, which is an open-source database with commercial enterprise edition and managed service options, supports both clustering and, more recently, multi-data-center deployment. Neptune is in limited preview, so we can only take Amazon’s word that it will deliver better and more reliable performance. Neptune will compete most directly with Neo4j and Titan/JanusGraph, from a technology perspective. But the real competition and biggest market opportunity is making a dent in the use of less-adept graph analysis features of more expensive databases including Oracle and Microsoft SQL Server. IBM has released Compose for JanusGraph, so it, too, is betting on a graph database service.  

SageMaker Introduces Yet Another Model-Management Environment

How many times this year have I heard vendors talking about making data science easier -- particularly ML? Let’s see, there’s Cloudera’s Data Science Workbench, IBM’s Data Science Experience, Microsoft’s next-generation Azure ML, Databricks (on AWS) and soon-to-be released on Azure… I’m sure there are more. At re:Invent AWS announced that Amazon SageMaker will join the crowd.

The idea with SageMaker (like the others) is to make it easier for data scientists, developers and data-savvy collaborators to build, train and run models at scale. A lot of these model-management environments rely on open-source notebooks, and that’s the case with SageMaker, too, as it uses Jupyter notebooks. Options for model building include a top drawer of 10 popular algorithms that AWS says have been optimized to run on its infrastructure, thereby improving performance and saving money on compute requirements. You can also use options from TensorFlow, MXNet and, soon, other frameworks, AWS promises, giving customers choice.

That’s a good start, but a key differentiator for SageMaker comes in lifecycle stages including training, where three’s a “one-click” option for serverless, autoscaling training. That’s one area where there’s typically a lot of manual work.  Other time and labor savers include auto hyperparameter tuning (in preview) and one-click deployment, also with serverless autoscaling.

MyPOV on SageMaker: The training and deployment automation features sound very promising, but you’ll have to forgive me for taking a wait-and-see attitude after so many announcements this year. The other model-management environment I was impressed by this year was Microsoft’s next-generation Azure ML, which is currently in preview. Microsoft’s environment promises data-lineage and model-change auditing throughout the development and deployment lifecycle. SageMaker doesn’t offer these capabilities currently, but an executive told me AWS expects to add them.

Data-lineage, auditability and transparency are crucial not just for regulated banks and insurance companies. Constellation sees transparency and ML/AI explainability as something that organizations and industries will demand as we embrace predictive and prescriptive systems that recommend and automate decisions. There have been plenty of examples where biases have been discovered in decision systems that impact people’s lives.   

MyPOV on Reinvent Overall

Once again, re:Invent was impressive, and the sheer number of announcements was stunning. I could site at least a dozen other notable data-to-decisions-related announcements, from AWS IoT Analytics to Amazon Translate (real-time translation) to Rekognition Video (object/activity/face detection) to Polly Transcribe (real-time, multi-language transcription). To Jassy’s point, having everything one could possibly need probably is everything to a developer. But when is enough enough?

My point is not to eliminate services and take away capabilities, but AWS CTO Wener Vogels pointed out in his keynote that the company has released a whopping 3,951 new services and capabilities since the first re:Invent event in 2012. The sheer number has sometimes been “confusing and hard to deal with,” Vogels admitted. He went on to talk about the administrative tools and services that Amazon has come out with to ease cloud architecture, development, deployment, operational management and cost/performance optimization. This includes everything from CloudFormation, CloudWatch, Config, and  CloudTrail to Config Rules, Cost Explorer, Inspector and Trusted Advisor.

So, yes, AWS is doing a lot to make working on the platform simpler, easier and more cost-effective, but I’ll close with three proposals to shift the emphasis and communications agenda a bit at re:Invent 2018.

Put the emphasis on improving existing services. Wherever possible, build on existing services rather than introducing yet another service. DynamoDB Global Tables, Backups and On-Demand Restore, for example, are examples of new features added to one of AWS’s oldest services. Werner Vogels noted that AWS likes to get new services out there even if it knows that certain features are wanting. That way it can get customer feedback on how to improve the service. I would submit that AWS is now so large, it would do well to add value to existing services first and take more time to polish new services before introducing them. I’d also make a point of highlighting upgrades to existing services at re:Invent so customers recognize the growing value of services already in use.

Put management and administrative services in the spotlight. This year there were a whopping 61 new product announcements overall at re:Invent, yet only two in the “Management” category: AWS Systems Manager and a new logging feature added to AWS CloudTrail. Systems management may not be as sexy as a new AI or ML service, but AWS should make point of using re:Invent to announce and highlight new capabilities that will help companies spend less, simplify, save employee time and get more bang for the buck. It may be that AWS Systems Manager didn’t get much limelight at re:Invent because it seems to be a makeover of Amazon EC2 Systems Manager, introduced at re:Invent 2016. According to a blog on the new AWS Systems Manager, it “defines a new experience around grouping, visualizing, and reacting to problems using features from products like Amazon EC2 Systems Manager.”  As the scale of AWS grows and companies use more and more services, I would think management tools and services would keep pace and take advantage of the most advanced technologies AWS is applying in other areas.

Bring more automation and AI to building and management capabilities. Following up on the last point, I was really intrigued by Werner Vogel’s discussion of the AWS Well-Architected Framework and Well-Architected Principles, but everything under this category seems to about white papers, best-practice documents, and case studies. That’s all great, but I sense an opportunity to turn this content into helpful services or, better still, new advisory features embedded into existing services. Point the sexy stuff, like machine learning and AI, at how customers use AWS and surface recommendations at every stage of development, deployment and operations. That seems to be the focus of some of the automation tools mentioned above, but let’s see more. Maybe even embed some of these capabilities directly within tools and services so it’s not up to administrators and managers to fix bad practices. These are areas where AWS should excel. If you help customers use AWS well and cost effectively, they will be even happier and more loyal than they are today.

Related Reading:
Salesforce Dreamforce 2017: 4 Next Steps for Einstein
Oracle Open World 2017: 9 Announcements to Follow From Autonomous to AI
Microsoft Stresses Choice, From SQL Server 2017 to Azure Machine Learning

 

Data to Decisions Tech Optimization Chief Information Officer Chief Digital Officer

Monday's Musings: Want AI Ethics? Learn From These Four Movies/TV Shows!

Monday's Musings: Want AI Ethics? Learn From These Four Movies/TV Shows!

 

The Convergence of Technology, Society, And AI Ethics Is Already Among Us

Over the past year, almost every conversation on artificial intelligence involved a discussion on ethics, humanity, and policy.  Sometimes a picture is worth a thousand words.  In this case, one can look to Hollywood for a few good suggestions to ponder.  Four AI geek classics bring out unique points to consider in any AI ethics conversation:

1. The Matrix (1999) Shows Why Humans Are Different

The Matrix raises many philosophical points from perceptions of reality.  For instance, has a general artificial intelligence fully manipulated humanity? Does morality exists in a dreamed state?  As with Descartes’ meditations, what ethics apply when reality is blurred?  Understanding that humanity has the capability to break the rules and create new rules drives a key design point in considering design of general artificial intelligence.

2. Assassin’s Creed (2016) Highlights The Heart Of Humanity – Free Will


Callum Lynch’s gains memories of his ancestor Aguilar de Nerha and trains as a Master Assassin before taking on the secret Templar society.  The Assassins upheld the extremity of free will while the Templars represent determinism.  The Templars advocate a utopian world where control and the perfection of humanity is in a state of order and discipline at the expense of free will.  While the real world must balance free will and determinism, if society enables a general purpose artificial intelligence go too far and take away free will, the essence of humanity will vanquish.

3. Ghost In The Shell Addresses Security Required For Augmented Humanity, Cybernetics, And AI Convergence


Ghost in the shell shows a world where robotics, computer technology, and bio medical engineering have enabled humanity to not only control augmented prostheses and cyberbrains.  Those with cyberbrains face a risk where cyberhackers can control their unwilling victims.  The movie highlights the societal implications of digitization and unchecked control as human augmentation goes mainstream.

4. Person of Interest Explores The Implications Of Trading Privacy For Security, Convenience, And Order With AI


The block buster five season series explores the creation of a general purpose AI designed to protect the post 9/11 world.   Thee series explores the trading of privacy for convenience, security, and order and what happens when an AI is unleashed on the world where humans are irrelevant.

The Bottom Line: There Will Be No Universal Ethics For AI But Time Is Of The Essence

Ethics reflects values humanity has placed on itself as a rules of engagement.  Applying a universal set of ethics will be a fools errand.  However the discussion of what contextual attributes, values, biases, and controls must be held in a public forum in order to find the right balance for humanity and machine to co-exist.

Time is of the essence.  Our smartest minds must work on this soon and establish AI Ethics the same way society approached bio medical ethics.

 

Buy The Report Now

The rush to exponential technologies in new business models has placed artificial intelligence (AI) in the forefront of boardroom priorities for 2018. As leaders move beyond the AI hype, the journey toward AI requires both a business mindset and the institutional fortitude to invest in the building blocks for success

This report provides a framework on how to realize the path to full AI, design for infinite ambient orchestration, build any AI-driven smart service and identify when to automate with AI.

Purchase the report now or login to the Constellation website for access.

Your POV.

Ready to roll out your plans for AI?  Do you understand the business model implications?  Who will you partner with for AI?  What are the ethical considerations that must be addressed?  What’s your favorite show or movie htat addresses AI ethics? Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) org.

Please let us know if you need help with your Digital Business transformation efforts. Here’s how we can assist:

  • Developing your digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales .

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Leadership Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Experience Officer

Event Report - Kronos KronosWorks - Kronos unleashes the Falcon - launches Workforce Dimensions

Event Report - Kronos KronosWorks - Kronos unleashes the Falcon - launches Workforce Dimensions

We had the opportunity to attend Kronos’ yearly user conference, held at the Aria in Las Vegas from November 12th till 15th 2017. The conference was well attended, similar to last year, we sensed an increased partner interest and exhibition demand, compared to 2016. 

 
Here is the 1 slide condensation (if the slide doesn’t show up, check here):



 


Want to read on? Here you go:

Kronos unleashes Falcon – launches Workforce Dimensions. Complete new products / product suites are seldom launched by surprise. But Kronos managed to do this – keeping the cloth of confidentiality on its new product, Workforce Dimension, for over 3 years. The approach that Kronos took was remarkable: It formed a separate team, with the objective to disrupt Kronos. The team was physically separated from the existing R&D team and laid the foundation for Falcon, the codename for what is today Workforce Dimensions. Moreover, Kronos took the time to listen and had the stamina, guts and discipline to get Workforce Dimensions built out to the point of functional parity with its two other key products – Workforce Central and Workforce Ready. Truly remarkable, and a departure from the typically seen ‘announce first, deliver later’ approach in enterprise software.



 
Holger Mueller Constellation Research Kronos Kronosworks 2017
Workforce Dimensions


Workforce Dimensions – a modern WFM management system. In Workforce Dimensions, Kronos has done many things right, but let’s start with the most crucial ones: Kronos has re-started with a brand-new domain model. Often overseen, but if gotten right, a key capability to support 21st century best practices, that often were hard if not impossible to achieve on older systems. Moreover, Kronos takes advantage of machine learning, to power interesting capabilities like Workforce Advisor and assure compliance. And lastly, Kronos Dimensions is built on a new platform the D5 platform… and D5 runs on Google Cloud Platform (GCP). The last decisions are likely going to be key, as all SaaS vendors are moving to IaaS platforms (instead of running on inhouse data centers). 


 
Holger Mueller Constellation Research Kronos Kronosworks 2017
Ain makes commitment to existing customers


Commitment to Workforce Central and Workforce Ready. When new products are announced, customers that have experience in the enterprise software market right away are concerned on existing R&D commitment and support / maintenance of existing products. The only way for vendors to address these concerns is to issue roadmaps and deliver on the R&D, support and maintenance commitments. At launch of the new product, vendor cannot do more than make the commitments, and Kronos has done well making these. For instance, the vendor committed 100M US$ in R&D for both Workforce Central and Workforce Ready in 2018. Now Kronos has to follow up and deliver to keep customers comfortable. The last thing vendors with new products want to see happening is a RFP. 


 
Holger Mueller Constellation Research Kronos Kronosworks 2017
Kronos Workforce Dimensions beta customers


Google Cloud Platform is the IaaS for Kronos Workforce Dimensions. SaaS vendors are picking their IaaS partners, as they can turn capital expenditure (CAPEX) into OPEX… if done right it should yield in more budget for R&D of the SaaS product. Kronos choice of GCP is not the typical first choice, so quite a coup for Google. But the machine learning and performance capabilities of GCP are something all SaaS vendors have noted. A good choice by Kronos from all we can tell at the end of 2017.


 

MyPOV

It’s seldom that new enterprise products are announced in the era of SaaS. But with the DNA of most products still reaching into the 20th century, they all need a considerable re-thinking in regards of their capabilities vis a vis 21st century best practice needs. Kronos making the decision to start from scratch is certainly a good one, given the legacy of Workforce Central and Workforce Ready. More important Kronos has also entered the HCM market, as it announced that it will scale up the Workforce Ready HCM capabilities up to enterprises with 10k FTE. What this means for the good partner relations that Kronos has with all the larger HCM and ERP vendors remains to be seen, but Kronos is certainly now in a position of strength. The remarkable ability to announce with a ready product is also a sign of competitive market weakness… most vendors in enterprise software don’t have the luxury to build their next generation product for 3+ years – and still grow. Likely Kronos will be able to exert some competitive pressure on the other players in Workforce Management.

On the concern side, Workforce Dimensions is a new product. Though Kronos has done a good job working with launch customers, presenting them at KronosWorks etc. – the proof will be in the pudding when a few dozens of customers are live. There are a lot of moving parts when creating a new platform and a new SaaS product, that runs on an IaaS platform, and they have to be all in line. There is nothing prompting a concern at the moment, but Kronos still has to master this. Kronos customers are now on the pickle to determine their upgrade decision and path. The older platforms have enough pain points (just mention Java on Desktop. Flash, UI, reporting) to force the hand of decision makers to look at Workforce Dimensions soon. And lastly Kronos needs to get the ecosystem ready, there will be a lot of services and support work needed. But this is a good problem to have.

Overall a great KronosWorks for Kronos and its customers. Kronos has created an attractive, modern Workforce Management system that will give all its customers pause to think about upgrade plans. Not only has Kronos managed to address challenges of the past, but has also shown vision in regards of user experience and best practices, to make the evaluation of Workforce Dimensions almost a no brainer for the install base. A great start for a brand-new product. Stay tuned.


 

Want to learn more? Checkout the Storify collection below (if it doesn’t show up – check here).

Find more coverage on the Constellation Research website here and checkout my magazine on Flipboard and my YouTube channel here.
Tech Optimization Next-Generation Customer Experience Revenue & Growth Effectiveness Data to Decisions Future of Work AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Customer Officer Chief People Officer Chief Human Resources Officer

Digital Transformation Digest: AWS re:Invent—Innovation Versus Overload, Major IoT Data Marketplace Taking Shape

Digital Transformation Digest: AWS re:Invent—Innovation Versus Overload, Major IoT Data Marketplace Taking Shape

Constellation Insights

AWS re:Invent—Innovation versus overload?: This week saw the Las Vegas Strip taken over by Amazon Web Services' re:Invent conference, which drew more than 40,000 attendees over several days of keynotes, sessions and in particular a slew of news announcements.

The company issued nearly two dozen press releases related to re:Invent, some of which covered multiple new services. And nearly all of AWS's new services for machine learning, container orchestration, databases and many other areas are available in preview, at a minimum. AWS may have its critics, but it would be hard for them to say it is selling vaporware.

AWS is delivering new features at a staggering speed in a bid to maintain its lead over Microsoft, Google, IBM and Oracle in the cloud. One question re:Invent 2017 seemed to raise is how much customers can comprehend, let alone consume.

Then there's the overall focus of the event's content. The past couple of re:Invent conferences heavily featured marquee customers such as General Electric discussing their journey to the AWS cloud. This year's edition featured some of that as well, but the emphasis seemed more on messaging to developers than painting a broad vision for enterprise IT leaders, notes Constellation VP and principal analyst Holger Mueller.

Still, on other fronts AWS used re:Invent as a launch pad for efforts aimed at getting a bigger piece of the enterprise pie. It previewed Enterprise Contract for AWS Marketplace, which is described as follows:

Enterprise Contract for AWS Marketplace is an agreed upon standardized contract template between enterprise software buyers and sellers that resolves challenging terms including liability, dispute resolution, IP protection, warranty, and more across multiple vendors. Participating customers using Enterprise Contract for AWS Marketplace are able to eliminate lengthy procurement negotiations that can delay projects for months.

The contract will be generally available in the first quarter of next year. Companies that are participating in the preview include AppDynamics, CA Technologies, NetApp and Trend Micro, among many others.

While AWS's partner program is growing quickly, and customers are certainly using ample amounts of third-party products on its platform, re:Invent's main focus was on AWS's own services. The key for AWS is to not just continue its torrid pace of innovation, but to make sure its new services work holistically and smoothly, lest customers who came to the cloud in search of simplicity and lower cost find themselves tangled in a kettle of "services spaghetti," as Constellation VP and principal analyst Doug Henschen puts it.

IOTA launches IoT data marketplace: There are hundreds of cryptocurrencies in existence today, but IOTA's entry is one of the most popular, with a market capitalization of about $3 billion. That may help explain why 20 prominent industrial and technology companies have joined up with blockchain startup IOTA on its new marketplace, which is aimed at monetizing data generated by Internet of Things devices. IOTA founder David Sønstebø paints a dramatic vision for the project:

While every filament of our digital zeitgeist is unequivocally telling us that data is the fuel of the future, there is an important distinction: unlike oil, which is finite and whose properties are well known in terms of what it can produce (and pollute), data is for all practical purposes limitless.

On the one hand, data wants to be free in the sense that its storage and transmission costs less and less over time; on the other hand, large quantities of data are extremely valuable and are not free to generate. These diametrically opposed conditions cause a gridlock that needs to be broken in order for Big Data to become truly big. A major cause of this is the fact that, while data sharing is becoming cheaper from a technological perspective, it is prohibitively expensive to sell fine, granular data in real-time due to intermediary fees — not to mention all the red tape one has to cut through in order to complete a single data purchase. These conditions make real-time data trade all but impossible.

IOTA's cryptocurrency technology, Tangle, is similar in theme to a blockchain distributed database, but has a different implementation approach. The company is banking that IOTA can support a marketplace where IoT data can be sold securely, quickly and cheaply. Microsoft, Fujitsu, Accenture, Bosch, Orange and Schneider Electric are among companies who have signed up for a pilot program that will run over the next two months.

Privacy would seem to be a major stumbling block for any large-scale IoT data marketplace. IOTA's announcement only briefly touches on this aspect:

The final result of the data marketplace will be a public report including several case studies that go into detail on the potential and the barriers we will face when rolling out a marketplace in full-scale production. Major emphasis will be dedicated to the impact of the EU’s General Data Protection Regulation (GDPR) on the planned future live data marketplace.

GDPR is a sweeping consumer data protection measure. It has serious teeth and will surely put IOTA's plans to the test. Watch this space.

Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Customer Officer Chief People Officer Chief Information Officer Chief Digital Officer

Digital Transformation Digest: Amazon Web Services Launches Container and Security Services, New Blockchain Interop Group Forms, and More

Digital Transformation Digest: Amazon Web Services Launches Container and Security Services, New Blockchain Interop Group Forms, and More

Constellation Insights

AWS unveils new services for containers, security: Amazon Web Services is making a flood of announcements this week during its re:Invent conference in Las Vegas, and one of the most promising is Fargate, a service that allows customers to deploy application containers at scale without needing to manage the underlying infrastructure. While container orchestration technologies such as Amazon ECS and Kubernetes provide deployment and management help, they can only go so far, and that's Fargate's selling point. Here's how AWS evangelist Randell Hunt describes it in a blog post:

To put it simply, Fargate is like EC2 but instead of giving you a virtual machine you get a container. It’s a technology that allows you to use containers as a fundamental compute primitive without having to manage the underlying instances. All you need to do is build your container image, specify the CPU and memory requirements, define your networking and IAM policies, and launch. With Fargate, you have flexible configuration options to closely match your application needs and you’re billed with per-second granularity.

AWS is also adding support for Kubernetes in Amazon ECS, in a long-awaited move. Kubernetes is an open-source project that originated at Google and has grown in popularity quickly.

Another significant AWS announcement concerns GuardDuty, a new managed threat detection service that can be turned on easily through AWS's management console. GuardDuty runs separately from customers' instances, so there's no performance hit or local agents required.

GuardDuty uses machine learning to spot anomalous events among API calls and network activities. It incorporates homegrown AWS technology and also integrates with third-party products. In AWS's view, GuardDuty is a must as customers scale up their cloud usage:

Identifying and assessing anomalous behavior across multiple accounts, networks, and instances at this scale can be like trying to find a needle in a haystack. ... Customers also have to collect API access and network flow logs and correlate them with threat intelligence sources, applying algorithms to identify anomalies based on known threats. And, often, as soon as the algorithms are well-tuned, the threats evolve and the algorithm requires rework. ... Amazon GuardDuty generates anomaly alerts that are tailored to each customer’s AWS use, and AWS continuously updates the threat intelligence sources Amazon GuardDuty employs.

General Electric, one of AWS's marquee customers, has activated GuardDuty across the thousands of applications it has running on AWS. It took "a matter of hours" to deploy GuardDuty across GE's AWS landscape, GE global chief information and security officer Nasrin Rezai said in a statement.

POV: GuardDuty received supportive comments from top security officials at the Financial Industry Regulatory Authority and Netflix as well. This week, AWS also announced customer wins with the National Football League, the Walt Disney Company, Expedia and Turner. While re:Invent is packed wall-to-wall with technical content and product announcements, for a provider like AWS, high-profile customer references send clear signals of validation to the broader market and decision-makers higher up the chain.

Constellation analysts Doug Henschen and Holger Mueller are in attendance at re:Invent. You can follow their coverage here on Constellation's website, as well as on Twitter at @DHenschen and @holgermu.

Blockchain startups launch Interoperability Alliance: A trio of startups is hoping to drum up support for the Blockchain Interoperability Alliance, a new group aimed at developing standards that foster easier and broader adoption of blockchains. Here is the value proposition as outlined by the group:

The group, which consists of Wanchain, Aion and ICON, isn't hoping to establish consensus on a single blockchain protocol, which would be impossible given the proliferation of types, particularly for cryptocurriencies. Rather, the goal is to figure out ways for different protocols to communicate seamlessly.

POV: While it appears to have a fairly modest beginning, the Alliance's goals are on target. "Though the fundamental reason for blockchain lies in the authentication of the decentralization of business processes to support the any-to-any transactions that lie at the heart of ubiquitous digital business, one of the principal barriers lies in gaining some ‘centralized agreement’ from participants," says Constellation VP and principal analyst Andy Mulholland. "Currently, multiple alliances and companies are working to provide commercial solutions, but though all use similar core technology, and go under the title of blockchain, there are substantial differences. Any move that brings these various alliances together and starts a move towards some consolidation is to be welcomed."

Legacy Watch

Doctor denied license over her refusal to use computer: Electronic medical records are a booming business, but one 84-year-old doctor in New Hampshire who keeps handwritten patient records is bucking the tide, to the point she no longer has a license to practice medicine. CNN has the details:

Why? "Because electronic medicine is for the system, not for the patients," said the 84-year-old, who is originally from Poland. "The system is destroying human relations between the doctor and the patient."

Konopka's refusal to keep electronic records, though, has played a part in a judge denying her request to regain her license to practice, which she voluntarily surrendered in October after allegations of misconduct were brought against her, according to the judge's ruling.

The allegations against Konopka started in October 2014 when a complaint brought to the New Hampshire Board of Medicine accused her of "improper prescribing practices" regarding a child patient, according to the state. After an investigation into the allegation, the board reprimanded Konopka in May.

Konopka, who denies misconduct, signed a voluntary surrender of license in September, in which she agreed to give up her license effective October 13, allowing her time to "provide scheduled and emergency treatment," according to the surrender.

POV: Konopka filed dozens of affidavits from patients speaking in her support with the court, but to no avail so far. Elsewhere in the CNN story, she describes herself as an "enemy of the system" that the establishment is trying to "destroy." Konopka had been seeing about 20 patients per week at her solo office, charging them $50. One told CNN that her other doctors "had their heads shoved into their computers" while Konopka gave her full attention.

But there are other, important wrinkles to Konopka's story. Due to her lack of computer literacy, she is unable to access the state's online system for monitoring the prescription of opiod-based drugs. Given the United States' longstanding opoid abuse epidemic, Konopka's inability—or unwillingness—to comply is obviously problematic.

While it's not clear how Konopka's situation will play out, it does raise some interesting points in an era where technology is continually transforming the way we work and receive services. Is she is a Luddite refusing to modernize at her patients' expense (and in violation of law)? Does her insistence that computers have depersonalized medicine—even as they introduce efficiency—have merit?
Data to Decisions Digital Safety, Privacy & Cybersecurity Matrix Commerce Tech Optimization Chief Customer Officer Chief Procurement Officer Chief Supply Chain Officer Chief Digital Officer Chief Revenue Officer