Results

Beyond the Seat: Future of the Ticketing Marketplace

Beyond the Seat: Future of the Ticketing Marketplace

1

I was really excited to participate on a panel at this year’s Sloan Sports Analytics Conference discussing the future of the ticket marketplace. The other panelists included:

  • John Abbamondi: Executive Vice President – Ticketing, Suites & Hospitality at Madison Square Garden
  • Jamie Brandt: Vice President, Sales and Service at the San Francisco 49ers
  • Jodi Mulkey: Chief Technology Officer at Ticketmaster
  • Zach Long: Chief Marketing Officer at PrimeSport
  • Patrick Rishe (moderator): Director, Sports Business Program at Washington University in St. Louis

We covered a wide range of topics over the course of the hour, including ticket pricing, product mix, creating personalized experiences, retention modeling, leveraging data for sales, and much more.

The video does require a paid subscription to content from 42 Analytics, but there is also a 14 day free trial option, so I hope you check out the panel and many of the other videos posted from this year’s event.

Digital Business Distributed Business and Technology Models Part Three; Distributed Services Business Management

Digital Business Distributed Business and Technology Models Part Three; Distributed Services Business Management

Digital Business takes place within distributed network-based ecosystems using interactions in the form of various ‘Services’ to create Business value and revenues. The ability to dynamically orchestrate a set of Services involving various partners within the Business Ecosystem brings the requirement for a new kind of ‘Middleware’. Distributed Service Management ensures that partners in orchestrated Business Service activities have their individual transactions recorded and settled as necessary.

Previous parts of this series started with an examination of the Business activities and actions of an Enterprise, followed by part 2 defining the need for a new kind of dynamic infrastructure. In part 3a the need for a Distributed Services Technology Management is explored to introduce this part 3b adding the Distributed Services Business Management. (It is recommended to read the parts sequentially from Part 1, though a brief overview is included in the summary at the end of this part).

Blockchain Technology is seen as the most likely solution to providing Distributed Service Management, hence the recent wave of interest in the topic. However knowledge of what Blockchain Technology is, and how it can be applied, to Digital Business is generally poor. Bitcoin tends to be seen as Blockchain, whereas it is a specific implementation using aspects of Blockchain technology.

There is considerable development being devoted to develop Blockchain Technology to create Distributed Ledger implementations that will provide transactional synchronization across a Distributed System, or Business Ecosystem, without requiring any single, or centralized control. There are some significant issues that have to be addressed that lead to some experts expressing concern as to the level of enthusiasm for Blockchain as the basis for Distributed Ledgers, and in particular IoT.

Consider a Smart City example where its many businesses and consumers interact through many Smart Services and how that creates a complex web of commercial settlements. The intention is that new developments of Distributed Ledger will ensure all financial settlements, and possibly other forms of transactions, are recorded across the community with out the need for a central brokering Service operated by the City.

There are some serious issues to be addressed in the development of Blockchain based Distributed Ledgers for wide spread use. As well as the usual constraints relating to scale and operational speed there is a complicated core issue around the issuing and management of the secure ‘keys’ that lie at the heart of the technology.

Owners of any part of the value chain covering Sensors, Services, Resources, Assets and general Infrastructure all require a reliable trusted shared decentralized service to record transactions. Smart Services has been widely used as a term to describe the Business value offered in these systems, but the underlying technology is correctly called Distributed Apps, (Dapps), in recognition of the distributed, or shared, nature of its operating environment.

AirBnB currently uses the familiar technologies of the Internet + Web, but has a stated ambition to develop their Service as a fully functional DApp to allow global levels of scale and interactions.. Other popular Smart Services are likely to migrate into DApps over time to extend their business value propositions, and scale.

The role of a Distributed Ledger, (which is itself a DApp), is to provide a decentralized trust model that prevents fraud, or assumption of control, by allowing all participants to ‘check’ every transaction. (Note that an expert on the detail of Blockchain technology would wish this term to be expanded into much greater detail). The term ‘every transaction’ means between all members of the specific business ecosystem book every transaction, even if not a direct participant in the current transaction.

A Distributed Ledger would treat each IoT Device, or set of Devices that correspond to a chargeable business Service element, as a Node with its own version of the common Distributed Ledger. Whenever a pair, or group, of such IoT Devices/Nodes carry out an interaction, ending in a transaction, all registered Nodes are provided with a copy. A distributed consensus algorithm leads to shared agreement on the legitimacy of the latest transactions and the overall state of the network At which point all Nodes will update their copy of the Distributed Ledger to include the new transaction.

As there are so many versions of the ledger being maintained and compared separately fraudulent interactions are, dependent on the quality of the Distributed Ledger implementation, somewhere between very difficult and impossible. However there are several practical difficulties around scale; firstly the number of nodes and the size of the ledger; secondly the number of transactions equating to the frequency of updates. As a comparison the Bitcoin Blockchain deployment provides ledger updates roughly every ten minutes, introducing significant latency for real time applications, and volume is limited to no more than five entries per second.

The scaling issue currently sets requirements for brand new DLT developments. Expectations are that for initial IoT deployments it will be possible to achieve acceptable operating parameters. The Business Services ecosystem covering a Smart City would be expected to operate with an acceptable number of nodes and transactions in its early years.

The development of Smart City, other IoT business networks and DApps are all driving the requirement for suitable Distributed Ledger solutions. By their very nature Open Source would seem to be an attractive proposition. Two Open Source projects; Ethereum and Hyperledger, have successfully attracted support and build sizable community support though their focuses are different.

Ethereum founded in 2014, supports an eclectic mix of activities, often with crowd funding, where the focus is on DApps that require the decentralized transaction architeure. The Ethereum name will often come into a conversation of Blockchain, or DApps, but the community is not aimed at supporting large-scale IoT requirements for Distributed Ledger solutions. A sample of Ethereum projects can be found here.

 

The focus of HyperLedger is distinctly different, aiming to support the broader needs of IoT Distributed Systems that can be served by Blockchain based technology. The HyperLedger Explorer core project defines its mission as; ‘the fabric is an implementation of blockchain technology that is intended as a foundation for developing blockchain applications or solutions. It offers a modular architecture allowing components, such as consensus and membership services, to be plug-and-play. It leverages container technology to host smart contracts called “chaincode” that comprise the application logic of the system’(sic).

HyperLedger has significant support from well-known industry names, both Business users as well as Technology Venders. Premier Members include; Accenture, Airbus Industries, Deutsche Bourse, Fujitsu, IBM, Intel and JP Morgan. (A full membership list can be found here).

There are four HyperLedger projects underway that contribute in different ways to making Distributed Ledgers a reality for commercial operations; Blockchain Explorer, Fabric, Iroha, and Sawtooth Lake. Summaries of each project and its aims can be found here. It would seem that Hyperledger Fabric is destined to become the most significant in 2017 with its stated aim;

Fabric is an implementation of block chain technology that is intended as a foundation for developing block chain applications or solutions. It offers a modular architecture allowing components, such as consensus and membership services, to be plug-and-play. It leverages container technology to host smart contracts called “chaincode” that comprise the application logic of the system.

IBM aims in 2017 to accelerate the Hyperledger Fabric project by recruiting and supporting an Ecosystem of Developers who will be offered various IBM provided Facilities. This ambitious plan highlights the value in understanding the details which can be found in a Constellation post, Why IBM’s Blockchain Plans Make Sense,.

Clearly effective decentralized transaction management using Blockchain Distributed Ledger technology is a highly important part of the development of Services based ‘Trading” Business Networks. However, this has become, thanks to the coverage on the topic, the most visible element in the overall challenge of building, operating and maintaining a new generation of de-centralized systems.

 

 

Summary; Background

This is third part in a series on Digital Business and the Technology required to support the ability of an Enterprise to do Digital Business. An explanation for the adoption of a simple definition shown in the diagram below to classify the technology requirements rather than attempt any form of conventional detailed Architecture is provided, together with a fuller explanation of the Business requirements.

 

 

 

 

 

 

 

 

 

 

 

 

 

Part 1 - Digital Business Distributed Business and Technology Models;

Understanding the Business Operating Model

Part 2 - Digital Business Distributed Business and Technology Models;

The Dynamic Infrastructure

Part 3a – Digital Business Distributed Business and Technology Models;

Distributed Services Technology Management

 

 

New C-Suite Innovation & Product-led Growth Tech Optimization Future of Work AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing SaaS PaaS IaaS Supply Chain Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

Betterment Tosses Desk Phones and Moves to Cloud Communications

Betterment Tosses Desk Phones and Moves to Cloud Communications

Imagine this: You are an IT manager newly hired into one of the fastest-growing companies disrupting an entire industry. Your workforce is highly mobile, but the phone system is desk-based with complicated codes to perform simple tasks like transferring calls. User frustration is high with dropped calls and you need to fix it and fast. This is the story of Betterment and the topic of my new case study report, Toss the Deskphone: Betterment Uses Dialpad to Modernize Communications.
 
Betterment is a company at the forefront of the FinTech disruption as an early pioneer of Robo-advisory investing. The company grew rapidly from an eight-employee startup to currently a mid-sized company with over 200 employees and managing over $8 billion in Assets Under Management (AUM). The problem was their communications system was a PBX-based solution that was expensive and a mismatch for a highly mobile workforce not always stationed at their desks. Betterment’s Technology Operations Manager, Mike Bongardino, went through an extensive evaluation process and decided that moving to a native cloud communications solution could support their mobile workforce, scale alongside the company as they continued their meteoric growth, and require minimal administration support. Betterment chose Dialpad, a cloud-based unified communications provider that offered a flexible, scalable, and multi-device friendly solution to free their team from complicated desk phones and saw savings of over 60 percent.
 
Image source: Dialpad on multiple devices
 
Native cloud communications provide mobility, scalability, and cost effectiveness for companies and we at Constellation see more companies making the switch over the next several years. The ability to integrate call activity with CRM solutions for improved customer tracking is another key benefit. For CIOs and IT professionals considering moving their traditional phone systems to cloud-based solutions, the case study report contains five key factors to consider, lessons learned from Bongardino, pitfalls to avoid, and overall recommendations on how to plan for the move.
 
Constellation clients can download the full report or an excerpt can be accessed here. Drop a comment below and let me know if you’re planning to move your organization’s communications platform to the cloud.
Future of Work Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth Data to Decisions New C-Suite Marketing Transformation Digital Safety, Privacy & Cybersecurity SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Customer Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

'Vigilante' Malware Brickerbot: Not the Answer for IoT Security

'Vigilante' Malware Brickerbot: Not the Answer for IoT Security

Constellation Insights

Last year, the IoT botnet Mirai was used in a number of high-profile DDoS (distributed denial of service) attacks, taking advantage of hundreds of thousands of insecure Linux-based IoT devices. In recent weeks, a new strain of IoT malware has emerged, with a different and potentially dangerous new purpose. 

Dubbed Brickerbot, the malware seeks out vulnerable Linux devices, accesses them and then executes a series of instructions that corrupts storage, disrupts Internet connectivity and inihibits kernel operations, rendering the device useless, or "bricked." Brickerbot works much like Mirai, scanning for devices that still have their default user names and passwords.

As Mirai's effectiveness underscores, far too many end users simply forget, don't know how, or don't care to take the simple security step of changing their devices' defaults. 

Security firm Radware discovered Brickerbot in March, when the malware targeted devices in a honeypot Radware maintains for security research purposes.

Since then, speculation has centered on the motivations of Brickerbot's author or authors. One popular hypothesis finds Brickerbot to be a form of vigilantism. In other words, by proactively destroying unsafe devices, the likes of Mirai will be inhibited. 

Brickerbot's creator(s) haven't publicly stated their motivations, and it's possible their intents are solely malicious. But if vigilantism is the goal, that's inexcusable, says Constellation Research VP and principal analyst Steve Wilson

"Some hackers have god complexes," he says. "The very words 'white hat' and 'black hat hackings betray a blurred morality where one way or another people take the law into their own hands."

"Who makes the judgement that a device is insecure? It's not black and white," Wilson adds. "Where is the risk assessment that a vulnerable device that might malfunction is worse that a bricked device that will actually malfunction?"

Indeed, as a post on Network World notes, applied unchecked malware like Brickerbot could have fatal consquences:

Imagine driving down the road and having your car’s computer bricked. ... At some point, lives will be lost and people maimed. An uncontrolled botnet seeking to protect us all from badly designed devices will brick the wrong one—or dozens of them.

IoT device manufacturers, eager to get their products to market as quickly and cheaply as possible, bear much of the responsibility for the current threat landscape.

"I am outraged by the parlous state of IoT security," Wilson says. "It is appalling that devices which never were computers are foisted on consumers with unapprehended complexities and inadequate computer security. We need to see action on the part of consumers, to demand proper security.  Consumer affairs regulators need to act to ensure device quality is fit for purpose. Manufacturers need to be held accountable for damages caused by faulty products."

"The last thing we need is vigilantism," Wilson says. "By all means expose, names and shame culpable product companies but don't take the law into your own hands."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Information Officer

Microsoft Makes Bet on Kubernetes with Deis Acquisition

Microsoft Makes Bet on Kubernetes with Deis Acquisition

Constellation Insights

Microsoft is building out its capabilities for the popular open-source container orchestration platform Kubernetes with the acquisition of Deis. Here are the details from a blog post by Microsoft EVP Scott Guthrie:

Container technologies let organizations more easily build, deploy and move applications to and from the cloud. With this increase in agility and portability, containers are helping to make applications the new currency in the cloud. At Microsoft, we’ve seen explosive growth in both interest and deployment of containerized workloads on Azure, and we’re committed to ensuring Azure is the best place to run them.

To support this vision, we’re pleased to announce that Microsoft has signed an agreement to acquire Deis – a company that has been at the center of the container transformation. Deis gives developers the means to vastly improve application agility, efficiency and reliability through their Kubernetes container management technologies.

We expect Deis’ technology to make it even easier for customers to work with our existing container portfolio including Linux and Windows Server Containers, Hyper-V Containers and Azure Container Service, no matter what tools they choose to use.

The Deis team will join Microsoft and its technology will remain open source, Deis CTO Gabriel Monroy said in a separate post:

Over the years, we have worked hard to be open, reliable, and dependable open source maintainers. From our new home at Microsoft you should expect nothing less. We will continue our contributions to Workflow, Helm, and Steward and look forward to maintaining our deep engagement with the Kubernetes community. 

Kubernetes originated from Google, which donated the project to the Cloud Native Computing Foundation in 2016. It has thrived as an open-source project, moving toward becoming the industry standard for container orchestration. Microsoft's move to acquire Deis comes just a couple of months after Kubernetes became generally available on Azure Container Service (which also supports the Docker Swarm and DC/OS orchestration tools).

Buying Deis is a good move by Microsoft, says Constellation Research VP and principal analyst Holger Mueller. "It's no longer enough to just support Kubernetes. It's also important to provide the best Kubernetes experience. Deis is bringing those tools and the experience to develop more. 

"I'd expect Deis to remain open source, but Microsoft will have to walk the balance between differentiation and open source contribution," Mueller adds.

Deis had been acquired by PaaS vendor Engine Yard in 2015. That connection wasn't mentioned in Microsoft's announcement, nor the purchase price revealed. 

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

Learnings from the Adobe Summit - It's all about the "Experience"

Learnings from the Adobe Summit - It's all about the "Experience"

I attended my first Adobe Summit held in Las Vegas a few weeks back along with 12,000 other attendees. The stunning opening videos were a visual feast for the eyes (I expected no less considering the company's creative talent) and the main theme was the launch of the new Adobe Experience Cloud. Adobe’s primary message centered on customer's experience and everyone is a brand ambassador for their organization. The new umbrella Experience Cloud then reorganized Adobe’s various solutions into 3 primary Clouds:
 
  • Marketing Cloud - Campaign, Experience Manager, Target, Social, and Primetime
  • Analytics Cloud - Analytics and Audience Manager
  • Advertising Cloud - Media Optimizer (Demand Side Platform, Search, Dynamic Creative Optimization) and TubeMogul
Image Source: Adobe

In my opinion, this is a stronger message and go-to-market strategy for Adobe as the Marketing solutions previously appeared detached from each other. Customers expressed some confusion on which solutions they needed to accomplish their objectives. The re-alignment of the solutions by cloud provides cohesion and better packaging of the offerings.
 
Impactful Customer Stories on Digital Transformation - Adobe shines with big B2C brand customers sharing their digital transformation stories. Standouts included T-Mobile and the NBA. T-Mobile’s SVP of Digital, Nick Drake, gave an energetic keynote on the journey transforming their website with Adobe Experience Manager (AEM). Drake stated that by keeping the focus on customer experience on their website and reducing the amount of time/clicks needed to order products (previously up to 81 clicks), find assistance, etc. Their new website powered on AEM saw a 485% conversion improvement. One key takeaway from Nick’s keynote for Marketers to take note - instead of forcing their existing processes into the website design, T-Mobile put design and experience first then adapted their processes accordingly. Seems like a small step, but the number of companies I talk to that try to force-feed their existing process then end up with less than desirable results is more than you think.
 
I also had some one-on-one time with Adobe customer Franke, a global kitchenware company with over $2b a year in revenue. Franke worked with their agency One-Inside, to build an immersive virtual experience to showcase their products using AEM. Franke CMO Renato Di Rubbo is helping the company transition from traditional marketing tactics such as catalogs and showrooms to now digital experiences to broaden their reach. He called Adobe, a “great technology partner” enabling their marketing transformation.
 
Product Enhancements - Sensei and Integrations of Marketing + Creative Cloud - From a product perspective, I was pleased to see more cross-cloud integration between Adobe’s Creative and Marketing Clouds. Marketers can now launch Dreamweaver within Adobe Campaign to create and edit emails, landing pages, and other web assets. Adobe Experience Manager (AEM) DAM now integrates with Creative Cloud to auto-identify and sync content. Sensei, Adobe’s Artificial Intelligence (AI) solution launched at the MAX conference last year, had a prominent showing at Summit as well. With Sensei, images from AEM can be analyzed and tags automatically applied for improved asset organization and management. AI solutions have the potential to improve the marketer’s workload with intelligent segmentation, automating next best offers, help track marketing campaign attribution, and optimize marketing budget, which I believe is the direction Sensei is heading. However, most of the customers I talk to at the conference are still confused on what is available in Sensei today beyond the image processing capabilities demonstrated. There is an opportunity for Adobe to help customers better understand the technology by providing additional marketing use cases.
 
Cross-Channel Marketing - In my Mobile Marketing Best Practices for CMOs report, I included the statistic that nearly 60% of US adult mobile device users own more than one device, of which 28 percent own three devices or more. In turn, the ability for marketers to deliver a cross-channel customer experience has increased in urgency. One of Adobe’s solutions to address the cross-channel marketing problem is their Cross-Device Co-op, an initiative launched in 2016, which pools the device data of customers that opt-in to the program. The Co-op allows member companies to leverage their collective non-personally identifiable data for improved digital advertising campaigns. The program has been steadily growing over the past year according to the Co-op product team, with about 30 customers currently participating in the free program.
 
Deepening of the Microsoft partnership - AEM is now offered on the Microsoft Azure cloud hosting platform and integrations between Campaign and Microsoft Dynamics 365 CRM connects the marketing and customer lifecycle data. CMOs with an eye on marketing campaign ROI can benefit from the Adobe Analytics and Microsoft Power BI integration. I asked Adobe SVP Suresh Vittal if we might see a Sensei + Cortana AI partnership a la Einstein + Watson. Apparently, those discussions have begun.
 
Final thoughts - The newly branded Adobe Experience Cloud provides improved cohesion on Adobe’s go-to-market strategy. Customers are accustomed to a suite based approach and the organization of the formerly individualized products makes it easier for customers to understand and buy by-the-cloud. AEM is one of the top Web Content Management platforms and the Microsoft Azure announcement will help customers speed up their go-live for faster time-to-value. I particularly enjoyed the “Sneaks” session with the company’s technical talent showcasing their ideas on future marketing features. The Virtual Reality ad replacement and AI journey demos were two standouts and offered a peek into what might be included in Adobe’s future product roadmap.
 
Lastly, as a marketer that has attended several events at the Venetian over the years, Adobe takes the prize on creativity and incorporating the “fun” of marketing into the event. The visuals, customer keynotes, and educational sessions showcased Adobe’s influence with CMOs and marketers and made for an enjoyable conference.
 
For more, view a Storify collection of my tweets from the #AdobeSummit below:
 
 
Marketing Transformation New C-Suite Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Future of Work Data to Decisions Digital Safety, Privacy & Cybersecurity adobe Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning business SaaS PaaS CRM ERP Leadership LLMs Agentic AI HR HCM IaaS Supply Chain Enterprise IT Enterprise Acceleration IoT Blockchain finance M&A Enterprise Service Chief Marketing Officer Chief Digital Officer Chief Data Officer Chief Information Officer Chief Technology Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

Twitter Courts Developers with Unified APIs, Better Transparency

Twitter Courts Developers with Unified APIs, Better Transparency

Constellation Insights

It's safe to say that Twitter has had a complex and at times contentious relationship with developers that want to integrate with and build on the social messenging service. In October 2015, CEO Jack Dorsey acknowledged the problem and said Twitter wanted to "reset" its developer relationships, as Venturebeat notes:

“Somewhere along the line, our relationship with developers got confusing, unpredictable,” he acknowledged. “We want to come to you today and apologize for the confusion. We want to reset our relationship and make sure that we’re learning, listening, and that we are rebooting."

Twitter conducted fairly extensive developer outreach during 2016, but the picture blurred once again with the departure of several prominent developer advocates from the company, and Twitter's decision in January to sell its Fabric mobile development toolkit to Google. 

Now Twitter is taking measure of its developer outreach, announcing a unified API platform and increased transparency through a public development road map. Developer advocate Andy Piper laid out the landscape in a blog post:

We’re excited to announce that we’ll be unifying our API platform to make it easier for developers to build new applications that can smoothly scale as they grow. We’re also launching new APIs and endpoints that enable developers to build on the unique attributes of Twitter to create better experiences for businesses. Developers can see where we’re focusing and what we’re building with our newly-published API platform roadmap.

Twitter values developers because they can help serve new use cases and spark innovation, Piper wrote. He cited examples such as the U.S. Geological Service's use of Twitter data for earthquake tracking, and LikeFolio's consumer service for stock investors.

Of course, the broader intent—as with any developer outreach effort—is to help grow Twitter's center of gravity, user base and ultimately revenue.

What Twitter has lacked, but wants to remedy now, is an API set and strategy that's clear, stable and relevant to developers at all ends of the spectrum, from startups to large enterprises. Piper walked through some examples of the new normal:

Since 2006, we’ve had a set of broadly available REST and real-time (streaming) APIs that provide access to a range of features and functions. In 2014, we acquired Gnip, a partner who built a suite of enterprise-grade APIs for the world’s largest and most demanding software companies to create solutions with Twitter data. The Gnip APIs provide deeper access to public data from the Firehose and greater functionality than the standard REST and streaming APIs, but have a price point that is often out of range for developers just starting to scale their businesses. As we’ve met and listened to developers at events around the world and in our developer discussion forums, we’ve heard that this can be a source of frustration.

This year, Twitter will roll out a new developer experience that combines its REST and streaming APIs "with the enterprise-grade power and reliability of Gnip," Piper wrote. "The goal is to create an integrated Twitter API platform that serves everyone, from an individual developer testing a new idea to Twitter’s largest enterprise partners."

Developers will enjoy a streamlined API experience; rather than having to shift among multiple APIs as their projects scale up, there will be a single tool for a given task, such as filtering data from Twitter's Firehose. There will be tiered access, from free at the low end to paid self-service and enterprise grade, Piper wrote.

Despite his earlier mention of developer frustration over Gnip pricing levels, Piper gave no indication Twitter plans to cut those costs. Rather, he emphasized that Twitter will "clearly define the features and costs at each tier" so developers can make the plans best suiting their needs. 

Twitter also has some new products in the works that target data analytics and customer engagement scenarios. There's more detail in Piper's full post, which is worth a read.

Piper characterized Twitter's efforts as a "massive new engineering and product investment" for its platform and developer ecosystem. That may be the case, but it comes after years of missteps and fractious relations with developers, not to mention amid stagnant revenue growth and diminished buzz around the service.

"It's good to see Twitter putting order into its API strategy," says Constellation Research VP and principal analyst Holger Mueller. "It needs to regain the trust it lost a few years ago when changing and reducing API access. And as always with Twitter, monetization is the question. This could open up new alleys."

The bottom line here? Twitter needs developers—many more of them—to start regaining traction. On the face of it, Twitter's unified API plans are welcome and long overdue. The question is how well it can sustain focus on this path.

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Marketing Transformation Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer

Executive Profile: Public Sector Chief Information Officer

Executive Profile: Public Sector Chief Information Officer

Media Name: optic-lights.jpeg
Our Constellation Research ecosystem of business leaders extends around the world, which we love. We find that our global footprint enriches the diversity of issues and challenges that need to be tackled and enables us to partner closely with executives driving digital transformation in all sizes of organizations. Disruptive technologies and developments like artificial intelligence, virtual reality, blockchain, the Internet of Things, and cyber security are among the many new additions that will drive groundbreaking change in the coming decades across multiple industries. 
 
David L. Stevens is a CIO who we're impressed with like David Chou, Healthcare VP, CIO and CDO. Both are Constellation Executive Network members. As a public sector leader with a large constituency, his focus for the next 18 months is all about the voice of customer, customer experience, and service quality.
 
If you know of a forward-thinking leader who we should consider profiling and who buys enterprise technology, we look forward to hearing from you at CEN.

David L. Stevens

Chief Information Officer
maricopa County, Arizona -Fourth Largest and Fastest Growing County in the US 

Industry - COUNTY GOVERNMENT
LinkedIn
Twitter - @MaricopaCIO & @ShadowAtNoon

David L. Stevens, Public Sector CIO

Q: Tell us about your role.
 
A:
I serve as the Chief Information Officer for Maricopa County, AZ – the fourth largest (4.2 million citizens and 9,200 square miles in size) and fastest growing County in the Nation; it is larger than 23 States by population with an overall budget of $2.2 Billion dollars and a technology budget of $200 Million dollars.  My focus is strategic technology investments that deliver value, keep the customer first, and create a winning culture.

Q: What’s your typical day like?
 
A:
Typical day starts before the office when I wake up to do a quick check of email, texts, and news.  In the office my day starts by huddling with my executive assistant to make any needed adjustments to the schedule, handle items needing my immediate attention, and also direct any action to my team.  Then I usually spend time with customers, business partners, and stakeholders.  The “end” of my work day is reviewing performance metrics, dashboards/ reports, and financials.  I go to the gym late each night, and before bed, read new research, a book, or other relevant information.
 
Q:  What are your biggest initiatives or challenges for the next 6 - 18 months?

A: 
We are making our next 18 months all about voice of customer, customer experience, and service quality – to strive to make saleable, secure, and contextually relevant services for our customers and citizens.  We spent the last 3 years building the foundation (network, security, telecommunications, metrics, ERP, culture, and financials) so we could build a reliable technology stack that will deliver first-class customer service.  Our new strategic plan can be viewed on our website.
 
Q: What do you see as the biggest enterprise disruptive technology trends?

A: My sense is that Artificial Intelligence, Machine Learning, IoT, and Data will continue to be major disruptors – the intersection of these Big Four will generate new growth, discovers, and impact business and society in profound ways.
 
Q: If you could have a different job, what would it be? 


A: I think I would either want to be an exotic vacation tour guide or a fighter pilot.
Media Name: David Stevens cropped headshot.png
New C-Suite Next-Generation Customer Experience Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity B2C CX ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer

Digital Adoption Platform WalkMe acquires Jaco

Digital Adoption Platform WalkMe acquires Jaco

WalkMe has a simple goal, they want to help companies make sure their employees and customers have the best possible experience with their applications and websites. If your Sales team is struggling using your CRM system, WalkMe will help guide them through the trouble spots. If your customers are not finding what they need on your website, WalkMe can show them the way. Think of it as the modernization of the Help file.

This week WalkMe announced the acquisition of Jaco, who’s product records the clicks people perform as they navigate around a website or web-based applications. This data can then be played back allowing administrators and designers to get insight into the trouble spots people are experiencing, then fix those problems to provide an improved experience.

Jaco’s technology will be integrated into WalkMe’s Digital Adoption Platform, providing a media player like experience, where people can replay, click by click, the interactions people take.

This is WalkMe’s second acquisition of the year, the first being ABBI.io, a tool that helps optimize the experience of mobile applications.

Helping People Get Work Done

One of the biggest challenges with applications and websites is that everyone has a different style in how they use them. Digital Adoption Platforms such as WalkMe provide a way to monitor, study, and ultimately fix the roadblocks people are having. This leads to improved workflows, less support calls, and happier users. WalkMe is aggressively improving their platform, constantly seeking new ways to help organizations improve the adoption of the tools they provide their employees and customers.

 

 

 

 

 

 

 

 

Future of Work

Anaplan Steps Up Investment In Connected Planning

Anaplan Steps Up Investment In Connected Planning

Anaplan says increased R&D investment will keep it ahead of cloud-based performance management arena. Here’s what connected planning is all about.

When Frank Calderoni, Anaplan’s new CEO, took the stage at the March 27-29 Anaplan Hub 17 event, he cited positive company statistics including signing 250 new customers in the last year (bringing the total count above 660) and more than 75 percent annual year-over-year subscription revenue growth to a $120-million-annual run rate. Just two months on the job, Calderoni also promised a “significant increase” in R&D investment, saying the company intends to stay ahead of the competition.

At Anaplan Hub 17, Frank Calderoni, who joined the company as CEO in January, promised
increased R&D investment. He was previously CFO at RedHat and Cisco, respectively.

Indeed, competition is tightening in the corporate performance management arena, in which the lion’s share of growth is going to cloud-based offerings. Incumbents including Oracle (Hyperion) and SAP (BPC) have introduced their own software-as-a-service options while SaaS rivals Adaptive Insights and Host Analytics have been courting larger and larger customers. Even partners are getting in on the act, with Workday adding Workday Planning to its cloud-based performance management application portfolio in 2016.

Anaplan stands apart among the cloud options in that it focuses on a broader range of business planning challenges than any of its competitors, most of which concentrate on the needs of finance departments. Financial planning and sales planning are Anaplan’s most mature use cases, but the company also supports supply chain, workforce, marketing, IT and other planning needs with more than 195 starting-point applications on its Anaplan App Hub. Roughly two thirds of these domain- and industry-specific apps are offered by partners, such as Accenture, Deloitte and Workforce Insight, who help customers build out and customize apps to their specific needs.

Customers typically start with one planning challenge, such as financial planning and analysis or sales planning, but Anaplan educates customers on the need for “connected planning” as part of its land-and-expand strategy. By connecting plans, companies can understand and account for interdependencies, cascading changes in plans and what-if scenarios across interconnected data, people and processes.

Why do companies need connected planning? Anaplan founder and CTO Michael Gould cited the example of U.K.-based companies that are now in the cross hairs of Brexit. Adequate preparation and forecasting demands more than a single plan; these companies need interconnected plans around possible changes in exchange rates, border tariffs, trade deals, supply costs, pricing and resulting demand. In the U.S. it’s easy to imagine the complex, interconnected planning healthcare organizations will need to come to grips with potential changes in the Affordable Care Act.

Of course, planning and forecasting challenges are more often triggered by routine business dynamics rather than legislative or geopolitical sea changes. Routine business changes such as mergers, acquisitions, digital disruption and emerging market opportunities all trigger complex planning and forecasting challenges. Anaplan customers are typically large companies and fast-growing companies – mostly in retail, banking, technology, healthcare and consumer packaged goods – that face complexity and constant change.

MyPOV on Anaplan’s Progress

There weren’t a lot of high-profile announcements at Anaplan Hub 17 in part because the company is still in the process of delivering capabilities promised at Anaplan Hub 16 (thus, Calderoni’s promise to step up R&D investment). For example, a Business Map feature announced at Hub 16 is still a few weeks away. The Business Map will support connected planning by giving customers a holistic view of all business planning activities, with tagging, searching and filtering by use case, business process and geography.

Also still in the works is a promised expansion of existing predictive capabilities to better support workforce optimization, supply planning, transportation assignment, product marketing, and risk modeling, among other forward-looking analyses. Anaplan did release a module in a limited private beta, but executives say they’re reworking the module to support mathematical optimization without requiring coding.

Late last year Anaplan did deliver on Application Lifecycle Management (ALM) capabilities promised at Hub 16. The new ALM capabilities brought an important productivity advance, enabling customers to split large models and synchronize model versions so they don’t have to replicate changes across development, testing and production instances.

At Hub 17, Anaplan did announce a new data-integration option called Anaplan HyperConnect, which is a licensed version of Informatica Cloud that Anaplan will sell and support under its own brand. It also announced reporting integrations with Tableau, and the company is days away from releasing a DocuSign integration that will take paperwork and, thus, time out of approval processes.

Anaplan didn’t play up the announcement, but in a roadmap session it unveiled plans for two important coming platform capabilities that will unlock yet more growth. A Bring-Your-Own-Key encryption requested by security-conscious banks and financial institution is due out later this year. And a lightweight workflow capability will improve planning throughput, governance and collaboration by routing tasks, approvals and alerts. Release dates weren’t disclosed, and one Anaplan executive quipped that the company wants to live down recent product delays by under-promising and overdelivering.

Anaplan has had its share of executive changes over the last year, as is common in any CEO regime change. But as Calderoni settles in and the company pours more of its considerable venture funding into development, I expect innovation to accelerate and the connected planning story to get stronger.

Related Reading:
Anaplan Scales Platform, Prepares for Prediction
SAP Feels Your Pain, ‘Storms Ahead’ on New Apps, Consumer Insights
Cloud-Based Performance Management: Why the Digital Era Demands Agile Planning


Media Name: Anaplan CEO Frank Calderoni.jpg
Media Name: Anaplan Connected Planning.jpg
Data to Decisions Tech Optimization Chief Customer Officer Chief Digital Officer Chief Financial Officer Chief Marketing Officer Chief People Officer Chief Procurement Officer Chief Revenue Officer Chief Supply Chain Officer