Results

Arizona State, OpenAI to collaborate on ChatGPT education use cases

Arizona State, OpenAI to collaborate on ChatGPT education use cases

OpenAI has inked its first partnership with Arizona State University in an effort to bring ChatGPT Enterprise to courses, tutoring and research.

The partnership is notable since ASU is planning to build a personalized AI tutor for students. Generative AI in education has been a hot topic as some universities have moved to ban its use. Other institutions have embraced generative AI. Meanwhile, students are using tools like ChatGPT and educational services like Chegg and Khan Academy have already partnered with OpenAI, which just launched its GPT Store

Previously: Why Chegg is using Scale AI to develop proprietary LLMsEducation gets schooled in generative AI | Coursera: Generative AI will lead to reskilling, upskilling boom

In addition, the education technology stack is looking to embed generative AI.

According to ASU, the plan is to begin use of ChatGPT Enterprise with faculty and staff. ASU said it is focusing on enhancing student success, finding new research avenues and streamlining processes. ASU created an AI accelerator within its enterprise technology department last year. 

On the privacy front, ASU said it will safeguard user data. ASU CIO Lev Gonick said in a statement:

"The goal is to leverage our knowledge core here at ASU to develop AI-driven projects aimed at revolutionizing educational techniques, aiding scholarly research and boosting administrative efficiency."

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Samsung's Galaxy S24 launch becomes showcase for Google Cloud AI

Samsung's Galaxy S24 launch becomes showcase for Google Cloud AI

Smartphones are increasingly about foundational models, generative AI features and the ability to leverage AI locally. The latest example is Samsung's Galaxy S24 launch, which also served as a showcase for Google's Gemini Pro and Imagen 2 on Vertex AI.

The consumer electronics giant unveiled the Galaxy S24 Ultra, Galaxy S24+ and Galaxy S24 and touted Galaxy AI experiences. Features included Interpreter, which can translate live conversations, Chat Assist, to ensure communication comes off well, Note Assist, which will feature AI-generated summaries, and other features baked into the camera.

With the Samsung launch, two of the primary Android flagship devices will come equipped with generative AI experiences. If you've been following any of the recent hardware launches the next battle is on device model processing. The Google Pixel 8 Pro is designed to show off Google’s models and AI processing. Amazon's Alexa event also had a heavy LLM spin, Apple touched on running AI and machine learning models locally and PC makers are betting (more like praying) that there will be an upgrade cycle due to model training. Microsoft's Surface event was really about a barrage of Microsoft 365 Copilot launches. Samsung said it will be the first Google Cloud partner to deploy Gemini Pro and Imagen 2 on Vertex AI via the cloud to smartphone devices.

According to Samsung, the Galaxy S24 Ultra will be equipped with Snapdragon 8 Gen 3 Mobile Platform for Galaxy, an optimized chipset for AI processing. The Galaxy S24 starts at $799.99 for Galaxy S24, the $999.99 Galaxy S24 Plus, and $1,299.99 Galaxy S24 Ultra. All devices have AI features. 

Janghyun Yoon, Corporate EVP and Head of Software Office of Mobile Experience Business at Samsung Electronics, said Google Cloud and Samsung teams worked together on the Galaxy S24 launch and conducted "months of rigorous testing and competitive evaluation."

While Samsung and Google touted consumer features on the Galaxy S24, the long-term takeaway for enterprises is that they'll eventually be able to leverage the processing power in smartphones for generative AI applications. Local AI processing is more secure, efficient and cost effective.

Bottom line: Smartphones are going to compete on generative AI. Smart enterprises will figure out ways to use local processing for personalized individual use cases.

Constellation Research's take

Constellation Research analyst Holger Mueller said:

"The interesting thing on Galaxy S24 is how many AI features are Google's including Circle to Search were seamless. Google's ability to push capabilties on Android seems to finally be working beyond Pixel devices to Samsung flagship smartphones. Samsung committed to 7 years of support with Google on the backend is definitely an inflection point."

Andy Thurai at Constellation Research added:

"The Samsung-Google and Google Cloud partnership is a win-win for both companies. Google's partnership with Samsung allows them to take Apple head-on. It will be interesting to see what Microsoft will do. Samsung will also deploy Gemini Nano, an LLM that is purpose-built for mobile devices. Because Samsung uses Android as its OS, this partnership and technology alliance was fairly easy.

While Microsoft/Azure is trying to capture the search market that Google owns with its AI advancements, Google is trying to go after the mobile market. Gemini Pro and Imagen 2 on Samsung Galaxy S24 will certainly challenge iPhone.

I wouldn't be surprised if Apple and Microsoft explore an alliance on mobile as they both need each other. While Apple has done some AI-related things such as facial ID unlock with facial recognition, A15 bionic AI chip, and some basic Siri auto-correct and photo editing, etc., the walled garden of Apple's ecosystem hasn't done much on the AI front."

Next-Generation Customer Experience Innovation & Product-led Growth Data to Decisions Future of Work Digital Safety, Privacy & Cybersecurity Tech Optimization Google Google Cloud SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing finance Healthcare Customer Service Content Management GenerativeAI Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Quantinuum raises $300 million, valued at $5 billion

Quantinuum raises $300 million, valued at $5 billion

Quantinuum raised $300 million in equity investment putting the quantum computing company's valuation at $5 billion.

Honeywell merged its quantum unit with Cambridge Quantum Computing in 2021 and launched Quantinuum as a stand-alone company. Honeywell remains Quantinuum's largest shareholder.

In a statement, Honeywell said that the funding round was led by JPMorgan Chase with participation from Mitsui & Co., Amgen and Honeywell. Mitsui said it will help expand Quantinuum's reach in Asia.

Quantinuum, which is focusing on quantum use cases such as cybersecurity, computational chemistry and simulation, has raised $625 million since inception. 

According to Honeywell, Quantinuum will use the funds to "accelerate the path towards achieving the world's first universal fault-tolerant quantum computers, while also extending Quantinuum's software offering to enhance commercial applicability." Quantinuum is also working to develop Quantum Natural Language Processing, an effort to bridge quantum computing and generative AI.

Quantinuum counts JPMorgan Chase as one of its customers using Quantinuum's H-Series quantum processors and the company's software development kit, TKET. Other Quantinuum customers include Airbus, BMW Group, Honeywell, HSBC, Mitsui and Thales.

Here's a look at Quantinuum's development so far and roadmap ahead. 

Data to Decisions Innovation & Product-led Growth Tech Optimization Quantum Computing Chief Information Officer Chief Technology Officer

Hitachi Vantara names NetApp alum Tanase as chief product officer

Hitachi Vantara names NetApp alum Tanase as chief product officer

Hitachi Vantara has named Octavian Tanase as chief product officer effectively immediately. Tanase will report to Hitachi Vantara CEO Sheila Rohra.

Tanase was most recently senior vice president of hybrid cloud engineering at NetApp where he integrated the company’s software portfolio with offerings from AWS, Microsoft Azure and Google Cloud.

At Hitachi Vantara, a Constellation Insights underwriter, Tanase will oversee the storage, infrastructure and hybrid cloud management company's product vision, strategy, development and execution.

In a statement, Rohra said the addition of Tanase will help Hitachi Vantara position for generative AI and the "explosive growth of processing required for data." Tanase said his goal "is to help the company expand its leadership position by harnessing the power of generative AI and other emerging technologies to drive even greater innovation in its portfolio."

The addition of Tanase comes just a few weeks after Hitachi Vantara, a subsidiary of Hitachi, appointed Tony Gonnella as CFO. Gonnella had been CFO at Cortex and Unit 42 at Palo Alto Networks.

On the product front, Hitachi Vantara recently launched Hitachi Unified Compute Platform (UCP) for GKE Enterprise to manage hybrid cloud operations. UCP is delivered via Google Distributed Cloud Virtual and can distribute data workloads between Google Cloud Anthos and in-house infrastructure securely.

Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Here's why generative AI disillusionment is brewing

Here's why generative AI disillusionment is brewing

When it comes to artificial intelligence and generative AI enterprises are still weighing options, trying to scale pilots and balance short-term returns and efficiency with long-term business transformation. These businesses are also wrestling with generative AI hype vs. reality.

In 2023, generative AI was top of mind and vendors raced to build out offerings. Now the question is how quickly enterprises will scale up generative AI.

These generative AI disconnects are appearing already in a bevy of surveys, conversations and research. Earnings calls are likely to add a few more generative AI themes to watch.

Here's a look at why the generative AI disillusionment is showing up in multiple places.

Big money is being spent on AI and returns need to follow. You can almost feel the pressure on CXOs when it comes to AI, generative AI and transformation spending. Boards want returns yesterday. According to Boston Consulting Group, 85% of more than 1,400 C-suite executives said they plan to increase spending on AI and generative AI, a top three priority, in 2024.

And expectations may be running ahead of reality. According to Constellation Research's second half 2023 CxO Business Confidence Survey: "Buy-side CxOs are balancing the pressure to invest in the AI space with the need for certainty about the reliability of these new tools. In turn, enterprise tech vendors recognize and predict strong revenue potential in the generative AI space but currently are in the waiting phase of tangible selling and the client's desire to see tangible return on investment (ROI)."

Constellation ShortList™ Artificial Intelligence and Machine Learning Cloud Platforms | Get ready for a parade of domain specific LLMs | Trust In The Age of AI

BCG found that 90% of CEOs are waiting for generative AI to move past the hype and remain in the pilot phase. A report from Deloitte on the state of AI (right) found that 79% of CXOs expect generative AI to drive organizational transformation in less than three years, but the majority are focusing on tactical returns like cost savings over growth and innovation. 

Data is still a big problem and there may not be enough of it. Yes, there's all of the data strategy and architecture work that needs to be done before generative AI pays off. But there is a larger question: Do enterprises have enough data.

Constellation Research CEO Ray Wang said on DisrupTV Episode 348:

"This year, everybody has budget so that they can actually prove that maybe this is the year we actually get some benefit out of AI. But we trend it out even further and we realized that next year is the year. Companies realize that no one will have enough data to get to a level of precision their stakeholders will trust."

Whether companies go with large language models or smaller models, there's a data issue. Wang said:

"The first 80% of data is hard, but that next 90% is just as hard. And that next 95% of data is even harder. You might get to this point where you're only going to gonna get 99% accuracy. Is that good enough? For contact center? Probably. For procurement? No. For health care? Never."

What we learned from customers in 2023 and predictions for 2024

Orchestration is challenging. It's one thing to find a commoditized large language model. It's another thing to tune it and secure your data. And it's another challenge to deliver that last mile experience. Frank Schneider, Vice President, AI Evangelist at Verint, on DisrupTV Episode 348 noted:

"A lot of this is use case driven. Is AI getting the use case accomplished," said Schneider, who added that there's accuracy, performance and trust and each use cases will have different variables of those three core items.

"It's really about orchestrating experiences, technologies, language models and getting things in the puzzle to fit together," said Schneider. "Folks with scar tissue can help answer how that equilibrium is going to work because they've tried multiple things over the years of business transformation, digital transformation and whatever new technology has come out."

"The elegant brilliance is in the last mile. That's where the winners are going to be."

Efficiency is dominating the AI conversation, but real transformation is about solving big challenges. Mark Minevich, Chief Digital AI Strategist, Global Social Innovation Technology Executive & Chair, UN Advisor, Private Investor and Author Columnist, said one of his biggest issues with AI and the topic is that it has been "swallowed by corporate players."

"Corporate players ferociously focus on optimization and efficiencies," said Minevich. "I think you need to repurpose and reposition the mission of AI to focus on solving the greatest challenges and problems. I think it's time for AI to save the world. I'm not here to replace human beings."

However, 65% of CFOs agree that they will deploy digital technologies to automate certain jobs previously performed by humans, according to Deloitte.

AI's role in transformation projects is a work in progress. Citigroup has had an ongoing transformation underway for years and the latest installment includes a reorganization to become flatter and 20,000 layoffs.

Citigroup CFO Mark Mason didn't talk AI on the bank's fourth quarter earnings conference call but did note a lot of spending on IT and transformation.

“Over the past three years, we have invested significantly in our infrastructure, platforms, applications, processes and data.

Roughly 30% of our transformation investments over the last three years were in technology, with the remainder related to non-tech employees and consultants. In 2023, we've seen a shift from consulting expenses to technology and compensation as we've gotten deeper into the execution of our transformation. And you should expect to see this trend continue.

In total, we invested over $12 billion in technology in 2023. Beyond transformation, our technology investments are also focused on digital innovation, new product development, client experience enhancements and areas that support our infrastructure like cloud and cyber."

Wang said in a research report that transformation projects need to have a longer-term view and consider the likelihood of success as well as qualitative benefits.

There are numerous hurdles blocking AI adoption. In a recent survey, IBM noted multiple barriers to AI adoption.

  • 33% said their companies had limited AI skills and expertise.
  • 25% said there was too much data complexity.
  • 23% had ethical concerns.
  • 22% said projects were too difficult to integrate and scale.
Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Themes from the healthcare data, AI, disruption front lines

Themes from the healthcare data, AI, disruption front lines

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

The convergence of conferences and healthcare-related themes provided a good overview of the state of healthcare data, AI and disruption.

With CES 2024 featuring a good dose of healthcare and wellness news and JP Morgan 42nd Annual Healthcare conference, there were plenty of items to ponder. Here are vignettes from healthcare disruption front lines.

Healthcare disruption and reassembly ahead

Neil Batra, Deloitte's Global Future of Health Leader, said during a CES 2024 panel that the current health system has been in place for the last 70 years since World War II. He said the health system is comprised of multiple players trying to maximize returns, but the consumer is on the fringes of the overall ecosystem.

"The consumer is the secondary part of the story. What we've observed from other transformations in other industries, is that transformations occur when you have pressure from the outside coming in and incumbent structures have to respond. And that's exactly where we think we are today," he said.

This fragmented market features virtual health challenging brick-and-mortar, novel approaches that threaten existing systems and consumers more in charge of their data, said Batra. These fragmented players reassemble where there's retail health and consumer health.

Batra said:

"We think reassembly is where the magic happens and where the value is going to be created. Incumbents are going to gobble up some of these new movers and create a fundamental transformation of the power structure to sectors that are intertwined, interrelated and integrated with all the great innovation that's occurred in the fragmentation moment. After reassembly we get to this notion of an age of biology on a personalized level. The journey is from being about the rule of thumb to the health of N plus one. We think it's a 20-year journey and we think we're roughly midway through."

Will consumers really leverage their health data?

Batra's vision of healthcare nirvana is one that revolves around the consumer being the CEO of her healthcare. He said:

"The consumer is going to elevate to the CEO of their own health. Armed with information spinning off wearables and other devices and data being translated through AI applications; laymen may be able to understand really complex dynamics. And that moves the healthcare professional from somebody as central figure to one that is now maybe a copilot or a coach."

He said this reformatting of healthcare professional won't happen overnight, but consumers will take charge of wellness, mental health and health overall. Healthcare won't be about sick calls only. Generative AI will result in technology that makes healthcare more consumer-oriented, said Batra.

Dorothy Kilroy, Chief Commercial Officer at Oura, said data quality and ease of use will drive how quickly consumers take charge of their own healthcare. "A lot of people still don't know how to just interpret the data. And so, we're going to have to make sure that it's really user friendly in a way that they can actually action on it," said Kilroy.

Cristian Liu, Director of Partnerships and Go-to-Market Strategy at Google Health said consumers have wearables and data so the tools for healthcare reinvention are there. "Do we have the tools to make sense of this information? I think it's really exciting time because of generative AI and because of artificial intelligence," said Liu.

Much of this health reinvention will depend on enterprise data infrastructure, trust and regulatory issues, said Tom Swanson, Head of Healthcare Strategy and Marketing at Adobe. He added:

"The healthcare industry as a whole has more data than any other industry. The question is, are you using the data in an appropriate way that is actually a value to your consumers? Can you use that data to provide value and build trust to enable your consumers to be a proactive participant in their own wellness? The data is there. Right. But I think the biggest problem that we have as an industry is not using it in a timely manner or being afraid to use it because of legal and regulatory constraints."

Kilroy said it's likely that consumers will push the healthcare industry to transform and clear regulatory hurdles. "I see consumers pushing and demanding for more here," she said.

Dr. Generative AI

Liu said the most interesting part of healthcare reinvention is the data and results interpretation via AI and large language models. "A year ago, we couldn't necessarily throw in all of this data and say what does it means. Today you can throw it all into a large language model and it can predict the pieces understood in really laypersons language and explain to you what's going on. That's so exciting for consumer applications," said Liu.

Kilroy said consumers will look to generative AI as a healthcare partner in phases. "There is certainly a more health-conscious consumer that is being more proactive about their health. I don't think that's everybody yet," said Kilroy. "I do think that is changing, but we still have a long way to go in more literacy of health to the consumer."

Data and generative AI can bridge the gap between what a person is feeling and knowing what's going on inside of the body. "The best data is your own data compared to your average personally, and then see how those little micro experimentations can actually change your life," said Kilroy.

The ability to combine personal data with generative AI interpretation may ultimately depend on data interoperability, said Liu. Consumers have health data. Healthcare systems have data locked up. Sharing is difficult. Incentives to share data, in the form of lower healthcare costs, may knock down barriers.

Data sharing between patients and physicians would also be a boon to the customer experience, argued Kilroy. "I think giving physicians continuous data gives them a bigger superpower. I am more hopeful that working with them than against them is actually going to be what's valuable here," she said.

Nvidia only tech company at JP Morgan's 42nd Annual Healthcare conference

Nvidia's healthcare reach played out on two fronts. First, the company held a special address during CES 2024 and outlined plans to use its Nvidia BioNeMo platform to meld generative AI models, cloud and drug discovery. BioNeMo is a generative AI platform to provide services to develop, customize and deploy foundational models for drug discovery.

In addition, Nvidia said Amgen will build AI models to train on human datasets on its infrastructure. The system is based on Nvidia DGX SuperPOD. Amgen will install the system at its deCode genetics headquarters in Reykjavik, Iceland.

With the CES 2024 news out of the way, Kimberly Powell, Vice President of Healthcare, spoke at the JPMorgan 42nd Annual Healthcare conference.

Powell said accelerated computing and AI are combining to usher in an era of digital biology. Nvidia systems are being used for cell imaging and high dimensional analysis. She added that spatial genomics was another promising area.

"There's another phenomenon that is happening, not only the digitization of biology but also with generative AI, the ability to represent the two things that describe drugs, biology and chemistry in a computer. We can use generative AI to represent it," she said.

Powell likened the shift in biology to computer-aided design and electronic design automation in the chip industry. It's early, but Powell argued that drug discovery will be a huge market, just like chip design and semiconductors.

However, there's still work to do. Powell said:

"Biology and chemistry generative AI models are still quite small. We're still in the very, very early innings compared to other fields like natural language processing and what you're seeing with GPT-3, 4, 5, but we're growing in size and complexity. And so, we still have a lot of progress to be had building larger and more capable models from digital biology data that already exist today and the continuously -- enhancing these models with the data that’s continuously being generated in the labs. So BioNeMo provides the biopharma ecosystem with large scale model training to effortlessly train and scale AI training to thousands of GPUs and you can train billion parameter models in days rather than the months it was taking."

Medtronic eyes AI for growth

Medtronic has formed an AI center of excellence as the company aims to advance AI-enabled healthcare based on data from its medical devices.

Speaking at the JPMorgan 42nd Annual Healthcare conference, Medtronic CEO Geoffrey Martha outlined the company's plans in AI. Martha said the company's move to create an AI center of excellence is aimed at centralizing key data assets including millions of patient datasets, regulatory experience, analytics knowhow, and medical device expertise.

In a nutshell, many of Medtronic's devices today include algorithms and models. The Medtronic AI-enhanced portfolio includes GI Genius, which uses AI for endoscopy, Touch Surgery Enterprise, AiBLE for neurosurgery, MiniMed 780G System for diabetes management, and LINQ, an insertable cardiac monitor.

"We have the data and analytics expertise, and we're continuing to build on that. And this is across multiple disease areas. And we've been working very closely with the regulators on this. We spend a lot of time with regulators around the world, especially the FDA on how to think about AI and health care," said Martha, who added the company is planning to leverage common platforms to scale.

Martha added that's it's early, but Medtronic is already seeing the promise in training models with its data.

"This isn't about ChatGPT. I mean we have to train the models ourselves with a lot of high-quality data, but the impact is amazing here. And I think as we move forward, you're going to hear more and more about this from us," said Martha.

From our underwriter

Each year over 22,685 people are admitted to St Mary’s Hospital — a National Health Service (NHS) hospital run by the Isle of Wight NHS Trust. Many of these patients need specialist scans, biopsies and other clinical tests as part of their treatment. The Isle of Wight NHS Trust set out to build a state-of-the-art imaging suite for its pathology department and chose the Hitachi Virtual Storage Platform (VSP) E590 platform as its foundation. Get the full story.

Data to Decisions Next-Generation Customer Experience Future of Work Innovation & Product-led Growth Tech Optimization AR AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing SaaS PaaS IaaS Supply Chain Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

Matt Abrahams on DisrupTV: Why small talk is a big deal and other takeaways

Matt Abrahams on DisrupTV: Why small talk is a big deal and other takeaways

Matt Abrahams, a Stanford University Graduate School of Business professor and author of Think Faster, Talk Smarter, argues that everyone can become better at spontaneous conversation, master small talk and learn the art of paraphrasing and apologizing.

Speaking on DisrupTV Episode 347, Abrahams laid out some tips from his book to ponder. Here's a look at the takeaways.

Everyone can get better at spontaneous conversation. "We can get better at a relatively fast clip," said Abrahams. "There are things you can do that almost immediately help you feel more comfortable and confident communicating in the moment. For example, learning a simple structure, how to package your information that can help a few techniques to manage anxiety can help a lot right away."

DisrupTV Special Edition Episode - Top 15 Books of 2023

Are there differences in generations when it comes to communications? Abrahams said yes and no. This is Abrahams' first year as a professor where all of the students at the Stanford Business School are Gen Z.

He said:

"I've been teaching a long time. I'm old and I've seen lots of shifts and certainly technology like generative AI is changing the way we communicate, and our students are much in many cases more well versed in them. But the fundamental struggles of communicating effectively, confidently, and concisely persist and transcend generations. And it is important that all of us take the time to hone and develop those skills." 

You can prepare to be spontaneous. Abrahams said his method to communicating has a lot of counter intuitive notions. The biggest one is that you can prepare to be spontaneous. It's like practicing sports. You do drills and prepare to play the game and be spontaneous and react."

"The vast majority of our communication in our personal and professional lives is spontaneous. It's not the plan presentation, the pitch or the meeting with an agenda. It's the giving feedback, it's the fixing our mistakes, it's the apologizing, it's the small talk, the answering questions, that's what most of our communication is," said Abrahams. "I've developed a methodology. It has six steps to help us get through it. And the steps divide into two categories, mindset and messaging and with practice and with pushing yourself to get better. All of us can improve in our spontaneous speaking."

Structure matters. Abrahams said:

"Structure is critical to effective communication when we are in the moment, and we are having to figure out what to say. Many of us take our audiences on the journey of our discovery of what it is we want to say. In other words, we just list out information, our brains are not wired to receive lists or process lists. Structure a logical connection of ideas, beginning middle and an end."

Small talk is a big deal. Abrahams said his book has two parts. The first focuses on the methodology and the other homes in situations that require spontaneous speaking. He said:

"I was surprised to find that small talk is what seems to resonate more than the other parts. I thought it would be Q&A and feedback. Many of us struggle with small talk and I am on a personal mission to help rebrand small talk. Small talk is a big deal. Big things happen in small talk. Think about some of your closest friends that you have. How did you get to know them? And how did you get closer? Chances are it was through small talk. Think about some of the most important deals that you've made or learnings that you've had. It happens through small talk. So, we often write it away as a frivolous, necessary evil when in fact, big things happen. So, the question becomes, how do we do it better?

It's about being interested not interesting. We lead with curiosity, lead with questions and lead with observations. That's how you get things started. Once you get started, most people feel more comfortable."

The art of the paraphrase. Abrahams said that there are plenty of people who talk more than they should. Sometimes it's malice or sometimes they're just discovering what they're saying as they go and get lost. When in that situation, utilize the paraphrase.

"One of the top three communication tools everybody should develop is paraphrasing where you highlight or summarize some key point somebody has said. It is critical to shutting somebody down. If somebody is pontificating and going on and on, simply jump in by highlighting some crucial element, comment on it  and then move on. Paraphrasing is a delightful skill that helps you do that. But it doesn't happen by itself. It is always partnered with good listening. You paraphrase and then there has to be a link, a bridge, to something else."

Listening well. Abrahams said you can learn to listen better by listening for the bottom line instead of the top line. Pay attention to context and how something is said. "What's the person really saying? When we listen intently, we actually hear better. If you want to be a better listener, we have to slow things down. We have to slow the pace. Life comes at us fast and furious, and we have to slow down," said Abrahams. "We have to go to a space we can listen in and  allocate space where we can really focus and then give ourselves a little bit of grace to listen intently to what is said and how it is said. Not only to what is said. By giving ourselves a little pace, space and grace, we can all listen better."

He added that better listening also requires eyes, ears and paying attention to the environment.

The art of apologizing. Abrahams said most people struggle with apologies and do it inappropriately. "We don't really apologize. We say we're sorry for how we make people feel vs. what we actually did. We really have to take the time to apologize," he said.

Abrahams added:

"When it comes to apologizing the structure that I teach is AAA just like roadside service here in the United States. AAA will help you. It's three steps and this is the way you can structure a good solid apology. First, you have to acknowledge the incident. What is it that you did? Second, you have to appreciate the consequences for the person and third, you make amends."

There are times when you have to apologize immediately, but others where you can think it through.

Innovation & Product-led Growth DisrupTV Leadership Chief Information Officer

5 takeaways from Infosys’ Q3

5 takeaways from Infosys’ Q3

Infosys' third quarter earnings highlight enterprise interest in generative AI, large deals and clients that are navigating an uncertain economic picture.

The services provider delivered $4.66 billion in the third quarter and announced large deal wins at $3.2 billion. In the quarter, 59% of revenue was attributed to North America. Infosys added that demand was strong for its Topaz generative AI platform and Cobalt, its cloud services. For the quarter ended Dec. 31, Infosys reported a net profit of $733 million.

Here are some of the takeaways from Infosys CEO Salil Parekh and CFO Nilanjan Roy on the company's earnings conference call.

Generative AI is top of mind for enterprises. Parekh said:

"Almost every discussion with clients involves some element of generative AI. We're working across a large number of clients on different scales, where there are some which are more pilots some which are programs."

Industry demand is uneven. "We have seen impact in Financial Services, Telco and Hi-Tech segments. We see strength in Manufacturing, Energy, Utilities and Life Sciences segment. We are seeing strong traction for generative AI programs leveraging our Topaz capability. We've integrated our generative AI components into our service line portfolio, creating impact for our clients," said Parekh.

Use cases for generative AI emerge across Infosys clients. "We have developed a range of use cases and benefit scenarios across different industries for our clients. Some of these areas are related to client analytics, process optimization, sales, marketing, knowledge analysis, software development, self-service and personalization," said Parekh.

He added that Infosys is working with a large bank on a risk analysis program by developing a large language model for them. A food supplier is personalizing food experiences and making operations efficient.

Inflation and an uncertain economic picture are prolonging decisions. "Inflation, uncertain macro and delay in decision-making continues to impact the financial services sector with increasing cost pressures, clients remain cautious on spending and are reprioritizing their programs to deliver maximum business value," said Roy. Telecom was similar.

"Clients are looking at conserving cash, which is visible in delayed decision-making and project deferrals. Our focus on large and mega deals resulted in healthy pipeline and deal wins. Energy, utilities, resources and services clients remain cautiously optimistic about the demand environment with cap in short-term spend," said Roy.

Why digital, business transformation projects need new approaches to returns

Retail eyes generative AI and predictive analytics. "In the Retail segment, cost takeouts and consolidation remain the primary focus for the clients. While discretionary spends remain under pressure, there are pockets of opportunities, leverage generative AI, in predictive analysis, real-term insights and decision support areas. Deal pipeline is strong, though decision cycles remain long," said Roy.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Big Idea: Return on Transformation Investments

Big Idea: Return on Transformation Investments

Many organizations have performed classical cost-benefit analyses to determine the impact of business technology projects. Although these approaches account for the quantifiable metrics, they often fail to capture key attributes such as probability of success and level of difficulty in project type.

Today’s artificial intelligence (AI) and business transformation projects require a much more holistic approach when evaluating the value created for an entire organization. Consequently, boards and their executives seek richer attributes to augment their traditional decision-making techniques.

Welcome to Constellation’s Return on Transformation Investment (RTI) methodology to provide decision support for business and technology leaders in their investment analyses. The methodology accounts for four elements that must be considered for any digital transformation project: cost, benefit, probability, and project type.

 

VIEW FULL REPORT: https://www.constellationr.com/research/return-transformation-investments-rti

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/yMy_nSzJeq4?si=KWb9preaOyK7B9w2" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
Media Name: YouTube Video Thumbnails (1).png

Google Cloud offers free data transfers to customers migrating to other clouds

Google Cloud offers free data transfers to customers migrating to other clouds

Google Cloud said it will offer free network data transfer to customers that move to another cloud provider or migrate on-premises.

In a blog post, Google Cloud outlined the changes. Data transfer fees have been under scrutiny by customers as well as regulators. Google Cloud's move also puts pressure on Microsoft Azure and AWS to do the same.

AWS, Microsoft Azure, Google Cloud battle about to get chippy

Amit Zavery, VP, Head of Platform at Google Cloud said eliminating data transfer fees is one important move, but also took aim at software licensing. He said:

Eliminating data transfer fees for switching cloud providers will make it easier for customers to change their cloud provider; however, it does not solve the fundamental issue that prevents many customers from working with their preferred cloud provider in the first place: restrictive and unfair licensing practices.

Certain legacy providers leverage their on-premises software monopolies to create cloud monopolies, using restrictive licensing practices that lock in customers and warp competition.

Google Cloud's free data transfer broaches a key topic in hybrid and multi-cloud approaches. Where your data resides equates to lock-in in many cases. Interoperability would offer more choices. The company has an FAQ outlining the details of free data transfers.

More:

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Google Google Cloud SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer