Results

Futures Recovery Healthcare Transforming Mental Health with AI Technology

Futures Recovery Healthcare Transforming Mental Health with AI Technology

Hear from SuperNova finalist Dr. Tammy Malloy about how Futures Health Recovery is using AI technology to transform the approach and focus around mental health patients.

Vote for Dr. Malloy or other SuperNova finalists here: https://www.constellationr.com/events/supernova/2023

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/AYBbAs_HFkY" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

Generative AI Trends, ShortLists, Tech News | ConstellationTV Episode 63

Generative AI Trends, ShortLists, Tech News | ConstellationTV Episode 63

On ConstellationTV episode 63, co-hosts analyst Dion Hinchcliffe and Doug Henschen talk #tech news trends, then CR Insights Editor-in-Chief Larry Dignan shares his top 5 takeaways from Q2 #earnings, Andy ThurAI discusses hype around #GenerativeAI & the episode concludes with previews of new Q3 Shortlists.

00:00 - Introduction
01:17 - Tech News Updates - cloud spending, AI trends and more
13:15 - Generative AI: Hype or Reality?
20:15 - 5 Lessons from Q2 Earnings
35:30 - 2023 Q3 ShortLists preview
35:55 - Bloopers

ConstellationTV is a bi-weekly Web series hosted by Constellation analysts. The show airs live at 9:00 a.m. PT/ 12:00 p.m. ET every other Wednesday. Subscribe to our YouTube Channel: youtube.com/@constellationresearch

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/OxfzAhR37mg" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

5 Takeaways from Q2 Tech Earnings | Constellation Insights

5 Takeaways from Q2 Tech Earnings | Constellation Insights

5 takeaways from technology Q2 earnings with Larry Dignan, Constellation Insights Editor in Chief.

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/P155C2LNxG8" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
Media Name: YouTube Video Thumbnails 14.54.14.png

F5's transformation required new CX approaches

F5's transformation required new CX approaches

F5 has been on a digital transformation journey that has changed its product portfolio, revenue mix and focus from hardware to software and services. The company's customer experience had to evolve with the transformation.

Speaking on F5's third quarter earnings conference call, CEO Francois Locoh-Donou noted that demand is stabilizing, and its transformation is paying off. Locoh-Donou said:

"Over the last several years, we have invested both organically and inorganically to build a portfolio of SaaS and managed services called F5 Distributed Cloud Services. Since launching distributed cloud in February of '22, we have been expanding our offerings and building momentum for multiple security use cases."

Connecting Experiences From Employees to Customers

Today, F5 has a bevy of product lines including F5 BIG-IP, F5 NGINX and F5 Distributed Cloud Services that have expanded beyond hardware to optimizing and securing applications and APIs. For the fiscal third quarter, F5 reported revenue of $703 million, up 4% from a year ago. Net income in the third quarter was $89 million, or $1.48 a share. Non-GAAP earnings were $3.21 a share.

For the fourth fiscal quarter, F5 expects to deliver revenue in the range of $690 million to $710 million, with non-GAAP earnings of $3.15 to $3.27 a share.

Indeed, F5's results highlight a company that has become more diversified. In an interview on DisrupTV Episode 331, Mika Yamamoto, Executive Vice President and Chief Customer Engagement and Marketing Officer at F5, and Kara Sprague, Executive Vice President and Chief Product Officer at F5, outlined how customer experience at F5 had to evolve as the company transformed.

Here's a look at the key lessons from F5's CX transformation.

Connect the data dots across silos. Yamamoto said for F5 to connect its digital and customer experience it had to connect its data on the back end. "As I dove in, I realized on our website there was this contact me form and people would fill out their information it would go into the ether," she said. "We needed to connect our data on the back end realize that if customers were actually trying to engage with us, we had to take that data and hand it to somebody to do something with it."

F5 created a data office and took more than 70 databases and starting cleaning information. "When we put all the information, we realized we had five Microsofts. Turns out there's only one," she said.

Digital transformation requires data at the foundation and automating as many processes as possible. "If our processes aren't clean, if our data's isn't clean then we're not listening to the voice of the customer and taking out friction points across the company," said Yamamoto.

 

Focus on application capital. "What we're pushing at F5 is that we are now in the era of application capital. Applications are the drivers of value from not just the companies that are application-centric but for all companies and all geographies all Industries," said Sprague.

F5's business used to revolve around perpetual hardware licenses. Today F5 is more oriented around software as a service and managed services. It's a land and expand model with a customer success function that's engaged. "I'm working more on the product side and making sure our portfolio can do what we want it to do which is secure, deliver and optimize any app any API anywhere," said Sprague.

"An important strength of the F5 model is that we now deploy in hardware, in software and software-as-a-service. And we're seeing that customers really value the flexibility that they have in the F5 model," said Locoh-Donou.

How does application capital apply to CX?

"If you think about where we were before we could rely on a discrete product that was deployable. It was a tangible asset. It was hardware. Then we could have software. There were human beings and digital to both sell, implement and support," explained Yamamoto. "Today, it's an expectation of customers that they can engage with us digitally and so the website becomes really important as a place where people can gather information, get access to online self-serve support through articles or chat bots."

Listen to your customers--and the data they provide. Yamamoto said F5 is also capturing the voice of the customer to get insights for the product and support teams. There are the obvious listening channels such as social channels. "You've got this complexity of engagement with the customer," said Yamamoto. "We're monitoring web traffic and who's engaging with us, and we want to make sure we connect the dots when we're engaging."

And then there's listening to what customers are doing.

"A lot of it is taking what we get to learn about our customer and creating some accountability," said Yamamoto, adding that signals from telemetry provide feedback on what features are being used in products. "We actually listen to those signals for a customer to tell us through their actions either positively or negatively. We've got to be able to connect those dots to be able to be proactive and anticipate what our customers need," said Yamamoto.

Next-Generation Customer Experience B2C CX Chief Executive Officer Chief Information Officer Chief Marketing Officer

Confidence in computing

Confidence in computing

I recently wrote about the inaugural Confidential Computing Summit, a milestone in the development of this new field.
In this piece I provide more context on the Confidential Computing movement and reflect on its potential for all computing.
Acknowledgement and Declaration: I was helped in preparing this article by Manu Fontaine, founder of new CCC member Hushmesh, who did attend the summit. I am a strategic adviser to Hushmesh.

Down to security basics; all the way down

Confidential Computing is essentially about embedding encryption and physical security at the lowest levels of computing machinery, in order to better protect the integrity of information processing. The CC movement is a logical evolution and consolidation of numerous well understood hardware-based security techniques.
Generally speaking, information security measures should be implemented as far down the technology stack as possible; as they say, near to the “silicon” or the “bare metal”. By carrying out cryptographic operations (such as key generation, hashing, encryption and decryption) in firmware or in wired logic, we enjoy faster execution, better tamper resistance and above all, a smaller attack surface.
The basics are not new. Many attempts have been made over decades to standardize hardware and firmware-based security, and make these measures ubiquitous to software processes and general computing. Smartcards led the way.
Emerging in Europe in the 1990s, the whole point of a smartcard was to provide a stand-alone, compact, well-controlled computer, separated from the network and regular computers, where critical functions could be carried out safely. Cryptography was of special concern; smartcards were capable enough to offer a complete array of signing, encryption, and key management tools, critical for retail payments, telephony, government ID and so on.

The smarts

So the discipline of smartcards led to clearer thinking about security for microprocessors in general, and spawned a number of special purpose processors.
  • From the early 1990s, the digital Global System for Mobile communications (GSM) cell phone system featured SIM cards — subscriber identification modules — essentially cryptographic smartcards holding each individual’s master account number in a digital certificate signed by her provider. The start and end of each new phone call is digitally signed in the SIM, thus providing secure metadata to support billing (and thus the SIM by the way is probably the world’s first cryptographically verifiable credential).
  • In 2003, Bill Gates committed Microsoft to smartcard authentication, writing to thousands of customers in an executive e-Mail that “over time we expect most businesses will go to smart card ID”.
  • ARM started working on the TrustZone security partition for its microprocessor architecture sometime before 2004.
  • Trusted Platform Modules (TPMs) were conceived as security co-processors for PCs and all manner of computers, to uplift cyber safety across the board (in only adoption was as widespread as anticipated)
  • NFC (near field communications) chip sets enable smartphones to emulate smartcards and thus function as payment cards. Security is paramount or else banks wouldn’t countenance virtual clones of their card products. But security was weaponized in the first round of “wallet wars” around 2010, with access to the precious NFC secure elements throttled, and Google forced to engineer a compromise “cloud wallet”.
Now, security wasn’t meant to be easy, and hardware security especially so!
Standardization of smartcards, trusted platform modules and the like been tough going, for all sorts of reasons which need not concern us right now.
Strict hardware-based security is also unforgiving. The FIDO Alliance originally adopted a strenuous key management policy where private authentication keys were never to leave the safety of approved chips. But the impact on users when their personal devices need to be changed out is harsh, and so FIDO has pivoted — very carefully mind you — to “synchronized” private keys in the cloud, a solution branded Passkeys.

TEE time!

The Confidential Computing Consortium (CCC) is a relatively new association comprising hardware vendors, cloud providers and software developers aiming to “accelerate the adoption of Trusted Execution Environment (TEE) technologies and standards”.
The CCC is certainly not the only game in town, with the long running Trusted Computing Group (TCG, est. 2003) continuing to develop standards for the important Trusted Platform Module (TPM) architecture. Membership of these groups overlaps. I do not mean to compare or rank security industry groups; I merely take this opportunity to report on the newest thinking and developments.
So TEEs sit at the centre of Confidential Computing.
The CCC offers the following definition:
Confidential Computing protects data in use by performing computation in a hardware-based, attested Trusted Execution Environment. These secure and isolated environments prevent unauthorized access or modification of applications and data while in use, thereby increasing the security assurances for organizations that manage sensitive and regulated data.
So Confidential Computing crucially goes beyond conventional encryption of data at rest and in transit, to protect data in use.
Attestation of the computing machinery is a central idea. This is the means by which any user or stakeholder can tell that a processing module is operating correctly, within its specifications, with up-to-date parameters and code. The CCC updated its definition of confidential computing, not long before the CCC Summit, to make attestation essential.

There’s more to CC than meets the eye

Confidential Computing as a field has yet to register with most IT professionals. I find that if people know anything at all about CC, they tend to see it in terms of secure storage, data vaults, and “hardened” or “locked down” computers.
But there is so much more to it.
At Constellation Research we have always taken a broad view of digital safety, beyond data privacy and cybersecurity. Safety must also mean confidence, even certainty for practical purposes. We believe stakeholders must have evidence for believing that a system is safe. Safety is about both rules and tools.
Security and privacy are always context dependent. Safety is judged relative to benchmarks, so we need to know the specifics behind calling a system fit for purpose.
What are the conditions in which a system is safe? What has it been designed for? What standards apply and who determined they are being followed? And do we know the detailed history of a system, from its boot-up through to the present minute?
This type of thinking leads to the need for finer grained signals to help users be confident that a system is safe and that given information is reliable. Data today has a life of its own, created from complex algorithms, training sets and analytics, typically with multiple contributions over time. We often need to know the story behind the data.
This is where CC comes in, with its explicit focus on traceability, accountability and evidence (see the CCC’s April 2023 blog Why is Attestation Required for Confidential Computing?).
With Confidential Computing we should be able to account for the entire life story of all important devices and all important data, and make those details machine readable and verifiable.

Recapping the CC Summit

As reported, the #CCSummit on June 29 featured a breadth of topics and perspectives.
  • The provenance of machine learning training data, algorithmic transparency and the pedigree of generative AI products are all excellent CC use cases.
  • Intel Chief Privacy Officer Xochitl Monteon argued for protecting data through its entire lifecycle in a CC ecosystem.
  • Google’s Head of Product for Computing and Encryption Nelly Porter explained how CC strengthens digital sovereignty in emerging economies.
  • Opaque Systems founder Raluca Ada Popa advocated for “Privacy-preserving Generative AI” including secure enclaves to protect machine learning models in operation.

Reflections: Can all computing be Confidential Computing?

Well, perhaps not all, but Confidential Computing should be the norm for most computing in future.
However, in my opinion the label “confidential” is limiting. Of course, some things need to be kept secret but the real deal with CC is certainty about the cryptographic state of our IT. Admittedly that’s a bit of a mouthful but let’s be clear about the requirement.
Cryptography is now so critical in digital infrastructure, it has to be a given. Cryptography is ubiquitous, and not just for encryption to keep things secrecy that matters; encryption for authentication is actually far more pervasive. Digital signatures, website authentication, code signing, device pedigree, version numbering and content watermarking are all part of the digital fabric. These techniques all rest on cryptographic processors operating properly without interference, and cryptographic keys being generated faithfully and distributed to the proper holders.
Yet as Hushmesh founder Manu Fontaine observes, “Cryptography is unforgiving but people are unreliable”.
That is, cryptography can’t be taken for granted – not yet.
If cryptography is to be a given, we must automate as much of it as possible, especially the attestation of the state of the machinery, to put certainty beyond the reach of tampering and human error.
Hushmesh has one of the most innovative applications for Confidential Computing. They have re-thought the way cryptographic relationships (usually referred to as bindings) are formed between users, devices and data, and turned to CC to automate the way these relationships are formed, so that users and data are fundamentally united instead of arbitrarily linked.

No room for error

Botnet attacks show us that the most mundane devices (all devices these days are computers) can become the locus of gravely serious vulnerabilities.
The scale of the IoT and the ubiquity of microprocessors (MCUs) and field upgradable software means that even light bulbs actually need what we used to call “military grade” security and reliability.
The military comparisons are obsolete. We really need to shift the expectation of consumer grade security and make serious encryption the norm everywhere.
The state of all end-points in cyberspace needs to be standardized, measurable, locked down, and verifiable. So many end-points now generate data and send messages back into the network. As this data spreads, we need to know where it’s really come from and what it means, not only to protect against harm but to maximize the value and benefits data can bring.

Privacy and data control

Remember that privacy is more to do with controlling personal data flows than confidentiality. A rich contemporary digital life requires data sharing, not data hiding.
A cornerstone of data privacy is disclosure minimization. A huge amount of extraneous information today is disclosed as circumstantial evidence collected in a vain attempt to lift confidence in business transactions, to support possible forensic activities, to try and deter criminals. Think about checking into a hotel: in many cases the clerk takes a copy of your driver licence just in case you turn out to be a fraudster.
If data flows such as payments by credit card were inherently more reliable, merchants wouldn’t need superfluous details like the card verification value (CVV).
Better reliability of core data would help stem the superfluous flow of personal information. Reliability here boils down to data signing, to mark its origin and provenance.
MyPOV: the most important primitive for security and privacy is turning out to be data signing. All important data should be signed at the origin and signed again at every important step as it flows through transaction chains, to enable us to know the pedigree of all information and all things.
 
Digital Safety, Privacy & Cybersecurity FIDO Chief Analytics Officer Chief Data Officer Chief Digital Officer Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief Technology Officer

Why you need to connect your hiring to data, outcomes pronto

Why you need to connect your hiring to data, outcomes pronto

Mike Fitzsimmons, Cofounder & CEO of Crosschq, has started four companies and his biggest headache across industries was the same: Making good hires.

Speaking on DisrupTV Episode 331, Fitzsimmons said:

"This is not my first rodeo in starting a tech company, but it is my first one in the HR tech space," said Fitzsimmons. "I started the company with my co-founder out of pure frustration on just how damn hard it is to hire people. You can't hide from the math."

And the math: "45% of the hires at companies never get ROI positive for the companies that made the hire," said Fitzsimmons. "It's insane. It's terrible for talent. Terrible for companies. And it's terrible for everybody."

Crosschq's mission is to make the hiring process linked to outcomes. HR is one of the few corporate functions not linked to outcomes. The Crosschq platform is aimed at increasing the quality of hire, boosting recruiter efficiency and improving hiring intelligence.

Here's the platform in a nutshell:

Fitzsimmons said that since hiring decisions haven’t been tied to an outcome enterprises never get smarter. "We have failed our talent acquisition leaders because we have given them KPIs and goals to put butts in seats quickly," he explained. "We haven't created a machine that enables us to make sure we're putting the right person in the right place every single time."

Indeed, Crosschq, founded in 2018, has struck a nerve. It has more than 400 customers and counts GGVCapital, Bessemer Venture Partners, Slack and SAP among its investors. The company also has integrations with Workday, SAP SuccessFactors, Teamable, Greenhouse, SmartRecruiters, iCIMS and Jobvite to name a few.

Related: Connecting Experiences From Employees to Customers | 7 future of work themes to know now | Coursera: Generative AI will lead to reskilling, upskilling boom | The Lost Art of Being a Supervisor

To improve the hiring process, you need data from every step of the hiring process including:

  • Everything known when a hire was made.
  • How long did the person last?
  • Performance.
  • Impact on culture.
  • Engagement.

"Breaking all that down has been historically difficult because there's a big wall between talent acquisition and the rest of the organization," said Fitzsimmons. "You have core HCM and then all this stuff scattered around. It's an integration nightmare."

The challenge is to close that talent hiring gap in software and processes. Fitzsimmons said processes can't be understated. For instance, two years ago companies were hiring at a rapid clip and now they're cutting back.

"It's all about data and driving impact. The magic opens up once you start to connect these dots and realize you weren't doing it right all along," he said. HR has been different because it hasn't been performance based. "You can't just spend $20 million on Indeed and not have an idea what that led to in terms of the impact on the company."

The cultural change is creating the connection between ROI and hiring talent. An ROI mindset to hiring talent yields some interesting items, according to Fitzsimmons. Consider the following tips that are sprinkled around Crosschq's blog and reports on hiring talent and quality of hire:

  • Understand the ROI of the places where you source talent. For instance, agencies are the most volatile and it's where companies spend the most money. Recruiting agencies will send B-level talent that interviews well because they know they can place them again in 18 months.
  • Companies that rely on internal referrals often get lower quality hires. If you remove the financial incentives for internal referrals, the quality of hire goes up.
  • There's only a 9% correlation between an interview score and quality of hire. The move is to understand what interviewers are not good predictors of success in the org.
  • Only three of the eight standard third party assessments were correlated with success.
  • Pay attention to LinkedIn embellishment if not outright fraud. There's a correlation between hires that aren't totally truthful and success.
  • Think in terms of progress instead of perfection. It's a journey and connecting the data flows between hiring and outcome is a start. From there, you can build a foundation to improve program optimization, skills and competency and talent selection.
Future of Work Next-Generation Customer Experience Data to Decisions Innovation & Product-led Growth New C-Suite Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Executive Officer Chief People Officer Chief Information Officer Chief Experience Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Confidential but in the limelight

Confidential but in the limelight

One of the most consequential fields in digital technology saw its first major public event recently, with the inaugural Confidential Computing Summit held in San Francisco on June 29. I was not able to attend but I have been following closely the emergence of this vital industry and the Confidential Computing Consortium (CCC). Here I offer some observations and reflections on what should become the foundational to the digital economy.

In a companion piece to follow, I will go into more detail on the history of hardware-based security industry initiatives, and reasons why Confidential Computing is critical way beyond confidentiality.

Acknowledgement and Declaration: I was helped in preparing this article by Manu Fontaine, founder of new CCC member Hushmesh, who attended the summit. I am a strategic adviser to Hushmesh.

The Confidential Computing mission

Confidential Computing is essentially about embedding encryption and physical security throughout computing for better data protection and integrity of information processing.

The Confidential Computing Consortium (CCC) is a relatively new association comprising hardware vendors, cloud providers and software developers aiming to “accelerate the adoption of Trusted Execution Environment (TEE) technologies and standards”.

Confidential Computing protects data in use by performing computation in a hardware-based, attested Trusted Execution Environment. These secure and isolated environments prevent unauthorized access or modification of applications and data while in use, thereby increasing the security assurances for organizations that manage sensitive and regulated data. Reference: CCC.

So Confidential Computing crucially goes beyond conventional encryption of data at rest and in transit, to protect data in use.

If you are at all aware of Confidential Computing, you might have the impression that it’s all about secure cloud and data clean rooms. These are important applications for sure but there’s so much more, as the CC Summit proved.

The #CCSummit

About 250 people attended the one-day #CCSummit at the San Francisco Marriott Marquis. I am told the atmosphere was intense! Sponsorships and attendance were both double the organisers’ expectations.

I was impressed by the breadth of the agenda and the speakers’ perspectives.

  • As with any tech conference at the moment, there was lots of AI. And rightly so, as the provenance of machine learning is one of the hottest topics in tech today and the potential for CC to improve accountability for digital artefacts is obvious.
  • Yet privacy was the bigger concern by design for the event, as it is a prime driver for Confidential Computing. It was good to see so many facets of privacy being fleshed out, not just confidentiality. concerns of CC.
  • Intel Chief Privacy Officer Xochitl Monteon provided a valuable privacy tutorial within her keynote Confidential Computing as a Cornerstone for Cybersecurity Strategies and Compliance, stressing how legislated data privacy now protects over 70% of the world’s population. Monteon argued for protecting data through its entire lifecycle in a CC ecosystem, because otherwise, businesses are being crushed by formal data flow impact assessments. Contrary to popular belief, privacy regimes to not ban data flows — they restrain them.
  • Localisation of data processing to particular jurisdictions is a recurring issue in data protection. Location is another one of those signals which we increasingly rely on in data processing, and with its deep hardware connections, CC is going to be beneficial here. Nelly Porter, Google’s Head of Product for Computing and Encryption, was eloquent on the merits of digital sovereignty for emerging economies.
  • Academic and entrepreneur Raluca Ada Popa from UC Berkeley advocated for “Privacy-preserving Generative AI” using CC to protect queries with end-to-end encryption, and further, to protect commercially sensitive machine learning models by running them in secure enclaves.
  • Rolfe Schmidt from Signal Messenger described innovative use of attested TTEs to execute end-to-end encryption on behalf of end users, in cases where the ideal of keeping all sensitive data on the user’s device is not practical.
  • And there was plenty of discussion of Confidential Computing’s safe place, data clean rooms.

Privacy and data control

To appreciate the full potential for Confidential Computing in privacy and data protection, let’s think beyond confidentiality. Privacy is more to do with controlling personal data flows than confidentiality.

The Confidential Computing summit has helped to set the scene for a richer approach to privacy enhancing technologies (PETs). As Associate Professor Raluca Ada Popa explained in her keynote, CC takes PETs well beyond Differential Privacy (which compromises data quality) and Homomorphic Encryption (which protects data in use for many applications but with major performance trade-offs).

At Constellation Research we have always taken a broad view of digital safety, beyond data privacy and cybersecurity. What draws me to Confidential Computing is the possibility of safeguarding entire data supply chains, protecting the properties that make data valuable: clear permissions, authorisations, originality, demonstrated regulatory compliance, peer review and so on. Confidential Computing can provide the story behind the data.

 

Data to Decisions Digital Safety, Privacy & Cybersecurity New C-Suite Security Zero Trust Chief Information Security Officer Chief Privacy Officer

Why Chegg is using Scale AI to develop proprietary LLMs

Why Chegg is using Scale AI to develop proprietary LLMs

Chegg is betting that a partnership with Scale AI can provide a new student experience over the next two semesters and develop proprietary large language models (LLMs) that can create personalized study tools. The goal: Develop generative AI tools that leverage Chegg's differentiated data and get to market fast.

The partnership was announced as Chegg reported second quarter earnings. The two companies have been piloting the new AI experience for students.

Generative AI has been a key topic for education technology providers. In the first quarter, Chegg shares took a beating over generative AI concerns but did launch its CheggMate generative AI service and a partnership with OpenAI. Generative AI is being built into the education technology stack with some efforts available in the fall.

Chegg's new experience will start rolling out this fall. Chegg CEO Dan Rosensweig said that the Scale AI partnership is accelerating the company's generative AI deployment. He said:

"The new Chegg will combine the best of generative AI, with Chegg's proprietary high-quality solutions and demonstrated ability to improve student outcomes. They can expect to see a much simpler conversational user interface, personalized learning pathways, more in-depth content and the ability to transform it automatically into innovative study tools such as practice tests, study guides and flash cards."

In addition, Chegg is building its own LLMs with training data provided by its proprietary data sets and more than 150,000 subject matter experts, said Rosensweig. Chegg has a learning taxonomy and a history of data from schools, classes and professors.

Andrew Brown, Chegg CFO, said the company’s decision to develop its own LLMs revolves around differentiating its service and creating "a truly differentiated and better experience with students at a lower cost." Brown added that completely relying on third parties for generative AI technology would have been too expensive.

For Rosensweig, the role of proprietary LLMs is to improve accuracy and engagement. Rosensweig, who noted that Chegg will still use ChatGPT, said:

"One of the really cool things that we'll be able to do differently than anybody else would be able to do is take the 100 million-plus questions that we have and all the data we've been able to collect and create completely personalized learning experiences on a per user basis based on knowing not only the history of that particular student, but others that have gone to that school, that class and with that professor. So that is not something that any generalist AI can do or frankly, anybody else in the education space could do because we have the largest direct-to-consumer list."

The plan for Chegg and Scale AI is to deploy a rolling launch to cover all 26 categories.

Chegg reported second quarter earnings of $24.6 million on revenue of $182.9 million, down 65 from a year ago. Non-GAAP earnings were 28 cents a share, a penny a share lower than estimates.

The company ended the quarter with 4.8 million subscribers, down 9% from a year ago. For the seasonally slow third quarter, Chegg projected revenue in the range of $151 million to $153 million.

 

 

Data to Decisions Next-Generation Customer Experience Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Nvidia fleshes out generative AI vision from PC, workstation to cloud

Nvidia fleshes out generative AI vision from PC, workstation to cloud

Nvidia at Siggraph outlined an AI vision where developers will create, test and optimize generative AI models and large language models (LLMs) on a PC and workstation and then scale them via data centers or the cloud.

Not surprisingly, this vision includes a heavy dose of Nvidia GPUs. PC makers already highlighted that systems were on deck for generative AI training and workloads.

The two headliners during CEO Jensen Huang's keynote were Nvidia RTX workstations as well as Nvidia AI Workbench. Nvidia AI Workbench is a toolkit to enable developers to create, test and customize models on a PC or workstation and then move them to deploy in data centers, public clouds or Nvidia DGX Cloud.

AI Workbench includes a simplified interface with models housed at Hugging Face, GitHub and Nvidia NGC that can be combined with custom data and shared. AI Workbench will be included in systems from Dell Technologies, Hewlett Packard Enterprise, HP Inc., Lambda, Lenovo and Supermicro.

To go along with AI Workbench, Nvidia launched Nvidia AI Enterprise 4.0, its enterprise software platform for production deployments. AI Enterprise 4.0 includes Nvidia NeMo, Triton Management Service, Base Command Manager Essentials as well as integration with public cloud marketplaces from Google Cloud, Microsoft Azure and Oracle Cloud.

As for the Nvidia RTX workstations, the systems will include Nvidia's RTX 6000 Ada Generation GPUs, AI Enterprise and Omniverse Enterprise software. These systems will include up to four RTX 6000 Ada Generation GPUs with 48GB of memory and up to 5,828 TFLOPS of AI performance and 192GB of GPU memory. These systems will be announced by OEMs in the fall.

Among other Nvidia items from Siggraph:

  • The company announced Nvidia OVX servers with the new Nvidia L40S GPU, which is designed for AI training and inference, 3D designs, visualization and video processing. The Nvidia L40S will be available starting in the fall. ASUS, Dell Technologies, GIGABYTE, HPE, Lenovo, QCT and Supermicro will offer OVX systems with L40S GPUs.
  • Nvidia launched a new release of Nvidia Omniverse for developers and enterprises using 3D tools and applications. Omniverse uses the OpenUSD framework and adds generative AI features. Additions include modular app building, new templates, better efficiency and native RTX spatial integration. Nvidia also launched new Omniverse Cloud APIs.
  • The company also rolled out frameworks, resources and services to speed up adoption of OpenUSD (Universal Scene Description). OpenUSD is a 3D framework that connects software tools, data types and APIs for building virtual worlds. The APIs include ChatUSD, a LLM copilot so developers and ask questions and generate code, RunUSD, which translates OpenUSD files to create rendered images, DeepSearch, an LLM for semantic search through untagged assets, and USD-GDN Publisher, which will publish OpenUSD experiences to Omniverse Cloud in a click.

Constellation Research’s take

Constellation Research analyst Andy Thurai said:

“Nvidia NeMO is an end-to-end framework for building foundational models that can be a pain to build. Nvidia AI workbench will allow for the cloning of AI projects and allow developers a workspace to build LLMs.

The new service and the interface will allow users to train or retrain models from Hugging Face in the Nvidia DGX cloud whether on a public cloud platform such as GCP or Azure or a Nvidia private cloud. Notoriously missing was AWS.

Nvidia claims the compute power today is built for older technologies and workloads. According to Nvidia, modern workloads must be run on the newer chips such as the Nvidia GPUs and Grace Hopper super chips. To that end, the Dual GH200 (a combination of Grace CPU and Hopper GPU into a Grace Hopper) is one of the most powerful processors ever. In that combination, Nvidia aims to get a lower capital expense for the processing power required (lower capex) and lower operational costs with energy consumption and a much faster inference thereby reducing the opex. If this dream were to come true, Nvidia can kill the mighty Intel's x86 business by demonstrating that Grace Hopper can process AI technologies, particularly the AI training workloads, with 20x less power and 12x less cost than the comparable CPU-based processing technologies.

In short, Nvidia has claimed the AI workloads, both training and inferencing, must be run by Nvidia based chips to be more efficient than rivals Intel and AI chip companies like SambaNova Systems."

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

RingCentral launches RingCX, names new CEO

RingCentral launches RingCX, names new CEO

RingCentral launched RingCX, a native contact center platform currently in beta, in a strategy shift that minimized partner NICE. RingCX is a native platform that will include RingCentral's unified communications tools with contact center capabilities as well as generative AI.

In a move that's aimed at moving customers across touchpoints and expanding its market, RingCentral is launching RingCX even has it has a partnership with NICE. RingCentral has RingCentral MVP and RingCentral Contact Center, which is powered by NICE.

Vlad Shmunis, CEO of RingCentral, said the company will continue to invest in the NICE partnership, but needed a more native approach. "In listening to our customers, we’ve recognized an additional need for a native intelligent contact center solution that would be better suited towards addressing simpler use cases," he said.

RingCX will launch with more than 1,000 features including integrated communications and messaging across customer service use cases. Features include:

  • Skills-based routing.
  • Dashboards with analytics, real-time data and pre-built reports.
  • Integration with Salesforce and Zendesk at launch and adding Hubspot, Microsoft dynamics and ServiceNow soon.
  • Virtual agents powered by Google Dialogflow.
  • Real-time AI transcription and post call summaries.
  • AI-driven assistance, quality management and conversation analytics.

The launch of RingCX comes amid a busy news day for RingCentral.

  • RingCentral named Tarek Robbiati, former CFO at Hewlett Packard Enterprise, as CEO succeeding Shmunis effective Aug. 28.
  • RingCentral reported second quarter revenue of $539 million, up 11% from a year ago with a net loss of 23 cents a share. Non-GAAP earnings were 83 cents a share.
  • For the third quarter, RingCentral said revenue will be between $552 million to $556 million, up 8% to 9%, with non-GAAP earnings of 75 cents a share to 78 cents a share.
  • The company projected 2023 revenue between $2.19 billion to $2.2 billion, up 10% to 11%, with non-GAAP earnings of $3.11 a share to $3.25 a share.
  • Last week, RingCentral acquired assets from Kopin.

Research:

Constellation Research's take

Here's what Constellation Research analyst Liz Miller had to say about recent RingCentral developments:

"So it’s official…now EVERYONE is a contact center player. It always feels like RingCentral is at the center of a good number of CCaaS rumors…at least once a quarter someone unleashes a rumor that they are going to buy/takeover/merge with 8x8. But, in the wake of every rumor someone reminds the crowd that RingCentral’s partnership with NICE has been rock solid. That is until now. While RingCentral has made it clear that their partnership with NICE is still both strategic and important to their vision and growth…it is also clear that RingCentral customers want more options and integrations to create a single pane of collaboration across UCaaS and CCaaS tools and communications.

All of these moves to a “more unified, unified-communications strategy” point to a trend I’ve been tracking…this convergence in communications isn’t just about wiring, clouds and where calls start…but is a bigger shift to consolidate around collaboration be it between employees and teams or brands and their customers.

Now add conferences and events (Hopin), video and webinars, voice calls, chats, bots, contact centers, collaboration….make no mistake…this is all about collaboration in every channel with every constituent from employees, to partners, to customers."

 

Next-Generation Customer Experience Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer