Results

Anthropic adds more collaboration features to Claude for Pro, Team customers

Anthropic's secret sauce could be collaboration instead of churning out the latest greatest large language model (LLM).

The Anthropic vision that's developing is interesting because LLMs are going to need something more than chat, content generation and personality to drive revenue growth and profits.

Shortly after the launch of Claude 3.5 Sonnet, a high-performing LLM, Anthropic said Claude.ai Pro and Team users can organize chats into projects to bring activity together in one place for collaboration. The feature is called Projects.

In a blog post, Anthropic noted that Claude's collaboration tools will make it part of the mix to generate ideas, provide results with context and be more strategic. With Claude's collaboration features, Pro and Team customers can add documents, code and ideas in Claude 3.5 Sonnet's 200K context window.

Key items about Claude's collaboration features being added:

  • Claude can be grounded with internal documents, transcripts, codebases and past work. This grounding gives Claude the background it needs to hit the ground running on projects.
  • Customers can give custom instructions for each Project based on tone, industry and roles.
  • Artifacts will appear on the side as previously launched. Artifacts are handy for coding and live previews.
  • Claude Team users can share information on project activity feeds, so teammates are continuously updated.
  • Anthropic said it will continue to add collaboration features and integrations with various applications and tools.

My take: Anthropic is on to something with its LLM and collaboration spin. Yes, these collaboration features exist everywhere, but for enterprise and business users looking to subscribe to generative AI services Anthropic's approach can win some converts vs. OpenAI and others. Anthropic has another advantage: It can natively build LLMs and collaboration in a native way. Existing collaboration apps are going to have to take a more bolt-on approach. 

Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Sales Marketing Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Tech Optimization AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Experience Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Cyberattack Cripples Car Dealerships: A Wake-Up Call for Post-Breach Resilience

More than 15,000 car dealerships across North America are facing frustrated customers and lost sales as a major software provider, CDK Global, is grappling with a cyberattack that has crippled their systems for days. Dealerships have resorted to using pen and paper to create sales contracts, and are unable to register vehicles with government agencies like the DMV. This has led to frustrated customers and a backlog at government offices. The breach has also resulted in lost sales for dealerships, with vehicles sitting idle on lots. CDK's brand has suffered irreparable damage, and the financial losses are significant.

While security breaches are inevitable, it's crucial for organizations to prioritize post-breach resilience. This requires a different approach than proactive security measures. Post-breach resilience is about rapidly isolating systems to contain risk, restoring them quickly to ensure business continuity, and communicating effectively with internal and external stakeholders. It requires a programmatic approach to security, with contingency planning, routine drills, and equal focus on prevention and mitigation.

During my conversations with CISOs and CIOs, I always advocate for a comprehensive cybersecurity approach that combines proactive info security and post-breach resilience, considering systems, tools, and people perspectives. The CDK Global incident serves as a stark reminder: in today's digital landscape, it's not just about preventing breaches—it's about being prepared to respond swiftly and effectively when they occur.

Digital Safety, Privacy & Cybersecurity Chief Information Officer Chief Information Security Officer

Oracle says TikTok ban could hit cloud revenue

Oracle said a TikTok ban could hurt its cloud revenue should the social network be banned in the US. The company revealed the risk factor in its annual report filed with the SEC.

In April, President Biden signed a law that would make it illegal to provide cloud services to TikTok unless its parent ByteDance could separate its operations from the Chinese government. The bill demands that ByteDance sell TikTok in nine months, or one year if extension approved.

TikTok runs on Oracle cloud services under an effort called Project Texas to keep US user data secure.

Oracle said that if it can't provide those services to TikTok its "revenues and profits would be adversely impacted." However, Oracle also is seeing strong demand for Oracle Cloud Infrastructure (OCI) and it's possible that capacity could be redeployed. OCI revenue for fiscal 2024 was $6.9 billion.

Oracle also noted that TikTok's compliance with US laws may increase its expenses. Constellation Research analyst Holger Mueller said:

"Losing TikTok workloads will dent Oracle Cloud utilization, but should hamper the overall growth of Oracle Cloud. Being the only cloud with available Nvidia capacity will keep Oracle Cloud revenue growing - with TikTok revenue - or not. Also given that we are in an election year and TikTok's popularity, it is unlikely neither the old or any new administration will ban TikTok completely in 2024."

While the TikTok note will garner attention, there were multiple other tidbits that stood out in Oracle's annual report. Here's a look.

  • As of May 31, Oracle owns 29% of Ampere, which makes Arm-based processors for cloud workloads including AI inference.  Convertible debt investments in Ampere mature in June 2026 and convert to equity. Oracle invested $600 million in convertible debt issued by Ampere in fiscal 2024 and has options to buy more equity from co-investors through January 2027. The upshot, Oracle could obtain control of Ampere if options are exercised by the company or co-investors.
  • Oracle said it has managed through supply chain shortages in part by "committing to higher purchases and balances of hardware products that we market and sell to our customers and that we use as a part of our cloud infrastructure to deliver our cloud offerings, relative to our historical positions." However, that move to secure manufacturing capacity increases inventory and obsolescence risks.
  • Multi-cloud partnerships were cited as a risk factor. Oracle said: "Use of our competitors’ technologies can influence a customer’s purchasing decision or create an environment that makes it less efficient to utilize or migrate to Oracle products and services. For example, we offer our customers multicloud services whereby our customers can combine cloud services from multiple clouds with the goal of optimizing cost, functionality and performance. OCI’s multicloud services work with a number of our competitors’ products, including Microsoft Azure, Amazon Web Services and Google Cloud Platform. This multicloud strategy could lead our customers to migrate away from our cloud offerings to our competitors’ products or limit their purchases of additional Oracle products, either of which could adversely affect our revenues and profitability." Obviously, multicloud partnerships are expected to drive more revenue than risk.
  • Oracle spent $8.9 billion on research and development in fiscal 2024, up from $8.6 billion in 2023 and $7.2 billion in 2022. Oracle ended fiscal 2024 with 47,000 employees in R&D.
  • 37% of Oracle's applications revenue are cloud services in fiscal 2024, up from 32% in 2023.
  • Oracle ended the fiscal year with 159,000 full-time employees with an average tenure of eight years. About 58,000 of those employees were in the US.

 

Data to Decisions Tech Optimization Oracle Chief Information Officer

Nvidia's growth is going to continue for at least the next 18-24 months | CNBC Interview

R "Ray" Wang, Constellation Research founder, chairman, and principal analyst, joins 'Squawk Box' to discuss Nvidia's stock performance, why he has a $200 price target on the stock, and more.

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/jYU9PB26FkA?si=9CO4RQZkTC8oO2jo" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Shopify unveils Target partnership, new AI features

Shopify outlined a partnership with Target that could scale distribution for its premier merchants as well as AI-enhanced features across its unified commerce platform.

The partnership with Target highlights how Shopify is increasingly becoming an enterprise commerce platform. In recent quarters, the company has noted that it is increasingly moving upstream.

Under the Target deal, curated Shopify merchants will expand Target Plus, the company's third-party marketplace. True Classic and Caden Lane were cited as Shopify merchants that would be available on Target Plus.

In addition, Target will be the first mass retailer to bring select Shopify merchants products into physical stores too. Harley Finkelstein, president of Shopify, said the Target deal will help its high-growth merchants expand.

The Target deal is part of Shopify's strategy to scale into larger retail accounts, B2B and other commerce markets. Finkelstein said:

"The past years show that we can cater to both start-ups and large companies. And we continue to invest in both to expand our merchant base. Our business model focuses on accelerating the success of our merchants and driving long-term value rather than short-term gains. We are a product-led company, and we will invest in those products and strategies that ultimately offer greater value for our merchants and thereby for Shopify."

Shopify eyes offline, B2B markets for commerce growth | Constellation ShortListâ„¢ Campaign to Commerce: All-In-One-Commerce Clouds

To that end, Shopify also launched its Summer '24 Edition, which includes more than 150 updates. Among the key updates:

  • Shopify Markets has been revamped as a central command center that can customize buyer experiences for selling internationally, B2B expansion and tailoring in-person offers via Shopify POS (point-of-sale). With the revamp, Markets will combine views for multiple stores and workflows.
  • AI commerce tools including image editing in the Shopify mobile app and editing tools across Online Store Editor and Email Editor. Shopify also outlined how its Magic feature suggests recommendations across product categories. 

  • Sidekick, an AI commerce assistant that provides context and guidance for products, orders and customers.

  • One-tap digital receipts for in-person shopping and automatic detection on whether a product can be returned.
  • A unified analytics experience across stores and categories.
Data to Decisions Matrix Commerce Next-Generation Customer Experience Chief Information Officer

14 takeaways from genAI initiatives midway through 2024

Generative AI projects in the enterprise have moved beyond the pilot stage with many use cases going into production. Scaling has been a bit of a challenge, but the maturation of how CxOs are approaching genAI is underway.

There's no census of generative AI projects in the enterprise, but directionally you can follow the trend. Datadog in a recent report found that GPU instances (a good proxy for genAI) are now 14% of cloud computing costs, up from 10% a year ago. Datadog's report was based on AWS instances so that percentage may be higher once Google Cloud, Microsoft Azure and increasingly Oracle Cloud is considered.

In recent weeks, I've made the rounds and heard from various enterprise customers talking about genAI. Here's a look at what's happening at the midway point of 2024 with genAI projects and what'll hopefully be a few best practices to ponder.

The big decisions are being made now by the business with technology close behind as a consideration. Use cases will proliferate, but they'll be scrutinized based on cost savings and revenue growth.

"It's not understanding AI. It's understanding how it works," said Jamie Dimon, CEO of JPMorgan Chase. By the end of the year, Dimon estimated that JPMorgan Chase will have about 800 AI use cases across the company as management teams become better at deploying AI. He added:

"We use it for prospect, marketing, offers, travel, notetaking, idea generation, hedging, equity hedging, and the equity trading floors, anticipating when people call in what they're calling it for, answering customer, just on the wholesale side, but answering customer requests. And then we have – and we're going to be building agents that not just answers the question, it takes action sometimes. And this is just going to blow people's mind. It will affect every job, every application, every database and it will make people highly more efficient."

Build a library of use cases. Vikram Nafde, CIO of Webster Bank, said at a recent AWS analyst meeting that it makes sense to build a library of genAI use cases. This library is effectively a playbook that can scale use cases across an enterprise. He said:

"Almost everyone has 10 use cases to try. I have hundreds of use cases. I've created this library of use cases. How do I prioritize? How do I engage my inner business? How do I engage more? Which are the ones that are worth an experiment? There are other costs and not only in terms of money or resource, but like people process and so forth."

Nafde said genAI use cases are viewed through a broader lens. Sometimes, a genAI use case is just a phase of a broader project. Use cases also must play well with multiple datasets and AI technologies and are vetted by an enterprise-wide AI council. Related: Target launches Store Companion, genAI app for employees 

GenAI is a tool to solve problems, but isn't more than that. One change in the last year is that generative AI is being seen more as a tool to use to solve business problems instead of this magical technology. Unum CTO Gautam Roy said at a recent AWS analyst event:

"We don't think about AI or genAI as something separate. We are going to use it to solve problems. The first question we ask is what problem are we solving? Not everything is changed by AI. Sometimes it's a process change. Maybe it's an education or training issue. It may not be a technology issue. We ask what the problem is we are solving and then use innovation and technology to solve it."

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

Data platforms are converging as data, AI, business intelligence converge. Databricks is working to show it has the data warehousing chops to complement its AI and data platform. Snowflake is leveraging its data warehouse prowess to get into AI.

Speaking at Databricks Summit, Brian Ames, senior manager of production AI and data products at General Motors, said the company has stood up its data factory and plans to layer in generative AI capabilities in the next year. "GM has a ton of data. That's not the problem. We had a beautiful on prem infrastructure. Why change? Well, two reasons. Number one was data efficiency. More importantly, the world changed. And GM understood that if we didn't have AI and ML in our arsenal, we could find ourselves at a competitive disadvantage," he said.

Smaller enterprises will be using AI everywhere via platforms. Dimon noted that smaller banks will be using AI too through AWS, Fiserv and FIS. You can assume the same thing for other smaller enterprises across verticals.

Generative AI projects require a lot of human labor that's often overlooked, said Lori Walters, Vice President, Claims and Operations Data Science at The Hartford. "We spend a lot of time talking about the cost to build, about the training costs and the inference cost. But what we're seeing is the human capital associated with genAI is significant. Do not underestimate it," she said.

AI is just part of the management team now. Multiple enterprises have created centralized roles to oversee AI. Often, the executive in charge of AI is also in charge of data and analytics. Again, JPMorgan Chase's approach to AI is instructive. There's a central organization, but each business in the banking giant has AI initiatives.

Daniel Pinto, Chief Operating Officer and President at JPMorgan Chase, said the company has moved to transform its data so it's usable for AI and analytics. There will also be a central platform to leverage that data across business units. "AI and, particularly large language models, will be transformational here," said Pinto.

Compute is moving beyond just the cloud. Judging from the results from hardware vendors, on-premise AI optimized servers are selling well. Enterprises are becoming much more sophisticated about how they leverage various compute instances to optimize price/performance. This trend has created some interesting partnerships, notably Oracle and Google Cloud.

Optionality is the word. Along with the on-premise, private cloud and cloud options, enterprises will mix multiple models and processors. In a recent briefing, AWS Vice President of Product Matt Wood said: "Optionality is disproportionately important for generative AI. This is true today, because there's so much change in so much flux."

GenAI projects will have to be self-funded. At a recent Cognizant analyst meeting, there were a bevy of customers talking about transformation, technical debt and returns. Cognizant CEO Ravi Kumar said the technology services firm's customers are preparing for generative AI, but need to do work in quantifying productivity gains to justify costs. Kumar's take was echoed repeatedly by the firm's customers. "Discretionary spending in tech over the last 25 years has happened in the low interest rate regime where there was no cost of capital. Today you need a business use case for new projects," said Kumar.

I don't doubt that business case argument for genAI. Enterprise software companies have been talking about the delay in genAI profit euphoria in their most recent quarters. No enterprise is going to pay multiple vendors copilot taxes unless there are returns attached.

The transformation journey to the cloud isn't done and may need to be accelerated to reap genAI rewards. Pinto said the JPMorgan Chase is revamping its technology stack.

"It's been a big, long journey, and it's a journey of modernizing our technology stacks from the layers that interact with our clients to all the deeper layers for processing. And we have made quite a lot of progress in both by moving some applications to the cloud, by moving some applications to the new data centers, and creating a tool for our developers that is a better experience. And we are making progress. The productivity of this organization today is by far higher than it was several years ago, and still a long way to go. We have optimized the infrastructure that we use, the cost per unit of processing and storage."

LLMOps is emerging and will converge with MLOps in the future. Hien Luu, Senior Engineering Manager at DoorDash and responsible for building out scalable genAI at the company, gave a talk at Databricks' Summit on LLMOps. Luu said LLMOps is becoming critical due to costs because working with LLMs and GPUs isn't cheap. He expects that MLOps and LLMOps platforms will converge.

Focus on long-lasting use cases and business value instead of infrastructure. Luu's big advice is: "Things are going to evolve rapidly so keep that in mind. Identify your goals and needs based on your company's specific environments and use cases. For now, focus less on infrastructure and more on long-lasting value components."

GenAI can speed up your analytics and data insights. Volume, surfacing insights, inflexibility and time are all analytics challenges, said Danielle Heymann, Senior Data Scientist at the National Institutes of Health. Speaking at Databricks Summit, Heymann said genAI is being explored to streamline data handling, uncover patterns, adjust and evolve and accelerate processing time and conduct quality and assurance functions. NIH National Institute of Child Health and Human Development are using genAI to process grant applications and streamline review processes. GenAI is being used to classify data and conduct QA with a bot.

More on genAI dynamics:

Data to Decisions Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth Future of Work Digital Safety, Privacy & Cybersecurity New C-Suite ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR business Marketing SaaS PaaS IaaS CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Experience Officer

GenAI projects may be sucked up into the transformation, digital core vortex

Generative AI projects are gaining steam in the enterprise, but there's a big hurry up and wait vibe to them. Why? Enterprises operate on a continuum and don't have their ERP, cloud and data transformations complete.

This genAI project progression was outlined by Accenture CEO Julie Sweet on the company's third quarter earnings call. Accenture is seeing genAI momentum and has $2 billion in generative AI bookings over the last 9 months but customers without a strong "digital core" are still on the tarmac.

Sweet said:

"It is important to remember that while there is a near universal recognition now of the importance of AI, which is at the heart of reinvention, the ability to use genAI at scale varies widely with clients on a continuum.

With those which have strong digital cores genuinely seeking to move more quickly, while most clients are coming to the realization of the investments needed to truly implement AI across the enterprise, starting with a strong digital core from migrating applications and data to the cloud, building a new cognitive layer, implementing modern ERP and applications across the enterprise to a strong security layer."

That take isn't news, but does make me wonder if genAI projects may not really hit production and scale for a few years. After all, SAP trumpeted that Bain moved completely to S/4HANA Public Cloud in a project that took four years.

Sweet continued:

"Nearly all clients are finding it difficult to scale genAI projects because the AI technology is a small part of what is needed. To reinvent using technology, data, and AI, you must also change your processes and ways of working, rescale and upscale your people, and build new capabilities around responsible AI, all with a deep understanding of industry, function, and technology to unlock the value. And many clients need to first find more efficiencies to enable scaled investment in their digital cores and all these capabilities, particularly in data foundations.

In short, genAI is acting as a catalyst for companies to more aggressively go after cost, build the digital core, and truly change the ways they work."

Sweet cited a bevy of customers including Macy's, which is migrating mainframes to the cloud; Central Bank of the United Arab Emirates, which is modernizing its enterprise data management; and Independence Health Group, which is moving to a digital first platform.

Now companies that have already made those transformational moves are set up for genAI. "Once clients have a strong foundation, they can explore new opportunities to drive growth and efficiencies with genAI," said Sweet.

Here's the catch: There are a lot more enterprises that need to do the hard work before genAI projects can scale than ones with strong digital cores.

Overall, these prerequisites for genAI impact means more work for Accenture, but it does explain why enterprise software vendors are seeing little uptick from genAI.

More on genAI dynamics:

Data to Decisions Future of Work Next-Generation Customer Experience Innovation & Product-led Growth New C-Suite Tech Optimization Digital Safety, Privacy & Cybersecurity accenture AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Anthropic launches Claude 3.5 Sonnet, Artifacts as a way to collaborate

Anthropic launched Claude 3.5 Sonnet, its latest large language model (LLM), with availability on Anthropic API, Amazon Bedrock and Google Cloud Vertex AI.

According to Anthropic, Claude 3.5 Sonnet outperforms OpenAI's ChatGPT-4o on multiple metrics with improved price/performance rations.

Claude 3.5 Sonnet will cost $3 per million input tokens and $15 per million output tokens with a 200K token context window. Anthropic said it will also be updating Claude Opus. Here's a look at the benchmarks.

In addition, Anthropic launched Artifacts on Claude.ai. Artifacts is a feature that creates a workspace on the side so a user can collaborate more with the model. Anthropic said:

"This preview feature marks Claude’s evolution from a conversational AI to a collaborative work environment. It’s just the beginning of a broader vision for Claude.ai, which will soon expand to support team collaboration. In the near future, teams—and eventually entire organizations—will be able to securely centralize their knowledge, documents, and ongoing work in one shared space, with Claude serving as an on-demand teammate."

Anthropic's Artifacts preview is a spin on a future of work where AI-based teammates work side-by-side with humans.

More on LLMs:

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Target launches Store Companion, genAI app for employees

Target said it will launch Store Companion, a generative AI chatbot designed to help employees boost customer experiences, across its 2,000 stores by August.

The retailer didn't reveal the vendors involved with Store Companion other than it said it designed the chatbot. Target was a reference customer at Google Cloud Next. Store Companion was trained on frequently asked question and process documents from store teams.

Store Companion is built to answer questions workers get on the job, offer coaching and support management. Target said it will continue to test and launch more genAI applications throughout 2024.

In a statement, Target CIO Brett Craig said "genAI is helping us accelerate the rate of innovation across our operations."

Walmart, Target highlight intersection of supply chain, customer experience 

Store Companion will be available as an app on employees handheld devices to answer questions about processes and procedures. Employees can ask multiple questions from credit card applications to restarting cash registers and procedures in a power outage. Target added that Store Companion will also enable seasonal workers to ramp up quickly.

Other details about Store Companion include:

  • Target is piloting Store Companion in about 400 stores.
  • The rollout took six months.
  • The model was tweaked based on employee feedback and experiences.

Target said on its first quarter earnings call that it was using generative AI to drive digital experiences and enable personalization.

Speaking on the earnings call, Christina Hennington, Chief Growth Officer at Target, outlined the genAI efforts, which include guided search, product display enhancements and personalization. She said:

"Our team's work to remodel our digital platforms is paying multiple dividends. Newly developed generative AI and personalization capabilities are expanding scope and reach of what we can offer our guests in terms of product recommendations, search results and more.

We recently engaged in a pilot with one of our biggest vendors to test our latest personalization capabilities with guests shopping our personal care categories. We're very encouraged by early test results, which showed a nearly three times lift in conversion rates from personalized promotions versus mass offers, including higher sales lift across the rest of the category as well.

In addition to driving more personalization, we're also focused on growing relevance, particularly where there may be opportunities in our current online assortment."

More on genAI dynamics:

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI ML LLMs Agentic AI SaaS PaaS IaaS Healthcare GenerativeAI Chief Information Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Growth Officer Chief Marketing Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer Chief Supply Chain Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer

Safe Superintelligence Inc. launches: Here's what it means

Three well-known generative AI pioneers have formed Safe Superintelligence Inc., a startup that will focus on safe superintelligence (SSI).

In a post, former OpenAI leaders Ilya Sutskever and Daniel Levy and Daniel Gross, a former Y Combinator partner, announced the company's role and mission. Sutskever was OpenAI's chief scientist and Levy was an OpenAI engineer.

Here's the Safe Superintelligence Inc. mission in a nutshell. The three founders wrote:

"SSI is our mission, our name, and our entire product roadmap, because it is our sole focus. Our team, investors, and business model are all aligned to achieve SSI.

We approach safety and capabilities in tandem, as technical problems to be solved through revolutionary engineering and scientific breakthroughs. We plan to advance capabilities as fast as possible while making sure our safety always remains ahead.

This way, we can scale in peace.

Our singular focus means no distraction by management overhead or product cycles, and our business model means safety, security, and progress are all insulated from short-term commercial pressures."

Constellation Research analyst Chirag Mehta broke down the Safe Superintelligence launch means and the open questions.

  • What does safe superintelligence mean exactly? "We at least know what AGI means, but no one can meaningfully describe what 'Safe Superintelligence' actually means," said Mehta.
  • The company may actually distraction from safe AI. "This launch might likely have the opposite effect--a distraction from focusing on making AI systems safe today before we cross the AGI or superintelligence Rubicon.
  • Researchers unite! "This effort will likely attract many researchers and technologists who have been passionate about advancing the domain but are frustrated with limitations and changing strategies of current AI companies," said Mehta.
  • Future direction of Safe Superintelligence. "It is unclear in which direction the company goes. I will be surprised if they themselves are clear about their milestones," said Mehta. "It would be worth watching who they hire, who they raise money from, and who they might work with as their design partners. That would reveal more details beyond a lofty mission statement."
  • Lines are drawn. "This will likely drive a deeper wedge into the OpenAI-Sam Altman and Stability AI networks as many of them considered this to be the original mission of OpenAI. As M.G. cleverly put it, “I'm reminded of Coca-Cola Classic. Safe Superintelligence sounds a lot like 'OpenAI Original'," said Mehta.
  • Now hiring for the movement. "The larger enterprise software community will largely ignore this launch, but for serious AI aficionados it would be a dream to be part of a movement in Palo Alto or Tel Aviv—two magnificent cities that have largely defined the next generation landscape and are on a way to define the next one," said Mehta.
Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer