Amazon and Anthropic have forged a strategic collaboration to advance generative AI. The move adds another component to the generative AI arms race between technology and cloud giants Microsoft, Google and Amazon.
Amazon’s plan incorporates a fully managed service into its offering in the form of Amazon Bedrock, first announced in April. As previously announced, Amazon Bedrock is AWS’s fully managed service that provides secure access to the industry’s top foundation models, including those available from Amazon and from some top AI startups. These are available via an API.
The Anthropic relationship adds a long-term commitment from Anthropic to provide AWS customers with access to future generations of its foundation models via Amazon Bedrock. In addition, Anthropic will provide AWS customers with early access to unique features for model customization and fine-tuning capabilities.
Anthropic’s model is known as Claude, and the company announced in August that Claude 2 would be available on Amazon Bedrock. When Amazon Bedrock launched in April, Claude 1.3 and Claude Instant were among the first foundation models available to AWS customers, Anthropic said.
In addition to collaboration on the AWS Amazon Bedrock managed service, there are several other components to the Amazon/Anthropic deal. They include the following:
- Amazon will invest up to $4 billion in Anthropic and have a minority ownership position in the company.
- Amazon developers and engineers will be able to build with Anthropic models via Amazon Bedrock so they can incorporate generative AI capabilities into their work, enhance existing applications, and create net0new customer experiences across Amazon businesses.
- AWS will become Anthropic’s primary cloud provider for mission critical workloads including safety research and future foundation model development. Anthropic plans to run the majority of its workloads on AWS, the companies said.
- Anthropic will use AWS Trainium and Inferentia chips to build, train and deploy its future foundation models. The companies will also collaborate in the development of future Trainium and Inferentia technology.
A Common Use Case for Commercial AI: ChatBots
One common use case for generative AI large language models is chat bots. Amazon pioneered this technology with its Amazon Lex chatbot, a pre-trained bot that commercial users could use as a building block to quickly create their own bot, either for internal or external or external use. Amazon explains how Amazon Lex and other components can be used together in this blog post.
Large language models and other AI have represented a huge leap forward from many of the simple, primitive chatbots that were stood up during the pandemic and were based on simple question and answer rules.
How Managed Service Providers are Using AI Today
A handful of companies have introduced AI-driven tools for MSP service desks in recent months. Octavia by Nine Minds is the most recent of a few AI and automation service desk tools purpose-built for MSPs that ChannelE2E has covered since the beginning of 2023. Another one is Pia out of Australia. Another one is NeoAgent. We also know that ConnectWise and other MSP platform companies have been working on their own AI-powered augmentations in their platforms this year. ConnectWise in particular was looking at AI to create Power Shell scripts and in April added AI-powered scripting to its Asio platform.
In addition, plenty of tech giants have made AI part of their roadmaps for 2023 and beyond, most recently, VMware. Analyst firm Canalys also recently provided a perspective of the opportunities in the channel for generative AI.
ChannelE2E will continue to track AI and generative AI developments and opportunities for MSPs, MSSPs and the rest of the IT channel. Have you heard of any others? Let me know. You can reach me at [email protected]. I’m also interested if your MSP is using any of these or other AI tools yet. How are MSPs using ChatGPT? Send me an email.