Amazon launches generative AI play in AWS Bedrock

2023-04-17 20:15:01
关注

Amazon is the latest hyperscaler to take on the world of foundation AI including generative and large language models. It has launched a new platform called AWS Bedrock that includes access to in-house tools such as the Titan family of foundation models, and pre-trained models from start-ups like AI21 Labs, Anthropic and Stability AI. The company says the focus is on providing a range of models for use in “enterprise-scale” AI tools. One expert said Amazon has “a long way to go” to catch up with other players in the field.

Amazon says Bedrock will be for enterprise-scale AI tools and allow access to multiple models within a project
Amazon says Bedrock will be for enterprise-scale AI tools and allow access to multiple models within a project.

Opening AWS up as a marketplace for multiple AI models mirrors moves by Google to offer those made by third parties in Google Cloud alongside its own PaLM, including from Midjourney and AI21 Labs. Microsoft has gone “all in” with OpenAI through its Azure cloud, offering GPT-4, ChatGPT and other models for customers.

Amazon says it will allow companies to train chatbots and AI tools on their own proprietary data without having to invest in costly data centres and expensive AI chips. AWS will use a combination of its own custom AI chips and those from Nvidia. “We’re able to land hundreds of thousands of these chips, as we need them,” explained Dave Brown, VP of Elastic Compute Cloud at AWS.

The launch of Bedrock has been in the works for the past few months, with AWS signing partnership agreements with Stability AI and other start-ups, as well as investing more in generative AI apps and its underlying technology. Hugging Face has also worked to bring its library of text-generating models onto AWS and Amazon has launched an AI accelerator for startups.

AWS is the largest hyperscaler in the world but is facing increasing competition from Google Cloud, Microsoft Azure and others, largely off the back of their AI offerings. Both companies have invested heavily in general AI tools including in chatbots such as ChatGPT and Google Bard.

Amazon hasn’t unveiled the pricing for its AI offerings yet and full details aren’t clear but users will be able to tap into the various foundation models via an API. It is focused on “enterprise-scale” apps rather than individual tools as it is designed for scale.

Multiple AI models

AI21 Labs Jurassic-2 family of foundation models are particularly suited to generating multilingual text, while Anthropic’s Claud is good for text-processing and conversational tools. Stability AI brings text-to-image tools to Bedrock including Stable Diffusion which can be used for images, art, logos and graphic design. The most recent version of Stable Diffusion has improved text accuracy and clarity. Using Bedrock, developers will be able to create tools that combine models.

Amazon’s own Titan models include text and embedding. This allows for text generation like writing a blog post or a sales pitch, where embedding can translate text into numerical representations to find the semantic meaning of the text.

Content from our partners

The war in Ukraine has changed the cybercrime landscape. SMEs must beware

The war in Ukraine has changed the cybercrime landscape. SMEs must beware

Why F&B must leverage digital to unlock innovation

Why F&B must leverage digital to unlock innovation

Resilience: The power of automating cloud disaster recovery

Resilience: The power of automating cloud disaster recovery

Any of the models can then be further trained on labelled datasets stored in S3, Amazon’s cloud storage tool. Only 20 well-labelled pieces of data is required to make the model work against the proprietary information and none of that data will be used to train the underlying models, according to Amazon.

View all newsletters Sign up to our newsletters Data, insights and analysis delivered to you By The Tech Monitor team

“At Amazon, we believe AI and ML are among the most transformational technologies of our time, capable of tackling some of humanity’s most challenging problems. That is why, for the last 25 years, Amazon has invested heavily in the development of AI and ML, infusing these capabilities into every business unit,” the company said in a statement.

In the same statement Amazon highlighted the use of chips to bring down the cost of running generative AI workloads, explaining that these ultra-large models require massive compute power to run in production and so AWS Inferentia chips can be used to make this more efficient and reduce cost at enterprise scale.

AWS Bedrock has ‘a lot of catching up to do’

The company is also opening up its answer to Microsoft’s GitHub Copilot, a tool widely used by developers to help write code. Amazon is making CodeWhisperer available for free for individual developers. It is an AI-powered coding companion that can offer code suggestions based on previously written code or comments. There are no usage limits for the free version, but a paid tier, for professional use, also includes enterprise security and admin capabilities.

Daniel Stodolsky, former Google Cloud VP and current SVP of Cloud at SambaNova said the old cloud argument of bringing compute to your data doesn’t stack up in the new world of generative AI. “Whereas other cloud services such as predictive analytics rely on huge volumes of real-time data, Amazon says the process of customising its pre-trained LLM can be completed with as few as 20 labelled data examples,” he said.

“The trend for generative AI will be towards open best-of-breed approaches rather than vendor lock-in and closed models. It’s much better to own a large language model that’s built and fine-tuned for your use-case rather than relying on an off-the-shelf model with minimal customisation.

“The other consideration is getting value from generative AI quickly. Amazon’s Bedrock service is only in limited preview right now and anyone looking at AWS Service Terms will find that Service Level Agreements don’t apply – in other words, it’s not production ready and won’t be for some time. Generative AI is a race that’s already well underway, and Amazon clearly has a lot of catching up to do with other production-ready platforms.”

Homepage image by Dennis Diatel/Shutterstock

Read more: Google AI: PaLM model open to devs and added to Workspace

Topics in this article : AI , AWS , Cloud

参考译文
亚马逊在AWS Bedrock上推出生成式人工智能游戏
亚马逊是最新一家进军基础人工智能领域的超大规模云计算公司,包括生成式人工智能和大语言模型。该公司推出了一款名为AWS Bedrock的新平台,该平台提供对内部工具如Titan系列基础模型的访问,并提供来自AI21 Labs、Anthropic和Stability AI等初创公司预先训练好的模型。该公司表示,其重点是为“企业级”人工智能工具提供多种模型选择。一位专家表示,亚马逊要想跟上该领域的其他竞争者还有“很长的路要走”。亚马逊表示,Bedrock将适用于企业级人工智能工具,并允许在项目中访问多个模型。将AWS打造成多种人工智能模型的市场,这一做法与谷歌允许在其Google Cloud平台上提供第三方模型(如Midjourney和AI21 Labs)并列的做法相似,同时也有自己的PaLM模型。微软则通过其Azure云平台全力投入OpenAI,向客户提供GPT-4、ChatGPT等模型。亚马逊表示,Bedrock将允许公司使用自己的专有数据训练聊天机器人和其他人工智能工具,而无需投资昂贵的数据中心和高价的人工智能芯片。AWS将使用自家定制的人工智能芯片和NVIDIA的芯片相结合。“我们有能力在需要时获取数十万枚这样的芯片,”AWS弹性计算云部门的副总裁Dave Brown解释道。Bedrock的发布已经筹备了几个月,AWS与Stability AI等初创公司签署了合作协议,并加大了对生成式人工智能应用程序及其底层技术的投入。Hugging Face也正在将其生成文本模型库引入AWS,亚马逊还推出了面向初创企业的AI加速器。AWS是全球最大的超大规模云服务商,但正面临来自Google Cloud、Microsoft Azure等竞争对手日益激烈的竞争,主要原因是这些公司的人工智能产品。两家公司都已在通用人工智能工具方面进行了大量投资,包括ChatGPT和Google Bard等聊天机器人。目前,亚马逊尚未公布其人工智能产品线的定价,详细信息也不清楚,但用户可以通过API访问各种基础模型。Bedrock专注于“企业级”应用程序,而非单个工具,因为其设计是面向大规模应用的。多种人工智能模型中,AI21 Labs的Jurassic-2系列基础模型特别适合生成多语言文本,而Anthropic的Claud在文本处理和对话工具方面表现优异。Stability AI则向Bedrock提供了文本生成图像的工具,包括Stable Diffusion,可用于图像、艺术作品、标志和图形设计。最新版本的Stable Diffusion在文本准确性和清晰度方面有所提升。通过Bedrock,开发人员可以创建结合多种模型的工具。亚马逊自家的Titan模型包括文本和嵌入模型,这使得文本生成(如撰写博客或销售文案)成为可能,而嵌入模型可以将文本转换为数值表示以寻找其语义含义。合作伙伴内容俄乌战争改变了网络犯罪格局。中小企业必须警惕为什么F&B行业必须借助数字化来推动创新。韧性:自动化云灾备的力量。AWS Bedrock在追赶方面仍有“很长的路要走”。亚马逊还在推出针对微软GitHub Copilot的替代产品,后者是开发者广泛使用的代码编写辅助工具。亚马逊为个人开发者免费提供CodeWhisperer。这是一款由人工智能驱动的编码助手,可以根据先前编写的代码或注释提供代码建议。免费版本没有使用限制,但面向专业用户付费版本还包括企业级安全和管理功能。SambaNova目前的云部门副总裁Daniel Stodolsky曾是Google Cloud的副总裁,他表示,在生成式人工智能的新时代,以往云服务中的“将计算能力带到你的数据所在位置”这一论调已不再适用。“虽然其他云服务如预测分析依赖于大量实时数据,但亚马逊表示,定制其预训练的大语言模型只需尽可能少的20个标注数据示例即可完成,”他说道。“生成式人工智能的趋势将倾向于开放的最佳实践方法,而不是被供应商锁定和封闭模型。对于特定的用例而言,拥有一个自主构建并微调的大语言模型比依赖一个定制化程度极低的现成模型要好得多。”另一个考虑因素是快速从生成式人工智能中获得价值。亚马逊的Bedrock服务目前仅处于有限预览阶段,如果查看AWS服务条款,你会发现服务等级协议不适用——换句话说,该服务尚未准备好用于生产环境,还需要一段时间。生成式人工智能是一场已经开始激烈竞争的竞赛,亚马逊显然还有很长的路要走,才能跟上其他已准备好投入生产的平台。首页图片由Dennis Diatel/Shutterstock提供。阅读更多:Google AI:PaLM模型对开发人员开放,并加入Workspace。本文主题:AI、AWS、云
您觉得本篇内容如何
评分

评论

您需要登录才可以回复|注册

提交评论

广告
提取码
复制提取码
点击跳转至百度网盘