close
close
Amazon updates enterprise AI solutions with guardrails at AWS Summit
Amazon updates enterprise AI solutions with guardrails at AWS Summit

img-0700

Radhika Rajkumar/ZDNET

Amazon held its annual AWS (Amazon Web Services) Summit on Wednesday at the Javits Center in New York City. The event focused on the cloud computing giant’s latest work in generative artificial intelligence (AI) and included partner exhibits, workshops, talks and a keynote address.

AWS emphasized increasing developer productivity and making scalable generative AI available to more organizations in all of its announcements. The company also focused on safety and responsibility and highlighted its partnership with Anthropic.

In his keynote address at the Summit, Dr. Matt Wood, Vice President of AI Products at AWS, pointed out that regulated industry and public sector customers are growing fastest on AWS because their compliance efforts provide a solid foundation for generative AI. They are well prepared to meet security requirements at the start of their AI build. Wood reiterated that security is built into AWS’ AI applications “from day one” and that it is the company’s highest priority.

Here are the biggest highlights from AWS Summit 2024.

AWS App Studio

img-0843

Radhika Rajkumar/ZDNET

One of the groundbreaking announcements of the summit was AppStudiowhich is currently in public preview. The new AI-powered platform allows technical professionals to build rich apps using natural language descriptions and prompts. Users can specify the app’s function and the data sources it should pull from, and App Studio creates an app in minutes that “would have taken a professional developer several days to create,” the press release states.

In a demo at AWS Summit, Amazon showed ZDNET how App Studio can take a request for an invoice tracking app, for example, and offer suggestions on how it should work. Once the user approves the outline and App Studio creates the app, the user can edit it with easy-to-use drag-and-drop features before deploying it.

App Studio also integrates with third-party services and AWS through connectors. Adam Seligman, vice president of developer experience at AWS, told ZDNET at the summit that the company expects App Studio to evolve based on customer feedback and add more integrations.

Amazon Q Updates

img-0799

Radhika Rajkumar/ZDNET

Amazon has announced several updates to Q, the company’s enterprise AI assistant, highlighting support for developers. After Amazon Q Developer became generally available in April, now also available in Sagemaker, Amazon’s developer environment for machine learning (ML). Previously, Q was only available in the AWS console and other developer environments such as IntelliJ IDEA, Visual Studio, and VS Code. Amazon Q provides product guides, generates code, and can help developers troubleshoot problems.

During his keynote, Wood called the integration a “step change in usability” for companies to accelerate their ML workloads.

“In SageMaker Studio, data scientists and ML engineers get all of Q’s existing capabilities such as code generation, debugging, and consulting, as well as specialized support for tasks such as data preparation, model training, and model deployment,” explained Swami Sivasubramanian, VP of AI & Data, in a opinion.

Developers can ask Q how to optimize their LLM, and Q will return a set of instructions, complete with sample code. Q can also advise users on which approach to take when preparing data, based on use case, code or no-code preference, and data format.

Another piece of news was that Amazon Q Apps, a feature of Amazon Q Business, is now generally available. The Amazon Q Business feature lets employees build apps with their company data by sending Q a descriptive, natural language prompt. Employees can also generate a reusable app from a conversation with the assistant for tasks like “summarizing feedback, creating onboarding plans, writing copy, drafting memos, and more,” Sivasubramanian continued in the press release.

The publication follows the trend of implementing AI assistants across industries and qualifications to relieve employees of all kinds of work.

Updates for Amazon Bedrock

The company also announced updates to Bedrock, its enterprise platform for building and scaling generative AI applications. Bedrock offers a wide range of models for every use case, allowing companies to work with one or more models as needed.

As of Wednesday, users have been able to fine-tune Anthropic’s Claude 3 Haiku model in Bedrock. Fine-tuning allows organizations to specify models for their needs and use cases, making customization easier. This feature is currently in preview and marks the first time Claude 3 models have been available for fine-tuning.

As we know, better data means better generative AI results. Amazon is adding new data sources to Knowledge Bases for Amazon Bedrock, including connectors for Confluence, SharePoint, and Salesforce, as well as custom web sources and improved accuracy for CSV and PDF data. This will enable companies to further customize their models with more business data. Knowledge Bases already connects to private sources like Amazon Aurora, MongoDB, Pinecone, and more.

New features for agents

Amazon announced two new features for agents in Amazon Bedrock: memory preservation and code interpretation, both of which improve customization.

By storing in memory, agents can now remember where a user query last ended, whereas before they were limited to the information they had from a single session. For example, agents can now reference information from previous interactions, such as your last trip when booking a flight.

Wood noted in the keynote that AWS customers are interested in having agents perform complex analysis beyond simple automated tasks. To achieve this, AWS leveraged agents’ ability to write code; they can now generate and execute code in a sandbox environment. This enables agents to analyze data and create charts “to address complex data-driven use cases such as data analysis, data visualization, text processing, equation solving, and optimization problems,” Sivasubramanian said in the release. This feature could, for example, enable the analysis of real estate price data to facilitate investment decisions.

Code interpretation is limited to a sandbox environment to avoid potential chaos caused by agents building and executing unchecked code. Amazon also noted that users can upload documents directly, making it easier to instruct agents.

Guardrail updates

img-0821

Radhika Rajkumar/ZDNET

To address customer concerns about hallucinations when using generative AI, Amazon announced Contextual earthing tests within Guardrails for bedrockthe company’s existing set of generative AI parameters to reduce harmful emissions. Contextual grounding checks detect and block hallucinations in model responses for customers using RAG and summarization. The feature also ensures that a model’s response can be mapped to the correct company source data and is relevant to the user’s original query.

Bedrock already offers filters for words, topics, malicious content, and personal data – this update builds on information blocking by addressing hallucinations themselves. AWS reports that Guardrails filters up to 75% of hallucinations. AWS also announced a Progress report on its responsible AI initiatives, which include visual watermarking measures, policy guidelines, and training resources for use.

There are guardrails in place in Bedrock, but to expand the reach of responsible AI, AWS also announced an API version of the feature that will enable customers to implement the feature in Foundation models that are not hosted by Bedrock.

More AI announcements

As part of AI Ready, Amazon’s free cloud computing skills training initiative, the company also announced AWS SimuLearnan interactive learning platform that “combines generative AI-powered simulations and hands-on training to help people learn how to turn business problems into technical solutions,” Sivasubramanian said in the NoticeAmazon announced that it had exceeded its 2025 goal of training 29 million people worldwide and had already trained 31 million people in July 2024.

By Aurora