Azure Content Safety

Nov 15 2024 · Python 3.12, Microsoft Azure, JupyterLabs

Lesson 03: Image Moderation Using Azure Content Safety

Exploring Image Moderation in Content Safety Studio

Episode complete

Play next episode

Next

Heads up... You’re accessing parts of this content for free, with some sections shown as obfuscated text.

Heads up... You’re accessing parts of this content for free, with some sections shown as obfuscated text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

In this segment, you’ll get hands-on experience on image moderation in content safety studio. You’ll also learn how to customize the image moderation response in the studio.

Explore Image Moderation in Content Safety Studio

Start by opening Content Safety Studio in the browser. Next, click on the Moderate image content card to navigate to the page.

Testing Simple Image for Moderation

Using Run tests on a single image feature, you can test whether a single image is safe or unsafe. In the “Select a sample or upload your own” section, select any unsafe image that you would like to run through the moderation API. You’ll see the selected image displayed in the preview.

Testing Bulk Test for Moderation

Next, you’ll use the Run a bulk test feature, which is located next to the Run a simple test tab. This feature is particularly useful when you want to test the moderation API on multiple images in a single request. It allows you to assess how your API will perform on a variety of images, which can be beneficial when you’re dealing with a large dataset, or when you want to compare the performance of different filter and configurations.

See forum comments
Cinema mode Download course materials from Github
Previous: Understanding Image Moderation Using Azure Content Safety Next: Understanding Image Moderation API