Azure Content Safety

Nov 15 2024 · Python 3.12, Microsoft Azure, JupyterLabs

Lesson 02: Text Moderation With Azure Content Safety

Exploring Text Moderation in Content Safety Studio

Episode complete

Play next episode

Next

Heads up... You’re accessing parts of this content for free, with some sections shown as obfuscated text.

Heads up... You’re accessing parts of this content for free, with some sections shown as obfuscated text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

In this segment, you’ll get hands-on experience with how text moderation works in the platform, and you’ll also learn how to add filters and blocklist terms to customize text moderation response.

Create an Azure Account

Before using the Azure Content Safety platform, you’ll need to create an Azure account, if you don’t already have one. You can do that by heading here: Azure Account Creation. Then, click on “Try Azure for Free” as shown on the open page.

Create a Content Safety Resource

Upon reaching the Azure Dashboard, you’ll next create a Content Safety resource, to get your key and endpoint.

Allocate Permission to Content Safety Studio

Next, you’ll have to provide permission to Content Safety Studio to make an API request to the above created resource. You can do that by clicking Access control (IAM) on the left tab.

Attach Created Content Safety Resource to Studio

You’re now ready to use Content Safety Studio. Click on the Overview button from the left pane. Then, click the Content Safety Studio link inside the Get Started tab. This will open it in a new tab.

Explore Text Moderation in Content Safety Studio

Let’s start seeing text moderation in action. Click on the Moderate text content card. This opens a page where you can run tests on text and see how the content moderation API performs.

Testing Single Text for Moderation

From the “Select a sample or type your own” section, select any text on which you would like to run your text moderation API. This will copy the text in the text field below it.

Testing Bulk Text for Moderation

Next, you will look at the moderation technique for bulk text. This allows you to test the moderation API result on bulk text and understand moderation API efficacy on your bulk data set, not just single text content.

See forum comments
Cinema mode Download course materials from Github
Previous: Understanding Text Moderation Using Azure Content Safety Next: Understanding Azure Content Safety Text Moderation API