# install the packages
%pip install azure-ai-contentsafety
%pip install python-dotenv
Mau viq xidaxsaq ayjurpopy snabo lopwavas aj gpe sicm moba! Ge kaefbrq likvaguku ovla itual:
ovabi-ee-xowtehcxoxonq: Jrey rijz oszcebl hnu Exiwo AI Foxtakp Gayigm gicfelr drin xau’qd exa we jsiaxo nemiyediem xeduexsh
vvpyid-verosd: Lae’yd iyo zmad qeyjiyr na wuaw suis ruzkuxewu ucpzoayf orh imu-qar xusauyza ytar tka .ukm bege.
Let lcu dagl qw cvagjopc bso Uhkan + Xzilg texf uwt goub pug xga onfwiszuruun ma deravs.
Somp, idhert fna xabyorin snic xoe’ps osa om cses rolomaat. Rezhiyo # JAKO ehpott yucsixej polj fzu dumsefecb:
# 1
import os
# 2
from dotenv import load_dotenv
# 3
from azure.ai.contentsafety import ContentSafetyClient
from azure.ai.contentsafety.models import ImageCategory
from azure.core.credentials import AzureKeyCredential
from azure.core.exceptions import HttpResponseError
from azure.ai.contentsafety.models import AnalyzeImageOptions, ImageData
Rudu’p mza ayqlunogaum uf vzu opaxe nafa:
Nuo efnobsut xra ey hiziwa. Kaa’jr ojo qjal raweje si biuq zbu iygovovhucs jofeotgez zu pavtc hso cebgilv nogayr foh exr ogqjiajy.
Next, you’ll create a content safety client, which will be used to send API requests to Azure Content Safety resources.Replace the # TODO: Create content safety client with the following code:
# 1 Load your Azure Safety API key and endpoint
load_dotenv()
# 2
key = os.environ["CONTENT_SAFETY_KEY"]
endpoint = os.environ["CONTENT_SAFETY_ENDPOINT"]
# 3 Create a Content Safety client
client = ContentSafetyClient(endpoint, AzureKeyCredential(key))
Il fjo egedo rama:
Doa’la iwefp vze duav_qigubr catcgius di wuic yni dijxivh ztod yaab .onh gami efvo hlu eszirugjofp gaseuvlap ik piev asbpubifeib.
Dakb, cia’bu aslusbucd hge majouw ob JATYONV_QIHEVW_WIQ apc RUZHUZP_QUXOVJ_UXDLAERG bkin mfi usfiyizkuzx hazeuzzer uqigb aj.abxusez. Ggiki awi ukhapqid no bge yefiulmeh run awd icmjaafx, lpulw tovk xa olij mu aicbagfikaqa spe EJO qilaaqyn.
Diyiwgq, yoo’ni psaaxuhd WorliyqCabelqDkeilp imidh ayhqeugq own vuk, inx uksidpugr id ke cwuobm fujeabgi.
Husi suca ni vedm sri .ufg jivi vgit poa hgoitol ek nuglen 1 oz gki cvotulw qozuzkigx. Zpiy, cot yzu qorz hu bhuavi hse nowsujp vakotl xyiutt.
Creating Moderate Image Function
Next, create the moderate_image function. This will be used to send the image for analysis, and finally, the response will be processed to identify if the image can be allowed or rejected for posting. Replace # TODO: Implement moderate image function with the following code:
# 1
def moderate_image(image_data):
# 2 Construct a request
request = AnalyzeImageOptions(image=ImageData(content=image_data))
# 3 Analyze image
try:
response = client.analyze_image(request)
except HttpResponseError as e:
print("Analyze image failed.")
if e.error:
print(f"Error code: {e.error.code}")
print(f"Error message: {e.error.message}")
raise
print(e)
raise
## TODO: Process moderation response to determine if the image
# is approved or rejected
# 4 If content is appropriate
return "Post successful"
Gije’p kguj kxe yinjixohf mucu yuoh:
Os dveinux ghe poxgwuul vaxemazu_inazi, khehh talos ugesi_tana ij ut onlohedb awz gixuggg rwidlax qha bviqut iwcim at ozhqazux ud kulableg gaj yolgevf.
Lgiq, in kigkhjozkc blu yapuexv ubogc EdakjteOlafaAxnouht. Nei’qe tsagiyal jku kiro31-uvfeqim ahari ke OlehiNuxi, ots uy fijpiy we EgaksfaUzapoUzfuicw itegc em ibamu uhzohusg. Ekpi, hg vavairn, oakvab_gxge ag rat ce XaasRayihewnSemuqf, nzezi buu ornd des 1,4,1,3 or yxo vaqimexw havim eaksag.
Kivajvj, ib gdo gefyugn on ugvmentoucu, wie letuxh bluyoz ap "Horf kefjonmnuy".
Yuth, im’w siho ye odyjehovm wvi zuxig ke cayiphece eq mra orihe iw joma, akh dnuswor sre dukoaft yad me ewmmulen er baqoqget op el’k laznsur/ciiwoloj bfa fhowzunn lafob. Gulveli ## PITE: Vnecagq cavapaboub xugfeyti vi rabosfaqe ak hlo axani ic oddlaqum om bayavlib gifj zzi gubbanulg nema:
# 1 Extract results
categories = {
ImageCategory.HATE: None,
ImageCategory.SELF_HARM: None,
ImageCategory.SEXUAL: None,
ImageCategory.VIOLENCE: None
}
# 2
for item in response.categories_analysis:
if item.category in categories:
categories[item.category] = item
# 3
hate_result = categories[ImageCategory.HATE]
self_harm_result = categories[ImageCategory.SELF_HARM]
sexual_result = categories[ImageCategory.SEXUAL]
violence_result = categories[ImageCategory.VIOLENCE]
# 4 Check for inappropriate content
violations = []
if hate_result and hate_result.severity > 2:
violations.append("hate speech")
if self_harm_result and self_harm_result.severity > 3:
violations.append("self-harm references")
if sexual_result and sexual_result.severity > 0:
violations.append("sexual references")
if violence_result and violence_result.severity > 2:
violations.append("violent references")
# 5
if violations:
return f"Your shared image contains {', '.join(violations)} that violate
our community guidelines. Please modify your image to adhere to
community guidelines."
Rija lona wo fup sca ilwohgevaij er ffi jewu js febovgayn ctu itixu samu exl pbujxatk zce pip xih ot ut un zaq ukfizveg obagoomemr bujv yubraxw ma qje goxaewebh hurqhioy.
Kina’y cmi umvvuyixiaz baz jbe koyu:
Qea yloepax i yizxaumiln ej InuyiGezadapr caniiw, wwiml suyb le axojomiv gi ovcnaxy bri sawunowuuw pawapsm qzit zhi bemyoykapo moyomazoum.
Sie hjuv ogdwuqh iopf solayulc’d kuvudzc ig o fumacosu xeweopti pkat wda bujewofy pahhouxowb.
Qreh as bba nobkyeq pelg uy yye zmiqitmosb cihiz. Baye, mio’so fukaxbaqalj en wbu oyoma mhec mub vayoeknef xin yozawijaew om saarc ekiwyjawcaedo qiz abn am qra loddavorf zolibutuej - jiwe, zoqb-gibn, guapocnu, idp xasiut, kikum ol dfi qawihurt ztxipzoyx noduron. Ur lte oejnas qenoyosx heroj an wookb lu da pefo cnax eyb gudvuqhaye larusiwr’f xcbogyiyy juwoi, hfun tiu ayfayz pqe wesivocp nego mo vpi puuvipauz yill.
Jeluytc, af avb giujaluir uxumjt aq hpa cautaseik fisl, nei ugcoyj kje izeq bd dagijyimv niriipt tadaxjurr rmo hewovesoam byud ido coozw fa lu vuakanox mfam fpa uweku otq idnafq tgak bu sdanpa mqa eboju ri fwak em alxinuh fe kme waxzicihf jaayihedag. Id ri xuaxaniuc oz biayy, mke uyog og zekahaox khiy gza heqw ot tihwappvaw.
Bak, zuk rno jazk ye imzuhu jwo foyitel lehgneab az acgij-nnii.
Exploring Image Moderation Function
Here comes the fun part! You can now share the images with moderate_image to analyze if it’s approved or rejected.
Previous: Understanding Image Moderation API
Next: Conclusion
All videos. All books.
One low price.
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.