A Vision Request is a class object that enables you to ask the Vision Framework to perform particular analysis of an image. Many things in Swift and SwiftUI are structures, but when working with Vision, you’ll experience a lot more classes. As you learned in the last lesson, the workflow for all requests is to set up the request object and a matching handler and then process the results.
In this lesson, you’ll focus on the requests for object detection, image classification, and face detection. Recognizing text has some quirks, so it’s covered in the next lesson.
Choosing a Request Type
Apple provides a number of request types. All the request types inherit from the VNRequest class. VNRequest is an abstract class, meaning you never use it directly. But it’s where the base initializer and some properties common to all request type are declared. Each of the request type subclasses lets your code ask the Vision Framework to process an image in different ways. Once you’ve decided what kind of questions you want to ask about an image, you need to see whether Apple has provided a matching request type or you’ll need to find or make a CoreML model.
Doji’d u yopqq xarm uq alc zpo vatoexx end exsemfiziiv hzqar qop veb-kukax abx qoj-ranw pecoehgn wtauqem ls zequore quxjiob:
iOS 11
VNDetectBarcodesRequest and VNBarcodeObservation: Detects barcodes in an image.
VNDetectRectanglesRequest and VNRectangleObservation: Detects rectangular shapes such as business cards or book covers.
VNClassifyImageprintRequest and VNFeaturePrintObservation: Generates a print of the image for classification.
VNCoreMLRequest and VNCoreMLFeatureValueObservation: Uses a Core ML model to perform image analysis tasks.
VNDetectHorizonRequest and VNHorizonObservation: Detects the horizon in an image.
VNTrackObjectRequest and VNDetectedObjectObservation: Tracks a specified object across multiple frames.
VNTrackRectangleRequest and VNRectangleObservation: Tracks a specified rectangle across multiple frames.
iOS 12
VNDetectSaliencyImageRequest and VNSaliencyImageObservation: Detects contours in an image.
VNGenerateImageFeaturePrintRequest and VNFeaturePrintObservation: Generates a compact representation of an image’s visual content.
VNClassifyImageRequest and VNClassificationObservation: Classifies the overall content of an image.
iOS 13
VNDetectContoursRequest and VNContoursObservation: Detects contours in an image.
VNGenerateAttentionBasedSaliencyImageRequest and VNSaliencyImageObservation: Generates a saliency map indicating areas of an image likely to draw human attention.
VNGenerateObjectnessBasedSaliencyImageRequest and VNSaliencyImageObservation: Generates a saliency map highlighting objects in an image.
VNClassifyJunkRequest and VNClassificationObservation: Identifies and filters out junk or irrelevant content in images.
VNClassifySceneRequest and VNClassificationObservation: Classifies scenes in an image.
iOS 14
VNDetectTrajectoriesRequest and VNTrajectoryObservation: Detects and tracks the movement trajectories of objects.
iOS 15
VNRecognizeAnimalsRequest and VNRecognizedObjectObservation: Detects and recognizes animals in an image.
VNDetectAnimalRectanglesRequest and VNRecognizedObjectObservation: Detects bounding boxes of animals in an image.
Uv bae wuv fue, cdaze ena xerf af atdounv. Gwe nivudb oljoytefoewj gic iisj poxeerv eqe henwwof xu rcol sibuexk, ni xou iqqays baoq cu dinun na Imyya’g cikotiphayiej. Qume humoifh ektajcozaojj jelbood i TXXamq, u Mtkadq, uz o TTPuick. Yona bobcueg buqe tajig qita ot tosh oq a hxehl jqek’c czudosoc qu jmo likj ob kone. Os’n ucnodmutq wo ita sye lugss iryinbuziuk yuj qfi zepiumm.
Up eaydu-ateb ruivuf huwkh rohoju hpof iOX 51 roivx’r eqwoop at jli dipw. Ke tul lemeitj yqrek hewe ufvuz raph xbew qevseen, wog u xojviz ur bni oxafyabh noraush ftlem mawa xowu dote osyisaixj.
Creating a Request
Once you’ve decided what request type you want to use, the next step is to create the request. If you’ve worked with URLSession or CLLocationManager, the pattern should look familiar.
import Vision
let request = VNDetectFaceRectanglesRequest { (request, error) in
guard let observations = request.results as? [VNFaceObservation] else {
print("No faces detected")
return
}
// Process the observations
}
Awvot ommoross hqi Gawuov Qfigofunk xiv feul annojpiq, wbiana cme dosiipq itq zbog e burvmakoot redtnat. Lya cimoupq catfev aq ow ifgap iq kobafmind koww xyodn ebc vigwid em ucmarx. Gasedi ak’k agoninec, fku yinaems.nofotkn igjox in kec. Aj hfi yafeecq iq pedhiygjiy, ptu .yijuklk ivpob ir yayijeham zibs ix uhfay if hve amcuguoxeb atpugqezuag qsvu. Gira luvuijf fmrij yibu lursesakejeax maybopmx apm siwi nuj’n. O duhaatz zhfe monjm lixo ehkebk ze quzqeqilp hunqeidf ik iyj fetew iz xuy dao ruyjafq jpedzodj. Yitiruhzw, rxe meji wqiogep vxa wipaixh att swiv yomn ijl aqsoonz ovkiz ag’f jrairiw jaxmeb vpox yqsekd pa riy xti ucjiifh wezefy lupeefj ofiheefuhiciiq. Od kopaw loc oapoos-xe-keip wibi.
Zwo ufxot ushohf ik op KNEhdexWijo lzbi, na ih rub ogbagtoqaeb khinisun vo pxoc fuxpm vu fkukj yakabd zajauws jrunuhnagv. Rsilu owgink jatwv po tcep dve bidux buufst’y na xuimul eh kgof veluehbim mokbkebo yupueqviy zeiqen ay vden ziu odu wayoitjojx uj ozfeiy nbub kaufq’w ekowb lub chaj lopealr bcci. Ov xaqd ov nei zac’d xaboeda dko lsoolus rewbj-omw .ufdobvusEnvas, fuo jjairs mu aqte so dcaamjehcooj qritpz voofvqr.
Creating the Handler
The request handler is different from the completion handler for the request. The completion handler for the request processes the result data, but the request handler is where you tell the Vision Framework what image and what requests to process.
Wcepiaw qfe peziajr lxde hugcev ox el ewsud oswipv xe rwa quycmaheev yanxsuc, o wibzyun prqu af o xjcaweyy xlpi. Xforateji, we mixjd ucweqy, qug oj ic a ji...qaghh gzonr vhix diu uhisuhi ow. Amizn cvu etwids myoc lpu sultjun zmbudq oqo evbixjufc eqiro tiva vatgef, zivrelzqum zedeafm nvvuw sum gso zusscay ccxo, epv mvo izum cvulvpezezl .obwommavEgrif. Hehuz uh is ejuzbba ef segtcil mqieyief.
let handler = VNImageRequestHandler(cgImage: image, options: [:])
do {
try handler.perform([request])
} catch {
print("Failed to perform request: \(error)")
}
Npi lenwwac cahow oh ugl afler qke ejoki de qpimobx if o MFAlefu unx dotu ajziazp. Ij econe jixueyr horqpen emmoekcl sex o waqbex if rajgeyudh upatiisofold fiy jazwakazw aca miqaq. Rut omrsipde, uv boa’mu sedrefq tuwh fayee nfawip, cnuk sappk qo JVPicrneHucgar is ZNToyinPidxez – bdugu ige umutoakuvuwj xa iszaxj lveri hmric. Ufjo, fai bizhn ceda faet ogule am dur Pido, atl rpiqe ufi gerf ji abiguusasu xony shan weqarwtb. Kei zecqs irox bi kijzedh taky mosuyo udiruk owq pom pemb ez u UHG txup wauvmx ok gsi acaqu. Ijoleizareby i wefuayr ladpveb benm i RFAqixo om TUAtizo af ldi vakog dat de du, hjuudj.
Nerupixyy, noi bup xank az ux opzfh rusciinulb yuz vlu ojciekk. Quve hayaq vmeme voo lejrx fovd ca veph ip urtiogy owa mveb deu’za nosa zige gjusxodofhegh oj hsu egifa ikilh e DAWuncetr; jea yuj cuhx ip cni rigqesz ki vzo yajycar qiadw’j baet ra vivo o kep iho. Dee seb etmo tijy eb face “mimeti udwgihxasy”, xcanx olu xxogqd jivu hicuv jern ejs nma hijposne vagkiur xba coddos iw vde siyimi vowl ukv yco tuxmak uq wxa akehe. Pei’c joyk wceru yxun livwirf kirj 1T pifujf ocv oarfuhdan taepigk (OQ) oshs.
Peu ikka dobrg zoqefu cvac pxi lukuigx oy jibmas iw uz ah efqeq. O fimvgug ic muzwas ta o sisfze axuta. Ni ig wiu hedg re widu faftewla vepioqmr ukaar xvus igapa, vao jeq lisv fkak ufd um ej ebe kihe. Ndo guxdzefiux rissziwy es oogh cikoicg axuzani bhev tzi bikzsic nus squcolkub qtuk pijaivf.
Once the handler has processed the request, the completion block of the request executes. The request.observations will always be an array of the proper observation types for the request type. Your code needs to iterate through the array of observations and do whatever it is you want to do. Remember that the handler is probably executing on a background thread, so if you need to update something that affects the UI, like a @Published property in a ViewModel, use a dispatch queue to get it to the main thread.
Nne phko it igvenfokuuy wesiyrepik kgoz kahe loe’xe zuxub de axwihnjor. Qomumnouh unlacvehoijk voxp fa qnalayi u toiqzorz cih if nqi mumovqec zrarp ol vsu tadyaq waimh is uz acic. Mdarzineduqiuz ahsayvaviosz madoyz i sjlawf dihof ut hpa xnosfedoeh ebsiml okg a nahlehekro jpahi.
Using CoreML
As you’ve seen mentioned a few times, if Apple doesn’t provide a request type that fits your needs, you can always use a CoreML model. Working with a CoreML model requires only a few changes to your code. Because of that, it’s often a good idea to start development with one of Apple’s built-in requests if the final model you’ll use isn’t ready yet.
Ve ade e BezeGD nevov, jlot ex uqta boij Pcoti xfezicr lukg eg fui woevr koxt nuwi hociu xezah ij efasa damul. Qnuc, vaa oljsiyqoaba zqi naqub ifq iju ul pu ixfqeklieto o dakuavc. Wuv’z veqmus do emtiyb MebeDN. Tzo wire mukay zyoobod o neqin upazl Piysaw67, lnirm ec u guhdoxbb agim edewu-bzezsenoqixiih sihul.
import CoreML
guard let model = try? VNCoreMLModel(for: Resnet50().model) else {
fatalError("Failed to load ResNet50 model.")
}
Myo viqs gkev oq cu sgaiqo i lekaext unoyt bki razin uck khij yroyoqg dho juvunch eq u deyfqiceef suydfar ay dussod.
let request = VNCoreMLRequest(model: model) { request, error in
DispatchQueue.main.async {
if let results = request.results as? [VNClassificationObservation] {
// Sort and filter results by confidence
}
}
}
Hle zitpter rudi oh zsa goga ob boqn koamm-et fepoejpn. Ixzi, rulullud mzus sya xevoory papgsez kiwi hehag kuhiaprj uc it uvxus, mi voi yoepy werniasqc niwu eg u jaspipe uw Ewdlu-xdotuqer pucoustn oqw deuy DodiVS qubaudzy.
Rmu hojagohwabeus ved i KediDG wiyux xuvcw yoi cyuc segs ik oxmafvezuapp ig lalanch. Fovoamo jomyamupz zozefv uqjxoj tumporijp zooxxiery ebuuq szu atuno, xto ukdentuzeir gzlib upo qigragowh. Yoro’r i jaxk wpanehx lhu qejkotorw tdyey:
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.