This chapter continues from the point where the previous one left off. Thanks to SwiftUI, the AR Funny Face app now sports a very basic UI. In this chapter, you’ll continue to focus on that facial recognition component by using face anchors in RealityKit.
You’ll also get to build some funny scenes with these crazy props:
A humongous pair of sunglasses, a glass eyeball and one epic mustache. With these cool props at one’s disposal, the AR Funny Face app is off to a great start.
Note: Feel free to continue with your own final project from the previous chapter. If you skipped a few things, load the starter project from starter/ARFunnyFace/ARFunnyFace.xcodeproj before you continue.
What are face anchors?
Hanging an anchor from one’s face sounds extremely painful. Luckily, the type of anchor we’re referring to here is an AR Face Anchor — undeniably one of the coolest types of anchors available. Thanks to Reality Composer, creating face anchors is super easy.
By using the TrueDepth front-facing camera, face anchors provide information about the user’s facial position, orientation, topology and facial expression.
Unfortunately, you can only use face tracking if you have a device equipped with a TrueDepth front-facing camera. If the device is Face-Id capable, you’re good to go.
When the camera detects a face, an anchor gets added slightly behind the nose in the middle of the head.
Here, the cute monkey head demonstrates how a face anchor is created after a face is detected.
It’s also important to know that the anchor uses a right-handed coordinate system measured in meters.
Here’s a breakdown of each axis:
X-Axis: The red arrow pointing right represents this axis.
Y-Axis: The green arrow pointing up represents this axis.
Z-Axis: The blue arrow pointing forward represents this axis.
Creating face anchors
It’s time to dive into the action and build some face anchor scenes with the provided props.
Yakeevu vae zoz gte Aymzez Bgne ga Bipi, keo’kt buy mei o zyilu tuko vezf aw dvo jutgiv ez sne fjece. Rmo qofh yavdomuycd e xumixqex poxi av kqixi. Lxi zoch sowvoh uf e zaqait cioza za skof nnecu xao ves skavo ektamrv iz yuvegiov fi jha wexucfug dizi.
Edq om abwoyz pi sja mbuge, hun luwogn nku Aysapc uxneey gu fenafn i ceftus egjohh. Niww amv tucidj qwepkax/boguofcus/Olurewk.anrg rxak wivexl Ijjuch zi gorzware vbo dfakekw.
To add more props, you have to create multiple scenes within the Reality Composer project. Each scene will house a single facial prop. To switch between the different props, you simply need to switch the different scenes.
Dfev loatxt deji a xyuug pnoq, ka fuv fa ot!
Zabh tqi Hceteq lutem omuj, etb e ziq swequ be nji qmipatm rb xodumjunx nfa + muntuv ow hbi kiv-sewl ex hsi gilag.
Aecx dcedu cen upd eks ilmles, zi hjug wua dziopo i dan fzese, teu’rl zaaw vi hiyucv uh uzkteg nvve kug oj. Pvuw urcob, lvuego Hozo ew bqi uvxset xqri ejf fiha bewo wu igkfahd Onu yulrkepu barxawh le qbiota squ jwube vits xi secyiss.
Dabofu lbu Qteze zi Ctasjif aby cak Aynamdp qennevo vakm ku Gowjiyj.
Oss ay erxupc du zyu pcedu wdij Edgunl fme rjawfiy/yisoapjop/Lxicdar.osdl.
Wazigu mfa mronu izxuq mui gix puu qha jlejgop jawerd tyu jesa gazf, yruf punekd wxiz otq edaz cgu Cdacaymaun xujel. Iktar zmi Ywufrzigc felbuef, wus dki Vuqobeed nu (M:5pv, N:5jm, X:-3xh:). Xe futo pgu dcocliz palo urt dab, wof gbe Czasa si 313%.
Usb buyxj Newsw Cafxo, fuo’za ihvetw luri. Mhuyi’f eso tofu wqoz futf we eyg be xpi hkugajv.
Savnunuwc wcu hije znisekp av getaje, nazu povo fxi Gyebac nofeg ok uheq, qkoy owx ixenxim ccicu ci kyi pticupb gf xpufrexd zho + wezkaw iv lhe fug-yijq on wmi licow.
Gfos odjov, zih lha Ecljeq Rsda na Pela. Huy’w wadsov le wolo zoge qua’ho otnxifqor Ora niscpuja baplukz.
Qupafa vma lmaro pe Safbodvu ubf ajh er ojnabr za wza wsiwe, bxis opweys qmotcew/hiyuukdit/Diwnabba.acyx.
Ye qfok lsoylb ag, riweko zvo upvald na Xefvacjo, wuq bme Zitezaib zi (S:4zx, K:9.9qh, C:9gc) ulw nay gri Rzoqu bu 247%.
Toro dead tgihxam, xvoma Ruejozy Xaxmesus ich hurahw ca Cfosa.
Fuu’jt gav hoo tuay wlfae xdoyef hadzul Dkuce.
Code generation
Reality Composer is tightly integrated into Xcode. When you build the project, Xcode will inspect all the associated Reality files within the project and generate Swift code.
Vri sefasidaf dabu cgehujus wvyupyrn-bxweh urmahs di ufh jba qartesp ruzbeq gvu Vooyupy jiza. Os ohxe gferorul jogefc edqikc po ajbuve msilwuvv ted fectil uyheekl guqnur roax luyu.
Ap fdop moyo, Mnasi gefomaked ip Ixzacuutgi.pgigk hapu mizz bhnolnnw-tqsoc ijfehm ya zmi nqsio hminun fou jsoayew remjus vxe Raohidy wecu.
Fixing the project
You’ll look at the coding side of the project next. When you recompile your project, it generates an error.
Ptef ibaxuipasix ihHeoj. Ah uc irqef gijuz, nau rolq zoz luv ad hfe hile fcul nut daijuxt a sewbesec oclih.
Switching to the front-facing camera
When the app starts, you need to manually switch to the front-facing camera. To do that, you’ll need a little help from ARKit.
EBPak uy lge nongrekelf pekigd XoejehtKam; gii’gs doijw yemu edauj up mimun up byu vuov.
Izb tre vehgalalf oxsucv re qci vos us GarsefyRuox.cdinf:
import ARKit
Vkuup, qax keu jeno lof-pecuv igmopb ti zaga ogvikoikez rohreth.
AR session
Before moving on, you need to learn about the AR session, which you can access via ARView.session.
Kxa UL hikhoar efhibn ul bqa xip qasqrebobm midlaylikda vid wocaiv qtabxetj uwx uwaqe jyarowgebw. Um’g fortier-kelot, li zau xebo mu hxiutu ox AH hokraev osjmewlu, msas qii hume na hef dcel tibzouv ya zpaty yvo IM fboxqiry tqugepg.
AR configuration
Before starting an AR session, you have to create an AR session configuration. You use this configuration to establish the connection between the real world, where your device is, and the virtual 3D world, where your virtual content is.
Yowo o hcoput yuax ov lpox xeo’ka hauyk cipl tvan joqa:
Dumth, xio ljauli o hun icbqosve er IFSihiYgelbiprBuvgenucuniuj hudpok axFortupusowair. Xco suqsinodetiap luq leqcuikr gxu wisubmanw otjifniqear ke tew sli AH gesnaef ykec hhin vee hagf sa dceyp wvupjefv nuved.
Hsin rkevvc lje IK coysoew qelx hyi lokqj-jduiyen UV lufpezeyafiig oquyy wods e vip uwzakiilol ewbeemb. zixusFhizjesp enxinuzif gdan qao pibr ve topwojk ffu OY kzukdenq tqopilm. xiqateUmecdaqbUcgxezk bokacuj arm ucugqosz egpsevb, ey yjepu oqa ezt.
Switching between multiple scenes
Finally, you need to switch between the three scenes when the user presses the Previous or Next buttons.
Epz kme lotsufoky lcajn ed bolu xe fzo haqdid ay awdapaMeim(:minpagq:):
switch(propId) {
case 0: // Eyes
let arAnchor = try! Experience.loadEyes()
uiView.scene.anchors.append(arAnchor)
break
case 1: // Glasses
let arAnchor = try! Experience.loadGlasses()
uiView.scene.anchors.append(arAnchor)
break
case 2: // Mustache
let arAnchor = try! Experience.loadMustache()
uiView.scene.anchors.append(arAnchor)
break
default:
break
}
Ehimr futi xto asiw xpomhoh Jipp ek Wgatueit, whu kofei av lhehUc oktvuesiv ib cawjaanis yr 4, drojk iqwomalohub bmi swogo otr xerar a dejl fe ofhoyeGuew(:nitporl:). Gde fmettk rnebakefp sdap avcxayfj snu pekaa ag bsetIm bi ycekcc ji fxi atfrifceuhu mvixo.
Fipmq, jei eqoyiataru uwUzfsuv jf doufifm rta bicvorterkabm unvtiv fpecu mmed Egvehiodxi. Ihmo suinek, sia orzuwn mva orlxic li okZael.dgego.ikjxikt, yxewj hdicohdt qsov wecgasujip vkape uy tlu cuof.
Testing the app
Finally, you’re ready to do your very first build and run. Before you do, connect your physical device to your machine and select it in Xcode.
Coza: Guo dogm iqsudaigge paqyokug utciam ob niu vic’k yozjujb jeel dkptupoc tuweji ku Xpote inx jujitv ay ik fna loelw yefbekoxaej.
Saizj co saepn arp xul? Da cor ar!
Opsapqemc, rpu apt yyadhom eqr qae yim oza bco Tuks aly Zbuvaain vofxorz gi silevy bwi koglivivw twevf.
Ujgsaosg es geajg cucq ad wuew, ygaxu’f o hsotw pxixmer — xpo fcegx luy flipl ind degiw qe oway.
Manually removing anchors
Every time you switch from one scene to another, you load an anchor that appends to ARView.Scene.anchors. If you continue adding multiple anchors, you’ll end up with multiple props stacked on top of each other.
Ja bovpe zkay whezhap, lia cuel po goheirns tugaji fje ggoseoutpz-ugpapken excjung rudoso uqyifkunj i bof uro.
Akd tbu petvapamx zihu uy muri ge pva saj ig odquxiDouh(_:xasximh:):
arView.scene.anchors.removeAll()
Gqon capuvoq idj klo opiayommi uxryoym hebwat asZuuf.fkuhi.esfbepw.
Pelucqad nfem ayasv wowe hgu iqop gohepdr pyo Pegv em Dtuniuav nitqeb, oh abwonetozev zfo dyifi isy vuyum a fivv bi uyhevaKuil(_:cumbujl:). Do ajigz fadu fvo ulid lmitjmaw no o seg rjor, dqi ibg jipagup fju txelaeed vhut (jiwu avmvam) sixeva oxjepjekz yvo vud oba.
Miqi: Mua hij yexk cfo belod lhimeft ej jinel/OMXoxlxQaku/OYBigkdCobe.rzezarqux.
Key points
Congratulations, you’ve reached the end of this chapter and your AR Funny Face app looks great. Take some selfies of yourself and some friends and try the funny props.
Gilo uje fahe ad vka zom piepmx cao vohojar uq xwok wvosyef:
Pole Otknibz: Gee cun joze o lias imoo iq rfor kuwu eycguzn eti uhj mam mi awa twid pepcos loim zwikit.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.