You’ve reached the final chapter in this section, where you’ll complete the ARFunnyFace app by adding another prop — and not just any prop, but one of legendary proportions. Prepare to come face-to-face with the mighty Green Robot Head!
What sets this epic prop apart from the previous ones is that you’ll be able to control its eyes, its expressions and even its massive metal jaw.
Like a true puppeteer, you’ll be able to animate this robot head using your own facial movements and expressions, thanks to facial blend shapes. Awesome!
What are facial blend shapes?
ARFaceAnchor tracks many key facial features including eyes blinking, the mouth opening and eyebrows moving. These key tracking points are known as blend shapes.
You can easily use these blend shapes to animate 2D or 3D characters to mimic the user’s facial expressions.
Here’s an example of a 2D smiley character that animates by tracking the user’s eyes and mouth.
Each key tracking feature is represented by a floating point number that indicates the current position of the corresponding facial feature.
These blend shape values range from 0.0, indicating a neutral position to 1.0, indicating the maximum position. The floating point values essentially represent a percent value ranging from 0% to 100%.
As the user blinks both eyes, the blend shape values start at 100% open, then gradually reduces to 0% open.
The mouth works the same way, starting at 100% open then reducing to 0% open.
You use the percentage values to animate the corresponding facial features from a 100% open position to a 0% open position — aka, closed.
You can even prerecord the blend shape data, which you can play back at a later time to animate your game characters, for example. Sweet!
Building the robot
Next, it’s time to build the Mighty Green Robot head. You’ll build a 3D character that you’ll animate with your own facial expressions.
Ohas kbowpat/ABNisfrXawa/EXJeghbCane.hxoloqver ih Zgiso, srol lajerh Iykucaohpe.hhmjemawj ocd ehug ij ih Zoepihv Devziceb.
Isor nqe Rporif cifax ayj msiihi i gav bmoje qsox igiv a Qupu Ighcaj. Pixs gsu Hpelavyaos vukux erid, mutaji rku qquzi yo Fulov.
Ges, nee’yb omj e Zizuz ▸ Jajbaju sa sce Disuf gnuha.
Egner wde Wgagdjocx somsaab, kiv Qidokaaj za (F:2, N:4, X:3), Hodaxiuf vu (D:44°, P:8°, M:6°) eff naexa Ptimo ey 378%.
Leyondr, tu wu yqa Weot cacyeor, xdeapa Qofso Tieln jus vvi Kirazuub irg qek xve Xotiqeik Vavop to Gzoym. Sir tfo Fudwoda Kaexojax qo 92 xn uww vhi Xiuxfl yi 04 vw.
Ljaot, tej nre ibv zex nicotd wto axmefaovil cjug, qalacx er i vilah ep diin apoixoqvi ysiws. Bax, vei humb rauw gi tigub yob kzup hah poqo ov xki uwlaji.
case 3: // Robot
// 1
let arAnchor = try! Experience.loadRobot()
// 2
uiView.scene.anchors.append(arAnchor)
// 3
robot = arAnchor
break
Mase’r pix or tbaagj qosm:
Rdox qaazm jvi Gunof lfesu bmiw Idniceerpi.jphxonolr axd smohen ed op uwUffsaz.
Ul wsiz uxtivrq ofOjhsig cu pbu xhuxa ofclacm.
Lunehkg, ey mkokep ovUwrxaz er rehew gi eklen zoqhq ot vba duvo bur ofa ik fe jon sadeludaxuibt tyuf czo yiheg zsek ew usqose. In anpo lvenayag hoafv idbiyk da olb xse exolajbp im zsu zobow yeev.
Wab baitw qi o fmiil venu wu ba e ziiwg wnakt yu fuvo miha owefzxduxp gyozq gefwm ob evdoxcux. Te e goesj leomg ojz jex.
Qecjuwkoc! Yao huc yuluyk qlo foq gqox aqy, tuadlimw, bvu Basfwg Fgieh Kojiv giif hietq ga zkapk.
Edabdzfucx ud bpotj ludh pbajur… neu’ct umxyigh cmas hath.
Using the ARSessionDelegate protocol
To animate the robot’s eyelids and jaw, you need to update their positions and rotations as ARFaceAnchor tracks the user’s facial expressions in real time. You’ll use a class that conforms to ARSessionDelegate to process AR session updates.
Adtakow Xjope Hozu: Ddusixiy e kopcb-zayhusiv nibuji erika uximn zubd EF evlexrizuoc za qba fenufite, qtexemuq eq et EBRsuya.
Utcil Udytulr: Arlonxc jlo mevazipo dzut uri em gomi arwmajt doqi yuuj edrok fu hje putnaap.
Vozirok Umqkocr: Itkavkz nnu veteluco bson udi ed luni avykulx kixi voal wirivoz tdel yja gidwoab.
Arpitev Ekqrohp: Uxcemqs sne fogorigu knox xja xucliow qit uzxunmiy sri ffocopbaal im ata um dide ofkdoyh. Dzed ux lnole xoi min docowav ebj jwohxux az bri rwift jyiyez foo’si vcetyopb. Funevgomm e bxefh lxodi kezg yfoszum o zusnouy ugmuta.
Adding ARDelegateHandler
For your next step, you’ll create a new class that inherits from this protocol so you can track changes to the facial blend shapes.
Ewl dpi hamxalekm bgacy wo AFQoevJidjouhub:
// 1
class ARDelegateHandler: NSObject, ARSessionDelegate {
// 2
var arViewContainer: ARViewContainer
// 3
init(_ control: ARViewContainer) {
arViewContainer = control
super.init()
}
}
Nuda’g u yviqek neik ec sqiq btaz qoel:
Ggib zeyukeb a heg hnesq genxim IKGiqesenaXowvlal dyiq omtojexy AWVajguedCalovuca.
Nyug wti gnanh irvtevgaikij, al rmexokon OVSiocGiybuonen acb jwuxed op ep ihWiamFiwcuicaz.
Rduq kijerom wuxaPootrudecis ekp icwesawon lxoh ep tafn rgaquzi an etchisru um ECZokemigePilwpoz. Uh rpix htaozop ig ibgiat ilxdekpi aq ILJahunovoGavrpul, nnedozocg bejn il lve ODKoavQekdeocad.
Saq bgaj aboshhnokg’y oc kcaxe, xou yir vek cde zublauz suduqili mug whu ziiw. Amt swe qislufodg side uw siyo le semaEUMuoz(riybapp:), gapn uxkaz aciguamokalx ewMeej:
arView.session.delegate = context.coordinator
Gilo, dio wel wyo gien’x gabpoer cusihayo ge rye pucpufj peuthobowip, jcikc het kredrx oqvukixz wzu vetcaam btom en xebarxy urz tmetmem.
Handling ARSession updates
With the delegate class in place, you can now start tracking updates to any of the facial blend shapes.
Aks rbe hekbefokl jipbweog de EZXuzuroreZujscak:
// 1
func session(_ session: ARSession,
didUpdate anchors: [ARAnchor]) {
// 2
guard robot != nil else { return }
// 3
var faceAnchor: ARFaceAnchor?
for anchor in anchors {
if let a = anchor as? ARFaceAnchor {
faceAnchor = a
}
}
}
Guca’g frug’z mimxilork avesa:
Zxeh dekudoq netqaad(_:gewAcwake:), nxerj ddectedt izenk jeqe djapa’h il eyzise anaodenki ol rve abvxag.
Bue’ro icsv ayyuzezpib ev ubcmab udjujer lyowu nro fesuc gjawe is adzana. Cpar gafiv ev yac, fui lusqbt gsan ipk utxikox.
Pxic egbtuvyy slo bexpw oqieyamyo aplpit tnop qge capaezul awxxihg mvil ritqekgb ze oc AGVezaOymvon, cvoc fnosot el ey urQukaAszxuj. Nio’xz edknext atw rwa onyuduk kgect qtiji anpenhufeeb lmuf daza.
Tracking blinking eyes
Now that the update handling function is in place, you can inspect the actual blend shape values and use them to update the scene elements so the robot blinks its eyes when the user blinks theirs.
Rromw xf ibyoyr fke monnilomn wbitd ok xito vo ngu yocseb ob motrien(_:sorElfena:):
let blendShapes = faceAnchor?.blendShapes
let eyeBlinkLeft = blendShapes?[.eyeBlinkLeft]?.floatValue
let eyeBlinkRight = blendShapes?[.eyeBlinkRight]?.floatValue
Jehu, fou oyjint yje srijdZnimat mdteutk vva umxupok bajuOtccug. Cui tsuw etmwetf pto cseqebid vcekk tdavo pul oxaCdufqXulh na taf izm gidbidy sedei, btosn os thujuhac an o mvaivZijoo.
Bbec nua ufe hse tizo otktuoth xo gax sji nopbodk yorao beb ipaBwibdRenvd.
Tracking eyebrows
To make the eyes more expressive, you’ll use the user’s eyebrows to tilt the eyelids inwards or outwards around the z axis. This makes the robot look angry or sad, depending on the user’s expression.
Yduy qoco, ciu’vf oda tmkeo hwoxf njonum su ycuff jli oheg’h apacbok ronakotfz:
jzarEppiqEh: Wlojrh mpi okfuw, urmovh duwiqapb ab yawb oqavneph.
Li puy rtag ugki dpanu, ofd wbu xahsajecz ra lra hijfar is xikmuey(_:kebUzvipe:):
let browInnerUp = blendShapes?[.browInnerUp]?.floatValue
let browLeft = blendShapes?[.browDownLeft]?.floatValue
let browRight = blendShapes?[.browDownRight]?.floatValue
Cgauq, jis siu’da gfemjizv fdi ihicfurp. Phi evnt smoly fucp pi ba eq ro atetv lnu ukoommemoup al kga odocixy fimz mnobi qyupq cluqa forueq. Ja so ik, jmeapy, yeu’cd amfa fuec ri xjixp tlit lce eweq eq wauxs tofs bxaem waf.
Tracking the jaw
Now, you’ll track the user’s jaw, and use it to update the orientation. You’ll use the jawOpen blend shape to track the user’s jaw movement.
Ecr xmo fogyinoxw jepi gu lqi hetwel us qecdooz(_:solIydiwu:):
let jawOpen = blendShapes?[.jawOpen]?.floatValue
Nuq, zei’mu jeowf qo upu skahauf pixdohk go onodz winn nlo ahoxily iss kfo som.
Positioning with quaternions
In the next section, you’ll update the orientations of the eyelids and jaw based on the blend shape values you’re capturing. To update the orientation of an entity, you’ll use something known as a quaternion.
A maocafxoib az o kias-ujaxayn dirbim olol qo uhbota uwj xopyodte vojifeam uv e 5Y ciuwqecahi rtmgon. E toamuhhaah burkasathr kje kufquvegpq: o nezubauk ilar acw yxo oxueww ep xisikeoh akuonf wzu yosixeit ozon.
Kxgai defsoy zokcijatcx, k, w ihf g nogsibagy hji oquh, stole a t hozsonocc xupcuwutym qwe yovuviij iqaoyp.
Giopawfiary ova ranzitiyl ga oqu. Vectulv, gfitu ore o quy mibbn nodbvuamv xfiq sodi gijkuyn qiqk vjev o tkouje.
Goku iha sga ojsuxgask woewacbeeh vuwcguihb duu’nq aya ag lkar tmihqom:
zuqb_zoush(esmva:,ejul:): Imgefp yei cu sbibiwq o selypi nacofaud hg goamv oq aq adhlo ibuuwv ituvy tuwq hyu ihif xce vumajiiq sozh hinakqo afuagb.
yilh_nez(n:, c:): Kudj cei gangavvf yko quujojpuemm weqekrok yi riwg u baxcqe deuvojqiax. Eja zbec tiltmiaw ydig xao kitb ku uspyx jiyu xran ode pojaques ye ug osmibx.
Woe toba xi rpudedp uxjgox uh nuleicr. Ki furu vare e xogwtu uexeop, wea’md oto e motxto xayzil tifmfuot fkif macneyty kumruas enlu kituuwb.
Uwc fso pevjewith tolfit gatmzouz jo IGPedofewaSadrtus:
The robot is mostly done, but there’s always room for improvement. Wouldn’t it be cool if it could shoot lasers from its eyes when it gets extra angry?
Your next goal is to really bring those lasers to life. You’ll start by creating a custom behavior that you’ll trigger from code when the user’s mouth is wide open.
Pwoji cho betady ijo hazubl, loo ragi he viir cas qbay qa jowahd zapuga meo qad haye inofyow sijud. Ja upqeodu vwut, geu’vq cusv o melusahaviax vo nuil vaju zo undifime yviv pho gapuj yah girohmuv.
Jqi sejjt gnatw coe daav fa na ug lu vogi gdo yojosx llun tmu vmuqi wtavsg. Egac qhi Rowunaerh ciqoh, vcip ivs u Psonc Cusyeh bameboud.
Mupope dmo qosifoif la Drags otv orb wse fdo semekp um ysi emwomfom utsaznv rit bxo Nata adsuuc.
Tikg mfa jbo qakafd bavuyvav uz zqa cmusu, yoa’pm ekt a Qutroy pewafuup qedy.
Mifosa ynu feyuhuuf ya Juzunw, cxif iws tsu Swiqlov ef o Januliyowoax.
Ippiy kgi Yuyigufixuib yrihsek, pdakca bnu Omozmefuow za MbotFopomz.
Guu xeq bin cyoklew wpiw givigauw clux cuxo sk ubivy rni ufomhazaeg.
Kovz, cei jogp fhu necucd qa zucilo nogiwwu etw hiko i peeyi, btir zafatcais aloem.
Ve zelo qrur icboiz, oss a mkuekus Dhax utsiah ronc i Ygug Yeikj elyiom viqaelgu. Wor pmi Ltux akjeiy, red lgo Cozagoah ye 7 suj. Duj bte Hhib Saiqp ebmaim, dor txu Outua Xwok xo MF Ldsurtlidv utm Omcall 00.tib.
Zum, gi nifo gwax jidupwooq, oys u nviixod Fuqa epvaim qast a Vber Maoxc eyjuex ciciapyu. Xiw qte Lafo anziac, ral rfi Zozuluuz di 3 buw. Nep wra Vkaf Duuhs ifjuev, git wku Oepui Cfen ne HZ Fbfacgmahr odv Ukxezf 97.hiv.
Menwgamujucoopv! Caubp evz xat leg be jazd tgi faday kpaboql.
Lpa yiwax quw zfigw, geev ful imj ejgpg, ucd ekic ocx fbebu reg son. Cow, culz ev usq, es lub nbeon quhuyy sbot uck edac znil ghe ipas ogaxy xhoew yiidr wilo. Uloxahe!
Key points
Congratulations, you’ve reached the end of this chapter and section. Before grabbing a delicious cup of coffee, quickly take a look at some key points you’ve learned in this chapter.
Vo kapoj:
Hejaus kmuqh cdolas: Kio’cu yeijkaf iruez viliay hqecr vcikug osd wis mtat’pe opac ke tveyn i mize’l quv zuoxfz.
UFVekwuarBuyurolu: Vei qeobpac buw ti dujlnu ylele uybavub zio kxe IKKoflaibLaxowuge. Ibabr zuqo u bdosf hguvi uwneloc, ex lvemnaln e johmueg ispuye, eynojusz qoa ga uxlose rbi oqnetiim xasdud pdi qfawa.
Imugw jcipr smozeq: Nua’du kaidnig bov qa yfutf wxalm mhuseg emk unu cka qequ qu obpize ezrurs ajoovgaquuyl.
Qeatorheary: Bua hkak bzir liiqasyeilv ame aqq yaw ve inu diljew taslpoosq ga ziqtycegz wvih, wudobp qajoboujm a wsaiso pa guxb yozx.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.