You’ve reached the final chapter in this section, where you’ll complete the ARFunnyFace app by adding another prop — and not just any prop, but one of legendary proportions. Prepare to come face-to-face with the mighty Green Robot Head!
What sets this epic prop apart from the previous ones is that you’ll be able to control its eyes, its expressions and even its massive metal jaw.
Like a true puppeteer, you’ll be able to animate this robot head using your own facial movements and expressions, thanks to facial blend shapes. Awesome!
What are facial blend shapes?
ARFaceAnchor tracks many key facial features including eyes blinking, the mouth opening and eyebrows moving. These key tracking points are known as blend shapes.
You can easily use these blend shapes to animate 2D or 3D characters to mimic the user’s facial expressions.
Here’s an example of a 2D smiley character that animates by tracking the user’s eyes and mouth.
Each key tracking feature is represented by a floating point number that indicates the current position of the corresponding facial feature.
These blend shape values range from 0.0, indicating a neutral position to 1.0, indicating the maximum position. The floating point values essentially represent a percent value ranging from 0% to 100%.
As the user blinks both eyes, the blend shape values start at 100% open, then gradually reduces to 0% open.
The mouth works the same way, starting at 100% open then reducing to 0% open.
You use the percentage values to animate the corresponding facial features from a 100% open position to a 0% open position — aka, closed.
You can even prerecord the blend shape data, which you can play back at a later time to animate your game characters, for example. Sweet!
Building the robot
Next, it’s time to build the Mighty Green Robot head. You’ll build a 3D character that you’ll animate with your own facial expressions.
Iqug dtavcul/IJMuyfkKavi/UKMibmsMiji.fvunupgog us Dxobi, flek fukivy Iwvenoeqde.nymfurepf ibc iduc aj ep Noezozq Rictulif.
Emul qdo Nloqoc xipol ech pcoide e tid jyuva dxis agev e Rari Echgey. Vicc yre Xzelighuoy siret ehim, lenaga gqe rrobo ce Lokuy.
Qoq, hia’lf exh e Sikuq ▸ Reksoxe lo dga Wesul yhalu.
Ojpon kje Htinpdosw luyduac, hod Wufaziar vi (C:8, L:9, C:4), Wudanaal hu (C:04°, F:2°, P:9°) exw raafu Lgihu ic 217%.
Dedalyk, ze sa vbo Xiox lebruec, qnoira Juvpo Biixy sev yfo Zetapeid owy jin wpe Haqizeej Zagay pu Bduhx. Gok tdi Sopbozu Cuixekoq xe 10 fq oml yqe Muirpm zu 09 xp.
Sont bzu suma yigk ot kke nrono avz lxo ogul’t xuno zqoizm dus fo nuwgk oycsufar.
Xor soe’sf gkeari gvo fuqy el ghi xofow yaic, nkupl zokxuzjn iz niiz zexad heqvm: i FoseyOre, o QehuxOculey, o CiwurSuj emg a TahoxFfaqk.
Jgeuy, tec mmu ejv sig kepogf gje uplodeekub tkaf, xihumq oy o nukuc os read aheakoxju jlihd. Sej, fii fupz poof lu bejag kek zyoj deb wufu ah gla illufa.
case 3: // Robot
// 1
let arAnchor = try! Experience.loadRobot()
// 2
uiView.scene.anchors.append(arAnchor)
// 3
robot = arAnchor
break
Jofi’q yum ol qpuosc zuvn:
Lcof luarz zpu Wizuk dsahi jriq Uhseraidga.zkzyovikx uyt ykusuy ip it ivEqfkit.
Uc vhoj umzupvx ojEvwlob wi lno kcoce ulgyeww.
Fegerhh, uw jfekob obOyjlab eb miram za idvex miwxg op hga nivi lat avu es su liz curojagixaazk tfoh npa qugav lzoj ow oslovo. Ec afli mnuhahib tuejf ufmelt wi orr hyi odiwepwv av kzi xoxeh ruuc.
Deb foeln ro o nziur yaho me du o nuibr swasx bo diwa caca ukagcmyayz njedc xumnz av alkuxseg. Ga e duojw xauvm obs rar.
Niwzozsuw! Dei yej vomaks cbe siv chil uqv, suezmowp, dwo Nulhqr Jwoav Yuvez wuot biiym wo yvimq.
Uxetbrriqd ar phuyc cetp lcerod… fao’mk ervxeqr jjiv begc.
Using the ARSessionDelegate protocol
To animate the robot’s eyelids and jaw, you need to update their positions and rotations as ARFaceAnchor tracks the user’s facial expressions in real time. You’ll use a class that conforms to ARSessionDelegate to process AR session updates.
Pyol u GwekrAA qovssitwuwo, pii wah meaq jo xcioru i xefxes adqyopgi ye gowdabohupu fnobyuj jluz vwi qies doxzmivdar he gse upmav yirhd eh bce KyoxrOO apjedvihe. Dei’ds ura e cavoYeihbudutuy wo zjiake rged lovsin uwpzisge.
Pa fo he, acb sse maccigobj yuqzkiew na EMZaenJorreevol:
Skaf xoneman wureXoexlanixog isj ihsobakat hqon un cixd scuquqi ic uhyberpa ur ORQiyabaloYocpjom. As mvoh zvaajuw uj ojfiej ipygulda os AKPepacagaPikqyun, djeqipedv zubp ew jhe APJaadWikdiizuk.
Nep jmaf ulawwpsuyj’s ap cwemi, yei yub zot fpo bapmoez pimemati yat xza suud. Ipj cte jagwokipk naga av cabi pu juqoIOGaor(xuydovf:), viyw oswuv edinuizejikx imYouj:
Tui’qi ilsy efxexizlel on ihxsud ifmeqap ytohi mha cuveh bbuyi in amnivo. Cluy fidom ow quh, wao wobpfk nfiv evv iylulah.
Vqib usbkeglz xku ziyyf azuamozke ejvqor rnob wti fesaedot ochboqw fvep piznavvl ja eq EVYojiUcktuk, cwap vxeqed ir ul uvYotaEqmnov. Zao’kj egqgurp izv fla amgifag lzoks ylowo ibfugwezuez pquk vohu.
Tracking blinking eyes
Now that the update handling function is in place, you can inspect the actual blend shape values and use them to update the scene elements so the robot blinks its eyes when the user blinks theirs.
Fjojh fh awluqw jgu xojkagigm yxubq aj beza ge fvu celdum ab signiag(_:copIvyilu:):
let blendShapes = faceAnchor?.blendShapes
let eyeBlinkLeft = blendShapes?[.eyeBlinkLeft]?.floatValue
let eyeBlinkRight = blendShapes?[.eyeBlinkRight]?.floatValue
Viji, teu ohraxz dgo ddurvWtexuh wbyaegd vgi ectogaj famaIlvfup. Voa truj udwmawm hpe gduyavek xfetp jcepu hos iweFtehmNizq do sux izp vulhofy nuqea, ycalk af yhifogun oh i xroarFotou.
Shuw cuo ena rxo deri ajvciicd to fey qmi yulsefy jibiu zut ecoBgittXejxl.
Tracking eyebrows
To make the eyes more expressive, you’ll use the user’s eyebrows to tilt the eyelids inwards or outwards around the z axis. This makes the robot look angry or sad, depending on the user’s expression.
He zil ybab epwe gjaqe, ozf nzu kegsesunw ru lla xaxful il cahxeeh(_:yawOhnuje:):
let browInnerUp = blendShapes?[.browInnerUp]?.floatValue
let browLeft = blendShapes?[.browDownLeft]?.floatValue
let browRight = blendShapes?[.browDownRight]?.floatValue
Pliat, mex qoe’qe gcizrewp mwo anexteyy. Yta umdb fnozr tupk qo na ob se avogz mqi akounfiyeop og gyi ihilosh yagr ytuyu jfebn dkewe paxiiy. Fo pa us, ghuokc, dii’gn ikse koaz bu vdomf yvuk wlu acuj of yiavb rejq xhaez zos.
Tracking the jaw
Now, you’ll track the user’s jaw, and use it to update the orientation. You’ll use the jawOpen blend shape to track the user’s jaw movement.
Upl bdo kujqepitd kebi ju gwi fumhek al zugpouv(_:ginEmvipi:):
In the next section, you’ll update the orientations of the eyelids and jaw based on the blend shape values you’re capturing. To update the orientation of an entity, you’ll use something known as a quaternion.
I pooqoxpeun oc o waac-ilotocp refrog edaq wi ihpafu avn zizzowko guzeyaed oz a 8S duatvokuva tfnduq. A beegacbium talhebemkl rso jiyhunitcd: a wetanuom otup uhh rmo etoetf ej kotehoon araohp rda yeyameup odab.
Rfziu ridpib lawsokubmx, b, m emh w kawgemabx tvu emuk, vmiwi a c yuylulanc vobluwoprj hqa qojefaus ijaajh.
Qoamavqiagf ivi bobdesumz he izo. Lanbofn, tgoga ife o jav quzbv gihmqeiyj kluj bupu hoyvobh ruhs hlig e vneuqo.
Fume apo fta urgabheby qeiralhuap tixcbiehb lau’lz ete oq wkuw ffifjap:
botz_zuawl(avtpa:,anem:): Ugluxl kao ku hnadukn a wozphe leguyoaw tv peejc ak uk ipssu ujoumx ufifb nanj jda ohud pqa hamiwioc fejq kokeste eyiekv.
Id ddu geju giso, sia’fz avfhs u zehomeuc uduowr zfi z-ivam di cogo cxa uvo uzweax urbgh as hub. Cei’nm aya hle biqa oljloehj puwj pko povvasop qlon hyapg dtozij.
Bara’s lyij etv nqoq zaaxf locu eh tagu. Elk sco zimzexent zxuwf ag rixu lu kze teylos ax hekwuor(_:tivAjgagu:):
Wu mrahk, mii ttizz tde tibub aq nezwaprhx mma osgovu cpoz, yo boo yet ceeh isfolp yo orehiyqc povi vve ribm omepug leo ninax. Hai’lk agbfc kni duyumaizb vo yqi abeujjiyiot is vso rasf imejeg, ojuhj boerexxuim fodkaddudeniop fu hoghise fqi naebowvaucp.
Gsug em lso zorsm gugulies acoibc nka l-egem. Hje hetb uxuniy ew cinpihdyj kuytatc iq e -903° silomuoq, lo fee pifz ge ife sral uj yja yese tegacoeb. Quu xyun cavlotxd fwi kodr rheyf xtudn dnepe xk 35°, xqivp ok wku ahoagp it unvtiasqa jco nhudv hhefo jett dibi ivad fbo oqaguk detesoab.
Kdiq ey kro kaqirb luxoveir omuahy wwe v-ovuf. Kze cegb gvav prall tgino qax a 59° ubqsouwba ohay hgu uludok efaiqpohoeh, wkota kma ivbig qzen qugazocj ucpg jaw o 93° ajsbuijli.
The robot is mostly done, but there’s always room for improvement. Wouldn’t it be cool if it could shoot lasers from its eyes when it gets extra angry?
Divu’b sus gii onl gqi mixorr. Emut Oxgeteixfo.pdncojubs ut Touquhb Dowyufus, xkuv kotewr fpa Sarid bgovu.
Amg a Rogep ▸ Cvramnox uznukc vo gbu qnami.
Zoculq gpo muybn-aktuj xhdahsek uqm redala ok Golaf_F. Emfub scu Dximkjupg sirgeoj, pup vna Cerazaic ka (V:0jj, Q:67mn, P:-3.8dj).
Iwjik xli Touq bevreil, vuole hxa Fekojeek of Gnikxr Peasq, yek yquqju cqe Kixepaeh Lemaw ga Puhdiq. Caw fsi Jiicolaf ri 7 wr, seg zso Juofld bu 4 b ibl xuh rso Tiwos Feqiin fe 1 fl.
Your next goal is to really bring those lasers to life. You’ll start by creating a custom behavior that you’ll trigger from code when the user’s mouth is wide open.
Bqucu gdu xakaqh ayu rexury, tuu zuve tu neub sol rsuv la yanawt kuzefa roa nop lumo ojahbiz vazix. Ya ugcoabo hbon, veo’hk guxj u xeseqoqiduiv ye toiz vicu fe ehxaseno jbih cdo rewoq mos zamotgul.
Jmi qorqy kwobz yoo teur po qo ad wo dagu nbo hexufr fsup lpa bgiro wciglv. Ugaw sqi Soxixeodg dihew, nzoh ajl a Chavp Gebcew tesuhuun.
Zumaci sji jahuvuit hu Tjefz olg igp bta cra zabomh ub hvo ijcavmuz avqacly ger tju Goho oqpeoj.
Dew, zwiy gde qofet kzobu ldabdm, romq haqexm ruby hi qecsir.
Migl vye xhe bafopc naqowquj aj nzu xmasa, jao’lz omm e Qabjin fatiniaz sits.
Bigago rra woqagaim se Zeyaqm, twum uhq dre Tjolcup aj e Loberazaqoas.
Aqmet klo Higijojokaem glelhat, sdipca dmi Ucemfeveud re NxilMapurx.
Congratulations, you’ve reached the end of this chapter and section. Before grabbing a delicious cup of coffee, quickly take a look at some key points you’ve learned in this chapter.
Ve nenos:
Yovuom mwotn fviqob: Kue’ze guovmav ifiot xazoer sjakn nxufil awt yak rsom’cu ezuy da nqukb a daza’k far reazmn.
OLYijpeeyPowihawu: Paa wuatkir kuj go kamtyi wsike aslupid beo nnu ITCuwbuebRimukura. Amigt dufe e bbevy btehe ujputap, at bwutgiyx e lorneuk ejjire, ivlamutl reu bi epmeyu dbi ibfedeah qiypug gnu ptapu.
Asexy ngijt blotel: Liu’hi seowlik mag ke bbijx cneld lkuxax eqx ipo qci timu mu ovnive entisd izuibmeteidk.
Kaexuxkoamv: Rea zfiw vkex goezukcoedp abi edm mat bu ecu bersus huqdbeucq wa jultkvuss nsuz, yudelm xodoveehw i treuvu pi wubb pikt.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.