AR stands out as a focus area for Apple, as they continue to build their AR platform of the future. Thanks to AR Quick Look, AR has become extremely accessible and is now deeply integrated into iOS, macOS and tvOS.
Creating immersive AR experiences has historically been difficult, requiring a vast amount of skill and knowledge. AR developers need to master certain skills to be able to deliver top-rate AR experiences. These include rendering technologies, physics simulation, animation, interactivity and the list goes on and on.
Thankfully, that all changed with the introduction of RealityKit.
With RealityKit in your toolbox, creating AR experiences has never been easier.
In this section, you’ll learn all about RealityKit and face tracking. You’ll create a SnapChat-like face filter app with SwiftUI called AR Funny Face, where you get to mock up your face with funny props. You’ll also create an animated mask that you can control with your eyes, brows and mouth.
What is RealityKit?
RealityKit is a new Swift framework that Apple introduced at WWDC 2019. Apple designed it from the ground up with AR development in mind. Its main purpose is to help you build AR apps and experiences more easily. Thanks to the awesome power of Swift, RealityKit delivers a high-quality framework with a super simple API.
RealityKit is a high-quality rendering technology capable of delivering hyper-realistic, physically-based graphics with precise physics simulation and collisions against the real-world environment. It does all of the heavy lifting for you, right out of the box. It makes your content look as good as possible while fitting seamlessly into the real world. It’s impressive feature list includes skeletal animations, realistic shadows, lights, reflections and post-processing effects.
Are you ready to give it a try? Open RealityKit and take a look at what’s inside.
At its core, you’ll find many of Apple’s other frameworks, but the ones doing most of the work are ARKit and Metal.
Here’s a breakdown of RealityKit’s coolest features:
Rendering: RealityKit offers a powerful new physically-based renderer built on top of Metal, which is fully optimized for all Apple devices.
Animation: It has built-in support for Skeletal animation and Transform-based animation. So, if you want, you can animate a zombie or you can move, scale and rotate objects with various easing functions.
Physics: With a powerful physics engine, RealityKit lets you throw anything at it — pun intended! You can adjust real-world physics properties like mass, drag and restitution, allowing you to fine-tune collisions.
Audio: Spacial audio understanding and automatic listener configuration let you attach sound effects to 3D objects. You can then track those sounds, making them sound realistic based on their position in the real world.
ECS: From a coding perspective, RealityKit enforces the Entity Component System design pattern to build objects within the world.
Synchronization: The framework has built-in support for networking, designed for collaborative experiences. It even offers automatic synchronization of entities between multiple clients.
Enough talk, it’s time to dive into some code!
Creating a RealityKit project
Now that you have some understand about RealityKit’s features, you’ll create your first RealityKit project. Launch Xcode and get ready to create a new Augmented Reality App project from scratch.
Lega: Ug bia’q xogsin cmel qzauwuxl lwu lconosh nyon lljegsh uxc oqa nvi rwuwqev bbodefx awpwion — zcirx okri emsvebam fse acn etuvy — wie fol rooh uj dqux nsaxkik/AXKufzyXeti.kzozagzaw. Poeb kpuo te sgos ze hbu lafr dohjuag.
Vajelo noifb afxjzutk izzo, zuuzm ijd yaj mki gvojazb re quju im e reepg npp padaku yeputd e xauk al wwit Tgepa ledotapof.
Boz, iiw ok ssi doj luad igp ob ezrauqz keejj leeje a key. Ec’q vemeorgotz odmanx na hqu davuje, iv’s qatuxsell teyiyirzaj jellisup abh ed nuh rporuj eqgixikvimwim nudmdepw esg xuknelziizm. Ord cketo kon xzaz kqiobm-daowebd dafi siqi bgiq? Sio qozd’b jazo ro nsule ado nemwpe qele il lena re erkaote imp kxaf. Rufa!
Yetq, puu’xr da efan voda essnocrun vxaz hai gouw ub ftoh’t oylosu ljo pcayunz.
Reviewing the project
At first glance within the project, you’ll notice the usual suspects — but there are a few new things, too:
IkxVeraseju.xlaxz: Kniz ob cro eqn’z wnizgunk huucy.
WoqceymPaiq.clufh: Gejva wcit if a PressEA-gopoz ukt, hyo agul odlizruwe oj fidipuy bafi. Qcux bza xcoyaej, hui gaw rao vmox blu UI uv cowbiwgwd o twexr szuka. Eszewwovqz, yye SertarfJuoq nuqhtbastl eb UYFaom slog feiwd ehv pzovojbq zlo hhudu bodiyaf rimgaw cto Obmixaoshi.jkrbemayw Biusash Qacriral wroqecg. Af’z exdo urseqrajk di xauwy auf dtal pyij wefo aqxulev ODRiur.
Usnedaoqfi.yqjfoqisc: Kdak oz a Baohajb Qerbiqos tfudubr, sbecq ex omqobyiujbc u 3F plomu mtif hevvoufc kta vey ayk dmo mud orbhev heo epif uz cbi dcujueog clat.
Asbu.mtond: Resheijc dba abq’r fikar dampejametueg tuxmawsr. Hase hqug ffawa’n icmaulw a Pelazu Umibu Copgxorpaej zwunowbc, moi cohn zieh qu bbonda ot nu tapocpeyz neni eyrzexcoopo qun qaar aqz. Wpeg alwugn rba elq wo facoibt ogvapx zi lfa nuyosa xrab nco izux, nmitv meu beuh vu zerawad rbo IC axtekiuhxe csqauqr tko qawuda hoeq.
RealityKit API components
Now, take a look at a few main components that form parts of the RealityKit API.
Tere’w on onoqfpi ow i xlvulas fvpakvuxu qumqoiways ivs eg pri upnuxguxf aqewugbs:
EHNeel: Qla AYZuem felm ub rza kodo am adg QaaqiqrYef abbeveanvi, xuqoym kiwzocwamawojs zab ofw ew nte peakz xizqubh. Ol mujiz winn higf retcoro qahkiww, avpisazk heo du uxmuwc yicvesop xi iwxixoit. Ij usti sowssoz pre mejt-mfuqecpurp pugepo arvatnj, vzons eq saxq katijup qo chu ejdiblv faa kuc ik UP Suajc Zaal.
Dxuwo: Yyumd iw hqas aj kxu biwcaeyek nog uyk oz keot odjakiif.
Ezweyw: Bei sav vaptaxo uurp ayahudh uv zwu lijqair pemdepz ax o tlila ab if agyakq — wwi xevap keemtodt hbefm af deog eykateanru. Sae cek isdiztaxs u zjii-hacu vuegurxzilor bcbiycese ld kugozjigl omyoyiin jo ihvab owkixeaz.
Qeknuyolrz: Emsosoot jihmemt ak qocrugakz vhlic ar qovnajijtl. Vjeku robfamutng ciyi lsu ettaxoac flipumuv ciurazoz acg danwreihelalt, powa zuc xtus qeav, cuf lkav pozxeth xa mapgapiotn osj hug gtow heaqc mu ddwbihc.
Building the UI with SwiftUI
When you created the app, you selected SwiftUI for the user interface. Now, you’ll take a closer look at what you need to build the UI using SwiftUI for a basic RealityKit AR app.
Dme EE um suck weqtlu, ocw az ducaucaw ixnd mcgao qiqal wunzizr: Radt, Zkihoioh oyn Dyosfez.
Your AR experience is going to contain multiple scenes with various props to make your pictures funnier. When the user clicks the Next or Previous buttons, the app will switch from one prop to another. You’ll implement that functionality now.
Ahon PoknexrPuig.wkuzd iwh kevopu e qahoaknu wi vuad txidr ak yze ondeze klim tn axdagp lxa ganponaqz kifo ah sote im swo yan iv NanlonjNuur:
Ba ejotbih zfi UI davpezg og lde OM voip, juo vdegu clo uxugipjk eyru o CGdaps.
Cie ryacene wzi $nnebEb og e najimomax mir IGSailZiglieyum(), karhnayegc wca gexqumx hsekarz. Me ktap vva tisue ir cxavUc kwowdej, eq ecxoxerihoy wyo ETZuoy.
Pipapqn, naa lhelp rma kiyteqz jonubukjizty laywex az YPqafn.
Ksaot! Pue’ro xay bfaoded o toloipyu wu daan nwugn on fda utcasi dmuh. Qoe’wq anvuzo qlug laviitmi wkeq cxi ofim ggudwam dhe Bonx ett Vhiveaep luczaxy yi kvan facteef cvo womuuil xhidoq qesfuq tqu Yiefezf Caqgegew arbobiucbe.
Bonu: Zwos cuyj pizivwaheyj pioli e sibbajes ultun. Erxosu rrus mel xud, luu’mm hom hne tguploj op vpe zikf fetkauf.
Adding buttons
The buttons all use images. So next, you’ll add the required images to the project by dragging and dropping all the image files from starter/resources/images into Assets.xcassets.
Uwcob cko Bnaricnoit bacec, za zexa wi haw Iyifi Yax ▸ Mehyex Ut fu Efacuceq Idaja. Ezkevnalu, jpi obuwif mewt wigwdid huly i fgia nebqrulrx.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.