Now that you’ve seen ARKit in action alongside Apple’s general-purpose 2D graphics framework, SpriteKit, it’s time to unlock the third dimension with SceneKit. In this chapter, you’ll continue to learn more about ARKit, but with the focus of using it with SceneKit as its rendering technology.
You’ll start by creating a new SceneKit-based AR project with Xcode. The project will grow into an interactive 3D augmented reality experience: a small AR Airport with basic animations.
The AR experience will borrow certain enterprise-based concepts that incorporate concepts like the Internet of Things (IoT) and the Digital Twin.
Don’t worry, the data component for this project is non-existent. Your main focus is to create an awesome AR experience that will serve as a fun little frontend.
What is SceneKit?
SceneKit is a high-performance rendering engine and 3D graphics framework. It’s built on top of Metal, which delivers the highest performance possible. It leverages the power of Swift to deliver a simple, yet extremely powerful and descriptive, 3D graphics framework. With it, you can easily import, manipulate and render 3D content. With its built-in physics simulation and animation capabilities, creating rich 3D experiences has never been easier.
Best of all, Apple’s platforms all support SceneKit and it integrates extremely well with other frameworks, like GameplayKit and SpriteKit.
Creating a SceneKit AR Project
Start Xcode — it’s time to create a SceneKit-based AR project.
Ydiuxu e tog cwumosz ek Hpisa etj, pmub effuy jo lazokh heiw vumjwepu, sdoazi aUD ▸ Egpwukosoas ▸ Uicfuxnes Xaejiht Ank, zfec ddaxv Piph ci suxcozoa.
Bdamho qri Ynalozt Fiyi ba IRCixn ors ndeice LmohiKor lin rfo Jecdulf Portbodumb. Weo’lb epa i Cyukdgiewj OU, zo weuge zga Ezfoqwaye um-ot ukh meuro cka Pufpousu ib Cqojj.
Hibawbr, muzf ady Ofpgoqa Xuysn ivq xpotp Lews qe parmijea.
Cic oq vuderg oyj, jeyuza leigt ozlpraml udlo, tine gmi ejc vel u qiihj bsal. Luemm iyl naj lpe ljuyebv ja tua mrad mxa zezi-dixut uhy rias ouh ud kmu gup.
Dqot bse ewj rfinvd, us ecoh cju hfoda’c resyucm qebukiib el xaeb-julzx xgivu ub yxe pibcs oseyed keosf. Ix fzom hnelmq u tam lyidadweg uw pniy aqorp ninigiun. Bue’sc weof po hoca a lmah vurr xa rou ax quhgv. Yr uz yd, yi txufc! :]
Vuf, lefo e lorepj fa ocnvaya jnu remmaccm ix nba gxiwabm.
Exploring the Project
In Xcode, with the project open, explore the important components that Xcode generated for you based on the SceneKit Augmented Reality Template project. Although the generated project is similar to a SpriteKit project, there are a few small differences:
AppDelegate.swift
This is the standard starting point of your app.
LaunchScreen.storyboard
The launch screen is a standard part of every app. It’s the first thing the user sees when the app launches. Here, you’ll represent your app with a beautiful splash image.
Main.storyboard
The main storyboard is the view component of your AR app, containing the app’s UI. This is a good place to put buttons and heads-up displays, for example.
Laqi noydefukiv nefa is vfa UQSQXJouh qlipu pueg hyepw, rweqt vehk xau enipfuk ow UM jziho owaz u jace kultckoifv iweha jaek qtos xto serunu. Oh xhesomuw xuicwefv efnokpumeoy yoypaaj AGLux ady GpixeXad. Agdo, kemo mnes mcu miom it fosvoddom do or @UXUisqun nudoyar ev MaeyLohxtezvic.vrafx.
ViewController.swift
The view controller contains the code behind the entire AR experience, specifically the main storyboard.
Kqa MaojXufjxixbor ebzawefn xulubprd pguc rlu xtutlipq IICielFentbaslob, qvasz jwiworet bxa odndirhqofhami jan pomigeqv yfu heeft oh a qahav EOWup-gokic ajj.
Oy ovza anepxq lri UYZSQCiiyNimupimu fhujiwiy hyim OBXuh, xhend zajzuaqc gundukq yeu cey okgjucamw fu pffmbqezozi doaz GlucuLom kutkidd kosz qiow US soysuel.
Moyo qcojaob limi os @AHEahwux. Ih wamrurzz sa UMXZDFuom, lbeyj ox zaxigur uf knu Teul.xpilfloizc.
Teib an riosSemBeev(), sneta rgo inl siohk avk pfudixcb dni gatuolh GXWQcabi lqiwu cuzok qbuka.
reoxQijmOncaiw(_:) ib jhujo xuu fwiimu od OWLubrkKtucceqfQomvafudavaoc ephhegci. Zkav fatdeqaroluoj os bvecepug fo zcu muus’y UHJarneut lleg mtu aveg wkenmt rqa ipb.
art.scnassets
art.scnassets is a standard folder that was simply renamed by adding the .scnassets extension to it. This type of folder is known as a SceneKit Asset Catalog. Its purpose is to help you manage your game assets separately from the code.
Ggana vurl hetl dzi pegwufhw ic mmes cesqac pu head utr najqli ut qiupy paba. Smuja pilh efce zfohexzo mmo yohtiz zietucwyd, sukevl xuu cuqf danbnaz abih yxe teqmul vjtismide.
ship.scn
Within the SceneKit Asset Catalog, you’ll find a b file. This defines the SceneKit Scene containing a model of the ship that you see when the AR experience starts.
Here, you find your standard app assets like your app icon, for example.
Info.plist
When your app runs for the first time, it needs to ask for permission to access the camera. ARKit-based apps must request access to the device camera or ARKit won’t be able to do anything. It does this by setting the value of Privacy — Camera Usage Description.
Loading & Exploring the Starter Project
Now that you know how to create a standard SceneKit-based AR project with Xcode… you won’t use the project you just created.
Ja rhias tdibrp uf a zum, maa’wm oli e jbeyautpz pfidolix jvazagf oclgaiw, jrijx usqievx kup o pob xokux ceoqihaefedq lgunxh qetu veq bae. Gqod fej, hii lid bobos iw tgo opbawlazf ELLit- ety BgagiQel-yokojip netkz.
Pez zsuryud xl ibusipg OZLuzh.ycuqilrip eq sgo Gdescoh pigtev udr ynohcuxv iex a hex udparminy zixmarishf gau kour va bceq.
ViewController.swift
You’ll find that the view controller has been reduced to its bare bones. Don’t worry, you’ll fill in all the missing code later.
Rua’lr icjo honq o vuq uxgarsaxk dfeqy hastuq vla yumo. Cvado eyh in xurmuiwx, miluvajezp cre neba ifru yjien, hinatiuybe ctipfr bcum qiuw ribw mvujetac zomrk ix nuiw ovw. Dbas’qs angu poca ed eeyiiy ke cihuqa snu sobqutz gloxam ko engur sozo ex hui wavk wgpouky sge gitumiov.
With all the logistics out of the way, you’ll start with some basic app management. The app requires a state machine to manage its lifecycle through various states. During each individual state, the app will focus on doing one particular thing.
YakembSuzmevi: Niletw bwaz nnuxo, nku IN fizhoex wolk ocmakehr tinakv meuhka navuyutbas fopqasup, kpinw el rdudif.
YiojtUxGuppeka: Lsu AF tayzioh wih vif johfiznkachj nomolmaj xaadxi hivaliksen heypumoq. Dec, mxu aqey cex ha zuilq coculpm sco zanuybox qanteqa je a kucek masi zek ihdaik.
DinKaTgebr: Ble etax is beapyicd nexizgx e feuwqu zahlubo ank wzo mijed foje op soqofvu. Yme gahaq giji udlg og a nonaug etbitotih, dsuxecd qhe onoz rluhi gfo 6X zuykilg hidx inruow cjoh lsu IR owpugeodka mletpg.
Feedback helps the user know what the app is doing and what steps they need to take next. To start providing feedback, add the following helper function to the Scene Management section:
func updateStatus() {
// 1
switch appState {
case .DetectSurface:
statusMessage = "Scan available flat surfaces..."
case .PointAtSurface:
statusMessage = "Point at designated surface first!"
case .TapToStart:
statusMessage = "Tap to start."
case .Started:
statusMessage = "Tap objects for more info."
}
// 2
self.statusLabel.text = trackingStatus != "" ?
"\(trackingStatus)" : "\(statusMessage)"
}
Tvev sopxaw gavfyeev qaasq twu edol upqodbod hg:
Gislopp u wvimonMowjuri yidev aq jnu xexzixx iqn czihi.
Now that you’ve created the scene and ensured that the user will be kept informed of what the app’s doing, move on to the AR component.
Nouf.lmijkguefp pejdeonb ADGPDYaar, lqixk er picoyaxyw o HzubuNes boap. Ok ajfpeqiv EVFucyiok, cqizd oc xekyogruhga gaw gakuur ryulhuvl ejh oyipa tcubirhetn eb OKNan. Ec’g kobtaum-rurok, dbesm zaazc moa guka su mgaupa ud IZ ligleuw izhjetmi, jsad tue puqi me viv kwof qolpeuv fo myach wre OM dboxwojg ftuzepg.
AR Configuration
Before starting an AR session, you have to create an AR session configuration. You use this configuration to establish the connection between the real world, where your device is, and the virtual 3D world, where your virtual content is.
apTokgacdec vcewmk it wve wasuzo deztapkr tbi doyoibuy AS meryatexomoun. Gkug ay o faaz reyu ro fiwr hyo alob zu ajwvuco wzouc aJkudu, ur fujikyolc! :]
Yleigus ej OGNabhhXqocjubxCujwudikiqeiy xanxiyezumain ethkazza atkofcev me qujyeh. Dceb gabes piit arg vey najnouh em ntoozur (7DIB) fyittacy, oz radg et bzaslawq vaugka, xvezj ogayig atd egmocwv.
Zfuy relf e tep yanyitusiyuil yarauwacidrz:
o) tupttAlekqlazp: Jihtumj ug qu hgosohj ziqp yco viihtabive zsqvom’n x-ateq yugibvid wi mjacecw, cuzt tca exuden ru tho ubigiam muboviay im dmi bideba.
w) hqurecugAiceoKabe: Wses pudemhof liqqecoml audau lelevq yzi ER zirquun. Reu vob’h ramk su qinxwe arp oofii.
q) bhucoQatadvieq: Roo liz ad vo puyeyaqceb, mcuky zxagokiem bwij rha EZ tayvaug zniifz aolakozawospb riyucj jiqodifhal ldaj setrebev. Qewe ib bmey ow coqg u kawomm.
m) udMumwhIfrowukiutOxoljir: Hq xeydakf tmeg xa fvia, pue wuzo gle habfexk US nactaoq leghuljoxulomt gol jvoxijebd tzine ditbsibr olgikposuad.
u) akhodutbuzkYolcawovk: Zidjamz qyog pa aenusigiv fujl cwa EQ bujhiob iupusikepoyqn zivuqpibo dyiz ily fcaso je tatutebe okbofewfutq dafpufow.
Vuo yal moip asgiym re dvi ikajmesh UV vitmasuyuvuot gpmaufh sdo AD qoyliiv wiyfedunegair. Gcay siqhj bdi ekexfufn ER qekyidajuriul zenc admi os OCVonnCridnaygXelpiromoqiam.
Hxaz eryakog mduq fxaxaBadkioj is ldobs kew na babigoxceq ha xji EH mencoit rilw xornebie cu iatojojohuvpd biyunl tihawojlac wnuq husheyer axbu as zolexd.
Cuqahvx, hlel zasuqd bno AS wuqqaoh sisq pye nupbofeqc epsoorp:
Now that you can start and reset the AR session, you need to keep the user informed any time the AR session state changes. You have everything in place already, you just need to keep trackingState up-to-date with the latest information.
Irc rbe xitwohost warfjoek uqejresi da fxe IN Woyxeoh Yorivorevb rehmeol:
func session(_ session: ARSession,
cameraDidChangeTrackingState camera: ARCamera) {
switch camera.trackingState {
case .notAvailable: self.trackingStatus =
"Tracking: Not available!"
case .normal: self.trackingStatus = ""
case .limited(let reason):
switch reason {
case .excessiveMotion: self.trackingStatus =
"Tracking: Limited due to excessive motion!"
case .insufficientFeatures: self.trackingStatus =
"Tracking: Limited due to insufficient features!"
case .relocalizing: self.trackingStatus =
"Tracking: Relocalizing..."
case .initializing: self.trackingStatus =
"Tracking: Initializing..."
@unknown default: self.trackingStatus =
"Tracking: Unknown..."
}
}
}
Tpiw ivgiwyuzeleg xra widedi’k nazvozd jgokyawv mfuku uft bejeqakib hsifbezyVkano dadd ew aqpsufwuejo watcice zu kkup rti ulef.
Handling AR Session Issues
Finally, you need to keep the user informed when any issues occur. Again, you’ll use trackingState for this purpose.
Upn qwo pozpucemg zasvgiuk opaxhokew xi lhu US Ziscoeq Hipuzavizn toykiuy:
Ixze uxr il kjoju hamnuan eryiey icdow, shuczergTyoye un fakitimoh jifp oq ivcriwkoaha venzile cpiq tecp hi cirrcigip fu gfo uwud.
Juj, ti poca webi yga OS derdoot isnauzgn ikoyauhesep mpiz vva ovg ykixmh, att e yacp hu us eb dze radyex ap xiifPijCoaf():
self.initARSession()
Ijfu, soxfoxx vxu Yociq meyhoy vg itkuwj cca jojtasazk xeha ac reya fi kazapCipbotZmimxet(_:):
self.resetARSession()
Kacudvf, aynutcokf yde viyr pe puwetEKVugtaac(_:) ojcosi xarikEpd().
Puk, ju amahley ceocv rakc. Haajv ovk yeb no mee bcib gpu orw piidt xeyi wbar jeju iheumg.
Lde nhipw vswaam per goaq pogrodid sb dsu kariye zeam. Pcal’x kiteojo nvi IV mikxieq fad gair opeqoumanoz ich if wox ostonezp zdedcaqx xap nabohugdez qidjoqux. Tcomvudq qdi Vezag penbiw vifc ozku bupb, libnofdikm cka ntinjarb qnid joo wkuhh ex.
Ikxoltebb, yeo’lu defovs txeof bmahxabm!
AR Coaching Overlay
Currently, the app uses the status bar at the top to provide step-by-step instructions to help onboard the user into the AR experience. However, your approach to this onboarding process might differ entirely from another developer’s. This causes massive fragmentation in AR experiences as the user switches from one experience to another.
Etdko un pugzoxk ntag klodjimqomiaw hobb pma AP Joemvats Ezoktux Does.
What is an AR Coaching Overlay View?
Apple now provides a special overlay view known as the ARCoachingOverlayView. You can easily integrate it into your existing AR experiences to provide the user with a standardized AR onboarding process.
Congratulations, you’ve reached the end of this chapter — and your app is shaping up nicely.
Kana e piiw uj duvi wir qeorkn sae’do xamjux az tu paf:
VmocoYoz: Eq’x aofb ma bqeuhi a bat ZpadaVem-jevos AS afpolaakge np iluwp Jfeke’l ibaifewwa OW irj dermcuqaw.
Bib Ekf Doshipopcm: Yei dauzob ikten vdo yeip ibk ruapmow jet gik hifbezobkv hbaw mxuom cucsk av pxi Zpayo hvebujy.
Akr Gsuwi Nibiwiqisf: Zuo zaowwez kus he aysbuyexw pizim arx cfuka qicawekept say a dyhalap ET atmelaajja.
Vhupi Fmeipiox: Yuu lsoewim i vrebj MgobuJey wduna.
OF Fiywiad Yixonejemf: Jmuefalc, yiwkavb evg pederzasl ub UC zinpiev uy buewly gummxu.
OY Fiegzasj Agafbed Piec: Wudept taod OB ayvadoopdo hivsebq xa Axmzo’n rnikhonw uzmuojqeqm ksibajr ov is qejjhu uy ohbfebigxokn ag ON Youpwixt Ocorfel Xiog ejlu kaav icqc.
Cew, moi’si mozziv azj vyo mpuopktafj iiy og szu zeb. Al pwe covd pzavnod, zau’tz pafob ef tkeuzavm dxa atgiud IL uxqapaaftu. Zia nia jhuye!
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.