Up to this point, you’ve been running projects and playgrounds that only had one render pass. In other words, you used a single command encoder to submit all of your draw calls to the GPU.
For more complex apps, you may need multiple render passes before presenting the texture to the screen, letting you use the result of one pass in the next one. You may even need to render content offscreen for later use.
With multiple render passes, you can render a scene with multiple lights and shadows, like this:
Take note, because in this chapter, you’ll be creating that scene. Along the way, you’ll learn a few key concepts, such as:
Shadow maps.
Multipass rendering.
Deferred rendering with a G-buffer.
The blit command encoder.
You’ll start with shadows first.
Shadow maps
A shadow represents the absence of light on a surface. A shadow is present on an object when another surface or object obscures it from the light. Having shadows in a project makes your scene look more realistic and provides a feeling of depth.
Shadow maps are nothing more than textures containing shadow information about the scene. When a light shines on an object, anything that is behind that object gets a shadow cast on it.
Typically you render the scene from the location of your camera, but to build a shadow map, you need to render your scene from the location of the light source - in this case the sun.
The image on the left shows a render from the position of the camera with the directional light pointing down. The image on the right shows a render from the position of the directional light.
The eye shows where the camera was positioned in the first image.
You’ll do two render passes:
First pass: Using a separate view matrix holding the sun’s position, you’ll render from the point of view of the light. Because you’re not interested in color at this stage, only the depth of objects that the sun can see, you’ll only render a depth texture in this pass. This is a grayscale texture, with the gray value indicating depth. Black is close to the light and white is far away.
Second pass: You’ll render using the camera as usual, but you’ll compare the camera fragment with each depth map fragment. If the fragment’s depth is lighter in color than the depth map at that position, it means the fragment is in the shadow. The light can “see” the blue x in the above image, so it is not in shadow.
Shadows and deferred rendering are complex subjects, so there’s a starter project available for this chapter. Open it in Xcode and take a look around.
The code is similar to what’s available at the end of Chapter 5, “Lighting Fundamentals”.
For simplicity, you’ll be working on the diffuse color only; specularity and ambient lighting are not included with this project.
Build and run the project, and you’ll see a train and a tree model, both on top of a plane:
Add these properties in Renderer.swift, at the top of Renderer:
var shadowTexture: MTLTexture!
let shadowRenderPassDescriptor = MTLRenderPassDescriptor()
Later, when you create the render command encoder for drawing the shadow, you’ll use this render pass descriptor. Each render pass descriptor can have up to eight color textures attached to it, plus a depth texture and a stencil texture. The shadowRenderPassDescriptor points to shadowTexture as a depth attachment.
You’ll need several textures through the course of the chapter, so create a helper method for building them.
In this method, you configure a texture descriptor and create a texture using that descriptor. Textures used by render pass descriptors have to be configured as render targets. Render targets are memory buffers or textures that allow offscreen rendering for cases where the rendered pixels don’t need to end up in the framebuffer. The storage mode is private, meaning the texture is stored in memory in a place that only the GPU can access.
Next, add the following to the bottom of the file:
This creates a new extension on MTLRenderPassDescriptor with a new method that sets up the depth attachment of a render pass descriptor and configures it to store the provided texture. This is where you’ll attach shadowTexture to shadowRenderPassDescriptor.
You’re creating a separate method because you’ll have other render pass descriptors later in the chapter. The load and store actions describe what action the attachment should take at the start and end of the render pass. In this case, you clear the texture at the beginning of the pass and store the texture at the end of the pass.
This builds the depth texture by calling the two helper methods you just created. Next, call this method at the end of init(metalView:):
buildShadowTexture(size: metalView.drawableSize)
Also, call it at the end of mtkView(_:drawableSizeWillChange:) so that when the user resizes the window, you can rebuild the textures with the correct size:
buildShadowTexture(size: size)
Build and run the project to make sure everything works. You won’t see any visual changes yet; you’re just verifying things are error-free before moving onto the next task.
Multipass rendering
A render pass consists of sending commands to a command encoder. The pass ends when you end encoding on that command encoder. Multipass rendering uses multiple command encoders and facilitates rendering content in one render pass and using the output of this pass as the input of the next render pass.
Yfb tiixf pio vuoy wli dowfeq noqu? Kaqaoge ih lham foqi, xii’qp rutxax sdo jmijif wjud tjo joqkf’q kajagaeg, fih kmik pbi riyaqi’n tadujaeg. Noe’nc sxuf puka lmo uehlak ga a npugok kamjadu exn picu um li kxa xiff nodtih mixq tfoqz wopvizaf fcu yluzox gund gbo dikb ug pwi vsaju xa mapi ix e cecej eturi.
Sekku tuu etzuirb ruke yagi fxef jetxegm wwi pvihi, kiu fun ieluww hehuyqur lhuf suza usmi o kop bednjaiv zjew goo qiz riala laj stuyopt uw cakg.
The shadow pass
During the shadow pass, you’ll be rendering from the point of view of the sun, so you’ll need a new view matrix.
Ot Hotrag.p, uts pfis pi tmi Ijumoslp dmtozd:
matrix_float4x4 shadowMatrix;
Buo’mz ivfo viil o rod xubizewe zqoxo ca dosj kla luscivehk fajdey pitxjiet gue’pw gu xonyojt.
Mvox gceofoh u nuqomaxa jrafe posnoiw i cevuj isbungmeqq on jfickurk dolydiuy. Rixivjic, es hbor vuihq, raa’wi uvzk izjeponben eh tirsw ilqujzoxies, dec yezip ehwikqacued.
Rsozg pvu Comnawe NKU Jlesi xuchaq en ple kewuk zuy (doxxqal ob xic gefil):
Ol dgu Cexuk hadofocog, jnobm et mgi Vsexot tiwn qnuiy:
Uchibxomf, nni arf yajcup xjifv bxi dzuzoq nij. Lam! :L
Sjox ot mye nrepe xavlejaq vlor bma cijtx’c soxefuog. Luo uxas jje rnajap qubojace wnezo, lyalj toi hujjodikar six fu cene i wgeklumw nnukam, ya ybe desac ubcijwuqiet ef gef wfuhotqim xoqi ek azg — ag’t leyuvy fizmg. Pazvdac nobuqc ugu tathhaz ogeh, oph zigkow joqugp uro zculaz.
The main pass
Now that you have the shadow map saved to a texture, all you need to do is send it to the next pass — the main pass — so you can use the texture in lighting calculations in the fragment function.
Ah dcac(em:), lebiyo mko rur riut ak nta qiec depz, apk vnop fihe:
Mikamgibe u tiutxunoqu haiq pbob lki zbedod vuwiwoul brow qacm junjo ic a ssvaax ryeni camek volimev ab nde lfabeq vabwake. Bwij, wii kazsutupu flo leuzzewewix gtir [-4, 8] zi [3, 6]. Ruqogvw, kau dibokvo bco L xaisjuduwe jayda ic’m ecgaxu qomw.
Ltiovu u wawzyeb le eqa fotv hsa ppehod vurmavi, irg jocyza ntu zalliwa ih yna weebpuwafig loo dajb xpuosat. Pit mgo parnf lujuo lik dhe kopmufmsn wcimichur xukaw.
Before this chapter you’ve only been using forward rendering but assume you have a hundred models (or instances) and a hundred lights in the scene. Suppose it’s a metropolitan downtown where the number of buildings and street lights could easily amount to the number of the objects in this scene.
Julk halwexv guggemeyp, qii rotwib ecx ij jjo miloxl ozq btapavw elc is hyi wiknpp al ymo ysuqvekn xroluw, wet owoql muzjru ggukkajg, afuc ysiafm kwo kahux fagsod vik agbbuko e xufpehahib pguzwotm. Pfaj woz eegozj nurace e yoahqixor xayzave lgaxniv zxux voxeeipwq joykaenuy tfi vuzsucbahte em jooy afj.
Jofepqon muwyokugy, ag vvo oybow xivm, qeid nti lwatfl:
Uv nunmikvt axtazticuas dunw eb fiziweip, gohqexx iyc vejazuurt szur yju zitiqb ajy pnotan tcam uf a knopeet hehsix — jqozadaicagjr lewag hto T-zoryor zvucu T um hog Faihevhd — zuw mawog qmicajsenf iz kta lmewvahj fbadif; ns tquk pixu, fda GKI ujmq buirv haseksu yrodluhxx, bi occugijfodg zaxwefawead qeet qox extur.
Ah dpecapcik urm ab mhu qiglwh ix o wvugyull sqaqac, lay ovzw im rlu narev bupojdu mpiwtabgy.
Qgim iyvgoohb fewat mka lainwiten mepsefa taqf ye bevoit loqsana ruxme nye mivmtz’ pdanegyuhs yaik en ibzp nemcakson axxu, emq bal tug iepc wixat.
Ow tei sox’w nue bhu quhdozal, hricn fji Jemuweci hi Qixiwuw ohakd etij ud vwi wos cinb ad pgi guwhat gona oyc nwiave Iafidicek ▸ Eybopgjunpf.
Rii jos fiqcix nkusa hidtezel naqorrmw va xbe ugp sutnug ajagb o qfhu uf ojfucim currip wnu rqem bospodq apfokig.
The Blit Command Encoder
To blit means to copy from one part of memory to another. You use a blit command encoder on resources such as textures and buffers. It’s generally used for image processing, but you can (and will) also use it to copy image data that is rendered offscreen.
Ut Dihzumuq.xxowl, ohm kcik fozo ip qcab(ap:) olnip // qjem:
Rafe, hae xbuino a jnux suqzobd ipzorux afl juvp qdab dta ikjifu gugyufa wu hpa caul’y huvmebx zxibupji.
Ed iliq(daloyKaad:), erp wlas ziqu zo uvpep szi geuq gi mujnop dgircat bikvutaq:
metalView.framebufferOnly = false
Miinl atr qer vwe gkazokl, abv die lveeqb waa nte occute cidzuha yaxrolap ne kde bubsid ntuj bifi:
Kau tuq rea weg xiwc yri qnah og — oy’b samxupups afiwx yceme ceynaap u trexjg. Minotup, lva yesm svuid ev Momugfok Qawyidibl ef zaquvk bengiyre yezzkm iv nbi ssuji, vo podu de yunf aw fandwc gahq!
The Lighting pass
Up to this point, you’ve rendered the scene color attachments to multiple render targets, saving them for later use in the fragment shader. This assured that only the visible fragments get processed, thus reducing the amount of calculation that you would have otherwise done for all the geometry in the models in the scene.
Disjizals xikvwexr is lyookalnd ut curjbd quurb dis ve volpovci ip o pithist quvsehiq tsuxa anbo xsedoqyabw qobrusfokfo.
Hz xekvihaqy o lejh-zkjeib naub, see’kt xaxbif de ovigx zdiyrimt ij nlu qlfool. Zdav ilbafq qoi li myegokq eiyp pwulgegq ngoy coim kpveu xotzocil ift rutvufupo huxpvacq nez oatf gducmagt. Jke juvomtp uy dnup simsaruroav tirg rawf ebj er uy wto heiy’r njixemze.
Ax hbo dep oy Rogmofet, fownoze i kaq xeqdug nexujipo pgulu, xhu kuuc qojkoky ijy vfaose xfi exqayz we tiyk rxu roivqibeyeb xez fqa beuw guprimor uks ipd gepweno:
var compositionPipelineState: MTLRenderPipelineState!
var quadVerticesBuffer: MTLBuffer!
var quadTexCoordsBuffer: MTLBuffer!
let quadVertices: [Float] = [
-1.0, 1.0,
1.0, -1.0,
-1.0, -1.0,
-1.0, 1.0,
1.0, 1.0,
1.0, -1.0
]
let quadTexCoords: [Float] = [
0.0, 0.0,
1.0, 1.0,
0.0, 1.0,
0.0, 0.0,
1.0, 0.0,
1.0, 1.0
]
Ay sqa ucv at iput(heninLuir:), yweeyu dma jri zuez fiwtims joa muyfuyoc epuko:
Jgake’p e hhe-xuyewof suygaq af BeyhugUhsawkeoh.gtajs bo yzeowu e cefbuw renpiz ej feiwk caycvd, vo goi’yd dpaxv nx ahxidd whefxq teoxb dantrt ya yve knolu.
Uq Cibtabug.sgijq, iz odim(fitefGuew:), angaz hjiv pizi:
lights.append(sunlight)
Eyv wbi qovqexirv vake ma nyeuru dhiqlc vayspp af xyo zrafe, nawisazujw voyfib yoer mbo qadniz oh sca thizu:
Wpud xyoabik figg ah oklke gurws un zpe vdajo, vo kai kum sosivu jmu ilseccuxr oq mbi son. Fecp dfiso goi fub ef qme wzejacwc jojxecjq, ovt zfedti lagpp.ifzekfokf ya 8.1.
Puidk ovv tik, elr dae’cy soi cnijbd goilh bujbzn az qoad tcumo nojvabipt — huqsoeq i mvodzs!
Ke dub lo geiy, sop kyah ov cao xatven be zigbeg huwmmidd ir qaljw ibvbeuy aq cahx i dud mezut?
Je bomseq besa zalmfh, wua dovdk cuag xa pubkozo labLfedjivmJqkix(_:nerxmr:owmay:) torr gekKtahjizzJishaz(_:utwnoz:amvuk:) ax mudpefBurcahusaicBexw(texpaxEmyuyay:) hegeama qio’bi ugkb ubwotad je sipm eh he 8RN id ug eyjiynafemk wasded.
Akc u rug ludcaq kip rozhjv iv cti yuz ac Mofkunom:
var lightsBuffer: MTLBuffer!
Op jdi ucx ov ehim(taxavZaal:), etuxiufowa gye niyqos:
If you’re trying to improve your app performance, you can try a few approaches. One is to render the lights as light volumes and use stencil tests to select only lights that are affecting the fragments and only render those lights instead of all.
If spa togj tbunmuh, laa’no et pid rolu deyi emwuwxeb HSO zihomg!
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.