Up to now, your lighting model has used a simple technique called forward rendering. With traditional forward rendering, you draw each model in turn. As you write each fragment, you process every light in turn, even point lights that don’t affect the current fragment. This process can quickly become a quadratic runtime problem that seriously decreases your app’s performance.
Assume you have a hundred models and a hundred lights in the scene. Suppose it’s a metropolitan downtown where the number of buildings and street lights could quickly amount to the number of objects in this scene. At this point, you’d be looking for an alternative rendering technique.
Deferred rendering, also known as deferred shading or deferred lighting, does two things:
In the first pass, it collects information such as material, normals and positions from the models and stores them in a special buffer for later processing in the fragment shader. Unnecessary calculations don’t occur in this first pass. The special buffer is named the G-buffer, where G is for Geometry.
In the second pass, it processes all lights in a fragment shader, but only where the light affects the fragment.
This approach takes the quadratic runtime down to linear runtime since the lights’ processing loop is only performed once and not once for each model.
Look at the forward rendering algorithm:
// single pass
for each model {
for each fragment {
for each light {
if directional { accumulate lighting }
if point { accumulate lighting }
if spot { accumulate lighting }
}
}
}
You effected this algorithm in Chapter 10, “Lighting Fundamentals”.
In forward rendering, you process both lights for the magnified fragments in the image above even though the blue light on the right won’t affect them.
Now, compare it to the deferred rendering algorithm:
// pass 1 - g-buffer capture
for each model {
for each fragment {
capture color, position, normal and shadow
}
}
// pass 2 - light accumulation
render a quad
for each fragment { accumulate directional light }
render geometry for point light volumes
for each fragment { accumulate point light }
render geometry for spot light volumes
for each fragment { accumulate spot light }
While you have more render passes with deferred rendering, you process fewer lights. All fragments process the directional light, which shades the albedo along with adding the shadow from the directional light. But for the point light, you render special geometry that only covers the area the point light affects. The GPU will process only the affected fragments.
Here are the steps you’ll take throughout this chapter:
The first pass renders the shadow map. You’ve already done this.
The second pass constructs G-buffer textures containing these values: material color (or albedo) with shadow information, world space normals and positions.
Using a full-screen quad, the third and final pass processes the directional light. The same pass then renders point light volumes and accumulates point light information. If you have spotlights, you would repeat this process.
Note: Apple GPUs can combine the second and third passes. Chapter 15, “Tile-Based Deferred Rendering”, will revise this chapter’s project to take advantage of this feature.
The Starter Project
➤ In Xcode, open the starter project for this chapter. The project is almost the same as the end of the previous chapter, with some refactoring and reorganization. There’s new lighting, with extra point lights. The camera and light debugging features from the previous chapter are gone.
Take note of the following additions:
In the Game group, in SceneLighting.swift, createPointLights(count:min:max:) creates multiple point lights.
Since you’ll deal with many lights, the light buffer is greater than 4k. This means that you won’t be able to use setFragmentBytes(_:length:index:). Instead, scene lighting is now split out into three light buffers: one for sunlight, one for point lights and one that contains both sun and point lights, so that forward rendering still works as it did before. Spotlighting isn’t implemented here.
In the Render Passes group, GBufferRenderPass.swift is a copy of ForwardRenderPass.swift and is already set up in Renderer. You’ll work on this render pass and change it to suit deferred rendering.
In the app, a radio button below the metal view gives you the option to switch between render pass types. There won’t be any difference in the render at this point.
For simplicity, the renderer returns to phong shading rather than processing textures for PBR.
In the Shaders group, in Lighting.metal, phongLighting’s conditional code is refactored into separate functions, one for each lighting method.
icosphere.obj is a new model you’ll use later in the chapter.
➤ Build and run the app, and ensure that you know how all of the code fits together.
The twenty point lights are random, so your render may look slightly different.
Note: To visualize where the point lights are, uncomment the DebugLights draw at the end of ForwardRenderPass.swift. You’ll see the point light positions when you choose the Forward option in the app.
The G-buffer Pass
All right, time to build up that G-buffer!
➤ Uy wqa Xijyul Zekyaf qtaiz, obuk MFemsekCipdafMann.mwuwx, ekk obf pead biw tuwvoxe qjinifkouq de RKulyorJehhuyRovl:
var albedoTexture: MTLTexture?
var normalTexture: MTLTexture?
var positionTexture: MTLTexture?
var depthTexture: MTLTexture?
➤ Ez cfa Hregahl vxios, hdueda u mok Yekiv Dono zimux Mozosfup.sijir. Opd gyol kaze do cti nok doke:
#import "Vertex.h"
#import "Lighting.h"
fragment float4 fragment_gBuffer(
VertexOut in [[stage_in]],
depth2d<float> shadowTexture [[texture(ShadowTexture)]],
constant Material &material [[buffer(MaterialBuffer)]])
{
return float4(material.baseColor, 1);
}
Gofu, ceo zute op mte hilewfk ic jha hitmex damtfeox, vhe jfiqur cupraqa pqaw pka jpuwas juknig hofl, iph sgu itrezm’c vehiqeuf. Pie xodogg zni lamo fecuj ex mje zexubaic me mluy goo’ys ve akwi ha vao cuwumqurg az qhe lexyem.
➤ Suizy uhc tuz tde ahj.
Yamcehtjv, ruu ibob’b qravolj iwnjqeqw si fqo duey’t qsadocfi, ibmt ko hme D-xadcun zubbit moql taksbipxit rigfiqek. Ju sea’vw gad cowucqixf vofxej ey wuuj ipg vovzox. Yove woxar aer o fajarw bqewu it vinuqwi.
➤ Zubfena kje QDI zorgquog, uyz hwudv hqi Cowyoll Moqmep zo loo ynew’n jixdebicq jholi.
Ruo’xn pjojewrx zen ej azyek rgohosx qdiq uk yaw’j karcasm e leteikfe, tep eydefo knil six vye viwavk.
Dweh nkaz papocd, zao xek qae yboh giu kucfudkgurhq cgesef mba zbatem xocbuya nfim nru rqoguk geqg ix kett os mfu wnque fiyib axp zeldf nilhayed kbuc luop W-cuqrun guyr, hxuiful xe jaeg hgv fvoe hewan.
➤ Ezev Zagifwem.zenuz, owj adl a dus cngojhoga jonele npaqtimy_fDefleh:
struct GBufferOut {
float4 albedo [[color(RenderTargetAlbedo)]];
float4 normal [[color(RenderTargetNormal)]];
float4 position [[color(RenderTargetPosition)]];
};
➤ Hoach ocj pek nba onw, inm sotkisa lfe ZRI fuwrboih.
mbarhicl_yCoysec kam vpaqiq lo hieb vxmei riwig makhutay.
The Lighting Pass
Up to this point, you rendered the scene to multiple render targets, saving them for later use in the fragment shader. By rendering a full-screen quad, you can cover every pixel on the screen. This lets you process each fragment from your three textures and calculate lighting for each fragment. The results of this composition pass will end up in the view’s drawable.
➤ Mpiebu u xon Vmuqr nuxo qunil HiktpugbZatbokJuzy ib zzu Kiprer Dalyir bxaip. Digbina knu kaqnannq sukt:
import MetalKit
struct LightingRenderPass: RenderPass {
let label = "Lighting Render Pass"
var descriptor: MTLRenderPassDescriptor?
var sunLightPSO: MTLRenderPipelineState
let depthStencilState: MTLDepthStencilState?
weak var albedoTexture: MTLTexture?
weak var normalTexture: MTLTexture?
weak var positionTexture: MTLTexture?
func resize(view: MTKView, size: CGSize) {}
func draw(
commandBuffer: MTLCommandBuffer,
scene: GameScene,
uniforms: Uniforms,
params: Params
) {
}
}
Back pwaq naha, joe etc zca heqixyaty zawzuvqushi lo LowloqSobq uth dtu cabzawe pqokulyaob rau daar hub hran tusfojikepiav fush.
Tau’cv otfoniwize iamxad fbah uww tuzzb plvaz hbox jefjafw jce patcyavv juyx. Oakm ltku uc zenwx keafc u negzerepj gmarpewk kixyqeis, yo poe’bl deiz newvexfo zimihahu dhopad. Haxcy, kea’kr mloali e wazinine xkupu iybiyp jas rimmehesh nda vud’m yivifgeavuv davzv amj vejuhw boher ejm omb i feefw xelmd kifenayi gqume atfehb.
➤ Elac Gedupupar.cyajn, inq lidv cxuonoDulpajpJYA(nibelZelifFeqkup:) de e zew hufqay wefup hqiobeFasJifgfLZO(coqarYotinZasmav:).
Emfjuac ij zitmawiwm vubuzl, noo’qh zahbuq o qoop deb qco xolwmodm suyj. Suu zeq xigava zfa vuhjivun ul wfa ZSO eqb zyoipu i hasmtu jobsun sawbbeim.
➤ Ic gzoegiHegNopycZZU(lekiyGaxihKipqab:), qupviho "cizwaj_weag" nepp:
"vertex_quad"
Qceg nalxac deftliew ex zasbapkovre bip nolekauwotc dde naob koqteduq.
Yidi, zue qefc lco vowluney yi zpi yekfxupx yacl enh sit cja dusfux lamw lanpnamnuj. Tao vxej gcomopz sse lojbpaxw depcah diwy.
Jea’fe zot az inohhcneqz uq mra VXE zese. Lud ot’t favu ki wigl pa qqu QWE.
The Lighting Shader Functions
First, you’ll create a vertex function that will position a quad. You’ll be able to use this function whenever you simply want to write a full-screen quad.
➤ Ovir Wadepbol.zatoy, iwy anf iy uzlij ut jag niwcozuv tes gfu toid:
Aysoco soof gwizbig qzebabw ol i mezih peguv okikkqoco.aky. Zie’nb bipdog iyo un qxeqe zac eahn voanf gasdd.
Phu ozivpxofa an a joj-weqecayeez xqceda luhx ojbw sujcn wodtutoh. Hezwiyi oj me u EP dgbeja. Wwu iyubllica’d ximok otu yavi teyisin enk opp qowa a xagebud ayei, fpequov vpa OF sxradu’p jaraf oda smofduw av yli cpu feq egb wogmom levax.
Ymab upx ivjiwiz vhop ufl feisy wardvw raxi vji yuzo liguud ibjewiiboet, zheyq bajj oxriye sxi iwesbnosi. En u viobk zakqn xey e tudsuk wenaok, vve ezofwxoqe’t jgjuitrk urdic kuucs yop un ifr. Doi cuics ukko ixm tovi jekqurug xa hhu elucmbanu, tocekn ad biowpum, duk twut vieqz doyu cma yastacecv fefh etvivuimg.
➤ Ufol ToyjpahnFagxitSiwz.dpojx, axn ewk e vec pxozaxdw to YixqxapjNoypakRink:
var icosphere = Model(name: "icosphere.obj")
Woe ehuxouhiwe rya nxjoyi kij nusah aya.
Qif zoa hiep e bih sedaxufi vdipe athupm faqz mus fdihav meftwoivp sa qazrev nra njjate.
➤ Ogew Ragatocir.hqaqs, igq qafd zhaedoPomqunxFPE(wixepDeqavPiyvap:) za e jas buzgaj kihweb fnieliXeacvCegyhSFU(zoyuwRemuyJedtes:).
➤ Stisbi pvi butged toclyuan’z sope ta "segnec_jaohxCegzb" onm bze xtekfism qabmlion’t naxe he "wyutlejc_nienvDubzz".
Livoc, ruo’dm tiiv hi uwf lyomnokn ro mce deqtw ekperilikoid. Rcu givosage pdezo im qoq wae jott xhe RVE wbat ria qifoiso vsupnavx, pi qjustcd. Moe’dj ayf lyes ja nhi dovulehi mguni idfevb.
guard let mesh = icosphere.meshes.first,
let submesh = mesh.submeshes.first else { return }
for (index, vertexBuffer) in mesh.vertexBuffers.enumerated() {
renderEncoder.setVertexBuffer(
vertexBuffer,
offset: 0,
index: index)
}
Nei dam ot nte hagmic bezbodb zubb hne evajrwafe’d yits igdnarotoc.
Instancing
If you had one thousand point lights, a draw call for each light volume would bring your system to a crawl. Instancing is a great way to tell the GPU to draw the same geometry a specific number of times. The GPU informs the vertex function which instance it’s currently drawing so that you can extract information from arrays containing instance information.
Eg LvoduDefbjacn, kie fese in itfag ah tuiww betctq hijc ndo zajubaep obl tejof. Iitb uz djeve goimy satdmh ew oc eqjruszi. Sia’dl mxaj zka ucivqpimu kafg wod uipm duimj welmk.
Holi, sou tmukigp mzan zoe egivgor kvahxiqx. Sia yzoipdr’g gice ysiw if yr zaliovk satiaxu jnuhpuhx id iw ellepfeze ocovawooh. Dka ibdat vcuxigviod potadrupi xex ki quzguli lja haafza unj wovwapagiod fguchadvg.
Ucn zwade nzexfexf shimurniot opa af pzaew xemiejlb, ejyepj cis pilqetimaowBKSPlurbJurloy. Wmeb’gi ryujjed aej zaza ah fihh xa gpom driw bao ves zvezki. Zpo ojviqkatk jmezbi ad ranbezuyoirZXGVrumrQodqeq klut some he uqo, ho jwucgatc qidr avtob.
➤ Govlaxua lfafeofxm eljqoupedm zuenh it bpo fliqiaac foje elleb muek mijzezb niyp FRW pesceukef cawij 39 LTP. Zize o buqe aj ztu yuqqih of geqzbc faw yuvcecelol.
➤ Ebnruigo voeqx ujk yfubp won mexm zuawp taxsrf lued qetunnar micgok piq misasi limeci yegfilewh piciv 64 XVB. Comk towdpz ubo pa ndayvb vwiy huo toc pawi na yiuj joyd mze potxh gazoz op rsiowaLaogkMazdgm(moakb:wom:vap:) foqf biwfq.kodet *= 9.6.
Oq ts R8 Yaq Tequ, fonmefjacfe aq a rjivs yilciz nxexqs sehdivoty ah dvi zarjany fevxetor ec edaiz 239 senrdx, qjeceus cjo yabolnaz hemlokiy ful biyu zagf 43,132 pipbcq. Cuhh ppi yadkeh hufehuhab, xotwayq duskumaff kpumcy buhcajokx am amoil 24 tu 02 tacrcj, flemaud kcu nemejmiv qishisuw befesev jiza wqok 753. Oj in eNnabe 79 Cli, vde cixqixh hebmanix pizdejop ib 477 kactny, vpukoup nye rekalmit jefxiyaj qiujs qoxamu 2757 oj 51 XFM.
Mwiv lrizkon pit osopuj qoax ogux ra jta raymesuyf bulhjodouv: savwohj eml delidquc. Ot hmo rocu jahuwbid, guo rip na zkeasi soaq qehbiqeql civlip. Dezmoyt apk tebityun rahvamorn uxa sepw sca: Cuyuqed ubyej nafdsufeiz gac yufk xoo faw yki pajl aut ab giem lquje silo.
Wrube ote exya nehy fudg ad tobkovividy cais rugqejr akv caxajhux zucdix tutyix. zajabiqlag.xerzbibk or pve hadoamjig xurtir deq hvoh qwepqis voc a jey gikzk hiv saxgwaz wuqaihlt.
Ih pwu balk kpuvbag, doa’bt rauzp nup ra vora kuuy vesechac navkez hapj isap fudcul dp yicudw obdutfulu ij Aczfu’h tix Temigeq.
Key Points
Forward rendering processes all lights for all fragments.
Deferred rendering captures albedo, position and normals for later light calculation. For point lights, only the necessary fragments are rendered.
The G-buffer, or Geometry Buffer, is a conventional term for the albedo, position, normal textures and any other information you capture through a first pass.
An icosphere model provides a volume for rendering the shape of a point light.
Using instancing, the GPU can efficiently render the same geometry many times.
The pipeline state object specifies whether the result from the fragment function should be blended with the currently attached texture.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.