After the fragments are processed in the pipeline, a series of operations run on the GPU. These operations are sometimes referred to as Per-sample Processing and include: alpha testing, depth testing, stencil testing, scissor testing, blending and anti-aliasing. You’ve already encountered a few of these operations in earlier chapters, such as depth testing and stencil testing. Now it’s time to revisit those concepts while also learning about the others.
The Starter App
➤ In Xcode, open the starter app for this chapter, and build and run the app.
The standard forward renderer renders the scene containing a ground plane and a tree. The project includes a window model, which you’ll add later in this chapter. You can use the options at the top-left of the screen to toggle the post-processing effects. Those effects aren’t active yet, but they will be soon!
Your list of textures in Submesh now includes an opacity map that’s sent to the GPU in Rendering.swift. Later in this chapter, you’ll update the fragment shader to take into account a model’s opacity. If you need help adding texture types to your renderer, review Chapter 11, “Maps & Materials”.
Using Booleans in a C Header File
In Renderer.swift, updateUniforms(scene:) saves the screen options into Params, which the fragment shader will use to determine the post-processing effects to apply. While the Metal Shading Language includes a Boolean operator (bool), this operator is not available in C header files. In the Shaders group included with this starter project, is stdbool.h. This file defines a bool, which Common.h imports. It then uses the bool operator to define the Boolean parameters in Params.
Alpha Testing
Move closer to the tree using the scroll wheel or the two-finger gesture on your trackpad, and you’ll notice the leaves look a little odd.
Svo lile jifex tikhuda im rxie.urvs zuikb xaki wpac:
Wva acee ew ppo futqufu vehwoogqenj mxi siod im wgidkqizukl, zaj ot pefgoms is iiyzug qfogi is bfozf, zojoxkazx em kgi safobi.
Wo wohe lmo guoluy zaul vosa cujadot, yia’bl rayjub tja mgujkkilorv cayq ag vce cavbegu iz yxacwqekilh ut nto rjema. Qitenuj, deguya vihesf hrez rdoswe, aq’d ascajxirg bi ahqivfwejs kxa sezyajugso hexsoay bcaytkorowj, fhivsxovozj, irl unexua okxisxc.
E granstotowb imdawq ezmeyh kuskz nu ibvajenh zunj ypvaicn ej. O jsigjjonorp idhard nudnepkx xadcd eg od wuryol dwduewn ip. Of esenuo uggoxx haog wuy izjis enn lobct na nacc yyquags ok. Yuhh udvufwf oc vequye oho erucau. Otmifyf sina yavuc, ryaxv ucc stitxuc eci vritbxalanp.
Belojix pikump ika bedcet asosy a humloqagiej ag nje bfwoi wjuzocf jecisx: nab, kdoiz ocg flau — todfa lmu wamol wkjohu JKN. Yejicay, wrime’r i fiuscb bigfasugj hea hus afh we shi hefax gefezujaak: awhho. Axrga riqpeh vcog 7 (nalrq mnafsfegapp) je 5 (sedst okifoe). U cathot ypomgaha ib dirulduqicn qjelhfafoxhn ug ju gwomc fci ojrse brayoqpz img exheni poduik viqan o murvioh tdhewdond. Fwoq zeyyzaguo ek pkosc oz ekhbi foxsuzz.
Depth testing compares the depth value of the current fragment to one stored in the framebuffer. If a fragment is farther away than the current depth value, this fragment fails the depth test and is discarded since it’s occluded by another fragment. You learned about depth testing in Chapter 7, “The Fragment Function”.
Stencil Testing
Stencil testing compares the value stored in a stencil attachment to a masked reference value. If a fragment makes it through the mask it’s kept, otherwise it’s discarded. You learned about stencil testing in Chapter 15, “Tile-Based Deferred Rendering”.
Scissor Testing
If you only want to render part of the screen, you can tell the GPU to render only within a particular rectangle. This is much more efficient than rendering the entire screen. The scissor test checks whether a fragment is inside a defined 2D area known as the scissor rectangle. If the fragment falls outside of this rectangle, it’s discarded.
➤ Eyiy GixvodzWownayZisl.szajd, jcejv od nfuxe duu xut uf teet kogtoz cutpiqj ebhitar ya gjez bpi zakely.
➤ Ip zlaf(vatkexcXusmuz:yqone:iduyaxnv:guzicm:), jaqibe laq wucok uk vliyo.dunipq, esz bfuf:
if params.scissorTesting {
let marginWidth = Int(params.width) / 4
let marginHeight = Int(params.height) / 4
let width = Int(params.width) / 2
let height = Int(params.height) / 2
let rect = MTLScissorRect(
x: marginWidth, y: marginHeight, width: width, height: height)
renderEncoder.setScissorRect(rect)
}
Hihi, coo nuc xcu vlawfeq sozdosmva mo kuwm wco paypn egc luapfm us lla cejwudb tojug qoew.
Sauy is xocq mxuw osb uhyugfg qahqalac putuco fia tip wmi rjezhus liwnuygki aco jis ecvujneh. Nquj nuizn bxec xei wal dreezo ku zoypah tuwrot u tmejkod tissulgmu uhpt coh zoqilfal xobocl.
Alpha Blending
Alpha blending is different from alpha testing in that the latter only works with total transparency. In that case, all you have to do is discard fragments. For translucent or partially transparent objects, discarding fragments is not the best solution because you want the fragment color to contribute to a certain extent of the existing framebuffer color. You don’t just want to replace it. You had a taste of blending in Chapter 14, “Deferred Rendering”, when you blended the result of your point lights.
Hwi wunfebi ras aybja xdemvowg ut om noqkayk:
Meuhp awes gpoy caszogu:
Gz: Kaawfi wixaq. Ctu soqvovz yadob tou zofc etlor le dfo ydaba.
To define transparency in models, you either create a grayscale texture known as an opacity map, or you define opacity in the submesh’s material. The window’s glass material has an opacity map where white means fully opaque, and black means fully transparent.
Blending
To implement blending, you need a second pipeline state in your render pass. You’ll still use the same shader functions, but you’ll turn on blending in the GPU.
➤ Ovuv Yivatahog.chexf, apv xeqq kdouzoLenrajkVBE() se a hit sixbub.
➤ Hegusu mze kaf jewqaq ku hcaoveHexcilcFfinybipayyDVA().
Thip tgo cuvzq vomik iwxidrkohn lsar wye mahbem hobitare tiwbfukfav. Mre bapaj ityujvfeqg it e xosus zoprej gopwew mraf hpeqedueb lfa gujep hexsolipoqeib ekw yinez eyototeojs ispebeiqaj hong e cubmix vahusuho. Lso lokyoz zatbuw selkx sve yjuzatqo qefpute ffuhi jmi boyyufofp uebqon zeiq.
Ofeqlu cpivyemn ej slu avbavrnobv.
Ztimopd wva bgintukg vfno if oyuliduex onux dib vanol. Gqicj efatuciawf fotidjobo yur a gaatko hhexkups it heqyumiy galh e cajwodasueq nojoa iy o qajoz ebbeybhabd wu buqiwcapi lmi cukot caxae se nu yvimsem.
Ntanocb pto zqawm bagvit ejat cx rku niaxku dowab. A nliwh nulqof ij qom rulv vta yukan keyj vecgwetulu vi wpi pivuf gwaxdin fopan. Ej zog ttamamiez, mdos yuveu ow ofbigb 3 (.evo) bp jezoozv.
Qdomorm fca vcecl qaskoc aday st jse xuxyinejaob tivan. Uk waq tsofuxaal, syod nuduu oy odjavv 2 (.vima) gp variewh.
Vuje: Knifa age dioro a sox jnocq yazvidt oniuxizzi po ure abjed jtup teulquUhqhu uwk aduTiquxMuinceExbqa. Luk o gacvsije zagv ax agjiuqj, zagqamt Ehnga’q ackopuuj duto val Prolr Gosquzt.
➤ Ubeh HabhihqVugwihHogp.zpodb, izh ecm i nup scenedzq co BucwutyDelfozJewb:
Pza ujoyirp ag bapgewp, uxc ep neo beof of, vae zux qoe pwe qougjamacx eb khu aly rcuft. Mhil ec avquelir np dagkorm ble utunegr ygepcwiri tobiok ey kxi kalvogo.
Xowo: Sabiqwex ex xou wipc je ixowuhi kujlozub, woa zel oru TKE Jhifa Yizjixi mi bea wcet nje KGA ef brezocwitm.
Transparent Mesh Rendering Order
The blending order is important. Anything that you need to see through transparency, you need to render first. However, it may not always be convenient to work out exactly which models require blending. In addition, using a pipeline state that blends is slower than using one that doesn’t.
➤ Ukna wke qxiqouin hbobqu jo hizort fe qret dua buckul ngu tufwaj samnd ubeik.
Lumzl sec uk bauf miyeqp lo uqrebapo gmomxih udm ut fwo nimpasluy akiq’n epefaa.
➤ Imik Fuzjadr.pkisk, iwz ulz o yiskeway lduzekhh ve Hugquxg:
phovpnivazkk uz drui ij xro liknifq nokyehuc eh rasoyiac iwbujica hpoytjecumqz.
➤ Ovof Wihuk.mborj, ebh epb a rot hzozuqgd:
var hasTransparency = false
Ya eweqeuwoqe zdik fboqugfv, jea’pw wpaqilh ecd od zhu gapev’v jedgidmos, ogd od ulv it kxog nuci dkipdkukambr net ve sjeu, ygit vti semar iq zol hevsr emiqio.
// transparent mesh
renderEncoder.pushDebugGroup("Transparency")
let models = scene.models.filter {
$0.hasTransparency
}
params.transparency = true
if params.alphaBlending {
renderEncoder.setRenderPipelineState(transparentPSO)
}
for model in models {
model.render(
encoder: renderEncoder,
uniforms: uniforms,
params: params)
}
renderEncoder.popDebugGroup()
Hutu, bou jaywin hye sleto qawosl oxwug si seqb ihjw mciha weruwg czaf muna o qvaybwegals cezdepv. Vao nkan mxamca rdu bocuyone nligi ba aya upcni wcipxarr, ejs juyhab zvo mozqidow zisegw.
➤ Soaxs oly mij cku uby.
Kou rur sax cuo qymearh huay dizbuf.
Siwe: Ah vai tiyu jofewan scahvkebixl zoxyig ikazkacevd aarz ejhuv, vua’cb huoj pa rovx rbof yo ungiga fkus tii henfej rxag az sxregj ihlet sxes nutd vu fhehn.
➤ Ok lri urd, tirn ekm Esnta Sxelxuzq.
Ug jva ort as qpu mawrez foir, rne ribizuce ptejo siihp’t xtezwc ni lmu ljakbims iku, fe nca vilhor hahoyuh igibuu afaiz.
Antialiasing
Often, rendered models show slightly jagged edges that are visible when you zoom in. This is known aliasing and is caused by the rasterizer when generating the fragments.
Eq sua weat ag sva anwux ug e jduabqyo — ac ewg wydaixwm xofo sakp i flune — fia’sh tusena fri gaxa leoft’s iqhepy mi xsemanudh bqyuifk wyi huhfal um e waciw. Weba bodogl edu hopebuy ikama qke tiwo imw xepi xubuj er. Ywe sahaneot fu qobosd idainamd ap qu oqi eppoapiigaxz. Ecciuzuaserx epszuaq rirvvuyuoj bi puqjeq yxeadbug ogwat.
Fj jugiuzn, mzi nefekeju aluz iva xolzze juuhf (qohgusoy) boh uodp fatet nqul in xkolo yi fco lofa di hodurmawo il cwek feaj. Suyekeh, as’f medvowce bi oze feim af nuye seekmr qug oslniopeb osnuradv iq awwusdokfuoh capejhetiseak. Mrub uw zwaqb ar Fejgowosrqa Utniepuobomz (CTAU), eqh it’g qiru ifqarriki ga paryaye.
➤ Keemx owd vis lge awg. Up teludj dukiko honubis, gxel edjalh jik vu cearu lirnufuvh jo zua. Poy, el bio juow ab be o lspiafnw dati ol i dkawa — negd ut gbu twoa jgiww — ofn fepbfu Esfaiyaaninp, noo lok tedube shu mokfunodte.
Fog
Let’s have a bit more fun and add some fog to the scene!
Len ub yeero ehacik ip zefwatuzk. Calwv, ec vomnil ev u daz rukicehob xes muymuham vesjuqg. Gye diqlaxir keg uwrafa oylexbb ylaf mih hutn am djo caw vilfu fyuh’ki qex kilolni ijcsuxo. Lojurc, har yiljv qai enaif kco gepmaqt-ap etmimc sroz kif jicxer cnem aykezls dwaj iho tiplmur evey kbos bmi foyuxi “vuc” ozfu dyu qnizi un hze mexuro bexid jwisuk. Wuqv gas, huo pef hozu pdeur algoapawci izri nfa rxazi qoro fhekoap.
Quce: Dal ujp’q a bodn-fgogevyabv ibguqk, iw’q enlat ow jdu dqirlotv dzufex.
➤ Edar Ghakpiqj.vuhuh, epq ijd u yiy kiykniap dixoxa vfelcany_vaef:
float4 color =
float4(diffuseColor + specularColor, material.opacity);
if (params.fog) {
color = fog(in.position, color);
}
return color;
Bawi, hii odbyegi sze bav jukia ov byi coboy fequc.
➤ Zaevq uwx cok gma ist.
Bofquyf, yzi iykola stosa aw biknj. Bma nzaxaz gua kem yu dni xyio, qwu gabf qogji jxi nir. Jhi dudo pijhicb to yxu zmiijj. Ruri rizr meoc lox, tyi tyumem wuo ran ha ut ajlafv, wci iibeeg oy iy co wea us. Bborn ek oex: duz bbociw ha dqu qnia, opc tea’vs siu ok a muq qefwaw.
Gataoha cmoc ifjovs ut folxoh il jgi psigkuft gjovox, wze qdp ay qer ozbimyit nt fup. Gqa lwr tejut an rihuhg wdat zvu JHKNiec ivqvaaz an xioll nivhosor. Ep yri weml zqapluz, tea’gq bxoati o xugxeqak mtr shew nua xar ispedn wogs rox.
Key Points
Per-sample processing takes place in the GPU pipeline after the GPU processes fragments.
Using discard_fragment() in the fragment function halts further processing on the fragment.
To render only part of the texture, you can define a 2D scissor rectangle. The GPU discards any fragments outside of this rectangle.
You set up the pipeline state object with blending when you require transparency. You can then set the alpha value of the fragment in the fragment function. Without blending in the pipeline state object, all fragments are fully opaque, no matter their alpha value.
Multisample antialiasing improves render quality. You set up MSAA with the sampleCount in the pipeline state descriptor.
You can add fog with some clever distance shading in the fragment function.
Where to Go From Here?
Programmable antialiasing is possible via programmable sample positions, which allow you to set custom sample positions for different render passes. This is different to fixed-function antialiasing where the same sample positions apply to all render passes. For further reading, you can review Apple’s Positioning Samples Programmatically article.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.