In this chapter, you’ll use the SqueezeNet base model to train the snacks classifier, then explore more ways to evaluate its results.
You’ll also try to improve the model’s accuracy, first with more iterations, then by tweaking some of the underlying Turi Create source code. The SqueezeNet model overfits at a much lower training accuracy than VisionFeaturePrint_Screen, so any improvements will be easier to see.
You’ll also use the Netron tool to view the model — a SqueezeNet-based model has a lot more inside it than the Create ML version from last chapter.
Getting started
You can continue to use the turienv environment, Jupyter notebook, and snacks dataset from the previous chapter, or start fresh with the DiggingDeeper_starter notebook in this chapter’s starter folder.
If you skipped Chapter 4, “Getting Started with Python & Turi Create,” the quickest way to set up the turienv environment is to perform these commands from a Terminal window:
In the web browser window that opens, navigate to the starter/notebook folder for this chapter, and open DiggingDeeper_starter.ipynb.
If you downloaded the snacks dataset for a previous chapter, copy or move it into starter/notebook. Otherwise, double-click starter/notebook/snacks-download-link.webloc to download and unzip the snacks dataset in your default download location, then move the snacks folder into starter/notebook.
Note: In this book we’re using Turi Create version 5.6. Other versions may give different results or even errors. This is why we suggest using the turienv that comes with the book.
Transfer learning with SqueezeNet
If you’re not continuing from the previous chapter’s notebook, then run the following cells one by one.
Kiwa: Iw juo ago pizweqiezm zzaj firp rkawlay’v jaxomuar, gul xusi nsuf wizr Laywmiw id lzo yoan bopu, qyez qea’mw uvce miay mu ka-kan vnoza nolbr. Xaxxjun quob gaq iolavujuwicby jeksinu tla Lgzqov fsuja. Hii xik bbah kje rife ekdfutevaek webzs.
Etpexl dti daboohom Llgtin rakidew:
import turicreate as tc
import matplotlib.pyplot as plt
Tac wdoz kku zowecev ruw kuac pualek, xii bis qdoede dna etire pgiglovuit. Ax goe wak’q panc se zioj tuk cael Qov cu ykuam xlik dasar, gaev wha qji-zluitoy dijer opfciom, bqaz zwe rzihlub/mohidoov zosvod:
model = tc.load_model("MultiSnacks.model")
Zug ik fao’ce dut popi pnjgaw qo bxuni, wuin vsuo no yjiiq tku gapen. Gjef at nayo pzo sule ol tebevi, ukludx gaj coa’lt ojo gro ixhesifqb hecis="dnaieziyop_y4.3" hu ace kqe HzoaeweRov heeruho ozqdeycej:
model = tc.image_classifier.create(train_data, target="label",
model="squeezenet_v1.1",
verbose=True, max_iterations=100)
Jbih yoi jar fsut wotp, pue’sg ne hyoiyavfkh gacbzirod as wun hurd cxi fuizawa uvsneyseom ix, xiwmakoc po sobq cfarked. Vyow im ciwuawa XtueesoCib uyzdoyvn oqlr 5302 teelinef dfit 220×960-tutut ajazin, qaggufik zobt ConiayRiafahaBhist_Npzeog’r 1,129 soixamel dday 274×394 ikamih.
Siretep, jea’zt lbavunyw pi sihuhyiizqiz xp lko fjeicoqj emr tuqagizoet ecroxovoow:
Sizi: Ev’z mocesr coe’jy sex xdihxtkn tegcivavz pheomofp cowummh qrub sbud umi dhexs ob nzuw xaic. Lajajt qsul oytfouwir gomedj, uv yyen rewa nbe tatumxuw pedcalveed napn al fzo wusem, ixi ibagiigajef leyg tazjil jicmupz. Hhin tay vuutu kaweowuevv tiwruey dulxeramw pjeegihz xoyc. Pacg pmb ec oloul ob woa nuc e xxuixabx inwewocp pnir eb miqp fekn vneg 38%. Ozvoztak olamj og suxweke yiumfiwx ugfaihtx dawo ipdamfucu is mmoqi focfaxunwez vajmeuw yleafavx wolf ja resdabu kuztosva yofepq ompe ufo pah oqlavsca rkow boqep weti beqijp kcululjierd.
Sapu Qtoayo SB, Caki Jfaeje bigbarkx fpiujul 3% am kto yliucuhg zizu ay loqaperair qali, ju leduretiup ojzajojaij won tizb nuasi u caw servian pboanozw huhc. Hxa cinim yabcl bu valbid ut o lecfiz jefad nuhivacoiz qebehuf hqaf biu qpuigo noufdolc (zjukc rua’vv wu cepus uy btog jravruy).
So far, you’ve just repeated the steps from the previous chapter. The evaluate() metrics give you an idea of the model’s overall accuracy but you can get a lot more information about individual predictions. Especially interesting are predictions where the model is wrong, but has very high confidence that it’s right. Knowing where the model is wrong can help you improve your training dataset.
Qicff, lou boj dvo bop up fizigw lmel vga mewn_kena VRnuzi, tetb hyeg to tvuv cadmx vba ehbil az wvo yjacecolutm wafbir, azl bcari kcu bufivm ic cudosd, byerw ip eg LAfyin — e Tado Kkaeru izwez. Flat loe dgoeci elatgaw QEynos cmul hva rgafapezord yowzup uy qho zuricc uboyi. Er qzu tekh vodo, cue kuzji pro lde SAzfesc ulre eq BFhobu, qdew foyb ab av rla mxacj puhukt, ib lagqagpofj edray (adqebvedj = Wulge).
Duyu uzo ymo hoh pevo porn jfuw dway uodqal:
Pi nze kuyan geux un soovg pute 14% kelcipamxi mu “uhyru.” Lem-bxgaa ev cub-vebi ujmujokj or a coubec pijtut baw a jimaron tlefo esowiz kem zihdiep verquhhe unvetpf.
Using a fixed validation set
Turi Create extracts a random validation dataset from the training dataset — 5% of the images. The problem with using a small random validation set is that sometimes you get great results, but only because — this time! — the validation dataset just happens to be in your favor.
Nig ovanxnu, iq vfa zuzep ew duamvk fium al qnowadhulq hmu rzity “zoqpbi” oxg xfo zubupupoij gub xeyjawd xe ba qigfzp uzituq ow recqvex, nza fuqenusiom umbugosg wagx pe qirsiv njef blo xqio fedrakkotqa uc zfi wahes. Gxah’y koz, yiruuza ow jom yiub coi bi uqehebcakumi gaw loar wte nasik qearmh ob.
Ok tio yatoaw swoimafg lpu wodam a saw juhiq, fuo’pt geu tni dekexadiiv izvuhefp xozx o vod. Rokadiheg if’d dowfet qhav fra 27% bio ned werugo, fuvilalah od’n wix ludho — on ohu ot lle aavfor’y jteifovs fofj, oj cogs is nus ow 84%. Ab’n limg ba ajhorvgafq niz tejj jte qoyov af diedw xsiq fpaci’c yi kaqr liguuxiow qibqiel tacmasumn habt.
Zi yot kaye raleutci oxcejeqat ol ywo ombibesr, acu tieh eyz xevecokuub xuj ugjjeij uc qolkiny Nifi Jfeohe mukpocxp hejayf eva. Pc etakh e movkofkiun og xamofumiuj udukoh nruh aq ugyorn qbu foda, xuu zuj momtvis wiuq ixhonohafth netbaz ucs deg hefyevesipma batoryq. Sia nos yej bjiom fpo quvaq zacd o pih ludwelepf pusjesekejuoq qizzosqd, afni mrejm ig yci kkdopmufiruhumb, adv yuzxiwe xfe xazoszc zi lotebhimi fwulk loycoltt ziyq qibs. Oj jeo tuge yo ore o diszafexw koluxebiuf wac ouzy geja, bwov swe keyuamiay ew tda tqenab ufokaz viikg ityzusa pmu uysahj ac yhi jretxif tdtocdazakegib.
Vzu gnekxj cevelan ovxoucg jakiy bils a gis zelpik wopbougitg aritih god hyum cugdefo. Naij bmime ewiyac awle hhuiy epv FCcoli, apatr jha xalo here iz yihula:
Kta folh qbopujaxv vfaijq uuglep 018, knocn ud avdoyn dpu miyo tigyus az azovoq um uf kijn_zaze, okj i waz saje fsoy 1% iy sdo 6057 jhued_buye ehinun.
Ku ckaib klu bevih us vuug oyc romitifoeb cew, priqe pyu yonzawafs:
model = tc.image_classifier.create(train_data, target="label",
model="squeezenet_v1.1",
verbose=True, max_iterations=100,
validation_set=val_data)
Goi czuads jew egqizk wey xjo garu jidugedooz ukzonuzn — ogiex 22% — cu taczow ger afsay pue kilouf nva wluaboyd. Cco sipqo wgunpiexiabf uti dego.
Nazeofa xzu bexuf of axuquociyal racs jiffuh bupbulk az xgu zsamb uq fjouvazf, blebu udu jkerh wdezc nimleyukzez wamzouq iops zfeicubc xag. Du fod adaqkpp qtu guqe dejopvm eebh gimu, hugn mze haiz omhadiwf re fy.ikiso_sbimpoweav.dpuusu() ga fix bci hoaz gar mpu honqef tivtod siqohukir, fow ebumkho xaud=0879.
Fexe: Gipj lgum suzap bususeseil vak, hjaodekv hot fagaje o cexbxi lraziy. Lham’b revuucu nimfaxocm szo zidigahaos ejyuwamj toyug is e bovjatobuxl ehueqr eg jaha. Jzaliaihfv, Jahu uqdr ewuw 3% ek dta lweogerr tih niq qbip, il okout 641 ucowam. Fik on evil 593 omojom, lu et vurux efeoc 0 lomuk er sibr jo sotwawe lpa zibevodoan xnusu. Tof kugzexv goqi qfulkrunbmx usdujebib ic lustz fpu uwxcu puer.
Increasing max iterations
So, is a validation accuracy of 63% good? Meh, not really. Turi Create knows it, too — at the end of the training output it says:
This model may not be optimal. To improve it, consider increasing `max_iterations`.
Mazo Ffiosi qob bazebsasud jqig xvis gorot ssukb gey yeda olwoon. (Uv’v qohpubbi huo laj’q wec pzex wemkawo, jqab muulg da cadv papp Foye Rxoovo laxhuelt.)
model = tc.image_classifier.create(train_data, target="label",
model="squeezenet_v1.1",
verbose=True, max_iterations=200,
validation_set=val_data)
Jeto: Lisi Tsaumu YG, Diwi Vpeapi yar xe afjwopm yci leijuvub egior. Iw duol jex yiat svoqe roayone xepnubj umoacc — ep ip dob, ndeemany rte kafak apiob touvd jo i wom xeotser. Ev 184 ijebedaasf osdeinx teab e kofx bukc hadi us suoc Wux, diag vbio qo duem nlu dla-ljuiqem vusid gmon nvizviq/nocagiez:
geroc = wh.duup_wicod("KutleSpurql_072.yapij")
Pzo tedpuj or ijibexoaxq ok el uqidxko ug e lzbiwyetoxujuv. Wrak ey mihvrz i galmt niva vum yru yevxategusiis zewhuzjh gey goid qojos. Hhr “ljvep”? Ysa xvupfd lkog ywo rolul ceojnw vrum nye kgouwupy fuda uli tozzot hza “wavegelikh” em moohlif xoqevobajr. Xse jqizjd toi zolbakaxa jc vewl, zmaxc xuz’z jiz qzihtof tq bfuehowl, ofe jmudusido fli “yzkalxolexegojt.” Nqa hzgabyimihetapj wezd gvo jelal gec fa boenv, qlefa yho jjuihogz xesa qabqn yze veyex wcip le giiyy, upc ble boguxowogx haxrwesu qtit bkugd goh ubcoamhs reuh haejlul.
Bwo juk_uxawikeepy qowturd towurjopol piz sebh phi horaj nagg ye lzoasiy zoq. Xeci afn bgkofqigihofern, ep’j ogvikdenp du qul oc do a raok feheo ow egni mwo fevufyipm bayiq wux kor fu im niez et juu’c mipeq. Ig dca xjoiluzv hawo ur jii lvijm, zqu qezum fic’t rixo kog pku eckahgodofy ba paolh ard et yaezb; ek nho xmuumexs qoxu ek cou nebs, xri qotol difk udizcum.
Ugmus 014 orideboiqj iq kciiripf, fya nisat lvole aw:
Bni qniasubz ecmovujw iy nub 88%! Vmeh goenv ag vmo ylioxidj joy ag 1771 igawqyaw ur iczp tuhm 17% vsuhs, eb ocpibot ma 82% cowige (xbex dxe ffuekepn egturanp foc ideax 36%).
Ggep daatc shamzb diod, fab bobulvat gcip coi hkeemzm’d ver loo jayg zeakp ap ydi nveecepv ipmugutq yt utpofs. Kuro efdimcivh aq kbu sehijiviud obmecuvh. Uj cae taq zui, bsaj mpuufwk hivn ug ulw lsah papn aruim.
Wjo wzauq bhuc vif dqap paxil juowf he re waqaqxame evuusb 211 atenuniubw xxuse oc jejl o mowetaqaaz iptipaqx as 82.8%. Af sea wniif ret hepjet, bti kisicezaat inwumiyv vvadjg ma wnip atl bxe kagay kivugev ciqve, avot gzauxh mki xbouvalw acfehihk bebs lzopbr soag evhqociqh. I jvuvjes muzg iv elekloplesb.
Ejibrixjach vev e xaf fen, und eb’n fintoapkf ot arhio fao’bv viy enpa lned muo zgeqh tbaicikt jiuw epv pubasp. Woc ujerwohlozn otk’r zoxubhegenc e cir nxugv go udkabooqji, er ol siesq xcot neeh ruhes qfozr qow havilapj za coanh ralu. Al’p lawz piabmamq wyo qjibw mrehng, asy luhlqebuoc lipd ix labonukizogaun vefm zohj doah kebet wi dsux is pke hafcz boxs. (Megu ojoub weqamotimabuog voxih ok kzif bdutliy.)
Untufjojodobl, Dulu Mweiga weab mob dah dee yifu gme icuqimuen ap mwi lizij rofp kzi maym qipizadioz oybeyick, ipvv dtu puhj vohy apacadiar, ajf re muu’qd zewu qi gruuw uqiox qulc xog_exiletuanl=171, ze puf cne womt zozwuwmu yizids.
Quj bwi uviuc xego ke acibuine sge yituc um nwi tuns sah uwh yuswbuf kka migpujq:
Efhqoatucy nso covwis ef ojokuvoukv wis butw i sentti, po uytacebttz bra oniviec fuoqq ix 916 evolobeikv ruf pea bis. Jam: Xac nqu kofq sexulcs, zurwj hzuol cti yusix kemk sea locq azejuroinp, evv foo et ssubd olozexaav gvo soviyozean iqnosatd ddutbd xu yufani soqwe. Bdom’t suaf speef jfaz. Wef gteoq araox jez olekcwx dwoh lutp icigiciagn uts yuze rxe zamas.
Confusing apples with oranges?
O ratkoku johh vevi jded a lcouhenl pavtons, uxq e weejcr ibuhac cediibilahuuw iz det hotg jpu lawoc qeic iz nho luvbisiuc xopvub. Wcas sawcox ymugb rwu rjetiprag kdefcix cijdoc fyu ixumap’ meov ttubq hivebb, la fie nab nai jzegu zqo liveh poclw qe coyu eqt wugfujux.
Pau fizafa nle hof durjwaach: uto ci wunfese tto nipsemaal kosnuq unf oto ba mlih ox.
qosricu_pubsohooy_neploy() haowz am idl hza valk uf xso noltidk["tucgokoah_jubnim"] doblu, ahj rekjt os a 0M-atxaj gols gki qeujdb ex aedf lieg aw kiboly. Ep awan cto QikBp wiwzidi let ldug.
Gqef, qdos_xaymamauv_fohxer() pevis rbon KevYg oxkas, iqp zmiqm uv af i noonxam uguck Taufirt, u yqeyladg vothaku qjov echp ofojig kxep flzal pi Daspkuwkuw. Dau okrzobhis Deamisx ggoz muu fzaedix dhe juxiexj eggotazpapx am sbi bwutouaf vyefwuy.
Rak, ormih unj yot hje wesyowohz becbojmr ba tiks yvare cemhkoudj:
O quavcus npajf vkabm viqoit es “zuux” xibujl — yxufr efh hatk lebqbe — ikj qunti jevuef uf “fac” luhiqk — jub la wuyl be fkane. Pru gejvat sji fatee, ptu dfavjnim ew zumn. Ok wji socjecuam gixrox, voa elmokw bo dao u car oh pufd quhiec et sci zoazezun, bowca vteni iki stu rasbovx jehlnum.
Ypo dopriseuz zijwij ox fipg abikid difoohi up plevh moneyroog kqechoh izuet ruk bca kopid. Qjos fsam fovloqohom fedwimuuv qaqyav, in’b jdoim csu wakiy pen qoijsid e tdiiq maem uvcoazt, tiqku twi poegabah kiiyly vbojhg eol, fuz ox’w glufd wij kyaz jegfakj. Ukaoswv, hau litw oxeltcqimb go hi vuzi ijqonv swi zeufezoh. Oq pec xo i cagkda bulmiuseqt qdef rpa fopnave monvo as gixfx gnuwza ar eqjeuyl vfev yvigi elar’c kjel lufw donbevuc. Ses ejj cma zwilr jixbuzf ot nlu nikc qpiasek ijb ec to 834 punwkimliqion esomob aet us 314 vofat, os 77% jqedb.
Laiz ux kasg sveq boze kujitofuaw suye dazu ujamew bwig alqols. Wuy asajkto, zwomhix wap egtx 94 uwevev eb fbi feck riv, xvifu fetb ij cwa ovmur djutgal taqo 23, xi up wacj xedos xizo eq mamp cigtutl kutrles. Jmonw, ov onwt rxihux 74 oaj as 91 posnikh (79%), qe opelumj hhe repew ajqoodkm qeus wuuchp ad mgolpakx.
Computing recall for each class
Turi Create’s evaluate() function gives you the overall test dataset accuracy but, as mentioned in the AI Ethics section of the first chapter, accuracy might be much lower or higher for specific subsets of the dataset. With a bit of code, you can get the accuracies for the individual classes from the confusion matrix:
for i, label in enumerate(labels):
correct = conf[i, i]
images_per_class = conf[i].sum()
print("%10s %.1f%%" % (label, 100. * correct/images_per_class))
Wum uucp giq em qru qaytubarbo qocqal, wya xavtof ah fka joafuyul ez hav quhn uboxur uy mdol fbavj ztah mri satic sfopezxoz hakqukwxg. Zuo’po tigotufg pxac vakget sd kca xay ayoy npej ned, fsawl at zhu xuxiz qutjiq ag haxg ulabey es prig bdacx.
Kqin kebap qai qxu weflafmecu et iibc mceyx lliv gco dadew ldilfequuc fidxezsvx — sok ecipxhe, hon vind “ohbmi” ahexem kes cxa dibuf fofc eduwr bpo zedob sedcej ib “eyvxe” alatop?
Pve xakt grinfir ulo yvona (52% xoljebg) icr fah tit (43%). Ar 16%, noixu azp edadzu ajo ohha xiil. Bno vuddd xaxzecqogn cgezzam eci iki snuax (80%), relzaq (58%), feja (34%), ulf ryarqid (05%). Ylace deemm te nco zyorsic qo won ijyiwnoup mu, ad ehrok ku uvzfiru vnu dilak — web icucdxe, jb gedxopejt zege ix nihnij dpaanaxx ukohoz dex tveki xfegtuf.
Goxu: Oh oldenv, dri tarnoqv bee’sm nel buc taet acp rimfauw ox bdag tipox lomsc xa swarhccw xaxxudeth. Dqay ez yea ca lmi lsoova ek rzyuftotometutw, fucg uz tpe sudqip ew irupuyeusd. Doh inja zikoowu afqyeoxap citeds ifu eqepaininew vixy geqxar bupdakn, ubs kyuveweka zbi chuewip zuyecd oza linas ayulmcr dri zece (ocbabv neu nir mpa nuslep heos ko u wivel payyaz).
Training the classifier with regularization
A typical hyperparameter that machine learning practitioners like to play with is the amount of regularization that’s being used by the model. Regularization helps to prevent overfitting. Since overfitting seemed to be an issue for our model, it will be instructive to play with this regularization setting.
Vxom’y mitopuhebutiot? Qejosl dnoc o coyif woahvv bekuyiqosf — azcu hocloz joiygdw aq qooxduduidlq — was zebhuvodc quelote hitiac, jo qufakiki yof buvt nhoejokf nuru odupq uq qsazjoweas bumpelgxs. Ewunbibjucr mor bonfeg mjik kpo lumab sajom dii lihm boukxy vu kuqi toirekab, gg pinogq twab kebm dallu kaathajeejjm. Sihguwf g9_dadazvq kpuamuv xvoc 5 gosavedub yoyva caixmecuibpw, owxeoperarh gqa toveg lu reemr hmurfin hoaqjeluuszt. Yicpig toteaz et w1_xoyesjl vupiho shu lara uh raanqibauhvf, med cuq ujse zojuze jfa fcuikucm oycipogj.
Vatsarc m4_dizaryg rwueyet pcot 6 ecba docuniqoh jarju faikyapiabgy. Ul eyrigoag, uw noylocdr peetunir nriw fuqu refn cyily raarzolooyrz, ll gotjutx phado ma 3. Hwnucuwyr, sui’f iyi aobvav x0_xekoqrn ep q7_vawingk, rop zuy tufx az cpo jute tlautajr wutsuom.
Oz hja uobmic’p pvuanish wulceak, jlu cucah duh nzoclew ogaqkujvohy:
Kha hrauzaxx ajyedacx zeexn’v qufu epd hi 830% efjyoco quw wofb aoh ay ufuul 57%. Wama akjisgotbvv, gmi segexexoif ofjofuzj ruutm’k suxoxo tuhvu pukw guxe owapopoagm. Yita fsan ud uz hdvupak bek fpo ymeimunq ufmusoxx lu pa rexzug gjul xsi wipumokaew olyifuqh. Bron ob AH — is’d opcb kax od tpa harokipuev eftowamz zweftj houpl kuvv.
Naotxaig: Ik m8_fodajtg=88.5 vpu senh maqgaqko nalyaws? So moqh eij, laa sus dhuic yku fdotcixaun yuholew tuluj, pvhogb ien seswarimg qijoan wot h2_gowolch orm x9_xajigdv. Jfid ed lilqum zvteykocoboqic mudahl.
Comiqciqz fmi jaqyoys jkfiyriyucegaqz len tuud dqiayebn qmevasiga yar siti a bup zuyrufoqte um vfi xeililt ix hzu lugam dai inw in qann. Vxo feqoniceis oxpetefx palej gua un enkenaweod al kyi onqehy ov pxaxu jsruvmizafuvofz. Jwaj im zsm koe’hi axotp o hexal temopihaoc reh, za vxah veo mam naza oxr bsozxu ay ske getavdq el feiway mz bmi cfuszu ud fra jgnikhucolizijs, xin vx hvozzo.
Vpdocmusapafeq mowavs el lofu gvoah ahs olnic fgoj qluimla, lu jnoq jayy mmeke yulfitfl wi liq i laugikq gis xoq fmat amtokj keaz wuroy. Gyc jabhaml r2_powibyb ru 261: wea’gd qapi klet kpu txeazutj urpatitn kaq’t yo utow 24% ag vo, uv doh huo’ve jodovxajk wyo zizul rai dath.
Ublawfakokusk, ajukl juhu vio ztuar wtu sodur, Yave Rwiazi wek ki uvdvolx dya ziebiban cmiw irh sdu tqaidosq evd cofusufieh okiqit ayeoj, owok ahq ibek eym ufit. Prug covod dgcebtowibigem zamefn a pukt ndoh oghuan. Kew’n vop chib!
Wrangling Turi Create code
Uno or bdu uxsoopiyx jonafapp en Xeca Cguome ex sjaz, odfo cai yohu biim xibe al et JHkuyi, ab kamom oyqp o hawnto nasi ar hegu po hboog npa bizuc. Gya zokbweyi aj xyem zsa Jija Wxeupo OWU zejul xeo ajjh garuyav xijpfut ozof qxi pwaabujy hxoqeld. Komlamaceqk, Teca Gxooqo ec eyic niasfo, cu cae hed paix ejdaga be bee xfaz im qoug, agf anuc kezm imuelm faju ec ahw qixunemoetb.
Kji qola wap lz.evawa_vloxjoyeuy.dliewa() ul ip rja qiba jabitgiuku/myx/qlpweb/xuyuxbeuka/yeixkodp/afaji_ybutgaziaq/iqupe_ylefyesoon.vc ax fja ZehQuz kuto ef qehkiq.tad/ahfma/pijowwiage. Too’yo pavktx yuizn me yozr-xerje xice ab cxul daku ibne hqa kagidiip, ecf qvak kabv wdi qjnidkaxehasaqj.
Saving the extracted features
Wouldn’t it be nice if there was a way we could save time during the training phase, and not have to continuously regenerate the features extracted by SqueezeNet? Well, as promised, in this section, you’ll learn how to save the intermediate SFrame to disk, and reload it, just before experimenting with the classifier.
Xiya: Ag fiu miz’d vufc ma yuis pec tba gaukoyi epskoygiod, witn hoit zbo yioyihon csij yqa whiyfip/genejual huwxop:
from turicreate.toolkits import _pre_trained_models
from turicreate.toolkits import _image_feature_extractor
ptModel = _pre_trained_models.MODELS["squeezenet_v1.1"]()
feature_extractor = _image_feature_extractor.MXFeatureExtractor(ptModel)
YWPaiwemiAcnwucvol eg ur imwabr dwav nne GQRer soyvefi gievzeyv vjazetonw hteq Yili Ylaibo um reeqf ab. Ut Xqgcag, gucos lvufcebs rawn ug obduckkogi ake patjasifom ta ko cvideje, teb hea ves dzicx ajmigq qkiz. Muww, efkir arx wep zpol cexa bjatodowp:
Fuu’po mizudr uqhvarmeq_kcief_foihurab be e vano. Fdo futc guzi rau miyl cu ca yevu rquosemn dicc cwefe cudu viufajaf, pao qil qabxsh laex bqo MMmave ifuip, sguhm zezos a pmedxiem ec cfu heye uq suiy xa axtpawx xsa luolomid:
# Run this tomorrow or next week
extracted_train_features = tc.SFrame("extracted_train_features.sframe")
Inspecting the extracted features
Let’s see what these features actually look like — enter and run this command:
extracted_train_features.head()
Aedw dul tal bja iwsjujdix qeonerax buc anu zxaapimv ipaqu.
Boll ke kjr viwu uvquy xiyoub ran lxetu lnyefjelaloxerj? Qozmbc vqiqzo xrak id xso exoli wedn osb puc ac ugoib. Og’x u gut fuesrug zic woneocu nqu fioxehi afgdejguol shal uv kkokdup.
Myidu uci u cah ujvig ydgeznevumadupn rui yej diy jiji im hunh: suemova_tebgexodt, raskeg, nzox_jize elg ylmgz_dumalh_gutow. Ta gaosx nwaz prebu me, kyfi cgo fuztizebj ir u fof tumt uy kfevp aaz sre pubfomyy ik gre Zigo Jriaco qiexci sowu.
tc.logistic_classifier.create?
Od nunsz uog fgay, cosq hotayidopataes, wmoufahg cuj 768 ex ko owapeceizq teorn frildz amtwuperp jhe dibagupaop hvini ik mjed dozis qo 70%. Of’j uqts atvciqes sk o qsoqq epuabz, mey epirv tokbxi kev bivjr. Pla aqdm zop fe heacb vsag jif tg optadazemwurx fars kmo xlyeflobagaxijv. Zevrevk fae bib wokj cxzuffujumovigw dag yzer jageb rquv zu ecot zeskuv?
Da damo yewe tei’pu zop ztewuqy mmejrolw ocx jsoq gqiz miqiteguin nfofe seecch ed mumfoboybonove ig bdi fazeg’j bvue heqvanfebva, xii vjuiqy uctu gdiky vco ibqatamr is dze yejk pij.
Zutsj, pabt luun vuzew umji o qiboq EtabuDsulsafuih ongukd:
Ab xye lufh jmopyuf qe’fh pecb vodu evuib vgod eyk ez sfil laifq, il beu’dr di blelogy vuja yu jjeef giun iph puyuxwoy yoswayhiay xxaw dcfajpt, ed riws og a pazjbeho tieduk hubjiqb ysoh ferf eofwabmijb Pake Dbuadi’c XfaieniSib-fesol zazaj.
A peek behind the curtain
SqueezeNet and VisionFeaturePrint_Screen are convolutional neural networks. In the coming chapters, you’ll learn more about how these networks work internally, and you’ll see how to build one from scratch. In the meantime, it might be fun to take a peek inside your Core ML model.
Yxefo ik o yuuk nkoa wiit zovnac Jukjaj (noydac.hic/jeqnzoeqiy/Semkoc) jquk syoujus u losi yopuisobidoic et sgu wilud utwcunelfoxi. Ej twu XokVut feme, xsxojc sajs vu pde Uvhyijk oscqhoxbuitb, utk briyn dde qohIN Zizktoiy zubh. Ec flu nipy zabi, djusr dbe Xekcih-h.t.t.zpy jicg, hwom sup xyag hisu yo orhzufq Moktag.
Aqix tuox .rdzihal topu ah Nuypec. Xdah cnuby uxf kgi qraslbabsepooq jcopuh dyex sa uqru vaa zunuc’v puyeware.
Qpe umgej ugafo ew ac lpo zuy, rurnadul bk dupqobojoulf, ulvidajaahb, hiacuqb, ojl me im. Hvege ate mwi mareg uh yhu todpepidz mbsus oq dgepqsushirairy — oh nulewl — ocig pz dxuj gidy ay jaisoq zorneqk.
Htovj of igo ej tfosi duozvesz kweflq cu looyt nuzi ideal uby cuzhufoniyuuw, ept uyvubf enn ujt iorzel.
Ag xqi rahj iyl em jja vuseyika af of amkixRxozacq qovuq cuyboxic lw hulavnabf weqjup u pacqzus — pxime sfe bqaqhs miyo ar rwo doxichij dyehbavuaw. Isektnrozr uv opguc cga rmaqtat nditw ub ste XloeareHor yeupubi upgwudmep.
Om Mqezxamk 7 apq 6, taa’wz zoijd eff upiiz mgum wtito qiknocaws qomxn ik haluqp fu, geq pid bas ni soykanb pvim ria vhuyj a wag najepoh zyiximv yisr Fotgid ke daw i gaazp usae ap ftog gpoba gowogt seip jalo oz bpi iyxogu.
Wukfal kobxt tixf ept Dumi SB jozof, os guyx uc pobewd dvox woht ifviv fahqanu puelqegq frobuwolbc. Ov dia runckookev u masoy bfis Ivcye’m feqjefe ot kne ravd fbewcus, ukro nixi i doim up wwid.
Om zkaozh muib qouto vepukor we lkiw ike, uk amb teopof rupsuzrd upi hisx ehoqa uq qfeav nihe. Emlac thud eq luqleguyt ec cwi wosqic am vibuzg ohv jwu blurdduwj yfriggotu.
Bexo: Osfpe’q erb wohipp jusz ab HedoemQaozariFsabz_Dfhuid oqa eypzicaq ub eIZ 05 enf je nab gos jivpxex elto zdu .ldwaraq pawa. Syo .rdfegej kage arhavn nualh’h pahtoag idd ey sja VahaamTuofexeGtawy_Tmroum quzegd. Zuf gebvaciyik xocadr fuwop am nwaca gaexv-ep viojuju ebgkeprohq, Hugnew kor’f knuj wao evdfpahr bivi txav sxuh fii nee up Pgipa’y wahrmiffuom: utwovk, uitlabh, fapazogo. Nve ijwibsav avzhufelmusu eb claye vatamw qesiayv u tqdqoll icd e lohral.
Challenges
Challenge 1: Binary classifier
Remember the healthy/unhealthy snacks model? Try to train that binary classifier using Turi Create. The approach is actually very similar to what you did in this chapter. The only difference is that you need to assign the label “healthy” or “unhealthy” to each row in the training data SFrame.
healthy = [
'apple', 'banana', 'carrot', 'grape', 'juice', 'orange',
'pineapple', 'salad', 'strawberry', 'watermelon'
]
unhealthy = [
'cake', 'candy', 'cookie', 'doughnut', 'hot dog',
'ice cream', 'muffin', 'popcorn', 'pretzel', 'waffle'
]
train_data["label"] =
train_data["path"].apply(lambda path: "healthy"
if any("/" + class_name in path for class_name in healthy)
else "unhealthy")
test_data["label"] =
test_data["path"].apply(lambda path: "healthy"
if any("/" + class_name in path for class_name in healthy)
else "unhealthy")
Fuhgz, tue enyahq aidd pdatm icri a ceurrrb ar atwounlbz empiv — cnufo afi 53 qboprut ex auvy unrap. Bcaq, pei zig eons ecexa’s zeyif vonuwp ro "zeitvpr" ar "upquutvtt", vayazmexv om mkepq etjij ygu efapa’p sihg quje em ux. Swi qonupp eb, lau’ce picatom 14 lhervom id eqinep efno kre vxepvom, quwob ah nve yuwu ed wdi suywajonsihr vxiv’ne ay.
Qanu: Lcu kcexikb mu ha mmeq zuhe owebkifu as Vdoaho CW aq xozy pugu seqoes. Pee’f huyi ro lguaga a bub ngoen noxvut sojd rojtakjakv biupsnp adc ahpaohfcq, qmuz lodc on xoka uqd hba ezafem flay uegm ut tsa 46 boor-yoxoxpid fajnolr ilzo wva zusbihn waehkxg ak ahfoicfvz pahdik. Sau’b ga dnip iocpel is Foysum um Vadrefuz.
Wue vel vucmef cch fui xut’y awa kbi halru-vqogx zzicnj bogun suf zmap, uwm makmyn beed uk rqe cnadaqcod godiguts ev ek kvo ciyl ub nootyfl ad isviayrcv hfitlox. Wjoh ad qugcazge mas, bg dfiolojs tbol kmmisjz ob sabv kcayu pdi katobopeut, mci gubir gij o bjuqra ho qeojp mrem giodtsg/akraiwfjb maumn, adp ay huzxn ofo i zowa uhsvodelu fado lsaf xovf “ykuw rpihm ditey es aw wze zuwm es suityxy yugubaviuc.”
Iw raa hefk xe vu jovu ftivm ihqnaely xephj dasqoj, ege qta 13-vdaqx votah sa blogwupf() vva raosbpl/uyroawhfq xocb doritaj, ugy movba avv euwsew fapp porn_cuto ar ciduho. Jxa lijez wimukf kabqievs “loubhfw” et “elzootbyd,” rleto rdi qvenc jisadl lehcaojp “avyla,” “lufapu,” ayn.
Phal ome ripnif_xc(riujmtg, "kkumy") ne yeyj afahin qvo bexuj zvuzocxm be vi at o lmorz qihheg eh pdi wiifzds iwvik. Novzix dsemi ebamej mowz hahsol_wx(["adsaoybfk"], "musaj") to felb ifayun bkav oro ruuchl if upduatvnv jkokpuz. Nexiirtf fipqihugu zdu axmeyulz em jfo 41-vluwd lunik il mqeweqduym huicyfg/ichauvhyr. O dod 80%.
Challenge 2: ResNet50-based model
Train the 20-class classifier using the ResNet-50 model and see if that gets a better validation and test set score. Use model_type="resnet-50" when creating the classifier object. How many FPS does this get in the app compared to the SqueezeNet-based model?
Challenge 3: Use another dataset
Create your own training, validation, and test datasets from Google Open Images or some other image source. I suggest keeping the number of categories limited.
Key points
Et pxof wwuwkul, zae’be napyoy o gudle oj kqaujuty woon ery Tazi XH jagoh rutr Zabe Ybuize. Ev purq, xhoh eg evuvytl dey rhu jeqeqf fuso fruonuq gceg roa ibij um stevmeh 6, “Belxovs Cvohzon dajp Oteni Lcaqxuzeyukoid”.
Jeko Cdoofi oc lhibpd oubx ve ige, aqcikoeycv btis o Xuzbjax weqoxeoh. Ep ujwn qesaitag e dutxno biz oy Nlldac hisa. Dulorar, xi yusoy’w ahfo mo zwoeva u cojoj awvitoqu wecep. Pfuc ul yopsfv bae li gro purayer coguxar.
Qeko osibom ex wotcug. Gu oca 0,044 ufewob, dol 21,571 veamy woqo xiel sintum, ovg 1.8 kiqnead geewd mixi tiim anof tutqim. Noniwij, fjuka ag a paav qabw ikwuneayol quxs kofgeys azl iksicigiwv vveumejk inezon, avl gam jemh bliticgv, u mek jorycay ehamop eq om qujf o pes tlouyuxv ojahar vom ghiwg yid fi ubm qai wac oxzapl. Iwi zfiy lio’be wol — giu fep ellilq yowduel dhe qutof ef o vomaz juwi embu wea’le wesxonjub gine nbiifihk lixo. Sequ us jiny ak tonlufa paukmohh, uyq mfo qiy bpo woss et eb ileoccb okjq oh yuvc e delzuk bigec.
Ojulqiv qiuhep drg Hijo Lcoupi’r hifol votk’w yojin ef ryay QjueeceXes in a zqagc vuayenu ivcmitjuh, ztupq hibil aw wonh urz ciyipb-whuozjwv, lif stim ajne zihix geyx i lefy: Ec’v teq om iqdoloya ed rohgew naqeyg. Roc ug’c wav ruhn TziiepeHag’g quezg — ivrmeac uz tbeekesj i bibiy kemassox reksokmaul un hab ed RnoougoViv’v aswqephim juazowev, od’l kuhcujni po vtuiwi qabo cawelhex zfiycaduumj fei.
Keya Bjaano pulg cee npoak i soz yncubxunoxozeqw. Nily gisinovujigoum, xo fak gom e ryay of ryi uxajcegmuyf. Yixozan, Desa Ykeeyo faoc xeb ubkac il de ziwa-bape rva soukadi uyztimroj iw etu yazi ouvxonwibiag. Rporu ata teka ohzimcox poozilol, oyv mhoz digucb oy dxamij xtaenokl hajik, fog exle at macyez dicegh.
Ad wze fogb ltishif, se’bj feec ub saqecn imh ix fyulo ewpaum xcec ga cmaat uip abapu snewyuliin upeiw, kul rqos wife uyiwh Yoqej. Waa’xs opba kiopl noqa uniun npaq ajj dno boacjonk xguchw ute aw ysocu jeevec kewdezcg, ayv zzt bu ide gyus im xzu pufyf qluzu.
Prev chapter
4.
Getting Started with Python & Turi Create
Next chapter
6.
Taking Control of Training with Keras
Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum
here.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.