What you did in the previous chapter is very similar to what Create ML and Turi Create do when they train models, except the convnet they use is a little more advanced. Turi Create actually gives you a choice between different convnets:
SqueezeNet v1.1
ResNet50
VisionFeaturePrint_Scene
In this section, you’ll take a quick look at the architecture of SqueezeNet and how it is different from the simple convnet you made. ResNet50 is a model that is used a lot in deep learning, but, at over 25 million parameters, it’s on the big side for use on mobile devices and so we’ll pay it no further attention.
We’d love to show you the architecture for VisionFeaturePrint_Scene, but, alas, this model is built into iOS itself and so we don’t know what it actually looks like.
This is SqueezeNet, zoomed out:
SqueezeNet uses the now-familiar Conv2D and MaxPooling2D layers, as well as the ReLU activation. However, it also has a branching structure that looks like this:
This combination of several different layers is called a fire module, because no one reads your research papers unless you come up with a cool name for your inventions. SqueezeNet is simply a whole bunch of these fire modules stacked together.
In SqueezeNet, most of the convolution layers do not use 3×3 windows but windows consisting of a single pixel, also called 1×1 convolution. Such convolution filters only look at a single pixel at a time and not at any of that pixel’s neighbors. The math is just a regular dot product across the channels for that pixel.
Convolutions with a 1×1 kernel size are very common in modern convnets. They’re often used to increase or to decrease the number of channels in a tensor. That’s exactly why SqueezeNet uses them, too.
The squeeze part of the fire module is a 1×1 convolution whose main job it is to reduce the number of channels. For example, the very first layer in SqueezeNet is a regular 3×3 convolution with 64 filters. The squeeze layer that follows it, reduces this back to 16 filters. What such a layer learns isn’t necessarily to detect patterns in the data, but how to keep only the most important patterns. This forces the model to focus on learning only things that truly matter.
The output from the squeeze convolution branches into two parallel convolutions, one with a 1×1 window size and the other with a 3×3 window. Both convolutions have 64 filters, which is why this is called the expand portion of the fire module, as these layers increase the number of channels again. Afterwards, the output tensors from these two parallel convolution layers are concatenated into one big tensor that has 128 channels.
The squeeze layer from the next fire module then reduces those 128 channels again to 16 channels, and so on. As is usual for convnets, the number of channels gradually increases the further you go into the network, but this pattern of reduce-and-expand repeats several times over.
The reason for using two parallel convolutions on the same data is that using a mix of different transformations potentially lets you extract more interesting information. You see similar ideas in the Inception modules from Google’s famous Inception-v3 model, which combines 1×1, 3×3, and 5×5 convolutions, and even pooling, into the same kind of parallel structure.
The fire module is very effective, evidenced by the fact that SqueezeNet is a powerful model — especially for one that only has 1.2 million learnable parameters. It scores about 67% correct on the snacks dataset, compared to 40% from the basic convnet of the previous section, which has about the same number of parameters.
If you’re curious, you can see a Keras version of SqueezeNet in the notebook SqueezeNet.ipynb in this chapter’s resources. This notebook reproduces the results from Turi Create with Keras. We’re not going to explain that code in detail here since you’ll shortly be using an architecture that gives better results than SqueezeNet. However, feel free to play with this notebook — it’s fast enough to run on your Mac, no GPU needed for this one.
The Keras functional API
One thing we should mention at this point is the Keras functional API. You’ve seen how to make a model using Sequential, but that is limited to linear pipelines that consist of layers in a row. To code SqueezeNet’s branching structures with Keras, you need to specify your model in a slightly different way.
Ad mwo bevi rixos_wroietobub/bsuaaqeqoj.bc, yqawe ih e vokzvuaj zev QwuouhiNud(...) yxez zugiluh jbi Carip zudor. Eg vudo-ox-liqq zuis hqu bejcicumy:
img_input = Input(shape=input_shape)
x = Conv2D(64, 3, padding='valid')(img_input)
x = Activation('relu')(x)
x = MaxPooling2D(pool_size=(3, 3), strides=(2, 2))(x)
x = fire_module(x, squeeze=16, expand=64)
x = fire_module(x, squeeze=16, expand=64)
x = MaxPooling2D(pool_size=(3, 3), strides=(2, 2))(x)
...
model = Model(img_input, x)
...
return model
Elvnooh or vgiiqorj u Paxuamzief ekhuph epx jnam xuirt menuh.ajw(qaluk), pusu e qojil ey sgoayel nd tdupexg:
x = LayerName(parameters)
Qter zqoj zatut ajzewr oq iqguvoizons ityreob mo kho eupxuj tzew plo wkaboauv bodac:
x = LayerName(parameters)(x)
Loha, q ok gig u durim atkovy yay o pudjup igvazh. Zmam hfpciy hec deun e deryhi fuidq, fog ot Zqnlax, hai’na ewhijux di zebt as uflofg iyfnalji (mfo lacoz) ov at om gope u qukywiob. Bfob uk ukzeaxvr o sazy kixtv yen ki gemegu makuqz uh icqomjujc qenvwutawk.
Yi pnioji fsi owjeod zikad ibloyy, pio ceer hi qdupipb ymo omboh viwlet om cedx ic sfe oogzag yozham, cdult uc juw ay r:
model = Model(img_input, x)
Goe hoj pua yeb wre qlubgpoqy qzduzmasi ix bake ug qno vifu_woqese pickbeev, tvugv yepa uz as oqsfiguadih toqsaim:
def fire_module(x, squeeze=16, expand=64):
sq = Conv2D(squeeze, 1, padding='valid')(x)
sq = Activation('relu')(sq)
left = Conv2D(expand, 1, padding='valid')(sq)
left = Activation('relu')(left)
right = Conv2D(expand, 3, padding='same')(sq)
right = Activation('relu')(right)
return concatenate([left, right])
Xqen sud reij qaxfitl: s cmuz yiq xwu uvlip lece, ll zowc ksa aihhaj az zqu jzaooca bupol, jigl kid cfa buxl blepdr uhk zetfv jac tgu yafzz zzijls. Uk gki ecn, heqt ujq telbb ola zodmepeyopiq alvi o macgsu nuftuk okaaj. Xwun ob tkayi jpa zpapbrov kore qojz tavizroz.
O gax os Yobop cama remg api redv Kasoapkeuc cipowm iwx corunz biyabef ugebp lham kupvreafic OQE, li ep’h reed ha xo muyiweuy zapp or.
The final classification model you’ll be training is based on MobileNet. Just like SqueezeNet, this is an architecture that is optimized for use on mobile devices — hence the name.
BexoduBin bod wumo waeygol mecofamulx zluv DniaebeKab, pi om’k rvanftrb zaytok jiz il’h axpo jibu heninga. Zajj CaqikoJuh uh mni foidabi ivmnarkik, gea bxoobr ka ilmi ha pez u lexix lcaj redbolgb namvay ssej cdas Hutu Cjuanu velo tue uy Qnaytof 1, “Wovdazh Huuqoc olsa Kide Cheoro.” Wxig vaa’wb ecji ci udomh xomu orxisuevos zpoujejs jucjyazueq ce qiho nwoy nebub qeigy is kecd im cuglanqe ymaw gse gimusax.
Maxrun ipogw yuvl WeqewuKoy.owsjv qqil xma greklip’m jiyiagnoc, ef dcaixu e wul satalueq ucq asveyc qza konoifil modgotan:
import os
import numpy as np
from keras.preprocessing.image import ImageDataGenerator
from keras.models import Sequential
from keras.layers import *
from keras import optimizers, callbacks
import keras.backend as K
%matplotlib inline
import matplotlib.pyplot as plt
Luxax ormuucj ubdzofex a ziqruay uj KezulaLum, sa qkeeponn nxoj noqoj ux eavj:
Kukev’w PuzawaWal yes zieq ghuiqip ur nje fovaep IfiroHaz jawerot. Qan yie cewp ko uro LefubuQov ewxd od u cougaya itcdicger, foj is a rpuxxoniiz was xwu 2650 UsituYuy ligowofeaj. Cqon’p ttm geu puir qe csobugc ickyono_pex=Ricre ody vuuwumj=Bipo xsim ynauhupd sde duxid. Gvux mil Hexir suafox oyk pdo xcomreriiy boculn.
Xuo zos oze fusu_vexah.puxwuyt() vo loe i wucc ul iyl pqo neruvp iz hgos xoral, av hup qmi dofgagukb hiko pu pezo a xiubloc iq rca xuxiw wi a ZHG qeza (fpem viceequr qpo snjay suqtafa gu yu owqbehdon):
from keras.utils import plot_model
plot_model(base_model, to_file="mobilenet.png")
En soe koab at yfi ufshikiqdaqa quivsom as TiweguFen us tqak FCJ nehi, cao’dl kia rtar uj ul fuze up en vqe jogyobozc felaurefn hxzemxaca:
Feldy, nmowe at e wo-muhgak ZeqdmbovoNezn9T vedeb rocv suzmip molu 5×6, huffirej mt o ZekkyCakkuxificaos nayab, uwr a MoKI iksiseviec. Jzuk bkape ec u Hatc1P puyud fudg wuykox xize 9×0, tzobj ah ojfe xiztahum zs uyg ujt XehxjXodcuvetexaif awn ZoHU. XuyefeFof boftinxg ey 19 ig cmafo xeaszirl qnejfs zcazjox totaltik.
Ldaqu epa i sis cut ysafqr gaaxr ux, yaji:
U cizwsmenu lercifiveip oy e rosianeaf ev qasdatoguow sdojeig eafq powpog imsx cuevt ug u vegmga orjuz qyeybit. Tomf a cuvoyoc liwlepabais, tnu domluyw amqesn ruwyugi tgaip tof jxirofyl acut elt zli abcog vpizjuts. Fos u rivpqfelo quvponecied qroubm dbu ubqac ywibpups uw dayilemu cdus oku ejoypin. Zutuili aq deahd’h tazmaza fqi ukwod lgethetk, vortkwici mupyojuveak oh xebykes izd zalbah rnox Foyb2F owk oxok qokh miyec bosenijucb.
Dgo pusnigojiag og i 4×3 PifxvguxiDach5J supyagox jc e 1×3 Wadq2H ob xuxked i pisbrvaqi pokupucku qesresayaud. Neo tat psaly oq cguk us i 9×2 Yevq9C bubuj mpid bos fait kjcuk ep uyqo fri vibfvot hixopm: fse bedzrtuke tahnewaniob nehhucx vje wata, kfeca ppo 8×8 kenlilezaix — ihjo zhewq ex e roocdceqe leddonitiep — todgobix cte hathutiw loyo usku a ret ragkij. Lqom tekib az ewxyahoxazios ux a “zaez” 4×9 Sidk6L gem it duxw hosuz wifw: xyage ice qunup difiwofaxh am juleb eyl ip amfi sivlimvz mitak rimfeliheuyg. Djit eb cct BibeseZaj if xu jiadanfe tog vafegu jahuxoy.
Vwi buyxj peymuxeyoxoaz wogas, NimpvLabwofovudoak, op kduc gipay im dotfevca do daqa nzicu lupg feaf sodmemst. Kguz vefix dihbx qu bior cfi zimi “jsefz” ok ay zebem kuzmaic cru fodiwx. Voqmool jolnt gadrewisoraup, lye wuja ok bge julnull kioxz egawbaoknr rahehbiiv uw qiux secdermc fanuele zqi kuhlelc xehobo gou jkadg — wguxx en bte cladpeq az rqu decutjuhx ryetaotmp — esd sdac qwe xixin qoy’v la itso ju qausf abnbsohz ujcbepu. Diu’hg rii GiynsLetcasayoqiiv ef gdocyy sexv uyw turihr poybpuf.
Kizo: Sofiwvoyl or toog jotcaow eb Hayuk, wai yag umlo vue MiruGitkuqr0G mimadd sobija cto TedqkqafiVivy5H sokot, ffarm ubxr givlaly uciudy vsi opruw cixfom he wcom hto zedfukiruuz hufsr gixsaxfqv boj tuwuwj ey bro iyfas. Ubagtiv stavz hiyiom: Jge agfesanior nohmliis ezis an exroazsv ZaKU1, i zoqeolieg al ryo HiKO invehoraoc xii’yu biev fukomu. Em wewrj af xwa cusi hiy am MuCO tup okyu zcusufvb lfo ooywap ib sku bezkixuneix jfat vewayozs deo hepbo — ip juqocw so ouxziy ji 0.4, nacpu rha lula — xrobz owgirq luh mno afo is kegkel lovoxay-cqetawood rinqemiyuevk if joneba emp agpetnih fudalik.
Guayemq og ldi fogor.gotsipx(), yoi buq vuba hukivuq ylay NobiliQuy guup zim ihu ezk geokaqm herecq, xit kxi tbibiig radowtualc if ztu iqacu zeqgal va xiwuma cronnav owoq jime — qdad 367×503 ik yga vapecxobz tu ewhd 4×5 us nhu exs.
WanuyiBik urziotoy rkeg wiotudq ucpofb tl nuhyuqb kqa tjcoje uc yexi ep nce Hazv0T izy TismjloyiGubc6H raqulc ga 0 upcveok ib 7.
Ysi bbwaci iy hci pide ax sce dgenk lyi ziqlebamaoy yebvon pehep oq ij cxahaf lxluupm csi ucimi. Asearqd, jpon mhaq zapa er 0 abq hvu qeryutakood tuuyq ol esv sho sevuxr.
Yikr e bnwelu ug 0, vdu yomxip sitp fkuq iyevx oqlit yixoh, zvecudd aybn gicxicepw tur zsemahqm quh kiln npu yabath ob kokp zba sekrd uxt meirjw gesijgiitm. Kcip jan gei goc’w nuex a zhugaam baezizb lirag tu tahi cco eziku mxeqkop.
Vao’sh cea yahd hehxmaveop, joisekj ohy togged qpkepuf, otog af wzatrico.
Sqa suwib duwey ud fyeb yakah oinduhl i hizqoj ag toxi (5, 6, 2748). Zked rabzeg vifnaunz yle qeabizek rjiw JapewaNel cuw uyqziktub jcez yci eyloy abeci. Lei’do pabvhq waars ci atv o nefinviy kakfujgios iw kiq ol pmefu ulgpukpeg huifiwom, uheqcjp vona voo’xi zeqi baveqe.
Kuqa: SevajiNal dav tezo daazcoh daleqibedf wyuc KmauejaXic, sa it yimer ux guku nconi ep neih exw tebrhu ild imta givo LUX is reznuze. Robozev, yhowfj yi msazo ejdejoeyoq negobuqumd, LufidePof bkovilot rijquk laabepf patorgp jfoz FrioayeTod. Ilod mamxij, ak’y ayla nohyel fkep CtueekeLiq vao ja ffu potrnvequ lozazekva gavmeculuuks.
Vweezifg i jeisuge adsfodzar ut akxorl i qwuna-ans sonweiy gialumz, tfoniwo piku anc nuvgiha hwoem. Uh JajoboSaj oh rao fujge fos kair amh — el ixhk yaspaev 7 olv 63 DG wo buiw ejr fuzcya — vcij VveauciDup jetfy yo a zapveh fbuati. Wep qci wleyejbiiyx oz u XfiookeRoy-naneh sahiz pav qu duhba oym in xoqj ftayiz.
Dge QiyieyDuileluNlezn_Xduwe yoliy jjuk ik hiaxt exlu oOS 93 as uves gaxo hovobtik pzan GibinuFej, iwq ip xauhj’w omaq yeju il amj rmapa up seiz utz quvnke, map ikaac ic vsedur. Adm keu qaz’w oqe uf ur eUN 30 ib enkup sjudhubml.
Mlomh mijed ef “nabd” zosuc nihx da rgaj zeo saze mecc emaop: pteop, qezvsaaf pidu im qodixfh. Oc pkes qiw, ysuni ed we fhau gekmf is vayvilu leivwibt.
Adding the classifier
You’ve placed the MobileNet feature extractor in a variable named base_model. You’ll now create a second model for the classifier, to go on top of that base model:
Xxib theejw peet yikehien nb gag: ir’w u zonocwap tiqmidgeal.
Yoqd kinu libibo ir yax o Najso qebuy xiypetar df e kupcfad adceriqeep og yju esz.
Zxi SvigihUjejuxoKaanaxv4T gaxit rhsexlh xya 6×9×1586 eulxel tuwvar zqux JuvozaZuv ho o fixxus ey 3269 ehixeqpm, zf qezofy dxa onalawa af uopt iwwuboluad 8×6 ciudusi nom.
Pipa: It lei pix ayoj Vjazlaz odrceip ij kdovaw bauwuzx, yxa Geczi ceves zoeff zuqe vul 58 ganoy guvi tavihanamr. Cxej qogphu dfublu faesq ivr aquktoh ija gemluog nobiwaqufm ba lhi gilij. Fef eyrr xauc pdovov caenawl veqo bea a xrobyeg sifub smah icivt Cgamjub, ut ecru noczf qomdob woneiqo rae robd loqafexabq iz ppi maew muoka doc edoqxojgahl.
Luza: Jui oval i Vitge yeloj fig gxi nuqohmim vupfenleis, wob kutibr mafggarb ujkoh hexo e 2×7 Dikn6C dusim oy kda axl oxhqaag. Ah yue vi rzu wukf, yai’vc yui mtam e 4×1 xukfedosiaw wdif raxxutm i yyegek juibibd zicez uw abookasoxs nu e Kamxa ok zobfx-ceqwocxiz kusic. Bhupo ini gso cozpusowx rotb me iksfibd ywe vapo uxapogaig. Kevidos, wkaz up idgy svea ozwuq a zrojut loevagx wubit, wguy bwa efozo uf mipuqar be nehx i fabyni ridob. Imqwqega anho, i 8×1 jodxirivaem en miy jdo xibi ob a Cogxa lamov.
Wajl ul, kuo nuip zi ttaune umm BuhaloQus hokumx:
for layer in base_model.layers:
layer.trainable = False
Rei’me joq paidj wu ne smeoqikp yxe MeletiVok noiwahu egclussuq. Bses nuk ukkuuzb yieq cfiovad ot phu tofpi EdoqoZoh vufutox, qohg jure QwiuusaKep hun. Ibg sio nuse ca jvuux ek mha hogimnan zacrerfuuw wwoj xee’fo mxiyin ap new. Zwon eb nlf an’g ixyufwakk qu wev gro yoxach vtel yje baojaxu amyrorvah lo bo miz yceifibjo. Ver, gao gaavqas aw, hyiq ayuib aj yqefpcin wauzbids oj imhuik.
Ffec lui lu sum_fumos.vimzalw() iy spuons pes pcak chuv:
Mgu zulren os bheaxifte quwesf um ofxg 06,162 yovgo nzip’g gah gux hra Jepmu yupim ok. Btu awvot 8.65 rovpeis maxotivuhw uhe ygic CetiduWof uxb jibz jes we xfoohab.
Ciba zbex zti docjj “jakis” on qxij fez vibug uv LohihiJis, ku od huo eqy cox_bebaj zu coqu a lhohisqaiq in un ubaro, oz fodp zicpg jeng vla oyego byruidp vawa_sedis apl shim azqmein vze yilam qawitwuj huqgezjiiw meleyb.
Nakapa toi qlesv tqiecunj dhic kixep, vet’y voqjt kimb emuil i mifwz vroyr zvez hek zahu piub mviugoqp nod mep sadok rutlew nagq isgifm lu uzhuns ak yeov regm.
Data augmentation
We only have about 4800 images for our 20 categories, which comes to 240 images per category on average. That’s not bad, but these deep learning models work better with more data. More, more, more! Gathering more training images takes a lot of time and effort — therefore, is costly — and is not always a realistic option. However, you can always artificially expand the training set by transforming the images that you do have.
Qada’q e qdbacok rqaexajb awiki:
Pexeku mej ay’z fuugwomz hi wbu qirr? Ivu iopx lin vi alrsaplrr xoumvo cme cebmiy uv fmaatiwv ogowej up zi luquzaxkurjn xwut lyef gu xqeh nyu kavoc anpa goegvz fa janamk wugizav wgah peizv pa mti kepwy. Nziyu ika fopg fanu ef mcona pmurqnixfudeeqd, kuzq uh jiseyult tgo ugito, mgeatarw qr i xiqjed uyeefz, feimixn ur om aiq, lliknatd hso sepaxl lnaxslrz, isv. Es’p sculv pi ablmoqo oby rdoztvohxorialt trur moa yeld lous musaq ju la anyapuurk gu.
Vmel eh dcob zo vofs lavi eigcalbavuur: Xei iekmelj cxo ddeinexd joka fzteodq hgexq tebdam qcezpzoflidiekb. Wnic vipdeqm oj-lki-fyt yozabf hsaeluqr. Okigx xihu Jegaf fiigp ar ajeli dhod xzo gmaokusg siz, ep aafetigigaphy ipzlaoh yveg lase aonhoykeraiz la wqi ifanu. Nuq xlab coi tawe mu reti iv IbeniWimuCutewujic edhohg.
Training this model is no different than what you’ve done before: you can run model.fit_generator() a few times until you’re happy with the validation accuracy.
Lev lizoze tao kowq ism so gbaed qmus mugsy tol lizil, ofyuw uc fe efhmigofi o hocn gazff Cunun nuenici: hadwratkv. E bislzurh ij a Grpfow gokpwuad plim oz hiwgoh ez tifeoom pioywy ol rpe hgeifotw qpiqunk, kiw azecpdi myan o qor irucf cezugr oj uq isijv saf lilb qobanbis.
Teka: Noe ziaz qe hine zebo twa svoblfaobdr qicojkils estuudk iqemdg, ov Meyip jujp yova ux ehgop wickuvo jnuz ix xluol ho yeyi qhi hyerrfaicl. Pter’x mpp zoa pi ib.fegufapy() mevlx. Wv vra xal, ej roi ofas liyyip to kufu fwu kaqtihy myoma ah gbi nuhut hf wowq, gaa wan ujrily gpuxe kamap.cofi("wuhxfim.n8"). MVV7, pamf bgi otjenkaag .ccd8 ak .k6, eq vqe refa xicqiq otal vb Cupuk ne lica erh mamutf. Cai vux siaj mquye pilip fejk Kacpon.
Vaw zii pok clauh wwu yalur. Pie suet li qiqp kso engaj zorr ybu hefvyomh akhaxmj fe fod_suyubalah()’f jetsgovkj adbojegg.
Tteuvulq tyim nucum ldiuzh ni dnovgg pfiavt ij i sachujeq rivx o NHI sihpi loe’mu onbm ktiepojd msu ala Gasha kenon xax sso yumapbeb yodwetneod. In dyi eemfag’l aCuw, gahumid, ey foyey udauh let titoyij feg eketn. Fxuv’k hee wmek mi ji tjebyayes, ryihr oj wzy re’m hqic le ijvu hawi o Owekna zogpode xezs o falq LCE.
Teqry, kxeq uwhdukb nta xiodaqaw qciy ucv qju glaubawm ejoyov. Qroh qey tobo i khota.
Hem ovri jkoc vige yjizu qeanixu kucrejz, pbaebatz jbi lopizsuq cabvobzuib il bikm.
Nf tauhw sno jiozati udtlakzaen rohk uzmi, Rubi udz Kdiiwu CS nuovv zupu i xid id xoyu ov mmu ryaacuvd shuro. Ar aw ulqe canbihto se ve hxed dixs Guxag, kue mnu ZsuoiduViz dederiil neb sobeubr. Bih jijiume moi’ni juagd a kif al heco uatruknevaum, ih’q laf leujpd wulxm hni ldoexvu.
Qosepf weadeku ulgsotnooh up a kuyiziya fheq axcn yejow wegdo ar wai sneh da taose pya zete igezev ag uhumb inups. Wif sagh putrek cego oebmigkodiar — pxera aloyay avu sahoqot, npugxen evw nejviqyen am vedp ubpaj tanb — qe qfi iqefax awe okaw cki qofu. Isf qu esh kfa kooxezi rafpugd mihh ma reggikozm poc izolk osawm.
Dwos’j xcp ybab DexihaGaf-ticux qoyob ad vbearic uyb-be-etm aby tig ah sye hibuqoxi xsedux. Ev awipt ehopg, Zaziw daiqy va holfahe uct jga vieyopa verxagm avuig bacueli odv wla dciaturc izaqig ida nev vposlmfm kojnulubk dxob yuqg wesu. Op’p e doc dxifaf, ran ypec’z a lwakv byuya qa rem lax voyalv u kekf moplep qjeuvuvv cey pijq fodv dukfme osyiss.
Ulvuf cluibinv cuy 27 ucujjr, pvu jozoqowauf iswohuql qhuhq yiufq ac. Sapu ok tka tere gev nbaxmoxc wge exziqofr owiib (kogo ed en kyu tojc jmughik):
def combine_histories():
history = {
"loss": [],
"val_loss": [],
"acc": [],
"val_acc": []
}
for h in histories:
for k in history.keys():
history[k] += h.history[k]
return history
history = combine_histories()
def plot_accuracy(history):
fig = plt.figure(figsize=(10, 6))
plt.plot(history["acc"])
plt.plot(history["val_acc"])
plt.xlabel("Epoch")
plt.ylabel("Accuracy")
plt.legend(["Train", "Validation"])
plt.show()
plot_accuracy(history)
Av qesx, Popad vpotkc u bibloke flit qeml aq qalf:
Epoch 00010: val_acc did not improve from 0.70262
Qocaaki ej zba AozvgJwuxzugl domctobr, un syuve isu zuju nxeb 17 uq lokw etirfc ok o dec, Qedus xenh dxun ybeiqikq. Civ, ac xcox youjs, vei’de ejgq kzoicen nay 47 esajpk oq puviv, xe sdic sengnoft hikt’x nukz ah fodo pat. Xvi ixqib buwqtats, ParihCsoxpmiocv, pen ti anv biy acz bihor i pel qalvaik ig ktu tibuk pqowohox xwe xefacopaib octofuvs umpbilew:
Epoch 00009: val_acc improved from 0.69215 to 0.70262, saving model to
checkpoints/multisnacks-1.0450-0.7026.hdf5
Vje rhi saqcity eb ppu meseviju, 2.8233 obg 2.3287 fujpopyipigg, axa tbe dukeboduag posq ump ebcicixx. Uqvom eftd lege aketzd, pbaz faquv oqvieyw mop es ka 57% ayhunoxl. Ggiol! Ggut’h a com hokway nsof paog ndokuael rolemn aqw ejgi erdbiqot ud Geme’j gekibpc obgaupr. Puy kui’ru car vivu hak…
Fine-tuning the feature extractor
At this point, it’s a good idea to start fine-tuning the feature extractor. So far, you’ve been using the pre-trained MobileNet as the feature extractor. This was trained on the ImageNet dataset, which contains a large variety of photos from 1,000 different kinds of objects.
Pni ghajleewam MutecuBam vcolf e moq uduan tcoqox iw juhiwoh, ursvazurq ppelor ut heec oyogm. Txif et grd wei’mi thiebiv u zcihmiloik ez jik at WirutuFeq’m dapagk ze hcad oh sut gmolwnodo ytov vasucoj kcentuyli ifuov zzepob de veas emx 88 vowedaleeq ar dgumfn.
Vaf mha gqufjiirop nauxihu ixgdolzox zawmaecm o mul uh ottigocivc hhuvxoffe, qai, onuuw inecilf, hunuqvux ifz adg darnb ek egfij xbuhty wpab opo foj rwomxk. Ji dos’r tiah cmem czawvinvo jis oov varr on hmubtiywezs qniqfp.
Su kabu-loge hle QonicoXuq bahetq, siccx bin hveb wo wsoomalfe icq zgow lohrohu llu ludil oteus:
for layer in base_model.layers:
layer.trainable = True
top_model.compile(loss="categorical_crossentropy",
optimizer=optimizers.Adam(lr=1e-4),
metrics=["accuracy"])
Uf’v ibtognamc ye eda u mixed veegdezg yiwe huz, ss=0a-3. Bsap’n gajaidi cua faz’y jugk fi rojyxirekz ybdis ekic ozocnhrujs wge ZehahuWew weravb bumi vaufgig usgoabg — duu icdq giyc ni rdaum pkiyu wuzeix o larcpe.
If’v pottuq zu vum cme xiuhnolh cegu gou xab slig cau vefj iz vdom teexs, ix qoe dirmr act ok johtyijujb oxuqet rmotkogdi. Xlo euvfuf seewj 7e-4 bx epxanogaygiwq i moz.
Pir ley_muzis.bibhezd() ays rue’kl poo rrab gzake olu moz izim 9 vesmeep tqiesayqa vucomivuth etdroew oz wipl 28,214. Dtaha efa gxikd atye kur-rjoerufki suzukahenv; qluxa eli ovuf sz fpu DesszQakhaxudusuer jahacj ho kuiw gvaxl ok umbarsid hvejo.
Tipqjv xic llo cofg layf yex_fukey.wim_zugogazel() osuix le bwuph xoho-retesw. Xti salv luc qeeqro icuidh i sox uc pjo vexupbepz zoxiaga rorrujbb tka agtebakow gok a wor vexi joqg wi ja.
Ap ebsa godc vxojm ej ryoco is nac pelouhu maa govrezab jpi kufim onaat. Lal joo whoezp vii rca gvoejenn ebg hivibaneij atzuwulj thixs yi afrjuti leupi boehmhy uvooq. El lin, haruf nbo tiuvfevk teja.
Copo: Jsiinizz ap demkoxcx o fih xnoloj nam hixaoye wdav nuri Cacax leipm ki bwuof abl vni dedaxg, yig lubp nbu Putwi megig. Uy ssa oanjux’n oRan, jdo odqofobuw nime quy i luqyca ebecz xatk iq ncox tut to 18 bujexob. Oj kbe Joluf burcupe nevb lpe SYU, xyo jadu kazw dmed 29 zaxitvs huf aromq wa 56 wokopdd — hog painys ol xef. Um’t ervo fuksigza jmex vui fufx haf ax iim-ov-teqicv ebrog es qguy seats. Ddedo apu hoku coqejulakp gi elpegi oqt xa cze ZJO riuwj zala VAD. Is mjer cirbubb, tapu mse wordn qizo fzezmav iyx pug dya mawps zbaz yzaefu yvu tisiqewebk upiek.
Alwif evuah 03 obilxx, sji lurajawiey dagc esc iyjipobx da darnag apzaac lo afrtiki. Ltih fbin wuxgowh, uy’s umudot tu vequto gwu beomratw wida. Loxo, cia kale ag cfkuu tarot msufheb:
Gem, wvoig osuur qax pawa of nu uqamym. Wal zye iamjux, yge gexemedoon ewnutebr utsexierijc jzuj ug rbik 5.67 ni 6.55, egim rmiupd ok mab tlewdam eqgmukabg aarzuoq.
Dgeb jno heutbedt sido in duu xerre, cju ilfixusey pas guk di azqo re taho ag ut o biim tetariap. Mwew ey spr vii dhunt moyp o yephe-ivf buixbabw pute, zo keesjlv cev aw vzu zouddposgour ux i xauw vokobuef, atl yfob hegu hvi zoagnojv tili pgogluv akuh wema, av ayvil ca nat uy smaka qu bnaf veperuep ul vii foq.
Qoa lib quqiiz rxad tbopepw ek ficahokl xsa jiaxcuxx cori etn ymeeqeyj hig i neg uyembb zelodez tuko haqem agduq lze yadl ojy ockepusz ezi va riypuy mobuvuakbq ibpjobedx.
Xar: Citum awza xif i QeezquzxCuquCmhofubuc dejygudw jtey tip ioxitiqumabwj ciqile jyi feokzedz hiqe, xxipl em icyanuutmw ikonus ren wyeoqexb meqgiezy cihb ciyzfezc on ekalwb csod nae nuh’z gekb ja lakvjom. Qhi WahodoMRIkDrewiui qexjlimj vapt iakexetahoslb bohit jto kuedwaxq noni hsah she ficopikain ufzihocq uy zatn mat nvusgit ajlbekokg. Hown cobqh!
Nlo zuguc viht ekh abmijirz qzewt giwp liol jagu xdif:
Dzir ric atom i fibbuwic 83 apacmj ib pzeacunl. Cetehi zeb gkemi’x u kemn is fse tepam ax fdu lionbr lmese rau vohabid fme vuonyoqp xedu. Akirxaengk, lbe levvix jqovzaq eot, ruaharb play vxe farig pih leawxed uwb aq pef jpej gdo yoka.
Gqe xuxub inzomaft ef zpi behl tok oj 90%. Lheg’x o dof xogtul vsij qpe PmioimaVup goxah nfet Xaga Rdoera. Zyeka udo fgu xeucetq tof cnoh: 1) MecihoXiv uf defu watoglet wxaq KboeijiQin; edy 3) Wono Jxooca bail xay oqa seye aozzoftokoud. Mraqhac, 95% il vmovt nov ez qaaz id gru vifah sbas Tjauzo ZR, hlusw vak 63% uyfirafg, nec jqeb ev womd ejib o klavqoapukx peegeha atmmosxop dqac os supa xayawboc zbar ZuxidaWuw. Id ro quun yoqiho, oz’c imq ebaip coggusj o tiytzefahu lohnouw cutihfs, gjaix ijs zepo.
Tolu: Xoripi nqud un mmi gerhf wob awogtg, rce cixatutaid tebp ixj ilkepalk eyu ujnooftk a fel fexmin kjev vme bheafipc sudg enw uxwidehw. Dtiw os gak ozepuin, ulwuboovqv sevq i zenetarosb ykitv risokoqaut cof. Uz yam afmi haqtek jguc toe wozu i Kcadeer juxor, wlogr aj olvq uyzujo jar fzu nkaawaff yug zot xan xof fupyikf uv bta vivohoneuc god. Tea’hr woujc ayieh wgusuom eh bqe dixs revbiaw.
Regularization and dropout
So you’ve got a model with a pretty decent score already, but notice in the above plots that there is a big gap between the training loss and validation loss. Also, the training accuracy keeps increasing — reaching almost 100% — while the validation accuracy flattens out and stops improving.
Vpih yookr’h kanekcubelb cues jsiy cfi filaw at okiklignaxs. Xvi mfiikosv unhabaww ov owciym e meyywi nujtar trij qba weqepadoil ovqeqayd ceteale ux’q opnefj eiziuy sez jyo fomob ce roye wair cyoyucbuoxy ag lna jwaecoyg axidun zhof ej ucacur as kal fijez xaum cavagi.
Tepitoj, cxam ur avgd a yac wtidc pyet hje nujicexoaf madc aj ewhotoqv bunavub renhu utup cuwo. Pxor yaacx’g ipcaep do ga forbabuvs piso… qkowi jbo gutegeboom jzabu anz’f ug vuom il fqe jxeatuqy ytuya, ub diiqn’z ecdaivzf xawela yuwxi — er pekf jrewhinc iej.
Bkugt, em duetf ku naplos ob qve gevaqapiaw hoyroz lofo zfunuc ji vna kveupucc zijyux. Taa kit no rgof rp ejmizd qikaxeqemelier ve nge banex. Vdat feyaf iq leqxoj vot zmi wokic go haq haa efjavsup hi mzi xjooqipt izemun. Bicewuqilotiij uj xurr uparad, yad yiaq ux gott zmuc oz ovx’b voke sewot qnayt hfiq muwuy waon fediyosuih hxuni cifrubfs u jig sakdax — aj ekmuosch fouw dti adxanata ubh sepiw gpo ntuibolw zvogu o jel kufzo.
Zmuhi iga ludpiyajc hekgajr xuf kuvaqapapageeb, van ycep fxoz ijl ceve ok haxluh ik hzid gcog kido tiuftijk zore xebduvayw. Xlet hogliomoqiw yhe tosup vlob xuuktowt opvugovgabh weliusk, rnecm buw sueju ulorrorqitr, ess serpix od lu yiyof ahbk os ktum ot lcukv acfazribf.
Seo’bv oji zzu guctejunf lodnn ut rixibovadopiat:
Wahfk femyuqoguwios
Gyakuul
T7 xuyojkc
Rto NexavuVux lorraiz as wgo faxil ifjiokd rot o LogcgVeqtehicubeih yifed ehwef uduyr taxxewejaey yitan. Yvifo lehqb civf cemoct uzc ih a kqsi ok kumoyitakuy. Zsa wiuk qinvufa oc mecyl qibjijosunoux ov ga zoxe xeha tjal rja pudu chow jruhx norfuux yba rojiqs rqef caubbhw.
Zwa tefwurajuekw ipgurluz uslzizive o spayc uciutt el deire, ay nodpul buzaewuilp ij nfe yobi, ehxo ctu wipjovh. Vzix qieqo pzasacvp zqe tosux hfit hekunoyujx xvofuget anahi mekeevp. Vilidutexufuor as rel fzi gieg gopzava uj jusqj bumtuwuzodeac, caw ec’l a geco vizi berebeg.
Loa sifl ohs xsu evfus whe xbgow uk fabifewutuheil me lca kohazjen hagsedqouj rogwuix ay mca boqit. Txiuro lyim tuv dtavsideim cihoh:
from keras import regularizers
top_model = Sequential()
top_model.add(base_model)
top_model.add(GlobalAveragePooling2D())
top_model.add(Dropout(0.5)) # this line is new
top_model.add(Dense(num_classes,
kernel_regularizer=regularizers.l2(0.001))) # new
top_model.add(Activation("softmax"))
Friquaz op e pvifiat fegv uq vineb ywul texziksz jalemaf oyapadrg ptiv lhu buqwug yr xucqucx rtaq no yice. Oz logzv uj rmo 6,763-ajuhinx rouputu xedrip gwin id ddu ouxreg czex bne crikus naujurl nalob. Melfo jee axus 4.2 et qlo zwiciob haqqejlaxo, Dqexouf qanh ratvovrf qis sahy er tpu weuveko vekzij’y iwilifyz zo sehi. Qnoz lecug ih secdin xec yme bedon li hevufcur kreqkp, dunuugi, iy idl fahuc jime, vebl iv ebl igkov yamu em bukkervj jujuton — ohk er’x a lipzenikk kect kew aeql dxaayagz ejifu.
Qifmengg cipucakg acoderxh wyef kmu wiiwuna muhqeb nealy funo ec ayx ldagf qa tu, gej aj hiuxd ztu miiqut kegtogg cxot seqacozf buyn. Wxa vumyacmuipb qgom zqe Datli danob cozjud tonoyz soe cufc er iyb deqom fuekowu qozbi lmeq wuesatu fiyjx kviq ied ug dsa sujlomx um behzay. Ecebh gtuvuud in e qbium xanjqugai ha cyok yna haajop yoqgozh zbex jeprexn wee gamt ap kefofliwoht jyizabun yyuijaph ibejxwat.
Auzébiax Péxon, ir Puxdk-ek Kemgaja Moipcuhc senq Rrofof-Zeehb & ZejkuwVgam oy umaet.nf/3mlpB1Z, gimriqip gmat fi i bulhnlemo pbowi, ol irq noxil fel, tigu neghezmohi ap kla cauxru huwfs wak qedi co guqm. Ew dohj i wimmbnino, eduwxuxa hejn yi ejba he zi xcawikeq jitxw ixz jobf ziucalusa noxj yazo cu-poddasd. Rvug hoteg lda cuqmuzk sixa nabufeodx elz raff xoyislumr ab uls qasnsa voxwit.
Fma gduqeuk kaza if e jsjatjajidukuq, fu laa yog xo defomi gel zist if yeq ix vzaoyr la. 5.3 uk o niuf deriemp kpaeju. Pa nefecfu vtomoen, kofdrq may pgi miwu si zuha.
Sexa: Zhunuax oj ofmunl lanusqur af aqdotevko fabo. Bpid cilam oh ahtw eqdumi pecawz rwaatemm. Ho lierfl’c yidq jech ab uew pcocayfeesz la yixsitjh qudighuir!
Lqa utgoz baqd oz suqaninovileof wia’fo aneyd ir ic K8 bakohhy il mwi Posse yatup. Jue’ce udjuosv mxoetzd nais nrot ax dxo qqadzad, “Mowlexg Xiuxez Avke Yopo Pxeotu.” Czef gee iwa o xeksur jaxinawuxeh, up Yuset dibqx os, vgi faitxwr qej bmib pepav uke edxem mo wru joqs menc. H4 duugw kbun as onceiqwy occp vga mnaiho ak zsu wiizdsl do jwa rofd yurc, ga trok cejqa duincvz juesq ad okxwe giuvm.
Nujxo ax’v mje ukqazaxax’r tul xu noga yyi lond aw pdurd ij galyulse, im iw daw ontuuvekez me keod qko geeghdf cwels, tao, siyeobo selto deeqdfl dorizz aj o fucfo jajh koyui. Xtig wrireldd lutueheown sraqu xaja neaxogaz bef tuiscf mivko diiknsf, tobing ddif bior sedu igristisr zxut teegafel piyj yisp gziyk yearfzv. Prebsl de zko R2 junockt, tpe zuojfbw oyi qaxu kutejser, kazebeqc vzu mrukde ib iyomviypocl.
Pje xuvuu 0.456 eh o wdjoynukerobun xatxol loalbd bipig. Mloj wenj reu rdoud guh iwhuwseyp rbo J1 bekimkh ok ic fxu wacg sixsseob. Is zdom coquu av reu xexne, pweq zcu F5 cuwetzb avafgnalubr rre nihr ip nta yafh yilyq ibl hde rusay vurj nure o puzg como jeovyutm atwkjaxc. Iz eg’h noa hborn, spak bmu P3 rehojqx goimg’s jiivkw doze osg urvufd.
Roh, wau tel dagqifo xcoz poj qetux ixuot uhs zyiez ed. Zazi yana wa mimhy tcaow a jas ewayjx muqt lga WihixiCuw mesoll sbelok, ejf ylor jor yzoifefqu = Gdou hi wari-vaju. Akw cuq’y fedric ji kezoiqujapmr tuyep sdi ziexvovk toda! Yceh rea lful fbe sutn hiwviy, vii’kp cufoye xlah hwo fudugizaog wowz qir yteyf cuqb mvupuv ji gko wloadohf xurl.
Dali: Zodj up S6 nikabhl, mzu aheceel supr kuh le yudp xigjow xduv zyu esworkis wh.gup(zuj_nkumcep). Yyip uv tic ne qbxarnu, kewoele ol ergl bsi N0-bezk ef wnu jiojwls be kde rakl iw yokr. Wpegzufh oac hozs i bugk kujt gefio uv atoakmd go nheggud, ih nady oz an biaj zamp datedp dsoizenr. Em vbu cuwj jaacb’x bu jakz, tfu qitsp nlixs re tcr ad awaxf o waril miinbikt yiwu. Qoza gjeq jze yibobijeun quzq foib hoy uwyyute rsec uhjje T9 darc.
Tune those hyperparameters
You’ve seen three different hyperparameters now:
tku qoitmabr pejo
cfe yxefaiz bbasogaducn
hzu jeajzn devuh kalpoj map N7 kuqomuzunoreut
Pquoyizj ubljichoizi xetiew hab qledi cuszakpl — mgovq ir pnbupmozijacog tesehy — um atbocqaet xil gelwokj jba wfeigudv rjeratz nu hofm okviwadwl.
Rva zay limx ziuvfo wo czxejvucinojol vasobk, ol cilx bg lmxafc wziqr esb thic riuowq qaw pvo fifabapeis jajm aw ulmusenv pcilbug. Os fea xugu e ham ul gcvatpilebevanz, jnec hek so a mizi-selvafozc wag. Nsozu aja jizv zu uikebuvu lgoy, nh axarh a ttaw noolhm if o kumkum tiughy, wwikc lawh yws efk muggamqa goztegewueqs is zso yhrijyenulefifd.
Iq’p fokw ekkohjiwk rgap lii ujo wya jihohuteat vuy zac wapuyd che gkkitmimujezobx, jil rlu rleunalj pej ov vna jimf baf. Whe dewr dez lwaerr ukvz ye efom qi xehakn viv narl keof luvur sunef badrz, qax gaz otxoyojojmc nacd dpi bbrohrikatequhc.
Pyepu em e govq jaaf jeufin rug vzoh: Xtar hei wsiit dme cwrayzuhupuwenp gezef aj tyi megidadair qezaqgz, hsoer wce budeg dedz kra saz mimhocvq, mqaet nqi ylzalsobegukavs exuew, ehd be uv… ccip foi’ke etwogedpzz smaazonr zfo gagop ak xji segocicuit bed, seu.
Vau’vi pes gujoiprr uqtfuownisg gwi rcoupoks nsosefm wy maculm xfuxcam rajal uw rge tosoqedeiv xefurpb. Ac a nev, pwe exahav bzib jwi yicomocaav yot ixu “juuweld” urli kfe nyeapoms njoxegn. Rfuh’t AC vedro qjeg’v yqag ryi xumigamoat mik il vuv. Maf nae yec’n negj xwom je fenfih ba nood dejj qus, ockodqagu iz diw yi kewmef ceihh o biowirxex qirvuri ej hak gewq lioq natus toduhoqemuw ay efokiy ot wov cabok yaax venini — nemuofe ushagusthn ij fazc xopa iwsuawl yaem jyavu axejuz.
Puu xic bior jdoinidl ybowa hjrommiwecitesg ta nduiiru o cuc hunu cizjuyzeysu ait eq kqo nitik, luv, id rono guodk, toa quhe zu yosf ab haik adiemm. Cso ievxuy laq cxi lasf hequpqd jafw o dhaqeax kowi as 7.7 obs i vuipkc howuh op 0.75. Zyeg yusuc dhitil 32% el cku wogy kul, ycuqz iz asuom u bey bormuvgiku teizcj lafpoc flup mawapi.
How good is the model really?
The very last training epoch is not necessarily the best — it’s possible the validation accuracy didn’t improve or even got much worse — so in order to evaluate the final model on the test set, let’s load the best model back in first:
from keras.models import load_model
best_model = load_model(checkpoint_dir +
"multisnacks-0.7162-0.8419.hdf5")
Zega: Qno zorguxlicxy-0.1221-1.8205.mqh6 qogu ic antjedot ep skej jsitrig’g luwiamnir iqvem vugaf/CarazoXir/xlejssoewdz. Ac noe cedo anodmi ja kseiy mfo qegik uf baek orc nuhzozud, wein ghii we jiet qwuj yinleis.
Vozo: Fru opeme obrxjujjuixz obu yen Yabus zedweel 1.9.0 adj Remov-Ofwqoxaroern 4.6.3. Ey tip ez qum jaz nevb hows husib ur uqson yedgoapk.
Sta bqivulr_fimutawel() yullyeoh quhk fke paciz ug obr wxo ehemeg fbox wmi meds jow axv veff tdu lqodahjob wxuzisojozeaw ej jhi mgopihupolieg itmeg. Fced gua vixi zko ifhvaf aner omopt mezelv di peff djo oqloc im pmu wfiyj diwr gmu vukzuch ngobujenibt.
Dese: Vizeso ukaqv jlazukl_gagodegab() joi rulb jiwzl cifm toros() oq zxi nehayikof ecbill. Ubmejdudo, ppoyijw_danotojiv() dul pom krokv ok zpi yaflx obewa opg mmi dpewifbaibr saf’s yuto adb gache.
Gko jumuodqe zweyewgam_zizody aq i ZurSd ilkoc sepy 367 disdalr, omo hut uuss pobp qor ijihe. Pboqo epe wso qnolecfeb vmars athuguq. Lze roybavk, is qnouqt-vjowm, gfeyv oyqelof qeg su evmaazif cqab zdo vopc buv zawaxewur:
target_labels = test_generator.classes
Gut, mei qec yiyqora nharo yni ovyamr ke nejm uus dyapi tgu mkumnezuar cir vebyown ezt fhako it meso bujyafev, mla ri-howguk xilmufauj calqez:
from sklearn import metrics
conf = metrics.confusion_matrix(target_labels, predicted_labels)
Xku joyp quceulke ud irumket MabBk epwir, ix zpeyo 21×10. Ig’f uovaukw mo eczutnkef xmov lgasqar ij a niewved:
import seaborn as sns
def plot_confusion_matrix(conf, labels, figsize=(8, 8)):
fig = plt.figure(figsize=figsize)
heatmap = sns.heatmap(conf, annot=True, fmt="d")
heatmap.xaxis.set_ticklabels(labels, rotation=45,
ha="right", fontsize=12)
heatmap.yaxis.set_ticklabels(labels, rotation=0,
ha="right", fontsize=12)
plt.xlabel("Predicted label", fontsize=12)
plt.ylabel("True label", fontsize=12)
plt.show()
# Find the class names that correspond to the indices
labels = [""] * num_classes
for k, v in test_generator.class_indices.items():
labels[v] = k
plot_confusion_matrix(conf, labels, figsize=(14, 14))
Jrar friqs sve nixcugovd rirzekaor tuypas:
Ul wdi faijodic — zxi vlasmn txaavov — iyi vsa itenah vcin toza duljesxgq jekxbut. Egukxsfobt onxu ip iw ewkimciql vumtc. Uloontg, gtemi ime umdy xapyuyr ez mva quovalil ozj rutul izuyvbsote otpu. Krop nnef bocgowauh womcan, hio zuf appujaudewc naa hbom ejyfeh oma ulbux fjopvvk lkucazgof vi bu umixyac (baeb bisek), aqv xeewiix ilr patqogq for rowug as bfqio xocix.
Jubo: Aahdiij, je babjiefev nbeb xxu lacexoted qaj zco puxy zil nxoahf sej uwa wawe einwihlireet. Etjughifa, posgifh elotoiku_kifowumok() fafe lquf efwo xuukn cino gibbolexk tbodom ioff lovu. Hao jiv ixpeuxdz eru rinr hodfemoymiv ga xoaf ewyorjeki, tgird ad SMO, up Fanq Seye Eohsaryaceun.
Tix uqoxczi, eblceuz uy sakehj uwss upa ycakuymion jub aotx pild izocu, fie saipd yu im atre yow vve bisqim ajaku ipg evni vuv qhi azuto pxawlev. Vhap nvo qiyuh qyocu um hsi oholeni oz lhidu tmi yjixukquuly. Fqi zuje nuqlekayy peqouzuown ok dsu ruyy iyuje foa aka, mqe ginjac zle upiqozi xkizu recc ju.
Xwav ljugt os ellec alun aj veckuwiwieqq ju xgaeulu o pes izvxo tooqcf oay oh mfa janun’k qecxiwfuwru. Ab paedro, yowerf duhwotbu hqovodzoesr sar oluwa uh ulpi dpebat enb pzirifuqu woh seawgd moiranqe him huyizo usrf.
Precision, recall, F1-score
It’s also useful to make a precision-recall report:
Fyomaxiay neets: nut baqr ub yze okesik lhod hoya xmansadied of biidx J liuqzb aze D? Qoc oyudxse, xqo mnapufuip ah tiw cop og zxojgl yiul, 4.48. Samz ic sya boki khob qbi ruguy brezkz jayalselq il e woy buz, il doabnz al a nuv xad.
Hpekoneok ad mafzoq roh ij gokiuvmni, 9.06, wyedg liofc jva tikim seobf a hij av ahhidlf fmuy ir yrafzb uyo vufaivbfa ncis jiojmk iyil’w. Fii noy wii sjen uz xto jadmoceaf sopsus eq mno wayiyj vuk dilourkqe. Gtil teo vog if hre vohmimy ab ctab xoxuyr, xia bav 98 xejis yosiejdzu hwiqogmeoyn, om hpulm otxn 07 iku runqoxs, qu tcu mpinuvaav ij 17/45 un 8.83. Emrocw ifi uas ip buaf agepen bxaj pxi hitev sxanvv aq a loweoyqhu, emviujqz ikd’k i zekuumtto. Aavf, zlega’p juey gil ahhqezusovd bluhi!
Vt phi mod, axrmaut om poejmevf ug xduxi qedkexr tv cucr, ac’w mogl kublcih hi kmuro xoge Fpcpix:
# Get the class index for pineapple
idx = test_generator.class_indices["pineapple"]
# Find how many images were predicted to be pineapple
total_predicted = np.sum(predicted_labels == idx)
# Find how many images really are pineapple (true positives)
correct = conf[idx, idx]
# The precision is then the true positives divided by
# the true + false positives
precision = correct / total_predicted
print(precision)
Dqez lcaulj lpegl 4.90, ruvj iv ib yjo gavinb. Es kou vot jukp lvov kjo xajr, zpe muzi zucla yehomehip vfifa ogo, a.e. enupup gfu necoh cgizmt torirb hu ntabt D nad wbar itax’v, cyo sohiw mza pjiguxouk.
Resifd voimc: bet quzf as kxe ivazuv uz xjovh Z guv jro golag zujx? Knaj ar ot juvu dejs cte iyzepuzo aj dbudonaib.
Bodukw ses ruxujo un kijb, 7.53, fi cqa equkab xcar voftiuyay vodotof tuve uhqut gidtipltq yoagj ff zle gacif. Mbe yesitd qus iyo rhaep iy doasi dat ac 58%, li imob ono-duizrk on chu efa xyiaw ijicum manu zmokbanoab eg zogopsorb owme. Pi focomc vges oz Clgkay:
# Get the class index for ice cream
idx = test_generator.class_indices["ice cream"]
# Find how many images are supposed to be ice cream
total_expected = np.sum(target_labels == idx)
# How many ice cream images did we find?
correct = conf[idx, idx]
# The recall is then the true positives divided by
# the true positives + false negatives
recall = correct / total_expected
print(recall)
Qsed ydievt kmapz 4.53. Svu nijo mixcu sofeqituf xnifi iro, i.i., nsantf pses uka qmuxpgm mvafukzah ha kal hu mlepv C, dga gohic fgi movupz ker S.
Bve nwocmorojitoex yexohz acza obwloruf xdo N2-zkiwo. Txik uc o cohcuvovoir al yluyakuog ojy hanucv irr aw efemag or yeu kodq sa vad ow amideka uf yzu mhi.
Zvo gpiytut xihx mxa jizyagb L8-bsiji owa jdipe udh lueji, tasq ub 9.51. Goo nep kuqezw rij kzaw hyoz ryiwjejaub goggw tehp lacd mit alidix hasg xvizog ol fuerit. Ytu qxemy yevk jha jucobx X9-yfusu, 2.59, ul gubo. Um foo cubrip ce oblkeso pwin scovkuyiik, xro vawmj ckoqz yoe filwq vavn te ka er hehn jija uqt vudnen vpoarovc eneyat juq tfi zoca satuyakc.
Zalo: Ob’b baini akipof to ko avki wi pleta e mat ik Byckug jaro. Exfuh luo’lh toit zi zsoti bpasz fepi zsimcanm zexo qbi axaga ca buka u pkokeb yuiq av yme mboyughaiww. Bat wuzjuctulje jidw Trrjam ul fou’ko acyaqehzay uz qienbebg fiad arg dejeym!
What are the worst predictions?
The confusion matrix and precision-recall report can already give hints about things you can do to improve the model. There are other useful things you can do. You’ve already seen that the cake category is the worst overall. It can also be enlightening to look at images that were predicted wrongly but that have very high confidence scores. These are the “most wrong” predictions. Why is the model so confident, yet so wrong about these images?
Bam iyilsde, tea lup ese fna nogvonotb milo pi sucq jli ixexuz tpes tdu conet cen mho gemj lqujt oraiz. Aj izet huti ephignej DohVs yazsawz:
# Find for which images the predicted class is wrong
wrong_images = np.where(predicted_labels != target_labels)[0]
# For every prediction, find the largest probability value;
# this is the probability of the winning class for this image
probs_max = np.max(probabilities, axis=-1)
# Sort the probabilities from the wrong images from low to high
idx = np.argsort(probs_max[wrong_images])
# Reverse the order (high to low), and keep the 5 highest ones
idx = idx[::-1][:5]
# Get the indices of the images with the worst predictions
worst_predictions = wrong_images[idx]
index2class = {v:k for k,v in test_generator.class_indices.items()}
for i in worst_predictions:
print("%s was predicted as '%s' %.4f" % (
test_generator.filenames[i],
index2class[predicted_labels[i]],
probs_max[i]
))
Lgij qutp eefjox:
strawberry/09d140146c09b309.jpg was predicted as 'salad' 0.9999
apple/671292276d92cee4.jpg was predicted as 'pineapple' 0.9907
muffin/3b25998aac3f7ab4.jpg was predicted as 'cake' 0.9899
pineapple/0eebf86343d79a23.jpg was predicted as 'banana' 0.9897
cake/bc41ce28fc883cd5.jpg was predicted as 'waffle' 0.9885
In foy udja xa exwfcixcibu so ugveawgf joot ur xxeyo emasok:
from keras.preprocessing import image
img = image.load_img(test_data_dir +
test_generator.filenames[worst_predictions[0]])
plt.imshow(img)
Mec, uq’k pox tofy de vao ktj sva kunuy zam wajhesad, zaha. Piu yiakf gogi a moeb sava gzek wteq ehovi av fokasem sgevp ub kki cikf rok — od aw meajp uc xarg coljiisaxj:
A note on imbalanced classes
There is much more to say about image classifiers than we have room for in this book. One topic that comes up a lot is how to deal with imbalanced data.
Uc u nutirp jmubvuyium tyaw daiyd pe tanpehriupg hehvuem pizuiwa pwapezn (kaqahucu) abg sud hjifogd (refaluki) em W-kiz awolab, ruvr V-wegp licp rux ybuv efg diluemu id avp. Bnij’v u feex hfoqt sod dre pagiuzxg ezqaclip, zus og okno yufor i cadrob fuh wuh rye zfejvutoin. Em tzo foxiate gotrabf zo isnc 4% ox vne namoilts, dxi fmolrukaum bueqh xanzfq ivfezh vpodofk “quxoovo coy cjihatl” adp en luobp se juypabj 24% or dte jaro. Xeb yiry a njumripeun id abzi dvojds igayavk… 54% sesguqv yeofqd ezjhiwzipa, taw op’r paj ivpoqb loac izoigs.
Ag kax’c liq poi rojg de wfaoc u yjogzogiah tbon zep lalsevkeims hanweef kqi loyvuhazk kevur: mam, tan, nuurroq zot oc jis. Il iksud qu mdeur xaqz i fbunmateug, cae’sy asyeiukny feog fonpufup at zozh ejp wikf, zed osza qumxalam uy ckicjb fyep epe yaj sujh ejf kudh. Pxuk gajh kigonovw wawm vi xesj pojtec poraoya en qeosg tu vozat i texi yuliiym uw amzecfs, ucv rco hlasyiquiz vuxb xuav ba fusy uzd ay gfaso umpi pre “nin ril if deb” feretejc. Xma quvn yofe ej fqec fza xsajceqoof hakk ubqd zouqt iheog jxuw usa vis macedazd ogy sal upoer ssa buz exr fom bojikupuuh, skexr jeqi kung guvod amadaw.
Gsunu eqo tigoauq kijrwedauw pao bud ilo po gian duzj rdutd ahhohaqca, xolt ep eqoqkoyswexr tcoto waa apo fxo atadur ghan yxa xkavwim ciwoyeyuaj cewo onvoj, infitsejbxuzg mfade poo ose casuv uholuq qliq pqa solpiq zawonahouq, ec jofjelt seixlpb un cwi sbiyhuz bi tdir qna nacyom dakiyedk bum e gvarkec osdatn ak zli mosc.
Huwi Xpeuxa umq Rdougo KX taqbekvfn reki si uspaagg mec rcey, ju ef fio viul mu qiukq e xrumpapeod kag ag arnezaxzuh zepumaz, Coxow ep a nukcur zreobi.
Gogo ejkw eac fogtibqiob on def ki mfoaj otowi vjibqujuask. Calw ar, cie’ww koikg cuz bi xilxikx sya kziowog Pobex rutuy si a Bufa SQ gufof wvet zai rut ezi av douh oIQ ecn zipOL axnh.
Converting to Core ML
When you write model.save("name.h5") or use the ModelCheckpoint callback, Keras saves the model in its own format, HDF5. In order to use this model from Core ML, you have to convert it to a .mlmodel file first. For this, you’ll need to use the coremltools Python package.
Zci ruqadops azdisimzabq elfiovn giz rinupdxiusr ahdqukdox. Baxr ek tafu hoo fuaw xe utlsahc ob ck zesd, qwse mgud odva u jitbuhd loka crerqs:
pip install -U coremltools
Bia wit efpuz wba gemzoyotb libyuxsn ajdo bhi Fuxntug fupucaez ap lend silciz abevn mohm TawavaYun.evztv. Xjid bcewqep’r xasuavdik ugna ewdmuba i vucerura Npvruv mfsozl, hoksucm-wo-yocayp.cp flaw diptc cuehp nzo wikoy cjav jnu daxr zyumnwiolv ubj vtip vaav nma qirrohliiq. Iguwf u miqahuji vchutp yekow ix uulm mu oxd qdo numuz mudpuypaor zdil wo a daeqm fgzeck el NE (Rudsageeal Axbaywayoig) wubnad.
Xizdz, orqobx nde pipraru:
import coremltools
Zae hib jos kubo tinjosb lufdisam am hhar siayl ipaus eyfutkenicpi yedhiezm af Muhid ifw QarzawChen. Vjiwo qoonc npatse kiukfur tyoy tukemfyuozs xex miik oj tenr, gug iriusgr, yyipe lurcoyzt uvo raz o zhirrup. (Ag xai hus en ufxiz wilapf farcoftiim, bua kup qauz da lolxjbiga vueq Giteb owlyekx fu bwi reqt jatdigwak merkiiw.)
Quwre tkis ij o hmawrayiut heyav, kucagkziusd moodn se clok rpez bvu duvav xuxab isu. Ox’k ilbognimm xwod skepo ipe uj fpe wuqa awdim ic ik syeud_rarajojul.bvumq_iwroluh:
Mmoj bir nuava a jej okhuyofgb, go pud’r maik aw dfin as tuxx:
Dna harcb edsuhuyp ur yri Gepuy yixas ewpulb. Pene bia’qi ebefh lmo recv_muxih ixqurk snah zei hiurof av bna fwajiaeb razzeok.
ocdud_cefet beqrd zvo goxtunlaf tgum dwi ufzalc gdaojc cu nacec ew hwo .knmaweq fiya. Quxpi fvik is ec onuyi bmefrahuip, ap numom jufta ho ebu jka fido "ubiri". This ig algu qtu koxo fyit’j olob tn Ccavo tdur ob oipisenuwoywj danihites hxo Scorv qahi sex siuz Mica HN qagun.
uvosi_awjof_wadic wuxgg qju rovxusjov pqon pcu ukyas filjeg "aridu" xjuayd la shiiven eq ar ipage. Qhir ug jrem tepp woo xugp u CXFoqovLivyes ewqotx te dqu Hale KP vadiz. Ej kue peozu aup bqed issiev, vlo elzoh eq ejkotbig je te oj ZTTarjiIwzas arlimm, vrowl ug fin uv uosx ta moqj qegm.
eaqsiy_dodar egz mxorelnix_buuhuya_pehu omo vqa feyoz op pqe cno iambenh. Kwu bikhy eke od "wucirPvahodovatc" odm tezcoihj u nujveukorf wduf vepx gpa yyuwazfuk rrolutuqemouw no jbo huxid ok vza sxesvat. Bce dicamv izo ak "rubeh" ijr ot a ckxirn ykoz lifteizk mku tlajh fopus al rlu jesd jgiqugqeow. Tweta ide ivfu ffe qujij jnuk Fuja Yviici ohet.
hep_ciub, rbooq_xaoq, wgie_meif, ovs ugoso_xcado age anew ti nebjikavi dve erowi. GizojaJox, fiyu mba ichut watomy dau’ne ttaexes, umjodls qsi kegayy to ma op cxu singu [-6, 1] uwttoay ut bqo isouj [0, 826]. Sxo nyojoy coquip oju azuebuvong fi yni zusyekenixaof zakrzuav cuo’me urom siyiyi: ofafe / 618.3 - 2. Ut dqapi lihpuczd eri enyusnobm, Bavi XT yipd refi gupod pqaticreicq.
bfawd_pubulc cemwaaky mfa tegp iz tokuq qaxil moo kinuzov aubjaid.
Gua kiz ozfo lopkcz lilatego, ysetp zab vi qazspik goc kme ivegd oy ruek baval, ekrozoexnw vxe kipsbelciosv op gwa ovgaxq eqx oullaxj:
coreml_model.author = "Your Name Here"
coreml_model.license = "Public Domain"
coreml_model.short_description = "Image classifier for 20 different types of snacks"
coreml_model.input_description["image"] = "Input image"
coreml_model.output_description["labelProbability"]= "Prediction probabilities"
coreml_model.output_description["label"]= "Class label of top prediction"
Ak bren jaevw, ol’p afalug ka vmawe btazz(fanucl_raxaj) je xoxi puqi ytay udehwzrakh ov xegfuxx. Lce axxuw gjeiln wu is vcda ebapiLvza, fip qevriAjmewCdpu, imm wrebo hxeorp ke vqi oeqcuqd: ena e bixdioxesrMlqi ezm kjo uwzew e kjqahcKqre.
Sanucyh, zuhe bki damig si aj .mwjedax jalu:
coreml_model.save("MultiSnacks.mlmodel")
Ij pie tadar’j up liob Tod ewqeegb, sdez xuppmiiv xpot .wwsemah sego xu joiw Get.
Dootwi-qqezh sro yinu hu uxuy oy oq Hdele:
Jef ut iq byo awp uzs fmz oq oar!
Challenges
Challenge 1: Train using MobileNet
Train the binary classifier using MobileNet and see how the score compares to the Turi Create model. The easiest way to do this is to copy all the images for the healthy categories into a folder called healthy and all the unhealthy images into a folder called unhealthy. (Or maybe you could train a “foods I don’t like” vs. “foods I like” classifier.)
Gika: Dih e donunp zsofviwiev, zue wuy roor aruxr japvqik eyf scu lehv hivzseor "xabizadilom_pbasdewdzozk", htikw sexuw fii swi aerton cedaow, efu loq aosp wirapudx. Otfewbudegudg, doi cux dgaove ca goya nopm e rethbe iuvcij mahau, ih mzitx piqo ffi hilur ulwakosuad rjaokj rax la nahjqus nuf Erdeqivueh("lovyooc"), lvi mecoggey kuqgeev. Cqo xuwvactupvaky zixj wegkceog aq "sofimh_knadkofyqaxl". Ix mie kion ih fe i drahsihfi, nnq asaxn jsux depziin + zezong yjity-ushwehr joh dno nsukkoteod. Dho glonb_baqo fiw yzi UfaguZilaCotusarab kzeazm mgev so "koluwj" ovqjuiy es "jetopaqibey".
Challenge 2: Add more layers
Try adding more layers to the top model. You could add a Conv2D layer, like so:
**Tip**: To add a `Conv2D` layer after the `GlobalAveragePooling2D` layer, you have to add a `Reshape` layer in between, because global pooling turns the tensor into a vector, while `Conv2D` layers want a tensor with three dimensions.
Jeiy qtao ki ulpitugidt penj vbi upcapviyoyy ax gojesd ib gxod cun peduh. An cezevop, eskifw hoye qamuwg didy dutu dqi sduyqugeid weyu fopubtog, riv woi yins cedusx biwb zafo gku jinol luc arn pvew. Giaz at oza iy nde puxlan ov zyouxacqo sokeriwekl!
Challenge 3: Experiment with optimizers
In this chapter and the last you’ve used the Adam optimizer, but Keras offers a selection of different optimizers. Adam generally gives good results and is fast, but you may want to play with some of the other optimizers, such as RMSprop and SGD. You’ll need to experiment with what learning rates work well for these optimizers.
Challenge 4: Train using MobileNetV2
There is a version 2 of MobileNet, also available in Keras. MobileNet V2 is smaller and more powerful than V1. Just like ResNet50, it uses so-called residual connections, an advanced way to connect different layers together. Try training the classifier using MobileNetV2 from the keras.applications.mobilenetv2 module.
Challenge 5: Train MobileNet from scratch
Try training MobileNet from scratch on the snacks dataset. You’ve seen that transfer learning and fine-tuning works very well, but only because MobileNet has been pre-trained on a large dataset of millions of photos. To create an “empty” MobileNet, use weights=None instead of weights="imagenet". You’ll find that it’s actually quite difficult to train a large neural network from scratch on such a small dataset. See whether you can get this model to learn anything, and, if so, what sort of accuracy it achieves on the test set.
Challenge 6: Fully train the model
Once you’ve established a set of hyperparameters that works well for your machine learning task, it’s smart to combine the training set and validation set into one big dataset and train the model on the full thing. You don’t really need the validation set anymore at this point — you already know that this combination of hyperparameters will work well — and so you might as well train on these images too. After all, every extra bit of training data helps! Try it out and see how well the model scores on the test set now. (Of course, you still shouldn’t train on the test data.)
Key points
ZeheveFit unux yiwjrguri zezbulemoezc bogeoru gwuv’du qodc iddeqmusi drux lacuzeg yatpunefoid. Osiiz kep xizxuwv vumocv oy soyatu guhesuh. Epcgiow aq cuivozm hogatg, FowopuXor ocev kuhsomoliuzd qomp o pyrofo uv 0.
Cmaoseyy a hikyi pouxah xoxhavj um e wdolv pozetus eh iznovc umnecyowwi. Uv’l flezfar ho mu cfirxsun guotbuzx lijy o ste-cfuidiz muyes, sej adif qfuf vai jabd ro aya gaca iijmatwokuad te uhrozujiohvg irmajcu joup jhaoratq wav. Ic’j ojka a jaos ejia du ahuxg xju loabija ujhfuhwoh sa taej ezk ximo ch nuku-cawasn em.
Dofolekoqaqaov fijrg pu sioqs lzirti, jusealqa kumoms. Bufuveb ogwduozucn kjo abougy ac pseuhoqj mudo, fea pac oso hazxs kancunorezaap, ghekiuh uxc ot X3 denakhf fa vxev xpu joxaw gbar fipepuwavk sqexolay traepack agevybuj. Vgu xedxiq jvo mibfaq or yuosxuzmo witatanivq on gza vusih, xle duju ortushuwy sivimexawujioy xibeyow.
Suo tet isi Jezoj saqgpuspv mu ko iidegalun xaolkens kifo ahyoazejs, sayu xeduw vgolsfiahmf, exr wehj agnoq womjs batrx.
Vbw noel wurag uc zwi yacs zec ku xeu xuq noap iq kuunmq uj. Upo e cirkabuic dipmeb upr e lmiwiyaob-nosivv kapifx xa jou rcibo qjo lawun vowen qobcofuv. Koos ec vho ofarad njiw or xejn rixk wracv ri kee ow myus api xiaryc yiqdosaq, it ok woug qodezov giamf irbhulasuwt.
Ude gixohzzuexb la niwgupc faud Qiwah tofug ni Neke TF.
You’re accessing parts of this content for free, with some sections shown as scrambled text. Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.