《中国图象图形学报》论文投稿模板.docx
中藻分突(此号在中国图书馆分类法中杳)力WMRe1.A文Moo6896(年I-修文引用格式I融合深度模型和传统模型的显著性检测(标题中不建议使用缩写词,居中排)作者I”,作者2(4号,各作者之间用逗号分隔,居中排)I.单上所在省市版箱,2.单位,所在褥市邮箱(6号一体.单位名解务必与金称.J:中M要,目的显杵性校测站图像和视觉久!域个收砒计题.传统极小时卜骁为性物体的边界保用较好,但是对必汇性目标的自信度不妙高,召回车岐,而深度学习模生对于是斤性物体的自信度跖,但是共结果边界粗IS,准痛率较低,针对这两种模型各自的优或点,提出了种H方性模里以煤介利用两种方法的优点并抑制各自的不足.方法田先改造域族的宓生卷积网络.训练了个Mj该网络的全卷税网络(FCN)显普性根本.同时选取个现有的基于是像总的显著性别归慑型,在得到两种模型的W杆性结果图后,提出一-种融合辑法,岫合两种方法的菇果以得到很终忧化结;K.PiJJdii1.U行性结果H"d"11i"M积和像发何是#性位的对非线性哄射,物FCN结!K5传统模里的结果相融合,的果实段在4个数期"上与最新的10种方法进行了比较.在HK1.MS数据集中.相比于性能第2的粳型.F(itf72.6%.在MSRA故据集中.桶比F性能第2的根小F值捉Mr2.2%,MAEWtt/5.6%!在DUTOMRoN数妪集中,相比于性能第2的模型,F值提高5.6%.MAEHfKT17.4,同时也HMSRA数据架中进行了对比实验以验证融合H法的有效性,对I匕实段结果证明提出(!,1.J-U'f!P><.r.A4.结论十大所提出的JHHteI型,例哈了传统校,!和深慢学习馍腐的优点,使星著性校的结果更加准确,(小5号宋伟,按输日的,方法,结果和一的I个佞。戒H,11晚轮文无按JRH*tU)*三1.:显方性投泅;邂Mi誉机网给;全卷枳网络;融合算法:HMfanM«1积(小5号宋体.各词之门用分号问.当(8到未朱诒文的改本w传搞.建议为犍记的公.从)、汲段、小一域.研充蚓>.I。绪攀、焦点检索词啸方面精选关位词)Sa1.iencydetectionviafusionofdeepmode1.andtraditiona1.mode1.(首词首字母大写,其他小写,居中排)作者-2,作者2(4号,各作者之间用逗号分隔,居中期h单位,所在省市邮编.期箝2、单位,所在省市邮址,国京(6号斜付,K'qoAh%tract:Objcc1.iveSx1.icncydetectioni<fund;IECnuI1.PrehICnIincomputervi%inn11n<!i11cprocessing,whichAims1.<>identifythemoMconspicuousobjectsOCregionsinunimage.SaJicncydetectionhasbeenwide1.yUSCdinsevera1.visua1.a1.icaiM>ns.inc1.udingObjZ1.rctafgctn.scenec1.asM<aiiovisua1.tracking,imgeretrieva1.,andscan<>cscgmenuion.Inmost(11d>1.iu1.UPPnKId1.eXSa1.ien1.(>bjec1.xavderivedIxxwd»miI1.ieextrudedtea1.urefmniP1.Rekorrvgiut.Fina1.%a1.iencymapsconsistofthc<eregionwiththeirMI1.iCfKyscore,ThePerfnnniInCCofthese11iekre1.yonthecgmcntitk>nmcthxis;indthese1.ectionoffeatures.TbcwnproadccannotPrOdUCCsaisfacto<>'resu1.tsWhEimageswithmu1.tip1.esa1.ientobjectsorIoWYomnistcoiMcnuarecounicrcd.Tradiixma1.ap<acbcsPrCSCrVCthebounduncswd1.butwithmsufYiccnconfidccof收稿日期::修国日裔:墓"I口:基金项目的赛瑟中文全称项目a号:“”;同。隔开)SUPPOrtaIby:黑如H目的英文全称1湮¥金项;I的中英文名称可在学报网M卜世中心今找核对sa1.ientobjects,whichyie1.d1.owreca1.1.rates.G>nx>1.utionncu11works(CNNs)havebeenintroducedinpixe1.-wiseprediction<b1.enH.suchassa!iencydcicciion.duetotheiroutstandingPC1.fUmUnNinimagec1.assificaiiontasks.CNNsredefine(beSaJiCnCyprob1.emasa1.abe1.ingpbkmWherVIheIcaiurvSeIeeIiOnbc(*<rnsa1.icinandno<sa1.icnObjCC1.Sisaxomaica1.1.yPernEnedt1.v<>ughgradie1.de¼cen1.ACNNcantMXbedirect1.yusedtoHumaKi1.ieneymode1.,axiaCNNcunbeU1.i1.izcx1.inii)icncydc1.ec1.>nbyextractingmihicpatchamundeachpixe1.andbyuingthepatchtoPnRiCtthecenterpixe1.'sCiaMPaichesarcfrequent1.yobtainedfromdifferentrcsoicnsoftheinputimugctocaptureg1.oba1.information.AnothermethodIstheadditionofup-samp1.ed1.ayc11*intheCNN.AmodifiedCNNisc1.1.cdafu1.1.yconneccdIKIWOfk(FCN).v*tichisfi11*proposedforenan(icenenU1.iu.Miw1.Xa1.iCnCyde1.tx*tx>nCNN11>dehueFVNtocapturecrtierab1.eg1.oba1.and1.oca1.InkKina1.iDi1.FCNisaPnPU1.arm<x!c1.thatfncx1.i11cvtheCNNtofitdensePfvdiCtionPrnWCe、whichrep1.aces(heSoHMaxandfi1.1.yCnnneCCCd1.ayc11iintheCNNintoconvo1.utionanddeconvo1.ution1.ayers.ComParcdwithtraditiona1.11thods.FCNscanaccurate1.y1.<mtcsa1.ientobjectsandyie1.dIhCirhighCOnGdencc.Houcwr.theboundariesofSahCIUobjectsarccoarseandtheirpfccis>nIU1.CSare1.owerthanIhCUudi1.iona1.UPP“融be、dueIoIheJuimmphiig!(u.'tuinK'N%.Todea1.vi1.hI1.iehmiU1.iaiis<Ihe2kix1.%of×3ecy11xxck.WCPmPOHCposite¼Iicncymode1.thatcombines(headv;InUigc%andrcstninthedrawbacksCfIWoMi1.icncynxx1.c1.s.MethodInIhKstudy,anewFCNbasedondenseCanvo1.uiiona1.ncw<H(DcnscNci)isbui1.t.ForSdiCnCydctecon.WDrep1.aceIheIu1.1.ycxxinec1.ed1.ayerandfina1.PUt)I1.ng1.ayern1.uaI×Ikerne1.sizecunvo!u1.i1.ayerau1.3(SeCs1.VDIUIiUn1.ayer.Ag11M>idIuyerisaPP1.1.CdIdub1.anUUrM1.ienCynq%.MtheIKIinmgPrOCeStIheSaIig1.Cyt1.workb(aifuasquaredEuc1.i<kankws1.ayerfbxa1.iencyregrcMm.Wcnc-tunctheprc1.11i11e!11cnvNet-1.1.u>t11inourMier>*mode1.Ourtrainingsetconsistsof3900imagesthatUrcrandom1.yse1.ectedfrom5sa1.ic>,pub1.icdataset,name1.y.ECSSD.SOD.HKU-IS.MSRA.andICOSEG.Oursa1.icncyworkisimp1.ementedinCaffctoo1.box.Theinputimagesandund-(thmapsarcresized105OOr5OOforItuining.thenu>nen1.umxruw1.erisMrtto0.99.IbeIeannngru1.eisietIo10",0.ax1.theweightdecayx¼O.(MX>5.1"heSGDIBninjJPfnCed1.IrCiXeCICnUCdKinIIaNVIDIAGTXTTTANXGPudevice,hihtakesapproximate1.yonedayin2(X>(KM)ic11nions.Then,weuseaImdi1.iona1.sa1.icncymx1.c1.TbcSdZcdmode1.adop<smu1.ti-1.eve1.wgnnta<iontopr<x1.uccsevera1.SCgmenuiiiosofanimage,whereeachhueixc1.isrv<sencdbyafeatureVeC1.(M1.hd1.containsdifferentkindsofimagefeatures.AIundoInfores!istrainedbyIhusefeaturevector%Idfeive¼hecymap%.OnIhebasisft1.x;2nuk1.s.½pop0ieafu%inAp11ithmthatcombinesthead、FnofInuh1.inna1.APProaChGanddeepkamingme<b<x1.%.Forani11wgc.15segmenta1.><>nxoftheimngcareproduced,andthesa1.icncymapsofu1.1.segmentationsarcderivedbythe11ndomforw(.Then,weIKCFCNcoPfVdUCCanothertypeofsa1.icncynwoftheimage.Thefusiona1.pofithmapp1.iestheHadamaidP(OdUC1.onthe2typesofsa1