agriculture

Article Classification of Species Based on Bilinear Networks with Attention Mechanism

Peng Wang, Jiang Liu, Lijia Xu, Peng Huang, Xiong Luo, Yan Hu and Zhiliang Kang *

College of Mechanical and Electrical Engineering, Sichuan Agricultural University, Ya’an 625000, China; [email protected] (P.W.); [email protected] (J.L.); [email protected] (L.X.); [email protected] (P.H.); [email protected] (X.L.); [email protected] (Y.H.) * Correspondence: [email protected]; Tel.: +86-186-0835-1703

Abstract: The accurate classification of Amanita is helpful to its research on biological control and medical value, and it can also prevent poisoning incidents. In this paper, we constructed the Bilinear convolutional neural networks (B-CNN) with attention mechanism model based on transfer learning to realize the classification of Amanita. When the model is trained, the weight on ImageNet is used for pre-training, and the Adam optimizer is used to update network parameters. In the test process, images of Amanita at different growth stages were used to further test the generalization ability of the model. After comparing our model with other models, the results show that our model greatly reduces the number of parameters while achieving high accuracy (95.2%) and has good generalization ability. It is an efficient classification model, which provides a new option for mushroom classification in areas with limited computing resources.

 Keywords: deep learning; bilinear convolutional neural networks; attention mechanism; trans-  fer learning Citation: Wang, P.; Liu, J.; Xu, L.; Huang, P.; Luo, X.; Hu, Y.; Kang, Z. Classification of Amanita Species Based on Bilinear Networks with 1. Introduction Attention Mechanism. Agriculture Amanita is a large , which is an important part of natural medicine resources. 2021, 11, 393. https://doi.org/ At present, using the characteristics of to control and treat tumors is a promising 10.3390/agriculture11050393 method [1–3]. is a famous hallucinogenic mushroom, which can be used to develop special drugs for anesthesia and sedation [4]. In terms of biological control, the Academic Editor: Maciej Zaborowicz toxins contained in Amanita albicans and Amanita muscaria have certain trapping and killing effects on insects or agricultural pests [3,5]. At present, there is no artificially cultivated Received: 8 April 2021 Amanita, the amatoxins needed in scientific research can only be extracted from fruit bodies Accepted: 22 April 2021 Published: 26 April 2021 collected in the field [6–8]. Moreover, due to the lack of knowledge and ability to identify poisonous , there are a number of cases of poisoning death from eating wild

Publisher’s Note: MDPI stays neutral mushrooms every year [9–13]. In , 95% of deaths are caused with regard to jurisdictional claims in by poisonous Amanita [14,15]. Therefore, it is necessary to accurately classify and identify published maps and institutional affil- them both in terms of use value and poisoning prevention. iations. Many researchers have contributed to the classification of mushrooms. For example, Ismail [16] studied the characteristics of mushrooms, such as the shape, surface and color of the cap, roots and stems, and used the principal component analysis (PCA) algorithm to select the best features for the classification experiment using the decision tree (DT) algorithm. Pranjal Maurya [17] used a support vector machine (SVM) classifier to distin- Copyright: © 2021 by the authors. Licensee MDPI, Basel, Switzerland. guish edible and poisonous mushrooms, with an accuracy of 76.6%. Xiao [18] used the This article is an open access article Shuf-fleNetV2 model to quickly identify the toxicity of wild bacteria. The accuracy of the distributed under the terms and model is 55.18% for Top-1 and 93.55% for Top-5. Chen [19] used the Keras platform to build conditions of the Creative Commons a convolutional neural network (CNN) for end-to-end model training and migrated to the Attribution (CC BY) license (https:// Android end to realize mushroom recognition on the mobile end, but the recognition effect creativecommons.org/licenses/by/ of his model was poor. Preechasuk J [20] proposed a new model of classifying 45 types of 4.0/). mushrooms including edible and poisonous mushrooms by using a technique of CNN,

Agriculture 2021, 11, 393. https://doi.org/10.3390/agriculture11050393 https://www.mdpi.com/journal/agriculture Agriculture 2021, 11, 393 2 of 13

which gives the results of 0.78, 0.73 and 0.74 of precision, recall and F1 score, respectively. Dong J [21] proposed a CNN model that can detect qualified and three common types of substandard enoki mushroom caps and achieved 98.35% accuracy. However, this model is suitable for few types of mushrooms, and the cap must be detected. Since the method based on the visual attention mechanism [22,23] can locate the distinguishable areas in the image without additional annotation information, it has been widely used in the field of fine-grained classification of images in recent years. At present, the mainstream attention mechanism can be divided into the following three types: channel attention, spatial attention and self-attention. Channel attention aims to show the correlation between different channels. In a convolutional neural network, each image is initially represented by three channels (R, G, B) [24]. After different convolution kernels are processed, each channel will generate a new channel with different information. Channel attention automatically obtains the importance of each feature channel through network learning, and finally assigns different weight coefficients to each channel. It can achieve the purpose of strengthening important channels and suppressing non-important channels. Spatial Attention is designed to enhance the spatial characteristics of expression of critical areas. DeepMind designed a Spatial Transformer Layer (STL) to realize spatial invariance [25,26]. Its principle is to transform the spatial information in the original picture into another space and retain the key information through STL. Then, generate a weight mask for each position and weigh the output. This method can enhance the specific target area of interest while weakening the irrelevant background area to extract the key information. Self-attention reduces the dependence on external information and uses the inherent information within the feature to interact with attention as much as possible [27,28]. How- ever, the disadvantage of this method is that every point must capture global contextual information, which will cause a lot of computational complexity and memory capacity, and the information on the channel is not considered. In this paper, a method of CNN combined with an attention mechanism is proposed to solve the problem of difficult classification and identification of Amanita. The specific contributions and innovations are as follows: (1) A self-built Amanita dataset that is 3219 Amanita images obtained from the Internet and divided. (2) The Bilinear convolutional neural networks model was built and fine-tuned to make the model more suitable for the dataset. (3) The Bilinear convolutional neural networks model is combined with the attention mechanism to improve the model. This method can quickly obtain the most effec- tive information.

2. Materials and Methods 2.1. Image Dataset In this paper, the original dataset comes from two sources. On the one hand, it is a downloaded mushroom dataset from the Kaggle platform. The data on Kaggle are a public data source and have a certain degree of authority. Special thanks are owed to the Nordic Society of Mycologists, who provided the most common mushroom sources in the region on Kaggle and checked the data and labels. We choose the Amanita dataset based on the label of the mushroom dataset. Another dataset of mushroom images was collected from http://www.mushroom.world (accessed on 24 September 2020). We searched the mushroom database on this website based on the name of the mushroom. Then, we recorded the color and structure of the cap of Amanita (such as egg-shaped, unfolded cap, umbrella-shaped, spherical) according to [29] to confirm the of Amanita again. Finally, the dataset of Amanita was obtained, as shown in Table1. Agriculture 2021, 11, 393 3 of 13 AgricultureAgricultureAgricultureAgricultureAgricultureAgricultureAgricultureAgriculture 2021 20212021 2021, ,2021,11 11202111,, ,112021,x x,x 2021 FOR ,11 FOR FOR11x, FOR ,x11, x PEER11 FOR PEER ,PEERFOR x, PEER xFOR FORPEER REVIEW PEER REVIEWREVIEW PEERREVIEW PEER REVIEW REVIEW REVIEW REVIEW 333 of ofof3 14 14of1433 of14 of3 14 3of14 of 14 14

TableTableTableTableTableTable 1. 1.1.Table Seven TableSeven1.Seven 1.Seven 1. Seven 1.Seven species 1.species speciesSeven Sevenspecies species species of species ofof species Amanita of AmanitaAmanita of Amanitaof Amanita ofAmanita of Amanita andAmanitaand andand and the the andthe theand the numberandnumber number numberandthe the number the number thenumber numberofof ofnumberof samples samples of samplessamples of samplesof samples ofsamples of samples of of samplesofof each each of eacheach of eachof species. eachspecies. of species.eachspecies. of eachspecies. each species. species. species. species.

Amanita-Amanita-Amanita-Amanita-Amanita-Amanita-Amanita-Amanita-Amani-Amani-Amani-Amani-Amani-Amani-Amani-Amani-AmanitaAmanitaAmanitaAmanitaAmanitaAmanitaAmanitaAmanitaAmanita AmanitaAmanitaAmanita Amanita AmanitaAmanita AmanitaAmanita echino- echino-echino- Amanitaechino- echino- echino- echino- echino-AmanitaAmanitaAmanitaAmanitaAmanitaAmanitaAmanitaAmanitaAmanita mus- mus-mus- mus- mus- mus-AmanitaAmanitaAmanita mus- Amanitamus-AmanitaAmanitaAmanitaAmanitaAmanita phal- phal-phal- phal- phal- phal-AmanitaAmanita Amanitaphal- phal-AmanitaAmanitaAmanitaAmanitaAmanitaAmanita pan- pan-pan- pan- pan- pan- pan- pan- VarietiesVarietiesVarietiesVarietiesVarietiesVarietiesVarietiesVarietiesVarieties Amanitabisporigera Amanitavaginata bisporigerabisporigerabisporigerabisporigerabisporigerabisporigerabisporigerabisporigera tavaginatatavaginatatavaginata tavaginata tavaginatatavaginata tavaginatatavaginata caesareacaesareacaesarea caesarea caesarea caesareacaesareacaesarea caesarea cephalacephala cephalacephalacephalaechinocephalacephalacephala cephala cariacariacariacariamuscariacaria caria caria caria loidesloidesphalloidesloidesloidesloidesloides loidesloides pantherinatherina therinatherinatherinatherinatherinatherina therina

SampleSampleSampleSampleSampleSampleSampleSample Sample

QuantityQuantityQuantityQuantityQuantityQuantityQuantityQuantityQuantity 725725725 725725 725 725 303303303303 303303 303 303 289289289289 289289 289 289 463 463 463 463 463 463 463 463 731 731 731 731 731 731 731 731 400 400 400 400 400 400 400 400 299 299 299 299 299 299 299 299 (pieces)(pieces)(pieces)(pieces)(pieces)(pieces)(pieces)(pieces) (pieces)

InInInIn In the thethetheIn Inthe paper, paper,paper,Inthe paper,theIn paper, the paper,the paper, inpaper, ininpaper, inorder orderorder in inorder orderin order in to totoorder order toma makemama to tomakeke kemato ma toke the thethemake makethe model kemodelthe modeltheke model the modelthe model moremodel more moremodel more more more applicable applicable applicablemore moreapplicable applicable applicable applicable applicable tototo to thethethe to tothe wildwildtheto wild theto wild the wildthe wild environment,environment,environment, wild environment,wild environment, environment, environment, environment, mostmostmostmostmostmost most of ofofofmost ofthemost thethe of theof data datatheof data theof data the datathe pictures datapicturespictures datapictures data pictures pictures pictures pictures are areare are Amanita AmanitaareAmanita are Amanita are Amanitaare Amanita Amanita Amanita growing growinggrowing growing growing growing growing growing in inin inthe thethe in thein wild wildtheinwild thein wild the wildthe environment,wild environment,environment, wild environment,wild environment, environment, environment, environment, but butbut but also butalso alsobut alsobut butalso include includealsoinclude alsoinclude also include include include include somesomesomesomesomesomesome pictures picturespicturespicturessome somepictures pictures pictures pictures pictures of ofof Amanitaof AmanitaAmanita of Amanitaof Amanitaof Amanitaof Amanita Amanita that thatthat that that were werethatwere werethat that were were picked pickedpicked were pickedwere picked picked picked by bybypicked by hand. hand. hand.by byhand. byhand. byhand. hand. hand.

2.2.2.2.2.2.2.2.2.2. Data Data2.2.Data2.2. Data2.2. 2.2.Data AugmentationData AugmentationAugmentation DataAugmentation Data Augmentation Augmentation Augmentation Augmentation ThereThereThereThereThereThereThere are areThereareThere are seven sevenareseven are seven are sevenare seven kinds kindssevenkinds seven kinds kinds kinds of ofofkinds kinds ofAmanita AmanitaAmanita of Amanitaof Amanitaof Amanitaof Amanita Amanita in inin inthe thethe in thein original originaltheinoriginal thein original the originalthe original original original dataset, dataset,dataset, dataset, dataset, dataset, dataset, dataset,a aa total totalatotal totala a total oftotal aof ofa total 3219of total32193219 of 3219of 3219of pictures.3219 ofpictures.pictures. 3219 pictures.3219 pictures. pictures. pictures. pictures. All AllAll All All All All All samplessamplessamplessamplessamplessamplessamplessamples samples are areareare are randomly randomlyarerandomly are randomly are randomlyare randomly randomly randomly divided divideddivided divided divided divided divided divided into intointo into into training trainingintotraining traininginto into training training training training and andand and andtest testandtest testand and dataset testdataset datasettest datasettest testdataset dataset dataset according datasetaccordingaccording according according according according according to toto ato aa ratioto ratioratioato ratioato a toratio ofratioa of ofa ratio 8:2.of ratio8:2.8:2. of 8:2.of 8:2.of 8:2.of 8:2. 8:2. TrainingTrainingTrainingTrainingTrainingTrainingTrainingTrainingTraining convolutional convolutionalconvolutional convolutional convolutional convolutional convolutional convolutional neural neuralneural neural neural neural networksneural networksnetworks networksneural networks networks networks networks networks requires requiresrequiresrequires requires requires requires requires requires a aaa lot lotlotlota lot a of ofa oflot lota imageof imageimagea lotof imagelotof imageof imageof data dataimagedata image data data to datatoto data preventto preventpreventdata to preventto preventto prevent to prevent prevent over- over-over- over- over- over- over- over- fitting.fitting.fitting.fitting.fitting.fitting.fitting. fitting. Therefore, Therefore,Therefore,fitting. Therefore, Therefore, Therefore, Therefore, Therefore, in inin inthis thisthis in inthis paper,thisin paper, paper,thisin paper,this thispaper, paper, thepaper, thethepaper, the input inputthe inputthe input the inputthe input im image imiminput inputimageageage im image data ageimdata dataageim dataage age data are dataareare data are data enhanced enhancedare enhancedare enhanced are enhancedare enhanced enhanced enhanced by byby by using using usingby byusing by usingbyusing the thetheusing usingthe built-in built-inthe built-inthe built-in the built-inthe built-in built-in built-in ImageDataGeneratorImageDataGeneratorImageDataGeneratorImageDataGeneratorImageDataGeneratorImageDataGeneratorImageDataGeneratorImageDataGeneratorImageDataGenerator [30,31] [[30,31][30,31]30 [30,31], 31 [30,31][30,31] ] [30,31] interface interfaceinterface[30,31]interface interface interfaceinterface interface interface of ofofof of Tensorflow2.0. Tensorflow2.0.Tensorflow2.0. Tensorflow2.0. of ofTensorflow2.0. of Tensorflow2.0.Tensorflow2.0.of Tensorflow2.0. Tensorflow2.0. The TheThe The The purpose purposeThepurposepurpose Thepurpose The purposepurpose purpose purpose of ofofof of increasingincreasing increasing of ofincreasing of increasingincreasingof increasing increasing thethethe the thethe the the numbernumbernumbernumbernumbernumbernumbernumber number of ofofof images of imagesimages images ofofimages of imagesimagesof is images isimagesisis achieved achievedisachievedachieved isachievedis achievedis achievedis achieved byachieved byby combiningby by combinincombinincombinin by bycombinin by combininbycombinin combinin combinin randomggg randomgrandomrandom grandomg randomg rotation,randomg random random rotation,rotation,rotation, rotation, rotation, translation,rotation, rotation, rotation, translation,translation,translation, translation, translation,translation, translation, cuttingtranslation, cuttingcuttingcutting andcutting cuttingcutting other cutting andcuttingandand and and and and and otheroperations,otherotherotherotherother operations, operations,operations,other otheroperations, operations, operations, as operations, operations, shown as asas as shown shownshown inas asshown Figureshownas shownas shown in ininshown inFigure1 FigureFigure b–e.in Figurein Figurein Figurein This 1b–e.Figure 1b–e.1b–e.Figure 1b–e. method1b–e. 1b–e. Th Th Th1b–e. 1b–e.Thisisis Th method Thismethodmethod roughly isThmethod isTh method ismethodis method method roughly roughlyroughly increases roughly roughly roughly roughly roughly increases increasesincreases theincreases increases increases number increases increases the thethe the number number ofthenumber the number imagesthe numberthe number number number of ofof of of of of of imagesbyimagesimagesimages 6images times.images images by byimagesby by 6 6 6 by times. bytimes.times.6 bytimes.6 by6 times. times.6 6 times. times.

((a(aa))( ) a ()a (a) )( a(a) ) ((b(bb)() )b ()b( b)( )b (b) ) ((c(cc)) () c )( c(c) )( c()c ) ((d(dd)() )d ()d( d)( )d (d) ) ((e(ee))( ) e ()e (e) )( e(e) )

FigureFigureFigureFigureFigureFigure 1. 1.Figure1. FigureImages Images Images1.Images 1.Images 1. Images 1.Images 1. after Images afterafter afterImages after afterdata data dataafterdata afterdata after dataaugmentation. augmentation. augmentation.dataaugmentation. augmentation.data data augmentation. augmentation. augmentation. augmentation. ( ((a(aaa)) ) )(Original OriginalaOriginalOriginal )( a(Originala) )(Original aOriginal()a )Original image; image;Original image;image; image; image; image; ( ( (image;b(b bimage;– –(–ebee ))()– )beffect ( eeffectbeffect–) –( eeffect be()b –)effect e–effectimage image)eimage )effect imageeffect image image after after afterimage imageafter afterdata dataafterdata afterdata after dataaugmentation. augmentation.dataaugmentation. augmentation.data data augmentation. augmentation. augmentation. augmentation.

2.3.2.3.2.3.2.3.2.3. Method Method2.3.Method2.3. Method2.3. 2.3.Method Method Method Method 2.3.1.2.3.1.2.3.1.2.3.1.2.3.1.2.3.1. 2.3.1. The TheThe2.3.1. The2.3.1. EfficientThe EfficientEfficientEfficientThe EfficientThe TheEfficient Efficient Efficient Efficient Net NetNetNet Net ModelNet ModelModelNet ModelNet NetModel Model Model Model TheTheTheTheThe TheEfficient EfficientEfficientTheEfficient EfficientThe TheEfficient Efficient Efficient Efficient NetNet NetNet Net [ Net[32] 32[32]Net[32] ][32]Net wasNet [32]was was[32]was was[32] proposed[32] was proposed wasproposedproposed wasproposed was proposed proposed proposed proposed by by byby the by th ththby by googleethee googleby thgoogleegooglebyth googleethe thgoogle googlee team e google team teamgoogleteam team in team team in 2019.in inteam inteam2019. 2019.2019. in in2019. Through 2019.in 2019.in Through Through Through2019. 2019.Through Through Through comprehen-Through Through compre- compre-compre- compre- compre- compre- compre- compre- hensivesivehensivehensivehensivehensive optimizationhensivehensive hensiveoptimization optimizationoptimization optimization optimization optimization optimization optimization of network of ofof networkof networknetwork of networkof networkof width,networkof network network width, width,width, width, depth, width, width, depth,width, depth, depth,width, depth, and depth, depth, anddepth, inputand anddepth, and inputand inputandinput image inputand andinput input image image imageinput input resolutionimage image image resolution imageresolutionresolution image resolution resolution resolution to resolution resolution achieve to toto achieveto achieveachieve to achieveto the achieveto achieveto goalachieve theachieve thethe the the the the the goalofgoalgoalgoal index goal of goalofof goal indexof indexindexgoal improvement.of indexof indexof index of improvement. improvement.improvement.index indeximprovement. improvement. improvement. improvement. improvement. It has It ItIt fewer has Ithashas It hasIt fewer has fewerfewerIthas model Itfewer has hasfewer fewer model modelmodelfewer fewer parameters,model model model parameters,model parameters, parameters,model parameters, parameters, parameters, parameters, butparameters, higher but butbut but higher buthigher higherbut accuracy.higher but buthigher higher accuracy.higher accuracy. accuracy.higher accuracy. accuracy. accuracy. accuracy. accuracy. EfficientNet-B4EfficientNet-B4EfficientNet-B4EfficientNet-B4EfficientNet-B4EfficientNet-B4EfficientNet-B4EfficientNet-B4EfficientNet-B4 waswas waswas was selectedwas selected wasselectedselected wasselected was selected selected selected throughselected through throughthrough through through through through comprehensive through comprehensive comprehensivecomprehensive comprehensive comprehensive comprehensive comprehensive comprehensive consideration consideration considerationconsideration consideration consideration consideration consideration consideration of of theofof ofthe thethe parametersof ofthe parame- ofthe parame- parame-theof parame-the theparame- parame- parame- parame- tersandterstersters ters and accuracyandtersand tersand ters and accuracyandaccuracyaccuracy and accuracy and accuracy ofaccuracy accuracy theaccuracy ofofof 8 of the modelsthethe of ofthe 8of the88the of models8modelsmodelsthe the (EfficientNet-B08models8 models8models 8 models models(Efficient(Efficient(Efficient (Efficient (Efficient(Efficient (Efficient (EfficientNet-B0Net-B0Net-B0 toNet-B0Net-B0 B7)Net-B0 Net-B0 toNet-B0 bytoto to B7)B7) consultingB7) to toB7) toby B7)bybyB7)to byB7) consultingconsultingB7) consultingby byconsulting by theconsultingbyconsulting consulting literatureconsulting thethethe the literature theliteratureliteraturethe [literaturethe 33 the literatureliterature,34 literature literature]. [33,34].The[33,34].[33,34].[33,34].[33,34]. network[33,34].[33,34]. The The[33,34].The The networkThe networknetworkThe structure networkThe Thenetwork network network network structure structurestructure ofstructure structureEfficientNet-B4 structure structure structure of ofof Effiof EffiEffi of EffiofcientNet-B4 cientNet-B4cientNet-B4Effiof EffiofcientNet-B4 Effi cientNet-B4 isEfficientNet-B4 showncientNet-B4cientNet-B4 is isis in shown isshownshown Figureis shownis shown isshown is shown in ininshown2 . Figurein FigureFigure in Figurein Figurein Figurein 2.Figure 2. 2.Figure 2. 2. 2. 2. 2.

AgricultureAgriculture 20212021, 11, 11, x, 393FOR PEER REVIEW 4 of4 of 14 13 Agriculture 2021, 11, x FOR PEER REVIEW 4 of 14

Figure 2. The network structure of EfficientNet-B4. FigureFigure 2. 2.The The networknetwork structurestructure of of EfficientNet-B4. EfficientNet-B4. 2.3.2. Bilinear Convolutional Neural Networks 2.3.2.2.3.2. BilinearBilinear ConvolutionalConvolutional NeuralNeural NetworksNetworks The Bilinear convolutional neural networks (B-CNN) [35,36] proposed by Lin in 2015 The Bilinear convolutional neural networks (B-CNN) [35,36] proposed by Lin in 2015 is a representativeThe Bilinear of convolutional weakly supervised neural fine-grainednetworks (B-CNN) classification. [35,36] proposedIts network by structure Lin in 2015 is a representative of weakly supervised fine-grained classification. Its network structure is is shownis a representative in Figure 3. ofIt usedweakly two supervised A and B two-way fine-grained convolutional classification. neural Its networknetworks structure to ex- shown in Figure3. It used two A and B two-way convolutional neural networks to extract tractis showntwo features in Figure at each 3. It usedposition two of A the and im B age,two-way then multiplyconvolutional the outer neural product, networks and to fi- ex- two features at each position of the image, then multiply the outer product, and finally nallytract classify two features through at the each classification position of layer.the im age, then multiply the outer product, and fi- classifynally classify through through the classification the classification layer. layer.

Figure 3. Image classification using a B-CNN. Figure 3. Image classification using a B-CNN. Figure 3. Image classification using a B-CNN. The models are coordinated with each other through the CNN A and CNN B networks. TheThe function models of are CNN coordinated A is to locate with the each feature other parts through of the the image, CNN and A and CNN CNN B is usedB net- to The models are coordinated with each other through the CNN A and CNN B net- works.extract The the function features of of CNN the feature A is to regions locate the detected feature by parts CNN of A the [37 image,]. In this and way, CNN the B local is works. The function of CNN A is to locate the feature parts of the image, and CNN B is useddetection to extract and the feature features extraction of the feature tasks in re thegions fine-grained detected by image CNN classification A [37]. In this process way, the can used to extract the features of the feature regions detected by CNN A [37]. In this way, the localbecompleted. detection and feature extraction tasks in the fine-grained image classification process canlocal be completed. detection and feature extraction tasks in the fine-grained image classification process 2.3.3.can be Visual completed. Attention Mechanism 2.3.3. VisualIn this Attention paper, we Mechanism chose to use Mixed attention, which combines multiple attention 2.3.3. Visual Attention Mechanism mechanisms.In this paper, It can we bring chose better to use performance Mixed attention, to a certain which extent. combines An excellent multiple work attention in this mechanisms.area isIn the this Convolutional I tpaper, can bring we chosebetter Attention toperformance use MechanismMixed toattention, a certain Module which extent. (CBAM) combines An [excellent38], whichmultiple work is based attentionin this on mechanisms. It can bring better performance to a certain extent. An excellent work in this area is the Convolutional Attention Mechanism Module (CBAM) [38], which is based on area is the Convolutional Attention Mechanism Module (CBAM) [38], which is based on

Agriculture 2021, 11, x FOR PEER REVIEW 5 of 14

Agriculture 2021, 11, x FOR PEER REVIEW 5 of 14 Agriculture 2021, 11, 393 5 of 13 the channel attention module (CAM) and connected with a spatial attention module (SAM). CAM and SAM are shown in Figure 4. the channel attention module (CAM) and connected with a spatial attention module the channel attention module (CAM) and connected with a spatial attention module (SAM). (SAM). CAM and SAM are shown in Figure 4. CAM and SAM are shown in Figure4.

Figure 4. Diagram of each attention sub-module. Figure 4. Diagram of each attention sub-module. Figure 4.CAM Diagram uses of theeach max-pooling attention sub-module. output and the average-pooling output through the sharedCAM network uses [39]. the max-poolingSAM generates output two featur and thee maps average-pooling representing different output through information the sharedby CAMperforming network uses the global [39 max-pooling]. SAM average generates poolingoutput two and feature globalthe mapsaverage-pooling maximum representing pooling output different operations. through information theAfter sharedbymerging performing network the two [39]. global feature SAM average maps,generates the pooling featuretwo featur and fusion globale maps is performed maximum representing through pooling different a operations. 7 × information 7 convolution After bymergingwith performing a larger the two receptiveglobal feature average field, maps, andpooling the finally feature and th fusion eglobal weight is maximum performed map is generated pooling through aoperations.by 7 ×the7 Sigmoid convolution After op- mergingwitheration a the largerand two superimposed receptive feature maps, field, back the and featureonto finally the fusion original the weightis performed input map feature isthrough generated map a [40]. 7 × by7 Itconvolution theachieves Sigmoid the withoperationpurpose a larger of and receptiveenhancing superimposed field, the target and back finally area. onto th thee weight original map input is generated feature map by [ 40the]. Sigmoid It achieves op- the erationpurpose and of superimposed enhancing the back target onto area. the original input feature map [40]. It achieves the purpose2.3.4. Our of enhancing Model the target area. 2.3.4. Our Model In this paper, a bilinear EfficientNet-B4 model is built and combined with the convo- 2.3.4. OurIn thisModel paper, a bilinear EfficientNet-B4 model is built and combined with the con- lutional attention mechanism module (CBAM), CBAM as shown in Figure 5, which is a volutional attention mechanism module (CBAM), CBAM as shown in Figure5, which In this paper, a bilinear EfficientNet-B4 model is built and combined with the convo- iscombination a combination of spatial of spatial and channel and channel modules. modules. The structure The structure of adding of adding the attention the attention mech- lutional attention mechanism module (CBAM), CBAM as shown in Figure 5, which is a mechanismanism to the to bilinear the bilinear model model is shown is shown in Figure in Figure 6. 6. combination of spatial and channel modules. The structure of adding the attention mech- TheThe overalloverall processprocess ofof usingusing thethe modelmodel is:is: anism to the bilinear model is shown in Figure 6. (1) Use the EfficientNet-B4 network architecture to extract the feature layer after data (1) TheUse overall the EfficientNet-B4 process of using network the model architecture is: to extract the feature layer after data expansion of the input image. (1) Use the EfficientNet-B4 network architecture to extract the feature layer after data (2)(2) Combining the outputoutput resultresult ofof thethe convolutional convolutional layer layer with with CBAM, CBAM, it it will will first first pass pass a expansion of the input image. achannel channel attention attention module module to to obtain obtain the the weighted weighted result, result, then then passpass throughthrough aa spatialspatial (2) Combining the output result of the convolutional layer with CBAM, it will first pass attention module, and finally finally obtain the extracted result by weighting. a channel attention module to obtain the weighted result, then pass through a spatial (3)(3) Multiply the extraction results obtained by CBAM with the feature layer one by one. (4)(4)attention JoinJoin the the module, fully fully connected connected and finally layer layer obtain fo forr classification classification the extracted to result obtain by the weighting. final final classification classification result. (3) Multiply the extraction results obtained by CBAM with the feature layer one by one. (4) Join the fully connected layer for classification to obtain the final classification result.

Figure 5. Structure of CBAM. Figure 5. Structure of CBAM.

Figure 5. Structure of CBAM.

AgricultureAgriculture 20212021, 11, 11, x, 393FOR PEER REVIEW 6 of6 of14 13

Figure 6. B-CNN adds the attention mechanism. Figure 6. B-CNN adds the attention mechanism.

2.4.2.4. Parameters Parameters and and Index Index InIn the the choice choice of of the the optimizer, optimizer, we we compared compared the the two two optimizers optimizers stochastic stochastic gradient gradient descentdescent (SGD) (SGD) [41] [41 and] and adaptive adaptive moment moment estimation estimation (Adam) (Adam) [42], [and42], found and found that the that per- the performance of the model decreased by about 1% when the SGD optimizer with a set formance of the model decreased by about 1% when the SGD optimizer with a set learning learning rate of 0.001 and momentum of 0.95 was used, so we choose the Adam optimizer rate of 0.001 and momentum of 0.95 was used, so we choose the Adam optimizer with with hyperparameters beta_1 = 0.9, beta_2 = 0.999, epsilon =1 × 10−8, decay = 0.0 as the hyperparameters beta_1 = 0.9, beta_2 = 0.999, epsilon =1 × 10−8, decay = 0.0 as the optimizer optimizer of models. In order to compare the accuracy and efficiency of different models, of models. In order to compare the accuracy and efficiency of different models, we used we used unified hyperparameters to train different network models (Table2). unified hyperparameters to train different network models (Table 2).

Table 2. Training parameters of models. Table 2. Training parameters of models. Optimization Initial Learning Batch Training Item Optimization Initial Learning Loss Batch Training Metrics Item Method Rate Loss Size Epochs Metrics Method Rate Size Epochs Categorical- Value Adam 0.0001 Categorical- 32 20 Accuracy Value Adam 0.0001 Crossentropy 32 20 Accuracy Crossentropy

TheThe simplest simplest and and most most commonly commonly used used metric metric for forevaluating evaluating classification classification models models is accuracy,is accuracy, but butprecision precision and and recall recall are arealso also needed needed to toevaluate evaluate the the quality quality of ofthe the model. model. PrecisionPrecision [43] [43 can] can be be understood understood as as the the number number of of correctly correctly predicted predicted AmanitaAmanita speciesspecies imagesimages divided divided by by the the number number of ofAmanitaAmanita imagesimages predicted predicted by the by theclassifier. classifier. Recall Recall [44] [ is44 ] theis thepercentage percentage of the of thenumber number of correctly of correctly predicted predicted AmanitaAmanita speciesspecies images images toto the the total total number of images actually belonging to the category of Amanita images. F1−score is the number of images actually belonging to the category of Amanita images. F1−score is the harmonic mean of precision and recall. Accuracy, Precision, Recall, and F1−score are defined harmonicas follows: mean of precision and recall. Accuracy, Precision, Recall, and F1−score are de- TP + TN fined as follows: Accuracy = (1) TP + FP + TN + FN + = TPTP TN Auracycc Precision = (1)(2) TP+++ FPTP + TNFP FN TP Recall = (3) TP +TPFN Precision = (2) PrecisionTP+· FPRecall F − = 2· (4) 1 score Precison + Recall TP where TP and TN represent the AmanitaRecallimage= correctly classified; FP and FN indicate(3) that the image of Amanita is misclassified. TP+ FN

Agriculture 2021, 11, x FOR PEER REVIEW 7 of 14

⋅ =⋅Precision Recall F1−score 2 (4) Agriculture 2021, 11, 393 Precison+ Recall 7 of 13 where TP and TN represent the Amanita image correctly classified; FP and FN indicate that the image of Amanita is misclassified. 3. Experimental Results and Discussion 3.1.3. Experimental Model Training Results and Discussion 3.1. ModelIn this Training paper, the experimental environment is on the Google Colaboratory platform, whichIn uses thisTesla paper, K80 the GPU experimental resources. environmen The programmingt is on the environment Google Colaboratory is Python3, platform, and the frameworkwhich uses structureTesla K80 is GPU Keras resources. 2.1.6 and The Tensorflow programming 1.6. environment is Python3, and the frameworkThe specific structure model is trainingKeras 2.1.6 steps and are Tensorflow as follows: 1.6. • DataThe specific loading; model training steps are as follows: • AData batch loading; of Amanita pictures (32 pictures) were randomly loaded from the training datasetA forbatch subsequent of Amanita data pictures processing. (32 pictures) were randomly loaded from the training •datasetImage for preprocessing;subsequent data processing. • PreprocessImage preprocessing; the image to change the image size to 224 × 224 × 3. Then, put it through the Tensorflow2.0Preprocess the built-in image ImageDataGenerator to change the image forsize data to 224 enhancement. × 224 × 3. Then, put it through the Tensorflow2.0 built-in ImageDataGenerator for data enhancement. • Define the model structure and load the pre-training weights; • Define the model structure and load the pre-training weights; Load the model (such as EfficientNet-B4) and fine-tune the model. Change the fully connectedLoad layerthe model to a custom (such as layer Effi andcientNet-B4) modify the and Softmax fine-tune layer the to model. seven layersChange according the fully toconnected the number layer of classificationsto a custom layer required. and modify In this the paper, Softmax different layer layers to seven are frozen layers accordingaccording toto thethe differentnumber of models, classifications and the required. method of In transfer this paper, learning different [45] layers is used are to frozen perform accord- pre- traininging to the with different Imagenet models, weights. and the method of transfer learning [45] is used to perform pre-training with Imagenet weights. • Start training; • Start training; Before training the model, it is necessary to set the hyperparameters and optimizers Before training the model, it is necessary to set the hyperparameters and optimizers related to the network structure. Pass the training dataset pictures (as shown in Figure7) related to the network structure. Pass the training dataset pictures (as shown in Figure 7) to the neural network for training. Among them, the feature map of the first convolutional to the neural network for training. Among them, the feature map of the first convolutional layer (as shown in Figure8), after one round of model training, obtains the loss and layer (as shown in Figure 8), after one round of model training, obtains the loss and accu- accuracy of the training dataset. racy of the training dataset. • • StopStop training;training; ThisThis experimentexperiment isis toto avoidavoid overtrainingovertraining thethe network.network. AnAn earlyearly stoppingstopping strategystrategy isis set,set, andand thethe lossloss isis verifiedverified byby monitoringmonitoring the the training training process. process. WhenWhen thethe verificationverification loss loss doesdoes notnot changechange withinwithin fivefive roundsrounds oror thethe modelmodel trainingtraining reachesreaches thethe presetpreset value,value, thethe modelmodel willwill stopstop training.training.

FigureFigure 7.7.Input Input imageimage diagram.diagram.

AgricultureAgriculture 2021, 112021, x, FOR11, 393 PEER REVIEW 8 of 148 of 13

Figure 8. The first layer of convolutional layer feature map. Figure3.2. 8. The Comparison first layer of of Modeling convolutional Methods layer feature map. In order to verify the performance of the model, we compared the proposed model 3.2. Comparisonwith other of CNN Modeling models Methods on the dataset. The structure and parameters of these models are Inshown order in to Table verify3. the performance of the model, we compared the proposed model with other CNN models on the dataset. The structure and parameters of these models are shownTable in Table 3. Some 3. parameters of these models. Number of Parameters, M TableModel 3. Some parametersClassification of these models. Layer Size, MB Total Trainable Classification Number of Parameters, M Model VGG-16 2 *(Fc 2048) 70.3 68.6Size, MB 268 LayerResNet-50 Total Fc 2048 Trainable 229.1 228.9 874 VGG-16 2 *(FcENB4 2048) * 70.3 Fc 2048 21.4 68.6 21.1268 82 ResNet-50 VGG-16Fc 2048 + VGG-16 229.1 Fc 1024 283.2228.9 275.6874 1080 VGG-16 + ResNet-50 Fc 1024 305.7 277.8 1167 ENB4 * ENB4 Fc 2048 + ENB4 21.4 Fc 1024 19.6 21.1 19.4 82 130 VGG-16 + VGG-16 ENB4 Fc 1024 + CBAM 283.2 Fc 1024 19.6 275.6 19.4 1080 75 VGG-16 + ResNet-50 ENB4 Fc + ENB41024 + CBAM 305.7 Fc 1024 19.7 277.8 19.4 1167 130 ENB4 + ENB4 * ENB4 = Fc EfficientNet-B4. 1024 19.6 19.4 130 ENB4 + CBAM Fc 1024 19.6 19.4 75 In this paper, the steps of the comparison experiment are: (1) Use the VGGnet model ENB4 + ENB4 + CBAM Fc 1024 19.7 19.4 130 with 16 layers (VGG-16) [46], the Residual Network with 50 layers (ResNet-50) [47], com- * ENB4 = EfficientNet-B4. pare these two network models with EfficientNet-B4. (2) Combine the bilinear model to build a bilinear EfficientNet-B4, compare the bilinear VGG-16 model (B-CNN(VGG-16, In this paper, the steps of the comparison experiment are: (1) Use the VGGnet model VGG-16)) and the B-CNN(VGG-16, ResNet-50) model. (3) Add an attention mechanism with 16 layers (VGG-16) [46], the Residual Network with 50 layers (ResNet-50) [47], com- to the model, conduct a comparative experiment, and discuss the effect of adding an pare these two network models with EfficientNet-B4. (2) Combine the bilinear model to attention mechanism. build a bilinear EfficientNet-B4, compare the bilinear VGG-16 model (B-CNN(VGG-16, Figure9 shows the changes in loss and accuracy during training and testing. Table4 VGG-16)) and the B-CNN(VGG-16, ResNet-50) model. (3) Add an attention mechanism to compares the results of different methods. The comprehensive chart can be used to draw the model, conduct a comparative experiment, and discuss the effect of adding an atten- the following conclusions: tion mechanism. (1)FigureThe 9 shows EfficientNet-B4 the changes is superiorin loss and to VGG-16accuracy and during Resnet-50 training in termsand testing. of accuracy, Table model4 compares parametersthe results of and different model methods. size. The comprehensive chart can be used to draw (2) On this basis, the bilinear structure was studied and used, and it was found that the following conclusions: B-CNN(VGG-16, ResNet-50) has good accuracy. However, it has the largest number of (1) The EfficientNet-B4parameters in theis superior model used, to VGG-16 and the and size Resnet-50 of the model in terms is also of accuracy, very large. model However, parametersBilinear and EfficientNet-B4 model size. has a good performance in accuracy, model size and number (2) On thisof parameters.basis, the bilinear structure was studied and used, and it was found that B- (3)CNN(VGG-16, For EfficientNet-B4 ResNet-50) (accuracy has good rate accuracy. is 92.76%), However, after adding it has the the attention largest number mechanism, of its parametersaccuracy in the rate model is 93.53%, used, which and the improves size of the accuracymodel is also rate very by 0.77%; large. after However, combining

Agriculture 2021, 11, x FOR PEER REVIEW 9 of 14

Bilinear EfficientNet-B4 has a good performance in accuracy, model size and number of parameters. Agriculture 2021, 11, 393 (3) For EfficientNet-B4 (accuracy rate is 92.76%), after adding the attention mechanism,9 of 13 its accuracy rate is 93.53%, which improves the accuracy rate by 0.77%; after combin- ing the bilinear structure and attention mechanism, its accuracy rate is 95.2%, an in- crease of 1.77%. In general, adding an attention mechanism to the model will increase the accuracythe bilinear by about structure 1% and and can attention reduce mechanism, the time by its 0.5 accuracy s. rate is 95.2%, an increase of 1.77%. In general, adding an attention mechanism to the model will increase the accuracy by about 1% and can reduce the time by 0.5 s.

Figure 9. The accuracy of models during training and testing. Figure 9. The accuracy of models during training and testing. In addition, by comparing the first five rounds of training and testing in Figure9, it can be found that the accuracy of the test is slightly higher than that of the training. The main reason is that the network is initialized with pre-trained weights. Therefore, the model has better feature extraction capabilities in the first few rounds of testing.

Agriculture 2021, 11, x FOR PEER REVIEW 10 of 14

In addition, by comparing the first five rounds of training and testing in Figure 9, it can be found that the accuracy of the test is slightly higher than that of the training. The Agriculture 2021, 11, 393 10 of 13 main reason is that the network is initialized with pre-trained weights. Therefore, the model has better feature extraction capabilities in the first few rounds of testing.

TableTable 4. Comparison 4. Comparison of 8 ofmodels’ 8 models’ AmanitaAmanita classificationclassification results. results.

AccuracyAccuracy (%) (%) Time, Model Precision (%) Recall (%) F1−Fscore Model Train Test Precision (%) Recall (%) 1−score s/FrameTime, s/Frame Train Test VGG-16 93.89 89.66 90.03 88.01 89.01 2.41 ResNet-50VGG-16 94.19 93.89 89.98 89.66 90.1 90.03 90.24 88.01 90.17 89.01 2.63 2.41 ResNet-50 94.19 89.98 90.1 90.24 90.17 2.63 ENB4 * 95.7 92.76 92.01 92.13 92.07 5.06 ENB4 * 95.7 92.76 92.01 92.13 92.07 5.06 VGG-16VGG-16 + + VGG-16 VGG-16 94.77 94.77 92.94 92.94 91.09 91.09 91.76 91.76 91.42 91.42 6.89 6.89 VGG-16VGG-16 + + ResNet-50 ResNet-50 96.94 96.94 93.27 93.27 92.19 92.19 93.6 93.6 92.89 92.89 7.21 7.21 ENB4ENB4 + + ENB4 ENB4 97.57 97.57 94.35 94.35 93.61 93.61 93.9 93.9 93.76 93.76 5.14 5.14 ENB4ENB4 + + CBAM CBAM 97.13 97.13 93.53 93.53 92.83 92.83 93.67 93.67 93.25 93.25 4.47 4.47 ENB4 + ENB4 + CBAM 99.4 95.2 94.5 94.6 94.6 4.56 ENB4 + ENB4 + CBAM 99.4 95.2 94.5 94.6 94.6 4.56 * ENB4* ENB4 = EfficientNet-B4. = EfficientNet-B4.

3.3.3.3. Model Model Test Test Results Results NotNot all allobjects objects are are equally equally difficult difficult to classify, to classify, so soit is it necessary is necessary to observe to observe the theaccu- accuracy racyof eachof each category category and and the the confusion confusion betweenbetween categories. In In the the test test dataset, dataset, there there are are 638 638pictures, pictures, and and the the Bilinear Bilinear EfficientNet-B4EfficientNet-B4 with with the the attention attention model model is isused used to toobtain obtain the theconfusion confusion matrix matrix asas shownshown inin Figure 10.10.

Figure 10. Confusion matrix of Amanita dataset. Figure 10. Confusion matrix of Amanita dataset. It can be seen from Figure 10 that it is difficult to classify among the three types of , and , resulting in the lowest accuracy and recall rate of Amanita vaginata. Observing the dataset found that there are three main reasons:

(1) Amanita vaginata and pure white Amanita bisporigera are similar in shape and feature, except for the difference in color on the surface of the fungus cap. The shapes of Amanita vaginata, Amanita bisporigera and Amanita phalloides are very similar in their juvenile period. AgricultureAgricultureAgricultureAgricultureAgriculture 20212021 2021 2021 2021 2021,, ,1111 ,11 11,, ,,x11x ,x11 FORxFOR ,FOR ,xFOR x FOR FOR PEERPEER PEER PEER PEER PEER REVIEWREVIEW REVIEW REVIEW REVIEW REVIEW 11111111 ofof 11of 11of 1414 14of 14of 14 14

AgricultureAgricultureAgricultureAgricultureAgricultureAgricultureAgriculture 20212021 20212021 2021 2021 2021,, 11,,11 1111,, ,, 11 , x,11 ,x 11 xx FOR,FOR , FORx ,FORx xFOR FOR FOR PEERPEER PEERPEER PEER PEER PEER REVIEWREVIEW REVIEWREVIEW REVIEW REVIEW REVIEW 11111111 of11of 11 of11of 14of14 of 1414of 14 14 14

ItItIt It cancanItcan Itcan can can bebebe be be seenseenbeseen seen seen seen fromfromfrom from from from FigureFigureFigure Figure Figure Figure 10 1010 10 10 thatthat10that that that that ititit it isisitis isit difficultis difficult isdifficult difficult difficult toto to classifyto classify toclassify classify classify among among among among among thethe the the the threethree three three three typestypes types types types ofof of of of AmanitaAmanitaAmanitaAmanitaItItAmanitaItIt canIt can ItcancanIt can can can vaginata bevaginata be vaginatabebe vaginata be bevaginataseen beseen seenseen seen seenseen, , , fromAmanita from,fromAmanita from Amanita, from ,fromAmanita fromAmanita Figure Figure FigureFigure Figure Figure bisporigeraFigure bisporigera bisporigera bisporigera bisporigera10 10 1010 10 10that that10 thatthat that that that and and itand it anditit isand itis itandis is it Amanita difficultisAmanita difficultisAmanitadifficultisdifficult Amanita difficult difficultAmanitadifficult phalloides, tophalloides, to phalloides,toto phalloides, toclassify tophalloides,classify toclassifyclassify classify classifyclassify resulting resulting resulting among resultingamong among amongresulting amongresulting amongamong the thein theinthein inthethe the inthe threethethree inthreethethree thethree threethelowest threelowestlowest lowest typeslowesttypes typeslowesttypes types typestypes accu- accu-accu- ofaccu- of ofofaccu- accu-of of of AmanitaAmanitaAmanitaAmanitaracyracyAmanitaracyAmanitaracyAmanitaracyracy and andand and vaginata vaginata andvaginata vaginataand vaginatarecall recallvaginatarecallvaginata recall recall recall, , rate rate, Amanita,rate Amanita Amanitarate,Amanita ,, rateAmanita Amanitarate Amanita of ofof of Amanita of Amanita of Amanita bisporigera bisporigera Amanita bisporigerabisporigeraAmanita bisporigera bisporigerabisporigera vaginata vaginata vaginata vaginata vaginata and and andand and andand .. Amanita. AmanitaObserving Observing.AmanitaAmanitaObserving Observing. Amanita Amanita.ObservingAmanita Observing phalloides, phalloides, phalloides,phalloides, phalloides, phalloides,the thephalloides,the the the thedataset datasetdataset dataset dataset resultingdataset resulting resultingresulting resulting resultingresulting found foundfound found found found in in in inthat that that in the in thatthe inthe thethat thethat the thethere therelowest therelowest lowesttherelowest lowestthere lowesttherelowest are areare are accu- accu- accu-areaccu- arethree three accu-three accu-accu-three three three racyracyracyracymainracymainracymainracy main and andmain andand andreasons: andreasons:and reasons: recall recall reasons:recall recall reasons: recall recallrecall rate rate raterate rate raterate of of ofof ofAmanita ofAmanita ofAmanitaAmanita Amanita AmanitaAmanita vaginata vaginata vaginatavaginata vaginata vaginatavaginata.. . Observing. Observing Observing.Observing. . Observing ObservingObserving the the thethe the the the dataset dataset datasetdataset dataset datasetdataset found found foundfound found foundfound that that thatthat that thatthat there there therethere there therethere are are areare are are are three three threethree three threethree mainmainmain reasons: reasons: reasons: mainmain(1)(1)(1)main(1)main (1) (1) reasons:reasons:Amanita Amanita Amanita reasons:reasons:AmanitaAmanita vaginata vaginata vaginata vaginata vaginata and andand and and and pure purepure pure pure pure white whitewhite white white white Amanita Amanita Amanita Amanita Amanita bisporigera bisporigera bisporigera bisporigera bisporigera are areare are are aresimilar similarsimilar similar similar similar in inin in shape shapein shape inshape shape shape and andand and and and feature, feature,feature, feature, feature, feature, Agriculture 2021, 11, 393 (1)(1)(1)(1)(1) (1) (1) Amanita Amanita AmanitaAmanita exceptexceptAmanitaexceptAmanitaexceptAmanitaexceptexcept vaginataforvaginata for vaginataforvaginata for vaginata vaginatafor vaginatathe forthethe the the thedifference difference difference andand difference and anddifference and difference andand purepure purepure pure purepure inwhiteinwhite in whitewhitein white colorcolorin whitecolor whiteincolor color color AmanitaAmanita AmanitaAmanita on onAmanita on AmanitaonAmanita on thetheonthe the the thesubisporigera subisporigera su bisporigerabisporigerasu rfacebisporigerarfacesu bisporigerarfacebisporigerasurfacerfacerface of ofof of the theofare theare ofaretheare are the are thefungusare fungussimilarfungussimilar similarfungussimilar similar fungus similarfungussimilar cap. cap.inin cap. inincap. in shapecap.inshape cap.inshapeshape The Theshape The shapeTheshape The The shapes andshapesand shapes andandshapes and shapes andshapesand feature,feature, feature,feature, feature, of feature,offeature,of 11of Am- of ofAm- ofAm- 13Am- Am- exceptexceptexceptexceptanitaanitaexceptanitaexceptanitaexceptanitaanita forvaginata forvaginata vaginatafor forvaginata for forvaginata theforvaginata the thethe the the the difference difference difference,,difference , Amanita difference,Amanitadifference differenceAmanita, ,Amanita Amanita inbisporigera in inbisporigerain bisporigerain color incolorbisporigera incolorcolorbisporigera color colorcolor on on onon on onand theandon theand thetheand theand the theandsu suAmanita susuAmanita Amanitarface surface surfacesuAmanitarface Amanitarfacerfacerface of of of ofphalloides ofphalloidesthe ofthephalloides ofthethe phalloides thephalloides the the fungus fungus fungusfungus fungus fungusfungus are areare are cap.arecap. cap.cap.arevery very very cap. verycap.cap. very Thevery The TheThe similar similar Thesimilar TheThesimilar shapessimilarshapes shapesshapessimilar shapes shapesshapes in inin in of oftheir intheir ofoftheir in their ofAm- ofAm- theirof Am-Am-their Am- Am-ju- Am-ju-ju- ju- ju- ju- anitaanitaanitaanitavenileanitavenileanitavenileanitavenile vaginatavenile vaginata vaginatavaginata vaginata vaginataperiod. vaginataperiod. period. period. period.,, , ,Amanita Amanita Amanita,Amanita , , Amanita AmanitaAmanita bisporigera bisporigera bisporigerabisporigera bisporigera bisporigerabisporigera and and andand and andand Amanita Amanita AmanitaAmanita Amanita AmanitaAmanita phalloides phalloides phalloidesphalloides phalloides phalloidesphalloides are are areare are are are very very veryvery very veryvery similar similar similarsimilar similar similarsimilar in in inin in their in their intheirtheir their theirtheir ju- ju- ju-ju- ju- ju-ju- venilevenilevenilevenilevenilevenilevenile period. period. period.period. period. period.period. (2)(2)(2)(2)(2) (2) (2) SomeSome SomeSome SomeSomeSome picturespictures pictures pictures pictures pictures of of of AmanitaAmanita of Amanita ofAmanitaAmanita Amanita Amanita vaginata vaginata vaginatavaginata vaginata vaginata areare are are areare overexposedareoverexposed overexposed overexposed overexposed overexposed andand and and and and thethe the the the the thepicturespictures pictures pictures pictures pictures areare are are are white.are white.white. white. white. white. white. AtAt At At Atthis thisAtthis this At this this (2)(2)(2)(2)(2) (2) (2) Some Some SomeSome SomeSomeSome picturespictures picturespictures pictures pictures pictures ofof ofof of AmanitaAmanitaof AmanitaofAmanita Amanita Amanita Amanita vaginatavaginata vaginatavaginata vaginata vaginata vaginata areare areare are are are overexposedoverexposed overexposedoverexposed overexposed overexposed overexposed andand andand and and and thethe thethe the the the picturespictures picturespictures pictures pictures pictures areare areare are are are white.white. white.white. white. white. white. AtAt AtAt At At thisthisAt thisthis this this this time,time,time,thistime,time,time, time, thethe thethe the the characteristicscharacteristics characteristics thecharacteristics characteristicscharacteristics characteristics areare areare are are areveryvery veryvery very very similarsimilar similarsimilar similar similarsimilar toto toto toAmanita AmanitatoAmanita Amanita AmanitaAmanita bisporigera bisporigerabisporigera bisporigera bisporigerabisporigera,, , so,so, so,so , so partpartso part partpart partpart ofof of ofof ofAmanitaAmanita AmanitaofAmanita AmanitaAmanita time, the characteristics are very similar to Amanita bisporigera, so part of Amanita time,time,time,time,vaginatavaginatavaginatatime,vaginatatime,vaginata vaginatathe thethethe the the characteristics characteristicsis ischaracteristicsischaracteristics isis classified classifiedcharacteristicsisclassifiedcharacteristics classifiedisclassified classified classified as asas asas areAmanita asAmanitaare Amanitaare asareAmanita Amanitaare are Amanitavery veryveryvery veryvery bisporigera bisporigerasimilarbisporigera similarbisporigerasimilarsimilar bisporigera similarbisporigerasimilar to tototo.. . .Amanitato .Amanitato AmanitaAmanita. . Amanita Amanita bisporigera bisporigerabisporigerabisporigera bisporigerabisporigera,, , ,so sososo,, sopart sopartpartpart partpart of ofofof Amanitaof AmanitaofAmanitaAmanita AmanitaAmanita (3)(3)(3)(3)(3) (3) vaginata vaginata(3)vaginata vaginata The Thevaginata TheThevaginata Thevaginata The The base basebase basebase is isbase isisbase classified is classifiedof isofclassifiedclassifiedisof of of classified classifiedthis thisofclassified this thisofthis this this category categorycategory categorycategory ascategory as ascategoryas as Amanita as Amanita asAmanitaAmanita Amanita AmanitaAmanita in inin in the thein the in the bisporigera bisporigera thebisporigera bisporigera thetest test bisporigeratest bisporigeratestbisporigera test test dataset datasetdataset dataset dataset dataset...... is. is is is not notisnot isnot not notlarge. large.large. large. large. large. (3)(3)(3)(3)(3) (3) (3) The The The The The The The base base basebase base basebase of of ofof ofthis ofthis ofthisthis this thisthis category category categorycategory category categorycategory in in inin inthe inthe inthethe the the the test test testtest test testtest dataset dataset datasetdataset dataset datasetdataset is is isis not is not isnotnotis not not not large. large. large.large. large. large.large. InInInIn Inaddition,addition, addition, Inaddition, addition, addition, thethe thethe the the accuracyaccuracy accuracy accuracy accuracy accuracy andand andand and and recallrecall recallrecall recall recall raterate rate rate rate rate ofof of of Amanitaof Amanita ofAmanita Amanita Amanita muscaria muscariamuscaria muscaria muscaria reachedreachedreached reached reached reached 1.01.01.0 1.0 1.0 and1.0andand and and and 0.959,0.959, 0.959, 0.959, 0.959, 0.959, indicatingindicatingindicatingindicatingindicatingInInindicatingInIn In addition,In addition, Inaddition,addition, addition, addition,addition, that that thatthat that that that this thisthe thisthisthe thethisthe thethis the this the accuracymodel modelaccuracy modelmodelaccuracy accuracymodel accuracy modelaccuracy accuracymodel is is isis is most mostand mostismostand andismostand andmost andmostand recall recall suitable suitablerecall suitablerecallsuitable suitable recall recallsuitablerecall suitable rate rate raterate rate forrate for forratefor forof of offor of identifying foridentifying identifyingof identifying Amanitaof Amanita identifyingofAmanitaAmanita identifying Amanitaidentifying AmanitaAmanita muscaria muscaria muscariamuscaria this this thismuscariathis muscariamuscariathis this this type type type type typereached reached typereachedreached of ofreached ofreached reachedof Amanita ofAmanita Amanita ofAmanita Amanita Amanita 1.0 1.0 1.01.0 1.0 1.0 1.0and and ..and and. . and and.and . 0.959, 0.959, 0.959,0.959, 0.959, 0.959,0.959, indicatingindicatingindicatingindicatingindicatingindicatingindicatingInInInIn Inorderorder orderIn thatorder that thatthatorder thatorder thatthat this this totothis tothis to to this this thistestto testtest testmodelto model testmodel model test modeltest model modelthethe the thethe the isthe isrobustness robustnessis robustnessis robustness robustnessmost is most ismostrobustness mostis robustness most mostmost suitable suitable suitablesuitable suitable suitablesuitable ofof of ofof ofthethe the theof forthe for forforthe forthe cl forcl classifier, forclidentifying identifying classifier,identifyingidentifying assifier, cl identifyingassifier, clidentifyingidentifyingassifier,assifier, we we wewe this thiswe this thiswe used usedused this thisusedthis typeused type usedtypetype type typeother othertype otherother of of otherofofother ofAmanita ofAmanita ofAmanitaAmanita types typestypes Amanita Amanitatypes Amanitatypes types of of. of. . of. .mushroomof mushroom. mushroomof. mushroom mushroom mushroom picturespicturespicturespicturesInInpicturesInIn In orderIn orderInorderorder order order andorder andandand andto andto totonon-mushroom non-mushroom non-mushroomto testto non-mushroom test totesttest non-mushroom non-mushroomtest testtest the thethethe the the the robustness robustnessrobustnessrobustness robustness robustnessrobustness pictures picturespicturespictures picturespictures of ofofof of theof the ofthethefor for forthe thefor thecl for clcl forclclas classification.assifier,clasassifier, cl assifier,clasassifier,cl cl assifier,clasassifier,classification.assifier,sification.sification.sification.sification.sification. we wewewe we we weused used used used SinceSince Sinceused usedSinceused Since Since other otherotherother thetheother theotherotherthe the the typesunknown unknown typesunknown typestypesunknown types unknowntypestypesunknown of ofofof of mushroomof mushroomof mushroommushroom classclass classmushroom mushroomclassmushroom classclass isis isis notnot is notis not not not picturespicturespicturespicturesaddedpicturesaddedpicturesaddedpicturesaddedadded and to and toandand to and and totheand theto thenon-mushroom non-mushroomnon-mushroom thenon-mushroom the datanon-mushroom data non-mushroomnon-mushroomdata datadata set setset set atset atat theat theatthe pictures the pictures thepictures beginningpictures beginning pictures beginningpictures pictures beginningbeginning for forforfor for for offorclas of clasclasofclas oftraining,clas clas training,ofclassification.training,sification. sification.sification. training,training,sification.sification.sification. when whenwhen Since when SinceSincewhenSince Since Since Since the the the the the thethe classifier the classifierclassifierthe classifierthe unknown unknownclassifierunknownunknownclassifier unknown unknownunknown classifies classifiesclassifiesclassifies classifiesclass classifiesclassclassclass class classclass is is anisis an isannot notis isnot annot un-an notun- notun- not un- un- addedaddedaddedaddedknownaddedknownaddedknownaddedknownknown to tototo category,to theto category, thetothecategory,the category,the the category,the data datadatadata data datadata set it set itset set it willset willsetwillit setatwill it at atatwill will at the at betheat bethebethe be the the be forcedthe forcedbeginningbe forced beginningforcedbeginningbeginning forcedbeginning beginningforcedbeginning to toto to be to bebe to beof of beof classifiof classifiedbeclassifi of classifitraining,of training,oftraining,classifitraining, classifi training, training,training,edededed edas as edas when as when when when aasa awhen as whenaclass whenclass class a classa class the theclassthethe in thein thein the inclassifier classifiertheclassifier intheclassifier the in classifierthe classifier classifierthe thedata datadata data data data classifiesset. set. classifiesset.classifies classifies set. classifies classifiesset. classifies set.Therefore, Therefore, Therefore, Therefore, Therefore, Therefore, an ananan an anun- an un-un-un- thetheun- theun- un-the the the knownknownknownknownrobustnessrobustnessknownrobustnessknownrobustnessknownrobustnessrobustness category, category, category,category, category, category,category, ofof ofof theofthe theof itthe it itit the will theitwill model modelitwill modelwill it model will willmodel will model be be bebe be beforced canforcedbecan cancanforced forced can forced forcedcan forced can bebe bebe be to be to identifiedidentifiedto identifiedbetoidentified identified tobe to be identifiedto bebeidentified be beclassifi classifibe classificlassifi classifi classificlassifi accordingaccording accordingaccording edaccordinged ededaccording accordinged edas edas asas as a as a asa a class class aclass aclass a toto toclassto classtoclass thetothe the into in thein in the in the in theprobabilityprobability intheprobability the probabilitythe the probabilitythe dataprobability data datadata data datadata set. set. set.set. set. set. set.of ofTherefore,of Therefore, Therefore,ofTherefore, theTherefore,ofthe Therefore,the ofTherefore,the the the unknownunknown unknown unknown unknown unknown the the thethe the the the robustnessrobustnessrobustnessrobustnessclassclassrobustnessclassrobustnessclassrobustnessclassclass classification. classification. classification.classification. classification. classification. classification. of of ofof of theof the ofthethe the the the model model modelmodel model model model can can cancan can cancan be be bebe be beidentified beidentified identifiedidentified identified identifiedidentified according according accordingaccording according accordingaccording to to toto to theto the tothethe the the the probability probability probabilityprobability probability probabilityprobability of of ofof of theof the ofthethe the the the unknown unknown unknownunknown unknown unknownunknown classclassclassclassclassclassclass classification. classification. classification.classification.The The classification. Theclassification.classification.TheThe test test test test testresults results results results results of of of non-mushrooms ofnon-mushrooms ofnon-mushrooms non-mushrooms non-mushrooms are are are are shownare shown shown shown shown in in in Table in Table inTable Table Table 55. 5.. 5. Combining5.Combining 5.Combining Combining Combining the the the the classificationthe classificationclassification classification classification classification resultsresultsresultsresultsresultsTheTheresultsTheTheTheTheThe test withwithtest withtest testwith testtestwith test with resultsresults results results thethe the resultsresults theresults the the datadata data data of of ofdata ofdata of non-mushrooms non-mushrooms of setnon-mushroomsofsetnon-mushrooms set non-mushroomsset non-mushrooms non-mushrooms set pictures,setpictures, pictures,pictures, pictures, pictures, pictures, it it it it cancan canitcan arecanitare arearecan arecan arebe beare beshownbe shown beshownshown be shownfoundfound foundshownbefound shownfound found found inin inin thatthatin TablethatthatTable in TableinthatTable Tablethat Table thatTable when,when, when,when, when, 5. 5. 5.when,5. when,Combining5. Combining 5.CombiningCombining5. Combining Combining thetheCombining thethe the the theshape shape shapeshape shape shape shape thethe the the isis istheis the is the classification classification differentdifferent isclassification differentclassification is differentclassification classification different classificationdifferent fromfrom from from from from resultsresultsresultsresultsthetheresultstheresultstheresultsthe theAmanita, Amanita, Amanita,withwith withAmanita,with Amanita, with withwith the the thethe it theit theit the itdata data willwilldata willdataitwill willit data datawilldata will setbe be set bebe setsetbe set beset divided dividedset dividedpictures,bedivided pictures,pictures,dividedpictures, pictures,divided pictures,dividedpictures, according according accordingaccording it accordingit itit according can itaccordingcan itcancanit can cancan be be bebe be befoundto tobefound toto foundfoundto foundthe tofoundthe thethefound tothe the the colorthatcolor thatcolor thatthatcolor thatcolor thatcolorthat when, when,of when,of when,of when,of when, when,the ofthe the ofthe the the theobject. object.theobject.the object. the the object. theshapeobject. shape shapeshape shape shape shapeFor For ForFor For Foris is For isisexample, example, example, differentis differentexample, isdifferent isdifferent example, differentexample, differentdifferent whenwhen whenfromwhen from whenfromfrom when from fromwhenfrom a a aa a a a thethethethewhitethewhitethe whitethe Amanita, Amanita, whiteAmanita,Amanita,white Amanita, Amanita,Amanita, cat cat cat cat image cat image imageit it itimageit imagewill it will itwillwillit will iswill iswill is be beused be usedisbe isused be bedividedused bedivided divideduseddivided divided fordivided fordivided for for the for the the according accordingthe accordingaccordingtheclassification classificationclassification according classificationaccordingaccording classification classification to to toto to the to the tothe thetest, test,test, the thetest, the color colortest, colorcolortest, thecolor thecolorthe colorthe the ofthetest of testtestofof test of the of testthe ofthe testtheresult result resultthe theresult the object. object. result object.object.result object. object.object. is isis is Amanita AmanitaisAmanita For Foris AmanitaForFor Amanita ForAmanita For For example, example, example,example, example, example, example,bisporigera bisporigerabisporigera bisporigera bisporigera bisporigera when when whenwhen when whenwhen.. . This This. This This aThis .a a a.This Thisa a a whitewhitewhitewhiteisiswhiteiswhiteiswhite because becauseis becausebecause isbecause cat cat because catcatbecause cat cat imagecatimage imageimage image imagethe theimage the the the isthecolor iscolor coloris is color usedis used iscolorused usediscolor used used usedof of of foroffor forforAmanita of Amanita offor Amanita for theforthe Amanita thetheAmanita the the the classificationclassification classificationclassification classificationbisporigera classificationclassification bisporigera bisporigera bisporigera bisporigera test,is test,is istest, test,is test, white. white. test,is white. test,iswhite. the the white. thethewhite. the thethe testtest testtestHowever, However,However, However,test However, test test However,result result However,resultresult result resultresult isis is isits its Amanitaits isAmanita isitsAmanitaAmanitais its Amanitapredicted predicted Amanitaitspredicted Amanitapredicted predicted predicted bisporigerabisporigera bisporigerabisporigera bisporigera bisporigerabisporigera value valuevalue value value value .is .is is. This.This is This This .53%. 53%.. is 53%. .Thisis This53%.This 53%. 53%. isisisis Using becauseisUsing because isbecauseUsingisbecause Usingbecause becauseUsingbecause seven seven seven theseven the theseventhe the the the kindscolor color kinds colorcolorkinds colorkinds color colorkinds of of ofof of of Amanita ofpictures Amanitaof of picturesAmanita Amanitaof pictures Amanita Amanitapictures Amanitapictures bisporigerafor bisporigera forbisporigera bisporigerafor bisporigera bisporigerafor bisporigeraclassification for classification,classification classification classification classification is is isis white.is white. iswhite.iswhite. white. white.,white., , it it, itit it , can can canHowever,,can itHowever, However,canitHowever, canHowever, However, can However,be be bebe be befound found foundbe found found found its its itsits itsthat that thatitspredictedthat itspredicted thatpredictedpredicted that predicted predictedthat predicted the the the the the thepredicted predicted predicted predicted valuepredicted value predictedvaluevalue value valuevalue is is isvalueis value valuevalue 53%.is value53%. is53%. is53%.value 53%.value 53%.53%. is is is is is is UsingUsingUsingUsingrelativelyrelativelyUsingrelativelyUsingrelativelyUsingrelativelyrelatively seven seven sevenseven seven sevenseven low low lowlowkinds kinds low kindskinds low kindslow kinds kinds(<55%) (<55%) (<55%).(<55%) (<55%) of(<55%) of (<55%)ofof ofpictures ofpictures ofpicturespictures picturespictures pictures forfor forfor for for forclassification classification classificationclassification classification classificationclassification,, , it, it itit, , can, itcan itcancanit can cancan bebe bebe be found be foundbe foundfound found foundfound that that thatthat that thatthat the the thethe the the the predicted predicted predictedpredicted predicted predictedpredicted value value valuevalue value valuevalue is is isis is isis relativelyrelativelyrelativelyrelativelyrelativelyrelativelyrelativelyCommonCommonCommon Common lowCommon low lowlow low lowlow (<55%) (<55%) (<55%)(<55%) edible (<55%) (<55%)edible (<55%)edible edible edible fungi fungi fungi fungi fungi were were were were were used used used used used for for for for classifica for classificationclassifica classifica classifica classificationtiontiontiontiontion prediction prediction prediction prediction prediction prediction and and and and and and the the the the the thetest test test test test test results results results results results results are are areare are are are shownshownshownshownshownCommonCommonshownCommonCommonCommonCommonCommon in in in in Table inTable Table inTable Table edible edibleTable edibleedible edible edible66. 6.edible6. .6. We We6.We 6.fungiWe fungi fungifungiWe Wefungi cancan canfungicanfungi can can were can were seesee seewereweresee see were wereweresee thatseethat that that usedthat used used usedthat usedthat used whenwhenused whenwhen when for for whenfor forwhen for for forclassifica classifica usingusin usinclassificaclassificausin usin classifica classificaclassificausin usingg g otherotherother gothergtion tionother tiontionothertiontion tion mushroom mushroom mushroomprediction prediction mushroompredictionprediction mushroom prediction mushroompredictionprediction imagesandimages andimages andandimages andimages andandimages the the thethe the the forthe testfor test fortesttest for test testclassification,classification,fortest classification, resultsclassification, results results resultsclassification, resultsclassification, resultsresults are are areare are are are shownshownshownshowntheirtheirshowntheirshowntheirshowntheirtheir predicted predictedin predictedin ininpredicted inpredictedTable inTable predictedinTableTable Table TableTable 6. 6. 6.values6.values valuesvalues We 6.values We6. We6. Wevalues Wevalues We We can can cancan areare arecanare canarecan see seeare see seegenerallyaregenerally generallygenerally see generallysee see that thatgenerally thatthatgenerally that thatthat when when whenwhen when whenhigherhigher higherwhenhigher higher higher usinhigher usin usinusin usin usin usin thanthan thangthang thangg otherthan otherg othergthanotherg otherthosethose other thosethoseother those those mushroomthose mushroom mushroommushroom mushroominin mushroom ininmushroom in TableTable in Table Table inTable Table Table 5 5.5. images5.. images 5.images images ThisThis This5.Thisimages images5.Thisimages This This isis is foris for because forbecauseforisbecause becauseis forfor forclassification,because classification, becauseclassification,classification, classification, classification,classification, theythey they they they they havehave have have have have theirtheirtheirtheiratheiratheir atheirhigher higherhighera predictedhigherapredicted predictedpredictedhigher higherpredicted predicted predicted degree degree degree degree degree valuesvalues valuesvalues values values ofofvaluesof of similaritysimilarity of similarity of aresimilarityare arearesimilarity aresimilarity are are generallygenerally generallygenerally generally generally generally in in in in appearance. appearance in appearance in appearance higherhigher higherappearancehigher appearancehigher higher higher thanthan thanthan than .than However,than. . However, However, thoseHowever,those . thoseHowever,those. those However, those However,those inin inin in TableTablein TableinTable in general,inTable in TableTablein general, ingeneral, general, in 5.general,5. 5.5.general, Thisgeneral,5. This 5. ThisThis5. This whenThis This iswhen iswhen iswhen is when becauseis because isbecausewhen becauseis wewhen because because because we we usewe we we weuse theyusethey theuse theytheyuse they use they theyuse the picturesthe the havehave the have havethe have thepic- havepic-havepic- pic- pic- pic- aaa a higherturestures higher aoftureshigherahigherturesa higher tureshigher thehighertures ofof ofAmanita degreeof degree degreedegree ofthe the theofdegree degreethedegree the the Amanita Amanita Amanita of ofspecies Amanitaofof Amanita ofsimilarity ofsimilarity ofsimilaritysimilarity similarity similaritysimilarity speciesspecies species inspecies species thespecies in in in datain in inappearance in appearance inappearanceappearance intheappearancethe appearancethein setappearancethe the the datadata for datadata datadata classification set.set. set. .set However, However, . setHowever,. However,. setforfor However, for However,forHowever, for forclassificationclassification classificationclassification classificationclassificationprediction, in in inin in general, in general, ingeneral,general, general, general,general, prediction,prediction, prediction,prediction, the whenprediction, when whenprediction,when whenprediction whenwhen we we wewe we thewe the wetheuse use theuse use the use theuse predictionprobability useprediction predictionthe the predictionthe the prediction the predictionthe the pic- pic- pic-pic- pic- pic-pic- turesturesturesturesprobabilityturesisprobabilityturesprobabilityturesprobability often probabilityof ofofof of theof theofthethe more the the the Amanita Amanitais AmanitaAmanitais is Amanita oftenAmanita Amanitaisoften than isoften often often more 97%.species more speciesspeciesmorespecies morespecies speciesmorespecies than Therefore,than than inthan inthaninin 97%.in thein 97%. thein97%.thethe 97%.the the97%. the data dataTherefore, datadataTherefore, when Therefore,data data Therefore,data Therefore, set setsetset set the set setfor forforfor forwhen prediction whenforwhen for classification when classificationclassification classification when classificationwhen classificationclassification the thethe the the theprediction predictionprediction probabilityprediction prediction prediction prediction, prediction,prediction,prediction, prediction, prediction,prediction, probabilityprobability probability probability isprobability probability less the thethethe the thanthe the prediction predictionprediction prediction is is isprediction predictionis predictionless less aisless lessis certain less less than thanthan than than than probabilityprobabilityprobabilityprobabilityaprobabilityvalue,aprobability aprobabilitycertain certain acertaina certain certain we value, isvalue, is value, canisis oftenvalue,is often isvalue,oftenoftenis often treatoften oftenwe we we moremore morewemore can we themorecan morecanmore can can thantreat predictionthan treatthan thantreat than thantreat thantreat 97%.the97%. 97%.the97%. the 97%. 97%. the 97%. thepred pred Therefore,predTherefore, resultTherefore, Therefore,pred Therefore,pred Therefore, Therefore,ictionictionictionictionictioniction as result anresult result whenresultwhen when whenresult unknownwhen resultwhen when as as as thethe as the the anas an thean asthe anthe prediction prediction anpredictionunknown predictionunknownanunknown prediction class.unknownprediction predictionunknown unknown probability probability class. class.probabilityprobabilityclass. class.probability probability class.probability class. isis isis lessis less is lesslessis less less less thanthan thanthan than than than aaa a certain certain acertainacertaina certain certaincertain value, value, value,value, value, value,value, we we wewe we we wecan can cancan can cancan treat treat treattreat treat treattreat the the thethe the the the pred pred predpred pred predpredictionictionictionictionictionictioniction result result resultresult result resultresult as as asas as an as an asanan an anunknown unknownan unknownunknown unknown unknownunknown class. class. class.class. class. class.class. TableTableTableTableTable 5. 5.5. 5. Non-mushroom Non-mushroomNon-mushroom5. Non-mushroom5. Non-mushroom Non-mushroom classification. classification.classification. classification. classification. classification. TableTableTableTableTableTableTable 5.5. 5.5. Non-mushroom5. Non-mushroom 5.Non-mushroomNon-mushroom5. Non-mushroom Non-mushroom Non-mushroom classification.classification. classification.classification. classification. classification. classification.

VarietiesVarietiesVarietiesVarietiesVarietiesVarieties VarietiesVarietiesVarietiesVarietiesVarietiesVarietiesVarieties

ResultResultResultResultResult A.bisporigeraA.bisporigeraA.bisporigeraA.bisporigeraA.bisporigeraA.bisporigera A.caesarea A.caesareaA.caesareaA.caesareaA.caesareaA.caesarea A.vaginata A.vaginataA.vaginata A.vaginataA.vaginataA.vaginata A.muscariaA.muscaria A.muscariaA.muscaria A.muscaria A.echinocephalaA.echinocephala A.echinocephalaA.echinocephalaA.echinocephala A.bisporigera A.bisporigeraA.bisporigeraA.bisporigeraA.bisporigera A.muscaria A.muscaria A.muscaria A.muscaria A.muscaria A.muscaria ProbabilityResultProbabilityResultProbabilityResultProbabilityResultProbabilityResultResultProbabilityResult A.bisporigeraA.bisporigera A.bisporigeraA.bisporigera A.bisporigeraA.bisporigeraA.bisporigera 0.530.530.530.530.530.53 A.caesareaA.caesareaA.caesareaA.caesareaA.caesareaA.caesareaA.caesarea 0.513 0.513 0.513 0.513 0.513 0.513 A.vaginata A.vaginataA.vaginataA.vaginataA.vaginataA.vaginataA.vaginata0.4660.4660.466 0.4660.466 A.muscaria A.muscaria A.muscariaA.muscariaA.muscariaA.muscariaA.muscaria 0.42 0.42 0.42 0.42 0.42 0.42 A.echinocephala A.echinocephala A.echinocephalaA.echinocephalaA.echinocephalaA.echinocephalaA.echinocephala 0.469 0.469 0.469 0.469 0.469 A.bisporigera A.bisporigeraA.bisporigeraA.bisporigeraA.bisporigeraA.bisporigeraA.bisporigera 0.457 0.457 0.457 0.457 0.457 0.457 A.muscaria A.muscaria A.muscaria A.muscaria A.muscaria A.muscaria A.muscaria 0.395 0.395 0.395 0.395 0.395 ProbabilityProbabilityProbabilityProbabilityProbabilityProbabilityProbability 0.530.530.530.530.530.530.53 0.513 0.513 0.513 0.513 0.513 0.513 0.513 0.4660.4660.4660.4660.4660.4660.466 0.42 0.42 0.42 0.42 0.42 0.42 0.42 0.469 0.469 0.469 0.469 0.469 0.469 0.469 0.457 0.457 0.457 0.457 0.457 0.457 0.457 0.395 0.395 0.395 0.395 0.395 0.395 0.395 TableTableTableTableTable 6. 6. 6. Classification Classification6. Classification6. Classification Classification of of of otherofother ofother other other mushrooms. mushrooms. mushrooms. mushrooms. mushrooms. TableTableTableTableTableTableTable 6.6. 6.6. Classification 6. Classification 6.ClassificationClassification6. ClassificationClassification Classification Classification ofof ofof ofotherother of otherofother otherother other mushrooms.mushrooms. mushrooms.mushrooms. mushrooms. mushrooms. mushrooms.

VarietiesVarietiesVarietiesVarietiesVarieties VarietiesVarietiesVarietiesVarietiesVarietiesVarietiesVarietiesVarieties

ResultResultResultResultResult A.A.A. pantherinaA. pantherinaA. pantherina pantherina pantherina A.vaginataA.vaginataA.vaginataA.vaginataA.vaginata A.A.A. vaginataA. vaginataA. vaginata vaginata vaginata A.A.A. bisporigerA. bisporigerA. bisporiger bisporiger bisporiger a a a a aa A.bisporigeraA.bisporigera A.bisporigeraA.bisporigeraA.bisporigeraA.bisporigera A.phalloidesA.phalloides A.phalloidesA.phalloidesA.phalloidesA.phalloides A.caesareaA.caesarea A.caesareaA.caesareaA.caesareaA.caesarea ProbabilityResultProbabilityResultResultProbabilityResultProbabilityResultResultProbabilityResult A. A.A.A. A. pantherinaA. pantherina A.pantherinapantherina pantherina pantherina0.826pantherina0.8260.8260.8260.826 A.vaginataA.vaginataA.vaginataA.vaginataA.vaginata A.vaginataA.vaginataA.vaginata 0.812 0.812 0.812 0.812 0.812 A.A.A.A. A. vaginataA. vaginata A.vaginatavaginata 0.786 vaginata 0.786 vaginata 0.786vaginata 0.786 0.786 A.A.A.A. A. A. bisporigerA. bisporiger A.bisporigerbisporiger 0.702089bisporigera 0.702089 bisporiger 0.702089bisporigerbisporiger 0.702089 0.702089aa a a aa a A.bisporigera A.bisporigera A.bisporigeraA.bisporigeraA.bisporigeraA.bisporigeraA.bisporigeraA.bisporigera 0.417 0.417 0.417 0.417 0.417 A.phalloides A.phalloidesA.phalloidesA.phalloidesA.phalloidesA.phalloidesA.phalloides 0.663 0.663 0.663 0.663 0.663 A.caesareaA.caesarea A.caesareaA.caesarea A.caesareaA.caesareaA.caesareaA.caesarea 0.857 0.857 0.857 0.857 0.857 ProbabilityProbabilityProbabilityProbabilityProbability 0.8260.8260.8260.826 0.826 0.812 0.812 0.812 0.812 0.786 0.786 0.786 0.786 0.702089 0.702089 0.702089 0.702089 0.702089 0.417 0.417 0.417 0.417 0.417 0.663 0.663 0.663 0.663 0.857 0.857 0.857 0.857 0.857 Probability Probability Probability 0.8260.8260.826 0.812 0.812 0.812 0.786 0.786 0.786 0.702089 0.702089 0.702089 0.417 0.417 0.417 0.663 0.663 0.663 0.857 0.857 0.857

4. Conclusions In this paper, eight different convolutional neural networks are used to classify

seven different Amanita species. In order to select Amanita suitable for growing in the wild environment, the speed and accuracy of eight classification models were compared. These results show that the classifier based on deep learning is quite suitable for Amanita classification. In this paper, we used simple models (VGG, ResNet, EfficientNet) for classification and found that the accuracy of these models is not particularly good. Therefore, the Bilinear Agriculture 2021, 11, 393 12 of 13

Networks model is proposed. After building the B-CNN, we found that, although the accuracy of the model has improved, the size of the model is larger and the training time is longer. Therefore, we chose to add an attention mechanism to the model to improve the speed and accuracy of model training. After comprehensive comparison of models, we found that the best model is B-CNN (EfficientNet-B4, EfficientNet-B4) which adds CBAM. After training, the accuracy of the training set is 99.3%, and the accuracy of the test set is 95.2%, which can solve the problem of difficult image classification of Amanita in the complex environment of the wild to a certain extent. It can provide a certain basis for future classification and identification of mushrooms with high similarity, and its model size is 130 MB. The presented model processed pictures in 4.56 s, which facilitates its application in mobile devices.

Author Contributions: Conceptualization, P.W. and J.L.; methodology, Z.K.; software, P.W.; valida- tion, P.W.; formal analysis, P.H.; investigation, P.W.; resources, L.X.; data curation, L.X.; writing— original draft preparation, P.W. and X.L.; writing—review and editing, P.W. and Z.K.; visualization, Y.H.; supervision, Z.K.; project administration, P.H.; funding acquisition, Z.K. All authors have read and agreed to the published version of the manuscript. Funding: This work was supported by the Subject double support program of Sichuan Agricultural University (Grant NO. 035-1921993093). Institutional Review Board Statement: Not applicable. Informed Consent Statement: Not applicable. Data Availability Statement: Publicly available datasets were analyzed in this study. This data can be found here: https://www.kaggle.com/maysee/mushrooms-classification-common-genuss-images (accessed on 20 September 2020); http://www.mushroom.world (accessed on 24 September 2020). Conflicts of Interest: The authors declare no conflict of interest.

References 1. Deng, W.; Xiao, Z.; Li, P.; Li, T. The in vitro anti-tumor effect of lethal Amanita toxins. J. Edible Fungi 2012, 19, 71–78. 2. Bas, C. Morphology and subdivision of Amanita and a monograph of its section Lepidella. Pers. Mol. Phylogeny Evol. Fungi 1969, 5, 285–573. 3. Michelot, D.; Melendez-Howell, L.M. Amanita muscaria: Chemistry, biology, toxicology, and ethnomycology. Mycol. Res. 2003, 107, 131–146. [CrossRef] 4. Dong, G.R.; Zhang, Z.G.; Chen, Z.H. Amanita toxic and its theory. J. Biol. 2000, 17, 1–3. 5. Chilton, W.S.; Ott, J. Toxic metabolites of , A. cothurnata, A. muscaria and other Amanita species. Lloydia 1976, 39, 150–157. [PubMed] 6. Drewnowska, M.; Falandysz, J.; Chudzi´nska,M.; Han´c,A.; Saba, M.; Barałkiewicz, D. Leaching of arsenic and sixteen metallic elements from mushrooms after food processing. LWT 2017, 84, 861–866. [CrossRef] 7. Wu, F.; Zhou, L.-W.; Yang, Z.-L.; Bau, T.; Li, T.-H.; Dai, Y.-C. Resource diversity of Chinese macrofungi: Edible, medicinal and poisonous species. Fungal Divers. 2019, 98, 1–76. [CrossRef] 8. Wang, Y.; Bao, H.; Xu, L.; Bau, T. Determination of main peptide toxins from Amanita pallidorosea with HPLC and their anti-fungal action on Blastomyces albicans. Acta Microbiol. Sin. 2011, 51, 1205–1211. 9. Klein, A.S.; Hart, J.; Brems, J.J.; Goldstein, L.; Lewin, K.; Busuttil, R.W. Amanita poisoning: Treatment and the role of transplantation. Am. J. Med. 1989, 86, 187–193. [CrossRef] 10. Faulstich, H. New aspects of Amanita poisoning. J. Mol. Med. 1979, 57, 1143–1152. [CrossRef] 11. Wieland, T. Peptides of Poisonous Amanita Mushrooms; Springer: Berlin/Heidelberg, Germany, 2012. 12. Garcia, J.C.; Costa, V.M.; Carvalho, A.; Baptista, P.; De Pinho, P.G.; Bastos, M.D.L.; Carvalho, F. Amanita phalloides poisoning: Mechanisms of toxicity and treatment. Food Chem. Toxicol. 2015, 86, 41–55. [CrossRef] 13. Aji, D.Y.; Çali¸skan,S.; Nayir, A.; Mat, A.; Can, B.; Ya¸sar, Z.; Oz¸sahin,H.; Cullu, F.; Sever, L. Haemoperfusion in Amanita phalloides poisoning. J. Trop. Pediatrics 1995, 41, 371–374. [CrossRef] 14. Wang, Y. The of Amanita from Jilin and Shandong Provinces and Detection of Peptide Toxins. Jilin Agricultural University. 2011. Available online: https://kns.cnki.net/kcms/detail/detail.aspx?FileName=1011150549.nh&DbName=CMFD2 011 (accessed on 24 September 2020). 15. Wu, F. Research on the identification and prevention of poisoning mushroom poisoning. Sci. Technol. Innov. 2018, 107, 61–62. 16. Ismail, S.; Zainal, A.R.; Mustapha, A. Behavioural features for mushroom classification. In Proceedings of the 2018 IEEE Symposium on Computer Applications & Industrial Electronics (ISCAIE), Penang, Malaysia, 28–29 April 2018; pp. 412–415. Agriculture 2021, 11, 393 13 of 13

17. Maurya, P.; Singh, N.P. Mushroom Classification Using Feature-Based Machine Learning Approach. In Proceedings of the 3rd International Conference on Computer Vision and Image Processing, Jaipur, India, 27–29 September 2019; pp. 197–206. 18. Xiao, J.; Zhao, C.; Li, X. Research on mushroom image classification based on deep learning. Softw. Eng. 2020, 23, 21–26. 19. Chen, Q. Design of Mushroom Recognition APP Based on Deep Learning under Android Platform; South-Central University for Nationalities: Wuhan, China, 2019; Available online: https://kns.cnki.net/kcms/detail/detail.aspx?FileName=1019857927.nh& DbName=CMFD2020 (accessed on 24 September 2020). 20. Preechasuk, J.; Chaowalit, O.; Pensiri, F.; Visutsak, P. Image Analysis of Mushroom Types Classification by Convolution Neural Net-works. In Proceedings of the 2019 2nd Artificial Intelligence and Cloud Computing Conference, New York, NY, USA, 21–23 December 2019; pp. 82–88. 21. Dong, J.; Zheng, L. Quality classification of Enoki mushroom caps based on CNN. In Proceedings of the 2019 IEEE 4th International Conference on Image, Vision and Computing (ICIVC), Xiamen, China, 5–7 July 2019; pp. 450–454. 22. Chikkerur, S.; Serre, T.; Tan, C.; Poggio, T. What and where: A Bayesian inference theory of attention. Vis. Res. 2010, 50, 2233–2247. [CrossRef][PubMed] 23. Xu, H.; Saenko, K. Ask, Attend and Answer: Exploring Question-Guided Spatial Attention for Visual Question Answering. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 8–16 October 2016. 24. Yang, X. An Overview of the Attention Mechanisms in Computer Vision. J. Phys. Conf. Ser. 2020, 1693, 012173. [CrossRef] 25. Jaderberg, M.; Simonyan, K.; Zisserman, A.; Kavukcuoglu, K. Spatial transformer networks. arXiv 2015, arXiv:1506.02025. 26. Sønderby, S.K.; Sønderby, C.K.; Maaløe, L.; Winther, O. Recurrent spatial transformer networks. arXiv 2015, arXiv:1509.05329. 27. Humphreys, G.W.; Sui, J. Attentional control and the self: The Self-Attention Network (SAN). Cogn. Neurosci. 2016, 7, 5–17. [CrossRef] 28. Shen, T.; Zhou, T.; Long, G.; Jiang, J.; Wang, S.; Zhang, C. Reinforced Self-Attention Network: A Hybrid of Hard and Soft Attention for Sequence Modeling. In Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, Stockholm, Sweden, 13–19 July 2018; pp. 4345–4352. 29. Yang, Z.L. Flora Fungorum Sinicorun—; Science Press: Beijing, China, 2005; Volume 27, pp. 1–257. (In Chinese) 30. Chollet, F. Building Powerful Image Classification Models Using Very Little Data. Keras Blog. 2016. Available online: https: //blog.keras.io/building-powerful-image-classification-models-using-very-little-data.html (accessed on 24 September 2020). 31. Kawakura, S.; Shibasaki, R. Distinction of Edible and Inedible Harvests Using a Fine-Tuning-Based Deep Learning System. J. Adv. Agric. Technol. 2019, 6, 236–240. [CrossRef] 32. Tan, M.; Le, Q.V. Efficientnet: Rethinking model scaling for convolutional neural networks. arXiv 2019, arXiv:1905.11946. 33. Duong, L.T.; Nguyen, P.; Di Sipio, C.; Di Ruscio, D. Automated fruit recognition using EfficientNet and MixNet. Comput. Electron. Agric. 2020, 171, 105326. [CrossRef] 34. Zhang, P.; Yang, L.; Li, D. EfficientNet-B4-Ranger: A novel method for greenhouse cucumber disease recognition under natural complex environment. Comput. Electron. Agric. 2020, 176, 105652. [CrossRef] 35. Lin, T.Y.; Roy, C.A.; Maji, S. Bilinear CNN models for fine-grained visual recognition. In Proceedings of the IEEE interna-Tional Conference on Computer Vision, Santiago, , 7–13 December 2015; pp. 1449–1457. 36. Chowdhury, A.R.; Lin, T.Y.; Maji, S.; Learned-Miller, E. One-to-many face recognition with bilinear CNNs. In Proceedings of the 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Placid, NY, USA, 7–10 March 2016; pp. 1–9. 37. Zhu, Y.; Sun, W.; Cao, X.; Wang, C.; Wu, D.; Yang, Y.; Ye, N. TA-CNN: Two-way attention models in deep convolutional neural network for plant recognition. Neurocomputing 2019, 365, 191–200. [CrossRef] 38. Woo, S.; Park, J.; Lee, J.-Y.; Kweon, I.S. CBAM: Convolutional Block Attention Module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. 39. Lee, H.; Park, J.; Hwang, J.Y. Channel Attention Module with Multi-scale Grid Average Pooling for Breast Cancer Segmentation in an Ultrasound Image. IEEE Trans. Ultrason. Ferroelectr. Freq. Control. 2020, 67, 1344–1353. [CrossRef][PubMed] 40. Fu, X.; Bi, L.; Kumar, A.; Fulham, M.; Kim, J. Multimodal Spatial Attention Module for Targeting Multimodal PET-CT Lung Tumor Segmentation. IEEE J. Biomed. Health Inf. 2021.[CrossRef] 41. Zhang, J.; Karimireddy, S.P.; Veit, A.; Kim, S.; Reddi, S.J.; Kumar, S.; Sra, S. Why ADAM beats SGD for attention models. arXiv 2019, arXiv:1912.03194. 42. Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. In Proceedings of the International Conference Learn, Represent (ICLR), San Diego, CA, USA, 5–8 May 2015. 43. Sun, J.; He, X.; Ge, X.; Wu, X.; Shen, J.; Song, Y. Detection of Key Organs in Tomato Based on Deep Migration Learning in a Complex Background. Agriculture 2018, 8, 196. [CrossRef] 44. Hong, S.J.; Kim, S.Y.; Kim, E.; Lee, C.-H.; Lee, J.-S.; Lee, D.-S.; Bang, J.; Kim, G. Moth detection from pheromone trap images using deep learning object detectors. Agriculture 2020, 10, 170. [CrossRef] 45. Pan, S.J.; Yang, Q. A Survey on Transfer Learning. IEEE Trans. Knowl. Data Eng. 2009, 22, 1345–1359. [CrossRef] 46. Wang, L.; Wang, P.; Wu, L.; Xu, L.; Huang, P.; Kang, Z. Computer Vision Based Automatic Recognition of Pointer Instruments: Data Set Optimization and Reading. Entropy 2021, 23, 272. [CrossRef][PubMed] 47. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 770–778.