CN111652285A - Tea cake category identification method, equipment and medium - Google Patents

Tea cake category identification method, equipment and medium Download PDF

Info

Publication number
CN111652285A
CN111652285A CN202010387385.7A CN202010387385A CN111652285A CN 111652285 A CN111652285 A CN 111652285A CN 202010387385 A CN202010387385 A CN 202010387385A CN 111652285 A CN111652285 A CN 111652285A
Authority
CN
China
Prior art keywords
tea cake
category identification
picture
sample set
tea
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010387385.7A
Other languages
Chinese (zh)
Inventor
冯落落
李锐
金长新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinan Inspur Hi Tech Investment and Development Co Ltd
Original Assignee
Jinan Inspur Hi Tech Investment and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinan Inspur Hi Tech Investment and Development Co Ltd filed Critical Jinan Inspur Hi Tech Investment and Development Co Ltd
Priority to CN202010387385.7A priority Critical patent/CN111652285A/en
Publication of CN111652285A publication Critical patent/CN111652285A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a tea cake category identification method, which comprises the following steps: inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm; determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model; and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements. According to the tea cake image recognition method and device, the tea cake image is input into the pre-trained tea cake category recognition model, and the tea cake category corresponding to the tea cake image can be determined more accurately.

Description

Tea cake category identification method, equipment and medium
Technical Field
The application relates to the technical field of computers, in particular to a tea cake category identification method, equipment and medium.
Background
The classification of chinese tea leaves is basically distinguished according to the degree of fermentation. The six kinds of tea are green tea, white tea, yellow tea, oolong tea, black tea and black tea, which correspond to different fermentation degrees respectively, and the colors of the tea are different due to different manufacturing processes and fermentation degrees. In addition, the shape of each tea leaf is different, and is also an important factor in recognition. The tea cake is prepared by making tea into cake shape, and is convenient for carrying and storing.
In the prior art, the variety of each tea cake is manually identified by experience, but errors can be easily identified for novices with insufficient experience.
Disclosure of Invention
In view of this, embodiments of the present application provide a method, an apparatus, and a medium for identifying a type of a tea cake, which are used to solve the problem in the prior art that an error is easily caused when a tea cake type is manually identified.
The embodiment of the application adopts the following technical scheme:
the embodiment of the application provides a tea cake category identification method, which comprises the following steps:
inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm;
determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model;
and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
Further, before the tea cake picture is input into the pre-trained tea cake category identification model, the method further comprises:
constructing an initial tea cake category identification model;
constructing a data set, wherein the data set comprises a plurality of classes of tea cake pictures, and each class of tea cake picture comprises a plurality of tea cake pictures;
and training the initial tea cake category identification model according to the data set and the optimization algorithm to obtain a tea cake category identification model meeting the requirements.
Further, the data set comprises a plurality of sample sets, wherein each sample set comprises a preset number of tea cake pictures, and each sample set comprises two tea cake pictures in the same category.
Further, the training the initial tea cake category identification model according to the data set and the optimization algorithm to obtain a tea cake category identification model meeting requirements specifically includes:
determining a characteristic vector corresponding to each tea cake picture according to the tea cake pictures of each sample set;
determining a preset number of distances between the tea cake pictures in each sample set according to the corresponding characteristic vector of each tea cake picture;
determining the loss value of each sample set according to the distance between a preset formula and the tea cake pictures in each sample set in a preset number;
and training an initial tea cake category identification model according to the loss value of each sample set and the optimization algorithm to obtain a tea cake category identification model meeting the requirements.
Further, each sample set comprises three tea cake pictures;
determining a preset number of distances between the tea cake pictures in each sample set according to the corresponding characteristic vector of each tea cake picture; determining the loss value of each sample set according to the distance between a preset formula and the preset number of the tea cake pictures in each sample set, wherein the method specifically comprises the following steps:
determining two preset distances between three tea cake pictures in each sample set according to the corresponding feature vector of each tea cake picture;
and determining the loss value of each sample set according to two preset distances between a preset formula and the tea cake pictures in each sample set.
Further, according to the corresponding characteristic vector of each tea cake picture, two preset distances between three tea cake pictures are determined in each sample set; determining a loss value in each sample set according to two preset distances between a preset formula and each sample set tea cake picture, wherein the method specifically comprises the following steps:
setting the vector corresponding to the tea cake pictures in each sample set as f1, f2 and f3 according to the feature vector corresponding to each tea cake picture, and determining the distance d1 between the vector f1 and the vector f2 and the distance d2 between the vector f1 and the vector f3, wherein the tea cake picture corresponding to the vector f1 and the tea cake picture corresponding to the vector f2 are in the same category, and the tea cake picture corresponding to the vector f1 and the tea cake picture corresponding to the vector f3 are not in the same category;
determining the Loss value of each sample set according to a preset formula of the distance d1 and d2 between the Loss (d1-d2+ const) and each sample set, wherein the Loss is the Loss value of each sample set, the Relu is an activation function, and the const is an over-parameter used for adjusting (d1-d2+ const) to be a positive number.
Further, the tea cake type identification model builds a network architecture according to Vgg16, and the last three layers of Vgg16 are deleted and replaced by a 256 full-connection layer and a 128 full-connection layer.
Further, the optimization algorithm is a random gradient descent algorithm.
The embodiment of the present application further provides a tea cake category identification device, the device includes:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm;
determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model;
and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
The embodiment of the application also provides a tea cake category identification medium, which stores computer executable instructions, wherein the computer executable instructions are set as follows:
inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm;
determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model;
and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
The embodiment of the application adopts at least one technical scheme which can achieve the following beneficial effects: according to the tea cake image recognition method and device, the tea cake image is input into the pre-trained tea cake category recognition model, and the tea cake category corresponding to the tea cake image can be determined more accurately.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flow chart of a tea cake category identification method provided in an embodiment of the present disclosure
Fig. 2 is a schematic flow chart of a tea cake category identification method provided in the second embodiment of the present specification;
fig. 3 is a schematic diagram of a Vgg16 network architecture provided in the second embodiment of the present disclosure;
fig. 4 is a functional diagram of an activation function provided in the second embodiment of the present specification.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a tea cake category identification method provided in an embodiment of the present specification, where the embodiment of the present specification may be implemented by a tea cake category identification system, and specifically includes:
step S101, the tea cake type recognition system inputs a tea cake picture into a pre-trained tea cake type recognition model, wherein the tea cake type recognition model carries out model training through an optimization algorithm.
And S102, determining a feature vector corresponding to the tea cake picture by the tea cake category identification system according to the tea cake category identification model.
And S103, screening out a characteristic vector meeting the requirement from the pre-stored characteristic vectors according to the characteristic vector by the tea cake type identification system, and identifying the tea cake type corresponding to the tea cake picture according to the characteristic vector meeting the requirement.
According to the tea cake image recognition method and device, the tea cake image is input into the pre-trained tea cake category recognition model, and the tea cake category corresponding to the tea cake image can be determined more accurately.
Correspondingly to the embodiment of the present specification, fig. 2 is a schematic flow chart of a method for identifying a category of tea cakes according to the second embodiment of the present specification, and the embodiment of the present specification may be implemented by a system for identifying a category of tea cakes, which specifically includes:
in step S201, the tea cake category identification system constructs an initial tea cake category identification model.
In step S201 of the embodiment of the present specification, a network architecture may be built according to Vgg16 by using a tea cake type identification model, and the last three layers of Vgg16 may be deleted by using a full connection layer of 256 and a full connection layer of 128 in the tea cake type identification model. Vgg16 network architecture schematic diagram referring to fig. 3, the embodiment of the present disclosure may delete the last three fc fully-connected layers and may replace them with one 265 fully-connected layer and one 128 fully-connected layer.
In step S202, the tea cake category identification system constructs a data set, wherein the data set includes a plurality of categories of tea cake pictures, and each category of tea cake picture includes a plurality of tea cake pictures.
And S203, training the initial tea cake type recognition model by the tea cake type recognition system according to the data set and the optimization algorithm to obtain the tea cake type recognition model meeting the requirements.
In step S203 of this specification, the data set may include a plurality of sample sets, where each sample set includes a preset number of tea cake pictures, and each sample set includes two tea cake pictures of the same category.
It should be noted that the training strategy of the embodiment of the present disclosure may use TripletNet, and use TripletLoss to construct the loss function of the embodiment of the present disclosure.
In step S203 in the embodiment of this specification, this step may specifically include:
determining a characteristic vector corresponding to each tea cake picture according to the tea cake pictures of each sample set, wherein the characteristic vector corresponding to each tea cake picture can be extracted through a vector extractor;
determining a preset number of distances between the tea cake pictures in each sample set according to the corresponding characteristic vector of each tea cake picture;
determining the loss value of each sample set according to the distance between a preset formula and the tea cake pictures in each sample set in a preset number;
and training an initial tea cake category identification model according to the loss value of each sample set and the optimization algorithm to obtain a tea cake category identification model meeting the requirements.
Further, each sample set may include three tea cake pictures;
determining a preset number of distances between the tea cake pictures in each sample set according to the corresponding characteristic vector of each tea cake picture; determining the loss value of each sample set according to a preset number of distances between a preset formula and the tea cake pictures in each sample set, wherein the method specifically comprises the following steps:
determining two preset distances between three tea cake pictures in each sample set according to the corresponding feature vector of each tea cake picture, wherein each sample set comprises two tea cake pictures in the same category;
and determining the loss value of each sample set according to two preset distances between a preset formula and the tea cake pictures in each sample set.
Determining two preset distances between three tea cake pictures in each sample set according to the corresponding feature vector of each tea cake picture; determining a loss value in each sample set according to two preset distances between a preset formula and each sample set tea cake picture, wherein the method specifically comprises the following steps:
setting the vector corresponding to the tea cake pictures in each sample set as f1, f2 and f3 according to the feature vector corresponding to each tea cake picture, and determining the distance d1 between the vector f1 and the vector f2 and the distance d2 between the vector f1 and the vector f3, wherein the tea cake picture corresponding to the vector f1 and the tea cake picture corresponding to the vector f2 are in the same category, and the tea cake picture corresponding to the vector f1 and the tea cake picture corresponding to the vector f3 are not in the same category;
determining the Loss value of each sample set according to a preset formula of the distance d1 and d2 between the Loss (d1-d2+ const) and each sample set, wherein the Loss is the Loss value of each sample set, the Relu is an activation function, and the const is an over-parameter used for adjusting (d1-d2+ const) to be a positive number.
Referring to fig. 4, which shows a function graph of an activation function, when d1-d2+ const is less than 0, the function value is 0, so the values of d1-d2+ const need to be adjusted to positive values. When the Loss is not zero, the optimization algorithm in the optimizer optimizes the Loss so that d1 becomes smaller, i.e., the same sample distance is small. On the other hand, if d2 is large, no loss will occur, and it can be seen from the formula that d2 must be the minimum const size, so const is also called margin interval. const is a super-parameter that can be set to 5cm, indicating that the euclidean distance between two dissimilar pictures is at least 5 cm.
And S204, inputting the tea cake picture into a pre-trained tea cake class identification model by the tea cake class identification system, wherein the tea cake class identification model carries out model training through an optimization algorithm.
In step S204 of the embodiments of the present specification, the optimization algorithm may be a random gradient descent algorithm (SGD).
And S205, determining a feature vector corresponding to the tea cake picture by the tea cake category identification system according to the tea cake category identification model.
And S206, screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors by the tea cake type identification system according to the characteristic vectors, and identifying the tea cake type corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
It should be noted that, when testing the tea cake category identification model, all categories of tea cake pictures can be put into the trained tea cake category identification model, then 128-dimensional feature vectors of each tea cake can be obtained, and then the feature vectors are stored in the database.
According to the tea cake image recognition method and device, the tea cake image is input into the pre-trained tea cake category recognition model, and the tea cake category corresponding to the tea cake image can be determined more accurately.
The embodiment of the present application further provides a tea cake category identification device, the device includes:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm;
determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model;
and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
The embodiment of the application also provides a tea cake category identification medium, which stores computer executable instructions, wherein the computer executable instructions are set as follows:
inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm;
determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model;
and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Language Description Language), traffic, pl (core unified Programming Language), HDCal, JHDL (Java Hardware Description Language), langue, Lola, HDL, laspam, hardsradware (Hardware Description Language), vhjhd (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A tea cake category identification method is characterized by comprising the following steps:
inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm;
determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model;
and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
2. The tea cake category identification method according to claim 1, wherein before inputting the tea cake picture to the pre-trained tea cake category identification model, the method further comprises:
constructing an initial tea cake category identification model;
constructing a data set, wherein the data set comprises a plurality of classes of tea cake pictures, and each class of tea cake picture comprises a plurality of tea cake pictures;
and training the initial tea cake category identification model according to the data set and the optimization algorithm to obtain a tea cake category identification model meeting the requirements.
3. The method for identifying the category of the tea cake according to claim 2, wherein the data set comprises a plurality of sample sets, wherein each sample set comprises a preset number of tea cake pictures, and each sample set comprises two tea cake pictures in the same category.
4. The tea cake category identification method according to claim 3, wherein the training of the initial tea cake category identification model according to the data set and the optimization algorithm to obtain a tea cake category identification model meeting requirements specifically comprises:
determining a characteristic vector corresponding to each tea cake picture according to the tea cake pictures of each sample set;
determining a preset number of distances between the tea cake pictures in each sample set according to the corresponding characteristic vector of each tea cake picture;
determining the loss value of each sample set according to the distance between a preset formula and the tea cake pictures in each sample set in a preset number;
and training an initial tea cake category identification model according to the loss value of each sample set and the optimization algorithm to obtain a tea cake category identification model meeting the requirements.
5. The tea cake category identification method according to claim 4, wherein each of said sample sets comprises three tea cake pictures;
determining a preset number of distances between the tea cake pictures in each sample set according to the corresponding characteristic vector of each tea cake picture; determining the loss value of each sample set according to the distance between a preset formula and the preset number of the tea cake pictures in each sample set, wherein the method specifically comprises the following steps:
determining two preset distances between three tea cake pictures in each sample set according to the corresponding feature vector of each tea cake picture;
and determining the loss value of each sample set according to two preset distances between a preset formula and the tea cake pictures in each sample set.
6. The tea cake category identification method according to claim 5, wherein two preset distances between three tea cake pictures are determined in each sample set according to the corresponding feature vector of each tea cake picture; determining a loss value in each sample set according to two preset distances between a preset formula and each sample set tea cake picture, wherein the method specifically comprises the following steps:
setting the vector corresponding to the tea cake pictures in each sample set as f1, f2 and f3 according to the feature vector corresponding to each tea cake picture, and determining the distance d1 between the vector f1 and the vector f2 and the distance d2 between the vector f1 and the vector f3, wherein the tea cake picture corresponding to the vector f1 and the tea cake picture corresponding to the vector f2 are in the same category, and the tea cake picture corresponding to the vector f1 and the tea cake picture corresponding to the vector f3 are not in the same category;
determining the Loss value of each sample set according to a preset formula of the distance d1 and d2 between the Loss (d1-d2+ const) and each sample set, wherein the Loss is the Loss value of each sample set, the Relu is an activation function, and the const is an over-parameter used for adjusting (d1-d2+ const) to be a positive number.
7. The tea cake category identification method according to claim 1, wherein the tea cake category identification model builds a network architecture according to Vgg16, and the tea cake category identification model deletes the last three layers of Vgg16 and replaces the last three layers with a full connection layer of 256 and a full connection layer of 128.
8. The tea cake category identification method according to claim 1, wherein the optimization algorithm is a random gradient descent algorithm.
9. A tea cake category identification device, characterized in that the device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to:
inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm;
determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model;
and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
10. A tea cake category identification medium having stored thereon computer-executable instructions configured to:
inputting the tea cake picture into a pre-trained tea cake category identification model, wherein the tea cake category identification model carries out model training through an optimization algorithm;
determining a characteristic vector corresponding to the tea cake picture according to the tea cake category identification model;
and screening out the characteristic vectors meeting the requirements from the pre-stored characteristic vectors according to the characteristic vectors, and identifying the tea cake category corresponding to the tea cake picture according to the characteristic vectors meeting the requirements.
CN202010387385.7A 2020-05-09 2020-05-09 Tea cake category identification method, equipment and medium Pending CN111652285A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010387385.7A CN111652285A (en) 2020-05-09 2020-05-09 Tea cake category identification method, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010387385.7A CN111652285A (en) 2020-05-09 2020-05-09 Tea cake category identification method, equipment and medium

Publications (1)

Publication Number Publication Date
CN111652285A true CN111652285A (en) 2020-09-11

Family

ID=72346742

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010387385.7A Pending CN111652285A (en) 2020-05-09 2020-05-09 Tea cake category identification method, equipment and medium

Country Status (1)

Country Link
CN (1) CN111652285A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250812A (en) * 2016-07-15 2016-12-21 汤平 A kind of model recognizing method based on quick R CNN deep neural network
CN106934346A (en) * 2017-01-24 2017-07-07 北京大学 A kind of method of target detection performance optimization
CN107679078A (en) * 2017-08-29 2018-02-09 银江股份有限公司 A kind of bayonet socket image vehicle method for quickly retrieving and system based on deep learning
CN107967484A (en) * 2017-11-14 2018-04-27 中国计量大学 A kind of image classification method based on multiresolution
CN108564092A (en) * 2018-04-12 2018-09-21 内蒙古工业大学 Sunflower disease recognition method based on SIFT feature extraction algorithm
CN109635643A (en) * 2018-11-01 2019-04-16 暨南大学 A kind of fast human face recognition based on deep learning
CN109829736A (en) * 2019-02-11 2019-05-31 上海元唯壹网络科技有限责任公司 A kind of application system based on the image recognition of AI in tea cake
CN109993236A (en) * 2019-04-10 2019-07-09 大连民族大学 Few sample language of the Manchus matching process based on one-shot Siamese convolutional neural networks
CN110188641A (en) * 2019-05-20 2019-08-30 北京迈格威科技有限公司 Image recognition and the training method of neural network model, device and system
CN110263659A (en) * 2019-05-27 2019-09-20 南京航空航天大学 A kind of finger vein identification method and system based on triple loss and lightweight network
CN110427832A (en) * 2019-07-09 2019-11-08 华南理工大学 A kind of small data set finger vein identification method neural network based
CN111062338A (en) * 2019-12-19 2020-04-24 厦门商集网络科技有限责任公司 Certificate portrait consistency comparison method and system
US20200134375A1 (en) * 2017-08-01 2020-04-30 Beijing Sensetime Technology Development Co., Ltd. Semantic segmentation model training methods and apparatuses, electronic devices, and storage media

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250812A (en) * 2016-07-15 2016-12-21 汤平 A kind of model recognizing method based on quick R CNN deep neural network
CN106934346A (en) * 2017-01-24 2017-07-07 北京大学 A kind of method of target detection performance optimization
US20200134375A1 (en) * 2017-08-01 2020-04-30 Beijing Sensetime Technology Development Co., Ltd. Semantic segmentation model training methods and apparatuses, electronic devices, and storage media
CN107679078A (en) * 2017-08-29 2018-02-09 银江股份有限公司 A kind of bayonet socket image vehicle method for quickly retrieving and system based on deep learning
CN107967484A (en) * 2017-11-14 2018-04-27 中国计量大学 A kind of image classification method based on multiresolution
CN108564092A (en) * 2018-04-12 2018-09-21 内蒙古工业大学 Sunflower disease recognition method based on SIFT feature extraction algorithm
CN109635643A (en) * 2018-11-01 2019-04-16 暨南大学 A kind of fast human face recognition based on deep learning
CN109829736A (en) * 2019-02-11 2019-05-31 上海元唯壹网络科技有限责任公司 A kind of application system based on the image recognition of AI in tea cake
CN109993236A (en) * 2019-04-10 2019-07-09 大连民族大学 Few sample language of the Manchus matching process based on one-shot Siamese convolutional neural networks
CN110188641A (en) * 2019-05-20 2019-08-30 北京迈格威科技有限公司 Image recognition and the training method of neural network model, device and system
CN110263659A (en) * 2019-05-27 2019-09-20 南京航空航天大学 A kind of finger vein identification method and system based on triple loss and lightweight network
CN110427832A (en) * 2019-07-09 2019-11-08 华南理工大学 A kind of small data set finger vein identification method neural network based
CN111062338A (en) * 2019-12-19 2020-04-24 厦门商集网络科技有限责任公司 Certificate portrait consistency comparison method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
HAIYUN GUO,KUAN ZHU,MING TANG,ET AL: "《Two-Level Attention Network With Multi-Grain Ranking Loss for Vehicle Re-Identification》", 《IEEE TRANSACTIONS ON IMAGE PROCESSING》 *
刘 虎,周 野,袁家斌: "《基于多尺度双线性卷积神经网络的多角度下车型精细识别》", 《计算机应用》 *

Similar Documents

Publication Publication Date Title
CN109034183B (en) Target detection method, device and equipment
CN109214193B (en) Data encryption and machine learning model training method and device and electronic equipment
CN112308113A (en) Target identification method, device and medium based on semi-supervision
CN112036236A (en) GhostNet-based detection model training method, device and medium
CN114332873A (en) Training method and device for recognition model
CN115828162B (en) Classification model training method and device, storage medium and electronic equipment
CN114861665B (en) Method and device for training reinforcement learning model and determining data relation
CN108921190A (en) A kind of image classification method, device and electronic equipment
CN111191090B (en) Method, device, equipment and storage medium for determining service data presentation graph type
CN115130621B (en) Model training method and device, storage medium and electronic equipment
CN108170663A (en) Term vector processing method, device and equipment based on cluster
CN111652053A (en) Employee attendance checking method, device and medium
CN111652285A (en) Tea cake category identification method, equipment and medium
CN110502551A (en) Data read-write method, system and infrastructure component
CN115221523A (en) Data processing method, device and equipment
CN109389157B (en) User group identification method and device and object group identification method and device
CN115017905A (en) Model training and information recommendation method and device
CN109325127B (en) Risk identification method and device
CN111898615A (en) Feature extraction method, device, equipment and medium of object detection model
CN114511376A (en) Credit data processing method and device based on multiple models
CN118193797B (en) Method and device for executing service, storage medium and electronic equipment
CN111539962A (en) Target image classification method, device and medium
CN111539520A (en) Method and device for enhancing robustness of deep learning model
CN112115952B (en) Image classification method, device and medium based on full convolution neural network
CN111596946A (en) Recommendation method, device and medium for intelligent contracts of block chains

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200911

RJ01 Rejection of invention patent application after publication