CN110796149A - Food tracing image comparison method and related device - Google Patents

Food tracing image comparison method and related device Download PDF

Info

Publication number
CN110796149A
CN110796149A CN201910955597.8A CN201910955597A CN110796149A CN 110796149 A CN110796149 A CN 110796149A CN 201910955597 A CN201910955597 A CN 201910955597A CN 110796149 A CN110796149 A CN 110796149A
Authority
CN
China
Prior art keywords
quality degree
food
quality
target
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910955597.8A
Other languages
Chinese (zh)
Other versions
CN110796149B (en
Inventor
陈浩能
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910955597.8A priority Critical patent/CN110796149B/en
Publication of CN110796149A publication Critical patent/CN110796149A/en
Application granted granted Critical
Publication of CN110796149B publication Critical patent/CN110796149B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The embodiment of the application provides an image comparison method and a related device for food tracing, wherein the method comprises the following steps: acquiring a reference image, wherein the reference image comprises image blocks of the food container; if the image block of the food container comprises a target identifier, extracting a target image block from a target area according to the target identifier, wherein the target image block comprises a reference image block of the food, and the target area is a preset area in the food container; and determining the quality degree of the food according to the reference image block of the food. The accuracy of food quality degree detection can be improved.

Description

Food tracing image comparison method and related device
Technical Field
The application relates to the technical field of data processing, in particular to an image comparison method for food tracing and a related device.
Background
With the improvement of living standard of people, the attention of people to the eating quality of foods and the like is also improved. In the conventional method of determining the quality of food, for example, the quality degree (including freshness, rotting degree, etc.), the quality degree is generally determined manually, or a portion of food in the food is randomly extracted by sampling to detect the quality degree, and the quality degree of the food is determined based on the detection result.
Disclosure of Invention
The embodiment of the application provides an image comparison method and a related device for food tracing, which can improve the accuracy of food quality detection.
A first aspect of the embodiments of the present application provides an image comparison method for food tracing, where the method includes:
acquiring a reference image, wherein the reference image comprises image blocks of the food container;
if the image block of the food container comprises a target identifier, extracting a target image block from a target area according to the indication of the target identifier, wherein the target image block comprises a reference image block of the food, and the target area is a preset area in the food container;
and determining the quality degree of the food according to the reference image block of the food.
With reference to the first aspect, in a possible embodiment of the first aspect, the determining the quality degree of the food product according to the reference image block of the food product includes:
acquiring characteristic parameters of a color reference area from the reference image according to the indication of the target mark;
determining a target image block of the food according to the characteristic parameters and the reference image block of the food;
and determining the quality degree of the food according to the target image block.
With reference to the first aspect, in a possible embodiment of the first aspect, determining a quality degree of the food product according to the target image block includes:
extracting the features of the target image block to obtain feature data;
determining first color data of the food according to the characteristic data;
acquiring second color data of a quality degree identification area of the food from the reference image;
and determining the quality degree of the food according to the first color data and the second color data.
With reference to the first aspect, in a possible embodiment of the first aspect, the determining the quality degree of the food product according to the first color data and the second color data includes:
determining the quality degree of the corresponding sub-region according to each sub-first color data in the N sub-first color data to obtain N reference quality degrees;
classifying the N reference quality degrees to obtain a first quality degree set and a second quality degree set, wherein the reference quality degree in the first quality degree set is smaller than the reference quality degree in the second quality degree set;
if the reference quality degree smaller than a preset quality degree threshold value exists in the first quality degree set, determining that the quality degree of the food is a first preset quality degree;
if the reference quality degrees in the first quality degree set are all larger than the preset quality degree, acquiring a first numerical value of the reference quality degree in the first quality degree set in a first quality degree interval, a second numerical value of the reference quality degree in the first quality degree set in a second quality degree interval, and a third numerical value of the reference quality degree in the first quality degree set in a third quality degree interval, wherein the first quality degree interval, the second quality degree interval and the third quality degree interval are not overlapped with each other;
determining a first sub-target quality degree according to the first quality degree interval, the second quality degree interval, the third quality degree interval, the first numerical value, the second numerical value and the third numerical value;
determining a second sub-target quality degree according to the second quality degree set;
and determining the quality degree of the food according to the first sub-target quality degree and the second sub-target quality degree.
With reference to the first aspect, in a possible embodiment of the first aspect, the determining a second sub-target quality degree according to the second quality degree set includes:
obtaining the deviation degree between the reference quality degree in the second quality degree set and a target quality degree threshold value;
acquiring the number M of reference quality degrees with positive deviation degrees and the number S of reference quality degrees with negative deviation degrees;
and determining the quality degree of the second sub-target according to the number M and the number S.
A second aspect of embodiments of the present application provides an image comparison apparatus for food tracing, the apparatus including an obtaining unit, an extracting unit, and a determining unit, wherein,
the acquisition unit is used for acquiring a reference image, and the reference image comprises image blocks of the food container;
the extraction unit is used for extracting a target image block from a target area according to the indication of a target identifier if the image block of the food container comprises the target identifier, wherein the target image block comprises a reference image block of food, and the target area is a preset area in the food container;
the determining unit is used for determining the quality degree of the food according to the reference image block of the food.
With reference to the second aspect, in a possible embodiment of the second aspect, in the determining the quality degree of the food product according to the reference image block of the food product, the determining unit is configured to:
acquiring characteristic parameters of a color reference area from the reference image according to the indication of the target mark;
determining a target image block of the food according to the characteristic parameters and the reference image block of the food;
and determining the quality degree of the food according to the target image block.
With reference to the second aspect, in a possible embodiment of the second aspect, in determining the quality degree of the food product according to the target image block, the determining unit is configured to:
extracting the features of the target image block to obtain feature data;
determining first color data of the food according to the characteristic data;
acquiring second color data of a quality degree identification area of the food from the reference image;
and determining the quality degree of the food according to the first color data and the second color data.
With reference to the second aspect, in one possible embodiment of the second aspect, the first color data includes N sub-first color data, the N sub-first color data correspond to N sub-regions of the food product, and in terms of determining the quality degree of the food product according to the first color data and the second color data, the determining unit is configured to:
determining the quality degree of the corresponding sub-region according to each sub-first color data in the N sub-first color data to obtain N reference quality degrees;
classifying the N reference quality degrees to obtain a first quality degree set and a second quality degree set, wherein the reference quality degree in the first quality degree set is smaller than the reference quality degree in the second quality degree set;
if the reference quality degree smaller than a preset quality degree threshold value exists in the first quality degree set, determining that the quality degree of the food is a first preset quality degree;
if the reference quality degrees in the first quality degree set are all larger than the preset quality degree, acquiring a first numerical value of the reference quality degree in the first quality degree set in a first quality degree interval, a second numerical value of the reference quality degree in the first quality degree set in a second quality degree interval, and a third numerical value of the reference quality degree in the first quality degree set in a third quality degree interval, wherein the first quality degree interval, the second quality degree interval and the third quality degree interval are not overlapped with each other;
determining a first sub-target quality degree according to the first quality degree interval, the second quality degree interval, the third quality degree interval, the first numerical value, the second numerical value and the third numerical value;
determining a second sub-target quality degree according to the second quality degree set;
and determining the quality degree of the food according to the first sub-target quality degree and the second sub-target quality degree.
With reference to the second aspect, in a possible embodiment of the second aspect, in the determining a second sub-target quality degree according to the second quality degree set, the determining unit is configured to:
obtaining the deviation degree between the reference quality degree in the second quality degree set and a target quality degree threshold value;
acquiring the number M of reference quality degrees with positive deviation degrees and the number S of reference quality degrees with negative deviation degrees;
and determining the quality degree of the second sub-target according to the number M and the number S.
A third aspect of the embodiments of the present application provides a terminal, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the step instructions in the first aspect of the embodiments of the present application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps as described in the first aspect of embodiments of the present application.
A fifth aspect of embodiments of the present application provides a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has at least the following beneficial effects:
the method comprises the steps of obtaining a reference image, wherein the reference image comprises an image block of a food container, if the image block of the food container comprises a target identifier, extracting a target image block from a target area according to the indication of the target identifier, wherein the target image block comprises a reference image block of food, the target area is a preset area in the food container, and determining the quality degree of the food according to the reference image block of the food.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of an image comparison system for food tracing according to an embodiment of the present application;
fig. 2A is a schematic flowchart of an image comparison method for food tracing according to an embodiment of the present application;
FIG. 2B is a schematic view of a food container according to an embodiment of the present disclosure;
fig. 2C is a schematic diagram of a method for fusing reference images according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of another image comparison method for food tracing according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another image comparison method for food tracing according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an image comparison device for food tracing according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic device according to the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal equipment (terminal), and so on. For convenience of description, the above-mentioned apparatuses are collectively referred to as electronic devices.
In order to better understand the method for comparing the food tracing images provided by the embodiment of the present application, a brief description is first given below to a system for comparing the food tracing images using the method for comparing the food tracing images. Referring to fig. 1, fig. 1 is a schematic diagram of an image comparison system for food tracing according to an embodiment of the present disclosure. As shown in fig. 1, the image comparison system for food tracing includes a food container 101, an electronic device 102 and a server 103, where food is placed in the food container 101, the electronic device 102 acquires a reference image after the food is transported to a destination, the reference image includes an image block of the food container, the image block of the food container is an image block of an area including the food container 101, the electronic device 102 sends the reference image to the server 103, if the server 103 determines that the image block of the food container includes a target identifier, a target image block is extracted from the target area according to the target identifier, the target image block includes a reference image block of the food, the target area is a preset area in the food container, the preset area may be an area for storing the food, the server 103 determines a quality degree of the food according to the reference image block of the food, the quality degree of the food is a parameter for indicating quality of the food, for example, the degree of rotting and the like, therefore, compared with the existing scheme, the method can acquire the images of the food in the food container by a manual or random detection method to obtain the reference image, and determine the quality degree of the food according to the reference image block of the food in the reference image, so that the accuracy of acquiring the quality degree of the food can be improved to a certain extent.
Referring to fig. 2A, fig. 2A is a schematic flowchart of an image comparison method for food tracing according to an embodiment of the present application. As shown in fig. 2A, the image comparison method for food tracing includes steps 201 and 203, which are as follows:
201. a reference image is acquired, the reference image comprising image patches of the food container.
The electronic device can acquire the reference image through a camera of the electronic device, and when the reference image is acquired, the angle, the exposure and the like of the acquired reference image can be automatically adjusted.
202. If the image block of the food container comprises the target identification, extracting the target image block from the target area according to the indication of the target identification, wherein the target image block comprises a reference image block of the food, and the target area is a preset area in the food container.
The target mark may be a specific pattern or code that can be recognized by computer graphics technology and is used to locate the reference image, and the corresponding recognition program is preset with the recognition logic for the target mark. After the target identification is obtained, the target area can be determined directly according to the target identification, and the target area can be a storage area of food, so that the image does not need to be identified again, the target image block can be obtained quickly only through the target identification, and the efficiency of obtaining the target image block can be improved.
The extracting of the target image block from the target area according to the indication of the target identifier may specifically be: and directly extracting the image from the target area according to the target area indicated by the target identification to obtain a target image block.
203. And determining the quality degree of the food according to the reference image block of the food.
The reference image can be processed according to the color reference area in the reference image block, and then the quality degree of the food can be determined according to the processed image. The color reference region includes a plurality of colors, such as three primary colors (red, green, and blue), and may have other colors, which are merely exemplary and not limited.
In the embodiment of the application, the reference image is obtained and comprises an image block of the food container, if the image block of the food container comprises the target identifier, the target image block is extracted from the target area according to the target identifier, the target image block comprises a reference image block of food, the target area is a preset area in the food container, and the quality degree of the food is determined according to the reference image block of the food.
In one possible embodiment, the present application provides a schematic diagram of a food container, as shown in fig. 2B, the food container includes a target identifier, a reference image block, a color reference area, and the like, the reference image block includes an image of a food, and the color reference area is used for performing image restoration processing and the like on the reference image block.
Optionally, the food container may further include a two-dimensional code for identifying food, and during the transportation of food, the two-dimensional code may be used to perform identity recognition or identity confirmation on the transported food, for example, it may be known by scanning the two-dimensional code whether the food in the food container is the food sent from the starting location.
In a possible embodiment, the exposure parameters of the camera may also be adjusted, so that images under different exposure parameters may be obtained, and then the images under different exposure parameters are fused to obtain a reference image, specifically, referring to the fusion step shown in fig. 2C, the images corresponding to the exposure parameter 1, the exposure parameter 2, and the exposure parameter 3 are fused to obtain the reference image, and the image fusion method may be to extract RGB values (RGB values of colors may be values between 0 and 255) of the images under different exposure parameters, perform weight calculation on the RGB values according to a weight determined by the exposure coefficient, thereby obtaining RGB values after the weight calculation, and finally determine the reference image according to the RGB values.
In one possible embodiment, a possible method for determining a quality level of a food product based on a reference image block of the food product comprises steps a1-A3 as follows:
a1, acquiring characteristic parameters of the color reference area from the reference image according to the indication of the target mark;
a2, determining a target image block of the food according to the characteristic parameters and the reference image block of the food;
and A3, determining the quality degree of the food according to the target image block.
The target identifier may indicate a position of the color reference region, so that the characteristic parameters of the color reference region may be obtained according to the position of the color reference region indicated by the target identifier, where the characteristic parameters include color difference, brightness, resolution, RGB values of each pixel point, and the like. The color of the image of the color reference area in the food container is stored in advance, the image of the color reference area is used for restoring the image, and the color of the image of the color reference area of the image collected when the food leaves the departure place is the color of the image of the color reference area of the image, namely the color of the image is the subsequent preset characteristic parameter.
Optionally, the offset of the characteristic parameter of the reference color region is determined by comparing the characteristic parameter of the color reference region with a preset characteristic parameter, and specifically, the offset may be: and taking the difference between the characteristic parameter and a preset characteristic parameter as the characteristic information offset. It is understood that the value of the characteristic parameter is higher than the value of the preset characteristic parameter when the offset is positive, and the value of the characteristic parameter is lower than the value of the preset characteristic parameter when the offset is negative.
Optionally, the characteristic parameters of the reference image block of the food are added with the characteristic information offset to obtain target characteristic information of the reference image block of the food, and the characteristic information of the target image block of the food is adjusted to the target characteristic information to obtain the target image block of the food.
The color data of food and the quality degree identification area according to food can be determined according to the target image block, the quality degree identification area is an identification area in the food container, a quality degree identification device of food is arranged in the identification area, the specific quality degree identification device can be color-changing test paper, and the test paper has different colors when the food is at different quality degrees. Of course, other quality level identification devices are also possible, and the present invention is only illustrative and not limited to the specific examples.
In the example, the reference image block is processed through the color reference area to obtain the target image block, so that the color of the image can be restored, the influence of the shooting environment on the color of the food can be reduced, and the accuracy of obtaining the quality degree of the food is improved.
In one possible embodiment, a possible method for determining the quality level of a food product based on target image blocks comprises steps B1-B4, as follows:
b1, performing feature extraction on the target image block to obtain feature data;
b2, determining first color data of the food according to the characteristic data;
b3, acquiring second color data of the quality degree identification area of the food from the reference image;
and B4, determining the quality degree of the food according to the first color data and the second color data.
Wherein the characteristic data comprises RGB values. First color data of the food can be determined according to the RGB values, and the first color data can be specific colors and the like, and the colors are skin colors of the food.
Optionally, the feature extraction may be directly performed on the quality degree identification area, and the second color data may be obtained by using a method of obtaining the first color data.
The quality degree of the food product may be determined based on the N sub-first color data of the first color data, and the second color data. The N first sub-color data are color data corresponding to N sub-regions of the food, and the N sub-regions are N sub-regions into which the food is divided, and are divided, for example, in an equal area manner.
In this example, the quality degree of the food is determined by the first color data of the food and the second color data of the quality degree identification area, and the accuracy of image matching for food tracing can be improved.
In one possible embodiment, the first color data comprises N sub-first color data, the N sub-first color data corresponding to N sub-regions of the food product, and a possible method of determining the quality level of the food product based on the first color data and the second color data comprises steps C1-C7 as follows:
c1, determining the quality degree of the corresponding sub-region according to each sub-first color data in the N sub-first color data to obtain N reference quality degrees;
c2, classifying the N reference quality degrees to obtain a first quality degree set and a second quality degree set, wherein the reference quality degree in the first quality degree set is smaller than the reference quality degree in the second quality degree set;
c3, if the reference quality degree smaller than the preset quality degree threshold value exists in the first quality degree set, determining the quality degree of the food to be a first preset quality degree;
c4, if the reference quality degrees in the first quality degree set are all larger than the preset quality degree, acquiring a first numerical value of the reference quality degree in the first quality degree set existing in the first quality degree interval, a second numerical value of the reference quality degree in the first quality degree set existing in the second quality degree interval, and a third numerical value of the reference quality degree in the first quality degree set existing in the third quality degree interval, wherein the first quality degree interval, the second quality degree interval, and the third quality degree interval are not overlapped;
c5, determining a first sub-target quality degree according to the first quality degree interval, the second quality degree interval, the third quality degree interval, the first numerical value, the second numerical value and the third numerical value;
c6, determining a second sub-target quality degree according to the second quality degree set;
and C7, determining the quality degree of the food according to the first sub-target quality degree and the second sub-target quality degree.
The reference quality degree corresponding to the first color data can be determined according to the mapping relation between the color data and the quality degree. Different colors have different degrees of quality for the same category of food, e.g., strawberry, the brighter the red, the fresher the red, the darker the red, the fresher the red, etc.
The predetermined quality level threshold is set by an empirical value or historical data, and the predetermined quality level threshold may indicate a quality level lower than the predetermined quality level threshold, and it is determined that the food is stale, which may include rotting or about to rot.
The first quality degree interval, the second quality degree interval and the third quality degree interval are not overlapped, and the first quality degree interval, the second quality degree interval and the third quality degree interval are sequentially connected, namely, the right endpoint of the first quality degree interval is the left endpoint of the second quality degree interval, and the right endpoint of the second quality degree interval is the left endpoint of the third quality degree interval.
And determining a first weight according to the first numerical value, wherein the first weight corresponds to the first quality degree interval, determining a second weight according to the second numerical value, wherein the second weight corresponds to the second quality degree interval, determining a third weight according to the first weight and the second weight, and wherein the third weight corresponds to the third quality degree interval. The method for determining the first weight according to the first value may be: dividing the first numerical value by the numerical value A to obtain a numerical value B, taking the positive square root of the numerical value B as a first weight value, wherein the numerical value A is set through experience values or historical data, and the method for determining the second weight value is the same as the method for determining the first weight value, and is not repeated here. If the sum of the first weight, the second weight, and the third weight is 1, the third weight can be determined according to the first weight and the second weight.
And multiplying the mean value of the reference quality degrees in the first quality degree interval by a first weight, multiplying the mean value of the reference quality degrees in the second quality degree interval by a second weight, multiplying the mean value of the reference quality degrees in the second quality degree interval by the second weight and adding to obtain the first sub-target quality degrees.
Optionally, according to the second quality degree set, determining the second sub-target quality degree may be: and determining a second sub-target quality degree according to the deviation degree between the reference quality degree in the second quality degree set and the target quality degree threshold.
Optionally, the average of the first sub-target quality degree and the second sub-target quality degree is used as the quality degree of the food.
In this example, the quality degree of the food is determined according to the reference quality degrees of different sub-regions of the food by processing and analyzing the N sub-first color data and the second color data, so that the accuracy of image comparison for food tracing can be improved to a certain extent.
In one possible embodiment, a possible method for determining a second sub-target quality level according to the second quality level set includes steps D1-D3, as follows:
d1, acquiring the deviation degree between the reference quality degree in the second quality degree set and the target quality degree threshold value;
d2, acquiring the number M of reference quality degrees with positive deviation degrees and the number S of reference quality degrees with negative deviation degrees;
d3, determining the second sub-target quality degree according to the number M and the number S.
Wherein the degree of deviation may be: the deviation degree of the difference between the reference quality degree and the target quality degree can be positive, zero and negative, and when the deviation degree is positive, the reference quality degree is greater than the target quality degree threshold, and when the deviation degree is negative, the reference quality degree is less than the target quality degree threshold.
Optionally, if M is greater than S, determining to use a preset first quality degree as a quality degree corresponding to a reference quality degree whose deviation is a positive number, and to use a preset second quality degree as a quality degree corresponding to a reference quality degree whose deviation is a negative number, where the preset first quality degree is smaller than the preset second quality degree; if M is less than S, determining a preset second quality degree as a quality degree corresponding to a reference quality degree with positive deviation degree, and determining a preset first quality degree as a quality degree corresponding to a reference quality degree with negative deviation degree; and performing weight calculation on the preset first quality degree and the preset second quality degree to obtain a second sub-target quality degree, wherein the weight during weight calculation is related to M and S.
Performing weight calculation on a preset first quality degree and a preset second quality degree to obtain a second sub-target quality degree, wherein when M is greater than S, the weight corresponding to the preset first quality degree is S/(M + S), and the weight corresponding to the preset second quality degree is M/(M + S); when M is smaller than S, the weights are opposite.
In this example, the second sub-target quality degree is determined by the positive-negative relationship of the deviation between the reference quality degree in the second quality degree set and the target quality degree threshold, so that the accuracy in acquiring the second sub-target quality degree can be improved.
In one possible embodiment, the present application provides another method for determining the quality level of a food product, which is as follows:
e1, acquiring a first image of the food by the electronic device when the food is transported from a place of departure;
e2, the electronic device sends the first image to a server, and the server stores the image;
e3, the electronic device acquires a second image of the food after the food reaches the destination, and sends the second image to the server;
e4, the server determines the quality degree of the food product according to the first image and the second image.
The server stores preset color scales and texture scales, the color scales can be understood as colors of food under different quality degrees, and the texture scales can be understood as textures of the food under different quality degrees. The server can determine the quality degree of food according to color scale and texture scale, specifically can be: and determining the color and the texture of the given food according to the first image and the second image, comparing the color and the texture with the color ruler and the texture ruler to obtain corresponding quality degrees, and taking the average value of the quality degrees as the quality degree of the food.
The departure point can be understood, for example, as the starting point of the food transport, and the destination point as the end point of the food transport, which can be understood in particular as the starting point and the end point in a single transport node.
In a possible embodiment, when acquiring the reference image, the acquired image may be poor due to environmental factors, and a possible method for performing effect enhancement on the reference image is as follows: the image processing method comprises the steps that an area to be processed is an ultrasonic image in an image to be processed, and a reference image is the image to be processed;
acquiring a gray image of an ultrasonic image in a region to be processed, and performing high-frequency direction decomposition on each pixel point in the gray image to respectively obtain a first component of each pixel point in the horizontal direction, a second component of each pixel point in the vertical direction and a third component of each pixel point in the diagonal direction; correspondingly forming first components of all pixel points in an ultrasonic image in a region to be processed in the horizontal direction into a first component image, correspondingly forming second components of all pixel points in the ultrasonic image in the vertical direction into a second component image, and correspondingly forming third components of all pixel points in the ultrasonic image in the diagonal direction into a third component image; sharpening the first component image, the second component image and the third component image to obtain a sharpened first component image, a sharpened second component image and a sharpened third component image; and performing pixel superposition on the sharpened first component image, the sharpened second component image and the sharpened third component image and the ultrasonic image in the region to be processed to obtain an enhanced ultrasonic image.
In a possible example, the ultrasound image in the region to be processed may be enhanced by constructing a Hessian matrix, which specifically includes: calculating a Hessian matrix corresponding to each pixel point in an ultrasonic image in a region to be processed, recording the position of each pixel point in the ultrasonic image in the region to be processed, adding a horizontal component, a vertical component and a diagonal component in the Hessian matrix of each pixel point to the matrix according to the position of each pixel point in the ultrasonic image in the region to be processed to obtain a horizontal component image, a vertical component image and a diagonal component image of the ultrasonic image in the region to be processed when the image is enhanced in the Hessian matrix mode, sharpening the horizontal component image, the vertical component image and the diagonal component image to obtain a sharpened horizontal component image, a vertical component image and a diagonal component image, and finally superposing the sharpened horizontal component image, the vertical component image and the diagonal component image and the gray level image of the ultrasonic image in the region to be processed by a gray level value, an enhanced ultrasound image is obtained.
By the method, the quality of the reference image can be improved by enhancing the reference image, so that the accuracy of processing the reference image is improved.
Optionally, in a possible embodiment, before the server communicates with the client, the client may be understood as an electronic device, and a secure communication channel may also be established, specifically: a possible method for establishing a secure communication channel relates to a server, a client and a proxy device, wherein the proxy device is a trusted third-party device, and specifically comprises the following steps:
s1, initialization: the initialization stage mainly completes the registration of the server and the client on the proxy equipment, the subscription of the theme and the generation of system parameters. The server and the client register to the agent device, and can participate in the publishing and subscribing of the theme only through the registered server and the registered client, and the client subscribes the related theme to the agent device. The proxy device generates a system public Parameter (PK) and a master key (MSK), and transmits the PK to the registered server and the client.
S2, encryption and release: the encryption and release stage is mainly that the server encrypts the load corresponding to the subject to be released and sends the load to the agent equipment. Firstly, the server encrypts a load by adopting a symmetric encryption algorithm to generate a Ciphertext (CT), and then an access structure is formulated
Figure BDA0002227172940000141
According to server-generated PK andand encrypting the symmetric key, and finally sending the encrypted key and the encrypted load to the proxy equipment. And after receiving the encrypted key and the encrypted CT sent by the server, the proxy equipment filters and forwards the key and the CT to the client.
Optionally, an access structure
Figure BDA0002227172940000143
Is an access tree structure. Each non-leaf node of the access tree is a threshold, denoted by KxIs represented by 0<=Kx<Num (x), num (x) indicates the number of child nodes. When K isxNum (x), the non-leaf node represents the and gate; when K isxWhen 1, the non-leaf node represents an or gate; each leaf node of the access tree represents an attribute. The attribute set satisfying an access tree structure can be defined as: let T be an access tree with r as the root node, TxIs a subtree of T with x as the root node. If T isx(S) < 1 > indicates that the attribute set S satisfies the access structure Tx. If node x is a leaf node, T is a set of attributes S if and only if the attribute att (x) associated with leaf node x is an element of attribute set Sx(S) ═ 1. If node x is a non-leaf node, at least KxChild node z satisfies TzWhen (S) is 1, Tx(S)=1。
S3, private key generation: the private key generation phase is mainly that the agent device generates a corresponding secret key for the client to decrypt the CT received thereafter. Client provides attribute set A to proxy devicei(the attribute can be the information of the characteristics, roles and the like of the subscriber), the proxy device collects A according to PK and attributeiAnd the master key MSK generates a private key SK and then sends the generated private key to the client.
Optionally, attribute set AiIs a global set of U ═ A1,A2,…,AnA subset of. Attribute set AiAttribute information indicating the client i (i-th client),the client-side attribute information can be the characteristics, roles and the like of the client side, and the global set U represents the set of all client-side attribute information as the default attribute of the client side.
S4, decryption: the decryption stage is mainly a process of decrypting the encrypted load by the client to extract the civilization. After receiving the encrypted key and the CT sent by the proxy equipment, the client decrypts the encrypted key according to the PK and the SK to obtain a symmetric key. If its attribute set AiAccess structure satisfying ciphertext
Figure BDA0002227172940000144
The ciphertext can be successfully decrypted, so that the safety of the communication process is guaranteed.
By constructing the secure communication channel, the security of communication between the electronic device and the server can be improved to a certain extent, the possibility that an illegal user steals data transmitted between the legal electronic device and the server is reduced, and meanwhile, the situation that the electronic device is controlled by the illegal user through an intrusion system and a tampering system is also reduced.
Referring to fig. 3, fig. 3 is a schematic flowchart of another method for comparing images for food tracing according to an embodiment of the present application. As shown in fig. 3, the image comparison method for food tracing comprises steps 301-305, which are as follows:
301. acquiring a reference image, wherein the reference image comprises image blocks of the food container;
302. if the image block of the food container comprises the target identification, extracting the target image block from the target area according to the indication of the target identification, wherein the target image block comprises a reference image block of the food, and the target area is a preset area in the food container;
303. acquiring characteristic parameters of a color reference area from a reference image according to the indication of the target identifier;
304. determining a target image block of the food according to the characteristic parameters and the reference image block of the food;
305. and determining the quality degree of the food according to the target image block.
In the example, the reference image block is processed through the color reference area to obtain the target image block, so that the color of the image can be restored, the influence of the shooting environment on the color of the food can be reduced, and the accuracy of obtaining the quality degree of the food is improved.
Referring to fig. 4, fig. 4 is a schematic flow chart of another food tracing image comparison method according to an embodiment of the present application. As shown in fig. 4, the image comparison method for food tracing includes steps 401 and 408, which are as follows:
401. acquiring a reference image, wherein the reference image comprises image blocks of the food container;
402. if the image block of the food container comprises the target identification, extracting the target image block from the target area according to the indication of the target identification, wherein the target image block comprises a reference image block of the food, and the target area is a preset area in the food container;
403. acquiring characteristic parameters of a color reference area from a reference image according to the indication of the target identifier;
404. determining a target image block of the food according to the characteristic parameters and the reference image block of the food;
405. carrying out feature extraction on the target image block to obtain feature data;
406. determining first color data of the food according to the characteristic data;
407. acquiring second color data of the quality degree identification area of the food from the reference image;
408. and determining the quality degree of the food according to the first color data and the second color data.
In this example, the quality degree of the food is determined by the first color data of the food and the second color data of the quality degree identification area, and the accuracy of image matching for food tracing can be improved.
In accordance with the foregoing embodiments, please refer to fig. 5, fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application, and as shown in the drawing, the terminal includes a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, the computer program includes program instructions, and the processor is configured to call the program instructions, and the program includes instructions for performing the following steps;
acquiring a reference image, wherein the reference image comprises image blocks of the food container;
if the image block of the food container comprises the target identification, extracting the target image block from the target area according to the indication of the target identification, wherein the target image block comprises a reference image block of the food, and the target area is a preset area in the food container;
and determining the quality degree of the food according to the reference image block of the food.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the terminal includes corresponding hardware structures and/or software modules for performing the respective functions in order to implement the above-described functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the terminal may be divided into the functional units according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In accordance with the above, please refer to fig. 6, fig. 6 is a schematic structural diagram of an image comparison apparatus for food tracing according to an embodiment of the present application. The apparatus comprises an acquisition unit 601, an extraction unit 602, and a determination unit 603, wherein,
an acquiring unit 601 for acquiring a reference image, the reference image comprising image blocks of the food container;
an extracting unit 602, configured to extract a target image block from a target area according to an indication of a target identifier if the image block of the food container includes the target identifier, where the target image block includes a reference image block of the food, and the target area is a preset area in the food container;
the determining unit 603 is configured to determine a quality level of the food product based on the reference image block of the food product.
Optionally, in terms of determining the quality degree of the food product according to the reference image block of the food product, the determining unit 603 is configured to:
acquiring characteristic parameters of a color reference area from a reference image according to the indication of the target identifier;
determining a target image block of the food according to the characteristic parameters and the reference image block of the food;
and determining the quality degree of the food according to the target image block.
Optionally, in terms of determining the quality degree of the food product according to the target image block, the determining unit 603 is configured to:
carrying out feature extraction on the target image block to obtain feature data;
determining first color data of the food according to the characteristic data;
acquiring second color data of the quality degree identification area of the food from the reference image;
and determining the quality degree of the food according to the first color data and the second color data.
Optionally, the first color data includes N sub-first color data, the N sub-first color data correspond to N sub-regions of the food, and in terms of determining the quality degree of the food according to the first color data and the second color data, the determining unit 603 is configured to:
determining the quality degree of the corresponding sub-area according to each sub-first color data in the N sub-first color data to obtain N reference quality degrees;
classifying the N reference quality degrees to obtain a first quality degree set and a second quality degree set, wherein the reference quality degree in the first quality degree set is smaller than the reference quality degree in the second quality degree set;
if the reference quality degree smaller than the preset quality degree threshold value exists in the first quality degree set, determining the quality degree of the food to be a first preset quality degree;
if the reference quality degrees in the first quality degree set are all larger than the preset quality degree, acquiring a first numerical value of the reference quality degree in the first quality degree set in a first quality degree interval, a second numerical value of the reference quality degree in the first quality degree set in a second quality degree interval, and a third numerical value of the reference quality degree in the first quality degree set in a third quality degree interval, wherein the first quality degree interval, the second quality degree interval and the third quality degree interval are not overlapped;
determining a first sub-target quality degree according to the first quality degree interval, the second quality degree interval, the third quality degree interval, the first numerical value, the second numerical value and the third numerical value;
determining a second sub-target quality degree according to the second quality degree set;
and determining the quality degree of the food according to the first sub-target quality degree and the second sub-target quality degree.
Optionally, in terms of determining a second sub-target quality degree according to the second quality degree set, the determining unit 603 is configured to:
acquiring the deviation degree between the reference quality degree in the second quality degree set and a target quality degree threshold value;
acquiring the number M of reference quality degrees with positive deviation degrees and the number S of reference quality degrees with negative deviation degrees;
and determining the quality degree of the second sub-target according to the number M and the number S.
The present application also provides a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the food tracing image comparison methods described in the above method embodiments.
Embodiments of the present application further provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program causes a computer to execute some or all of the steps of any one of the image comparison methods for food product traceability described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash memory disks, read-only memory, random access memory, magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. An image comparison method for food tracing is characterized by comprising the following steps:
acquiring a reference image, wherein the reference image comprises image blocks of the food container;
if the image block of the food container comprises a target identifier, extracting a target image block from a target area according to the indication of the target identifier, wherein the target image block comprises a reference image block of the food, and the target area is a preset area in the food container;
and determining the quality degree of the food according to the reference image block of the food.
2. The method of claim 1, wherein determining the quality level of the food product from the reference image patch of the food product comprises:
acquiring characteristic parameters of a color reference area from the reference image according to the indication of the target mark;
determining a target image block of the food according to the characteristic parameters and the reference image block of the food;
and determining the quality degree of the food according to the target image block.
3. The method of claim 2, wherein determining the quality level of the food product based on the target image patch comprises:
extracting the features of the target image block to obtain feature data;
determining first color data of the food according to the characteristic data;
acquiring second color data of a quality degree identification area of the food from the reference image;
and determining the quality degree of the food according to the first color data and the second color data.
4. The method of claim 3, wherein the first color data comprises N sub-first color data corresponding to N sub-regions of the food product, and wherein determining the quality level of the food product based on the first color data and the second color data comprises:
determining the quality degree of the corresponding sub-region according to each sub-first color data in the N sub-first color data to obtain N reference quality degrees;
classifying the N reference quality degrees to obtain a first quality degree set and a second quality degree set, wherein the reference quality degree in the first quality degree set is smaller than the reference quality degree in the second quality degree set;
if the reference quality degree smaller than a preset quality degree threshold value exists in the first quality degree set, determining that the quality degree of the food is a first preset quality degree;
if the reference quality degrees in the first quality degree set are all larger than the preset quality degree, acquiring a first numerical value of the reference quality degree in the first quality degree set in a first quality degree interval, a second numerical value of the reference quality degree in the first quality degree set in a second quality degree interval, and a third numerical value of the reference quality degree in the first quality degree set in a third quality degree interval, wherein the first quality degree interval, the second quality degree interval and the third quality degree interval are not overlapped with each other;
determining a first sub-target quality degree according to the first quality degree interval, the second quality degree interval, the third quality degree interval, the first numerical value, the second numerical value and the third numerical value;
determining a second sub-target quality degree according to the second quality degree set;
and determining the quality degree of the food according to the first sub-target quality degree and the second sub-target quality degree.
5. The method of claim 4, wherein determining a second sub-target quality level according to the second set of quality levels comprises:
obtaining the deviation degree between the reference quality degree in the second quality degree set and a target quality degree threshold value;
acquiring the number M of reference quality degrees with positive deviation degrees and the number S of reference quality degrees with negative deviation degrees;
and determining the quality degree of the second sub-target according to the number M and the number S.
6. The image comparison device for food tracing is characterized by comprising an acquisition unit, an extraction unit and a determination unit, wherein,
the acquisition unit is used for acquiring a reference image, and the reference image comprises image blocks of the food container;
the extraction unit is used for extracting a target image block from a target area according to the indication of a target identifier if the image block of the food container comprises the target identifier, wherein the target image block comprises a reference image block of food, and the target area is a preset area in the food container;
the determining unit is used for determining the quality degree of the food according to the reference image block of the food.
7. The apparatus according to claim 6, wherein, in said determining the quality level of the food product from the reference image block of the food product, the determining unit is configured to:
acquiring characteristic parameters of a color reference area from the reference image according to the indication of the target mark;
determining a target image block of the food according to the characteristic parameters and the reference image block of the food;
and determining the quality degree of the food according to the target image block.
8. The apparatus according to claim 7, wherein in determining the quality level of the food product from the target image block, the determining unit is configured to:
extracting the features of the target image block to obtain feature data;
determining first color data of the food according to the characteristic data;
acquiring second color data of a quality degree identification area of the food from the reference image;
and determining the quality degree of the food according to the first color data and the second color data.
9. A terminal, comprising a processor, an input device, an output device, and a memory, the processor, the input device, the output device, and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-5.
10. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to perform the method according to any of claims 1-5.
CN201910955597.8A 2019-10-09 2019-10-09 Image comparison method for food tracing and related device Active CN110796149B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910955597.8A CN110796149B (en) 2019-10-09 2019-10-09 Image comparison method for food tracing and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910955597.8A CN110796149B (en) 2019-10-09 2019-10-09 Image comparison method for food tracing and related device

Publications (2)

Publication Number Publication Date
CN110796149A true CN110796149A (en) 2020-02-14
CN110796149B CN110796149B (en) 2023-10-27

Family

ID=69440017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910955597.8A Active CN110796149B (en) 2019-10-09 2019-10-09 Image comparison method for food tracing and related device

Country Status (1)

Country Link
CN (1) CN110796149B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040218838A1 (en) * 2003-04-30 2004-11-04 Canon Kabushiki Kaisha Image processing apparatus and method therefor
JP2008252779A (en) * 2007-03-30 2008-10-16 Yamaha Corp Image processor, image processing method, program and camera
JP2010071951A (en) * 2008-09-22 2010-04-02 Omron Corp Visual inspection device and visual inspection method
WO2016123977A1 (en) * 2015-02-05 2016-08-11 努比亚技术有限公司 Image colour identification method and device, terminal and storage medium
US20170011276A1 (en) * 2015-07-08 2017-01-12 Intelleflex Corporation Photo analytics calibration
CN106709492A (en) * 2016-12-15 2017-05-24 网易(杭州)网络有限公司 Examination paper image processing method and device, and computer readable storage medium
CN108171721A (en) * 2017-12-04 2018-06-15 北京农业智能装备技术研究中心 The target object image extraction method and device of a kind of large scale image
US20180284091A1 (en) * 2017-03-29 2018-10-04 Ido LEVANON Apparatus and method for monitoring preparation of a food product
WO2018195797A1 (en) * 2017-04-26 2018-11-01 深圳配天智能技术研究院有限公司 Visual detection method, detection device, and robot
CN109587268A (en) * 2018-12-25 2019-04-05 Oppo广东移动通信有限公司 Electronic equipment, information-pushing method and Related product
CN109639888A (en) * 2018-11-27 2019-04-16 Oppo广东移动通信有限公司 Electronic device, information-pushing method and Related product

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040218838A1 (en) * 2003-04-30 2004-11-04 Canon Kabushiki Kaisha Image processing apparatus and method therefor
JP2008252779A (en) * 2007-03-30 2008-10-16 Yamaha Corp Image processor, image processing method, program and camera
JP2010071951A (en) * 2008-09-22 2010-04-02 Omron Corp Visual inspection device and visual inspection method
WO2016123977A1 (en) * 2015-02-05 2016-08-11 努比亚技术有限公司 Image colour identification method and device, terminal and storage medium
US20170011276A1 (en) * 2015-07-08 2017-01-12 Intelleflex Corporation Photo analytics calibration
CN106709492A (en) * 2016-12-15 2017-05-24 网易(杭州)网络有限公司 Examination paper image processing method and device, and computer readable storage medium
US20180284091A1 (en) * 2017-03-29 2018-10-04 Ido LEVANON Apparatus and method for monitoring preparation of a food product
WO2018195797A1 (en) * 2017-04-26 2018-11-01 深圳配天智能技术研究院有限公司 Visual detection method, detection device, and robot
CN108171721A (en) * 2017-12-04 2018-06-15 北京农业智能装备技术研究中心 The target object image extraction method and device of a kind of large scale image
CN109639888A (en) * 2018-11-27 2019-04-16 Oppo广东移动通信有限公司 Electronic device, information-pushing method and Related product
CN109587268A (en) * 2018-12-25 2019-04-05 Oppo广东移动通信有限公司 Electronic equipment, information-pushing method and Related product

Also Published As

Publication number Publication date
CN110796149B (en) 2023-10-27

Similar Documents

Publication Publication Date Title
TWI675308B (en) Method and apparatus for verifying the availability of biometric images
US8406424B2 (en) Visual universal decryption apparatus and methods
US20120148089A1 (en) Method and system for efficient watermarking of video content
CN112884859B (en) Anti-fake image generation and identification method and device and computer storage medium
CN111738238A (en) Face recognition method and device
CN106709963A (en) Method and apparatus for verifying authenticity of image
WO2017041494A1 (en) Information processing method and terminal, and a computer storage medium
CN105022946A (en) Face decryption method and device
CN109816543B (en) Image searching method and device
US11374933B2 (en) Securing digital data transmission in a communication network
CN110473136B (en) Image processing method and device based on SURF-DCT (speeded Up robust features-discrete cosine transform) mixing
CN112381000A (en) Face recognition method, device, equipment and storage medium based on federal learning
CN115527101A (en) Image tampering detection method and processor
CN108921080A (en) Image-recognizing method, device and electronic equipment
CN109856979B (en) Environment adjusting method, system, terminal and medium
CN108206961A (en) A kind of method and relevant device for calculating live streaming platform popularity
CN110796149A (en) Food tracing image comparison method and related device
WO2020057389A1 (en) Signature verification method and apparatus, electronic device and readable storage medium
CN111680284A (en) Slider verification method and device and readable storage medium
CN111382296B (en) Data processing method, device, terminal and storage medium
CN110796136B (en) Mark and image processing method and related device
CN113689321A (en) Image information transmission method and device based on stereoscopic projection encryption
CN113393358A (en) Image processing method and system, storage medium, and computing device
CN111062048B (en) Secure transmission method and related device
CN110505285B (en) Park session method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 29a-2a, dongle garden, 1023 Buxin Road, Luohu District, Shenzhen, Guangdong 518021

Applicant after: Chen Haoneng

Address before: 518000 513, building 11, Shenzhen Bay science and technology ecological park, No.16, Keji South Road, high tech community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: Chen Haoneng

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant