CN111582109A - Recognition method, recognition device, computer-readable storage medium and electronic equipment - Google Patents

Recognition method, recognition device, computer-readable storage medium and electronic equipment Download PDF

Info

Publication number
CN111582109A
CN111582109A CN202010352258.3A CN202010352258A CN111582109A CN 111582109 A CN111582109 A CN 111582109A CN 202010352258 A CN202010352258 A CN 202010352258A CN 111582109 A CN111582109 A CN 111582109A
Authority
CN
China
Prior art keywords
carrier
matrix
target object
attribute
identified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010352258.3A
Other languages
Chinese (zh)
Other versions
CN111582109B (en
Inventor
苏睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Haiyi Tongzhan Information Technology Co Ltd
Original Assignee
Beijing Haiyi Tongzhan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Haiyi Tongzhan Information Technology Co Ltd filed Critical Beijing Haiyi Tongzhan Information Technology Co Ltd
Priority to CN202010352258.3A priority Critical patent/CN111582109B/en
Publication of CN111582109A publication Critical patent/CN111582109A/en
Application granted granted Critical
Publication of CN111582109B publication Critical patent/CN111582109B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The present disclosure relates to the field of image processing technologies, and in particular, to an identification method, an identification apparatus, a computer-readable storage medium, and an electronic device, where the method includes: acquiring a carrier top view image corresponding to a carrier to be identified, and carrying out image identification on the carrier top view image to generate a distribution matrix of a target object in the carrier to be identified; and performing feature recognition on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized. According to the technical scheme of the embodiment of the invention, on one hand, the carrier overlook image is identified, and then the target object is subjected to feature identification, so that the target object placed in the carrier to be identified can be automatically identified, and the automation degree of identification is improved; on the other hand, the recognition method can obtain the recognition results of all the target objects in one carrier to be recognized at one time, so that the recognition efficiency is high.

Description

Recognition method, recognition device, computer-readable storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an identification method, an identification apparatus, a computer-readable storage medium, and an electronic device.
Background
In the process of transportation, in order to reduce the damage to the transported objects in the process of transportation, the transported objects are often required to be placed in some specific carriers, especially when a plurality of fragile transported objects are transported. For example, in modern farms, eggs, such as duck eggs, are often transported in special trays to reduce breakage during transport.
In the related art. In order to sort a plurality of transports placed in a carrier, it is generally necessary to perform preliminary identification on the transports and then sort the transports according to the identification result. At present, the common preliminary identification method usually needs to be performed manually. However, the efficiency of the preliminary identification by human is low, and the identification result is also susceptible to subjectivity.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to an identification method, an identification device, a computer-readable storage medium, and an electronic device, which can improve the efficiency of identifying a plurality of target objects in a carrier and avoid the problem that an identification result is subjectively influenced by an identifier by providing an automatic identification method.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an identification method comprising: acquiring a carrier top view image corresponding to a carrier to be identified, and carrying out image identification on the carrier top view image to generate a distribution matrix of a target object in the carrier to be identified; and performing characteristic identification on each target object in the carrier to be identified based on the distribution matrix to generate an object characteristic matrix corresponding to the carrier to be identified.
Optionally, based on the foregoing scheme, performing image recognition on the carrier top view image to generate a distribution matrix of the target object in the carrier to be recognized, including: carrying out image recognition on the carrier overlook image so as to determine at least one placing position in the carrier to be recognized and an image area corresponding to each placing position; identifying each image area, determining whether a target object is placed in each placing position, and generating a corresponding placing identifier; and generating a distribution matrix corresponding to the target object in the carrier to be recognized based on the placement position and the corresponding placement identifier.
Optionally, based on the foregoing scheme, generating a distribution matrix corresponding to the target object in the carrier to be recognized based on the placement position and the corresponding placement identifier includes: generating a zero matrix according to the arrangement mode of at least one placing position in the carrier to be identified; and assigning values to each element of the zero matrix according to the corresponding relation between the placement position and each element in the zero matrix and the placement identifier to obtain the distribution matrix.
Optionally, based on the foregoing scheme, the feature identification includes first attribute identification, and the object feature matrix includes a first attribute matrix; the method for identifying the characteristics of each target object in the carrier to be identified based on the distribution matrix to generate an object characteristic matrix corresponding to the carrier to be identified comprises the following steps: respectively acquiring an object top view image corresponding to each target object based on the distribution matrix, and extracting first attribute data of the target object in the object top view image; calculating a first attribute threshold corresponding to the carrier to be identified according to the first attribute data of each target object; and replacing the matrix value of the distribution matrix according to the first attribute data and the first attribute threshold value of each target object to generate a first attribute matrix corresponding to the carrier to be identified.
Optionally, based on the foregoing scheme, replacing a matrix value of the distribution matrix according to the first attribute data and the first attribute threshold of each target object to generate a first attribute matrix corresponding to the carrier to be identified, including: when first attribute data of the target object belong to a first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a first identifier; when the first attribute data of the target object is larger than a first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a second identifier; and replacing the matrix value corresponding to the target object in the distribution matrix with a third identifier when the first attribute data of the target object is smaller than the first attribute threshold.
Optionally, based on the foregoing scheme, the feature identification includes second attribute identification, and the object feature matrix includes a second attribute matrix; the method for identifying the characteristics of each target object in the carrier to be identified based on the distribution matrix to generate an object characteristic matrix corresponding to the carrier to be identified comprises the following steps: acquiring an object image set corresponding to each target object based on the distribution matrix; performing second attribute identification on the object image set to acquire a second attribute parameter corresponding to the target object; and replacing the matrix value in the distribution matrix according to the size relation between the preset second attribute threshold and the second attribute parameter so as to generate a second attribute matrix corresponding to the carrier to be identified.
Optionally, based on the foregoing scheme, the method further includes: and sorting the target objects in the carrier to be identified according to the object feature matrix.
According to a second aspect of the present disclosure, there is provided an identification apparatus comprising: the image recognition module is used for acquiring a carrier top view image corresponding to the carrier to be recognized and carrying out image recognition on the carrier top view image so as to generate a distribution matrix of the target object in the carrier to be recognized; and the characteristic identification module is used for identifying the characteristics of each target object in the carrier to be identified based on the distribution matrix so as to generate an object characteristic matrix corresponding to the carrier to be identified.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the identification method as any one of the above.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a processor; and
a storage device for storing one or more programs which, when executed by one or more processors, cause the one or more processors to implement the identification method as in any one of the above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
in the identification method provided by one embodiment of the disclosure, a carrier top view image corresponding to a carrier to be identified is obtained, a distribution matrix of target objects in the carrier to be identified is generated, and whether the target objects exist at each placement position in the carrier to be identified can be further determined through the distribution matrix; and then, carrying out feature recognition on the target object existing in the carrier to be recognized according to the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized. On one hand, the carrier overlook image is identified, and then the target object is subjected to feature identification, so that the target object placed in the carrier to be identified can be automatically identified, and the automation degree of identification is improved; on the other hand, the recognition method can obtain the recognition results of all the target objects in one carrier to be recognized at one time, so that the recognition efficiency is high.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 schematically illustrates a flow chart of an identification method in an exemplary embodiment of the disclosure;
fig. 2 schematically illustrates a schematic view of a carrier to be identified and a top view direction in an exemplary embodiment of the disclosure;
fig. 3 schematically illustrates a flow chart of a method of generating a distribution matrix of target objects in a carrier to be identified in an exemplary embodiment of the present disclosure;
fig. 4 schematically illustrates a flow chart of another method of generating a distribution matrix of target objects in a carrier to be identified in an exemplary embodiment of the present disclosure;
fig. 5 schematically illustrates a flowchart of a method for generating an object feature matrix corresponding to a carrier to be identified in an exemplary embodiment of the present disclosure;
fig. 6 schematically illustrates a flowchart of a method for generating a first attribute matrix corresponding to a carrier to be identified in an exemplary embodiment of the present disclosure;
fig. 7 schematically illustrates a flowchart of another method for generating an object feature matrix corresponding to a carrier to be identified in an exemplary embodiment of the present disclosure;
fig. 8 schematically illustrates a flow chart of an identification method targeting an avian egg in an exemplary embodiment of the present disclosure;
FIG. 9 schematically illustrates a component diagram of an identification device in an exemplary embodiment of the disclosure;
fig. 10 schematically illustrates a structural diagram of a computer system suitable for use in an electronic device to implement an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
In the exemplary embodiment, firstly, an identification method is provided, which may be applied to a terminal device having an image processing function, such as a mobile phone, a computer, a digital camera, and the like, and may also be applied to a processing module corresponding to an identification device, for example, before eggs such as eggs and duck eggs are sorted, tray top view images corresponding to trays on which a plurality of eggs are placed may be collected by the identification device, and then the tray top view images are identified and processed by the processing module corresponding to the identification device.
Referring to fig. 1, the above-described recognition method may include the following steps S110 and S120:
in step S110, a carrier top view image corresponding to the carrier to be recognized is obtained, and image recognition is performed on the carrier top view image to generate a distribution matrix of the target object in the carrier to be recognized.
In the present exemplary embodiment, the carrier to be recognized may be a carrier for placing a target object, on which at least one placing position is provided; the carrier top view image corresponding to the carrier to be identified comprises an image obtained by shooting one surface of the carrier to be identified, wherein the surface of the carrier to be identified is located at a placing position. As shown in fig. 2, the top view direction of the carrier 210 to be recognized is the arrow mark direction. In addition, since the carrier to be recognized on which the target object is placed is usually placed with the surface having the placement position as the upper side, in general, a top view image of the carrier can be obtained by shooting the top of the carrier to be recognized; however, in a special case, there may be a case where the side having the placement position is not placed as an upper side, and in this case, it is necessary to adjust the position of the imaging camera to acquire a captured image corresponding to the side having the placement position on the carrier to be recognized as a carrier top view image.
In the present exemplary embodiment, the image recognition is performed on the top view image of the carrier to generate the distribution matrix of the target objects in the carrier to be recognized, and as shown in fig. 3, the following steps S310 to S330 may be included:
in step S310, image recognition is performed on the top view image of the carrier to determine at least one placement position in the carrier to be recognized and an image area corresponding to each placement position.
In the exemplary embodiment, after the carrier top view image of the carrier to be identified is obtained, the carrier top view image may be identified according to an image identification technology, so as to obtain at least one placement position in the carrier top view image and an image area corresponding to the placement position.
In step S320, each image area is recognized, whether a target object is placed in each placement position is determined, and a corresponding placement identifier is generated.
In the present exemplary embodiment, after the placement position in the carrier to be recognized and the image area corresponding to the placement position are determined from the carrier top view image, each image area may be recognized to determine whether the target object is placed in each placement position. For example, whether eggs are placed in the egg tray positions can be determined according to image information such as color and brightness of the egg tray positions.
The placement identifier is an identifier used for indicating whether a target object exists in the placement position, and can be customized. For example, when a target object is placed at the placement position, a placement flag may be set to 1; when the target object is not placed at the placement position, a placement flag may be set to 0.
In step S330, a distribution matrix corresponding to the target object in the carrier to be recognized is generated based on the placement position and the corresponding placement identifier.
In the present exemplary embodiment, the generating of the distribution matrix corresponding to the target object in the to-be-recognized carrier based on the placement position and the corresponding placement identifier, as shown in fig. 4, may include the following steps S410 and S420:
in step S410, a zero matrix is generated according to the arrangement of at least one placement position in the carrier to be identified.
In the present exemplary embodiment, the corresponding zero matrix may be generated according to the arrangement of the placement positions in the carrier to be identified. For example, when 4 × 5 placement positions are included in the carrier to be identified, a zero matrix of 4 × 5 may be correspondingly generated, so that each 0 element in the zero matrix corresponds to each placement position in the carrier to be identified.
In step S420, assigning values to each element of the zero matrix according to the corresponding relationship between the placement position and each element in the zero matrix and the placement identifier to obtain a distribution matrix.
In the exemplary embodiment, each 0 element in the zero matrix may be reassigned according to the corresponding relationship between the placement position and each 0 element in the zero matrix and the placement identifier corresponding to the placement position, so as to obtain the distribution matrix. For example, in the zero matrix of 4-by-5, assuming that the placement identifier of the 1 st placement position corresponding to the 1 st 0 element in the 1 st row represents that there is a target object at the placement position, the 1 st 0 element may be assigned as 1; if the placement identifier of the 2 nd placement position corresponding to the 2 nd element 0 in the 1 st row represents that there is no target object in the placement position, the 2 nd element 0 may be assigned as 2.
It should be noted that the assignment of the above elements is only used to identify whether the target object is placed at the placement position in the matrix, so that the specific values of the assignment are different, and the specific data can be customized, which is not particularly limited in this disclosure. For example, the assignment corresponding to the placement identifier with the target object may be set to a, and the assignment corresponding to the placement identifier without the target object may be set to 2.
In step S120, feature recognition is performed on each target object in the to-be-recognized carrier based on the distribution matrix, so as to generate an object feature matrix corresponding to the to-be-recognized carrier.
In the present exemplary embodiment, since the distribution matrix can indicate whether a target object exists in each placement position on the to-be-identified carrier, the placement positions where the target object exists may be further feature-identified according to the distribution matrix to generate an object feature matrix corresponding to the to-be-identified carrier.
For example, the feature identification may include first attribute identification, where the object feature matrix includes the first attribute matrix, and as shown in fig. 5, the feature identification is performed on each target object in the carrier to be identified based on the distribution matrix to generate the object feature matrix corresponding to the carrier to be identified, which may include the following steps S510 to S530:
in step S510, object top view images corresponding to the target objects are respectively obtained based on the distribution matrix, and first attribute data of the target objects in the object top view images is extracted.
Wherein the target object comprises an object placed in a placement position of the carrier to be identified; the object overhead view image may be an image obtained by shooting the placement position of the target object in the same shooting direction as the shooting carrier overhead view image; the first attribute data may be a parameter for expressing an attribute such as a size, a dimension, or the like of the target object, and may be, for example, an area enclosed by the outline, an overall length of the outline, or the like, which is not particularly limited in the present disclosure.
In this exemplary embodiment, the top view image of the placement position area where the target object is placed in the placement position may be obtained according to the distribution matrix, that is, the top view image of the object. After the object top view image is acquired, the first attribute data of the target object can be extracted according to the color difference or other parameters of the object top view image.
In step S520, a first attribute threshold corresponding to the carrier to be identified is calculated according to the first attribute data of each target object.
In the exemplary embodiment, the average value and the standard deviation of the corresponding first attribute data may be calculated according to the first attribute data of all the target objects in the carrier to be identified, and then the first attribute threshold corresponding to the carrier to be identified may be determined by taking the first attribute data standard deviation as the upper and lower floating ranges based on the first attribute data average value as the center.
In addition, since the first attribute threshold is calculated by the first attribute data of all the target objects in the carrier to be recognized, whether the target object meets the attribute standard can be judged based on the attribute of the target object in the carrier to be recognized, compared with the preset first attribute threshold which is not set. The target object with larger attribute difference with other target objects can be quickly identified in the carrier to be identified, and the problem of lower identification accuracy caused by setting an inaccurate first attribute threshold is avoided.
In step S530, the matrix values of the distribution matrix are replaced according to the first attribute data and the first attribute threshold of each target object, so as to generate a first attribute matrix corresponding to the carrier to be identified.
In the exemplary embodiment, whether the first attribute data of the target object meets the criterion or not can be determined through the first attribute threshold, and then the first attribute identifier of the target object is determined. The first attribute identifier may be an identifier such as a character, a number, a letter, an image, and the like, which is not particularly limited in this disclosure.
For example, when the first attribute identifier includes a numerical value, the first attribute matrix corresponding to the carrier to be identified may be generated in a manner of directly replacing the matrix value in the distribution matrix by the first identifier, the second identifier, and the third identifier. Referring to fig. 6, replacing the matrix value of the distribution matrix according to the first attribute data and the first attribute threshold of each target object may include:
step S610, when the first attribute data of the target object belongs to the first attribute threshold, replacing the matrix value corresponding to the target object in the distribution matrix with a first identifier;
step S620, when the first attribute data of the target object is larger than the first attribute threshold, replacing the matrix value corresponding to the target object in the distribution matrix with a second identifier;
in step S630, when the first attribute data of the target object is smaller than the first attribute threshold, the matrix value corresponding to the target object in the distribution matrix is replaced with the third identifier.
In the present exemplary embodiment, when the target object is identified based on the first attribute data and the first attribute threshold, different types of first attribute data may be represented by different matrix values in the distribution matrix. Specifically, when the first attribute data of the target object belongs to the first attribute threshold range, the corresponding element of the target object in the distribution matrix may be replaced by the first identifier; when the first attribute data of the target object is smaller than the first attribute threshold range, replacing the corresponding element of the target object in the distribution matrix by the second identifier; when the first attribute data of the target object is larger than the first attribute threshold range, the corresponding element of the target object in the distribution matrix can be replaced by the third identifier.
It should be noted that, because the elements in the matrix may be identifiers such as letters and numbers in a normal case, and cannot be filled with identifiers such as characters and images, when the first attribute identifier is an identifier such as an image that cannot be used as a matrix element, the filling identifier corresponding to the first attribute identifier may be set at the same time, and then the elements in the distribution matrix are replaced according to the corresponding relationship between the first attribute identifier and the filling identifier.
In addition, the feature recognition may further include second attribute recognition, where the object feature matrix includes a second attribute matrix, and as shown in fig. 7, the feature recognition is performed on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized, which may include the following steps S710 to S730:
in step S710, an object image set corresponding to each target object is acquired based on the distribution matrix.
In the present exemplary embodiment, since some attribute parameters of the target object may not only require the top view image of the target object but also require images in other directions to be available. Such as whether the target object is intact, whether the target object is soiled, the kind of target object, etc. Therefore, it is possible to determine which placement positions have the target object placed therein based on the distribution matrix, and then shoot the target object to acquire the corresponding object image set.
In step S720, performing second attribute identification on the object image set to obtain a second attribute parameter corresponding to the target object.
In the present exemplary embodiment, after the object image set is obtained, second attribute identification may be performed on each image in the object image set, so as to obtain a second attribute parameter corresponding to the target object. The second attribute parameters may include whether the second attribute parameters are complete or dirt.
In step S730, a matrix value in the distribution matrix is replaced according to a size relationship between a preset second attribute threshold and a second attribute parameter, so as to generate a second attribute matrix corresponding to the carrier to be identified.
In the exemplary embodiment, when the second attribute parameter of the target object is identified, the identification result cannot be obtained by comparing with other target objects in the same carrier to be identified, so that a preset second attribute threshold needs to be set in a user-defined manner, and then the second attribute parameter of the target object is identified by presetting the second attribute threshold.
Specifically, the matrix values in the distribution matrix may be replaced according to the size relationship between the preset second attribute threshold and the second attribute parameters, so as to obtain a second attribute matrix capable of representing the second attribute parameters of all the target objects in the to-be-identified carrier. It should be noted that, when the matrix value is replaced, the matrix value may be replaced by adopting a mode when the first attribute matrix is generated, or may be replaced by adopting other modes, so that each element in the second attribute matrix may represent the second attribute parameter of each target object in the carrier to be identified, and a specific replacement mode is not particularly limited in this disclosure.
It should be noted that, since the preliminary identification may be identification according to attributes of multiple aspects, the object feature matrix may include multiple, that is, each attribute corresponds to one object feature matrix. For example, if the size and integrity of an egg needs to be identified, then 2 object feature matrices can be generated.
In addition, after obtaining the object features corresponding to the carrier to be identified, the method may further include: and sorting the target objects in the carrier to be identified according to the object feature matrix.
In this exemplary embodiment, after the object feature matrix corresponding to the to-be-identified carrier is obtained, all the target objects on the to-be-identified carrier may be sorted at one time according to the object feature matrix. Because the object feature matrix can represent the feature attributes of all target objects on the whole carrier to be identified, all the target objects can be classified according to the object feature matrix, and then one-time sorting is carried out through the sorting equipment.
It should be noted that, when the carriers to be identified are sorted, in general, on the sorting head of the sorting equipment, the sorting devices may correspond to the placement positions on the carriers to be identified one by one. However, because the carriers to be identified are placed in different directions, there is a possibility that the sorting device in the sorting device for sorting the carriers to be identified cannot correspond to the placement positions of the carriers to be identified one by one. For example, in the current placement direction, the placement positions are distributed in 4 × 5, while the sorting devices on the sorting installation are distributed in 5 × 4. At this time, the direction of the carrier to be recognized should be recognized according to the distribution matrix or the object feature matrix, and the direction of the carrier to be recognized is transmitted to the sorting equipment by means of identification and the like, and the sorting equipment can make the sorting device correspond to the placement positions in the carrier to be recognized one by performing operations such as rotating the sorting device.
The following takes identification of eggs as an example, and details of implementation of the technical solution of the embodiment of the present disclosure are explained in detail with reference to fig. 8:
step S810, the egg tray to be identified enters an identification area, and an egg tray overlook image of the egg tray to be identified is shot.
Step S820, based on the image recognition of the egg tray overhead image, a distribution matrix is generated which has the same distribution pattern as the distribution pattern of the egg trays in the egg tray and can indicate whether eggs exist in each egg tray. For example, egg trays are distributed in a 1-9 distribution mode, wherein 1-7 eggs are placed in the 9 th placement positions, eggs are placed in the positions indicated by '1', eggs are not placed in the positions indicated by '0', and then the corresponding distribution matrix is [1,1,1,1,1,1,1,0,1 ].
In step S830, the egg tray can be determined to be placed in a transverse direction according to the distribution matrix of the egg trays.
And step S840, performing attribute identification on the eggs in the egg tray according to the distribution matrix, and determining an egg characteristic matrix corresponding to the egg tray.
And step S850, the sorting equipment rotates the sorting head according to the placing direction of the egg support, so that the sorting devices on the sorting head can correspond to the placing positions in the egg support one by one, and the eggs in the egg support are sorted according to the egg characteristic matrix.
Step S840 is detailed below by two embodiments:
example 1:
in the above example, it is assumed that the attribute identification correspondence may be a size feature, and at this time, the egg images corresponding to the 1 st to 7 th and 9 th placement positions on the egg tray top view image may be identified, the area enclosed by the egg contour is extracted, and the average area and the area standard deviation of 8 eggs are calculated. For example, the average area and standard deviation of area for 8 eggs are 5 and 0.5, respectively. And determining the area threshold value to be 4.5-5.5 through the average area and the area standard deviation.
And then, according to the area surrounded by the 8 poultry egg outlines, respectively determining the relation with an area threshold value, and replacing matrix values in the distribution matrix [1,1,1,1,1,1, 0,1 ]. For example, the areas of the 1 st and 7 th egg profiles are 4.3 and 5.7, respectively, and the areas encompassed by the remaining egg profiles are within an area threshold. Assuming that "2" represents that the outline area belongs to the area threshold, "3" represents that the outline area is smaller than the area threshold, "4" represents that the outline area is larger than the area threshold, and the egg feature matrix corresponding to the size feature is [3,2,2,2,2,2,4,0,2 ].
Example 2:
in the above example, it is assumed that the attribute identification correspondence may be an integrity feature, and at this time, parameters such as corresponding egg outlines and surface colors can be obtained by obtaining egg image sets corresponding to the 1 st to 7 th and 9 th placement positions on the egg tray and identifying the egg image sets, and then the parameters are compared with a preset integrity threshold value to determine whether the eggs are complete. The preset integrity threshold value may be an empirical value obtained by identifying parameters such as contour lines and surface colors of a plurality of complete eggs.
And then replacing matrix values in the distribution matrix [1,1,1,1,1,1,1,0,1] according to the comparison result. For example, if the parameters such as the contour lines and the surface colors of the 3 rd and 9 th eggs do not belong to the preset integrity threshold, it can be determined that the 3 rd and 9 th eggs are incomplete. Assuming that "5" represents the egg is complete and "6" represents the egg is incomplete, the egg feature matrix corresponding to the integrity feature is [5,5,6,5,5,5,5,0,6 ].
It is noted that the above-mentioned figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Embodiments of the apparatus of the present disclosure are described below, which may be used to perform the above-described identification methods of the present disclosure. Referring to fig. 9, the recognition apparatus 900 includes: an image recognition module 910 and a feature recognition module 920.
The image recognition module 910 may be configured to obtain a carrier top view image corresponding to the carrier to be recognized, and perform image recognition on the carrier top view image to generate a distribution matrix of the target object in the carrier to be recognized.
The feature identification module 920 may be configured to perform feature identification on each target object in the to-be-identified carrier based on the distribution matrix, so as to generate an object feature matrix corresponding to the to-be-identified carrier.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the image recognition module 910 may be configured to perform image recognition on a top view image of a carrier, so as to determine at least one placement position in the carrier to be recognized and an image area corresponding to each placement position; identifying each image area, determining whether a target object is placed in each placing position, and generating a corresponding placing identifier; and generating a distribution matrix corresponding to the target object in the carrier to be recognized based on the placement position and the corresponding placement identifier.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the image recognition module 910 may be configured to generate a zero matrix according to an arrangement manner of at least one placement position in the carrier to be recognized; and assigning values to each element of the zero matrix according to the corresponding relation between the placement position and each element in the zero matrix and the placement identifier to obtain the distribution matrix.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the feature identification module 920 may be configured to obtain object top view images corresponding to the target objects respectively based on the distribution matrix, and extract first attribute data of the target objects in the object top view images; calculating a first attribute threshold corresponding to the carrier to be identified according to the first attribute data of each target object; and replacing the matrix value of the distribution matrix according to the first attribute data and the first attribute threshold value of each target object to generate a first attribute matrix corresponding to the carrier to be identified.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the feature identification module 920 may be configured to replace a matrix value corresponding to the target object in the distribution matrix with the first identifier when the first attribute data of the target object belongs to the first attribute threshold; when the first attribute data of the target object is larger than a first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a second identifier; and replacing the matrix value corresponding to the target object in the distribution matrix with a third identifier when the first attribute data of the target object is smaller than the first attribute threshold.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the feature identification module 920 may be configured to obtain an object image set corresponding to each target object based on a distribution matrix; performing second attribute identification on the object image set to acquire a second attribute parameter corresponding to the target object; and replacing the matrix value in the distribution matrix according to the size relation between the preset second attribute threshold and the second attribute parameter so as to generate a second attribute matrix corresponding to the carrier to be identified.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the identifying apparatus 900 further includes: and an object sorting module 930, configured to sort the target objects in the to-be-identified carrier according to the object feature matrix.
For details which are not disclosed in the embodiments of the identification apparatus of the present disclosure, please refer to the embodiments of the identification method of the present disclosure for the respective functional modules of the identification apparatus of the exemplary embodiments of the present disclosure correspond to the steps of the exemplary embodiments of the identification method described above.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above identification method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 1000 according to such an embodiment of the present disclosure is described below with reference to fig. 10. The electronic device 1000 shown in fig. 10 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 10, the electronic device 1000 is embodied in the form of a general purpose computing device. The components of the electronic device 1000 may include, but are not limited to: the at least one processing unit 1010, the at least one memory unit 1020, a bus 1030 connecting different system components (including the memory unit 1020 and the processing unit 1010), and a display unit 1040.
Where the storage unit stores program code that may be executed by the processing unit 1010 to cause the processing unit 1010 to perform the steps according to various exemplary embodiments of the present disclosure described in the "exemplary methods" section above in this specification. For example, the processing unit 1010 may perform step S110 as shown in fig. 1: acquiring a carrier top view image corresponding to a carrier to be identified, and carrying out image identification on the carrier top view image to generate a distribution matrix of a target object in the carrier to be identified; s120: and performing characteristic identification on each target object in the carrier to be identified based on the distribution matrix to generate an object characteristic matrix corresponding to the carrier to be identified.
As another example, the electronic device may implement the steps shown in fig. 3-8.
The memory unit 1020 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)1021 and/or a cache memory unit 1022, and may further include a read-only memory unit (ROM) 1023.
Storage unit 1020 may also include a program/utility 1024 having a set (at least one) of program modules 1025, such program modules 1025 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 1030 may be any one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, and a local bus using any of a variety of bus architectures.
The electronic device 1000 may also communicate with one or more external devices 1070 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1000, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 1000 to communicate with one or more other computing devices. Such communication may occur through input/output (I/O) interfaces 1050. Also, the electronic device 1000 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the internet) via the network adapter 1060. As shown, the network adapter 1060 communicates with the other modules of the electronic device 1000 over the bus 1030. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 1000, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device.
Furthermore, an exemplary embodiment of the present disclosure provides a program product for implementing the above method, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An identification method, comprising:
acquiring a carrier top view image corresponding to a carrier to be identified, and carrying out image identification on the carrier top view image to generate a distribution matrix of a target object in the carrier to be identified;
and performing feature recognition on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized.
2. The method according to claim 1, wherein the image recognition of the top view image of the carrier to generate a distribution matrix of target objects in the carrier to be recognized comprises:
performing image recognition on the carrier overhead image to determine at least one placing position in the carrier to be recognized and an image area corresponding to each placing position;
identifying each image area, determining whether the target object is placed in each placing position, and generating a corresponding placing identifier;
and generating a distribution matrix corresponding to the target object in the carrier to be recognized based on the placement position and the corresponding placement identifier.
3. The method of claim 2, wherein generating a distribution matrix corresponding to the target object in the carrier to be recognized based on the placement positions and the corresponding placement identifications comprises:
generating a zero matrix according to the arrangement mode of at least one placing position in the carrier to be identified;
and assigning values to the elements of the zero matrix according to the corresponding relation between the placement position and the elements in the zero matrix and the placement identifier to obtain the distribution matrix.
4. The method of claim 1, wherein the feature identification comprises a first attribute identification, and the object feature matrix comprises a first attribute matrix;
performing feature recognition on each target object in the to-be-recognized carrier based on the distribution matrix to generate an object feature matrix corresponding to the to-be-recognized carrier, including:
respectively acquiring an object top view image corresponding to each target object based on the distribution matrix, and extracting first attribute data of the target object in the object top view image;
calculating a first attribute threshold corresponding to the carrier to be identified according to the first attribute data of each target object;
and replacing the matrix value of the distribution matrix according to the first attribute data and the first attribute threshold value of each target object to generate a first attribute matrix corresponding to the carrier to be identified.
5. The method according to claim 4, wherein the replacing the matrix values of the distribution matrix according to the first attribute data and the first attribute threshold of each target object to generate the first attribute matrix corresponding to the carrier to be identified includes:
replacing a matrix value corresponding to the target object in the distribution matrix with a first identifier when first attribute data of the target object belongs to the first attribute threshold;
replacing a matrix value corresponding to the target object in the distribution matrix with a second identifier when the first attribute data of the target object is greater than the first attribute threshold;
and replacing the matrix value corresponding to the target object in the distribution matrix with a third identifier when the first attribute data of the target object is smaller than the first attribute threshold.
6. The method of claim 1, wherein the feature identification comprises a second attribute identification, and the object feature matrix comprises a second attribute matrix;
the performing feature recognition on each target object in the to-be-recognized carrier based on the distribution matrix to generate an object feature matrix corresponding to the to-be-recognized carrier includes:
acquiring an object image set corresponding to each target object based on the distribution matrix;
performing second attribute identification on the object image set to acquire a second attribute parameter corresponding to the target object;
and replacing the matrix value in the distribution matrix according to the size relationship between a preset second attribute threshold and the second attribute parameter so as to generate a second attribute matrix corresponding to the carrier to be identified.
7. The method of claim 1, further comprising:
and sorting the target objects in the carrier to be identified according to the object feature matrix.
8. An identification device, comprising:
the image recognition module is used for acquiring a carrier top view image corresponding to a carrier to be recognized and carrying out image recognition on the carrier top view image so as to generate a distribution matrix of a target object in the carrier to be recognized;
and the characteristic identification module is used for carrying out characteristic identification on each target object in the carrier to be identified based on the distribution matrix so as to generate an object characteristic matrix corresponding to the carrier to be identified.
9. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the identification method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
memory for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the identification method of any one of claims 1 to 7.
CN202010352258.3A 2020-04-28 2020-04-28 Identification method, identification device, computer-readable storage medium, and electronic apparatus Active CN111582109B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010352258.3A CN111582109B (en) 2020-04-28 2020-04-28 Identification method, identification device, computer-readable storage medium, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010352258.3A CN111582109B (en) 2020-04-28 2020-04-28 Identification method, identification device, computer-readable storage medium, and electronic apparatus

Publications (2)

Publication Number Publication Date
CN111582109A true CN111582109A (en) 2020-08-25
CN111582109B CN111582109B (en) 2023-09-05

Family

ID=72126201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010352258.3A Active CN111582109B (en) 2020-04-28 2020-04-28 Identification method, identification device, computer-readable storage medium, and electronic apparatus

Country Status (1)

Country Link
CN (1) CN111582109B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030072035A1 (en) * 2001-09-27 2003-04-17 Brother Kogyo Kabushiki Kaisha Image processing device
CN103530598A (en) * 2013-03-08 2014-01-22 Tcl集团股份有限公司 Station logo identification method and system
CN103521464A (en) * 2013-10-25 2014-01-22 华中农业大学 Method and device for identification and separation of yolk-dispersed eggs based on machine vision
CN110288037A (en) * 2019-06-28 2019-09-27 北京字节跳动网络技术有限公司 Image processing method, device and electronic equipment
WO2020011001A1 (en) * 2018-07-11 2020-01-16 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and computer device
CN110927167A (en) * 2019-10-31 2020-03-27 北京海益同展信息科技有限公司 Egg detection method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030072035A1 (en) * 2001-09-27 2003-04-17 Brother Kogyo Kabushiki Kaisha Image processing device
CN103530598A (en) * 2013-03-08 2014-01-22 Tcl集团股份有限公司 Station logo identification method and system
CN103521464A (en) * 2013-10-25 2014-01-22 华中农业大学 Method and device for identification and separation of yolk-dispersed eggs based on machine vision
WO2020011001A1 (en) * 2018-07-11 2020-01-16 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and computer device
CN110288037A (en) * 2019-06-28 2019-09-27 北京字节跳动网络技术有限公司 Image processing method, device and electronic equipment
CN110927167A (en) * 2019-10-31 2020-03-27 北京海益同展信息科技有限公司 Egg detection method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈前: "基于深度学习的隧道渗漏水特征识别算法研究" *

Also Published As

Publication number Publication date
CN111582109B (en) 2023-09-05

Similar Documents

Publication Publication Date Title
WO2021051885A1 (en) Target labeling method and apparatus
CN110225366B (en) Video data processing and advertisement space determining method, device, medium and electronic equipment
CN109377508B (en) Image processing method and device
CN108805180B (en) Target object detection method and device
CN111292327B (en) Machine room inspection method, device, equipment and storage medium
CN116168351B (en) Inspection method and device for power equipment
CN109815405B (en) Gray level shunting method and system
CN115205883A (en) Data auditing method, device, equipment and storage medium based on OCR (optical character recognition) and NLP (non-line language)
CN108595178B (en) Hook-based data acquisition method, device and equipment
CN110119459B (en) Image data search method and image data search device
CN111582109B (en) Identification method, identification device, computer-readable storage medium, and electronic apparatus
CN111666884A (en) Living body detection method, living body detection device, computer-readable medium, and electronic apparatus
CN111078317A (en) Scene data processing method and device, computer equipment and storage medium
CN111062374A (en) Identification method, device, system, equipment and readable medium of identity card information
CN110706185A (en) Image processing method and device, equipment and storage medium
CN113610801B (en) Defect classification method, device, equipment and storage medium based on minimum unit
CN116091481A (en) Spike counting method, device, equipment and storage medium
CN108447107B (en) Method and apparatus for generating video
CN110310341B (en) Method, device, equipment and storage medium for generating default parameters in color algorithm
CN111683296B (en) Video segmentation method and device, electronic equipment and storage medium
CN110827261B (en) Image quality detection method and device, storage medium and electronic equipment
CN114399670A (en) Control method for extracting characters in pictures in 5G messages in real time
CN109040774B (en) Program information extraction method, terminal equipment, server and storage medium
CN112801016A (en) Vote data statistical method, device, equipment and medium
CN111061625A (en) Automatic testing method and device applied to out-of-order password keyboard

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Technology Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: BEIJING HAIYI TONGZHAN INFORMATION TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant