CN111582109B - Identification method, identification device, computer-readable storage medium, and electronic apparatus - Google Patents

Identification method, identification device, computer-readable storage medium, and electronic apparatus Download PDF

Info

Publication number
CN111582109B
CN111582109B CN202010352258.3A CN202010352258A CN111582109B CN 111582109 B CN111582109 B CN 111582109B CN 202010352258 A CN202010352258 A CN 202010352258A CN 111582109 B CN111582109 B CN 111582109B
Authority
CN
China
Prior art keywords
carrier
matrix
attribute
target object
identified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010352258.3A
Other languages
Chinese (zh)
Other versions
CN111582109A (en
Inventor
苏睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Technology Information Technology Co Ltd
Original Assignee
Jingdong Technology Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jingdong Technology Information Technology Co Ltd filed Critical Jingdong Technology Information Technology Co Ltd
Priority to CN202010352258.3A priority Critical patent/CN111582109B/en
Publication of CN111582109A publication Critical patent/CN111582109A/en
Application granted granted Critical
Publication of CN111582109B publication Critical patent/CN111582109B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The disclosure relates to the technical field of image processing, in particular to an identification method, an identification device, a computer readable storage medium and electronic equipment, wherein the method comprises the following steps: acquiring a carrier overlook image corresponding to a carrier to be identified, and carrying out image identification on the carrier overlook image to generate a distribution matrix of a target object in the carrier to be identified; and carrying out feature recognition on each target object in the carrier to be recognized based on the distribution matrix so as to generate an object feature matrix corresponding to the carrier to be recognized. According to the technical scheme, on the one hand, the carrier overlook image is identified, and then the target object is identified, so that the target object placed in the carrier to be identified can be automatically identified, and the automation degree of identification is improved; on the other hand, the identification result of all target objects in the carrier to be identified can be obtained at a time by the identification method, so that the identification efficiency is higher.

Description

Identification method, identification device, computer-readable storage medium, and electronic apparatus
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an identification method, an identification device, a computer-readable storage medium, and an electronic apparatus.
Background
In order to reduce damage to the transported objects during transportation, it is often necessary to place the transported objects in specific carriers, especially when transporting a plurality of fragile transported objects. For example, in modern farms, in order to be able to transport eggs, such as chicken eggs, duck eggs, etc., it is often necessary to place the eggs in special trays to reduce the occurrence of breakage during transport.
In the related art. In order to sort a plurality of transported objects placed in a carrier, it is generally necessary to initially identify the transported objects and sort the transported objects according to the identification result. Currently, common preliminary identification methods are often performed manually. However, the efficiency of preliminary recognition by manual work is low, and the recognition result is also easily affected by the subjective influence.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to an identification method, an identification device, a computer-readable storage medium, and an electronic apparatus, which can improve the identification efficiency of identifying a plurality of target objects in a carrier and avoid the problem that an identification result is subjectively affected by an identification person by providing an automated identification method.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an identification method comprising: acquiring a carrier overlook image corresponding to a carrier to be identified, and carrying out image identification on the carrier overlook image to generate a distribution matrix of a target object in the carrier to be identified; and carrying out feature recognition on each target object in the carrier to be recognized based on the distribution matrix so as to generate an object feature matrix corresponding to the carrier to be recognized.
Optionally, based on the foregoing solution, performing image recognition on the carrier top view image to generate a distribution matrix of the target object in the carrier to be recognized, including: carrying out image recognition on the carrier overlook image to determine at least one placement position in the carrier to be recognized and an image area corresponding to each placement position; identifying each image area, determining whether a target object is placed in each placement position, and generating a corresponding placement identifier; and generating a distribution matrix corresponding to the target object in the carrier to be identified based on the placement position and the corresponding placement identifier.
Optionally, based on the foregoing solution, generating a distribution matrix corresponding to the target object in the carrier to be identified based on the placement location and the corresponding placement identifier includes: generating a zero matrix according to the arrangement mode of at least one placement position in the carrier to be identified; and assigning values to each element of the zero matrix according to the corresponding relation between the placement position and each element in the zero matrix and the placement mark to obtain a distribution matrix.
Optionally, based on the foregoing scheme, the feature recognition includes a first attribute recognition, and the object feature matrix includes a first attribute matrix; performing feature recognition on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized, including: respectively acquiring object overlooking images corresponding to all target objects based on the distribution matrix, and extracting first attribute data of the target objects in the object overlooking images; calculating a first attribute threshold corresponding to the carrier to be identified according to the first attribute data of each target object; and replacing matrix values of the distribution matrix according to the first attribute data and the first attribute threshold value of each target object to generate a first attribute matrix corresponding to the carrier to be identified.
Optionally, based on the foregoing solution, replacing matrix values of the distribution matrix according to the first attribute data and the first attribute threshold of each target object to generate a first attribute matrix corresponding to the carrier to be identified, including: when the first attribute data of the target object belong to a first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a first identifier; when the first attribute data of the target object is larger than a first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a second identifier; and when the first attribute data of the target object is smaller than the first attribute threshold value, replacing matrix values corresponding to the target object in the distribution matrix with third identifications.
Optionally, based on the foregoing scheme, the feature recognition includes a second attribute recognition, and the object feature matrix includes a second attribute matrix; performing feature recognition on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized, including: acquiring an object image set corresponding to each target object based on the distribution matrix; performing second attribute identification on the object image set to obtain second attribute parameters corresponding to the target object; and replacing matrix values in the distribution matrix according to the size relation between the preset second attribute threshold and the second attribute parameter to generate a second attribute matrix corresponding to the carrier to be identified.
Optionally, based on the foregoing solution, the method further includes: and sorting the target objects in the carrier to be identified according to the object feature matrix.
According to a second aspect of the present disclosure, there is provided an identification device comprising: the image recognition module is used for acquiring a carrier overlook image corresponding to the carrier to be recognized, and carrying out image recognition on the carrier overlook image so as to generate a distribution matrix of the target object in the carrier to be recognized; and the feature recognition module is used for carrying out feature recognition on each target object in the carrier to be recognized based on the distribution matrix so as to generate an object feature matrix corresponding to the carrier to be recognized.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of identifying as any one of the above.
According to a fourth aspect of embodiments of the present disclosure, there is provided an electronic device, comprising:
a processor; and
and a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the identification method as in any of the preceding claims.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
in the identification method provided by the embodiment of the disclosure, a carrier overlook image corresponding to a carrier to be identified is obtained, and a distribution matrix of target objects in the carrier to be identified is generated, so that whether the target objects exist at each placement position in the carrier to be identified can be determined through the distribution matrix; and then carrying out feature recognition on the target object existing in the carrier to be recognized according to the distribution matrix, and generating an object feature matrix corresponding to the carrier to be recognized. On the one hand, the carrier overlook image is identified, and then the target object is identified, so that the automatic identification of the target object placed in the carrier to be identified can be realized, and the automation degree of identification is improved; on the other hand, the identification result of all target objects in the carrier to be identified can be obtained at a time by the identification method, so that the identification efficiency is higher.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
FIG. 1 schematically illustrates a flow chart of an identification method in an exemplary embodiment of the present disclosure;
fig. 2 schematically illustrates a schematic view of a carrier to be identified and a top view in an exemplary embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow chart of a method of generating a distribution matrix of target objects in a carrier to be identified in an exemplary embodiment of the disclosure;
FIG. 4 schematically illustrates a flowchart of another method of generating a distribution matrix of target objects in a carrier to be identified in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a flowchart of a method of generating an object feature matrix corresponding to a carrier to be identified in an exemplary embodiment of the present disclosure;
Fig. 6 schematically illustrates a flowchart of a method for generating a first attribute matrix corresponding to a carrier to be identified in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a flowchart of another method for generating an object feature matrix corresponding to a carrier to be identified in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a flowchart of a method of identifying eggs as target objects in an exemplary embodiment of the present disclosure;
FIG. 9 schematically illustrates a composition diagram of an identification device in an exemplary embodiment of the present disclosure;
fig. 10 schematically illustrates a structural schematic diagram of a computer system suitable for use in implementing the electronic device of the exemplary embodiments of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
In the present exemplary embodiment, an identification method is provided first, which may be applied to a terminal device having an image processing function, for example, a mobile phone, a computer, a digital camera, or the like, or may be applied to a processing module corresponding to the identification device, for example, before sorting eggs such as eggs, duck eggs, or the like, a tray top view image corresponding to a tray in which a plurality of eggs are placed may be collected by the identification device, and then identification processing is performed on the tray top view image by the processing module corresponding to the identification device.
Referring to fig. 1, the above-described identification method may include the following steps S110 and S120:
in step S110, a carrier top view image corresponding to the carrier to be identified is obtained, and image identification is performed on the carrier top view image, so as to generate a distribution matrix of the target object in the carrier to be identified.
In the present exemplary embodiment, the carrier to be recognized may be a carrier for placing the target object, on which at least one placement position is provided; the carrier overlook image corresponding to the carrier to be identified comprises an image obtained by shooting one surface of the carrier to be identified, which has a placement position. As shown in fig. 2, the top view of the carrier 210 to be identified is as indicated by the arrow. Since the carrier to be identified on which the target object is placed is usually placed above the surface where the placement position is located, in general, a top view image of the carrier may be obtained by photographing the top of the carrier to be identified; however, in a special case, there may be a case where the surface where the placement position exists is not placed above, and at this time, the position of the photographing camera needs to be adjusted to obtain a photographed image corresponding to the surface where the placement position exists on the carrier to be recognized as a carrier top view image.
In the present exemplary embodiment, image recognition is performed on a carrier top view image to generate a distribution matrix of target objects in a carrier to be recognized, as shown with reference to fig. 3, may include the following steps S310 to S330:
in step S310, the carrier top view image is subjected to image recognition to determine at least one placement position in the carrier to be recognized, and an image area corresponding to each placement position.
In this exemplary embodiment, after the carrier top view image of the carrier to be identified is obtained, the carrier top view image may be identified according to an image identification technology, so as to obtain at least one placement position in the carrier top view image and an image area corresponding to the placement position.
In step S320, each image area is identified, whether or not a target object is placed in each placement position is determined, and a corresponding placement identifier is generated.
In the present exemplary embodiment, after the placement positions in the carrier to be identified and the image areas corresponding to the placement positions are determined from the carrier top view image, each image area may be identified to determine whether or not the target object is placed in each of the placement positions. For example, for each placement location of an egg tray, it may be determined whether an avian egg is placed in each location of the egg tray based on image information such as color, brightness, etc.
The placement identifier is an identifier for indicating whether the placement position has a target object or not, and can be customized. For example, when the target object is placed at the placement position, a placement flag may be set to 1; when the target object is not placed at the placement position, a placement flag may be set to 0.
In step S330, a distribution matrix corresponding to the target object in the carrier to be identified is generated based on the placement positions and the corresponding placement identifiers.
In this exemplary embodiment, the generation of the distribution matrix corresponding to the target object in the carrier to be identified based on the placement position and the corresponding placement identifier, as shown with reference to fig. 4, may include the following steps S410 and S420:
in step S410, a zero matrix is generated according to the arrangement of at least one placement position in the carrier to be identified.
In this exemplary embodiment, the corresponding zero matrix may be generated according to the arrangement manner of the placement positions in the carrier to be identified. For example, when 4*5 placement positions are included in the carrier to be identified, a zero matrix of 4*5 may be generated correspondingly, such that each 0 element in the zero matrix corresponds to each placement position in the carrier to be identified one by one.
In step S420, the assignment is performed on each element of the zero matrix according to the corresponding relationship between the placement position and each element in the zero matrix and the placement identifier, so as to obtain a distribution matrix.
In this exemplary embodiment, each 0 element in the zero matrix may be reassigned according to the correspondence between the placement position and each 0 element in the zero matrix and the placement identifier corresponding to the placement position, so as to obtain a distribution matrix. For example, in the zero matrix of 4*5, assuming that the 1 st placement position corresponding to the 1 st 0 element in the 1 st row has a placement identifier representing that the placement position has a target object, the 1 st 0 element may be assigned a value of 1; at the 2 nd placement position corresponding to the 2 nd element 0 of the 1 st row, the placement identifier thereof represents that the placement position has no target object, and the 2 nd element 0 can be assigned as 2.
It should be noted that, the above-mentioned assignment of elements is only used to identify whether the target object is placed in the placement location in the matrix, so as long as the specific values of the assignment are different, specific data may be customized, and this disclosure is not limited in particular. For example, the assignment corresponding to the placement identity of the target object may be set to a and the assignment corresponding to the placement identity of the target object may be set to 2.
In step S120, feature recognition is performed on each target object in the carrier to be recognized based on the distribution matrix, so as to generate an object feature matrix corresponding to the carrier to be recognized.
In this exemplary embodiment, since the distribution matrix can represent whether there is a target object in each placement position on the carrier to be identified, further feature identification can be performed on the placement position where there is a target object according to the distribution matrix, so as to generate an object feature matrix corresponding to the carrier to be identified.
For example, the feature recognition may include a first attribute recognition, where the object feature matrix corresponds to the first attribute matrix, referring to fig. 5, and the feature recognition is performed on each target object in the carrier to be recognized based on the distribution matrix to generate the object feature matrix corresponding to the carrier to be recognized, which may include the following steps S510 to S530:
in step S510, object top view images corresponding to the respective target objects are acquired based on the distribution matrices, and first attribute data of the target objects in the object top view images are extracted.
Wherein the target object comprises an object placed in a placement position of the carrier to be identified; the object overhead view image may be an image obtained by photographing the placement position of the target object in the same photographing direction as the photographing carrier overhead view image; the first attribute data may be parameters for expressing attributes of the size, the dimension, etc. of the target object, for example, an area surrounded by the outline, an overall length of the outline, etc., to which the present disclosure is not limited in particular.
In the present exemplary embodiment, a top view image of a placement position area in which a target object is placed in a placement position may be acquired according to a distribution matrix, that is, an object top view image. After the object top view image is acquired, the first attribute data of the target object can be extracted according to the color difference or other parameters of the object top view image.
In step S520, a first attribute threshold corresponding to the carrier to be identified is calculated according to the first attribute data of each target object.
In this exemplary embodiment, the average value and standard deviation of the corresponding first attribute data may be calculated according to the first attribute data of all the target objects in the carrier to be identified, and then the first attribute threshold corresponding to the carrier to be identified may be determined according to the first attribute data standard deviation as the up-down floating range with the average value of the first attribute data as the center.
In addition, since the first attribute threshold is calculated by the first attribute data of all the target objects in the carrier to be identified, whether the target objects meet the attribute standard can be judged based on the attribute of the target objects in the carrier to be identified, and compared with the preset first attribute threshold which is unchanged. The target object with larger attribute difference with other target objects can be quickly identified in the carrier to be identified, and the problem of low identification accuracy caused by the fact that an inaccurate first attribute threshold value is set is avoided.
In step S530, the matrix values of the distribution matrix are replaced according to the first attribute data and the first attribute threshold value of each target object, so as to generate a first attribute matrix corresponding to the carrier to be identified.
In this exemplary embodiment, whether the first attribute data of the target object meets the standard or not may be determined through the first attribute threshold, so as to determine the first attribute identifier of the target object. Wherein, the first attribute identifier may be an identifier such as a letter, a number, a letter, an image, etc., which is not particularly limited in this disclosure.
For example, when the first attribute identifier includes a numerical value, the first attribute matrix corresponding to the carrier to be identified may be generated directly by replacing the matrix value in the distribution matrix with the first identifier, the second identifier, and the third identifier. Referring to fig. 6, replacing matrix values of the distribution matrix according to the first attribute data and the first attribute threshold value of each target object may include:
step S610, when the first attribute data of the target object belongs to the first attribute threshold value, replacing the matrix value corresponding to the target object in the distribution matrix with the first identifier;
step S620, when the first attribute data of the target object is greater than the first attribute threshold, replacing the matrix value corresponding to the target object in the distribution matrix with the second identifier;
In step S630, when the first attribute data of the target object is smaller than the first attribute threshold, the matrix value corresponding to the target object in the distribution matrix is replaced by the third identifier.
In the present exemplary embodiment, when the target object is identified based on the first attribute data and the first attribute threshold value, different types of the first attribute data may be represented by different matrix values in the distribution matrix. Specifically, when the first attribute data of the target object belongs to the first attribute threshold range, the corresponding element of the target object in the distribution matrix can be replaced by the first identifier; when the first attribute data of the target object is smaller than the first attribute threshold range, replacing the corresponding element of the target object in the distribution matrix through a second identifier; when the first attribute data of the target object is greater than the first attribute threshold range, the corresponding element of the target object in the distribution matrix can be replaced by the third identifier.
It should be noted that, because the elements in the matrix may be identifiers such as letters and numbers in normal cases, and identifiers such as characters and images cannot be filled, when the first attribute identifier is an identifier such as an image that cannot be used as a matrix element, the filling identifier corresponding to the first attribute identifier may be set at the same time, and then the elements in the distribution matrix may be replaced according to the correspondence between the first attribute identifier and the filling identifier.
In addition, the feature recognition may further include a second attribute recognition, where the object feature matrix corresponds to the second attribute matrix, referring to fig. 7, and the feature recognition is performed on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized, which may include the following steps S710 to S730:
in step S710, a set of object images corresponding to each target object is acquired based on the distribution matrix.
In the present exemplary embodiment, since some attribute parameters of the target object may be available not only by a top view image of the target object but also by images in other directions. Such as whether the target object is complete, whether the target object is fouled, the type of target object, etc. It is thus possible to determine which placement positions have a target object placed in them based on the distribution matrix, and then take a photograph of the target object, obtaining a corresponding set of object images.
In step S720, second attribute identification is performed on the object image set to obtain second attribute parameters corresponding to the target object.
In this exemplary embodiment, after the object image set is obtained, second attribute identification may be performed on each image in the object image set, and a second attribute parameter corresponding to the target object may be obtained. The second attribute parameters may include whether the second attribute parameters are complete, dirt, and the like.
In step S730, the matrix values in the distribution matrix are replaced according to the magnitude relation between the preset second attribute threshold and the second attribute parameter, so as to generate a second attribute matrix corresponding to the carrier to be identified.
In this exemplary embodiment, when the second attribute parameter of the target object is identified, the identification result cannot be obtained by comparing the second attribute parameter with other target objects in the same carrier to be identified, so that a preset second attribute threshold needs to be set in a self-defined manner, and the second attribute parameter of the target object is identified by the preset second attribute threshold.
Specifically, the matrix values in the distribution matrix may be replaced according to the magnitude relation between the preset second attribute threshold and the second attribute parameter, so as to obtain a second attribute matrix capable of representing the second attribute parameters of all the target objects in the carrier to be identified. When the matrix value is replaced, the matrix value may be replaced by adopting a mode when the first attribute matrix is generated, or may be replaced by adopting other modes, so that each element in the second attribute matrix can represent the second attribute parameter of each target object in the carrier to be identified, and the specific replacing mode is not particularly limited in the disclosure.
It should be noted that, since the preliminary recognition may be recognition according to attributes of various aspects, the object feature matrix may include a plurality of, that is, each attribute corresponds to one object feature matrix. For example, if the size and integrity of an avian egg needs to be identified, 2 object feature matrices may be generated.
In addition, after obtaining the object features corresponding to the carrier to be identified, the method may further include: and sorting the target objects in the carrier to be identified according to the object feature matrix.
In this exemplary embodiment, after the object feature matrix corresponding to the carrier to be identified is obtained, all the target objects on the carrier to be identified may be sorted at a time according to the object feature matrix. Because the object feature matrix can represent the feature attributes of all the target objects on the whole carrier to be identified, all the target objects can be classified according to the object feature matrix, and then the objects are sorted once by sorting equipment.
It should be noted that, when sorting the carriers to be identified, in general, on the sorting head of the sorting apparatus, the sorting device may correspond to the placement positions on the carriers to be identified one by one. However, the placement directions of the carriers to be identified are different, so that the situation that the sorting device cannot be in one-to-one correspondence with the placement positions of the carriers to be identified in the sorting equipment for sorting the carriers to be identified may be caused. For example, the placement position of the carrier to be identified is distributed 4*5 in the current placement direction, and the sorting devices on the sorting apparatus are distributed 5*4. At this time, the direction of the carrier to be identified should be identified according to the distribution matrix or the object feature matrix, and the direction of the carrier to be identified should be transmitted to the sorting equipment by means of identification or the like, and the sorting equipment can make the sorting device correspond to the placement positions in the carrier to be identified one by performing operations such as rotating the sorting device.
The following details of implementation of the technical solution of the embodiment of the present disclosure are described in detail with reference to fig. 8, by taking the identification of eggs as an example:
step S810, the egg tray to be identified enters the identification area, and an egg tray overlook image of the egg tray to be identified is shot.
Step S820 generates a distribution matrix representing whether or not eggs are present in each placement position in the same manner as the placement position distribution in the egg tray, based on the image recognition of the overhead view image of the egg tray. For example, the egg trays are 1*9, wherein eggs are placed in the 1 st to 7 th placement positions and the 9 th placement positions, wherein a '1' indicates that eggs are placed, a '0' indicates that eggs are not placed, and the corresponding distribution matrix is [1,1,1,1,1,1,1,0,1].
In step S830, the placement direction of the egg tray may be determined to be transverse by the distribution matrix of the egg tray.
Step S840, carrying out attribute identification on eggs in the egg trays according to the distribution matrix, and determining an egg feature matrix corresponding to the egg trays.
Step S850, the sorting equipment rotates the sorting head according to the placement direction of the egg tray, so that the sorting device on the sorting head can be in one-to-one correspondence with the placement positions in the egg tray, and the eggs in the current egg tray are sorted according to the egg feature matrix.
Step S840 is described in detail below by way of two embodiments:
example 1:
in the above example, it is assumed that the correspondence of attribute recognition may be a size feature, at this time, 1 st to 7 th egg images corresponding to the 9 th placement position thereon may be recognized by an egg tray top view, an area surrounded by an egg outline is extracted, and an average area and an area standard deviation of 8 eggs are calculated. For example, the average area and standard deviation of the area of 8 eggs are 5 and 0.5, respectively. The area threshold is determined to be 4.5-5.5 by the average area and the area standard deviation.
And then respectively determining the relation between the areas surrounded by the 8 poultry egg outlines and the area threshold value, and replacing matrix values in the distribution matrix [1,1,1,1,1,1,1,0,1 ]. For example, the 1 st and 7 th eggs have contour areas of 4.3 and 5.7, respectively, and the remaining egg contour envelope areas are within the area threshold. Assuming that the profile area belongs to an area threshold value by '2', wherein '3' indicates that the profile area is smaller than the area threshold value, and '4' indicates that the profile area is larger than the area threshold value, and the egg feature matrix corresponding to the size feature is [3,2,2,2,2,2,4,0,2].
Example 2:
in the above example, it is assumed that the correspondence of attribute recognition may be an integrity feature, and at this time, the 1 st to 7 th egg image sets corresponding to the 9 th placement positions on the egg tray may be obtained, the egg image sets are recognized, parameters such as corresponding egg contour lines, surface colors and the like are obtained, and then these parameters are compared with a preset integrity threshold value to determine whether the eggs are complete. The preset integrity threshold value can be an empirical value obtained by identifying parameters such as contour lines, surface colors and the like of a plurality of complete eggs.
And then replacing matrix values in the distribution matrix [1,1,1,1,1,1,1,0,1] according to the comparison result. For example, if the parameters such as the contour lines, the surface colors, etc. of the 3 rd and 9 th eggs do not belong to the preset integrity threshold value, it may be determined that the 3 rd and 9 th eggs are incomplete. Assuming that an egg is complete as indicated by "5" and incomplete as indicated by "6", the characteristic matrix of the egg corresponding to the integrity feature is [5,5,6,5,5,5,5,0,6].
It is noted that the above-described figures are merely schematic illustrations of processes involved in a method according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
The following describes embodiments of the apparatus of the present disclosure that may be used to perform the identification methods described above of the present disclosure. Referring to fig. 9, the identifying device 900 includes: an image recognition module 910 and a feature recognition module 920.
The image recognition module 910 may be configured to obtain a carrier top view image corresponding to a carrier to be recognized, and perform image recognition on the carrier top view image to generate a distribution matrix of the target object in the carrier to be recognized.
The feature recognition module 920 may be configured to perform feature recognition on each target object in the carrier to be recognized based on the distribution matrix, so as to generate an object feature matrix corresponding to the carrier to be recognized.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the image recognition module 910 may be configured to perform image recognition on a carrier top view image to determine at least one placement location in a carrier to be recognized, and an image area corresponding to each placement location; identifying each image area, determining whether a target object is placed in each placement position, and generating a corresponding placement identifier; and generating a distribution matrix corresponding to the target object in the carrier to be identified based on the placement position and the corresponding placement identifier.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the image recognition module 910 may be configured to generate a zero matrix according to an arrangement manner of at least one placement position in the carrier to be recognized; and assigning values to each element of the zero matrix according to the corresponding relation between the placement position and each element in the zero matrix and the placement mark to obtain a distribution matrix.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the feature recognition module 920 may be configured to obtain object top view images corresponding to respective target objects based on the distribution matrix, and extract first attribute data of the target objects in the object top view images; calculating a first attribute threshold corresponding to the carrier to be identified according to the first attribute data of each target object; and replacing matrix values of the distribution matrix according to the first attribute data and the first attribute threshold value of each target object to generate a first attribute matrix corresponding to the carrier to be identified.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the feature recognition module 920 may be configured to replace a matrix value corresponding to the target object in the distribution matrix with the first identifier when the first attribute data of the target object belongs to the first attribute threshold value; when the first attribute data of the target object is larger than a first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a second identifier; and when the first attribute data of the target object is smaller than the first attribute threshold value, replacing matrix values corresponding to the target object in the distribution matrix with third identifications.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the feature recognition module 920 may be configured to obtain, based on the distribution matrix, an object image set corresponding to each target object; performing second attribute identification on the object image set to obtain second attribute parameters corresponding to the target object; and replacing matrix values in the distribution matrix according to the size relation between the preset second attribute threshold and the second attribute parameter to generate a second attribute matrix corresponding to the carrier to be identified.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the identifying apparatus 900 further includes: the object sorting module 930 is configured to sort the target objects in the carrier to be identified according to the object feature matrix.
Since each functional module of the identification device according to the exemplary embodiment of the present disclosure corresponds to a step of the foregoing exemplary embodiment of the identification method, for details not disclosed in the embodiment of the device according to the present disclosure, please refer to the foregoing embodiment of the identification method according to the present disclosure.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above identification method is also provided.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 1000 according to such an embodiment of the present disclosure is described below with reference to fig. 10. The electronic device 1000 shown in fig. 10 is merely an example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 10, the electronic device 1000 is embodied in the form of a general purpose computing device. Components of electronic device 1000 may include, but are not limited to: the at least one processing unit 1010, the at least one memory unit 1020, a bus 1030 connecting the various system components (including the memory unit 1020 and the processing unit 1010), and a display unit 1040.
Wherein the storage unit stores program code that is executable by the processing unit 1010 such that the processing unit 1010 performs steps according to various exemplary embodiments of the present disclosure described in the above section of the present specification. For example, the processing unit 1010 may perform step S110 as shown in fig. 1: acquiring a carrier overlook image corresponding to a carrier to be identified, and carrying out image identification on the carrier overlook image to generate a distribution matrix of a target object in the carrier to be identified; s120: and carrying out feature recognition on each target object in the carrier to be recognized based on the distribution matrix so as to generate an object feature matrix corresponding to the carrier to be recognized.
As another example, the electronic device may implement the various steps shown in fig. 3-8.
The memory unit 1020 may include readable media in the form of volatile memory units such as Random Access Memory (RAM) 1021 and/or cache memory unit 1022, and may further include Read Only Memory (ROM) 1023.
Storage unit 1020 may also include a program/utility 1024 having a set (at least one) of program modules 1025, such program modules 1025 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 1030 may be representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 1000 can also communicate with one or more external devices 1070 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 1000, and/or with any device (e.g., router, modem, etc.) that enables the electronic device 1000 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1050. Also, electronic device 1000 can communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1060. As shown, the network adapter 1060 communicates with other modules of the electronic device 1000 over the bus 1030. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with the electronic device 1000, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the various aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
Further, in an exemplary embodiment of the present disclosure, a program product for implementing the above method is provided, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (9)

1. A method of identification, comprising:
Acquiring a carrier overlook image corresponding to a carrier to be identified, and carrying out image identification on the carrier overlook image to generate a distribution matrix of a target object in the carrier to be identified;
performing feature recognition on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized, wherein the feature recognition comprises second attribute recognition, and the object feature matrix comprises a second attribute matrix;
the feature recognition is performed on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized, including:
acquiring an object image set corresponding to each target object based on the distribution matrix;
performing second attribute identification on the object image set to obtain second attribute parameters corresponding to the target object;
and replacing matrix values in the distribution matrix according to the magnitude relation between a preset second attribute threshold and the second attribute parameter to generate a second attribute matrix corresponding to the carrier to be identified.
2. The method of claim 1, wherein the performing image recognition on the carrier top view image to generate a distribution matrix of target objects in the carrier to be recognized comprises:
Performing image recognition on the carrier overlook image to determine at least one placement position in the carrier to be recognized and an image area corresponding to each placement position;
identifying each image area, determining whether the target object is placed in each placement position, and generating a corresponding placement identifier;
and generating a distribution matrix corresponding to the target object in the carrier to be identified based on the placement position and the corresponding placement identifier.
3. The method according to claim 2, wherein generating a distribution matrix corresponding to the target object in the carrier to be identified based on the placement locations and the corresponding placement identifications comprises:
generating a zero matrix according to the arrangement mode of at least one placement position in the carrier to be identified;
and assigning values to the elements of the zero matrix according to the corresponding relation between the placement position and the elements in the zero matrix and the placement mark to obtain the distribution matrix.
4. The method of claim 1, wherein the feature recognition comprises a first attribute recognition and the object feature matrix comprises a first attribute matrix;
performing feature recognition on each target object in the carrier to be recognized based on the distribution matrix to generate an object feature matrix corresponding to the carrier to be recognized, including:
Respectively acquiring object overlook images corresponding to the target objects based on the distribution matrix, and extracting first attribute data of the target objects in the object overlook images;
calculating a first attribute threshold corresponding to the carrier to be identified according to the first attribute data of each target object;
and replacing matrix values of the distribution matrix according to the first attribute data of each target object and the first attribute threshold value to generate a first attribute matrix corresponding to the carrier to be identified.
5. The method according to claim 4, wherein replacing the matrix value of the distribution matrix according to the first attribute data of each target object and the first attribute threshold value to generate the first attribute matrix corresponding to the carrier to be identified includes:
when the first attribute data of the target object belong to the first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a first identifier;
when the first attribute data of the target object is larger than the first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a second identifier;
And when the first attribute data of the target object is smaller than the first attribute threshold value, replacing a matrix value corresponding to the target object in the distribution matrix with a third identifier.
6. The method according to claim 1, wherein the method further comprises:
and sorting the target objects in the carrier to be identified according to the object feature matrix.
7. An identification device, comprising:
the image recognition module is used for acquiring a carrier overlook image corresponding to the carrier to be recognized, and carrying out image recognition on the carrier overlook image so as to generate a distribution matrix of the target object in the carrier to be recognized;
the feature recognition module is used for carrying out feature recognition on each target object in the carrier to be recognized based on the distribution matrix so as to generate an object feature matrix corresponding to the carrier to be recognized, wherein the feature recognition comprises second attribute recognition, and the object feature matrix comprises a second attribute matrix;
the feature recognition module is used for acquiring an object image set corresponding to each target object based on the distribution matrix; performing second attribute identification on the object image set to obtain second attribute parameters corresponding to the target object; and replacing matrix values in the distribution matrix according to the magnitude relation between a preset second attribute threshold and the second attribute parameter to generate a second attribute matrix corresponding to the carrier to be identified.
8. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the identification method according to any one of claims 1 to 6.
9. An electronic device, comprising:
a processor; and
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the identification method of any of claims 1-6.
CN202010352258.3A 2020-04-28 2020-04-28 Identification method, identification device, computer-readable storage medium, and electronic apparatus Active CN111582109B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010352258.3A CN111582109B (en) 2020-04-28 2020-04-28 Identification method, identification device, computer-readable storage medium, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010352258.3A CN111582109B (en) 2020-04-28 2020-04-28 Identification method, identification device, computer-readable storage medium, and electronic apparatus

Publications (2)

Publication Number Publication Date
CN111582109A CN111582109A (en) 2020-08-25
CN111582109B true CN111582109B (en) 2023-09-05

Family

ID=72126201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010352258.3A Active CN111582109B (en) 2020-04-28 2020-04-28 Identification method, identification device, computer-readable storage medium, and electronic apparatus

Country Status (1)

Country Link
CN (1) CN111582109B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530598A (en) * 2013-03-08 2014-01-22 Tcl集团股份有限公司 Station logo identification method and system
CN103521464A (en) * 2013-10-25 2014-01-22 华中农业大学 Method and device for identification and separation of yolk-dispersed eggs based on machine vision
CN110288037A (en) * 2019-06-28 2019-09-27 北京字节跳动网络技术有限公司 Image processing method, device and electronic equipment
WO2020011001A1 (en) * 2018-07-11 2020-01-16 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and computer device
CN110927167A (en) * 2019-10-31 2020-03-27 北京海益同展信息科技有限公司 Egg detection method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4797308B2 (en) * 2001-09-27 2011-10-19 ブラザー工業株式会社 Image processing apparatus and image processing program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530598A (en) * 2013-03-08 2014-01-22 Tcl集团股份有限公司 Station logo identification method and system
CN103521464A (en) * 2013-10-25 2014-01-22 华中农业大学 Method and device for identification and separation of yolk-dispersed eggs based on machine vision
WO2020011001A1 (en) * 2018-07-11 2020-01-16 腾讯科技(深圳)有限公司 Image processing method and device, storage medium and computer device
CN110288037A (en) * 2019-06-28 2019-09-27 北京字节跳动网络技术有限公司 Image processing method, device and electronic equipment
CN110927167A (en) * 2019-10-31 2020-03-27 北京海益同展信息科技有限公司 Egg detection method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈前.《基于深度学习的隧道渗漏水特征识别算法研究》.《CNKI优秀硕士学位论文全文库(工程科技II辑)》.2020,(第3期),C034-573. *

Also Published As

Publication number Publication date
CN111582109A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN110705405B (en) Target labeling method and device
CN108830780B (en) Image processing method and device, electronic device and storage medium
CN116168351B (en) Inspection method and device for power equipment
CN109543683A (en) Image labeling modification method, device, equipment and medium
CN115311469A (en) Image labeling method, training method, image processing method and electronic equipment
CN108595178B (en) Hook-based data acquisition method, device and equipment
CN111582109B (en) Identification method, identification device, computer-readable storage medium, and electronic apparatus
CN111967449A (en) Text detection method, electronic device and computer readable medium
CN111985471A (en) License plate positioning method and device and storage medium
CN111598832A (en) Slot defect marking method and device and storage medium
CN111062374A (en) Identification method, device, system, equipment and readable medium of identity card information
CN113888635B (en) Visual positioning method and related device
CN113610801B (en) Defect classification method, device, equipment and storage medium based on minimum unit
CN110580185A (en) Data preprocessing method, device and storage medium
CN110310341B (en) Method, device, equipment and storage medium for generating default parameters in color algorithm
CN108447107B (en) Method and apparatus for generating video
CN115658525A (en) User interface checking method and device, storage medium and computer equipment
CN113140042B (en) Three-dimensional scanning splicing method and device, electronic device and computer equipment
CN115374517A (en) Testing method and device for wiring software, electronic equipment and storage medium
CN113378958A (en) Automatic labeling method, device, equipment, storage medium and computer program product
CN114170373A (en) Target object labeling method, processor, device and mixing station
CN112258541A (en) Video boundary detection method, system, device and storage medium
CN111683296A (en) Video segmentation method and device, electronic equipment and storage medium
CN111028313A (en) Table distribution image generation method and device
CN111061625A (en) Automatic testing method and device applied to out-of-order password keyboard

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Technology Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant before: BEIJING HAIYI TONGZHAN INFORMATION TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant