CN111144421B - Object color recognition method and device and throwing equipment - Google Patents

Object color recognition method and device and throwing equipment Download PDF

Info

Publication number
CN111144421B
CN111144421B CN201911257419.4A CN201911257419A CN111144421B CN 111144421 B CN111144421 B CN 111144421B CN 201911257419 A CN201911257419 A CN 201911257419A CN 111144421 B CN111144421 B CN 111144421B
Authority
CN
China
Prior art keywords
image
color
gray level
preset
preset position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911257419.4A
Other languages
Chinese (zh)
Other versions
CN111144421A (en
Inventor
邝嘉隆
熊友军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN201911257419.4A priority Critical patent/CN111144421B/en
Publication of CN111144421A publication Critical patent/CN111144421A/en
Application granted granted Critical
Publication of CN111144421B publication Critical patent/CN111144421B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

An object color recognition method includes: detecting a position of an object in the throwing device; when detecting that an object reaches a preset position, collecting a gray image and a color image of the object; correcting the color image according to the gray level image to obtain a corrected image; and identifying the color of the object according to the corrected image. The distance between the object and the image acquisition equipment can be a fixed value, so that the acquired image does not show different color values along with the change of the distance, the color identification precision can be improved, the acquired image comprises a gray level image and a color image, the color image is corrected through the gray level image to obtain a corrected image, the influence of ambient light can be further reduced, and the detection precision of the object color is improved.

Description

Object color recognition method and device and throwing equipment
Technical Field
The application belongs to the field of educational products, and particularly relates to an object color recognition method, an object color recognition device and throwing equipment.
Background
Pitching scoring is a common item in STEM (Science, technology, engineering, mathematics, science, engineering, technology, and mathematics) education and robotic competition. In order to improve the diversity of the content of the scoring items, the input pellets with different colors are usually subjected to color recognition, so that different scores are recorded according to different colors.
In order to identify the color of the thrown ball, a color sensor is usually added into throwing equipment to perform photoelectric conversion on the light of the object to be detected, and color identification is performed according to the converted electric signal. Because of the intensity change of light, the pellets with the same color can show different color values, so that the identification precision of the throwing equipment on the color of the pellets is not high.
Disclosure of Invention
In view of this, the embodiments of the present application provide a method, an apparatus, and a device for identifying an object color, so as to solve the problem in the throwing device in the prior art that objects with the same color may exhibit different color values due to the intensity change of light, and the accuracy of identifying the object color is not high.
A first aspect of an embodiment of the present application provides an object color identification method, including:
detecting a position of an object in the throwing device;
when detecting that an object reaches a preset position, collecting a gray image and a color image of the object;
correcting the color image according to the gray level image to obtain a corrected image;
and identifying the color of the object according to the corrected image.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the step of detecting a position of the object includes:
acquiring a distance signal of a preset azimuth detected by a distance sensor;
and determining whether the acquired distance signal corresponds to the object at the preset position according to the corresponding relation between the preset distance signal and the object at the preset position.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the throwing device is provided with a gate valve, and the preset position is a steady-state position of an input object in the throwing device when the gate valve is closed.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the gate valve is a rotating plate that is disposed obliquely, and after the capturing of the gray-scale image and the color image of the object, the method further includes:
and controlling the rotating plate to a preset angle, and controlling the rotating plate to restore to a preset gate closing angle after an object falls from the gate valve.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, the step of correcting the color image according to the gray scale image to obtain a corrected image includes:
comparing the background image of the acquired gray level image with a gray level standard image, and determining the gray level difference between the background image of the gray level image and the gray level standard image;
searching a correction coefficient corresponding to the determined gray level difference according to the preset corresponding relation between the gray level difference and the correction coefficient;
and correcting the color image according to the searched correction coefficient.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, the step of identifying the object color according to the corrected image includes:
converting the color value of the corrected image into an HSV hue saturation brightness value;
and determining the color corresponding to the converted HSV value according to the corresponding relation between the preset hue range, saturation range and brightness range and the color.
With reference to the first aspect, in a sixth possible implementation manner of the first aspect, the step of acquiring a gray image and a color image of the object when the object is detected to reach the preset position includes:
when detecting that an object reaches a preset position, controlling the light-emitting device to generate light aimed at the object;
and collecting gray level images and color images of the object according to the light rays generated by the light-emitting device.
A second aspect of embodiments of the present application provides an object color recognition apparatus, including:
a position detection unit for detecting a position of an object in the throwing apparatus;
the image acquisition unit is used for acquiring gray images and color images of the object when the object is detected to reach a preset position;
the image correction unit is used for correcting the color image according to the gray level image to obtain a corrected image;
and the image recognition unit is used for recognizing the color of the object according to the corrected image.
A third aspect of the embodiments of the present application provides a throwing apparatus, the throwing apparatus includes a main body, an image sensor, and a controller, where an object guiding channel is provided on the main body, the throwing apparatus further includes a gray sensor and a position detector for detecting a position of an object, the gray sensor and the image sensor are disposed on the guiding channel and aligned to a preset position of the guiding channel, the controller is connected to the image sensor and the position detector, and the controller is configured to implement the steps of the object color recognition method according to any one of the first aspect.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the object color recognition method according to any one of the first aspects.
Compared with the prior art, the embodiment of the application has the beneficial effects that: when an object is detected to reach a preset position, an image of the object is acquired, so that the distance between the object and the image acquisition equipment is a fixed value, the acquired image does not show different color values along with the change of the distance, the color identification precision can be improved, the acquired image comprises a gray level image and a color image, the color image is corrected through the gray level image, the corrected image is obtained, the influence of ambient light can be further reduced, and the detection precision of the color of the object is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural view of a throwing apparatus provided in an embodiment of the present application;
fig. 2 is a schematic implementation flow chart of an object color recognition method according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of an implementation of a method for correcting a color image according to an embodiment of the present application;
fig. 4 is a schematic diagram of an object color recognition device according to an embodiment of the present application;
fig. 5 is a system schematic diagram of a throwing apparatus provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to illustrate the technical solutions described in the present application, the following description is made by specific examples.
Fig. 1 is a schematic structural view of a throwing apparatus according to an embodiment of the present application. As shown in fig. 1, the throwing device comprises a main body 1, a controller, an image sensor 2, a gate valve 3 and a position detector 4, wherein an object inlet 11, an object outlet 12 and an object guiding channel 13 are formed in the main body, the object guiding channel 13 is connected with the object inlet 11 and the object outlet 12, the gate valve 3 and the image sensor 2 are arranged in the object guiding channel 13, the image collecting direction of the image sensor 2 is aligned with the object stable position at the gate valve 3, and the controller receives a position detection signal of the object detected by the position detector 4 and located at the object stable position, and controls the image sensor 2 to collect an object image and open the gate valve 3.
Wherein the object may be a pellet, or other shaped object.
The position detector 4 may be a push switch provided at a preset position, for example, at an object stable position of the object in the object guide passage, and when the object falls into the preset position from the object inlet 11 via the object guide passage 13, the on-off state of the push switch is changed due to the gravity of the object, so that it can be detected that the object reaches the preset position.
Alternatively, the position detector 4 may be a distance sensor, which may be installed in alignment with the preset position to detect a distance between the distance sensor and the preset position. When an object rolls down to the preset position, the distance between the distance sensor and the preset position is changed. The distance sensor 4 sends the detected change information to the controller, so that the controller can control the image sensor 2 to collect images, or control the gate valve to open, so that an object with the collected images falls from the object outlet due to the action of gravity. The distance sensor can be an infrared distance measuring sensor or an ultrasonic distance measuring sensor.
In one embodiment, the image sensor 2 may include a gray scale image sensor and a color image sensor. The color image sensor may be an RGB image sensor or may also be a CMYK image sensor. The gray image sensor and the color image sensor are connected with the controller, and the controller can analyze and compare colors according to the acquired images so as to analyze and identify the colors of the object.
In one embodiment, the gate valve 3 may include a rotating plate and a steering gear, and the steering gear output shaft drives the rotating plate to rotate according to a preset angle, so as to switch between an open state and a closed state of the object guiding channel.
In one embodiment, the rotating plate is in an inclined state and the inclination angle is a predetermined angle when the object guide passage is in a closed state. In this way, when an object enters the object guide passage through the object inlet, the object falls to the position of the rotating plate due to gravity, and the object slides to one side of the rotating plate, namely to the preset position of the rotating plate due to the fact that the rotating plate is inclined at a preset angle.
The pressing switch can be arranged at the preset position, and the pressing switch is pressed by the gravity of the object, so that the object is detected to be positioned at the preset position. The distance between the preset position and the distance sensor can also be detected by the distance sensor. When the object is located at the preset position, the distance signal detected by the distance sensor changes, and the change information can be sent to the controller. The image sensor is aligned to the preset position, and the distance between the object and the image sensor cannot be changed because the object is in a stable state at the preset position, so that an image with relatively stable color can be acquired, and the subsequent more accurate identification of the color in the image is facilitated.
The rotation plate may be controlled to rotate to a predetermined angle such that when the object guide passage is in an open state such that an object put into the object guide passage rolls out through the object outlet, an open time interval may be preset such that the gate valve is closed immediately after the object passes through the gate valve. Alternatively, the gate valve may be controlled to be closed by a distance signal detected by a distance sensor, i.e., the rotation plate may be returned to the original inclination angle.
The object guiding channel 13 may be an entire cavity disposed in the main body, or may be a circular channel configured according to the positions of the object inlet and the object outlet.
In one implementation of the embodiment of the present application, an object guiding frame may be disposed at the object inlet, and the object guiding frame may guide the object projected to the predetermined spatial range into the object inlet. In one embodiment, the object guide frame may be a basket.
In one implementation of the embodiment of the present application, the light emitting device is further installed in the object guiding channel, and the light emitting device may be an LED lamp or the like. The light emitting direction of the light emitting device is aligned to the preset position, the light emitting device can be triggered to emit light when an object is detected to reach the preset position, and the light emitting device is turned off when the object leaves the preset position.
In one implementation manner of the embodiment of the application, the device further comprises a display device 6, wherein the display device numerals may be a digital display tube or a display screen, including, for example, a liquid crystal display screen, an LED display screen, and the like, the display device tube is connected with the controller, and the digital display tube may be used for displaying the ball-throwing score calculated according to the color of the thrown object. The score of the current input object is displayed in real time, so that the enthusiasm and interactivity of a user for operating the throwing equipment are improved.
Fig. 2 is a schematic implementation flow chart of an object color identification method according to an embodiment of the present application, where the object color identification method may be based on the throwing device shown in fig. 1, and the object color identification method includes:
in step S201, detecting the position of an object in the throwing apparatus;
specifically, when a user inputs into the object guide passage via the object inlet, in order to detect a stable image of an object, the present application may determine a preset position of the object in the throwing apparatus by a position detector for detecting whether the object reaches the preset position. By means of the position detector it is possible to detect whether the object is in the preset position.
The preset position is a position corresponding to the state of the object in a steady state position, namely in a steady state, when the object is put into the object guide channel. The steady state is understood to mean that the position of the object does not change due to gravity when the object reaches the preset position, before the gate valve of the throwing device is not opened. Thus, the preset position is usually located at the gate valve position, and in order to more reliably determine the preset position, the gate valve may be a tilting rotating plate or a rotating rod. When an object rolls to the gate valve position, the object slides to the lower side of the rotating rod or the rotating plate due to the gravity, so that the preset position can be determined more reliably.
In addition, in order to reduce the bouncing amplitude of the object at the preset position, a buffer layer can be arranged at the preset position, and when the object falls to the preset position, the impact force of the small weight downwards is absorbed through the buffer layer. The buffer layer may be a sponge or the like.
The position detector for detecting the position of the object in the throwing device may be a push switch provided at the preset position, and when the object falls to the preset position, the push switch changes a switch state by gravity of the object, thereby detecting that the object has reached the preset position.
Alternatively, the position detector, which detects the position of the object in the throwing apparatus, may be a distance sensor. The distance sensor may be installed in an object guide passage aligned with the preset position. When the object guiding passage is a cavity, the distance detector may be installed on the lower surface of the upper cover, detect a distance between the distance sensor and a preset position through the distance sensor, and when the object reaches the preset position, detect the object reaching the preset position according to a relation between the detected distance signal and the object position.
In step S202, when it is detected that an object reaches a preset position, a gray image and a color image of the object are acquired;
when the position detector detects that the object reaches a preset position, an image acquisition instruction can be sent to the image sensor through the controller to acquire a color image and a gray image of the object. Or, the light emitting device can be further controlled to irradiate the preset position of the object when the object image is acquired, so that the ambient light of the object is determined by the light emitting device when the object image is acquired, and the color stability of the acquired object image is further improved.
In step S203, the color image is corrected according to the gray level image, so as to obtain a corrected image;
after the distance between the collected images is determined, the collected images may have a difference in light intensity, so that in order to further improve the color stability of the collected images, the color images may be corrected by using the gray scale images to obtain corrected images, which may specifically be as shown in fig. 3, including:
in step S301, comparing the acquired background image of the gray scale image with the gray scale standard image, and determining the gray scale difference between the background image of the gray scale image and the gray scale standard image;
specifically, the gradation standard image at the preset position may be preset. If the ambient light collected each time is the same, the gray scale image of the collected environment is substantially identical to the gray scale standard image. When the ambient light changes, a difference exists between a background image and a gray standard image in the acquired gray images, and the difference can be extracted, and the difference can be a difference value of gray values.
In step S302, according to the preset correspondence between the gray-scale difference and the correction coefficient, the correction coefficient corresponding to the determined gray-scale difference is searched;
the difference between the background gray level image and the gray standard image and the color difference between the color of the corresponding color image and the standard image can be obtained by statistics in advance under the environment light with different intensities. According to the counted correspondence, the difference between the color image and the standard image can be found according to the difference currently determined by the gray image.
In step S303, the color image is corrected according to the searched correction coefficient.
After the difference of the influence of the current environment on the color image is found, the color image can be adjusted according to the difference, so that the influence of the ambient light change on the color image is reduced, and the accurate color recognition of objects in the color image is facilitated.
In step S204, the object color is identified from the corrected image.
When the object color is identified according to the corrected image, the color value of the acquired color image can be transferred to the HVS hue saturation brightness value according to a preset conversion method, for example, the RGB color value can be converted to the HVS value, or the CMKY value can be transferred to the HVS value.
For example, when converting RGB color values to HVS values, max is assumed to be the largest of R, G, B values; assuming min is the smallest of R, G, B values, the conversion can be performed according to the following conversion formula:
V=max(R,G,B);
S=(max-min)/max(if max≠0);
S=0(if max=0);
if (r=max and g≡b), h= (G-B)/(max-min) 60;
if (r=max and G < B), h= (G-B)/(max-min) 60+360;
if (G = max),H = 120+(B-R)/(max-min)* 60;
if (B = max),H = 240 +(R-G)/(max-min)* 60;
after the color value of the corrected image is converted into the HVS value, 9 different colors of red, orange, yellow, green, blue, violet, black and white can be distinguished according to the positions of the H value, the V value and the S value in a preset color range, so that the identification and classification of the different colors are realized.
For example, the color range to which the H value belongs may be searched according to the H value of the object after the color conversion of the corrected image;
searching a color range to which the S value belongs according to the S value of the object subjected to color conversion of the corrected image;
searching a color range to which the V value belongs according to the V value of the object subjected to color conversion of the corrected image;
for example, the color classification table may be as follows:
where Hmin is the minimum of the H range and Hmax is the maximum of the H range, otherwise similar to this definition. And after searching the range corresponding to the HVS value converted by the corrected image, the color of the object can be identified, so that the score of the object is calculated, and the accumulated score of the object can be displayed by a display device.
After the object image is acquired, the gate valve can be controlled to be opened, so that the object slides to the object outlet of the throwing device under the action of gravity, and the gate valve can be closed after the object slides downwards, thereby facilitating the color recognition of the next object.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Fig. 4 is a schematic diagram of an object color recognition device according to an embodiment of the present application, which is described in detail below:
the object color recognition device includes:
a position detection unit 401 for detecting a position of an object in the throwing apparatus;
an image acquisition unit 402, configured to acquire a gray image and a color image of an object when the object is detected to reach a preset position;
an image correction unit 403, configured to correct the color image according to the gray level image, so as to obtain a corrected image;
an image recognition unit 404, configured to recognize the object color according to the corrected image.
The object color recognition device corresponds to the object color recognition method described in fig. 2.
Fig. 5 is a schematic view of a throwing apparatus provided in an embodiment of the present application. As shown in fig. 5, the throwing apparatus 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52, such as an object color recognition program, stored in said memory 51 and executable on said processor 50. The processor 50, when executing the computer program 52, implements the steps of the respective object color identification method embodiments described above. Alternatively, the processor 50, when executing the computer program 52, performs the functions of the modules/units of the apparatus embodiments described above.
By way of example, the computer program 52 may be partitioned into one or more modules/units that are stored in the memory 51 and executed by the processor 50 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions describing the execution of the computer program 52 in the throwing device 5. For example, the computer program 52 may be partitioned into:
a position detection unit for detecting a position of an object in the throwing apparatus;
the image acquisition unit is used for acquiring gray images and color images of the object when the object is detected to reach a preset position;
the image correction unit is used for correcting the color image according to the gray level image to obtain a corrected image;
and the image recognition unit is used for recognizing the color of the object according to the corrected image.
The throwing device may include, but is not limited to, a processor 50, a memory 51. It will be appreciated by those skilled in the art that fig. 5 is merely an example of a throwing device 5 and is not meant to be limiting as throwing device 5, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the throwing device may also include input and output devices, network access devices, buses, etc.
The processor 50 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the throwing device 5, such as a hard disk or a memory of the throwing device 5. The memory 51 may also be an external storage device of the throwing device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the throwing device 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the throwing apparatus 5. The memory 51 is used to store the computer program and other programs and data required by the throwing apparatus. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium may include content that is subject to appropriate increases and decreases as required by jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is not included as electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (9)

1. An object color recognition method, characterized in that the object color recognition method comprises:
detecting a position of an object in the throwing device; the throwing device comprises a main body, an image sensor and a gate valve, wherein an object guide channel is formed in the main body, the gate valve and the image sensor are arranged in the object guide channel, and the image acquisition direction of the image sensor is aligned to a position corresponding to the position of an object at the gate valve in a stable state;
when detecting that an object reaches a preset position, controlling a light-emitting device to irradiate the preset position where the object is positioned when acquiring an object image, and acquiring a gray level image and a color image of the object; the preset position is a position corresponding to the state of the object when the object is thrown into the object guide channel and is in a stable state, and the stable state is that the position of the object is not changed due to the action of gravity before a gate valve of throwing equipment is not opened when the object reaches the preset position;
correcting the color image according to the gray level image to obtain a corrected image;
identifying the object color according to the corrected image;
the step of correcting the color image according to the gray image to obtain a corrected image comprises the following steps:
comparing the background image of the acquired gray level image with a gray level standard image, and determining the gray level difference between the background image of the gray level image and the gray level standard image;
searching a correction coefficient corresponding to the determined gray level difference according to the preset corresponding relation between the gray level difference and the correction coefficient;
and correcting the color image according to the searched correction coefficient.
2. The method of claim 1, wherein the step of detecting the position of the object in the throwing apparatus comprises:
acquiring a distance signal of a preset azimuth detected by a distance sensor;
and determining whether the acquired distance signal corresponds to the object at the preset position according to the corresponding relation between the preset distance signal and the object at the preset position.
3. The object color recognition method according to claim 2, wherein the throwing apparatus is provided with a gate valve, and the preset position is a steady-state position of the thrown object in the throwing apparatus when the gate valve is closed.
4. The object color recognition method according to claim 3, wherein the gate valve is a rotating plate disposed obliquely, and after collecting the gray-scale image and the color image of the object, the method further comprises:
and controlling the rotating plate to a preset angle, and controlling the rotating plate to restore to a preset gate closing angle after an object falls from the gate valve.
5. The object color recognition method according to claim 1, wherein the step of recognizing the object color from the corrected image includes:
converting the color value of the corrected image into an HSV hue saturation brightness value;
and determining the color corresponding to the converted HSV value according to the corresponding relation between the preset hue range, saturation range and brightness range and the color.
6. The method according to claim 1, wherein the step of acquiring a gray image and a color image of the object when it is detected that the object reaches a preset position comprises:
when detecting that an object reaches a preset position, controlling the light-emitting device to generate light aimed at the object;
and collecting gray level images and color images of the object according to the light rays generated by the light-emitting device.
7. An object color recognition device, characterized in that the object color recognition device comprises:
a position detection unit for detecting a position of an object in the throwing apparatus; the throwing device comprises a main body, an image sensor and a gate valve, wherein an object guide channel is formed in the main body, the gate valve and the image sensor are arranged in the object guide channel, and the image acquisition direction of the image sensor is aligned to a position corresponding to the position of an object at the gate valve in a stable state;
the image acquisition unit is used for controlling the light-emitting device to irradiate the preset position where the object is located when the object is detected to reach the preset position and acquiring the gray level image and the color image of the object when the object image is acquired; the preset position is a position corresponding to the state of the object when the object is thrown into the object guide channel and is in a stable state, and the stable state is that the position of the object is not changed due to the action of gravity before a gate valve of throwing equipment is not opened when the object reaches the preset position;
the image correction unit is used for correcting the color image according to the gray level image to obtain a corrected image;
an image recognition unit for recognizing the object color according to the corrected image;
the step of correcting the color image according to the gray level image to obtain a corrected image comprises the following steps:
comparing the background image of the acquired gray level image with a gray level standard image, and determining the gray level difference between the background image of the gray level image and the gray level standard image;
searching a correction coefficient corresponding to the determined gray level difference according to the preset corresponding relation between the gray level difference and the correction coefficient;
and correcting the color image according to the searched correction coefficient.
8. A throwing device, characterized in that the throwing device comprises a main body, an image sensor and a controller, wherein an object guiding channel is formed in the main body, the throwing device further comprises a gray sensor and a position detector for detecting the position of an object, the gray sensor and the image sensor are arranged in the guiding channel and aligned with the preset position of the guiding channel, the controller is connected with the image sensor and the position detector, and the controller is used for realizing the steps of the object color recognition method according to any one of claims 1 to 6.
9. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the object color recognition method according to any one of claims 1 to 6.
CN201911257419.4A 2019-12-10 2019-12-10 Object color recognition method and device and throwing equipment Active CN111144421B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911257419.4A CN111144421B (en) 2019-12-10 2019-12-10 Object color recognition method and device and throwing equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911257419.4A CN111144421B (en) 2019-12-10 2019-12-10 Object color recognition method and device and throwing equipment

Publications (2)

Publication Number Publication Date
CN111144421A CN111144421A (en) 2020-05-12
CN111144421B true CN111144421B (en) 2024-02-13

Family

ID=70517916

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911257419.4A Active CN111144421B (en) 2019-12-10 2019-12-10 Object color recognition method and device and throwing equipment

Country Status (1)

Country Link
CN (1) CN111144421B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000222673A (en) * 1999-01-29 2000-08-11 Mitsubishi Electric Corp Vehicle color discriminating device
JP2002292094A (en) * 2001-04-02 2002-10-08 Sanyo Electric Co Ltd Ball sending device and its use device
CN105719318A (en) * 2016-01-26 2016-06-29 上海葡萄纬度科技有限公司 Educational toy set and HSV based color identification method for Rubik's cube
CN106503719A (en) * 2016-09-27 2017-03-15 深圳增强现实技术有限公司 A kind of object color is extracted and detection method and device
CN107106901A (en) * 2015-01-15 2017-08-29 赵炳九 Exerciser for ball sports
CN108053422A (en) * 2017-11-07 2018-05-18 镇江市高等专科学校 Mobile target monitoring method
CN108553879A (en) * 2018-04-27 2018-09-21 北京云点联动科技发展有限公司 A kind of remote control based on NB-IoT internet of things grabs doll machine
JP2018189420A (en) * 2017-04-28 2018-11-29 株式会社イマジオム Color pattern discrimination probe, and color patten discrimination device
CN110505459A (en) * 2019-08-16 2019-11-26 域鑫科技(惠州)有限公司 Suitable for the image color correction method of endoscope, device and storage medium
CN211585159U (en) * 2019-12-10 2020-09-29 深圳市优必选科技股份有限公司 Throwing equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000222673A (en) * 1999-01-29 2000-08-11 Mitsubishi Electric Corp Vehicle color discriminating device
JP2002292094A (en) * 2001-04-02 2002-10-08 Sanyo Electric Co Ltd Ball sending device and its use device
CN107106901A (en) * 2015-01-15 2017-08-29 赵炳九 Exerciser for ball sports
CN105719318A (en) * 2016-01-26 2016-06-29 上海葡萄纬度科技有限公司 Educational toy set and HSV based color identification method for Rubik's cube
CN106503719A (en) * 2016-09-27 2017-03-15 深圳增强现实技术有限公司 A kind of object color is extracted and detection method and device
JP2018189420A (en) * 2017-04-28 2018-11-29 株式会社イマジオム Color pattern discrimination probe, and color patten discrimination device
CN108053422A (en) * 2017-11-07 2018-05-18 镇江市高等专科学校 Mobile target monitoring method
CN108553879A (en) * 2018-04-27 2018-09-21 北京云点联动科技发展有限公司 A kind of remote control based on NB-IoT internet of things grabs doll machine
CN110505459A (en) * 2019-08-16 2019-11-26 域鑫科技(惠州)有限公司 Suitable for the image color correction method of endoscope, device and storage medium
CN211585159U (en) * 2019-12-10 2020-09-29 深圳市优必选科技股份有限公司 Throwing equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种彩色图像分割的障碍物识别方法;侯之旭 等;重庆理工大学学报(自然科学);第30卷(第03期);第94-98页 *

Also Published As

Publication number Publication date
CN111144421A (en) 2020-05-12

Similar Documents

Publication Publication Date Title
Aquino et al. vitisBerry: An Android-smartphone application to early evaluate the number of grapevine berries by means of image analysis
US20240005726A1 (en) System for counting quantity of game tokens
CN101882034B (en) Device and method for discriminating color of touch pen of touch device
US20170193735A1 (en) Bet sensing apparatuses and methods
US9070192B1 (en) Implementing rich color transition curve tracking for applications
CN103808669B (en) A kind of apple small holes caused by worms fast non-destructive detection method based on high light spectrum image-forming technology
US8331628B2 (en) Vision assistance using mobile telephone
CN108375341B (en) Standing long jump distance measuring method based on image recognition
EP2856409B1 (en) Article authentication apparatus having a built-in light emitting device and camera
CN1123940A (en) Produce recognition system
MX2011003977A (en) Method and system for item identification.
Wang et al. License plate detection using gradient information and cascade detectors
EP3087736A1 (en) Measuring apparatus, system, and program
CN107490967A (en) Adapt to the picking robot Target self-determination identifying system and its method of illuminance mutation
CN112200230B (en) Training board identification method and device and robot
CN108268839A (en) A kind of live body verification method and its system
CN211585159U (en) Throwing equipment
CN108830908A (en) A kind of magic square color identification method based on artificial neural network
US8526717B2 (en) Rich color transition curve tracking method
CN201247528Y (en) Apparatus for obtaining and processing image
CN111144421B (en) Object color recognition method and device and throwing equipment
CN102024264B (en) Dimensional-histogram-statistic-based touch pen color recognition method
CN106910167A (en) Automatic identification system
WO2012020381A1 (en) Method and apparatus for recognizing an interesting object
CN106530774A (en) Led signal lamp

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant