Disclosure of Invention
The invention provides a 3D recognition device and a method, and aims to solve the technical problem that the existing 3D recognition device comprises more components, so that the 3D recognition device is high in cost when strong light protection is realized.
In a first aspect, the present invention provides a 3D recognition apparatus, comprising: the system comprises an infrared sensor, an RGB sensor and a processor;
the processor is respectively connected with the infrared sensor and the RGB sensor, the infrared sensor is used for shooting a first image of the recognition object, the RGB sensor is used for shooting a color image of the recognition object, and the processor is used for obtaining a first distance from the recognition object to the 3D recognition device according to the first image and the color image and determining whether to carry out 3D recognition processing on the recognition object according to the first distance.
Optionally, the apparatus further comprises: a light supplement lamp; the light filling lamp is connected with the infrared sensor and used for projecting infrared light to the infrared sensor to obtain a first image.
Optionally, the first image is a 2D image.
Optionally, the processor processes the first image and the color image using the principle of parallax to obtain the first distance.
Optionally, the apparatus further comprises: a structured light projector; the structured light projector is connected with the processor, and the structured light projector is used for projecting structured light to the identification object so that the infrared sensor obtains a second image, and the second image is used for carrying out 3D identification processing on the identification object.
Optionally, the processor is configured to determine whether to activate the structured light projector based on the first distance.
Optionally, the processor is configured to perform a 3D recognition process on the recognition object based on the first image and the second image.
Optionally, the light supplement lamp has a luminous power less than that of the structured light projector.
In a second aspect, the present invention provides a 3D identification method applied to a 3D identification apparatus, including:
acquiring a color image of an identification object and a first image projected by infrared light;
determining a first distance from the recognition object to the 3D recognition device according to the color image and the first image;
and if the first distance is within a preset range, performing 3D recognition processing on the recognition object.
Optionally, if the first distance is within a preset range, performing 3D recognition processing on the recognition object specifically includes:
if the first distance is within the preset range, acquiring a second image of the identification object under the projection of the structured light;
the recognition object is subjected to 3D recognition processing based on the first image and the second image.
Optionally, determining a first distance from the recognition object to the 3D recognition apparatus according to the color image and the first image specifically includes: the first image and the color image are processed using the principle of parallax to obtain a first distance.
Optionally, the first image is a 2D image.
The invention provides a 3D recognition device and a method thereof, wherein the 3D recognition device comprises an infrared sensor, an RGB sensor and a processor. The infrared sensor shoots a first image of the recognition object, the RGB sensor shoots a color image of the recognition object, and the processor obtains a first distance from the recognition object to the 3D recognition device according to the first image and the color image and determines whether to carry out 3D recognition processing on the recognition object according to the first distance. Compared with the existing 3D recognition device which obtains the distance from the recognition object to the 3D recognition device by using the proximity sensor, the 3D recognition device provided by the invention has fewer used components, and when the 3D recognition device is used for carrying out 3D recognition processing on the recognition object, the damage to the recognition object caused by strong light projected by the 3D recognition device when the distance is too short can be avoided.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 2 is a schematic structural diagram of a 3D recognition apparatus according to an exemplary embodiment of the present invention. As shown in fig. 2, the 3D recognition apparatus 100 provided in the present embodiment includes: an infrared sensor 101, an RGB sensor 102, and a processor 103.
In the 3D recognition apparatus 100, the processor 103 is connected to the infrared sensor 101, the RGB sensor 102, and the processor 103, respectively. Wherein, the infrared sensor 101 is used for shooting a first image projected by infrared light, the RGB sensor 102 is used for shooting a color image, the processor 103 is used for obtaining a first distance from the identified object to the 3D identification device according to the first image and the color image, and the processor 103 is further used for judging whether the first distance from the identified object to the 3D identification device is within a preset range, wherein the preset range is determined according to the intensity of the structured light provided by the 3D identification device, for example: the preset range may be set to be greater than 40 cm. And when the first distance from the identification object to the 3D identification device is within a preset range, carrying out 3D identification processing on the identification object to obtain an identification result. And if the first distance is not within the preset range, stopping the 3D recognition processing of the recognition object. And if the preset range is set to be larger than 40cm, when the first distance between the recognition object and the 3D recognition device is larger than 40cm, performing 3D recognition processing on the recognition object. And stopping the 3D recognition processing of the recognition object when the first distance is less than or equal to 40 cm.
In the 3D recognition device that this embodiment provided, obtain the distance of discerning the object to 3D recognition device according to the first image that infrared sensor gathered and the color image that the RGB sensor gathered for this 3D recognition device need not proximity sensor can obtain the distance of discerning the object to 3D recognition device, reduce this 3D recognition device's components and parts, reduce 3D recognition device's cost, and when utilizing this 3D recognition device to carry out 3D discernment, can avoid the highlight that 3D recognition device throws when the distance is too close to cause the injury to discerning the object.
Fig. 3 is a schematic structural diagram of a 3D recognition apparatus according to another exemplary embodiment of the present invention. As shown in fig. 3, the 3D recognition apparatus 200 provided in the present embodiment includes: an infrared sensor 201, an RGB sensor 202, a processor 203, a memory 204, a fill-in light 205, a drive unit 206, and a structured light projector 207.
In the 3D recognition apparatus 200, the processor 203 is connected to the infrared sensor 201, the driving unit 206, the RGB sensor 202, and the memory 204, the fill light 205 is connected to the infrared sensor 201, and the structured light projector 207 is connected to the driving unit 206.
Fig. 4 is a layout diagram of a 3D recognition apparatus based on the embodiment shown in fig. 3. As shown in fig. 4, the fill-in light 205, the infrared sensor 201, the RGB sensor 202, the driving unit 206, and the structured light projector 207 are assembled in an integrated module. The processor 203 and memory 204 may be located inside the module or on a separate motherboard. When the processor 203 and the memory 204 are disposed on separate motherboards, the processor 203 is connected to the integrated module through a communication line. The light supplement lamp 205 and the RGB sensor 202 are located around the infrared sensor 201, and a certain distance is provided between the infrared sensor 201 and the structured light projector 207. It should be noted that, the present embodiment does not limit the layout manner of each part of the 3D recognition apparatus, and may be adjusted according to a specific application scenario.
In the 3D recognition apparatus 200, the infrared sensor 201 is used to take an infrared projection picture or a structured light projection picture, the driving unit 206 is used to drive the structured light projector 207 to generate structured light, and the light supplement lamp 205 is used to provide supplement infrared light. The processor controls the infrared sensor 201, the structured light projector 207 and the driving unit 206 to work synchronously, and also controls the infrared sensor 201 to generate a driving signal to drive the fill-in light to work, so that the infrared sensor obtains a first image, and controls the RGB sensor to work. And on the other hand, the images shot by the infrared sensor and the images shot by the RGB are processed.
When the 3D recognition device 200 is used to perform 3D recognition processing, the fill-in light 205 projects infrared light to the infrared sensor 201, the infrared sensor 201 captures a first image of the recognition object projected by the infrared light, wherein the first image is a 2D image, and the first image is transmitted to the processor 203. The RGB sensor 202 captures a color image of the recognition object and transmits the color image to the processor 203.
The processor 203 calculates the first image and the color image by using the parallax principle to obtain a first distance between the identification object and the 3D identification device, and determines whether the first distance is within a preset distance range, and if the first distance is within the preset distance range, the processor controls the driving unit to provide a driving signal for the structured light projector through a communication Interface such as a Serial Peripheral Interface (SPI) or a two-wire Serial Circuit (I2C), wherein the driving signal includes a configuration signal and a synchronization signal. The configuration signals are used to configure the drive current, frequency, duty cycle, etc. for the structured light projector. The synchronization signal is used for realizing the synchronization of the structured light projector and the exposure control of the infrared sensor. Under the control of the driving signal, the structured light projector projects structured light to a scene where a human face is located, meanwhile, the processor controls the infrared sensor to work, the infrared sensor obtains a second picture projected by the structured light and transmits the second picture to the processor, and the processor calculates a 3D picture of an object to be recognized according to the first picture and the second picture.
And if the first distance is not within the preset range, the controller controls the driving unit to stop providing the driving signal for the structured light projector through the communication interface, the structured light projector is not started, and the 3D identification processing of the identification object is stopped.
In the 3D identification device, the light-emitting power of the light-compensating lamp is smaller than the light-emitting power of the structured light projector, the light-compensating lamp with the smaller light-emitting power is turned on to enable the infrared sensor to capture a first image, the processor obtains a first distance according to the first image and the color image captured by the RGB sensor, and if the first distance is not within a preset range, the structured light projector is not started, and the 3D identification processing is stopped for the identification object, so that the identification object is prevented from being damaged by the structured light projected by the structured light projector.
In addition, the processor also calls information in memory, such as: calibration, template, feature, and system information stored in memory generate various instructions to implement control of the infrared sensor, RGB sensor, and structured light projector.
The structured light projector described above can provide point structured light, line structured light, multi-line structured light, area structured light, and other forms of structured light. The light supplement lamp can be a floodlight which is used for providing uniform infrared light.
Fig. 5 is a schematic diagram based on the principle of parallax in the embodiment shown in fig. 3. As shown in fig. 5, a point P is a subject, a picture is taken of a subject point P by the infrared sensor 201 and the RGB sensor 202, a left optical axis is an optical axis of the infrared sensor 201, a right optical axis is an optical axis of the RGB sensor 202, a left image is an image taken by the infrared sensor 201, a right image is an image taken by the RGB sensor 202, P1 is an imaging point of the subject point P on the left image, and P2 is an imaging point of the subject point P on the right image. o1 is the intersection of the left optical axis and the line connecting the imaging point P1 and the subject. o2 is the intersection of the right optical axis and the line connecting the imaging point P2 and the photographic subject.
The optical axis of the infrared sensor 201 is taken as the Z-axis of the coordinate system, the origin of coordinates of the coordinate system is o1, and the coordinates of the imaging target point P are P (x)c,yc,zc) The distance between the optical axis of the infrared sensor 201 and the optical axis of the RGB sensor 102 is B.
The distance f from the coordinate axis origin o1 to the left image, and the distance z from the coordinate axis origin o2 to the point P to be identifiedc. Assuming that the image captured by the infrared sensor 201 is on the same plane as the image captured by the RGB sensor 202, the ordinate Y of the imaging point P1 on the left image is the same as the ordinate of the imaging point P2 on the right image.
It is understood that Δ o 1P 1 m1 and Δ o 1P m are similar triangles, and the abscissa value X of the imaging point P1 on the left image can be obtainedleftComprises the following steps:
the ordinate value Y of the imaging point P1 on the left image is:
wherein, XleftAnd Y takes the intersection o1 as the origin of coordinates, the coordinates of the imaged point P1 on the left image are:
it is also understood that Δ o 2P 2 m2 and Δ o 2P m are similar triangles, and an abscissa value X of an imaging point P1 on the left image can be obtainedrightComprises the following steps:
the ordinate value Y of the imaging point P1 on the left image is:
wherein, XrightAnd Y takes the intersection o2 as the origin of coordinates, the coordinates of the imaged point P2 on the left image are:
the parallax between the image captured by the infrared sensor 201 and the image captured by the RGB sensor 202 is:
disparity=Xright-Xleft
therefore, the coordinates of the shooting object P in the coordinate system of the infrared sensor 201 can be calculated as:
the distance of the recognition object to the 3D recognition device is:
in the 3D identification device provided in this embodiment, a light supplement lamp with a small luminous power is turned on first, so that the infrared sensor captures a first image, the first image and a color image captured by the RGB sensor are calculated by using a parallax principle, a first distance from an identification object to the 3D identification device is obtained, the first distance can be obtained without a proximity sensor, and components of the 3D identification device are reduced.
Fig. 6 is a flowchart illustrating an identification method according to an exemplary embodiment of the present invention. As shown in fig. 6, the identification method provided by the present embodiment is applied to a 3D identification device, and the identification method includes the following steps:
s301, acquiring a color image of the identification object and a first image projected by infrared light.
More specifically, the acquiring of the first image of the identification object under the projection of the infrared light includes: a first image of the identified object is captured with an infrared sensor. When the recognition object is located outdoors with good light, the first image projected by the infrared light can be directly shot without providing the infrared light by the light supplementing lamp. When the recognition object is in an environment with poor light, a supplementary lamp is required to provide infrared light when the first image is taken using the infrared sensor. The first image is a 2D image shot by an infrared sensor.
The acquiring of the color image of the identification object specifically includes: a color image of an identification object is captured using an RGB sensor.
S302, determining a first distance from the recognition object to the 3D recognition device according to the color image and the first image.
More specifically, when the same recognition object is photographed by two cameras, there is a parallax between the two photographed pictures, and the distance between the recognition object and the 3D recognition device can be obtained by using the parallax. In the present embodiment, the first image captured by the infrared sensor and the color image captured by the RGB sensor are processed using the principle of parallax to obtain the first distance.
And S303, judging whether the first distance is within a preset range, if so, entering S304, and otherwise, entering S305.
More specifically, the preset range is determined according to the intensity of the structured light provided by the 3D recognition device. For example: a preset value of more than 40cm may be provided.
S304, the 3D recognition process is performed on the recognition target, and the process proceeds to S307.
And S305, updating the cycle number.
And updating the cycle number once when the first distance is judged not to be within the preset range.
S306, judging whether the circulation frequency is within a preset threshold value, if so, entering S301, otherwise, entering S307.
More specifically, the preset threshold may be set according to user requirements, for example: 3 times. And if the cycle number is found to exceed the preset threshold value, stopping identifying the identification object.
And S307, ending the flow.
In the identification method provided by the embodiment, the color image of the identification object and the first image projected by the infrared light are acquired, the first distance from the identification object to the 3D identification device is acquired according to the first image and the color image, the first distance can be acquired without a proximity sensor, the number of components of the 3D identification device is reduced, the identification cost is reduced, and damage to the identification object caused by the strong light projected when the 3D identification is performed when the distance is too close can also be avoided.
Fig. 7 is a flowchart illustrating an identification method according to another exemplary embodiment of the present invention. As shown in fig. 7, the identification method provided by the present embodiment is applied to a 3D identification device, and the identification method includes the following steps:
s401, acquiring a color image of the identification object and a first image projected by infrared light.
S402, determining a first distance from the recognition object to the 3D recognition device according to the color image and the first image.
And S403, judging whether the first distance is within a preset range, if so, entering S404, and otherwise, entering S406.
S404, acquiring a second image of the identification object under the projection of the structured light.
More specifically, acquiring the second image projected by the structured light includes: the structured light projector and the infrared sensor work synchronously, the structured light projector projects structured light, and the infrared sensor shoots a second image of the identification object under the projection of the structured light. Wherein the power of the infrared light is less than the power of the structured light. The structured light can be point structured light, line structured light, multi-line structured light, surface structured light and other forms of structured light.
S405, the 3D recognition processing is performed on the recognition target from the first image and the second image, and the process proceeds to S408.
More specifically, after the second image projected by the structured light is obtained, 3D reconstruction is performed by using the first image and the second image to obtain a 3D image, so as to realize 3D identification of the identification object.
And S406, updating the cycle number.
And S407, judging whether the cycle number is within a preset threshold value, if so, entering S401, and otherwise, entering S408.
And S408, ending the flow.
In the identification method provided by this embodiment, a color image of an identification object and a first image projected by infrared light are acquired first, so as to obtain a first distance between the identification object and a 3D identification device according to the first image and the color image, when the first distance is within a preset range, a second image projected by structured light is acquired, and the power of infrared light is smaller than that of structured light, so that the damage of too strong structured light to eyes due to too close distance can be avoided, and the method does not use a proximity sensor to obtain the distance, reduces the components of the 3D identification device, and reduces the cost.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.