CN109495694A - A kind of environment perception method and device based on RGB-D - Google Patents
A kind of environment perception method and device based on RGB-D Download PDFInfo
- Publication number
- CN109495694A CN109495694A CN201811307133.8A CN201811307133A CN109495694A CN 109495694 A CN109495694 A CN 109495694A CN 201811307133 A CN201811307133 A CN 201811307133A CN 109495694 A CN109495694 A CN 109495694A
- Authority
- CN
- China
- Prior art keywords
- exposure
- acquisition module
- data acquisition
- module
- laser radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- General Physics & Mathematics (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a kind of environment perception method and device based on RGB-D, the method includes exposing laser radar data acquisition module and image data acquiring module synchronization by hardware synchronization mechanism;A cromogram of the described image data acquisition module through single exposure acquisition environment picture;The laser radar data acquisition module is exposed the several groups strength information of acquisition environment picture several times simultaneously, and is generated N grayscale images, wherein N >=6;The fusion treatment that the cromogram and the N grayscale images are carried out to initial data level, obtains the environment picture of brightness improving.The present invention is exposed using laser radar data acquisition module and image data acquiring module synchronization, after grey chromatic graph and cromogram are acquired under different frequency acquisitions, two kinds of pictures is carried out to the fusion treatment of initial data level, to obtain the environment picture of brightness improving;It realizes and obtains splendid environmental information under various weather and natural environment.
Description
Technical field
The present invention relates to intelligent driving technical field more particularly to a kind of environment perception methods and device based on RGB-D.
Background technique
Under the background that the technologies such as computer, artificial intelligence and machine vision constantly develop and reform, what camera obtained
The value of image seems increasing, and the application based on image is also more and more.Typical application is that camera is mounted on machine
On device people, ambient image is constantly obtained, the algorithm of computer vision is recycled to handle it, allows robot picture
People equally possesses eyes and removes observation ambient enviroment.It on this basis, equally can be by camera applications in perceiving automobile on automobile
The environment of surrounding.
Common color camera is the light for passively receiving object sending, is then imaged on its plane of delineation.It is this
The influence of the external factors such as the image obtained under principle is easy to be illuminated by the light, shade, shooting angle, current computer vision algorithms make
For the processing not robust of this image, therefore its application is also correspondingly restricted.Depth camera is by its own hair
Light out, can initiatively perceive the range information of object, and imaging is illuminated by the light, the influence of shade and shooting angle is smaller.
With ADAS, AD industry continue to develop, it is increasingly strong for environment sensing demand, but be constrained to factors at
Sheet, industrial technology status etc..There are binomial primary limitations for existing industry monocular camera: 1) not having range information;2) light not
It is not available under the application scenarios of foot or complicated weather.
Therefore, it is necessary to provide a kind of environment perception method and device based on RGB-D, it is intended in the current monocular phase of solution
Bottleneck problem of the machine under the application scenarios of light complexity.
Summary of the invention
The technology of high brightness picture can not be obtained under the application scenarios of insufficient light in order to solve camera in the prior art
Problem, the present invention provides a kind of environment perception method and device based on RGB-D.The present invention intends to break present to produce
Industry bottleneck provides better environment perception method and equipment for intelligent automobile.
In order to solve the above-mentioned technical problem, in a first aspect, the present invention provides a kind of environment sensing sides based on RGB-D
Method includes the following steps:
Laser radar data acquisition module and image data acquiring module synchronization are exposed by hardware synchronization mechanism;
A cromogram of the described image data acquisition module through single exposure acquisition environment picture;The laser thunder simultaneously
It is exposed the several groups strength information of acquisition environment picture several times up to data acquisition module, and is generated N grayscale images,
In, N >=6;The laser radar data acquisition module single exposure time is less than 1ms;
The fusion treatment that the cromogram and the N grayscale images are carried out to initial data level, obtains brightness improving
Environment picture.
Further, the laser radar data acquisition module is exposed the several groups intensity of acquisition environment picture several times
Information, and N grayscale images are generated, it specifically includes:
The strength information is converted to after gray level information and generates the grayscale image.
Further, using maximum value and mean value gradient method, the strength information is converted to gray level information.
Further, the fusion treatment that the cromogram and the N grayscale images are carried out to initial data level, obtains
To the environment picture of brightness improving, specifically include:
The color gamut of the cromogram is converted into YcCbcCrc;
The low brightness area after converting in cromogram is obtained, the low brightness area refers to that brightness is lower than the area of preset threshold
Domain;
Luminance information in the grayscale image with the low brightness area corresponding part is added into the low brightness area
In merged after picture;
The attribute information of picture after the fusion is obtained, the attribute information includes gradient magnitude, gradient direction, gradient damage
Mistake, shade and noise;
Adaptive smooth processing is carried out to the corresponding data of the attribute information and is based on melting described in grayscale image elimination
Motion blur after conjunction in picture;
Convergence process is carried out to the corresponding data of the attribute information.
Further, described that laser radar data acquisition module and image data acquiring module are made by hardware synchronization mechanism
Synchronous exposure, comprising:
The laser radar data acquisition module issues exposure signal and simultaneously exposes the exposure signal, while by the exposure
Optical signal is sent to described image data acquisition module, and described image data acquisition module receives the exposure signal and will be described
Exposure signal exposure;
Or,
Described image data acquisition module issues exposure signal and exposes the exposure signal, while the exposure being believed
It number is sent to the laser radar data acquisition module, the laser radar data acquisition module receives the exposure signal and will
The exposure signal exposure.
Further, cromogram of the described image data acquisition module through single exposure acquisition environment picture, at the same it is described
Laser radar data acquisition module is exposed the several groups strength information of acquisition environment picture several times, is specifically included:
The laser point cloud data and image data of the synchronous exposure of acquisition;The intensity is obtained by the laser point cloud data
Information obtains the cromogram of the environment picture by described image data;
It after handling the laser point cloud data, is fused in described image data, obtains the distance of environment
Information.
Further, the laser radar data acquisition module is Flash Lidar, described to swash in synchronous exposure process
The ratio between frequency acquisition of frequency acquisition of optical radar data acquisition module and described image data acquisition module is M:1, wherein M >=
100;Described image data acquisition module be camera model, the colour atla mode that the camera model uses for RGGB, RCCB,
RCCC。
Preferably, the laser radar data acquisition module single exposure time is 50ns-1ms.
Preferably, the laser radar data acquisition module single exposure time is 1-50ns.
Preferably, the laser radar data acquisition module single exposure time is less than 1ns.
Second aspect, the present invention provides a kind of environmental perception devices based on RGB-D, comprising: image data acquiring mould
Block, laser radar data acquisition module, synchronous exposure module and environment picture Fusion Module;
The synchronous exposure module, for making laser radar data acquisition module and image data by hardware synchronization mechanism
The synchronous exposure of acquisition module;
Described image data acquisition module, a cromogram for single exposure acquisition environment picture;
The laser radar data acquisition module, for exposing the several groups strength information of acquisition environment picture several times,
And generated N grayscale images, wherein N >=6;The laser radar data acquisition module single exposure time is less than 1ms;Institute
Environment picture Fusion Module is stated, for the cromogram and the N grayscale images to be carried out to the fusion treatment of initial data level,
Obtain the environment picture of brightness improving.
Further, the laser radar data acquisition module further includes information conversion unit, the information conversion unit
For the strength information to be converted to gray level information.
Further, the information conversion unit includes first information converting unit, the first information converting unit, is used
In using maximum value and mean value gradient method, the strength information is converted to gray level information.
Further, the information conversion unit further includes the second information conversion unit, second information conversion unit,
For using ROI method, the strength information is converted to gray level information.
Further, the environment picture Fusion Module includes: color gamut converting unit, picture integrated unit, attribute letter
Acquiring unit is ceased, motion blur eliminates unit and restrains unit,
The color gamut converting unit, for the color gamut of the cromogram to be converted to YcCbcCrc;
The picture integrated unit, for obtaining the low brightness area after conversion in cromogram, the low brightness area is
Refer to that brightness is lower than the region of preset threshold;And it will be mended in the grayscale image with the luminance information of the low brightness area corresponding part
It is charged to picture after being merged in the low brightness area;
The attribute information acquiring unit, for obtaining the attribute information of picture after the fusion, the attribute information packet
Include gradient magnitude, gradient direction, gradient loss, shade and noise;
The motion blur eliminates unit, for carrying out adaptive smooth processing simultaneously to the corresponding data of the attribute information
The motion blur after the fusion in picture is eliminated based on the grayscale image;
The convergence unit, for carrying out convergence process to the corresponding data of the attribute information.
Further, the synchronous exposure module synchronizes exposure module exposure module synchronous with second including first, described
First, which synchronizes exposure module exposure module synchronous with described second, is separately positioned on the laser radar data acquisition module or institute
State in image data acquiring module, the first synchronous exposure module include triggering generate unit, the first exposure control unit and
Signal transmitting unit, the described second synchronous exposure module includes signal receiving unit and the second exposure control unit;
The triggering generates unit, for issuing exposure signal;
First exposure control unit, for receiving the exposure signal of the triggering generation unit transmission and believing exposure
Number exposure;
The signal transmitting unit, for receiving the exposure signal of the triggering generation unit transmission and sending out exposure signal
Give the signal receiving unit;
The signal receiving unit, the exposure signal sent for receiving the signal transmitting unit;
Second exposure control unit, for receiving the exposure signal of the signal receiving unit transmission and believing exposure
Number exposure.
Further, the laser radar data acquisition module includes that the first data acquisition unit and strength information obtain list
Member,
First data acquisition unit, for acquiring the laser point cloud number of exposure synchronous with the second data acquisition unit
According to;
The strength information acquiring unit, for obtaining the strength information by the laser point cloud data;
Described image data acquisition module includes the second data acquisition unit and cromogram acquiring unit;
Second data acquisition unit, for acquiring the picture number of exposure synchronous with first data acquisition unit
According to;
The cromogram acquiring unit, for the cromogram by environment picture described in described image data acquisition;
It further, further include that range information obtains module, after handling the laser point cloud data, by it
It is fused in described image data, obtains the range information of the environment.
Further, the laser radar data acquisition module is Flash Lidar, described to swash in synchronous exposure process
The ratio between frequency acquisition of frequency acquisition of optical radar data acquisition module and described image data acquisition module is M:1, wherein M >=
100;Described image data acquisition module be camera model, the colour atla mode that the camera model uses for RGGB, RCCB,
RCCC。
Preferably, the laser radar data acquisition module single exposure time is 50ns-1ms.
Preferably, the laser radar data acquisition module single exposure time is 1-50ns.
Preferably, the laser radar data acquisition module single exposure time is less than 1ns.
A kind of environment perception method and device based on RGB-D provided by the invention, has the following beneficial effects:
(1) environment perception method of the invention based on RGB-D, using laser radar data acquisition module and image data
Two kinds of pictures after acquiring grey chromatic graph and cromogram under different frequency acquisitions, are carried out original number by the synchronous exposure of acquisition module
According to the fusion treatment of level, to obtain the environment picture of brightness improving;It realizes and is obtained under various weather and natural environment
Splendid environmental information is obtained, robustness is preferable;
(2) laser radar data acquisition module can be realized and image data acquiring mould by hardware synchronization mechanism in the present invention
The synchronous exposure of block, laser point cloud data is small by motion distortion influence degree, so as to obtain the higher environment of fusion accuracy
Picture;
(3) laser radar data acquisition module is compared to camera in the present invention, and frequency acquisition is fast, in high vehicle speeds
In the case of, it can get the better data of real-time, improve the accuracy of data.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with
It obtains other drawings based on these drawings.
Fig. 1 is a kind of flow diagram of the environment perception method provided in an embodiment of the present invention based on RGB-D;
Fig. 2 is provided in an embodiment of the present invention to make laser radar data acquisition module and picture number by hardware synchronization mechanism
According to a kind of flow diagram of the synchronous exposure of acquisition module;
Fig. 3 is provided in an embodiment of the present invention to make laser radar data acquisition module and picture number by hardware synchronization mechanism
According to another flow diagram of the synchronous exposure of acquisition module;
Fig. 4 is the one of the fusion treatment provided in an embodiment of the present invention that cromogram and grayscale image are carried out to initial data level
Kind flow diagram;
Fig. 5 is another flow diagram of the environment perception method provided in an embodiment of the present invention based on RGB-D
Fig. 6 is Flash Lidar provided in an embodiment of the present invention exposure signal schematic diagram synchronous with RGB camera;
Fig. 7 is a kind of composition block diagram of the environmental perception device provided in an embodiment of the present invention based on RGB-D;
Fig. 8 is the composition block diagram of synchronous exposure module provided in an embodiment of the present invention;
Fig. 9 is the composition block diagram of the provided in an embodiment of the present invention first synchronous exposure module;
Figure 10 is the composition block diagram of the provided in an embodiment of the present invention second synchronous exposure module;
Figure 11 is provided in an embodiment of the present invention to make laser radar data acquisition module and image by hardware synchronization mechanism
A kind of composition block diagram of the synchronous exposure of data acquisition module;
Figure 12 is provided in an embodiment of the present invention to make laser radar data acquisition module and image by hardware synchronization mechanism
Another composition block diagram of the synchronous exposure of data acquisition module;
Figure 13 is the composition block diagram of laser radar data acquisition module provided in an embodiment of the present invention;
Figure 14 is the composition block diagram of image data acquiring module provided in an embodiment of the present invention;
Figure 15 is the composition block diagram of environment picture Fusion Module provided in an embodiment of the present invention;
Figure 16 is another composition block diagram of the environmental perception device provided in an embodiment of the present invention based on RGB-D;
Figure 17 is laser radar data acquisition module provided in an embodiment of the present invention and the installation of image data acquiring module
Structural schematic diagram.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description.Obviously, described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, those of ordinary skill in the art without making creative work it is obtained it is all its
His embodiment, shall fall within the protection scope of the present invention.
In several embodiments provided herein, described Installation practice is only schematical, such as institute
The division of module is stated, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple moulds
Block or component can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point,
Shown or discussed mutual coupling, direct-coupling or communication connection can be through some interfaces, module or unit
Indirect coupling or communication connection, can be electrically or other forms.
The module as illustrated by the separation member may or may not be physically separated, aobvious as module
The component shown may or may not be physical module, it can and it is in one place, or may be distributed over multiple
On network module.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, each functional unit in each embodiment of the present invention can integrate in a processing module
It is that each unit physically exists alone, can also be integrated in two or more units in a module.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
The present invention applies in the scene of intelligent driving field, and main purpose is to perceive vehicle periphery during ADAS, AD
Environment.
Embodiment 1
As shown in Figure 1, the embodiment of the invention provides a kind of environment perception methods based on RGB-D, which comprises
S110. expose laser radar data acquisition module and image data acquiring module synchronization by hardware synchronization mechanism
Light;
In hardware level, laser radar data acquisition module and image data acquiring mould are realized by way of hardware communications
The synchronous exposure of block, compared with the prior art in each equipment separate exposures (for example radar has the frequency of exposure of radar, camera
There is the frequency of exposure of camera, after respective exposure data output, then by adding a processing system below, by the data of radar output
Merged with the video data of camera data), it is exposed by the way that hardware mechanisms are synchronous, adopts laser point cloud data and image data
It is synchronous to collect data time, laser point cloud data can be made not generate motion distortion.
Fig. 2, which is shown in the embodiment of the present invention, makes laser radar data acquisition module and picture number by hardware synchronization mechanism
According to a kind of flow diagram of the synchronous exposure of acquisition module:
S210. triggering in the laser radar data acquisition module generates unit and issues exposure signal and by the exposure
Signal exposure, while the exposure signal is sent to described image data acquisition module by channels such as the parallel ports MIPI/LVDS/
Block, the signal element in described image data acquisition module receive the exposure signal and expose the exposure signal.
Fig. 3, which show the embodiment of the present invention, makes laser radar data acquisition module and image data by hardware synchronization mechanism
Another flow diagram of the synchronous exposure of acquisition module:
S310. triggering in described image data acquisition module generates unit and issues exposure signal and by the exposure signal
Exposure, while the exposure signal is sent to the laser radar data by channels such as the parallel ports MIPI/LVDS/ and acquires mould
Block, the laser radar data acquisition module receive the exposure signal and expose the exposure signal.
In the embodiment of the present invention, a main control chip can be set, when by main control chip output signal, triggering generates single
Member is triggered to expose laser radar data acquisition module and image data acquiring module synchronization, and a spy also can be set
Fixed interface, when entering the interface, triggering generates unit and is triggered to make laser radar data acquisition module and picture number
It is exposed according to acquisition module is synchronous, or a kind of hardware synchronization exposure mode is set, when entering the hardware synchronization exposure mode, touching
Hair generates unit and is triggered to expose laser radar data acquisition module and image data acquiring module synchronization, or can be with
One time cycle is set, and when reaching the time cycle, triggering generates unit and is triggered to acquire laser radar data
Module and the exposure of image data acquiring module synchronization.
S120. in the same time, a coloured silk of the described image data acquisition module through single exposure acquisition environment picture
Chromatic graph;The laser radar data acquisition module is exposed the several groups strength information of acquisition environment picture several times simultaneously, and
Generated N grayscale images, wherein N >=6;The laser radar data acquisition module single exposure time is 50ns-1ms;
In the embodiment of the present invention, acquired by laser radar data acquisition module synchronous with described image data acquisition module
The laser point cloud data of exposure, and it is same by described image data collecting module collected and the laser radar data acquisition module
Walk the image data of exposure.
The strength information is obtained by the laser point cloud data, it, will be described strong using maximum value and mean value gradient method
Degree information is converted to gray level information, specific formula for calculation are as follows:
Y=kx+b;
Wherein b=0, k=Max (A)/Max (B);Wherein A be gray scale maximum value 255, B be Ambient rate intensity most
Big value.
Other parameters explanation: x is certain point respective intensities information in actual environment, and y is grayscale information, i.e. gray level information;
Such as the density by adjusting data, the data (1-1 ten thousand) that radar acquires are mapped in grayscale image (0-255);
The cromogram of the environment picture is obtained by described image data;
It after handling the laser point cloud data, is fused in described image data, obtains the distance of environment
Information.
S130., the fusion treatment that the cromogram and the N grayscale images are carried out to initial data level, obtains brightness and changes
Kind environment picture;As shown in figure 4, specifically comprising the following steps:
S410. the color gamut of the cromogram is converted into YcCbcCrc;
S420. the low brightness area after converting in cromogram is obtained, the low brightness area refers to that brightness is lower than default threshold
The region of value;The low brightness area is defined relative to normal brightness, and preset threshold therein is lower than normal brightness value, should
Value can be configured according to actual application scenarios;
S430. the luminance information in the grayscale image with the low brightness area corresponding part is added into the low-light level
Picture after being merged in region;
S440. obtain the attribute information of picture after the fusion, the attribute information include gradient magnitude, gradient direction,
Gradient loss, shade and noise;
S450. adaptive smooth processing is carried out to the corresponding data of the attribute information and institute is eliminated based on the grayscale image
State the motion blur after merging in picture;
Wherein can by convolution process realize adaptive smooth processing, use size be the filter W (x, y) of m × n and
Piece image f (x, y) carries out convolution, its calculation formula is:
Wherein, a=(m-1)/2, b=(n-1)/2;Minus sign on the right side of equation indicates overturning f, that is, rotates 180 °.
S460. convergence process is carried out to the corresponding data of the attribute information and obtains the environment picture of brightness improving, specifically
Convergence process can be carried out to data using IRLS and CG method.Wherein, brightness improving refers to compared to the colour in step S120
Figure, obtained environment picture luminance are promoted.
Wherein IRLS method refers to: iteration weighted least-squares method;
CG method refers to: conjugate gradient method.
Image data acquiring module in the embodiment of the present invention includes but is not limited to in-vehicle camera or camera etc., laser thunder
It is Flash Lidar up to data acquisition module, Flash Lidar belongs to non-scanning type laser radar, by emitting and receiving
Face battle array laser, is exported the point cloud data arranged in the form of two dimensional image, includes the information such as distance, reflected intensity and speed.It is non-to sweep
Laser radar gets rid of the dependence of scanning device when retouching, and detection accuracy and reliability are higher, and cost is relatively low.Wherein Flash
The IR wavelength of Lidar transmitting can be configured according to the actual situation, for example, the application scenarios more severe for environment,
Flash Lidar supports SWIR.
It is based on by the technical solution that above this specification embodiment is issued as it can be seen that providing one kind in this specification embodiment
The environment perception method of RGB-D makes laser radar data acquisition module and image data acquiring mould by hardware synchronization mechanism
The synchronous exposure of block, a cromogram of the described image data acquisition module through single exposure acquisition environment picture;It is described simultaneously to swash
Optical radar data acquisition module is exposed the several groups strength information of acquisition environment picture several times, and is generated N gray scales
Figure, wherein N >=6;The laser radar data acquisition module single exposure time is less than 1ms;By the cromogram and the N
The fusion treatment that grayscale image carries out initial data level is opened, the environment picture of brightness improving is obtained.This method has many advantages, such as:
(1) it is exposed using laser radar data acquisition module and image data acquiring module synchronization, is acquired under different frequency acquisitions
After grey chromatic graph and cromogram, two kinds of pictures are carried out to the fusion treatment of initial data level, are not increasing image data acquiring mould
Under the premise of block, the environment picture of brightness improving has been obtained;(2) laser radar data acquisition module can be by hard in the present invention
Part synchronization mechanism is realized to expose with image data acquiring module synchronization, and laser point cloud data will not generate motion distortion, so as to
Access the higher environment picture of fusion accuracy;(3) laser radar data acquisition module is acquired compared to camera in the present invention
Frequency is fast, in the case where high vehicle speeds, can get the better data of real-time, improves the accuracy of data.
Embodiment 2
Fig. 5 show another flow diagram of the environment perception method provided in an embodiment of the present invention based on RGB-D,
This method specifically includes:
S510. combined calibrating is carried out to laser radar data acquisition module and image data acquiring module, makes laser point cloud
The alignment of the spatial position of data and image data;
Specifically, to the laser radar data acquisition module and described image data acquisition module in the embodiment of the present invention
The mode for carrying out combined calibrating can be demarcated using mode in the prior art, and details are not described herein.
In the embodiment of the present invention, to the laser radar data acquisition module and the progress of described image data acquisition module
It further include that laser radar data acquisition module and image data acquiring module are placed in identical position before combined calibrating
Step, which may insure the spatial position alignment of subsequent acquisition laser point cloud data and image data, in order to guarantee to adopt
The covering for collecting the precision of data, accuracy and Limit of J-validity, can regard the fusion of the laser radar data acquisition module
Field is set greater than the fusion visual field of described image data acquisition module, and as shown in figure 17, A is image data acquiring module, right
The fusion visual field answered is α, and B is the corresponding fusion visual field β of laser radar data acquisition module, wherein β is greater than α.
It should be noted that image data acquiring module and laser radar data acquisition module are mutual in the embodiment of the present invention
It mends, this programme needs to obtain the coincidence visual field (i.e. intersection region) of two acquisition modules, so not limiting α, β in this programme
β≤α also can be set into the relationship of physical relationship, the two.
In the present embodiment, laser radar data acquisition module and image data acquiring module may be mounted at the top of vehicle
Portion, side or front, the embodiment of the present invention to the installation site of optical radar data acquisition module and image data acquiring module not
It limits.
S520. expose laser radar data acquisition module and image data acquiring module synchronization by hardware synchronization mechanism
Light makes laser point cloud data and image data time synchronization;
S530. a cromogram of the described image data acquisition module through single exposure acquisition environment picture;It is described simultaneously
Laser radar data acquisition module is exposed the several groups strength information of acquisition environment picture several times, and is generated N ashes
Degree figure, wherein N >=6, the laser radar data acquisition module single exposure time are 1-50ns, and the time is according to practical need
It is set, also can be set into and be less than 1ns;It is specific as follows:
The laser point cloud data and image data of spatial position alignment and synchronous exposure are acquired, wherein the laser point cloud data
The strength information is converted to gray level information, this method will be strong using ROI method by distance, Reflection intensity information including cloud
The mapping relations of degree information and gray level information are divided into n sections, then individually calculate each section of corresponding gray level information;
Specific formula for calculation are as follows:
Wherein b is constant, and k=Max (A)/Max (B), wherein A is gray value, and B is Ambient rate intensity value.Other ginsengs
Scold bright: x is certain point respective intensities information in actual environment, and y is grayscale information, i.e. gray level information.
Field test adjustment segmentation parameter and effect is combined at this stage, for the strength information of differentiation feature object and right
Grayscale information after should mapping;
Such as the density by adjusting data, the data (1-1 ten thousand) that radar acquires are mapped in grayscale image (0-255);
The cromogram of the environment picture is obtained by described image data;
It after handling the laser point cloud data, is fused in described image data, obtains the distance of environment
Information.
S540., the fusion treatment that the cromogram and the N grayscale images are carried out to initial data level, obtains brightness and changes
Kind environment picture;Concrete operations are referring to above-mentioned steps S410-S460, and details are not described herein.
Laser radar data acquisition module described in the present embodiment is Flash Lidar, described to swash in synchronous exposure process
The ratio between frequency acquisition of frequency acquisition of optical radar data acquisition module and described image data acquisition module is M:1, wherein M >=
100, synchronous exposure schematic diagram is as shown in fig. 6, wherein the camera model is RGB camera, and the μ of single exposure time >=5 s is described
The colour atla mode that camera model uses is RGGB, it should be noted that its colour atla mode is not limited to RGGB, can also be
RCCB, RCCC etc.;Described image data acquisition module be camera model, camera model can be monocular camera, binocular camera, but
It is not limited thereto.
Embodiment 3
The embodiment of the invention provides a kind of environmental perception devices based on RGB-D, and as described in Figure 7, described device includes:
Synchronous exposure module 710, image data acquiring module 720, laser radar data acquisition module 730, environment picture Fusion Module
740 and range information obtain module 750.The synchronous exposure module 710, for making laser radar number by hardware synchronization mechanism
It is exposed according to acquisition module and image data acquiring module synchronization;Specifically, as shown in figure 8, the synchronous exposure module 710 includes
First synchronizes the exposure module 7102 synchronous with second of exposure module 7101;
Specifically, as shown in Figures 9 and 10, the described first synchronous exposure module 7101 include: triggering generate unit 71011,
First exposure control unit 71012 and signal transmitting unit 71013, the described second synchronous exposure module 7102 includes: that signal connects
Receive unit 71021 and the second exposure control unit 71022;
The triggering generates unit 71011, for issuing exposure signal;
First exposure control unit 71012, for receiving the exposure signal of the triggering generation unit transmission and inciting somebody to action
Exposure signal exposure;
The signal transmitting unit 71013, for receiving the exposure signal of the triggering generation unit transmission and will expose
Signal is sent to the signal receiving unit by channels such as the parallel ports MIPI/LVDS/;
The signal receiving unit 71021, the exposure signal sent for receiving the signal transmitting unit;
Second exposure control unit 71022, for receiving the exposure signal of the signal receiving unit transmission and inciting somebody to action
Exposure signal exposure.
Figure 11 show it is provided in an embodiment of the present invention by hardware synchronization mechanism make laser radar data acquisition module and
A kind of composition block diagram of image data acquiring module synchronization exposure, wherein the first synchronous exposure module 7101 be set to it is described
In laser radar data acquisition module 730, the described second synchronous exposure module 7102 is set to described image data acquisition module
In 720.
Figure 12 makes laser radar data acquisition module and image by hardware synchronization mechanism to be provided in an embodiment of the present invention
Another composition block diagram of the synchronous exposure of data acquisition module, wherein the described first synchronous exposure module 7101 is set to the figure
As in data acquisition module 720, the described second synchronous exposure module 7102 is set to the laser radar data acquisition module 730
In.
Described image data acquisition module 720, a cromogram for single exposure acquisition environment picture;
The laser radar data acquisition module 730, for exposing the several groups intensity letter of acquisition environment picture several times
Breath, and generated N grayscale images, wherein N >=6;The laser radar data acquisition module single exposure time is 50ns-
1ms;The ratio between the frequency acquisition of the laser radar data acquisition module and the frequency acquisition of described image data acquisition module are
M:1, wherein M >=100;Described image data acquisition module be camera model, the colour atla mode that the camera model uses for
RGGB、RCCB、RCCC。
As shown in figure 13, the laser radar data acquisition module 730 includes the first data acquisition unit 7301, intensity letter
Cease acquiring unit 7302 and information conversion unit 7303, first data acquisition unit 7301, for acquiring and the second data
The laser point cloud data of the synchronous exposure of acquisition unit;The strength information acquiring unit 7302, for passing through the laser point cloud
Strength information described in data acquisition;The information conversion unit packet 7303 includes first information converting unit, the first information
The strength information is converted to gray level information, circular for using maximum value and mean value gradient method by converting unit
It can refer to the record of embodiment 1.
As shown in figure 14, described image data acquisition module 720 includes that the second data acquisition unit 7201 and cromogram obtain
Take unit 7202;Second data acquisition unit 7201, for acquiring exposure synchronous with first data acquisition unit
Image data;
The cromogram acquiring unit 7202, for the cromogram by environment picture described in described image data acquisition;
The range information obtains module 750 and is fused to institute after handling the laser point cloud data
It states in image data, obtains the range information of the environment.
As shown in figure 15, the environment picture Fusion Module 740 includes color gamut converting unit 7401, picture integrated unit
7402, attribute information acquiring unit 7403, motion blur eliminate unit 7404 and convergence unit 7405;
The color gamut converting unit 7401, for the color gamut of the cromogram to be converted to YcCbcCrc;
The picture integrated unit 7402, for obtaining the low brightness area after conversion in cromogram, the low-light level area
Domain refers to that brightness is lower than the region of preset threshold;And it will believe in the grayscale image with the brightness of the low brightness area corresponding part
Breath add in the low brightness area merged after picture;
The attribute information acquiring unit 7403, for obtaining the attribute information of picture after the fusion, the attribute letter
Breath includes gradient magnitude, gradient direction, gradient loss, shade and noise;
The motion blur eliminates unit 7404, for carrying out at adaptive smooth to the corresponding data of the attribute information
It manages and the motion blur after the fusion in picture is eliminated based on the grayscale image;
The convergence unit 7405 specifically can be used for carrying out convergence process to the corresponding data of the attribute information
IRLS and CG method.
Embodiment 4
Figure 16 show the composition block diagram of another positive sample producing device provided in an embodiment of the present invention, described device packet
It includes: demarcating module 700, synchronous exposure module 710, image data acquiring module 720, laser radar data acquisition module 730, ring
Border picture Fusion Module 740 and range information obtain module 750.The demarcating module 700, for being acquired to laser radar data
Module and image data acquiring module carry out combined calibrating, are aligned the spatial position of laser point cloud data and image data;
The synchronous exposure module 710, for making laser radar data acquisition module and image by hardware synchronization mechanism
The synchronous exposure of data acquisition module;
Described image data acquisition module 720, a cromogram for single exposure acquisition environment picture;
The laser radar data acquisition module 730, for exposing the several groups intensity letter of acquisition environment picture several times
Breath, and generated N grayscale images, wherein N >=6;The laser radar data acquisition module single exposure time is 1-
50ns, the time are set according to actual needs, also be can be set into and are less than 1ns;The laser radar data acquisition module
Frequency acquisition and the ratio between the frequency acquisition of described image data acquisition module be M:1, wherein M >=100;Described image data are adopted
Integrate module as camera model, the colour atla mode that the camera model uses is RGGB, RCCB, RCCC.
The range information obtains module 750 and is fused to institute after handling the laser point cloud data
It states in image data, obtains the range information of the environment.
The synchronous exposure module 710, image data acquiring module 720, laser radar data acquisition module 730, environment
Picture Fusion Module 740 can refer to the description in embodiment 3, and details are not described herein.As different from Example 3, laser radar
Information conversion unit packet 7303 in data acquisition module 730 includes second information conversion unit, for using ROI method,
The strength information is converted to gray level information, circular can refer to the record of embodiment 2.
It should be noted that the apparatus and method embodiment in described device embodiment is based on same inventive concept.
A kind of environment perception method and device based on RGB-D provided by aforementioned present invention, using laser radar data
Acquisition module and the exposure of image data acquiring module synchronization will after acquiring grey chromatic graph and cromogram under different frequency acquisitions
Two kinds of pictures carry out the fusion treatment of initial data level, to obtain the environment picture of brightness improving;It realizes in various days
Splendid environmental information is obtained under gas and natural environment, robustness is preferable;Laser radar data acquisition module can in the present invention
It is realized by hardware synchronization mechanism and is exposed with image data acquiring module synchronization, laser point cloud data is influenced journey by motion distortion
Spend it is small, so as to obtain the higher environment picture of fusion accuracy;Laser radar data acquisition module is compared to phase in the present invention
Machine, frequency acquisition is fast, in the case where high vehicle speeds, can get the better data of real-time, improves the accurate of data
Property.
It should be understood that embodiments of the present invention sequencing is for illustration only, do not represent the advantages or disadvantages of the embodiments.
And above-mentioned this specification specific embodiment is described.Other embodiments are within the scope of the appended claims.One
In a little situations, the movement recorded in detail in the claims or step can be executed according to the sequence being different from embodiment and
Still desired result may be implemented.In addition, process depicted in the drawing not necessarily requires the particular order shown or company
Continuous sequence is just able to achieve desired result.In some embodiments, multitasking and parallel processing it is also possible or
It may be advantageous.
All the embodiments in this specification are described in a progressive manner, same and similar portion between each embodiment
Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for device
Speech, since it is substantially similar to the method embodiment, so being described relatively simple, referring to the part of embodiment of the method in place of correlation
Explanation.
The foregoing is merely presently preferred embodiments of the present invention, is not intended to limit the invention, it is all in spirit of the invention and
Within principle, any modification, equivalent replacement, improvement and so on be should all be included in the protection scope of the present invention.
Claims (20)
1. a kind of environment perception method based on RGB-D, which comprises the steps of:
Laser radar data acquisition module and image data acquiring module synchronization are exposed by hardware synchronization mechanism;
A cromogram of the described image data acquisition module through single exposure acquisition environment picture;The laser radar number simultaneously
It is exposed the several groups strength information of acquisition environment picture several times according to acquisition module, and is generated N grayscale images, wherein N
≥6;The laser radar data acquisition module single exposure time is less than 1ms;
The fusion treatment that the cromogram and the N grayscale images are carried out to initial data level, obtains the environment of brightness improving
Picture.
2. the environment perception method according to claim 1 based on RGB-D, which is characterized in that the laser radar data
Acquisition module is exposed the several groups strength information of acquisition environment picture several times, and is generated N grayscale images, specific to wrap
It includes:
The strength information is converted to after gray level information and generates the grayscale image.
3. the environment perception method according to claim 2 based on RGB-D, which is characterized in that use maximum value and mean value
The strength information is converted to gray level information by gradient method.
4. the environment perception method according to claim 1 based on RGB-D, which is characterized in that described by the cromogram
The fusion treatment that initial data level is carried out with the N grayscale images, obtains the environment picture of brightness improving, specifically includes:
The color gamut of the cromogram is converted into YcCbcCrc;
The low brightness area after converting in cromogram is obtained, the low brightness area refers to that brightness is lower than the region of preset threshold;
It will add in the low brightness area and obtain with the luminance information of the low brightness area corresponding part in the grayscale image
Picture after to fusion;
Obtain the attribute information of picture after the fusion, the attribute information include gradient magnitude, gradient direction, gradient loss,
Shade and noise;
After carrying out adaptive smooth processing to the corresponding data of the attribute information and eliminating the fusion based on the grayscale image
Motion blur in picture;
Convergence process is carried out to the corresponding data of the attribute information.
5. the environment perception method according to claim 1 based on RGB-D, which is characterized in that described to pass through hardware synchronization
Mechanism exposes laser radar data acquisition module and image data acquiring module synchronization, comprising:
The laser radar data acquisition module issues exposure signal and exposes the exposure signal, while the exposure being believed
Number it is sent to described image data acquisition module, described image data acquisition module receives the exposure signal and by the exposure
Signal exposure;
Or,
Described image data acquisition module issues exposure signal and exposes the exposure signal, while the exposure signal being sent out
The laser radar data acquisition module is given, the laser radar data acquisition module receives the exposure signal and will be described
Exposure signal exposure.
6. the environment perception method according to claim 1 based on RGB-D, which is characterized in that the acquisition of described image data
Cromogram of the module through single exposure acquisition environment picture, while the laser radar data acquisition module is adopted through exposing several times
The several groups strength information for collecting environment picture, specifically includes:
The laser point cloud data and image data of the synchronous exposure of acquisition;The intensity letter is obtained by the laser point cloud data
Breath, obtains the cromogram of the environment picture by described image data;
It after handling the laser point cloud data, is fused in described image data, obtains the range information of environment.
7. the environment perception method according to claim 1 based on RGB-D, which is characterized in that the laser radar data
Acquisition module is Flash Lidar, in synchronous exposure process, the frequency acquisition of the laser radar data acquisition module with it is described
The ratio between frequency acquisition of image data acquiring module is M:1, wherein M >=100;Described image data acquisition module is camera mould
Block, the colour atla mode that the camera model uses is RGGB, RCCB, RCCC.
8. the environment perception method according to claim 1 based on RGB-D, which is characterized in that the laser radar data
The acquisition module single exposure time is 50ns-1ms.
9. the environment perception method according to claim 1 based on RGB-D, which is characterized in that the laser radar data
The acquisition module single exposure time is 1-50ns.
10. the environment perception method according to claim 1 based on RGB-D, which is characterized in that the laser radar data
The acquisition module single exposure time is less than 1ns.
11. a kind of environmental perception device based on RGB-D characterized by comprising image data acquiring module, laser radar
Data acquisition module, synchronous exposure module and environment picture Fusion Module;
The synchronous exposure module, for making laser radar data acquisition module and image data acquiring by hardware synchronization mechanism
Module synchronization exposure;
Described image data acquisition module, a cromogram for single exposure acquisition environment picture;
The laser radar data acquisition module, for exposing the several groups strength information of acquisition environment picture several times, and will
It generates N grayscale images, wherein N >=6;The laser radar data acquisition module single exposure time is less than 1ms;
The environment picture Fusion Module, for the cromogram and the N grayscale images to be carried out melting for initial data level
Conjunction processing, obtains the environment picture of brightness improving.
12. the environmental perception device according to claim 11 based on RGB-D, which is characterized in that the laser radar number
It further include information conversion unit according to acquisition module, the information conversion unit is used to for the strength information to be converted to grayscale letter
Breath.
13. the environmental perception device according to claim 12 based on RGB-D, which is characterized in that the information conversion is single
Member includes first information converting unit,
The strength information is converted to grayscale for using maximum value and mean value gradient method by the first information converting unit
Information.
14. the environmental perception device according to claim 11 based on RGB-D, which is characterized in that the environment picture melts
Molding block includes: color gamut converting unit, picture integrated unit, attribute information acquiring unit, motion blur elimination unit and receipts
Unit is held back,
The color gamut converting unit, for the color gamut of the cromogram to be converted to YcCbcCrc;
The picture integrated unit, for obtaining the low brightness area after conversion in cromogram, the low brightness area refers to bright
Degree is lower than the region of preset threshold;And the luminance information in the grayscale image with the low brightness area corresponding part is added to
Picture after being merged in the low brightness area;
The attribute information acquiring unit, for obtaining the attribute information of picture after the fusion, the attribute information includes ladder
Spend amplitude, gradient direction, gradient loss, shade and noise;
The motion blur eliminates unit, for carrying out adaptive smooth processing to the corresponding data of the attribute information and being based on
The grayscale image eliminates the motion blur after the fusion in picture;
The convergence unit, for carrying out convergence process to the corresponding data of the attribute information.
15. the environmental perception device according to claim 11 based on RGB-D, which is characterized in that the synchronous exposure mould
Block synchronizes exposure module exposure module synchronous with second including first, and the described first synchronous exposure module is synchronous with described second to expose
Optical module is separately positioned in the laser radar data acquisition module or described image data acquisition module, and described first is synchronous
Exposure module includes that triggering generates unit, the first exposure control unit and signal transmitting unit, the described second synchronous exposure module
Including signal receiving unit and the second exposure control unit;
The triggering generates unit, for issuing exposure signal;
First exposure control unit, for receiving the exposure signal of the triggering generation unit transmission and exposing exposure signal
Light;
The signal transmitting unit, for receiving the exposure signal of the triggering generation unit transmission and being sent to exposure signal
The signal receiving unit;
The signal receiving unit, the exposure signal sent for receiving the signal transmitting unit;
Second exposure control unit, for receiving the exposure signal of the signal receiving unit transmission and exposing exposure signal
Light.
16. the environmental perception device according to claim 11 based on RGB-D, which is characterized in that described device further includes
Range information obtains module, and the laser radar data acquisition module includes that the first data acquisition unit and strength information obtain list
Member, described image data acquisition module include the second data acquisition unit and cromogram acquiring unit;
First data acquisition unit, for acquiring the laser point cloud data of exposure synchronous with the second data acquisition unit;
The strength information acquiring unit, for obtaining the strength information by the laser point cloud data;
Second data acquisition unit, for acquiring the image data of exposure synchronous with first data acquisition unit;
The cromogram acquiring unit, for the cromogram by environment picture described in described image data acquisition;
The range information obtains module and is fused to described image after handling the laser point cloud data
In data, the range information of the environment is obtained.
17. the environmental perception device according to claim 11 based on RGB-D, which is characterized in that the laser radar number
It is Flash Lidar according to acquisition module, in synchronous exposure process, the frequency acquisition of the laser radar data acquisition module and institute
Stating the ratio between frequency acquisition of image data acquiring module is M:1, wherein M >=100;Described image data acquisition module is camera mould
Block, the colour atla mode that the camera model uses is RGGB, RCCB, RCCC.
18. the environmental perception device according to claim 11 based on RGB-D, which is characterized in that the laser radar number
It is 50ns-1ms according to the acquisition module single exposure time.
19. the environmental perception device according to claim 11 based on RGB-D, which is characterized in that the laser radar number
It is 1-50ns according to the acquisition module single exposure time.
20. the environmental perception device according to claim 11 based on RGB-D, which is characterized in that the laser radar number
It is less than 1ns according to the acquisition module single exposure time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811307133.8A CN109495694B (en) | 2018-11-05 | 2018-11-05 | RGB-D-based environment sensing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811307133.8A CN109495694B (en) | 2018-11-05 | 2018-11-05 | RGB-D-based environment sensing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109495694A true CN109495694A (en) | 2019-03-19 |
CN109495694B CN109495694B (en) | 2021-03-05 |
Family
ID=65693802
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811307133.8A Active CN109495694B (en) | 2018-11-05 | 2018-11-05 | RGB-D-based environment sensing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109495694B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113688900A (en) * | 2021-08-23 | 2021-11-23 | 阿波罗智联(北京)科技有限公司 | Radar and visual data fusion processing method, road side equipment and intelligent traffic system |
CN115499637A (en) * | 2021-06-18 | 2022-12-20 | 黄初镇 | Camera device with radar function |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106043169A (en) * | 2016-07-01 | 2016-10-26 | 百度在线网络技术(北京)有限公司 | Environment perception device and information acquisition method applicable to environment perception device |
US20170069071A1 (en) * | 2015-09-04 | 2017-03-09 | Electronics And Telecommunications Research Institute | Apparatus and method for extracting person region based on red/green/blue-depth image |
-
2018
- 2018-11-05 CN CN201811307133.8A patent/CN109495694B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170069071A1 (en) * | 2015-09-04 | 2017-03-09 | Electronics And Telecommunications Research Institute | Apparatus and method for extracting person region based on red/green/blue-depth image |
CN106043169A (en) * | 2016-07-01 | 2016-10-26 | 百度在线网络技术(北京)有限公司 | Environment perception device and information acquisition method applicable to environment perception device |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115499637A (en) * | 2021-06-18 | 2022-12-20 | 黄初镇 | Camera device with radar function |
CN115499637B (en) * | 2021-06-18 | 2024-02-27 | 黄初镇 | Camera device with radar function |
CN113688900A (en) * | 2021-08-23 | 2021-11-23 | 阿波罗智联(北京)科技有限公司 | Radar and visual data fusion processing method, road side equipment and intelligent traffic system |
Also Published As
Publication number | Publication date |
---|---|
CN109495694B (en) | 2021-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210217212A1 (en) | Method and system for automatically colorizing night-vision images | |
CN110378838B (en) | Variable-view-angle image generation method and device, storage medium and electronic equipment | |
CN112165573B (en) | Shooting processing method and device, equipment and storage medium | |
CN103731583B (en) | Intelligent synthetic, print processing method is used for taking pictures | |
WO2019109805A1 (en) | Method and device for processing image | |
CN105141841B (en) | Picture pick-up device and its method | |
CN108055452A (en) | Image processing method, device and equipment | |
CN107800965B (en) | Image processing method, device, computer readable storage medium and computer equipment | |
CN101662694B (en) | Method and device for presenting, sending and receiving video and communication system | |
CN104683685A (en) | Automatic focusing method, automatic focusing device and image extracting device thereof | |
US20120154541A1 (en) | Apparatus and method for producing 3d images | |
CN108154514A (en) | Image processing method, device and equipment | |
JP2024504027A (en) | Pose estimation method and related device | |
CN109690628A (en) | Image producing method and device | |
CN104853080B (en) | Image processing apparatus | |
CN108683902A (en) | Target image obtains System and method for | |
CN109889799B (en) | Monocular structure light depth perception method and device based on RGBIR camera | |
JP7461504B2 (en) | Model generation method, image perspective determination method, apparatus, device and medium | |
CN113545030B (en) | Method, user equipment and system for automatically generating full-focus image through mobile camera | |
CN109495694A (en) | A kind of environment perception method and device based on RGB-D | |
KR20140074201A (en) | Tracking device | |
JP2005065051A (en) | Imaging apparatus | |
CN108279421A (en) | Time-of-flight camera with high-resolution colour picture | |
CN110933290A (en) | Virtual photographing integrated system and method based on human-computer interaction | |
KR20190090980A (en) | Apparatus for generating 3d model using filter-equipped lighting and drone |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |