WO2020088165A1 - 一种摄像头与激光雷达融合系统 - Google Patents

一种摄像头与激光雷达融合系统 Download PDF

Info

Publication number
WO2020088165A1
WO2020088165A1 PCT/CN2019/108119 CN2019108119W WO2020088165A1 WO 2020088165 A1 WO2020088165 A1 WO 2020088165A1 CN 2019108119 W CN2019108119 W CN 2019108119W WO 2020088165 A1 WO2020088165 A1 WO 2020088165A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
lidar
mounting
fusion system
base
Prior art date
Application number
PCT/CN2019/108119
Other languages
English (en)
French (fr)
Inventor
田尚
向少卿
Original Assignee
上海禾赛光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海禾赛光电科技有限公司 filed Critical 上海禾赛光电科技有限公司
Publication of WO2020088165A1 publication Critical patent/WO2020088165A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • the invention relates to the technical field of environment perception systems for driverless cars, in particular to a fusion system of a camera and a lidar.
  • Obstacle detection in the driving environment of self-driving cars is one of the key technologies of self-driving car environment perception system.
  • sensors installed on driverless cars are usually used to collect and analyze two-dimensional images and three-dimensional distance data around the car in real time during driving, and transmit the data to the control system of the driverless car. Drive the car to control.
  • Vision camera and lidar are the two most commonly used distance sensing components.
  • the visual camera monitors the objects around the car in real time, and at the same time calculates the distance between the object and the vehicle with a calibration algorithm, so as to achieve lane departure warning, collision prevention of the preceding car, pedestrian detection and other functions; the lidar measures the emitted light and reflected light Time interval between, to make accurate distance measurement of obstacles.
  • the advantage of the camera is the detailed classification and recognition of objects, especially in the content, such as license plates, signs, etc.
  • the advantage of the lidar lies in the collection of distance and depth information.
  • the camera is a passive collection of ambient light. When encountering dark light, shadows, and strong backlighting, the camera cannot collect enough effective information or directly fails.
  • Lidar actively sends out light detection, which can resist the influence of ambient light, especially the detection effect of dark light environment is better.
  • Driverless cars are driving in unknown and complex environments.
  • the detection information obtained through video cameras or lidar is relatively simple, and it is difficult to accurately locate. Multi-sensor fusion is an inevitable development trend for obstacle detection of driverless cars.
  • the purpose of the present invention is to provide a fusion system of camera and lidar in view of the defects of the prior art.
  • the present invention provides a camera and lidar fusion system, including a lidar and a camera module, the lidar is connected to the camera module;
  • the camera module includes at least one first camera and at least one second camera, the first camera is used to achieve 360 ° look-around image acquisition, and the second camera is used to realize forward-view image acquisition;
  • the lidar includes a rotor and at least one lens group disposed on the rotor, and the lens group can rotate with the rotor relative to the camera module;
  • the camera module When the lens group rotates to a preset position, the camera module is triggered to take a picture.
  • the camera module further includes a camera bracket
  • the lidar includes a base, and the lidar is connected to the camera bracket through the base.
  • the camera module includes four first cameras and one second camera, and the first camera and the second camera are both disposed on the outer wall of the camera bracket;
  • the four first cameras are evenly distributed along the circumferential direction of the outer wall of the camera bracket, and are used to collect images in four directions of front view, rear view, left view, and right view, respectively;
  • the second camera is a color camera, which is located directly above or below the camera used to collect the forward-looking azimuth image in the first camera, and is used to realize the forward-looking azimuth color image collection.
  • the second camera is a color camera, which replaces the camera used in the first camera to collect the forward-looking orientation image, and is used to realize the color image collection in the forward-looking orientation.
  • the camera bracket includes a first mounting cylinder and a second mounting cylinder, and the second mounting cylinder and the first mounting cylinder are arranged up and down;
  • the first mounting barrel is provided with a plurality of first mounting sections for mounting the first camera, and the plurality of first mounting sections are evenly distributed on the first mounting barrel in the circumferential direction, and the first
  • the mounting part is a first hollow structure, and the first hollow structure includes a first through hole matched with the first camera;
  • the second mounting barrel is provided with a second mounting portion for mounting the second camera, the second mounting portion is a second hollow structure, and the second hollow structure includes a structure matching with the second camera Second through hole.
  • a first chamfered structure and a second chamfered structure are provided on the first mounting portion and the second mounting portion, respectively.
  • one end of the camera bracket is provided with a first connection portion
  • the base is provided with a second connection portion cooperating with the first connection portion, the first connection portion and the second
  • the connecting parts are connected by connecting pieces.
  • a socket is provided on the outer side of the base, and the socket is a ring structure.
  • the system further includes a base, and the base is connected to the camera bracket.
  • the system further includes a first circuit board and a second circuit board, the first circuit board and the second circuit board are both disposed inside the base, and the second circuit board is located in the Described below the first circuit board.
  • a first cover is provided on the outside of the lidar
  • a second cover is provided on the outside of the camera module
  • one end of the socket is connected to the first cover
  • the socket The other end is connected to the second outer cover.
  • a first seal is provided between the first housing and the socket, and a second seal is provided between the second housing and the socket.
  • the present invention has the following beneficial effects:
  • the lens group of the present invention rotates to a position where the field of view of the lidar coincides with the field of view of the first camera in whole or in part, the first camera and the second camera will be triggered to take pictures, and the lidar and the first camera can be completed.
  • the information of the second camera is collected synchronously.
  • the post-parameter calibration is used to fuse the image data collected by the first camera and the second camera with the point cloud data collected by the lidar to obtain the pixel and point cloud data.
  • the precise coordinate mapping relationship realizes spatial matching, improves the detection accuracy, and enriches the information of the detected objects;
  • the present invention makes the structure more compact by distributing the four first cameras evenly on the outer wall of the camera bracket, and can effectively reduce the blind area between the four first cameras;
  • the present invention limits the relative movement between the base of the lidar and the camera bracket in the radial direction through the combination of main positioning and sub-positioning, and connects the base to the camera bracket through the connecting piece to limit the base
  • the relative movement between the seat and the camera bracket in the axial direction can be used to maintain the independent integrity of the lidar and camera module during installation, which is conducive to modularization, convenient for disassembly and assembly and later maintenance;
  • the present invention arranges the first outer cover and the second outer cover to ensure that all components are hidden inside the first outer cover and the second outer cover, so that the system has better integrity;
  • the first outer cover and the second outer cover of the present invention are arranged symmetrically with respect to the socket, ensuring a good appearance effect, and at the same time, ensuring that the first outer cover and the second outer cover can share a pair of molds during manufacturing, which is beneficial to reduce manufacturing cost;
  • the socket of the present invention is screwed to the first outer cover and the second outer cover respectively, and the first sealing member and the second sealing member are compressed by means of thread spinning, which can provide a suitable pressing force and help to ensure IP rating;
  • the present invention adopts the form of splitting a large circuit board into two circuit boards arranged up and down. Compared with the form where a whole large circuit board is arranged horizontally, the installation space of the base is saved.
  • FIG. 1 is a schematic diagram of a camera and lidar fusion system provided by an embodiment of the present invention
  • FIG. 2 is a schematic diagram of the connection between the camera module and the lidar provided by the embodiment of the present invention
  • FIG. 3 is a schematic perspective view of a camera module provided by an embodiment of the present invention.
  • FIG. 4 is a front view of a camera module provided by an embodiment of the present invention.
  • FIG. 5 is a front view of a first camera provided by an embodiment of the present invention.
  • FIG. 6 is a top view of a first camera provided by an embodiment of the present invention.
  • FIG. 7 is a horizontal field of view distribution diagram of a first camera provided by an embodiment of the present invention.
  • FIG. 8 is a working principle diagram of a lens group provided by an embodiment of the present invention rotating to a first preset position
  • FIG. 9 is a working principle diagram of a lens group provided by an embodiment of the present invention rotating to a second preset position;
  • FIG. 10 is a schematic diagram of the assembly of the camera bracket and the base provided by an embodiment of the present invention.
  • FIG. 11 is an enlarged schematic view of A in FIG. 10;
  • FIG. 13 is an enlarged schematic view of B in FIG. 12;
  • FIG. 14 is an enlarged schematic view of C in FIG. 12;
  • FIG. 15 is a schematic structural diagram of a first circuit board provided by an embodiment of the present invention.
  • 1-camera module 11-first camera, 111-horizontal field of view, 112-blind zone, 12-second camera, 13-camera bracket, 131-first mounting barrel, 1311-first mounting section, 1312-First angle-cut structure, 132-Second mounting barrel, 1321-Second mounting portion, 1322-Second angle-cut structure, 133-First connecting portion, 1331-First positioning hole, 2-Lidar, 21 -Rotor, 22-lens group, 23-base, 231-second connection, 3-connector, 4-socket, 5-first housing, 6-second housing, 7-first seal, 8-second seal, 9-base, 91-second positioning member, 92-plug connector, 10-first circuit board, 101-third positioning hole, 11-second circuit board.
  • an embodiment of the present invention provides a camera and lidar fusion system.
  • the system includes a camera module 1 and a lidar 2 disposed above the camera module 1.
  • the camera module 1 includes at least one first camera 11, at least one second camera 12, and a camera bracket 13, the first camera 11 is used to achieve 360 ° surround-view image acquisition, and the second camera 12 is used to Realize front-view image acquisition.
  • the camera module 1 may include four first cameras 11 and one second camera 12, four The first camera 11 and one second camera 12 are both installed on the outer wall of the camera bracket 13, and the four first cameras 11 are evenly distributed on the outer wall of the camera bracket 13 for respectively Acquire images in four directions: forward, rear, left, and right.
  • the second camera 12 is a color camera, which is located directly under the camera used in the first camera 11 for front-view orientation image acquisition. , Used to achieve color image acquisition in the forward-looking direction.
  • the second camera 12 may also be disposed directly above the first camera 11.
  • the second camera 12 is a color camera, which replaces the camera used in the first camera 11 for acquiring a forward-looking orientation image, and is used to realize color image acquisition in the forward-looking orientation.
  • the first camera 11 may also be one, which achieves 360 ° look-around image acquisition through rotation.
  • the camera module 1 may only include four of the first cameras 11 without using the second camera 12, wherein all four of the first cameras 11 are color cameras. It is used to collect color images in four directions: front view, back view, left view and right view.
  • the horizontal angle of view of the first camera 11 is 129 °
  • the vertical angle of view is 81.8 °
  • the horizontal angle of view of the second camera 12 is 52 °
  • the horizontal field of view 111 of the four first cameras 11 is evenly distributed, and the structure is more even It is compact and can effectively reduce the blind area 112 between the four first cameras 11.
  • the camera bracket 13 may be a cylindrical structure. With reference to FIGS. 3, 4 and 10, in some embodiments, the camera holder 13 may be a cylindrical structure with a circular cross section.
  • the camera bracket 13 includes a first mounting barrel 131 and a second mounting barrel 132 connected to the first mounting barrel 131, and the second mounting barrel 132 is disposed below the first mounting barrel 131.
  • the second mounting cylinder 132 may also be disposed above the first mounting cylinder 131.
  • the camera bracket 13 may be integrally formed. In other embodiments, the camera bracket 13 may also be formed by fixedly connecting the first mounting barrel 131 and the second mounting barrel 132. Further, a plurality of first mounting parts 1311 for mounting the first camera 11 are provided on the first mounting cylinder 131, and the plurality of first mounting parts 1311 are evenly distributed in the circumferential direction on the first Install on the cylinder 131.
  • the first mounting barrel 131 is provided with four first mounting portions 1311 for mounting the first camera 11, and the four first mounting portions 1311 are evenly distributed in the circumferential direction On the first mounting cylinder 131.
  • a first chamfered structure 1312 is provided on the first mounting portion 1311, and the arrangement of the first chamfered structure 1312 can reduce the occlusion of the field of view of the first camera 11 by the frame.
  • the first mounting portion 1311 may be a first hollow structure
  • the first hollow structure may include a first through hole that cooperates with the first camera 11, and the first chamfered structure 1312 is disposed at the edge of the first through hole.
  • the first mounting portion 1311 may be a first groove structure, and the first groove structure may include a first groove that cooperates with the first camera 11, the first The chamfered structure 1312 is disposed at the edge of the first groove.
  • a plurality of second mounting parts 1321 for mounting the second camera 12 are provided on the second mounting cylinder 132, and the plurality of second mounting parts 1321 are evenly distributed in the circumferential direction on the second Install on the cylinder 132.
  • the second mounting barrel 132 is provided with four second mounting portions 1321 for mounting the second camera 12, and the four second mounting portions 1321 are evenly distributed in the circumferential direction On the second mounting cylinder 132.
  • a second chamfer structure 1322 is provided on the second mounting portion 1321, and the arrangement of the second chamfer structure 1322 can reduce the occlusion of the field of view of the second camera 12 by the frame.
  • the second mounting portion 1321 may be a second hollow structure, and the second hollow structure may include a second through hole that cooperates with the second camera 12, and the second chamfered structure 1322 is provided at the edge of the second through hole.
  • the second mounting portion 1321 may be a second groove structure, and the second groove structure may include a second groove that cooperates with the second camera 12, the second The chamfered structure 1322 is disposed at the edge of the second groove.
  • the camera holder 13 is designed as a cylindrical structure in another form, for example, a cylindrical structure with a regular polygonal cross section.
  • the lidar 2 includes a rotor 21 and at least one lens group 22 disposed on the rotor 21, the lens group 22 may be relative to the camera with the rotor 21
  • the module 1 rotates, and when the lens group 22 rotates to a preset position, the camera module 1 is triggered to take a picture.
  • the preset position refers to a position where the field of view of the lidar 2 and the field of view of the first camera 11 all or partly coincide.
  • the first camera 11 and the second camera are triggered 12 Take a picture.
  • the first camera 11 and the second camera 12 may be triggered to take a picture
  • the first camera 11 and the second camera 12 may also be triggered to take pictures.
  • the post-parameter calibration can be used to fuse the image data collected by the first camera 11 and the second camera 12 with the point cloud data collected by the lidar 2 to obtain pixels and point clouds.
  • the precise coordinate mapping relationship between them can realize spatial matching, improve the detection accuracy, enrich the information of the detected object, and can give the detected object more attributes, including posture and motion status, so as to provide the possibility of the next moment. Behavior prediction.
  • the lens group 22 may include a transmitting lens group and a receiving lens group, the transmitting lens group is used to emit probe light, and the receiving lens group is used to receive probe light.
  • the transmitting lens group may include one transmitting lens
  • the receiving lens group may include one receiving lens.
  • the number of the transmitting lens and the receiving lens is not limited to one, but may also be two, three, or four.
  • the lidar 2 further includes a base 23 through which the lidar 2 is connected to the camera bracket 13.
  • the top end of the camera holder 13 is provided with a first connection part 133, and the base 23 is correspondingly provided with a second connection part 231, the first connection part 133 is connected to all
  • the second connection portion 231 is connected in a cooperative manner, for example, a plug connection, so as to realize the main positioning between the base 23 and the camera bracket 13.
  • At least one first positioning hole 1331 is provided on the top of the first connecting portion 133, and a second positioning hole corresponding to the first positioning hole 1331 is provided on the bottom of the second connecting portion 231.
  • a positioning hole 1331 and the second positioning hole are provided with a first positioning member (such as a pin), so as to realize the auxiliary positioning between the base 23 and the camera bracket 13.
  • first connecting portion 133 and the second connecting portion 231 are connected by a connecting member 3 (for example, a screw). It should be noted that the mounting direction of the connecting member 3 is along the radial direction of the camera bracket 13.
  • the above-mentioned main positioning and auxiliary positioning are combined to limit the relative movement between the base 23 and the camera bracket 13 in the radial direction, and the base 3 is used to make the base
  • the base 23 is connected to the camera bracket 13 to restrict relative movement between the base 23 and the camera bracket 13 in the axial direction.
  • the laser radar 2 and the camera module 1 can maintain independent integrity during installation, which facilitates modularization and facilitates disassembly and assembly and later maintenance.
  • the system further includes a socket 4, and the socket 4 is disposed outside the base 23.
  • the socket member 4 may adopt a ring structure.
  • the socket member 4 may also adopt a structure of other shapes, as long as the shape and the shape of the socket member 4 are ensured.
  • the shape of the base 23 can be adapted.
  • the socket 4 can be made of metal.
  • the socket 4 can be made of plastic or other materials.
  • the system further includes a first housing 5 and a second housing 6, the first housing 5 is disposed outside the lidar 2 , The second outer cover 6 is disposed outside the camera module 1.
  • One end of the socket 4 is connected to the first outer cover 5, and the other end of the socket 4 is connected to the second outer cover 6.
  • first outer cover 5 and the second outer cover 6 are both internally hollow circular truncated cone-shaped structures, and the first outer cover 5 and the second outer cover 6 are relative to the socket 4 Symmetrical setting ensures a good appearance; at the same time, it ensures that the first outer cover 5 and the second outer cover 6 can share a pair of molds during manufacturing, which is beneficial to reduce manufacturing costs.
  • first housing 5, the second housing 6, and the socket 4 may be integrally formed. In other embodiments, the first outer cover 5, the second outer cover 6, and the socket 4 may be non-integral molded.
  • the top and bottom ends of the socket 4 are respectively provided with a first sealing groove and a second sealing groove, and the first sealing groove and the second sealing groove are respectively provided with a first seal Piece 7 and second seal 8.
  • the socket 4 is screwed to the first housing 5 and the second housing 6 respectively, and the first sealing member 7 and the second sealing member 8 are compressed by thread spinning , Can provide a suitable pressing force, which is conducive to ensuring the IP level.
  • the first sealing member 7 and the second sealing member 8 may be sealing rings.
  • the sealing ring may be an O-ring, a Y-ring, or a V-ring Sealing ring or U-shaped sealing ring.
  • the system further includes a base 9 that is disposed below the second housing 6 and is connected to the bottom end of the camera bracket 13.
  • the system further includes a first circuit board 10 and a second circuit board 11.
  • the first circuit board 10 and the second circuit board 11 are both disposed on the base 9 And the second circuit board 11 is located below the first circuit board 10.
  • both the first circuit board 10 and the second circuit board 11 are ring-shaped structures.
  • the embodiment of the present invention adopts the form of splitting a large circuit board into two circuit boards arranged up and down. Compared with the form where a whole large circuit board is arranged horizontally, the installation space of the base 9 is saved .
  • the first circuit board 10 is provided with at least one third positioning hole 101
  • the second circuit board 11 is provided with a third positioning hole 101 corresponding to the third positioning hole 101.
  • the base 9 is provided with a second positioning member 91 (such as a positioning column), the second positioning member 91 is penetrated through the third positioning hole 101 and the fourth positioning hole to achieve The positioning between the first circuit board 10 and the second circuit board 11 facilitates the installation and disassembly of the first circuit board 10 and the second circuit board 11.
  • At least one plug 92 is also provided in the base 9, and the first circuit board 10 and the second circuit board 11 are respectively connected to the plug 92 to achieve The lidar 2 and the camera module 1 supply power and perform data transmission and processing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Studio Devices (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

一种摄像头与激光雷达融合系统,包括激光雷达(2)和摄像头模组(1),激光雷达(2)与摄像头模组(1)相连。摄像头模组(1)包括至少一个第一摄像头(11)和至少一个第二摄像头(12),第一摄像头(11)用于实现360°环视图像采集,第二摄像头(12)用于实现前视图像采集。激光雷达(2)包括转子(21)和设于转子(21)上的至少一个透镜组(22),透镜组(22)可随转子(21)相对于摄像头模组(1)旋转,当透镜组(22)旋转到预设位置时,触发摄像头模组(1)进行拍照,由此完成激光雷达(2)与第一摄像头(11)和第二摄像头(12)的信息同步采集。通过后期参数标定,使第一摄像头和第二摄像头采集的图像数据与激光雷达采集的点云数据融合,得到像素和点云之间的精确的坐标映射关系,实现空间匹配,提高探测准确性,丰富被探测物体的信息。

Description

一种摄像头与激光雷达融合系统 技术领域
本发明涉及无人驾驶汽车的环境感知系统技术领域,特别涉及一种摄像头与激光雷达融合系统。
背景技术
无人驾驶汽车行驶环境中的障碍物探测是无人驾驶汽车环境感知系统的关键技术之一。现阶段通常采用安装在无人驾驶汽车上的传感器,在行驶过程中实时采集并分析汽车周围的二维图像和三维距离数据,将该数据传输至无人驾驶汽车的控制系统,来对无人驾驶汽车进行控制。
视觉摄像头和激光雷达是目前最常使用的两种测距传感元件。视觉摄像头通过对汽车周围物体进行实时监测,同时配以标定算法计算出物体与车辆的距离,从而实现车道偏离警告、前车防撞、行人探测等功能;激光雷达则通过测量发射光线与反射光线之间的时间间隔,来对障碍物进行精确的距离测量。
摄像头的优势在于物体的细致分类识别,尤其是在内容上,比如车牌,指示牌等,而激光雷达的优势在于距离,深度信息的采集。摄像头是对环境光的被动采集,在遇到暗光,阴影,强逆光环境,摄像头无法采集到足够有效信息或者直接失效。激光雷达是主动发出光线探测,可以抵御环境光的影响,尤其是暗光环境探测效果反而更佳。无人驾驶汽车在未知的复杂环境中行驶,仅仅通过视频摄像头或激光雷达获取的探测信息较为单一,难以进行准确的定位,多传感器的融合是无人驾驶汽车障碍物探测必然的发展趋势。
发明内容
本发明的目的在于针对现有技术的缺陷,提供一种摄像头与激光雷达 融合系统。
为了达到上述目的,本发明提供一种摄像头与激光雷达融合系统,包括激光雷达和摄像头模组,所述激光雷达与所述摄像头模组相连;
所述摄像头模组包括至少一个第一摄像头和至少一个第二摄像头,所述第一摄像头用于实现360°环视图像采集,所述第二摄像头用于实现前视图像采集;
所述激光雷达包括转子和设置于所述转子上的至少一个透镜组,所述透镜组可随所述转子相对于所述摄像头模组旋转;
当所述透镜组旋转到预设位置时,触发所述摄像头模组进行拍照。
可选地,所述摄像头模组还包括摄像头支架;
所述激光雷达包括基座,所述激光雷达通过所述基座与所述摄像头支架相连。
可选地,所述摄像头模组包括四个所述第一摄像头和一个所述第二摄像头,所述第一摄像头和所述第二摄像头均设置于所述摄像头支架的外壁上;
四个所述第一摄像头沿所述摄像头支架的外壁的周向均匀分布,用于分别采集前视、后视、左视、右视四个方位的图像;
所述第二摄像头为彩色摄像头,其位于所述第一摄像头中用于采集前视方位图像的摄像头的正上方或正下方,用于实现前视方位的彩色图像采集。
可选地,所述第二摄像头为彩色摄像头,其取代所述第一摄像头中的用于采集前视方位图像的摄像头,用于实现前视方位的彩色图像采集。
可选地,所述摄像头支架包括第一安装筒和第二安装筒,所述第二安装筒与所述第一安装筒上下布置;
所述第一安装筒上设置有多个用于安装所述第一摄像头的第一安装部,多个所述第一安装部环向均匀分布于所述第一安装筒上,所述第一安装部为第一镂空结构,所述第一镂空结构包括与所述第一摄像头相配合的第一通孔;
所述第二安装筒上设置有用于安装所述第二摄像头的第二安装部,所 述第二安装部为第二镂空结构,所述第二镂空结构包括与所述第二摄像头相配合的第二通孔。
可选地,所述第一安装部和所述第二安装部上分别设置有第一切角结构和第二切角结构。
可选地,所述摄像头支架的一端设置有第一连接部,所述基座上设置有与所述第一连接部相配合的第二连接部,所述第一连接部和所述第二连接部之间通过连接件相连。
可选地,所述基座的外侧设置有套接件,所述套接件为环状结构。
可选地,所述系统还包括底座,所述底座与所述摄像头支架相连。
可选地,所述系统还包括第一电路板和第二电路板,所述第一电路板和所述第二电路板均设置于所述底座的内部,且所述第二电路板位于所述第一电路板下方。
可选地,所述激光雷达的外侧设置有第一外罩,所述摄像头模组的外侧设置有第二外罩,所述套接件的一端与所述第一外罩相连,所述套接件的另一端与所述第二外罩相连。
可选地,所述第一外罩与所述套接件之间设置有第一密封件,所述第二外罩与所述套接件之间设置有第二密封件。
与现有技术相比,本发明具有如下有益效果:
1、本发明的透镜组旋转到激光雷达的视场与第一摄像头的视场全部或部分重合的位置时,会触发第一摄像头和第二摄像头进行拍照,能够完成激光雷达与第一摄像头和第二摄像头的信息同步采集,当信息同步采集完成后,通过后期参数标定,使第一摄像头和第二摄像头采集的图像数据与激光雷达采集的点云数据融合,得到像素和点云之间的精确的坐标映射关系,实现了空间匹配,提高了探测准确性,丰富了被探测物体的信息;
2、本发明通过将四个第一摄像头均匀分布于摄像头支架的外壁上的布置方式,使得结构更加紧凑,能够有效降低四个第一摄像头之间的盲区;
3、本发明通过主定位和副定位相结合的方式,以限制激光雷达的基座和摄像头支架之间沿径向方向的相对运动,并通过连接件使基座与摄像头支架相连,以限制基座和摄像头支架之间沿轴向方向的相对运动,采用这 样的方式可以使激光雷达与摄像头模组在安装时保持独立的完整性,有利于实现模块化,方便拆装及后期维护;
4、本发明通过在摄像头支架的第一安装部和第二安装部上分别设置第一切角结构和第二切角结构,相对于没有切角结构的摄像头支架而言,减少了对摄像头视场的遮挡;
5、本发明通过布置第一外罩和第二外罩,保证所有元件均隐藏在第一外罩和第二外罩内部,使系统具有更好的完整性;
6、本发明的第一外罩和第二外罩相对于套接件对称设置,保证具有良好的外观效果,同时,保证第一外罩和第二外罩在制造时可以共用一副模具,有利于降低制造成本;
7、本发明的套接件分别与第一外罩和第二外罩螺纹连接,通过螺纹旋压的方式来压紧第一密封件和第二密封件,能够提供合适的压紧力,有利于保证IP等级;
8、本发明采用将一整块大的电路板拆分成上下布置的两块电路板形式,相比于采用一整块大的电路板水平布置的形式,节约了底座的安装空间。
本发明的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本发明的实践了解到。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它附图。
图1是本发明实施例提供的摄像头与激光雷达融合系统的示意图;
图2是本发明实施例提供的摄像头模组与激光雷达的连接示意图;
图3是本发明实施例提供的摄像头模组的立体结构示意图;
图4是本发明实施例提供的摄像头模组的主视图;
图5是本发明实施例提供的第一摄像头的主视图;
图6是本发明实施例提供的第一摄像头的俯视图;
图7是本发明实施例提供的第一摄像头的水平视场分布图;
图8是本发明实施例提供的透镜组旋转至第一预设位置的工作原理图;
图9是本发明实施例提供的透镜组旋转至第二预设位置的工作原理图;
图10是本发明实施例提供的摄像头支架与基座的装配示意图;
图11是图10中A的放大示意图;
图12是本发明实施例提供的第二外罩的装配示意图;
图13是图12中B的放大示意图;
图14是图12中C的放大示意图;
图15是本发明实施例提供的第一电路板的结构示意图;
图中:1-摄像头模组,11-第一摄像头,111-水平视场,112-盲区,12-第二摄像头,13-摄像头支架,131-第一安装筒,1311-第一安装部,1312-第一切角结构,132-第二安装筒,1321-第二安装部,1322-第二切角结构,133-第一连接部,1331-第一定位孔,2-激光雷达,21-转子,22-透镜组,23-基座,231-第二连接部,3-连接件,4-套接件,5-第一外罩,6-第二外罩,7-第一密封件,8-第二密封件,9-底座,91-第二定位件,92-插接件,10-第一电路板,101-第三定位孔,11-第二电路板。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本发明实施例的组件可以以各种不同的配置来布置和设计。
因此,以下对在附图中提供的本发明的实施例的详细描述并非旨在限制要求保护的本发明的范围,而是仅仅表示本发明的选定实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步 定义和解释。
需要说明的是,在不冲突的情况下,本发明中的实施例及实施例中的特征可以相互组合。
实施例
结合参考图2和图3,本发明实施例提供一种摄像头与激光雷达融合系统,所述系统包括摄像头模组1和设置于所述摄像头模组1上方的激光雷达2。其中,所述摄像头模组1包括至少一个第一摄像头11、至少一个第二摄像头12和摄像头支架13,所述第一摄像头11用于实现360°环视图像采集,所述第二摄像头12用于实现前视图像采集。
结合参考图3、图5、图6、图8和图9,在一些实施例中,所述摄像头模组1可以包括四个所述第一摄像头11和一个所述第二摄像头12,四个所述第一摄像头11和一个所述第二摄像头12均安装于所述摄像头支架13的外壁上,且四个所述第一摄像头11均匀分布于所述摄像头支架13的外壁上,用于分别采集前视、后视、左视、右视四个方位的图像,所述第二摄像头12为彩色摄像头,其设置于所述第一摄像头11中用于前视方位图像采集的摄像头的正下方,用于实现前视方位的彩色图像采集。此外,根据实际需要,所述第二摄像头12也可以设置于所述第一摄像头11的正上方。
在另一些实施例中,所述第二摄像头12为彩色摄像头,其取代所述第一摄像头11中的用于采集前视方位图像的摄像头,用于实现前视方位的彩色图像采集。
在另一些实施例中,所述第一摄像头11还可以为一个,其通过旋转实现360°环视图像采集。
在另一些实施例中,所述摄像头模组1也可以仅包括四个所述第一摄像头11,而不使用所述第二摄像头12,其中四个所述第一摄像头11均为彩色摄像头,用于分别采集前视、后视、左视、右视四个方位的彩色图像。
在一个可选的示例中,所述第一摄像头11的水平视场角为129°,垂直视场角为81.8°,所述第二摄像头12的水平视场角为52°,垂直视场角为28.6°。
参考图7,本发明实施例通过将四个所述第一摄像头11均匀分布于所述摄像头支架13上的布置方式,使得四个所述第一摄像头11的水平视场111均匀分布,结构更加紧凑,能够有效降低四个所述第一摄像头11之间的盲区112。
在一些实施例中,所述摄像头支架13可以为筒状结构。结合参考图3、图4和图10,在一些实施例中,所述摄像头支架13可以为横截面呈圆形的筒状结构。
具体地,所述摄像头支架13包括第一安装筒131和与所述第一安装筒131相连的第二安装筒132,所述第二安装筒132设置于所述第一安装筒131的下方。此外,根据实际需要,所述第二安装筒132也可以设置于所述第一安装筒131的上方。
在一些实施例中,所述摄像头支架13可以为一体成型,在另一些实施例中,所述摄像头支架13也可由第一安装筒131与第二安装筒132通过固定连接形成。进一步地,所述第一安装筒131上设置有多个用于安装所述第一摄像头11的第一安装部1311,且多个所述第一安装部1311环向均匀分布于所述第一安装筒131上。
在一个可选的示例中,所述第一安装筒131上设置有四个用于安装所述第一摄像头11的第一安装部1311,四个所述第一安装部1311环向均匀分布于所述第一安装筒131上。
在一些实施例中,所述第一安装部1311上设置有第一切角结构1312,所述第一切角结构1312的设置可以减小边框对所述第一摄像头11的视场的遮挡。
在一些实施例中,所述第一安装部1311可以为第一镂空结构,所述第一镂空结构可以包括与所述第一摄像头11相配合的第一通孔,所述第一切角结构1312设置在所述第一通孔的边缘。
在另一些实施例中,所述第一安装部1311可以为第一凹槽结构,所述第一凹槽结构可以包括与所述第一摄像头11相配合的第一凹槽,所述第一切角结构1312设置在所述第一凹槽的边缘。
进一步地,所述第二安装筒132上设置有多个用于安装所述第二摄像 头12的第二安装部1321,且多个所述第二安装部1321环向均匀分布于所述第二安装筒132上。
在一个可选的示例中,所述第二安装筒132上设置有四个用于安装所述第二摄像头12的第二安装部1321,四个所述第二安装部1321环向均匀分布于所述第二安装筒132上。
在一些实施例中,所述第二安装部1321上设置有第二切角结构1322,所述第二切角结构1322的设置可以减少边框对所述第二摄像头12的视场的遮挡。
在一些实施例中,所述第二安装部1321可以为第二镂空结构,所述第二镂空结构可以包括与所述第二摄像头12相配合的第二通孔,所述第二切角结构1322设置在所述第二通孔的边缘。
在另一些实施例中,所述第二安装部1321可以为第二凹槽结构,所述第二凹槽结构可以包括与所述第二摄像头12相配合的第二凹槽,所述第二切角结构1322设置在所述第二凹槽的边缘。
虽然以上实施例仅列举了所述摄像头支架13为横截面呈圆形的筒状结构的情形,但本发明的保护范围不仅限于此,根据具体需要,本领域技术人员基于本发明的主旨,可以将所述摄像头支架13设计成其他形式的筒状结构,例如横截面呈正多边形的筒状结构。
结合参考图2、图8和图9,所述激光雷达2包括转子21和设置于所述转子21上的至少一个透镜组22,所述透镜组22可随所述转子21相对于所述摄像头模组1旋转,当所述透镜组22旋转到预设位置时,触发所述摄像头模组1进行拍照。其中,所述预设位置是指所述激光雷达2的视场与所述第一摄像头11的视场全部或部分重合的位置。
具体地,当所述透镜组22旋转到所述激光雷达2的视场与所述第一摄像头11的视场全部或部分重合的位置时,触发所述第一摄像头11和所述第二摄像头12进行拍照。例如,参考图8,当所述透镜组22旋转到第一预设位置时,可以触发所述第一摄像头11和所述第二摄像头12进行拍照,参考图9,当所述透镜组22旋转到第二预设位置时,也可以触发所述第一摄像头11和所述第二摄像头12进行拍照。通过触发所述第一摄像头11和 所述第二摄像头12进行拍照,完成所述激光雷达2与所述第一摄像头11和所述第二摄像头12的信息同步采集。
当信息同步采集完成后,可以通过后期参数标定,使所述第一摄像头11和所述第二摄像头12采集的图像数据与所述激光雷达2采集的点云数据融合,得到像素和点云之间的精确的坐标映射关系,从而实现空间匹配,提高了探测准确性,丰富了被探测物体的信息,可以赋予被探测物体更多的属性,包括姿态,运动状态,便于提供下一时刻可能的行为预测。
在一些实施例中,所述透镜组22可以包括发射透镜组和接收透镜组,所述发射透镜组用于出射探测光,所述接收透镜组用于接收探测光。在一个实施例中,所述发射透镜组可以包括一个发射透镜,所述接收透镜组可以包括一个接收透镜。在其它实施例中,所述发射透镜和所述接收透镜的数目不限于是一个,也可以是两个、三个或四个等。
参考图2,在一些实施例中,所述激光雷达2还包括基座23,所述激光雷达2通过所述基座23与所述摄像头支架13相连。
结合参考图3、图10和图11,所述摄像头支架13的顶端设置有第一连接部133,所述基座23上对应设置有第二连接部231,所述第一连接部133与所述第二连接部231配合连接,例如插接连接,以实现所述基座23与所述摄像头支架13之间的主定位。
所述第一连接部133的顶端设置有至少一个第一定位孔1331,所述第二连接部231的底端设置有与所述第一定位孔1331相对应的第二定位孔,所述第一定位孔1331和所述第二定位孔内穿设置有第一定位件(例如销钉),以实现所述基座23与所述摄像头支架13之间的副定位。
进一步地,所述第一连接部133与所述第二连接部231之间通过连接件3(例如紧盯螺钉)相连。需要说明的是,所述连接件3的安装方向为沿所述摄像头支架13的径向方向。
本发明实施例通过上述主定位和副定位相结合的方式,以限制所述基座23和所述摄像头支架13之间沿径向方向的相对运动,并通过所述连接件3使所述基座23与所述摄像头支架13相连,以限制所述基座23和所述摄像头支架13之间沿轴向方向的相对运动。采用这样的方式可以使所述激 光雷达2与所述摄像头模组1在安装时保持独立的完整性,有利于实现模块化,方便拆装及后期维护。
结合参考图12和图13,在一些实施例中,所述系统还包括套接件4,所述套接件4设置于所述基座23的外侧。在一个实施例中,所述套接件4可以采用环状结构,在其它实施例中,所述套接件4也可以采用其他形状的结构,只要保证所述套接件4的形状与所述基座23的形状相适配即可。在一个实施例中,所述套接件4可以采用金属材质制成。在其它实施例中,所述套接件4可以采用塑料或其他材质制成。
结合参考图1、图2、图12和图13,在一些实施例中,所述系统还包括第一外罩5和第二外罩6,所述第一外罩5设置在所述激光雷达2的外侧,所述第二外罩6设置在所述摄像头模组1的外侧。所述套接件4的一端与所述第一外罩5相连,所述套接件4的另一端与所述第二外罩6相连。
在一些实施例中,所述第一外罩5和所述第二外罩6均为内部中空的圆台状结构,且所述第一外罩5和所述第二外罩6相对于所述套接件4对称设置,保证具有良好的外观效果;同时,保证所述第一外罩5和所述第二外罩6在制造时可以共用一副模具,有利于降低制造成本。
在一些实施例中,所述第一外罩5、所述第二外罩6和所述套接件4可以为一体成型。在另一些实施例中,所述第一外罩5、所述第二外罩6和所述套接件4可以为非一体成型。
本发明实施例通过布置所述第一外罩5和所述第二外罩6,保证所有元件均隐藏在所述第一外罩5和所述第二外罩6内部,使所述系统具有更好的完整性。
在一些实施例中,所述套接件4的顶端和底端分别设置有第一密封槽和第二密封槽,所述第一密封槽和所述第二密封槽中分别设置有第一密封件7和第二密封件8。
在一些实施例中,所述套接件4分别与所述第一外罩5和所述第二外罩6螺纹连接,通过螺纹旋压的方式来压紧第一密封件7和第二密封件8,能够提供合适的压紧力,有利于保证IP等级。
在一些实施例中,所述第一密封件7和所述第二密封件8可以为密封 圈,根据具体情况的需要,所述密封圈可以选用O形密封圈、Y形密封圈、V形密封圈或者U形密封圈。
结合参考图2和图14,在一些实施例中,所述系统还包括底座9,所述底座9设置于所述第二外罩6的下方,且与所述摄像头支架13的底端相连。
结合参考图12,在一些实施例中,所述系统还包括第一电路板10和第二电路板11,所述第一电路板10和所述第二电路板11均设置于所述底座9的内部,且所述第二电路板11位于所述第一电路板10下方。
在一些实施例中,所述第一电路板10和所述第二电路板11均为环状结构。
本发明实施例采用将一整块大的电路板拆分成上下布置的两块电路板形式,相比于采用一整块大的电路板水平布置的形式,节约了所述底座9的安装空间。
参考图15,在一些实施例中,所述第一电路板10上设置有至少一个第三定位孔101,所述第二电路板11上设置有与所述第三定位孔101相对应的第四定位孔,所述底座9上设置有第二定位件91(例如定位柱),所述第二定位件91穿设于所述第三定位孔101和所述第四定位孔内,以实现所述第一电路板10与所述第二电路板11之间的定位,方便所述第一电路板10与所述第二电路板11之间的安装与拆卸。
在一些实施例中,所述底座9内还设置有至少一个插接件92,所述第一电路板10和所述第二电路板11分别与所述插接件92相连,以实现为所述激光雷达2和所述摄像头模组1供电以及进行数据传输及处理。
应当理解的是,本发明的上述具体实施例仅仅用于示例性说明或解释本发明的原理,而不构成对本发明的限制。因此,在不偏离本发明的精神和范围的情况下所做的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (12)

  1. 一种摄像头与激光雷达融合系统,其特征在于,包括激光雷达(2)和摄像头模组(1),所述激光雷达(2)与所述摄像头模组(1)相连;
    所述摄像头模组(1)包括至少一个第一摄像头(11)和至少一个第二摄像头(12),所述第一摄像头(11)用于实现360°环视图像采集,所述第二摄像头(12)用于实现前视图像采集;
    所述激光雷达(2)包括转子(21)和设置于所述转子(21)上的至少一个透镜组(22),所述透镜组(22)可随所述转子(21)相对于所述摄像头模组(1)旋转;
    当所述透镜组(22)旋转到预设位置时,触发所述摄像头模组(1)进行拍照。
  2. 根据权利要求1所述的摄像头与激光雷达融合系统,其特征在于,所述摄像头模组(1)还包括摄像头支架(13);
    所述激光雷达(2)包括基座(23),所述激光雷达(2)通过所述基座(23)与所述摄像头支架(13)相连。
  3. 根据权利要求2所述的摄像头与激光雷达融合系统,其特征在于,所述摄像头模组(1)包括四个所述第一摄像头(11)和一个所述第二摄像头(12),所述第一摄像头(11)和所述第二摄像头(12)均设置于所述摄像头支架(13)的外壁上;
    四个所述第一摄像头(11)沿所述摄像头支架(13)的外壁的周向均匀分布,用于分别采集前视、后视、左视、右视四个方位的图像;
    所述第二摄像头(12)为彩色摄像头,其位于所述第一摄像头(11)中用于采集前视方位图像的摄像头的正上方或正下方,用于实现前视方位的彩色图像采集。
  4. 根据权利要求3所述的摄像头与激光雷达融合系统,其特征在于,所述第二摄像头(12)为彩色摄像头,其取代所述第一摄像头(11)中的用于采集前视方位图像的摄像头,用于实现前视方位的彩色图像采集。
  5. 根据权利要求3所述的摄像头与激光雷达融合系统,其特征在于, 所述摄像头支架(13)包括第一安装筒(131)和第二安装筒(132),所述第二安装筒(132)与所述第一安装筒(131)上下布置;
    所述第一安装筒(131)上设置有多个用于安装所述第一摄像头(11)的第一安装部(1311),多个所述第一安装部(1311)环向均匀分布于所述第一安装筒(131)上,所述第一安装部(1311)为第一镂空结构,所述第一镂空结构包括与所述第一摄像头(11)相配合的第一通孔;
    所述第二安装筒(132)上设置有用于安装所述第二摄像头(12)的第二安装部(1321),所述第二安装部(1321)为第二镂空结构,所述第二镂空结构包括与所述第二摄像头(12)相配合的第二通孔。
  6. 根据权利要求5所述的摄像头与激光雷达融合系统,其特征在于,所述第一安装部(1311)和所述第二安装部(1321)上分别设置有第一切角结构(1312)和第二切角结构(1322)。
  7. 根据权利要求2所述的摄像头与激光雷达融合系统,其特征在于,所述摄像头支架(13)的一端设置有第一连接部(133),所述基座(23)上设置有与所述第一连接部(133)相配合的第二连接部(231),所述第一连接部(133)和所述第二连接部(231)之间通过连接件(3)相连。
  8. 根据权利要求7所述的摄像头与激光雷达融合系统,其特征在于,所述基座(23)的外侧设置有套接件(4),所述套接件(4)为环状结构。
  9. 根据权利要求8所述的摄像头与激光雷达融合系统,其特征在于,还包括底座(9),所述底座(9)与所述摄像头支架(13)相连。
  10. 根据权利要求9所述的摄像头与激光雷达融合系统,其特征在于,还包括第一电路板(10)和第二电路板(11),所述第一电路板(10)和所述第二电路板(11)均设置于所述底座(9)的内部,且所述第二电路板(11)位于所述第一电路板(10)下方。
  11. 根据权利要求8所述的摄像头与激光雷达融合系统,其特征在于,所述激光雷达(2)的外侧设置有第一外罩(5),所述摄像头模组(1)的外侧设置有第二外罩(6),所述套接件(4)的一端与所述第一外罩(5)相连,所述套接件(4)的另一端与所述第二外罩(6)相连。
  12. 根据权利要求11所述的摄像头与激光雷达融合系统,其特征在于, 所述第一外罩(5)与所述套接件(4)之间设置有第一密封件(7),所述第二外罩(6)与所述套接件(4)之间设置有第二密封件(8)。
PCT/CN2019/108119 2018-10-30 2019-09-26 一种摄像头与激光雷达融合系统 WO2020088165A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811276492.1A CN109253720B (zh) 2018-10-30 2018-10-30 一种摄像头与激光雷达融合系统
CN201811276492.1 2018-10-30

Publications (1)

Publication Number Publication Date
WO2020088165A1 true WO2020088165A1 (zh) 2020-05-07

Family

ID=65043117

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/108119 WO2020088165A1 (zh) 2018-10-30 2019-09-26 一种摄像头与激光雷达融合系统

Country Status (2)

Country Link
CN (1) CN109253720B (zh)
WO (1) WO2020088165A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522020A (zh) * 2020-06-23 2020-08-11 山东亦贝数据技术有限公司 一种园区活动要素混合定位系统及方法
CN112797990A (zh) * 2020-12-24 2021-05-14 深圳市优必选科技股份有限公司 一种存储介质、机器人及其导航位图生成方法及装置
CN113674422A (zh) * 2021-08-27 2021-11-19 中汽创智科技有限公司 一种数据同步采集方法、控制模块、系统及存储介质
CN116687386A (zh) * 2023-08-07 2023-09-05 青岛市畜牧工作站(青岛市畜牧兽医研究所) 牛体形数据综合校准的雷达探测系统及方法
CN117706942A (zh) * 2024-02-05 2024-03-15 四川大学 一种环境感知与自适应驾驶辅助电子控制方法及系统

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109253720B (zh) * 2018-10-30 2020-04-24 上海禾赛光电科技有限公司 一种摄像头与激光雷达融合系统
CN110082739B (zh) * 2019-03-20 2022-04-12 深圳市速腾聚创科技有限公司 数据同步方法和设备
CN109884617A (zh) * 2019-03-29 2019-06-14 广州文远知行科技有限公司 安全监控设备及包含其的运载工具
CN110261869A (zh) * 2019-05-15 2019-09-20 深圳市速腾聚创科技有限公司 目标探测系统和数据融合方法
CN110376598A (zh) * 2019-08-14 2019-10-25 深圳市镭神智能系统有限公司 一种视觉系统与激光雷达融合装置及融合系统
CN110488318A (zh) * 2019-08-16 2019-11-22 长沙行深智能科技有限公司 环视相机与雷达同步的曝光控制方法、装置、介质及设备
CN110471085B (zh) * 2019-09-04 2023-07-04 深圳市镭神智能系统有限公司 一种轨道检测系统
CN110596729A (zh) * 2019-09-12 2019-12-20 北京京东乾石科技有限公司 激光扫描仪及自动驾驶汽车
CN110753167B (zh) * 2019-11-13 2022-04-08 广州文远知行科技有限公司 时间同步方法、装置、终端设备及存储介质
CN111435162B (zh) * 2020-03-03 2021-10-08 深圳市镭神智能系统有限公司 激光雷达与相机同步方法、装置、设备和存储介质
CN112394347B (zh) * 2020-11-18 2022-12-23 杭州海康威视数字技术股份有限公司 一种目标检测方法、装置及设备
CN117616309A (zh) * 2021-09-23 2024-02-27 华为技术有限公司 信号处理方法、信号传输方法及装置

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150177007A1 (en) * 2013-12-23 2015-06-25 Automotive Research & Testing Center Autonomous driver assistance system and autonomous driving method thereof
CN206848481U (zh) * 2017-07-03 2018-01-05 百度在线网络技术(北京)有限公司 车载信息采集系统
CN107797117A (zh) * 2016-09-06 2018-03-13 夏普株式会社 自主行驶装置
CN107990879A (zh) * 2017-11-28 2018-05-04 佛山市安尔康姆航空科技有限公司 无人机云台的控制方法
CN108332716A (zh) * 2018-02-07 2018-07-27 徐州艾特卡电子科技有限公司 一种自动驾驶汽车环境感知系统
CN108628301A (zh) * 2017-03-20 2018-10-09 通用汽车环球科技运作有限责任公司 用于操作自动驾驶车辆的时间数据关联
CN109253720A (zh) * 2018-10-30 2019-01-22 上海禾赛光电科技有限公司 一种摄像头与激光雷达融合系统
CN110261869A (zh) * 2019-05-15 2019-09-20 深圳市速腾聚创科技有限公司 目标探测系统和数据融合方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170056932A1 (en) * 2015-08-26 2017-03-02 Denso Corporation Detection device cleaning apparatus having fan
CN107219533B (zh) * 2017-08-04 2019-02-05 清华大学 激光雷达点云与图像融合式探测系统

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150177007A1 (en) * 2013-12-23 2015-06-25 Automotive Research & Testing Center Autonomous driver assistance system and autonomous driving method thereof
CN107797117A (zh) * 2016-09-06 2018-03-13 夏普株式会社 自主行驶装置
CN108628301A (zh) * 2017-03-20 2018-10-09 通用汽车环球科技运作有限责任公司 用于操作自动驾驶车辆的时间数据关联
CN206848481U (zh) * 2017-07-03 2018-01-05 百度在线网络技术(北京)有限公司 车载信息采集系统
CN107990879A (zh) * 2017-11-28 2018-05-04 佛山市安尔康姆航空科技有限公司 无人机云台的控制方法
CN108332716A (zh) * 2018-02-07 2018-07-27 徐州艾特卡电子科技有限公司 一种自动驾驶汽车环境感知系统
CN109253720A (zh) * 2018-10-30 2019-01-22 上海禾赛光电科技有限公司 一种摄像头与激光雷达融合系统
CN110261869A (zh) * 2019-05-15 2019-09-20 深圳市速腾聚创科技有限公司 目标探测系统和数据融合方法

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111522020A (zh) * 2020-06-23 2020-08-11 山东亦贝数据技术有限公司 一种园区活动要素混合定位系统及方法
CN112797990A (zh) * 2020-12-24 2021-05-14 深圳市优必选科技股份有限公司 一种存储介质、机器人及其导航位图生成方法及装置
WO2022134937A1 (zh) * 2020-12-24 2022-06-30 深圳市优必选科技股份有限公司 一种存储介质、机器人及其导航位图生成方法及装置
CN113674422A (zh) * 2021-08-27 2021-11-19 中汽创智科技有限公司 一种数据同步采集方法、控制模块、系统及存储介质
CN116687386A (zh) * 2023-08-07 2023-09-05 青岛市畜牧工作站(青岛市畜牧兽医研究所) 牛体形数据综合校准的雷达探测系统及方法
CN116687386B (zh) * 2023-08-07 2023-10-31 青岛市畜牧工作站(青岛市畜牧兽医研究所) 牛体形数据综合校准的雷达探测系统及方法
CN117706942A (zh) * 2024-02-05 2024-03-15 四川大学 一种环境感知与自适应驾驶辅助电子控制方法及系统
CN117706942B (zh) * 2024-02-05 2024-04-26 四川大学 一种环境感知与自适应驾驶辅助电子控制方法及系统

Also Published As

Publication number Publication date
CN109253720A (zh) 2019-01-22
CN109253720B (zh) 2020-04-24

Similar Documents

Publication Publication Date Title
WO2020088165A1 (zh) 一种摄像头与激光雷达融合系统
US10491883B2 (en) Image capturing device
CA2902430C (en) Methods, systems, and apparatus for multi-sensory stereo vision for robotics
CN108765496A (zh) 一种多视点汽车环视辅助驾驶系统及方法
US20130038732A1 (en) Field of view matching video display system
TW201533612A (zh) 感測器與投射器之校準技術
US20200348127A1 (en) 3-d environment sensing by means of projector and camera modules
EP3351899A1 (en) Method and device for inpainting of colourised three-dimensional point clouds
US20180274917A1 (en) Distance measurement system, mobile object, and component
KR101203816B1 (ko) 로봇 물고기 위치 인식 시스템 및 로봇 물고기 위치 인식 방법
CN110750153A (zh) 一种无人驾驶车辆的动态虚拟化装置
WO2016040229A1 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
CN210998737U (zh) 移动机器人
TW201605247A (zh) 影像處理系統及方法
CN207475756U (zh) 机器人红外立体视觉系统
WO2020129398A1 (ja) 観測装置
US20220198202A1 (en) Camera for vehicle and parking assistance apparatus having the same
JP2015032936A (ja) 撮像装置
US20150172606A1 (en) Monitoring camera apparatus with depth information determination
CN215373873U (zh) 三维检测相机
DE102014206677A1 (de) Kamerasystem und Verfahren zum Erfassen eines Umfeldes eines Fahrzeuges
Li et al. Extrinsic calibration between a stereoscopic system and a LIDAR with sensor noise models
TW201739648A (zh) 影像疊合之方法
CN212905464U (zh) 激光雷达三维扫描相机
CN113218327A (zh) 三维检测相机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19880544

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19880544

Country of ref document: EP

Kind code of ref document: A1