WO2019237782A1 - 3d信息检测设备 - Google Patents

3d信息检测设备 Download PDF

Info

Publication number
WO2019237782A1
WO2019237782A1 PCT/CN2019/078168 CN2019078168W WO2019237782A1 WO 2019237782 A1 WO2019237782 A1 WO 2019237782A1 CN 2019078168 W CN2019078168 W CN 2019078168W WO 2019237782 A1 WO2019237782 A1 WO 2019237782A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
dlp
dlp projector
detection device
information detection
Prior art date
Application number
PCT/CN2019/078168
Other languages
English (en)
French (fr)
Inventor
张华林
何品将
刘娟
刘静
Original Assignee
杭州海康机器人技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康机器人技术有限公司 filed Critical 杭州海康机器人技术有限公司
Priority to US17/252,178 priority Critical patent/US20210262789A1/en
Priority to EP19819633.9A priority patent/EP3809095A4/en
Publication of WO2019237782A1 publication Critical patent/WO2019237782A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo

Definitions

  • the present application relates to the technical field of 3D information acquisition of objects, and in particular, to a 3D information detection device.
  • the image acquisition device of the machine is the core component of the machine, which determines the positioning accuracy of the machine.
  • ordinary image acquisition devices can only obtain two-dimensional information of objects, and cannot obtain 3D (3Dimensions) information of objects, which obviously cannot meet actual requirements.
  • an image acquisition device currently used for acquiring 3D information of an object includes a projector and a camera.
  • a structured light detection system is constructed by the projector and the camera, thereby realizing the acquisition of 3D information of the object.
  • the projectors and cameras included in the current image acquisition devices work independently of each other, and can only acquire two-dimensional images of objects.
  • the acquired two-dimensional images need to be subsequently processed on a personal computer (PC), and then In order to obtain 3D information of the object.
  • PC personal computer
  • the 3D information of the object can only be obtained based on the cooperation of the PC, resulting in the problem that the 3D information of the object cannot be obtained directly.
  • This application provides a 3D information detection device to solve the problem that the 3D information of an object cannot be directly obtained in the current method for acquiring 3D information of an object.
  • a 3D information detection device includes a DLP projector, a camera, a controller, and an image processing module.
  • the DLP projector is used to project structured light onto an object
  • the camera is used to obtain the object on which the structured light is projected.
  • An image, the image processing module is connected to the camera, and is configured to obtain 3D information of the object by processing the image
  • the controller is connected to the DLP projector and the camera, and controls two People work.
  • the DLP projector (100) includes a DLP driving device (110) and a projection light machine (120), and the DLP driving device (110) is connected to the projection light machine (120) and drives the A projection light machine (120) projects the encoded structured light onto an object, and the DLP driving device (110) is connected to the controller (300).
  • controller (300) is integrated on the DLP driving device (110).
  • the DLP projector (100) further includes a housing (600), the controller (300), the DLP driving device (110), the projection light machine (120), and the image processing
  • the modules (400) are all arranged in the casing (600).
  • it further includes a mounting base (500), the camera (200) and the DLP projector (100) are both disposed on the mounting base (500), and the camera (200) is located at The side of the DLP projector (100).
  • the camera (200) is movably disposed on the mounting base (500) and can be moved in a direction close to or away from the DLP projector (100).
  • the camera (200) includes a camera body and a camera lens, the camera body is movably matched with the mounting base (500), and the camera lens is rotatably matched with the camera body.
  • the included angle between the optical axis of the imaging lens of the camera (200) and the optical axis of the imaging lens of the DLP projector (100) is 5 ° to 45 °.
  • the number of the cameras (200) is two, and the two cameras (200) are symmetrically arranged on both sides of the DLP projector (100).
  • the projection lens of the DLP projector (100) is an off-axis lens; or / and the image processing module (400) is a GPU.
  • the controller can control the operation of the DLP projector and the camera, thereby enabling the two to form an associated triggered whole, the camera can capture the structured light projected by the DLP projector in time, and enable the captured image to be Processed by the image processing module in time.
  • the image processing module directly performs 3D analysis processing on the image obtained by the camera and includes the image projected by the DLP projector, to obtain the 3D information of the object, therefore, the 3D information of the object can be obtained directly without the cooperation of the PC.
  • FIG. 1 is a schematic structural diagram of a 3D information detection device disclosed in an embodiment of the present application.
  • FIG. 2 is a schematic diagram of an off-axis lens disclosed in an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an off-axis lens disclosed in an embodiment of the present application.
  • 100-DLP projector 110-DLP driver, 120-projector, 200-camera, 300-controller, 400-image processing module, 500-mounting base, 600-case.
  • an embodiment of the present application discloses a 3D information detection device.
  • the 3D information detection device of the present application includes a DLP projector 100, a camera 200, a controller 300, and an image processing module 400.
  • the DLP projector 100 is a projection device based on DLP (Digital Light Processing) technology.
  • the DLP technology can digitally process an image signal and then project the light.
  • the DLP projector 100 has a codeable characteristic.
  • the DLP projector 100 can be used to project structured light onto an object (object under test), where the structured light can be encoded structured light. In a specific work process, the DLP projector 100 projects a series of coded patterns, which are formed by structured light.
  • the structured light technology is used later to calculate the 3D information of the object, which can improve the detection accuracy.
  • Structured light can be analyzed by structured light technology, which is an active triangulation technology.
  • the basic principle is that the light projection device projects a controllable light spot, light bar or light surface onto the surface of the object to form a characteristic point.
  • the structured light is projected on the surface of the object and then modulated by the height of the object. It is transmitted to the analysis equipment for analysis and calculation, and then the 3D information of the object can be obtained, that is, the 3D data of the object.
  • the camera 200 is used to acquire an image of an object on which structured light is projected.
  • the image processing module 400 is connected to the camera 200 and is used to obtain 3D information of the object through image processing. As described in the previous paragraph, the specific processing process and calculation of the 3D information are well-known technologies, and will not be repeated here.
  • the image processing module 400 may be based on the X86 system framework and use a GPU to obtain an image from the camera 200 and implement a surface structured light algorithm processing on the image.
  • the image processing module 400 may be a GPU (Graphics Processing Unit), which is not limited to this.
  • the controller 300 is connected to both the DLP projector 100 and the camera 200 and controls the operations of both.
  • the controller 300 may control the DLP projector 100 to work first, and the camera 200 to work again after a preset time.
  • the preset time may be 1 second, 2 seconds, etc.
  • the embodiment does not specifically limit the size of the preset time.
  • each time the DLP projector 100 projects a picture the camera 200 takes a picture after a preset time, and the image processing module 400 can process the picture taken by the camera 200 to calculate the 3D information of the object.
  • the controller 300 may control the DLP projector 100 and the camera 200 to work synchronously. In this case, each time the DLP projector 100 projects a picture, at the same time, the camera 200 takes a picture, and the image processing module 400 can process the picture taken by the camera 200 in real time, and then calculate the 3D information of the object.
  • the controller 300 may be equivalent to a linkage control device capable of controlling synchronization between the DLP projector 100 and the camera 200.
  • the controller 300 may be controlled by software or by a hardware circuit. There are various control methods in the art to realize the simultaneous operation of two devices, which are not listed here one by one.
  • the controller 300 may be an FPGA (Field Programmable Gate Array, Field Programmable Logic Gate Array) control chip.
  • the controller 300 can control the DLP projector 100 and the camera 200 to work synchronously, thereby enabling the two to form an associated and triggered whole, and the camera 200 can capture the structure projected by the DLP projector 100 in time. Light, and enables the captured image to be processed by the image processing module 400 in time.
  • the image processing module directly performs 3D analysis processing on the image obtained by the camera and includes the image projected by the DLP projector, to obtain the 3D information of the object, therefore, the 3D information of the object can be obtained directly without the cooperation of the PC.
  • the embodiment of the present application integrates the DLP projector 100, the camera 200, and the image processing module 400 into one, and can directly provide the user with 3D information of the object through the 3D algorithm, which greatly facilitates the use of the user and does not require the user. Do subsequent further processing.
  • the DLP projector 100 uses DLP projection technology to project a series of encoded patterns (formed by structured light) onto an object, and then the camera 200 collects an image with a pattern on the surface of the object, and finally The image processing module 400 decodes the image collected by the camera 200, which can accurately restore the depth information of the object surface.
  • the camera 200 itself has two-dimensional information to obtain the image.
  • the structured light technology is used to measure the 3D information of the object. .
  • the 3D information detection device disclosed in the embodiments of the present application can be widely used in the fields of robot positioning, 3D scanning, 3D measurement, and the like.
  • the use of DLP projection technology can flexibly implement the encoding of different patterns, and then can project a higher-precision surface structured light.
  • the DLP projector 100 may encode the structured light using a Gray code encoding format or a sine code encoding format.
  • a person skilled in the art can know a specific implementation manner of encoding structured light by using a Gray code encoding format or a sine code encoding format, and details are not described herein again.
  • the DLP projector 100 may include a DLP driving device 110 and a projector 120.
  • the DLP driving device 110 is connected to the projector 120 and drives the projector 120 to project an encoded structured light onto an object.
  • the DLP driving device 110 is connected to the controller 300, and the controller 300 can be integrated on the DLP driving device. In this way, space can be saved and the controller can control the camera 200 and the DLP driving device 110 simultaneously.
  • the controller 300 can control the DLP driving device 110 to further realize the projection of the projector 120.
  • the projection optical unit 120 includes a projection lens.
  • the projection lens can be a 12 mm or 8 mm lens. Specifically, the projection lens can achieve focusing at working distances of 500 mm and 1000 mm. Of course, the focusing distance is not limited to the above-mentioned distance.
  • the projector 120 may use a DMD (Digital Micromirror Device) chip including TI (Texas Instruments) to implement DLP projection.
  • DMD Digital Micromirror Device
  • the projector 120 may include red, green, and blue three-color LED light sources, so that the DLP light source can project structured light of different colors.
  • the DLP projector 100 can provide patterns of different colors according to different scenes.
  • the DLP driving device 110 may include an FPGA module.
  • the FPGA module controls the generation of a Gray code pattern, the generated code pattern is stored in a memory, and then is projected by the projection light machine 120.
  • the camera 200 includes a camera body and a camera lens.
  • the camera 200 may be a camera using 1.3 million pixels, or 3 million pixels, or other pixels.
  • the image sensor may be a high-speed area-array CCD (Charge-coupled Device) image sensor or a CMOS (Complementary Metal Oxygen Semiconductor). (Complementary Metal Oxide Semiconductor) image sensor, of course, is not limited to this.
  • the camera lens can be equipped with a standard FA (Factory Automation) lens, and the focal length of the standard FA lens can be 8mm or 12mm, of course, it is not limited to this.
  • FA Vectory Automation
  • the DLP projector 100 may include a housing 600, and the controller 300, the DLP driving device 110, the projector 120, and the image processing module 400 may all be disposed in the housing 600, thereby facilitating on-site installation of the above components.
  • the housing 600 may be a metal shell, which can play a better role in heat dissipation, so as to timely dissipate the heat generated by the components placed inside it.
  • the 3D information detection device disclosed in the embodiment of the present application may further include a mounting base 500.
  • Both the camera 200 and the DLP projector 100 may be disposed on the mounting base 500 and the camera 200 is located on one side of the DLP projector 100, that is, the camera 200 may be located on either side of the DLP projector 100.
  • the installation base 500 can be fixed at the inspection site first, and then can provide installation positions for the DLP projector 100 and the camera 200.
  • the DLP projector 100 and the camera 200 can also be installed on the mounting base 500 first, and finally the formed whole can be installed on site.
  • the camera 200 is movably disposed on the mounting base 500 and can be moved in a direction close to or away from the DLP projector 100, thereby adjusting the position of the camera 200 to adjust the shooting position.
  • the camera 200 may include a camera body and a camera lens, the camera body may be connected to the mounting base 500, and the camera body and the mounting base 500 may be movably matched.
  • the camera lens rotates with the camera body to adjust the shooting angle of the camera 200 more flexibly.
  • the number of cameras 200 may be two, and the two cameras 200 may be symmetrically arranged on both sides of the DLP projector 100.
  • the use of two cameras 200 can better compensate for the problem of the blind area of the field of vision of one camera 200, and can further improve the detection accuracy.
  • 3D information of an object can also be detected.
  • the distance between the two cameras 200 can be referred to as the baseline distance.
  • the larger the baseline distance the higher the depth resolution during the shooting process.
  • the camera 200 is movably disposed on the mounting base 500, so that the baseline distance between the two cameras 200 can be adjusted more flexibly to achieve the effect of flexibly adjusting the depth resolution.
  • the user can flexibly adjust the baseline distance between the two cameras 200 according to the use environment.
  • the end faces of the projection lenses of the DLP projector 100 and the end faces of the camera lenses of the two cameras 200 may be on the same straight line, and the projection lens is located in the middle of the camera lenses of the two cameras 200 That is, the camera lenses of the two cameras 200 are symmetrically arranged on both sides of the projection lens.
  • an included angle between the optical axis of the imaging lens of the camera 200 and the optical axis of the projection lens of the DLP projector 100 may be 5 ° to 45 °.
  • the camera 200 with the above specific structure can adjust the shooting direction of the camera lens, and can flexibly adjust the angle between the optical axis of the camera lens of the camera 200 and the optical axis of the projection lens of the DLP projector 100.
  • the imaging lens of the camera 200 can be rotated, 3D scanning of a 3D information detection device over a wide range can be realized, and the detection range can be diffused.
  • the projection lens of the DLP projector 100 can be an off-axis lens. As shown in FIG. 2, the projection surface formed by the off-axis lens is located on one side of its optical axis, and the off-axis lens can be used to meet the edge position of the projected image. Compatible with the installation position of the DLP projector 100, but since the projected image is not on the main axis of the projection lens, the distortion and imaging quality of the projection lens will decrease. Based on this, in an optional solution, the projection lens of the DLP projector 100 may be an off-axis lens, as shown in FIG. 3. In this case, the projection surface formed by the off-axis lens is the optical axis of the off-axis lens. symmetry.
  • An off-axis lens can improve the quality of the projected pattern, and ultimately improve the detection accuracy of the detection equipment, and obtain more accurate 3D information of the object.
  • the above embodiments in the present application mainly describe the differences between the various embodiments. As long as the different optimization features between the various embodiments are not inconsistent, they can be combined to form a better embodiment. In view of the simplicity of the text, here is the No longer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种3D信息检测设备,其包括DLP投影仪(100)、相机(200)、控制器(300)和图像处理模块(400),所述DLP投影仪(100)用于向物体投射经编码后的结构光,所述相机(200)用于获取被投射有结构光的所述物体的图像,所述图像处理模块(400)与所述相机(200)相连、且用于通过对所述图像处理得到所述物体的3D信息,所述控制器(300)与所述DLP投影仪(100)和所述相机(200)均相连、且控制两者工作。所述设备能解决目前无法直接获取物体3D信息的问题。

Description

3D信息检测设备
本申请要求于2018年6月15日提交中国专利局、申请号为201820937584.9发明名称为“3D信息检测设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及物体的3D信息获取技术领域,尤其涉及一种3D信息检测设备。
背景技术
随着技术的进步,机器的视觉性能越来越好,机器的图像获取装置是机器的核心部件,决定着机器的定位精度。在机器视觉领域中,普通的图像获取装置只能获取物体的二维信息,无法获取物体的3D(3Dimensions,三维)信息,很显然无法满足实际要求。
而为了获取物体的3D信息,目前用于获取物体3D信息的图像获取装置包括投影仪和相机,通过投影仪和相机来搭建结构光检测系统,进而实现对物体3D信息的获取。但是,目前的图像获取装置中包括的投影仪和相机相互独立工作,而且只能获取物体的二维图像,获取的二维图像需要后续在PC(personal computer,个人计算机)上被继续处理,进而才能获取物体的3D信息。此种方式中,基于PC的协作才能获取物体3D信息,导致存在无法直接获取物体3D信息的问题。
发明内容
本申请提供一种3D信息检测设备,以解决目前用于获取物体3D信息的方式所存在的无法直接获取物体3D信息的问题。
为了解决上述问题,本申请采用下述技术方案:
一种3D信息检测设备,包括DLP投影仪、相机、控制器和图像处理模块,所述DLP投影仪用于向物体投射结构光,所述相机用于获取被投射有结构光的所述物体的图像,所述图像处理模块与所述相机相连、且用于通过对所述图像处理得到所述物体的3D信息,所述控制器与所述DLP投影仪和所述相机均相连、且控制两者工作。
可选的,所述DLP投影仪(100)包括DLP驱动装置(110)和投影光机(120),所述DLP驱动装置(110)与所述投影光机(120)相连、且驱动所述投影光机(120)向物体投射经编码后的所述结构光,所述DLP驱动装置(110)与所述控制器(300)相连。
可选的,所述控制器(300)集成于所述DLP驱动装置(110)上。
可选的,所述DLP投影仪(100)还包括壳体(600),所述控制器(300)、所述DLP驱动装置(110)、所述投影光机(120)和所述图像处理模块(400)均设置在所述壳体(600)内。
可选的,还包括安装基座(500),所述相机(200)和所述DLP投影仪(100)均设置在所述安装基座(500)上、且所述相机(200)位于所述DLP投影仪(100)的一侧。
可选的,所述相机(200)可移动地设置在所述安装基座(500)上、且能在靠近或远离所述DLP投影仪(100)的方向移动。
可选的,所述相机(200)包括相机主体和摄像镜头,所述相机主体与所述安装基座(500)可移动配合,所述摄像镜头与所述相机主体转动配合。
可选的,所述相机(200)的摄像镜头的光轴与所述DLP投影仪(100)的摄像镜头的光轴之间的夹角为5°~45°。
可选的,所述相机(200)的数量为两个,两个所述相机(200)对称布置在所述DLP投影仪(100)的两侧。
可选的,所述DLP投影仪(100)的投影镜头为零偏轴镜头;或/和,所述图像处理模块(400)为GPU。
本申请采用的技术方案能够达到以下有益效果:
本申请公开的3D信息检测设备中,控制器能够控制DLP投影仪和相机工作,进而能使得两者形成关联触发的整体,相机能够及时拍摄DLP投影仪投射的结构光,并使得拍摄的图像能够被图像处理模块及时处理。此检测过程中,由于图像处理模块直接对相机所获取的包含DLP投影仪投影的图像进行3D分析处理,得到物体3D信息,因此,无需PC协作,直接能够获得物体3D信息。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部 分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本申请实施例公开的一种3D信息检测设备的结构示意图;
图2为本申请实施例公开的偏轴镜头的示意图;
图3为本申请实施例公开的零偏轴镜头的示意图。
附图标记说明:
100-DLP投影仪、110-DLP驱动装置、120-投影光机、200-相机、300-控制器、400-图像处理模块、500-安装基座、600-壳体。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请具体实施例及相应的附图对本申请技术方案进行清楚、完整地描述。显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
以下结合附图,详细说明本申请各个实施例提供的技术方案。
请参考图1,本申请实施例公开一种3D信息检测设备。本申请的3D信息检测设备包括DLP投影仪100、相机200、控制器300和图像处理模块400。
DLP投影仪100是一种基于DLP(Digital Light Processing,数字光处理)技术的投影设备,DLP技术能将影像信号经过数字处理后,再把光投影出来。DLP投影仪100具有可编码特性。DLP投影仪100可以用于向物体(被测物体)投射结构光,其中,该结构光可以为经编码后的结构光。在具体的工作过程中,DLP投影仪100投影一系列经过编码的图案,这些图案由结构光形成,后文采用结构光技术计算物体的3D信息,这能提高检测精度。
结构光能够被结构光技术分析,结构光技术是一种主动式三角测量技术。其基本原理是光投射设备投射可控制的光点、光条或光面到物体表面进而形成特征点,结构光被投射到物体表面后被物体的高度调制,被调制的结构光经过图像获取后被传送到分析设备进行分析计算,进而能得到物体的3D信息, 即物体的三维数据。
相机200用于获取被投射有结构光的物体的图像。图像处理模块400与相机200相连、且用于通过对图像处理得到物体的3D信息,如上段所述,具体的处理过程及3D信息的计算为公知技术,在此则不再赘述。具体的,图像处理模块400可以基于X86系统框架,采用GPU实现从相机200处获取图像及对图像实施面结构光算法处理。一种具体的实施方式中,图像处理模块400可以是GPU(Graphics Processing Unit,图形处理器),当然并不局限于此。
控制器300与DLP投影仪100和相机200均相连、且控制两者工作。
在一种实施方式中,控制器300可以控制DLP投影仪100先工作,在预设时间后,相机200再工作,可以理解的是,该预设时间可以是1秒、2秒等,本申请实施例对预设时间的大小不做具体限定。这种情况下,DLP投影仪100每投影一张,在预设时间后,相机200拍摄一次,图像处理模块400则可以处理相机200拍摄的图片,进而计算出物体的3D信息。
在另一种实施方式中,控制器300可以控制DLP投影仪100和相机200同步工作。此种情况下,DLP投影仪100每投影一张,与此同时,相机200则拍摄一次,图像处理模块400则可以实时处理相机200拍摄的图片,进而计算出物体的3D信息。在可选的方案中,控制器300可以相当于一个联动控制设备,能够控制DLP投影仪100和相机200的同步。控制器300可以通过软件控制,也可以通过硬件电路控制。本领域有多种实现两个设备同时工作的控制方式,在此则不一一列举。具体的,控制器300可以为FPGA(Field Programmable Gate Array,现场可编程逻辑门阵列)控制芯片。
本申请实施例公开的3D信息检测设备中,控制器300能够控制DLP投影仪100和相机200同步工作,进而能使得两者形成关联触发的整体,相机200能够及时拍摄DLP投影仪100投射的结构光,并使得拍摄的图像能够被图像处理模块400及时处理。此检测过程中,由于图像处理模块直接对相机所获取的包含DLP投影仪投影的图像进行3D分析处理,得到物体3D信息,因此,无需PC协作,直接能够获得物体3D信息。
与此同时,本申请实施例将DLP投影仪100、相机200和图像处理模块400 集成为一体,能够直接通过3D算法给用户提供物体的3D信息,这极大地方便了用户的使用,也无需用户做后续进一步的处理。
本申请实施例公开的检测设备中,DLP投影仪100采用DLP投影技术,投影一系列经过编码后的图案(由结构光形成)到物体上,然后相机200采集回物体表面有图案的图像,最后图像处理模块400对相机200采集的图像进行解码算法,进而能够精确还原出物体表面的深度信息,再加之相机200本身具备获取图像的二维信息,最终采用结构光技术实现对物体3D信息的测量。本申请实施例公开的3D信息检测设备能够被广泛地应用在机器人定位、3D扫描、3D测量等领域中。
采用DLP投影技术可以灵活地实现不同图案的编码,进而能够投射出较高精度的面结构光。DLP投影仪100可以采用格雷码编码格式或正弦码编码格式对结构光进行编码。本领域技术人员能够得知采用格雷码编码格式或正弦码编码格式对结构光进行编码的具体实现方式,在此不再赘述。
DLP投影仪100可以包括DLP驱动装置110和投影光机120,DLP驱动装置110与投影光机120相连、且驱动投影光机120向物体投射经编码后的结构光。DLP驱动装置110与控制器300相连,控制器300可以集成于DLP驱动装置上,这样,可以节省空间,同时便于控制器能够同步控制相机200和DLP驱动装置110。控制器300能够控制DLP驱动装置110,进而实现投影光机120的投影。投影光机120包括投影镜头,投影镜头可以采用12mm或8mm的镜头,具体的,投影镜头可以实现500mm、1000mm等工作距离的对焦。当然,对焦距离不局限于上述距离。投影光机120可以采用包含TI(Texas Instruments,美国德州仪器公司)的DMD(Digital Micromirror Device,数字微镜设备)芯片实现DLP投影。
投影光机120可以包含红、绿、蓝三色LED光源,进而能够使得DLP光源投射不同颜色的结构光,此种情况下,DLP投影仪100可以依据场景不同来提供不同颜色的图案。
一种具体的实施方式中,DLP驱动装置110可以包括FPGA模块,FPGA模块控制生成格雷编码图案,生成的编码图案存在内存中,然后经投影光机120投影出去。
本申请实施例公开的相机200中,相机200包括相机主体和摄像镜头。相机200可以为采用130万像素,或者300万像素,或者其他像素的图像传感器的相机,图像传感器可以为高速面阵CCD(Charge-coupled Device,电荷耦合元件)图像传感器或CMOS(Complementary Metal Oxide Semiconductor,互补金属氧化物半导体)图像传感器,当然并不局限于此。另外,摄像镜头可以搭配标准FA(Factory Automation,工业自动化)镜头,该标准FA镜头的焦距可以为8mm或12mm焦距,当然不限于此。
DLP投影仪100可以包括壳体600,控制器300、DLP驱动装置110、投影光机120和图像处理模块400均可以设置在壳体600内,进而能够方便上述部件的现场安装。可选的方案中,壳体600可以为金属外壳,金属外壳能够起到较好的散热作用,进而能将置于其内部的部件产生的热及时散除。
本申请实施例公开的3D信息检测设备还可以包括安装基座500。相机200和DLP投影仪100均可以设置在安装基座500上、且相机200位于DLP投影仪100的一侧,也就是说,相机200可以位于DLP投影仪100的任一侧。在具体的安装过程中,安装基座500可以先固定在检测现场,进而能为DLP投影仪100和相机200提供安装位。当然,也可以先将DLP投影仪100和相机200安装在安装基座500上,最后将形成的整体在现场实现安装。
可选的方案中,相机200可移动地设置在安装基座500上、且能在靠近或远离DLP投影仪100的方向移动,进而能调整相机200的位置,达到调整拍摄位置的目的。
如上文所述,相机200可以包括相机主体和摄像镜头,相机主体可以与安装基座500相连,相机主体与安装基座500可移动配合。摄像镜头与相机主体转动配合,进而能较为灵活地调整相机200的拍摄角度。
本申请实施例公开的3D信息检测设备中,相机200的数量可以为两个,两个相机200可以对称布置在DLP投影仪100的两侧。采用两个相机200能够较好地弥补一个相机200存在的视野盲区的问题,进而能提高检测精度。当然,相机200的数量为一个时,也可以检测物体的3D信息。
在相机200为两部时,两部相机200之间的距离可以称之为基线距离,根 据三角测量法原理,基线距离越大,在拍摄过程中深度分辨率越高。上述相机200可移动地设置在安装基座500上,进而能较为灵活地调整两部相机200之间的基线距离,达到灵活调节深度分辨率的效果。用户可以依据使用环境来灵活地调整两部相机200之间的基线距离。
为了提高检测效果,可选的方案中,DLP投影仪100的投影镜头的端面与两部相机200的摄像镜头的端面可以在同一条直线上,投影镜头处于两部相机200的摄像镜头的中间部位,即两部相机200的摄像镜头对称布置在投影镜头的两侧。
一种具体的实施方式中,相机200的摄像镜头的光轴与DLP投影仪100的投影镜头的光轴之间的夹角可以为5°~45°。当然,上述具体结构的相机200能够实现摄像镜头的拍摄方向的调整,能够较为灵活地调整相机200的摄像镜头的光轴与DLP投影仪100的投影镜头的光轴之间的夹角。
由于相机200的摄像镜头能够转动,因此能够实现3D信息检测设备在大范围内的3D扫描,进而能扩散检测范围。
本申请实施例中,DLP投影仪100的投影镜头可以采用偏轴镜头,如图2所示,偏轴镜头形成的投影面位于其光轴的一侧,采用偏轴镜头可以满足投影影像边缘位置与DLP投影仪100安装位置之间的兼容,但是由于投影出的图像不在投影镜头的主轴上,投影镜头的畸变和成像质量会下降。基于此,可选的方案中,DLP投影仪100的投影镜头可以为零偏轴镜头,如图3所示,此种情况下,零偏轴镜头形成的投影面以零偏轴镜头的光轴对称。零偏轴镜头能够提高投影出的图案的质量,最终能提升检测设备的检测精度,得到物体更为精确的3D信息。本申请上文实施例中重点描述的是各个实施例之间的不同,各个实施例之间不同的优化特征只要不矛盾,均可以组合形成更优的实施例,考虑到行文简洁,在此则不再赘述。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (10)

  1. 一种3D信息检测设备,其特征在于,包括数字光处理DLP投影仪(100)、相机(200)、控制器(300)和图像处理模块(400),所述DLP投影仪(100)用于向物体投射结构光,所述相机(200)用于获取被投射有结构光的所述物体的图像,所述图像处理模块(400)与所述相机(200)相连、且用于通过对所述图像处理得到所述物体的3D信息,所述控制器(300)与所述DLP投影仪(100)和所述相机(200)均相连、且控制两者工作。
  2. 根据权利要求1所述的3D信息检测设备,其特征在于,所述DLP投影仪(100)包括DLP驱动装置(110)和投影光机(120),所述DLP驱动装置(110)与所述投影光机(120)相连、且驱动所述投影光机(120)向物体投射经编码后的所述结构光,所述DLP驱动装置(110)与所述控制器(300)相连。
  3. 根据权利要求2所述的3D信息检测设备,其特征在于,所述控制器(300)集成于所述DLP驱动装置(110)上。
  4. 根据权利要求1所述的3D信息检测设备,其特征在于,所述DLP投影仪(100)还包括壳体(600),所述控制器(300)、所述DLP驱动装置(110)、所述投影光机(120)和所述图像处理模块(400)均设置在所述壳体(600)内。
  5. 根据权利要求1所述的3D信息检测设备,其特征在于,还包括安装基座(500),所述相机(200)和所述DLP投影仪(100)均设置在所述安装基座(500)上、且所述相机(200)位于所述DLP投影仪(100)的一侧。
  6. 根据权利要求5所述的3D信息检测设备,其特征在于,所述相机(200)可移动地设置在所述安装基座(500)上、且能在靠近或远离所述DLP投影仪(100)的方向移动。
  7. 根据权利要求6所述的3D信息检测设备,其特征在于,所述相机(200)包括相机主体和摄像镜头,所述相机主体与所述安装基座(500)可移动配合,所述摄像镜头与所述相机主体转动配合。
  8. 根据权利要求7所述的3D信息检测设备,其特征在于,所述相机(200)的摄像镜头的光轴与所述DLP投影仪(100)的投影镜头的光轴之间的夹角为 5°~45°。
  9. 根据权利要求5-8中任一项所述的3D信息检测设备,其特征在于,所述相机(200)的数量为两个,两个所述相机(200)对称布置在所述DLP投影仪(100)的两侧。
  10. 根据权利要求1所述的3D信息检测设备,其特征在于,所述DLP投影仪(100)的投影镜头为零偏轴镜头;或/和,所述图像处理模块(400)为图形处理器GPU。
PCT/CN2019/078168 2018-06-15 2019-03-14 3d信息检测设备 WO2019237782A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/252,178 US20210262789A1 (en) 2018-06-15 2019-03-14 3d information detection device
EP19819633.9A EP3809095A4 (en) 2018-06-15 2019-03-14 3D INFORMATION DETECTION DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201820937584.9U CN208579746U (zh) 2018-06-15 2018-06-15 3d信息检测设备
CN201820937584.9 2018-06-15

Publications (1)

Publication Number Publication Date
WO2019237782A1 true WO2019237782A1 (zh) 2019-12-19

Family

ID=65508005

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/078168 WO2019237782A1 (zh) 2018-06-15 2019-03-14 3d信息检测设备

Country Status (4)

Country Link
US (1) US20210262789A1 (zh)
EP (1) EP3809095A4 (zh)
CN (1) CN208579746U (zh)
WO (1) WO2019237782A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI792128B (zh) * 2019-12-27 2023-02-11 美商豪威科技股份有限公司 利用極化和相位檢測光電二極體獲得三維形狀資訊之裝置及方法

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN208579746U (zh) * 2018-06-15 2019-03-05 杭州海康机器人技术有限公司 3d信息检测设备
CN110411573A (zh) * 2019-07-15 2019-11-05 泉州极简机器人科技有限公司 一种监测床及其使用方法
CN111272070B (zh) * 2020-03-05 2021-10-19 南京华捷艾米软件科技有限公司 一种结构光参考图采集装置和方法
CN113340228B (zh) * 2021-05-26 2024-05-07 深圳市二郎神视觉科技有限公司 一种手持式轮胎花纹深度测量装置及测量方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825445A (zh) * 2010-05-10 2010-09-08 华中科技大学 一种动态物体的三维测量系统
CN205448962U (zh) * 2015-12-31 2016-08-10 深圳先进技术研究院 一种具有三维扫描功能的移动终端
CN207301612U (zh) * 2017-10-16 2018-05-01 深圳奥比中光科技有限公司 一体化大视角3d视觉系统
CN208579746U (zh) * 2018-06-15 2019-03-05 杭州海康机器人技术有限公司 3d信息检测设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5889719B2 (ja) * 2012-05-31 2016-03-22 カシオ計算機株式会社 撮像装置、撮像方法及びプログラム
CN102829736B (zh) * 2012-09-12 2015-05-06 河北工业大学 一种三维指纹传感系统
CN203259133U (zh) * 2013-04-26 2013-10-30 华中科技大学 一种动态三维测量时序同步系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101825445A (zh) * 2010-05-10 2010-09-08 华中科技大学 一种动态物体的三维测量系统
CN205448962U (zh) * 2015-12-31 2016-08-10 深圳先进技术研究院 一种具有三维扫描功能的移动终端
CN207301612U (zh) * 2017-10-16 2018-05-01 深圳奥比中光科技有限公司 一体化大视角3d视觉系统
CN208579746U (zh) * 2018-06-15 2019-03-05 杭州海康机器人技术有限公司 3d信息检测设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3809095A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI792128B (zh) * 2019-12-27 2023-02-11 美商豪威科技股份有限公司 利用極化和相位檢測光電二極體獲得三維形狀資訊之裝置及方法

Also Published As

Publication number Publication date
EP3809095A1 (en) 2021-04-21
EP3809095A4 (en) 2021-06-23
CN208579746U (zh) 2019-03-05
US20210262789A1 (en) 2021-08-26

Similar Documents

Publication Publication Date Title
WO2019237782A1 (zh) 3d信息检测设备
JP4495041B2 (ja) ピンホール投影により表示面上のレーザポイントと関連付けられるプロジェクタ画素を求める方法
US9338447B1 (en) Calibrating devices by selecting images having a target having fiducial features
US11025888B2 (en) System and method for capturing omni-stereo videos using multi-sensors
JP2016017961A (ja) 投影光源を有する撮像方法及びその撮像装置
KR102482062B1 (ko) 컬러 패턴을 이용한 치과용 3차원 스캐너
JP4709571B2 (ja) 視覚情報処理システム及びその視覚情報処理方法
JP2011097377A (ja) 撮像装置
CN110463191A (zh) 投影仪及投影仪的控制方法
JP6809477B2 (ja) 投射型表示装置および画像補正方法
Kitajima et al. Simultaneous projection and positioning of laser projector pixels
JP2004228824A (ja) スタックプロジェクション装置及びその調整方法
JP2008294545A (ja) 投射型映像表示装置及び投射型映像表示システム
CN106840034A (zh) 具有散斑投射器的三维扫描系统及其应用
JP2016149618A (ja) 画像投影システム、プロジェクタ、およびプログラム
GB2525000A (en) Structured light generation and processing on a mobile device
US11019314B2 (en) Projector and method for controlling projector
CN111126145B (zh) 一种避免光源像影响的虹膜3d信息获取系统
CN114339179B (zh) 投影校正方法、装置、存储介质以及投影设备
KR102080506B1 (ko) 광학 3d 스캐너
CN210513043U (zh) 3d信息检测设备
KR101816781B1 (ko) 사진계측 방식의 3d스캐너 및 3d모델링의 고품질 입력 데이터를 위한 사진계측 방식의 촬영방법
TWI688274B (zh) 影像校正方法及影像校正系統
CN115103169B (zh) 投影画面校正方法、装置、存储介质以及投影设备
TWI477880B (zh) 投影機光線調整系統及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19819633

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2019819633

Country of ref document: EP