WO2019137347A1 - 体感摄像头的精度测量方法和体感摄像头的精度测量装置 - Google Patents

体感摄像头的精度测量方法和体感摄像头的精度测量装置 Download PDF

Info

Publication number
WO2019137347A1
WO2019137347A1 PCT/CN2019/070759 CN2019070759W WO2019137347A1 WO 2019137347 A1 WO2019137347 A1 WO 2019137347A1 CN 2019070759 W CN2019070759 W CN 2019070759W WO 2019137347 A1 WO2019137347 A1 WO 2019137347A1
Authority
WO
WIPO (PCT)
Prior art keywords
led lighting
unit
led light
led
infrared image
Prior art date
Application number
PCT/CN2019/070759
Other languages
English (en)
French (fr)
Inventor
周晓军
王行
盛赞
李骊
李朔
Original Assignee
南京华捷艾米软件科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 南京华捷艾米软件科技有限公司 filed Critical 南京华捷艾米软件科技有限公司
Publication of WO2019137347A1 publication Critical patent/WO2019137347A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Definitions

  • the invention relates to the field of production test application, in particular to a method for measuring the accuracy of a somatosensory camera and an accuracy measuring device for a somatosensory camera.
  • Chinese Patent No. 201610366199.9 discloses an assembly method, an augmented reality system and a computer program product for measurement and/or manufacture.
  • the method includes providing an augmented reality system including a receiver, a processor, and an output device.
  • the method also includes arranging the measurement device in the WS such that the measurement device has a particular positional relationship with a reference point in the WS.
  • the method also includes setting a target object in the WS.
  • the method also includes measuring, by the measuring device, a distance measurement from the measuring device to the target object.
  • the method also includes transmitting, by the measuring device, a distance measurement to the augmented reality system.
  • the method also includes determining, by the augmented reality system, whether the distance measurement corresponds to the target distance.
  • Chinese Patent No. 201280033997.X discloses an assembly method for an augmented reality system and for assembling a first assembly part to a second assembly part with the aid of an augmented reality system.
  • the augmented reality system is capable of capturing a first portion of the indicia associated with each component and a second portion of the indicia.
  • the augmented reality system is capable of capturing variable tags associated with one of the components.
  • the augmented reality system is capable of capturing first and second indicia associated with various components.
  • the augmented reality system is capable of identifying the location and/or state of the indicia and thereby determining whether the connection is properly established between the first and second fitting components.
  • This patent only describes the production inspection scheme of the augmented reality device, and does not give a detailed design and description of the specific distance accuracy detection.
  • the present invention aims to at least solve one of the technical problems existing in the prior art, and proposes an accuracy measuring method for a somatosensory camera and an accuracy measuring device for a somatosensory camera.
  • a first aspect of the present invention provides a method for measuring accuracy of a somatosensory camera, the method comprising:
  • the LED light-emitting unit is disposed, the LED light-emitting unit includes a plurality of LED light-emitting devices spaced apart in a height direction, wherein the LED light-emitting unit and the body-sensing camera have a preset distance;
  • step S140 Compare the number of LED light emitting devices in the infrared image with an actual number of LED light emitting devices in the LED light emitting unit, and when the number of LED light emitting devices in the infrared image is illuminated with the LED When the actual number of LED light-emitting devices in the unit is consistent, go to step S150;
  • S160 Compare a depth value of the LED lighting unit with the preset distance, and output a precision measurement success signal when a depth value of the LED lighting unit is consistent with the preset distance.
  • the step S110 includes:
  • the steps S120 to S160 are sequentially performed on the two sets of the LED lighting units.
  • an accuracy measuring device for a somatosensory camera comprising an LED lighting unit, a control unit, an image acquiring unit and an analyzing unit;
  • the LED lighting unit includes a plurality of LED lighting devices spaced apart in a height direction, wherein the LED lighting unit and the somatosensory camera have a preset distance;
  • the control unit is configured to illuminate all of the LED lighting devices in the LED lighting unit
  • the image acquisition unit is configured to acquire an infrared image of the LED light emitting unit, and calculate a position of the LED light emitting device in the infrared image and a quantity of the LED light emitting device according to the infrared image;
  • the analyzing unit is configured to compare the number of LED lighting devices in the infrared image with an actual number of LED lighting devices in the LED lighting unit, and when the number of LED lighting devices in the infrared image is When the actual number of the LED lighting devices in the LED lighting unit is consistent, the depth image acquiring signal is sent to the image acquiring unit;
  • the image acquiring unit is further configured to acquire a depth image of the LED light emitting unit according to the depth image acquiring signal, and locate the LED light emitting device according to a position of the LED light emitting device in the infrared image. Depth position in the depth image to obtain a depth value of the LED lighting unit;
  • the analyzing unit is further configured to compare a depth value of the LED lighting unit with the preset distance, and when the depth value of the LED lighting unit is consistent with the preset distance, the output precision measurement is successful. signal.
  • the LED lighting unit comprises two sets of the LED lighting units, and the two sets of the LED lighting units and the somatosensory camera have different preset distances;
  • the control unit is configured to independently control the working states of the two sets of the LED lighting units.
  • the device further includes a lamp post corresponding to the LED lighting unit, and each set of the LED lighting unit is disposed in the lamp post corresponding thereto.
  • the apparatus further comprises a test darkroom, the light pole being disposed within the test darkroom.
  • control unit comprises a single chip microcomputer.
  • the single chip is further configured to communicate with the terminal to receive a test request sent by the terminal;
  • the single chip microcomputer is configured to control an operating state of the LED lighting unit according to a test request sent by the terminal.
  • the method for measuring the accuracy of the somatosensory camera of the present invention firstly sets an LED lighting unit, then illuminates all the LED lighting devices in the LED lighting unit, and acquires an infrared image, and calculates the number of LED lighting devices in the infrared image according to the infrared image. And the position, when the quantity is consistent with the actual number of the LED lighting devices, collecting the depth image, and calculating the depth value of the LED lighting device according to the depth image and the position of the LED lighting device in the infrared image, when the depth value is When the preset distance is the same, the accuracy measurement is completed, and the output precision measurement success signal is output.
  • the method for measuring the accuracy of the somatosensory camera of the present invention is a fully automated test mode, omitting the measurement error caused by human time and human judgment.
  • this kind of accuracy measurement method has low cost, simple implementation, easy maintenance and large-scale layout.
  • the accuracy measuring device of the somatosensory camera of the present invention when the accuracy measurement of the somatosensory camera is required, the control unit illuminates all the LED lighting devices in the LED lighting unit, and then the image acquiring unit acquires the infrared image of the LED lighting unit. And calculating the number and position of the LED light emitting devices in the infrared image according to the infrared image.
  • the image acquiring unit is further configured to collect the depth image, and according to the depth image and the LED light emitting device The position in the infrared image is calculated, and the depth value of the LED light-emitting device is calculated.
  • the accuracy measuring device of the somatosensory camera of the present invention is a fully automated test mode, omitting the measurement error caused by human time and human judgment.
  • the precision measuring device has the advantages of low cost, simple implementation, convenient maintenance and large-scale layout.
  • FIG. 1 is a flowchart of a method for measuring accuracy of a somatosensory camera according to a first embodiment of the present invention
  • FIG. 2 is a schematic structural view of an accuracy measuring device for a somatosensory camera according to a second embodiment of the present invention
  • FIG. 3 is a schematic structural view of a test darkroom and a lamp post in an accuracy measuring device for a somatosensory camera according to a third embodiment of the present invention.
  • 100 an accuracy measuring device for a somatosensory camera
  • control unit 120 a control unit
  • AR Augmented Reality
  • AR is a technology that calculates the position and angle of camera images in real time and adds corresponding images, videos, and 3D models.
  • the goal of this technology is to put the virtual world on the screen. Set in the real world and interact.
  • a first aspect of the present invention relates to a method S100 for measuring the accuracy of a somatosensory camera, the method S100 comprising:
  • the LED lighting unit is disposed, and the LED lighting unit includes a plurality of LED lighting devices disposed at intervals in the height direction, wherein the LED lighting unit and the somatosensory camera have a preset distance.
  • step S140 Compare the number of LED light emitting devices in the infrared image with an actual number of LED light emitting devices in the LED light emitting unit, and when the number of LED light emitting devices in the infrared image is illuminated with the LED When the actual number of LED light-emitting devices in the unit coincides, the flow proceeds to step S150.
  • S150 Obtain a depth image of the LED lighting unit, and locate a depth position of the LED lighting device in the depth image according to a position of the LED lighting device in the infrared image to obtain a depth of the LED lighting unit. Value.
  • S160 Compare a depth value of the LED lighting unit with the preset distance, and output a precision measurement success signal when a depth value of the LED lighting unit is consistent with the preset distance.
  • the accuracy measuring method S100 of the somatosensory camera of the embodiment firstly sets an LED lighting unit, then illuminates all the LED lighting devices in the LED lighting unit, and acquires an infrared image, and calculates the LED lighting device in the infrared image according to the infrared image.
  • the number and position when the number is consistent with the actual number of LED lighting devices, the depth image is acquired, and according to the depth image and the position of the LED lighting device in the infrared image, the depth value of the LED lighting device is calculated, when the depth is When the value is consistent with the preset distance, the accuracy measurement is completed, and the output precision measurement success signal is output.
  • the accuracy measuring method S100 of the somatosensory camera of the present embodiment is a fully automated test mode, omitting the measurement error caused by human time and human judgment.
  • this kind of accuracy measurement method has low cost, simple implementation, easy maintenance and large-scale layout.
  • the step S110 includes:
  • the steps S120 to S160 are sequentially performed on the two sets of the LED lighting units.
  • the LED lighting unit having a relatively close distance with the somatosensory camera may be first illuminated, and then the step S120 to step S160 are performed on the LED lighting unit to use the group of LED lighting units to the somatosensory camera. Accuracy test. After the testing of the group of LED lighting units is completed, another group of LED lighting units may be further illuminated, and steps S120 to S160 are repeated to complete the accuracy test of the somatosensory camera.
  • the LED lighting unit provided in this embodiment is not limited to two groups, and may further include multiple groups of LED lighting units, and the plurality of groups of LED lighting units may have different preset distances from the somatosensory camera.
  • an accuracy measuring apparatus 100 for a somatosensory camera comprising an LED lighting unit 110, a control unit 120, an image acquisition unit 130, and an analysis unit 140.
  • the LED lighting unit 110 includes a plurality of LED lighting devices 111 spaced apart in the height direction, wherein the LED lighting unit 110 and the somatosensory camera 200 have a preset distance.
  • the control unit 120 is configured to illuminate all of the LED light emitting devices 111 in the LED lighting unit 110.
  • the image acquiring unit 130 is configured to acquire an infrared image of the LED lighting unit 110, and calculate a position of the LED lighting device 111 in the infrared image and a quantity of the LED lighting device 111 according to the infrared image.
  • the analyzing unit 140 is configured to compare the number of the LED lighting devices 111 in the infrared image with the actual number of the LED lighting devices 111 in the LED lighting unit 110, and when the LEDs in the infrared image emit light When the number of devices 111 coincides with the actual number of LED light-emitting devices 111 in the LED lighting unit 110, a depth image acquisition signal is transmitted to the image acquisition unit 130.
  • the image obtaining unit 130 is further configured to acquire a depth image of the LED lighting unit 110 according to the depth image acquiring signal, and locate the LED light according to a position of the LED light emitting device 111 in the infrared image. A depth position of the device 111 in the depth image to obtain a depth value of the LED lighting unit 110.
  • the analyzing unit 140 is further configured to compare the depth value of the LED lighting unit 110 with the preset distance, and output when the depth value of the LED lighting unit 110 is consistent with the preset distance. Accuracy measurement success signal.
  • the control unit 120 lights all of the LED lighting devices 111 in the LED lighting unit 110, and then the image acquiring unit 130 acquires the LEDs.
  • the image acquiring unit 130 is further configured to collect a depth image. And calculating the depth value of the LED light emitting device 111 according to the depth image and the position of the LED light emitting device 111 in the infrared image.
  • the accuracy measuring device 100 of the somatosensory camera of the present embodiment is a fully automated test mode, omitting the measurement error caused by human judgment and human judgment.
  • the precision measuring device has the advantages of low cost, simple implementation, convenient maintenance and large-scale layout.
  • the LED lighting unit 110 includes two sets of the LED lighting units 110, and the two sets of the LED lighting units 110 and the somatosensory camera 200 have different preset distances.
  • the control unit 120 is configured to independently control the working states of the two sets of the LED lighting units 110.
  • the control unit 120 may first illuminate the LED lighting unit 110 having a relatively close distance with the somatosensory camera 200 to utilize the set of LED lighting units 110 to the somatosensory camera 200. Perform an accuracy test. After the test is completed, the control unit 120 may illuminate another set of LED lighting units 110 to perform the accuracy test on the somatosensory camera 200 using the set of LED lighting units 110.
  • the LED lighting unit 110 disposed in this embodiment is not limited to two groups, and may further include multiple groups of LED lighting units 110, which may have different pre-preparations with the somatosensory camera 200. Set the distance.
  • the device further includes a lamp post 150 corresponding to the LED lighting unit 110, and each set of the LED lighting unit 110 is disposed on the lamp post corresponding thereto. Within 150.
  • each group of LED lighting units 110 can be spaced apart within the lamp post 150 such that each LED lighting device 111 can have a different predetermined height relative to the ground. Thereby, the height sensing test of the somatosensory camera 200 can be performed by the LED lighting unit 110.
  • the apparatus further includes a test darkroom 160 disposed within the test darkroom 160.
  • the accuracy measuring device 100 of the somatosensory camera of the present embodiment includes a test dark room 160, that is, the test dark room 160 is an opaque test dark room. In this way, it can solve the interference of natural light or ambient light such as incandescent lamps in the production environment, thereby improving the test accuracy of the somatosensory camera.
  • control unit 120 comprises a single chip microcomputer.
  • the single chip is further configured to communicate with the terminal to receive a test request sent by the terminal;
  • the single chip microcomputer is configured to control an operating state of the LED lighting unit 110 according to a test request sent by the terminal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Studio Devices (AREA)

Abstract

本发明公开了一种体感摄像头的精度测量方法和测量装置。方法包括:设置LED发光单元,其与体感摄像头之间具有预设距离;点亮LED发光单元中的全部LED发光器件;获取LED发光单元的红外图像,并根据红外图像计算LED发光器件在红外图像中的位置和数量,并将该数量与LED发光器件的实际数量相比较,当两者相一致时;获取LED发光单元的深度图像,根据LED发光器件在红外图像中的位置,定位其在深度图像中的深度位置,以获得LED发光单元的深度数值;将LED发光单元的深度数值与预设距离进行比较,并且,当两者相一致时,输出精度测量成功信号。本发明的精度测量方法,可以实现完全自动化的测试,省略人为判断耗时以及人为判断导致的测量误差,成本较低,实施简单。

Description

体感摄像头的精度测量方法和体感摄像头的精度测量装置 技术领域
本发明涉及生产测试应用领域,特别涉及一种体感摄像头的精度测量方法和一种体感摄像头的精度测量装置。
背景技术
目前,AR已经逐步普及,市场上针对拥有增强现实技术的硬件设备的实际使用效果方面的各种测试已经形成相对较为标准的案例,但是针对深度测量设备指标量化的测试仍未有成熟的方案。
中国专利201610366199.9公开了提供了一种用于测量和/或制造的组装方法、增强现实系统和计算机程序产品。方法包括提供包括接收器、处理器和输出设备的增强现实系统。方法还包括在WS中布置测量设备使得测量设备与WS中的参考点具有特定的位置关系。方法还包括在WS中设置目标对象。方法还包括由测量设备测量从测量设备到目标对象的距离测量。方法还包括由测量设备向增强现实系统传输距离测量。方法还包括由增强现实系统确定距离测量是否对应于目标距离。
中国专利201280033997.X公开了针对一种增强现实系统和用于在增强现实系统的协助下将第一装配部件组装至第二装配部件的组装方法。该增强现实系统能够捕捉与各个部件关联的标记的第一部分和该标记的第二部分。该增强现实系统能够捕捉与其中一个部件关联的可变标记。该增强现实系统能够捕捉与各个部件关联的第一标记和第二标记。该增强现实系统能够识别标记的位置和/或状态,并且由此确定第一和第二装配部件之间是否正确地建立连接。此专利仅仅描述了增强现实设备的生产检测方案,并未对具体的距离精度检测给出详细的设计和描述。
因此,如何设计出一种体感摄像头的精度测量方法成为本领域 亟需解决的技术问题。
发明内容
本发明旨在至少解决现有技术中存在的技术问题之一,提出了一种体感摄像头的精度测量方法和一种体感摄像头的精度测量装置。
为了实现上述目的,本发明的第一方面,提供了一种体感摄像头的精度测量方法,所述方法包括:
S110、设置LED发光单元,所述LED发光单元包括多个沿高度方向间隔设置的LED发光器件,其中,所述LED发光单元与所述体感摄像头之间具有预设距离;
S120、点亮所述LED发光单元中的全部所述LED发光器件;
S130、获取所述LED发光单元的红外图像,并根据所述红外图像计算所述LED发光器件在所述红外图像中的位置和所述LED发光器件的数量;
S140、将所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相比较,并且,当所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相一致时,转入步骤S150;
S150、获取LED发光单元的深度图像,根据所述LED发光器件在所述红外图像中的位置,定位所述LED发光器件在所述深度图像中的深度位置,以获得所述LED发光单元的深度数值;
S160、将所述LED发光单元的深度数值与所述预设距离进行比较,并且,当所述LED发光单元的深度数值与所述预设距离相一致时,输出精度测量成功信号。
优选地,所述步骤S110包括:
设置两组所述LED发光单元,该两组所述LED发光单元与所述体感摄像头之间具有不同的预设距离;
对两组所述LED发光单元依次执行所述步骤S120至步骤S160。
本发明的第二方面,提供了一种体感摄像头的精度测量装置,所述装置包括LED发光单元、控制单元、图像获取单元和分析单元;
所述LED发光单元包括多个沿高度方向间隔设置的LED发光器件,其中,所述LED发光单元与所述体感摄像头之间具有预设距离;
所述控制单元用于点亮所述LED发光单元中的全部所述LED发光器件;
所述图像获取单元用于获取所述LED发光单元的红外图像,并根据所述红外图像计算所述LED发光器件在所述红外图像中的位置和所述LED发光器件的数量;
所述分析单元用于将所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相比较,并且,当所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相一致时,向所述图像获取单元发送深度图像获取信号;
所述图像获取单元还用于根据所述深度图像获取信号,获取所述LED发光单元的深度图像,并根据所述LED发光器件在所述红外图像中的位置,定位所述LED发光器件在所述深度图像中的深度位置,以获得所述LED发光单元的深度数值;
所述分析单元还用于将所述LED发光单元的深度数值与所述预设距离进行比较,并且,当所述LED发光单元的深度数值与所述预设距离相一致时,输出精度测量成功信号。
优选地,所述LED发光单元包括两组所述LED发光单元,该两组所述LED发光单元与所述体感摄像头之间具有不同的预设距离;
所述控制单元用于独立控制两组所述LED发光单元的工作状态。
优选地,所述装置还包括与所述LED发光单元一一对应的灯柱,每组所述LED发光单元设置在与其所对应的所述灯柱内。
优选地,所述装置还包括测试暗室,所述灯柱设置在所述测试暗室内。
优选地,所述控制单元包括单片机。
优选地,所述单片机还用于与终端通信连接,以接收终端发送的测试请求;
所述单片机用于根据所述终端发送的测试请求控制所述LED发光单元的工作状态。
本发明的体感摄像头的精度测量方法,首先,设置LED发光单元,之后点亮LED发光单元内的所有的LED发光器件,并获取红外图像,根据红外图像计算该红外图像中的LED发光器件的数量与位置,当数量与LED发光器件的实际数量一致时,采集深度图像,并根据深度图像和LED发光器件在所述红外图像中的位置,计算得到LED发光器件的深度数值,当该深度数值与预设距离一致时,完成精度测量,输出精度测量成功信号。因此,本发明的体感摄像头的精度测量方法,是一个完全自动化的测试模式,省略了人为判断耗时以及人为判断导致的测量误差。其次,该种精度测量方法,其成本较低,实施简单,便于维护和大规模布局。
本发明的体感摄像头的精度测量装置,当需要对体感摄像头进行精度测量时,控制单元点亮所述LED发光单元中的所有的LED发光器件,之后,图像获取单元获取LED发光单元的红外图像,并根据红外图像计算该红外图像中的LED发光器件的数量与位置,当数量与LED发光器件的实际数量一致时,图像获取单元还用于采集深度图像,并根据深度图像和LED发光器件在所述红外图像中的位置,计算得到LED发光器件的深度数值,当该深度数值与预设距离一致时,完成精度测量,输出精度测量成功信号。因此,本发明的体感摄像头的精度测量装置,是一个完全自动化的测试模式,省略了人为判断耗时以及人为判断导致的测量误差。其次,该种精度测量装置,其成本较低,实施简单,便于维护和大规模布局。
附图说明
附图是用来提供对本发明的进一步理解,并且构成说明书的一部分,与下面的具体实施方式一起用于解释本发明,但并不构成对本发明的限制。在附图中:
图1为本发明第一实施例中体感摄像头的精度测量方法的流程图;
图2为本发明第二实施例中体感摄像头的精度测量装置的结构示意图;
图3为本发明第三实施例中体感摄像头的精度测量装置中的测试暗室和灯柱的结构示意图。
附图标记说明
100:体感摄像头的精度测量装置;
110:LED发光单元;
111:LED发光器件;
120:控制单元;
130:图像获取单元;
140:分析单元;
150:灯柱;
160:测试暗室;
200:体感摄像头。
具体实施方式
以下结合附图对本发明的具体实施方式进行详细说明。应当理解的是,此处所描述的具体实施方式仅用于说明和解释本发明,并不用于限制本发明。
本发明中的部分名词解释:
AR,增强现实技术(Augmented Reality,简称AR),是一种实时地计算摄影机影像的位置及角度并加上相应图像、视频、3D模型的技术,这种技术的目标是在屏幕上把虚拟世界套在现实世界并进行互动。
参考图1,本发明的第一方面,涉及一种体感摄像头的精度测量方法S100,所述方法S100包括:
S110、设置LED发光单元,所述LED发光单元包括多个沿高度方向间隔设置的LED发光器件,其中,所述LED发光单元与所述体感摄像头之间具有预设距离。
S120、点亮所述LED发光单元中的全部所述LED发光器件。
S130、获取所述LED发光单元的红外图像,并根据所述红外图像计算所述LED发光器件在所述红外图像中的位置和所述LED发光器件的数量。
S140、将所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相比较,并且,当所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相一致时,转入步骤S150。
S150、获取LED发光单元的深度图像,根据所述LED发光器件在所述红外图像中的位置,定位所述LED发光器件在所述深度图像中的深度位置,以获得所述LED发光单元的深度数值。
S160、将所述LED发光单元的深度数值与所述预设距离进行比较,并且,当所述LED发光单元的深度数值与所述预设距离相一致时,输出精度测量成功信号。
本实施例的体感摄像头的精度测量方法S100,首先,设置LED发光单元,之后点亮LED发光单元内的所有的LED发光器件,并获取红外图像,根据红外图像计算该红外图像中的LED发光器件的数量与位置,当数量与LED发光器件的实际数量一致时,采集深度图像,并根据深度图像和LED发光器件在所述红外图像中的位置,计算得到LED发光器件的深度数值,当该深度数值与预设距离一致时,完成精度测量,输出精度测量成功信号。因此,本实施例的体感摄像头的精度测量方法S100,是一个完全自动化的测试模式,省略了人为判断耗时以及人为判断导致的测量误差。其次,该种精度测量方法,其成本较低,实施简单,便于维护和大规模布局。
优选地,所述步骤S110包括:
设置两组所述LED发光单元,该两组所述LED发光单元与所述体感摄像头之间具有不同的预设距离;
对两组所述LED发光单元依次执行所述步骤S120至步骤S160。
也就是说,上述两组LED发光单元中,其中一组LED发光单元与所述体感摄像头之间的距离相对较近,另一组LED发光单元与所 述体感摄像头之间的距离相对较远。此时,可以首先点亮与所述体感摄像头之间具有较近距离的LED发光单元,之后,对该LED发光单元进行上述步骤S120至步骤S160,以利用该组LED发光单元对所述体感摄像头的精度测试。当该组LED发光单元测试完成以后,可以再点亮另一组LED发光单元,重复步骤S120至步骤S160,完成所述体感摄像头的精度测试。
当然,本实施例中所设置的LED发光单元并不限于两组,其还可以包括多组LED发光单元,该多组LED发光单元可以与所述体感摄像头之间具有不同的预设距离。
本发明的第二方面,如图2和图3所示,提供了一种体感摄像头的精度测量装置100,所述装置包括LED发光单元110、控制单元120、图像获取单元130和分析单元140。
其中,所述LED发光单元110包括多个沿高度方向间隔设置的LED发光器件111,其中,所述LED发光单元110与所述体感摄像头200之间具有预设距离。
所述控制单元120用于点亮所述LED发光单元110中的全部所述LED发光器件111。
所述图像获取单元130用于获取所述LED发光单元110的红外图像,并根据所述红外图像计算所述LED发光器件111在所述红外图像中的位置和所述LED发光器件111的数量。
所述分析单元140用于将所述红外图像中的LED发光器件111的数量与所述LED发光单元110中的LED发光器件111的实际数量相比较,并且,当所述红外图像中的LED发光器件111的数量与所述LED发光单元110中的LED发光器件111的实际数量相一致时,向所述图像获取单元130发送深度图像获取信号。
所述图像获取单元130还用于根据所述深度图像获取信号,获取所述LED发光单元110的深度图像,并根据所述LED发光器件111在所述红外图像中的位置,定位所述LED发光器件111在所述深度图像中的深度位置,以获得所述LED发光单元110的深度数值。
所述分析单元140还用于将所述LED发光单元110的深度数值 与所述预设距离进行比较,并且,当所述LED发光单元110的深度数值与所述预设距离相一致时,输出精度测量成功信号。
本实施例的体感摄像头的精度测量装置100,当需要对体感摄像头进行精度测量时,控制单元120点亮所述LED发光单元110中的所有的LED发光器件111,之后,图像获取单元130获取LED发光单元110的红外图像,并根据红外图像计算该红外图像中的LED发光器件111的数量与位置,当数量与LED发光器件111的实际数量一致时,图像获取单元130还用于采集深度图像,并根据深度图像和LED发光器件111在所述红外图像中的位置,计算得到LED发光器件111的深度数值,当该深度数值与预设距离一致时,完成精度测量,输出精度测量成功信号。因此,本实施例的体感摄像头的精度测量装置100,是一个完全自动化的测试模式,省略了人为判断耗时以及人为判断导致的测量误差。其次,该种精度测量装置,其成本较低,实施简单,便于维护和大规模布局。
优选地,所述LED发光单元110包括两组所述LED发光单元110,该两组所述LED发光单元110与所述体感摄像头200之间具有不同的预设距离。
所述控制单元120用于独立控制两组所述LED发光单元110的工作状态。
具体地,上述两组LED发光单元110中,其中一组LED发光单元与所述体感摄像头200之间的距离相对较近,另一组LED发光单元110与所述体感摄像头200之间的距离相对较远。当需要对体感摄像头200进行精度测量时,控制单元120可以首先点亮与所述体感摄像头200之间具有较近距离的LED发光单元110,以利用该组LED发光单元110对所述体感摄像头200进行精度测试。当测试完成以后,控制单元120可以再点亮另一组LED发光单元110,以利用该组LED发光单元110对所述体感摄像头200进行精度测试。
当然,本实施例中所设置的LED发光单元110并不限于两组,其还可以包括多组LED发光单元110,该多组LED发光单元110可以与所述体感摄像头200之间具有不同的预设距离。
优选地,如图2和图3所示,所述装置还包括与所述LED发光单元110一一对应的灯柱150,每组所述LED发光单元110设置在与其所对应的所述灯柱150内。
这样,每组LED发光单元110中的所有的LED发光器件111可以间隔设置在灯柱150内,从而可以使得各LED发光器件111相对地面具有不同的预设高度。从而,可以利用LED发光单元110对体感摄像头200进行高度精度测试。
优选地,如图2和图3所示,所述装置还包括测试暗室160,所述灯柱150设置在所述测试暗室160内。
本实施例的体感摄像头的精度测量装置100,其包括一个测试暗室160,也就是说,该测试暗室160为一个不透光的测试暗室。这样,其可以解决自然光或者生产环境中的白炽灯等环境光的干扰,从而可以提高体感摄像头的测试精度。
优选地,所述控制单元120包括单片机。
优选地,所述单片机还用于与终端通信连接,以接收终端发送的测试请求;
所述单片机用于根据所述终端发送的测试请求控制所述LED发光单元110的工作状态。
可以理解的是,以上实施方式仅仅是为了说明本发明的原理而采用的示例性实施方式,然而本发明并不局限于此。对于本领域内的普通技术人员而言,在不脱离本发明的精神和实质的情况下,可以做出各种变型和改进,这些变型和改进也视为本发明的保护范围。

Claims (8)

  1. 一种体感摄像头的精度测量方法,其特征在于,所述方法包括:
    S110、设置LED发光单元,所述LED发光单元包括多个沿高度方向间隔设置的LED发光器件,其中,所述LED发光单元与所述体感摄像头之间具有预设距离;
    S120、点亮所述LED发光单元中的全部所述LED发光器件;
    S130、获取所述LED发光单元的红外图像,并根据所述红外图像计算所述LED发光器件在所述红外图像中的位置和所述LED发光器件的数量;
    S140、将所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相比较,并且,当所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相一致时,转入步骤S150;
    S150、获取LED发光单元的深度图像,根据所述LED发光器件在所述红外图像中的位置,定位所述LED发光器件在所述深度图像中的深度位置,以获得所述LED发光单元的深度数值;
    S160、将所述LED发光单元的深度数值与所述预设距离进行比较,并且,当所述LED发光单元的深度数值与所述预设距离相一致时,输出精度测量成功信号。
  2. 根据权利要求1所述的体感摄像头的精度测量方法,其特征在于,所述步骤S110包括:
    设置两组所述LED发光单元,该两组所述LED发光单元与所述体感摄像头之间具有不同的预设距离;
    对两组所述LED发光单元依次执行所述步骤S120至步骤S160。
  3. 一种体感摄像头的精度测量装置,其特征在于,所述装置包括LED发光单元、控制单元、图像获取单元和分析单元;
    所述LED发光单元包括多个沿高度方向间隔设置的LED发光器件,其中,所述LED发光单元与所述体感摄像头之间具有预设距离;
    所述控制单元用于点亮所述LED发光单元中的全部所述LED发光器件;
    所述图像获取单元用于获取所述LED发光单元的红外图像,并根据所述红外图像计算所述LED发光器件在所述红外图像中的位置和所述LED发光器件的数量;
    所述分析单元用于将所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相比较,并且,当所述红外图像中的LED发光器件的数量与所述LED发光单元中的LED发光器件的实际数量相一致时,向所述图像获取单元发送深度图像获取信号;
    所述图像获取单元还用于根据所述深度图像获取信号,获取所述LED发光单元的深度图像,并根据所述LED发光器件在所述红外图像中的位置,定位所述LED发光器件在所述深度图像中的深度位置,以获得所述LED发光单元的深度数值;
    所述分析单元还用于将所述LED发光单元的深度数值与所述预设距离进行比较,并且,当所述LED发光单元的深度数值与所述预设距离相一致时,输出精度测量成功信号。
  4. 根据权利要求3所述的体感摄像头的精度测量装置,其特征在于,所述LED发光单元包括两组所述LED发光单元,该两组所述LED发光单元与所述体感摄像头之间具有不同的预设距离;
    所述控制单元用于独立控制两组所述LED发光单元的工作状态。
  5. 根据权利要求4所述的体感摄像头的精度测量装置,其特征在于,所述装置还包括与所述LED发光单元一一对应的灯柱,每组所述LED发光单元设置在与其所对应的所述灯柱内。
  6. 根据权利要求5所述的体感摄像头的精度测量装置,其特征在于,所述装置还包括测试暗室,所述灯柱设置在所述测试暗室内。
  7. 根据权利要求3至6中任意一项所述的体感摄像头的精度测量装置,其特征在于,所述控制单元包括单片机。
  8. 根据权利要求7所述的体感摄像头的精度测量装置,其特征在于,所述单片机还用于与终端通信连接,以接收终端发送的测试请求;
    所述单片机用于根据所述终端发送的测试请求控制所述LED发光单元的工作状态。
PCT/CN2019/070759 2018-01-09 2019-01-08 体感摄像头的精度测量方法和体感摄像头的精度测量装置 WO2019137347A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810020810.1 2018-01-09
CN201810020810.1A CN108234998A (zh) 2018-01-09 2018-01-09 体感摄像头的精度测量方法和体感摄像头的精度测量装置

Publications (1)

Publication Number Publication Date
WO2019137347A1 true WO2019137347A1 (zh) 2019-07-18

Family

ID=62641646

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/070759 WO2019137347A1 (zh) 2018-01-09 2019-01-08 体感摄像头的精度测量方法和体感摄像头的精度测量装置

Country Status (2)

Country Link
CN (1) CN108234998A (zh)
WO (1) WO2019137347A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108234998A (zh) * 2018-01-09 2018-06-29 南京华捷艾米软件科技有限公司 体感摄像头的精度测量方法和体感摄像头的精度测量装置
CN110225336B (zh) 2019-06-21 2022-08-26 京东方科技集团股份有限公司 评估图像采集精度的方法及装置、电子设备、可读介质
CN111601101A (zh) * 2020-04-30 2020-08-28 欧菲微电子技术有限公司 摄像头的精度评测方法、装置、系统及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102572505A (zh) * 2010-11-03 2012-07-11 微软公司 家中深度相机校准
CN104506857A (zh) * 2015-01-15 2015-04-08 苏州阔地网络科技有限公司 一种摄像头位置偏离检测方法及设备
CN105741346A (zh) * 2014-12-29 2016-07-06 达索系统公司 用于校准深度照相机的方法
US20170019663A1 (en) * 2015-07-14 2017-01-19 Microsoft Technology Licensing, Llc Depth-spatial frequency-response assessment
CN108234998A (zh) * 2018-01-09 2018-06-29 南京华捷艾米软件科技有限公司 体感摄像头的精度测量方法和体感摄像头的精度测量装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106767933B (zh) * 2017-02-10 2024-01-12 奥比中光科技集团股份有限公司 深度相机误差的测量系统、测量方法、评价方法及补偿方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102572505A (zh) * 2010-11-03 2012-07-11 微软公司 家中深度相机校准
CN105741346A (zh) * 2014-12-29 2016-07-06 达索系统公司 用于校准深度照相机的方法
CN104506857A (zh) * 2015-01-15 2015-04-08 苏州阔地网络科技有限公司 一种摄像头位置偏离检测方法及设备
US20170019663A1 (en) * 2015-07-14 2017-01-19 Microsoft Technology Licensing, Llc Depth-spatial frequency-response assessment
CN108234998A (zh) * 2018-01-09 2018-06-29 南京华捷艾米软件科技有限公司 体感摄像头的精度测量方法和体感摄像头的精度测量装置

Also Published As

Publication number Publication date
CN108234998A (zh) 2018-06-29

Similar Documents

Publication Publication Date Title
WO2019137347A1 (zh) 体感摄像头的精度测量方法和体感摄像头的精度测量装置
CN106462265B (zh) 基于编码光定位便携式设备
CN106524922B (zh) 测距校准方法、装置和电子设备
CN101672663B (zh) 基于计算机的汽车仪表视觉检测系统及其检测方法
CN105157568B (zh) 坐标测量装置
JP2023052570A5 (zh)
WO2013102572A1 (en) Arrangement for optical measurements and related method
JP2015135331A (ja) 実用型3dビジョンシステム装置およびその方法
CN102369498A (zh) 通过有源显示反馈进行触摸指示器消岐
CN108027623A (zh) 自行检测其源的相对物理布置的照明系统
CN105122943A (zh) 特性化光源和移动设备的方法
CN103780835A (zh) 标识设备和方法
CN108029180A (zh) 用于照明灯具位置映射的系统和方法
CN103096794B (zh) 血液采集装置
CN101929843B (zh) 水稻穗长的自动测量装置及测量方法
WO2016099154A1 (ko) 부품이 실장된 기판 검사방법 및 검사장치
CN108093618A (zh) 通过比较3d高度轮廓与参照高度轮廓来检验贴装内容
CN105391998B (zh) 微光夜视仪分辨率自动检测方法和装置
CN105841930B (zh) 一种光生物安全性测试系统
CN107990873A (zh) 一种用led智能灯定位的方式
CN105910534A (zh) 橡塑制品几何量测量系统
TWI577979B (zh) 光源通道校正方法及系統
CN108287046A (zh) 一种车用发动机漏液检测仪器及方法
CN103438992A (zh) 一种带有自动定位功能的照度计
CN208012829U (zh) 一种车用发动机漏液检测仪器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19738767

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19738767

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19738767

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 10.03.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19738767

Country of ref document: EP

Kind code of ref document: A1