WO2024055662A1 - 一种近眼显示模组检测方法及检测系统 - Google Patents

一种近眼显示模组检测方法及检测系统 Download PDF

Info

Publication number
WO2024055662A1
WO2024055662A1 PCT/CN2023/101287 CN2023101287W WO2024055662A1 WO 2024055662 A1 WO2024055662 A1 WO 2024055662A1 CN 2023101287 W CN2023101287 W CN 2023101287W WO 2024055662 A1 WO2024055662 A1 WO 2024055662A1
Authority
WO
WIPO (PCT)
Prior art keywords
display module
eye display
screen
lens
focus
Prior art date
Application number
PCT/CN2023/101287
Other languages
English (en)
French (fr)
Inventor
王雷
冯奇
毛涌
欧昌东
郑增强
Original Assignee
武汉精测电子集团股份有限公司
武汉精立电子技术有限公司
苏州精濑光电有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 武汉精测电子集团股份有限公司, 武汉精立电子技术有限公司, 苏州精濑光电有限公司 filed Critical 武汉精测电子集团股份有限公司
Publication of WO2024055662A1 publication Critical patent/WO2024055662A1/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses

Definitions

  • This application relates to the field of optical detection technology, and in particular to a near-eye display module detection method and detection system.
  • VR Virtual Reality, virtual reality
  • VR glasses are being purchased and used by more and more people.
  • the module quality of VR glasses directly affects the customer's viewing experience.
  • the quality of VR modules Increasingly demanding.
  • Embodiments of this application provide a near-eye display module detection method and detection system to solve the problems of low manual detection efficiency and large errors in related technologies.
  • a near-eye display module detection method which includes:
  • the focus space several focus positions are determined in a direction perpendicular to the screen of the near-eye display module; wherein the focus space includes: a point parallel to the point where the lens of the near-eye display module is farthest from the screen. The plane of the screen and the space between the screen;
  • the position of the defect on the near-eye display module is determined.
  • determining the position of the defect on the near-eye display module based on the coordinates and sharpness value of the defect includes the following steps:
  • the target surface is: on the near-eye display module, intersecting with the plane parallel to the screen where the focus position is located. s surface.
  • the method further includes: determining the position of the defect on the target surface according to the coordinates of the defect.
  • lighting the lens when the surface of the lens at the farthest point from the screen is not etched, lighting the lens includes:
  • the side light source when taking the farthest point of the lens from the screen as the focus position, the side light source is turned on to illuminate when taking focus shots; at other focus positions, when taking focus shots, the near-eye display module is turned on screen to light.
  • lighting the lens when it is detected that the surface of the lens at the point farthest from the screen is etched, lighting the lens includes: taking the position of the point farthest from the screen on the lens as the focus position, When focusing on shooting, the side light source is turned on for lighting; when focusing on shooting at other focusing positions, the screen of the near-eye display module is turned on for lighting.
  • the focus positions are equally spaced in a direction perpendicular to the screen of the near-eye display module.
  • the position of the farthest point of the lens from the screen is used as the first focus position.
  • the method further includes: adjusting the image acquisition device so that the image acquisition device faces the lens of the near-eye display module.
  • a near-eye display module detection system which includes:
  • a carrier platform which is used to lay out the lens and screen of the near-eye display module, and the lens is spaced apart from the display surface of the screen;
  • Image acquisition device which has a shooting angle.
  • the image acquisition device When in the shooting angle, the image acquisition device is facing the lens of the near-eye display module, and the point farthest from the lens of the near-eye display module to the screen is The space between the plane parallel to the screen and the screen is located within the zoom range of the detection lens of the image acquisition device;
  • a computing device used to calculate the coordinates and sharpness values of all defects on the image captured by the image acquisition device, and determine the location of the defects on the near-eye display module based on the coordinates and sharpness values of the defects.
  • it also includes a side light source, the side light source has a lighting area, and the lighting area at least partially covers the carrier, so that the near-eye display module can enter the lighting area.
  • the near-eye display module detection method provided by the embodiment of the present application is to focus and shoot at different positions, and analyze the coordinates and sharpness values to determine the location of the defect on the near-eye display module, and then complete the near-eye display module. This method can effectively avoid the subjective judgment errors of manual inspection and improve the accuracy of inspection.
  • this application processes the image to obtain the coordinates and sharpness values of all defects on the image, so that multiple defects can be detected at one time, so the detection efficiency can be improved.
  • Figure 1 is a flow chart of a near-eye display module detection method provided by an embodiment of the present application
  • Figure 2 is a schematic diagram of a near-eye display module detection system provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of a near-eye display module provided by an embodiment of the present application.
  • Figure 4 is a focus position-sharpness value curve provided by an embodiment of the present application.
  • the detection method includes the following steps:
  • the focus space determine several focus positions in the direction perpendicular to the screen 3 of the near-eye display module; where the focus space includes: the point parallel to the screen where the farthest point of the lens 2 of the near-eye display module from the screen 3 is located 3 plane, and the space between screen 3.
  • the display surface of the screen 3 of the near-eye display module is spaced apart from the lens 2.
  • the lens 2 is located between the screens 3 of the image acquisition device 1.
  • the image acquisition device 1 is facing the lens 2. Before collecting the image, if If there is no direct alignment, the image acquisition device 1 can be adjusted so that the image acquisition device 1 faces the lens 2 of the near-eye display module.
  • the defects may appear on the surface of the lens 2 closest to the image acquisition device 1, that is, the distance between the lens 2 and the screen 3 On the farthest surface; it may also be on the surface of the middle lens; it may also appear on the display surface of screen 3. Therefore, in order to find the location of the defect, the above focusing space at least includes the distance between lens 2 and screen 3 The space between the plane parallel to screen 3 where the farthest point is located, and screen 3.
  • N focus positions are determined in the direction perpendicular to the screen 3 of the near-eye display module, N ⁇ 1.
  • Each focus position is used to focus and capture images.
  • the lens 2 can be positioned away from the screen. 3
  • the farthest point is used as a focus position.
  • the first boundary of the focus space can be determined as the distance from the lens 2
  • the farthest point of the screen 3 is located on a plane parallel to the screen 3
  • the second boundary of the focus space is determined as the plane where the display surface of the screen 3 is located.
  • the size of the above-mentioned focusing space can be determined according to actual detection needs. For example, since each surface of lens 2 will also produce a virtual image surface in the optical system of the lens, for example, as shown in Figure 3, surface B will be imaged in surface A. , surface C will be imaged in surface A and surface B. Since the range of the virtual image surface is often larger, the virtual image surface can be photographed for defect detection. At this time, as shown in Figure 2, the second boundary can be moved downward to Make the screen 3 completely enter the focus space, and the Nth focus position will also move downward. In other words, the focus space will be expanded to expand the shooting range.
  • the advantage of using this method to detect the virtual image plane is that the defects will be detected during the imaging process. will be magnified and easier to determine.
  • the focus positions are equally spaced in the direction perpendicular to the screen 3 of the near-eye display module. That is, the distance between the first boundary and the second boundary is equally divided.
  • the above-mentioned near-eye display module should be understood in a broad sense, that is, a display module including a screen and a lens.
  • the above-mentioned near-eye display module can be a VR module, and for another example, as another
  • the above-mentioned near-eye display module can be an AR projector.
  • each image can be processed using a computing device to obtain the coordinates and sharpness values of all defects on each image.
  • sharpness values When calculating an image, several sharpness values will be output. Each sharpness value represents a defect. The number of sharpness values indicates how many defects there are in the image.
  • the near-eye display module detection method provided by the embodiment of the present application is to focus and shoot at different positions, and analyze the coordinates and sharpness values to determine the location of the defect on the near-eye display module, and then complete the near-eye display module detection method.
  • this method can effectively avoid subjective judgment errors in manual detection and improve detection accuracy.
  • this application processes the image to obtain the coordinates and sharpness values of all defects on the image, so that multiple defects can be detected at one time, so the detection efficiency can be improved.
  • the above-mentioned defects can be dirt, foreign matter, scratches, etc., where dirt refers to oil stains, fingerprints, and the like, and foreign matter refers to fallen dust, debris, hair, and the like. things.
  • the center lines of the above-mentioned screen 3, the detection lens 10 of the image acquisition device 1 and the lens 2 coincide with each other.
  • step 104 determines the position of the defect on the near-eye display module based on the coordinates and sharpness value of the defect, including the following steps:
  • the target surface is: on the near-eye display module, the surface that intersects with the plane parallel to the screen 3 where the focus position is located.
  • the defects on the near-eye display module include defects on each lens of the lens 2 and defects on the screen 3.
  • the total number is certain.
  • the detection lens 10 of the image acquisition device 1 faces the lens 2.
  • the image acquisition device 1 is stationary. That is to say, the position of the detection lens 10 of the image acquisition device 1 relative to the lens 2 is stationary.
  • the focus position is changing, so all the Defects will appear on the captured images, and each defect will appear on the images captured at all focus positions, and the coordinates of the defect on different images are the same.
  • the sharpness values from the first image to the tenth image are a1, a2, a3, a4, a5, a6, a7, a8, a9, a10.
  • the focus positions corresponding to the first image to the tenth image are b1, b2, b3, b4, b5, b6, b7, b8, b9, and b10.
  • the sharpness value can be associated with the focus position to obtain the focus position-sharpness value curve, as shown in Figure 4.
  • This surface is the surface on the near-eye display module that intersects with the plane parallel to the screen 3 where the focus position is located.
  • the sharpness value of the defect cannot be calculated on these images.
  • these defects can be calculated.
  • the sharpness value on the image is assigned.
  • the specific size can be determined according to the actual detection needs. For example, as an example, it can be 0.
  • the detection method provided by the embodiment of the present application also includes: according to the coordinates of the defect, determine where the defect is on the target apparent location.
  • etching may occur. , to present the desired pattern, such as irregular shapes; it is also possible that no etching is performed.
  • lighting lens 2 includes:
  • lighting the lens 2 includes: taking the position of the farthest point of the lens 2 from the screen 3 as the focus position, and when taking a focused shot, Turn on the side light source 4 for lighting; when focusing on shooting at other focus positions, light up the screen 3 of the near-eye display module for lighting.
  • the number of side light sources 4 can be determined according to actual needs.
  • the embodiment of the present application also provides a near-eye display module detection system.
  • the detection system can execute the detection method provided in the above embodiment.
  • the detection system includes a stage, an image acquisition device 1 and a computing device.
  • the carrier is used to arrange the lens 2 and the screen 3 of the near-eye display module, and the display surfaces of the lens 2 and the screen 3 are spaced apart.
  • the image acquisition device 1 includes an area array camera 11 and a detection lens 10 that are connected to each other.
  • the image acquisition device 1 has a shooting angle. When it is at the shooting angle, the image acquisition device 1 is facing the lens 2 of the near-eye display module, and the near-eye display module
  • the space between the plane parallel to the screen 3 and the screen 3 where the furthest point of the lens 2 of the group is located from the screen 3 is located within the zoom range of the detection lens 10 of the image acquisition device 1; because the clamped space is located in the detection
  • the detection lens 10 is within the zoom range of the lens 10, so the detection lens 10 can select different focus positions in the direction perpendicular to the screen 3 of the near-eye display module in the clamped space to perform focused photography.
  • the computing device receives the image transmitted by the image acquisition device 1, calculates the coordinates and sharpness values of all defects on the image captured by the image acquisition device 1, and determines the location of the defect on the near-eye display module based on the coordinates and sharpness values of the defects. Location.
  • the detection system provided by this embodiment focuses on shooting at different positions, and analyzes the coordinates and sharpness values to determine the location of the defect on the near-eye display module, and then completes the detection of the near-eye display module. Work, using this detection system can effectively avoid the subjective judgment errors of manual detection and improve detection accuracy.
  • this application processes the image to obtain the coordinates and sharpness values of all defects on the image, so that multiple defects can be detected at one time, so the detection efficiency can be improved.
  • the image acquisition device 1 can be adjusted in angle.
  • the detection lens 10 of the image acquisition device 1 can have the characteristics of automatic focus, front diaphragm, and large viewing angle.
  • the viewing angle can be up to 150°.
  • the detection system is also equipped with a side light source 4, which has a lighting area, and the lighting area at least partially covers the stage, so that the near-eye display module can enter the lighting area when it is placed on the stage. light zone.
  • the angle of the side light source 4 can be adjusted.
  • it can be a fixed connection, a detachable connection, or an integral connection; it can be a mechanical connection, It can also be an electrical connection; it can be a direct connection, or it can be an indirect connection through an intermediate medium, or it can be an internal connection between two components.
  • a fixed connection a detachable connection, or an integral connection
  • it can be a mechanical connection
  • It can also be an electrical connection
  • it can be a direct connection, or it can be an indirect connection through an intermediate medium, or it can be an internal connection between two components.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Vascular Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Testing Of Optical Devices Or Fibers (AREA)

Abstract

本申请涉及一种近眼显示模组检测方法及检测系统,包括:在对焦空间内,沿垂直于近眼显示模组的屏幕的方向上确定若干对焦位置;对焦空间包括:近眼显示模组的镜头距离屏幕最远的点所在的平行于屏幕的平面,与屏幕之间的空间;对镜头打光,并在各个对焦位置上对焦,以拍摄图像;获取每一张图像上所有缺陷的坐标及锐度值;基于缺陷的坐标及锐度值,确定出缺陷在近眼显示模组上的位置。本申请在不同的位置对焦拍摄,通过坐标和锐度值进行分析,从而确定缺陷所在位置,故可以有效避免人工主观判断误差,提高检测准确性。相对于人工逐一检测,本申请对图像处理,以获取图像上所有缺陷的坐标及锐度值,可以一次性检测多个缺陷,故可以提高检测效率。

Description

一种近眼显示模组检测方法及检测系统 技术领域
 本申请涉及光学检测技术领域,特别涉及一种近眼显示模组检测方法及检测系统。
背景技术
 随着生活品质的提升,VR(Virtual Reality,虚拟现实)技术越来越多的被应用到人们的娱乐和生活当中。VR眼镜作为VR技术的使用载体,被人们越来越多的购买和使用,VR眼镜的模组品质直接影响客户的观感体验,随着人们对VR眼镜显示效果的不断追求,VR模组的品质要求越来越高。
 现有的VR模组检测主要还是通过人工检测,检测效率低下,同一人员长时间的检测疲劳或不同人员的检测标准存在差异均会影响VR模组检测结果,导致检测误差较大。
发明内容
 本申请实施例提供一种近眼显示模组检测方法及检测系统,以解决相关技术中人工检测效率低、误差较大的问题。
 第一方面,提供了一种近眼显示模组检测方法,其包括:
在对焦空间内,沿垂直于近眼显示模组的屏幕的方向上确定若干对焦位置;其中,所述对焦空间包括:所述近眼显示模组的镜头距离所述屏幕最远的点所在的平行于所述屏幕的平面,与所述屏幕之间的空间;
对所述镜头打光,并在各个所述对焦位置上对焦,以拍摄图像;
获取每一张图像上所有缺陷的坐标及锐度值;
基于所述缺陷的坐标及锐度值,确定出所述缺陷在近眼显示模组上的位置。
 一些实施例中,基于所述缺陷的坐标及锐度值,确定出所述缺陷在近眼显示模组上的位置,包括如下步骤:
获取各张图像上相同坐标的缺陷的锐度值,以及对应的拍摄图像时的对焦位置;
找出锐度值最大时所对应的对焦位置,则所述缺陷位于目标表面上,所述目标表面为:所述近眼显示模组上,与该对焦位置所在的平行于所述屏幕的平面相交的表面。
 一些实施例中,所述方法还包括:根据所述缺陷的坐标,确定所述缺陷在所述目标表面上的位置。
 一些实施例中,当所述镜头上离屏幕最远的点所在的表面未被刻蚀时,对所述镜头打光包括:
点亮近眼显示模组的屏幕,以打光;
或者,在以镜头距离所述屏幕最远的点所在位置作为对焦位置,进行对焦拍摄时,点亮侧光源,以打光;在其余的对焦位置,进行对焦拍摄时,点亮近眼显示模组的屏幕,以打光。
 一些实施例中,当检测所述镜头上离屏幕最远的点所在的表面被刻蚀过时,对所述镜头打光包括:在以镜头距离所述屏幕最远的点所在位置作为对焦位置,进行对焦拍摄时,点亮侧光源,以打光;当在其余的对焦位置,进行对焦拍摄时,点亮近眼显示模组的屏幕,以打光。
 一些实施例中,在对焦空间内,沿垂直于近眼显示模组的屏幕的方向上,各对焦位置等间距分布。
 一些实施例中,将镜头距离所述屏幕最远的点所在位置作为第一个对焦位置。
 一些实施例中,所述方法还包括:调整图像采集装置,以使所述图像采集装置正对近眼显示模组的镜头。
 第二方面,提供了一种近眼显示模组检测系统,其包括:
载台,其用于布设近眼显示模组的镜头和屏幕,且所述镜头与屏幕的显示面间隔设置;
图像采集装置,其具有一个拍摄视角,当处于所述拍摄视角时,所述图像采集装置正对近眼显示模组的镜头,且所述近眼显示模组的镜头距离所述屏幕最远的点所在的平行于所述屏幕的平面,与所述屏幕之间的空间,位于所述图像采集装置的检测镜头的变焦范围内;
计算装置,其用于计算所述图像采集装置拍摄的图像上所有缺陷的坐标及锐度值,并基于所述缺陷的坐标及锐度值,确定出所述缺陷在近眼显示模组上的位置。
 一些实施例中,其还包括侧光源,所述侧光源具有打光区,且所述打光区至少部分覆盖于载台上,以使近眼显示模组进入所述打光区。
 本申请提供的技术方案带来的有益效果包括:
本申请实施例提供的近眼显示模组检测方法,是在不同的位置进行对焦拍摄,并通过坐标和锐度值进行分析,从而确定出缺陷在近眼显示模组上的位置,进而完成近眼显示模组的检测工作,采用这种方法,可以有效避免人工检测的主观判断误差,提高检测准确性。
 同时,相对于人工逐一检测,本申请是对图像进行处理,以获取图像上所有缺陷的坐标及锐度值,从而可以一次性检测多个缺陷,故可以提高检测效率。
附图说明
 为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
 图1为本申请实施例提供的近眼显示模组检测方法流程图;
图2为本申请实施例提供的近眼显示模组检测系统示意图;
图3为本申请实施例提供的近眼显示模组示意图;
图4为本申请实施例提供的对焦位置-锐度值曲线图。
 图中:1、图像采集装置;10、检测镜头;11、面阵相机;2、镜头;3、屏幕;4、侧光源。
实施方式
 为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请的一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本申请保护的范围。
 参见图1和图2所示,本申请实施例提供了一种近眼显示模组检测方法,该检测方法包括如下步骤:
101:在对焦空间内,沿垂直于近眼显示模组的屏幕3的方向上确定若干对焦位置;其中,对焦空间包括:近眼显示模组的镜头2距离屏幕3最远的点所在的平行于屏幕3的平面,与屏幕3之间的空间。
 如图2所示,近眼显示模组的屏幕3的显示面和镜头2间隔设置,镜头2位于图像采集装置1的屏幕3之间,图像采集装置1正对镜头2,在采集图像之前,若没有正对,则可以调整图像采集装置1,以使图像采集装置1正对近眼显示模组的镜头2。
 由于本申请是对近眼显示模组的缺陷进行检测,且镜头2包含了一个或多个镜片,而缺陷可能出现在镜头2的最靠近图像采集装置1的表面上,也即镜头2距离屏幕3最远的表面上;也可能是在中间的镜片的表面上;还可能出现在屏幕3的显示面上,因此,为了将缺陷的位置找出来,上述对焦空间,至少是包括镜头2距离屏幕3最远的点所在的平行于屏幕3的平面,与屏幕3之间的空间。
 在对焦空间内,沿垂直于近眼显示模组的屏幕3的方向上确定N个对焦位置,N≥1,每一个对焦位置都是用来对焦拍摄图像的,很显然,可以将镜头2距离屏幕3最远的点所在位置作为一个对焦位置,那么,为了保证能够检测到近眼显示模组所有表面上的缺陷,同时使对焦空间最小最合理,可以将对焦空间的第一边界确定为镜头2距离屏幕3最远的点所在的平行于屏幕3的平面,而对焦空间的第二边界确定为屏幕3的显示面所在的平面。
 很显然,上述对焦空间的大小可以根据实际检测需要确定,比如,由于镜头2的每个面在镜头的光学系统中还会产生虚像面,比如参见图3所示,B面会在A面中成像,C面会在A面和B面中成像,由于虚像面的范围往往会更大,故可以拍摄虚像面进行缺陷检测,此时,参见图2所示,可以将第二边界向下移,以使屏幕3完全进入对焦空间内,而第N个对焦位置也会一样向下移,也就是说,扩大对焦空间,以将拍摄范围扩大,采用这种检测虚像面的好处就是缺陷在成像过程中会被放大,更容易确定。
 为了更加方便检测,在对焦空间内,沿垂直于近眼显示模组的屏幕3的方向上,各对焦位置等间距分布。也就是说,将第一边界到第二边界之间的距离进行等分。
 很显然,上述近眼显示模组,应该做广义上的理解,即包含有屏幕和镜头的显示模组,比如,作为一个示例,上述近眼显示模组可以是VR模组,再比如,作为另一个示例,上述近眼显示模组可以是AR projector。
 102:对镜头2打光,并在各个对焦位置上对焦,以拍摄图像。
 在进行对焦拍摄时,可以采用由近及远,从第一个对焦位置开始对焦拍摄,拍完之后,再在第二个对焦位置进行对焦拍摄,以此类推,完成所有对焦位置的对焦拍摄任务。
 还可以采用由远及近,从第N个对焦位置开始对焦拍摄,拍完之后,再在第N-1个对焦位置进行对焦拍摄,以此类推,完成所有对焦位置的对焦拍摄任务。
 103:获取每一张图像上所有缺陷的坐标及锐度值。
 步骤103中,可以利用计算装置对每一张图像进行处理,从而得到每一张图像上所有缺陷的坐标及锐度值。
 在对一张图像进行计算时,会输出若干锐度值,每一个锐度值代表一个缺陷,有多少个锐度值,说明图像上有多少个缺陷。
 104:基于缺陷的坐标及锐度值,确定出缺陷在近眼显示模组上的位置,也即确定出缺陷在近眼显示模组的哪一个镜片的哪一个表面上,或者在屏幕3的显示面上。
 可见,本申请实施例提供的近眼显示模组检测方法,是在不同的位置进行对焦拍摄,并通过坐标和锐度值进行分析,从而确定出缺陷在近眼显示模组上的位置,进而完成近眼显示模组的检测工作,采用这种方法,可以有效避免人工检测的主观判断误差,提高检测准确性。
 同时,相对于人工逐一检测,本申请是对图像进行处理,以获取图像上所有缺陷的坐标及锐度值,从而可以一次性检测多个缺陷,故可以提高检测效率。
 需要说明的是,上述缺陷可以是脏污、异物或划痕等等,其中,脏污指的是如油污、手指印等类似物,异物指的是掉落的灰尘、碎屑、毛发等类似物。
 必要时,上述屏幕3、图像采集装置1的检测镜头10以及镜头2的中心线重合。
 进一步地,上述步骤104,基于缺陷的坐标及锐度值,确定出缺陷在近眼显示模组上的位置,包括如下步骤:
201:获取各张图像上相同坐标的缺陷的锐度值,以及对应的拍摄图像时的对焦位置。
 202:找出锐度值最大时所对应的对焦位置,则缺陷位于目标表面上,目标表面为:近眼显示模组上,与该对焦位置所在的平行于屏幕3的平面相交的表面。
 参见图2所示,在近眼显示模组上的缺陷,包括镜头2的各个镜片上的缺陷,以及屏幕3上的缺陷,其总数是一定的,图像采集装置1的检测镜头10朝向镜头2,在拍摄图像时,图像采集装置1是不动的,也就是说,图像采集装置1的检测镜头10相对于镜头2的位置是不动的,在拍摄时,只是对焦位置在变化,故所有的缺陷都会呈现在拍摄的图像上,而且每一个缺陷,都会在所有对焦位置所拍摄的图像上,且该缺陷在不同的图像上的坐标是相同的。
 比如,对焦位置有10个,则拍摄的图像有10张,对应缺陷a,其在第一张图像中的坐标为(100,100),则在其他9张图像中的坐标也是(100,100),只是锐度值不同,比如,第一张图像到第十张图像中的锐度值依次为a1、a2、a3、a4、a5、a6、a7、a8、a9、a10。同时,第一张图像到第十张图像对应的对焦位置为b1、b2、b3、b4、b5、b6、b7、b8、b9、b10。
 则可以将锐度值与对焦位置进行关联,得到对焦位置-锐度值曲线图,如图4所示。找出锐度值最大时所对应的对焦位置,就可以确定出缺陷所在的表面了,该表面为近眼显示模组上,与该对焦位置所在的平行于屏幕3的平面相交的表面。
 然而,可能由于拍摄的原因,同一个缺陷可能在一些图像上能够出现,而在另一些图像上并没有出现,导致该缺陷在这些图像上并不能计算出锐度值,此时,可以对这些图像上的锐度值进行赋值,具体大小可以根据实际检测需要确定,比如,作为示例,可以取0。
 上述主要是确定缺陷出现在近眼显示模组的哪一个表面上,属于较为粗略的定位,为了满足一些精确定位需求,本申请实施例提供的检测方法还包括:根据缺陷的坐标,确定缺陷在目标表面上的位置。
 由于知道了缺陷在近眼显示模组的其中一个具体表面上,加上也知道了缺陷的坐标,根据坐标,就可以确定缺陷更为精确的位置。
 由于当镜头2的外表面,也即镜头2上离屏幕3最远的点所在的表面,或者说镜头2上离图像采集装置1的检测镜头10最近的点所在的表面,可能会进行刻蚀,以呈现所需要的图案,比如不规则形状;也有可能并未进行刻蚀。
 此时,针对这两种情况,在进行打光时,可以采取不同的方式。
 当镜头2上离屏幕3最远的点所在的表面未被刻蚀时,对镜头2打光包括:
点亮近眼显示模组的屏幕3,以打光;
或者,在以镜头2距离屏幕3最远的点所在位置作为对焦位置,进行对焦拍摄时,点亮侧光源4,以打光;在其余的对焦位置,进行对焦拍摄时,点亮近眼显示模组的屏幕3,以打光。
 可见,当镜头2的外表面并未被刻蚀过时,有两种打光方式,可以选择在所有的对焦位置进行对焦拍摄时,直接利用屏幕3进行打光,也可以选择在第一个对焦位置进行对焦拍摄时,利用侧光源4进行打光,在第二个对焦位置到第N个对焦位置进行对焦拍摄时,利用屏幕3进行打光。
 而当检测镜头2上离屏幕3最远的点所在的表面被刻蚀过时,对镜头2打光包括:在以镜头2距离屏幕3最远的点所在位置作为对焦位置,进行对焦拍摄时,点亮侧光源4,以打光;当在其余的对焦位置,进行对焦拍摄时,点亮近眼显示模组的屏幕3,以打光。
 可见,当镜头2的外表面被刻蚀过时,在第一个对焦位置进行对焦拍摄时,需要利用侧光源4进行打光,以避免背光而误检,提高检测准确性。
 侧光源4的数量可以根据实际需要确定。
 参见图2所示,本申请实施例还提供了一种近眼显示模组检测系统,该检测系统可以执行上述实施例提供的检测方法,该检测系统包括载台、图像采集装置1和计算装置,其中,载台用于布设近眼显示模组的镜头2和屏幕3,且镜头2与屏幕3的显示面间隔设置。
 图像采集装置1包括互相连接的面阵相机11和检测镜头10,图像采集装置1具有一个拍摄视角,当处于拍摄视角时,图像采集装置1正对近眼显示模组的镜头2,且近眼显示模组的镜头2距离屏幕3最远的点所在的平行于屏幕3的平面,与屏幕3之间的空间,位于图像采集装置1的检测镜头10的变焦范围内;由于该夹持的空间位于检测镜头10的变焦范围内,故检测镜头10可以在该夹持的空间内,沿垂直于近眼显示模组的屏幕3的方向上选取不同的对焦位置,进行对焦拍摄。
 计算装置接收图像采集装置1传输的图像,并计算图像采集装置1拍摄的图像上所有缺陷的坐标及锐度值,并基于缺陷的坐标及锐度值,确定出缺陷在近眼显示模组上的位置。
 可见,本实施例提供的检测系统,是在不同的位置进行对焦拍摄,并通过坐标和锐度值进行分析,从而确定出缺陷在近眼显示模组上的位置,进而完成近眼显示模组的检测工作,采用这种检测系统,可以有效避免人工检测的主观判断误差,提高检测准确性。
 同时,相对于人工逐一检测,本申请是对图像进行处理,以获取图像上所有缺陷的坐标及锐度值,从而可以一次性检测多个缺陷,故可以提高检测效率。
 必要时,图像采集装置1可以进行角度调整。
 必要时,图像采集装置1的检测镜头10可以具备自动对焦、光阑前置、大视角的特征,比如视角最大可到150°。
 进一步地,该检测系统还配置有侧光源4,该侧光源4具有一个打光区,该打光区至少部分覆盖于载台,以使近眼显示模组放置于载台上时能够进入该打光区。
 必要时,侧光源4可以进行角度调整。
 在本申请的描述中,需要说明的是,术语“上”、“下”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本申请和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本申请的限制。除非另有明确的规定和限定,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通。对于本领域的普通技术人员而言,可以根据具体情况理解上述术语在本申请中的具体含义。
 需要说明的是,在本申请中,诸如“第一”和“第二”等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
 以上所述仅是本申请的具体实施方式,使本领域技术人员能够理解或实现本申请。对这些实施例的多种修改对本领域的技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本申请的精神或范围的情况下,在其它实施例中实现。因此,本申请将不会被限制于本文所示的这些实施例,而是要符合与本文所申请的原理和新颖特点相一致的最宽的范围。
在此处键入序列表自由内容描述段落。

Claims (9)

  1.  一种近眼显示模组检测方法,其特征在于,其包括:
    在对焦空间内,沿垂直于近眼显示模组的屏幕(3)的方向上确定若干对焦位置;其中,所述对焦空间包括:所述近眼显示模组的镜头(2)距离所述屏幕(3)最远的点所在的平行于所述屏幕(3)的平面,与所述屏幕(3)之间的空间;
    对所述镜头(2)打光,并保持图像采集装置(1)的检测镜头(10)相对于镜头(2)的位置不动,通过检测镜头(10)在各个所述对焦位置上对焦,以拍摄图像;
    获取每一张图像上所有缺陷的坐标及锐度值;
    基于所述缺陷的坐标及锐度值,确定出所述缺陷在近眼显示模组上的位置;
    基于所述缺陷的坐标及锐度值,确定出所述缺陷在近眼显示模组上的位置,包括如下步骤:
    获取各张图像上相同坐标的缺陷的锐度值,以及对应的拍摄图像时的对焦位置,将各张图像上相同坐标的缺陷的锐度值与对焦位置进行关联;
    找出锐度值最大时所对应的对焦位置,则所述缺陷位于目标表面上,所述目标表面为:所述近眼显示模组上,与该对焦位置所在的平行于所述屏幕(3)的平面相交的表面。
  2.  如权利要求1所述的近眼显示模组检测方法,其特征在于:
    所述方法还包括:根据所述缺陷的坐标,确定所述缺陷在所述目标表面上的位置。
  3.  如权利要求1所述的近眼显示模组检测方法,其特征在于:
    当所述镜头(2)上离屏幕(3)最远的点所在的表面未被刻蚀时,对所述镜头(2)打光包括:
    点亮近眼显示模组的屏幕(3),以打光;
    或者,在以镜头(2)距离所述屏幕(3)最远的点所在位置作为对焦位置,进行对焦拍摄时,点亮侧光源(4),以打光;在其余的对焦位置,进行对焦拍摄时,点亮近眼显示模组的屏幕(3),以打光。
  4.  如权利要求1所述的近眼显示模组检测方法,其特征在于:
    当检测所述镜头(2)上离屏幕(3)最远的点所在的表面被刻蚀过时,对所述镜头(2)打光包括:在以镜头(2)距离所述屏幕(3)最远的点所在位置作为对焦位置,进行对焦拍摄时,点亮侧光源(4),以打光;当在其余的对焦位置,进行对焦拍摄时,点亮近眼显示模组的屏幕(3),以打光。
  5.  如权利要求1所述的近眼显示模组检测方法,其特征在于:
    在对焦空间内,沿垂直于近眼显示模组的屏幕(3)的方向上,各对焦位置等间距分布。
  6.  如权利要求5所述的近眼显示模组检测方法,其特征在于:
    将镜头(2)距离所述屏幕(3)最远的点所在位置作为第一个对焦位置。
  7.  如权利要求1所述的近眼显示模组检测方法,其特征在于:
    所述方法还包括:调整图像采集装置(1),以使所述图像采集装置(1)正对近眼显示模组的镜头(2)。
  8.  一种近眼显示模组检测系统,其特征在于,其包括:
    载台,其用于布设近眼显示模组的镜头(2)和屏幕(3),且所述镜头(2)与屏幕(3)的显示面间隔设置;
    图像采集装置(1),其具有一个拍摄视角,当处于所述拍摄视角时,所述图像采集装置(1)正对近眼显示模组的镜头(2),且所述近眼显示模组的镜头(2)距离所述屏幕(3)最远的点所在的平行于所述屏幕(3)的平面,与所述屏幕(3)之间的空间,位于所述图像采集装置(1)的检测镜头(10)的变焦范围内,拍摄图像时,图像采集装置(1)的检测镜头(10)相对于镜头(2)的位置不动;
    计算装置,其用于计算所述图像采集装置(1)拍摄的图像上所有缺陷的坐标及锐度值,并基于所述缺陷的坐标及锐度值,确定出所述缺陷在近眼显示模组上的位置;
    基于所述缺陷的坐标及锐度值,确定出所述缺陷在近眼显示模组上的位置,包括:
    获取各张图像上相同坐标的缺陷的锐度值,以及对应的拍摄图像时的对焦位置,将各张图像上相同坐标的缺陷的锐度值与对焦位置进行关联;
    找出锐度值最大时所对应的对焦位置,则所述缺陷位于目标表面上,所述目标表面为:所述近眼显示模组上,与该对焦位置所在的平行于所述屏幕(3)的平面相交的表面。
  9.  如权利要求8所述的近眼显示模组检测系统,其特征在于:其还包括侧光源(4),所述侧光源(4)具有打光区,且所述打光区至少部分覆盖于载台上,以使近眼显示模组进入所述打光区。
PCT/CN2023/101287 2022-09-13 2023-06-20 一种近眼显示模组检测方法及检测系统 WO2024055662A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211110547.8A CN115183989B (zh) 2022-09-13 2022-09-13 一种近眼显示模组检测方法及检测系统
CN202211110547.8 2022-09-13

Publications (1)

Publication Number Publication Date
WO2024055662A1 true WO2024055662A1 (zh) 2024-03-21

Family

ID=83524551

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/101287 WO2024055662A1 (zh) 2022-09-13 2023-06-20 一种近眼显示模组检测方法及检测系统

Country Status (2)

Country Link
CN (1) CN115183989B (zh)
WO (1) WO2024055662A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115183989B (zh) * 2022-09-13 2023-01-10 武汉精立电子技术有限公司 一种近眼显示模组检测方法及检测系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004077261A (ja) * 2002-08-16 2004-03-11 Horiba Ltd 液晶パネルの異物検査装置および異物検査方法
WO2013077282A1 (ja) * 2011-11-25 2013-05-30 シャープ株式会社 液晶表示パネルの検査方法
CN111292228A (zh) * 2020-01-16 2020-06-16 宁波舜宇仪器有限公司 镜头缺陷检测方法
CN111784684A (zh) * 2020-07-13 2020-10-16 合肥市商巨智能装备有限公司 基于激光辅助的透明产品内部缺陷定深检测方法及装置
CN112255239A (zh) * 2020-10-22 2021-01-22 青岛歌尔声学科技有限公司 污染位置检测方法、装置、设备及计算机可读存储介质
CN112595496A (zh) * 2020-12-31 2021-04-02 深圳惠牛科技有限公司 近眼显示设备的不良检测方法、装置、设备及存储介质
CN113538431A (zh) * 2021-09-16 2021-10-22 深圳市鑫信腾科技股份有限公司 显示屏瑕疵定位方法、装置、终端设备及系统
CN115049643A (zh) * 2022-08-11 2022-09-13 武汉精立电子技术有限公司 近眼显示模组夹层异物检测方法、装置、设备及存储介质
CN115183989A (zh) * 2022-09-13 2022-10-14 武汉精立电子技术有限公司 一种近眼显示模组检测方法及检测系统

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108267299B (zh) * 2017-12-22 2019-12-20 歌尔股份有限公司 Ar眼镜瞳距测试方法及装置
CN109186959B (zh) * 2018-09-28 2020-02-07 歌尔股份有限公司 Vr光学模组的场曲的检测方法、装置及设备
CN112326206B (zh) * 2020-11-06 2023-06-13 歌尔光学科技有限公司 Ar模组双目融合检测装置及检测方法
CN112595726A (zh) * 2020-12-11 2021-04-02 深圳市智联汇网络系统企业(有限合伙) 一种oled微型显示器件的像素缺陷检测方法
CN114993614A (zh) * 2022-04-24 2022-09-02 北京闪亮视觉智能科技有限公司 Ar头戴设备测试设备及其测试方法
CN114813061B (zh) * 2022-06-23 2022-09-20 武汉精立电子技术有限公司 一种近眼成像设备的光学参数检测方法及系统

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004077261A (ja) * 2002-08-16 2004-03-11 Horiba Ltd 液晶パネルの異物検査装置および異物検査方法
WO2013077282A1 (ja) * 2011-11-25 2013-05-30 シャープ株式会社 液晶表示パネルの検査方法
CN111292228A (zh) * 2020-01-16 2020-06-16 宁波舜宇仪器有限公司 镜头缺陷检测方法
CN111784684A (zh) * 2020-07-13 2020-10-16 合肥市商巨智能装备有限公司 基于激光辅助的透明产品内部缺陷定深检测方法及装置
CN112255239A (zh) * 2020-10-22 2021-01-22 青岛歌尔声学科技有限公司 污染位置检测方法、装置、设备及计算机可读存储介质
CN112595496A (zh) * 2020-12-31 2021-04-02 深圳惠牛科技有限公司 近眼显示设备的不良检测方法、装置、设备及存储介质
CN113538431A (zh) * 2021-09-16 2021-10-22 深圳市鑫信腾科技股份有限公司 显示屏瑕疵定位方法、装置、终端设备及系统
CN115049643A (zh) * 2022-08-11 2022-09-13 武汉精立电子技术有限公司 近眼显示模组夹层异物检测方法、装置、设备及存储介质
CN115183989A (zh) * 2022-09-13 2022-10-14 武汉精立电子技术有限公司 一种近眼显示模组检测方法及检测系统

Also Published As

Publication number Publication date
CN115183989B (zh) 2023-01-10
CN115183989A (zh) 2022-10-14

Similar Documents

Publication Publication Date Title
US7986875B2 (en) Sound-based focus system and focus method thereof
WO2017088469A1 (zh) 一种基于机械手臂的高精度自动光学检测系统和方法
WO2024055662A1 (zh) 一种近眼显示模组检测方法及检测系统
JP6368766B2 (ja) 対象物の自動アライメント方法及びその自動アライメント検出装置
US20090067701A1 (en) System and method for detecting blemishes on surface of object
JPH05264221A (ja) 半導体露光装置用マーク位置検出装置及びこれを用いた半導体露光装置用位置合わせ装置
WO2022027896A1 (zh) 振镜的参数调节方法、装置、设备及可读存储介质
CN110261069B (zh) 一种用于光学镜头的检测方法
US6760096B2 (en) Lens-evaluating method and lens-evaluating apparatus
KR20180015139A (ko) 투명 기판의 내부 결함을 검출하기 위한 광학 디바이스 및 이를 위한 방법
TW201741723A (zh) 影像景深測量方法以及應用該方法的影像擷取裝置
JP2010243212A (ja) 傾斜検出方法、および傾斜検出装置
KR101341632B1 (ko) 줌 카메라의 광축 오차 보상 시스템, 그 보상 방법
CN114152413A (zh) 一种激光显示中动态散斑的测试方法及其测试装置
CN115685576A (zh) 一种镜头调芯方法和设备
TWI459063B (zh) 多重表面對焦系統及方法
CN114746716B (zh) 形状复原方法和图像测量装置
JP2005024618A (ja) 傾斜角度測定装置を有するプロジェクタ。
JP2938126B2 (ja) カラーフィルタの表面検査装置
JP6196148B2 (ja) デフォーカス制御装置およびデフォーカス制御方法
KR101050711B1 (ko) 다초점 이미지의 광학식 자동 초점 조절 방법
JP2018196068A (ja) ゴースト検出装置及びそれを有する撮像装置
KR100505219B1 (ko) 광학계를 이용한 검사장치
JP4893938B2 (ja) 欠陥検査装置
JP2003344298A (ja) 撮像手段及びそれを用いたワークの傷検査装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23864401

Country of ref document: EP

Kind code of ref document: A1