WO2020228512A1 - 悬浮显示成像装置及悬浮显示触控方法 - Google Patents

悬浮显示成像装置及悬浮显示触控方法 Download PDF

Info

Publication number
WO2020228512A1
WO2020228512A1 PCT/CN2020/086720 CN2020086720W WO2020228512A1 WO 2020228512 A1 WO2020228512 A1 WO 2020228512A1 CN 2020086720 W CN2020086720 W CN 2020086720W WO 2020228512 A1 WO2020228512 A1 WO 2020228512A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
floating
finger
area
display
Prior art date
Application number
PCT/CN2020/086720
Other languages
English (en)
French (fr)
Inventor
石炳川
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2020228512A1 publication Critical patent/WO2020228512A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements

Definitions

  • the present disclosure relates to the technical field of manufacturing floating display products, and in particular to a floating display imaging device and a floating display touch method.
  • the floating display technology separates the displayed image from the screen entity, it is essentially a two-dimensional display, and the user still uses the logic of two-dimensional interaction in perception and interaction.
  • the imaging-based interactive technology needs to accurately capture the user's hand operation and movement in a three-dimensional space, which is still technically difficult at present, especially when the motion range is small, it is easy to cause recognition failure.
  • a floating display imaging device including a display panel, and an air imaging panel arranged at an angle to the display panel, and the light beam emitted by the display panel passes through the air imaging panel.
  • the mirror image of the display panel relative to the air imaging plate is a floating image plane, and further includes:
  • Infrared illumination structure used to provide infrared light to the floating imaging area
  • a reflective infrared filter which is arranged between the display panel and the air imaging board, and is used to reflect the infrared light reflected by the finger when the finger is in the floating image area;
  • the infrared sensor panel is located on the side of the reflective infrared filter away from the display panel, and is used to receive the infrared light reflected by the reflective infrared filter, and when detecting a light spot with a preset brightness distribution Send out an image signal when;
  • the image processing structure is used to obtain a finger touch position according to the image signal, and trigger a corresponding touch operation according to the touch information including the touch position.
  • the infrared sensor panel and the display panel are arranged in mirror images with respect to the reflective infrared filter.
  • the area of the infrared sensor panel is greater than or equal to the area of the display area of the display panel.
  • the area of the infrared sensor panel is not less than the area of the display area of the display panel.
  • the finger touch position includes: the finger does not reach the floating image plane, the finger reaches the floating image plane, and the finger passes through the floating image plane.
  • the infrared sensor panel includes a substrate, and a photodetector array located on the substrate, each of the photodetectors includes a thin film transistor and a photodiode, and the photodiode is used for incident on the The infrared light on the substrate is converted into photocurrent.
  • the included angle between the air imaging panel and the display panel is 40-50 degrees.
  • the image processing structure includes a touch position acquisition unit, and the touch position acquisition unit includes:
  • the first acquisition mode subunit is used to acquire the image information of the light spots distributed on the infrared sensor panel when the finger is in the floating image area according to the image signal, and determine the image information according to the position of the high-brightness area in the image information The touch position; and/or,
  • the second acquisition mode subunit is used to acquire the image information of the light spots distributed on the infrared sensor panel when the finger is in the floating image area according to the image signal, and analyze the image information of the high-brightness area in the image information To determine whether the finger tip is in focus.
  • the second acquisition mode subunit includes:
  • the first processing part is used to obtain the image gradient distribution of the image information of the high-brightness area
  • the second processing part is used for judging whether the finger end of the finger is in the in-focus position according to the image gradient distribution and using a convolutional neural network, and determines the touch position when the finger end of the finger is in the in-focus position.
  • the focus position is a position on the floating image plane that focuses the light spots distributed on the infrared sensor panel.
  • the present disclosure also provides a floating display touch method, which is applied to the above floating display imaging device, and includes the following steps:
  • Trigger a trigger operation corresponding to the touch position
  • Obtaining the touch position in the first acquisition mode includes: acquiring, according to the image signal, the image information of the light spots distributed on the infrared sensor panel when the finger is in the floating image area, and according to the high brightness in the image information The location of the area determines the touch position;
  • Obtaining the touch position in the second acquisition mode includes: acquiring, according to the image signal, the image information of the light spots distributed on the infrared sensor panel when the finger is in the floating image area, and comparing the high brightness in the image information The area image information is analyzed to determine whether the finger tip is in focus.
  • acquiring the touch position in the second acquiring mode specifically includes:
  • the touch position is determined.
  • the method before the step of analyzing and processing the image gradient distribution by using a convolutional neural network to determine whether the fingertips of the fingers are in focus, the method further includes:
  • the beneficial effects of the present disclosure are: reducing the difficulty of floating display imaging interaction, improving the recognition sensitivity of touch position, and avoiding the problem of recognition failure when the motion range is small.
  • FIG. 1 shows a schematic diagram of the structure of a floating display imaging device in an embodiment of the present disclosure
  • Figure 2 shows a schematic diagram of a part of the infrared sensor panel in an embodiment of the present disclosure
  • FIG. 3 shows a schematic diagram of an image when a finger is hovering and touching in the first state in an embodiment of the present disclosure
  • FIG. 4 shows a schematic diagram of an image when a finger is hovering and touching in a second state in an embodiment of the present disclosure
  • FIG. 5 shows a schematic diagram of an image when a finger is hovering and touching in a third state in an embodiment of the present disclosure
  • FIG. 6 shows a schematic flow chart of the floating display touch method according to an embodiment of the present disclosure
  • FIG. 7 shows a schematic diagram of a process of acquiring a touch position in the second acquisition mode in an embodiment of the present disclosure.
  • the small motion range may easily lead to recognition failure.
  • this embodiment provides a floating display imaging device, which includes a display panel and an air imaging panel arranged at an angle to the display panel.
  • the light beam emitted by the display panel forms a floating image through the air imaging panel.
  • the mirror image of the display panel relative to the air imaging plate is a floating image plane, and further includes:
  • Infrared lighting structure used to provide infrared light to the floating image area
  • a reflective infrared filter which is arranged between the display panel and the air imaging board, and is used to reflect the infrared light reflected by the finger when the finger is in the floating image area;
  • the infrared sensor panel is located on the side of the reflective infrared filter away from the display panel, and is used to receive the infrared light reflected by the reflective infrared filter, and when detecting a light spot with a preset brightness distribution Send out an image signal when;
  • the image processing structure is used to obtain the finger touch position according to the image signal, and trigger a corresponding touch operation according to the touch information including the touch position.
  • the air imaging panel is placed in front of the display panel.
  • the light beams emitted by the pixels of the display panel are deflected by the air imaging panel and then converge again to form a floating image at the mirror position of the display panel relative to the air imaging panel.
  • the infrared illumination structure provides infrared light illumination to the floating imaging area.
  • the infrared light beam reflected by the finger enters the air imaging panel and is reflected by the reflective infrared filter to image on the infrared sensor panel.
  • the infrared sensor panel can detect the bright spot and send the image signal to the image processing structure for processing. It can accurately acquire the touch position and improve the sensitivity of the touch position recognition.
  • the reflective infrared filter is used to completely transmit the visible light imaging beam emitted by the display panel without affecting the display effect; the infrared beam reflected from the finger tip is completely reflected to ensure the optical touch signal It can reach the infrared sensor panel, and the area of the reflective infrared filter is larger than the area of the display panel, and is sufficient to cover the aperture of the imaging beam to ensure that the infrared light reflected by the finger is completely reflected by the reflective infrared filter.
  • the preset brightness distribution can be set according to actual needs.
  • the display panel and the floating image area are in a mirror image relationship with respect to the air imaging plate; since the floating imaging area and the touch plane are coplanar, the display panel and the touch plane are in a mirror image relationship, and in this embodiment, The infrared sensor panel and the display panel are arranged in a mirror image with respect to the reflective infrared filter.
  • the area of the infrared sensor panel is greater than or equal to the area of the display area of the display panel to ensure that the display area of the display panel is fully covered.
  • the area of the reflective infrared filter is larger than the area of the display area of the display panel.
  • the area of the infrared filter is larger than the display panel and is sufficient to cover the aperture of the imaging beam.
  • the infrared sensor panel includes a substrate and a photodetector array on the substrate, and each photodetector includes a thin film.
  • the transistor and the photodiode that convert the infrared light incident on the substrate into photocurrent are shown in FIG. 2.
  • the photodiode is responsible for converting the collected infrared light into a photocurrent
  • the thin film transistor is a switch for the control of the photocurrent.
  • the drive circuit controls the turn-on and turn-off of the thin film transistor through the gate line.
  • the photocurrent generated by the photodiode connected in series with it will be transmitted to the drive circuit along the data line and converted into the corresponding pixel after processing. Grayscale information.
  • the image data collection in the infrared sensor panel can be completed within a certain collection period.
  • the angle between the air imaging panel and the display panel is 40-50 degrees.
  • the display panel is arranged parallel to the horizontal plane, and the air imaging panel is located in front of the display panel.
  • the floating imaging formed by the air imaging panel of the display panel is in a mirror image relationship with the display panel.
  • the angle between the air imaging panel and the display panel is 40-50 degrees, and the floating imaging is located just in front of the observer, which is convenient for the observer to watch.
  • the infrared illuminating structure is located below the air imaging board (refer to the direction shown in the figure), and the infrared illuminating structure is located at the lower left of the floating imaging area, which effectively provides illumination to the floating imaging area.
  • the image processing structure includes a touch position acquisition unit, and the touch position acquisition unit includes:
  • the first acquisition mode subunit is used to acquire the image information of the light spots distributed on the infrared sensor panel when the finger is in the floating image area according to the image signal, and determine the image information according to the position of the high-brightness area in the image information The touch position; and/or,
  • the second acquisition mode subunit is used to acquire the image information of the light spots distributed on the infrared sensor panel when the finger is in the floating image area according to the image signal, and analyze the image information of the high-brightness area in the image information To determine whether the finger tip is in focus.
  • Touch operation can be set to two types according to application scenarios: normal mode and fine mode.
  • the normal mode is performed by the first acquisition mode subunit
  • the fine mode is performed by the second acquisition mode subunit.
  • the image processing structure determines the touch position according to the maximum brightness of the captured image, that is, according to all The position of the high-brightness area in the image information determines the touch position. Since there are no physical constraints and possible visual deviations in the floating touch state, it is not easy to accurately position the fingers on the floating image surface. At this time, the fingertips may be defocused but the finger pads are just in focus, as shown in Figure 5.
  • the fine mode determines whether the touch point is a fingertip by further processing the collected images, that is, analyzes the image information of the high-brightness area in the image information to determine the finger Whether the end is in focus.
  • the touch position acquisition unit may include only the first acquisition mode subunit, or only the second acquisition mode subunit, or both The first acquisition mode sub-unit and the second acquisition mode sub-unit, and the automatic switching of the first zone change mode sub-unit and the second acquisition mode sub-unit is realized by the switching unit.
  • the focus position is a position on the floating image plane that focuses the light spots distributed on the infrared sensor panel.
  • the second acquisition mode subunit includes:
  • the first processing part is used to obtain the image gradient distribution of the image information of the high-brightness area
  • the second processing part is used for judging whether the finger tip of the finger is in the in-focus position according to the image gradient distribution and using a convolutional neural network, and determining the touch position when the finger tip is in the in-focus position.
  • the first state the finger does not reach the floating image plane (floating imaging area), the finger is completely out of focus, and the image spot is diffused, as shown in Figure 3;
  • the finger passes through the floating image plane, only the middle part of the fingertip is in focus, while the fingertip and finger heel are in a defocused state, and the overall light spot is diffuse, as shown in Figure 5.
  • the defocus and focus of the image directly determine the gradient distribution of the image. Therefore, by solving the gradient of the image, it can be judged whether the finger is in the focus position.
  • this embodiment uses a convolutional neural network to train the gradient distribution map of the image.
  • the object of analysis and judgment by the convolutional neural network is the image information of the high-brightness area, and Not the complete image information received by the infrared sensor panel.
  • the system calibration stage a large number of images are acquired at the in-focus and out-of-focus positions of the finger through a large number of touch tests, and whether these images are in focus are manually marked as training data for the convolutional neural network.
  • the neural network model obtained by the aforementioned training is used to determine the focus and defocus state of the finger.
  • the image processing structure recognizes each frame of images collected, and only when the finger tip is in the focus position, can the corresponding operation behavior be triggered.
  • This embodiment also provides a floating display touch method, which is applied to the above floating display imaging device, as shown in FIG. 6, including the following steps:
  • Trigger a trigger operation corresponding to the touch position
  • Obtaining the touch position in the first acquisition mode includes: acquiring, according to the image signal, the image information of the light spots distributed on the infrared sensor panel when the finger is in the floating image area, and according to the high brightness in the image information The location of the area determines the touch position;
  • Obtaining the touch position in the second acquisition mode includes: acquiring, according to the image signal, the image information of the light spots distributed on the infrared sensor panel when the finger is in the floating image area, and comparing the high brightness in the image information The area image information is analyzed to determine whether the finger tip is in focus.
  • the infrared illumination structure provides infrared light illumination to the floating imaging area.
  • the infrared light beam reflected by the finger enters the air imaging panel and is reflected by the reflective infrared filter to image on the infrared sensor panel.
  • the infrared sensor panel can detect the bright spot and send the image signal to the image processing structure for processing. It can accurately acquire the touch position and improve the sensitivity of the touch position recognition.
  • acquiring the touch position in the second acquisition mode specifically includes:
  • the touch position is determined.
  • the method before the step of analyzing and processing the image gradient distribution by using the convolutional neural network to determine whether the fingertip of the finger is in the in-focus position, the method further includes:
  • Using a convolutional neural network to analyze and process the image gradient distribution can improve the accuracy of the touch position determination.

Abstract

本公开涉及一种悬浮显示成像装置,包括显示面板,以及与显示面板成角度设置的空气成像板,所述显示面板发出的光束经所述空气成像板形成悬浮像,所述显示面板相对于所述空气成像板的镜像为悬浮像平面,还包括:红外照明结构,用于向悬浮像区域提供红外光;反射型红外滤光片,设置于所述显示面板和所述空气成像板之间,用于当手指处于所述悬浮像区域时反射经所述手指反射的红外光;红外传感面板,位于所述反射型红外滤光片远离所述显示面板的一侧,用于接收所述反射型红外滤光片反射的红外光,并在检测到具有预设亮度分布的光斑时发出影像信号;影像处理结构,用于根据所述影像信号获取手指触控位置,并根据包括所述触控位置的触控信息,触发相应的触控操作。

Description

悬浮显示成像装置及悬浮显示触控方法
相关申请的交叉引用
本申请主张在2019年5月15日在中国提交的中国专利申请号No.201910406980.8的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及悬浮显示产品制作技术领域,尤其涉及一种悬浮显示成像装置及悬浮显示触控方法。
背景技术
由于悬浮显示的显示画面已经脱离了物理显示器件,人机交互过程也将悬空展开,此时由于手指不能与实体传感器相接触,传统电容或电阻触控技术将不再有效。相关技术中的解决方案均采用成像交互技术,如基于深度图像的手势识别。
虽然悬浮显示技术使显示图像脱离了屏幕实体,但其本质上还是一种二维显示,使用者在感知和交互上仍是采用的二维交互的逻辑。基于成像的交互技术需要在三维空间中准确捕捉使用者的手部操作及运动,这在技术上目前仍具有一定难度,尤其是在动作幅度较小时容易导致识别失败。
发明内容
为了达到上述目的,本公开采用的技术方案是:一种悬浮显示成像装置,包括显示面板,以及与所述显示面板成角度设置空气成像板,所述显示面板发出的光束经所述空气成像板形成悬浮像,所述显示面板相对于所述空气成像板的镜像为悬浮像平面,还包括:
红外照明结构,用于向悬浮成像区域提供红外光;
反射型红外滤光片,设置于所述显示面板和所述空气成像板之间,用于当手指处于所述悬浮像区域时反射经手指反射的红外光;
红外传感面板,位于所述反射型红外滤光片远离所述显示面板的一侧, 用于接收所述反射型红外滤光片反射的红外光,并在检测到具有预设亮度分布的光斑时发出影像信号;
影像处理结构,用于根据所述影像信号获取手指触控位置,并根据包括所述触控位置的触控信息,触发相应的触控操作。
可选的,所述红外传感面板与所述显示面板相对于所述反射型红外滤光片成镜像设置。
可选的,所述红外传感面板的面积大于或等于所述显示面板的显示区域的面积。
可选的,所述红外传感面板的面积不小于所述显示面板的显示区域的面积。
可选的,其中所述手指触控位置包括:手指未达到所述悬浮像平面、手指到达所述悬浮像平面、手指穿过所述悬浮像平面。
可选的,所述红外传感面板包括基板,以及位于所述基板上的光电探测器阵列,每个所述光电探测器包括薄膜晶体管和光电二极管,所述光电二极管用于将入射至所述基板上的红外光转化为光电流。
可选的,所述空气成像板与所述显示面板之间的夹角为40-50度。
可选的,所述影像处理结构包括触控位置获取单元,所述触控位置获取单元包括:
第一获取模式子单元,用于根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并根据所述图像信息中的高亮度区域的位置确定所述触控位置;和/或,
第二获取模式子单元,用于根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并对所述图像信息中的高亮度区域图像信息进行分析、以确定手指指端是否位于合焦位置。
可选的,所述第二获取模式子单元包括:
第一处理部分,用于获取所述高亮度区域图像信息的图像梯度分布;
第二处理部分,用于根据所述图像梯度分布、利用卷积神经网络判断手指指端是否处于合焦位置,并在手指的指端处于合焦位置时,确定所述触控位置。
可选的,所述合焦位置为位于悬浮像平面且使红外传感面板上分布的光斑聚焦的位置。
本公开还提供一种悬浮显示触控方法,应用于上述的悬浮显示成像装置,包括以下步骤:
采集手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息;
确定根据所述图像信息获取触控位置的处理模式为第一获取模式或第二获取模式;
触发与所述触控位置相应的触发操作;
其中,
在所述第一获取模式下获取触控位置包括:根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并根据所述图像信息中的高亮度区域的位置确定所述触控位置;
在所述第二获取模式下获取触控位置包括:根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并对所述图像信息中的高亮度区域图像信息进行分析、以确定手指指端是否位于合焦位置。
可选的,在所述第二获取模式下获取触控位置具体包括:
获取所述图像信息中的高亮度区域图像信息;
获取所述高亮度区域图像信息的图像梯度分布;
利用卷积神经网络对所述图像梯度分布进行分析处理、以判断手指的指端是否处于合焦位置;
在手指的指端处于合焦位置时,确定所述触控位置。
可选的,在利用卷积神经网络对所述图像梯度分布进行分析处理、以判断手指的指端是否处于合焦位置的步骤前,还包括:
将所述高亮度区域图像信息的像素分辨率转化为与卷积神经网络相匹配的像素分辨率。
本公开的有益效果是:降低悬浮显示成像交互难度,提高触控位置的识别灵敏度,避免动作幅度小时识别失败的问题。
附图说明
图1表示本公开实施例中悬浮显示成像装置结构示意图;
图2表示本公开实施例中红外传感面板部分结构示意图;
图3表示本公开实施例中手指悬浮触控第一状态时的影像示意图;
图4表示本公开实施例中手指悬浮触控第二状态时的影像示意图;
图5表示本公开实施例中手指悬浮触控第三状态时的影像示意图;
图6表示本公开实施例这种悬浮显示触控方法流程示意图;
图7表示本公开实施例中在所述第二获取模式下获取触控位置的流程示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例的附图,对本公开实施例的技术方案进行清楚、完整地描述。显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。基于所描述的本公开的实施例,本领域普通技术人员所获得的所有其他实施例,都属于本公开保护的范围。
相关技术中,与悬浮图像交互,通过三维建模以捕捉使用者对的手部操作及运动,动作幅度较小容易导致识别失败。
针对上述技术问题,本实施例提供一种悬浮显示成像装置,包括显示面板,以及与显示面板成角度设置的空气成像板,所述显示面板发出的光束经所述空气成像板形成悬浮像,所述显示面板相对于所述空气成像板的镜像为悬浮像平面,还包括:
红外照明结构,用于向悬浮像区域提供红外光;
反射型红外滤光片,设置于所述显示面板和所述空气成像板之间,用于当手指处于所述悬浮像区域时反射经手指反射的红外光;
红外传感面板,位于所述反射型红外滤光片远离所述显示面板的一侧,用于接收所述反射型红外滤光片反射的红外光,并在检测到具有预设亮度分布的光斑时发出影像信号;
影像处理结构,用于根据所述影像信号获取手指触控位置,并根据包括 所述触控位置的触控信息,触发相应的触控操作。
如图1所示,空气成像板放置于显示面板前方,显示面板各像素发出的光束经过空气成像板偏折后再次汇聚,并在显示面板相对于空气成像板的镜像位置处形成悬浮像。红外照明结构向悬浮成像区域提供红外光照明,当手指进入悬浮像区域时,手指反射的红外光束进入空气成像板,并经反射型红外滤光片反射,成像于红外传感面板。红外传感面板能够检测到高亮光斑并将影像信号送入影像处理结构处理。可准确的获取触控位置,提高触控位置的识别灵敏度。
需要说明的是,本实施例中采用反射型红外滤光片对显示面板发出可见光成像光束完全透过,不影响显示效果;对从手指端反射回的红外光束则完全反射,保证光学触控信号可到达红外传感面板,且反射型红外滤光片的面积大于显示面板的面积,并足以覆盖成像光束口径,以保证手指反射的红外光完全被反射型红外滤光片反射。
需要说明的是,所述预设亮度分布可根据实际需要设定。
根据空气成像板的成像特性,显示面板与悬浮像区域相对于空气成像板成镜像关系;由于悬浮成像区域与触控平面共面,显示面板与触控平面成镜像关系,且本实施例中,所述红外传感面板与所述显示面板相对于所述反射型红外滤光片成镜像设置。
本实施例中,所述红外传感面板的面积大于或等于所述显示面板的显示区域的面积,以保证显示面板的显示区域被全面覆盖。
本实施例中,所述反射型红外滤光片的面积大于所述显示面板的显示区域的面积。红外滤光片面积大于显示面板,并足以覆盖成像光束口径。
所述红外传感面板的具体结构形式可以有多种,本实施例中,所述红外传感面板包括基板,以及位于所述基板上的光电探测器阵列,每个所述光电探测器包括薄膜晶体管和将入射至所述基板上的红外光转化为光电流的光电二极管,如图2所示。光电二极管负责将采集到的红外光转换为光电流,薄膜晶体管为光电流的控制的开关。驱动电路通过栅极线控制薄膜晶体管的导通与截止,在薄膜晶体管导通的状态下,与其串联的光电二极管产生的光电流会沿数据线传输到驱动电路中,经处理后转换为对应像素的灰度信息。在 驱动电路的时序控制下,可以在一定的采集周期内完成红外传感面板内图像数据的采集。
本实施例中,所述空气成像板与所述显示面板之间的夹角为40-50度。如图1所示,显示面板平行于水平面设置,所述空气成像板位于显示面板的前方,根据空气成像板的特性,显示面板通过空气成像板形成的悬浮成像与显示面板成镜像关系,所述空气成像板与所述显示面板之间的夹角为40-50度,悬浮成像恰好位于观察者的前方,便于观察者的观看。
本实施例中,所述红外照明结构位于所述空气成像板的下方(参考图示方向),且红外照明结构位于悬浮成像区域的左下方,有效的对悬浮成像区域提供照明。
本实施例中,所述影像处理结构包括触控位置获取单元,所述触控位置获取单元包括:
第一获取模式子单元,用于根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并根据所述图像信息中的高亮度区域的位置确定所述触控位置;和/或,
第二获取模式子单元,用于根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并对所述图像信息中的高亮度区域图像信息进行分析以确定手指指端是否位于合焦位置。
触控操作可以根据应用场景设置为两种:普通模式和精细模式。普通模式为所述第一获取模式子单元进行,精细模式由所述第二获取模式子单元进行,在普通模式下,影像处理结构根据采集影像的亮度最大值确定触控位置,也就是根据所述图像信息中的高亮度区域的位置确定所述触控位置。由于悬浮触控状态下没有了物理约束和可能存在的视觉偏差,手指不容易准确定位在悬浮像表面,此时指尖可能存在一定的离焦而指腹却恰好在合焦位置,如图5所示,此时,指腹位置的影像亮度最大,因此影像处理结构将以指腹影像位置作为使用者触控位置,造成误触。为了提高触控的精度,精细模式通过对采集到的影像做进一步的处理确定触控点是否为指端,也就是,对所述图像信息中的高亮度区域图像信息进行分析、以确定手指指端是否位于合焦位置。
需要说明的是,在实际应用时,根据不同的应用场景,所述触控位置获取单元可以仅包括所述第一获取模式子单元,或仅包括第二获取模式子单元,或者同时包括所述第一获取模式子单元和所述第二获取模式子单元,并通过切换单元实现所述第一换区模式子单元和所述第二获取模式子单元的自动切换。
需要说明的是,合焦位置为位于悬浮像平面且使红外传感面板上分布的光斑聚焦的位置。
本实施例中,所述第二获取模式子单元包括:
第一处理部分,用于获取所述高亮度区域图像信息的图像梯度分布;
第二处理部分,用于根据所述图像梯度分布、利用卷积神经网络判断手指指端是否处于合焦位置,并在手指的指端处于合焦位置时,确定所述触控位置。
手指对悬浮像触控时,包括三种状态,对应三种状态,到达所述红外传感面板的表面的影像也有所区别。
第一状态:手指未达到悬浮像平面(悬浮成像区域),手指完全离焦,此时影像光斑弥散,如图3所示;
第二状态:手指恰好位于悬浮像平面,指尖部位合焦,成像清晰,而后端的指腹则逐渐离焦,光斑弥散,如图4所示;
第三状态,手指穿过悬浮像平面,只有指腹中间部位合焦,而指尖和指跟则处于离焦状态,整体上光斑呈弥散,如图5所示。
图像的离焦和合焦直接决定图像的梯度分布,因此,通过对图像求解梯度,可以判断手指是否处于合焦位置。
为了增强系统的鲁棒性,本实施例采用卷积神经网络对图像的梯度分布图进行训练。为了降低卷积神经网络训练和判定的数据处理压力,仅将图像中的高亮区域进行截取进行后续处理,也就是说,通过卷积神经网络进行分析判断的对象是高亮度区域图像信息,而不是由所述红外传感面板接收的完整的图像信息。在系统标定阶段,通过大量的触控测试在手指合焦和离焦位置获取大量图像,并对这些图像是否合焦进行人工标注,作为卷积神经网络的训练数据。在实际应用中,利用前述训练得到的神经网络模型完成对手指 合焦和离焦状态的判定。在显示系统工作状态时,影像处理结构对采集到的每一帧图像进行识别,只有当手指的指端处于合焦位置时,才能触发相应的操作行为。
本实施例还提供一种悬浮显示触控方法,应用于上述的悬浮显示成像装置,如图6所示,包括以下步骤:
采集手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息;
确定根据所述图像信息获取触控位置的处理模式为第一获取模式或第二获取模式;
触发与所述触控位置相应的触发操作;
其中,
在所述第一获取模式下获取触控位置包括:根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并根据所述图像信息中的高亮度区域的位置确定所述触控位置;
在所述第二获取模式下获取触控位置包括:根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并对所述图像信息中的高亮度区域图像信息进行分析、以确定手指指端是否位于合焦位置。
红外照明结构向悬浮成像区域提供红外光照明,当手指进入悬浮像区域时,手指反射的红外光束进入空气成像板,并经反射型红外滤光片反射,成像于红外传感面板。红外传感面板能够检测到高亮光斑并将影像信号送入影像处理结构处理。可准确的获取触控位置,提高触控位置的识别灵敏度。
本实施例中,如图7所示,在所述第二获取模式下获取触控位置具体包括:
获取所述图像信息中的高亮度区域图像信息;
获取所述高亮度区域图像信息的图像梯度分布;
利用卷积神经网络对所述图像梯度分布进行分析处理、以判断手指的指端是否处于合焦位置;
在手指的指端处于合焦位置时,确定所述触控位置。
本实施例中,在利用卷积神经网络对所述图像梯度分布进行分析处理、 以判断手指的指端是否处于合焦位置的步骤前,还包括:
将所述高亮度区域图像信息的像素分辨率转化为与卷积神经网络相匹配的像素分辨率。
在利用卷积神经网络对所述图像梯度分布进行分析处理,可提高触控位置判断的精确性。
以上所述为本公开较佳实施例,需要说明的是,对于本领域普通技术人员来说,在不脱离本公开所述原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本公开保护范围。

Claims (13)

  1. 一种悬浮显示成像装置,包括显示面板,以及与所述显示面板成角度设置的空气成像板,所述显示面板发出的光束经所述空气成像板形成悬浮像,所述显示面板相对于所述空气成像板的镜像为悬浮像平面,其中,还包括:
    红外照明结构,用于向悬浮像区域提供红外光;
    反射型红外滤光片,设置于所述显示面板和所述空气成像板之间,用于当手指处于所述悬浮像区域时反射经手指反射的红外光;
    红外传感面板,位于所述反射型红外滤光片远离所述显示面板的一侧,用于接收所述反射型红外滤光片反射的红外光,并在检测到具有预设亮度分布的光斑时发出影像信号;
    影像处理结构,用于根据所述影像信号获取手指触控位置,并根据包括所述触控位置的触控信息,触发相应的触控操作。
  2. 根据权利要求1所述的悬浮显示成像装置,其中,所述红外传感面板与所述显示面板相对于所述反射型红外滤光片成镜像设置。
  3. 根据权利要求1所述的悬浮显示成像装置,其中,所述红外传感面板的面积大于或等于所述显示面板的显示区域的面积。
  4. 根据权利要求1所述的悬浮显示成像装置,其中,所述反射型红外滤光片的面积大于所述显示面板的显示区域的面积。
  5. 根据权利要求1所述的悬浮显示成像装置,其中,所述手指触控位置包括:手指未达到所述悬浮像平面、手指到达所述悬浮像平面、手指穿过所述悬浮像平面。
  6. 根据权利要求1所述的悬浮显示成像装置,其中,所述红外传感面板包括基板,以及位于所述基板上的光电探测器阵列,每个所述光电探测器包括薄膜晶体管和光电二极管,所述光电二极管用于将入射至所述基板上的红外光转化为光电流。
  7. 根据权利要求1所述的悬浮显示成像装置,其中,所述空气成像板与所述显示面板之间的夹角为40-50度。
  8. 根据权利要求1所述的悬浮显示成像装置,其中,所述影像处理结构 包括触控位置获取单元,所述触控位置获取单元包括:
    第一获取模式子单元,用于根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并根据所述图像信息中的高亮度区域的位置确定所述触控位置;和/或,
    第二获取模式子单元,用于根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并对所述图像信息中的高亮度区域图像信息进行分析以确定手指指端是否位于合焦位置。
  9. 根据权利要求8所述的悬浮显示成像装置,其中,所述第二获取模式子单元包括:
    第一处理部分,用于获取所述高亮度区域图像信息的图像梯度分布;
    第二处理部分,用于根据所述图像梯度分布、利用卷积神经网络判断手指指端是否处于合焦位置,并在手指的指端处于合焦位置时,确定所述触控位置。
  10. 根据权利要求9所述的悬浮显示成像装置,其中,所述合焦位置为位于悬浮像平面且使红外传感面板上分布的光斑聚焦的位置。
  11. 一种悬浮显示触控方法,应用于权利要求1-10任一项所述的悬浮显示成像装置,其中,包括以下步骤:
    采集手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息;
    确定根据所述图像信息获取触控位置的处理模式为第一获取模式或第二获取模式;
    触发与所述触控位置相应的触发操作;
    其中,
    在所述第一获取模式下获取触控位置包括:根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并根据所述图像信息中的高亮度区域的位置确定所述触控位置;
    在所述第二获取模式下获取触控位置包括:根据所述影像信号获取手指处于所述悬浮像区域时红外传感面板上分布的光斑的图像信息,并对所述图像信息中的高亮度区域图像信息进行分析、以确定手指指端是否位于合焦位置。
  12. 根据权利要求11所述的悬浮显示触控方法,其中,在所述第二获取模式下获取触控位置具体包括:
    获取所述图像信息中的高亮度区域图像信息;
    获取所述高亮度区域图像信息的图像梯度分布;
    利用卷积神经网络对所述图像梯度分布进行分析处理、以判断手指的指端是否处于合焦位置;
    在手指的指端处于合焦位置时,确定所述触控位置。
  13. 根据权利要求12所述的悬浮显示触控方法,其中,
    在利用卷积神经网络对所述图像梯度分布进行分析处理、以判断手指的指端是否处于合焦位置的步骤前,还包括:
    将所述高亮度区域图像信息的像素分辨率转化为与卷积神经网络相匹配的像素分辨率。
PCT/CN2020/086720 2019-05-15 2020-04-24 悬浮显示成像装置及悬浮显示触控方法 WO2020228512A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910406980.8A CN110119208B (zh) 2019-05-15 2019-05-15 悬浮显示成像装置及悬浮显示触控方法
CN201910406980.8 2019-05-15

Publications (1)

Publication Number Publication Date
WO2020228512A1 true WO2020228512A1 (zh) 2020-11-19

Family

ID=67522571

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/086720 WO2020228512A1 (zh) 2019-05-15 2020-04-24 悬浮显示成像装置及悬浮显示触控方法

Country Status (2)

Country Link
CN (1) CN110119208B (zh)
WO (1) WO2020228512A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110119208B (zh) * 2019-05-15 2021-04-30 京东方科技集团股份有限公司 悬浮显示成像装置及悬浮显示触控方法
CN112925444A (zh) * 2021-03-04 2021-06-08 业成科技(成都)有限公司 触控显示器
CN114690976A (zh) * 2021-04-22 2022-07-01 广州创知科技有限公司 基于弹性波的系统首页界面交互操作方法及装置
CN114199887A (zh) * 2021-12-13 2022-03-18 苏州华星光电技术有限公司 显示面板的曲面度外观检测设备
WO2023206351A1 (zh) * 2022-04-29 2023-11-02 深圳盈天下视觉科技有限公司 水下成像设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150277610A1 (en) * 2014-03-27 2015-10-01 Industry-Academic Cooperation Foundation, Yonsei University Apparatus and method for providing three-dimensional air-touch feedback
CN107300867A (zh) * 2017-06-30 2017-10-27 广东美的制冷设备有限公司 投影触控控制设备、家用电器和家用电器的控制方法
CN207780717U (zh) * 2018-01-30 2018-08-28 上海永微信息科技有限公司 空气成像互动设备
CN108664173A (zh) * 2013-11-19 2018-10-16 麦克赛尔株式会社 投影型影像显示装置
CN108762660A (zh) * 2018-05-29 2018-11-06 京东方科技集团股份有限公司 悬浮显示装置和用于悬浮显示装置的指示触控位置的方法
CN208156638U (zh) * 2018-05-29 2018-11-27 衍视电子科技(上海)有限公司 一种车载全息中控及娱乐显示设备
CN109947302A (zh) * 2019-03-29 2019-06-28 京东方科技集团股份有限公司 一种空中显示装置及其控制方法
CN110119208A (zh) * 2019-05-15 2019-08-13 京东方科技集团股份有限公司 悬浮显示成像装置及悬浮显示触控方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9505664D0 (en) * 1995-03-21 1995-05-10 Central Research Lab Ltd An interactive display and input device
FR2976093B1 (fr) * 2011-06-01 2013-08-16 Thales Sa Systeme tactile a emetteurs et recepteurs optiques
CN109271029B (zh) * 2011-08-04 2022-08-26 视力移动技术有限公司 无触摸手势识别系统、无触摸手势识别方法和介质
CN103116422A (zh) * 2013-03-02 2013-05-22 刘昱 空气投影键盘
CN106324848B (zh) * 2016-10-31 2019-08-23 昆山国显光电有限公司 显示面板及其实现悬浮触控和裸眼3d的方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108664173A (zh) * 2013-11-19 2018-10-16 麦克赛尔株式会社 投影型影像显示装置
US20150277610A1 (en) * 2014-03-27 2015-10-01 Industry-Academic Cooperation Foundation, Yonsei University Apparatus and method for providing three-dimensional air-touch feedback
CN107300867A (zh) * 2017-06-30 2017-10-27 广东美的制冷设备有限公司 投影触控控制设备、家用电器和家用电器的控制方法
CN207780717U (zh) * 2018-01-30 2018-08-28 上海永微信息科技有限公司 空气成像互动设备
CN108762660A (zh) * 2018-05-29 2018-11-06 京东方科技集团股份有限公司 悬浮显示装置和用于悬浮显示装置的指示触控位置的方法
CN208156638U (zh) * 2018-05-29 2018-11-27 衍视电子科技(上海)有限公司 一种车载全息中控及娱乐显示设备
CN109947302A (zh) * 2019-03-29 2019-06-28 京东方科技集团股份有限公司 一种空中显示装置及其控制方法
CN110119208A (zh) * 2019-05-15 2019-08-13 京东方科技集团股份有限公司 悬浮显示成像装置及悬浮显示触控方法

Also Published As

Publication number Publication date
CN110119208B (zh) 2021-04-30
CN110119208A (zh) 2019-08-13

Similar Documents

Publication Publication Date Title
WO2020228512A1 (zh) 悬浮显示成像装置及悬浮显示触控方法
US8165422B2 (en) Method and system for reducing effects of undesired signals in an infrared imaging system
TWI450159B (zh) Optical touch device, passive touch system and its input detection method
US8941620B2 (en) System and method for a virtual multi-touch mouse and stylus apparatus
US7313255B2 (en) System and method for optically detecting a click event
US6791531B1 (en) Device and method for cursor motion control calibration and object selection
US8971565B2 (en) Human interface electronic device
CN101971128B (zh) 屏幕与指针对象之间交互用的交互装置
WO2015192712A1 (zh) 接触式图像采集器、触摸屏、指纹采集器及电子设备
WO2012124730A1 (ja) 検出装置、入力装置、プロジェクタ、及び電子機器
TWI571769B (zh) 非接觸輸入裝置及方法
TWI394072B (zh) 平面顯示器位置檢出裝置及其方法
CN101950221A (zh) 基于球形显示的多点触摸装置及多点触摸方法
CN102193687A (zh) 基于labview的多点触控屏交互系统
TW201337649A (zh) 光學式輸入裝置及其輸入偵測方法
KR101385263B1 (ko) 가상 키보드를 위한 시스템 및 방법
KR20130136313A (ko) 터치 펜을 이용한 터치 스크린 시스템 및 그 터치 인식 방법
KR20010016506A (ko) 광 마우스 장치
KR101586665B1 (ko) 접촉면 계측이 가능한 유체형 촉각센서
CN105278760B (zh) 光学触控系统
JP2017227972A (ja) 投影撮像システム及び投影撮像方法
CN201812278U (zh) 基于球形显示的多点触摸装置
KR101197284B1 (ko) 터치 시스템 및 그 터치 인식 방법
TWI782534B (zh) 圖像採集裝置及具有圖像採集裝置的電子設備
KR20130136314A (ko) 터치 패널 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20805479

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20805479

Country of ref document: EP

Kind code of ref document: A1