WO2018171041A1 - 一种动向智能投影系统及其方法 - Google Patents
一种动向智能投影系统及其方法 Download PDFInfo
- Publication number
- WO2018171041A1 WO2018171041A1 PCT/CN2017/085727 CN2017085727W WO2018171041A1 WO 2018171041 A1 WO2018171041 A1 WO 2018171041A1 CN 2017085727 W CN2017085727 W CN 2017085727W WO 2018171041 A1 WO2018171041 A1 WO 2018171041A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- unit
- user
- image
- correction
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3182—Colour adjustment, e.g. white balance, shading or gamut
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
Definitions
- the present invention relates to the field of digital projection display technology, and more particularly to a dynamic intelligent projection system and method thereof.
- the functions of the projection device may vary depending on the usage scenario. For example, in a home or large stage, when a projection device needs to perform multi-directional projection display according to a user's intentional projection direction, the projection device needs to recognize and track the user's intentional projection position direction, thereby changing the projection direction of the projection lens, and performing high quality. Intelligent projection. Therefore, an intelligent projection system and method for tracking the intentional projection direction of the user, changing the direction of the projection lens in real time and performing projection image correction processing to realize high-quality projection display has become one of the research focuses in the field of projection.
- an object of the present invention is to provide a dynamic intelligent projection system that can change a projection direction in real time according to a user's intentional projection direction and perform image correction processing, can realize high-quality projection of motion, and can be used in various scenes. Its method.
- the user of the present invention can be a human or a robot or other animal.
- the present invention provides a dynamic intelligent projection display system comprising:
- a projection unit configured to project projection information to the projection screen
- An environment recognition unit configured to acquire a complete projection picture image of the projection unit and a change of the environment thereof;
- a target direction identifying unit configured to determine a user's intentional projection direction according to the user's directional information tracking
- a first direction control unit that simultaneously changes a projection direction of the projection unit and an acquisition direction of the environment recognition unit by rotating;
- a central processing unit connected to the projection unit, the environment recognition unit, the target direction identification unit, and the first direction control unit;
- the central processing unit includes: a target direction processing unit, configured to calculate according to the user's directivity information a rotation angle of the first direction control unit; and an image correction unit configured to perform an analysis and comparison of the projection screen image acquired by the environment recognition unit and the environment change information to perform real-time projection image correction.
- the target direction recognizing unit is an eye tracking unit, and the intent projection direction is determined according to the direction of the eye movement of the user.
- the eye tracking unit may be an independently arranged head-mounted eye tracker, including: a micro sensor that acquires data in real time, a wireless communication unit, and a micro camera; the eye tracking unit may also include: eye movement environment recognition a unit, a second direction control unit, and an eye movement analysis unit; the eye movement environment recognition unit is configured to collect a human eye image of the user in real time; and the second direction control unit changes the collection direction of the eye movement environment recognition unit by rotating
- the eye movement analysis unit is configured to compare the center position of the iris pupil in the human eye image with the reference position, and calculate and analyze the eye movement direction of the user; the eye movement environment recognition unit is preferably an infrared collection device.
- the target direction recognizing unit is a limb target direction recognizing unit
- the limb may include: a user's head, a gesture; and determining a user's intentional projection direction according to the user's head or the swinging direction of the gesture.
- the limb target direction recognizing unit may be a head mounted tracker including: a gravity sensor.
- the motion intelligent projection display system further comprises: a storage unit and a wireless communication unit; the projection information may be from a storage unit, or may be real-time projection information from the wireless communication unit; the environment recognition unit may be infrared 3D tester or camera.
- the projection image correction comprises: projection autofocus, projection image direction correction, and correction of projection background color, brightness, and surface unevenness degree;
- the projection image direction correction includes: projection image malformation correction and direction correction, so that after correction The projected image is square and the user's eyes can face the projected image.
- the first direction control unit and/or the second direction control unit is a pan/tilt or a multi-dimensional motion stage, which can realize spherical rotation; the user can be a human or a robot or other animals.
- the invention also provides a dynamic intelligent projection display method, comprising the following steps:
- the projection unit performs correction and clear projection.
- the directionality information of the user in the step (S1) is an eye movement direction, and the intentional projection direction is determined according to the direction of the eye movement of the user; the eye movement direction can be realized by an independently set head-mounted eye tracker. It includes: micro-sensors for real-time data acquisition, wireless communication units and micro
- the method for acquiring the intentional projection direction of the user may also include: an eye movement environment recognition step, a second direction control step, and an eye movement analysis step; the eye movement environment recognition step is for collecting the user's human eye in real time.
- the second direction control step changes the acquisition direction in the eye movement environment recognition step by rotation; the eye movement analysis step is used to compare the center position of the iris pupil in the human eye image with the reference position, and calculate and analyze The direction of the user's eye.
- the directivity information of the user described in step (S1) is the swing direction of the user's head or gesture, thereby determining the intentional projection direction of the user.
- the swinging direction of the user's head or gesture can be achieved by a head mounted tracker such as a gravity sensor.
- the projection image correction according to the step (S6) comprises: projection autofocus, projection image direction correction, and correction of projection background color, brightness, and surface unevenness;
- the projection image direction correction includes: projection image malformation correction and direction Correction so that the corrected projection screen is square and the user's eyes can face the projected image.
- the dynamic intelligent projection display system comprises: a projection unit; an environment recognition unit; a target direction recognition unit; a first direction control unit; a central processing unit,
- the utility model comprises: a target direction processing unit and an image correction unit, so the dynamic intelligent projection display system and the method thereof according to the invention can track the intention projection direction of the user in real time, change the direction of the projection lens and perform projection image correction in real time.
- this system can be applied to a large-scale stage to achieve virtual reality combined scene applications.
- FIG. 1 is a schematic structural view of a preferred embodiment of a motion intelligent projection display system of the present invention
- FIG. 2 is a flow chart of a preferred embodiment of the dynamic intelligent projection display method of the present invention.
- the dynamic intelligent projection display system includes: a projection unit 10 for projecting projection information to a projection screen;
- the unit 20 is configured to acquire a complete projection picture image of the projection unit and a change of its environment;
- the target direction identification unit 30 is configured to track the intentional projection direction of the user according to the directional information of the user;
- the first direction control unit 40 rotates The projection direction of the projection unit 10 and the acquisition direction of the environment recognition unit 20 are simultaneously changed; and the central processing unit 50 is connected to the projection unit 10, the environment recognition unit 20, the target direction recognition unit 30, and the first direction control unit 40.
- the central processing unit 50 includes: a target direction processing unit 51 for calculating a rotation angle of the first direction control unit 40 according to the directivity information of the user; and an image correction unit 52 for collecting the environment recognition unit 20 The projection screen image and the environmental change information are analyzed and compared, and real-time projection image correction is performed.
- the target direction recognizing unit 30 is an eye tracking unit, and the intent projection direction is determined according to the direction of the eye movement of the user;
- the eye tracking unit may be an independently set head-mounted eye tracker.
- the utility model comprises: a micro sensor for acquiring data in real time, a wireless communication unit and a miniature camera;
- the wearable eye tracker is an independently arranged device, which can be a head mounted device, and closely tracks the direction of the eye movement of the user, and then is wirelessly
- the communication unit transmits eye movement direction data to the central processing unit 50.
- the eye tracking unit may further include: an eye movement environment recognition unit, a second direction control unit, and an eye movement analysis unit;
- the eye movement environment recognition unit is configured to collect a human eye image of the user in real time;
- the two-direction control unit changes the acquisition direction of the eye movement environment recognition unit by rotating;
- the eye movement analysis unit is configured to compare the center position of the iris pupil in the human eye image with the reference position, and calculate and analyze the eye movement direction of the user;
- the eye movement environment recognition unit is preferably an infrared collection device.
- the target direction identifying unit 30 may also be a limb target.
- the limb may include: a user's head, a gesture; determining a user's intentional projection direction according to a user's head or a swing direction of the gesture; and the limb target direction recognition unit may be a head-mounted tracker including: a gravity sensor.
- the motion intelligent projection display system further includes: a storage unit 60 and a wireless communication unit 70; the projection information of the projection unit 10 may be from the storage unit 60, or may be received from the wireless communication unit 70 in real time. Projection information; the environment recognition unit 20 may be an infrared three-dimensional tester or a camera.
- the first direction control unit and/or the second direction control unit is a pan/tilt or a multi-dimensional motion stage, which can realize spherical rotation.
- the image correcting unit 52 corrects the projected image by using the projected image image acquired by the environment recognizing unit 20 in real time;
- the projected image correction includes: projection autofocus, projected image direction correction, and projected background color, brightness, and Correction of the degree of surface unevenness;
- the projection image direction correction includes projection image malformation correction and direction correction, so that the corrected projection screen is square and the user's eyes can face the projected image.
- the image correction unit 52 performs correction processing according to the acquired projection screen image to change the background of the projected information.
- the image correction unit 52 performs correction according to the acquired projection image image. Processing to achieve intelligent and clear projection.
- the invention also provides a dynamic intelligent projection display method, as shown in FIG. 2, comprising the following steps:
- (S4) the projection unit performs the direction of the user's intentional projection according to the user's directivity information. Initialize the projection display;
- the projection unit performs correction and clear projection.
- the directionality information of the user in the step (S1) is an eye movement direction
- the intentional projection direction can be determined according to the direction of the eye movement of the user; the eye movement direction can be through the independently set head-mounted eye.
- the method includes: a micro sensor that acquires data in real time, a wireless communication unit, and a micro camera; and the method for acquiring an intentional projection direction of the user may also include: an eye movement environment recognition step, a second direction control step, and An eye movement analysis step; the eye movement environment recognition step is configured to collect a human eye image of the user in real time; the second direction control step changes an acquisition direction in the eye movement environment recognition step by rotating; the eye movement analysis step It is used to compare the center position of the iris pupil in the human eye image with the reference position, and calculate and analyze the user's eye movement direction.
- the directionality information of the user in the step (S1) is a swinging direction of the limb
- the limb may include: a head of the user, a gesture; determining an intentional projection direction of the user according to a swinging direction of the user's head or the gesture;
- the swinging direction of the limb can be achieved by a head mounted tracker such as a gravity sensor.
- the projection image correction according to the step (S6) includes: projection autofocus, projection image direction correction, and correction of projection background color, brightness, and surface unevenness degree;
- the projection image direction correction includes: projection image malformation Correction and direction correction, so that the corrected projection screen is square and the user's eyes can face the projected image.
- the user can be a human or a robot or other animals, and the dynamic intelligent projection display system and method thereof according to the present invention can be applied to various occasions.
- the user controls the projection in real time by means of the target direction recognition unit.
- the projection direction of the unit according to the needs of the stage performance, the relevant content is projected and displayed at different positions on the stage background, and the projection correction processing is performed to realize the dazzling stage effect of the virtual reality combination.
- the user may be a performer on the stage or a stage behind the scenes.
- the effect controls the staff and the like; or, in the home, the owner or the pet controls the projection direction of the projection unit in real time by means of the target direction recognition unit, and can perform intelligent projection at different times and at different positions to realize the enjoyment experience of interactive entertainment;
- the store owner or machine attendant can control the projection direction of the projection unit in real time by means of the target direction recognition unit, and can project the menu introduction or other entertainment content according to different customer requirements, enriching the customer's dining experience.
- the dynamic intelligent projection display system and method thereof can track the intentional projection direction of the user in real time by means of the target direction recognition unit, change the direction of the projection lens in real time and perform projection image correction to achieve high quality.
- Intelligent projection display can be applied to a variety of occasions, giving users different visual enjoyment and entertainment effects, with a very promising market prospects.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Projection Apparatus (AREA)
Abstract
一种动向智能投影显示系统和方法,投影显示系统包括:投影单元(10)、环境识别单元(20)、目标方向识别单元(30)、第一方向控制单元(40)和中央处理单元(50)。环境识别单元(20)识别投影单元(10)完整的投影画面图像及环境的变化。目标方向识别单元(30)根据用户的指向性信息追踪确定用户的意向投影方向。第一方向控制单元(40)同时改变投影单元(10)的投影方向和环境识别单元(20)的采集方向。中央处理单元(50)包括目标方向处理单元(51)和图像校正单元(52)。目标方向处理单元(51)根据用户的指向性信息计算第一方向控制单元(40)的转动角度。图像校正单元(52)对环境识别单元(20)采集的投影画面图像及环境的变化信息进行分析对比,进行实时的投影图像校正。动向智能投影显示系统借助目标方向识别单元(30)可实时地追踪用户的意向投影方向,实时地改变投影镜头的方向并进行投影画面校正,实现高质量的智能投影显示。
Description
本发明涉及数字投影显示技术领域,更具体地说,涉及一种动向智能投影系统及其方法。
随着科学技术的发展,特别是半导体技术的推动,便携式的电子设备被不断地设计制造出来。便携式电子设备的提升,使得用户对人机界面的显示器件的要求越来越向着微型、大屏幕以及高分辨率方向发展。在广大用户强烈需求的促使下,近年来投影机技术发展迅猛,DLP、LCOS等产品纷纷推出了性能高、尺寸小且重量轻的便携式的投影机。
在投影领域中,由于使用场景的不同,对投影设备所具备的功能也会有所不同。例如,在家庭或者大型舞台中,投影设备需要根据用户的意向投影方向进行多方向的投影显示时,需要投影设备识别跟踪用户的意向投影位置方向,进而改变投影镜头的投影方向,并进行高质量的智能投影。因此,一种可追踪用户的意向投影方向,实时地改变投影镜头的方向并进行投影画面校正处理,实现高质量投影显示的智能投影系统及其方法成为往后投影领域的研究重点之一。
发明内容
针对上述技术问题,本发明的目的在于提供一种可根据用户的意向投影方向实时地改变投影方向并进行图像校正处理,可实现动向高质量投影且可在多种场景使用的动向智能投影系统及其方法。本发明所述的用户可以为人或者机器人或者其他动物。
为实现上述目的,本发明提供了一种动向智能投影显示系统,包括:
投影单元,用于向投影画面投影显示投影信息;
环境识别单元,用于获取投影单元完整的投影画面图像及其环境的变化;
目标方向识别单元,用于根据用户的指向性信息追踪确定用户的意向投影方向;
第一方向控制单元,通过转动同时改变投影单元的投影方向以及环境识别单元的采集方向;以及
中央处理单元,与所述投影单元、环境识别单元、目标方向识别单元以及第一方向控制单元相连接;所述中央处理单元包括:目标方向处理单元,用于根据用户的指向性信息来计算出第一方向控制单元的转动角度;以及图像校正单元,用于对环境识别单元所获取的投影画面图像以及环境变化信息进行分析比对,进行实时的投影图像校正。
优选地,所述目标方向识别单元为眼动追踪单元,根据用户的眼动方向确定其意向投影方向。所述眼动追踪单元可以为独立设置的头戴式眼动仪,其包括:实时获取数据的微型传感器、无线通信单元及微型摄像头;所述眼动追踪单元也可以为包括:眼动环境识别单元、第二方向控制单元以及眼动分析单元;所述眼动环境识别单元用于实时采集用户的人眼图像;所述第二方向控制单元通过转动改变所述眼动环境识别单元的采集方向;所述眼动分析单元用于将人眼图像中虹膜瞳孔的中心位置与基准位置做比较,计算分析用户的眼动方向;所述眼动环境识别单元优选为红外采集装置。
优选地,所述目标方向识别单元为肢体目标方向识别单元,所述肢体可包括:用户的头、手势;根据用户头或者手势的摆动方向确定用户的意向投影方向。所述肢体目标方向识别单元可以为头戴式跟踪器包括:重力传感器。
优选地,所述动向智能投影显示系统还包括:存储单元和无线通信单元;所述投影信息可以是来自存储单元,也可以是来自无线通信单元的实时投影信息;所述环境识别单元可以为红外三维测试仪或者摄像机。
优选地,所述投影图像校正包括:投影自动对焦、投影图像方向校正以及投影背景颜色、亮度以及表面凹凸程度的校正;所述投影图像方向校正包括:投影图像畸形校正以及方向校正,使校正后的投影画面方正且用户双眼可正视投影图像。
优选地,所述第一方向控制单元和/或第二方向控制单元为云台或者多维运动台,可实现球面式转动;所述用户可以为人或者机器人或者其他动物。
本发明还提供了一种动向智能投影显示方法,包括以下步骤:
(S1)根据用户的指向性信息获取用户的意向投影方向;
(S2)根据用户的意向投影方向来计算出第一方向控制单元的转动角度;
(S3)根据第一方向控制单元的所述转动角度,使投影单元的投影方向以及环境识别单元的采集方向均朝向用户的意向投影方向;
(S4)投影单元根据用户的指向性信息沿着用户的意向投影方向进行初始化投影显示;
(S5)通过环境识别单元获取投影画面图像以及环境变化信息;
(S6)对环境识别单元所获取的投影画面图像以及环境变化信息进行分析比对并进行实时的投影图像校正;
(S7)投影单元进行校正后清晰投影。
优选地,步骤(S1)所述的用户的指向性信息为眼动方向,根据用户的眼动方向确定其意向投影方向;所述眼动方向可以通过独立设置的头戴式眼动仪实现,其包括:实时获取数据的微型传感器、无线通信单元及微
型摄像头;所述获取用户的意向投影方向的方法也可以为包括:眼动环境识别步骤、第二方向控制步骤以及眼动分析步骤;所述眼动环境识别步骤用于实时采集用户的人眼图像;所述第二方向控制步骤通过转动改变所述眼动环境识别步骤中的采集方向;所述眼动分析步骤用于将人眼图像中虹膜瞳孔的中心位置与基准位置做比较,计算分析用户的眼动方向。
优选地,步骤(S1)所述的用户的指向性信息为用户头或者手势的摆动方向,由此确定用户的意向投影方向。所述用户头或者手势的摆动方向可以通过诸如重力传感器等的头戴式跟踪器实现。
优选地,步骤(S6)所述的投影图像校正包括:投影自动对焦、投影图像方向校正以及投影背景颜色、亮度和表面凹凸程度的校正;所述投影图像方向校正包括:投影图像畸形校正以及方向校正,使校正后的投影画面方正且用户双眼可正视投影图像。
与现有技术相比,本发明具有如下有益效果:由于本发明所述的动向智能投影显示系统,包括:投影单元;环境识别单元;目标方向识别单元;第一方向控制单元;中央处理单元,其包括:目标方向处理单元,和图像校正单元,所以本发明所述的动向智能投影显示系统及其方法,可实时地追踪用户的意向投影方向,实时地改变投影镜头的方向并进行投影画面校正,实现高质量投影显示;此系统可应用在大型舞台实现虚拟现实结合的场景应用。
图1是本发明动向智能投影显示系统的优选实施例的结构示意图;
图2是本发明动向智能投影显示方法的优选实施例的流程图。
下面结合附图,对本发明的具体实施方式进行详细描述,但应当理解本发明的保护范围并不受具体实施方式的限制。
除非另有其它明确表示,否则在整个说明书和权利要求书中,术语“包括”或其变换如“包含”或“包括有”等等将被理解为包括所陈述的元件或组成部分,而并未排除其它元件或其它组成部分。
图1是本发明动向智能投影显示系统的实施例1的结构示意图;如图1所示,所述动向智能投影显示系统,包括:投影单元10,用于向投影画面投影显示投影信息;环境识别单元20,用于获取投影单元完整的投影画面图像及其环境的变化;目标方向识别单元30,用于根据用户的指向性信息跟踪确定用户的意向投影方向;第一方向控制单元40,通过转动同时改变投影单元10的投影方向以及环境识别单元20的采集方向;以及中央处理单元50,与投影单元10、环境识别单元20、目标方向识别单元30以及第一方向控制单元40相连接。所述中央处理单元50包括:目标方向处理单元51,用于根据用户的指向性信息来计算出第一方向控制单元40的转动角度;以及图像校正单元52,用于对环境识别单元20所采集的投影画面图像以及环境变化信息进行分析比对,进行实时的投影图像校正。
在本实施例中,所述目标方向识别单元30为眼动追踪单元,根据用户的眼动方向确定其意向投影方向;所述眼动追踪单元可以为独立设置的头戴式眼动仪,其包括:实时获取数据的微型传感器、无线通信单元及微型摄像头;所述头戴式眼动仪为独立设置的装置,可以为头戴装置,近距离准确的追踪用户的眼动方向,然后由无线通信单元向中央处理单元50传输眼动方向数据。当然,所述眼动追踪单元也可以为包括:眼动环境识别单元、第二方向控制单元以及眼动分析单元;所述眼动环境识别单元用于实时采集用户的人眼图像;所述第二方向控制单元通过转动改变所述眼动环境识别单元的采集方向;所述眼动分析单元用于将人眼图像中虹膜瞳孔的中心位置与基准位置做比较,计算分析用户的眼动方向;所述眼动环境识别单元优选为红外采集装置。
另外,在本实施例中,所述目标方向识别单元30也可以为肢体目标方
向识别单元,所述肢体可包括:用户的头、手势;根据用户头或者手势的摆动方向确定用户的意向投影方向;所述肢体目标方向识别单元可以为头戴式跟踪器包括:重力传感器。
在本实施例中,所述动向智能投影显示系统还包括:存储单元60和无线通信单元70;所述投影单元10的投影信息可以是来自存储单元60,也可以是来自无线通信单元70实时接收的投影信息;所述环境识别单元20可以为红外三维测试仪或者摄像机。所述第一方向控制单元和/或第二方向控制单元为云台或者多维运动台,可实现球面式转动。
在本实施例中,图像校正单元52利用环境识别单元20实时采集的投影画面图像,对投影图像进行校正;所述投影图像校正包括:投影自动对焦、投影图像方向校正以及投影背景颜色、亮度和表面凹凸程度的校正;所述投影图像方向校正包括投影图像畸形校正以及方向校正,使校正后的投影画面方正且用户双眼可正视投影图像。例如:当用户眼动方向的投影面背景颜色和所需投影显示的内容对比度不高,不能清晰显示时,图像校正单元52会根据所采集的投影画面图像进行校正处理,改变所投影信息的背景颜色或者字体颜色等等,实现智能清晰化投影;又或者当用户眼动方向的投影面凹凸不平,使得所投影的投影画面不连续时,图像校正单元52会根据所采集的投影画面图像进行校正处理,实现智能清晰化投影。
本发明还提供了一种动向智能投影显示方法,如图2所示,包括以下步骤:
(S1)根据用户的指向性信息获取用户的意向投影方向;
(S2)根据用户的意向投影方向来计算出第一方向控制单元的转动角度;
(S3)根据第一方向控制单元的所述转动角度,使投影单元的投影方向以及环境识别单元的采集方向均朝向用户的意向投影方向;
(S4)投影单元根据用户的指向性信息沿着用户的意向投影方向进行
初始化投影显示;
(S5)通过环境识别单元获取投影画面图像以及环境变化信息;
(S6)对环境识别单元所获取的投影画面图像以及环境变化信息进行分析比对并进行实时的投影图像校正;
(S7)投影单元进行校正后清晰投影。
在本实施例中,步骤(S1)所述的用户的指向性信息为眼动方向,可根据用户的眼动方向确定其意向投影方向;所述眼动方向可以通过独立设置的头戴式眼动仪实现,其包括:实时获取数据的微型传感器、无线通信单元及微型摄像头;另外,所述获取用户的意向投影方向的方法也可以为包括:眼动环境识别步骤、第二方向控制步骤以及眼动分析步骤;所述眼动环境识别步骤用于实时采集用户的人眼图像;所述第二方向控制步骤通过转动改变所述眼动环境识别步骤中的采集方向;所述眼动分析步骤用于将人眼图像中虹膜瞳孔的中心位置与基准位置做比较,计算分析用户的眼动方向。
在本实施例中,步骤(S1)所述用户的指向性信息为肢体的摆动方向,所述肢体可包括:用户的头、手势;根据用户头或者手势的摆动方向确定用户的意向投影方向;所述肢体的摆动方向可以通过诸如重力传感器等头戴式跟踪器来实现。
在本实施例中,步骤(S6)所述的投影图像校正包括:投影自动对焦、投影图像方向校正以及投影背景颜色、亮度和表面凹凸程度的校正;所述投影图像方向校正包括:投影图像畸形校正以及方向校正,使校正后的投影画面方正且用户双眼可正视投影图像。
值得注意的是,在发明实施例中,所述用户可以为人或者机器人或者其他动物,本发明所述的动向智能投影显示系统及其方法可以应用于多种场合。例如:在大型表演中,用户借助目标方向识别单元来实时控制投影
单元的投影方向,根据舞台表演的需要在舞台背景的不同位置投影显示相关内容并进行投影校正处理,实现虚拟现实结合的炫丽舞台效果,所述用户可以是在舞台上的表演者或者幕后的舞台效果操控工作人员等等;又或者,在家庭中,主人或者宠物借助目标方向识别单元来实时控制投影单元的投影方向,可以在不同时间不同位置进行智能投影,实现交互娱乐的享受体验;还或者,在大型餐厅中,由店家或者机器服务员借助目标方向识别单元来实时控制投影单元的投影方向,可以根据不同客户需求,投影菜单简介或者其他娱乐内容,丰富了顾客的用餐体验。
综上所述,本发明所述的动向智能投影显示系统及其方法,借助目标方向识别单元可实时地追踪用户的意向投影方向,实时地改变投影镜头的方向并进行投影画面校正,实现高质量的智能投影显示;可应用在多种场合,给予用户不同的视觉享受和娱乐效果,具有非常可观的市场前景。
前述对本发明的具体示例性实施方案的描述是为了说明和例证的目的。这些描述并非想将本发明限定为所公开的精确形式,并且很显然,根据上述教导,可以进行很多改变和变化。对示例性实施例进行选择和描述的目的在于解释本发明的特定原理及其实际应用,从而使得本领域的技术人员能够实现并利用本发明的各种不同的示例性实施方案以及各种不同的选择和改变。本发明的范围意在由权利要求书及其等同形式所限定。
Claims (12)
- 一种动向智能投影显示系统,其特征在于,包括:投影单元,用于向投影画面投影显示投影信息;环境识别单元,用于识别投影单元完整的投影画面图像及其环境的变化;目标方向识别单元,用于根据用户的指向性信息追踪确定用户的意向投影方向;第一方向控制单元,通过转动同时改变投影单元的投影方向以及环境识别单元的采集方向;以及中央处理单元,与所述投影单元、环境识别单元、目标方向识别单元以及第一方向控制单元相连接;所述中央处理单元包括:目标方向处理单元,用于根据用户的指向性信息来计算出第一方向控制单元的转动角度;以及图像校正单元,用于对环境识别单元所获取的投影画面图像以及环境变化信息进行分析比对,进行实时的投影图像校正。
- 根据权利要求1所述的动向智能投影显示系统,其特征在于,所述目标方向识别单元为眼动追踪单元,根据用户的眼动方向确定其意向投影方向。
- 根据权利要求2所述的动向智能投影显示系统,其特征在于,所述眼动追踪单元包括:眼动环境识别单元、第二方向控制单元以及眼动分析单元;所述眼动环境识别单元用于实时采集用户的人眼图像;所述第二方向控制单元通过转动改变所述眼动环境识别单元的采集方向;所述眼动分析单元用于将人眼图像中虹膜瞳孔的中心位置与基准位置做比较,计算分析用户的眼动方向。
- 根据权利要求1所述的动向智能投影显示系统,其特征在于,所述目标方向识别单元为肢体目标方向识别单元,根据用户头或者手势的摆动方向确定用户的意向投影方向。
- 根据权利要求1-4的任一项所述的动向智能投影显示系统,其特征在于,所述动向智能投影显示系统还包括:存储单元和无线通信单元;所述投影信息是来自存储单元或者是来自无线通信单元的实时投影信息;所述环境识别单元为红外三维测试仪或者摄像机。
- 根据权利要求1-4的任一项所述的动向智能投影显示系统,其特征在于,所述投影图像校正包括:投影自动对焦、投影图像方向校正以及投影背景颜色、亮度和表面凹凸程度的校正;所述投影图像方向校正包括:投影图像畸形校正以及方向校正,使校正后的投影画面方正且用户双眼可正视投影图像。
- 根据权利要求1或3所述的动向智能投影显示系统,其特征在于,所述第一方向控制单元和/或第二方向控制单元为云台或者多维运动台,能够实现球面式转动;所述用户为人或者机器人或者其他动物。
- 一种动向智能投影显示方法,其特征在于,包括以下步骤:(S1)根据用户的指向性信息获取用户的意向投影方向;(S2)根据用户的意向方向来计算出第一方向控制单元的转动角度;(S3)根据第一方向控制单元的所述转动角度,使投影单元的投影方向以及环境识别单元的采集方向均朝向用户的意向投影方向;(S4)投影单元根据用户的指向性信息沿着用户的意向投影方向进行初始化投影显示;(S5)通过环境识别单元获取投影画面图像以及环境变化信息;(S6)对环境识别单元所获取的投影画面图像以及环境变化信息进行分析比对并进行实时的投影图像校正;(S7)投影单元进行校正后清晰投影。
- 根据权利要求8所述的动向智能投影显示方法,其特征在于,步骤(S1)所述用户的指向性信息为眼动方向,根据用户的眼动方向确定其意 向投影方向。
- 根据权利要求8所述的动向智能投影显示方法,其特征在于,所述步骤(1)的获取用户的意向投影方向的方法包括:眼动环境识别步骤、第二方向控制步骤以及眼动分析步骤;所述眼动环境识别步骤用于实时采集用户的人眼图像;所述第二方向控制步骤通过转动改变所述眼动环境识别步骤中的采集方向;所述眼动分析步骤用于将人眼图像中虹膜瞳孔的中心位置与基准位置做比较,计算分析用户的眼动方向。
- 根据权利要求8所述的动向智能投影显示方法,其特征在于,步骤(S1)所述的用户的指向性信息为用户头或者手势的摆动方向,由此确定用户的意向投影方向。
- 根据权利要求8所述的动向智能投影显示方法,其特征在于,步骤(S6)所述的投影图像校正包括:投影自动对焦、投影图像方向校正以及投影背景颜色、亮度以及表面凹凸程度的校正;所述投影图像方向校正包括:投影图像畸形校正以及方向校正,使校正后的投影画面方正且用户双眼可正视投影图像。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710178577.5A CN107027014A (zh) | 2017-03-23 | 2017-03-23 | 一种动向智能投影系统及其方法 |
CN201710178577.5 | 2017-03-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018171041A1 true WO2018171041A1 (zh) | 2018-09-27 |
Family
ID=59525803
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/085727 WO2018171041A1 (zh) | 2017-03-23 | 2017-05-24 | 一种动向智能投影系统及其方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN107027014A (zh) |
WO (1) | WO2018171041A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11402900B2 (en) | 2019-01-07 | 2022-08-02 | Beijing Boe Optoelectronics Technology Co., Ltd. | Augmented reality system comprising an aircraft and control method therefor |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107704076A (zh) * | 2017-09-01 | 2018-02-16 | 广景视睿科技(深圳)有限公司 | 一种动向投影物体展示系统及其方法 |
CN107977082A (zh) * | 2017-12-19 | 2018-05-01 | 亮风台(上海)信息科技有限公司 | 一种用于呈现ar信息的方法及系统 |
CN109993835A (zh) * | 2017-12-31 | 2019-07-09 | 广景视睿科技(深圳)有限公司 | 一种舞台互动投影方法、装置以及系统 |
CN109996051B (zh) * | 2017-12-31 | 2021-01-05 | 广景视睿科技(深圳)有限公司 | 一种投影区域自适应的动向投影方法、装置及系统 |
CN110543230A (zh) * | 2018-05-28 | 2019-12-06 | 广州彩熠灯光有限公司 | 基于虚拟现实的舞台灯光元素的设计方法及设计系统 |
CN108632594A (zh) * | 2018-07-17 | 2018-10-09 | 王锐 | 一种智能商品信息显示系统及方法 |
CN109191939B (zh) * | 2018-08-31 | 2021-06-01 | 广东小天才科技有限公司 | 一种基于智能设备的三维投影交互方法及智能设备 |
CN109442254A (zh) * | 2018-09-27 | 2019-03-08 | 广东小天才科技有限公司 | 一种基于智能台灯的学习辅助方法及智能台灯 |
CN109815939A (zh) * | 2019-03-01 | 2019-05-28 | 北京当红齐天国际文化发展集团有限公司 | 基于人眼追踪的投影显示系统及方法 |
CN110719451A (zh) * | 2019-09-30 | 2020-01-21 | 深圳市火乐科技发展有限公司 | 投影调节方法及相关产品 |
CN111031298B (zh) | 2019-11-12 | 2021-12-10 | 广景视睿科技(深圳)有限公司 | 控制投影模块投影的方法、装置和投影系统 |
CN113052101B (zh) * | 2021-03-31 | 2023-04-07 | 乐融致新电子科技(天津)有限公司 | 基于姿态识别的体育辅助教学方法及装置 |
CN113382220A (zh) * | 2021-05-25 | 2021-09-10 | 中国联合网络通信集团有限公司 | 基于智能灯杆的投影方法和智能灯杆 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090243963A1 (en) * | 2008-03-28 | 2009-10-01 | Kabushiki Kaisha Toshiba | Image display apparatus and method for displaying an image |
CN106125848A (zh) * | 2016-08-02 | 2016-11-16 | 宁波智仁进出口有限公司 | 一种智能穿戴设备 |
CN205720872U (zh) * | 2016-03-10 | 2016-11-23 | 上海科斗电子科技有限公司 | 具有瞳孔追踪功能的虚拟现实眼镜 |
KR20160144245A (ko) * | 2015-06-08 | 2016-12-16 | 성균관대학교산학협력단 | 프로젝터 장치 |
CN206575538U (zh) * | 2017-03-23 | 2017-10-20 | 广景视睿科技(深圳)有限公司 | 一种动向智能投影显示系统 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4009850B2 (ja) * | 2002-05-20 | 2007-11-21 | セイコーエプソン株式会社 | 投写型画像表示システム、プロジェクタ、プログラム、情報記憶媒体および画像投写方法 |
JP4006601B2 (ja) * | 2004-03-29 | 2007-11-14 | セイコーエプソン株式会社 | 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法 |
CN102221887B (zh) * | 2011-06-23 | 2016-05-04 | 康佳集团股份有限公司 | 互动投影系统及方法 |
CN103150013A (zh) * | 2012-12-20 | 2013-06-12 | 天津三星光电子有限公司 | 一种移动终端 |
CN103136519A (zh) * | 2013-03-22 | 2013-06-05 | 中国移动通信集团江苏有限公司南京分公司 | 一种基于虹膜识别的视线跟踪定位方法 |
CN104656257A (zh) * | 2015-01-23 | 2015-05-27 | 联想(北京)有限公司 | 信息处理方法及电子设备 |
JP6701621B2 (ja) * | 2015-03-24 | 2020-05-27 | セイコーエプソン株式会社 | プロジェクター及びプロジェクターの制御方法 |
CN104866100B (zh) * | 2015-05-27 | 2018-11-23 | 京东方科技集团股份有限公司 | 眼控装置及其眼控方法和眼控系统 |
CN106325510B (zh) * | 2016-08-19 | 2019-09-24 | 联想(北京)有限公司 | 信息处理方法及电子设备 |
CN106445104A (zh) * | 2016-08-25 | 2017-02-22 | 蔚来汽车有限公司 | 车用hud显示系统及方法 |
-
2017
- 2017-03-23 CN CN201710178577.5A patent/CN107027014A/zh active Pending
- 2017-05-24 WO PCT/CN2017/085727 patent/WO2018171041A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090243963A1 (en) * | 2008-03-28 | 2009-10-01 | Kabushiki Kaisha Toshiba | Image display apparatus and method for displaying an image |
KR20160144245A (ko) * | 2015-06-08 | 2016-12-16 | 성균관대학교산학협력단 | 프로젝터 장치 |
CN205720872U (zh) * | 2016-03-10 | 2016-11-23 | 上海科斗电子科技有限公司 | 具有瞳孔追踪功能的虚拟现实眼镜 |
CN106125848A (zh) * | 2016-08-02 | 2016-11-16 | 宁波智仁进出口有限公司 | 一种智能穿戴设备 |
CN206575538U (zh) * | 2017-03-23 | 2017-10-20 | 广景视睿科技(深圳)有限公司 | 一种动向智能投影显示系统 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11402900B2 (en) | 2019-01-07 | 2022-08-02 | Beijing Boe Optoelectronics Technology Co., Ltd. | Augmented reality system comprising an aircraft and control method therefor |
Also Published As
Publication number | Publication date |
---|---|
CN107027014A (zh) | 2017-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018171041A1 (zh) | 一种动向智能投影系统及其方法 | |
WO2018223469A1 (zh) | 动向投影装置及其工作方法 | |
CN109960401B (zh) | 一种基于人脸追踪的动向投影方法、装置及其系统 | |
CN106843460B (zh) | 基于多摄像头的多目标位置捕获定位系统及方法 | |
WO2017000457A1 (zh) | 一种手持交互设备装置及其投影交互方法 | |
US10691934B2 (en) | Real-time visual feedback for user positioning with respect to a camera and a display | |
WO2018196070A1 (zh) | 基于增强现实的3d动向投影系统以及用于该系统的投影方法 | |
US9996979B2 (en) | Augmented reality technology-based handheld viewing device and method thereof | |
CN206575538U (zh) | 一种动向智能投影显示系统 | |
US9498720B2 (en) | Sharing games using personal audio/visual apparatus | |
US8963956B2 (en) | Location based skins for mixed reality displays | |
CN109982054B (zh) | 一种基于定位追踪的投影方法、装置、投影仪及投影系统 | |
TW201915831A (zh) | 對象識別方法 | |
US20150271449A1 (en) | Integrated Interactive Space | |
CN104881114B (zh) | 一种基于3d眼镜试戴的角度转动实时匹配方法 | |
CN102221887A (zh) | 互动投影系统及方法 | |
CN104935848A (zh) | 一种具有摄像功能的投影仪 | |
WO2020063000A1 (zh) | 神经网络训练、视线检测方法和装置及电子设备 | |
WO2019205283A1 (zh) | 基于红外的ar成像方法、系统及电子设备 | |
WO2019128086A1 (zh) | 一种舞台互动投影方法、装置以及系统 | |
CA3096312A1 (en) | System for tracking a user during a videotelephony session and method ofuse thereof | |
CN106237588B (zh) | 基于二次曲面投影技术的多功能健身系统 | |
US11422618B2 (en) | Smart strap and method for defining human posture | |
WO2022246608A1 (zh) | 生成全景视频的方法、装置和可移动平台 | |
Seewald et al. | Magic Mirror: I and My Avatar-A Versatile Augmented Reality Installation Controlled by Hand Gestures. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17902265 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 21/01/2020) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17902265 Country of ref document: EP Kind code of ref document: A1 |