WO2019128109A1 - 一种基于人脸追踪的动向投影方法、装置及电子设备 - Google Patents

一种基于人脸追踪的动向投影方法、装置及电子设备 Download PDF

Info

Publication number
WO2019128109A1
WO2019128109A1 PCT/CN2018/089628 CN2018089628W WO2019128109A1 WO 2019128109 A1 WO2019128109 A1 WO 2019128109A1 CN 2018089628 W CN2018089628 W CN 2018089628W WO 2019128109 A1 WO2019128109 A1 WO 2019128109A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
projection
face
image
image capturing
Prior art date
Application number
PCT/CN2018/089628
Other languages
English (en)
French (fr)
Inventor
杨伟樑
高志强
纪园
林清云
Original Assignee
广景视睿科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广景视睿科技(深圳)有限公司 filed Critical 广景视睿科技(深圳)有限公司
Publication of WO2019128109A1 publication Critical patent/WO2019128109A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the embodiments of the present invention relate to the field of projection display technologies, and in particular, to a method, a device, and an electronic device for moving a projection based on a face tracking.
  • the inventor found that the existing interactive projection technology lacks flexibility, and can not be better projected according to the user's movement, and the user experience is not good.
  • the main purpose of the present application is to provide a method and apparatus for moving a face tracking based on face tracking, and an electronic device to solve the problem that the projection device cannot move and project along with the movement of a human face in the prior art.
  • an embodiment of the present application provides a method for projecting a projection based on a face tracking, the method comprising:
  • the collecting the face image and determining the position of the face comprises: collecting the face image by using the image collection device, using the face detection algorithm, detecting the face and determining the position of the face in the current frame relative to the image capture device. .
  • the calculating an angle of the image capturing direction includes:
  • the vertical angle being an angle of a face position relative to the image capturing device in a vertical direction
  • a horizontal angle is an angle between a face position and a horizontal direction of the image capture device.
  • the calculating the projection angle comprises:
  • the projection device When the image capture device is horizontally moved at any angle, the projection device also moves horizontally by the same angle.
  • the angle at which the projection device needs to move vertically includes:
  • the projection device By calculating the distance between the face and the image acquisition device in the vertical direction, the projection device is rotated and the actual distance between the projection device and the projection plane is calculated in real time, when the angle of rotation is such that the actual distance between the projection device and the projection plane and the projection device are positive to the face When the distances of the opposite projection planes are equal, the angle of rotation is the direction in which the projection device needs to be projected.
  • the embodiment of the present application provides a motion tracking device based on face tracking, the method comprising: the device includes:
  • An image acquisition unit configured to collect a face image and determine a location of the face
  • a first calculation and analysis unit configured to calculate an angle of an image collection direction according to a position of the face
  • a first direction control unit configured to adjust an angle of the image capturing direction according to the calculated angle of the image capturing direction
  • a second calculation and analysis unit configured to calculate a projection angle that the projection device needs to change according to adjusting an angle of the image acquisition direction
  • a second direction control unit configured to adjust the projection angle according to the calculated projection angle of the projection device
  • the projection display unit is configured to perform projection according to the adjusted projection angle, so that the projection content is displayed directly opposite the human face.
  • the device further includes: a trigger activation unit, wherein the trigger activation unit is configured to activate a motion projection device in a standby state.
  • the trigger activation unit may be:
  • a remote control module for starting a standby projection device in a standby state by a user controlling a micro button device carried with the user;
  • a voice module configured to activate a moving projection device in a standby state according to a corresponding voice command
  • the motion recognition module is configured to receive a body language trigger such as a gesture acquired by the image acquisition unit, and activate a moving projection device in a standby state.
  • the device further includes: a photographing correction unit, configured to cover the image of the captured projected image area, and perform projection screen correction.
  • a photographing correction unit configured to cover the image of the captured projected image area, and perform projection screen correction.
  • an electronic device including:
  • At least one processor and,
  • the memory stores instructions executable by the at least one processor, the instructions being executed by the at least one processor to enable the at least one processor to perform the method as described above.
  • embodiments of the present application provide a non-transitory computer readable storage medium storing computer-executable instructions for causing an electronic
  • the device is capable of performing the method as described above.
  • an embodiment of the present application provides a computer program product, the computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions, when the program instruction is used by an electronic device When executed, the electronic device is caused to perform the method as described above.
  • the present application discloses a method and device for moving a face tracking based on a face tracking, and an electronic device according to the method, the device and the electronic device.
  • the change of the face position of the moving user can adjust the projection angle in real time so that the tracked user always has the visual enjoyment of the projected picture, which enhances the user's viewing experience.
  • FIG. 1 is a schematic flow chart of a method for moving a face tracking based on a face tracking method according to an embodiment of the present application
  • FIG. 2 is a schematic plan view showing an angle of calculating an image collection direction according to an embodiment of the present application
  • FIG. 3 is a perspective view showing an angle of calculating an image collection direction according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a method for calculating a projection angle that a projection device needs to change in a first case provided by an embodiment of the present application;
  • FIG. 5 is a schematic diagram of a method for calculating a projection angle that a projection device needs to change in a second case provided by an embodiment of the present application;
  • FIG. 6 is a schematic diagram of a method for calculating a projection angle that a projection device needs to change in a third case provided by an embodiment of the present application;
  • FIG. 7 is a plan view showing a center position acquired by an image capture device according to an embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a motion tracking device based on face tracking according to an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a trigger start unit based on face tracking according to an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of hardware of an electronic device according to an embodiment of the present application.
  • FIG. 1 is a schematic flowchart diagram of a motion projection method according to an embodiment of the present application.
  • a motion projection method of an embodiment of the present application can be performed by the motion projection device 10 shown in FIG.
  • the method includes:
  • the image acquisition device when the image acquisition device is used to collect the face image, the face image is collected by the image collection device, and the face detection algorithm is used to detect the face and determine the position of the face in the current frame relative to the image capture device.
  • the image collection device includes: image acquisition of a conventional two-dimensional human face and a three-dimensional depth ranging function.
  • the image acquisition device may be a binocular camera or an infrared camera with an optional infrared lens.
  • the calculating an angle of the image capturing direction includes: calculating a vertical offset angle of the image capturing direction and a vertical offset angle; wherein the vertical offset angle is a face position relative to the image An angle at which the acquisition device is offset in the vertical direction; the horizontal offset angle is an angle at which the face position is offset in the horizontal direction relative to the image capture device.
  • the angle of the image capturing direction is adjusted, so that the target face is always located at the center position of the subsequently acquired image.
  • the distance of the human face from the image capturing device is acquired by a depth camera installed in the image capturing device, and a fixed distance between the projection device and the image capturing device is measured using a measuring device.
  • the angle at which the projection device needs to move horizontally is the same as the angle at which the image capturing device moves in the horizontal direction; when the image capturing device moves in the vertical direction At a certain angle, the projection device needs to move at different angles in the vertical direction according to the positional relationship between the face and the image capturing device.
  • the specific method for changing the angle of the projection device along with the image capturing direction is: a horizontal angle between the projection device and the initial position of the image capturing device is 180 degrees, and the projection device is calculated according to the calculation.
  • the projection angle is adjusted by adjusting the angle of the projection device so that the final adjusted position of the two is always maintained at a relative position of 180 degrees.
  • the projection is performed according to the adjusted projection angle, so that the projection content is displayed on the opposite side of the human face.
  • the angle is adjusted so that the human face is at the center of the image.
  • the present application discloses a face tracking based motion projection method, according to which, by tracking the change of the face position of the user who is moving, The projection angle can be adjusted in real time so that the tracked user always has the visual enjoyment of the projected picture, which enhances the comfort of the user.
  • FIG. 2 is a schematic plan view showing an angle of an image capturing direction according to an embodiment of the present disclosure
  • FIG. 3 is a schematic perspective view showing an angle of calculating an image capturing direction according to an embodiment of the present application.
  • a face detection algorithm is first used to detect a face.
  • the face detection algorithm includes: after the image collection device receives a face image, performing facial feature positioning on the face region in the image to determine a face.
  • point C is the position in the image of the face in the current frame
  • point A is the center point of the captured image
  • O is the position where the image capturing device camera is located
  • Calculating the angle of the image capturing direction includes: calculating an angle formed by the OC and the OA, as shown in FIG.
  • the angle formed by the OC and the OA includes: a horizontal angle x_angle_c and a vertical angle y_angle_c, wherein the L2 is perpendicular to the face
  • the distance of the horizontal plane, d is the distance of the image capturing device perpendicular to the horizontal plane.
  • dx is the point A between the image capturing device camera projected onto the horizontal plane and the point B projected by the face onto the horizontal plane. The distance, thus L2 and dx, can be calculated, and the angles of the x_angle_o and x_angle_c can be calculated, and the deflection angle y_angle_c of the image capturing device in the vertical direction can be calculated.
  • a positional relationship model as shown in FIG. 2 is established, and the vertical angle x_angle_c and the vertical angle y_angle_c that the image capturing device needs to change are calculated. Moving the face to the center of the image, tracking the face in real time using a tracking algorithm, and rotating the image acquisition device according to the calculated angle, so that the target face is always at the center of the image, so that the face can be easily captured. The user's intention is judged to change the angle of the projection device.
  • FIG. 4, FIG. 5 and FIG. 6 are respectively schematic diagrams showing a method for calculating a projection angle that a projection device needs to change in three different situations according to an embodiment of the present application.
  • the specific method for changing the angle of the image capturing direction of the projection device is: the horizontal angle between the initial position of the projection device and the image capturing device is 180 degrees, and the projection is calculated according to the calculation.
  • the projection angle of the device is adjusted by adjusting the angle of the projection device so that the final adjusted position of the two devices is always maintained at a relative position of 180 degrees, and the angle of the projector is adjusted so that the projector is projected directly opposite the face, and the description is required.
  • the angle is adjusted to be up or down, or to the left or to the right.
  • the method for calculating a projection angle includes:
  • the distance from the face to the image acquisition device is L; the fixed distance between the projection device and the image acquisition device is h; the angle at which the image capture device moves horizontally is x_ca; and the projection device needs to move horizontally
  • the angle of the image capture device is y_ca; the angle at which the projection device needs to move vertically is y_pr; the distance between the face and the image capture device in the vertical direction is h1; the actual distance between the projection device and the projection plane L_pr; the distance from the projection device to the projection plane directly opposite the face is L_pr_measure;
  • the angle at which the image capturing device moves horizontally is x_ca
  • the angle at which the projection device needs to move horizontally is also x_ca; therefore, the x_pr is equal to the x_ca.
  • the value of L_pr, where L_pr (h+h1)/sin(y_pr), when L_pr is equal to L_pr_measure, the angle y_pr of rotation is the direction in which the projector needs to project.
  • x1 and x2 represent the coordinates of the center of the same horizontal line.
  • the image of the position of the center of the camera obtained by the camera is as follows: acquired by the image acquisition device
  • the image capturing device is disposed on the same horizontal line as the central axis of the projection device, and the projection device is used to project two circles on the same horizontal line.
  • x1 and x2 are coordinate positions of two centers of the same horizontal line; f is a fixed coefficient; d0 is the actual distance between the camera and the camera; w is the width of the image; and d is the actual distance between the two detected centers ;abs represents the absolute value;
  • the (x1+x2)/2 represents the center position of the two centers on the x-axis line
  • the w/2 is the center position in the X-axis direction on the image
  • the (x1+ X2)/2-w/2 is the offset distance in the X-axis direction in the image
  • the (x1-x2) represents the offset distance actually in the x-axis direction.
  • FIG. 8 is a schematic diagram of a face tracking based motion projection apparatus according to an embodiment of the present application.
  • the moving projection device 80 includes:
  • the trigger activation unit 801 is configured to activate a motion projection device in a standby state.
  • the trigger activation unit 801 includes:
  • a remote control module 901 configured to activate a moving projection device in a standby state by controlling a micro button device carried by the user;
  • a voice module 902 configured to activate a moving projection device in a standby state according to a corresponding voice command
  • the motion recognition module 903 is configured to receive a body language trigger such as a gesture collected by the image acquisition unit, and activate a motion projection device in a standby state.
  • the user can activate the moving projection device 80 in the standby state by controlling the micro button device, the corresponding voice command or the body language, and the moving projection device is provided with a central processing module.
  • the central processing module connects and controls each unit in the moving projection device, and the central processing module can be activated by the trigger activation unit 801 to control the activation of the moving projection device 80.
  • the image collecting unit 802 is configured to collect a face image and determine a location of the face
  • the image collecting unit 802 collects a face image and uses a face detection algorithm to detect a face and determine the position of the face in the current frame relative to the image capturing unit 802.
  • a first calculation and analysis unit 803 configured to calculate an angle of an image collection direction according to a position of the face
  • the image collecting unit 802 collects the face image and determines the position of the face in the current frame relative to the image capturing unit 802, and analyzes and calculates the face relative to the face by the first calculating and analyzing unit 803.
  • the angle of the image acquisition unit 802 in the vertical direction and in the horizontal direction.
  • the first direction control unit 804 is configured to adjust an angle of the image capturing direction according to the calculated angle of the image capturing direction;
  • the first direction control unit 804 calculates the angle between the face of the face calculated by the first calculation and analysis unit 803 in the vertical direction and the horizontal direction with respect to the image acquisition unit 802.
  • the image acquisition unit 802 is adjusted and the face is always located at the center of the subsequently acquired image.
  • a second calculation and analysis unit 805, configured to calculate a projection angle that the projection device needs to change according to adjusting the image acquisition direction angle
  • the distance of the face from the image acquisition unit 802 and the fixed distance of the projection device from the image acquisition unit 802 are acquired by the depth camera installed in the image acquisition unit 802.
  • the angle at which the projection device needs to move horizontally is the same as the angle at which the image capturing unit 802 moves in the horizontal direction; when the image capturing unit 802 is along the vertical
  • the projection device needs to move at different angles in the vertical direction according to the positional relationship between the face and the image acquisition unit 802.
  • a second direction control unit 806, configured to adjust the projection angle according to the calculated projection angle of the projection device
  • the distance of the face from the image acquisition unit 802 is acquired according to the depth camera installed in the image acquisition unit 802 by the projection device, and the projection device is measured and measured by using the measurement device.
  • the fixed distance of the image acquisition unit 802. When the image capturing unit 802 is moved by a certain angle in the horizontal direction, the angle at which the projection device needs to move horizontally is the same as the angle at which the image capturing unit 802 moves in the horizontal direction; when the image capturing unit 802 is along the vertical When the direction is moved by a certain angle, the projection device needs to move at different angles in the vertical direction according to the positional relationship between the face and the image acquisition unit 802.
  • the projection display unit 807 is configured to display the projected content.
  • the projection display unit 807 projects the projection content to the opposite side of the human face according to the projection angle adjusted by the second direction control unit 806.
  • the photographing correction unit 808 is configured to cover the image of the collected projection screen area and perform projection screen correction.
  • the photographing correction unit 808 covers the image of the projection screen area and performs projection screen correction when the projection display unit 807 projects; wherein the photographing correction unit 808 includes trapezoidal projection correction (not shown in the figure) And a projection brightness correction (not shown) for correcting the shape and brightness of the projected content of the projection device, respectively.
  • the trigger activation unit 801 starts the image acquisition unit 802, and the image acquisition unit 802 collects and determines the face information of the face image, and the first calculation and analysis unit 803
  • the face information collected by the image acquisition unit 802 calculates the angle of the image acquisition direction
  • the first direction control unit 804 adjusts the angle of the image collection direction according to the result calculated by the first calculation analysis unit 803,
  • the second The calculation analysis unit 805 calculates a projection angle that the projection device needs to change according to the angle of the image acquisition direction
  • the second direction control unit 806 adjusts the projection angle according to the projection angle calculated by the second calculation analysis unit 805.
  • the projection angle is such that the projection content projected by the projection display unit 807 is always located directly opposite the human face, and the projection display unit 807 covers the image of the captured projection image area when the projection display unit 807 is projected. Projection screen correction.
  • the motion projection device 80 can perform the motion projection method provided by the embodiment of the present application, and has a function module and a beneficial effect corresponding to the execution method.
  • the motion projection method provided by the embodiment of the present application.
  • the present application discloses a face tracking-based moving projection device, according to which, by tracking the change of the face position of the user who is moving, The projection angle can be adjusted in real time so that the tracked user always has the visual enjoyment of the projected picture, which enhances the comfort of the user.
  • FIG. 10 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure.
  • the electronic device may be a projector or a cloud platform equipped with a projector.
  • the electronic device 100 includes:
  • One or more processors 1001 and a memory 1002, one processor 1001 is taken as an example in FIG.
  • the processor 1001 and the memory 1002 may be connected by a bus or other means, as exemplified by a bus connection in FIG.
  • the memory 1002 is a non-volatile computer readable storage medium, and can be used for storing non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions corresponding to the motion projection method in the embodiment of the present application. / unit (for example, trigger activation unit 801, image acquisition unit 802, first calculation analysis unit 803, first direction control unit 804, second calculation analysis unit 805, second direction control unit 806, projection shown in FIG. The display unit 807 and the photographing correction unit 808).
  • the processor 1001 performs various functional applications and data processing of the electronic device 100 by executing non-volatile software programs, instructions, and units stored in the memory 1002, that is, implementing the dynamic projection method of the method embodiments.
  • the memory 1002 may include a storage program area and an storage data area, wherein the storage program area may store an operating system, an application required for at least one function; the storage data area may store data created according to the use of the moving projection device 80, and the like. Further, the memory 1002 may include a high speed random access memory, and may also include a nonvolatile memory such as at least one magnetic disk storage device, flash memory device, or other nonvolatile solid state storage device. In some embodiments, the memory 1002 can optionally include memory remotely located relative to the processor 1001 that can be connected to the electronic device 100 over a network. Embodiments of the network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the one or more units are stored in the memory 1002, and when executed by the one or more processors 1001, perform a method of motion projection as described above, for example, performing the method of FIG. 1 described above Steps 101 through 106 implement the functions of units 801-808 in FIG.
  • the electronic device can perform the motion projection method provided by the present application, and has a corresponding functional module and a beneficial effect of the execution method.
  • the electronic device can perform the motion projection method provided by the present application, and has a corresponding functional module and a beneficial effect of the execution method.
  • Embodiments of the present application provide a non-transitory computer readable storage medium storing computer-executable instructions that are executed by one or more processors,
  • the functions of the units 801-808 in FIG. 8 can be implemented by performing the method steps 101 to 106 of FIG. 1 described above.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical. Units can be located in one place or distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the embodiments can be implemented by means of software plus a general hardware platform, and of course, by hardware.
  • One of ordinary skill in the art can understand that all or part of the process of implementing the embodiment method can be completed by a computer program to instruct related hardware, and the program can be stored in a computer readable storage medium, the program When executed, the flow of an embodiment of the methods as described may be included.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请实施例涉及投影显示技术领域,特别是涉及一种基于人脸追踪的动向投影方法、装置及电子设备。其中,所述方法包括:采集人脸图像并确定人脸的位置;根据人脸的位置,计算图像采集方向的角度;根据计算得出的所述图像采集方向的角度,调整所述图像采集方向的角度;根据调整所述图像采集方向角度,计算投影装置需要改变的投影角度;根据计算得出的所述投影装置的投影角度,调整所述投影角度;根据调整后的所述投影角度进行投影,使投影内容显示在人脸的正对面。通过上述方法,根据追踪正在移动的用户的人脸位置的改变,可实时的调整投影角度使被追踪用户一直有正视投影画面的视觉享受,增强了用户的观看体验。

Description

一种基于人脸追踪的动向投影方法、装置及电子设备
相关申请的交叉参考
本申请要求于2017年12月26日提交中国专利局、申请号为201711435378.4、发明名称为“一种基于人脸追踪的动向投影方法、装置及其系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及投影显示技术领域,特别是涉及一种基于人脸追踪的动向投影方法、装置及电子设备。
背景技术
随着科学技术的发展和人民生活水平的不断提高,人们对视觉感受方面的要求越来越高,一方面,人们对人机界面的显示器件的要求越来越向着微型、大屏以及高分辨率方向发展;另一方面,在显示效果上,人们又倾向于追求增强现实、身临其境的视觉享受。目前,互动投影技术正逐渐进入人们的日常生活中,并且在逐步向追踪用户意向的方向发展。
发明人在实现本申请的过程中发现,现有的互动投影技术缺乏灵活性,还不能够较好得根据用户动向进行投影,用户体验不好。
发明内容
有鉴于此,本申请的主要目的在于提供一种基于人脸追踪的动向投影方法、装置及电子设备,以解决现有技术中投影装置无法跟随着人脸的移动而移动并投影的问题。
为了解决上述的技术问题,本申请实施例公开了如下技术方案:
第一方面,本申请实施例提供了一种基于人脸追踪的动向投影方法,所述方法包括:
采集人脸图像并确定人脸的位置;
根据人脸的位置,计算图像采集方向的角度;
根据计算得出的所述图像采集方向的角度,调整所述图像采集方向的角度;
根据调整所述图像采集方向角度,计算投影装置需要改变的投影角度;
根据计算得出的所述投影装置的投影角度,调整所述投影角度;
根据调整后的所述投影角度进行投影,使投影内容显示在人脸的正对面。
可选的,所述采集人脸图像并确定人脸的位置包括:通过图像采集装置采集人脸图像,使用人脸检测算法,检测人脸并确定人脸在当前帧相对于图像采集装置的位置。
可选的,所述计算图像采集方向的角度包括:
计算水平角度以及竖直角度;
竖直角度,所述竖直角度为人脸位置相对于所述图像采集装置在竖直方向上的夹角;
水平角度,所述水平角度为人脸位置相对于所述图像采集装置在水平方向上的夹角。
可选的,所述计算投影角度包括:
当图像采集装置水平移动了任意角度为时,则投影装置也水平移动相同的角度。
可选的,当图像采集装置竖直移动了任意角度时,则投影装置需要竖直移动的角度包括:
通过计算人脸与图像采集装置竖直方向上的距离,转动投影装置并实时计算投影装置与投影平面的实际距离,当转动的角度使投影装置与投影平面的实际距离与投影装置到人脸正对面的投影平面的距离相等的时候,则所述转动的角度即为投影装置需要投影的方向。
第二方面,本申请实施例提供了一种基于人脸追踪的动向投影装置,所述方法包括,所述装置包括:
图像采集单元,用于采集人脸图像并确定人脸的位置;
第一计算分析单元,用于根据人脸的位置,计算图像采集方向的角度;
第一方向控制单元,用于根据计算得出的所述图像采集方向的角度,调整所述图像采集方向的角度;
第二计算分析单元,用于根据调整所述图像采集方向角度,计算投影装置需要改变的投影角度;
第二方向控制单元,用于根据计算得出的所述投影装置的投影角度,调整所述投影角度;
投影显示单元,用于根据调整后的所述投影角度进行投影,使投影内容显示在人脸的正对面。
可选的,所述装置还包括:触发启动单元,所述触发启动单元用于启动待机状态的动向投影装置。
可选的,所述触发启动单元可以为:
遥控模块,用于通过用户控制随身携带的微型按钮装置,启动待机状态的动向投影装置;和/或
语音模块,用于根据相应的语音指令,启动待机状态的动向投影装置;和/或
动作识别模块,用于接受图像采集单元采集的手势等肢体语言触发,启动待机状态的动向投影装置。
可选的,所述装置还包括:摄影校正单元,用于覆盖采集投影画面区域图像,进行投影画面校正。
第三方面,本申请实施例提供了一种电子设备,包括:
至少一个处理器;以及,
与所述至少一个处理器通信连接的存储器;其中,
所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行如上所述的方法。
第四方面,本申请实施例提供了一种非易失性计算机可读存储介质,所述非易失性计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使电子设备能够执行如上所述的方法。
第五方面,本申请实施例提供了一种计算机程序产品,所述计算机程序产品包括存储在计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被电子设备执行时,使所述电子设备执行如上所述的方法。
本申请实施方式的有益效果是:区别于现有技术的情况,本申请公开了一种基于人脸追踪的动向投影方法、装置及电子设备,根据所述方法、装置及电子设备,通过追踪正在移动的用户的人脸位置的改变,可实时调整投影角度使被追踪用户一直有正视投影画面的视觉享受,增强了用户的观看体验。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1是本申请实施例提供的一种基于人脸追踪的动向投影方法的流程示意图;
图2是本申请实施例提供的一种计算图像采集方向的角度的平面示意图;
图3是本申请实施例提供的一种计算图像采集方向的角度的立体示意图;
图4是本申请实施例提供的第一种情况下,计算投影装置需要改变的投影角度的方法示意图;
图5是本申请实施例提供的第二种情况下,计算投影装置需要改变的投影角度的方法示意图;
图6是本申请实施例提供的第三种情况下,计算投影装置需要改变的投影角度的方法示意图;
图7是本申请实施例提供的一种图像采集装置获取到的圆心位置的平面图;
图8是本申请实施例提供的一种基于人脸追踪的动向投影装置的结构示意图;
图9是本申请实施例提供的一种基于人脸追踪的触发启动单元的结构示意图;
图10是本申请实施例提供的一种电子设备的硬件结构示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然, 所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
此外,下面所描述的本申请各个实施方式中所涉及到的技术特征只要彼此之间未构成冲突就可以相互组合。
本申请实施例为本申请提供的一种动向投影方法的实施例。图1为本申请实施例提供的一种动向投影方法的流程示意图。本申请实施例的一种动向投影方法可由图8所示的动向投影装置10执行。
参照图1,所述方法包括:
101:采集人脸图像并确定人脸的位置;
在本申请实施例中,在使用图像采集装置采集人脸图像时,通过图像采集装置采集人脸图像,使用人脸检测算法,检测人脸并确定人脸在当前帧相对于图像采集装置的位置。其中,所述图像采集装置包括:常规二维人脸的图像采集以及三维深度测距功能。所述图像采集装置可以为双目摄像头,或者设有可选红外镜片的红外摄像头。
102:根据人脸的位置,计算图像采集方向的角度;
在本申请实施例中,所述计算图像采集方向的角度包括:计算图像采集方向的垂直偏移角度以及竖直偏移角度;其中,所述竖直偏移角度为人脸位置相对于所述图像采集装置在竖直方向上偏移的夹角;所述水平偏移角度为人脸位置相对于所述图像采集装置在水平方向上偏移的夹角。
103:根据计算得出的所述图像采集方向的角度,调整所述图像采集方向的角度;
在本申请实施例中,根据所述图像采集方向的角度的计算结果,调整图像采集方向角度,使目标人脸始终位于后续所采集图像的中心位置。
104:根据调整所述图像采集方向角度,计算投影装置需要改变的投影角度;
在本申请实施例中,通过安装于所述图像采集装置的深度摄像头获取人脸距离所述图像采集装置的距离,并使用测量设备测量所述投影装置与所述图像采集装置的固定距离。当所述图像采集装置沿着水平方向移动了一定角度时,则投影装置需要水平移动的角度与所述图像采集装置在水平方向上移动的角度相同;当图像采集装置沿着竖直方向移动了一定的角度时,则所述投影装置根 据人脸与所述图像采集装置的位置关系需要沿着竖直方向的移动不同的角度。
105:根据计算得出的所述投影装置的投影角度,调整所述投影角度;
在本申请实施例中,所述投影装置随图像采集方向角度改变的具体方法为:所述投影装置与图像采集装置的初始位置的水平夹角是180度,根据计算得出的所述投影装置的投影角度,通过调整所述投影装置的角度,使得两者最终调整后的位置始终保持180度的相对位置。
106:根据调整后的所述投影角度进行投影,使投影内容显示在人脸的正对面。
在本申请实施例中,根据调整后的所述投影角度进行投影,使投影内容显示在人脸的正对面,当摄像头检测到人脸时,调整角度使得人脸处在图像的中心。
本申请实施例的有益效果是:区别于现有技术的情况,本申请公开了一种基于人脸追踪的动向投影方法,根据所述方法,通过追踪正在移动的用户的人脸位置的改变,可实时的调整投影角度使被追踪用户一直有正视投影画面的视觉享受,增强了用户观看的舒适感。
图2是本申请实施例提供的一种计算图像采集方向的角度的平面示意图;图3是本申请实施例提供的一种计算图像采集方向的角度的立体示意图。
在本申请实施例中,首先使用人脸检测算法检测人脸,所述人脸检测算法包括:所述图像采集装置接收人脸图像后,对图像中的人脸区域进行五官定位,确定人脸在当前帧的位置;如图2及图3所示,C点为当前帧中人脸所在图像中的位置,A点为采集图像的中心点,O为图像采集装置camera所在的位置,所述计算图像采集方向的角度包括:计算OC与OA所形成的角度,如图2所示,所述OC与OA所形成的角度包括:水平角度x_angle_c以及竖直角度y_angle_c,其中,L2为人脸垂直于水平面的距离,d为图像采集装置垂直于水平面的距离,请结合图3,dx为所述图像采集装置camera投影到水平面上的点A与所述人脸投影到水平面上的点B之间的距离,因此已知L2和dx,可以计算所述x_angle_o以及x_angle_c的角度,同理可以计算得出所述图像采集装置在竖直方向的偏转角度y_angle_c。因此,所述图像采集装置camera对所采集的人脸图像进行分析后,建立如图2所示的位置关系模型,计算得出所述图像采集装置所需改变的垂直角度x_angle_c以及竖直角度y_angle_c并将人脸移动到图像的中心,使用 跟踪算法,实时跟踪人脸,并根据计算的角度,转动所述图像采集装置,使得目标人脸始终处在图像的中心位置,方便抓捕人脸从而判断用户意向从而改变投影装置的角度。
图4、图5、图6分别为本申请实施例提供的在三种不同情况下,计算投影装置需要改变的投影角度的方法示意图。
在本申请实施例中,所述投影装置随着图像采集方向角度改变的具体方法为:所述投影装置与图像采集装置的初始位置的水平夹角是180度,根据计算得出的所述投影装置的投影角度,通过调整所述投影装置的角度,使得两者最终调整后的位置始终保持180度的相对位置,同时调整投影机的角度,使得投影机投影在人脸的正对面,需要说明的是,为了调整人脸的视线不受所述投影装置的遮挡,适当调整角度偏上或偏下,或偏左,或偏右。
在本申请实施例中,所述计算投影角度的方法包括:
如图4、图5、图6所示,人脸距离图像采集装置的距离为L;投影装置与图像采集装置的固定距离为h;图像采集装置水平移动的角度为x_ca;投影装置需要水平移动的角度为x_pr;图像采集装置竖直移动的角度为y_ca;投影装置需要竖直移动的角度为y_pr;人脸与图像采集装置竖直方向上的距离为h1;投影装置与投影平面的实际距离为L_pr;投影装置到人脸正对面的投影平面的距离为L_pr_measure;
通过深度摄像头获取L,并测量h;
当图像采集装置水平移动的角度为x_ca时,所述投影装置需要水平移动的角度同样为x_ca;因此所述x_pr与所述x_ca相等。
当图像采集装置竖直移动的角度为y_ca时,则y_pr的计算方法分为以下三种情况:
如图4所示,当人脸位置位于所述图像采集装置的下方时,通过计算h1=L*sin(y_ca)得出h1的距离,转动所述投影装置并实时计算每次转动的角度y_pr时L_pr的值,其中,所述L_pr=(h+h1)/sin(y_pr),当L_pr与L_pr_measure相等的时候,则转动的角度y_pr即为投影机需要投影的方向。
如图5所示,当人脸位置位于所述图像采集装置的上方时,通过计算h1=L*cos(y_ca)-h得出h1,转动投影装置并实时计算每次转动的角度y_pr时L_pr的值,其中,所述L_pr=h1/sin(y_pr),当L_pr与L_pr_measure相等的时候, 则转动的角度y_pr即为投影机需要投影的方向。
如图6所示,当人脸位置位于所述图像采集装置与投影装置之间时,通过计算h1=h-L*cos(y_ca)得出h1,转动投影装置并实时计算每次转动的角度y_pr时L_pr的值,其中,所述L_pr=h1/sin(y_pr),当L_pr与L_pr_measure相等的时候,则转动的角度y_pr即为投影装置需要投影的方向。其中,所述L_pr_measure的测量方法如下:
假设摄像头与投影机的中心轴在同一水平线上,使用投影机投影两个在同一水平线的圆,如下图所示,x1、x2代表同一水平线的圆心的坐标位置
摄像头获取到的圆心的位置图像示意图如下所示:图像采集装置获取到的
如图7所示,设所述图像采集装置与投影装置的中心轴在同一水平线上,使用所述投影装置投影两个在同一水平线的圆,
其中,设x1、x2为同一水平线的两个圆心的坐标位置;f是固定系数;d0是摄像机和摄像头的实际距离;w是图像的宽;d表示的是检测到的两个圆心的实际距离;abs表示绝对值;
经过abs((x1+x2)/2-w/2)/d0=abs(x1-x2)/d,可以计算出d;
通过L_pr_measure/f=d/abs(x1-x2),可以计算出L_pr_measure,
需要说明的是,所述(x1+x2)/2代表两个圆心在x轴连线上的中心位置,所述w/2为在图像上X轴方向上的中心位置,所述(x1+x2)/2-w/2为在图像中X轴方向上的偏移距离,所述(x1-x2)代表实际在x轴方向上的偏移距离。
本申请实施例为本申请提供的一种基于人脸追踪的动向投影装置的实施例。如图8为本申请实施例提供的一种基于人脸追踪的动向投影装置的示意图。
参照图8,所述动向投影装置80包括:
触发启动单元801,用于启动待机状态的动向投影装置。
请结合图9,所述触发启动单元801包括:
遥控模块901,用于通过用户控制随身携带的微型按钮装置,启动待机状态的动向投影装置;和/或
语音模块902,用于根据相应的语音指令,启动待机状态的动向投影装置;和/或
动作识别模块903,用于接受图像采集单元采集的手势等肢体语言触发,启动待机状态的动向投影装置。
在本申请实施例中,用户可以通过控制随身携带的微型按钮装置,相应的语音指令或者肢体语言,启动待机状态的动向投影装置80,其中,所述动向投影装置装有中央处理模块,所述中央处理模块连接并控制着所述动向投影装置中的各单元,通过触发启动单元801可以启动所述中央处理模块控制所述动向投影装置80的启动。
图像采集单元802,用于采集人脸图像并确定人脸的位置;
在本申请实施例中,所述图像采集单元802采集人脸图像并使用人脸检测算法,检测人脸并确定人脸在当前帧相对于图像采集单元802的位置。
第一计算分析单元803,用于根据人脸的位置,计算图像采集方向的角度;
在本申请实施例中,所述图像采集单元802采集人脸图像并确定人脸在当前帧相对于图像采集单元802的位置后,通过所述第一计算分析单元803分析并计算人脸相对于所述图像采集单元802在竖直方向上以及水平方向上的夹角。
第一方向控制单元804,用于根据计算得出的所述图像采集方向的角度,调整所述图像采集方向的角度;
在本申请实施例中,所述第一方向控制单元804通过所述第一计算分析单元803计算出的人脸相对于所述图像采集单元802在竖直方向上以及水平方向上的夹角,调整图像采集单元802,并且使所述人脸始终位于后续所采集图像的中心位置。
第二计算分析单元805,用于根据调整所述图像采集方向角度,计算投影装置需要改变的投影角度;
在本申请实施例中,通过安装于所述图像采集单元802的深度摄像头获取人脸距离所述图像采集单元802的距离,并所述投影装置与所述图像采集单元802的固定距离。当所述图像采集单元802沿着水平方向移动了一定角度时,则投影装置需要水平移动的角度与所述图像采集单元802在水平方向上移动的角度相同;当图像采集单元802沿着竖直方向移动了一定的角度时,则所述投影装置根据人脸与所述图像采集单元802的位置关系需要沿着竖直方向的移动不同的角度。
第二方向控制单元806,用于根据计算得出的所述投影装置的投影角度,调整所述投影角度
在本申请实施例中,通过所述投影装置,根据安装于所述图像采集单元802 的深度摄像头获取人脸距离所述图像采集单元802的距离,并使用测量设备测量所述投影装置与所述图像采集单元802的固定距离。当所述图像采集单元802沿着水平方向移动了一定角度时,则投影装置需要水平移动的角度与所述图像采集单元802在水平方向上移动的角度相同;当图像采集单元802沿着竖直方向移动了一定的角度时,则所述投影装置根据人脸与所述图像采集单元802的位置关系需要沿着竖直方向的移动不同的角度。
投影显示单元807,用于显示投影内容。
在本申请实施例中,所述投影显示单元807根据所述第二方向控制单元806调整的投影角度,将投影内容投影至所述人脸的正对面。
摄影校正单元808,用于覆盖采集投影画面区域图像,进行投影画面校正。
在本申请实施例中,所述摄影校正单元808在所述投影显示单元807投影时,覆盖采集投影画面区域图像,进行投影画面校正;其中所述摄影校正单元808包括梯形投影校正(图中未示出)以及投影亮度校正(图中未示出),所述梯形投影校正以及投影亮度校正分别用于对所述投影装置的投影内容进行外形及亮度的校正。
在本申请实施例中,所述触发启动单元801启动所述图像采集单元802,所述图像采集单元802对人脸图像进行采集并确定的人脸信息,所述第一计算分析单元803根据所述图像采集单元802采集的人脸信息,计算图像采集方向的角度,所述第一方向控制单元804根据所述第一计算分析单元803计算的结果,调整图像采集方向的角度,所述第二计算分析单元805根据所述图像采集方向的角度计算投影装置所需改变的投影角度,所述第二方向控制单元806根据所述第二计算分析单元805计算出来的所述投影角度,调整所述投影角度,使所述投影显示单元807投影的所述投影内容始终位于所述人脸的正对面,所述投影显示单元807在投影时,所述摄影校正单元808覆盖采集投影画面区域图像,进行投影画面校正。
需要说明的是,在本申请实施例中,所述动向投影装置80可执行本申请实施例所提供的动向投影方法,具备执行方法相应的功能模块和有益效果。未在动向投影装置80的实施例中详尽描述的技术细节,可参见本申请实施例所提供的动向投影方法。
本申请实施方式的有益效果是:区别于现有技术的情况,本申请公开了一 种基于人脸追踪的动向投影装置,根据所述装置,通过追踪正在移动的用户的人脸位置的改变,可实时的调整投影角度使被追踪用户一直有正视投影画面的视觉享受,增强了用户观看的舒适感。
图10是本申请实施例提供的一种电子设备的硬件结构示意图,所述电子设备具体可以为投影仪或者搭载投影仪的云台。
如图10所示,该电子设备100包括:
一个或多个处理器1001以及存储器1002,图10中以一个处理器1001为例。
处理器1001和存储器1002可以通过总线或者其他方式连接,图10中以通过总线连接为例。
存储器1002作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的动向投影方法对应的程序指令/单元(例如,附图8所示的触发启动单元801、图像采集单元802、第一计算分析单元803、第一方向控制单元804、第二计算分析单元805、第二方向控制单元806、投影显示单元807、摄影校正单元808)。处理器1001通过运行存储在存储器1002中的非易失性软件程序、指令以及单元,从而执行电子设备100的各种功能应用以及数据处理,即实现所述方法实施例的动向投影方法。
存储器1002可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据动向投影装置80的使用所创建的数据等。此外,存储器1002可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器1002可选包括相对于处理器1001远程设置的存储器,这些远程存储器可以通过网络连接至电子设备100。所述网络的实施例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
所述一个或者多个单元存储在所述存储器1002中,当被所述一个或者多个处理器1001执行时,执行如上所述的动向投影的方法,例如,执行以上描述的图1中的方法步骤101至步骤106,实现图8中的单元801-808的功能。
所述电子设备可执行本申请所提供的动向投影方法,具备执行方法相应的功能模块和有益效果。未在电子设备实施例中详尽描述的技术细节,可参见本 申请提供的动向投影方法。
本申请实施例提供了一种非易失性计算机可读存储介质,所述非易失性计算机可读存储介质存储有计算机可执行指令,该计算机可执行指令被一个或多个处理器执行,可执行以上描述的图1中的方法步骤101至步骤106,实现图8中的单元801-808的功能。
需要说明的是,以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施例的描述,本领域普通技术人员可以清楚地了解到各实施例可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。本领域普通技术人员可以理解实现所述实施例方法中的全部或部分流程是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如所述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(RandomAccessMemory,RAM)等。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;在本申请的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本申请的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (12)

  1. 一种基于人脸追踪的动向投影方法,其特征在于,所述方法包括:
    采集人脸图像并确定人脸的位置;
    根据人脸的位置,计算图像采集方向的角度;
    根据计算得出的所述图像采集方向的角度,调整所述图像采集方向的角度;
    根据调整所述图像采集方向角度,计算投影装置需要改变的投影角度;
    根据计算得出的所述投影装置的投影角度,调整所述投影角度;
    根据调整后的所述投影角度进行投影,使投影内容显示在人脸的正对面。
  2. 根据权利要求1所述的动向投影方法,其特征在于,所述采集人脸图像并确定人脸的位置包括:通过图像采集装置采集人脸图像,使用人脸检测算法,检测人脸并确定人脸在当前帧相对于图像采集装置的位置。
  3. 根据权利要求1所述的动向投影方法,其特征在于,所述计算图像采集方向的角度包括:
    计算水平角度以及竖直角度;
    竖直角度,所述竖直角度为人脸位置相对于所述图像采集装置在竖直方向上的夹角;
    水平角度,所述水平角度为人脸位置相对于所述图像采集装置在水平方向上的夹角。
  4. 根据权利要求1所述的动向投影方法,其特征在于,所述计算投影角度包括:
    当图像采集装置水平移动了任意角度时,则所述投影装置也水平移动相同的角度。
  5. 根据权利要求4所述的动向投影方法,其特征在于,还包括:
    当图像采集装置竖直移动了任意角度时,则所述投影装置需要竖直移动的角度包括:
    通过计算人脸与所述图像采集装置在竖直方向上的距离,转动投影装置并实时计算所述投影装置与投影平面的实际距离,当转动的角度使所述投影装置与投影平面的实际距离与所述投影装置到人脸正对面的投影平面的距离相等的时候,则所述转动的角度即为所述投影装置需要投影的角度。
  6. 一种基于人脸追踪的动向投影装置,其特征在于,所述装置包括:
    图像采集单元,用于采集人脸图像并确定人脸的位置;
    第一计算分析单元,用于根据人脸的位置,计算图像采集方向的角度;
    第一方向控制单元,用于根据计算得出的所述图像采集方向的角度,调整所述图像采集方向的角度;
    第二计算分析单元,用于根据调整所述图像采集方向角度,计算投影装置需要改变的投影角度;
    第二方向控制单元,用于根据计算得出的所述投影装置的投影角度,调整所述投影角度;
    投影显示单元,用于根据调整后的所述投影角度进行投影,使投影内容显示在人脸的正对面。
  7. 根据权利要求6所述的动向投影装置,其特征在于,所述装置还包括: 触发启动单元,所述触发启动单元用于启动待机状态的动向投影装置。
  8. 根据权利要求7所述动向投影装置,其特征在于,所述触发启动单元包括:
    遥控模块,用于通过用户控制随身携带的微型按钮装置,启动待机状态的动向投影装置;和/或
    语音模块,用于根据相应的语音指令,启动待机状态的动向投影装置;和/或
    动作识别模块,用于接受图像采集单元采集的手势等肢体语言触发,启动待机状态的动向投影装置。
  9. 根据权利要求6所述的动向投影装置,其特征在于,所述装置还包括:摄影校正单元,用于覆盖采集投影画面区域图像,进行投影画面校正。
  10. 一种电子设备,其特征在于,包括:
    至少一个处理器;以及,
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行权利要求1-5的任一项所述的方法。
  11. 一种非易失性计算机可读存储介质,其特征在于,所述非易失性计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使电子设备能够执行权利要求1-5的任一项所述的方法。
  12. 一种计算机程序产品,其特征在于,所述计算机程序产品包括存储在计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被电子设备执行时,使所述电子设备执行如权利要求1-5任一项所述的方法。
PCT/CN2018/089628 2017-12-26 2018-06-01 一种基于人脸追踪的动向投影方法、装置及电子设备 WO2019128109A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711435378.4 2017-12-26
CN201711435378.4A CN109960401B (zh) 2017-12-26 2017-12-26 一种基于人脸追踪的动向投影方法、装置及其系统

Publications (1)

Publication Number Publication Date
WO2019128109A1 true WO2019128109A1 (zh) 2019-07-04

Family

ID=67022499

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/089628 WO2019128109A1 (zh) 2017-12-26 2018-06-01 一种基于人脸追踪的动向投影方法、装置及电子设备

Country Status (2)

Country Link
CN (1) CN109960401B (zh)
WO (1) WO2019128109A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110458617A (zh) * 2019-08-07 2019-11-15 卓尔智联(武汉)研究院有限公司 广告投放方法、计算机装置及可读存储介质
CN111046729A (zh) * 2019-11-05 2020-04-21 安徽爱学堂教育科技有限公司 一种基于人脸追踪的双面屏升降调整系统
CN111144327A (zh) * 2019-12-28 2020-05-12 神思电子技术股份有限公司 一种提高自助设备人脸识别摄像头识别效率的方法
CN111345584A (zh) * 2020-03-09 2020-06-30 北京文香信息技术有限公司 一种智慧讲桌及跟踪方法
CN114554031A (zh) * 2022-03-07 2022-05-27 云知声智能科技股份有限公司 一种提词的方法、装置、终端及存储介质

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111031298B (zh) * 2019-11-12 2021-12-10 广景视睿科技(深圳)有限公司 控制投影模块投影的方法、装置和投影系统
CN111179694B (zh) * 2019-12-02 2022-09-23 广东小天才科技有限公司 一种舞蹈教学互动方法及智能音箱、存储介质
CN110897604A (zh) * 2019-12-26 2020-03-24 深圳市博盛医疗科技有限公司 一种减少3d视觉中三维畸变的腹腔镜系统及使用方法
CN111491146B (zh) * 2020-04-08 2021-11-26 上海松鼠课堂人工智能科技有限公司 用于智能教学的互动投影系统
CN111489594B (zh) * 2020-05-09 2022-02-18 兰州石化职业技术学院 一种人机互动平台
CN115113553B (zh) * 2021-09-03 2024-08-06 博泰车联网科技(上海)股份有限公司 车载影音播放的控制方法、控制系统及控制装置
CN114727077A (zh) * 2022-03-04 2022-07-08 乐融致新电子科技(天津)有限公司 投影方法、装置、设备和存储介质
CN114782901B (zh) * 2022-06-21 2022-09-09 深圳市禾讯数字创意有限公司 基于视觉变动分析的沙盘投影方法、装置、设备及介质
CN115494961B (zh) * 2022-11-17 2023-03-24 南京熊大巨幕智能科技有限公司 基于人脸识别的新型交互式环绕智能显示设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307288A (zh) * 2011-07-27 2012-01-04 中国计量学院 基于人脸识别的随第一人称视线移动的投影系统
CN103996215A (zh) * 2013-11-05 2014-08-20 深圳市云立方信息科技有限公司 一种实现虚拟视图转立体视图的方法及装置
CN107065409A (zh) * 2017-06-08 2017-08-18 广景视睿科技(深圳)有限公司 动向投影装置及其工作方法
CN107195277A (zh) * 2017-05-15 2017-09-22 盐城华星光电技术有限公司 一种液晶显示模块(lcm)及其图像显示方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106650665B (zh) * 2016-12-26 2021-02-12 北京旷视科技有限公司 人脸跟踪方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307288A (zh) * 2011-07-27 2012-01-04 中国计量学院 基于人脸识别的随第一人称视线移动的投影系统
CN103996215A (zh) * 2013-11-05 2014-08-20 深圳市云立方信息科技有限公司 一种实现虚拟视图转立体视图的方法及装置
CN107195277A (zh) * 2017-05-15 2017-09-22 盐城华星光电技术有限公司 一种液晶显示模块(lcm)及其图像显示方法
CN107065409A (zh) * 2017-06-08 2017-08-18 广景视睿科技(深圳)有限公司 动向投影装置及其工作方法

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110458617A (zh) * 2019-08-07 2019-11-15 卓尔智联(武汉)研究院有限公司 广告投放方法、计算机装置及可读存储介质
CN110458617B (zh) * 2019-08-07 2022-03-18 卓尔智联(武汉)研究院有限公司 广告投放方法、计算机装置及可读存储介质
CN111046729A (zh) * 2019-11-05 2020-04-21 安徽爱学堂教育科技有限公司 一种基于人脸追踪的双面屏升降调整系统
CN111144327A (zh) * 2019-12-28 2020-05-12 神思电子技术股份有限公司 一种提高自助设备人脸识别摄像头识别效率的方法
CN111144327B (zh) * 2019-12-28 2023-04-07 神思电子技术股份有限公司 一种提高自助设备人脸识别摄像头识别效率的方法
CN111345584A (zh) * 2020-03-09 2020-06-30 北京文香信息技术有限公司 一种智慧讲桌及跟踪方法
CN114554031A (zh) * 2022-03-07 2022-05-27 云知声智能科技股份有限公司 一种提词的方法、装置、终端及存储介质

Also Published As

Publication number Publication date
CN109960401A (zh) 2019-07-02
CN109960401B (zh) 2020-10-23

Similar Documents

Publication Publication Date Title
WO2019128109A1 (zh) 一种基于人脸追踪的动向投影方法、装置及电子设备
JP7283506B2 (ja) 情報処理装置、情報処理方法、及び情報処理プログラム
US9696859B1 (en) Detecting tap-based user input on a mobile device based on motion sensor data
JP4278979B2 (ja) ジェスチャーに基づいた入力及びターゲット指示のための単一カメラシステム
WO2018171041A1 (zh) 一种动向智能投影系统及其方法
CN111432115B (zh) 基于声音辅助定位的人脸追踪方法、终端及存储装置
CN106131413B (zh) 一种拍摄设备的控制方法及拍摄设备
WO2018223469A1 (zh) 动向投影装置及其工作方法
US20170316582A1 (en) Robust Head Pose Estimation with a Depth Camera
WO2017126172A1 (ja) 情報処理装置、情報処理方法、及び記録媒体
CN109982054B (zh) 一种基于定位追踪的投影方法、装置、投影仪及投影系统
TWI631506B (zh) 螢幕旋轉控制方法及系統
WO2019062056A1 (zh) 一种智能投影方法、系统及智能终端
WO2021035891A1 (zh) 基于增强现实技术的投影方法及投影设备
EP2882180A1 (en) Control method, control apparatus and control device
US11151804B2 (en) Information processing device, information processing method, and program
JP7103354B2 (ja) 情報処理装置、情報処理方法、及びプログラム
US20190369807A1 (en) Information processing device, information processing method, and program
JP2016197192A (ja) プロジェクションシステムと映像投影方法
US20240013439A1 (en) Automated calibration method of a system comprising an external eye-tracking device and a computing device
JP2023047026A5 (zh)
TWI566596B (zh) 鏡頭拍攝範圍確定方法及系統
US20150063631A1 (en) Dynamic image analyzing system and operating method thereof
TWI524213B (zh) 控制方法及電子裝置
US20160011675A1 (en) Absolute Position 3D Pointing using Light Tracking and Relative Position Detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18894437

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12/11/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18894437

Country of ref document: EP

Kind code of ref document: A1