CN109960401B - Dynamic projection method, device and system based on face tracking - Google Patents

Dynamic projection method, device and system based on face tracking Download PDF

Info

Publication number
CN109960401B
CN109960401B CN201711435378.4A CN201711435378A CN109960401B CN 109960401 B CN109960401 B CN 109960401B CN 201711435378 A CN201711435378 A CN 201711435378A CN 109960401 B CN109960401 B CN 109960401B
Authority
CN
China
Prior art keywords
projection
angle
image acquisition
face
projection device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711435378.4A
Other languages
Chinese (zh)
Other versions
CN109960401A (en
Inventor
杨伟樑
高志强
纪园
林清云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Iview Displays Shenzhen Co Ltd
Original Assignee
Iview Displays Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iview Displays Shenzhen Co Ltd filed Critical Iview Displays Shenzhen Co Ltd
Priority to CN201711435378.4A priority Critical patent/CN109960401B/en
Priority to PCT/CN2018/089628 priority patent/WO2019128109A1/en
Publication of CN109960401A publication Critical patent/CN109960401A/en
Application granted granted Critical
Publication of CN109960401B publication Critical patent/CN109960401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The embodiment of the invention relates to the technical field of projection display, in particular to a dynamic projection method, a dynamic projection device and a dynamic projection system based on face tracking. Wherein the method comprises the following steps: collecting a face image and determining the position of a face; calculating the angle of the image acquisition direction according to the position of the face; adjusting the angle of the image acquisition direction according to the calculated angle of the image acquisition direction; calculating a projection angle required to be changed by the projection device according to the adjustment of the image acquisition direction angle; adjusting the projection angle according to the calculated projection angle of the projection device; and projecting according to the adjusted projection angle to display the projection content on the right opposite side of the human face. By the method, the projection angle can be adjusted in real time according to the change of the face position of the user who is tracking moving, so that the tracked user always has the visual enjoyment of a front projection picture, and the watching experience of the user is enhanced.

Description

Dynamic projection method, device and system based on face tracking
Technical Field
The embodiment of the invention relates to the technical field of projection display, in particular to a dynamic projection method, a dynamic projection device and a dynamic projection system based on face tracking.
Background
With the development of science and technology and the continuous improvement of the living standard of people, the requirements of people on the aspect of visual perception are higher and higher, and on one hand, the requirements of people on display devices of human-computer interfaces are developed towards the direction of micro, large screen and high resolution; on the other hand, people tend to pursue augmented reality and immersive visual enjoyment in terms of display effect. Currently, interactive projection technology is gradually entering people's daily life, and gradually developing towards tracking the user's intention.
The applicant finds that the existing interactive projection technology is lack of flexibility in the process of implementing the invention, and the projection can not be well carried out according to the user movement direction, so that the user experience is not good.
Disclosure of Invention
In view of the above, the present invention provides a dynamic projection method and system based on face tracking, so as to solve the problem that the projection apparatus in the prior art cannot move and project along with the movement of the face.
In order to solve the technical problem, the embodiment of the invention discloses the following technical scheme:
in a first aspect, an embodiment of the present invention provides a dynamic projection method based on face tracking, where the method includes:
collecting a face image and determining the position of a face;
calculating the angle of the image acquisition direction according to the position of the face;
adjusting the angle of the image acquisition direction according to the calculated angle of the image acquisition direction;
calculating a projection angle required to be changed by the projection device according to the adjustment of the image acquisition direction angle;
adjusting the projection angle according to the calculated projection angle of the projection device;
and projecting according to the adjusted projection angle to display the projection content on the right opposite side of the human face.
Optionally, the acquiring a face image and determining the position of the face includes: the method comprises the steps of collecting a human face image through an image collecting device, detecting a human face by using a human face detection algorithm, and determining the position of the human face in a current frame relative to the image collecting device.
Optionally, the calculating the angle of the image capturing direction includes:
calculating a horizontal angle and a vertical angle;
the vertical angle is an included angle of the face position relative to the image acquisition device in the vertical direction;
and the horizontal angle is an included angle of the face position relative to the image acquisition device in the horizontal direction.
Optionally, the calculating the projection angle includes:
when the image acquisition device moves horizontally by any angle, the projection device also moves horizontally by the same angle.
Optionally, when the image capturing device moves vertically by any angle, the angle that the projection device needs to move vertically includes:
the distance between the human face and the image acquisition device in the vertical direction is calculated, the projection device is rotated, the actual distance between the projection device and the projection plane is calculated in real time, and when the actual distance between the projection device and the projection plane is equal to the distance between the projection device and the projection plane opposite to the human face through the rotating angle, the rotating angle is the direction required to be projected by the projection device.
In a second aspect, an embodiment of the present invention provides a dynamic projection apparatus based on face tracking, where the method includes:
the image acquisition unit is used for acquiring a face image and determining the position of a face;
the first calculation analysis unit is used for calculating the angle of the image acquisition direction according to the position of the face;
the first direction control unit is used for adjusting the angle of the image acquisition direction according to the calculated angle of the image acquisition direction;
the second calculation and analysis unit is used for calculating the projection angle of the projection device which needs to be changed according to the adjustment of the image acquisition direction angle;
the second direction control unit is used for adjusting the projection angle according to the calculated projection angle of the projection device;
and the projection display unit is used for projecting according to the adjusted projection angle so as to display the projection content on the right opposite side of the human face.
Optionally, the apparatus further comprises: and the trigger starting unit is used for starting the dynamic projection device in the standby state.
Optionally, the trigger starting unit may be:
the remote control module is used for starting the dynamic projection device in a standby state by controlling the portable micro button device through a user; and/or
The voice module is used for starting the dynamic projection device in a standby state according to the corresponding voice instruction; and/or
And the motion recognition module is used for receiving the body language trigger such as the gesture collected by the image collection unit and starting the motion projection device in a standby state.
Optionally, the apparatus further comprises: and the shooting correction unit is used for covering and acquiring the image of the projection picture area and correcting the projection picture.
In a third aspect, an embodiment of the present invention provides a dynamic projection system, including:
the dynamic projection device as described above,
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described above.
In a fourth aspect, embodiments of the present invention provide a non-transitory computer-readable storage medium storing computer-executable instructions for enabling a mobile projection apparatus to perform the method described above.
The beneficial effects of the embodiment of the invention are as follows: the invention discloses a dynamic projection method, a dynamic projection device and a dynamic projection system based on face tracking, which are different from the prior art, and can adjust the projection angle in real time by tracking the change of the face position of a moving user according to the method, the device and the system, so that the tracked user always has the visual enjoyment of an orthographic projection picture, and the viewing experience of the user is enhanced.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a schematic flow chart of a dynamic projection method based on face tracking according to an embodiment of the present invention;
FIG. 2 is a schematic plan view of an embodiment of the present invention for calculating an angle of an image capturing direction;
FIG. 3 is a schematic perspective view of an embodiment of the present invention for calculating an angle of an image capturing direction;
fig. 4, fig. 5, and fig. 6 are schematic diagrams of a method for calculating a projection angle of a projection apparatus to be changed under three different conditions according to an embodiment of the present invention;
fig. 7 is a plan view of a circle center position acquired by an image acquisition apparatus according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a dynamic projection apparatus based on face tracking according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a trigger starting unit based on face tracking according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a dynamic projection system according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
The embodiment of the invention provides a dynamic projection method. Fig. 2 is a schematic flow chart of a dynamic projection method according to an embodiment of the present invention. A dynamic projection method according to an embodiment of the present invention can be performed by the dynamic projection apparatus 10 shown in fig. 8.
Referring to fig. 1, the method includes:
101: collecting a face image and determining the position of a face;
in the embodiment of the invention, when the image acquisition device is used for acquiring the face image, a face detection algorithm is used for detecting the face and determining the position of the face in the current frame relative to the image acquisition device. Wherein, the image acquisition device includes: the method comprises the functions of image acquisition of a conventional two-dimensional face and three-position depth ranging. The image acquisition device can be a binocular camera or an infrared camera provided with optional infrared lenses.
102: calculating the angle of the image acquisition direction according to the position of the face;
in an embodiment of the present invention, the calculating an angle of an image capturing direction includes: calculating a vertical offset angle and a vertical offset angle of an image acquisition direction; the vertical offset angle is an included angle of the face position offset relative to the image acquisition device in the vertical direction; the horizontal deviation angle is an included angle of the human face position relative to the image acquisition device in a deviation mode in the horizontal direction.
103: adjusting the angle of the image acquisition direction according to the calculated angle of the image acquisition direction;
in the embodiment of the invention, the angle of the image acquisition direction is adjusted according to the calculation result of the angle of the image acquisition direction, so that the target face is always positioned at the central position of the subsequent acquired image.
104: calculating a projection angle required to be changed by the projection device according to the adjustment of the image acquisition direction angle;
in the embodiment of the invention, the distance between the human face and the image acquisition device is acquired through the depth camera arranged on the image acquisition device, and the fixed distance between the projection device and the image acquisition device is measured by using the measuring equipment. When the image acquisition device moves for a certain angle along the horizontal direction, the angle of the horizontal movement of the projection device is the same as the angle of the horizontal movement of the image acquisition device; when the image acquisition device moves by a certain angle along the vertical direction, the projection device needs to move by different angles along the vertical direction according to the position relationship between the human face and the image acquisition device.
105: adjusting the projection angle according to the calculated projection angle of the projection device;
in the embodiment of the present invention, a specific method for changing the projection apparatus with the angle of the image collecting direction is as follows: the horizontal included angle between the initial position of the projection device and the initial position of the image acquisition device is 180 degrees, and the final adjusted positions of the projection device and the image acquisition device are always kept at the relative position of 180 degrees by adjusting the angle of the projection device according to the calculated projection angle of the projection device.
106: and projecting according to the adjusted projection angle to display the projection content on the right opposite side of the human face.
In the embodiment of the invention, projection is carried out according to the adjusted projection angle, so that the projection content is displayed on the right opposite side of the face, and when the face is detected by the camera, the angle is adjusted so that the face is positioned in the center of the image.
The embodiment of the invention has the beneficial effects that: the invention discloses a dynamic projection method based on face tracking, which is different from the prior art, and according to the method, the projection angle can be adjusted in real time by tracking the change of the face position of a moving user, so that the tracked user always has the visual enjoyment of a front projection picture, and the viewing comfort of the user is enhanced.
FIG. 2 is a schematic plan view of an embodiment of the present invention for calculating an angle of an image capturing direction; fig. 3 is a schematic perspective view of calculating an angle of an image capturing direction according to an embodiment of the present invention.
In the embodiment of the present invention, a face detection algorithm is first used to detect a face, and the face detection algorithm includes: after the image acquisition device receives the face image, the facial region in the image is positioned by the five sense organs, and the position of the face in the current frame is determined; as shown in fig. 2 and fig. 3, point C is a position of the face in the current frame in the image, point a is a central point of the acquired image, and point O is a position of the image acquisition device camera, where calculating the angle of the image acquisition direction includes: calculating the angle formed by OC and OA, as shown in FIG. 2, the angle formed by OC and OA includes: the horizontal angle x _ angle _ c and the vertical angle y _ angle _ c, where L2 is the distance between the face and the horizontal plane, d is the distance between the image capturing device and the horizontal plane, please refer to fig. 3, and dx is the distance between the point a projected by the image capturing device camera onto the horizontal plane and the point B projected by the face onto the horizontal plane, so knowing L2 and dx, the angles of x _ angle _ o and x _ angle _ c can be calculated, and similarly, the deflection angle y _ angle _ c of the image capturing device in the vertical direction can be calculated. Therefore, after the image acquisition device camera analyzes the acquired face image, a position relation model as shown in fig. 2 is established, a vertical angle x _ angle _ c and a vertical angle y _ angle _ c which need to be changed by the image acquisition device are calculated, the face is moved to the center of the image, a tracking algorithm is used for tracking the face in real time, and the image acquisition device is rotated according to the calculated angle, so that the target face is always positioned at the center of the image, the face is conveniently grabbed, the intention of a user is judged, and the angle of the projection device is changed.
Fig. 4, fig. 5, and fig. 6 are schematic diagrams of a method for calculating a projection angle of a projection apparatus that needs to be changed under three different conditions according to an embodiment of the present invention.
In the embodiment of the present invention, a specific method for changing the angle of the projection apparatus along with the image collecting direction is as follows: the horizontal included angle between the initial position of the projection device and the initial position of the image acquisition device is 180 degrees, the final adjusted positions of the projection device and the image acquisition device are always kept at 180-degree relative positions by adjusting the angle of the projection device according to the calculated projection angle of the projection device, and meanwhile, the angle of the projector is adjusted to enable the projector to project on the opposite side of the face.
In an embodiment of the present invention, the method for calculating the projection angle includes:
as shown in fig. 4, 5 and 6, the distance from the face to the image acquisition device is L; the fixed distance between the projection device and the image acquisition device is h; the horizontal movement angle of the image acquisition device is x _ ca; the angle of the horizontal movement of the projection device is x _ pr; the vertical movement angle of the image acquisition device is y _ ca; the angle of the projection device required to move vertically is y _ pr; the distance between the human face and the image acquisition device in the vertical direction is h 1; the actual distance between the projection device and the projection plane is L _ pr; the distance from the projection device to the projection plane opposite to the face is L _ pr _ measure;
l is obtained through a depth camera, and h is measured;
when the horizontal movement angle of the image acquisition device is x _ ca, the horizontal movement angle of the projection device is also x _ ca; thus, the x _ pr is equal to the x _ ca.
When the vertical movement angle of the image acquisition device is y _ ca, the calculation method of y _ pr is divided into the following three cases:
as shown in fig. 4, when the face position is located below the image capturing device, the distance of h1 is obtained by calculating h1 ═ L × sin (y _ ca), the projection device is rotated, and the value of L _ pr is calculated in real time at an angle y _ pr of each rotation, where L _ pr is (h + h1)/sin (y _ pr), and when L _ pr is equal to L _ pr _ measure, the angle y _ pr of rotation is the direction in which the projector needs to project.
As shown in fig. 5, when the face position is located above the image capturing device, h1 is obtained by calculating h1 ═ L × cos (y _ ca) -h, the projecting device is rotated, and the value of L _ pr is calculated in real time at each rotation angle y _ pr, where L _ pr ═ h1/sin (y _ pr), and when L _ pr is equal to L _ pr _ measure, the rotation angle y _ pr is the direction in which the projector needs to project.
As shown in fig. 6, when the face position is located between the image capturing device and the projection device, h1 is obtained by calculating h1 ═ h-L × cos (y _ ca), the projection device is rotated, and the value of L _ pr is calculated in real time at an angle y _ pr of each rotation, where L _ pr ═ h1/sin (y _ pr), and when L _ pr is equal to L _ pr _ measure, the angle y _ pr of rotation is the direction in which the projection device needs to project. The method for measuring the L _ pr _ measure comprises the following steps:
assuming that the central axes of the camera and the projector are on the same horizontal line, two circles on the same horizontal line are projected by using the projector, and x1 and x2 represent coordinate positions of the center of the circle on the same horizontal line as shown in the following figure
The schematic diagram of the position image of the circle center acquired by the camera is as follows: obtained by image acquisition device
As shown in fig. 7, the central axes of the image acquisition device and the projection device are on the same horizontal line, the projection device is used to project two circles on the same horizontal line,
wherein x1 and x2 are coordinate positions of two circle centers of the same horizontal line; f is a fixed coefficient; d0 is the actual distance between the camera and the camera; w is the width of the image; d represents the actual distance between the two detected circle centers; abs represents an absolute value;
d can be calculated by abs ((x1+ x2)/2-w/2)/d0 ═ abs (x1-x 2)/d;
by L _ pr _ measure/f ═ d/abs (x1-x2), L _ pr _ measure can be calculated,
it should be noted that (X1+ X2)/2 represents the center positions of two circle centers on the X-axis connecting line, w/2 is the center position in the X-axis direction on the image, (X1+ X2)/2-w/2 is the offset distance in the X-axis direction in the image, and (X1-X2) represents the actual offset distance in the X-axis direction.
The embodiment of the invention provides a dynamic projection system based on face tracking. Fig. 8 is a schematic diagram of a dynamic projection system based on face tracking according to an embodiment of the present invention.
Referring to fig. 8, the apparatus kinetic projection system 80 includes:
the trigger starting unit 801 is used for starting the dynamic projection system in the standby state.
Referring to fig. 9, the trigger activation unit 801 includes:
a remote control module 901, configured to start the dynamic projection system in a standby state by controlling a portable micro button device by a user; and/or
The voice module 902 is configured to start the dynamic projection system in the standby state according to a corresponding voice instruction; and/or
And the motion recognition module 903 is used for receiving body language trigger such as gestures acquired by the image acquisition unit and starting the motion projection system in a standby state.
In the embodiment of the present invention, a user can start the mobile projection system 80 in a standby state by controlling a portable micro button device, a corresponding voice command and a body voice, wherein the mobile projection system is equipped with a central processing module, the central processing module is connected with and controls each unit in the mobile projection system, and the central processing module can be started by triggering the starting unit 801 to control the starting of the mobile projection system 80.
An image acquisition unit 802, configured to acquire a face image and determine a position of a face;
in the embodiment of the present invention, the image acquisition unit 802 acquires a face image and uses a face detection algorithm to detect a face and determine the position of the face in the current frame relative to the image acquisition unit 802.
A first calculation and analysis unit 803, configured to calculate an angle of an image acquisition direction according to a position of a human face;
in the embodiment of the present invention, after the image acquisition unit 802 acquires a face image and determines the position of a face in a current frame relative to the image acquisition unit 802, the first calculation and analysis unit 803 analyzes and calculates the included angle of the face in the vertical direction and the horizontal direction relative to the image acquisition unit 802.
The first direction control unit 804 is configured to adjust an angle of the image acquisition direction according to the calculated angle of the image acquisition direction;
in this embodiment of the present invention, the first direction control unit 804 adjusts the image capturing unit 802 through the included angle of the face calculated by the first calculation and analysis unit 803 in the vertical direction and the horizontal direction with respect to the image capturing unit 802, and makes the face always located at the center of the subsequent captured image.
The second calculation and analysis unit 805 is configured to calculate a projection angle that needs to be changed by the projection apparatus according to the adjustment of the image acquisition direction angle;
in the embodiment of the present invention, the distance from the face to the image acquisition unit 802 is obtained by a depth camera installed on the image acquisition unit 802, and the distance between the projection apparatus and the image acquisition unit 802 is fixed. When the image acquisition unit 802 moves by a certain angle along the horizontal direction, the angle that the projection apparatus needs to move horizontally is the same as the angle that the image acquisition unit 802 moves in the horizontal direction; when the image capturing unit 802 moves by a certain angle along the vertical direction, the projection apparatus needs to move by different angles along the vertical direction according to the position relationship between the human face and the image capturing unit 802.
A second direction control unit 806, configured to adjust the projection angle according to the calculated projection angle of the projection apparatus
In the embodiment of the present invention, the distance from the face to the image acquisition unit 802 is obtained by the projection apparatus according to the depth camera installed on the image acquisition unit 802, and the fixed distance between the projection apparatus and the image acquisition unit 802 is measured by using the measurement device. When the image acquisition unit 802 moves by a certain angle along the horizontal direction, the angle that the projection apparatus needs to move horizontally is the same as the angle that the image acquisition unit 802 moves in the horizontal direction; when the image capturing unit 802 moves by a certain angle along the vertical direction, the projection apparatus needs to move by different angles along the vertical direction according to the position relationship between the human face and the image capturing unit 802.
A projection display unit 807 for displaying the projection content.
In this embodiment of the present invention, the projection display unit 807 projects the projection content to the right opposite side of the human face according to the projection angle adjusted by the second direction control unit 806.
And the shooting correction unit 808 is used for covering and acquiring the image of the projection picture area and correcting the projection picture.
In the embodiment of the present invention, when the projection display unit 807 performs projection, the photography correcting unit 808 collects images of a projection image area in a covering manner, and performs projection image correction; the photographing correction unit 808 includes a trapezoidal projection correction (not shown) and a projection brightness correction (not shown), which are respectively used for correcting the shape and brightness of the projection content of the projection apparatus.
In this embodiment of the present invention, the trigger starting unit 801 starts the image capturing unit 802, the image capturing unit 802 captures a face image and determines face information, the first calculating and analyzing unit 803 calculates an angle of an image capturing direction according to the face information captured by the image capturing unit 802, the first direction control unit 804 adjusts the angle of the image capturing direction according to a result calculated by the first calculating and analyzing unit 803, the second calculating and analyzing unit 805 calculates a projection angle that a projection apparatus needs to change according to the angle of the image capturing direction, the second direction control unit 806 adjusts the projection angle according to the projection angle calculated by the second calculating and analyzing unit 805, so that the projection content projected by the projection display unit 807 is always located right opposite to the face, when the projection display unit 807 performs projection, the photographing correction unit 808 performs projection correction by covering the acquired projection screen area image.
In the embodiment of the present invention, the dynamic projection apparatus 80 can execute the dynamic projection method provided in the embodiment of the present invention, and has functional modules and beneficial effects corresponding to the execution method. For technical details that are not described in detail in the embodiment of the dynamic projection apparatus 80, reference may be made to the dynamic projection method provided by the embodiment of the present invention.
The beneficial effects of the embodiment of the invention are as follows: different from the situation of the prior art, the invention discloses a dynamic projection system based on face tracking, according to the system, the projection angle can be adjusted in real time by tracking the change of the face position of a moving user, so that the tracked user always has the visual enjoyment of a front projection picture, and the viewing comfort of the user is enhanced.
Fig. 10 is a schematic structural diagram of a dynamic projection system according to an embodiment of the present invention, and as shown in fig. 10, the dynamic projection system 100 includes:
one or more processors 1001 and a memory 1002, with one processor 1001 being an example in fig. 10.
The processor 1001 and the memory 1002 may be connected by a bus or other means, and the bus connection is exemplified in fig. 1001.
The memory 1002, as a non-volatile computer-readable storage medium, may be used to store a non-volatile software program, a non-volatile computer-executable program, and modules, such as program instructions/units corresponding to the dynamic projection method in the embodiment of the present invention (for example, the trigger starting unit 801, the image acquisition unit 802, the first calculation and analysis unit 803, the first direction control unit 804, the second calculation and analysis unit 805, the second direction control unit 806, the projection display unit 807, and the photography correction unit 808 shown in fig. 8). The processor 1001 executes various functional applications and data processing of the cartographic projection system, i.e., implementing the cartographic projection method of the method embodiments, by running non-volatile software programs, instructions and units stored in the memory 1002.
The memory 1002 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created from the use of the kinetic projection system, and the like. Further, the memory 1002 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 1002 may optionally include memory located remotely from the processor 1001, which may be connected to the mobile projection system via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more units are stored in the memory 1002 and when executed by the one or more processors 1001, perform the method for dynamic projection as described above, for example, perform the above-described method steps 101 to 106 in fig. 1, and implement the functions of the unit 801 and 808 in fig. 8.
The dynamic projection system can execute the dynamic projection method provided by the invention and has corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in the embodiment of the dynamic projection system, reference may be made to the dynamic projection method provided by the present invention.
Embodiments of the present invention provide a non-transitory computer-readable storage medium storing computer-executable instructions for execution by one or more processors, for example, to perform the above-described method steps 101-106 in fig. 1, and to implement the functions of the units 801-808 in fig. 8.
It should be noted that the above-described device embodiments are merely illustrative, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
Through the above description of the embodiments, those skilled in the art will clearly understand that the embodiments may be implemented by software plus a general hardware platform, and may also be implemented by hardware. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer readable storage medium, and when executed, may include processes of the embodiments of the methods as described. The storage medium may be a magnetic disk, an optical disk, a Read-only memory (ROM), a Random Access Memory (RAM), or the like.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; within the idea of the invention, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A dynamic projection method based on face tracking is characterized by comprising the following steps:
collecting a face image and determining the position of a face;
calculating the angle of the image acquisition direction according to the position of the face, wherein the angle comprises a horizontal angle and a vertical angle;
adjusting the angle of the image acquisition direction according to the calculated angle of the image acquisition direction;
calculating a projection angle required to be changed by a projection device according to the adjustment of the angle of the image acquisition direction, wherein when the image acquisition device vertically moves any angle, the angle required to be vertically moved by the projection device comprises the steps of calculating the distance between a human face and the image acquisition device in the vertical direction, rotating the projection device and calculating the actual distance between the projection device and a projection plane in real time, and when the actual distance between the projection device and the projection plane is equal to the distance between the projection device and the projection plane opposite to the human face through the rotating angle, the rotating angle is the angle required to be projected by the projection device;
adjusting the projection angle according to the calculated projection angle of the projection device, wherein the horizontal included angle between the projection device and the initial position of the image acquisition device is 180 degrees, and the final adjusted positions of the projection device and the image acquisition device always keep a relative position of 180 degrees by adjusting the angle of the projection device according to the calculated projection angle of the projection device;
and projecting according to the adjusted projection angle to display the projection content on the right opposite side of the human face.
2. The kinetic projection method of claim 1, wherein the acquiring of the face image and determining the location of the face comprises: the method comprises the steps of collecting a human face image through an image collecting device, detecting a human face by using a human face detection algorithm, and determining the position of the human face in a current frame relative to the image collecting device.
3. The dynamic projection method of claim 1, wherein said calculating the angle of the image acquisition direction comprises the step of calculating a horizontal angle and a vertical angle, further comprising:
the vertical angle is an included angle of the face position relative to the image acquisition device in the vertical direction;
and the horizontal angle is an included angle of the face position relative to the image acquisition device in the horizontal direction.
4. The kinetic projection method of claim 1, wherein the calculating the projection angle comprises:
when the image acquisition device moves horizontally by any angle, the projection device also moves horizontally by the same angle.
5. A dynamic projection device based on face tracking, the device comprising:
the image acquisition unit is used for acquiring a face image and determining the position of a face;
the first calculation and analysis unit is used for calculating the angle of the image acquisition direction according to the position of the face, including calculating the horizontal angle and the vertical angle;
the first direction control unit is used for adjusting the angle of the image acquisition direction according to the calculated angle of the image acquisition direction;
the second calculation and analysis unit is used for calculating a projection angle required to be changed by the projection device according to the adjustment of the angle of the image acquisition direction, when the image acquisition device vertically moves any angle, the angle required to be vertically moved by the projection device comprises the steps of rotating the projection device and calculating the actual distance between the projection device and a projection plane in real time by calculating the distance between a human face and the image acquisition device in the vertical direction, and when the rotating angle enables the actual distance between the projection device and the projection plane to be equal to the distance between the projection device and the projection plane opposite to the human face, the rotating angle is the angle required to be projected by the projection device;
the second direction control unit is used for adjusting the projection angle according to the calculated projection angle of the projection device, wherein the horizontal included angle between the projection device and the initial position of the image acquisition device is 180 degrees, and the final adjusted positions of the projection device and the image acquisition device are always kept at the relative position of 180 degrees by adjusting the angle of the projection device according to the calculated projection angle of the projection device;
and the projection display unit is used for projecting according to the adjusted projection angle so as to display the projection content on the right opposite side of the human face.
6. A kinetic projection device according to claim 5 wherein the device further comprises: and the trigger starting unit is used for starting the dynamic projection device in the standby state.
7. The kinetic projection device of claim 6, wherein the trigger activation unit comprises:
the remote control module is used for starting the dynamic projection device in a standby state by controlling the portable micro button device through a user; and/or
The voice module is used for starting the dynamic projection device in a standby state according to the corresponding voice instruction; and/or
And the motion recognition module is used for receiving the body language trigger such as the gesture collected by the image collection unit and starting the motion projection device in a standby state.
8. A kinetic projection device according to claim 5 wherein the device further comprises: and the shooting correction unit is used for covering and acquiring the image of the projection picture area and correcting the projection picture.
9. A dynamic projection system, comprising:
a kinetic projection device according to claims 5-8,
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-4.
10. A non-transitory computer-readable storage medium having stored thereon computer-executable instructions for enabling a mobile projection device to perform the method of any of claims 1-4.
CN201711435378.4A 2017-12-26 2017-12-26 Dynamic projection method, device and system based on face tracking Active CN109960401B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711435378.4A CN109960401B (en) 2017-12-26 2017-12-26 Dynamic projection method, device and system based on face tracking
PCT/CN2018/089628 WO2019128109A1 (en) 2017-12-26 2018-06-01 Face tracking based dynamic projection method, device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711435378.4A CN109960401B (en) 2017-12-26 2017-12-26 Dynamic projection method, device and system based on face tracking

Publications (2)

Publication Number Publication Date
CN109960401A CN109960401A (en) 2019-07-02
CN109960401B true CN109960401B (en) 2020-10-23

Family

ID=67022499

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711435378.4A Active CN109960401B (en) 2017-12-26 2017-12-26 Dynamic projection method, device and system based on face tracking

Country Status (2)

Country Link
CN (1) CN109960401B (en)
WO (1) WO2019128109A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110458617B (en) * 2019-08-07 2022-03-18 卓尔智联(武汉)研究院有限公司 Advertisement putting method, computer device and readable storage medium
CN111046729A (en) * 2019-11-05 2020-04-21 安徽爱学堂教育科技有限公司 Double-sided screen lifting adjustment system based on face tracking
CN111031298B (en) * 2019-11-12 2021-12-10 广景视睿科技(深圳)有限公司 Method and device for controlling projection of projection module and projection system
CN111179694B (en) * 2019-12-02 2022-09-23 广东小天才科技有限公司 Dance teaching interaction method, intelligent sound box and storage medium
CN110897604A (en) * 2019-12-26 2020-03-24 深圳市博盛医疗科技有限公司 Laparoscope system for reducing three-dimensional distortion in 3D vision and use method
CN111144327B (en) * 2019-12-28 2023-04-07 神思电子技术股份有限公司 Method for improving recognition efficiency of face recognition camera of self-service equipment
CN111491146B (en) * 2020-04-08 2021-11-26 上海松鼠课堂人工智能科技有限公司 Interactive projection system for intelligent teaching
CN111489594B (en) * 2020-05-09 2022-02-18 兰州石化职业技术学院 Man-machine interaction platform
CN115113553A (en) * 2021-09-03 2022-09-27 博泰车联网科技(上海)股份有限公司 Control method, control system and control device for vehicle-mounted video and audio playing
CN114554031A (en) * 2022-03-07 2022-05-27 云知声智能科技股份有限公司 Method, device, terminal and storage medium for prompting
CN114782901B (en) * 2022-06-21 2022-09-09 深圳市禾讯数字创意有限公司 Sand table projection method, device, equipment and medium based on visual change analysis
CN115494961B (en) * 2022-11-17 2023-03-24 南京熊大巨幕智能科技有限公司 Novel interactive surrounding intelligent display equipment based on face recognition

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307288A (en) * 2011-07-27 2012-01-04 中国计量学院 Projection system moving along with sightline of first person based on human face recognition
CN106650665A (en) * 2016-12-26 2017-05-10 北京旷视科技有限公司 Human face tracing method and device
CN107065409A (en) * 2017-06-08 2017-08-18 广景视睿科技(深圳)有限公司 Trend projection arrangement and its method of work

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103996215A (en) * 2013-11-05 2014-08-20 深圳市云立方信息科技有限公司 Method and apparatus for realizing conversion from virtual view to three-dimensional view
CN107195277A (en) * 2017-05-15 2017-09-22 盐城华星光电技术有限公司 A kind of LCD MODULE(LCM)And its method for displaying image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307288A (en) * 2011-07-27 2012-01-04 中国计量学院 Projection system moving along with sightline of first person based on human face recognition
CN106650665A (en) * 2016-12-26 2017-05-10 北京旷视科技有限公司 Human face tracing method and device
CN107065409A (en) * 2017-06-08 2017-08-18 广景视睿科技(深圳)有限公司 Trend projection arrangement and its method of work

Also Published As

Publication number Publication date
WO2019128109A1 (en) 2019-07-04
CN109960401A (en) 2019-07-02

Similar Documents

Publication Publication Date Title
CN109960401B (en) Dynamic projection method, device and system based on face tracking
JP7283506B2 (en) Information processing device, information processing method, and information processing program
US10157477B2 (en) Robust head pose estimation with a depth camera
US9723226B2 (en) System and method for acquiring virtual and augmented reality scenes by a user
US10755438B2 (en) Robust head pose estimation with a depth camera
US9154739B1 (en) Physical training assistant system
WO2018171041A1 (en) Moving intelligent projection system and method therefor
WO2019062056A1 (en) Smart projection method and system, and smart terminal
WO2018223469A1 (en) Dynamic projection device and operation method thereof
US20170316582A1 (en) Robust Head Pose Estimation with a Depth Camera
US11879750B2 (en) Distance estimation using multi-camera device
CN109982054B (en) Projection method and device based on positioning tracking, projector and projection system
CN107562189B (en) Space positioning method based on binocular camera and service equipment
US10447926B1 (en) Motion estimation based video compression and encoding
KR102228663B1 (en) Method of providing photographing guide and system therefor
US10165186B1 (en) Motion estimation based video stabilization for panoramic video from multi-camera capture device
CN112702587A (en) Intelligent tracking projection method and system
CN113870213A (en) Image display method, image display device, storage medium, and electronic apparatus
EP3882846B1 (en) Method and device for collecting images of a scene for generating virtual reality data
KR101741149B1 (en) Method and device for controlling a virtual camera's orientation
US20150063631A1 (en) Dynamic image analyzing system and operating method thereof
US20160011675A1 (en) Absolute Position 3D Pointing using Light Tracking and Relative Position Detection
CN115348438B (en) Control method and related device for three-dimensional display equipment
US20230070721A1 (en) Method, processing device, and display system for information display
EP4233004A1 (en) Automated calibration method of a system comprising an external eye tracking device and a computing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PP01 Preservation of patent right

Effective date of registration: 20231226

Granted publication date: 20201023

PP01 Preservation of patent right