CN116489326A - Automatic following projection method and device and electronic equipment - Google Patents

Automatic following projection method and device and electronic equipment Download PDF

Info

Publication number
CN116489326A
CN116489326A CN202310394619.4A CN202310394619A CN116489326A CN 116489326 A CN116489326 A CN 116489326A CN 202310394619 A CN202310394619 A CN 202310394619A CN 116489326 A CN116489326 A CN 116489326A
Authority
CN
China
Prior art keywords
projection
point
sight
projector
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310394619.4A
Other languages
Chinese (zh)
Inventor
杨杰
李秀
甘露
方堃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhenhuo Technology Co ltd
Original Assignee
Shenzhen Zhenhuo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhenhuo Technology Co ltd filed Critical Shenzhen Zhenhuo Technology Co ltd
Priority to CN202310394619.4A priority Critical patent/CN116489326A/en
Publication of CN116489326A publication Critical patent/CN116489326A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides an automatic following projection method, an automatic following projection device and electronic equipment. The method is applied to the server and comprises the following steps: acquiring a sight line falling point of a user in real time, wherein the sight line falling point is a point at which the sight line intersects with the projection wall surface when the user views the projection wall surface; acquiring first position information of a sight falling point; acquiring second position information of a projection point of the projector in real time, wherein the projection point is a central point of a projection picture when the projector projects the projection picture onto a projection wall surface; calculating a first distance between the line-of-sight falling point and the projection point based on the first position information and the second position information; judging the size relation between the first distance and a preset first threshold value, and if the first distance is smaller than or equal to the first threshold value, setting the current position of the projector as a projection position; a position adjustment signal is sent to the projector to move the projector to the projection position. The projection position of the projector can be adjusted in real time according to the viewing angle of the user.

Description

Automatic following projection method and device and electronic equipment
Technical Field
The application relates to the technical field of intelligent projection, in particular to an automatic following projection method, an automatic following projection device and electronic equipment.
Background
A projector is a very common audio-visual device that can project images or videos onto a curtain or wall surface, and has a larger display area and a more comfortable viewing experience than conventional televisions. Projectors are widely used in education, business, entertainment, etc., and are one of the indispensable devices in modern life.
Currently, when a common projector is used for viewing, a user needs to view at a specific angle at a specific position to achieve the best viewing effect. If the position of the user moves, the position of the projector needs to be manually adjusted, so that the projection angle is changed, and the viewing experience is greatly influenced. Therefore, there is a need for a method capable of adjusting the projection position of a projector in real time according to the viewing angle of a user.
Disclosure of Invention
The application provides an automatic following projection method, an automatic following projection device and electronic equipment, which have the effect of adjusting the projection position of a projector in real time according to the viewing angle of a user.
In a first aspect of the present application, there is provided an automatic following projection method, the method being applied to a server, the method comprising:
acquiring a sight line falling point of a user in real time, wherein the sight line falling point is a point at which the sight line intersects with a projection wall surface when the user views the projection wall surface;
acquiring first position information of the sight falling point;
acquiring second position information of a projection point of a projector in real time, wherein the projection point is a central point of a projection picture when the projector projects the projection picture to the projection wall surface;
calculating a first distance between the line-of-sight landing point and the projection point based on the first position information and the second position information;
judging the size relation between the first distance and a preset first threshold value, and if the first distance is smaller than or equal to the first threshold value, setting the current position of the projector as a projection position;
and sending a position adjustment signal to the projector so as to enable the projector to move to the projection position.
Through adopting above-mentioned technical scheme, when watching the projection picture of projection wall, if the user position takes place to the user changes at the sight landing of projection wall, and the position that the picture needs to be projected to is calculated to the projecting apparatus according to the sight landing when the user watches the projection wall to the server, and then adjusts the position of projecting apparatus, thereby still can normally watch the projection picture after making the user remove the position, reaches the effect that can be according to the projection position of user viewing angle real-time adjustment projecting apparatus.
Optionally, the acquiring the first position information of the line-of-sight landing point specifically includes:
establishing a first coordinate system, wherein the first coordinate system takes the central point of the projection wall surface as an original point, takes the horizontal direction of the projection wall surface as an X axis and takes the vertical direction of the projection wall surface as a Y axis;
and acquiring a first coordinate of the line-of-sight falling point in the first coordinate system, wherein the first coordinate is the first position information.
By adopting the technical scheme, the server establishes the first coordinate system based on the projection wall surface, and the position of the sight falling point can be accurately represented in a coordinate mode.
Optionally, the acquiring the second position information of the projection point of the projector specifically includes:
and acquiring a second coordinate of the projection point in the first coordinate system, wherein the second coordinate is the second position information.
By adopting the technical scheme, the server expresses the position of the projection point in the first coordinate system in a coordinate mode, so that the distance required to be moved by the projection picture can be calculated by combining the position of the line-of-sight falling point.
Optionally, the calculating the first distance between the line of sight falling point and the projection point based on the first position information and the second position information specifically includes:
the first distance is calculated based on the first coordinate and the second coordinate.
By adopting the technical scheme, the server calculates the first distance based on the first coordinate and the second coordinate, and the first distance is the deviation distance between the center point of the projection picture and the line of sight falling point observed by the user. The server obtains the first distance so as to facilitate subsequent adjustment of the projection picture according to the first distance.
Optionally, the acquiring, in real time, the line of sight landing point of the user specifically includes:
acquiring an eye image of a user in real time;
identifying the eye images to obtain movement information of pupils;
based on the pupil information, obtaining an observation line of sight of a user;
and extending the observation sight until the observation sight is intersected with the projection wall surface to obtain an intersection point, and setting the intersection point as the sight falling point.
Through adopting above-mentioned technical scheme, the server carries out pupil discernment to the user, acquires user's observation sight, obtains the sight landing point of user according to the sight, and the projection position of projecting wall that the user watched is convenient for follow-up according to sight landing adjustment projecting apparatus.
Optionally, after the determining that the first distance is less than or equal to the first threshold, the method further includes:
acquiring an angle of a projection angle, and judging the size relation between the angle of the projection angle and a preset second threshold value, wherein the projection angle is an included angle between the sight of a user and a projection line;
and if the angle of the projection angle is smaller than or equal to the second threshold value, setting the current position of the projector as a projection position.
Through adopting above-mentioned technical scheme, after the server makes projection point and sight drop point be close to through calculating, even projection picture is close to the sight drop point that the user observed, the projector adjustment projection angle of control again prevents when projection angle is greater than the second threshold value, reduces the probability that causes the projection picture to take place serious distortion because of projection angle is too big.
Optionally, after the obtaining the first position information of the line-of-sight landing point, the method further includes:
acquiring the time length of the line-of-sight falling point in a first area, wherein the first area is a circular area taking a first position as a circle center and taking a preset length as a radius;
judging whether the duration is greater than a preset third threshold value, and if the duration is greater than the third threshold value, determining that the first position is the position of the sight falling point.
By adopting the technical scheme, when the time of watching the first area by the user is longer than the third threshold value, the time of watching the first area of the projection wall surface by the user is longer, and the position of the projection picture needs to be adjusted. By setting the third threshold, the probability of excessively high screen movement frequency due to short-time line-of-sight transition of the user is reduced.
Optionally, after adjusting the position of the projector based on the relative distance between the first coordinate and the second coordinate to obtain the optimal projection position, the method further includes:
obtaining a projection picture of a projector on the projection wall surface;
identifying the shape of the projection picture, and judging whether the shape of the projection picture is rectangular;
and correcting the shape of the projection picture to be rectangular if the shape of the projection picture is not rectangular.
By adopting the technical scheme, the server corrects the projection picture based on the shape of the projection picture, so that the ornamental effect of the user can be improved.
In a second aspect of the present application, an automatic following projection apparatus is provided, where the apparatus is a server, and includes an acquisition module, a processing module, and an output module, where:
the acquisition module is used for acquiring a sight line falling point of a user in real time, wherein the sight line falling point is a point at which the sight line intersects with a projection wall surface when the user views the projection wall surface; the method comprises the steps of,
acquiring first position information of the sight falling point; the method comprises the steps of,
acquiring second position information of a projection point of a projector in real time, wherein the projection point is a central point of a projection picture when the projector projects the projection picture to the projection wall surface;
the processing module is used for calculating a first distance between the sight falling point and the projection point based on the first position information and the second position information; the method comprises the steps of,
judging the size relation between the first distance and a preset first threshold value, and if the first distance is smaller than or equal to the first threshold value, setting the current position of the projector as a projection position;
and the output module is used for sending a position adjustment signal to the projector so as to enable the projector to move to the projection position.
In a third aspect of the present application there is provided an electronic device comprising a processor, a memory for storing instructions, a user interface and a network interface, both for communicating to other devices, the processor being for executing the instructions stored in the memory to cause the electronic device to perform a method as claimed in any one of the preceding claims.
In summary, one or more technical solutions provided in the embodiments of the present application at least have the following technical effects or advantages:
when a user watches a projection picture of a projection wall surface, if the position of the user moves, the sight line drop point of the user on the projection wall surface changes, and the server calculates the position to which the picture needs to be projected by the projector according to the sight line drop point when the user watches the projection wall surface, and then adjusts the position of the projector, so that the user still can watch the projection picture normally after moving the position, and the effect of adjusting the projection position of the projector in real time according to the watching angle of the user is achieved.
Drawings
Fig. 1 is a schematic flow chart of an automatic following projection method disclosed in an embodiment of the present application.
Fig. 2 is an application scene diagram of an automatic following projection method disclosed in an embodiment of the present application.
Fig. 3 is a schematic structural diagram of an automatic following projection device according to an embodiment of the present disclosure.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Reference numerals illustrate: 301. an acquisition module; 302. a processing module; 303. an output module; 401. a processor; 402. a communication bus; 403. a user interface; 404. a network interface; 405. a memory.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments.
In the description of embodiments of the present application, words such as "for example" or "for example" are used to indicate examples, illustrations or descriptions. Any embodiment or design described herein as "such as" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "or" for example "is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, the term "plurality" means two or more. For example, a plurality of systems means two or more systems, and a plurality of screen terminals means two or more screen terminals. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The embodiment discloses an automatic following projection method, which is applied to a server, and referring to fig. 1, and comprises the following steps:
s110, acquiring a sight line falling point of a user in real time, wherein the sight line falling point is a point at which the sight line intersects with the projection wall surface when the user views the projection wall surface.
Specifically, in the present embodiment, definition for the line of sight is: when a user observes a certain specific point, the user's dual-purpose pupils are connected to obtain a dual-purpose connecting line, and then the specific point is connected with the midpoint of the dual-purpose connecting line, so that the obtained connecting line is the user's sight. Definition for projected wall surface is: the user is located on a certain blank wall surface of the room, and no object is used for shielding the projected picture on the projected wall surface.
In a possible implementation manner, the method for acquiring the line-of-sight landing point of the user in real time specifically includes: and acquiring an eye image of the user in real time. And identifying the eye images to obtain the movement information of the pupils. Based on the pupil information, an observation line of sight of the user is acquired. And extending the observation sight until the observation sight intersects with the projection wall surface to obtain an intersection point, and setting the intersection point as a sight falling point.
Specifically, a real-time image of the eyes of the user is captured by using a camera or a sensor or the like. The captured real-time images are sent to a server, and the server processes and analyzes the real-time images through a computer vision technology to obtain movement information of pupils. For example, by employing a pupil tracking algorithm to track the movement of the pupil, and detecting and tracking the position and size of the pupil, and updating its position as the pupil moves. The server may determine the gaze direction and gaze point of the user by analyzing the movement pattern of the pupil. And finally, the server prolongs the observation sight until the observation sight intersects with the projection wall surface to obtain an intersection point, and the intersection point is set as a sight falling point.
S120, acquiring first position information of a line-of-sight falling point.
In one possible implementation manner, obtaining the first position information of the line-of-sight landing point specifically includes: and establishing a first coordinate system, wherein the first coordinate system takes the central point of the projected wall surface as an original point, the horizontal direction of the projected wall surface as an X axis and the vertical direction of the projected wall surface as a Y axis. And acquiring a first coordinate of the line-of-sight falling point in a first coordinate system, wherein the first coordinate is the first position information.
Specifically: the calculation and simulation of the sight line of the user and the calculation and simulation of the falling point of the sight line are performed in a world coordinate system, namely, a coordinate system established based on the real world where the user is located by the server. After the line of sight falling point in the world coordinate system is acquired, coordinate conversion is needed, and the coordinate of the line of sight falling point in the world coordinate system is converted into the coordinate in the first coordinate system, so that the first coordinate is obtained, and the first coordinate is the first position information. The transformation of the coordinates is a conventional technical means in the related art, and will not be further described herein.
S130, acquiring second position information of a projection point of the projector in real time, wherein the projection point is a central point of a projection picture when the projector projects the projection picture onto a projection wall surface.
In one possible implementation manner, the method for obtaining the second position information of the projection point of the projector specifically includes: and acquiring a second coordinate of the projection point in the first coordinate system, wherein the second coordinate is second position information.
Specifically, a UWB positioning module is arranged in the projector, and the ultra-wideband pulse signal is utilized to realize centimeter-level high-precision positioning, so that the specific position of the lens of the projector is obtained. And then, the distance between the projection picture and the projector is obtained by a laser ranging technology, and the accurate position of the projection point can be obtained by combining the known projection angle. And finally, converting the coordinates of the projection points in the world coordinate system into the coordinates in the first coordinate system, and obtaining second coordinates, namely obtaining second position information of the projection points.
S140, calculating a first distance between the sight line falling point and the projection point based on the first position information and the second position information.
In one possible real-time manner, calculating the first distance between the line-of-sight landing point and the projection point based on the first position information and the second position information specifically includes: the first distance is calculated based on the first coordinate and the second coordinate.
Specifically, after the server obtains the first coordinate of the line-of-sight falling point and the second coordinate of the projection point, the first distance can be calculated by a distance formula between the two points because the first coordinate and the second coordinate are two-dimensional coordinates in the first coordinate system.
And S150, judging the magnitude relation between the first distance and a preset first threshold value, and if the first distance is smaller than or equal to the first threshold value, setting the current position of the projector as a projection position.
Specifically, when the distance between the projection point and the line-of-sight falling point is smaller than or equal to the first threshold value, the distance between the corresponding projector and the user is relatively short, and when the projector projects the picture to the front of the user, the distortion degree of the projected picture is relatively small, and the influence on the projection effect is relatively small. The specific value of the first threshold is not specifically limited in this embodiment, and other embodiments may be adjusted according to actual situations.
In one possible embodiment, after determining that the first distance is less than or equal to the first threshold, the method further comprises: and acquiring the angle of the projection angle, and judging the size relation between the angle of the projection angle and a preset second threshold value, wherein the projection angle is the included angle between the sight of the user and the projection line. And if the angle of the projection angle is smaller than or equal to the second threshold value, setting the current position of the projector as the projection position.
Specifically, referring to fig. 2, in the present embodiment, definition of the projection line L1 is: and a connecting line of the center point of the projector lens and the projection point a. The corresponding projection line L1 and the sight line L2 have an included angle of projection angle alpha. When the projection angle alpha is smaller than or equal to the second threshold value, and the first distance between the projection point a and the sight falling point b is smaller than or equal to the first threshold value, the distortion degree of the projection picture is smaller, and the influence on the projection effect is smaller. The specific value of the second threshold is not specifically limited in this embodiment, and other embodiments may be adjusted according to actual situations.
In one possible implementation manner, after the first position information of the line-of-sight landing point is acquired, the method further includes: and acquiring the time length of the line-of-sight falling point in a first area, wherein the first area is a circular area taking the first position as a circle center and taking the preset length as a radius. Judging whether the duration is greater than a preset third threshold value, and if so, determining that the first position is the position of the line-of-sight falling point.
Specifically, because the server monitors the line of sight falling point of the user in real time, if the position of the projection point is adjusted in real time according to the position of the line of sight falling point, the moving frequency of the picture is too high, and visual dizziness is easily caused. Therefore, after the first position information of the line-of-sight falling point is acquired in real time, the server needs to acquire the time length of the line-of-sight falling point in the first area, and when the time length of the line-of-sight falling point in the first area is greater than a preset third threshold value, the user can be judged to adjust the position or change the viewing posture, so that the position of the line-of-sight falling point is changed, and finally, the server sets the first position as the position of the line-of-sight falling point. The third threshold may be 2 seconds, or may be 10 seconds, or may be 30 seconds, and in this embodiment, the third threshold is preferably 10 seconds, and other embodiments may be adjusted according to practical situations.
In one possible embodiment, the direction and position of the user's gaze is determined by analyzing the movement of the pupil and the pose of the head.
Specifically, first, by calculating the change in pupil position and size, and the posture of the head, the area where the eyes of the user are gazed at is determined. For example, when the pupil stays in an area for a time greater than a certain threshold, the area at which the eyes of the user are gazing may be considered as a candidate area for the line-of-sight landing point. And further determining the specific point of the user's gaze in the candidate region. By analyzing the motion trail and time series data of the pupil. For example, when the time that the pupil gazes in an area is greater than a certain threshold, a specific point where the user gazes may be determined as a line-of-sight falling point.
In one possible implementation manner, after adjusting the position of the projector based on the relative distance between the first coordinate and the second coordinate to obtain the optimal projection position, the method specifically further includes: and obtaining a projection picture of the projector on the projection wall surface. Identifying the shape of the projection picture, and judging whether the shape of the projection picture is rectangular; if the shape of the projection screen is not rectangular, the shape of the projection screen is corrected to be rectangular.
Specifically, when the distance between the center point of the projector lens and the projection point in the horizontal direction and/or the vertical direction is greater than the second threshold, it is indicated that the angle at which the picture is projected is inconsistent with the angle at which the picture is photographed, resulting in the distortion of the picture, and therefore, correction of the distorted picture is required. Firstly, a server acquires a projection picture of a projector on a projection wall surface, then identifies the shape of the projection picture, judges whether the projection picture is rectangular, and if the projection picture is rectangular, indicates that the projection picture has no distortion phenomenon. If the shape of the projection picture is not rectangular, the distortion phenomenon of the projection picture is indicated. And finally, the server corrects the picture into a rectangle according to the shape of the picture, and sends the corrected projection picture to the projector. The technology involved in correcting the projection image is only a conventional means in the related art, and will not be further described herein.
The embodiment also discloses an automatic following projection device, which is a server, and includes an acquisition module 301, a processing module 302 and an output module 303, wherein:
the acquisition module 301 is configured to acquire, in real time, a line-of-sight falling point of a user, where the line-of-sight falling point is a point where the line of sight intersects the projection wall surface when the user views the projection wall surface; the method comprises the steps of,
acquiring first position information of a sight falling point; the method comprises the steps of,
acquiring second position information of a projection point of the projector in real time, wherein the projection point is a central point of a projection picture when the projector projects the projection picture onto a projection wall surface;
a processing module 302, configured to calculate a first distance between the line-of-sight landing point and the projection point based on the first location information and the second location information; the method comprises the steps of,
judging the size relation between the first distance and a preset first threshold value, and if the first distance is smaller than or equal to the first threshold value, setting the current position of the projector as a projection position;
and an output module 303 for sending a position adjustment signal to the projector to move the projector to the projection position.
In one possible implementation manner, the server is configured to establish a first coordinate system with a center point of the projected wall surface as an origin, a horizontal direction of the projected wall surface as an X axis, and a vertical direction of the projected wall surface as a Y axis;
and placing the line-of-sight falling point into a first coordinate system, acquiring a first coordinate of the line-of-sight falling point in the first coordinate system, and setting the first coordinate as first position information.
In one possible implementation manner, the server is configured to put the projection point into the first coordinate system, obtain a second coordinate of the projection point in the first coordinate system, and set the second coordinate as the second position information.
In a possible embodiment, the server is configured to calculate the first distance based on the first coordinate and the second coordinate.
In one possible implementation, the server is configured to acquire an eye image of the user in real time;
identifying an eye image and acquiring movement information of pupils;
based on pupil information, obtaining an observation line of sight of a user;
and extending the observation sight until the observation sight is intersected with the projection wall surface to obtain an intersection point, and setting the intersection point as a sight falling point.
In one possible implementation manner, the server is configured to obtain an angle of the projection angle, and determine a magnitude relation between the angle of the projection angle and a preset second threshold, where the projection angle is an included angle between a user's line of sight and a projection line;
and if the angle of the projection angle is smaller than or equal to the second threshold value, setting the current position of the projector as the projection position.
In one possible implementation manner, the server is configured to obtain a duration that the line-of-sight falling point is located in a first area, where the first area is a circular area with a first position as a center and a preset length as a radius;
judging whether the duration is greater than a preset third threshold value, and if so, determining that the first position is the position of the line-of-sight falling point.
In one possible implementation manner, the server is used for acquiring a projection picture of the projector on a projection wall surface;
identifying the shape of the projection picture, and judging whether the shape of the projection picture is rectangular;
if the shape of the projection screen is not rectangular, the shape of the projection screen is corrected to be rectangular.
The embodiment also discloses an electronic device, referring to fig. 3, the electronic device may include: at least one processor 401, at least one communication bus 402, a user interface 403, a network interface 404, at least one memory 405.
Wherein communication bus 402 is used to enable connected communications between these components.
The user interface 403 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 403 may further include a standard wired interface and a standard wireless interface.
The network interface 404 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein the processor 401 may include one or more processing cores. The processor 401 connects the various parts within the entire server using various interfaces and lines, performs various functions of the server and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 405, and invoking data stored in the memory 405. Alternatively, the processor 401 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 401 may integrate one or a combination of several of a central processor 401 (Central Processing Unit, CPU), an image processor 401 (Graphics Processing Unit, GPU), a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 401 and may be implemented by a single chip.
The Memory 405 may include a random access Memory 405 (Random Access Memory, RAM) or a Read-Only Memory 405 (Read-Only Memory). Optionally, the memory 405 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 405 may be used to store instructions, programs, code sets, or instruction sets. The memory 405 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described various method embodiments, etc.; the storage data area may store data or the like involved in the above respective method embodiments. The memory 405 may also optionally be at least one storage device located remotely from the aforementioned processor 401. As shown, an operating system, network communication module, user interface 403 module, and an application program that automatically follows the projection method may be included in memory 405, which is a computer storage medium.
In the electronic device shown in fig. 3, the user interface 403 is mainly used for providing an input interface for a user, and acquiring data input by the user; and the processor 401 may be used to invoke an application program in the memory 405 that automatically follows the projection method, which when executed by the one or more processors 401, causes the electronic device to perform the method as in one or more of the embodiments described above.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided herein, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, such as a division of units, merely a division of logic functions, and there may be additional divisions in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some service interface, device or unit indirect coupling or communication connection, electrical or otherwise.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory 405. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory 405, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned memory 405 includes: various media capable of storing program codes, such as a U disk, a mobile hard disk, a magnetic disk or an optical disk.
The foregoing is merely exemplary embodiments of the present disclosure and is not intended to limit the scope of the present disclosure. That is, equivalent changes and modifications are contemplated by the teachings of this disclosure, which fall within the scope of the present disclosure. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a scope and spirit of the disclosure being indicated by the claims.

Claims (10)

1. An automatic following projection method, wherein the method is applied to a server, the method comprising:
acquiring a sight line falling point of a user in real time, wherein the sight line falling point is a point at which the sight line intersects with a projection wall surface when the user views the projection wall surface;
acquiring first position information of the sight falling point;
acquiring second position information of a projection point of a projector in real time, wherein the projection point is a central point of a projection picture when the projector projects the projection picture to the projection wall surface;
calculating a first distance between the line-of-sight landing point and the projection point based on the first position information and the second position information;
judging the size relation between the first distance and a preset first threshold value, and if the first distance is smaller than or equal to the first threshold value, setting the current position of the projector as a projection position;
and sending a position adjustment signal to the projector so as to enable the projector to move to the projection position.
2. The method of claim 1, wherein the obtaining the first position information of the line-of-sight landing point specifically includes:
establishing a first coordinate system, wherein the first coordinate system takes the central point of the projection wall surface as an original point, takes the horizontal direction of the projection wall surface as an X axis and takes the vertical direction of the projection wall surface as a Y axis;
and acquiring a first coordinate of the line-of-sight falling point in the first coordinate system, wherein the first coordinate is the first position information.
3. The method according to claim 2, wherein the obtaining the second position information of the projection point of the projector specifically includes:
and acquiring a second coordinate of the projection point in the first coordinate system, wherein the second coordinate is the second position information.
4. An automatic following projection method according to claim 3, wherein the calculating the first distance between the line-of-sight landing point and the projection point based on the first position information and the second position information specifically includes:
the first distance is calculated based on the first coordinate and the second coordinate.
5. The automatic following projection method according to claim 1, wherein the acquiring, in real time, the line of sight landing point of the user specifically includes:
acquiring an eye image of a user in real time;
identifying the eye images to obtain movement information of pupils;
based on the pupil information, obtaining an observation line of sight of a user;
and extending the observation sight until the observation sight is intersected with the projection wall surface to obtain an intersection point, and setting the intersection point as the sight falling point.
6. The automatic follow-up projection method of claim 2, wherein after the determining that the first distance is less than or equal to the first threshold, the method further comprises:
acquiring an angle of a projection angle, and judging the size relation between the angle of the projection angle and a preset second threshold value, wherein the projection angle is an included angle between the sight of a user and a projection line;
and if the angle of the projection angle is smaller than or equal to the second threshold value, setting the current position of the projector as a projection position.
7. The automatic follow-up projection method according to claim 1, wherein after the first position information of the line-of-sight landing is obtained, the method further comprises:
acquiring the time length of the line-of-sight falling point in a first area, wherein the first area is a circular area taking a first position as a circle center and taking a preset length as a radius;
judging whether the duration is greater than a preset third threshold value, and if the duration is greater than the third threshold value, determining that the first position is the position of the sight falling point.
8. The automatic following projection method according to claim 1, wherein after the position where the projector is currently located is set as a projection position, the method further comprises:
obtaining a projection picture of the projector on the projection wall surface;
identifying the shape of the projection picture, and judging whether the shape of the projection picture is rectangular;
and correcting the shape of the projection picture to be rectangular if the shape of the projection picture is not rectangular.
9. An automatic following projection device, characterized in that the device is a server comprising an acquisition module (301), a processing module (302) and an output module (303), wherein:
the acquisition module (301) is configured to acquire, in real time, a line-of-sight landing point of a user, where the line-of-sight landing point is a point where a line of sight intersects a projection wall surface when the user views the projection wall surface; the method comprises the steps of,
acquiring first position information of the sight falling point; the method comprises the steps of,
acquiring second position information of a projection point of a projector in real time, wherein the projection point is a central point of a projection picture when the projector projects the projection picture to the projection wall surface;
-the processing module (302) for calculating a first distance of the gaze point and the projection point based on the first position information and the second position information; the method comprises the steps of,
judging the size relation between the first distance and a preset first threshold value, and if the first distance is smaller than or equal to the first threshold value, setting the current position of the projector as a projection position;
the output module (303) is configured to send a position adjustment signal to the projector to move the projector to the projection position.
10. An electronic device comprising a processor (401), a memory (405), a user interface (403) and a network interface (404), the memory (405) being configured to store instructions, the user interface (403) and the network interface (404) being configured to communicate to other devices, the processor (401) being configured to execute the instructions stored in the memory (405) to cause the electronic device to perform the method of any of claims 1-8.
CN202310394619.4A 2023-04-07 2023-04-07 Automatic following projection method and device and electronic equipment Pending CN116489326A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310394619.4A CN116489326A (en) 2023-04-07 2023-04-07 Automatic following projection method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310394619.4A CN116489326A (en) 2023-04-07 2023-04-07 Automatic following projection method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN116489326A true CN116489326A (en) 2023-07-25

Family

ID=87214867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310394619.4A Pending CN116489326A (en) 2023-04-07 2023-04-07 Automatic following projection method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116489326A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10161802A (en) * 1996-11-29 1998-06-19 Canon Inc Method and device for image display
JP2010153983A (en) * 2008-12-24 2010-07-08 Panasonic Electric Works Co Ltd Projection type video image display apparatus, and method therein
KR20110096372A (en) * 2010-02-22 2011-08-30 에스케이텔레콤 주식회사 Method for providing user interface of terminal with projecting function
US20150195479A1 (en) * 2014-01-06 2015-07-09 Kabushiki Kaisha Toshiba Image processor, image processing method, and image projector
CN110764342A (en) * 2019-11-19 2020-02-07 四川长虹电器股份有限公司 Intelligent projection display device and adjustment method of projection picture thereof
CN112540676A (en) * 2020-12-15 2021-03-23 广州舒勇五金制品有限公司 Projection system-based variable information display device
CN114286066A (en) * 2021-12-23 2022-04-05 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment
CN114727077A (en) * 2022-03-04 2022-07-08 乐融致新电子科技(天津)有限公司 Projection method, apparatus, device and storage medium
CN115633159A (en) * 2022-07-29 2023-01-20 深圳市当智科技有限公司 Projection method, projection system, and storage medium
WO2023005800A1 (en) * 2021-07-30 2023-02-02 歌尔科技有限公司 Display calibration method and apparatus for head-mounted device, head-mounted device, and storage medium
WO2023029277A1 (en) * 2021-09-01 2023-03-09 广景视睿科技(深圳)有限公司 Projection method, apparatus and device, and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10161802A (en) * 1996-11-29 1998-06-19 Canon Inc Method and device for image display
JP2010153983A (en) * 2008-12-24 2010-07-08 Panasonic Electric Works Co Ltd Projection type video image display apparatus, and method therein
KR20110096372A (en) * 2010-02-22 2011-08-30 에스케이텔레콤 주식회사 Method for providing user interface of terminal with projecting function
US20150195479A1 (en) * 2014-01-06 2015-07-09 Kabushiki Kaisha Toshiba Image processor, image processing method, and image projector
CN110764342A (en) * 2019-11-19 2020-02-07 四川长虹电器股份有限公司 Intelligent projection display device and adjustment method of projection picture thereof
CN112540676A (en) * 2020-12-15 2021-03-23 广州舒勇五金制品有限公司 Projection system-based variable information display device
WO2023005800A1 (en) * 2021-07-30 2023-02-02 歌尔科技有限公司 Display calibration method and apparatus for head-mounted device, head-mounted device, and storage medium
WO2023029277A1 (en) * 2021-09-01 2023-03-09 广景视睿科技(深圳)有限公司 Projection method, apparatus and device, and storage medium
CN114286066A (en) * 2021-12-23 2022-04-05 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment
CN114727077A (en) * 2022-03-04 2022-07-08 乐融致新电子科技(天津)有限公司 Projection method, apparatus, device and storage medium
CN115633159A (en) * 2022-07-29 2023-01-20 深圳市当智科技有限公司 Projection method, projection system, and storage medium

Similar Documents

Publication Publication Date Title
US10650533B2 (en) Apparatus and method for estimating eye gaze location
AU2019282933B2 (en) Smart glasses, method and device for tracking eyeball trajectory, and storage medium
JP2021144227A (en) Adaptive parameters in image regions based on eye tracking information
KR101741335B1 (en) Holographic displaying method and device based on human eyes tracking
CN107272904B (en) Image display method and electronic equipment
US20200241731A1 (en) Virtual reality vr interface generation method and apparatus
US11838494B2 (en) Image processing method, VR device, terminal, display system, and non-transitory computer-readable storage medium
CN105989577B (en) Image correction method and device
US11232602B2 (en) Image processing method and computing device for augmented reality device, augmented reality system, augmented reality device as well as computer-readable storage medium
CN110855972B (en) Image processing method, electronic device, and storage medium
EP3671408B1 (en) Virtual reality device and content adjusting method therefor
CN105787884A (en) Image processing method and electronic device
US11590415B2 (en) Head mounted display and method
CN110780742B (en) Eyeball tracking processing method and related device
US20150304625A1 (en) Image processing device, method, and recording medium
CN112666705A (en) Eye movement tracking device and eye movement tracking method
CN107436681A (en) Automatically adjust the mobile terminal and its method of the display size of word
JP2020003898A (en) Information processing device, information processing method, and program
CN109936697A (en) A kind of video capture method for tracking target and device
US11749141B2 (en) Information processing apparatus, information processing method, and recording medium
JP6525740B2 (en) INFORMATION PROCESSING APPARATUS, CONTROL METHOD FOR INFORMATION PROCESSING APPARATUS, IMAGE PROCESSING SYSTEM, AND PROGRAM
US11477433B2 (en) Information processor, information processing method, and program
CN113395438B (en) Image correction method and related device for eyeball tracking technology
CN116489326A (en) Automatic following projection method and device and electronic equipment
US20230177879A1 (en) Videoconference iris position adjustments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination