CN113973195A - Projection equipment and correction method thereof - Google Patents

Projection equipment and correction method thereof Download PDF

Info

Publication number
CN113973195A
CN113973195A CN202111275147.8A CN202111275147A CN113973195A CN 113973195 A CN113973195 A CN 113973195A CN 202111275147 A CN202111275147 A CN 202111275147A CN 113973195 A CN113973195 A CN 113973195A
Authority
CN
China
Prior art keywords
angle offset
data
initial
actual
attitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111275147.8A
Other languages
Chinese (zh)
Inventor
刘鹏鹏
陈许
姜大鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Laser Display Co Ltd
Original Assignee
Qingdao Hisense Laser Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Laser Display Co Ltd filed Critical Qingdao Hisense Laser Display Co Ltd
Priority to CN202111275147.8A priority Critical patent/CN113973195A/en
Publication of CN113973195A publication Critical patent/CN113973195A/en
Priority to PCT/CN2022/101830 priority patent/WO2023071256A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/315Modulator illumination systems
    • H04N9/3161Modulator illumination systems using laser light sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Optics & Photonics (AREA)
  • Projection Apparatus (AREA)

Abstract

The application provides a projection device and a correction method thereof, relates to the field of projection display, and is used for automatically correcting a to-be-projected image when the projection device moves. The projection device includes: the attitude detection sensor is used for detecting actual attitude data of the projection equipment, and the actual attitude data is used for reflecting the current attitude of the projection equipment; a controller connected to the attitude detection sensor, the controller configured to: acquiring actual attitude data; under the condition that the actual attitude data is not matched with the initial attitude data, wherein the initial attitude data is data detected by an attitude detection sensor when the projection equipment is in the initial attitude; determining the roll angle offset, the pitch angle offset and the yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual attitude data and the initial attitude data; and correcting the image to be projected according to the roll angle offset, the pitch angle offset and the yaw angle offset.

Description

Projection equipment and correction method thereof
Technical Field
The present disclosure relates to the field of projection display, and more particularly, to a projection apparatus and a correction method thereof.
Background
With the rapid development of display technology, projection equipment is more and more widely applied, and great convenience is brought to life, study and work of people. At present, a laser television appears in the television market. The laser television is composed of a laser projector adopting a reflective ultra-short-focus projection technology and a projection screen, and can show good television pictures in an environment with high brightness.
However, no matter the laser television or other projection devices are inevitably subjected to position movement in daily use, when the projection devices move, the projection images change, so that the projection images are no longer rectangular, and the watching effect is affected. The existing method for adjusting the projection image is generally manual adjustment, so that the steps are complex, certain speciality is realized, the adjustment difficulty of a user is high, and poor experience is caused.
Disclosure of Invention
The embodiment of the application provides projection equipment and a correction method thereof, which are used for automatically correcting a to-be-projected image after the projection equipment moves.
In a first aspect, an embodiment of the present application provides a projection apparatus, including: the attitude detection sensor is used for detecting actual attitude data of the projection equipment, and the actual attitude data is used for reflecting the current attitude of the projection equipment; a controller connected to the attitude detection sensor, the controller configured to: acquiring actual attitude data; under the condition that the actual attitude data is not matched with the initial attitude data, determining a roll angle offset, a pitch angle offset and a yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual attitude data and the initial attitude data; the initial attitude data is data detected by an attitude detection sensor when the projection equipment is in an initial attitude; the roll angle offset is an angle deflected by the projection equipment relative to the initial posture by taking a first coordinate axis as a rotating shaft, and the first coordinate axis is a projection direction of the projection equipment in the initial posture in a horizontal plane; the pitch angle offset is an angle deflected by the projection equipment relative to the initial attitude by taking a second coordinate axis as a rotating axis, and the second coordinate axis is a direction vertical to the first coordinate axis in a horizontal plane; the yaw angle offset is an angle deflected by the projection equipment relative to the initial attitude by taking a third coordinate axis as a rotating axis, and the third coordinate axis is in a vertical direction; and correcting the image to be projected according to the roll angle offset, the pitch angle offset and the yaw angle offset.
The projection equipment provided by the embodiment of the application is provided with the attitude detection sensor, and when actual attitude data detected by the attitude detection sensor is not matched with initial attitude data, the projection equipment is explained to move. In this case, the controller of the projection apparatus determines the attitude change condition (i.e., the roll angle offset, the pitch angle offset, and the yaw angle offset) of the projection apparatus based on the actual attitude data and the initial attitude data. And then, automatically correcting the image to be projected by the controller according to the roll angle offset, the pitch angle offset and the yaw angle offset. Therefore, the projection equipment provided by the application can automatically correct the image to be projected according to the posture change condition after moving, and does not need the user to manually correct, so that the good watching experience of the user is guaranteed, and the operation of the user is reduced.
In a second aspect, an embodiment of the present application provides a calibration method for a projection apparatus, where the method includes: acquiring actual attitude data; under the condition that the actual attitude data is not matched with the initial attitude data, determining a roll angle offset, a pitch angle offset and a yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual attitude data and the initial attitude data; the initial attitude data is data detected by the attitude detection sensor when the projection equipment is in an initial attitude; the roll angle offset is an angle deflected by the projection equipment relative to the initial posture by taking a first coordinate axis as a rotating shaft, and the first coordinate axis is a projection direction of the projection equipment in the initial posture in a horizontal plane; the pitch angle offset is an angle deflected by the projection equipment relative to the initial attitude by taking a second coordinate axis as a rotating axis, and the second coordinate axis is a direction vertical to the first coordinate axis in a horizontal plane; the yaw angle offset is an angle deflected by the projection equipment relative to the initial attitude by taking a third coordinate axis as a rotating axis, and the third coordinate axis is in a vertical direction; and correcting the image to be projected according to the roll angle offset, the pitch angle offset and the yaw angle offset.
In a third aspect, a projection system is provided, which includes: a projection screen and a projection device as provided in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium comprising computer-executable instructions that, when executed on a computer, cause the computer to perform the correction method according to the second aspect.
In a fifth aspect, the present application provides a computer program product comprising instructions which, when run on correction means of a projection device, cause the correction means of the projection device to perform the correction method according to the second aspect.
The beneficial effects described in the second aspect to the fifth aspect in the present application may refer to the beneficial effect analysis of the first aspect, and are not described herein again.
Drawings
Fig. 1(a) is a schematic structural diagram of a projection apparatus according to an embodiment of the present disclosure;
fig. 1(b) is a schematic composition diagram of a projection apparatus provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a coordinate system provided in an embodiment of the present application;
FIG. 3 is a schematic diagram of a roll angle offset of a projection apparatus according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a pitch angle offset of a projection apparatus according to an embodiment of the present disclosure;
FIG. 5 is a schematic view of a yaw angle offset of a projection apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of another projection apparatus provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of another projection apparatus provided in an embodiment of the present application;
fig. 8 is a schematic view of a projection screen according to an embodiment of the present disclosure;
fig. 9 is a schematic view of another projection screen provided in the embodiment of the present application;
FIG. 10 is a schematic view of another projection screen provided in the embodiment of the present application;
fig. 11 is a schematic structural diagram of another projection apparatus provided in an embodiment of the present application;
fig. 12 is a first flowchart illustrating a calibration method of a projection apparatus according to an embodiment of the present disclosure;
fig. 13 is a first flowchart illustrating another calibration method for a projection apparatus according to an embodiment of the present disclosure;
fig. 14 is a first flowchart illustrating another calibration method for a projection apparatus according to an embodiment of the present disclosure;
fig. 15 is a schematic view of an application scenario in which a projection system interacts with a control device and a server according to an embodiment of the present application;
FIG. 16 is a schematic diagram of a control device according to an embodiment of the present disclosure;
fig. 17 is a schematic application layer diagram of a projection device according to an embodiment of the present disclosure.
Detailed Description
To facilitate understanding of those skilled in the art, the embodiments of the present application describe terms referred to in the embodiments of the present application.
An acceleration sensor: the deformation of the sensitive part inside the sensor is caused by the acting force, and the corresponding acceleration signal is obtained by measuring the deformation and converting the deformation into voltage output by a related circuit. In a static state, the acceleration sensor must have a direction of gravity, so that the data of one axis is 1g (namely 9.8 m)2/s) when the object moves, the data on the X, Y, Z axis all change, when passing through X2+Y2+Z2=9.8m2The motion components of the object in the three axes can be calculated by/s, so that the motion of the object can be determined.
A geomagnetic sensor: the measuring device is similar to a compass, and indicates information such as the attitude, the motion angle and the like of a measured object by inducing the distribution change of the geomagnetic field according to the difference of the motion states of the measured object in the geomagnetic field.
A gyro sensor: the device for measuring the rotation angle or the angular velocity of the object in the relative inertia space can determine the angle value of the movement of the object by integrating the obtained angular velocity values.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
For the convenience of clearly describing the technical solutions of the embodiments of the present application, in the embodiments of the present invention, the words "first", "second", and the like are used for distinguishing the same items or similar items with basically the same functions and actions, and those skilled in the art can understand that the words "first", "second", and the like are not limited in number or execution order.
As described in the background art, no matter a laser television or other projection equipment is inevitably subjected to position movement during daily use, when the projection equipment moves, a projected image changes, so that the projected image is no longer rectangular, and the viewing effect is affected. The existing method for adjusting the projection image is generally manual adjustment, so that the steps are complex, certain speciality is realized, the adjustment difficulty of a user is high, and poor experience is caused.
Based on the above technical problem, an embodiment of the present application provides a projection device, where the projection device has an attitude detection sensor, and when actual attitude data detected by the attitude detection sensor does not match initial attitude data, it indicates that the projection device has moved. In this case, the controller of the projection apparatus determines the attitude change condition (i.e., the roll angle offset, the pitch angle offset, and the yaw angle offset) of the projection apparatus based on the actual attitude data and the initial attitude data. And then, automatically correcting the image to be projected by the controller according to the roll angle offset, the pitch angle offset and the yaw angle offset. Therefore, the projection equipment provided by the application can automatically correct the image to be projected according to the posture change condition after moving, and does not need the user to manually correct, so that the good watching experience of the user is guaranteed, and the operation of the user is reduced.
The projection device in the embodiment of the present application may refer to a device having a projection function.
Optionally, the projection device may be a laser projection device, and may also be a Light Emitting Diode (LED) projection device.
Optionally, the projection device may be a rectangular parallelepiped, a prismatic shape, a spherical shape, a desk lamp shape, or the like, and the embodiment of the present application is not limited as long as the projection device has a projection function.
Fig. 1(a) is a schematic structural diagram of a projection apparatus 100 according to an embodiment of the present disclosure. As shown in fig. 1(a), the projection apparatus 100 includes: an attitude detection sensor 110 and a controller 120 (not shown in fig. 1 (a)).
The pose detection sensor 110 is configured to detect actual pose data of the projection device, where the actual pose data is configured to reflect a current pose of the projection device.
Alternatively, as shown in fig. 1(b), the controller 120 is connected to the gesture detection sensor 110, and is configured to: acquiring actual attitude data; under the condition that the actual attitude data is not matched with the initial attitude data, determining a roll angle offset, a pitch angle offset and a yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual attitude data and the initial attitude data; and correcting the image to be projected according to the roll angle offset, the pitch angle offset and the yaw angle offset.
The initial attitude data is data detected by the attitude detection sensor when the projection equipment is in an initial attitude.
Optionally, the initial posture of the projection apparatus may be a posture after the user or a professional manually corrects the projection apparatus according to the projection environment and the actual projection image. When the projection equipment is in the initial posture, the projection direction of the projection equipment is aligned with the projection screen, so that the projection picture of the projection equipment is an ideal rectangular projection picture.
It should be appreciated that the projection device has different initial poses under different projection environments. In the same projection environment, a user only needs to perform manual correction once after the projection device is started for the first time to enable the projection device to be located at the initial posture, and then the initial posture of the projection device is unchanged no matter how many times the projection device is started or stopped. When the projection environment of the projection device changes, the initial posture of the projection device also changes.
For example, if a user uses the projection device to perform projection in the room a, after the user starts the projector device for the first time, the user makes the projector device be in the posture 1 through a manual correction, where the posture 1 is an initial posture of the projector device when the projector device is in the room a, and after the room a shuts down the projector device and starts the projector device again, the user does not need to perform the manual correction again to readjust the initial posture of the projector device. However, if the user moves the projection device to the room B for projection, a manual correction is still required after the projector device is turned on for the first time, so that the projection device is in the posture 2 in the room B, where the posture 2 is the initial posture of the projection device in the room B.
Optionally, after the projection device is subjected to the initial correction and is in the initial posture, the three-dimensional coordinate system is established with the initial posture of the projection device as a reference. Illustratively, as shown in fig. 2, the first coordinate axis is a projection direction of the projection device in the initial posture in a horizontal plane, the second coordinate axis is a direction perpendicular to the first coordinate axis in the horizontal plane, and the third coordinate axis is a vertical direction. For convenience of description, the first coordinate axis is hereinafter referred to as an X-axis, the second coordinate axis is hereinafter referred to as a Y-axis, and the third coordinate axis is hereinafter referred to as a Z-axis.
The controller 120 determines the angular offset of the projection device in each case as described in detail below.
First, roll angle offset
The roll angle offset is an angle that the projection device is deflected with respect to the initial attitude with the first coordinate axis as the axis of rotation. As shown in fig. 3, the coordinate system formed by the X-axis, the Y-axis, and the Z-axis is the coordinate system established when the projection apparatus is in the initial posture, and when the projection apparatus rolls, the projection apparatus rotates around the X-axis, that is, the YZ plane rotates around the X-axis, and the Y is formed after the rotation1Z1Plane, Y-axis and Y1The axes form an included angle alpha (or Z axis and Z)1The included angle alpha) is the roll angle offset of the projection device.
Second, pitch angle offset
The pitch angle offset is an angle of the projection device deflected relative to the initial attitude with the second coordinate axis as a rotation axis. As shown in fig. 4, the coordinate system formed by the X-axis, the Y-axis, and the Z-axis is the coordinate system established when the projection apparatus is in the initial posture, and when the projection apparatus is tilted, the projection apparatus rotates around the Y-axis, that is, the XZ plane rotates around the Y-axis, and the X is formed after the rotation1Z1Plane, X-axis and X1The axis forming an angle beta (or Z axis and Z)1The included angle β) is the amount of pitch offset of the projection apparatus.
Third, yaw angle offset
The yaw angle offset is an angle of the projection device deflected with respect to the initial attitude with the third coordinate axis as a rotation axis. As shown in fig. 5, the coordinate system formed by the X-axis, the Y-axis, and the Z-axis is the coordinate system established when the projection apparatus is in the initial posture, and when the projection apparatus is deflected, the projection apparatus rotates around the Z-axis, that is, the XY plane rotates around the Z-axis, and the X-axis is formed after the rotation1Y1Plane, X-axis and X1The axis forms an included angle gamma (or Y axis and Y)1The included angle γ) is the yaw angle offset of the projection apparatus.
As one possible implementation of the projection apparatus, as shown in fig. 6, the posture detection sensor 110 may include an acceleration sensor 111 and a geomagnetic sensor 112.
Wherein the acceleration sensor 111 is a gravity acceleration sensor using X2+Y2+Z2=9.8m2And/s, the gravity acceleration components of the projection device on three coordinate axes can be detected. The geomagnetic sensor 112 is used to detect a yaw angle of the projection apparatus.
Optionally, the actual attitude data includes actual acceleration data detected by the acceleration sensor 111 and actual geomagnetic data detected by the geomagnetic sensor 112, and the initial attitude data includes an initial acceleration speed detected by the acceleration sensor 111 and initial geomagnetic data detected by the geomagnetic sensor 112.
Further, when the controller 120 receives the actual acceleration data and the actual geomagnetic data transmitted by the acceleration sensor 111 and the geomagnetic sensor 112, the actual acceleration data is compared with the initial acceleration data, and the actual geomagnetic data is compared with the initial geomagnetic data. If the actual acceleration data is different from the initial acceleration data and/or the actual geomagnetic data is different from the initial geomagnetic data, it may be determined that the actual attitude data is not matched with the initial attitude data, so as to determine that the current attitude of the projection apparatus is changed compared with the initial attitude.
In general, the initial acceleration data of the projection device in the initial attitude is AX2=0㎡/s,AY2=0㎡/s,AZ2The initial geomagnetic data is determined according to the actual situation, wherein the square meters per second is 9.8.
For example, the initial geomagnetic data of the projection apparatus is 15 ° (north is offset east), and when the actual acceleration data received by the controller 120 is AX2=1.5㎡/s,AY2=2.4㎡/s,AZ2When the actual geomagnetic data is 25 degrees (north is offset east) per square meter/s, the actual posture data of the projection equipment is not matched with the initial posture data, and the projection picture of the projection equipment is offset.
In some embodiments, in the case that the actual acceleration data is different from the initial acceleration data, the controller 120 is specifically configured to: and determining the roll angle offset and the pitch angle offset according to the actual acceleration data and the initial acceleration data.
Illustratively, according to a formula
Figure BDA0003329113390000071
Calculating a roll angle offset for the projection device, wherein aYRepresenting the acceleration in the Y-axis measured by the acceleration sensor, aZThe acceleration sensor measures acceleration in the Z-axis. According to the formula
Figure BDA0003329113390000072
Calculating the pitch angle offset of the projection device, wherein aXRepresenting the acceleration on the X-axis measured by the acceleration sensor, aZThe acceleration sensor measures acceleration in the Z-axis.
In some embodiments, in the case that the actual geomagnetic data is different from the initial geomagnetic data, the controller 120 is specifically configured to: and determining the yaw angle offset according to the actual geomagnetic data and the initial geomagnetic data.
Alternatively, since the geomagnetic sensor 112 detects the yaw angle of the projection apparatus with respect to the north pole of the earth, the yaw angle offset is the actual geomagnetic data ± the initial geomagnetic data.
For example, if the initial geomagnetic data is 20 ° north and the actual geomagnetic data is 15 ° north, the yaw angle offset of the projection apparatus is 35 °.
For another example, if the initial geomagnetic data is 20 ° north, the actual geomagnetic data is 30 ° north, and the yaw angle offset of the projection apparatus is 10 °.
As another possible implementation manner of the projection device, as shown in fig. 7, the gesture detection sensor 110 may further include: an acceleration sensor 111, a geomagnetic sensor 112, and a gyro sensor 113.
Among them, the gyro sensor 113 is used to detect the angular velocity of the projection apparatus in three XYZ axes.
Based on the structure shown in fig. 7, a specific process of determining the angular offset by the projection apparatus will be described in detail below with reference to different scenarios.
In a first scene, the posture of the projection equipment is changed in a starting state.
At this time, the actual attitude data includes actual acceleration data detected by the acceleration sensor 111, actual geomagnetic data detected by the geomagnetic sensor 112, and actual angular velocity data detected by the gyro sensor 113, and the initial attitude data includes initial acceleration velocity detected by the acceleration sensor 111 and initial geomagnetic data detected by the geomagnetic sensor 112.
Alternatively, the controller 120 may compare the actual acceleration data with the initial acceleration data, and compare the actual geomagnetic data with the initial geomagnetic data. If the actual acceleration data is different from the initial acceleration data and/or the actual geomagnetic data is different from the initial geomagnetic data, it may be determined that the actual attitude data is not matched with the initial attitude data, so as to determine that the current attitude of the projection apparatus is changed compared with the initial attitude.
In general, the initial acceleration data of the projection device in the initial attitude is AX2=0㎡/s,AY2=0㎡/s,AZ2The initial geomagnetic data is determined according to the actual situation, and the initial angular velocity data is 0.
For example, the initial geomagnetic data of the projection apparatus is 15 ° (north is offset east), and when the actual acceleration data received by the controller 120 is AX2=1.5㎡/s,AY2=2.4㎡/s,AZ25.9 square meters per second, the actual geomagnetic data is 25 degrees (north is offset east), and the actual angular velocity data is omegaX=2rad/s,ωY=1.5rad/s,ωZAt this time, the controller 120 may detect that the actual pose data of the projection apparatus does not match the initial pose data, which proves that the projection image of the projection apparatus is shifted.
In the event that a change in the pose of the projection device is determined, the controller 120 is specifically configured to: and determining a roll angle offset, a pitch angle offset and a yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual acceleration data, the actual geomagnetic data, the actual angular velocity data, the initial acceleration data and the initial geomagnetic data.
Specifically, the controller 120 may first determine a first roll angle offset and a first pitch angle offset based on the actual acceleration data and the initial acceleration data. And determining a first yaw angle offset according to the actual geomagnetic data and the initial geomagnetic data. For the specific calculation process, reference may be made to the above description, which is not repeated herein.
The controller 120 also determines a second roll angle offset, a second pitch angle offset, and a second yaw angle offset based on the actual angular velocity data.
In particular, according to
Figure BDA0003329113390000091
Determining a second roll angle offset, where ωXRepresenting the angular velocity, t, of the rotation of the projection device on the X-axis, as detected by a gyro-sensorXRepresenting the time that the projection device is rotated on the X-axis. According to
Figure BDA0003329113390000092
Determining a second pitch angle offset, wherein ωYIndicating the angular velocity, t, of the rotation of the projection device on the Y-axis, as detected by a gyro-sensorYIndicating the time the projection device is rotated on the Y-axis. According to
Figure BDA0003329113390000093
Second yaw angle offset, wherein ωZIndicating the angular velocity, t, of the rotation of the projection device on the Z-axis, as detected by a gyro-sensorZIndicating the time the projection device is rotated on the Z-axis.
As such, the controller 120 may determine a pitch angle offset based on the first pitch angle offset and the second pitch angle offset; determining a roll angle offset according to the first roll angle offset and the second roll angle offset; and determining the yaw angle offset according to the first yaw angle offset and the second yaw angle offset.
In some embodiments, the roll angle offset is a weighted average of the first roll angle offset and the second roll angle offset. For example, according to a formula
Figure BDA0003329113390000094
Determining projectionsRoll angle offset of the device, wherein3Representing the roll angle offset, alpha, of the projection device1Representing a first roll angle offset, alpha2Indicating a second roll angle offset.
In some embodiments, the pitch angle offset is a weighted average of the first pitch angle offset and the second pitch angle offset. According to the formula
Figure BDA0003329113390000095
Determining a pitch angle offset of the projection device, wherein β3Representing the amount of angular deflection of the projection device, beta1Representing a first pitch angle offset, beta2Representing a second pitch angle offset.
In some embodiments, the yaw angle offset is a weighted average of the first yaw angle offset and the second yaw angle offset. According to the formula
Figure BDA0003329113390000096
Determining a yaw angle offset of the projection device, wherein λ3Representing the yaw angle offset, λ, of the projection device1Representing a first yaw angle offset, λ2Indicating a second yaw angle offset.
And in a second scene, the posture of the projection equipment is changed in a power-off state.
It should be understood that the posture of the projection device is changed in the off state, and at this time, the projection device does not need to correct the image to be projected, so the projection device does not need to calculate the angle offset according to the actual posture data when the posture is changed. After the projector device is turned back on, the gyro sensor 113 does not detect valid data because the projector device is not moving.
Therefore, if the posture of the projection apparatus is changed in the off state, after the projection apparatus is turned back on, the actual posture data that can be used for calculating the angle offset includes the actual acceleration data detected by the acceleration sensor 111 and the actual geomagnetic data detected by the geomagnetic sensor 112.
Therefore, after the projector apparatus is turned back on, the controller 120 compares the actual acceleration data with the initial acceleration data, and compares the actual geomagnetic data with the initial geomagnetic data. If the actual acceleration data is different from the initial acceleration data and/or the actual geomagnetic data is different from the initial geomagnetic data, it may be determined that the actual attitude data is not matched with the initial attitude data, so as to determine that the current attitude of the projection apparatus is changed compared with the initial attitude.
Further, upon determining that the pose of the projection device has changed, the controller 120 is specifically configured to: and determining a roll angle offset, a pitch angle offset and a yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual acceleration data, the actual geomagnetic data, the initial acceleration data and the initial geomagnetic data.
Specifically, the controller 120 determines the roll angle offset and the pitch angle offset according to the actual acceleration data and the initial acceleration data; and determining a yaw angle offset according to the actual geomagnetic data and the initial geomagnetic data. For the specific calculation process, reference may be made to the above description, which is not repeated herein.
Optionally, after calculating the roll angle offset of the projection device, the controller determines the first picture tilt angle α of the projection picture according to the roll angle offset of the projection device.
As shown in fig. 8, the first screen inclination angle α is a rotation angle between the rectangular projection screen of the projection apparatus after the change of the roll and the rectangular projection screen of the projection apparatus at the initial posture.
Specifically, the controller pre-stores a first corresponding relationship between the roll angle offset and the first screen inclination angle α, and the controller obtains the first screen inclination angle α by searching the first corresponding relationship according to the roll angle offset.
Optionally, after calculating the pitch angle offset of the projection device, the controller determines the second screen tilt angle β of the projection screen according to the pitch angle offset of the projection device.
As shown in fig. 9, after the projection apparatus is changed in pitch, the projection image will become an isosceles trapezoid image as shown in fig. 9, and the second image tilt angle β is an angle between the waist of the isosceles trapezoid image and the vertical edge of the rectangular projection image in the initial posture of the projection apparatus.
Specifically, the controller prestores a second corresponding relationship between the pitch angle offset and the second screen inclination angle β, and the controller searches the second corresponding relationship according to the pitch angle offset to obtain the second screen inclination angle β.
Optionally, after calculating the yaw angle offset of the projection device, the controller determines a third picture tilt angle λ of the projection picture according to the yaw angle offset of the projection device.
As shown in fig. 10, after the projection apparatus changes its yaw, the projection image will become a right trapezoid image as shown in fig. 10, and the third image tilt angle λ is an angle between the waist of the right trapezoid image and the horizontal side of the rectangular projection image in the initial posture of the projection apparatus.
Specifically, a third corresponding relationship between the yaw angle offset and the third picture inclination angle λ is pre-stored in the controller, and the controller obtains the third picture inclination angle λ by searching the third corresponding relationship according to the yaw angle offset.
Further, the controller may obtain a projection picture of the projection device after the roll change, the pitch change, and the yaw change according to the first picture tilt angle α, the second picture tilt angle β, and the third picture tilt angle λ, and further perform a rectangular correction on the projection picture, and project the corrected rectangular projection picture onto the projection screen.
Optionally, as shown in fig. 11, the projection apparatus 100 may further include: a wireless fidelity (Wi-Fi) unit 130, an interface unit 140, a power supply unit 150, a light emitting unit 160, a display unit 170, a projection lens 180, a memory 190, and other units. These components may communicate over one or more communication buses or signal lines (not shown in fig. 9). Those skilled in the art will appreciate that the hardware configuration shown in fig. 9 does not constitute a limitation of the projection device, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
Wi-Fi is a short-range wireless transmission technology, and the projection apparatus 100 may help a user to send and receive e-mails, browse webpages, access streaming media, etc. through the Wi-Fi unit 130, which provides a wireless broadband internet access for the user.
An interface unit 140 for providing various interfaces for external input/output devices (e.g., a keyboard, a mouse, an external display, an external memory, a sim card, etc.). For example, the terminal is connected to a mouse or a display through a Universal Serial Bus (USB) interface, connected to a subscriber identity module card (SIM) card provided by a telecommunications carrier through a metal contact on a card slot of the SIM card, and used to implement a communication function with other terminals through an interface of the Wi-Fi unit 130, an interface of a Near Field Communication (NFC) device, an interface of a bluetooth module, and the like.
The power supply unit 150 is configured to supply power to various components of the projection device 100, such as a battery and a power management chip, where the battery may be logically connected to the controller 120 through the power management chip, so as to implement functions of managing charging, discharging, and power consumption through the power supply unit 150.
The light emitting unit 160 is used for emitting brightness meeting the display requirements of the display unit in the embodiment. Specifically, the light emitting unit 160 may be an OLED pixel. An OLED (Organic Light-Emitting Diode) is an Organic Light-Emitting Diode and has a self-Light-Emitting function. And a display unit 170 for displaying the image received by the projection apparatus, and projecting the content displayed in the display unit 170 onto a projection screen through a projection lens 180.
Memory 190 may be used to store software programs and data. The controller 190 performs various functions of the projection apparatus 100 and data processing by executing software programs or data stored in the memory 190. The memory 190 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Memory 190 stores an operating system that enables projection device 100 to operate. The memory 190 may store an operating system and various application programs, and may also store codes for executing the projection control method of the projection apparatus provided in the embodiments of the present application.
Alternatively, the projection device may interact with other devices, such as receiving a to-be-projected picture sent by other terminals or base stations. The projection device 100 may further include an RF circuit 1100, where the RF circuit 1100 may be used for connection between the projection device and other terminals or base stations to implement reception and transmission of signals, and may give received data to the controller 120 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
Optionally, the projection device 100 may further include an audio circuit, a speaker, a microphone, bluetooth, a Near Field Communication (NFC) device, and the like, which are not described herein again.
The projection equipment provided by the embodiment of the application is provided with the attitude detection sensor, and when actual attitude data detected by the attitude detection sensor is not matched with initial attitude data, the projection equipment is explained to move. In this case, the controller of the projection apparatus determines the attitude change condition (i.e., the roll angle offset, the pitch angle offset, and the yaw angle offset) of the projection apparatus based on the actual attitude data and the initial attitude data. And then, automatically correcting the image to be projected by the controller according to the roll angle offset, the pitch angle offset and the yaw angle offset. Therefore, the projection equipment provided by the application can automatically correct the image to be projected according to the posture change condition after moving, and does not need the user to manually correct, so that the good watching experience of the user is guaranteed, and the operation of the user is reduced.
Based on the projection apparatus, an embodiment of the present application provides a method for correcting a projection apparatus, where the method is applied to the projection apparatus 100 in the above embodiment, and as shown in fig. 12, the method may include:
s101, the controller acquires actual posture data.
S102, under the condition that the actual attitude data is not matched with the initial attitude data, the controller determines the roll angle offset, the pitch angle offset and the yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual attitude data and the initial attitude data.
S103, the controller corrects the image to be projected according to the roll angle offset, the pitch angle offset and the yaw angle offset.
According to the correction method of the projection equipment provided by the embodiment of the application, when the actual posture data detected by the controller does not match the initial posture data, the projection equipment is explained to move. In this case, the controller of the projection apparatus determines the attitude change condition (i.e., the roll angle offset, the pitch angle offset, and the yaw angle offset) of the projection apparatus based on the actual attitude data and the initial attitude data. And then, automatically correcting the image to be projected by the controller according to the roll angle offset, the pitch angle offset and the yaw angle offset. Therefore, the correction method of the projection equipment can automatically correct the image to be projected according to the posture change condition after moving, and does not need the user to manually correct, so that the good watching experience of the user is guaranteed, and the operation of the user is reduced.
Based on the embodiment shown in fig. 12, an embodiment of the present application further provides a method for correcting a projection device, which is suitable for a scene where a posture of the projection device changes when the projection device is in a power-on state, as shown in fig. 13, the method includes:
after the projection equipment is started for the first time, the projection equipment is in an initial attitude after being manually corrected by a user or a professional according to a projection environment and an actual projection picture, and initial attitude data is acquired through an attitude detection sensor.
Thereafter, when the attitude of the projection apparatus is changed, the data detected by the attitude detection sensor is changed. Based on this, the controller acquires actual attitude data detected by the attitude detection sensor and compares the actual attitude data with the initial attitude data to judge whether the attitude of the projection device is changed. And if the actual attitude data and the initial attitude data are different, determining that the attitude of the projection equipment is changed, and calculating the angle offset according to the actual attitude data and the initial attitude data.
As one possible implementation, when the attitude detection sensor includes an acceleration sensor and a geomagnetic sensor, the controller acquires actual acceleration data and initial acceleration data detected by the acceleration sensor and actual geomagnetic data and initial geomagnetic data detected by the geomagnetic sensor.
Further, the roll angle offset and the pitch angle offset of the projection device are calculated according to the actual acceleration data and the initial acceleration data. And calculating the yaw angle offset of the projection equipment according to the actual geomagnetic data and the initial geomagnetic data.
As another possible implementation, when the attitude detection sensor includes an acceleration sensor, a geomagnetic sensor, and a gyro sensor, actual acceleration data and initial acceleration data detected by the acceleration sensor, actual geomagnetic data and initial geomagnetic data detected by the geomagnetic sensor, and actual angular velocity data and initial angular velocity data detected by the gyro sensor are acquired.
And finally, the controller corrects the image to be projected according to the roll angle offset, the pitch angle offset and the yaw angle offset. Of course, if there is no difference between the actual pose data and the initial pose data, no correction need be performed.
Based on the embodiment shown in fig. 13, an embodiment of the present application further provides a method for correcting a projection device, which is suitable for a scene where a posture of the projection device changes when the projection device is in a power-off state, as shown in fig. 14, the method includes:
when the posture of the projection equipment is changed in the shutdown state, the gyro sensor cannot detect effective data, and only the acceleration sensor and the geomagnetic sensor can detect actual posture data of the projection equipment when the posture is changed in the shutdown state.
Therefore, when the projection equipment is started again after being shut down, the actual acceleration data and the actual geomagnetic data, recorded by the acceleration sensor and the geomagnetic sensor, of the projection equipment when the posture of the projection equipment is changed in the shutdown state are obtained, the actual acceleration data and the initial acceleration data are compared, whether a difference exists is judged, if the difference exists, the roll angle offset and the pitch angle offset are calculated according to the actual acceleration data and the initial acceleration data, and whether roll change and/or pitch change occur to the projection equipment is judged. And meanwhile, comparing the actual geomagnetic data with the initial geomagnetic data, judging whether the actual geomagnetic data and the initial geomagnetic data are different, and if so, calculating the yaw angle offset according to the actual geomagnetic data and the initial geomagnetic data.
And finally, correcting the image to be projected according to the roll angle offset and/or the pitch angle offset and/or the yaw angle offset. Of course, if there is no difference between the actual attitude data and the initial attitude data, no correction is performed.
As shown in fig. 15, a schematic diagram of an application scenario in which the projection apparatus 100 interacts with the control device 200 and the server 300 in the embodiment of the present application is shown.
The control device 200 may be a remote controller 200A, which can communicate with the projection apparatus 100 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-range communication, and is used to control the projection apparatus 100 through a wireless or other wired manner. The user may input a user command through keys on the remote controller 200A, voice input, control panel input, etc. to control the projection apparatus 100. Such as: the user may input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right movement keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller 200A, to implement the functions of the projection apparatus 100.
The control device 200 may also be an intelligent device, such as a mobile terminal 200B, a tablet computer, a notebook computer, etc., which may communicate with the multimedia controller 100 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the projection apparatus 100 through an application program corresponding to the multimedia controller 100. For example, projection device 100 is controlled using an application running on a smart device. The application may provide various controls to the User through an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 200B and the projection device 100 may each have a software application installed thereon, so that connection communication between the two can be realized through a network communication protocol, and thus, the purpose of one-to-one control operation and data communication can be realized. Such as: a control instruction protocol can be established between the mobile terminal 200B and the projection device 100, a remote control keyboard is synchronized to the mobile terminal 200B, and the function of controlling the multimedia controller 100 is realized by controlling a user interface on the mobile terminal 400B; the audio and video content displayed on the mobile terminal 400B may also be transmitted to the projection device 100, so as to implement a synchronous display function.
The server 300 may be a video server, an Electronic Program Guide (EPG) server, a cloud server, or the like.
Projection device 100 may be in data communication with server 300 via a variety of communication means. In various embodiments of the present application, the projection device 100 may be allowed to be in a wired or wireless communication connection with the server 500 via a local area network, a wireless local area network, or other network. Server 300 may provide various content and interactions to projection device 100.
Illustratively, projection device 100 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as EPG interactions. The servers 300 may be a group or groups, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
Fig. 16 is a block diagram schematically showing the configuration of the control device 200 according to the exemplary embodiment. As shown in fig. 16, the control device 200 includes a controller 210, a communicator 230, a user input/output interface 240, a memory 290, and a power supply 280.
The control device 200 is configured to control the projection apparatus 100, and to receive an input operation instruction from a user, and convert the operation instruction into an instruction recognizable and responsive by the projection apparatus 100, so as to mediate interaction between the user and the projection apparatus 100. Such as: the user operates the channel up/down key on the control device 400, and the projection apparatus 100 responds to the channel up/down operation.
In some embodiments, the control device 200 may be a smart device. Such as: the control device 200 may be installed to control various applications of the projection apparatus 100 according to user requirements.
In some embodiments, the mobile terminal 200B or other intelligent electronic device may function similar to the control apparatus 200 after an application for operating the projection device 100 is installed. Such as: the user may implement the functions of controlling the physical keys of the apparatus 200 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 200B or other intelligent electronic devices.
The controller 210 includes a processor 212, a RAM 213 and a ROM 214, a communication interface, and a communication bus. The controller 210 is used to control the operation of the control device 200, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 230 enables communication of control signals and data signals with the projection apparatus 100 under the control of the controller 210. Such as: the received user input signal is transmitted to the projection device 100. The communicator 230 may include at least one of a WIFI module 231, a bluetooth module 232, a Near Field Communication (NFC) module 233, and the like.
A user input/output interface 240, wherein the input interface includes at least one of a microphone 241, a touch pad 242, a sensor 243, keys 244, a camera 245, and the like. Such as: the user may implement a user instruction input function through voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the projection apparatus 100.
The output interface includes an interface that transmits the received user instruction to the projection device 100. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, a user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the projection device 100 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to a rf control signal modulation protocol and then transmitted to the projection device 100 through the rf transmitting terminal.
In some embodiments, the control device 200 includes at least one of a communicator 230 and an output interface. The control device 200 is provided with a communicator 230, such as: the modules such as the WIFI, the bluetooth, and the NFC may transmit the user input command to the projection device 100 through a WIFI protocol, a bluetooth protocol, or an NFC protocol code.
The memory 290 stores various operation programs, data, and applications for driving and controlling the control apparatus 200 under the control of the controller 210. The memory 290 may store various control signal commands input by a user.
And a power supply 280 for providing operation power support for each electrical component of the control device 200 under the control of the controller 210. The power supply 280 may be powered by a battery and associated control circuitry.
Further, as shown in FIG. 17, the application layer of the projection device may include various applications that may be executed on projection device 100.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on projection device 100.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be operational on projection device 100.
Embodiments of the present invention further provide a computer-readable storage medium, where the computer-readable storage medium includes computer-executable instructions, and when the computer-executable instructions are executed on a computer, the computer is enabled to execute the correction method provided in the foregoing embodiments.
The embodiment of the present invention further provides a computer program product, which can be directly loaded into the memory and contains software codes, and after the computer program product is loaded and executed by the computer, the calibration method provided by the above embodiment can be implemented.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer-executable instructions. The processes or functions described in accordance with the embodiments of the present application occur, in whole or in part, when computer-executable instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer executable instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer executable instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or can comprise one or more data storage devices, such as servers, data centers, and the like, that can be integrated with the media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A projection device, comprising:
the attitude detection sensor is used for detecting actual attitude data of the projection equipment, and the actual attitude data is used for reflecting the current attitude of the projection equipment;
a controller connected to the attitude detection sensor, the controller configured to:
acquiring the actual attitude data;
under the condition that the actual attitude data is not matched with the initial attitude data, determining roll angle offset, pitch angle offset and yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual attitude data and the initial attitude data; wherein the initial attitude data is data detected by the attitude detection sensor when the projection device is in an initial attitude; the rolling angle offset is an angle deflected by the projection equipment relative to an initial posture by taking a first coordinate axis as a rotating axis, and the first coordinate axis is a projection direction of the projection equipment in the initial posture in a horizontal plane; the pitch angle offset is an angle deflected by the projection equipment relative to the initial posture by taking a second coordinate axis as a rotating axis, and the second coordinate axis is a direction vertical to the first coordinate axis in a horizontal plane; the yaw angle offset is an angle deflected by the projection equipment relative to the initial posture by taking a third coordinate axis as a rotating axis, and the third coordinate axis is in a vertical direction;
and correcting the image to be projected according to the roll angle offset, the pitch angle offset and the yaw angle offset.
2. The apparatus according to claim 1, wherein the attitude detection sensor includes an acceleration sensor and a geomagnetic sensor, the actual attitude data includes actual acceleration data detected by the acceleration sensor and actual geomagnetic data detected by the geomagnetic sensor, and the initial attitude data includes an initial acceleration speed detected by the acceleration sensor and initial geomagnetic data detected by the geomagnetic sensor;
the controller is specifically configured to:
determining the roll angle offset and the pitch angle offset according to the actual acceleration data and the initial acceleration data;
and determining the yaw angle offset according to the actual geomagnetic data and the initial geomagnetic data.
3. The apparatus according to claim 1, wherein the attitude detection sensor includes an acceleration sensor, a geomagnetic sensor, and a gyro sensor, the actual attitude data includes actual acceleration data detected by the acceleration sensor, actual geomagnetic data detected by the geomagnetic sensor, and actual angular velocity data detected by the gyro sensor, and the initial attitude data includes an initial acceleration velocity detected by the acceleration sensor and initial geomagnetic data detected by the geomagnetic sensor;
the controller is specifically configured to:
determining a first roll angle offset and a first pitch angle offset according to the actual acceleration data and the initial acceleration data;
determining a first yaw angle offset according to the actual geomagnetic data and the initial geomagnetic data;
determining a second roll angle offset, a second pitch angle offset and a second yaw angle offset according to the actual angular velocity data;
determining the roll angle offset according to the first roll angle offset and the second roll angle offset;
determining the pitch angle offset according to the first pitch angle offset and the second pitch angle offset;
and determining the yaw angle offset according to the first yaw angle offset and the second yaw angle offset.
4. The apparatus of claim 3,
the roll angle offset is equal to a weighted average of the first roll angle offset and the second roll angle offset;
the pitch angle offset is equal to a weighted average of the first pitch angle offset and the second pitch angle offset;
the yaw angle offset is equal to a weighted average of the first yaw angle offset and the second yaw angle offset.
5. A method for calibrating a projection device, the method comprising:
acquiring actual attitude data;
under the condition that the actual attitude data is not matched with the initial attitude data, determining roll angle offset, pitch angle offset and yaw angle offset of the projection equipment switched from the initial attitude to the current attitude according to the actual attitude data and the initial attitude data; wherein the initial attitude data is data detected by the attitude detection sensor when the projection device is in an initial attitude; the rolling angle offset is an angle deflected by the projection equipment relative to an initial posture by taking a first coordinate axis as a rotating axis, and the first coordinate axis is a projection direction of the projection equipment in the initial posture in a horizontal plane; the pitch angle offset is an angle deflected by the projection equipment relative to the initial posture by taking a second coordinate axis as a rotating axis, and the second coordinate axis is a direction vertical to the first coordinate axis in a horizontal plane; the yaw angle offset is an angle deflected by the projection equipment relative to the initial posture by taking a third coordinate axis as a rotating axis, and the third coordinate axis is in a vertical direction;
and correcting the image to be projected according to the roll angle offset, the pitch angle offset and the yaw angle offset.
6. The method of claim 5, wherein the actual attitude data comprises actual acceleration data and actual geomagnetic data, and the initial attitude data comprises an initial acceleration velocity and initial geomagnetic data;
according to the actual attitude data and the initial attitude data, determining a roll angle offset, a pitch angle offset and a yaw angle offset of the projection equipment switched from the initial attitude to the current attitude, specifically comprising:
determining the roll angle offset and the pitch angle offset according to the actual acceleration data and the initial acceleration data;
and determining the yaw angle offset according to the actual geomagnetic data and the initial geomagnetic data.
7. The method of claim 5, wherein the actual attitude data comprises actual acceleration data, actual geomagnetic data, and actual angular velocity data, and the initial attitude data comprises initial acceleration velocity and initial geomagnetic data;
according to the actual attitude data and the initial attitude data, determining a roll angle offset, a pitch angle offset and a yaw angle offset of the projection equipment switched from the initial attitude to the current attitude, specifically comprising:
determining a first roll angle offset and a first pitch angle offset according to the actual acceleration data and the initial acceleration data;
determining a first yaw angle offset according to the actual geomagnetic data and the initial geomagnetic data;
determining a second roll angle offset, a second pitch angle offset and a second yaw angle offset according to the actual angular velocity data;
determining the roll angle offset according to the first roll angle offset and the second roll angle offset;
determining the pitch angle offset according to the first pitch angle offset and the second pitch angle offset;
and determining the yaw angle offset according to the first yaw angle offset and the second yaw angle offset.
8. The method of claim 7,
the roll angle offset is equal to a weighted average of the first roll angle offset and the second roll angle offset;
the pitch angle offset is equal to a weighted average of the first pitch angle offset and the second pitch angle offset;
the yaw angle offset is equal to a weighted average of the first yaw angle offset and the second yaw angle offset.
9. A projection system, comprising:
a projection screen and a projection device as claimed in any one of claims 1 to 4.
10. A computer program readable storage medium, characterized in that the computer program readable storage medium comprises computer program instructions which, when read by a computer, the computer performs the method of any of claims 5 to 8.
CN202111275147.8A 2021-10-29 2021-10-29 Projection equipment and correction method thereof Pending CN113973195A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111275147.8A CN113973195A (en) 2021-10-29 2021-10-29 Projection equipment and correction method thereof
PCT/CN2022/101830 WO2023071256A1 (en) 2021-10-29 2022-06-28 Laser projection device, and correction method for projected image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111275147.8A CN113973195A (en) 2021-10-29 2021-10-29 Projection equipment and correction method thereof

Publications (1)

Publication Number Publication Date
CN113973195A true CN113973195A (en) 2022-01-25

Family

ID=79589027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111275147.8A Pending CN113973195A (en) 2021-10-29 2021-10-29 Projection equipment and correction method thereof

Country Status (1)

Country Link
CN (1) CN113973195A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114485622A (en) * 2022-02-09 2022-05-13 国科星图(深圳)数字技术产业研发中心有限公司 Visual safety monitoring method for dam reservoir
CN116033131A (en) * 2022-12-29 2023-04-28 深圳创维数字技术有限公司 Image correction method, device, electronic equipment and readable storage medium
WO2023071256A1 (en) * 2021-10-29 2023-05-04 青岛海信激光显示股份有限公司 Laser projection device, and correction method for projected image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023071256A1 (en) * 2021-10-29 2023-05-04 青岛海信激光显示股份有限公司 Laser projection device, and correction method for projected image
CN114485622A (en) * 2022-02-09 2022-05-13 国科星图(深圳)数字技术产业研发中心有限公司 Visual safety monitoring method for dam reservoir
CN116033131A (en) * 2022-12-29 2023-04-28 深圳创维数字技术有限公司 Image correction method, device, electronic equipment and readable storage medium
CN116033131B (en) * 2022-12-29 2024-05-17 深圳创维数字技术有限公司 Image correction method, device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
US11044514B2 (en) Method for displaying bullet comment information, method for providing bullet comment information, and device
CN113973195A (en) Projection equipment and correction method thereof
US11343556B2 (en) Method of controlling a horizontal screen or vertical screen of television, device, and storage medium
EP3605314B1 (en) Display method and apparatus
US9819870B2 (en) Photographing method and electronic device
KR102302437B1 (en) Method for motion sensing and an user device thereof
US8089513B2 (en) Information processing apparatus, information processing method, program, and information processing system
US11262903B2 (en) IoT device control system and method using virtual reality and augmented reality
US20210195525A1 (en) Display device and control method thereof
KR102226520B1 (en) Method and apparatus for updating advertising information
US20240223904A1 (en) Omnidirectional camera system with improved point of interest selection
KR101227331B1 (en) Method for transmitting and receiving data and display apparatus thereof
CN111510757A (en) Method, device and system for sharing media data stream
CN110178111B (en) Image processing method and device for terminal
KR102194787B1 (en) Apparatus and method for user based sensor data acquiring
KR102508148B1 (en) digital device, system and method for controlling color using the same
CN111787364A (en) Media data acquisition method, smart television and mobile terminal
CN110095792B (en) Method and device for positioning terminal
US10394336B2 (en) Ocular focus sharing for digital content
KR20220113332A (en) Method of provding contents and terminal device
CN114339174A (en) Projection equipment and control method thereof
CN113965740A (en) Projection equipment and control method thereof
KR20180045359A (en) Electronic apparatus and control method thereof
KR20220102942A (en) Image displaying apparatus and method thereof
KR102206817B1 (en) An electronic device and a method for managing a base station list thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination