CN116743973A - Automatic correction method for noninductive projection image - Google Patents

Automatic correction method for noninductive projection image Download PDF

Info

Publication number
CN116743973A
CN116743973A CN202310710687.7A CN202310710687A CN116743973A CN 116743973 A CN116743973 A CN 116743973A CN 202310710687 A CN202310710687 A CN 202310710687A CN 116743973 A CN116743973 A CN 116743973A
Authority
CN
China
Prior art keywords
projection
projector
image
tan
projection plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310710687.7A
Other languages
Chinese (zh)
Inventor
高岚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Changhong Electric Co Ltd
Original Assignee
Sichuan Changhong Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Changhong Electric Co Ltd filed Critical Sichuan Changhong Electric Co Ltd
Priority to CN202310710687.7A priority Critical patent/CN116743973A/en
Publication of CN116743973A publication Critical patent/CN116743973A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

The application relates to the field of projection image correction, and provides an automatic correction method of a non-inductive projection image, which aims to improve the user experience during correction. The whole correction process does not need to project characteristic images, and compared with the traditional processing process of reading images and decoding the images by using a camera, the processing speed can be improved by about 1 time, so that the use experience of a user is improved.

Description

Automatic correction method for noninductive projection image
Technical Field
The application relates to the field of projection image correction, in particular to an automatic correction method for a non-inductive projection image.
Background
In the existing LED micro-projection products, due to the characteristics of a use scene, the position is often moved, after the whole machine is moved, an image projected on a screen or a wall is deformed, if the projected image is expected to ensure the rectangle and the proportion, the vertex coordinate position of the projected image is required to be adjusted, and the rectangle and the proportion of the projected image are ensured to be correct. It is common practice to use an automatic trapezoidal correction function to achieve automatic adjustment of the projected picture.
The automatic trapezoid correction function requires a projector to project a picture with a characteristic image, the projected image is captured by a camera of the projector, and the image which is not in a positive rectangle is adjusted to be in the positive rectangle by an image processing algorithm. After the image with the characteristics is thrown, the normal use of the user is interrupted, the processing time is long, 5-6 seconds are usually required, and the use experience is not very friendly for the user.
Disclosure of Invention
In order to improve the user experience during correction, the application provides an automatic correction method for a non-inductive projection image.
The application solves the problems by adopting the following technical scheme:
the automatic correction method of the noninductive projection image comprises the following steps:
step 1, acquiring attitude angle information of a projector and a projection plane;
step 2, obtaining three-dimensional coordinates of four vertexes of the current projection image;
step 3, respectively carrying out vector decomposition on three-dimensional coordinates of the four vertexes according to the attitude angle information so as to obtain two-dimensional imaging vertex coordinates of the projection image on the projection plane in the current state;
step 4, obtaining a maximum inscribed rectangle corresponding to the coordinates of four two-dimensional imaging vertexes, wherein the aspect ratio of the maximum inscribed rectangle is a projection display ratio;
step 5, obtaining a projection two-dimensional vertex coordinate of the vertex coordinate in the projection equipment according to the vertex coordinate of the inscribed rectangle and an image projection perspective transformation principle;
and 6, projecting according to the projection two-dimensional vertex coordinates.
Further, the attitude angle information in the step 1 includes: and acquiring depth information and angle information between the projector and a projection plane by using a TOF sensor, and acquiring angle information of deviation between the projector and a horizontal plane by using an IMU inertial sensor.
Further, the step 1 specifically includes:
step 11, obtaining a linear relation between the angle of the inertial sensor of the IMU and the numerical value of each axis;
step 12, respectively obtaining the relative angles of the projector and the projection plane in the Y-axis direction and the Z-axis direction according to the measured value of the inertial sensor of the IMU and the linear relation;
step 13, acquiring depth information of light spots P1 and P2 with fixed angles on a projection plane through a TOF sensor;
and 14, calculating the relative angle between the X-axis direction projector and the projection plane according to the fixed angle and the depth information.
Further, the fixed angle in the step 13 is 30 °.
Further, the step 2 specifically includes:
defining the preset distance between the projector and the projection plane as L, w as the width of the projected image at the preset distance, and h as the height of the projected image at the preset distance; if the actual distance between the projection plane and the micro-projector is L', the three-dimensional imaging vertex coordinates of the four vertices A, B, C, D of the projection image are respectively: a (-L '×tan θ, L' ×tan γ, L '), B (L' ×tan θ, L '×tan γ, L'), C (-L '×tan θ,0, L'), D (L '×tan θ,0, L'), wherein tan θ=w/(2×l), tan γ=h/L.
Further, in the step 4, the projection display ratio is 16:9.
Compared with the prior art, the application has the following beneficial effects: according to the application, a TOF (time of flight) sensor and an IMU (inertial measurement unit) inertial sensor are utilized to obtain the attitude angle between the projection equipment and the projection plane, the relative coordinate state of the projection equipment and the projection plane is obtained through calculation according to the attitude angle and the fixed ray vector of the projection angular point projected by the projection equipment, and then the position adjustment of the projection vertex is carried out according to the image projection perspective transformation principle. The whole correction process does not need to project characteristic images, and compared with the traditional processing process of reading images and decoding the images by using a camera, the processing speed can be improved by about 1 time, so that the use experience of a user is improved.
Drawings
FIG. 1 is a flow chart of a method for automatically correcting an image of a non-inductive projection;
FIG. 2 is a schematic diagram of the relative angle acquisition of an X-direction projector and a projection plane;
FIG. 3 is a schematic diagram of the principle of calculating three-dimensional imaging vertex coordinates of a standard image;
FIG. 4 is a schematic view of a projected rectangle adjustment;
fig. 5 is a schematic diagram showing the effect of 4-point correction in the case of Z-axis angular offset.
Detailed Description
The present application will be described in further detail with reference to the following examples in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
As shown in fig. 1, the method for automatically correcting the noninductive projection image comprises the following steps:
step 1, acquiring attitude angle information of a projector and a projection plane;
step 2, obtaining three-dimensional coordinates of four vertexes of the current projection image;
step 3, respectively carrying out vector decomposition on three-dimensional coordinates of the four vertexes according to the attitude angle information so as to obtain two-dimensional imaging vertex coordinates of the projection image on the projection plane in the current state;
step 4, obtaining a maximum inscribed rectangle corresponding to the coordinates of four two-dimensional imaging vertexes, wherein the aspect ratio of the maximum inscribed rectangle is a projection display ratio;
step 5, obtaining a projection two-dimensional vertex coordinate of the vertex coordinate in the projection equipment according to the vertex coordinate of the inscribed rectangle and an image projection perspective transformation principle;
and 6, projecting according to the projection two-dimensional vertex coordinates.
Specifically, in the step 1, a TOF sensor is used to obtain depth information between the projector and the projection plane, and an IMU inertial sensor is used to obtain angle information of deviation between the projector and the horizontal plane.
First, IMU inertial sensor data are read, X, Y, Z axis values are read in real time, and a linear relation between an angle and each axis value is obtained through recording of each axis change under different angles. In the actual test, the linear relationship is effective only in the direction of the projector Y, Z, and the X direction cannot be judged. Therefore, the data acquired by the TOF range finder is needed for analysis in the X direction, and the angle in the Y, Z direction can be directly obtained by the IMU inertial sensor. Other measuring modes besides IMU inertial sensors and TOF sensors, such as level meters, laser ranging and the like, can be adopted.
And controlling a TOF sensor of the projector to measure the projection plane, and obtaining depth information of a plurality of light spots irradiated on the projection plane by the TOF sensor.
The relative angle between the X-direction projector and the projection plane can be determined from the depth information of the two fixed-angle light spots P1, P2:
as shown in fig. 2, the depth information of the light spots P1 and P2 is L1 and L2, respectively, the fixed angle is α, and the relative angle β between the X-direction projector and the projection plane can be obtained according to the formula (1):
the relative angles of the projector and the projection plane in the direction X, Y, Z can be obtained through the steps.
And calculating the three-dimensional coordinates of the vertex of the projection image according to the vector of the connecting line between the vertex of the projection image and the optical center of the projector. The calculation process is as follows:
as shown in fig. 3, the standard image has four vertices, which are point a, point B, point C, and point D, respectively.
Defining a preset distance as L, w as the width of the projected image under the preset distance, and h as the height of the projected image under the preset distance. According to projector parameters and a triangular relation, assuming that the distance between a projection plane and the projector is L', the three-dimensional imaging vertex coordinates of the point A are as follows:
similarly, three-dimensional imaging vertex coordinate vectors for vertex B, vertex C and vertex D can be calculated as: b (L '. Times.tan. Theta., L '. Times.tan. Gamma., L '), C (-L '. Times.tan. Theta., 0, L '), D (L '. Times.tan. Theta., 0, L ').
And carrying out vector decomposition on the three-dimensional vertex coordinate vector of the projection image obtained by calculation by utilizing the relative angle position information of the projector and the projection plane, so as to obtain the two-dimensional imaging vertex coordinate of the projection image on the projection plane in the current state. The specific vector decomposition process is not described in detail herein.
After the two-dimensional imaging vertex coordinates on the projection plane are obtained, the image is not a positive rectangle and is a trapezoid quadrilateral. On this quadrangular shape, an inscribed rectangle for obtaining the maximum display scale within the quadrangular shape is calculated as shown in fig. 4.
As shown in fig. 5, if the Z direction has a rotation angle Φ, the x and y coordinate values of the four vertex coordinates of the maximum inscription matrix are further required to be subjected to angle rotation processing calculation according to the Z direction angle information.
Based on the four finally obtained vertex coordinate values, the projection two-dimensional vertex coordinate of the corrected image in the projection equipment can be obtained by utilizing the image projection perspective transformation principle. The projector projects using the corrected two-dimensional vertex coordinates, and the image presented in the user's field of view is rectangular.
Examples
The IMU inertial sensor is arranged in the projector, and the TOF sensor is arranged beside the projection optical machine. Acquiring a linear relation between the IMU sensor angle and each axis value: y_angle=0.876×ty+2.862, z_angle=1.796×tz-56.596, ty is measurement data of the Y axis, and tz is measurement data of the Z axis.
After the auto-calibration function is triggered, the IMU sensor data Y, Z axis values are read,
according to the actual measured data, ty=8.15, tz=31.5, at which the projector and the projection plane relative angles are 10 ° and 0 ° in the Y and Z directions, respectively.
As shown in fig. 2, the fixed angle of the light extraction points P1, P2 is 30 °, the TOF sensor data is read, where l1=1.2 m, l2=1 m, and the relative angle of the projector and the projection plane is-10.3 ° in the X direction can be found according to formula 1.
And obtaining the three-dimensional coordinate vector of the vertex of the projection image according to the vector of the connecting line between the vertex of the projection image and the optical center of the projector. When the horizontal distance between the projector and the projection plane is 1m, the width and the height of the acquired projection image are 864cm and 4816 cm respectively, the coordinate vector of the point a is (-0.432,0.486,1), the coordinate vector of the point B is (-0.432,0.486,1), the coordinate vector of the point C is (-0.432,0,1), and the coordinate vector of the point D is (0.432,0,1) according to the formula 2.
And according to the coordinate vectors of the 4 vertexes, combining the relative angles of the projector and the projection plane, carrying out vector decomposition on the three-dimensional vertex coordinate vectors to obtain the two-dimensional imaging vertex coordinates of the projection image in the current state. In this embodiment, for the convenience of calculation, the decomposition is performed with the point D as the origin: point A (-76.47,726.41), point B (1122.19,607.28), point C (1102.51,2.38), point D (0, 0).
After the two-dimensional imaging vertex coordinates on the plane are obtained, the image is a trapezoid quadrilateral, and as shown in fig. 4, the maximum 16 in the quadrilateral is calculated and obtained on the quadrilateral: 9 to obtain a point a (23.51,609.29), a point B (1102.51,609.29), a point C (1102.51,2.38), and a point D (23.51,2.38). The display scale can be set to other values according to actual needs.
According to the four vertex coordinate values of the maximum inscription matrix solved above, calculating by using a projection perspective transformation principle to obtain the two-dimensional vertex coordinate of the corrected original image: point A (81.98,858.61), point B (1690.33,1065.25), point C (1764.54,86.21), point D (39.81,0).
If the Z direction has a rotation angle of 5 degrees, the x and y coordinate values of the four vertex coordinates of the maximum inscription matrix are directly subjected to angle rotation processing calculation according to the Z direction angle information, and a corrected vertex coordinate point A '(91.348,609.29), a point B' (1083.49,522.49), a point C '(1034.672,2.38) and a point D' (42.52,89.178) after rotation can be obtained. In this embodiment, the Z direction is 0 °, and this step can be skipped.
The projector uses the corrected two-dimensional vertex coordinates: point A (81.98,858.61), point B (1690.33,1065.25), point C (1764.54,86.21), and Point D (39.81,0) are projected to obtain the desired rectangular display.

Claims (6)

1. The automatic correction method for the noninductive projection image is characterized by comprising the following steps of:
step 1, acquiring attitude angle information of a projector and a projection plane;
step 2, obtaining three-dimensional coordinates of four vertexes of the current projection image;
step 3, respectively carrying out vector decomposition on three-dimensional coordinates of the four vertexes according to the attitude angle information so as to obtain two-dimensional imaging vertex coordinates of the projection image on the projection plane in the current state;
step 4, obtaining a maximum inscribed rectangle corresponding to the coordinates of four two-dimensional imaging vertexes, wherein the aspect ratio of the maximum inscribed rectangle is a projection display ratio;
step 5, obtaining a projection two-dimensional vertex coordinate of the vertex coordinate in the projection equipment according to the vertex coordinate of the inscribed rectangle and an image projection perspective transformation principle;
and 6, projecting according to the projection two-dimensional vertex coordinates.
2. The method according to claim 1, wherein the posture angle information in step 1 includes: and acquiring depth information and angle information between the projector and a projection plane by using a TOF sensor, and acquiring angle information of deviation between the projector and a horizontal plane by using an IMU inertial sensor.
3. The method for automatically correcting a non-inductive projection image according to claim 2, wherein the step 1 specifically comprises:
step 11, obtaining a linear relation between the angle of the inertial sensor of the IMU and the numerical value of each axis;
step 12, respectively obtaining the relative angles of the projector and the projection plane in the Y-axis direction and the Z-axis direction according to the measured value of the inertial sensor of the IMU and the linear relation;
step 13, acquiring depth information of light spots P1 and P2 with fixed angles on a projection plane through a TOF sensor;
and 14, calculating the relative angle between the X-axis direction projector and the projection plane according to the fixed angle and the depth information.
4. The method of automatically correcting a non-inductive projected image according to claim 3, wherein the fixed angle in the step 13 is 30 °.
5. The method for automatically correcting a non-inductive projection image according to claim 2, wherein the step 2 is specifically:
defining the preset distance between the projector and the projection plane as L, w as the width of the projected image at the preset distance, and h as the height of the projected image at the preset distance; if the actual distance between the projection plane and the micro-projector is L', the three-dimensional imaging vertex coordinates of the four vertices A, B, C, D of the projection image are respectively: a (-L '×tan θ, L' ×tan γ, L '), B (L' ×tan θ, L '×tan γ, L'), C (-L '×tan θ,0, L'), D (L '×tan θ,0, L'), wherein tan θ=w/(2×l), tan γ=h/L.
6. The method according to any one of claims 1 to 5, wherein the projection display ratio in the step 4 is 16:9.
CN202310710687.7A 2023-06-15 2023-06-15 Automatic correction method for noninductive projection image Pending CN116743973A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310710687.7A CN116743973A (en) 2023-06-15 2023-06-15 Automatic correction method for noninductive projection image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310710687.7A CN116743973A (en) 2023-06-15 2023-06-15 Automatic correction method for noninductive projection image

Publications (1)

Publication Number Publication Date
CN116743973A true CN116743973A (en) 2023-09-12

Family

ID=87910995

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310710687.7A Pending CN116743973A (en) 2023-06-15 2023-06-15 Automatic correction method for noninductive projection image

Country Status (1)

Country Link
CN (1) CN116743973A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117570853A (en) * 2024-01-16 2024-02-20 深圳新智联软件有限公司 Method, device, equipment and storage medium for calculating four-point coordinates in projection interface

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117570853A (en) * 2024-01-16 2024-02-20 深圳新智联软件有限公司 Method, device, equipment and storage medium for calculating four-point coordinates in projection interface
CN117570853B (en) * 2024-01-16 2024-04-09 深圳新智联软件有限公司 Method, device, equipment and storage medium for calculating four-point coordinates in projection interface

Similar Documents

Publication Publication Date Title
US20240153143A1 (en) Multi view camera registration
CN112804507B (en) Projector correction method, projector correction system, storage medium, and electronic device
CN110191326B (en) Projection system resolution expansion method and device and projection system
JP4245963B2 (en) Method and system for calibrating multiple cameras using a calibration object
US8107722B2 (en) System and method for automatic stereo measurement of a point of interest in a scene
EP2631740A2 (en) System for reproducing virtual objects
US20150116691A1 (en) Indoor surveying apparatus and method
CN109215108A (en) Panorama three-dimensional reconstruction system and method based on laser scanning
US20060215935A1 (en) System and architecture for automatic image registration
JP2007036482A (en) Information projection display and program
US9449433B2 (en) System for reproducing virtual objects
US20210364900A1 (en) Projection Method of Projection System for Use to Correct Image Distortion on Uneven Surface
JP4052382B2 (en) Non-contact image measuring device
CN113824942B (en) Trapezoidal correction method, apparatus, projector, and computer-readable storage medium
CN110555813B (en) Rapid geometric correction method and system for remote sensing image of unmanned aerial vehicle
CN112734860A (en) Arc-screen prior information-based pixel-by-pixel mapping projection geometric correction method
JP2010287074A (en) Camera calibration device, camera calibration method, camera calibration program and recording medium recording the program
CN116743973A (en) Automatic correction method for noninductive projection image
CN110044266B (en) Photogrammetry system based on speckle projection
CN110490943B (en) Rapid and accurate calibration method and system of 4D holographic capture system and storage medium
JP2011155412A (en) Projection system and distortion correction method in the same
JP3842988B2 (en) Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program
CN111131801A (en) Projector correction system and method and projector
Martínez et al. Non-contact 3D measurement of buildings through close range photogrammetry and a laser distance meter
JP2974316B1 (en) Method for restoring two-dimensional position information of local coordinates from bird's-eye view photograph, system for restoring the two-dimensional position information, and computer-readable recording medium recording a program of the method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination