CN114079756A - Projection system and method and projection display system - Google Patents

Projection system and method and projection display system Download PDF

Info

Publication number
CN114079756A
CN114079756A CN202010799226.8A CN202010799226A CN114079756A CN 114079756 A CN114079756 A CN 114079756A CN 202010799226 A CN202010799226 A CN 202010799226A CN 114079756 A CN114079756 A CN 114079756A
Authority
CN
China
Prior art keywords
projection
image
unit
projection screen
processing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010799226.8A
Other languages
Chinese (zh)
Inventor
张宁
白晓锌
郑旗军
谢燕俊
何宽鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TPK Glass Solutions Xiamen Inc
Original Assignee
TPK Touch Systems Xiamen Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TPK Touch Systems Xiamen Inc filed Critical TPK Touch Systems Xiamen Inc
Priority to CN202010799226.8A priority Critical patent/CN114079756A/en
Publication of CN114079756A publication Critical patent/CN114079756A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Projection Apparatus (AREA)

Abstract

A projection system suitable for a projection screen comprises a projection device, a signal output unit and a processing unit, wherein the projection device comprises a projection unit and a shooting unit, and the signal output unit and the processing unit are suitable for being arranged on the projection screen. Under the condition that the projection unit projects an image, the processing unit generates image position data indicating the position of the image according to a shooting result generated by the shooting unit, generates projection screen position data indicating the position of the projection screen according to a plurality of wireless signals output by the signal output unit, generates correction data according to a comparison result between the image position data and the projection screen position data, and adjusts at least one setting parameter related to the projection unit according to the correction data so as to reduce the size or position difference between the image and the projection screen.

Description

Projection system and method and projection display system
Technical Field
The present invention relates to a projection system, and more particularly, to a projection system for correcting a projected image using a wireless signal. The invention also relates to a projection method implemented by the projection system and a projection display system comprising the projection system.
Background
In modern society, projection devices are often used in teaching, speech, meetings, presentations, and the like. Various information is projected on the projection screen by the projection device, so that the speaker can be effectively assisted to transmit messages, and the audience can more easily know the content of the speaker.
When the projection apparatus is used, it is necessary to ensure that the image projected by the projection apparatus matches the projection screen, and more specifically, the position and size of the projected image need to match the projection screen as much as possible, so as to obtain the best projection effect.
Disclosure of Invention
One of the objects of the present invention is to provide a projection system that improves the inconvenience of the prior art.
The projection system is suitable for a projection screen, and is characterized in that: the projection system comprises a projection device, a signal output unit and a processing unit. The projection device comprises a projection unit, a shooting unit and a signal receiving unit. The signal output unit is suitable for being arranged on the projection screen. The processing unit is electrically connected with the projection unit, the shooting unit and the signal receiving unit. When the projection unit projects an image to the projection screen, the processing unit obtains a shooting result generated by the shooting unit and presenting the image from the shooting unit, and generates image position data indicating a position of the image according to the shooting result. The signal receiving unit is used for receiving a plurality of wireless signals output by the signal output unit, and the processing unit generates projection screen position data indicating the position of the projection screen according to the wireless signals. The processing unit generates correction data according to a comparison result between the image position data and the projection screen position data, the correction data indicates a difference of at least one of a size and a position between the image and the projection screen, and the processing unit adjusts at least one setting parameter related to the projection unit according to the correction data so as to reduce the difference of at least one of the size and the position between the image and the projection screen.
In some embodiments of the projection system of the present invention, the signal output unit includes a plurality of signal emitters, and the signal emitters are adapted to be respectively disposed at a plurality of corner regions of the projection screen. The image position data comprises a plurality of image position coordinates, and the image position coordinates respectively correspond to a plurality of vertexes of the image. The wireless signals are respectively output by the signal transmitters, the projection screen position data comprise a plurality of projection screen position coordinates respectively corresponding to the signal transmitters, and each projection screen position coordinate is calculated by the processing unit according to the intensity and the direction of the wireless signals output by the corresponding signal transmitters.
In some embodiments of the projection system of the present invention, the projection screen is a touch-sensitive projection screen, and after the processing unit adjusts at least one of the setting parameters according to the correction data and when the projection unit projects the image, the processing unit executes a specific processing procedure when receiving touch operation data from the projection screen and determining that the touch operation data indicates the specific processing procedure related to the image.
In some embodiments of the projection system of the present invention, before the processing unit generates the image position data and the projection screen position data, the processing unit further generates the image and controls the projection unit to project the image. After the processing unit adjusts at least one setting parameter according to the correction data, the image presents a user operation interface, and the processing unit determines whether the touch operation data indicates the specific processing program related to the user operation interface according to a touch operation position and a touch operation type included in the touch operation data, and presents at least one of an execution process and an execution result of the specific processing program in the image to be projected by the projection unit when the specific processing program is executed.
Another objective of the present invention is to provide a projection method implemented by the projection system.
The projection method is implemented by a projection system, the projection system comprises a projection device, a signal output unit and a processing unit, the signal output unit is suitable for being arranged on a projection screen, the projection device comprises a projection unit, a shooting unit and a signal receiving unit, and the projection unit, the shooting unit and the signal receiving unit are electrically connected with the processing unit, and the projection system is characterized in that: the projection method comprises the following steps: an image positioning step in which, when the projection unit projects an image toward the projection screen, the processing unit obtains, from the shooting unit, a shooting result generated by the shooting unit and presenting the image, and generates image position data indicating a position of the image according to the shooting result; a projection screen positioning step, wherein the signal receiving unit is used for receiving a plurality of wireless signals output by the signal output unit, and the processing unit is used for generating projection screen position data indicating the position of the projection screen according to the wireless signals; and an image correction step, wherein the processing unit generates correction data according to a comparison result between the image position data and the projection screen position data, the correction data indicates a difference of at least one of the size and the position between the image and the projection screen, and the processing unit adjusts at least one setting parameter related to the projection unit according to the correction data so as to reduce the difference of at least one of the size and the position between the image and the projection screen.
In some embodiments of the projection method of the present invention, the signal output unit includes a plurality of signal emitters, and the signal emitters are adapted to be respectively disposed at a plurality of corner region positions of the projection screen. In the image positioning step, the image position data includes a plurality of image position coordinates, and the image position coordinates respectively correspond to a plurality of vertices of the image. In the projection screen positioning step, the wireless signals are respectively output by the signal transmitters, the projection screen position data include a plurality of projection screen position coordinates respectively corresponding to the signal transmitters, and each projection screen position coordinate is calculated by the processing unit according to the intensity and direction of the wireless signal output by the corresponding signal transmitter.
In some embodiments of the projection method of the present invention, the projection screen is a touch-sensitive projection screen, and the projection method further includes a touch operation processing step after the image correction step, wherein when the projection unit projects the image, the processing unit executes a specific processing procedure when receiving touch operation data from the projection screen and determining that the touch operation data indicates the specific processing procedure related to the image.
In some embodiments of the projection method of the present invention, the projection method further includes a projection step before the image positioning step and the projection screen positioning step, and the processing unit generates the image and controls the projection unit to project the image. In the touch operation processing step, the image presents a user operation interface, and the processing unit determines whether the touch operation data indicates the specific processing program related to the user operation interface according to a touch operation position and a touch operation type included in the touch operation data, and when the specific processing program is executed, the processing unit further presents at least one of an execution process and an execution result of the specific processing program in the image to be projected by the projection unit.
It is another object of the present invention to provide a projection display system including the projection system.
The projection display system of the present invention is characterized in that: the projection display system comprises a projection device, a projection screen, a signal output unit and a processing unit. The projection device comprises a projection unit, a shooting unit and a signal receiving unit. The signal output unit is used for being arranged on the projection screen. The processing unit is electrically connected with the projection unit, the shooting unit and the signal receiving unit. When the projection unit projects an image to the projection screen, the processing unit obtains a shooting result generated by the shooting unit and presenting the image from the shooting unit, and generates image position data indicating a position of the image according to the shooting result. The signal receiving unit is used for receiving a plurality of wireless signals output by the signal output unit, and the processing unit generates projection screen position data indicating the position of the projection screen according to the wireless signals. The processing unit generates correction data according to a comparison result between the image position data and the projection screen position data, the correction data indicates a difference of at least one of a size and a position between the image and the projection screen, and the processing unit adjusts at least one setting parameter related to the projection unit according to the correction data so as to reduce the difference of at least one of the size and the position between the image and the projection screen.
In some embodiments of the projection display system of the present invention, the projection screen is a touch-sensitive projection screen, and the projection screen is operable to switch between a transparent state allowing light to pass through and an opaque state reflecting light.
In some embodiments of the projection display system of the present invention, the signal output unit includes a plurality of signal emitters, and the signal emitters are respectively disposed at a plurality of corner regions of the projection screen. The image position data comprises a plurality of image position coordinates, and the image position coordinates respectively correspond to a plurality of vertexes of the image. The wireless signals are respectively output by the signal transmitters, the projection screen position data comprise a plurality of projection screen position coordinates respectively corresponding to the signal transmitters, and each projection screen position coordinate is calculated by the processing unit according to the intensity and the direction of the wireless signals output by the corresponding signal transmitters.
In some embodiments of the projection display system of the present invention, the projection screen is a touch-sensitive projection screen, and after the processing unit adjusts at least one of the setting parameters according to the correction data and projects the image, the processing unit executes a specific processing procedure when receiving touch operation data from the projection screen and determining that the touch operation data indicates the specific processing procedure related to the image.
In some embodiments of the projection display system of the present invention, before the processing unit generates the image position data and the projection screen position data, the processing unit further generates the image and controls the projection unit to project the image. After the processing unit adjusts at least one setting parameter according to the correction data, the image presents a user operation interface, and the processing unit determines whether the touch operation data indicates the specific processing program related to the user operation interface according to a touch operation position and a touch operation type included in the touch operation data, and presents at least one of an execution process and an execution result of the specific processing program in the image to be projected by the projection unit when the specific processing program is executed.
The invention has the beneficial effects that: the processing unit of the projection system can generate the correction data according to the image position data and the projection screen position data, and automatically adjust the setting parameters of the projection device according to the correction data to reduce the difference of at least one of the size and the position between the projected image and the projection screen.
Drawings
Other features and effects of the present invention will become apparent from the following detailed description of the embodiments with reference to the accompanying drawings, in which:
FIG. 1 is a block diagram of a projection display system according to a first embodiment of the present invention;
FIG. 2 is a flow chart illustrating exemplary how the first embodiment implements a projection method; and
FIG. 3 is a block diagram of a projection display system according to a second embodiment of the present invention.
Detailed Description
Before the present invention is described in detail, it should be noted that in the following description, like elements are represented by like reference numerals. In addition, the term "electrical connection" used in the present specification refers to a wired electrical connection between a plurality of electronic devices/apparatuses/elements through a conductive material, and a wireless electrical connection for transmitting wireless signals through a wireless communication technology. Further, the term "electrical connection" as used herein also refers to "direct electrical connection" formed by direct connection between two electronic devices/apparatuses/elements, and "indirect electrical connection" formed by connection between two electronic devices/apparatuses/elements through other electronic devices/apparatuses/elements.
Referring to fig. 1, the projection display system 1 of the first embodiment of the present invention includes a projection device 11, a projection screen 12, a signal output unit 13, and a processing unit 14, and the projection device 11 includes a projection unit 111, a shooting unit 112, a signal receiving unit 113, and a storage unit 114.
In the present embodiment, the projection unit 111 is implemented as a projection lens, the photographing unit 112 is implemented as a photographing lens, the signal receiving unit 113 is implemented as a wireless signal receiver, and the storage unit 114 is implemented as an internal memory of the projection apparatus 11, for example.
In more detail, the signal receiving unit 113 may be implemented as an infrared receiver in the present embodiment, however, in other embodiments, the signal receiving unit 113 may also be implemented as a receiver of bluetooth, Wi-Fi, ZigBee or other kinds of wireless signals, for example, and is not limited to the present embodiment. In addition, the projection unit 111, the shooting unit 112 and the signal receiving unit 113 are disposed on the same side (e.g. the front side) of a main body (not shown) of the projection apparatus 11 and all face the same direction. More specifically, if the direction in which the projection unit 111 faces is taken as a projection direction, the image capturing unit 112 and the signal receiving unit 113 are disposed on the body, for example, so as to face the projection direction, so that the projection unit 111 can project images in the projection direction, the image capturing unit 112 can capture images in the projection direction, and the signal receiving unit 113 can receive wireless signals in the projection direction.
The storage unit 114 is, for example, installed with an Operating System (Operating System), and stores a plurality of setting parameters related to the projection unit 111. More specifically, the operating system may be, for example, an open operating system (e.g., but not limited to, the Android system) using a Graphical User Interface (Graphical User Interface). On the other hand, the setting parameters are, for example, used for setting the projection mode of the projection unit 111, and can be used for adjusting the position, angle, size, shape, etc. of the image projected by the projection unit 111, and more specifically, the setting parameters may include, for example, an image horizontal offset setting parameter, an image vertical offset setting parameter, an image rotation angle setting parameter, a projection focal length setting parameter, an image size setting parameter, a trapezoidal image correction setting parameter, a barrel image correction setting parameter, and a grid image correction setting parameter, but not limited thereto.
In the present embodiment, the projection screen 12 is implemented as a rectangular touch screen implemented by Smart Glass (Smart Glass), and more specifically, the projection screen 12 of the present embodiment includes a touch panel 121 and a touch signal processor 122 electrically connected to the touch panel 121, and the touch panel 121 can be implemented by, for example, but not limited to, capacitive touch technology, resistive touch technology, infrared touch technology, or acoustic touch technology. Furthermore, in the present embodiment, the projection screen 12 further includes a light modulation film, which may also be referred to as a Polymer Dispersed Liquid Crystal (PDLC) layer, and is made of, for example, a Polymer Liquid Crystal material. With the light modulating film, the projection screen 12 can be switched between a transparent state that allows light to pass through and an opaque state that reflects light (i.e., does not allow light to pass through) by a user. More specifically, when the dimming film is powered on, the polymer liquid crystals of the dimming film are orderly arranged in a specific direction to allow light to pass through, so that the projection screen 12 is in the transparent state, and when the dimming film is not powered on, the polymer liquid crystals of the dimming film are disorderly arranged to prevent light from passing through, so that the projection screen 12 is in the opaque state and is suitable for being projected. In a further embodiment, the projection screen 12 can be further in a semi-transparent state between the transparent state and the opaque state by adjusting the power level of the dimming film, but not limited thereto.
It should be noted that, in a simplified embodiment, the projection screen 12 may be implemented as a combination of a projection screen and a plurality of sensing elements (e.g., electrodes, not shown), and more specifically, the sensing elements are disposed on the projection screen and used for sensing touch operations, so as to achieve integration of the projection screen and the sensing elements, thereby providing a touch sensing function. In addition, in some embodiments, the projection screen 12 can also be implemented as a general projection screen, and does not necessarily include the touch panel 121 and the dimming film layer in the above embodiments. In other words, the projection screen 12 does not have to have a touch sensing function and a function of switching between a transparent state and an opaque state. In short, the embodiment of the projection screen 12 is not limited to the embodiment.
The signal output unit 13 includes a plurality of signal transmitters 131, and the signal transmitters 131 are respectively disposed at a plurality of corner regions of the projection screen 12, for example, and are used for respectively outputting a plurality of wireless signals capable of being received by the signal receiving unit 113. More specifically, in the present embodiment, the number of the signal emitters 131 is, for example, four, and the four signal emitters 131 are, for example, respectively disposed at four corner region positions of the projection screen 12, that is, four vertex positions of the projection screen 12. On the other hand, in accordance with the implementation aspect (infrared receiver) of the signal receiving unit 113 in the present embodiment, the four signal transmitters 131 are correspondingly implemented as four infrared transmitters in the present embodiment, for example. However, it should be understood that, in other embodiments, the signal transmitter 131 can also be implemented as a transmitter of other kinds of wireless signals (such as the aforementioned bluetooth, Wi-Fi or ZigBee) in cooperation with the signal receiving unit 113, and is not limited by the embodiment. On the other hand, in a similar implementation, the number of the signal emitters 131 may also be two, for example, and the two signal emitters 131 are respectively disposed at two corner area positions of the projection screen 12 that are in a diagonal relationship with each other, while in other similar implementation, the number of the signal emitters 131 may also be three, five or more, for example, and in general, the signal output unit 13 can be implemented as long as two signal emitters 131 are respectively disposed at two corner area positions of the projection screen 12 that are in a diagonal relationship with each other, so the specific implementation of the signal output unit 13 is not limited to this embodiment.
The processing unit 14 is electrically connected to the projection unit 111, the shooting unit 112, the signal receiving unit 113, the storage unit 114, the signal transmitter 131 of the signal output unit 13, and the touch signal processor 122 of the projection screen 12, but not limited thereto. More specifically, in the present embodiment, the processing unit 14 is, for example, implemented as a central processing unit disposed inside the main body of the projection apparatus 11, and the processing unit 14 is capable of operating the operating system installed in the storage unit 114 and controlling the projection unit 111 to perform projection according to the setting parameters. In other words, in the present embodiment, the processing unit 14 is, for example, included in the projection apparatus 11 as shown in fig. 1, but not limited thereto.
It should be added that the projection device 11, the signal output unit 13, and the processing unit 14 together serve as a projection system 10 included in the projection display system 1 in the present embodiment, and the processing unit 14 is electrically connected to the projection unit 111, the shooting unit 112, the signal receiving unit 113, and the storage unit 114 in a wired manner, and is electrically connected to the signal transmitter 131 and the touch signal processor 122 in a wireless manner, for example, but not limited thereto.
Referring to fig. 1 and fig. 2, a detailed description of how the projection system 10 of the present embodiment implements a projection method is provided. It should be understood that the projection method is implemented if the projection device 11 and the projection screen 12 are disposed opposite to each other with a gap therebetween, that is, the projection unit 111, the shooting unit 112 and the signal receiving unit 113 of the projection device 11 all face the projection screen 12, and in addition, during the implementation of the projection method of the present embodiment, the projection screen 12 is adapted to be switched to the opaque state.
First, in step S1, the processing unit 14 generates an image and controls the projection unit 111 to project the image. Since the projection unit 111 projects the image toward the projection screen 12, at least a portion of the image can be projected on the projection screen 12, however, for convenience of describing the subsequent steps of the projection method, it is assumed that the image is offset from the projection screen 12. It should be noted that step S1 is taken as a projection step of the present embodiment, and may be automatically executed by the processing unit 14 when the projection apparatus 11 is turned on, for example, and the image may present a start-up picture in step S1, but not limited thereto. Next, the flow advances to step S2.
In step S2, when the projection unit 111 projects the image to the projection screen 12, the processing unit 14 controls the capturing unit 112 to capture the image and obtains a capturing result from the capturing unit 112. Specifically, the capturing result is generated by the capturing unit 112 and represents the image projected by the projection unit 111. Next, the flow advances to step S3.
In step S3, the processing unit 14 generates image position data indicating the position of the image according to the capturing result. In this embodiment, the image position data includes, for example, four image position coordinates, which correspond to, for example, four vertices of the projected image, respectively, and each image position coordinate indicates, for example, a position of the vertex corresponding to the image position coordinate.
More specifically, the processing unit 14 generates the image position data by, for example, performing image recognition on the captured result to recognize the edge of the image in the captured result according to the change of color and contrast, and further recognize the four vertices of the image in the captured result, and then calculating the four image position coordinates corresponding to the four vertices. In addition, in the present embodiment, each image position coordinate may be, for example, a two-dimensional coordinate, however, in other embodiments, each image position coordinate may also be implemented as a three-dimensional coordinate, and the present invention is not limited thereto.
It should be noted that the image position coordinates are calculated by the processing unit 14 with a predetermined reference point as an origin of a coordinate system, and the reference point may represent the position of the capturing unit 112, or may represent the position of the signal receiving unit 113, for example, but not limited thereto. In addition, in a similar implementation, the image position data may also include, for example, only two image position coordinates, and the two image position coordinates correspond to two mutually diagonal vertices of the image, respectively, while in other similar implementations, the image position coordinates may also be three or other numbers, and in general, the image position data may be implemented as long as the image position data includes at least two image position coordinates corresponding to two mutually diagonal vertices of the image, respectively, so the specific implementation of the image position data is not limited by the embodiment. In addition, the steps S2 and S3 are commonly used as an image positioning step in the present embodiment.
After the processing unit 14 generates the image position data, the flow proceeds to step S4.
In step S4, the processing unit 14 controls the signal transmitter 131 of the signal output unit 13 to output a plurality of wireless signals respectively for the signal receiving unit 113 to receive, and the processing unit 14 generates projection screen position data indicating the position of the projection screen 12 according to the wireless signals by the signal receiving unit 113 receiving the wireless signals respectively output by the signal transmitter 131.
In this embodiment, the projection screen position data includes, for example, four projection screen position coordinates respectively corresponding to the four signal emitters 131, and each projection screen position coordinate may be, for example, a three-dimensional coordinate, and is calculated by, for example, the processing unit 14 according to the intensity and direction of the wireless signal output by the corresponding signal emitter 131. Specifically, the projection screen position coordinates are calculated by the processing unit 14 using the preset reference point as the origin of the coordinate system, in other words, the projection screen position coordinates and the reference point of the image position coordinates (i.e. the origin of the coordinate system) are the same.
It should be noted that, in a similar implementation, the number of the projection screen position coordinates is not limited to four, but is consistent with the number of the signal emitters 131, and therefore, the specific implementation of the projection screen position data is not limited to this embodiment. In other embodiments, each projection screen position coordinate may be a two-dimensional coordinate. Further, the signal output unit 13 is not limited to be controlled by the processing unit 14, but may also output the wireless signals in an independent operation manner, and more specifically, the signal output unit 13 may output the wireless signals separately by a user operation, or automatically output the wireless signals when receiving power supply, for example, without being limited by the embodiment. In addition, step S4 is, for example, a projection screen positioning step in the present embodiment.
After the processing unit 14 generates the projection screen position data, the flow proceeds to step S5.
In step S5, the processing unit 14 compares the image position data with the projection screen position data, and generates a correction data according to the comparison result between the image position data and the projection screen position data. In the present embodiment, the calibration data indicates, for example, a size and a position difference between the image projected by the projection unit 111 and the projection screen 12, however, in other embodiments, the calibration data may also indicate only one of the size and the position difference between the image and the projection screen 12, and is not limited to the present embodiment.
More specifically, in step S5, the processing unit 14 compares the image position coordinates of the image position data with the projection screen position coordinates, respectively. Specifically, since the image is projected on the projection screen 12, the distance between the projection device 11 and the image and the distance between the projection device 11 and the projection screen 12 can be considered to be the same as each other, in other words, the depth of the image and the depth of the projection screen 12 are the same for the projection device 11. Therefore, if the projection direction of the projection unit 111 is taken as the Z-axis direction in the three-dimensional space, the processing unit 14 compares the position difference between the image position coordinates and the projection screen position coordinates in the X-axis direction and the Y-axis direction in this step, in other words, the processing unit 14 compares the position difference, the angle difference and the area difference between the projected image and the projection screen 12 in an X-Y plane defined by the X-axis and the Y-axis. Therefore, the processing unit 14 can calculate a horizontal offset distance, a vertical offset distance, a rotational offset angle, and an area difference ratio between the image and the projection screen 12, and the correction data includes, for example, a horizontal correction amount corresponding to the horizontal offset distance, a vertical correction amount corresponding to the vertical offset distance, a rotational correction angle corresponding to the rotational offset angle, and a zoom correction magnification corresponding to the area difference ratio in the embodiment, but is not limited thereto.
After the processing unit 14 generates the correction data, the flow proceeds to step S6.
In step S6, the processing unit 14 adjusts at least one of the setting parameters according to the calibration data to reduce at least one of the size difference and the position difference between the image and the projection screen 12. For example, if the horizontal correction amount of the correction data indicates a right shift by 1 unit distance, the processing unit 14 may adjust the setting parameter by, for example, adding one to the current value of the image horizontal shift setting parameter. For another example, if the rotation correction angle of the correction data indicates a counterclockwise rotation of 2 unit angles, the method for adjusting the setting parameter by the processing unit 14 may include, for example, subtracting two from the current value of the image rotation angle setting parameter. For another example, assuming that the scaling correction magnification of the correction data indicates 108%, the way for the processing unit 14 to adjust the setting parameter may include, but is not limited to, multiplying the current value of the image size setting parameter by 1.08.
It should be noted that, in the case that the aspect ratio of the image is the same as that of the projection screen 12 (for example, both are 16:9), the processing unit 14 adjusts the setting parameters to make the size and the position of the image completely consistent with the projection screen 12. However, if the length-width ratio of the image is different from that of the projection screen 12 (e.g. the image is 4:3, but the projection screen 12 is 16:9), the processing unit 14 may, for example, make one of the length and the width of the image not exceed the projection screen 12, and make the other one of the length and the width of the image match the projection screen 12 as much as possible, in other words, the processing unit 14, for example, magnifies the image on the projection screen 12 as much as possible without changing the length-width ratio of the image and making the image not exceed the range of the projection screen 12. Alternatively, the processing unit 14 may adjust the length-width ratio of the image to be consistent with the projection screen 12, and then adjust the size and position of the image to be completely consistent with the projection screen 12, but not limited thereto.
In addition, the steps S5 and S6 are commonly used as an image correction step in the present embodiment.
After the processing unit 14 completes the adjustment of the setting parameters according to the correction data, the flow proceeds to step S7.
In step S7, when the projection unit 111 continuously projects the image, when the processing unit 14 receives a touch operation data from the projection screen 12, the processing unit 14 determines whether the touch operation data indicates a specific processing procedure related to the image, and when the processing unit 14 determines that the result is yes, the processing unit 14 executes the specific processing procedure, and presents at least one of an execution process and an execution result of the specific processing procedure in the image to be projected by the projection unit 111.
More specifically, in the present embodiment, the touch operation data is generated by the touch signal processor 122 of the projection screen 12 according to the touch operation received by the touch panel 121, for example, and the touch signal processor 122 provides the touch operation data to the processing unit 14 by way of wireless communication, for example. The touch operation data includes, for example, a touch operation position and a touch operation type, wherein the touch operation position includes, for example, one or more touch coordinates, and the touch operation type indicates, for example, but not limited to, one of a single-click touch operation, a double-click touch operation, a long-press touch operation, a sliding touch operation, a dragging touch operation, and a zooming touch operation.
On the other hand, in step S7, the image presents a ui included in the os, and the specific handler corresponds to the ui presented in the image. When the processing unit 14 receives the touch operation data, the processing unit 14 determines whether the touch operation data indicates the specific processing procedure corresponding to the user operation interface according to the touch operation position and the touch operation type. More specifically, the processing unit 14, for example, determines whether a virtual object (e.g., a key) exists at a position indicated by the touch operation position in the user operation interface, and if so, determines whether the touch operation type matches the operation mode supported by the virtual object, but not limited thereto.
For example, if the processing unit 14 determines that the touch operation data indicates that a key (e.g., a key with a function of "close window") in the ui is pressed, the specific processing procedure executed by the processing unit 14 is, for example, to hide a window corresponding to the key in the screen. For another example, if the processing unit 14 determines that the touch operation data indicates that a picture in the ui is slid to the left, the specific processing procedure executed by the processing unit 14 is, for example, to make the next picture of the picture be displayed in the screen. It should be understood that the mechanism of the processing unit 14 executing the corresponding specific processing procedure according to the touch operation data is similar to the operation mechanism of the conventional touch electronic device 15, and therefore, the detailed description thereof is omitted here.
In addition, the step S7 is, for example, a touch operation processing step in the present embodiment. By means of the touch operation processing step, the user can directly perform touch operation on the projection screen 12, so as to achieve the effect of interaction with the projection screen 12. It should be noted that the touch panel 121 has a plurality of sensing elements (e.g., electrodes, not shown) disposed in a manner (e.g., the length, width, number, and spacing distance of the sensing elements) corresponding to the size specification of the projection screen 12, and the coordinate system adopted by the touch signal processor 122 when calculating the touch operation position of the touch operation data is correlated with the coordinate system adopted when the processing unit 14 calculates the projection screen position coordinates of the projection screen position data, so as to be mapped and matched with each other. In this way, after the image correction step (i.e., steps S5 and S6) is performed, not only the size and position of the image projected by the projection apparatus 11 are matched with the projection screen 12, but also the touch operation position calculated by the touch signal processor 122 can be correctly matched with the size specification of the projection screen 12, so that the touch operation data can be correctly determined and processed by the processing unit 14, and an accurate touch operation function can be realized, and thus, a user can smoothly perform a touch operation using the projected image.
It should be noted that, assuming that the user intends to use the projected image for presentation or speaking, the user can directly control the content presented by the image by touching the projection screen 12 without operating other devices (such as a computer) specifically for controlling the image, so that the rhythm of the user during presentation or speaking can be smoother by the projection screen 12. On the other hand, compared to the touch-control liquid crystal display in the prior art, since the color liquid crystal display panel is expensive in manufacturing cost, the present embodiment can effectively save cost under the requirement of large-size picture.
The above is an example description of the projection method according to the first embodiment of the present invention. It should be noted that, although the projection method of the present embodiment performs the image positioning step (i.e., step S2 and step S3) first and then performs the projection screen positioning step (i.e., step S4), in an actual implementation, the order of performing the image positioning step and the projection screen positioning step may be opposite to that of the present embodiment, or may be performed simultaneously through multiplexing, and is not limited to the present embodiment. On the other hand, in a simplified embodiment in which the projection screen 12 is implemented as a general projection screen (i.e. without touch sensing function), the projection method does not include the touch operation processing step (i.e. does not include step S7).
The present invention also provides a second embodiment of the projection display system 1, and the main difference between the second embodiment and the first embodiment lies in the specific implementation of the processing unit 14. Referring to fig. 3, in the second embodiment, the processing unit 14 is implemented as a cpu included in an electronic device 15, and can communicate with a cpu 115 of the projection device 11 via a wireless network, for example. More specifically, in the second embodiment, the electronic device 15 may be, for example, a notebook computer, a desktop computer, a tablet computer, or a smart phone, and an application program for remotely controlling the projection device 11 is installed in the electronic device 15, so that a user can operate the electronic device 15 to enable the projection system 10 to implement the projection method.
Moreover, in an implementation manner similar to the second embodiment, the processing unit 14 may also include the cpu of the electronic device 15 and the cpu 115 of the projection apparatus 11, that is, steps S1 to S7 in the projection method may be performed by the electronic device 15 and the projection apparatus 11 sharing the same task. For example, the steps S1 to S6 can be executed by the cpu 115 of the projection apparatus 11, and the step S7 can be executed by the cpu of the electronic device 15, for example, however, the division of the electronic device 15 and the projection apparatus 11 can be freely changed, and is not limited thereto.
In summary, the processing unit 14 of the projection system 10 can generate the correction data according to the image position data and the projection screen position data, and automatically adjust the setting parameter of the projection apparatus 11 according to the correction data to reduce the difference of at least one of the size and the position between the projected image and the projection screen 12, so that even if the relative position relationship between the projection apparatus 11 and the projection screen 12 changes, the projection system 10 can automatically correct the projected image without manual adjustment by the user, thereby achieving the object of the present invention.
The above description is only an example of the present invention, and the scope of the present invention should not be limited thereby, and the invention is still within the scope of the present invention by simple equivalent changes and modifications made according to the claims and the contents of the specification.

Claims (13)

1. A projection system is suitable for a projection screen, and is characterized in that: the projection system includes:
the projection device comprises a projection unit, a shooting unit and a signal receiving unit;
the signal output unit is suitable for being arranged on the projection screen; and
the processing unit is electrically connected with the projection unit, the shooting unit and the signal receiving unit; wherein:
when the projection unit projects an image to the projection screen, the processing unit obtains a shooting result which is generated by the shooting unit and presents the image from the shooting unit, and generates image position data indicating the position of the image according to the shooting result;
receiving a plurality of wireless signals output by the signal output unit by the signal receiving unit, and generating projection screen position data indicating the position of the projection screen by the processing unit according to the wireless signals;
the processing unit generates correction data according to a comparison result between the image position data and the projection screen position data, the correction data indicates a difference of at least one of a size and a position between the image and the projection screen, and the processing unit adjusts at least one setting parameter related to the projection unit according to the correction data so as to reduce the difference of at least one of the size and the position between the image and the projection screen.
2. The projection system of claim 1, wherein:
the signal output unit comprises a plurality of signal emitters, and the signal emitters are suitable for being respectively arranged at a plurality of corner area positions of the projection screen;
the image position data comprises a plurality of image position coordinates, and the image position coordinates respectively correspond to a plurality of vertexes of the image; and
the wireless signals are respectively output by the signal transmitters, the projection screen position data comprise a plurality of projection screen position coordinates respectively corresponding to the signal transmitters, and each projection screen position coordinate is calculated by the processing unit according to the intensity and the direction of the wireless signals output by the corresponding signal transmitters.
3. The projection system of claim 1, wherein: the projection screen is a touch-control projection screen, and after the processing unit adjusts at least one setting parameter according to the correction data and under the condition that the projection unit projects the image, the processing unit executes a specific processing program when receiving touch operation data from the projection screen and judging that the touch operation data indicates the specific processing program related to the image.
4. The projection system of claim 3, wherein:
before the processing unit generates the image position data and the projection screen position data, the processing unit also generates the image and controls the projection unit to project the image;
after the processing unit adjusts at least one setting parameter according to the correction data, the image presents a user operation interface, and the processing unit determines whether the touch operation data indicates the specific processing program related to the user operation interface according to a touch operation position and a touch operation type included in the touch operation data, and presents at least one of an execution process and an execution result of the specific processing program in the image to be projected by the projection unit when the specific processing program is executed.
5. A projection method is implemented by a projection system, the projection system comprises a projection device, a signal output unit and a processing unit, the signal output unit is suitable for being arranged on a projection screen, the projection device comprises a projection unit, a shooting unit and a signal receiving unit, the projection unit, the shooting unit and the signal receiving unit are electrically connected with the processing unit, and the projection system is characterized in that: the projection method comprises the following steps:
an image positioning step in which, when the projection unit projects an image toward the projection screen, the processing unit obtains, from the shooting unit, a shooting result generated by the shooting unit and presenting the image, and generates image position data indicating a position of the image according to the shooting result;
a projection screen positioning step, wherein the signal receiving unit is used for receiving a plurality of wireless signals output by the signal output unit, and the processing unit is used for generating projection screen position data indicating the position of the projection screen according to the wireless signals; and
and an image correction step, wherein the processing unit generates correction data according to a comparison result between the image position data and the projection screen position data, the correction data indicates a difference of at least one of the size and the position between the image and the projection screen, and the processing unit adjusts at least one setting parameter related to the projection unit according to the correction data so as to reduce the difference of at least one of the size and the position between the image and the projection screen.
6. The projection method of claim 5, wherein:
the signal output unit comprises a plurality of signal emitters, and the signal emitters are suitable for being respectively arranged at a plurality of corner area positions of the projection screen;
in the image positioning step, the image position data comprises a plurality of image position coordinates, and the image position coordinates respectively correspond to a plurality of vertexes of the image;
in the projection screen positioning step, the wireless signals are respectively output by the signal transmitters, the projection screen position data include a plurality of projection screen position coordinates respectively corresponding to the signal transmitters, and each projection screen position coordinate is calculated by the processing unit according to the intensity and direction of the wireless signal output by the corresponding signal transmitter.
7. The projection method of claim 5, wherein: the projection screen is a touch-control projection screen, and the projection method further comprises a touch operation processing step after the image correction step, wherein when the projection unit projects the image, the processing unit receives touch operation data from the projection screen and executes a specific processing program when judging that the touch operation data indicates the specific processing program related to the image.
8. The projection method of claim 7, wherein:
the projection method further comprises a projection step before the image positioning step and the projection screen positioning step, wherein the processing unit generates the image and controls the projection unit to project the image; and
in the touch operation processing step, the image presents a user operation interface, and the processing unit determines whether the touch operation data indicates the specific processing program related to the user operation interface according to a touch operation position and a touch operation type included in the touch operation data, and when the specific processing program is executed, the processing unit further presents at least one of an execution process and an execution result of the specific processing program in the image to be projected by the projection unit.
9. A projection display system, characterized by: the projection display system includes:
the projection device comprises a projection unit, a shooting unit and a signal receiving unit;
a projection screen;
the signal output unit is used for being arranged on the projection screen; and
the processing unit is electrically connected with the projection unit, the shooting unit and the signal receiving unit; wherein:
when the projection unit projects an image to the projection screen, the processing unit obtains a shooting result which is generated by the shooting unit and presents the image from the shooting unit, and generates image position data indicating the position of the image according to the shooting result;
receiving a plurality of wireless signals output by the signal output unit by the signal receiving unit, and generating projection screen position data indicating the position of the projection screen by the processing unit according to the wireless signals;
the processing unit generates correction data according to a comparison result between the image position data and the projection screen position data, the correction data indicates a difference of at least one of a size and a position between the image and the projection screen, and the processing unit adjusts at least one setting parameter related to the projection unit according to the correction data so as to reduce the difference of at least one of the size and the position between the image and the projection screen.
10. A projection display system according to claim 9, wherein: the projection screen is a touch projection screen, and the projection screen can be switched between a transparent state capable of allowing light to pass and an opaque state capable of reflecting light in an operation mode.
11. A projection display system according to claim 9, wherein:
the signal output unit comprises a plurality of signal emitters, and the signal emitters are respectively arranged at a plurality of corner area positions of the projection screen;
the image position data comprises a plurality of image position coordinates, and the image position coordinates respectively correspond to a plurality of vertexes of the image; and
the wireless signals are respectively output by the signal transmitters, the projection screen position data comprise a plurality of projection screen position coordinates respectively corresponding to the signal transmitters, and each projection screen position coordinate is calculated by the processing unit according to the intensity and the direction of the wireless signals output by the corresponding signal transmitters.
12. A projection display system according to claim 9, wherein: the projection screen is a touch-control projection screen, and after the processing unit adjusts at least one setting parameter according to the correction data and under the condition that the projection unit projects the image, the processing unit executes a specific processing program when receiving touch operation data from the projection screen and judging that the touch operation data indicates the specific processing program related to the image.
13. A projection display system according to claim 12, wherein:
before the processing unit generates the image position data and the projection screen position data, the processing unit also generates the image and controls the projection unit to project the image;
after the processing unit adjusts at least one setting parameter according to the correction data, the image presents a user operation interface, and the processing unit determines whether the touch operation data indicates the specific processing program related to the user operation interface according to a touch operation position and a touch operation type included in the touch operation data, and presents at least one of an execution process and an execution result of the specific processing program in the image to be projected by the projection unit when the specific processing program is executed.
CN202010799226.8A 2020-08-11 2020-08-11 Projection system and method and projection display system Pending CN114079756A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010799226.8A CN114079756A (en) 2020-08-11 2020-08-11 Projection system and method and projection display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010799226.8A CN114079756A (en) 2020-08-11 2020-08-11 Projection system and method and projection display system

Publications (1)

Publication Number Publication Date
CN114079756A true CN114079756A (en) 2022-02-22

Family

ID=80280047

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010799226.8A Pending CN114079756A (en) 2020-08-11 2020-08-11 Projection system and method and projection display system

Country Status (1)

Country Link
CN (1) CN114079756A (en)

Similar Documents

Publication Publication Date Title
US10481475B2 (en) Smart lighting device and control method thereof
US11386535B2 (en) Image blending method and projection system
WO2022002053A1 (en) Photography method and apparatus, and electronic device
US20170264875A1 (en) Automatic correction of keystone distortion and other unwanted artifacts in projected images
US8878858B2 (en) Video projection apparatus and methods, with image content control
US20140333585A1 (en) Electronic apparatus, information processing method, and storage medium
US10416813B2 (en) Display system, display device, information processing device, and information processing method
WO2018010440A1 (en) Projection picture adjusting method and apparatus, and projection terminal
US20190191042A1 (en) Display Device, Image Processing Device and Non-Transitory Recording Medium
WO2019119621A1 (en) Correction method and device for projection picture, and projection system
US11073949B2 (en) Display method, display device, and interactive projector configured to receive an operation to an operation surface by a hand of a user
CN113110906B (en) Display control method and electronic equipment
CN105912101B (en) Projection control method and electronic equipment
US20110242421A1 (en) Image distortion correction apparatus and method
JP2017182109A (en) Display system, information processing device, projector, and information processing method
US20160321968A1 (en) Information processing method and electronic device
US9946333B2 (en) Interactive image projection
CN213126265U (en) Projection system and projection display system
TWM607874U (en) Projection system and projection display system
CN114079756A (en) Projection system and method and projection display system
TWI750773B (en) Projecting system and method thereof and projecting displaying system
US11102461B1 (en) Projection system, projection method, and projection display system
US9684415B2 (en) Optical touch-control system utilizing retro-reflective touch-control device
US20160320897A1 (en) Interactive display system, image capturing apparatus, interactive display method, and image capturing method
TW201142466A (en) Interactive projection system and system control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240306

Address after: 361101 No. 996 Min'an Avenue, Torch High tech Zone (Xiang'an) Industrial Zone, Xiamen City, Fujian Province

Applicant after: TPK GLASS SOLUTIONS (XIAMEN) Inc.

Country or region after: China

Address before: 361021 floors 3, 4, 5 and 8, No. 190, Jimei Avenue, Jimei District, Xiamen City, Fujian Province

Applicant before: TPK Touch Systems (Xiamen) Inc.

Country or region before: China

TA01 Transfer of patent application right