WO2023115857A1 - 激光投影设备及投影图像的校正方法 - Google Patents

激光投影设备及投影图像的校正方法 Download PDF

Info

Publication number
WO2023115857A1
WO2023115857A1 PCT/CN2022/100357 CN2022100357W WO2023115857A1 WO 2023115857 A1 WO2023115857 A1 WO 2023115857A1 CN 2022100357 W CN2022100357 W CN 2022100357W WO 2023115857 A1 WO2023115857 A1 WO 2023115857A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection
projected
correction
frame
Prior art date
Application number
PCT/CN2022/100357
Other languages
English (en)
French (fr)
Inventor
陈星�
郭大勃
肖纪臣
梁倩
赵一石
Original Assignee
青岛海信激光显示股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202111562487.9A external-priority patent/CN114222099A/zh
Priority claimed from CN202111559005.4A external-priority patent/CN114339173A/zh
Priority claimed from CN202111566402.4A external-priority patent/CN114245089B/zh
Application filed by 青岛海信激光显示股份有限公司 filed Critical 青岛海信激光显示股份有限公司
Publication of WO2023115857A1 publication Critical patent/WO2023115857A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present disclosure relates to the technical field of projection display, and in particular, to a laser projection device and a correction method for a projected image.
  • Laser display products include projection screens and laser projection equipment.
  • Laser projection equipment can project images on projection screens to achieve video playback and other functions.
  • the laser projection equipment includes a light source assembly, an optical machine and a lens.
  • the light source assembly is used to provide a high-intensity laser illumination beam to the optical machine; the optical machine is used to modulate the image signal of the laser illumination beam to form a projection beam.
  • the formed projection beam enters the lens; the lens is used to project the projection beam onto the projection screen.
  • a laser projection device including a light source component, an optical engine, a lens, a photographing device, and a circuit system architecture.
  • the light source assembly is configured to provide an illumination beam.
  • the light machine is configured to modulate the illumination beam with the image signal to obtain the projection beam.
  • the lens is configured to project the projection beam into an image.
  • the photographing device is configured to, in response to the photographing instruction, photograph a first projected image projected by the lens on the projection screen to obtain a photographed image; the first projected image includes a correction mark.
  • the circuitry architecture is configured to control the light source and optomechanical operation.
  • the circuit system architecture includes a main control circuit and a display control circuit.
  • the main control circuit is coupled to the photographing device and is configured to acquire a photographed image, obtain correction data based on the position of the correction mark in the photographed image, and send the correction data to the display control circuit.
  • the display control circuit is coupled to the main control circuit and is configured to receive the correction data, perform correction processing on the image to be projected based on the correction data, and transmit the corrected image signal of the image to be projected to the optical machine, so that the optical machine uses the corrected
  • the processed image signal of the image to be projected modulates the illumination beam to obtain the projection beam.
  • some embodiments of the present disclosure provide a method for correcting a projected image, which is applied to a laser projection device.
  • the correction method includes: first, acquiring a sequence of image frames to be projected, and the sequence of image frames to be projected includes a plurality of Image. Secondly, in response to the image processing start instruction and the image processing end instruction, a correction mark is added to the image frame subsequence to obtain a processed image frame subsequence.
  • the image frame subsequence is the image frame sequence within the time range corresponding to the image processing start instruction and the image processing end instruction in the image frame sequence to be projected, and the processed image frame subsequence includes the first projected image.
  • a captured image is acquired, and the captured image is an image captured when the first projection image is projected onto the projection screen. Then, based on the position of the correction mark in the captured image, correction data is acquired. Finally, the image to be projected is corrected based on the correction data.
  • FIG. 1 is a structural diagram of a laser projection device according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram of a light source assembly, an optical engine and a lens in a laser projection device according to some embodiments of the present disclosure
  • FIG. 3 is a structural diagram of an optical path in a laser projection device according to some embodiments of the present disclosure
  • FIG. 4 is a schematic diagram of the optical path principle of a light source assembly in a laser projection device according to some embodiments of the present disclosure
  • FIG. 5 is an arrangement structure diagram of tiny mirror mirrors in a digital micromirror device according to some embodiments of the present disclosure
  • Fig. 6 is a schematic diagram of the operation of a tiny mirror according to some embodiments of the present disclosure.
  • Fig. 7 is the schematic diagram of the position of the swing of a tiny mirror in the digital micromirror device shown in Fig. 5;
  • FIG. 8 is a schematic diagram of the positions of a laser projection device and a projection screen according to some embodiments of the present disclosure
  • FIG. 9 is a structural diagram of another laser projection device according to some embodiments of the present disclosure.
  • Fig. 10 is a sequence diagram of generating an image processing start instruction and an image processing end instruction by a main control circuit according to some embodiments of the present disclosure
  • Fig. 11 is a schematic diagram of an image frame subsequence according to some embodiments of the present disclosure.
  • Fig. 12 is a schematic diagram of a processed image frame subsequence according to some embodiments of the present disclosure.
  • Fig. 13 is a schematic diagram of another processed image frame subsequence according to some embodiments of the present disclosure.
  • Fig. 14 is a schematic diagram of another image frame subsequence according to some embodiments of the present disclosure.
  • Fig. 15 is a schematic diagram of yet another processed image frame subsequence according to some embodiments of the present disclosure.
  • Fig. 16 is a schematic diagram of yet another processed image frame subsequence according to some embodiments of the present disclosure.
  • Fig. 17 is a schematic diagram of matching a projection area of a first projection image with a projection screen according to some embodiments of the present disclosure
  • Fig. 18 is a schematic diagram showing that the projection area of the first projection image does not match the projection screen according to some embodiments of the present disclosure
  • Fig. 19 is a schematic diagram of another mismatch between the projection area of the first projection image and the projection screen according to some embodiments of the present disclosure.
  • Fig. 20 is a flowchart of a method for correcting a projected image according to some embodiments of the present disclosure
  • Fig. 21 is a flowchart of another method for correcting a projected image according to some embodiments of the present disclosure.
  • Fig. 22 is a flow chart of another method for correcting a projected image according to some embodiments of the present disclosure.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present disclosure, unless otherwise specified, "plurality” means two or more.
  • the expressions “coupled” and “connected” and their derivatives may be used.
  • the term “connected” may be used in describing some embodiments to indicate that two or more elements are in direct physical or electrical contact with each other.
  • the term “coupled” may be used when describing some embodiments to indicate that two or more elements are in direct physical or electrical contact.
  • the terms “coupled” or “communicatively coupled” may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
  • the embodiments disclosed herein are not necessarily limited by the context herein.
  • the term “if” is optionally interpreted to mean “when” or “at” or “in response to determining” or “in response to detecting,” depending on the context.
  • the phrases “if it is determined that " or “if [the stated condition or event] is detected” are optionally construed to mean “when determining ! or “in response to determining ! depending on the context Or “upon detection of [stated condition or event]” or “in response to detection of [stated condition or event]”.
  • Exemplary embodiments are described herein with reference to cross-sectional and/or plan views that are idealized exemplary drawings.
  • the thickness of layers and regions are exaggerated for clarity. Accordingly, variations in shape from the drawings as a result, for example, of manufacturing techniques and/or tolerances are contemplated.
  • example embodiments should not be construed as limited to the shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an etched region illustrated as a rectangle will, typically, have curved features.
  • the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of example embodiments.
  • FIG. 1 is a structural diagram of a laser projection device according to some embodiments of the present disclosure.
  • the laser projection device includes a host 10, and the host 10 includes a complete machine housing 101 (only part of the housing is shown in the figure) , the light source assembly 100 , the light engine 200 , and the lens 300 assembled in the machine housing 101 .
  • the light source assembly 100 is configured to provide an illumination beam (laser beam).
  • the optical machine 200 is configured to use an image signal to modulate the illumination beam provided by the light source assembly 100 to obtain a projection beam.
  • the lens 300 is configured to project the projection beam onto a projection screen or a wall for imaging.
  • the light source assembly 100 , the light engine 200 and the lens 300 are sequentially connected along the beam propagation direction, and each is wrapped by a corresponding housing.
  • the housings of the light source assembly 100 , the optical engine 200 and the lens 300 support the optical components and make the optical components meet certain sealing or airtight requirements.
  • the light source assembly 100 is airtightly sealed through its corresponding housing, which can better improve the problem of light decay of the light source assembly 100 .
  • One end of the light engine 200 is connected to the lens 300 and arranged along the first direction X of the whole machine, for example, the first direction X may be the width direction of the whole machine.
  • the light source assembly 100 is connected to the other end of the optical machine 200 .
  • the connection direction between the light source assembly 100 and the optical machine 200 is perpendicular to the connection direction between the optical machine 200 and the lens 300.
  • this connection structure can adapt to the optical path characteristics of the reflective light valve in the optical machine 200, and on the other hand On the one hand, it is also beneficial to shorten the length of the optical path in one dimension, which is beneficial to the structural arrangement of the whole machine.
  • the length of the optical path in this direction will be very long , which is not conducive to the structural arrangement of the whole machine.
  • light source assembly 100 may include three laser arrays.
  • 2 is a schematic diagram of a light source assembly, an optical machine, and a lens in a laser projection device according to some embodiments of the present disclosure.
  • the light source assembly 100 is an example of a three-color laser light source, and the three laser arrays can be red Laser array 130, green laser array 120, and blue laser array 110; but not limited thereto.
  • the three laser arrays may also all be blue laser arrays 110 , or two laser arrays may be blue laser arrays 110 , and one laser array may be red laser arrays 130 .
  • the light source assembly 100 can generate an illumination beam containing light of the three primary colors, so there is no need to set a fluorescent wheel in the light source assembly 100 (when one or more lasers included in the light source When the array can only generate one or two colors of laser light, it is necessary to use the existing color laser to excite the fluorescent wheel to generate other colors of fluorescent light, so that the laser light and the fluorescent light together form white light), thereby simplifying the structure of the light source assembly 100, The volume of the light source assembly 100 is reduced.
  • the light source assembly 100 may also include two laser arrays.
  • the light source assembly 100 is a two-color laser light source as an example.
  • the two laser arrays can be a blue laser array 110 and a red laser array 130;
  • the light source assembly 100 can also include a laser array, that is, the light source assembly 100 is a monochromatic laser light source, that is, the light source assembly 100 only includes the blue laser array 110, or only includes the blue laser array 110 and the red laser array 130 pm.
  • FIG. 4 is a schematic diagram of the optical path principle of a light source assembly in a laser projection device according to some embodiments of the present disclosure.
  • Color wheel 150 After the blue laser 110 emits blue light, a part of the blue light is irradiated on the fluorescent wheel 140 to generate red fluorescent light (when the light source assembly 100 includes the red laser array 130, it is not necessary to generate red fluorescent light) and green fluorescent light; , red fluorescent light (or red laser) and green fluorescent light sequentially pass through the light combining mirror 160 and then pass through the color filter wheel 150 for color filtering, and output the three primary colors sequentially.
  • red fluorescent light or red laser
  • green fluorescent light sequentially pass through the light combining mirror 160 and then pass through the color filter wheel 150 for color filtering, and output the three primary colors sequentially.
  • the human eye cannot distinguish the color of light at a certain moment, and what it perceives is still mixed white light.
  • Fig. 2 is a schematic diagram of a light source assembly, an optical machine and a lens in a laser projection device according to some embodiments of the present disclosure
  • Fig. 3 is a diagram of an optical path structure in a laser projection device according to some embodiments of the present disclosure, as shown in Fig. 2 and Fig. 3
  • the optical machine 200 may include: a light guide 210, a lens assembly 220, a mirror 230, a digital micromirror device (Digital Micromirror Device, DMD) 240 and a prism assembly 250.
  • the light pipe 210 can receive the illumination beam provided by the light source assembly 100 and homogenize the illumination beam.
  • the lens assembly 220 can amplify the illumination light beam first, then converge it and output it to the reflector 230 .
  • the mirror 230 can reflect the illumination beam to the prism assembly 250 .
  • the prism assembly 250 reflects the illumination beam to the DMD 240, and the DMD 240 modulates the illumination beam, and reflects the modulated projection beam to the lens 300.
  • the DMD 240 is the core component, and its function is to use the image signal to modulate the illumination beam provided by the light source assembly 100, that is, to control the illumination beam to display different colors and brightness for different pixels of the image to be displayed, so as to finally form Optical image, so DMD 240 is also known as light modulation device or light valve.
  • the light modulation device or light valve
  • the light modulation device can be divided into a transmissive light modulation device (or light valve) or a reflective light modulation device (or light valve).
  • FIG. 2 is a schematic diagram of a light source assembly, an optical engine, and a lens in a laser projection device according to some embodiments of the present disclosure
  • FIG. 3 is a diagram of an optical path architecture in a laser projection device according to some embodiments of the present disclosure, as shown in FIGS.
  • the DMD 240 reflects the illumination beam, which is a reflective light modulation device.
  • the liquid crystal light valve transmits the illumination beam, so it is a transmissive light modulation device.
  • the optomechanics can be divided into single-chip systems, two-chip systems or three-chip systems. For example, as shown in FIG. 2 and FIG.
  • the light source assembly, optical machine and lens in some embodiments of the present disclosure are schematic diagrams and optical path architecture diagrams, only one piece of DMD 240 is used in the optical machine 200, so the optical Machine 200 may be referred to as a system-on-a-chip.
  • the optical machine 200 can be called a three-chip system.
  • the DMD 240 is applied in a digital light processing (Digital Light Processing, DLP) projection architecture.
  • FIG. 2 is a schematic diagram of a light source assembly, an optical engine and a lens in a laser projection device according to some embodiments of the present disclosure.
  • FIG. 3 is a schematic diagram of some implementations according to the present disclosure. As shown in FIG. 2 and FIG. 3 , the optical machine 200 uses a DLP projection architecture.
  • FIG. 5 is an arrangement structure diagram of tiny reflective mirrors in a digital micromirror device according to some embodiments of the present disclosure. As shown in FIG.
  • DMD 240 includes thousands of tiny reflective mirrors 2401 that can be individually driven to rotate, These tiny mirrors 2401 are arranged in an array, and each tiny mirror 2401 corresponds to a pixel in the image to be displayed.
  • each tiny reflector 2401 is equivalent to a digital switch, which can swing within the range of plus or minus 12 degrees ( ⁇ 12°) or plus or minus 17 degrees ( ⁇ 17°) under the action of an external electric field, to The reflected light can be imaged on the screen through the lens 300 along the optical axis to form a bright pixel.
  • FIG. 6 is a schematic diagram of the operation of the micro-mirror mirror according to some embodiments of the present disclosure.
  • the light reflected by the micro-reflector 2401 at a negative deflection angle is called OFF light, and the OFF light is invalid light. It hits the housing 101 of the whole machine, the housing of the optical machine 200 or the light absorbing unit to absorb it.
  • the light reflected by the tiny reflective lens 2401 at a positive deflection angle is called ON light.
  • the ON light is the effective light beam that the tiny reflective lens 2401 on the surface of the DMD 240 receives the illumination beam and enters the lens 300 through a positive deflection angle.
  • the open state of the micro-reflector 2401 is the state where the micro-reflector 2401 is and can be maintained when the illumination beam emitted by the light source assembly 100 is reflected by the micro-reflector 2401 and can enter the lens 300, that is, the micro-reflector 2401 is at a positive deflection angle status.
  • the closed state of the tiny reflective mirror 2401 is the state where the tiny reflective mirror 2401 is and can be maintained when the illuminating light beam emitted by the light source assembly 100 is reflected by the tiny reflective mirror 2401 and does not enter the lens 300, that is, the tiny reflective mirror 2401 is in a negative deflection angle status.
  • FIG. 7 is a schematic diagram of the swinging position of a tiny mirror in the digital micromirror device shown in FIG. 5. As shown in FIG. That is, the on state, the state at -12° is the off state, and for the deflection angle between -12° and +12°, the actual working state of the tiny mirror 2401 is only the on state and the off state.
  • the state at +17° is the on state
  • the state at -17° is the off state.
  • the image signal is converted into digital codes such as 0 and 1, and these digital codes can drive the tiny mirror 2401 to vibrate.
  • part or all of the tiny mirrors 2401 will be switched once between the on state and the off state, so as to realize the display in one frame of image according to the duration time of the tiny mirrors 2401 respectively in the on state and the off state.
  • the gray scale of each pixel of For example, when a pixel has 256 gray scales from 0 to 255, the tiny mirrors corresponding to gray scale 0 are in the off state during the entire display period of one frame of image, and the tiny mirrors corresponding to gray scale 255 are in the off state during one frame.
  • the whole display period of the image is in the on state, and the tiny reflective mirror corresponding to the gray scale 127 is in the on state for half of the time in the display period of a frame of image, and the other half of the time is in the off state. Therefore, the state and the maintenance time of each state in the display period of each frame of image are controlled by the image signal to control the brightness (gray scale) of the corresponding pixel of the tiny mirror 2401 to realize the projection
  • the illumination beam to the DMD 240 is modulated for the purpose.
  • the light guide 210 at the front end of the DMD 240, the lens assembly 220 and the reflector 230 form an illumination light path, and the illumination beam emitted by the light source assembly 100 passes through the illumination light path to form a beam size and an incident angle that meet the requirements of the DMD 240.
  • the lens 300 includes a combination of multiple lenses, which are usually divided into groups, such as the front group, the middle group and the front group.
  • the front group is the lens group near the light output side of the projection device (left side shown in FIG. 2 )
  • the rear group is the lens group near the light output side of the light engine 200 (right side shown in FIG. 2 ).
  • the lens 300 may also be a zoom lens, or a fixed focus adjustable focus lens, or a fixed focus lens.
  • the laser projection device is an ultra-short-focus projection device
  • the lens 300 is an ultra-short-focus lens
  • the throw ratio of the lens 300 is usually less than 0.3, such as 0.24.
  • the throw ratio refers to the ratio of the projection distance to the screen width. The smaller the ratio, the larger the projection screen width at the same projection distance.
  • the ultra-short-focus lens with a relatively small projection can adapt to a narrow space while ensuring the projection effect.
  • FIG. 8 is a schematic diagram of the positions of a laser projection device and a projection screen according to some embodiments of the present disclosure.
  • the host 10 of the laser projection device is set separately from the projection screen, and there is generally a certain distance between them.
  • the main body 10 of the laser projection device moves, that is, when the whole machine casing 101 moves, the projected image projected by the lens 300 on the projection screen 30 will also shift, which may cause the projected image to exceed the display on the projection screen 30.
  • the range of the situation will affect the projection display effect.
  • FIG. 8 is a schematic diagram of the positions of a laser projection device and a projection screen according to some embodiments of the present disclosure
  • FIG. 9 is another schematic diagram according to some embodiments of the present disclosure
  • a structural diagram of a laser projection device, as shown in FIGS. 8 and 9 the laser projection device includes a host 10 and a photographing device 20 .
  • the photographing device 20 is a device capable of photographing the projection screen 30 .
  • the photographing device 20 may be a camera.
  • the photographing device 20 may be arranged on the whole machine casing 101 of the host 10, or the photographing device 20 may also be arranged at a position outside the whole machine casing 101 of the host 10.
  • the disposition of the photographing device 20 in the present disclosure The location is not limited.
  • FIG. 9 is a structural diagram of another laser projection device according to some embodiments of the present disclosure.
  • the host 10 of the laser projection device further includes a circuit system architecture (power system architecture) 400,
  • the circuit system architecture 400 may be a printed circuit board assembly (PCBA).
  • the circuit system architecture 400 is configured to control the operation of the light source assembly 100 and the light machine 200 .
  • FIG. 1 is a structural diagram of a laser projection device according to some embodiments of the present disclosure. As shown in FIG. The setting position of is not limited.
  • FIG. 9 is a structural diagram of another laser projection device according to some embodiments of the present disclosure.
  • a circuit system architecture 400 includes a main control circuit 401 and a display control circuit 402 .
  • the main control circuit 401 is coupled to the photographing device 20 and the display control circuit 402.
  • the display control circuit 402 may be a DLP chip.
  • the main control circuit 401 is configured to acquire an image frame sequence of an image to be projected, the image frame sequence to be projected includes multiple frames of an image to be projected, and the image to be projected may be a video screen to be played or a user interface
  • the image to be projected may also be an image used only for image correction. The following embodiments will be described by taking the image to be projected as a video image to be played or a user interface (UI) image as an example.
  • UI user interface
  • the main control circuit 401 is configured to respond to the image processing start instruction and the image processing end instruction, select the image frame subsequence within the time range corresponding to the image processing start instruction and the image processing end instruction in the image frame sequence, and select the image frame subsequence in the image frame subsequence Add the correction flag to get the processed image frame subsequence.
  • the processed sub-sequence of image frames includes a first projected image that includes a calibration signature. That is, the first projected image is an image to which a correction mark is added, and the first projected image may also be called a correction chart.
  • the first projection image includes one or more correction marks, and when the first projection image includes multiple correction marks, the positions of the multiple correction marks in the first projection image are different.
  • a plurality of correction marks can be distributed at the vertex position of the first projection image, and can also be distributed at the midpoint position of the edge of the first projection image.
  • the following embodiments take the correction mark distribution at the vertex position of the first projection image as an example for example sexual description.
  • the shape of the correction marks may include rhombus, star or cross.
  • the present disclosure does not limit the shape and quantity of the correction marks included in the first projected image.
  • the following embodiments take multiple correction patterns with the same shape and the same size as an example Be explained.
  • the image processing start instruction is used to instruct the main control circuit 401 to start adding correction marks in the image frame subsequence.
  • the image processing end instruction is used to instruct the main control circuit 401 to stop adding correction marks in the image frame subsequence. It should be noted that the image processing start instruction and the image processing end instruction may be set in the system, or may be triggered by a user.
  • the main control circuit 401 is further configured to periodically generate an image processing start instruction and an image processing end instruction.
  • FIG. 10 is a timing diagram of a main control circuit generating an image processing start instruction and an image processing end instruction according to some embodiments of the present disclosure.
  • the first preset time T1 is used as a cycle to generate An image processing start command Cmd1 and an image processing end command Cmd2.
  • the interval between the generation time of the image processing start command Cmd1 and the generation time of the image processing end command Cmd2 may be a second preset time T2.
  • the image frame subsequence is the image frame sequence corresponding to the image to be projected within the second preset time T2 starting from the generation time of the image processing start command Cmd1.
  • the present disclosure does not limit the specific values of the first preset time T1 and the second preset time T2, which can be set according to actual conditions.
  • a timing projection state self-inspection switch can be set on the video playback interface or user interaction interface. If the user turns on the timing projection state self-inspection switch, the main control circuit 401 will be triggered to periodically generate the image processing start command Cmd1 and Image processing end command Cmd2.
  • a correction switch may also be set on the video playback interface or the user interaction interface, and the user triggers the correction of a single projected image.
  • the main control circuit 401 may generate an image processing start command Cmd1 or an image processing end command Cmd2 based on the user's operation on the calibration switch.
  • the main control circuit 401 may add a correction mark to the image frame subsequence by means of image replacement or image insertion to obtain a processed image frame subsequence.
  • the main control circuit 401 is configured to: replace the first image in the image frame subsequence with the first projected image to obtain a processed image frame subsequence.
  • the main control circuit 401 is configured to: insert the first projected image before or after the second image in the image frame subsequence to obtain the processed image frame subsequence . Inserting the first projection image before the second image may be inserting the first projection image between the second image and the previous frame image of the second image; inserting the first projection image after the second image may be inserting the first projection image after the second image The first projected image is inserted between the image and the next frame image of the second image.
  • the image in the image frame subsequence when adding the correction mark in the image frame subsequence by means of image replacement or image insertion, can be selected as the background image of the correction mark in the first projected image, or other
  • the image serves as the background image of the first projected image
  • the other images are images other than the images in the image frame subsequence.
  • the following embodiments take an example of selecting an image in the image frame subsequence as the background image of the calibration mark in the first projected image for illustration.
  • the main control circuit 401 is configured to: add a correction mark to the first image to obtain a first projection image, and then use the first projection image to replace the first image.
  • the first image is an image in the image frame subsequence, so the correction mark is added to the first image to obtain the first projection image. It is equivalent to using the first image, that is, the image in the image frame subsequence, as the background image of the first projection image.
  • the main control circuit 401 is configured to: add a correction mark to the second image to obtain the first projection image; insert the first projection image before or after the second image.
  • the second image is an image in the image frame subsequence, so the correction mark is added to the second image to obtain the first projection image. It is equivalent to using the second image, that is, the image in the image frame subsequence, as the background image of the first projected image.
  • the first image or the second image in the image frame subsequence is used as the background image of the first projection image, that is, the video image to be played or the user interface image is used as the background image of the first projection image.
  • the user watches the video or the user interface, it is difficult for the user to perceive the first projected image or the correction mark in the first projected image, so the projected image can be corrected without the user being aware of it.
  • the main control circuit 401 periodically selects the first image or the second image in the image frame subsequence.
  • FIG. 10 is a timing diagram of a main control circuit generating an image processing start instruction and an image processing end instruction according to some embodiments of the present disclosure.
  • the image or the second image that is to say, within the second preset time T2
  • the first image or the second image is selected every third preset time T3.
  • Each time the first image or the second image is selected it may include one frame of video images or user interface images to be played, or may include multiple frames of video images or user interface images to be played.
  • the number of image frames included in the first image or the second image selected each time is related to the number of correction marks added to the first image or the second image.
  • the present disclosure does not limit the shape of the video screen or the screen of the user interface to be played, and the following embodiments are described by taking the video screen or the screen of the user interface being played as a rectangle as an example.
  • every third preset time T3, a frame of a video picture to be played or a picture of the user interface may be selected as the first image or the second image, And add a plurality of correction marks at different positions in the first image or the second image, so every third preset time T3, a frame of the first projection image can be obtained, and the frame of the first projection image includes a plurality of corrections logo.
  • FIG. 10 is a timing diagram of a main control circuit generating an image processing start instruction and an image processing end instruction according to some embodiments of the present disclosure
  • FIG. 11 is a schematic diagram of an image frame subsequence according to some embodiments of the present disclosure.
  • FIG. 12 is a schematic diagram of a processed image frame subsequence according to some embodiments of the present disclosure
  • FIG. 13 is a schematic diagram of another processed image frame subsequence according to some embodiments of the present disclosure.
  • every third preset time T3, in the image frame sub-sequence select a video frame to be played or a user interface frame
  • the selection One frame of the video frame to be played or the frame of the user interface is the image P0
  • the image P0 is used as the first image or the second image
  • the previous frame image of the image P0 is the image Pm1
  • the next frame image of the image P0 is the image Pm2 .
  • 4 correction marks a are added to obtain the first projected image P1.
  • the first projected image P1 includes the frame of the to-be-played video picture or the picture of the user interface to which the four correction marks a are added.
  • Fig. 12 is a schematic diagram of a processed image frame subsequence according to some embodiments of the present disclosure.
  • the four correction marks a in the first projected image P1 are rhombuses with the same shape and size, and can be The left vertex a1 of the rhombus is used as the marking point of the calibration mark to determine the position of the calibration mark in the captured image.
  • the four calibration marks a are respectively located at the four vertices of the first projected image P1.
  • Fig. 11 is a schematic diagram of an image frame subsequence according to some embodiments of the present disclosure
  • Fig. 12 is a schematic diagram of a processed image frame subsequence according to some embodiments of the present disclosure, as shown in Fig. 11 and Fig. 12 , using the first A projected image replaces the first image, that is, the first projected image P1 is used to replace the image P0 in the image frame subsequence shown in FIG. 11 , and the processed image frame subsequence shown in FIG. 12 is obtained.
  • the first projection image is inserted after the second image, that is, the first projection image P1 is inserted between the images P0 and Pm2 in the image frame subsequence shown in Figure 11, to obtain The processed image frame subsequence as shown in FIG. 13 .
  • the first projected image P1 may also be inserted between the image P0 and the image Pm1 in the image frame subsequence shown in FIG. 11 to obtain a processed image frame subsequence.
  • every third preset time T3, multiple frames of video frames to be played or user interface frames can also be selected as the first image or the second image, and each frame of the video frame to be played or the user interface A calibration mark is added to the picture to obtain multiple frames of the first projection image.
  • Each frame of the first projection image in the plurality of frames of first projection images includes a correction mark, and each correction mark has a different position in the first projection image corresponding to the correction mark.
  • the multi-frame video images to be played or the user interface images may be continuous multi-frame images or discontinuous multi-frame images.
  • the multi-frame video images to be played included in the first image or the second image Or it is not limited whether the screens of the user interface are continuous screens. The following embodiments will be described by taking multiple frames of video screens to be played included in the first image or the second image or the screens of the user interface as continuous screens as an example.
  • FIG. 10 is a timing diagram of a main control circuit generating an image processing start instruction and an image processing end instruction according to some embodiments of the present disclosure
  • FIG. 14 is a schematic diagram of another image frame subsequence according to some embodiments of the present disclosure
  • Fig. 15 is a schematic diagram of another processed image frame subsequence according to some embodiments of the present disclosure
  • Fig. 16 is a schematic diagram of another processed image frame subsequence according to some embodiments of the present disclosure.
  • FIG. 15 is a schematic diagram of another processed image frame subsequence according to some embodiments of the present disclosure. As shown in FIG. 15 , a correction mark a is added to the upper left corner of the image P11 to obtain the first projected image P21.
  • a correction mark a is added to obtain the first projected image P22.
  • a correction mark a is added to obtain the first projected image P23.
  • a correction mark a is added to obtain the first projected image P24.
  • FIG. 14 is a schematic diagram of another image frame subsequence according to some embodiments of the present disclosure
  • FIG. 15 is a schematic diagram of another processed image frame subsequence according to some embodiments of the present disclosure, as shown in FIGS. 14 and 15 .
  • Replace the first image with the first projected image that is, replace the image P11 with the first projected image P21, replace the image P12 with the first projected image P22, replace the image P13 with the first projected image P23, and replace the image P14 with the first projected image P24 , obtain the processed image frame subsequence in another schematic diagram of the processed image frame subsequence of some embodiments of the present disclosure as shown in FIG. 15 .
  • FIG. 14 is a schematic diagram of another image frame subsequence according to some embodiments of the present disclosure
  • FIG. 16 is a schematic diagram of another processed image frame subsequence according to some embodiments of the present disclosure, as shown in FIGS. 14 and 16
  • the first projected image is inserted after the second image, that is, based on the image frame subsequence in the schematic diagram of another image frame subsequence of some embodiments of the present disclosure as shown in FIG.
  • the image P24 is to obtain the processed image frame subsequence in another schematic diagram of the processed image frame subsequence of some embodiments of the present disclosure as shown in FIG. 16 .
  • the first projected image may be inserted before the second image, that is, based on the image frame subsequence in another image frame subsequence schematic diagram of some embodiments of the present disclosure as shown in FIG.
  • the image P24 is projected to obtain a processed image frame subsequence.
  • the color of the correction mark when adding a correction mark to the first image or the second image, can be set to a color that is significantly different from the background color of the first image or the second image, for example, the background color is black, and the correction mark The color is white, so that the captured image can be processed later to better extract the position of the correction mark in the captured image. For example, you can first obtain the background color of the first image or the second image; then, invert the background color to obtain the bright color. For example, subtract the color value of the background color from 255 to get the color value of the bright color . Second, set the color of the calibration mark to the bright color of the mark.
  • the main control circuit 401 is further configured to periodically send a control command and the first projected image to the display control circuit 402, the control command is used to control the display control circuit 402 to project the first projected image to the projection screen 30 superior.
  • the main control circuit 401 generates the first preset time T1 as a cycle.
  • the image processing start command Cmd1 and the image processing end command Cmd2 select the first image and the second image with the third preset time T3 as a cycle, that is, generate the first projected image with the third preset time T3 as a cycle.
  • the main control circuit 401 may send the control command and the first projected image to the display control circuit 402 at a period of the third preset time T3.
  • the control display control circuit 402 is configured to receive the first projected image from the main control circuit 401, and transmit an image signal of the first projected image to the optical machine 200 in response to a control instruction, so that the optical machine 200 uses the first projected image
  • the image signal modulates the illumination light beam provided by the light source assembly 100 to obtain a projection light beam and project the first projection image onto the projection screen 30 through the lens 300 .
  • the main control circuit 401 is further configured to periodically send a photographing instruction to the photographing device 20, and the photographing device 20 is configured to respond to the photographing instruction and photograph the first projected image projected by the lens 300 on the projection screen, Get the captured image.
  • a master control circuit in some embodiments of the present disclosure generates a sequence diagram of an image processing start instruction and an image processing end instruction, and generates a shooting instruction, an image Process the start command Cmd1 and the image processing end command Cmd2, and select the first image and the second image with the third preset time T3 as a cycle, that is, generate the first projected image with the third preset time T3 as a cycle.
  • the photographing device 20 photographs the projection image on the projection screen 30 within a second preset time T2, and can photograph the first projection image projected by the lens 300 on the projection screen to obtain a photographed image.
  • the main control circuit 401 generates the image processing start command Cmd1 and the image processing end command Cmd2 with the first preset time T1 as the cycle, and selects the first image and the second image with the third preset time T3 as the cycle, that is, with the first preset time T3 as the cycle. Three preset times T3 are used to periodically generate the first projected image.
  • the main control circuit 401 generates a shooting instruction with a period of the third preset time T3.
  • the shooting device 20 After the first projected image is projected onto the projection screen 30, the shooting device 20 responds to the shooting instruction and takes the first projection projected by the lens 300 on the projection screen. image to get the captured image.
  • the cycle of the main control circuit 401 periodically sending shooting instructions to the shooting device 20 is small enough, it can be considered that the shooting device is shooting the projection screen 30 in real time.
  • the main control circuit 401 periodically sends shooting instructions to the shooting device 20
  • the specific period is not limited.
  • the parameters of the photographing device 20 can be set according to the application scenario, for example, the photographing device 20 can capture images of many consecutive frames.
  • the main control circuit 401 is further configured to: determine whether the projection area of the first projection image matches the range of the projection screen 30 based on the captured image.
  • the range of the projection screen 30 may be the range where the display area is set, or the range of the frame of the projection screen 30 .
  • Fig. 17 is a schematic diagram showing that the projection area of the first projected image matches the projection screen according to some embodiments of the present disclosure
  • Fig. 18 is a schematic diagram showing that the projection area of the first projected image does not match the projection screen according to some embodiments of the present disclosure
  • FIG. 19 is another schematic diagram of a mismatch between the projection area of the first projection image and the projection screen according to some embodiments of the present disclosure. As shown in FIGS.
  • the range of the projection screen 30 is a range Z1
  • the projection area of the first projected image is a range Z2 .
  • the projection area of the first projected image matches the range of the projection screen 30 , which refers to a schematic diagram of matching the projection area of the first projected image with the projection screen in some embodiments of the present disclosure as shown in FIG. 17 , the first projected image
  • the projection area Z2 and the range Z1 of the projection screen 30 almost overlap or completely overlap, and the almost overlap can be the overlap recognizable by human eyes.
  • the area inside the frame of the projection screen 30 displays the projected image of the first projected image, so the brightness value of the area inside the frame of the projection screen 30 is higher.
  • the luminance value of the area outside the frame of the projection screen 30 should be equivalent to the luminance value of the use environment of the laser projection device. Therefore, by comparing the luminance value of the area within the frame of the projection screen 30 with the luminance value of the area outside the frame of the projection screen 30, it can be determined whether the projection area Z2 of the first projected image overlaps with the area Z1 of the projection screen 30.
  • the main control circuit 401 is further configured to identify the frame of the projection screen 30 in the captured image.
  • the reference image card including the projection screen 30 may be pre-stored in the main control circuit 401. After the main control circuit 401 obtains the captured image, it performs image processing on the captured image and the reference image card, and compares them with the pre-stored reference image card including the projection screen.
  • the frame of the projection screen in the captured image can be identified by comparing with the reference chart of the frame of the screen 30 .
  • the reference chart can be overlaid to the position of the projection screen 30 in the captured image, and information outside the projection screen 30 in the captured image can be obtained.
  • the frame coordinates of the projection screen 30 in the captured image may also be pre-stored in the main control circuit 401, After the main control circuit 401 acquires the captured image, it recognizes the frame according to the frame coordinates of the pre-stored projection screen 30 .
  • the present disclosure does not limit the specific method for the main control circuit 401 to identify the frame of the projection screen 30 in the captured image.
  • the main control circuit 401 acquires the first brightness value of the area inside the frame and the second brightness value of the area outside the frame in the captured image. If the first brightness value is less than or equal to the first threshold, or the second brightness value is greater than or equal to the second threshold, it is determined that the projection area of the first projection image does not match the range of the projection screen.
  • the present disclosure does not limit the set value of the first threshold or the second threshold, which can be set according to actual applications.
  • FIG. 18 is a schematic diagram of a mismatch between the projection area of the first projection image and the projection screen according to some embodiments of the present disclosure.
  • the first threshold is the projection area of the first projection image
  • the projection image that is, the projection zone Z2 of the first projection image does not match the range Z1 of the projection screen 30 .
  • FIG. 19 is another schematic diagram showing that the projection area of the first projected image does not match the projection screen according to some embodiments of the present disclosure.
  • the second threshold is the usage environment of the laser projection device.
  • the brightness value, the brightness value of the use environment can be obtained by taking pictures of the projection screen, or can be pre-stored in the main control circuit 401 . If the second luminance value of the acquired image located outside the frame is greater than or equal to the second threshold, it can be considered that the part of the area other than the frame of the projection screen 30 is irradiated by the projection light beam projected by the lens 300, causing the second luminance value to be greater than or equal to the second threshold. equal to the second threshold, that is, the projection zone Z2 of the first projected image does not match the zone Z1 of the projection screen 30 .
  • the main control circuit 401 after the main control circuit 401 recognizes the frame of the projection screen 30 in the captured image, it acquires the third brightness value of the area outside the frame in the captured image, and obtains the third brightness value and the captured image obtained in the previous period. Amount of change in the fourth brightness value for areas of the image that lie outside the bounding box. If the variation is greater than the third threshold, it is determined that the projection area of the first projection image does not match the range of the projection screen.
  • the present disclosure does not limit the setting value of the third threshold, which can be set according to actual applications.
  • the third threshold is 0, since the shooting instruction is periodically issued by the main control circuit 401, when the brightness information of the area outside the frame in the captured images acquired in two adjacent periods is inconsistent, it can be considered that some areas are captured by the lens. 300 is irradiated by the projected light beam, resulting in a change value greater than 0, that is, the projected area Z2 of the first projected image does not match the area Z1 of the projected screen 30 .
  • the third threshold may also be set to a value greater than 0.
  • the main control circuit 401 is further configured to: if it is determined that the projection area Z2 of the first projected image does not match the range Z1 of the projection screen, based on the position of the correction mark in the captured image and the position of the correction mark in the target image, to determine if there is an offset for the calibration flag.
  • the target image is a captured image when the projection area Z2 of the first projected image matches the range Z1 of the projection screen, and the target image may be captured and stored in the main control when the projection area Z2 of the first projected image matches the range Z1 of the projection screen. in circuit 401.
  • the shape and size of the correction mark in the first projected image are the same, and the position of the correction mark in the shot image and the position in the target image can be determined based on the position of the same mark point of the correction mark.
  • the position of the marker point a1 in the captured image is the position of the calibration marker in the captured image
  • the position of the marker point a1 in the target image is the position of the calibration marker in the target image.
  • the main control circuit 401 can determine the offset of the marking point a1 by comparing the position of the marking point a1 in the captured image with the position of the marking point a1 in the target image.
  • the frame of the projection screen 30 is a rectangle, and a two-dimensional coordinate system can be established with the lower left vertex of the rectangle as the origin, the left frame of the rectangle as the x-axis, and the lower frame of the rectangle as the y-axis.
  • the main control circuit 401 is further configured to: if there is an offset, determine the correction data according to the offset.
  • the first projected image has not been offset after being projected, and there is no need to correct subsequent video images or user interface images to be played. If there is at least one correction mark offset among multiple correction marks, the correction mark has shifted. It can be considered that after the first projected image is projected, the projected image has also shifted. Correct the video screen or user interface screen.
  • the main control circuit 401 is configured to acquire geometric correction data according to the transformation relationship between the position coordinates of the correction mark in the captured image and the position coordinates of the correction mark in the target image. After determining the position coordinates of the correction mark in the captured image and the position coordinates of the correction mark in the target image, the degree of geometric distortion caused by the tilt of the optical axis of the laser projection device can be known, and then the anti-distortion data can be determined. That is, geometrically corrected data.
  • the frame of the projection screen 30 is a rectangle
  • a two-dimensional coordinate system can be established with the lower left vertex of the rectangle as the origin, the left frame of the rectangle as the x-axis, and the lower frame of the rectangle as the y-axis.
  • the position coordinates of the four calibration marks a in the first projected image P1 can be determined by the coordinates of the corresponding marker points a1 .
  • the position coordinates of the four correction marks a in the captured image are (x1, y1), (x2, y2), (x3, y3), (x4, y4), and the positions of the correction marks in the target image
  • the coordinates are (x1', y1'), (x2', y2'), (x3', y3'), (x4', y4'), respectively.
  • the position coordinates of the correction mark in the captured image are transformed into the position coordinates of the correction mark in the target image through geometric correction, so as to realize anti-distortion.
  • the geometric correction data may be determined by formula (1).
  • i is an integer between [1, 4]
  • a, b, c, d, e, f, u, v are geometric correction data.
  • the display control circuit 402 is further configured to receive the correction data, perform correction processing on the image to be projected based on the correction data, and transmit the corrected image signal of the image to be projected to the optical machine 200, so that the optical machine 200
  • the image signal of the corrected image to be projected is used to modulate the illumination light beam, and the corrected processed image to be projected is projected onto the projection screen 30 .
  • Performing correction processing on the image to be projected based on the correction data may be based on the correction data to correct the video picture or user interface picture to be played subsequent to the first projected image, so as to ensure the video picture or user interface to be played subsequent to the first projected image
  • the screen does not shift to ensure the user's viewing effect.
  • the display control circuit 402 may perform geometric correction on the video picture or user interface picture to be played following the first projected image based on the correction data.
  • the display control circuit 402 can follow the first projected image to any pixel in the video screen or user interface screen to be played. Substituting the position coordinates (x, y) of the pixel into the formula (2), the position coordinate value (x', y') of the pixel in the corrected image is obtained, and then the video to be played subsequent to the first projected image is realized Geometric correction of screens or user interface screens.
  • the video picture or user interface picture to be played subsequent to the first projection image may lose some pixel features, therefore, the video picture or user interface picture to be played subsequent to the first projection image may be An interpolation operation is performed on the image obtained after geometric correction of the interface picture to obtain the final corrected first projection image and the subsequent video picture or user interface picture to be played.
  • the interpolation operation the missing pixels due to image scaling can be supplemented, the possible mosaic phenomenon can be eliminated, and the content of the display screen can be smoothly transitioned without affecting the viewing effect.
  • the projection screen before correction is shown in FIG. 19
  • the projection screen after correction is shown in FIG. 17 .
  • the display control circuit 402 generates the corrected image signal, so that the optical machine 200 uses the corrected image signal of the image to be projected to modulate the illumination beam, and projects the corrected image to be projected onto the projection screen 30 .
  • the main control circuit 401 acquires a sequence of image frames to be projected, and then periodically generates a first projected image based on the sequence of image frames to be projected.
  • the method of generating the first projected image may be as follows: the main control circuit 401 periodically generates the image processing start command Cmd1 and the image processing end command Cmd2, and periodically selects the first image and the second image in the image frame subsequence, the second A correction mark is added to the first image and the second image to generate a first projection image.
  • the main control circuit 401 periodically generates shooting instructions and control instructions.
  • the display control circuit 402 In response to the control instruction, the display control circuit 402 periodically projects the first projection image to the projection screen 30 . In response to a photographing instruction, the photographing device 20 periodically acquires photographed images. Then, the main control circuit 401 determines whether the projection area Z2 of the first projection image matches the range Z1 of the projection screen 30 based on the captured image. If the projection area Z2 of the first projected image does not match the range Z1 of the projection screen 30 , and at least one of the correction marks has an offset, then the correction data is determined based on the offset of the plurality of correction marks. Finally, the display control circuit 402 corrects the video picture or user interface picture to be played following the first projected image based on the correction data, and projects the corrected video picture or user interface picture to be played by the optical machine 200 and the lens 300 onto the projection screen.
  • the laser projection device can automatically determine whether the projected image needs to be corrected, and when the projected image needs to be corrected, it does not need manual correction by the user.
  • the image to be projected is automatically corrected without interrupting the normal playback of the video picture or the user interface picture, and the user is not easy to feel the correction process, so that the user experience can be improved.
  • FIG. 20 is a flow chart of a method for correcting a projected image according to some embodiments of the present disclosure. As shown in FIG. 20 , the correction method include the following steps:
  • Step 2001 acquire a sequence of image frames to be projected, and the sequence of image frames to be projected includes a plurality of images to be projected.
  • Step 2002 in response to the image processing start instruction and the image processing end instruction, adding a correction mark to the image frame subsequence to obtain a processed image frame subsequence.
  • the image frame subsequence is the image frame sequence within the time range corresponding to the image processing start instruction and the image processing end instruction in the image frame sequence to be projected, and the processed image frame subsequence includes the first projected image.
  • Step 2003 acquire the photographed image.
  • the captured image is an image captured when the first projection image is projected onto the projection screen.
  • Step 2004 Obtain correction data based on the position of the correction mark in the captured image.
  • Step 2005 correcting the image to be projected based on the correction data.
  • adding a correction mark to the image frame subsequence to obtain the processed image frame subsequence includes: replacing the first image in the image frame subsequence with the first projection image, obtaining A subsequence of processed image frames.
  • the first projection image is inserted before or after the second image in the image frame subsequence to obtain a processed image frame subsequence.
  • obtaining the correction data based on the position of the correction mark in the captured image includes: determining whether there is a correction mark based on the position of the correction mark in the captured image and the position of the correction mark in the target image If there is an offset of the calibration mark, the correction data is determined according to the offset of the correction mark.
  • the correction method further includes: based on the captured image, determining whether the projection area of the first projection image matches the range of the projection screen. If it is determined that the projection area of the first projection image does not match the range of the projection screen, correction data is obtained based on the position of the correction mark in the captured image.
  • FIG. 21 is a flow chart of another projection image correction method according to some embodiments of the present disclosure. As shown in FIG. 21 , based on the captured image, the projection area of the first projection image and the distance between the projection screen and the projection screen are determined. Whether the scope matches includes the following steps:
  • Step 2101 identify the frame of the projection screen in the captured image.
  • Step 2102 acquiring the first luminance value of the area inside the frame and the second luminance value of the area outside the frame in the captured image.
  • Step 2103 if the first brightness value is less than or equal to the first threshold, or the second brightness value is greater than or equal to the second threshold, determine that the projection area of the first projected image does not match the range of the projection screen.
  • FIG. 22 is a flowchart of another projection image correction method according to some embodiments of the present disclosure. As shown in FIG. 22 , based on the captured image, the projection area of the first projection image and the distance between the projection screen and the projection screen are determined. Whether the scope matches can also include the following steps:
  • Step 2201 identify the frame of the projection screen in the captured image.
  • Step 2202 acquire the third brightness value of the area outside the border in the captured image; acquire the third brightness value of the area outside the border in the captured image, which is the same as the third brightness value of the area outside the border in the captured image acquired in the previous cycle. The amount of change in the four brightness values.
  • Step 2203 if the variation is greater than the third threshold, determine that the projection area of the first projection image does not match the range of the projection screen.
  • the laser projection device can automatically determine whether the projected image needs to be corrected, and when the projected image needs to be corrected, the user does not need to manually correct it, without interrupting the video screen or When the user interface screen is played normally and the user is not easy to feel the correction process, the image to be projected is automatically corrected, so that the user experience can be improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

本公开一些实施例提出一种激光投影设备,包括光源组件、光机、镜头、拍摄装置和电路系统架构。光源组件被配置为提供照明光束。光机被配置为利用图像信号对照明光束进行调制,以获得投影光束。镜头被配置为将投影光束投射成像。拍摄装置被配置为拍摄镜头在投影屏幕上投射的第一投影图像,得到拍摄图像;第一投影图像包括校正标识。电路系统架构被配置为控制光源和光机运行。电路系统架构包括主控电路和显示控制电路。主控电路耦接至拍摄装置,被配置为获取拍摄图像,并基于校正标识在拍摄图像中的位置,得到校正数据;向显示控制电路发送校正数据。显示控制电路耦接至主控电路,且被配置为接收校正数据,基于校正数据对待投影图像进行校正处理。

Description

激光投影设备及投影图像的校正方法
本申请要求于2021年12月20日提交的、申请号为202111566402.4的中国专利申请,于2021年12月20日提交的、申请号为202111562487.9的中国专利申请,以及于2021年12月20日提交的、申请号为202111559005.4的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及投影显示技术领域,尤其涉及一种激光投影设备及投影图像的校正方法。
背景技术
随着激光显示产品的普及,激光显示产品开始作为替代电视的大屏幕产品走进了千家万户。激光显示产品包括投影屏幕和激光投影设备,激光投影设备能够在投影屏幕上投射画面,以实现视频播放等功能。
激光投影设备包括光源组件、光机和镜头,该光源组件用于向光机提供高强度的激光照明光束;该光机用于对激光照明光束进行图像信号调制形成投影光束,经光机调制后形成的投影光束进入镜头;该镜头用于将投影光束投射至投影屏幕上。
发明内容
一方面,本公开一些实施例提供一种激光投影设备,包括光源组件、光机、镜头、拍摄装置和电路系统架构。光源组件被配置为提供照明光束。光机被配置为利用图像信号对照明光束进行调制,以获得投影光束。镜头被配置为将投影光束投射成像。拍摄装置被配置为响应于拍摄指令,拍摄镜头在投影屏幕上投射的第一投影图像,得到拍摄图像;第一投影图像包括校正标识。电路系统架构被配置为控制光源和光机运行。电路系统架构包括主控电路和显示控制电路。主控电路耦接至拍摄装置,被配置为获取拍摄图像,并基于校正标识在拍摄图像中的位置,得到校正数据;向显示控制电路发送校正数据。显示控制电路耦接至主控电路,且被配置为接收校正数据,基于校正数据对待投影图像进行校正处理,并向光机传输校正处理后的待投影图像的图像信号,以使光机利用校正处理后的待投影图像的图像信号对照明光束进行调制,以获得投影光束。
另一方面,本公开一些实施例提供一种投影图像的校正方法,应用于激光投影设备,该校正方法包括:首先,获取待投影的图像帧序列,待投影的图像帧序列包括多个待投影的图像。其次,响应于图像处理开始指令和图像处理结束指令,在图像帧子序列中添加校正标识,得到处理后的图像帧子序列。图像帧子序列为待投影的图像帧序列中图像处理开始指令和图像处理结束指令对应的时间范围内的图像帧序列,处理后的图像帧子序列包括第一投影图像。再次,获取拍摄图像,拍摄图像为第一投影图像投射至投影屏幕上时拍摄的图像。然后,基于校正标识在拍摄图像中的位置,获取校正数据。最后,基于校正数据对待投影图像进行校正。
附图说明
图1为根据本公开一些实施例的一种激光投影设备的结构图;
图2为根据本公开一些实施例的激光投影设备中光源组件、光机和镜头的示意图;
图3为根据本公开一些实施例的激光投影设备中的光路架构图;
图4为根据本公开一些实施例的激光投影设备中光源组件的光路原理示意图;
图5为根据本公开一些实施例的数字微镜器件中的微小反射镜片的排列结构图;
图6为根据本公开一些实施例的微小反射镜片的工作示意图;
图7为图5所示数字微镜器件中一个微小反射镜片摆动的位置示意图;
图8为根据本公开一些实施例的一种激光投影设备与投影屏幕的位置示意图;
图9为根据本公开一些实施例的另一种激光投影设备的结构图;
图10为根据本公开一些实施例的一种主控电路生成图像处理开始指令和图像处理结束指令的时序图;
图11为根据本公开一些实施例的一种图像帧子序列示意图;
图12为根据本公开一些实施例的一种处理后的图像帧子序列示意图;
图13为根据本公开一些实施例的另一种处理后的图像帧子序列示意图;
图14为根据本公开一些实施例的另一种图像帧子序列示意图;
图15为根据本公开一些实施例的又一种处理后的图像帧子序列示意图;
图16为根据本公开一些实施例的又一种处理后的图像帧子序列示意图;
图17为根据本公开一些实施例的一种第一投影图像的投影区域与投影屏幕匹配的示意图;
图18为根据本公开一些实施例的一种第一投影图像的投影区域与投影屏幕不匹配的示意图;
图19为根据本公开一些实施例的另一种第一投影图像的投影区域与投影屏幕不匹配的示意图;
图20为根据本公开一些实施例的一种投影图像的校正方法的流程图;
图21为根据本公开一些实施例的另一种投影图像的校正方法的流程图;
图22为根据本公开一些实施例的又一种投影图像的校正方法的流程图。
具体实施方式
下面将结合附图,对本公开一些实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。基于本公开所提供的实施例,本领域普通技术人员所获得的所有其他实施例,都属于本公开保护的范围。
除非上下文另有要求,否则,在整个说明书和权利要求书中,术语“包括(comprise)”及其其他形式例如第三人称单数形式“包括(comprises)”和现在分词形式“包括(comprising)”被解释为开放、包含的意思,即为“包含,但不限于”。在说明书的描述中,术语“一个实施例(one embodiment)”、“一些实施例(some embodiments)”、“示例性实施例(exemplary embodiments)”、“示例(example)”、“特定示例(specific example)”或“一些示例(some examples)”等旨在表明与该实施例或示例相关的特定特征、结构、材料或特性包括在本公开的至少一个实施例或示例中。上述术语的示意性表示不一定是指同一实施例或示例。此外,的特定特征、结构、材料或特点可以以任何适当方式包括在任何一个或多个实施例或示例中。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本公开实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
在描述一些实施例时,可能使用了“耦接”和“连接”及其衍伸的表达。例如,描述一些实施例时可能使用了术语“连接”以表明两个或两个以上部件彼此间有直接物理接触或电接触。又如,描述一些实施例时可能使用了术语“耦接”以表明两个或两个以上部件有直接物理接触或电接触。然而,术语“耦接”或“通信耦合(communicatively coupled)”也可能指两个或两个以上部件彼此间并无直接接触,但仍彼此协作或相互作用。这里所公开的实施例并不必然限制于本文内容。
如本文中所使用,根据上下文,术语“如果”任选地被解释为意思是“当……时”或“在……时”或“响应于确定”或“响应于检测到”。类似地,根据上下文,短语“如果确定……”或“如果检测到[所陈述的条件或事件]”任选地被解释为是指“在确定……时”或“响应于确定……”或“在检测到[所陈述的条件或事件]时”或“响应于检测到[所陈述的条件或事件]”。
本文中“适用于”或“被配置为”的使用意味着开放和包容性的语言,其不排除适用于或被配置为执行额外任务或步骤的设备。
另外,“基于”的使用意味着开放和包容性,因为“基于”一个或多个条件或值的过程、步骤、计算或其他动作在实践中可以基于额外条件或超出的值。
如本文所使用的那样,“约”或“近似”包括所阐述的值以及处于特定值的可接受偏差范围内的平均值,其中可接受偏差范围如由本领域普通技术人员考虑到正在讨论的测量以及与特定量的测量相关的误差(即,测量系统的局限性)所确定。
本文参照作为理想化示例性附图的剖视图和/或平面图描述了示例性实施方式。在附图中,为了清楚,放大了层和区域的厚度。因此,可设想到由于例如制造技术和/或公差引起的相对于附图的形状的变动。因此,示例性实施方式不应解释为局限于本文示出的区域的形状,而是包括因例如制造而引起的形状偏差。例如,示为矩形的蚀刻区域通常将具有弯曲的特征。因此,附图中所示的区域本质上是示意性的,且它们的形状并非旨在示出设备的区域的实际形状,并且并非旨在限制示例性实施方式的范围。
图1为根据本公开一些实施例的一种激光投影设备的结构图,如图1所示,激光投影设备包括主机10,主机10包括整机壳体101(图中仅示出部分壳体),装配于整机壳体101中的光源组件100、光机200,以及镜头300。该光源组件100被配置为提供照明光束(激光束)。该光机200被配置为利用图像信号对光源组件100提供的照明光束进行调制以获得投影光束。该镜头300被配置为将投影光束投射在投影屏幕或墙壁上成像。光源组件100、光机200和镜头300沿着光束传播方向依次连接,各自由对应的壳体进行包裹。光源组件100、光机200和镜头300各自的壳体对各光学部件进行支撑并使得各光学部件达到一定的密封或气密要求。比如,光源组件100通过其对应的外壳实现气密性密封,可以较好地改善光源组件100的光衰问题。
光机200的一端和镜头300连接且沿着整机第一方向X设置,比如第一方向X可以为整机的宽度方向。在光机200的另一端连接有光源组件100。在本示例中,光源组件100与光机200的连接方向,垂直于光机200与镜头300的连接方向,这种连接结构一方面可以适应光机200中反射式光阀的光路特点,另一方面,还有利于缩短一个维度方向上光路的长度,利于整机的结构排布。例如,当将光源组件100、光机200和镜头300设置在一个维度方向(例如第二方向,第二方向与第一方向X垂直的方向)上时,该方向上光路的长度就会很长,从而不利于整机的结构排布。
在一些实施例中,光源组件100可以包括三个激光器阵列。图2为根据本公开一些实施例的激光投影设备中光源组件、光机和镜头的示意图,如图2所示,光源组件100为三色激光光源为例,该三个激光器阵列可分别为红色激光器阵列130、绿色激光器阵列120和蓝色激光器阵列110;但并不局限于此。该三个激光器阵列也可以均为蓝色激光器阵列110,或者两个激光器阵列为蓝色激光器阵列110、一个激光器阵列为红色激光器阵列130。当光源组件100包括的多个激光器可以产生三基色,则光源组件100可以产生包含三基色光的照明光束,因此光源组件100内不需要设置荧光轮(当光源组件所包括的一个或多个激光器阵列仅能产生一种或两种颜色的激光时,需要使用已有颜色的激光激发荧光轮来产生其他颜色的荧光,从而使激光和荧光一起形成白光),进而能够简化光源组件100的结构,减小光源组件100的体积。
在一些实施例中,光源组件100还可以包括两个激光器阵列。光源组件100为双色激光光源为例,该两个激光器阵列可以为蓝色激光器阵列110和红色激光器阵列130;也可以均为蓝色激光器阵列110,即光源组件100为单色激光光源。
在另一些实施例中,光源组件100还可以包括一个激光器阵列,即光源组件100为单色激光光源,即光源组件100仅包括蓝色激光器阵列110,或者仅包括蓝色激光器阵列110和红色激光器阵列130时。
图4为根据本公开一些实施例的激光投影设备中光源组件的光路原理示意图,如图4所示,激光器阵列可以为蓝色激光器阵列110,该光源组件100还可以包括:荧光轮140和滤色轮150。该蓝色激光器110发射蓝光后,一部分蓝光照射到荧光轮140上以产生红光荧光(当光源组件100包括红色激光器阵列130时,则不需要再产生红色荧光)和绿光荧光;该蓝光激光、红光荧光(或红色激光)以及绿光荧光依次通过合光镜160后再通过滤色轮150进行滤色,并时序性地输出三基色光。根据人眼的视觉暂留现象,人眼分辨不出某一时刻光的颜色,感知到的仍然是混合的白光。
光源组件100发出的照明光束进入光机200。图2为根据本公开一些实施例的激光投影设备中光源组件、光机和镜头的示意图,图3为根据本公开一些实施例的激光投影设备中的光路架构图,如图2和图3所示,光机200可以包括:光导管210,透镜组件220,反射镜230,数字微镜器件(Digital Micromirror Device,DMD)240以及棱镜组件250。该光导管210可以接收光源组件100提供的照明光束,并对该照明光束进行匀化。透镜组件220可以对照明光束先进行放大后进行会聚并出射至反射镜230。反射镜230可以将照明光束反射至棱镜组件250。棱镜组件250将照明光束反射至DMD 240,DMD 240对照明光束进行调制,并将调制后得到的投影光束反射至镜头300中。
光机200中,DMD 240是核心部件,其作用是利用图像信号对光源组件100提供的照明光束进行调制,即:控制照明光束针对待显示图像的不同像素显示不同的颜色和亮度,以最终形成光学图像,因此DMD 240也被称为光调制器件或光阀。根据光调制器件(或光阀)对照明光束进行透射还是进行反射,可以将光调制器件(或光阀)分为透射式光调制器件(或光阀)或反射式光调制器件(或光阀)。例如,图2为根据本公开一些实施例的激光投影设备中光源组件、光机和镜头的示意图,图3为根据本公开一些实施例的激光投影设备中的光路架构图,如图2和图3所示,DMD 240对照明光束进行反射,即为一种反射式光调制器件。而液晶光阀对照明光束进行透射,因此是一种透射式光调制器件。此外,根据光机中使用的光调制器件(或光阀)的数量,可以将光机分为单片系统、双片系统或三片系统。例如,图2和图3所示的本公开一些实施例的激光投影设备中的光源组件、光机和镜头的示意图和光路架构图中,的光机200中仅使用了一片DMD 240,因此光机200可被称为单片系统。当使用三片数字微镜器件时,则光机200可以被称为三片系统。
DMD 240应用于数字光处理(Digital Light Processing,DLP)投影架构中,图2为根据本公开一些实施例的激光投影设备中光源组件、光机和镜头的示意图,图3为根据本公开一些实施例的激光投影设备中的光路架构图,如图2和图3所示,光机200使用了DLP投影架构。图5为根据本公开一些实施例的数字微镜器件中的微小反射镜片的排列结构图,如图5所示,DMD 240包含成千上万个可被单独驱动以旋转的微小反射镜片2401,这些微小反射镜片2401呈阵列排布,每个微小反射镜片2401对应待显示图像中的一个像素。在DLP投影架构中,每个微小反射镜片2401相当于一个数字开关,在外加电场作用下可以在正负12度(±12°)或者正负17度(±17°)的范围内摆动,以使得被反射的光能够沿光轴方向通过镜头300成像在屏上,形成一个亮的像素。
图6为根据本公开一些实施例的微小反射镜片的工作示意图,如图6所示,微小反射镜片2401在负的偏转角度反射出的光,称之为OFF光,OFF光为无效光,通常打到整机壳体101上、光机200的壳体上或者光吸收单元上吸收掉。微小反射镜片2401在正的偏转角度反射出的光,称之为ON光,ON光是DMD 240表面的微小反射镜片2401接收照明光束照射,并通过正的偏转角度射入镜头300的有效光束,用于投影成像。微小反射镜片2401的开状态为光源组件100发出的照明光束经微小反射镜片2401反射后可以进入镜头300时,微小反射镜片2401所处且可以保持的状态,即微小反射镜片2401处于正的偏转角度的状态。微小反射镜片2401的关状态为光源组件100发出的照明光束经微小反射镜片2401反射后未进入镜头300时,微小反射镜片2401所处且可以保持的状态,即微小 反射镜片2401处于负的偏转角度的状态。
示例性地,图7为图5所示数字微镜器件中一个微小反射镜片摆动的位置示意图,如图7所示,对于偏转角度为±12°的微小反射镜片2401,位于+12°的状态即为开状态,位于-12°的状态即为关状态,而对于-12°和+12°之间的偏转角度,微小反射镜片2401的实际工作状态仅开状态和关状态。
示例性地,对于偏转角度为±17°的微小反射镜片2401,位于+17°的状态即为开状态,位于-17°的状态即为关状态。图像信号通过处理后被转换成0、1这样的数字代码,这些数字代码可以驱动微小反射镜片2401摆动。
在一帧图像的显示周期内,部分或全部微小反射镜片2401会在开状态和关状态之间切换一次,从而根据微小反射镜片2401在开状态和关状态分别持续的时间来实现一帧图像中的各个像素的灰阶。例如,当像素具有0~255这256个灰阶时,与灰阶0对应的微小反射镜片在一帧图像的整个显示周期内均处于关状态,与灰阶255对应的微小反射镜片在一帧图像的整个显示周期内均处于开状态,而与灰阶127对应的微小反射镜片在一帧图像的显示周期内一半时间处于开状态、另一半时间处于关状态。因此通过图像信号控制DMD240中每个微小反射镜片在一帧图像的显示周期内所处的状态以及各状态的维持时间,可以控制该微小反射镜片2401对应像素的亮度(灰阶),实现对投射至DMD 240的照明光束进行调制的目的。
DMD 240前端的光导管210,透镜组件220和反射镜230形成照明光路,光源组件100发出的照明光束经过照明光路后形成符合DMD 240所要求的光束尺寸和入射角度。
如图2所示,本公开一些实施例的激光投影设备中光源组件、光机和镜头的示意图中,镜头300包括多片透镜组合,通常按照群组进行划分,分为前群、中群和后群三段式,或者前群和后群两段式。前群是靠近投影设备出光侧(图2所示的左侧)的镜片群组,后群是靠近光机200出光侧(图2所示的右侧)的镜片群组。根据上述多种镜片组组合,镜头300也可以是变焦镜头,或者为定焦可调焦镜头,或者为定焦镜头。在一些实施例中,激光投影设备为超短焦投影设备,镜头300为超短焦镜头,镜头300的投射比通常小于0.3,比如0.24。投射比是指投影距离与画面宽度之比,比值越小,说明相同投影距离,投射画面的宽度越大。投射比较小的超短焦镜头保证投射效果的同时,能够适应较狭窄的空间。
图8为根据本公开一些实施例的一种激光投影设备与投影屏幕的位置示意图,如图8所示,激光投影设备的主机10与投影屏幕分开设置,两者之间一般相距一段距离。当激光投影设备的主机10发生移动时,也就是整机壳体101发生移动时,镜头300投射到投影屏幕30上的投影图像也会产生移位,因此可能会造成投影图像超出投影屏幕30显示范围的情况,影响投影显示效果。
为此,本公开的一些实施例提供一种激光投影设备,图8为根据本公开一些实施例的一种激光投影设备与投影屏幕的位置示意图,图9为根据本公开一些实施例的另一种激光投影设备的结构图,如图8和图9所示,激光投影设备包括主机10和拍摄装置20。该拍摄装置20为能够对投影屏幕30进行拍摄的设备。例如,该拍摄装置20可以为摄像头。
示例性地,拍摄装置20可以设置于主机10的整机壳体101上,或者,拍摄设备20也可以设置于主机10的整机壳体101之外的位置,本公开对于拍摄装置20的设置位置并不限定。
在一些实施例中,图9为根据本公开一些实施例的另一种激光投影设备的结构图,如图9所示,激光投影设备的主机10还包括电路系统架构(power system architecture)400,该电路系统架构400可以为印刷电路板组件(Printed Circuit Board Assembly,PCBA)。该电路系统架构400被配置为控制光源组件100和光机200运行。示例性地,图1为根据本公开一些实施例的一种激光投影设备的结构图,如图1所示,电路系统架构400可以设置于整机壳体101内,本公开对于电路系统架构400的设置位置并不限定。
在一些实施例中,图9为根据本公开一些实施例的另一种激光投影设备的结构图,如图9所示,电路系统架构400包括主控电路401和显示控制电路402。主控电路401耦接至拍摄装置20与显示控制电路402,示例性地,显示控制电路402可以为DLP芯片。
在一些实施例中,主控电路401被配置为获取待投影图像的图像帧序列,待投影的图像帧序列包括多帧待投影的图像,待投影的图像可以是待播放的视频画面或者用户界面的画面,待投影的图像也可以是仅用于图像校正的图像,下述实施例以待投影的图像是待播放的视频画面或者用户界面(User Interface,UI)的画面为例进行说明。
主控电路401被配置为响应于图像处理开始指令和图像处理结束指令,在图像帧序列中图像处理开始指令和图像处理结束指令对应的时间范围内选取图像帧子序列,并在图像帧子序列中添加校正标识,得到处理后的图像帧子序列。处理后的图像帧子序列包括第一投影图像,该第一投影图像包括校正标识。即,第一投影图像为添加校正标识的图像,该第一投影图像也可以称为校正图卡。
在一些实施例中,第一投影图像包括的校正标识为一个或多个,当第一投影图像包括多个校正标识时,该多个校正标识在第一投影图像中的位置不同。多个校正标识可以分布于第一投影图像的顶点位置,也可以分布于第一投影图像的边的中点位置,下述实施例以校正标识分布于第一投影图像的顶点位置为例进行示例性说明。校正标识的形状可以包括菱形、星形或十字形,本公开对于第一投影图像包括的校正标识的形状和数量并不限定,下述实施例以多个校正图案的形状相同且尺寸相同为例进行说明。
图像处理开始指令用于指示主控电路401开始在图像帧子序列中添加校正标识。图像处理结束指令用于指示主控电路401停止在图像帧子序列中添加校正标识。需要说明的是,图像处理开始指令和图像处理结束指令可以是在系统中设置的,也可以是用户触发的。
在一些实施例中,主控电路401还被配置为周期性地生成图像处理开始指令和图像处理结束指令。示例性地,图10为根据本公开一些实施例的一种主控电路生成图像处理开始指令和图像处理结束指令的时序图,如图10所示,以第一预设时间T1为周期,生成图像处理开始指令Cmd1和图像处理结束指令Cmd2。在以图像处理开始指令Cmd1生成时间为起点的第一预设时间T1内,图像处理开始指令Cmd1的生成时间和图像处理结束指令Cmd2的生成时间之间间隔可以第二预设时间T2。图像帧子序列即为对应于以图像处理开始指令Cmd1生成时间为起点的第二预设时间T2内的待投影图像的图像帧序列。本公开对于第一预设时间T1和第二预设时间T2的具体值并不限定,可根据实际情况进行设置。
在一些实施例中,可以在视频播放界面或者用户交互界面设置定时投影状态自检开关,若用户打开该定时投影状态自检开关,则触发主控电路401周期性地生成图像处理开始指令Cmd1和图像处理结束指令Cmd2。
在一些实施例中,也可以在视频播放界面或者用户交互界面设置校正开关,由用户触发单次投影图像的校正。示例性地,主控电路401可以基于用户对校正开关的操作生成图像处理开始指令Cmd1或图像处理结束指令Cmd2。
在一些实施例中,主控电路401可以通过图像替换或图像插入的方式在图像帧子序列中添加校正标识,得到处理后的图像帧子序列。下面分别对这两种方式进行介绍。
若通过图像替换的方式在图像帧子序列中添加校正标识,主控电路401被配置为:将图像帧子序列中的第一图像替换为第一投影图像,得到处理后的图像帧子序列。
若通过图像插入的方式在图像帧子序列中添加校正标识,主控电路401被配置为:在图像帧子序列中第二图像之前或之后插入第一投影图像,得到处理后的图像帧子序列。在第二图像之前插入第一投影图像,可以为在第二图像与第二图像的前一帧图像之间插入第一投影图像;在第二图像之后插入第一投影图像,可以为在第二图像与第二图像的后一帧图像之间插入第一投影图像。
在一些实施例中,通过图像替换或图像插入的方式在图像帧子序列中添加校正标识 时,可以选取图像帧子序列中的图像作为第一投影图像中校正标识的背景图像,也可以选取其他图像作为第一投影图像的背景图像,其他图像为除图像帧子序列中的图像之外的图像。下述实施例以选取图像帧子序列中的图像作为第一投影图像中校正标识的背景图像为例进行示例性说明。
在一些实施例中,主控电路401被配置为:在第一图像中添加校正标识,得到第一投影图像,然后采用该第一投影图像替换该第一图像。第一图像为图像帧子序列中的图像,因此在第一图像中添加校正标识,得到第一投影图像。相当于将第一图像,也就是图像帧子序列中的图像,作为第一投影图像的背景图像。
在一些实施例中,主控电路401被配置为:在第二图像中添加校正标识,得到第一投影图像;在第二图像之前或之后插入第一投影图像。第二图像为图像帧子序列中的图像,因此在在第二图像中添加校正标识,得到第一投影图像。相当于将第二图像,也就是图像帧子序列中的图像,作为第一投影图像的背景图像。
将图像帧子序列中的第一图像或第二图像作为第一投影图像的背景图像,也就是将待播放的视频画面或者用户界面的画面作为第一投影图像的背景图像。这样一来,用户在观看视频或者用户界面时,不易感知到第一投影图像或第一投影图像中的校正标识,因此,可以在用户无感知的情况下对投影图像进行校正。
在一些实施例中,主控电路401周期性地在图像帧子序列中选取第一图像或第二图像。比如,图10为根据本公开一些实施例的一种主控电路生成图像处理开始指令和图像处理结束指令的时序图,如图10所示,以第三预设时间T3为周期,选取第一图像或第二图像,也就是说在第二预设时间T2内,每隔第三预设时间T3,选取一次第一图像或第二图像。每一次选取的第一图像或第二图像时,可以包括一帧待播放的视频画面或者用户界面的画面,也可以包括多帧待播放的视频画面或者用户界面的画面。并且,每一次选取的第一图像或第二图像包括的图像帧的数量与在第一图像或第二图像中添加校正标识的数量有关。本公开对于待播放的视频画面或者用户界面的画面的形状并不限定,下述实施例以播放的视频画面或者用户界面的画面为长方形为例进行说明。
为了保证校正效果,示例性地,在第二预设时间T2内,每隔第三预设时间T3,可以选取一帧待播放的视频画面或者用户界面的画面作为第一图像或第二图像,并在第一图像或第二图像中的不同位置上添加多个校正标识,因此每隔第三预设时间T3,可以得到一帧第一投影图像,该一帧第一投影图像包括多个校正标识。
比如,图10为根据本公开一些实施例的一种主控电路生成图像处理开始指令和图像处理结束指令的时序图,图11为根据本公开一些实施例的一种图像帧子序列示意图,图12为根据本公开一些实施例的一种处理后的图像帧子序列示意图,图13为根据本公开一些实施例的另一种处理后的图像帧子序列示意图。如图10至图13所示,在第二预设时间T2内,每隔第三预设时间T3,在图像帧子序列中,选取一帧待播放的视频画面或者用户界面的画面,该选取的一帧待播放的视频画面或者用户界面的画面为图像P0,将图像P0作为第一图像或第二图像,图像P0的前一帧图像为图像Pm1,图像P0的后一帧图像为图像Pm2。在图像P0中,添加4个校正标识a,得到第一投影图像P1。第一投影图像P1包括添加了该4个校正标识a的该一帧待播放的视频画面或者用户界面的画面。
图12为根据本公开一些实施例的一种处理后的图像帧子序列示意图,如图12所示,第一投影图像P1中的4个校正标识a为形状与尺寸均相同的菱形,并且可将菱形的左顶点a1作为校正标识的标记点,以确定校正标识的在拍摄图像中的位置。该4个校正标识a分别位于第一投影图像P1的四个顶点位置。
图11为根据本公开一些实施例的一种图像帧子序列示意图,图12为根据本公开一些实施例的一种处理后的图像帧子序列示意图,如图11和图12所示,采用第一投影图像替换第一图像,即采用第一投影图像P1替换如图11所示的图像帧子序列中的图像P0,得到 如图12所示的处理后的图像帧子序列。或者,如图11和图13所示,在第二图像之后插入第一投影图像,即在如图11所示的图像帧子序列中的图像P0与Pm2之间插入第一投影图像P1,得到如图13所示的处理后的图像帧子序列。或者,也可以在如图11所示的图像帧子序列中的图像P0与图像Pm1之间插入第一投影图像P1,得到处理后的图像帧子序列。
示例性地,每隔第三预设时间T3,也可以选取多帧待播放的视频画面或者用户界面的画面作为第一图像或第二图像,并在每一帧待播放的视频画面或者用户界面的画面中添加一个校正标识,得到多帧第一投影图像。该多帧第一投影图像中的每一帧第一投影图像包括一个校正标识,且每一个校正标识在该校正标识对应的第一投影图像中的位置不同。该多帧待播放的视频画面或者用户界面的画面可以为连续的多帧画面,也可以为不连续的多帧画面,本公开对于第一图像或第二图像包括的多帧待播放的视频画面或者用户界面的画面是否为连续画面并不限定,下述实施例以第一图像或第二图像包括的多帧待播放的视频画面或者用户界面的画面为连续画面为例进行说明。
比如,图10为根据本公开一些实施例的一种主控电路生成图像处理开始指令和图像处理结束指令的时序图,图14为根据本公开一些实施例的另一种图像帧子序列示意图;图15为根据本公开一些实施例的又一种处理后的图像帧子序列示意图,图16为根据本公开一些实施例的又一种处理后的图像帧子序列示意图。如图10、图14、图15和图16所示,在第二预设时间T2内,每隔第三预设时间T3,在图像帧子序列中,选取四帧连续的待播放的视频画面或者用户界面的画面(比如图14所示的按照时序连续排列的图像P11、图像P12、图像P13和图像P14)作为第一图像或第二图像,图像P11的前一帧图像为图像Pn1,图像P14的后一帧图像为图像Pn2。图15为根据本公开一些实施例的又一种处理后的图像帧子序列示意图,如图15所示,在图像P11的左上角,添加一个校正标识a,得到第一投影图像P21。在图像P12的左下角,添加一个校正标识a,得到第一投影图像P22。在图像P13的右上角,添加一个校正标识a,得到第一投影图像P23。在图像P14的右下角,添加一个校正标识a,得到第一投影图像P24。显然,校正标识a在第一投影图像P11、第一投影图像P22、第一投影图像P23和第一投影图像P24中的位置不同。
图14为根据本公开一些实施例的另一种图像帧子序列示意图,图15为根据本公开一些实施例的又一种处理后的图像帧子序列示意图,如图14和图15所示,采用第一投影图像替换第一图像,即采用第一投影图像P21替换图像P11,采用第一投影图像P22替换图像P12,采用第一投影图像P23替换图像P13,采用第一投影图像P24替换图像P14,得到如图15所示的本公开一些实施例的又一种处理后的图像帧子序列示意图中处理后的图像帧子序列。
或者,图14为根据本公开一些实施例的另一种图像帧子序列示意图,图16为根据本公开一些实施例的又一种处理后的图像帧子序列示意图,如图14和图16所示,在第二图像之后插入第一投影图像,即,基于如图14所示的本公开一些实施例的另一种图像帧子序列示意图中的图像帧子序列,在图像P11与图像P12之间插入第一投影图像P21,在图像P12与图像P13之间插入第一投影图像P22,在图像P13与图像P14之间插入第一投影图像P23,在图像P14与图像Pn2之间插入第一投影图像P24,得到如图16所示的本公开一些实施例的又一种处理后的图像帧子序列示意图中处理后的图像帧子序列。同样,可以在第二图像之前插入第一投影图像,即,基于如图14所示的本公开一些实施例的另一种图像帧子序列示意图中的图像帧子序列,在图像Pn1与图像P11之间插入第一投影图像P21,在图像P11与图像P12之间插入第一投影图像P22,在图像P12与图像P13之间插入第一投影图像P23,在图像P13与图像P14之间插入第一投影图像P24,得到处理后的图像帧子序列。
示例性地,在第一图像或第二图像中添加校正标识时,可以将校正标识的颜色设置为与第一图像或第二图像的背景颜色显著差异的颜色,例如背景颜色为黑色,校正标识颜色 为白色,以便后续在对拍摄图像进行处理,更好地提取校正标识在拍摄图像中的位置。比如,可以首先获取第一图像或第二图像的背景颜色;然后,将背景颜色进行反色处理得到标识明色,比如使用255减去背景颜色的色彩值,即可得到标识明色的色彩值。其次,将校正标识的颜色设置标识明色。
在一些实施例中,主控电路401还被配置为周期性地向显示控制电路402发送控制指令和第一投影图像,控制指令用于控制显示控制电路402将第一投影图像投射至投影屏幕30上。示例性地,如图10所示,本公开一些实施例的一种主控电路生成图像处理开始指令和图像处理结束指令的时序图中,主控电路401以第一预设时间T1为周期生成图像处理开始指令Cmd1和图像处理结束指令Cmd2,并以第三预设时间T3为周期选取第一图像与第二图像,即以第三预设时间T3为周期生成第一投影图像。主控电路401可以以第三预设时间T3为周期向显示控制电路402发送控制指令和第一投影图像。控制显示控制电路402被配置为接收来自主控电路401的第一投影图像,并响应于控制指令,向光机200传输第一投影图像的图像信号,以使得光机200利用该第一投影图像的图像信号对光源组件100提供的照明光束进行调制,获得投影光束通过镜头300将第一投影图像投射至投影屏幕30上。
在一些实施例中,主控电路401还被配置为周期性地向拍摄装置20发送拍摄指令,拍摄装置20被配置为响应于拍摄指令,拍摄镜头300在投影屏幕上投射的第一投影图像,得到拍摄图像。示例性地,如图10所示,本公开一些实施例的一种主控电路生成图像处理开始指令和图像处理结束指令的时序图中,以第一预设时间T1为周期生成拍摄指令、图像处理开始指令Cmd1和图像处理结束指令Cmd2,并以第三预设时间T3为周期选取第一图像与第二图像,即以第三预设时间T3为周期生成第一投影图像。拍摄装置20响应于该拍摄指令,在第二预设时间T2内对投影屏幕30上的投影图像进行拍摄,可拍摄镜头300在投影屏幕上投射的第一投影图像,得到拍摄图像。或者,主控电路401以第一预设时间T1为周期生成图像处理开始指令Cmd1和图像处理结束指令Cmd2,并以第三预设时间T3为周期选取第一图像与第二图像,即以第三预设时间T3为周期生成第一投影图像。主控电路401以第三预设时间T3为周期生成拍摄指令,在第一投影图像投射至投影屏幕30后,拍摄装置20响应于该拍摄指令,拍摄镜头300在投影屏幕上投射的第一投影图像,得到拍摄图像。当主控电路401周期性地向拍摄装置20发送拍摄指令的周期足够小时,可以认为拍摄装置实时地对投影屏幕30进行拍摄,本公开对主控电路401周期性地向拍摄装置20发送拍摄指令的具体周期并不限定。且拍摄装置20的参数可以根据应用场景进行设定,比如,拍摄装置20可以拍摄连续很多帧的图像。
在一些实施例中,主控电路401还被配置为:基于拍摄图像,确定第一投影图像的投影区域与投影屏幕30的范围是否匹配。投影屏幕30的范围可以是设定了显示区域的范围,也可以是投影屏幕30的边框范围,下述实施例以投影屏幕30的范围为投影屏幕30的边框范围为例进行说明。图17为根据本公开一些实施例的一种第一投影图像的投影区域与投影屏幕匹配的示意图,图18为根据本公开一些实施例的一种第一投影图像的投影区域与投影屏幕不匹配的示意图,图19为根据本公开一些实施例的另一种第一投影图像的投影区域与投影屏幕不匹配的示意图。如图17至图19所示,投影屏幕30的范围为范围Z1,第一投影图像的投影区域为区域Z2。第一投影图像的投影区域与投影屏幕30的范围匹配,指的是如图17所示的本公开一些实施例的一种第一投影图像的投影区域与投影屏幕匹配的示意图,第一投影图像的投影区域Z2与投影屏幕30的范围Z1几乎重合或完全重合,几乎重合可以为人眼可识别出的重合。
当第一投影图像的投影区域与投影屏幕30的范围匹配时,投影屏幕30的边框以内区域显示的是第一投影图像的投影图像,因此投影屏幕30的边框以内区的亮度值较高。投影屏幕30的边框以外区域的亮度值应当与激光投影设备的使用环境的亮度值相当。因此 通过对比投影屏幕30的边框以内区域的亮度值和投影屏幕30的边框以外区域的亮度值,可以判断第一投影图像的投影区域Z2与投影屏幕30的范围Z1是否重合。
在一些实施例中,主控电路401还被配置为识别拍摄图像中投影屏幕30的边框。示例性地,可以将包括投影屏幕30的参考图卡预先存储于主控电路401中,主控电路401获取拍摄图像后,对拍摄图像和参考图卡进行图像处理,并与预先存储的包括投影屏幕30边框的参考图卡进行对比,可以识别出拍摄的图像中投影屏幕的边框。比如,可以将参考图卡覆盖到拍摄图像中投影屏幕30的位置,并得到拍摄图像中投影屏幕30之外的信息。示例性地,若在安装激光投影设备时,固定了激光投影设备的位置,并且固定了投影屏幕30的安装位置,还可以将拍摄图像中投影屏幕30的边框坐标预存于主控电路401中,主控电路401获取拍摄图像后,通过该预存的投影屏幕30的边框坐标识别边框。本公开对主控电路401识别拍摄图像中投影屏幕30的边框的具体方法并不限定。
在一些实施例中,主控电路401识别出拍摄图像中投影屏幕30的边框后,获取拍摄图像中位于边框以内区域的第一亮度值,和位于边框以外区域的第二亮度值。若第一亮度值小于或等于第一阈值,或者第二亮度值大于或等于第二阈值,确定第一投影图像的投影区域与投影屏幕的范围不匹配。本公开对第一阈值或第二阈值的设定值并不限定,可以根据实际应用进行设定。
示例性地,图18为根据本公开一些实施例的一种第一投影图像的投影区域与投影屏幕不匹配的示意图,如图18所示,假设第一阈值为第一投影图像的被投射至投影屏幕30上的投影图像的最低亮度值,若获取的拍摄图像中位于边框以内区域的第一亮度值小于或等于第一阈值,则可以认为投影屏幕30的边框以内的部分区域未显示第一投影图像,也就是说,第一投影图像的投影区域Z2投影屏幕30的范围Z1不匹配。
示例性地,图19为根据本公开一些实施例的另一种第一投影图像的投影区域与投影屏幕不匹配的示意图,如图19所示,假设第二阈值为激光投影设备的使用环境的亮度值,该使用环境的亮度值可以通过对投影屏幕进行拍摄获取,也可以预存于主控电路401中。若获取的拍摄图像中位于边框以外区域的第二亮度值大于或等于第二阈值,可以认为投影屏幕30的边框以外的部分区域被镜头300投射的投影光束照射到,导致第二亮度值大于或等于第二阈值,也就是说,第一投影图像的投影区域Z2与投影屏幕30的范围Z1不匹配。
在一些实施例中,主控电路401识别出拍摄图像中投影屏幕30的边框后,获取拍摄图像中位于边框以外的区域的第三亮度值,并获取第三亮度值与上一个周期获取的拍摄图像中位于边框以外的区域的第四亮度值的变化量。若变化量大于第三阈值,确定第一投影图像的投影区域与投影屏幕的范围不匹配。本公开对第三阈值的设定值并不限定,可以根据实际应用进行设定。示例性地,假设第三阈值为0,由于拍摄指令由主控电路401周期性发出,当两个相邻周期获取的拍摄图像中位于边框以外的区域的亮度信息不一致,可以认为部分区域被镜头300投射的投影光束照射到,导致变化值大于0,也就是说,第一投影图像的投影区域Z2与投影屏幕30的范围Z1不匹配。示例性地,也可以将第三阈值设置为大于0的数值。
在一些实施例中,主控电路401还被配置为:若确定第一投影图像的投影区域Z2与投影屏幕的范围Z1不匹配,基于校正标识在拍摄图像中的位置和校正标识在目标图像中的位置,确定是否存在校正标识的偏移量。目标图像为第一投影图像的投影区域Z2与投影屏幕的范围Z1匹配时的拍摄图像,目标图像可以是在第一投影图像的投影区域Z2与投影屏幕的范围Z1匹配时拍摄并存储于主控电路401中的。
示例性地,第一投影图像中的校正标识形状与尺寸均相同,校正标识在拍摄图像中的位置和在目标图像中的位置可以基于该校正标识的同一标记点的位置确定。比如,如图12所示,可以认为标记点a1在拍摄图像中的位置为该校正标识在拍摄图像中的位置,标记点 a1在目标图像中的位置为该校正标识在目标图像中的位置。
主控电路401通过对比标记点a1在拍摄图像中的位置和标记点a1在目标图像中的位置,可确定标记点a1的偏移量。示例性地,投影屏幕30的边框为长方形,可以以该长方形的左下顶点为原点,以该长方形的左边框为x轴,以该长方形的下边框为y轴建立二维坐标系。获取标记点a1在拍摄图像中的位置在该二维坐标系中的坐标,并获取标记点a1在目标图像中的位置在该二维坐标系中的坐标,即可获取标记点a1的偏移量,也就是获取校正标识的偏移量。
在一些实施例中,主控电路401还被配置为:若存在偏移量,根据偏移量确定校正数据。
若多个校正标识均不存在偏移量,则可以认为第一投影图像被投射出去后并未发生偏移,不需要对后续待播放的视频画面或用户界面画面进行校正。若多个校正标识中,至少有存在一个校正标识偏移量,则该校正标志发生了偏移,可以认为第一投影图像被投射出去后,投影图像也发生了偏移,需要对后续待播放的视频画面或用户界面画面进行校正。
示例性地,主控电路401被配置为根据校正标识在拍摄图像中的位置坐标,和校正标识在目标图像中的位置坐标的变换关系,获取几何校正数据。确定了根据校正标识在拍摄图像中的位置坐标,和校正标识在目标图像中的位置坐标,即可了解到激光投影设备光轴因倾斜而造成的几何畸变的程度,进而确定出反畸变数据,即几何校正数据。
比如,投影屏幕30的边框为长方形,可以以该长方形的左下顶点为原点,以该长方形的左边框为x轴,以该长方形的下边框为y轴建立二维坐标系。如图12所示,第一投影图像P1中四个校正标识a的位置坐标,可以通过相应的标记点a1的坐标确定。假设该四个校正标识a在拍摄图像中的位置坐标分别为(x1,y1)、(x2,y2)、(x3,y3)、(x4,y4),且该校正标识在目标图像中的位置坐标分别为(x1′,y1′)、(x2′,y2′)、(x3′,y3′)、(x4′,y4′)。将该校正标识在拍摄图像中的位置坐标通过几何校正变换到该校正标识在目标图像中的位置坐标,即可实现反畸变。示例性地,根据多个校正标识在拍摄图像中的位置坐标和该多个校正标识在目标图像中的位置坐标,可以通过公式(1)确定几何校正数据。
Figure PCTCN2022100357-appb-000001
其中,i为[1,4]之间的整数,a、b、c、d、e、f、u、v为几何校正数据。
在一些实施例中,显示控制电路402还被配置为接收校正数据,基于校正数据对待投影图像进行校正处理,并向光机200传输校正处理后的待投影图像的图像信号,以使光机200利用该校正处理后的待投影图像的图像信号对照明光束进行调制,将校正处理后的待投影图像投射至投影屏幕30上。基于校正数据对待投影图像进行校正处理,可以为基于校正数据对第一投影图像后续的待播放的视频画面或用户界面画面进行校正,以保证第一投影图像后续的待播放的视频画面或用户界面画面不偏移,以保证用户的观看效果。
示例性地,显示控制电路402可以基于校正数据对第一投影图像后续的待播放的视频画面或用户界面画面进行几何校正。主控电路401确定几何校正数据a、b、c、d、e、f、u、v后,显示控制电路402可以将第一投影图像后续待播放的视频画面或用户界面画面中的任一像素的位置坐标(x,y)代入到公式(2)中,得到该像素在校正后的图像中的位置坐标值(x′,y′),进而实现对第一投影图像后续的待播放的视频画面或用户界面画面进行几何校正。
Figure PCTCN2022100357-appb-000002
示例性地,由于几何校正后,可能会使第一投影图像后续待播放的视频画面或用户界面画面损失某些像素的特征,因此,可以在对第一投影图像后续待播放的视频画面或用户界面画面作几何校正后所得到的图像中进行插值运算,得到最终的校正后的第一投影图像后续待播放的视频画面或用户界面画面。通过插值运算,可补充因图像缩放而缺失的像素点,消除可能的马赛克现象,使显示画面的内容平滑过渡,不影响观看效果。比如,校正前的投影画面如图19所示,校正后的投影画面如图17所示。
显示控制电路402生成校正处理后的图像信号,以使光机200利用该校正处理后的待投影图像的图像信号对照明光束进行调制,将校正处理后的待投影图像投射至投影屏幕30上。
综合上述实施例,对本公开一些实施例的激光投影设备工作时的其中一种工作过程进行示例性说明。首先,主控电路401获取待投影的图像帧序列,然后基于该待投影的图像帧序列周期性地生成第一投影图像。第一投影图像的生成方式可以为:主控电路401周期性地生成图像处理开始指令Cmd1和图像处理结束指令Cmd2,并在图像帧子序列中周期性地选取第一图像与第二图像,第一图像和第二图像中添加校正标识,生成第一投影图像。同时,主控电路401周期生成拍摄指令和控制指令。响应于控制指令,显示控制电路402周期性地将第一投影图像投射至投影屏幕30。响应于拍摄指令,拍摄装置20周期性地获取拍摄图像。然后,主控电路401基于拍摄图像确定第一投影图像的投影区域Z2与投影屏幕30的范围Z1是否匹配。若第一投影图像的投影区域Z2与投影屏幕30的范围Z1不匹配,且多个校正标识中,至少有一个校正标识存在偏移量,则基于多个校正标识的偏移量确定校正数据。最后,显示控制电路402基于校正数据对第一投影图像后续的待播放的视频画面或用户界面画面进行校正,并由光机200和镜头300将校正后的待播放的视频画面或用户界面画面投射至投影屏幕上。
用户在观看投影屏幕30上的投影图像的过程中,通过激光投影设备的上述工作过程,激光投影设备可以自动判断出投影图像是否需要校正,并在投影图像需要校正时,在无需用户手动校正,不中断视频画面或用户界面画面的正常播放,并且用户不易感受到校正过程的情况下,自动对待投影图像进行校正,因此可以使得用户体验感更好。
本公开的一些实施例提供一种投影图像的校正方法,应用于激光投影设备,图20为根据本公开一些实施例的一种投影图像的校正方法的流程图,如图20所示,校正方法包括以下步骤:
步骤2001,获取待投影的图像帧序列,待投影的图像帧序列包括多个待投影的图像。
步骤2002,响应于图像处理开始指令和图像处理结束指令,在图像帧子序列中添加校正标识,得到处理后的图像帧子序列。
图像帧子序列为待投影的图像帧序列中图像处理开始指令和图像处理结束指令对应的时间范围内的图像帧序列,处理后的图像帧子序列包括第一投影图像。
步骤2003,获取拍摄图像。
拍摄图像为第一投影图像投射至投影屏幕上时拍摄的图像。
步骤2004,基于校正标识在拍摄图像中的位置,获取校正数据。
步骤2005,基于校正数据对待投影图像进行校正。
在一些实施例中,上述步骤2002中,在图像帧子序列中添加校正标识,得到处理后 的图像帧子序列,包括:将图像帧子序列中的第一图像替换为第一投影图像,得到处理后的图像帧子序列。或者,在图像帧子序列中的第二图像之前或之后插入第一投影图像,得到处理后的图像帧子序列。
在一些实施例中,上述步骤2004中,基于校正标识在拍摄图像中的位置,得到校正数据包括:基于校正标识在拍摄图像中的位置和校正标识在目标图像中的位置,确定是否存在校正标识的偏移量;若存在校正标识的偏移量,根据校正标识的偏移量确定校正数据。
在一些实施例中,校正方法还包括:基于拍摄图像,确定第一投影图像的投影区域与投影屏幕的范围是否匹配。若确定第一投影图像的投影区域与投影屏幕的范围不匹配,基于校正标识在拍摄图像中的位置,得到校正数据。
在一些实施例中,图21为根据本公开一些实施例的另一种投影图像的校正方法的流程图,如图21所示,基于拍摄图像,确定第一投影图像的投影区域与投影屏幕的范围是否匹配包括以下步骤:
步骤2101,识别拍摄图像中投影屏幕的边框。
步骤2102,获取拍摄图像中位于边框以内区域的第一亮度值,和位于边框以外区域的第二亮度值。
步骤2103,若第一亮度值小于或等于第一阈值,或,第二亮度值大于或等于第二阈值,确定第一投影图像的投影区域与投影屏幕的范围不匹配。
在一些实施例中,图22为根据本公开一些实施例的又一种投影图像的校正方法的流程图,如图22所示,基于拍摄图像,确定第一投影图像的投影区域与投影屏幕的范围是否匹配也可以包括以下步骤:
步骤2201,识别拍摄图像中投影屏幕的边框。
步骤2202,获取拍摄图像中位于边框以外的区域在的第三亮度值;获取拍摄图像中位于边框以外的区域的第三亮度值,与上一个周期获取的拍摄图像中位于边框以外的区域的第四亮度值的变化量。
步骤2203,若变化量大于第三阈值时,确定第一投影图像的投影区域与投影屏幕的范围不匹配。
通过上述方法,在用户观看投影屏幕30上的投影图像的过程中,激光投影设备可以自动判断出投影图像是否需要校正,并在投影图像需要校正时,在无需用户手动校正,不中断视频画面或用户界面画面的正常播放,并且用户不易感受到校正过程的情况下,自动对待投影图像进行校正,因此可以使得用户体验感更好。
以上,仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,想到变化或替换,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应以权利要求的保护范围为准。

Claims (18)

  1. 一种激光投影设备,包括:
    光源组件,被配置为提供照明光束;
    光机,被配置为利用图像信号对所述照明光束进行调制,以获得投影光束;
    镜头,被配置为将所述投影光束投射成像;
    拍摄装置,被配置为响应于拍摄指令,拍摄所述镜头在投影屏幕上投射的第一投影图像,得到拍摄图像;所述第一投影图像包括校正标识;和
    电路系统架构,被配置为控制所述光源和所述光机运行;其中,所述电路系统架构包括:
    主控电路,耦接至所述拍摄装置,被配置为获取所述拍摄图像,并基于所述校正标识在所述拍摄图像中的位置,得到校正数据;向显示控制电路发送所述校正数据;
    显示控制电路,耦接至所述主控电路,且被配置为接收所述校正数据,基于所述校正数据对待投影图像进行校正处理,并向所述光机传输校正处理后的所述待投影图像的图像信号,以使所述光机利用校正处理后的所述待投影图像的图像信号对所述照明光束进行调制,以获得投影光束。
  2. 根据权利要求1所述的激光投影设备,其中,所述主控电路还被配置为:
    获取待投影的图像帧序列,所述待投影的图像帧序列包括多个待投影的图像;
    响应于图像处理开始指令和图像处理结束指令,在图像帧子序列中添加所述校正标识,得到处理后的图像帧子序列;所述图像帧子序列为所述待投影的图像帧序列中所述图像处理开始指令和所述图像处理结束指令对应的时间范围内的图像帧序列,所述处理后的图像帧子序列包括所述第一投影图像。
  3. 根据权利要求2所述的激光投影设备,其中,所述主控电路被配置为:
    在所述第一图像中添加所述校正标识,得到所述第一投影图像;
    采用所述第一投影图像替换所述第一图像。
  4. 根据权利要求2所述的激光投影设备,其中,所述主控电路被配置为:
    在所述第二图像中添加所述校正标识,得到所述第一投影图像;
    在所述第二图像之前或之后插入所述第一投影图像。
  5. 根据权利要求3或4所述的激光投影设备,
    所述主控电路还被配置为:
    周期性地在所述图像帧子序列中选取第一图像或第二图像。
  6. 根据权利要求1-5中任一项所述的激光投影设备,其中,
    所述主控电路还被配置为:
    周期性地向所述拍摄装置发送所述拍摄指令,周期性地向所述显示控制电路发送控制指令和所述第一投影图像;
    所述显示控制电路,还被配置为响应于所述控制指令,将所述第一投影图像投射至所述投影屏幕上。
  7. 根据权利要求1-6中任一项所述的激光投影设备,所述校正标识为一个或多个,多个所述校正标识在所述第一投影图像中的位置不同。
  8. 根据权利要求1-7中任一项所述的激光投影设备,其中,所述主控电路还被配置为:基于所述拍摄图像,确定所述第一投影图像的投影区域与所述投影屏幕的范围是否匹配。
  9. 根据权利要求8所述的激光投影设备,其中,
    所述主控电路还被配置为:
    识别所述拍摄图像中所述投影屏幕的边框;
    获取所述拍摄图像中位于所述边框以内区域的第一亮度值,和位于所述边框以外区域的第二亮度值;
    若所述第一亮度值小于或等于第一阈值,或,所述第二亮度值大于或等于第二阈值, 确定所述第一投影图像的投影区域与所述投影屏幕的范围不匹配。
  10. 根据权利要求8或9所述的激光投影设备,其中,
    所述主控电路还被配置为:
    识别所述拍摄图像中所述投影屏幕的边框;
    获取所述拍摄图像中位于所述边框以外的区域在的第三亮度值;获取所述拍摄图像中位于所述边框以外的区域的第三亮度值,与上一个周期获取的所述拍摄图像中位于所述边框以外的区域的第四亮度值的变化量;
    若所述变化量大于第三阈值时,确定所述第一投影图像的投影区域与所述投影屏幕的范围不匹配。
  11. 根据权利要求8-10中任一项所述的激光投影设备,其中,所述主控电路还被配置为:若确定所述第一投影图像的投影区域与所述投影屏幕的范围不匹配,基于所述校正标识在所述拍摄图像中的位置,和所述校正标识在目标图像中的位置,得到校正数据;所述目标图像为所述第一投影图像的投影区域与所述投影屏幕的范围匹配时的所述拍摄图像。
  12. 根据权利要求11所述的激光投影设备,其中,
    所述主控电路被配置为:
    基于所述校正标识在所述拍摄图像中的位置和所述校正标识在目标图像中的位置,确定是否存在所述校正标识的偏移量;
    若存在所述校正标识的所述偏移量,根据所述校正标识的所述偏移量确定所述校正数据。
  13. 一种投影图像的校正方法,应用于激光投影设备,所述校正方法包括:
    获取待投影的图像帧序列,所述待投影的图像帧序列包括多个待投影的图像;
    响应于图像处理开始指令和图像处理结束指令,在图像帧子序列中添加所述校正标识,得到处理后的图像帧子序列;所述图像帧子序列为所述待投影的图像帧序列中所述图像处理开始指令和所述图像处理结束指令对应的时间范围内的图像帧序列,所述处理后的图像帧子序列包括所述第一投影图像;
    获取拍摄图像,所述拍摄图像为第一投影图像投射至投影屏幕上时拍摄的图像;
    基于所述校正标识在所述拍摄图像中的位置,获取校正数据;
    基于所述校正数据对待投影图像进行校正。
  14. 根据权利要求13所述的校正方法,所述在图像帧子序列中添加所述校正标识,得到处理后的图像帧子序列,包括:
    将所述图像帧子序列中的第一图像替换为所述第一投影图像,得到所述处理后的图像帧子序列;或者,
    在所述图像帧子序列中的第二图像之前或之后插入所述第一投影图像,得到所述处理后的图像帧子序列。
  15. 根据权利要求13或14所述的校正方法,还包括:
    基于所述拍摄图像,确定所述第一投影图像的投影区域与所述投影屏幕的范围是否匹配;
    若确定所述第一投影图像的投影区域与所述投影屏幕的范围不匹配,基于所述校正标识在所述拍摄图像中的位置,得到校正数据。
  16. 根据权利要求15所述的校正方法,其中,所述基于所述拍摄图像,确定所述第一投影图像的投影区域与所述投影屏幕的范围是否匹配包括:
    识别所述拍摄图像中所述投影屏幕的边框;
    获取所述拍摄图像中位于所述边框以内区域的第一亮度值,和位于所述边框以外区域的第二亮度值;
    若所述第一亮度值小于或等于第一阈值,或,所述第二亮度值大于或等于第二阈值, 确定所述第一投影图像的投影区域与所述投影屏幕的范围不匹配。
  17. 根据权利要求15或16所述的校正方法,其中,所述基于所述拍摄图像,确定所述第一投影图像的投影区域与所述投影屏幕的范围是否匹配包括:
    识别所述拍摄图像中所述投影屏幕的边框;
    获取所述拍摄图像中位于所述边框以外的区域在的第三亮度值;获取所述拍摄图像中位于所述边框以外的区域的第三亮度值,与上一个周期获取的所述拍摄图像中位于所述边框以外的区域的第四亮度值的变化量;
    若所述变化量大于第三阈值时,确定所述第一投影图像的投影区域与所述投影屏幕的范围不匹配。
  18. 根据权利要求15-17中任一项所述的校正方法,其中,所述基于所述校正标识在所述拍摄图像中的位置,得到校正数据包括:
    基于所述校正标识在所述拍摄图像中的位置和所述校正标识在目标图像中的位置,确定是否存在所述校正标识的偏移量;
    若存在所述校正标识的所述偏移量,根据所述校正标识的所述偏移量确定所述校正数据。
PCT/CN2022/100357 2021-12-20 2022-06-22 激光投影设备及投影图像的校正方法 WO2023115857A1 (zh)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN202111562487.9A CN114222099A (zh) 2021-12-20 2021-12-20 投影图像的校正方法及激光投影设备
CN202111559005.4 2021-12-20
CN202111559005.4A CN114339173A (zh) 2021-12-20 2021-12-20 投影图像校正方法、激光投影系统及可读性存储介质
CN202111562487.9 2021-12-20
CN202111566402.4A CN114245089B (zh) 2021-12-20 2021-12-20 几何校正方法、装置、激光投影设备
CN202111566402.4 2021-12-20

Publications (1)

Publication Number Publication Date
WO2023115857A1 true WO2023115857A1 (zh) 2023-06-29

Family

ID=86901180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/100357 WO2023115857A1 (zh) 2021-12-20 2022-06-22 激光投影设备及投影图像的校正方法

Country Status (1)

Country Link
WO (1) WO2023115857A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070230815A1 (en) * 2006-04-04 2007-10-04 Samsung Electronics Co., Ltd. Method and apparatus for unobtrusively correcting projected image
CN112399156A (zh) * 2019-08-13 2021-02-23 青岛海尔多媒体有限公司 用于校正投影成像的方法及投影成像设备
CN113055663A (zh) * 2021-03-31 2021-06-29 青岛海信激光显示股份有限公司 投影图像的校正方法及激光投影设备
CN113271447A (zh) * 2021-05-25 2021-08-17 青岛海信激光显示股份有限公司 激光投影设备及图像校正系统
CN114222099A (zh) * 2021-12-20 2022-03-22 青岛海信激光显示股份有限公司 投影图像的校正方法及激光投影设备
CN114245089A (zh) * 2021-12-20 2022-03-25 青岛海信激光显示股份有限公司 几何校正方法、装置、激光投影设备
CN114339173A (zh) * 2021-12-20 2022-04-12 青岛海信激光显示股份有限公司 投影图像校正方法、激光投影系统及可读性存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070230815A1 (en) * 2006-04-04 2007-10-04 Samsung Electronics Co., Ltd. Method and apparatus for unobtrusively correcting projected image
CN112399156A (zh) * 2019-08-13 2021-02-23 青岛海尔多媒体有限公司 用于校正投影成像的方法及投影成像设备
CN113055663A (zh) * 2021-03-31 2021-06-29 青岛海信激光显示股份有限公司 投影图像的校正方法及激光投影设备
CN113271447A (zh) * 2021-05-25 2021-08-17 青岛海信激光显示股份有限公司 激光投影设备及图像校正系统
CN114222099A (zh) * 2021-12-20 2022-03-22 青岛海信激光显示股份有限公司 投影图像的校正方法及激光投影设备
CN114245089A (zh) * 2021-12-20 2022-03-25 青岛海信激光显示股份有限公司 几何校正方法、装置、激光投影设备
CN114339173A (zh) * 2021-12-20 2022-04-12 青岛海信激光显示股份有限公司 投影图像校正方法、激光投影系统及可读性存储介质

Similar Documents

Publication Publication Date Title
WO2023088329A1 (zh) 投影设备及投影图像校正方法
EP1531357A1 (en) Projection type display device
JP2006018293A (ja) ピンホール投影により表示面上のレーザポイントと関連付けられるプロジェクタ画素を求める方法
KR101767853B1 (ko) 정보 처리 장치, 화상 투영 시스템 및 컴퓨터 프로그램
JP2005124131A (ja) 画像処理システム、プロジェクタ、プログラム、情報記憶媒体および画像処理方法
WO2012046575A1 (ja) 投写型映像表示装置
JP2012170007A (ja) 投写型映像表示装置及び画像調整方法
WO2022253336A1 (zh) 激光投影设备及投影图像的校正方法
CN113259644B (zh) 激光投影系统及图像校正方法
JP2012118289A (ja) 投写型映像表示装置
JP4428371B2 (ja) 投写型画像表示装置及び平面被投写体
JP3879560B2 (ja) 投写型画像表示装置
JP2012078490A (ja) 投写型映像表示装置及び画像調整方法
JP2012018214A (ja) 投写型映像表示装置
JP2012181264A (ja) 投影装置、投影方法及びプログラム
CN113259637B (zh) 投影图像的校正方法及激光投影系统
WO2023115857A1 (zh) 激光投影设备及投影图像的校正方法
JP4572066B2 (ja) プロジェクタ
CN114979600B (zh) 激光投影设备及投影图像的校正方法
JP2012053227A (ja) 投写型映像表示装置
JP2012220709A (ja) 投写型映像表示装置およびその制御方法
JP2011199717A (ja) 投写型表示装置および画像表示方法
JP2011138019A (ja) 投写型映像表示装置及び画像調整方法
WO2023071256A1 (zh) 激光投影设备及投影图像的校正方法
WO2023071698A1 (zh) 激光投影设备及投影图像的色散校正方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22909189

Country of ref document: EP

Kind code of ref document: A1