WO2023000937A1 - 投影设备及投影图像的校正方法 - Google Patents

投影设备及投影图像的校正方法 Download PDF

Info

Publication number
WO2023000937A1
WO2023000937A1 PCT/CN2022/102067 CN2022102067W WO2023000937A1 WO 2023000937 A1 WO2023000937 A1 WO 2023000937A1 CN 2022102067 W CN2022102067 W CN 2022102067W WO 2023000937 A1 WO2023000937 A1 WO 2023000937A1
Authority
WO
WIPO (PCT)
Prior art keywords
black
controller
image
frame
video
Prior art date
Application number
PCT/CN2022/102067
Other languages
English (en)
French (fr)
Inventor
肖纪臣
梁倩
郑晴晴
吴超
唐甜甜
Original Assignee
青岛海信激光显示股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202110825818.7A external-priority patent/CN115695740A/zh
Priority claimed from CN202110825816.8A external-priority patent/CN115691365A/zh
Priority claimed from CN202111567322.0A external-priority patent/CN114339174B/zh
Application filed by 青岛海信激光显示股份有限公司 filed Critical 青岛海信激光显示股份有限公司
Publication of WO2023000937A1 publication Critical patent/WO2023000937A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present disclosure relates to the field of projection technology, and in particular, to a projection device and a correction method for a projected image.
  • the projection technology is an optical projection technology from a point light source to a surface display.
  • the corresponding projection host location, screen location, screen flatness, optical distortion and other objective issues determine that the actual projection screen is difficult to directly match the screen completely.
  • some embodiments of the present disclosure provide a projection device, including: a light source assembly, an optical engine, a lens, and a circuit system architecture.
  • a light source assembly configured to provide an illumination beam.
  • the light engine is configured to modulate the illumination beam with the image signal to obtain the projection beam.
  • the lens is configured to project the projection beam into an image.
  • the circuit system architecture is configured to control the light source component and the optical machine operation; wherein the circuit system architecture includes: a first controller and a second controller, the first controller is coupled to the second controller, and the first controller is configured to : receiving a video switching instruction; in response to the video switching instruction, sending a black field video to the second controller, the black field video including a plurality of black frames and a plurality of positioning frames, and the positioning frames are used to assist the projection device in image correction.
  • the second controller is configured to receive the black field video, and control the light source component and the light machine to play the black field video.
  • some embodiments of the present disclosure provide a projection device, including: a light source assembly, an optical engine, a lens, a circuit system architecture, and a detection device.
  • the light source assembly is configured to provide an illumination beam.
  • the light machine is configured to modulate the illumination beam with the image signal to obtain the projection beam.
  • the lens is configured to project the projection beam into an image.
  • the detection device is configured to detect whether there is a target object within the target range, and generate a detection signal based on the detection result.
  • a circuit system architecture configured to control light source components and optomechanical operations. Wherein, the circuit system architecture includes: a power supply circuit, coupled with the light valve driving circuit and the light source driving circuit, and configured to supply power to the light valve driving circuit and the light source driving circuit.
  • the first controller coupled with the second controller, is configured to send a standby command to the second controller in response to the standby operation, and control the power supply circuit to keep the state of supplying power to the second controller unchanged.
  • the second controller is configured to, in response to the standby instruction, control the power supply circuit to maintain the state of supplying power to the light valve drive circuit, and control the power supply circuit to stop supplying power to the light source drive circuit; if it is determined based on the detection signal that there is a target object within the target range , the power supply circuit is controlled to supply power to the light source driving circuit, and the image signal of the standby image is sent to the light valve driving circuit.
  • the light source driving circuit is configured to drive the light source to provide the illumination light beam in response to the image signal of the standby image.
  • the light valve drive circuit is configured to respond to the image signal of the standby image, and drive the light engine to modulate the illumination beam by using the image signal of the standby image to obtain the projection beam.
  • some embodiments of the present disclosure provide a projection system, including a projection screen and the above-mentioned projection device.
  • some embodiments of the present disclosure provide a method for correcting a projected image, including: receiving a video switching instruction.
  • play a black field video the black field video includes a plurality of black frames and a plurality of positioning frames, and the positioning frames are used to assist the projection device to perform image correction.
  • the captured image is acquired, where the captured image is an image captured when the positioning frame is projected onto the projection screen. Based on the captured images, correction parameters are determined. Correct the image to be projected according to the correction parameters.
  • FIG. 1A is a schematic diagram of a projection system according to some embodiments of the present disclosure.
  • FIG. 1B is a structural diagram of a projection device according to some embodiments of the present disclosure.
  • FIG. 2 is a schematic diagram of a light source assembly, an optical engine and a lens in a projection device according to some embodiments of the present disclosure
  • FIG. 3 is a structural diagram of an optical path in a projection device according to some embodiments of the present disclosure.
  • Fig. 4 is a schematic diagram of the principle of an optical path of a light source assembly in a projection device according to some embodiments of the present disclosure
  • FIG. 5 is an arrangement structure diagram of tiny mirror mirrors in a digital micromirror device according to some embodiments of the present disclosure
  • Fig. 6 is a schematic diagram of the operation of a tiny mirror according to some embodiments of the present disclosure.
  • Fig. 7 is a schematic diagram of the position of a tiny mirror swing in the digital micromirror device shown in Fig. 5;
  • FIG. 8 is a structural diagram of another projection device according to some embodiments of the present disclosure.
  • FIG. 9 is a schematic diagram of the composition of a projection device according to some embodiments of the present disclosure.
  • FIG. 10 is a structural diagram of another projection device according to some embodiments of the present disclosure.
  • FIG. 11 is a structural diagram of another projection device according to some embodiments of the present disclosure.
  • Fig. 12 is a schematic diagram of a black frame and an alignment frame according to some embodiments of the present disclosure.
  • Fig. 13 is a schematic diagram of a positioning frame according to some embodiments of the present disclosure.
  • Fig. 14 is a schematic diagram of another positioning frame according to some embodiments of the present disclosure.
  • Fig. 15 is a schematic diagram of another positioning frame according to some embodiments of the present disclosure.
  • Fig. 16 is a schematic composition diagram of a black field video according to some embodiments of the present disclosure.
  • Fig. 17 is a schematic composition diagram of another black field video according to some embodiments of the present disclosure.
  • Fig. 18 is a flowchart of a method for correcting a projected image according to some embodiments of the present disclosure
  • Fig. 19 is a flow chart of another method for correcting a projected image according to some embodiments of the present disclosure.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the present disclosure, unless otherwise specified, "plurality" means two or more.
  • words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design described as “exemplary” or “for example” in the embodiments of the present disclosure shall not be construed as being preferred or advantageous over other embodiments or designs. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner.
  • the projected image projected by the projection device on the projection screen may not match the projection screen (for example, the projected image exceeds the border of the projection screen), and the projected image needs to be manually corrected, or the projected image needs to be corrected using a geometric correction function.
  • these correction methods have to be adjusted point by point or feature point by manual adjustment, which is not only time-consuming and laborious, but also brings extremely poor viewing experience to users.
  • FIG. 1A is a schematic diagram of a projection system according to some embodiments of the present disclosure.
  • the projection system includes: a projection device 10 , a projection screen 20 , a control device 30 and a server 40 .
  • the user can control the projection device 10 to project on the projection screen 20 through the control device 30 , and the server 40 can provide various contents and interactions to the projection device 10 .
  • the control device 30 can be a remote controller 30A, which can communicate with the projection device 10 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for The projection device 10 is controlled by wireless or other wired means.
  • the user can control the projection device 10 by inputting user instructions through buttons on the remote control 30A, voice input, control panel input, etc.
  • the user can input corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, power on and off keys, etc. on the remote controller 30A to realize the projection device 10 function.
  • the control device 30 can also be an intelligent device, such as a mobile terminal 30B, a tablet computer, a computer, a notebook computer, etc., which can be connected through a local network (LAN, Local Area Network), a wide area network (WAN, Wide Area Network), a wireless local area network (WLAN, Wireless Local Area Network) or other networks communicate with the multimedia controller, and realize the control of the projection device 10 through the application program corresponding to the multimedia controller.
  • LAN Local Area Network
  • WAN Wide Area Network
  • WLAN Wireless Local Area Network
  • the application can provide users with various controls through an intuitive user interface (UI, User Interface) on the screen associated with the smart device.
  • both the mobile terminal 30B and the projection device 10 can be installed with software applications, so that the connection and communication between the two can be realized through the network communication protocol, and then the purpose of one-to-one control operation and data communication can be realized.
  • the mobile terminal 30B can establish a control command protocol with the projection device 10
  • the remote control keyboard can be synchronized to the mobile terminal 30B, and the function of controlling the multimedia controller can be realized by controlling the user interface on the mobile terminal 30B
  • the audio and video content displayed on the screen is transmitted to the projection device 10 to realize the synchronous display function.
  • the server 40 can be a video server, an electronic program guide (EPG, Electronic Program Guide) server, a cloud server, and the like.
  • EPG Electronic Program Guide
  • the projection device 10 can perform data communication with the server 40 through various communication methods.
  • the projection device 10 may be allowed to perform a wired communication connection or a wireless communication connection with the server 40 through a local area network, a wireless local area network or other networks.
  • projection device 10 interacts with the EPG by sending and receiving messages, receiving software program updates, or accessing a remotely stored digital media library.
  • the servers 40 may be one group or multiple groups, and may be one or more types of servers. Other network service contents such as video on demand and advertisement service are provided through the server 40 .
  • the light source assembly 100 is configured to provide an illumination beam (laser beam).
  • the optical machine 200 is configured to use an image signal to modulate the illumination beam provided by the light source assembly 100 to obtain a projection beam.
  • the lens 300 is configured to project the projection beam onto a projection screen or a wall for imaging.
  • the light source assembly 100 , the light engine 200 and the lens 300 are sequentially connected along the beam propagation direction, and each is wrapped by a corresponding housing.
  • the housings of the light source assembly 100 , the optical engine 200 and the lens 300 support the optical components and make the optical components meet certain sealing or airtight requirements.
  • the light source assembly 100 is airtightly sealed through its corresponding housing, which can better improve the problem of light decay of the light source assembly 100 .
  • One end of the optical engine 200 is coupled to the lens 300 and arranged along a first direction X of the whole machine, for example, the first direction X may be the width direction of the whole machine.
  • the other end of the optical machine 200 is coupled with the light source assembly 100 .
  • the connection direction between the light source assembly 100 and the optical machine 200 is perpendicular to the connection direction between the optical machine 200 and the lens 300.
  • this connection structure can adapt to the optical path characteristics of the reflective light valve in the optical machine 200, and on the other hand On the one hand, it is also beneficial to shorten the length of the optical path in one dimension, which is beneficial to the structural arrangement of the whole machine.
  • the length of the optical path in this direction will be Very long, which is not conducive to the structural arrangement of the whole machine.
  • light source assembly 100 may include three laser arrays.
  • Fig. 2 is a schematic diagram of a light source assembly, an optical machine, and a lens in a projection device according to some embodiments of the present disclosure.
  • the light source assembly 100 is an example of a three-color laser light source, and the three laser arrays may be red lasers respectively array 130, green laser array 120, and blue laser array 110; but not limited thereto.
  • the three laser arrays may also all be blue laser arrays 110 , or two laser arrays may be blue laser arrays 110 , and one laser array may be red laser arrays 130 .
  • the light source assembly 100 can generate an illumination beam containing light of the three primary colors, so there is no need to set a fluorescent wheel in the light source assembly 100 (when one or more lasers included in the light source
  • the array can only generate one or two colors of laser light, it is necessary to use the existing color laser to excite the fluorescent wheel to generate other colors of fluorescent light, so that the laser light and the fluorescent light together form white light
  • the volume of the light source assembly 100 is small.
  • the light source assembly 100 may also include two laser arrays.
  • the light source assembly 100 is a two-color laser light source as an example.
  • the two laser arrays can be a blue laser array 110 and a red laser array 130;
  • the light source assembly 100 can also include a laser array, that is, the light source assembly 100 is a monochromatic laser light source, that is, the light source assembly 100 only includes the blue laser array 110, or only includes the blue laser array 110 and the red laser array 130 pm.
  • FIG. 4 is a schematic diagram of the optical path principle of the light source assembly in the projection device according to some embodiments of the present disclosure.
  • FIG. Round 150 After the blue laser 110 emits blue light, a part of the blue light is irradiated on the fluorescent wheel 140 to generate red fluorescent light (when the light source assembly 100 includes the red laser array 130, it is not necessary to generate red fluorescent light) and green fluorescent light; , red fluorescent light (or red laser) and green fluorescent light sequentially pass through the light combining mirror 160 and then pass through the color filter wheel 150 for color filtering, and output the three primary colors sequentially.
  • red fluorescent light or red laser
  • green fluorescent light sequentially pass through the light combining mirror 160 and then pass through the color filter wheel 150 for color filtering, and output the three primary colors sequentially.
  • the human eye cannot distinguish the color of light at a certain moment, and what it perceives is still mixed white light.
  • FIG. 3 is a schematic diagram of an optical path structure in a projection device according to some embodiments of the present disclosure.
  • DMD Digital Micromirror Device
  • the light pipe 210 can receive the illumination beam of the light source assembly 100 and homogenize the illumination beam.
  • the lens assembly 220 can amplify the illumination light beam first, then converge it and output it to the reflector 230 .
  • the mirror 230 can reflect the illumination beam to the prism assembly 250 .
  • the prism assembly 250 reflects the illumination beam to the DMD 240, and the DMD 240 modulates the illumination beam, and reflects the modulated projection beam to the lens 300.
  • the DMD 240 is the core component, and its function is to use the image signal to modulate the illumination beam of the light source assembly 100, that is, to control the illumination beam to display different colors and brightness for different pixels of the image to be displayed, so as to finally form an optical image, so the DMD 240 is also known as a light modulation device or light valve.
  • the light modulation device or light valve
  • the light modulation device can be divided into a transmissive light modulation device (or light valve) or a reflective light modulation device (or light valve).
  • the DMD 240 reflects the illumination beam, which is a reflective light modulation device.
  • the liquid crystal light valve transmits the illumination beam, so it is a transmissive light modulation device.
  • the optomechanics can be divided into single-chip systems, two-chip systems or three-chip systems.
  • the optical machine 200 can be called a single-chip system.
  • the optical machine 200 can be called a three-chip system.
  • the DMD 240 is applied in the digital light processing (Digital Light Processing, DLP) projection architecture, as shown in Figure 2 and Figure 3, the optical machine 200 uses the DLP projection architecture.
  • FIG. 5 is an arrangement structure diagram of tiny reflective mirrors in a digital micromirror device according to some embodiments of the present disclosure. As shown in FIG. 5 , DMD 240 includes thousands of tiny reflective mirrors 2401 that can be individually driven to rotate, These tiny mirrors 2401 are arranged in an array, and each tiny mirror 2401 corresponds to a pixel in the image to be displayed.
  • each tiny reflector 2401 is equivalent to a digital switch, which can swing within the range of plus or minus 12 degrees ( ⁇ 12°) or plus or minus 17 degrees ( ⁇ 17°) under the action of an external electric field, to The reflected light can be imaged on the screen through the lens 300 along the optical axis to form a bright pixel.
  • FIG. 6 is a schematic diagram of the operation of the micro-mirror mirror according to some embodiments of the present disclosure.
  • the light reflected by the micro-reflector 2401 at a negative deflection angle is called OFF light, and the OFF light is invalid light. It hits the housing 101 of the whole machine, the housing of the optical machine 200 or the light absorbing unit to absorb it.
  • the light reflected by the tiny reflective lens 2401 at a positive deflection angle is called ON light.
  • the ON light is the effective light beam that the tiny reflective lens 2401 on the surface of the DMD 240 receives the illumination beam and enters the lens 300 through a positive deflection angle.
  • the open state of the micro-reflector 2401 is the state where the micro-reflector 2401 is and can be maintained when the illumination beam emitted by the light source assembly 100 is reflected by the micro-reflector 2401 and can enter the lens 300, that is, the micro-reflector 2401 is at a positive deflection angle status.
  • the closed state of the tiny reflective mirror 2401 is the state where the tiny reflective mirror 2401 is and can be maintained when the illuminating light beam emitted by the light source assembly 100 is reflected by the tiny reflective mirror 2401 and does not enter the lens 300, that is, the tiny reflective mirror 2401 is in a negative deflection angle status.
  • FIG. 7 is a schematic diagram of the swinging position of a tiny mirror in the digital micromirror device shown in FIG. 5. As shown in FIG. That is, the on state, the state at -12° is the off state, and for the deflection angle between -12° and +12°, the actual working state of the tiny mirror 2401 is only the on state and the off state.
  • the state at +17° is the on state
  • the state at -17° is the off state.
  • the image signal is converted into digital codes such as 0 and 1, and these digital codes can drive the tiny mirror 2401 to swing.
  • part or all of the tiny mirrors 2401 will be switched once between the on state and the off state, so as to realize the display in one frame of image according to the duration time of the tiny mirrors 2401 respectively in the on state and the off state.
  • the gray scale of each pixel of For example, when a pixel has 256 gray scales from 0 to 255, the tiny mirrors corresponding to gray scale 0 are in the off state during the entire display period of one frame of image, and the tiny mirrors corresponding to gray scale 255 are in the off state during one frame.
  • the whole display period of the image is in the on state, and the tiny reflective mirror corresponding to the gray scale 127 is in the on state for half of the time in the display period of a frame of image, and the other half of the time is in the off state. Therefore, the state and the maintenance time of each state in the display period of a frame image are controlled by the image signal of each tiny reflective mirror in the DMD 240, and the brightness (gray scale) of the corresponding pixel of the tiny reflective mirror 2401 can be controlled to realize the control of the image.
  • the purpose of modulation of the illumination beam projected to the DMD 240 is controlled by the image signal of each tiny reflective mirror in the DMD 240, and the brightness (gray scale) of the corresponding pixel of the tiny reflective mirror 2401 can be controlled to realize the control of the image.
  • the light guide 210 at the front end of the DMD 240, the lens assembly 220 and the reflector 230 form an illumination light path, and the illumination beam emitted by the light source assembly 100 passes through the illumination light path to form a beam size and an incident angle that meet the requirements of the DMD 240.
  • the lens 300 includes a combination of multiple lenses, which are generally divided into groups, such as three-stage front group, middle group and rear group, or two-stage front group and rear group.
  • the front group is the lens group near the light output side of the projection device (left side shown in FIG. 2 )
  • the rear group is the lens group near the light output side of the light engine 200 (right side shown in FIG. 2 ).
  • the lens 300 may also be a zoom lens, or a fixed focus adjustable focus lens, or a fixed focus lens.
  • the laser projection device is an ultra-short-focus projection device
  • the lens 300 is an ultra-short-focus lens
  • the throw ratio of the lens 300 is usually less than 0.3, such as 0.24.
  • the throw ratio refers to the ratio of the projection distance to the screen width. The smaller the ratio, the larger the projection screen width at the same projection distance.
  • the ultra-short-focus lens with a relatively small projection can adapt to a narrow space while ensuring the projection effect.
  • the projection device 10 further includes a circuit system architecture 400 and a detection device 600.
  • the circuit system architecture 400 includes a first controller 401, a second controller 402, a power supply circuit 403, and a light source driving circuit. 404 and light valve drive circuit 405.
  • the power circuit 403 is coupled with the light valve driving circuit 405 , the light source driving circuit 404 and the second controller 402 , and is used for supplying power to the light valve driving circuit 405 , the light source driving circuit 404 and the second controller 402 .
  • the first controller 401 is configured to send a standby instruction to the second controller 402 in response to the standby operation, and control the power supply circuit 403 to keep the state of supplying power to the second controller 402 unchanged.
  • the second controller 402 is configured to control the power supply circuit 403 to maintain the state of supplying power to the light valve driving circuit 405 in response to the standby command, and control the power supply circuit 403 to stop supplying power to the light source driving circuit 404 .
  • the standby operation may be a click operation on a standby button in a remote controller used to control the projection device 10 , or a click operation on a standby button on a housing of the projection device 10 .
  • the first controller 401 may be a system-on-chip (System on Chip, SoC), and the second controller 402 may be a display control chip, such as a DLP chip.
  • SoC System on Chip
  • the first controller 401 and the second controller 402 may also be integrated into one chip.
  • the first controller 401 and the second controller 402 are two chips respectively, the first controller 401 is coupled to the second controller 402 .
  • the detection device 600 is configured to detect whether there is a target object within the target range, and generate a detection signal based on the detection result.
  • FIG. 9 is a structural diagram of another projection device according to some embodiments of the present disclosure.
  • the detection device 600 is located outside the housing of the projection device 10 .
  • the detection device 600 may be located on the side of the casing of the projection device 10 , and the plane of the side intersects the plane of the projection screen 20 .
  • the detection device 600 may be located on the side of the housing of the projection device 10 away from the projection screen 20 .
  • the detection device 600 may include at least one of a millimeter wave sensor, a pyroelectric infrared sensor, and a camera, and the embodiment of the present disclosure does not limit the type and location of the detection device 600 .
  • the detection device 600 may periodically or in real time detect whether there is a target object within the target range, and generate a detection signal based on the detection result.
  • the detection signal is used to indicate whether there is a target object in the target range, and the target object may be a person within the detection range of the detection device 600 .
  • the target range may be the detection range of the detection device 600 , or the target range may be a fixed range pre-stored in the detection device 600 , and the target range is within the detection range of the detection device 600 .
  • the detection device 600 may include a millimeter wave sensor, and the detection device 600 may emit a millimeter wave signal, and determine whether there is a target object within the target range according to the millimeter wave signal reflected by the target object.
  • the detection device 600 can determine the position of the target object based on the millimeter wave signal reflected by the target object, and detect whether the position of the target object is within the target range.
  • the position of the target object may include a target distance between the target object and the detection device 600 and an azimuth of the target object.
  • the detection device 600 can determine a difference signal according to the received millimeter wave signal, and determine the target distance between the target object and the detection device 600 according to the peak frequency of the difference signal. Moreover, the detection device 600 can determine the azimuth angle of the target object according to the difference between the phase angles of two adjacent difference signals.
  • the detection device 600 may include a pyroelectric infrared sensor, which can detect an infrared signal radiated by a target object, and determine whether there is a target object within the target range according to the detected infrared signal radiated by the target object .
  • the detection device 600 can amplify the detected infrared signal, convert the amplified infrared signal into an electrical signal, and detect whether the amplitude of the electrical signal is greater than the amplitude value threshold. If the detection result is that the amplitude of the electrical signal is greater than or equal to the amplitude threshold, it is determined that there is a target object within the target range. If the detection result is that the amplitude of the electrical signal is smaller than the amplitude threshold, the detection device 600 determines that there is no target object within the target range.
  • the amplitude threshold may be a fixed value pre-stored in the detection device 600 .
  • the detection device 600 may include a camera, and if the target range is within the detection range of the detection device 600 , the detection device 600 may detect whether there is a target object in the image captured by it.
  • the detection device 600 can detect the ratio of the area occupied by the target object in the captured image to the area of the captured image. If the detection result is that the ratio is greater than or equal to the ratio threshold, it is determined that the target object exists within the target range. If the ratio is smaller than the ratio threshold, the detection device 600 determines that there is no target object within the target range.
  • the ratio threshold may be a fixed value pre-stored in the detection device 600 .
  • the detection device 600 If there is a target object within the target range, the detection device 600 generates a detection signal for indicating that the target object exists within the target range. If there is no target object within the target range, the detection device 600 generates a detection signal for indicating that there is no target object within the target range.
  • the detection device 600 may be coupled to the first controller 401 and may also be coupled to the second controller 402 .
  • the first controller 401 when the detection device 600 is coupled with the first controller 401, the first controller 401 sends a signal acquisition instruction to the detection device 600 in response to the standby operation, and the detection device 600, after receiving the signal acquisition instruction, In response to the signal acquisition instruction, the generated detection signal is sent to the first controller 401 .
  • the first controller 401 sends the detection signal to the second controller 402 .
  • the second controller 402 when the detection device 600 is coupled with the second controller 402, the second controller 402 sends a signal acquisition instruction to the detection device 600 in response to the standby instruction, and the detection device 600, after receiving the signal acquisition instruction, In response to the signal acquisition instruction, the generated detection signal is sent to the second controller 402 .
  • the second controller 402 determines that there is a target object in the target range based on the detection signal, it controls the power supply circuit 403 to supply power to the light source driving circuit 404 and sends an image signal of the standby image to the light valve driving circuit 405 .
  • An embodiment of the present disclosure provides a projection device 10.
  • the second controller 402 determines that there is a target object within the target range based on the detection signal, it controls the power supply circuit 403 to supply power to the light source drive circuit 404. , and send a projection signal to the light valve driving circuit 405 , so that the light valve driving circuit 405 projects and displays the standby screen on the projection screen 20 . That is, when the projection device 10 is in the standby state, it can project and display the standby screen on the projection screen 20 by detecting that there is a target object within the target range, thereby effectively enriching the functions of the projection device 10 .
  • the second controller 402 is further configured to send a current driving signal to the light source driving circuit 404, and the light source driving circuit 404 is configured to send a driving current to the light source assembly 100 in response to the current driving signal, and the light source assembly 100 uses The light beam is emitted under the driving of the driving current.
  • the standby image is an image pre-stored in the light valve driving circuit 405 .
  • the second controller 402 is further configured to determine that the target object exists within the target range based on the detection signal, send an image request signal to the first controller 401 , and send a standby image to the light valve driving circuit 405 .
  • the image request signal is used to instruct the first controller 401 to send the pre-stored standby image to the second controller 402 .
  • the first controller 401 is further configured to send a pre-stored standby image to the second controller 402 in response to the image request signal.
  • the first controller 401 is configured to: receive a video switching instruction; send a black field video to the second controller 402 in response to the video switching instruction, the black field video includes a plurality of black frames and a plurality of positioning frames , the positioning frame is used to assist the projection device 10 in image correction.
  • the second controller 402 is coupled to the light source assembly 100 and the optical machine 200 and configured to receive the black field video and control the light source assembly 100 and the optical machine 200 to play the black field video.
  • the standby image may be an image frame in a black field video, for example, a black frame and/or an alignment frame.
  • FIG. 10 is a structural diagram of another projection device according to some embodiments of the present disclosure.
  • the first controller 401 includes a wake-up circuit 4011 and a slave control circuit 4012 .
  • the wake-up circuit 4011 is coupled to the slave control circuit 4012 and the second controller 402 .
  • the wake-up circuit 4011 is configured to send a standby instruction to the second controller 402 in response to the standby operation, and control the power supply circuit 403 to stop supplying power to the slave control circuit 4012; wherein, the slave control circuit 4012 is configured to send a command to the second controller 402 Control instruction.
  • the second controller 402 is configured to control the power supply circuit 403 to maintain the state of supplying power to the light valve driving circuit 405 in response to the standby command, and control the power supply circuit 403 to stop supplying power to the light source driving circuit 404 .
  • the wake-up circuit 4011 and the slave control circuit 4012 are integrated on the first controller 401, and when the projection device is in a standby state, the power supply circuit 403 keeps powering the second controller 402 and the wake-up circuit 4011 unchanged. That is, when the projection device 10 is in the standby state, except the wake-up circuit 4011 in the first controller 401 is in the working state, other circuits in the first controller 401 (such as the slave control circuit 4012 ) are all in the non-working state.
  • the slave control circuit 4012 is configured to send a control instruction to the second controller 402 when it is in a working state.
  • the second controller 402 may respond to the control instruction and control the power supply circuit 403 to supply power to the light source driving circuit 404 and the light valve driving circuit 405 .
  • the control instruction may carry a projected image to be displayed, and the second controller 402 may send the projected image to be displayed to the light valve drive circuit 405 in response to the control command, so that the light valve drive circuit 405 will The projected image to be displayed is projected and displayed on the projection screen 20 .
  • the power circuit 403 includes a power board 4031 and a first switch circuit 4032 , and the first switch circuit 4032 is coupled to the power board 4031 , the second controller 402 and the wake-up circuit 4011 respectively.
  • the wake-up circuit 4011 is further configured to control the first switch circuit 4032 to maintain a conductive state in response to the standby operation, so that the power board 4031 continues to supply power to the second controller 402 .
  • the power supply circuit 403 keeps the state of supplying power to the second controller 402 unchanged.
  • the control terminal of the first switch circuit 4032 is coupled to the wake-up circuit 4011 , the input terminal of the first switch circuit 4032 is coupled to the power board 4031 , and the output terminal of the first switch circuit 4032 is coupled to the second controller 402 .
  • the first switch circuit 4032 can be a switch, and the wake-up circuit 4011 is also used to continuously send an enable signal whose level is an active level to the control terminal of the first switch circuit 4032 in response to the standby operation, thereby enabling The first switch circuit 4032 remains on.
  • the wake-up circuit 4011 may be provided with multiple general purpose input/output (General Purpose Input/Output, GPIO) GPIO ports.
  • the first switch circuit 4032 may be coupled to the first GPIO port among the multiple GPIO ports provided on the wake-up circuit 4011 .
  • the wake-up circuit 4011 may also be coupled to the second controller 402 through a second GPIO port among the plurality of GPIO ports, and send a standby instruction to the second controller 402 through the second GPIO port.
  • FIG. 11 is a structural diagram of another projection device according to some embodiments of the present disclosure.
  • the second controller 402 is directly coupled to the power board 4031 , and the power board 4031 can continuously supply power to the second controller 402 when the projection device is in a standby state.
  • the GPIO ports that can be set on the wake-up circuit 4011 are limited, and the wake-up circuit 4011 needs to respond to the standby operation, and control the power supply circuit 403 through other GPIO ports to stop supplying power to the slave control circuit 4012, the remote control module in the projection device, etc. .
  • the second controller 402 By directly coupling the second controller 402 to the power board 4031 , it is possible to effectively avoid the situation that the wake-up circuit 4011 has limited GPIO ports and no extra GPIO ports on the wake-up circuit 4011 are coupled to the first switch circuit 4032 . Therefore, it is effectively ensured that when the projection device is in a standby state, the power supply circuit 403 remains in a state where it can supply power to the second controller 402 .
  • the power supply circuit 403 may further include a second switch circuit 4033 , a third switch circuit 4034 and a fourth switch circuit 4035 .
  • the second switch circuit 4033 , the third switch circuit 4034 and the fourth switch circuit 4035 can all be switches.
  • the second switch circuit 4033 is respectively coupled to the power board 4031, the wake-up circuit 4011 and the slave control circuit 4012.
  • the wake-up circuit 4011 is used to control the second switch circuit 4033 to disconnect in response to the standby operation, so that the power board 4031 stops being a slave control circuit.
  • Circuit 4012 provides power.
  • the control terminal of the second switch circuit 4033 is coupled to the wake-up circuit 4011 , the input terminal of the second switch circuit 4033 is coupled to the power board 4031 , and the output terminal of the second switch circuit 4033 is coupled to the slave control circuit 4012 .
  • the wake-up circuit 4011 is configured to send an enable signal at an inactive level to the control terminal of the second switch circuit 4033 in response to the standby operation, thereby turning off the second switch circuit 4033 .
  • the wake-up circuit 4011 may be coupled to the control terminal of the second switch circuit 4033 through the third GPIO port among the plurality of GPIO ports configured by the wake-up circuit 4011 .
  • the third switch circuit 4034 is respectively coupled to the power board 4031, the second controller 402 and the light valve drive circuit 405, and the second controller 402 is used to control the third switch circuit 4034 to maintain a conduction state in response to the standby command, so that the power board 4031 continuously supplies power to the light valve driving circuit 405 .
  • the control terminal of the third switch circuit 4034 is coupled to the second controller 402, the input terminal of the third switch circuit 4034 is coupled to the power board 4031, and the output terminal of the third switch circuit 4034 is coupled to the light valve driving circuit 405. catch.
  • the second controller 402 is configured to continuously send an enable signal with an active level to the control terminal of the third switch circuit 4034 in response to the standby instruction, thereby keeping the third switch circuit 4034 in a conducting state.
  • the fourth switch circuit 4035 is respectively coupled to the power board 4031, the second controller 402 and the light source driving circuit 404, and the second controller 402 is used to control the fourth switch circuit 4035 to be disconnected in response to the standby command, so that the power board 4031 stops To supply power to the light source driving circuit 404 , and if it is determined based on the detection signal that there is a target object within the target range, the fourth switch circuit 4035 can be controlled to turn on, so that the power supply board 4031 supplies power to the light source driving circuit 404 .
  • the control terminal of the fourth switch circuit 4035 is coupled to the second controller 402, the input terminal of the fourth switch circuit 4035 is coupled to the power board 4031, and the output terminal of the fourth switch circuit 4035 is coupled to the light source driving circuit 404 .
  • the second controller 402 is configured to send an enable signal at an inactive level to the control terminal of the fourth switch circuit 4035 in response to the standby instruction, thereby controlling the fourth switch circuit 4035 to be turned off. If the second controller 402 determines that there is a target object in the target range based on the detection signal, it can send an enabling signal whose level is an active level to the control terminal of the fourth switch circuit 4035, thereby controlling the fourth switch circuit 4035 to lead Pass.
  • the projection device 10 further includes a heat dissipation assembly 1000 coupled to the second controller 402 .
  • the second controller 402 is further configured to control the power supply circuit 403 to stop supplying power to the cooling assembly 1000 in response to the standby instruction. If it is determined based on the detection signal that there is a target object within the target range, the power supply circuit 403 is controlled to supply power to the cooling assembly 1000 .
  • FIG. 11 is a structural diagram of another projection device according to some embodiments of the present disclosure.
  • the second controller 402 is directly coupled to the power board 4031 , and the power board 4031 can continuously supply power to the second controller 402 when the projection device is in a standby state.
  • the power circuit 403 may further include a fifth switch circuit 4036 , and the fifth switch circuit 4036 is respectively coupled to the power board 4031 , the second controller 402 and the cooling assembly 1000 .
  • the second controller 402 is also used to control the fifth switch circuit 4036 to turn off in response to the standby command, so that the power board 4031 stops supplying power to the cooling assembly 1000, and if it is determined based on the detection signal that there is a target object in the target range, then control the fifth switch circuit 4036 to turn off.
  • the five-switch circuit 4036 is turned on, so that the power board 4031 supplies power to the cooling assembly 1000 .
  • the control terminal of the fifth switch circuit 4036 is coupled to the second controller 402 , the input terminal of the fifth switch circuit 4036 is coupled to the power board 4031 , and the output terminal of the fifth switch circuit 4036 is coupled to the cooling assembly 1000 .
  • the second controller 402 is configured to send an enable signal at an inactive level to the control terminal of the fifth switch circuit 4036 in response to the standby command, thereby turning off the fifth switch circuit 4036 . If the second controller 402 determines that there is a target object in the target range based on the detection signal, it can send an enabling signal whose level is an active level to the control terminal of the fifth switch circuit 4036, thereby controlling the fifth switch circuit 4036 to turn on Pass.
  • the second controller 402 may also send a driving signal to the heat dissipation assembly 1000, thereby making the heat dissipation assembly 1000 work under the drive of the driving signal.
  • the wake-up circuit 4011 receives a video switching instruction, controls the power supply circuit 403 to supply power to the slave control circuit 4012, and supplies power to the slave control circuit 4012.
  • Circuit 4012 sends the video switching command.
  • the slave control circuit 4012 receives and responds to the video switching instruction, and sends the black scene video to the second controller 402 .
  • the video switching instruction is used to instruct the projection screen on the projection screen 20 to switch from the current projection screen to a preset projection screen.
  • the video switching instruction may be a button instruction issued by the user by pressing a button on the remote control, or a voice instruction, or an instruction issued by the user through a terminal device coupled to the projection device 10 . Embodiments of the present disclosure do not limit this.
  • the video switching instruction may be switching signal channels, switching image modes, opening a USB flash drive, switching video channels, and other instructions that need to switch the current projection screen.
  • the current signal channel of the projection device 10 is HDMI1
  • the video switching instruction may be used to instruct to switch the signal channel from HDMI1 to HDMI2.
  • the black field video refers to a video whose picture is completely black.
  • the black field video includes at least one black frame and at least one positioning frame, and the embodiments of the present disclosure make no limitation on the number of black frames and the number of positioning frames included in the black field video.
  • a black frame represents a completely black, low-brightness image frame. That is, the color of all pixels on the black frame is black.
  • a localization frame includes one or more feature points. Wherein, a feature point may be formed by positioning multiple adjacent pixel points on the frame. That is, a feature point may be a correction mark, and the correction mark may include multiple adjacent pixel points.
  • the color of the first pixel on the positioning frame is not black (for example, it can be is white or other colors), and the color of the second pixel is black.
  • the first pixel point is a pixel point used to form a feature point on the positioning frame
  • the second pixel point is other pixel points on the positioning frame except the first pixel point.
  • FIG. 12 is a schematic diagram of a black frame and an alignment frame according to some embodiments of the present disclosure.
  • the video frame shown in (a) in FIG. 12 is a black frame.
  • the video frame shown in (b) in Fig. 12 is a positioning frame, and this positioning frame includes 4 circular feature points K1-K4, and the color of the first pixel point included in each feature point in the positioning frame is white, and the feature The color of the second pixel point other than the point is black.
  • Embodiments of the present disclosure do not limit the specific color of the black frame, for example, it may also be gray.
  • the embodiment of the present disclosure does not limit the specific color of the second pixel on the positioning frame, and it only needs to keep the color of the second pixel consistent with the color of the black frame.
  • the positioning frame includes one or more feature points, and the number of feature points included in different positioning frames may be the same or different.
  • the positions of the multiple feature points in the positioning frame are different.
  • image correction requires K feature points as an example, and K is an integer greater than 1.
  • the K feature points may be located on one positioning frame, or may be respectively located on multiple positioning frames. This disclosure is not limited to this.
  • each positioning frame in the multiple positioning frames includes a part of the K feature points, and the positions of the feature points in different positioning frames can be the same or different. .
  • each of the M positioning frames may include multiple feature points, where M is an integer greater than 1.
  • FIG. 13 is a schematic diagram of a positioning frame according to some embodiments of the present disclosure. As shown in FIG. 13 , there is a feature point K1 in the upper left corner of each positioning frame in a black field video, a feature point K2 in the upper right corner, and a feature point K2 in the lower left corner. There is a feature point K3 in the corner and a feature point K4 in the lower right corner, that is, each positioning frame includes 4 feature points in different positions.
  • FIG. 14 is a schematic diagram of another positioning frame according to some embodiments of the present disclosure.
  • the M positioning frames at least include a first positioning frame, a second positioning frame, a third positioning frame and a fourth positioning frame. frame.
  • the multiple feature points may also be respectively located on multiple different positioning frames, and each positioning frame includes two feature points.
  • FIG. 15 is a schematic diagram of another positioning frame according to some embodiments of the present disclosure.
  • the M positioning frames include at least a fifth positioning frame and a sixth positioning frame.
  • the fifth positioning frame includes the feature point K1 in the upper left corner and the feature point K2 in the lower left corner
  • the sixth positioning frame includes the feature point K3 in the upper right corner and the feature point K4 in the lower right corner
  • the positions of the feature points in the sixth positioning frame are different. That is, the feature points included in the at least two positioning frames are respectively located at different positions in the at least two positioning frames.
  • the embodiment of the present disclosure does not limit the shape of the feature points and the setting manner of the feature points in the positioning frame.
  • the number of feature points that need to be carried on one positioning frame can be reduced.
  • the difference between the positioning frame and the black frame can be reduced, thereby effectively reducing the possibility that the user perceives the positioning frame when watching a black field video.
  • video frames adjacent to the anchor frame are black frames.
  • the second controller 402 controls the light source driving circuit 404 to drive the light source assembly 100 to provide an illumination beam
  • the second controller 402 controls the light valve driving circuit 405 to drive the light machine 200 to utilize black light.
  • the image signal in the field video modulates the illumination beam to obtain a projection beam
  • the lens 300 projects the projection beam into an image.
  • the projection device 10 is playing black video, it may alternately project the positioning frame and the black frame on the projection screen 20 .
  • the second controller 402 first projects at least one black frame to the projection screen 20, then projects one frame of positioning frame to the projection screen 20, then projects at least one black frame to the projection screen 20, and then projects one frame to the projection screen 20 positioning frames, and so on. That is, the second controller 402 can project multiple positioning frames to the projection screen 20 , and there is at least one black frame between two adjacent positioning frames, so as to realize alternate projection of positioning frames and black frames.
  • the number of black frames in the black video is greater than or equal to the number of anchor frames. For example, taking the number of frames per second of a black field video as 30 as an example, 15 video frames are black frames, and 15 frames are positioning frames. For another example, the number of frames per second of the black field video is 60, among 15 consecutive video frames in the black field video, 14 video frames are black frames, and 1 video frame is an alignment frame. In this way, when the black video is played, most of the video frames that the user sees are black frames, and a small part of the video frames are positioning frames, which can reduce the possibility of the user perceiving the positioning frames. Moreover, by projecting the positioning frame and the black frame alternately, it is possible to prevent the positioning frame from staying in the user's vision for too long, and reduce the possibility that the user perceives the positioning frame when watching a black field video.
  • the black scene video in the embodiments of the present disclosure may be pre-generated before the projection device receives the video switching instruction, or may be generated in real time by the projection device after receiving the video switching instruction, or may be generated by the projection device after leaving the factory. pre-configured.
  • the black field video in the embodiment of the present disclosure is improved and generated on the basis of the video to be processed.
  • the video to be processed includes multiple black frames, and any black frame in the multiple black frames may be the first black frame.
  • the video to be processed only includes black frames and does not include anchor frames. The manner in which the first controller 401 acquires black video is introduced below.
  • the first controller 401 is further configured to: add feature points to the first black frame included in the video to be processed to obtain a positioning frame, and replace the first black frame with the positioning frame to obtain a black field video.
  • the video to be processed includes one or more first black frames, and the number of first black frames included in the video to be processed is the same as the number of positioning frames included in the black video.
  • Fig. 16 is a schematic diagram of the composition of a black field video according to some embodiments of the present disclosure. As shown in Fig.
  • the video to be processed includes five black frames from black frame P1 to black frame P5 as an example, and the video to be processed can be selected
  • the third black frame P3 is used as the first black frame, and four feature points are added to the upper left corner, lower left corner, upper right corner and lower right corner of the first black frame, that is, the black frame P3 to obtain the positioning frame L1, and the positioning frame is used L1 replaces the first black frame P3 to obtain a black field video.
  • the black field video includes multiple first black frames
  • the multiple first black frames in the video to be processed may be replaced with positioning frames to obtain a black field video including multiple positioning frames.
  • the first controller 401 is further configured to: add feature points to the first black frame included in the video to be processed to obtain a positioning frame, and insert the positioning frame before or after the first black frame to obtain a black frame video.
  • FIG. 17 is a schematic composition diagram of another black field video according to some embodiments of the present disclosure.
  • the video to be processed includes three first black frames including black frame P1 to black frame P3
  • a positioning frame is inserted after each first black frame, that is, positioning frame L1 is inserted after black frame P1, positioning frame L2 is inserted after black frame P2, and positioning frame L3 is inserted after black frame P3, resulting in three Locate the black field video of frames L1-L3 and three black frames P1-P3.
  • inserting an anchor frame into the video to be processed can also be understood as performing frame expansion processing on the video to be processed, and inserting an anchor frame every preset number of black frames.
  • the frame rate of the video to be processed is 30Hz, that is, the video to be processed has 30 black frames per second, and a positioning frame is inserted every other black frame in the video to be processed, and the frame rate of the obtained black field video is 60Hz.
  • the projection device 10 may include an image acquisition interface, the first controller 401 is coupled to the image acquisition interface, and the first controller 401 is further configured to: acquire a captured image through the image acquisition interface, and determine according to the captured image
  • the correction parameter is to send the correction parameter to the second controller 402 .
  • the captured image is an image captured when the positioning frame in the black field video is projected onto the projection screen 20 .
  • the image acquisition interface is used to connect a shooting device, and the shooting device may be set on the whole machine casing 101 of the projection device 10, or may also be set at a position outside the whole machine casing 101 of the projection device 10,
  • the present disclosure does not limit the installation position of the photographing device, as long as it can photograph the projection screen 20 .
  • the first controller 401 can acquire images and videos through the image acquisition interface.
  • the first controller 401 sends a photographing instruction to the photographing device through the image acquisition interface, and the photographing device responds to the photographing instruction, photographs the positioning frame projected by the lens 300 on the projection screen 20, obtains the photographed image, and sends the image to the first controller through the image acquisition interface 401 Send the captured image.
  • the photographing device may photograph the projection screen 20 . Therefore, the photographing device can photograph the black frame projected on the projection screen 20 or the positioning frame projected on the projection screen 20 .
  • the photographed images in the implementation of the present disclosure all refer to the images of the positioning frames projected on the projection screen 20 captured by the photographing device.
  • the number and position of feature points included in the positioning frame may be different, the number of captured images used for image correction will also be different. Therefore, the specific implementation of capturing images will be introduced below according to different situations of positioning frames.
  • the captured image may be an image captured when any positioning frame is projected onto the projection screen.
  • the positions of all the feature points on the projection screen can be determined based on one captured image.
  • the captured image includes at least M images, M is the number of positioning frames in the above group of positioning frames, and M is an integer greater than 1. It should be understood that the M images are in one-to-one correspondence with the M positioning frames in a group of positioning frames.
  • the image is obtained by the photographing device photographing the projection screen displaying the positioning frame corresponding to the image. In this way, the positions of all feature points on the projection screen can be determined based on the M images.
  • the black field video includes at least the first positioning frame, the second positioning frame, the Three positioning frames and a fourth positioning frame.
  • the captured images at least include a first image, a second image, a third image and a fourth image.
  • the first image is the image taken when the first positioning frame is projected onto the projection screen
  • the second image is the image taken when the second positioning frame is projected onto the projection screen
  • the third image is the third positioning frame projected onto the projection screen
  • the fourth image is the image taken when the fourth positioning frame is projected onto the projection screen.
  • the black field video includes at least the fifth positioning frame and the sixth positioning frame .
  • the captured images include at least the fifth image and the sixth image.
  • the fifth image is an image captured when the fifth positioning frame is projected onto the projection screen
  • the sixth image is an image captured when the sixth positioning frame is projected onto the projection screen.
  • the first controller 401 determines the location information of the feature points by identifying the feature points in the captured image. And the correction parameters are determined according to the position information of the feature points. Other implementation manners may also be used to determine the correction parameter, and the embodiments of the present disclosure are not limited thereto.
  • the correction parameter refers to the offset direction and offset amount of the feature point on the positioning frame and the feature point on the projection screen 20 .
  • a feature point on the positioning frame corresponds to a pixel area in the projection screen 20, the pixel area corresponding to the feature point in the projection screen 20, and the projection offset on the projection screen 20 relative to the initial projection position of the pixel area Equal to the offset of the feature point.
  • the position information of the feature points is used to reflect the positions of the feature points on the projection screen 20 .
  • the projection screen 20 in the captured image is identified to determine the position of the frame of the projection screen 20 and the positions of the corners of the projection screen 20 . And based on the position of the frame of the projection screen 20 and the positions of the corners of the projection screen 20, a two-dimensional plane coordinate system is established.
  • a two-dimensional plane coordinate system may be pre-established with the left vertex of the projection screen 20 as the origin, the upper side as the X axis, and the left side as the Y axis.
  • the position information of the feature points can be determined according to the relative positions between the feature points and the frame of the projection screen 20 and the relative positions between the feature points and the corner points of the projection screen 20 .
  • the location information of the feature points is the coordinates of the feature points in the above-mentioned two-dimensional plane coordinate system.
  • a captured image includes all feature points
  • all feature points in the captured image may be identified, and then position information of all feature points may be determined. If the captured image includes M images, and each of the M images includes a part of feature points, feature point recognition needs to be performed on each of the M images to determine the position information of the feature points in each image.
  • the second controller 402 is also configured to: correct the image to be projected according to the correction parameters, and transmit the corrected image signal of the image to be projected to the optical machine 200, so that the optical machine 200 uses the corrected image to be projected
  • the image signal of the image modulates the illumination beam to obtain the projection beam.
  • the second controller 402 receives the correction parameters from the first controller 401, and corrects the image to be projected according to the correction parameters.
  • the black field video represents a video with a completely black screen
  • playing the black field video during video switching can prevent the user from finding that the video is switched. Brings the phenomenon of flashing images.
  • the correction parameters can be automatically obtained according to the captured image including the positioning frame, and the image to be corrected is corrected according to the correction parameters, and when playing the black frame video, the user can only see the black frame, but cannot recognize the black frame video
  • the existing positioning frames in the video so as to realize the user's senseless correction in the process of video switching.
  • the second controller 402 can respond to the standby command, control the power supply circuit 403 to keep the state of supplying power to the light valve drive circuit 405 unchanged, and the power supply circuit 403 can keep the second controller 402 and The power supply status of the wake-up circuit 4011 remains unchanged.
  • both the second controller 402 and the light valve driving circuit 405 are in the working state, so when the second controller 402 determines that there is a target object within the target range, it can quickly control the power supply circuit 403 as a light source
  • the driving circuit 404 supplies power and sends a projection signal to the light valve driving circuit 405 , so as to rapidly project and display the standby image on the projection screen 20 .
  • the power supply circuit 403 keeps supplying power to the wake-up circuit 4011 in the first controller 401 , and the power supply circuit 403 stops supplying power to the slave control circuit 4012 in the first controller 401 . Since the power supply circuit 403 does not need to supply power to the slave control circuit 4012 when the projection device 10 is in the standby state, the power consumption of the projection device 10 in the standby state is reduced.
  • Some embodiments of the present disclosure also provide a projection system, which includes the above-mentioned projection device 10 and a projection screen 20 .
  • the light outlet of the projection device 10 faces the projection screen, and emits the light beam to the projection screen 20, and the projection screen 20 is used to reflect the light beam to realize the display of the picture.
  • the projection device 10 may be a laser projection device, or may be a light emitting diode (Light Emitting Diode, LED) projection device. Projection device 10 may also have other names, such as projection host and so on.
  • the projection device 10 may be in the shape of a cuboid, a prism, a sphere, or a desk lamp, which is not limited in this embodiment of the present disclosure, as long as it has a projection function.
  • FIG. 18 is a flow chart of a method for correcting a projected image according to some embodiments of the present disclosure. As shown in FIG. 18 , the method includes Steps S181 to S185.
  • the captured image is an image captured when the anchor frame is projected onto the projection screen.
  • FIG. 19 is a flowchart of another projection image correction method according to some embodiments of the present disclosure. As shown in FIG. 19 , S181 and S182 also include S186 and S187.
  • S186 Add feature points to the first black frame included in the video to be processed to obtain a positioning frame.
  • the video to be processed includes multiple black frames, and the first black frame is any black frame in the multiple black frames.
  • S181 and S182 further include S186 and S188.
  • S186 Add feature points to the first black frame included in the video to be processed to obtain a positioning frame.
  • the video to be processed includes multiple black frames, and the first black frame is any black frame in the multiple black frames.
  • S187 and S188 are two methods for obtaining black field video, either method of S187 and S188 can be used, and both methods can be used simultaneously.
  • An embodiment of the present invention is also a computer-readable storage medium.
  • the computer-readable storage medium includes computer-executable instructions, and when the computer-executable instructions are run on the computer, the computer is made to execute the projection image correction method according to the above-mentioned embodiments.
  • An embodiment of the present invention is also a computer program product, which can be directly loaded into a memory and contains software codes. After being loaded and executed by a computer, the computer program product can realize the projection image correction method of the above-mentioned embodiment.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • a software program When implemented using a software program, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer-executable instructions.
  • computer-executed instructions When computer-executed instructions are loaded and executed on a computer, the processes or functions according to the embodiments of the present disclosure are generated in whole or in part.
  • a computer can be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • Computer-executable instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, computer-executable instructions may be (such as coaxial cable, optical fiber, digital subscriber line (Digital Subscriber Line, DSL)) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or may contain one or more data storage devices such as servers and data centers that can be integrated with the medium.
  • the usable medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (Solid State Disk, SSD)), etc.
  • a magnetic medium such as a floppy disk, a hard disk, or a magnetic tape
  • an optical medium such as a DVD
  • a semiconductor medium such as a solid state disk (Solid State Disk, SSD)

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

本公开一些实施例提供一种投影设备,包括:光源组件、光机、镜头和电路系统架构。光源组件,被配置为提供照明光束。光机,被配置为利用图像信号对照明光束进行调制,以获得投影光束。镜头,被配置为将投影光束投射成像。电路系统架构,被配置为控制光源组件和光机运行;其中,电路系统架构包括:第一控制器,与第二控制器耦接,且被配置为:接收视频切换指令;响应于视频切换指令,向第二控制器发送黑场视频,该黑场视频包括多个黑帧以及多个定位帧,定位帧用于辅助投影设备进行图像校正。第二控制器,与光源组件和光机耦接,且被配置为接收黑场视频,并控制光源组件和光机播放黑场视频。

Description

投影设备及投影图像的校正方法
本申请要求于2021年12月20日提交的、申请号为202111567322.0的中国专利申请,于2021年7月21日提交的、申请号为202110825818.7的中国专利申请,以及于2021年7月21日提交的、申请号为202110825816.8的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及投影技术领域,尤其涉及一种投影设备及投影图像的校正方法。
背景技术
随着显示技术的发展,投影技术应用市场越来越广,目前已广泛应用于家庭,商业,体育文艺,视频会议,交通监控,生产监控等领域,具有广阔的前景。投影技术为点光源到面显示的光学投影技术,对应的投影主机位置,屏幕位置、屏幕平整度、光学畸变等客观问题决定了实际投影画面很难直接与屏幕完全匹配。
发明内容
一方面,本公开一些实施例提供一种投影设备,包括:光源组件、光机、镜头和电路系统架构。光源组件,被配置为提供照明光束。光机,被配置为利用图像信号对照明光束进行调制,以获得投影光束。镜头,被配置为将投影光束投射成像。电路系统架构,被配置为控制光源组件和光机运行;其中,电路系统架构包括:第一控制器和第二控制器,第一控制器与第二控制器耦接,第一控制器被配置为:接收视频切换指令;响应于视频切换指令,向第二控制器发送黑场视频,该黑场视频包括多个黑帧以及多个定位帧,定位帧用于辅助投影设备进行图像校正。第二控制器被配置为接收黑场视频,并控制光源组件和光机播放黑场视频。
再一方面,本公开一些实施例提供一种投影设备,包括:光源组件、光机、镜头、电路系统架构和检测器件。光源组件被配置为提供照明光束。光机被配置为利用图像信号对照明光束进行调制,以获得投影光束。镜头被配置为将投影光束投射成像。检测器件,被配置为检测目标范围内是否存在目标对象,并基于检测结果生成检测信号。电路系统架构,被配置为控制光源组件和光机运行。其中,电路系统架构包括:电源电路,与光阀驱动电路和光源驱动电路耦接,且被配置为为光阀驱动电路和光源驱动电路供电。第一控制器,与第二控制器耦接,且被配置为响应于待机操作,向第二控制器发送待机指令,并控制电源电路保持为第二控制器供电的状态不变。第二控制器,被配置为响应于待机指令,控制电源电路保持为光阀驱动电路供电的状态不变,并控制电源电路停止为光源驱动电路供电;若基于检测信号确定目标范围内存在目标对象,则控制电源电路为光源驱动电路供电,并向光阀驱动电路发送待机图像的图像信号。光源驱动电路,被配置为响应于待机图像的图像信号,驱动光源提供照明光束。光阀驱动电路,被配置为响应于待机图像的图像信号,驱动光机利用待机图像的图像信号对照明光束进行调制,以获得投影光束。
另一方面,本公开一些实施例提供一种投影系统,包括投影屏幕以及上述投影设备。
又一方面,本公开一些实施例提供一种投影图像的校正方法,包括:接收视频切换指令。响应于视频切换指令,播放黑场视频,所述黑场视频包括多个黑帧以及多个定位帧,所述定位帧用于辅助所述投影设备进行图像校正。获取拍摄图像,拍摄图像为定位帧投射至投影屏幕上时拍摄的图像。根据拍摄图像,确定校正参数。根据校正参数,对待投影图像进行校正。
附图说明
图1A为根据本公开一些实施例的投影系统的示意图;
图1B为根据本公开一些实施例的一种投影设备的结构图;
图2为根据本公开一些实施例的投影设备中光源组件、光机和镜头的示意图;
图3为根据本公开一些实施例的投影设备中的光路架构图;
图4为根据本公开一些实施例的投影设备中光源组件的光路原理示意图;
图5为根据本公开一些实施例的数字微镜器件中的微小反射镜片的排列结构图;
图6为根据本公开一些实施例的微小反射镜片的工作示意图;
图7是图5所示的数字微镜器件中一个微小反射镜片摆动的位置示意图;
图8为根据本公开一些实施例的又一种投影设备的结构图;
图9为根据本公开一些实施例的一种投影设备的组成示意图;
图10为根据本公开一些实施例的又一种投影设备的结构图;
图11为根据本公开一些实施例的又一种投影设备的结构图;
图12为根据本公开一些实施例的一种黑帧和定位帧的示意图;
图13为根据本公开一些实施例的一种定位帧的示意图;
图14为根据本公开一些实施例的另一种定位帧的示意图;
图15为根据本公开一些实施例的又一种定位帧的示意图;
图16为根据本公开一些实施例的一种黑场视频的组成示意图;
图17为根据本公开一些实施例的另一种黑场视频的组成示意图;
图18为根据本公开一些实施例的一种投影图像的校正方法的流程图;
图19为根据本公开一些实施例的另一种投影图像的校正方法的流程图。
具体实施方式
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
需要说明,本发明实施例中所有方向性指示(诸如上、下、左、右、前、后……)仅用于解释在某一特定姿态(如附图所示)下各部件之间的相对位置关系、运动情况等,如果该特定姿态发生改变时,则该方向性指示也相应地随之改变。
术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本公开的描述中,除非另有说明,“多个”的含义是两个或两个以上。
在本公开的描述中,需要说明的是,除非另有明确的规定和限定,术语“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接。对于本领域的普通技术人员而言,可以具体情况理解上述术语在本公开中的具体含义。另外,在对管线进行描述时,本公开中所用“相连”、“连接”则具有进行导通的意义。具体意义需结合上下文进行理解。
在本公开实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本公开实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
通常,投影设备在投影屏幕上投射的投影画面可能与投影屏幕不匹配(例如,投影画面超出投影屏幕的边框),需要人工对投影图像进行校正,或者,使用几何校正功能对投影图像进行校正。但是,这些校正方式都要通过人工调整的方式逐点或逐特征点进行调整,不仅费时费力,还会给用户带来极差的观影感受。
为此,本公开一些实施例提供一种投影系统,图1A为根据本公开一些实施例的投影系统的示意图。如图1A所示,投影系统包括:投影设备10、投影屏幕20、控制设备30和服务器40。用户可以通过控制装置30控制投影设备10在投影屏幕20上进行投影,服务器40可以向投影设备10提供各种内容和互动。
在一些实施例中,控制装置30可以是遥控器30A,其可与投影设备10之间通过红外协议通信、蓝牙协议通信、紫蜂(ZigBee)协议通信或其他短距离通信方式进行通信,用于通过无线或其他有线方式来控制投影设备10。用户可以通过遥控器30A上 按键、语音输入、控制面板输入等输入用户指令,来控制投影设备10。如:用户可以通过遥控器30A上音量加减键、频道控制键、上/下/左/右的移动按键、语音输入按键、菜单键、开关机按键等输入相应控制指令,来实现投影设备10的功能。
控制装置30也可以是智能设备,如移动终端30B、平板电脑、计算机、笔记本电脑等,其可以通过本地网(LAN,Local Area Network)、广域网(WAN,Wide Area Network)、无线局域网(WLAN,Wireless Local Area Network)或其他网络与多媒体控制器之间通信,并通过与多媒体控制器相应的应用程序实现对投影设备10的控制。例如,使用在智能设备上运行的应用程序控制投影设备10。该应用程序可以在与智能设备关联的屏幕上通过直观的用户界面(UI,User Interface)为用户提供各种控制。
示例的,移动终端30B与投影设备10均可安装软件应用,从而可通过网络通信协议实现二者之间的连接通信,进而实现一对一控制操作和数据通信的目的。如:可以使移动终端30B与投影设备10建立控制指令协议,将遥控控制键盘同步到移动终端30B上,通过控制移动终端30B上用户界面,实现控制多媒体控制器的功能;也可以将移动终端30B上显示的音视频内容传输到投影设备10上,实现同步显示功能。
服务器40可以是视频服务器,电子节目指南(EPG,Electronic Program Guide)服务器,云端服务器等。
投影设备10可与服务器40通过多种通信方式进行数据通信。在本公开各个实施例中,可允许投影设备10通过局域网、无线局域网或其他网络与服务器40进行有线通信连接或无线通信连接。
示例的,投影设备10通过发送和接收信息,以及EPG互动,接收软件程序更新,或访问远程储存的数字媒体库。服务器40可以是一组,也可以是多组,可以是一类或多类服务器。通过服务器40提供视频点播和广告服务等其他网络服务内容。
图1B为根据本公开一些实施例的一种投影设备的结构图,如图1B所示,投影设备10包括整机壳体101(图中仅示出部分壳体),装配于整机壳体101中的光源组件100、光机200,以及镜头300。该光源组件100被配置为提供照明光束(激光束)。该光机200被配置为利用图像信号对光源组件100提供的照明光束进行调制以获得投影光束。该镜头300被配置为将投影光束投射在投影屏幕或墙壁上成像。光源组件100、光机200和镜头300沿着光束传播方向依次连接,各自由对应的壳体进行包裹。光源组件100、光机200和镜头300各自的壳体对各光学部件进行支撑并使得各光学部件达到一定的密封或气密要求。比如,光源组件100通过其对应的外壳实现气密性密封,可以较好地改善光源组件100的光衰问题。
光机200的一端和镜头300耦接且沿着整机第一方向X设置,比如第一方向X可以为整机的宽度方向。在光机200的另一端与光源组件100耦接。在本示例中,光源组件100与光机200的连接方向,垂直于光机200与镜头300的连接方向,这种连接结构一方面可以适应光机200中反射式光阀的光路特点,另一方面,还有利于缩短一个维度方向上光路的长度,利于整机的结构排布。例如,当将光源组件100、光机200和镜头300设置在一个维度方向(例如第二方向Y,第二方向Y与第一方向X垂直的方向)上时,该方向上光路的长度就会很长,从而不利于整机的结构排布。
在一些实施例中,光源组件100可以包括三个激光器阵列。图2为根据本公开一些实施例的投影设备中光源组件、光机和镜头的示意图,如图2所示,光源组件100为三色激光光源为例,该三个激光器阵列可分别为红色激光器阵列130、绿色激光器阵列120和蓝色激光器阵列110;但并不局限于此。该三个激光器阵列也可以均为蓝色激光器阵列110,或者两个激光器阵列为蓝色激光器阵列110、一个激光器阵列为红色激光器阵列130。当光源组件100包括的多个激光器可以产生三基色,则光源组件100可以产生包含三基色光的照明光束,因此光源组件100内不需要设置荧光轮(当光源组件所包括的一个或多个激光器阵列仅能产生一种或两种颜色的激光时,需要使用已有颜色的激光激发荧光轮来产生其他颜色的荧光,从而使激光和荧光一起形成白光),能够简化光源组件100的结构,减小光源组件100的体积。
在一些实施例中,光源组件100还可以包括两个激光器阵列。光源组件100为双色激光光源为例,该两个激光器阵列可以为蓝色激光器阵列110和红色激光器阵列130;也可以均为蓝色激光器阵列110,即光源组件100为单色激光光源。
在另一些实施例中,光源组件100还可以包括一个激光器阵列,即光源组件100为单色激光光源,即光源组件100仅包括蓝色激光器阵列110,或者仅包括蓝色激光器阵列110和红色激光器阵列130时。
图4为根据本公开一些实施例的投影设备中光源组件的光路原理示意图,如图4所示,激光器阵列可以为蓝色激光器阵列110,该光源组件100还可以包括:荧光轮140和滤色轮150。该蓝色激光器110发射蓝光后,一部分蓝光照射到荧光轮140上以产生红光荧光(当光源组件100包括红色激光器阵列130时,则不需要再产生红色荧光)和绿光荧光;该蓝光激光、红光荧光(或红色激光)以及绿光荧光依次通过合光镜160后再通过滤色轮150进行滤色,并时序性地输出三基色光。根据人眼的视觉暂留现象,人眼分辨不出某一时刻光的颜色,感知到的仍然是混合的白光。
光源组件100发出的照明光束进入光机200。图3为根据本公开一些实施例的投影设备中的光路架构图,如图2和图3所示,光机200可以包括:光导管210,透镜组件220,反射镜230,数字微镜器件(Digital Micromirror Device,DMD)240以及棱镜组件250。该光导管210可以接收光源组件100的照明光束,并对该照明光束进行匀化。透镜组件220可以对照明光束先进行放大后进行会聚并出射至反射镜230。反射镜230可以将照明光束反射至棱镜组件250。棱镜组件250将照明光束反射至DMD 240,DMD 240对照明光束进行调制,并将调制后得到的投影光束反射至镜头300中。
光机200中,DMD 240是核心部件,其作用是利用图像信号对光源组件100的照明光束进行调制,即:控制照明光束针对待显示图像的不同像素显示不同的颜色和亮度,以最终形成光学图像,因此DMD 240也被称为光调制器件或光阀。根据光调制器件(或光阀)对照明光束进行透射还是进行反射,可以将光调制器件(或光阀)分为透射式光调制器件(或光阀)或反射式光调制器件(或光阀)。例如,如图2和图3所示,DMD 240对照明光束进行反射,即为一种反射式光调制器件。而液晶光阀对照明光束进行透射,因此是一种透射式光调制器件。此外,根据光机中使用的光调制器件(或光阀)的数量,可以将光机分为单片系统、双片系统或三片系统。例如,图2和图3所示的光机200中仅使用了一片DMD 240,因此光机200可被称为单片系统。当使用三片数字微镜器件时,则光机200可以被称为三片系统。
DMD 240应用于数字光处理(Digital Light Processing,DLP)投影架构中,如图2和图3所示,光机200使用了DLP投影架构。图5为根据本公开一些实施例的数字微镜器件中的微小反射镜片的排列结构图,如图5所示,DMD 240包含成千上万个可被单独驱动以旋转的微小反射镜片2401,这些微小反射镜片2401呈阵列排布,每个微小反射镜片2401对应待显示图像中的一个像素。在DLP投影架构中,每个微小反射镜片2401相当于一个数字开关,在外加电场作用下可以在正负12度(±12°)或者正负17度(±17°)的范围内摆动,以使得被反射的光能够沿光轴方向通过镜头300成像在屏上,形成一个亮的像素。
图6为根据本公开一些实施例的微小反射镜片的工作示意图,如图6所示,微小反射镜片2401在负的偏转角度反射出的光,称之为OFF光,OFF光为无效光,通常打到整机壳体101上、光机200的壳体上或者光吸收单元上吸收掉。微小反射镜片2401在正的偏转角度反射出的光,称之为ON光,ON光是DMD 240表面的微小反射镜片2401接收照明光束照射,并通过正的偏转角度射入镜头300的有效光束,用于投影成像。微小反射镜片2401的开状态为光源组件100发出的照明光束经微小反射镜片2401反射后可以进入镜头300时,微小反射镜片2401所处且可以保持的状态,即微小反射镜片2401处于正的偏转角度的状态。微小反射镜片2401的关状态为光源组件100发出的照明光束经微小反射镜片2401反射后未进入镜头300时,微小反射镜片2401所处且可以保持的状态,即微小反射镜片2401处于负的偏转角度的状态。
示例性地,图7为图5所示数字微镜器件中一个微小反射镜片摆动的位置示意图,如图7所示,对于偏转角度为±12°的微小反射镜片2401,位于+12°的状态即为开状态,位于-12°的状态即为关状态,而对于-12°和+12°之间的偏转角度,微小反射镜片2401的实际工作状态仅开状态和关状态。
示例性地,对于偏转角度为±17°的微小反射镜片2401,位于+17°的状态即为开状态,位于-17°的状态即为关状态。图像信号通过处理后被转换成0、1这样的数字代码,这些数字代码可以驱动微小反射镜片2401摆动。
在一帧图像的显示周期内,部分或全部微小反射镜片2401会在开状态和关状态之间切换一次,从而根据微小反射镜片2401在开状态和关状态分别持续的时间来实现一帧图像中的各个像素的灰阶。例如,当像素具有0~255这256个灰阶时,与灰阶0对应的微小反射镜片在一帧图像的整个显示周期内均处于关状态,与灰阶255对应的微小反射镜片在一帧图像的整个显示周期内均处于开状态,而与灰阶127对应的微小反射镜片在一帧图像的显示周期内一半时间处于开状态、另一半时间处于关状态。因此通过图像信号控制DMD 240中每个微小反射镜片在一帧图像的显示周期内所处的状态以及各状态的维持时间,可以控制该微小反射镜片2401对应像素的亮度(灰阶),实现对投射至DMD 240的照明光束进行调制的目的。
DMD 240前端的光导管210,透镜组件220和反射镜230形成照明光路,光源组件100发出的照明光束经过照明光路后形成符合DMD 240所要求的光束尺寸和入射角度。
如图2所示,镜头300包括多片透镜组合,通常按照群组进行划分,分为前群、中群和后群三段式,或者前群和后群两段式。前群是靠近投影设备出光侧(图2所示的左侧)的镜片群组,后群是靠近光机200出光侧(图2所示的右侧)的镜片群组。根据上述多种镜片组组合,镜头300也可以是变焦镜头,或者为定焦可调焦镜头,或者为定焦镜头。在一些实施例中,激光投影设备为超短焦投影设备,镜头300为超短焦镜头,镜头300的投射比通常小于0.3,比如0.24。投射比是指投影距离与画面宽度之比,比值越小,说明相同投影距离,投射画面的宽度越大。投射比较小的超短焦镜头保证投射效果的同时,能够适应较狭窄的空间。
在一些实施例中,如图8所示,投影设备10还包括电路系统架构400和检测器件600,电路系统架构400包括第一控制器401、第二控制器402、电源电路403、光源驱动电路404和光阀驱动电路405。电源电路403与光阀驱动电路405、光源驱动电路404和第二控制器402耦接,用于为光阀驱动电路405、光源驱动电路404和第二控制器402供电。
第一控制器401,被配置为响应于待机操作,向第二控制器402发送待机指令,并控制电源电路403保持为第二控制器402供电的状态不变。第二控制器402,被配置为响应于待机指令,控制电源电路403保持为光阀驱动电路405供电的状态不变,并控制电源电路403停止为光源驱动电路404供电。
在一些实施例中,待机操作可以是针对用于控制投影设备10的遥控器中的待机按钮的点击操作,或者是针对用于投影设备10的壳体上的待机按钮的点击操作。
在一些实施例中,第一控制器401可以为系统级芯片(System on Chip,SoC),第二控制器402可以为显示控制芯片,比如DLP芯片。本公开实施例对于第一控制器401和第二控制器402的类型并不限定,该第一控制器401和第二控制器402也可以集成在一颗芯片中。当第一控制器401和第二控制器402分别为两颗芯片时,第一控制器401与第二控制器402耦接。
在一些实施例中,检测器件600,被配置为检测目标范围内是否存在目标对象,并基于检测结果生成检测信号。
图9为根据本公开一些实施例的又一种投影设备的结构图,如图9所示,检测器件600位于投影设备10的壳体外侧。该检测器件600可以为位于投影设备10的壳体的侧面,该侧面所在平面与投影屏幕20所在平面相交。或者,参考图9,该检测器件 600可以位于投影设备10的壳体远离投影屏幕20的一侧。检测器件600可以包括毫米波传感器、热释电红外传感器和摄像头中的至少一种,本公开实施例对于检测器件600的类型和设置位置不作限定。
在一些实施例中,检测器件600可以周期性或者实时检测目标范围内是否存在目标对象,并基于该检测结果生成检测信号。该检测信号用于指示目标范围是否存在目标对象,该目标对象可以为位于检测器件600的检测范围内的人。该目标范围可以为检测器件600的检测范围,或者,该目标范围可以为检测器件600中预先存储的固定范围,且该目标范围位于该检测器件600的检测范围之内。
示例性的,检测器件600可以包括毫米波传感器,该检测器件600可以发射毫米波信号,并根据被目标对象反射的毫米波信号,确定目标范围内是否存在目标对象。
若目标范围为检测器件600中预先存储的固定范围,则检测器件600可以基于被目标对象反射的毫米波信号确定目标对象的位置,并检测该目标对象的位置是否处于目标范围内。
该目标对象的位置可以包括目标对象与检测器件600之间的目标距离和目标对象的方位角。检测器件600可以根据接收到的毫米波信号确定差值信号,并根据该差值信号的峰值频率确定目标对象与检测器件600之间的目标距离。并且,检测器件600可以根据相邻两个差值信号的相位角的差值,确定在目标对象的方位角。
示例性的,检测器件600可以包括热释电红外传感器,该热释电红外传感器可以检测到目标对象辐射的红外信号,根据检测到的目标对象辐射的红外信号,确定目标范围内是否存在目标对象。
若目标范围为检测器件600中预先存储的固定范围,则检测器件600可以将检测到的红外信号放大,并将放大后的红外信号转化为电信号,以及检测该电信号的幅值是否大于幅值阈值。若检测结果为电信号的幅值大于或等于幅值阈值,确定目标范围内存在目标对象。若检测结果为电信号的幅值小于幅值阈值,则检测器件600确定目标范围内不存在目标对象。其中,该幅值阈值可以为检测器件600中预先存储的固定数值。
示例性的,检测器件600可以包括摄像头,若目标范围为检测器件600的检测范围,则检测器件600可以检测其拍摄得到的拍摄图像中是否存在目标对象。
若目标范围为检测器件600中预先存储的固定范围,则检测器件600可以检测拍摄图像中目标对象所占的面积与拍摄图像的面积的比值。若检测结果为该比值大于或等于比值阈值,确定目标范围内存在目标对象。若该比值小于比值阈值,则检测器件600确定目标范围内不存在目标对象。其中,该比值阈值可以为检测器件600中预先存储的固定数值。
若目标范围内存在目标对象,则检测器件600生成用于指示目标范围内存在目标对象的检测信号。若目标范围内不存在目标对象,则检测器件600生成用于指示目标范围内不存在目标对象的检测信号。
在一些实施例中,检测器件600可以与第一控制器401耦接,也可以与第二控制器402耦接。
示例性地,当检测器件600与第一控制器401耦接时,第一控制器401响应于待机操作,向检测器件600发送信号获取指令,该检测器件600在接收到该信号获取指令后,响应于该信号获取指令,向第一控制器401发送生成的检测信号。第一控制器401向第二控制器402发送该检测信号。
示例性地,当检测器件600与第二控制器402耦接时,第二控制器402响应于待机指令,向检测器件600发送信号获取指令,该检测器件600在接收到该信号获取指令后,响应于该信号获取指令,向第二控制器402发送生成的检测信号。
第二控制器402若基于检测信号确定目标范围内存在目标对象,则控制电源电路403为光源驱动电路404供电,并向光阀驱动电路405发送待机图像的图像信号。
本公开实施例提供了一种投影设备10,该投影设备10在待机状态下时,第二控 制器402若基于检测信号确定目标范围内存在目标对象,则控制电源电路403为光源驱动电路404供电,并向光阀驱动电路405发送投影信号,以使得光阀驱动电路405将待机画面投影显示至投影屏幕20。即,该投影设备10在处于待机状态时,通过检测目标范围内存在目标对象时,在投影屏幕20中投影显示待机画面,有效丰富了投影设备10的功能。
在一些实施例中,第二控制器402还用于向光源驱动电路404发送电流驱动信号,该光源驱动电路404用于响应于该电流驱动信号向光源组件100发送驱动电流,该光源组件100用于在该驱动电流的驱动下出射光束。在一些实施例中,待机图像为光阀驱动电路405中预先存储的图像。或者,第二控制器402还被配置为基于检测信号确定目标范围内存在目标对象,向第一控制器401发送图像请求信号,且向光阀驱动电路405发送待机图像。图像请求信号用于指示第一控制器401向第二控制器402发送预先存储的待机图像。第一控制器401还被配置为响应于图像请求信号,向第二控制器402发送预先存储的待机图像。
在一些实施例中,第一控制器401被配置为:接收视频切换指令;响应于视频切换指令,向第二控制器402发送黑场视频,黑场视频包括多个黑帧以及多个定位帧,定位帧用于辅助投影设备10进行图像校正。第二控制器402,与光源组件100和光机200耦接,且被配置为接收黑场视频,并控制光源组件100和光机200播放黑场视频。
示例性的,待机图像可以为黑场视频中的图像帧,例如,黑帧和/或定位帧。
在一些实施例中,图10为根据本公开一些实施例的又一种投影设备的结构图。如图10所示,第一控制器401包括唤醒电路4011和从控电路4012。唤醒电路4011与从控电路4012和第二控制器402耦接。
唤醒电路4011被配置为响应于待机操作,向第二控制器402发送待机指令,并控制电源电路403停止为从控电路4012供电;其中,从控电路4012被配置为向第二控制器402发送控制指令。第二控制器402,被配置为响应于待机指令,控制电源电路403保持为光阀驱动电路405供电的状态不变,并控制电源电路403停止为光源驱动电路404供电。示例性的,唤醒电路4011和从控电路4012集成在第一控制器401上,且在投影设备处于待机状态时,电源电路403保持为第二控制器402和唤醒电路4011供电的状态不变。即在投影设备10处于待机状态时,第一控制器401中除唤醒电路4011处于工作状态之外,第一控制器401中的其他电路(如从控电路4012)均处于不工作状态。
示例性的,从控电路4012在处于工作状态时,用于向第二控制器402发送控制指令。第二控制器402可以响应于该控制指令,控制电源电路403为光源驱动电路404和光阀驱动电路405供电。或者,该控制指令中可以携带有待显示的投影图像,第二控制器402可以响应于该控制指令,将该待显示的投影图像发送至光阀驱动电路405,以使得光阀驱动电路405将该待显示的投影图像投影显示至投影屏幕20。
在一些实施例中,参考图10,该电源电路403包括电源板4031和第一开关电路4032,该第一开关电路4032分别与电源板4031、第二控制器402和唤醒电路4011耦接。唤醒电路4011还被配置为响应于待机操作,控制第一开关电路4032保持导通状态,使得电源板4031持续为第二控制器402供电。由此使得在投影设备处于待机状态时,电源电路403保持为第二控制器402供电的状态不变。
该第一开关电路4032的控制端与唤醒电路4011耦接,该第一开关电路4032的输入端与电源板4031耦接,该第一开关电路4032的输出端与第二控制器402耦接。该第一开关电路4032可以为开关(switch),该唤醒电路4011还用于响应于待机操作,持续向第一开关电路4032的控制端发送电平为有效电平的使能信号,由此使得第一开关电路4032保持导通状态。
唤醒电路4011上可以设置有多个通用输入/输出(General Purpose Input/Output,GPIO)GPIO端口。该第一开关电路4032可以与唤醒电路4011上设置的多个GPIO端口中的第一GPIO端口耦接。
此外,唤醒电路4011还可以通过多个GPIO端口中的第二GPIO端口与第二控制器402耦接,并通过该第二GPIO端口向第二控制器402发送待机指令。
在一些实施例中,图11为根据本公开一些实施例的又一种投影设备的结构图。参考图11,该第二控制器402与电源板4031直接耦接,则在投影设备处于待机状态时,该电源板4031可以持续为第二控制器402供电。
在一些实施例中,唤醒电路4011上能够设置的GPIO端口有限,且唤醒电路4011需要响应于待机操作,通过其他GPIO端口控制电源电路403停止为从控电路4012、投影设备中的遥控模块等供电。通过将第二控制器402与电源板4031直接耦接,可以有效避免唤醒电路4011上设置的GPIO端口有限,而导致唤醒电路4011上没有多余的GPIO端口与第一开关电路4032耦接的情况。从而有效确保在投影设备处于待机状态时,电源电路403保持可以为第二控制器402供电的状态不变。
参考图10和图11,该电源电路403还可以包括第二开关电路4033、第三开关电路4034和第四开关电路4035。该第二开关电路4033、第三开关电路4034和第四开关电路4035均可以为开关。
该第二开关电路4033分别与电源板4031、唤醒电路4011和从控电路4012耦接,唤醒电路4011用于响应于待机操作,控制第二开关电路4033断开,使得电源板4031停止为从控电路4012供电。
该第二开关电路4033的控制端与唤醒电路4011耦接,该第二开关电路4033的输入端与电源板4031耦接,该第二开关电路4033的输出端与从控电路4012耦接。该唤醒电路4011用于响应于待机操作,向第二开关电路4033的控制端发送电平为无效电平的使能信号,由此使得第二开关电路4033断开。
在本公开实施例中,唤醒电路4011可以通过其设置的多个GPIO端口中的第三GPIO端口与第二开关电路4033的控制端耦接。
第三开关电路4034分别与电源板4031、第二控制器402和光阀驱动电路405耦接,第二控制器402用于响应于待机指令,控制第三开关电路4034保持导通状态,使得电源板4031持续为光阀驱动电路405供电。
该第三开关电路4034的控制端与第二控制器402耦接,该第三开关电路4034的输入端与电源板4031耦接,该第三开关电路4034的输出端与光阀驱动电路405耦接。该第二控制器402用于响应于待机指令,持续向第三开关电路4034的控制端发送电平为有效电平的使能信号,由此使得第三开关电路4034保持导通状态。
第四开关电路4035分别与电源板4031、第二控制器402和光源驱动电路404耦接,第二控制器402用于响应于待机指令,控制第四开关电路4035断开,使得电源板4031停止为光源驱动电路404供电,以及若基于检测信号确定目标范围内存在目标对象,则可以控制第四开关电路4035导通,使得电源板4031为光源驱动电路404供电。
该第四开关电路4035的控制端与第二控制器402耦接,该第四开关电路4035的输入端与电源板4031耦接,该第四开关电路4035的输出端与光源驱动电路404耦接。
该第二控制器402用于响应于待机指令,向第四开关电路4035的控制端发送电平为无效电平的使能信号,由此控制第四开关电路4035断开。该第二控制器402若基于检测信号确定目标范围内存在目标对象,则可以向第四开关电路4035的控制端发送电平为有效电平的使能信号,由此控制第四开关电路4035导通。
在一些实施例中。投影设备10还包括与第二控制器402耦接的散热组件1000。第二控制器402,还被配置为响应于待机指令,控制电源电路403停止为散热组件1000供电。若基于检测信号确定目标范围内存在目标对象,则控制电源电路403为散热组件1000供电。
在一些实施例中,图11为根据本公开一些实施例的又一种投影设备的结构图。参考图11,该第二控制器402与电源板4031直接耦接,则在投影设备处于待机状态时,该电源板4031可以持续为第二控制器402供电。
示例性的,参考图10和图11,该电源电路403还可以包括第五开关电路4036, 该第五开关电路4036分别与电源板4031、第二控制器402和散热组件1000耦接。该第二控制器402还用于响应于待机指令,控制第五开关电路4036断开,使得电源板4031停止为散热组件1000供电,以及若基于检测信号确定目标范围内存在目标对象,则控制第五开关电路4036导通,使得电源板4031为散热组件1000供电。
该第五开关电路4036的控制端与第二控制器402耦接,该第五开关电路4036的输入端与电源板4031耦接,该第五开关电路4036的输出端与散热组件1000耦接。
该第二控制器402用于响应于待机指令,向第五开关电路4036的控制端发送电平为无效电平的使能信号,由此使得第五开关电路4036断开。该第二控制器402若基于检测信号确定目标范围内存在目标对象,则可以向第五开关电路4036的控制端发送电平为有效电平的使能信号,由此控制第五开关电路4036导通。
在本公开实施例中,第二控制器402还可以向散热组件1000发送驱动信号,由此使得散热组件1000在该驱动信号的驱动下工作。
示例性的,如图8所示,当第一控制器401包括唤醒电路4011和从控电路4012时,唤醒电路4011接收视频切换指令,控制电源电路403为从控电路4012供电,并向从控电路4012发送该视频切换指令。从控电路4012接收并响应于该视频切换指令,向第二控制器402发送黑场视频。
在一些实施例中,视频切换指令用于指示投影屏幕20上的投影画面从当前投影画面切换到预设投影画面。视频切换指令可以是用户通过触发遥控器上的按键发出的按键指令,或者是语音指令,又或者是用户通过与投影设备10耦接的终端设备发出的指令。本公开实施例对此不做限定。
示例性的,视频切换指令可以为切换信号通道、切换图像模式、打开U盘、切换视频频道以及其他需要切换当前投影画面的指令。例如,投影设备10的当前信号通道为HDMI1,视频切换指令可以用于指示将信号通道从HDMI1切换到HDMI2。
在一些实施例中,黑场视频指的是画面全黑的视频。在视频切换的时候,由于信号转换会带来图像花闪的现象,通过在视频切换的间隙播放黑场视频可以避免用户发现图像花闪的现象,从而避免给用户带来不好的体验。黑场视频包括至少一个黑帧以及至少一个定位帧,本公开实施例对于黑场视频包括的黑帧数量和定位帧数量不作限定。
黑帧表示全黑、低亮度的图像帧。即,黑帧上所有像素点的颜色为黑色。定位帧包括一个或多个特征点。其中,一个特征点可以由定位帧上相邻的多个像素点来构成。即一个特征点可以为一个校正标识,该校正标识可以包括多个相邻的像素点。
在一些实施例中,在黑帧上所有像素点的颜色为黑色的情况下,为了避免用户在观看黑场视频时感知到定位帧,定位帧上第一像素点的颜色不为黑色(例如可以是白色或者其他颜色),第二像素点的颜色为黑色。其中,第一像素点为定位帧上用于构成特征点的像素点,第二像素点为定位帧上除了第一像素点之外的其他像素点。
示例性的,图12为根据本公开一些实施例的一种黑帧和定位帧的示意图。图12中的(a)所示的视频帧为黑帧。图12中的(b)所示的视频帧为定位帧,该定位帧包括4个圆形的特征点K1-K4,定位帧中每个特征点包括的第一像素点的颜色为白色,特征点以外的第二像素点的颜色为黑色。
本公开实施例不限制黑帧的具体颜色,例如也可以是灰色等。相应的,本公开实施例也不限制定位帧上第二像素点的具体颜色,第二像素点的颜色与黑帧的颜色保持一致即可。
在一些实施例中,定位帧包括一个或多个特征点,不同定位帧包括的特征点的数量可以相同,也可以不同。当一个定位帧包括多个特征点时,该多个特征点在该定位帧中的位置不同。
在一些实施例中,以图像校正需要K个特征点,K为大于1的整数为例,该K个特征点可以位于一个定位帧上,也可以分别位于多个定位帧上。本公开对此并不限定。当K个特征点分别位于多个定位帧上时,该多个定位帧中的每一个定位帧包括K个特 征点中的一部分特征点,不同定位帧中特征点的位置可以相同,也可以不同。
示例性的,M个定位帧中的每一个定位帧均可以包括多个特征点,M为大于1的整数。例如,图13为根据本公开一些实施例的一种定位帧的示意图,如图13所示,黑场视频中的每个定位帧的左上角存在特征点K1、右上角存在特征点K2、左下角存在特征点K3、右下角存在特征点K4,即每个定位帧包括位于不同位置的4个特征点。
示例性的,多个特征点也可以分别位于多个不同的定位帧上,每个定位帧包括一个特征点。例如,图14为根据本公开一些实施例的另一种定位帧的示意图,如图14所示,M个定位帧至少包括第一定位帧、第二定位帧、第三定位帧和第四定位帧。第一定位帧的左上角存在特征点K1,第二定位帧的右上角存在特征点K2,第三定位帧的左下角存在特征点K3,第四定位帧的右下角存在特征点K4。
示例性的,多个特征点也可以分别位于多个不同的定位帧上,每个定位帧包括两个特征点。例如,图15为根据本公开一些实施例的另一种定位帧的示意图,如图15所示,M个定位帧至少包括第五定位帧和第六定位帧。第五定位帧包括位于左上角的特征点K1和左下角的特征点K2,第六定位帧包括位于右上角的特征点K3和右下角的特征点K4,第五定位帧中特征点的位置与第六定位帧中特征点的位置不同。即,至少两个定位帧包括的特征点分别位于该至少两个定位帧中的不同位置。
本公开实施例对于特征点的形状以及特征点在定位帧中的设置方式并不限定。通过将多个特征点分别设置在不同的定位帧上,能够减少一个定位帧上需要携带的特征点的数目。这样可以减小定位帧与黑帧之间的差异,从而有效降低用户在观看黑场视频时感知到定位帧的可能性。
在一些实施例中,在黑场视频中,与定位帧相邻的视频帧为黑帧。例如,如图8所示,用户触发视频切换指令后,第二控制器402控制光源驱动电路404驱动光源组件100提供照明光束,第二控制器402控制光阀驱动电路405驱动光机200利用黑场视频中的图像信号对照明光束进行调制,以获得投影光束,镜头300将该投影光束投射成像。投影设备10在播放黑场视频时,可以将定位帧与黑帧交替投射在投影屏幕20上。比如,第二控制器402先向投影屏幕20投射至少一帧黑帧,然后向投影屏幕20投射一帧定位帧,之后向投影屏幕20投射至少一帧黑帧,再向投影屏幕20投射一帧定位帧,以此类推。即,第二控制器402可以向投影屏幕20投射多帧定位帧,且相邻的两帧定位帧之间具有至少一帧黑帧,从而实现定位帧与黑帧的交替投射。
黑场视频中的黑帧的数量大于或等于定位帧的数量。例如,以黑场视频的每秒的帧数为30为例,其中15个视频帧为黑帧,15帧为定位帧。又例如,黑场视频的每秒的帧数为60,黑场视频中连续的15个视频帧中,14个视频帧为黑帧,1个视频帧为定位帧。如此一来,在黑场视频播放时,用户看到的大部分视频帧均为黑帧,小部分视频帧为定位帧,能够降低用户感知到定位帧的可能性。而且通过将定位帧与黑帧交替投射,可以避免定位帧在用户的视觉暂留时间过长,降低用户在观看黑场视频时感知到定位帧的可能性。
在一些实施例中,本公开实施例的黑场视频可以是在投影设备接收到视频切换指令之前预先生成的,也可以是投影设备在接收视频切换指令之后实时生成的,也可以是投影设备出厂时预先配置的。
本公开实施例的黑场视频是在待处理视频的基础上改进生成的。待处理视频包括多个黑帧,多个黑帧中的任一黑帧可以为第一黑帧。待处理视频仅包括黑帧,不包括定位帧。下面对第一控制器401获取黑场视频的方式进行介绍。
在一些实施例中,第一控制器401还被配置为:在待处理视频包括的第一黑帧中添加特征点,得到定位帧,采用定位帧替换第一黑帧,得到黑场视频。
示例性的,待处理视频包括一个或多个第一黑帧,待处理视频包括的第一黑帧数量与黑场视频包括的定位帧数量相同。图16为根据本公开一些实施例的一种黑场视频的组成示意图,如图16所示,以待处理视频包括黑帧P1~黑帧P5共5个黑帧为例,可以选择待处理视频中第3个黑帧P3作为第一黑帧,在该第一黑帧即黑帧P3的左上 角、左下角、右上角和右下角中添加四个特征点得到定位帧L1,采用该定位帧L1替换第一黑帧P3,得到黑场视频。若黑场视频包括多个第一黑帧,可以将待处理视频中的多个第一黑帧替换为定位帧,得到包括多个定位帧的黑场视频。
在一些实施例中,第一控制器401还被配置为:在待处理视频包括的第一黑帧中添加特征点,得到定位帧,在第一黑帧之前或之后插入定位帧,得到黑场视频。
示例性的,图17为根据本公开一些实施例的另一种黑场视频的组成示意图,如图17所示,以待处理视频包括黑帧P1~黑帧P3共三个第一黑帧为例,在每个第一黑帧之后插入一个定位帧,即在黑帧P1后插入定位帧L1,在黑帧P2后插入定位帧L2,在黑帧P3后插入定位帧L3,得到包括3个定位帧L1~L3和3个黑帧P1~P3的黑场视频。
示例性的,在待处理视频中插入定位帧也可以理解为对待处理视频进行扩帧处理,每隔预设数量的黑帧插入一个定位帧。假设待处理视频帧率为30Hz,即待处理视频每秒钟有30个黑帧,在待处理视频中每隔一个黑帧插入一个定位帧,得到的黑场视频的帧率为60Hz。
在一些实施例中,投影设备10可以包括图像采集接口,第一控制器401与图像采集接口耦接,第一控制器401还被配置为:通过图像采集接口获取拍摄图像,根据拍摄图像,确定校正参数,向第二控制器402发送校正参数。该拍摄图像为黑场视频中的定位帧投射至投影屏幕20上时拍摄的图像。
示例性地,图像采集接口用于连接拍摄装置,该拍摄装置可以设置于投影设备10的整机壳体101上,或者,也可以设置于投影设备10的整机壳体101之外的位置,本公开对于拍摄装置的设置位置不作限定,只要能够拍摄投影屏幕20即可。第一控制器401通过图像采集接口可以获取图像,也可以获取视频。
第一控制器401通过图像采集接口向拍摄装置发送拍摄指令,拍摄装置响应于拍摄指令,拍摄镜头300在投影屏幕20上投射的定位帧,得到拍摄图像,并通过图像采集接口向第一控制器401发送拍摄图像。
示例性地,投影设备10在向投影屏幕20投射黑帧和定位帧时,拍摄装置可以对投影屏幕20进行拍摄。故拍摄装置可以拍摄投影屏20幕上投射的黑帧,也可以拍摄投影屏幕20上投射的定位帧。本公开实施中的拍摄图像均指拍摄装置拍摄的投影屏幕20上投射的定位帧的图像。
由于定位帧所包括的特征点的数量和位置可能不同,用于图像校正的拍摄图像的数量也会不同。因此,以下根据定位帧的不同情形,对拍摄图像的具体实现进行介绍。
示例性的,在黑场视频的任意一个定位帧均包括图像校正所需的全部特征点的情况下,拍摄图像可以是任意一个定位帧投射至投影屏幕上时拍摄的图像。这样一来,基于一个拍摄图像即可以确定全部特征点在投影屏幕上的位置。
示例性的,若黑场视频中一组定位帧包括图像校正所需的全部特征点,而一组定位帧中的各个定位帧仅包括图像校正所需的部分特征点,则拍摄图像至少包括M个图像,M为上述一组定位帧中定位帧的数目,M为大于1的整数。应理解,M个图像与一组定位帧中的M个定位帧一一对应。对于一个图像来说,该图像即为拍摄装置对显示该图像对应的定位帧的投影屏幕进行拍摄而得到的。这样,基于M个图像可以确定全部特征点在投影屏幕上的位置。
例如,假设图像校正需要4个特征点,这4个特征点分别设置于四个不同的定位帧上,如图14所示,黑场视频中至少包括第一定位帧、第二定位帧、第三定位帧和第四定位帧。那么,拍摄图像至少包括第一图像、第二图像、第三图像以及第四图像。其中,第一图像为第一定位帧投射至投影屏幕上时拍摄的图像,第二图像为第二定位帧投射至投影屏幕上时拍摄的图像,第三图像为第三定位帧投射至投影屏幕上时拍摄的图像,第四图像为第四定位帧投射至投影屏幕上时拍摄的图像。
示例性的,假设图像校正需要4个特征点,这4个特征点分别设置于两个不同的定位帧上,如图15所示,黑场视频中至少包括第五定位帧和第六定位帧。那么,拍摄 图像至少包括第五图像和第六图像。其中,第五图像为第五定位帧投射至投影屏幕上时拍摄的图像,第六图像为第六定位帧投射至投影屏幕上时拍摄的图像。
在一些实施例中,第一控制器401通过在拍摄图像中进行特征点识别,确定特征点的位置信息。并根据特征点的位置信息确定校正参数。还可以采用其他实现方式确定校正参数,本公开实施例不限于此。其中,校正参数指的是特征点在定位帧上与特征点在投影屏幕20上的偏移方向和偏移量。定位帧上的一个特征点与投影屏幕20中的一个像素区域相对应,投影屏幕20中与特征点对应的像素区域,在投影屏幕20上相对于该像素区域的初始投影位置的投影偏移量等于该特征点的偏移量。特征点的位置信息用于反映特征点在投影屏幕20上的位置。
示例性的,识别拍摄图像中的投影屏幕20,以确定投影屏幕20边框的位置,以及投影屏幕20角点的位置。并基于投影屏幕20边框的位置以及投影屏幕20角点的位置,建立二维平面坐标系。
例如,可以以投影屏幕20的左顶点为原点,上边为X轴,左边为Y轴预先建立二维平面坐标系。在拍摄图像中识别出特征点之后,可以根据特征点与投影屏幕20边框之间的相对位置,以及根据特征点与投影屏幕20角点之间的相对位置,确定特征点的位置信息。特征点的位置信息即为特征点在上述二维平面坐标系中的坐标。
在一些实施例中,若一个拍摄图像包括所有特征点,可以对该拍摄图像中的所有特征点进行识别,进而确定所有特征点的位置信息。若拍摄图像包括M个图像,M个图像中每一个图像包括一部分特征点,则需要对M个图像中的每个图像分别进行特征点识别,以确定每个图像中的特征点的位置信息。
第二控制器402,还被配置为:根据校正参数,对待投影图像进行校正,并向光机200传输校正处理后的待投影图像的图像信号,以使光机200利用校正处理后的待投影图像的图像信号对照明光束进行调制,以获得投影光束。
示例性的,第二控制器402接收来自第一控制器401的校正参数,根据校正参数对待投影图像进行校正。
通过在黑场视频中添加用于辅助投影设备10进行图像校正的定位帧,一方面,由于黑场视频代表画面全黑的视频,因此在视频切换时播放黑场视频可以避免用户发现视频切换所带来的图像花闪的现象。另一方面,根据包括定位帧的拍摄图像可以自动获取校正参数,根据校正参数对待校正图像进行校正,并且在播放黑场视频时,用户只能看到黑色的画面,而无法识别到黑场视频中存在的定位帧,从而在视频切换的过程中实现用户的无感校正。
投影设备10在待机状态下时,第二控制器402可以响应于待机指令,控制电源电路403保持为光阀驱动电路405供电的状态不变,且电源电路403可以保持为第二控制器402和唤醒电路4011供电的状态不变。即,投影设备10在处于待机状态时,第二控制器402和光阀驱动电路405均处于工作状态,因此第二控制器402在确定目标范围内存在目标对象时,可以快速控制电源电路403为光源驱动电路404供电,并向光阀驱动电路405发送投影信号,进而快速将待机画面投影显示至投影屏幕20。
并且,投影设备10在待机状态下时,电源电路403保持为第一控制器401中的唤醒电路4011供电的状态不变,且电源电路403停止为第一控制器401中从控电路4012供电。由于投影设备10在待机状态下时,电源电路403无需为从控电路4012供电,因此降低了投影设备10在待机状态下的功耗。
本公开一些实施例还提供一种投影系统,该投影系统包括上述投影设备10和投影屏幕20。投影设备10的出光口朝向投影屏幕,并发射光束至投影屏幕20,投影屏幕20用于反射该光束以实现画面的显示。
在一些实施例中,投影设备10可以为激光投影设备,也可以为发光二极管(Light Emitting Diode,LED)投影设备。投影设备10也可以有其他名称,例如投影主机等。
示例性的,投影设备10可以为长方体、棱柱状、球形、台灯状等,本公开实施例不做限定,只要具备投影功能即可。
本公开实施例提供一种投影图像的校正方法,应用于上述投影设备10,图18为根据本公开一些实施例的一种投影图像的校正方法的流程图,如图18所示,该方法包括步骤S181~S185。
S181、接收视频切换指令。
S182、响应于视频切换指令,播放黑场视频,所述黑场视频包括多个黑帧以及多个定位帧,所述定位帧用于辅助所述投影设备进行图像校正。
S183、获取拍摄图像。拍摄图像为定位帧投射至投影屏幕上时拍摄的图像。
S184、根据拍摄图像,确定校正参数。
S185、根据校正参数,对待投影图像进行校正。
在一些实施例中,图19为根据本公开一些实施例的另一种投影图像的校正方法的流程图,如图19所示,S181与S182之间还包括S186和S187。
S186:在待处理视频包括的第一黑帧中添加特征点,得到定位帧。待处理视频包括多个黑帧,第一黑帧为多个黑帧中的任一黑帧。
S187:采用定位帧分别替换第一黑帧,得到黑场视频。
在一些实施例中,如图19所示,S181于S182之间还包括S186和S188。
S186:在待处理视频包括的第一黑帧中添加特征点,得到定位帧。待处理视频包括多个黑帧,第一黑帧为多个黑帧中的任一黑帧。
S188:在多个第一黑帧之前或之后插入定位帧,得到黑场视频。
S187和S188为两种得到黑场视频的方法,可以使用S187和S188中的任一个方法,也可以同时使用两种方法。
本发明实施例还一种计算机可读存储介质,计算机可读存储介质包括计算机执行指令,当计算机执行指令在计算机上运行时,使得计算机执行如上述实施例的投影图像的校正方法。
本发明实施例还一种计算机程序产品,该计算机程序产品可直接加载到存储器中,并含有软件代码,该计算机程序产品经由计算机载入并执行后能够实现上述实施例的投影图像的校正方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式来实现。该计算机程序产品包括一个或多个计算机执行指令。在计算机上加载和执行计算机执行指令时,全部或部分地产生按照本公开实施例所述的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机执行指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机执行指令可以从一个网站站点、计算机、服务器或者数据中心通过有线(例如同轴电缆、光纤、数字用户线(Digital Subscriber Line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可以用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质(例如,软盘、硬盘、磁带),光介质(例如,DVD)、或者半导体介质(例如固态硬盘(Solid State Disk,SSD))等。
尽管在此结合各实施例对本公开进行了描述,然而,在实施所要求保护的本公开过程中,本领域技术人员通过查看附图、公开内容、以及所附权利要求书,可理解并实现公开实施例的其他变化。在权利要求中,“包括”(comprising)一词不排除其他组成部分或步骤,“一”或“一个”不排除多个的情况。单个处理器或其他单元可以实现权利要求中列举的若干项功能。相互不同的从属权利要求中记载了某些措施,但这并不表示这些措施不能组合起来产生良好的效果。
尽管结合具体特征及其实施例对本公开进行了描述,显而易见的,在不脱离本公开的精神和范围的情况下,可对其进行各种修改和组合。相应地,本说明书和附图仅 仅是所附权利要求所界定的本公开的示例性说明,且视为已覆盖本公开范围内的任意和所有修改、变化、组合或等同物。
以上所述,仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,任何在本公开揭露的技术范围内的变化或替换,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应该以权利要求的保护范围为准。

Claims (20)

  1. 一种投影设备,包括:
    光源组件,被配置为提供照明光束;
    光机,被配置为利用图像信号对所述照明光束进行调制,以获得投影光束;
    镜头,被配置为将所述投影光束投射成像;
    电路系统架构,被配置为控制所述光源组件和所述光机运行;其中,所述电路系统架构包括:
    第一控制器,与第二控制器耦接,且被配置为:接收视频切换指令;响应于所述视频切换指令,向所述第二控制器发送黑场视频,所述黑场视频包括多个黑帧以及多个定位帧,所述定位帧用于辅助所述投影设备进行图像校正;
    第二控制器,与所述光源组件和所述光机耦接,且被配置为接收所述黑场视频,并控制所述光源组件和所述光机播放所述黑场视频。
  2. 根据权利要求1所述的投影设备,其中,
    所述第一控制器,与图像采集接口耦接,且还被配置为:通过所述图像采集接口获取拍摄图像,根据所述拍摄图像,确定校正参数,向所述第二控制器发送所述校正参数;所述拍摄图像为所述定位帧投射至投影屏幕上时拍摄的图像;
    所述第二控制器,还被配置为:根据所述校正参数,对待投影图像进行校正,并向所述光机传输校正处理后的所述待投影图像的图像信号,以使所述光机利用校正处理后的所述待投影图像的图像信号对所述照明光束进行调制,以获得投影光束。
  3. 根据权利要求2所述的投影设备,其中,所述定位帧包括一个或多个特征点,当所述定位帧包括所述多个特征点时,所述多个特征点在所述定位帧中的位置不同。
  4. 根据权利要求3所述的投影设备,其中,至少两个所述定位帧包括的所述特征点分别位于至少两个所述定位帧中的不同位置。
  5. 根据权利要求1-4中任一项所述的投影设备,其中,在所述黑场视频中,与所述定位帧相邻的视频帧为所述黑帧。
  6. 根据权利要求1-5中任一项所述的投影设备,其中,所述第一控制器还被配置为:
    在待处理视频包括的第一黑帧中添加所述特征点,得到所述定位帧,所述待处理视频包括多个黑帧,所述第一黑帧为所述多个黑帧中的任一黑帧;
    采用所述定位帧替换所述第一黑帧,得到所述黑场视频。
  7. 根据权利要求1-5中任一项所述的投影设备,其中,所述第一控制器还被配置为:
    在待处理视频包括的第一黑帧中添加所述特征点,得到所述定位帧,所述待处理视频包括多个黑帧,所述第一黑帧为所述多个黑帧中的任一黑帧;
    在所述第一黑帧之前或之后插入所述定位帧,得到所述黑场视频。
  8. 一种投影设备,包括:
    光源组件,被配置为提供照明光束;
    光机,被配置为利用图像信号对所述照明光束进行调制,以获得投影光束;
    镜头,被配置为将所述投影光束投射成像;
    检测器件,被配置为检测目标范围内是否存在目标对象,并基于检测结果生成检测信号;
    电路系统架构,被配置为控制所述光源组件和所述光机运行;其中,所述电路系统架构包括:
    电源电路,与光阀驱动电路和光源驱动电路耦接,且被配置为为所述光阀驱动电路和所述光源驱动电路供电;
    第一控制器,与第二控制器耦接,且被配置为响应于待机操作,向第二控制器发送待机指令,并控制所述电源电路保持为所述第二控制器供电的状态不变;
    所述第二控制器,被配置为响应于所述待机指令,控制所述电源电路保持为所述光阀驱动电路供电的状态不变,并控制所述电源电路停止为所述光源驱动电 路供电;若基于所述检测信号确定所述目标范围内存在所述目标对象,则控制所述电源电路为所述光源驱动电路供电,并向所述光阀驱动电路发送待机图像的图像信号;
    所述光源驱动电路,被配置为响应于所述待机图像的图像信号,驱动所述光源提供照明光束;
    所述光阀驱动电路,被配置为响应于所述待机图像的图像信号,驱动所述光机利用所述待机图像的图像信号对所述照明光束进行调制,以获得投影光束。
  9. 根据权利要求8所述的投影设备,其中,所述第一控制器与所述检测器件耦接;
    所述第一控制器,还被配置为响应于所述待机操作,获取来自所述检测器件的所述检测信号,并向所述第二控制器发送所述检测信号。
  10. 根据权利要求8所述的投影设备,其中,所述第二控制器与所述检测器件耦接;
    所述第二控制器,还被配置为响应于所述待机指令,获取来自所述检测器件的所述检测信号。
  11. 根据权利要求8-10中任一项所述的投影设备,其中,所述第一控制器包括唤醒电路和从控电路;
    所述唤醒电路,与所述从控电路和所述第二控制器耦接,且被配置为响应于所述待机操作,向所述第二控制器发送所述待机指令,并控制所述电源电路停止为所述从控电路供电;其中,所述从控电路被配置为向所述第二控制器发送控制指令。
  12. 根据权利要求8-11中任一所述的投影设备,其中,所述待机图像为所述光阀驱动电路中预先存储的图像;
    或者,
    所述第二控制器还被配置为基于所述检测信号确定所述目标范围内存在所述目标对象,向所述第一控制器发送图像请求信号,且向所述光阀驱动电路发送所述待机图像;所述图像请求信号用于指示所述第一控制器向所述第二控制器发送预先存储的所述待机图像;
    所述第一控制器还被配置为响应于所述图像请求信号,向所述第二控制器发送预先存储的所述待机图像。
  13. 根据权利要求8-12中任一所述的投影设备,还包括散热组件,所述第二控制器,还被配置为响应于所述待机指令,控制所述电源电路停止为所述散热组件供电;若基于所述检测信号确定所述目标范围内存在所述目标对象,则控制所述电源电路为所述散热组件供电。
  14. 一种投影系统,包括投影屏幕,以及权利要求1-13中任一项所述的投影设备。
  15. 一种投影图像的校正方法,包括:
    接收视频切换指令;
    响应于视频切换指令,播放黑场视频,所述黑场视频包括多个黑帧以及多个定位帧,所述定位帧用于辅助所述投影设备进行图像校正;
    获取拍摄图像,所述拍摄图像为所述定位帧投射至投影屏幕上时拍摄的图像;
    根据所述拍摄图像,确定校正参数;
    根据所述校正参数,对待投影图像进行校正。
  16. 根据权利要求15所述的方法,其中,所述定位帧包括一个或多个特征点,当所述定位帧包括所述多个特征点时,所述多个特征点在所述定位帧中的位置不同。
  17. 根据权利要求16所述的方法,其中,至少两个所述定位帧包括的所述特征点分别位于至少两个所述定位帧中的不同位置。
  18. 根据权利要求15-17中任一项所述的方法,其中,在所述黑场视频中,与所述定位帧相邻的视频帧为所述黑帧。
  19. 根据权利要求15-18中任一项所述的方法,还包括:
    在待处理视频包括的第一黑帧中添加所述特征点,得到所述定位帧,所述待处理 视频包括多个黑帧,所述第一黑帧为所述多个黑帧中的任一黑帧;
    采用所述定位帧分别替换所述第一黑帧,得到所述黑场视频。
  20. 根据权利要求15-18中任一项所述的方法,还包括:
    在待处理视频包括的第一黑帧中添加所述特征点,得到所述定位帧,所述待处理视频包括多个黑帧,所述第一黑帧为所述多个黑帧中的任一黑帧;
    在多个第一黑帧之前或之后插入所述定位帧,得到所述黑场视频。
PCT/CN2022/102067 2021-07-21 2022-06-28 投影设备及投影图像的校正方法 WO2023000937A1 (zh)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN202110825816.8 2021-07-21
CN202110825818.7 2021-07-21
CN202110825818.7A CN115695740A (zh) 2021-07-21 2021-07-21 激光投影设备及投影显示方法
CN202110825816.8A CN115691365A (zh) 2021-07-21 2021-07-21 激光投影设备及投影显示方法
CN202111567322.0A CN114339174B (zh) 2021-12-20 一种投影设备及其控制方法
CN202111567322.0 2021-12-20

Publications (1)

Publication Number Publication Date
WO2023000937A1 true WO2023000937A1 (zh) 2023-01-26

Family

ID=84978943

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/102067 WO2023000937A1 (zh) 2021-07-21 2022-06-28 投影设备及投影图像的校正方法

Country Status (1)

Country Link
WO (1) WO2023000937A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011082798A (ja) * 2009-10-07 2011-04-21 Sanyo Electric Co Ltd 投写型映像表示装置
CN110300294A (zh) * 2018-03-22 2019-10-01 卡西欧计算机株式会社 投影控制装置、投影控制方法以及存储介质
CN112165644A (zh) * 2020-09-27 2021-01-01 海信视像科技股份有限公司 一种显示设备及竖屏状态下视频播放方法
CN114339174A (zh) * 2021-12-20 2022-04-12 青岛海信激光显示股份有限公司 一种投影设备及其控制方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011082798A (ja) * 2009-10-07 2011-04-21 Sanyo Electric Co Ltd 投写型映像表示装置
CN110300294A (zh) * 2018-03-22 2019-10-01 卡西欧计算机株式会社 投影控制装置、投影控制方法以及存储介质
CN112165644A (zh) * 2020-09-27 2021-01-01 海信视像科技股份有限公司 一种显示设备及竖屏状态下视频播放方法
CN114339174A (zh) * 2021-12-20 2022-04-12 青岛海信激光显示股份有限公司 一种投影设备及其控制方法

Similar Documents

Publication Publication Date Title
US8550635B2 (en) Projection system
US8702244B2 (en) Multimedia player displaying projection image
US8585213B2 (en) Projection-type display and control thereof
US9470412B2 (en) Lighting device
US20110292080A1 (en) Projection system
JP5428378B2 (ja) 画像表示システム、画像通信システム
US11323672B2 (en) Control method for projector and projector
US10536627B2 (en) Display apparatus, method of controlling display apparatus, document camera, and method of controlling document camera
WO2022253336A1 (zh) 激光投影设备及投影图像的校正方法
WO2023246211A1 (zh) 激光投影设备及投影图像的显示方法
CN203275866U (zh) 一种投影机
CN105652567A (zh) 一种投影方法
CN114268775A (zh) 投影系统、方法及存储介质
JP3846444B2 (ja) 投写面上に画像を表示させずに行う画像の表示領域の決定
WO2021164440A1 (zh) 一种投影设备、投影系统及投影方法
JP2000112021A (ja) 投写型表示装置
WO2023000937A1 (zh) 投影设备及投影图像的校正方法
US11019314B2 (en) Projector and method for controlling projector
CN202870456U (zh) 一种家用投影机
US11531255B2 (en) Projector, projection optical device, and method of controlling projector
CN203275867U (zh) 一种投影机
CN112598589A (zh) 激光投影系统及图像校正方法
JP2020182178A (ja) 表示装置及びその制御方法、並びにプログラム
CN106324954B (zh) 一种电子设备
JP2019080176A (ja) 表示装置及びシステム並びに制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22845102

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE