WO2023088304A1 - 一种投影设备和投影区域修正方法 - Google Patents

一种投影设备和投影区域修正方法 Download PDF

Info

Publication number
WO2023088304A1
WO2023088304A1 PCT/CN2022/132250 CN2022132250W WO2023088304A1 WO 2023088304 A1 WO2023088304 A1 WO 2023088304A1 CN 2022132250 W CN2022132250 W CN 2022132250W WO 2023088304 A1 WO2023088304 A1 WO 2023088304A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
target
projection area
area
coordinates
Prior art date
Application number
PCT/CN2022/132250
Other languages
English (en)
French (fr)
Inventor
卢平光
郑晴晴
王昊
王英俊
岳国华
唐高明
何营昊
陈先义
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Priority to CN202280063350.5A priority Critical patent/CN118077192A/zh
Publication of WO2023088304A1 publication Critical patent/WO2023088304A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/317Convergence or focusing systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/53Means for automatic focusing, e.g. to compensate thermal effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Definitions

  • the present application relates to the technical field of projection equipment, and in particular to a projection equipment and a projection area correction method.
  • a projection device is a display device that can project images or video onto a screen.
  • the projection device can project the laser light of a specific color onto the projection surface through the refraction of the optical lens assembly to form a specific image.
  • the projection device can project the image on the projection surface, in the projection area set by the user, for example, it can be on a screen placed by the user in advance.
  • the projection device if the position of the projection device is shifted, the projection device cannot completely project the image into the projection area, and the user experience is poor. For this reason, the projection area needs to be re-determined, so that the image can be projected into the projection area again.
  • the existing projection device When the existing projection device re-determines the projection area, the user can adjust the position and projection angle of the projection device, so that the image can be completely projected into the projection area.
  • this method requires the user to manually select and adjust the direction, the process is cumbersome and the projection area cannot be accurately determined, and the experience for the user is poor.
  • Embodiments of the present application provide a projection device and a method for correcting a projection area.
  • the projection device includes an optical machine and a controller.
  • the optical machine is configured to project the chart onto the projection surface;
  • the controller is configured to perform the following steps:
  • the optical machine projects the chart to the initial projection area in the projection surface
  • the optical machine is controlled to project the target chart to the target projection area.
  • An embodiment of the present application provides a method for correcting a projection area of a projection device, including:
  • the optical machine projects the chart to the initial projection area in the projection surface
  • the optical machine is controlled to project the target chart to the target projection area.
  • FIG. 1 shows a schematic diagram of a projection screen in some embodiments of the present application
  • Fig. 2 shows a schematic diagram of placement of projection equipment in some embodiments of the present application
  • Fig. 3 shows a schematic diagram of the optical path of the projection device in some embodiments of the present application
  • FIG. 4 shows a schematic diagram of a circuit architecture of a projection device in some embodiments of the present application
  • FIG. 5 shows a schematic structural diagram of a projection device in some embodiments of the present application.
  • FIG. 6 shows a schematic diagram of a circuit structure of a projection device in some embodiments of the present application.
  • Fig. 7 shows a schematic diagram of a system framework for realizing display control by a projection device in some embodiments of the present application
  • Fig. 8 shows a schematic diagram when the position of the projection device changes in some embodiments of the present application.
  • FIG. 9 shows an interaction flowchart of components of a projection device in some embodiments of the present application.
  • Figure 10 shows a schematic diagram of the first chart in some embodiments of the present application.
  • Figure 11 shows a schematic diagram of the first chart in some embodiments of the present application.
  • Fig. 12 shows a schematic flow chart of obtaining a mapping relationship in some embodiments of the present application
  • Fig. 13 shows a schematic diagram of the target projection area in some embodiments of the present application.
  • Fig. 14 shows a schematic diagram of the target projection area in some embodiments of the present application.
  • Fig. 15 shows a schematic diagram of the target projection area in some embodiments of the present application.
  • Fig. 16 shows a schematic diagram of the projection area before and after correction in some embodiments of the present application
  • Fig. 17 shows a schematic diagram of a projection area correction method in some embodiments of the present application.
  • FIG. 18 shows a schematic structural diagram of a projection device in some embodiments of the present application.
  • FIG. 19 shows a schematic structural diagram of a projection device according to another embodiment of the present application.
  • Fig. 20 shows a schematic diagram of the signaling interaction sequence of the projection device realizing the emissive eye function according to another embodiment of the present application
  • FIG. 21 shows a schematic diagram of a signaling interaction sequence of a projection device implementing a display screen correction function according to another embodiment of the present application.
  • FIG. 22 shows a schematic flow diagram of a projection device implementing an autofocus algorithm according to another embodiment of the present application.
  • FIG. 23 shows a schematic flow diagram of a projection device implementing trapezoidal correction and obstacle avoidance algorithms according to another embodiment of the present application.
  • FIG. 24 shows a schematic flow diagram of a projection device implementing a screen entry algorithm according to another embodiment of the present application.
  • FIG. 25 shows a schematic flowchart of implementing an anti-eye projection algorithm by a projection device according to another embodiment of the present application.
  • a projection device is a device that can project images or videos onto a screen.
  • the projection device can communicate with computers, radio and television networks, the Internet, VCD (Video Compact Disc), DVD (Digital Versatile Disc Recordable) through different interfaces. : Digital Video Disc), game consoles, DV, etc. are connected to play corresponding video signals.
  • Projection equipment is widely used in homes, offices, schools and entertainment venues, etc.
  • the projection device may be in the form of a projector.
  • the projection device may be provided with an optical machine, and the optical machine may project an image, such as a picture card selected by the user, onto the projection surface, specifically onto a projection area specified by the user on the projection surface.
  • the projection surface can be a wall, and the projection area can be a specific area on the wall, or a projection screen in front of the wall.
  • the projection screen is used in movies, offices, home theaters, large conferences and other occasions. It is a tool for displaying images and video files, and can be set to different specifications and sizes according to actual needs. In order to make the display effect more in line with the user's viewing habits, the aspect ratio of the projection device corresponding to the screen is usually set to 16:9 to adapt to the size of the image projected by the projection device.
  • Figure 1 shows a schematic diagram of a projection screen in some embodiments.
  • the screen edge of the projection screen can be a dark edge band, which is used to highlight the border of the screen, and the central area of the projection screen can be a white screen, which can be used as a projection area for displaying images projected by the projection device.
  • the installation position of the projection screen and projection equipment can be operated by professional after-sales technicians.
  • the projection screen and projection equipment By setting the projection screen and projection equipment at the position where the best projection effect can be achieved during work, the projection The image projected by the device can be completely in the projection screen, thereby improving the user experience.
  • the user can also set the positions of the two by himself, which is not limited in this embodiment of the present application.
  • Fig. 2 shows a schematic diagram of placement of projection equipment in some embodiments.
  • the projection screen provided by the present application can be fixed at the first position, and the projection device is placed at the second position, so that the picture projected by the projection device coincides with the projection screen.
  • Fig. 3 shows a schematic diagram of an optical path of a projection device in some embodiments.
  • An embodiment of the present application provides a projection device, including a laser light source 100 , an optical machine 200 , a lens 300 , and a projection medium 400 .
  • the laser light source 100 provides illumination for the light machine 200
  • the light machine 200 modulates the light beam, and outputs it to the lens 300 for imaging, and projects it to the projection medium 400 to form a projection image.
  • the laser light source 100 of the projection device includes a laser component and an optical lens component, and the light beam emitted by the laser component can pass through the optical lens component to provide illumination for the optical machine.
  • optical lens components require a higher level of environmental cleanliness and airtight level sealing; while the chamber where the laser component is installed can be sealed with a lower level of dustproof level to reduce sealing costs.
  • the light engine 200 of the projection device may be implemented to include a blue light engine, a green light engine, and a red light engine, and may also include a heat dissipation system, a circuit control system, and the like. It should be noted that, in some embodiments, the light emitting component of the projection device may also be realized by an LED light source.
  • the present application provides a projection device, including a three-color light engine and a controller; wherein, the three-color light engine is used to modulate and generate laser light containing pixels in a user interface, including a blue light engine, a green light engine, and a red light engine.
  • the controller is configured to: obtain the average gray value of the user interface; when it is determined that the average gray value is greater than the first threshold and its duration is greater than the time threshold, control the operating current value of the red optical machine according to The preset gradient value is lowered to reduce the heating of the three-color light engine. It can be found that by reducing the operating current of the red light engine integrated in the three-color light engine, the overheating of the red light engine can be controlled, so as to control the overheating of the three-color light engine and the projection device.
  • the light machine 200 can be implemented as a three-color light machine, and the three-color light machine integrates a blue light machine, a green light machine, and a red light machine.
  • the implementation manner provided by the present application will be described by taking the light machine 200 of the projection device as an example including a blue light machine, a green light machine, and a red light machine.
  • the optical system of the projection device is composed of a light source part and an optical machine part.
  • the function of the light source part is to provide illumination for the light machine, and the function of the light machine part is to modulate the illumination beam provided by the light source, and finally form Projected screen.
  • Fig. 4 shows a schematic diagram of a circuit architecture of a projection device in some embodiments.
  • the projection device may include a display control circuit 10, a laser light source 20, at least one laser driving component 30, and at least one brightness sensor 40, and the laser light source 20 may include a one-to-one corresponding At least one laser.
  • the at least one refers to one or more, and a plurality refers to two or more.
  • the projection device can realize adaptive adjustment. For example, by setting the brightness sensor 40 in the light output path of the laser light source 20, the brightness sensor 40 can detect the first brightness value of the laser light source and send the first brightness value to the display control circuit 10.
  • the display control circuit 10 can obtain the second brightness value corresponding to the driving current of each laser, and when it is determined that the difference between the second brightness value of the laser and the first brightness value of the laser is greater than the difference threshold, determine that the laser A COD failure occurs; the display control circuit can adjust the current control signal of the corresponding laser drive component of the laser until the difference is less than or equal to the difference threshold, thereby eliminating the COD failure of the blue laser; the projection device can eliminate the laser in time COD failure, reduce the damage rate of the laser, and improve the image display effect of the projection equipment.
  • the laser light source 20 includes three lasers that correspond one-to-one to the laser driving assembly 30 , and the three lasers may be a blue laser 201 , a red laser 202 and a green laser 203 respectively.
  • the blue laser 201 is used to emit blue laser
  • the red laser 202 is used to emit red laser
  • the green laser 203 is used to emit green laser.
  • the laser driving component 30 may be implemented to include a plurality of sub-laser driving components corresponding to lasers of different colors.
  • the display control circuit 10 is used to output the primary color enable signal and the primary color current control signal to the laser driving component 30 to drive the laser to emit light.
  • the display control circuit 10 is connected with the laser driving component 30, and is used to output at least one enabling signal corresponding to the three primary colors of each frame of the multi-frame display image, and transmit at least one enabling signal to the corresponding
  • the laser driving component 30 outputs at least one current control signal corresponding to the three primary colors of each frame of image, and transmits the at least one current control signal to the corresponding laser driving component 30 respectively.
  • the display control circuit 10 may be a microcontroller unit (microcontroller unit, MCU), also known as a single-chip microcomputer.
  • the current control signal may be a pulse width modulation (pulse width modulation, PWM) signal.
  • the display control circuit 10 can output the blue PWM signal B_PWM corresponding to the blue laser 201 based on the blue primary color component of the image to be displayed, and output the blue PWM signal B_PWM corresponding to the red laser 202 based on the red primary color component of the image to be displayed.
  • the red PWM signal R_PWM outputs the green PWM signal G_PWM corresponding to the green laser 203 based on the green primary color component of the image to be displayed.
  • the display control circuit can output the enable signal B_EN corresponding to the blue laser 201 based on the lighting duration of the blue laser 201 in the driving cycle, and output the enable signal B_EN corresponding to the red laser 202 based on the lighting duration of the red laser 202 in the driving cycle.
  • the enable signal R_EN of the green laser 203 outputs an enable signal G_EN corresponding to the green laser 203 based on the lighting duration of the green laser 203 in the driving cycle.
  • the blue laser 201 , the red laser 202 and the green laser 203 are respectively connected to the laser driving assembly 30 .
  • the laser driving component 30 can provide corresponding driving current to the blue laser 201 in response to the blue PWM signal and the enable signal sent by the display control circuit 10 .
  • the blue laser 201 is used to emit light under the driving current.
  • the brightness sensor is arranged in the light output path of the laser light source, usually on one side of the light output path, without blocking the light path.
  • at least one brightness sensor 40 is arranged in the light path of the laser light source 20, and each brightness sensor is connected with the display control circuit 10 for detecting the first brightness value of a laser, and the first brightness value sent to the display control circuit 10.
  • the display control circuit 10 is also used to obtain the second brightness value corresponding to the driving current of each laser, if it is detected that the difference between the second brightness value of the laser and the first brightness value of the laser is greater than
  • the difference threshold indicates that the laser has a COD failure
  • the display control circuit 10 can adjust the current control signal of the laser drive assembly 30 until the difference is less than or equal to the difference threshold, that is, the COD of the laser is eliminated by reducing the driving current of the laser. Fault.
  • both the first luminance value and the second luminance value are represented as light output power values, wherein the second luminance value may be pre-stored, or may be a luminance value sent back by a luminance sensor in a normal lighting state.
  • the display control circuit will reduce the current control signal of the laser driver component corresponding to the laser, and continuously collect and compare the brightness signal returned by the brightness sensor.
  • the display control circuit 10 does not need to adjust and The laser corresponds to the current control signal of the laser driving component 30 .
  • the display control circuit 10 may store the corresponding relationship between the current and the brightness value.
  • the luminance value corresponding to each current in the corresponding relationship is the initial luminance value that the laser can emit when the laser works normally under the driving of the current (that is, no COD failure occurs).
  • the brightness value may be the initial brightness when the laser is first turned on when it is driven by the current.
  • the display control circuit 10 can obtain the second brightness value corresponding to the driving current of each laser from the corresponding relationship, the driving current is the current actual working current of the laser, and the second brightness value corresponding to the driving current is the brightness value that the laser can emit when it works normally under the driving current.
  • the difference threshold may be a fixed value pre-stored in the display control circuit 10 .
  • the display control circuit 10 when the display control circuit 10 adjusts the current control signal of the laser drive component 30 corresponding to the laser, it can reduce the duty cycle of the current control signal of the laser drive component 30 corresponding to the laser, thereby reducing the driving force of the laser. current.
  • the brightness sensor 40 can detect the first brightness value of the blue laser 201 and send the first brightness value to the display control circuit 10 .
  • the display control circuit 10 can obtain the driving current of the blue laser 201, and obtain the second brightness value corresponding to the driving current from the corresponding relationship between the current and the brightness value. Then detect whether the difference between the second luminance value and the first luminance value is greater than the difference threshold, if the difference is greater than the difference threshold, it indicates that the blue laser 201 has a COD failure, and the display control circuit 10 can reduce the difference with the difference threshold.
  • the current control signal of the laser driving component 30 corresponding to the blue laser 201 .
  • the display control circuit 10 can acquire the first luminance value of the blue laser 201 and the second luminance value corresponding to the driving current of the blue laser 201 again, and the difference between the second luminance value and the first luminance value is greater than
  • the difference threshold is reached, the current control signal of the laser driving component 30 corresponding to the blue laser 201 is lowered again. This loops until the difference is less than or equal to the difference threshold. Therefore, by reducing the driving current of the blue laser 201 , the COD failure of the blue laser 201 is eliminated.
  • the display control circuit 10 can monitor in real time whether each laser has COD failure. And when it is determined that any laser has a COD failure, the COD failure of the laser is eliminated in time, the duration of the COD failure of the laser is reduced, the damage of the laser is reduced, and the image display effect of the projection device is ensured.
  • Fig. 5 shows a schematic structural diagram of a projection device in some embodiments of the present application.
  • the laser light source 20 in the projection device may include a blue laser 201, a red laser 202 and a green laser 203 which are set independently, and the projection device may also be called a three-color projection device, the blue laser 201, the red laser 201
  • the laser 202 and the green laser 203 are both Mirai Console Loader (MCL) packaged lasers, which are small in size and conducive to compact arrangement of optical paths.
  • MCL Mirai Console Loader
  • the controller includes a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processing unit (Graphics Processing Unit, GPU), RAM (Random Access Memory, RAM), ROM (Read -Only Memory, ROM), at least one of the first interface to the nth interface for input/output, a communication bus (Bus), and the like.
  • CPU Central Processing Unit
  • video processor video processor
  • audio processor audio processor
  • graphics processing unit GPU
  • RAM Random Access Memory
  • ROM Read -Only Memory
  • the projection device may be configured with a camera for cooperating with the projection device to realize adjustment and control of the projection process.
  • the camera configured by the projection device can be specifically implemented as a 3D camera, a depth camera, or a binocular camera; when the camera is implemented as a binocular camera, it specifically includes a left camera and a right camera; the binocular camera can obtain the corresponding screen of the projection device, That is, the images and charts presented on the projection surface, which are projected by the built-in optical machine of the projection equipment.
  • the projection device controller can be based on the image captured by the camera, through The correct display of the included angle between the projection surfaces of the coupled light machine and the projected image realizes automatic keystone correction.
  • the laser driving component 30 may include a driving circuit 301 , a switching circuit 302 and an amplifying circuit 303 .
  • the driving circuit 301 may be a driving chip.
  • the switch circuit 302 may be a metal-oxide-semiconductor (MOS) transistor.
  • MOS metal-oxide-semiconductor
  • the driving circuit 301 is used to output the driving current to the corresponding laser in the laser light source 20 through the VOUT terminal based on the current control signal sent by the display control circuit 10 , and transmit the received enabling signal to the switch circuit 302 through the ENOUT terminal.
  • the laser may include n sub-lasers connected in series, which are respectively sub-lasers LD1 to LDn. n is a positive integer greater than 0.
  • the switch circuit 302 is connected in series in the current path of the laser, and is used to control the conduction of the current path when the received enable signal is at an effective potential.
  • the amplifying circuit 303 is respectively connected to the detection node E in the current path of the laser light source 20 and the display control circuit 10, and is used to convert the detected driving current of the laser component into a driving voltage, amplify the driving voltage, and convert the amplified driving current The voltage is transmitted to the display control circuit 10 .
  • the display control circuit 10 is further configured to determine the amplified driving voltage as the driving current of the laser, and obtain a second brightness value corresponding to the driving current.
  • the amplifying circuit 303 may include: a first operational amplifier A1, a first resistor (also known as a sampling power resistor) R1, a second resistor R2, a third resistor R3 and a fourth resistor R4.
  • the non-inverting input terminal (also known as the positive terminal) of the first operational amplifier A1 is connected to one end of the second resistor R2, and the inverting input terminal (also known as the negative terminal) of the first operational amplifier A1 is respectively connected to one terminal of the third resistor R3 and the second resistor R3.
  • One end of the four resistors R4 is connected, and the output end of the first operational amplifier A1 is respectively connected to the other end of the fourth resistor R4 and the processing sub-circuit 3022 .
  • One end of the first resistor R1 is connected to the detection node E, and the other end of the first resistor R1 is connected to the reference power supply end.
  • the other end of the second resistor R2 is connected to the detection node E, and the other end of the third resistor R3 is connected to the reference power supply end.
  • the reference power terminal is a ground terminal.
  • the first operational amplifier A1 may further include two power supply terminals, one of which is connected to the power supply terminal VCC, and the other power supply terminal may be connected to the reference power supply terminal.
  • the relatively large driving current of the laser included in the laser light source 20 passes through the first resistor R1 to generate a voltage drop, and the voltage Vi at one end of the first resistor R1 (that is, the detection node E) is transmitted to the first operational amplifier A1 through the second resistor R2
  • the non-inverting input end of the first operational amplifier A1 is amplified by N times and then output.
  • the N is the amplification factor of the first operational amplifier A1, and N is a positive number.
  • the magnification ratio N can make the value of the voltage Vfb output by the first operational amplifier A1 be an integer multiple of the value of the driving current of the laser.
  • the value of the voltage Vfb can be equal to the value of the driving current, so that the display control circuit 10 can determine the amplified driving voltage as the driving current of the laser.
  • the display control circuit 10, the drive circuit 301, the switch circuit 302, and the amplifier circuit 303 form a closed loop to realize the feedback adjustment of the driving current of the laser, so that the display control circuit 10 can pass the laser second
  • the difference between the luminance value and the first luminance value adjusts the driving current of the laser in time, that is, adjusts the actual luminance of the laser in time, avoids long-term COD failure of the laser, and improves the accuracy of laser luminescence control.
  • the laser light source 20 includes a blue laser, a red laser and a green laser.
  • the blue laser 201 can be set at the L1 position
  • the red laser 202 can be set at the L2 position
  • the green laser 203 can be set at the L3 position.
  • the laser light at the position L1 is transmitted once through the fourth dichroic film 604 , is reflected once at the fifth dichroic film 605 , and then enters the first lens 901 .
  • the light efficiency P1 Pt ⁇ Pf at the L1 position.
  • Pt represents the transmittance of the dichroic plate
  • Pf represents the reflectance of the dichroic plate or the fifth reflectance.
  • the light efficiency of the laser light at the position L3 is the highest, and the light efficiency of the laser light at the position L1 is the lowest.
  • the maximum optical power Pb output by the blue laser 201 is 4.5 watts (W)
  • the maximum optical power Pr output by the red laser 202 is 2.5W
  • the maximum optical power Pg output by the green laser 203 is 1.5W. That is, the maximum optical power output by the blue laser 201 is the largest, followed by the maximum optical power output by the red laser 202 , and the maximum optical power output by the green laser 203 is the smallest.
  • the green laser 203 is therefore placed at the L3 position, the red laser 202 is placed at the L2 position, and the blue laser 201 is placed at the L1 position. That is, the green laser 203 is arranged in the optical path with the highest light efficiency, so as to ensure that the projection device can obtain the highest light efficiency.
  • the display control circuit 10 is further configured to recover the current control signal of the laser driving component corresponding to the laser when the difference between the second brightness value of the laser and the first brightness value of the laser is less than or equal to the difference threshold
  • the initial value is the magnitude of the PWM current control signal to the laser in the normal state. Therefore, when a COD failure occurs in the laser, it can be quickly identified, and measures to reduce the driving current can be taken in time to reduce the continuous damage of the laser itself and help it recover itself. The whole process does not require dismantling and human intervention, which improves the laser light source. The reliability of use ensures the projection display quality of laser projection equipment.
  • the controller includes a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processing unit (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read- Only Memory, ROM), at least one of the first interface to the nth interface for input/output, a communication bus (Bus), and the like.
  • a central processing unit Central Processing Unit, CPU
  • a video processor an audio processor
  • a graphics processing unit GPU
  • RAM Random Access Memory
  • ROM Read- Only Memory
  • the first interface to the nth interface for input/output a communication bus (Bus), and the like.
  • the projection device after the projection device is started, it can directly enter the display interface of the signal source selected last time, or the signal source selection interface, wherein the signal source can be a preset video-on-demand program, or an HDMI interface, a live TV interface
  • the projector can display the content obtained from different signal sources.
  • the system of the projection device may include a kernel (Kernel), a command parser (shell), a file system, and an application program.
  • kernel Kernel
  • shell command parser
  • file system a file system
  • an application program an application program.
  • the kernel, shell, and file system make up the basic operating system structure, and they allow users to manage files, run programs, and use the system.
  • the kernel starts, activates the kernel space, abstracts hardware, initializes hardware parameters, etc., runs and maintains virtual memory, scheduler, signal and inter-process communication (IPC).
  • IPC inter-process communication
  • the Shell and user applications are loaded.
  • the application is started, it is compiled into machine code to form a process.
  • the system is divided into four layers, from top to bottom are the Applications (Applications) layer (abbreviated as “Application Layer”), Application Framework (Application Framework) layer (abbreviated as “Framework Layer”), Android Runtime (Android runtime) and system library layer (referred to as “system runtime layer”), and the kernel layer.
  • Applications Applications
  • Application Framework Application Framework
  • Android Runtime Android runtime
  • system library layer system library layer
  • there is at least one application program running in the application program layer and these application programs can be window (Window) program, system setting program or clock program etc. that come with the operating system; they can also be developed by third-party developers. s application.
  • the application program packages in the application program layer are not limited to the above examples.
  • the projection device after the projection device is started, it can directly enter the display interface of the signal source selected last time, or the signal source selection interface, wherein the signal source can be a preset video-on-demand program, or an HDMI interface, a live TV interface
  • the projector can display the content obtained from different signal sources.
  • a laser projection device will be taken as an example to describe the projection device and the display control method based on geometric correction.
  • the laser light emitted by the projection device is reflected by the nanoscale mirror of the digital micromirror device (DMD:Digital Micromirror Device) chip, wherein the optical lens is also a precision element, and when the image plane and the object plane are not parallel, it will cause The image projected to the screen is geometrically distorted.
  • DMD Digital Micromirror Device
  • Fig. 7 shows a schematic diagram of a system framework for implementing display control by a projection device in some embodiments of the present application.
  • the projection device provided by the present application has the characteristics of telephoto micro-projection.
  • the projection device includes a controller, and the controller can control the display of the optical-mechanical image through a preset algorithm, so as to realize automatic keystone correction of the display image. , automatic screen entry, automatic obstacle avoidance, automatic focus, and anti-eye shooting functions. It can be understood that the projection device can realize flexible position movement in the telephoto micro-projection scene through the display control method based on geometric correction; From problems such as screen abnormality, the controller can control the projection device to realize the automatic display correction function, so that it can automatically return to normal display.
  • the geometric correction-based display control system includes an application program service layer (APK Service: Android application package Service), a service layer, and an underlying algorithm library.
  • API Service Android application package Service
  • the application service layer is used to realize the interaction between the projection device and the user; based on the display of the user interface, the user can configure various parameters of the projection device and the display screen, and the controller coordinates and calls the algorithm services corresponding to various functions , which can realize the function of automatically correcting the display screen of the projection device when the display is abnormal.
  • the service layer can include calibration service, camera service, time-of-flight (TOF) service, etc.
  • the service layer corresponds to the application program service layer (APK Service) to realize the corresponding specific functions of different configuration services of the projection device; the service layer connects to the algorithm library downward , cameras, time-of-flight sensors and other data acquisition services to realize the complex logic of the underlying package.
  • APIK Service application program service layer
  • the underlying algorithm library provides correction services and control algorithms for various functions of projection equipment.
  • the algorithm library can complete various mathematical operations based on OpenCV (based on licensed open source) to provide basic computing capabilities for calibration services; OpenCV is an open source release based on BSD license A cross-platform computer vision and machine learning software library that can run on a variety of existing operating system environments.
  • the projection device is configured with a gyroscope sensor; during the movement of the device, the gyroscope sensor can sense the position movement and actively collect movement data; then the collected data is sent to the application service layer through the system framework layer, It is used to support the application data required in the process of user interface interaction and application program interaction, and the collected data can also be used for data calls by the controller in the implementation of algorithm services.
  • the projection device is configured with a time-of-flight (TOF) sensor, and after the time-of-flight sensor collects corresponding data, the data will be sent to the corresponding time-of-flight service of the service layer;
  • TOF time-of-flight
  • time-of-flight service After the above-mentioned time-of-flight service acquires data, it sends the collected data to the application program service layer through the process communication framework, and the data will be used for interactive use such as controller data calls, user interfaces, and program applications.
  • the camera collection data will be sent to the camera service, and then the camera service will send the collection image data to the process communication framework, and/or the projection device calibration service; the projection device calibration service can receive the camera service sent by the camera service To collect data, the controller can call the corresponding control algorithm in the algorithm library for different functions that need to be realized.
  • data interaction is performed with the application service through the process communication framework, and then the calculation result is fed back to the correction service through the process communication framework; the correction service sends the obtained calculation result to the operating system of the projection device to generate a control signal command, and send the control signal to the optical-mechanical control driver to control the optical-mechanical working conditions and realize automatic correction of the projection area.
  • the projection device can project the image onto the projection surface, specifically in the projection area preset by the user, which can be a certain area on the wall or a curtain placed by the user in advance. Projected images can be displayed in the area for users to watch.
  • the projection device cannot completely project the image into the projection area.
  • the user bumps into the projection device, which causes the position of the projection device to change, resulting in a shift in the projected image, so that it cannot be completely in the projection area.
  • the projection device needs to re-determine the projection area, so as to completely project the image into the projection area, so that the user can watch it.
  • FIG. 8 shows a schematic diagram when the position of the projection device changes in some embodiments.
  • the projection device starts at position A and can completely project into the projection area, which is rectangular.
  • the projection device moves from position A to position B, the projected image will shift and cannot be completely in the projection area.
  • the user can manually adjust the position and projection angle of the projection device so that the projected image is completely in the projection area.
  • This manual adjustment method is cumbersome and has poor accuracy.
  • the projection device can also photograph the projection surface to obtain an image of the projection surface. Then, the acquired image is subjected to binarization image processing, so that the outline of the object in the image can be displayed more clearly. For the binarized image, all the closed contours contained in it can be extracted, and the closed contour with the largest area and the same internal color can be identified as the projection area for projection.
  • the projection device will recognize the wall as the projection area, causing the image to be projected on the wall instead of the area specified by the user.
  • the projection device provided by the embodiment of the present application can accurately determine the projection area, so as to solve the above problems.
  • the projection device has a projection area correction function.
  • the projection device can redefine the projection area so that the image can be completely projected into the projection area, thereby realizing the correction of the projection area, so as to automatically enter the screen of the projected image Effect.
  • the projection device when receiving the projection area correction instruction, may enable the projection area correction function to correct the current projection area of the projection device.
  • the projection area correction command refers to a control command for triggering the projection device to automatically correct the projection area.
  • the projection area modification instruction may be an instruction actively input by the user. For example, after the projection device is powered on, the projection device can project an image on the projection surface. At this time, the user can press the preset projection area correction switch in the projection device, or the projection area correction button on the remote control of the projection device, so that the projection device can enable the projection area correction function, re-determine the projection area, and realize the adjustment of the projection area. fix.
  • the projection area modification instruction can also be automatically generated according to the built-in control program of the projection device.
  • the projection device actively generates the projection area correction command after it is turned on. For example, when the projection device detects that a video signal is input for the first time after being turned on, it may generate a projection area correction instruction, thereby triggering the projection area correction function.
  • the projection device may also be that the projection device generates the projection area correction instruction by itself during the working process. Considering that during the process of using the projection device, the user may actively move the projection device or accidentally touch the projection device, resulting in a change in the posture or setting position of the projection device. changes and cannot be completely projected into the projection area set by the user. At this time, in order to ensure the user's viewing experience, the projection device may automatically correct the projection area.
  • the projection device can detect its own situation in real time.
  • the projection device detects that its own posture or setting position has changed, it can generate a projection area correction instruction, thereby triggering the projection area correction function.
  • the projection device can monitor the movement of the device in real time through its configuration components, and immediately feed back the monitoring results to the projection device controller, so that after the projection device moves, the controller can start the projection area correction function in time. Time to achieve automatic correction of the projection area.
  • the controller receives monitoring data from the gyroscope and TOF sensor to determine whether the projection device moves.
  • the controller may generate a projection area correction command and enable the projection area correction function.
  • the time-of-flight sensor realizes distance measurement and position movement monitoring by adjusting the frequency of the transmitted pulse. Its measurement accuracy will not decrease with the increase of the measurement distance, and it has strong anti-interference ability.
  • the projection device when the projection area correction command is detected, can redetermine the projection area, so as to completely project the image into the projection area, realize automatic correction of the projection area, and achieve the effect that the image automatically enters the screen.
  • Fig. 9 shows an interaction flowchart of components of a projection device in some embodiments of the present application.
  • the projection device may automatically correct the projection area.
  • the projection area correction function can be implemented on the projection image through a preset projection area correction algorithm.
  • the projection device may first determine the projection relationship between the projection surface and the projection device.
  • the projection relationship refers to the projection relationship in which the projection device projects an image onto the projection surface, specifically the mapping relationship between the graphics card for optical mechanical projection of the projection device and the projection surface.
  • the projection device can determine the location information of the projection area on the projection surface, so as to perform projection and realize correction of the projection area.
  • the projection device may project a first map on the projection plane, and determine position information of the projection plane according to the first map. According to the position information of the projection surface, the projection relationship between the projection surface and the projection device can further be determined.
  • the embodiment of the present application can construct the transformation matrix between the projection surface in the world coordinate system and the optical-mechanical coordinate system based on the binocular camera.
  • the homography relationship between them which is also called the projection relationship, can realize any projection transformation between the projected image and the playing card by using the homography relationship.
  • the projection device may first project the first image card.
  • the controller may control the optical machine to project the first map onto the projection surface (step 902).
  • the controller may also control the camera to take pictures of the first image card displayed on the projection surface to obtain a first image (step 903 ).
  • the first image card may contain several feature points, so the first image captured by the camera also includes all the feature points in the first image card.
  • the location information of the projection surface can be determined through the feature points. It should be noted that, for a plane, after the positions of three points in the plane are determined, the position information of the plane can be determined. Therefore, in order to determine the location information of the projection surface, it is necessary to determine the locations of at least three points in the projection surface, that is, the first map needs to include at least three feature points.
  • the location information of the projection surface can be determined according to the at least three feature points.
  • the first picture card may contain patterns and color features preset by the user.
  • the first picture card may be a checkerboard picture card, and the checkerboard picture card is set as a black and white checkerboard.
  • the feature points contained in the checkerboard picture card are corner points of a rectangle.
  • the pattern in the first diagram card can also be set as a donut diagram card, including a ring pattern, as shown in Figure 11, the feature points contained in the donut diagram card are corresponding solid points on each ring.
  • the first picture card can also be set as a combination of the above two types of patterns, or as other patterns with identifiable feature points.
  • Fig. 12 shows a schematic flowchart of obtaining a mapping relationship in some embodiments.
  • the controller may control the camera to take pictures of the first map to obtain a first image, so as to obtain the positions of the feature points.
  • the camera may be a binocular camera, and one camera is arranged on both sides of the optical machine.
  • the binocular camera is used to shoot the first picture card, one image is obtained by the left camera, and another image is obtained by the right camera.
  • the controller can perform image recognition processing on the two images to obtain the first coordinates of the feature points in any one of the images.
  • the first coordinates refer to coordinate information of feature points in the image coordinate system corresponding to the first image.
  • the image coordinate system refers to the coordinate system with the center of the image as the coordinate origin and the X and Y axes parallel to the two sides of the image.
  • the image coordinate system in the embodiment of the present application can be set as follows: for the projection area preset by the user, the center point of the projection area can be set as the origin, the horizontal direction is the X axis, and the vertical direction is the Y axis.
  • the high image coordinate system can be set in advance according to the preset projection area.
  • Step 1201 determine the first coordinates of the feature points in the corrected image; the coordinate information of the feature points in the first image can be determined according to the image coordinate system.
  • the position in the image captured by the left camera may be different from its position in the image captured by the right camera.
  • the coordinates of the feature point in the camera coordinate system of any one of the left and right cameras can be determined through the two positions of the same feature point in the two images.
  • the second coordinate is used to represent: the coordinate information of the feature point in the camera coordinate system.
  • the camera coordinate system is specifically: a direct coordinate system in which the light point of the camera is the center, the optical axis is the Z axis, and the XOY plane is parallel to the projection plane to establish a space.
  • the left camera is used as an example for introduction.
  • the second coordinate information of the feature point in the camera coordinate system is obtained according to the first coordinate, which is set as P(x ,y,z).
  • step 1203 converting the second coordinates into third coordinates of the feature points in the optical-mechanical coordinate system.
  • the third coordinate refers to coordinate information of the feature point in the optical-mechanical coordinate system.
  • the optical-mechanical coordinate system is specifically: a direct coordinate system in which a space is established with the optical point of the optical machine as the center, the optical axis as the Z-axis, and parallel to the projection plane as the XOY plane.
  • the optical-mechanical coordinate system and the camera coordinate system can be converted to each other, so the coordinates of the feature points in the camera coordinate system can be converted into coordinates in the optical-mechanical coordinate system.
  • the feature points can be converted between the two coordinate systems according to the external parameters between the optical machine and the camera.
  • the external parameters between the optical machine and the camera are the equipment parameters that have been marked on the equipment shell or the manual when the projection equipment is manufactured. It is usually set based on the function, assembly, manufacture, and parts of the projection equipment, and is applicable to all projection equipment of the same model. , which can include the rotation matrix and translation matrix between the optical machine and the camera.
  • the conversion relationship between the optical-mechanical coordinate system and the camera coordinate system can be determined, and the coordinates of the feature points in the optical-mechanical coordinate system can be obtained further.
  • P'(x', y', z') is the coordinates of the feature points in the optical-mechanical coordinate system.
  • RRR is the rotation matrix between the optical machine and the camera
  • TTT is the translation matrix between the optical machine and the camera
  • the projection surface equation of the projection surface in the optical-mechanical coordinate system can be determined.
  • the controller can acquire the first positions of at least three feature points in at least the first image, and determine coordinate information in the camera coordinate system according to the first positions of these feature points. Further, the coordinate information in the camera coordinate system can be converted into coordinate information in the optical machine coordinate system.
  • the controller can fit the coordinates of these feature points, step 1204, obtain the projection of the projection surface in the optical-mechanical coordinate system according to the third coordinate surface equation.
  • the projection surface equation can be expressed as:
  • step 1205 after determining the projection surface equation of the projection surface in the optical-mechanical coordinate system, step 1205, obtain the conversion matrix of the optical-mechanical coordinate system and the world coordinate system according to the projection surface equation, and the conversion matrix is used to characterize the mapping relation.
  • the world coordinate system is set as follows: the image coordinate system is the XOY plane, that is, the projection plane is the XOY plane, and the origin is the center point of the projection area preset by the user. Set the Z-axis in the direction perpendicular to the projection plane to establish a space coordinate system.
  • the controller can respectively determine the representation of the projection surface in the world coordinate system and the representation of the projection surface in the optical-mechanical coordinate system.
  • the controller may first determine the unit normal vector of the projection surface in the world coordinate system.
  • the unit normal vector of the projection surface in the world coordinate system can be expressed as:
  • the controller can also obtain the unit normal vector of the projection surface in the optical-mechanical coordinate system according to the projection surface equation of the projection surface in the optical-mechanical coordinate system, and the unit normal vector of the projection surface in the optical-mechanical coordinate system can be expressed as the formula:
  • R1 represents the conversion matrix between the optical machine coordinate system and the world coordinate system.
  • the conversion matrix can represent the mapping relationship between the photo-mechanical projection chart and the projection surface.
  • the controller After determining the mapping relationship, the controller can realize the conversion of the coordinates of a certain point between the world coordinate system and the optical-mechanical coordinate system.
  • its coordinate representation in the world coordinate system can be determined according to the position information of the target area in the projection plane, and the controller can convert the coordinate representation in the world coordinate system according to the transformation matrix It is a coordinate representation in the optical-mechanical coordinate system, so as to determine the position information of the target area for the projection device, and then the chart can be directly projected into the target area.
  • the projection device when the projection process is not performed, there is no projection area on the projection surface, and the specific projection area cannot be directly determined.
  • the projection device can determine the state of the card to be projected and the projection angle, etc., so as to determine the coordinate representation of the projection area to be projected in the optical-mechanical coordinate system.
  • the controller can transform the coordinate representation in the optical machine coordinate system into the coordinate representation in the world coordinate system according to the transformation matrix, so as to determine the position information of the projection area to be projected on the projection plane.
  • the coordinate representation of the projection area between the world coordinate system and the optical-mechanical coordinate system can be converted arbitrarily.
  • the projection area can be re-determined to realize correction of the projection area.
  • the initial projection area where the optical machine projects the chart onto the projection surface can be obtained first.
  • the projection area to be corrected in the initial projection area is the display area of the image projected by the projection device on the projection plane before the projection area is corrected.
  • the projection device may be in the process of projection, and at this time, the projection image is being displayed in the initial projection area on the projection surface.
  • the user has turned on the projection device, and the projection device has performed a projection process to form a projection image on the projection surface, and the area where the image is currently displayed on the projection surface is the initial projection area.
  • the current initial projection area and the target projection area set by the user may not be the same projection area, causing the projected image to deviate from the position set by the user, and the projection device needs to correct the initial projection area.
  • the projection device may also not be in the process of projection, and at this time no projection image is displayed on the projection surface, so the initial projection area will not be displayed directly. At this time, the area where the projection device is about to project is the initial projection area. However, for the projection device, it is possible to determine the location information of the area to which it is about to project, that is, the location information of the initial projection area. However, since the projection process is not actually carried out, the user cannot determine the specific initial projection area, and cannot confirm whether the initial projection area and the set target projection area are the same initial projection area. At this time, in order to ensure that the projection device can completely project the image into the target projection area, the user can control the projection device to directly correct the projection area, or the projection device can actively perform projection area correction, for example, when it is turned on.
  • the controller may obtain the initial projection area according to the mapping relationship.
  • the controller may first determine the initial position information when the light-emitting device projects the chart onto the projection surface.
  • the initial position information is the position information of the initial projection area in the optical-mechanical coordinate system, and specifically may be the vertex coordinates of the initial projection area.
  • the user can control the projection device to project the graphics card, thereby forming a projection image on the projection surface.
  • the picture card can be an image selected by the user, including content such as pictures or videos, and its shape is generally a rectangle. Therefore, the image of the chart projected on the projection surface can also be in the form of a rectangle, which is convenient for users to watch. That is, the projection area corresponding to the projection image is a rectangular area, and when the coordinate information of the four vertices of the rectangular area is determined, the position of the rectangular area is also determined.
  • the projection device no matter whether it has performed the projection process or not, the projection device can determine that it is about to or has already projected.
  • the projection device can determine the state of the chart to be projected and the projection angle, etc., so as to determine the projected position of the light machine, that is, the initial position information of the chart projected by the light machine. Therefore, the controller can obtain the coordinate representation of the initial projection area in the optical-mechanical coordinate system.
  • the controller can convert the initial position information, that is, the vertex coordinates of the initial projection area, into position information in the world coordinate system. Specifically, it can be the vertex coordinates of the initial projection area in the world coordinate system. After determining the position information of the initial projection area in the world coordinate system, the specific initial projection area is determined.
  • the area of the image displayed on the projection surface is the initial projection area.
  • the initial projection area can also be obtained by the following methods:
  • the controller can control the camera to take pictures of the projection surface, and the image of the projection surface obtained at this time can include the graphics card being projected by the projection device, and the area where the graphics card is located is the initial projection area.
  • the controller can identify the projection surface image, determine the coordinate information of the four vertices of the chart, and thereby determine the position of the initial projection area.
  • the controller can control the optical machine to project the preset image card onto the projection surface.
  • the preset picture card is used to distinguish it from the projection surface, and it can be a green picture card.
  • the controller controls the camera to take pictures of the projection surface, and can recognize the complete outline of the chart, thereby determining the position information of the initial projection area.
  • the controller may perform contour correction on the initial projection area to obtain the target projection area.
  • the target projection area is the projection area set by the user. Therefore, after contour correction is performed on the initial projection area, the projection area set by the user can be re-determined, thereby realizing the correction process of the projection area.
  • the controller may first control the camera to photograph the projection surface to obtain an image of the projection surface.
  • the projection surface image will contain a variety of environmental objects, such as curtains, TV cabinets, walls, ceilings, coffee tables, and so on.
  • the position information of the object can be determined. For example, determining the coordinates of the four vertices of the curtain also determines the position information of the curtain on the projection surface.
  • an initial projection area image corresponding to the initial projection area may be generated. Specifically, since the vertex coordinates of the initial projection area have been obtained in the world coordinate system. Therefore, in the projection surface image, the four vertices of the initial projection area can be marked first, and then the four vertices can be further connected to form the initial projection area. At this time, the initial projection area is also displayed in the projection surface image.
  • the four vertices of the initial projection area may be respectively corrected to obtain four corrected target vertices.
  • the area formed by the four target vertices can be considered as the target projection area.
  • the target vertex can be obtained according to the vertex of the initial projected area image. For the four vertices of the initial projected area image, each vertex corresponds to a corrected target vertex.
  • the controller may perform corner detection processing on the initial projection area image.
  • the corner detection algorithm which may be the Shi Tomasi algorithm
  • the initial projection area image is processed to obtain all the corner points contained in the initial projection area image.
  • the corner point can be the extreme point of the change of the gray value or brightness value in the image of the initial projection area, such as the vertex of the outline of the object. Therefore, the vertices of the outline of the object contained in the image, such as the vertices of the curtain, can be determined through the corner points.
  • a preset first size may be determined with the vertex as the center.
  • initial projection area images of different sizes may correspond to different first sizes, for example, for an initial projection area image of 1280*720, the first size may be set to 20 pixels. Therefore, for each vertex, the vertex can be set as the center of the circle, and 20 pixels can be used as the radius to determine the first correction area corresponding to the vertex.
  • the controller may obtain the closest corner point from the center of the circle, that is, the vertex of the image of the initial projection area, which is referred to as the first corner point in the embodiment of the present application.
  • the first corner point can be regarded as the target vertex corresponding to the vertex of the initial projection area image.
  • the four vertices of the image of the initial projection area and the corresponding target vertices can be determined.
  • the four target vertices form the target projection area.
  • the error between the target projection area obtained by the above method and the real outline of the screen is extremely small, so it can be considered that the target projection area is the screen area, and the pixel-level screen recognition effect can be realized, so as to accurately control the image entering the screen. Greatly improved the user's viewing experience.
  • the curtain outline is relatively dark in view of the darker scene.
  • the projection image will be displayed on the projection surface, resulting in the projection image appearing on the projection surface image. Since the projected image is bright and the outline of the curtain is relatively dark, the edges of the projected image may be recognized as corner points, so that the first corner point actually belongs to the projected image, and the projected image is mistakenly identified as the curtain. Therefore, after the first corner point is obtained, the first corner point can be further detected.
  • the controller may first determine the position information of the first corner point, that is, the coordinates of the first corner point in the projection plane image.
  • the controller can directly convert the coordinates in the image coordinate system to the coordinates in the world coordinate system.
  • the coordinates of the first corner point in the world coordinate system can be converted into coordinates in the optical-mechanical coordinate system, which are referred to as corner point coordinates in the embodiment of the present application.
  • the controller knows the vertex coordinates of the initial projection area. Therefore, the vertex coordinates and corner coordinates can be compared and confirmed.
  • the controller can judge whether the vertex coordinates and the corner point coordinates meet the preset second size condition.
  • the preset second size condition is set as follows: the coordinates of the vertex and the corner point are larger than the preset second size, the second size can be a size preset by the relevant technical personnel, and can be set to be the same as The first size is the same or may be different, which is not limited in this embodiment of the present application.
  • the vertex coordinates and the corner point coordinates satisfy the preset second size condition, it means that the vertex of the initial projection area is far away from the first corner point, and the first corner point does not belong to the edge of the initial projection area.
  • the first corner point can be determined as the target vertex, so as to obtain the target projection area, and the target projection area is the screen area.
  • the distance between the vertex and the first corner point of the initial projection area is relatively close, and the first corner point belongs to the edge of the initial projection area.
  • the first corner point cannot be used as the target vertex, and the screen entry operation will not be performed at this time, that is, the projection device will not perform the normal projection process.
  • the first corner point when the vertex coordinates and the corner point coordinates do not meet the preset second size condition, that is, the first corner point cannot be used as the target vertex, the first corner point may be deleted. At the same time, you can continue to search for a new first corner point in the remaining corner points in the initial projection area image, and you can continue to detect the new first corner point until you get the first corner point that meets the conditions. The first corner point is used as the target vertex. If there is no qualified first corner point, the screen entering operation will not be performed, and the projection device will not project.
  • the projection device if the target vertex cannot be obtained, the projection device will not perform projection.
  • the projection device can prompt the user, for example, projecting a message of failure to enter the screen, which is used to remind the user that the location of the projection device cannot enter the screen normally.
  • the user can adjust the position of the projection device, and the projection device can correct the projection area again until the target projection area, that is, the screen area, is identified.
  • position information of the target projection area on the projection plane is obtained at this time.
  • the target projection area has four vertices
  • the chart projected by the projection device also has four vertices, in order to project accurately, one of the four vertices of the target projection area and the four vertices of the chart can be obtained.
  • One-to-one correspondence It may be that the upper left vertex of the chart is projected to the upper left vertex of the target projection area, so as to ensure that the chart can be normally displayed in the target projection area.
  • the four vertices of the upper left, upper right, lower left, and lower right can be determined, so as to obtain the corresponding relationship between the target projection area and the chart.
  • the projection device needs to further determine which of the four target vertices in the target projection area is specific, so as to determine the corresponding relationship of the map projection.
  • the controller may first determine the state of the target projection area.
  • the target projection area has a forward projection state and an oblique state.
  • the forward projection state refers to: a relatively horizontal state compared with the reference area, without inclination.
  • the controller can obtain the coordinate information of the four target vertices in the image coordinate system or the world coordinate system. Use either of the two coordinate systems.
  • the controller can further determine whether the difference between the X coordinates of the two target vertices satisfies a preset coordinate condition.
  • the preset coordinate condition may be set as: the difference between the X coordinates of the two target vertices is smaller than the preset size.
  • the corresponding relationship of the four target vertices can be further determined according to the state of the target projection area.
  • the controller may first classify the four target vertices to determine two target vertices with smaller X coordinates and two target vertices with larger X coordinates.
  • Fig. 13 shows a schematic diagram of a target projection area in some embodiments of the present application.
  • the coordinates of the four target vertices in the target projection area are: A(x1, y2), B(x1, y1), C(x2, y2) and D(x2, y1).
  • Vertices A and B are two target vertices with smaller X coordinates
  • C and D are two target vertices with larger X coordinates.
  • the controller may determine the target vertex with the larger Y coordinate among the two target vertices with the smaller X coordinate as the first target vertex, and the target vertex with the smaller Y coordinate as the second target vertex. That is, vertex A is determined as the first target vertex, and vertex B is determined as the second target vertex.
  • the controller may determine the target vertex with the larger Y coordinate among the two target vertices with the larger X coordinate as the third target vertex, and the target vertex with the smaller Y coordinate as the fourth target vertex. That is, vertex C is determined as the third target vertex, and vertex D is determined as the fourth target vertex.
  • the first target vertex can be set as the upper left vertex, the second target vertex as the lower left vertex, the third target vertex as the upper right vertex, and the fourth target vertex as the lower right vertex.
  • the four target vertices and the four vertices of the target chart so as to determine the specific projection relationship between the target projection area and the chart.
  • the controller may first determine two target vertices with smaller X coordinates and two target vertices with larger X coordinates.
  • the target projection area may be inclined to the left, and may be inclined to the right. Therefore, the controller may first determine the direction of inclination of the target projection area.
  • the controller can obtain the slope of the straight line where the two target vertices with smaller X coordinates are located.
  • Figure 14 shows a schematic diagram of the target projection area in some embodiments. Wherein, the coordinates of the four target vertices in the target projection area are: A(x1, y1), B(x2, y2), C(x3, y3) and D(x4, y4). Vertices A and B are two target vertices with smaller X coordinates, from which the slope K of the straight line AB is determined.
  • the inclination direction of the target projection area can be determined through the slope K, specifically, if the slope is less than 0, it is determined that the target projection area is in the first inclination state, that is, it is inclined to the left. If the slope is greater than 0, it is determined that the target projection area is in the second inclined state, that is, inclined to the right.
  • the slope K of the straight line AB in Fig. 15 is obviously less than 0, so it slopes to the left.
  • the screen size is 16:9 as an example for introduction.
  • the controller can sequence the X coordinates of the four target vertices from small to large. And in ascending order, the four target vertices are sequentially determined as the first target vertex, the second target vertex, the third target vertex and the fourth target vertex.
  • vertex A is the first target vertex, that is, the upper left vertex.
  • Vertex B is the second target vertex, namely the lower left vertex.
  • Vertex C is the third target vertex, namely the upper right vertex.
  • Vertex D is the fourth target vertex, namely the lower right vertex.
  • the controller can determine the four target vertices as the second target vertex, the first target vertex, and the fourth target vertex in order of X coordinates from small to large and a third target vertex.
  • vertex A is the first target vertex, that is, the upper left vertex.
  • Vertex B is the second target vertex, namely the lower left vertex.
  • Vertex C is the third target vertex, namely the upper right vertex.
  • Vertex D is the fourth target vertex, namely the lower right vertex.
  • the screen size 16:9 is used for introduction. At this time, the length of the screen is greater than the width, so the X coordinate is used for calculation. If the length of the curtain is smaller than the width, the Y coordinate can be used for calculation.
  • the specific method is similar to the above, and will not be repeated here.
  • a rotating assembly may also be provided behind the screen.
  • the controller can calculate the rotation angle that can restore it to the normal projection state according to the slope of the target projection area, and send the rotation angle to the rotation component, and control the rotation component to rotate according to the rotation angle, so that the target projection area returns to the normal projection state, In order to improve the user's viewing experience.
  • position information of the target projection area in the world coordinate system may be acquired.
  • the controller can convert the position information in the world coordinate system into the position information in the optical machine coordinate system to obtain the target position information.
  • the controller can control the optical machine to project the target chart to the target projection area, so as to realize the correction of the projection area.
  • Fig. 16 shows a schematic diagram of the projection area before and after correction in some embodiments of the present application. After the projection device moves from position A to position B, the projected image will shift and cannot be completely in the projection area. The projection device can perform projection area correction processing, so that the projection device can completely project into the projection area when it is at the B position.
  • the embodiment of the present application also provides a method for correcting a projection area, which is applied to a projection device, as shown in FIG. 17 , the method includes:
  • Step 1701 in response to the projection area correction command, obtain the mapping relationship between the chart and the projection surface for optical machine projection.
  • Step 1702 according to the mapping relationship acquisition, the optical machine projects the chart to the initial projection area on the projection surface.
  • Step 1703 performing contour correction on the initial projection area to obtain the target projection area.
  • Step 1704 determine the target position information of the target projection area according to the mapping relationship.
  • Step 1705 according to the target position information, control the optical machine to project the target chart to the target projection area.
  • FIG. 18 shows a schematic structural diagram of a projection device according to an embodiment of the present application.
  • the laser light source in the projection device may include a blue laser 201-20, a red laser 202-20, and a green laser 203-20, and the projection device may also be called a three-color projection device.
  • the color lasers 201-20, the red lasers 202-20 and the green lasers 203-20 are all MCL-type packaged lasers, which are small in size and facilitate the compact arrangement of optical paths.
  • the at least one brightness sensor may include a first brightness sensor 401-40, a second brightness sensor 402-40 and a third brightness sensor 403-40, wherein the first brightness sensor 401-40 is a blue light brightness sensor or a white light brightness sensor, the second brightness sensor 402-40 is a red light brightness sensor or a white light brightness sensor, and the third brightness sensor 403-40 is a green light brightness sensor or a white light brightness sensor.
  • the first brightness sensor 401-40 is set in the light output path of the blue laser 201-20, specifically, it can be set on the side of the light output path of the collimated beam of the blue laser 201-20.
  • the second The brightness sensor 402-40 is arranged in the light output path of the red laser 202-20, specifically arranged on one side of the light output path of the collimated beam of the red laser 201-20
  • the third brightness sensor 403-40 is arranged in the green laser 203-20 In the light output path of the green laser 203-20, specifically, it is arranged on the side of the light output path of the collimated beam of the green laser 203-20. Since the laser light emitted by the laser does not attenuate in its light-emitting path, the brightness sensor is arranged in the light-emitting path of the laser, which improves the accuracy of the brightness sensor for detecting the first brightness value of the laser.
  • the display control circuit is also used to read the brightness value detected by the first brightness sensor 401-40 when controlling the blue laser 201-20 to emit blue laser light. And when the blue laser 201-20 is controlled to be turned off, stop reading the brightness value detected by the first brightness sensor 401-40.
  • the display control circuit is also used to read the brightness value detected by the second brightness sensor 402-40 when controlling the red laser 202-20 to emit red laser light, and stop reading the second brightness value when controlling the red laser 202-20 to turn off.
  • the brightness value detected by the brightness sensor 402-40 is also used to read the brightness value detected by the second brightness sensor 402-40 when controlling the red laser 202-20 to emit red laser light, and stop reading the second brightness value when controlling the red laser 202-20 to turn off.
  • the display control circuit is also used to read the luminance value detected by the third luminance sensor 403-40 when controlling the green laser 203-20 to emit green laser light, and stop reading the luminance value detected by the third luminance sensor 403-40 when controlling the green laser 203-20 to be turned off.
  • the brightness values detected by the three brightness sensors 403-40 are also used to read the luminance value detected by the third luminance sensor 403-40 when controlling the green laser 203-20 to emit green laser light, and stop reading the luminance value detected by the third luminance sensor 403-40 when controlling the green laser 203-20 to be turned off.
  • FIG. 19 shows a schematic structural diagram of a projection device according to another embodiment of the present application.
  • the projection device may further include a light pipe 110, which is used as a light-collecting optical component for receiving and homogenizing the output three-color laser light in a combined light state.
  • a light pipe 110 which is used as a light-collecting optical component for receiving and homogenizing the output three-color laser light in a combined light state.
  • the brightness sensor may include a fourth brightness sensor 404, which may be a white light brightness sensor.
  • the fourth brightness sensor 404 is disposed in the light exit path of the light pipe 110 , for example, on the light exit side of the light pipe, close to its light exit surface.
  • the above-mentioned fourth brightness sensor is a white light brightness sensor.
  • the display control circuit is also used to read the luminance value detected by the fourth luminance sensor 404 when controlling the blue laser 201-20, the red laser 202-20 and the green laser 203-20 to turn on time-sharing, so as to ensure that the fourth The brightness sensor 404 can detect the first brightness value of the blue laser 201-20, the first brightness value of the red laser 202-20 and the first brightness value of the green laser 203-20. And when the blue laser 201-20, the red laser 202-20 and the green laser 203-20 are all turned off, stop reading the brightness value detected by the fourth brightness sensor 404.
  • the fourth brightness sensor 404 is always on when the projection device is projecting images.
  • the projection device may further include a fourth dichroic film 604, a fifth dichroic film 605, a fifth reflector 904, a second lens assembly, a diffusion wheel 150, TIR lens 120, DMD 130 and projection lens 140.
  • the second lens assembly includes a first lens 901-90, a second lens 902-90 and a third lens 903-90.
  • the fourth dichroic film 604 can transmit blue laser light and reflect green laser light.
  • the fifth dichroic film 605 can transmit red laser light and reflect green laser light and blue laser light.
  • the blue laser light emitted by the blue laser 201-20 passes through the fourth dichroic film 604, and then is reflected by the fifth dichroic film 605 and enters the first lens 901-90 for condensing.
  • the red laser light emitted by the red laser 202-20 passes through the fifth dichroic film 605 and directly enters the first lens 901-90 for focusing.
  • the green laser light emitted by the green laser 203-20 is reflected by the fifth reflector 904, reflected by the fourth dichroic film 604 and the fifth dichroic film 605 in turn, and then enters the first lens 901-90 for focusing.
  • the blue laser, red laser and green laser After being concentrated by the first lens 901-90, the blue laser, red laser and green laser pass through the rotating diffusion wheel 150 in time-sharing to dissipate the speckle, and project to the light guide 110 for uniform light, and then pass through the second lens 902-90 After shaping with the third lens 903-90, it enters the TIR lens 120 for total reflection, and after being reflected by the DMD 130, passes through the TIR lens 120, and finally is projected onto the display screen through the projection lens 140 to form an image to be displayed.
  • FIG. 20 shows a schematic diagram of a signaling interaction sequence of a projection device implementing a radioactive eye function according to another embodiment of the present application.
  • the projection device provided by the present application can realize the anti-eye function.
  • the controller can control the user interface to display corresponding prompt information to remind the user to leave In the current area, the controller can also control the user interface to reduce the display brightness, so as to prevent the laser from causing damage to the user's eyesight.
  • the controller when the projection device is configured as a children's viewing mode, the controller will automatically turn on the anti-eye switch.
  • the controller controls the projection device to turn on the anti-eye switch.
  • the controller when the data collected by time-of-flight (TOF) sensors, camera devices and other devices triggers any preset threshold condition, the controller will control the user interface to reduce the display brightness, display prompt information, and reduce the optical-mechanical transmission power. , brightness, intensity, in order to protect the user's eyesight.
  • TOF time-of-flight
  • the projection device controller can control the calibration service to send signaling to the time-of-flight sensor; step 2001, query the current device status of the projection device, and then the controller receives data feedback from the time-of-flight sensor.
  • Step 2002 the correction service can send notification algorithm service to the process communication framework (HSP Core) to start the anti-eye shot process signaling;
  • step 2003 the process communication framework (HSP Core) will call the service capability from the algorithm library to call the corresponding algorithm Services, for example, may include camera detection algorithms, screenshot algorithms, and foreign object detection algorithms, etc.;
  • the timing is shown in Figure 20.
  • the projection device when the anti-eye switch of the projection device is turned on, when the user enters a preset specific area, the projection device will automatically reduce the intensity of the laser light emitted by the optical machine, reduce the display brightness of the user interface, and display safety prompt information.
  • the projection device can control the above-mentioned anti-eye function through the following methods, as shown in Figure 25:
  • the controller Based on the projection screen acquired by the camera, the controller uses an edge detection algorithm to identify the target projection area of the projection device; when the target projection area is displayed as a rectangle or a rectangle, the controller obtains the four vertices of the above-mentioned rectangular target projection area through a preset algorithm. coordinate value;
  • the perspective transformation method can be used to correct the target projection area to be a rectangle, and the difference between the rectangle and the projection screenshot can be calculated to realize whether there is a foreign object in the display area; if the judgment result is that there is a foreign object, The projection device automatically triggers the anti-eye function to start.
  • Step 2501 turn on anti-shooting eyes
  • the difference between the camera content of the current frame and the camera content of the previous frame can be used to determine whether foreign objects have entered the area outside the projection range; if it is judged that foreign objects have entered, the projection
  • the device automatically triggers the anti-eye function.
  • the projection device can also use a time-of-flight (ToF) camera or a time-of-flight sensor to detect real-time depth changes in a specific area; if the depth value changes beyond a preset threshold, the projection device will automatically trigger the anti-eye function.
  • TOF time-of-flight
  • the projection device judges whether to enable the anti-eye function based on the collected time-of-flight data, screenshot data, and camera data analysis.
  • step 2502 according to the collected time-of-flight data; in step 2503, the controller performs depth difference analysis; in step 2504, if the depth difference is greater than the preset threshold X, when the preset threshold X is implemented as 0, it can be determined that there is A foreign object has been lodged in a specific area of the projection device. Step 2505, the screen is darkened, and a prompt pops up; if the user is located in the specific area and his vision is at risk of being damaged by the laser, the projection device will automatically activate the anti-eye function to reduce the intensity of the laser emitted by the light machine, reduce the brightness of the user interface display, And display security prompt information.
  • step 2506 collect screenshot data; step 2507, analyze the difference of the color addition mode (RGB), if it is greater than the threshold Y, then execute step 2508, if the difference of the color addition mode is greater than the preset threshold Y, then it can be determined There is a foreign object in a specific area of the projection device; step 2505, the screen becomes dark, and a prompt pops up; if there is a user in the specific area, and his vision is at risk of being damaged by laser light, the projection device will automatically activate the anti-eye function to reduce the amount of laser light emitted. Intensity, reduce the display brightness of the user interface and display the corresponding security prompt information.
  • RGB color addition mode
  • step 2509 collect camera data; step 2510, obtain projection coordinates; step 2511, determine the target projection area of the projection device; step 2507, perform additive color mode (RGB) difference analysis in the target projection area; step 2508, If the color addition mode difference is greater than the preset threshold Y, it can be determined that there is a foreign object in a specific area of the projection device; step 2505, the screen becomes dark, and a prompt pops up; if there is a user in the specific area, his vision is damaged by the laser risk, the projection device will automatically activate the anti-eye function, reduce the intensity of the emitted laser light, reduce the brightness of the user interface display, and display the corresponding safety prompt information.
  • RGB additive color mode
  • Step 2512 the acquired projection coordinates are in the extended area;
  • Step 2513 the controller can still perform color additive mode (RGB) difference analysis in the expanded area;
  • Step 2514 if the color additive mode difference is greater than the preset threshold Y, then It can be determined that there is a foreign object in a specific area of the projection device; step 2505, the screen becomes dark, and a prompt pops up; if there is a user in the specific area, his vision may be damaged by the laser emitted by the projection device, and the projection device will automatically activate anti-projection Eye function, reduce the laser intensity, reduce the user interface display brightness and display the corresponding safety prompt information.
  • RGB color additive mode
  • FIG. 21 shows a schematic diagram of a signaling interaction sequence of a projection device implementing a display image correction function according to another embodiment of the present application.
  • the projection device can monitor the movement of the device through a gyroscope or a gyroscope sensor.
  • the calibration service sends a signaling to the gyroscope for querying the status of the device; and receives the signaling fed back by the gyroscope to determine whether the device moves.
  • the display correction strategy of the projection device can be configured such that when the gyroscope and the time-of-flight sensor change simultaneously, the projection device triggers keystone correction first;
  • the algorithm service starts the keystone correction process;
  • the controller starts and triggers the keystone correction; and
  • the controller can also configure the projection device not to respond to the commands issued by the remote control buttons when the keystone correction is in progress; in order to cooperate with the realization of the keystone correction, the projection device will play pure White illustration card.
  • the trapezoidal correction algorithm can construct the transformation matrix between the projection surface and the optical-mechanical coordinate system in the world coordinate system based on the binocular camera; further combine the optical-mechanical internal parameters to calculate the homography between the projection screen and the playing card, and use the homography to realize Arbitrary shape conversion between the projected screen and the playing card.
  • the correction service sends a signaling for informing the algorithm service to start the keystone correction process to the process communication framework (HSP CORE), and the process communication framework further sends a service capability call signaling to the algorithm service to obtain the capability corresponding algorithm;
  • the algorithm service obtains and executes the camera and picture algorithm processing service and the obstacle avoidance algorithm service, and sends them to the process communication framework in the form of signaling; in some embodiments, the process communication framework executes the above algorithms and feeds back the execution results to the Calibration service, the execution results may include successful photographing and successful obstacle avoidance.
  • the user interface will be controlled to display an error return prompt, and the user interface will be controlled to display keystone correction and auto focus charts again.
  • the projection device can identify the screen; and use the projection changes to correct the projection screen to be displayed inside the screen, so as to achieve the effect of aligning with the edge of the screen.
  • the projection device can use the time-of-flight (ToF) sensor to obtain the distance between the optical machine and the projection surface, based on the distance, find the best image distance in the preset mapping table, and use the image algorithm to evaluate the clarity of the projection screen. Based on this, the image distance can be fine-tuned.
  • ToF time-of-flight
  • the automatic keystone correction signaling sent by the correction service to the process communication framework may include other function configuration instructions, for example, may include control instructions such as whether to implement synchronous obstacle avoidance, whether to enter a scene, etc.
  • the process communication framework sends the service capability call signaling to the algorithm service, so that the algorithm service acquires and executes the auto-focus algorithm to realize the adjustment of the line-of-sight between the device and the screen; in some embodiments, after applying the auto-focus algorithm to realize the corresponding function, the algorithm
  • the service may also obtain and execute an automatic entry algorithm, which may include a keystone correction algorithm.
  • the projection device automatically enters the screen, and the algorithm service can set the 8-position coordinates between the projection device and the screen; and then through the autofocus algorithm again, the adjustment of the viewing distance between the projection device and the screen is realized; finally, the correction result Feedback to the correction service, step 2104, control the user interface to display the correction result, as shown in FIG. 21 .
  • the projection device uses an autofocus algorithm to obtain the current object distance by using its configured laser ranging to calculate the initial focal length and search range; then the projection device drives the camera (Camera) to take pictures, and uses the corresponding algorithm Perform clarity evaluation.
  • the projection device searches for the best possible focal length based on the search algorithm, then repeats the above steps of photographing and sharpness evaluation, and finally finds the optimal focal length through sharpness comparison to complete autofocus.
  • step 2201 after the projection device is started; in step 2202, the user moves the device; the projection device automatically completes the correction and refocuses; in step 2203, the controller will detect whether the auto focus function is enabled; when the auto focus function is not enabled, the controller The auto-focus service will be ended; step 2204, when the auto-focus function is turned on, the projection device will obtain the detection distance of the time-of-flight (TOF) sensor through the middleware for calculation;
  • TOF time-of-flight
  • Step 2205 the controller queries the preset mapping table according to the obtained distance to obtain the approximate focal length of the projection device;
  • step 2206 the middleware sets the obtained focal length to the optical engine of the projection device;
  • Step 2207 after the optical machine emits laser light with the above focal length, the camera will execute the photographing command;
  • Step 2208 the controller judges whether the projection device is focused according to the acquired photographing result and evaluation function; if the judgment result meets the preset completion conditions, then Control the autofocus process to end;
  • step 2209 if the judgment result does not meet the preset completion conditions, the middleware will fine-tune the focal length parameters of the projection device optical machine, for example, the preset step length can be used to gradually fine-tune the focal length, and the adjusted focal length parameter is set again to Optics and mechanics; so as to realize the steps of repeated photographing and sharpness evaluation, and finally find the optimal focal length through sharpness comparison to complete autofocus, as shown in Figure 22.
  • the projection device provided by the present application can implement a display correction function through a keystone correction algorithm.
  • two sets of external parameters between the two cameras and between the camera and the optical machine can be obtained, that is, the rotation and translation matrices; then the specific checkerboard chart is played through the optical machine of the projection device, and the projected checkerboard angle is calculated
  • Point depth value for example, solve the xyz coordinate value through the translation relationship between binocular cameras and the principle of similar triangles; then fit the projection surface based on the xyz, and obtain the rotation relationship and translation relationship with the camera coordinate system , which can specifically include pitch relationship (Pitch) and yaw relationship (Yaw).
  • the Roll parameter value can be obtained through the gyroscope configured by the projection device to combine the complete rotation matrix, and finally calculate the external parameters from the projection plane to the optical-mechanical coordinate system in the world coordinate system.
  • Step 2301 the projection device controller obtains the depth value of the point corresponding to the pixel point of the photo, or the coordinates of the projection point in the camera coordinate system;
  • Step 2302 through the depth value, the middleware obtains the relationship between the optical machine coordinate system and the camera coordinate system;
  • Step 2303 the controller calculates the coordinate value of the projected point in the optical machine coordinate system
  • Step 2304 obtaining the angle between the projection surface and the optical machine based on the coordinate value fitting plane
  • Step 2305 obtain the corresponding coordinates of the projection point in the world coordinate system of the projection surface according to the angle relationship;
  • Step 2306 according to the coordinates of the map card in the optical-mechanical coordinate system and the coordinates of the corresponding points on the projection surface of the projection plane, a homography matrix can be calculated.
  • Step 2307 the controller determines whether an obstacle exists based on the above acquired data
  • Step 2308 when obstacles exist, randomly select rectangular coordinates on the projection surface in the world coordinate system, and calculate the area to be projected by the optical machine according to the homography relationship;
  • Step 2309 when the obstacle does not exist, the controller can obtain the feature points of the two-dimensional code, for example;
  • Step 2310 obtaining the coordinates of the two-dimensional code on the prefabricated map card
  • Step 2311 obtaining the homography relationship between the camera photo and the drawing card
  • Step 2312 converting the obtained coordinates of the obstacle into the chart, and then obtaining the coordinates of the chart of obstructions blocked by the obstacle.
  • Step 2313 according to the coordinates of the occlusion area of the obstacle map in the optical-mechanical coordinate system, the coordinates of the occlusion area of the projection surface are obtained through homography matrix transformation; Obstacles, the area to be projected by the optical machine is calculated according to the homography relationship.
  • the obstacle avoidance algorithm uses the algorithm (OpenCV) library to complete the contour extraction of foreign objects when selecting the rectangle step in the trapezoidal correction algorithm process, and avoids the obstacle when selecting the rectangle to realize the projection obstacle avoidance function.
  • OpenCV algorithm
  • Step 2401 the middleware obtains the QR code image card captured by the camera
  • Step 2402 identifying the feature points of the two-dimensional code, and obtaining the coordinates in the camera coordinate system
  • Step 2403 the controller further acquires the coordinates of the preset image card in the optical-mechanical coordinate system
  • Step 2404 to solve the homography relationship between the camera plane and the optical-mechanical plane
  • Step 2405 the controller identifies the coordinates of the four vertices of the curtain captured by the camera based on the above homography;
  • Step 2406 according to the homography matrix, obtain the range of the chart to be projected by the screen light machine.
  • the screen entry algorithm is based on the algorithm library (OpenCV), which can identify and extract the largest black closed rectangle outline, and judge whether it is a 16:9 size; project a specific picture card and use a camera to take photos, and extract more details in the photos.
  • OpenCV algorithm library
  • the corner points are used to calculate the homography between the projection surface (curtain) and the optical-mechanical display card, and the four vertices of the screen are converted to the optical-mechanical pixel coordinate system through homography, and the optical-mechanical graphic card is converted to the four vertices of the screen.
  • OpenCV algorithm library
  • Telephoto micro-projection equipment for example, micro-projection TV
  • the projection screen may be distorted after each displacement.
  • the projection provided by this application can automatically complete the correction for the above-mentioned problems, including automatic trapezoidal correction, automatic screen entry, automatic obstacle avoidance, automatic focus, anti-eye and other functions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

本申请提供了一种投影设备和投影区域修正方法。当获取到投影区域修正指令,投影设备可以获取光机投射的图卡和投影面之间的映射关系。根据映射关系可以获取,光机将图卡投射至投影面中的初始投影区域,即为修正前的投影区域。投影设备可以对初始投影区域进行轮廓修正,得到目标投影区域,从而确定出用户设定好的目标区域,可以是幕布。根据映射关系确定目标投影区域的目标位置信息,并可以令光机将目标图卡投射至目标投影区域,从而对投影区域进行了修正,保证投影设备将图像完全投射到幕布中,提高用户的使用体验。

Description

一种投影设备和投影区域修正方法
相关申请的交叉引用
本申请要求在2021年11月16日提交、申请号为202111355866.0;在2022年04月13日提交、申请号为202210389054.6的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及投影设备技术领域,尤其涉及一种投影设备和投影区域修正方法。
背景技术
投影设备是一种可以将图像或视频投射到屏幕上的显示设备。投影设备可以将特定颜色的激光光线通过光学透镜组件的折射作用,投射到投影面上形成具体影像。
为了满足用户的需要,投影设备可以将图像投射到投影面中,用户设定好的投影区域中,例如可以是用户提前摆放好的幕布中。在使用投影设备时,如果投影设备的位置发生了偏移,会导致投影设备不能将图像完全投射到投影区域中,给用户的体验性较差。为此,需要重新确定出投影区域,从而将图像再次投射到投影区域中。
现有的投影设备在重新确定投影区域时,用户可以调节投影设备的位置和投射角度,从而令图像能够完全投射到投影区域中。然而,这种方式需要用户手动选择调整方向,过程繁琐并且不能够准确的确定投影区域,给用户的体验性较差。
发明内容
本申请实施方式提供一种投影设备和投影区域修正方法,投影设备包括光机和控制器。其中,光机被配置为将图卡投射至投影面;控制器被配置为执行以下步骤:
响应于投影区域修正指令,获取所述光机投射的图卡和所述投影面之间的映射关系;
根据所述映射关系获取,所述光机将图卡投射至所述投影面中的初始投影区域;
对所述初始投影区域进行轮廓修正,得到目标投影区域;
根据所述映射关系确定所述目标投影区域的目标位置信息;
根据所述目标位置信息,控制所述光机将目标图卡投射至所述目标投影区域。
本申请实施方式提供一种用于投影设备的投影区域修正方法,包括:
响应于投影区域修正指令,获取光机投射的图卡和所述投影面之间的映射关系;
根据所述映射关系获取,所述光机将图卡投射至所述投影面中的初始投影区域;
对所述初始投影区域进行轮廓修正,得到目标投影区域;
根据所述映射关系确定所述目标投影区域的目标位置信息;
根据所述目标位置信息,控制所述光机将目标图卡投射至所述目标投影区域。
附图说明
图1示出了本申请一些实施例中投影幕布的示意图;
图2示出了本申请一些实施例中投影设备的摆放示意图;
图3示出了本申请一些实施例中投影设备光路示意图;
图4示出了本申请一些实施例中投影设备的电路架构示意图;
图5示出了本申请一些实施例中投影设备的结构示意图;
图6示出了本申请一些实施例中投影设备的电路结构示意图;
图7示出了本申请一些实施例中投影设备实现显示控制的系统框架示意图;
图8示出了本申请一些实施例中投影设备位置改变时的示意图;
图9示出了本申请一些实施例中投影设备各部件的交互流程图;
图10示出了本申请一些实施例中第一图卡的示意图;
图11示出了本申请一些实施例中第一图卡的示意图;
图12示出了本申请一些实施例中获取映射关系的流程示意图;
图13示出了本申请一些实施例中目标投影区域的示意图;
图14示出了本申请一些实施例中目标投影区域的示意图;
图15示出了本申请一些实施例中目标投影区域的示意图;
图16示出了本申请一些实施例中投影区域修正前后的示意图;
图17示出了本申请一些实施例的投影区域修正方法的示意图;
图18示出了本申请一些实施例的投影设备的结构示意图;
图19示出了本申请另一实施例的投影设备的结构示意图;
图20示出了本申请另一实施例投影设备实现放射眼功能的信令交互时序示意图;
图21示出了本申请另一实施例投影设备实现显示画面校正功能的信令交互时序示意图;
图22示出了本申请另一实施例投影设备实现自动对焦算法的流程示意图;
图23示出了本申请另一实施例投影设备实现梯形校正、避障算法的流程示意图;
图24示出了本申请另一实施例投影设备实现入幕算法的流程示意图;
图25示出了本申请另一实施例投影设备实现防射眼算法的流程示意图。
具体实施方式
为使本申请的目的、实施方式和优点更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,所描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
本申请实施例可以应用于各种类型的投影设备。投影设备是一种可以将图像、或视频投射到屏幕上的设备,投影设备可以通过不同的接口同计算机、广电网络、互联网、VCD(Video Compact Disc:视频高密光盘)、DVD(Digital Versatile Disc Recordable:数字化视频光盘)、游戏机、DV等相连接播放相应的视频信号。投影设备广泛应用于家庭、办公室、学校和娱乐场所等。在一些具体的实施例中,投影设备可以是投影仪的形式。
投影设备可以设置有光机,光机可以将图像,例如用户选取的图卡投射至投影面,具体投射到投影面中用户指定的投影区域。投影面可以是墙面,投影区域可以墙面中的特定区域,也可以是位于墙面前的投影幕布。
在一些实施例中,投影幕布是用在电影、办公、家庭影院、大型会议等场合上的,用来显示 图像、视频文件的工具,可根据实际需要设置为不同的规格尺寸。为了显示效果更符合用户观影习惯,投影设备对应幕布的高宽比通常被设置为16:9,以适应投影设备投射的图像尺寸。图1示出了一些实施例中投影幕布的示意图。投影幕布的幕布边缘可以是深色边缘带,用于突出幕布的边界,投影幕布的中心区域可以是白色幕布,可以作为投影区域,用于显示投影设备投射的图像。
为了能够实现最佳的投射效果,投影幕布和投影设备的安装位置可以由专业售后技术人员操作,通过将投影幕布和投影设备设置在工作时实现最佳投射效果的摆放位置处,从而令投影设备投射的图像能够完全处于投影幕布中,从而提高用户的体验性。在使用过程中,用户也可以自行设定二者的位置,本申请实施例不做限定。
图2示出了一些实施例中投影设备的摆放示意图。在一些实施例中,本申请提供的投影幕布可以固定于第一位置上,投影设备放置于第二位置上,使得投影设备投影出的画面与投影幕布吻合。
图3示出了一些实施例中投影设备光路示意图。
本申请实施例提供了一种投影设备,包括激光光源100,光机200,镜头300,投影介质400。其中,激光光源100为光机200提供照明,光机200对光源光束进行调制,并输出至镜头300进行成像,投射至投影介质400形成投影画面。
在一些实施例中,投影设备的激光光源100包括激光器组件和光学镜片组件,激光器组件发出的光束可透过光学镜片组件进而为光机提供照明。其中,例如,光学镜片组件需要较高等级的环境洁净度、气密等级密封;而安装激光器组件的腔室可以采用密封等级较低的防尘等级密封,以降低密封成本。
在一些实施例中,投影设备的光机200可实施为包括蓝色光机、绿色光机、红色光机,还可以包括散热系统、电路控制系统等。需要说明的是,在一些实施例中,投影设备的发光部件还可以通过LED光源实现。
在一些实施例中,本申请提供了一种投影设备,包括三色光机和控制器;其中,三色光机用于调制生成用户界面包含像素点的激光,包括蓝色光机、绿色光机和红色光机;控制器被配置为:获取用户界面的平均灰度值;判定所述平均灰度值大于第一阈值、且其持续时间大于时间阈值时,控制所述红色光机的工作电流值按照预设梯度值降低,以减小所述三色光机的发热。可以发现,通过降低三色光机中所集成红色光机的工作电流,可以实现控制所述红色光机的过热,以实现控制三色光机、及投影设备的过热。
光机200可实施为三色光机,所述三色光机集成蓝色光机、绿色光机、红色光机。
下文中将以投影设备的光机200实施为包括蓝色光机、绿色光机、红色光机为例,对本申请提供的实施方式进行阐述。
在一些实施例中,投影设备的光学系统由光源部分和光机部分组成,光源部分的作用是为光机提供照明,光机部分的作用是对光源提供的照明光束进行调制,最后通过镜头出射形成投影画面。
图4示出了一些实施例中投影设备的电路架构示意图。在一些实施例中,该投影设备可以包括显示控制电路10、激光光源20、至少一个激光器驱动组件30以及至少一个亮度传感器40,该激光光源20可以包括与至少一个激光器驱动组件30一一对应的至少一个激光器。其中,该至少一个是指一个或多个,多个是指两个或两个以上。
基于该电路架构,投影设备可以实现自适应调整。例如,通过在激光光源20的出光路径中设 置亮度传感器40,使亮度传感器40可以检测激光光源的第一亮度值,并将第一亮度值发送至显示控制电路10。
该显示控制电路10可以获取每个激光器的驱动电流对应的第二亮度值,并在确定该激光器的第二亮度值与该激光器的第一亮度值的差值大于差值阈值时,确定该激光器发生COD故障;则显示控制电路可以调整激光器的对应的激光器驱动组件的电流控制信号,直至该差值小于等于该差值阈值,从而消除该蓝色激光器的COD故障;该投影设备能够及时消除激光器的COD故障,降低激光器的损坏率,提高投影设备的图像显示效果。
在一些实施例中,激光光源20包括与激光器驱动组件30一一对应的三个激光器,该三个激光器可以分别为蓝色激光器201、红色激光器202和绿色激光器203。其中,该蓝色激光器201用于出射蓝色激光,该红色激光器202用于出射红色激光,该绿色激光器203用于出射绿色激光。在一些实施例中,激光器驱动组件30可实施为包含多个子激光器驱动组件,分别对应不同颜色的激光器。
显示控制电路10用于向激光器驱动组件30输出基色使能信号以及基色电流控制信号,以驱动激光器发光。显示控制电路10与激光器驱动组件30连接,用于输出与多帧显示图像中的每一帧图像的三种基色一一对应的至少一个使能信号,将至少一个使能信号分别传输至对应的激光器驱动组件30,并输出与每一帧图像的三种基色一一对应的至少一个电流控制信号,将至少一个电流控制信号分别传输至对应的激光器驱动组件30。示例的,该显示控制电路10可以为微控制单元(microcontroller unit,MCU),又称为单片机。其中,该电流控制信号可以是脉冲宽度调制(pulse width modulation,PWM)信号。
在一些实施例中,该显示控制电路10可以基于待显示图像的蓝色基色分量输出与蓝色激光器201对应的蓝色PWM信号B_PWM,基于待显示图像的红色基色分量输出与红色激光器202对应的红色PWM信号R_PWM,基于待显示图像的绿色基色分量输出与绿色激光器203对应的绿色PWM信号G_PWM。显示控制电路可以基于蓝色激光器201在驱动周期内的点亮时长,输出与蓝色激光器201对应的使能信号B_EN,基于红色激光器202在驱动周期内的点亮时长,输出与红色激光器202对应的使能信号R_EN,基于绿色激光器203在驱动周期内的点亮时长,输出与绿色激光器203对应的使能信号G_EN。
在一些实施例中,蓝色激光器201、红色激光器202和绿色激光器203分别与激光器驱动组件30连接。激光器驱动组件30可以响应于显示控制电路10发送的蓝色PWM信号和使能信号,向该蓝色激光器201提供对应的驱动电流。该蓝色激光器201用于在该驱动电流的驱动下发光。
亮度传感器设置于激光光源的出光路径中,通常设置在出光路径的一侧,而不会遮挡光路。如图2所示,至少一个亮度传感器40设置在激光光源20的出光路径中,该每个亮度传感器与显示控制电路10连接,用于检测一个激光器的第一亮度值,并将第一亮度值发送至显示控制电路10。
在一些实施例中,显示控制电路10,还用于获取每个激光器的驱动电流对应的第二亮度值,若检测到该激光器的第二亮度值与该激光器的第一亮度值的差值大于差值阈值,表明该激光器发生COD故障,显示控制电路10可以调整激光器驱动组件30的电流控制信号,直至该差值小于等于该差值阈值,即通过降低激光器的驱动电流来消除该激光器的COD故障。具体地,第一亮度值和第二亮度值均表征为光输出功率值,其中第二亮度值可以是预先存储的,也可以是处于正常发光状态时的亮度传感器发回的亮度数值。如果激光器发生COD故障,通常是其光输出功率发生骤降,亮度传感器回传的第一亮度值会小于正常的第二亮度值的一半。当确认发生上述故障时,显示控制电路会减小与激光器对应的激光 器驱动组件的电流控制信号,并不断采集亮度传感器回传的亮度信号并比较。
在一些实施例中,若检测到的该激光器的第二亮度值与该激光器的第一亮度值的差值小于等于差值阈值,表明该激光器未发生COD故障,则显示控制电路10无需调整与该激光器对应的激光器驱动组件30的电流控制信号。
其中,显示控制电路10中可以存储有电流与亮度值之间的对应关系。该对应关系中每个电流对应的亮度值为激光器在该电流的驱动下正常工作(即在未发生COD故障)时,该激光器能够发出的初始亮度值。例如,该亮度值可以是激光器在该电流的驱动下工作时,其首次点亮时的初始亮度。
在一些实施例中,显示控制电路10可以从该对应关系中获取每个激光器的驱动电流对应的第二亮度值,该驱动电流为激光器当前的实际工作电流,该驱动电流对应的第二亮度值为激光器在该驱动电流的驱动下正常工作时能够发出的亮度值。该差值阈值可以为显示控制电路10中预先存储的固定数值。
在一些实施例中,显示控制电路10在调整与激光器对应的激光器驱动组件30的电流控制信号时,可以降低与激光器对应的激光器驱动组件30的电流控制信号的占空比,从而降低激光器的驱动电流。
在一些实施例中,亮度传感器40可以检测蓝色激光器201的第一亮度值,并将该第一亮度值发送至显示控制电路10。该显示控制电路10可以获取该蓝色激光器201的驱动电流,并从电流与亮度值的对应关系中获取该驱动电流对应的第二亮度值。之后检测第二亮度值与第一亮度值之间的差值是否大于差值阈值,若该差值大于差值阈值,表明该蓝色激光器201发生COD故障,则显示控制电路10可以降低与该蓝色激光器201对应的激光器驱动组件30的电流控制信号。之后显示控制电路10可以再次获取蓝色激光器201的第一亮度值,以及蓝色激光器201的驱动电流对应的第二亮度值,并在第二亮度值与第一亮度值之间的差值大于差值阈值时,再次降低与该蓝色激光器201对应的激光器驱动组件30的电流控制信号。如此循环,直至该差值小于等于差值阈值。由此通过降低蓝色激光器201的驱动电流,消除该蓝色激光器201的COD故障。
在一些实施例中,显示控制电路10可以根据至少一个亮度传感器40获取到的每一个激光器的第一亮度值,以及每个激光器的驱动电流对应的第二亮度值,实时监测每个激光器是否发生COD故障。并在确定任一个激光器发生COD故障时,及时消除该激光器的COD故障,减少激光器发生COD故障的持续时长,降低该激光器的损伤,确保投影设备的图像显示效果。
图5示出了本申请一些实施例中投影设备的结构示意图。
在一些实施例中,该投影设备中的激光光源20可以包括独立设置的蓝色激光器201、红色激光器202和绿色激光器203,该投影设备也可以称为三色投影设备,蓝色激光器201、红色激光器202和绿色激光器203均为模块轻量化(Mirai Console Loader,MCL)封装激光器,其体积小,利于光路的紧凑排布。
在一些实施例中,控制器包括中央处理器(Central Processing Unit,CPU),视频处理器,音频处理器,图形处理器(Graphics Processing Unit,GPU),RAM Random Access Memory,RAM),ROM(Read-Only Memory,ROM),用于输入/输出的第一接口至第n接口,通信总线(Bus)等中的至少一种。
在一些实施例中,投影设备可以配置相机,用于和投影设备协同运行,实现对投影过程的调节控制。例如,投影设备配置的相机可具体实施为3D相机、深度相机或双目相机;在相机实施为双目相机时,具体包括左相机、以及右相机;双目相机可获取投影设备对应的幕布,即投影面所呈现的图像及图卡,该图像或图卡由投影设备内置的光机进行投射。
当投影设备移动位置后,其投射角度、及至投影面距离发生变化,会导致投影图像发生形变,投影图像会显示为梯形图像、或其他畸形图像;投影设备控制器可基于相机拍摄的图像,通过耦合光机投影面之间夹角和投影图像的正确显示实现自动梯形校正。
图6示出了本申请一些实施例中投影设备的电路结构示意图。在一些实施例中,激光器驱动组件30可以包括驱动电路301、开关电路302和放大电路303。该驱动电路301可以为驱动芯片。该开关电路302可以为金属氧化物半导体(metal-oxide-semiconductor,MOS)管。其中,该驱动电路301分别与开关电路302、放大电路303以及激光光源20所包括的对应的激光器连接。该驱动电路301用于基于显示控制电路10发送的电流控制信号通过VOUT端向激光光源20中对应的激光器输出驱动电流,并通过ENOUT端将接收到的使能信号传输至开关电路302。其中,该激光器可以包括串联的n个子激光器,分别为子激光器LD1至LDn。n为大于0的正整数。
开关电路302串联在激光器的电流通路中,用于在接收到的使能信号为有效电位时,控制电流通路导通。放大电路303分别与激光光源20的电流通路中的检测节点E和显示控制电路10连接,用于将检测到的激光器组件的驱动电流转换为驱动电压,放大该驱动电压,并将放大后的驱动电压传输至显示控制电路10。显示控制电路10还用于将放大后的驱动电压确定为激光器的驱动电流,并获取该驱动电流对应的第二亮度值。
在一些实施例中,放大电路303可以包括:第一运算放大器A1、第一电阻(又称取样功率电阻)R1、第二电阻R2、第三电阻R3和第四电阻R4。
第一运算放大器A1的同相输入端(又称正端)与第二电阻R2的一端连接,第一运算放大器A1的反相输入端(又称负端)分别与第三电阻R3的一端和第四电阻R4的一端连接,第一运算放大器A1的输出端分别与第四电阻R4的另一端和处理子电路3022连接。第一电阻R1的一端与检测节点E连接,第一电阻R1的另一端与参考电源端连接。第二电阻R2的另一端与检测节点E连接,第三电阻R3的另一端与参考电源端连接。该参考电源端为接地端。
在一些实施例中,第一运算放大器A1还可以包括两个电源端,其中一个电源端与电源端VCC连接,另一个电源端可以与参考电源端连接。
激光光源20所包括的激光器的较大的驱动电流通过第一电阻R1后产生压降,该第一电阻R1一端(即检测节点E)的电压Vi通过第二电阻R2传输至第一运算放大器A1的同相输入端,经过第一运算放大器A1放大N倍后输出。该N为该第一运算放大器A1的放大倍数,且N为正数。该放大倍数率N可以使得第一运算放大器A1输出的电压Vfb的数值为激光器的驱动电流的数值的整数倍。示例的,电压Vfb的数值可以与该驱动电流的数值相等,从而便于显示控制电路10将该放大后的驱动电压确定为激光器的驱动电流。
在一些实施例中,显示控制电路10、驱动电路301、开关电路302和放大电路303形成闭环,以实现对该激光器的驱动电流的反馈调节,从而使得该显示控制电路10可以通过激光器的第二亮度值与第一亮度值的差值,及时调节该激光器的驱动电流,也即是及时调节该激光器的实际发光亮度,避免激光器长时间发生COD故障,同时提高了对激光器发光控制的准确度。需要说明的是,若激光光源20包括一个蓝色激光器、一个红色激光器和一个绿色激光器。该蓝色激光器201可以设置在L1位置处,该红色激光器202可以设置在L2位置处,绿色激光器203可以设置在L3位置处。
L1位置处的激光经过第四二向色片604一次透射,再经过第五二向色片605反射一次后进入第一透镜901中。该L1位置处的光效率P1=Pt×Pf。其中,Pt表示的是二向色片的透射率,Pf表示的是二向色片或者第五反射率的反射率。
在一些实施例中,在L1、L2和L3三个位置中,L3位置处的激光的光效率最高,L1位置处的激光的光效率最低。由于蓝色激光器201输出的最大光功率Pb=4.5瓦(W),红色激光器202输出的最大光功率Pr=2.5W,绿色激光器203输出的最大光功率Pg=1.5W。即蓝色激光器201输出的最大光功率最大,红色激光器202输出的最大光功率次之,绿色激光器203输出的最大光功率最小。因此将绿色激光器203设置在L3位置处,将红色激光器202设置在L2位置处,将蓝色激光器201设置在L1位置处。也即是将绿色激光器203设置在光效率最高的光路中,从而确保投影设备能够获得最高的光效率。
在一些实施例中,显示控制电路10,还用于当激光器的第二亮度值与激光器的第一亮度值的差值小于等于差值阈值时,恢复与激光器对应的激光器驱动组件的电流控制信号至初始值,该初始值为正常状态下对激光器的PWM电流控制信号的大小。从而,当激光器发生COD故障时,可以快速的识别,并及时采取降低驱动电流的措施,减轻激光器自身的持续损伤,帮助其自恢复,整个过程中不需要拆机和人为干涉,提高了激光器光源使用的可靠性,保证了激光投影设备的投影显示质量。
在一些实施例中控制器包括中央处理器(Central Processing Unit,CPU),视频处理器,音频处理器,图形处理器(Graphics Processing Unit,GPU),RAM Random Access Memory,RAM),ROM(Read-Only Memory,ROM),用于输入/输出的第一接口至第n接口,通信总线(Bus)等中的至少一种。在一些实施例中,投影设备启动后可以直接进入上次选择的信号源的显示界面,或者信号源选择界面,其中信号源可以是预置的视频点播程序,还可以是HDMI接口,直播电视接口等中的至少一种,用户选择不同的信号源后,投影机可以显示从不同信号源获得的内容。
在一些实施例中,投影设备的系统可以包括内核(Kernel)、命令解析器(shell)、文件系统和应用程序。内核、shell和文件系统一起组成了基本的操作系统结构,它们让用户可以管理文件、运行程序并使用系统。上电后,内核启动,激活内核空间,抽象硬件、初始化硬件参数等,运行并维护虚拟内存、调度器、信号及进程间通信(IPC)。内核启动后,再加载Shell和用户应用程序。应用程序在启动后被编译成机器码,形成一个进程。
在一些实施例中,将系统分为四层,从上至下分别为应用程序(Applications)层(简称“应用层”),应用程序框架(Application Framework)层(简称“框架层”),安卓运行时(Android runtime)和系统库层(简称“系统运行库层”),以及内核层。
在一些实施例中,应用程序层中运行有至少一个应用程序,这些应用程序可以是操作系统自带的窗口(Window)程序、系统设置程序或时钟程序等;也可以是第三方开发者所开发的应用程序。在具体实施时,应用程序层中的应用程序包不限于以上举例。
在一些实施例中,投影设备启动后可以直接进入上次选择的信号源的显示界面,或者信号源选择界面,其中信号源可以是预置的视频点播程序,还可以是HDMI接口,直播电视接口等中的至少一种,用户选择不同的信号源后,投影机可以显示从不同信号源获得的内容。
本申请实施例可以应用于各种类型的投影设备。下文中将以激光投影设备为例,对投影设备、及基于几何校正的显示控制方法进行阐述。
在一些实施例中,投影设备发出的激光通过数字微镜器件(DMD:Digital Micromirror Device)芯片的纳米级镜片反射,其中光学镜头也是精密元件,当像面、以及物面不平行时,会使得投影到屏幕的图像发生几何形状畸变。
图7示出了本申请一些实施例中投影设备实现显示控制的系统框架示意图。
在一些实施例中,本申请提供的投影设备具备长焦微投的特点,投影设备包括控制器,所述控制器通过预设算法可对光机画面进行显示控制,以实现显示画面自动梯形校正、自动入幕、自 动避障、自动对焦、以及防射眼等功能。可以理解,投影设备通过基于几何校正的显示控制方法,可实现长焦微投场景下的灵活位置移动;设备在每次移动过程中,针对可能出现的投影画面失真、投影面异物遮挡、投影画面从幕布异常等问题,控制器可控制投影设备实现自动显示校正功能,使其自动恢复正常显示。
在一些实施例中,基于几何校正的显示控制系统,包括应用程序服务层(APK Service:Android application package Service)、服务层、以及底层算法库。
应用程序服务层用于实现投影设备和用户之间的交互;基于用户界面的显示,用户可对投影设备的各项参数以及显示画面进行配置,控制器通过协调、调用各种功能对应的算法服务,可实现投影设备在显示异常时自动校正其显示画面的功能。
服务层可包括校正服务、摄像头服务、飞行时间(TOF)服务等内容,服务向上对应于应用程序服务层(APK Service),实现投影设备不同配置服务的对应特定功能;服务层向下对接算法库、相机、飞行时间传感器等数据采集业务,实现封装底层复杂逻辑。
底层算法库提供校正服务、以及投影设备实现各种功能的控制算法,算法库可基于OpenCV(基于许可开源)完成各种数学运算,为校正服务提供基础计算能力;OpenCV是一个基于BSD许可开源发行的跨平台计算机视觉、机器学习软件库,可以运行在多种现有操作系统环境中。
在一些实施例中,投影设备配置有陀螺仪传感器;设备在移动过程中,陀螺仪传感器可感知位置移动、并主动采集移动数据;然后通过系统框架层将已采集数据发送至应用程序服务层,用于支撑用户界面交互、应用程序交互过程中所需应用数据,其中的采集数据还可用于控制器在算法服务实现中的数据调用。
在一些实施例中,投影设备配置有飞行时间(TOF)传感器,在飞行时间传感器采集到相应数据后,所述数据将被发送至服务层对应的飞行时间服务;
上述飞行时间服务获取数据后,将采集数据通过进程通信框架发送至应用程序服务层,数据将用于控制器的数据调用、用户界面、程序应用等交互使用。
在一些实施例中,相机采集数据将发送至摄像头服务,然后由摄像头服务将采集图像数据发送至进程通信框架、和/或投影设备校正服务;所述投影设备校正服务可接收摄像头服务发送的相机采集数据,控制器针对所需实现的不同功能可在算法库中调用对应的控制算法。
在一些实施例中,通过进程通信框架、与应用程序服务进行数据交互,然后经进程通信框架将计算结果反馈至校正服务;校正服务将获取的计算结果发送至投影设备操作系统,以生成控制信令,并将控制信令发送至光机控制驱动以控制光机工况、实现投影区域的自动修正。
当用户开启投影设备后,投影设备可以将图像投射到投影面中,具体可以投射到用户预先设置好的投影区域中,可以是墙面的某个区域或者用户提前摆放好的幕布,该投影区域中可以显示出投影图像,以供用户进行观看。然而,当用户摆放投影设备的位置不当时,投影设备无法将图像完全投射到投影区域中。或者在使用过程中,用户碰到了投影设备导致投影设备的位置发生了改变,导致投影图像会产生偏移,从而无法完全处于投影区域中。对于上述情况,投影设备需要重新确定出投影区域,从而将图像完全投射到投影区域中,以便用户观看。图8示出了一些实施例中投影设备位置改变时的示意图。投影设备开始处于A位置,可以完全投射到投影区域中,投影区域呈矩形。当投影设备由A位置移动至B位置后,投影图像会发生偏移,不能完全处于投影区域中。
相关技术中,用户可以手动调节投影设备的位置和投射角度,从而令投影图像完全处于投影区域中,然而这种手动调节的方式过程繁琐并且准确性较差。投影设备也可以对投影面进行拍摄, 得到投影面的影像。然后将已获取影像进行二值化图像处理,以使得图像中物体轮廓可显示更加清晰。针对二值化图像,可以提取其中包含的所有闭合轮廓,并将其中面积最大、且内部颜色一致的闭合轮廓认定为投影区域,从而进行投射。然而,当投影区域周边存在大面积纯色墙壁、且墙壁边缘构成闭合轮廓时,投影设备会将该墙壁识别为投影区域,导致图像投射至该墙壁上,而并非用户指定的区域。
为此,本申请实施例提供的投影设备能够准确的确定出投影区域,以解决上述问题。
为了能够使得投影设备能够将图像完全投射到用户设定的投影区域中,投影设备具有投影区域修正功能。当投影设备投射的图像没能完全处于用户设定的投影区域中时,投影设备可以重新确定出投影区域,使得图像能够完全投射到投影区域中,从而实现投影区域修正,以达到投影图像自动入幕的效果。
在一些实施例中,当接收到投影区域修正指令时,投影设备可以开启投影区域修正功能,对投影设备当前的投影区域进行修正。投影区域修正指令是指用于触发投影设备自动进行投影区域修正的控制指令。
在一些实施例中,投影区域修正指令可以是用户主动输入的指令。例如,在接通投影设备的电源后,投影设备可以在投影面上投射出图像。此时,用户可以按下投影设备中预先设置的投影区域修正开关,或投影设备配套遥控器上的投影区域修正按键,使投影设备开启投影区域修正功能,重新确定出投影区域,实现投影区域的修正。
在一些实施例中,投影区域修正指令也可以根据投影设备内置的控制程序自动生成。
可以是投影设备在开机后主动生成投影区域修正指令。例如,当投影设备检测到开机后的首次视频信号输入时,可以生成投影区域修正指令,从而触发投影区域修正功能。
也可以是投影设备在工作过程中,自行生成投影区域修正指令。考虑到用户在使用投影设备的过程中,可能主动移动投影设备或者不小心碰到投影设备,导致投影设备的摆放姿态或设置位置发生改变,此时投影图像会发生偏移,导致投影区域发生变化,不能完全投射到用户设定的投影区域中。此时,为了保证用户的观看体验性,投影设备可以自动进行投影区域修正。
具体的,投影设备可以实时检测自身的情况。当投影设备检测到自身摆放姿态或设置位置发生改变时,可以生成投影区域修正指令,从而触发投影区域修正功能。
在一些实施例中,投影设备通过其配置组件可实时监测设备的移动,并将监测结果即时反馈至投影设备控制器,以实现投影设备移动后,控制器及时启动投影区域修正功能,在第一时间实现投影区域的自动修正。例如,通过投影设备配置的陀螺仪、或TOF(Time of Flight,飞行时间)传感器,控制器接收来自陀螺仪、TOF传感器的监测数据,以判定投影设备是否发生移动。
在判定投影设备发生移动后,控制器可以生成投影区域修正指令,并开启投影区域修正功能。
其中,飞行时间传感器通过调节发射脉冲的频率改变实现距离测量和位置移动监测,其测量精度不会随着测量距离的增大而降低,且抗干扰能力强。
在一些实施例中,当检测到投影区域修正指令时,投影设备可以重新确定出投影区域,从而将图像完全投射至该投影区域中,实现投影区域的自动修正,达到图像自动入幕的效果。
图9示出了本申请一些实施例中投影设备各部件的交互流程图。
在一些实施例中,当获取到投影区域修正指令时,投影设备可以对投影区域自动进行修正。具体的,可以通过预设的投影区域修正算法对投影图像实现投影区域修正功能。
为了能够对投影区域进行修正,投影设备可以先确定出投影面和投影设备之间的投射关系。本申请实施例中,投射关系指的是投影设备将图像投射到投影面的投射关系,具体为投影设备的 光机投射的图卡,和投影面之间的映射关系。在确定了投影面和投影设备之间的投射关系后,投影设备可以确定出投影面中投射区域的位置信息,从而进行投射,实现投影区域的修正。
投影设备可以在投影面中投射第一图卡,并根据该第一图卡确定投影面的位置信息。根据投影面的位置信息,进一步可以确定出投影面和投影设备之间的投射关系。
需要说明的是,本申请实施例可以基于双目相机构建投影面在世界坐标系下与光机坐标系的转换矩阵,该转换矩阵即为投影面中的投影图像与光机播放的播放图卡之间的单应性关系,该单应性关系也称为投射关系,利用该单应性关系即可实现投影图像与播放图卡之间的任意投射转换。
在一些实施例中,获取到投影区域修正指令(步骤901)时,投影设备可以先投射第一图卡。具体的,控制器可以控制光机将第一图卡投射至投影面上(步骤902)。
在投射第一图卡后,控制器还可以控制相机对投影面中显示的第一图卡进行拍摄,得到第一图像(步骤903)。
第一图卡中可以包含若干个特征点,因此相机拍摄到的第一图像中也会包含第一图卡中的所有的特征点。通过特征点可以确定出投影面的位置信息。需要说明的是,对于一个平面来说,当确定出该平面中三个点的位置后,即可确定出该平面的位置信息。因此,为了确定出投影面的位置信息,需要确定出投影面中至少三个点的位置,即第一图卡中需要包括至少三个特征点。根据这至少三个特征点即可确定出投影面的位置信息。
在一些实施例中,第一图卡中可以包含用户预先设定的图案、颜色特征。第一图卡可以是棋盘格图卡,棋盘格图卡设置为黑白相间的棋盘格,如图10所示,棋盘格图卡中包含的特征点为矩形的角点。第一图卡中的图案也可设置为圆环图卡,包括圆环图案,如图11所示,圆环图卡中包含的特征点为各圆环上对应的实心点。在一些实施例中,第一图卡还可设置为上述两类图案的组合,或者设置为其它具有可识别特征点的图案。
图12示出了一些实施例中获取映射关系的流程示意图。
在一些实施例中,光机将第一图卡投射到投影面上之后,控制器可以控制相机对第一图卡进行拍摄,得到第一图像,以获取特征点的位置。
具体的,相机可以是双目相机,在光机的两侧各设置有一个相机。通过该双目相机对第一图卡进行拍摄,左相机拍摄得到一张图像,右相机拍摄得到另一张图像。
控制器可以对两张图像进行图像识别处理,得到特征点在任一张图像中的第一坐标。在本申请实施例中,第一坐标指的是:在第一图像对应的图像坐标系下,特征点的坐标信息。
图像坐标系指的是:以图像中心为坐标原点,X,Y轴平行于图像两边的坐标系。本申请实施例中的图像坐标系可以设定如下:对于用户预先设定的投影区域,可以将该投影区域的中心点设置为原点,水平方向为X轴,垂直方向为Y轴。根据预设的投影区域可以提前设置高图像坐标系。
步骤1201,确定特征点在校正图像中的第一坐标;可以根据图像坐标系,确定出第一图像中的特征点的坐标信息。
对于同一个特征点来说,其在左相机拍摄到的图像中的位置,和在右相机拍摄到的图像中的位置,可能是不同的。通过同一个特征点在两张图像中的两个位置可以确定出该特征点在左右相机任一个的相机坐标系下的坐标。本申请实施例中,采用第二坐标表示:在相机坐标系下,特征点的坐标信息。
本申请实施例中,相机坐标系具体为:以相机的光点为中心,光轴为Z轴,平行于投影面为XOY面建立空间的直接坐标系。
对于双目相机来说,确定出特征点在其中任意一个相机对应的相机坐标系下的第二坐标即可,本申请实施例中以左相机为例介绍。
具体的,可以根据同一个特征点在左右相机拍摄到的两张图像中的位置信息,步骤1202,根据第一坐标获取特征点在相机坐标系下的第二坐标信息,设定为P(x,y,z)。
在一些实施例中,在获取到特征点在左相机的相机坐标系下的第二坐标后,步骤1203,将第二坐标换转为特征点在光机坐标系下的第三坐标。本申请实施例中,第三坐标指的是:在光机坐标系下,特征点的坐标信息。
本申请实施例中,光机坐标系具体为:以光机的光点为中心,光轴为Z轴,平行于投影面为XOY面建立空间的直接坐标系。需要说明的是,光机坐标系和相机坐标系之间是可以相互转换的,因此特征点在相机坐标系中的坐标可以转换为光机坐标系中的坐标。具体的,可以根据光机相机间的外参,将特征点在两个坐标系之间进行转换。光机相机间的外参是投影设备在制造出厂时已标注于设备外壳、或说明书的设备参数,通常基于投影设备的功能、组装、制造、零件而设定,适用于同一型号的所有投影设备,可以包括光机相机间旋转矩阵和平移矩阵。
根据光机相机间的外参,可以确定光机坐标系和相机坐标系之间的转换关系,并进一步得到特征点在光机坐标系下的坐标。
转换公式如下:
P′(x′,y′,z′)=RRR*P+TTT   (1)
其中:
P′(x′,y′,z′)为特征点在光机坐标系中的坐标。
RRR为光机相机间旋转矩阵,TTT为光机相机间平移矩阵。
在一些实施例中,在获取到特征点在光机坐标系中的坐标后,可以确定出投影面在光机坐标系下的投影面方程。
需要说明的是,由于需要至少三个点的坐标信息才可以确定出一个平面的位置信息。因此,控制器可以获取至少第一图像中的至少三个特征点的第一位置,并根据这些特征点的第一位置确定出相机坐标系下的坐标信息。进一步可以将相机坐标系下的坐标信息转换为光机坐标系下的坐标信息。
在确定出至少三个特征点在光机坐标系下的坐标信息后,控制器可以对这些特征点的坐标进行拟合,步骤1204,根据第三坐标获取投影面在光机坐标系下的投影面方程。投影面方程可以表示为:
z=ax+by+c   (2);
或如下公式:
Figure PCTCN2022132250-appb-000001
在一些实施例中,在确定了投影面在光机坐标系下的投影面方程后,步骤1205,根据投影面方程获取光机坐标系和世界坐标系的转换矩阵,该转换矩阵用于表征映射关系。
在本申请实施例中,世界坐标系设定为:以图像坐标系为XOY面,即投影面为XOY面,其中原点为用户预先设定的投影区域的中心点。以垂直于投影面的方向设定Z轴,建立的空间坐标系。
在获取光机坐标系和世界坐标系的转换矩阵时,控制器可以分别确定投影面在世界坐标系下的表示,以及投影面在光机坐标系下的表示。
具体的,控制器可以先确定投影面在世界坐标系下的单位法向量。
由于投影面本身即为世界坐标系的XOY面,因此投影面在世界坐标系下的单位法向量可以表示为:
m=(0,0,1)*T   (4)
控制器还可以根据投影面在光机坐标系下的投影面方程获取投影面在光机坐标系下的单位法向量,该投影面在光机坐标系下的单位法向量可表示为公式:
Figure PCTCN2022132250-appb-000002
根据投影面在两个坐标系下的单位法向量,可以获取光机坐标系和世界坐标系之间的转换矩阵,相互关系表示为如下公式:
m=R1*n(6)
其中:R1表示光机坐标系和世界坐标系之间的转换矩阵。
该转换矩阵即可表示光机投射的图卡,和投影面之间的映射关系。
在确定了映射关系之后,控制器可以实现某个点的坐标在世界坐标系和光机坐标系之间的转换。
对于投影面中已经确定的某个目标区域,根据目标区域在投影面中的位置信息可以确定出其在世界坐标系中的坐标表示,控制器可以根据转换矩阵将世界坐标系中的坐标表示转换为光机坐标系中的坐标表示,从而确定出对于投影设备来说,目标区域的位置信息,进而可以将图卡直接投射到该目标区域中。
对于投影设备来说,在未进行投射过程时,投影面中不存在投影区域,还无法直接确定出具体的投影区域。但投影设备可以确定出自身将要投射的图卡状态以及投射角度等等,从而能够确定出即将投射出的投影区域在光机坐标系中的坐标表示。控制器可以根据转换矩阵将光机坐标系中的坐标表示转换为世界坐标系中的坐标表示,从而确定出将要投射的投影区域在投影面中的位置信息。
即根据该转换矩阵,可以任意转换投影区域在世界坐标系和光机坐标系之间的坐标表示。
在一些实施例中,根据转换矩阵,即光机投射的图卡和投影面之间的映射关系,可以重新确定出投影区域,实现投影区域的修正。
根据该映射关系,可以先获取光机将图卡投射至投影面中的初始投影区域。具体的,初始投影区域时待进行修正的投影区域,即为:在对投影区域修正之前,投影设备投射的图像在投影面中的显示区域。
需要说明的是,投影设备可能正在进行投影过程,此时投影面中的初始投影区域中正在显示投影图像。例如,用户已经开启了投影设备,并且投影设备进行了投影过程,在投影面中形成了投影图像,当前投影面中显示图像的区域即为初始投影区域。然而当前的初始投影区域和用户设定的目标投影区域可能并不是一个投影区域,导致投影图像较于用户设定的位置发生了偏移,投影设备需要对该初始投影区域进行修正。
投影设备也可能没有正在进行投影过程,此时投影面中没有显示出投影图像,因此也不会直接显示出初始投影区域。此时,投影设备即将投射到的区域为初始投影区域。然而,对于投影设备来说,是可以确定出自身即将投射到的区域的位置信息,即初始投影区域的位置信息。但由于未真正进行投射过程,用户无法确定出具体的初始投影区域,也就无法确认初始投影区域和设定的目标投影区域是否为同一个初始投影区域。此时,为了保证投影设备能够将图像完全投射到目标投影区域中,用户可以控制投影设备直接进行投影区域修正,也可以是投影设备主动进行投影 区域修正,例如开机时主动进行。
在一些实施例中,控制器可以根据映射关系获取到初始投影区域。
具体的,控制器可以先确定出光机将图卡投射至投影面时的初始位置信息。本申请实施例中,初始位置信息为光机坐标系下,初始投影区域的位置信息,具体可以是初始投影区域的顶点坐标。
用户可以控制投影设备投射图卡,从而在投影面中形成投影图像。图卡可以是用户选择的图像,包括图片或视频等内容,其形状一般为矩形。因此,图卡投射到投影面中图像也可以呈矩形,以便用户观看。即投影图像对应的投影区域是矩形区域,当确定了矩形区域的四个顶点的坐标信息时,也就确定了该矩形区域的位置。
需要说明的是,对于投影设备来说,无论其是否已经进行了投影过程,投影设备均可以确定出自身即将或者已经投射的情况。投影设备可以确定出自身将要投射的图卡状态以及投射角度等等,从而确定出光机投射到的位置,即光机将图卡投射到的初始位置信息。因此,控制器可以获取到初始投影区域在光机坐标系中的坐标表示。
根据映射关系,控制器可以将初始位置信息,即初始投影区域的顶点坐标,转换为世界坐标系下的位置信息。具体可以是在世界坐标系下,初始投影区域的顶点坐标。当确定出初始投影区域在世界坐标系下的位置信息后,即确定出了具体的初始投影区域。
在一些实施例中,如果投影设备正在进行投影过程,此时投影面中显示的图像的区域即为初始投影区域。还可以通过下述方法获取到初始投影区域:
控制器可以控制相机对投影面进行拍摄,此时得到的投影面图像中可以包含投影设备正在投射的图卡,并且该图卡所在的区域即为初始投影区域。控制器可以对该投影面图像进行识别,确定出图卡的四个顶点的坐标信息,从而确定出初始投影区域的位置。
考虑到投影设备投射的图卡可能无法被完全识别出来,例如图卡的边缘区域均为白色,而投影面也是白色,此时无法区分图卡的边缘和投影面。对于相机拍摄的投影面图像,也就识别不到图卡的四个顶点,从而无法确定出初始投影区域。因此,控制器可以控制光机将预设的图卡投射到投影面中。该预设的图卡用于和投影面进行区分,可以是绿色图卡。此时,控制器再控制相机对投影面进行拍摄,可以识别出其中完整的图卡轮廓,从而确定出初始投影区域的位置信息。
在一些实施例中,在获取到初始投影区域后,控制器可以对初始投影区域进行轮廓修正,得到目标投影区域。其中,目标投影区域即为用户设定的投影区域。因此,对初始投影区域进行轮廓修正后,可以重新确定出用户设定的投影区域,实现了投影区域的修正处理。
具体的,控制器可以先控制相机对投影面进行拍摄,得到投影面图像。投影面图像中会包含各种各样的环境物体,如幕布、电视柜、墙壁、天花板、茶几等。通过确定出物体的轮廓即可确定出物体的位置信息。例如,确定出幕布四个顶点的坐标,也就确定出了幕布在投影面中的位置信息。
在该投影面图像中,可以生成初始投影区域对应的初始投影区域图像。具体的,由于已经获取到世界坐标系下,初始投影区域的顶点坐标。因此可以在投影面图像中,先标出初始投影区域的四个顶点,再进一步连接四个顶点,形成初始投影区域。此时,投影面图像中也显示出了初始投影区域。
在对初始投影区域进行轮廓修正时,可以对初始投影区域的四个顶点分别进行修正,从而得到四个修正后的目标顶点。四个目标顶点所形成的区域可以认为是目标投影区域。
在一些实施例中,可以根据初始投影区域图像的顶点获取目标顶点。对于初始投影区域图像的四个顶点,每个顶点都会对应一个修正后的目标顶点。
具体的,控制器可以对初始投影区域图像进行角点检测处理。通过角点检测算法,可以是Shi Tomasi算法,对初始投影区域图像进行处理,得到初始投影区域图像中包含的所有角点。角点可以是初始投影区域图像中灰度值或亮度值变化的极值点,例如物体轮廓的顶点等。因此,通过角点即可确定出图像中包含的物体轮廓的顶点,例如幕布的顶点。
需要说明的是,各个角点之间并不存在几何关系,因此单纯的角点识别还不能确定出具体的幕布轮廓。此时,可以确定出初始投影区域图像的每个顶点在修正后,所对应的一个角点,这四个角点即可认为时幕布的四个顶点,从而确定出具体的幕布轮廓。
对于初始投影区域图像的每个顶点来说,可以以该顶点为中心,确定出预设的第一尺寸。具体的,不同尺寸的初始投影区域图像可以对应不同的第一尺寸,例如对1280*720的初始投影区域图像,可以设定第一尺寸为20个像素点。因此,对于每个顶点,可以将顶点设为圆心,以20个像素点为半径,确定出顶点对应的第一修正区域。
在第一修正区域中,控制器可以获取距离圆心,即初始投影区域图像的顶点,最近的角点,本申请实施例中称为第一角点。该第一角点即可认为是初始投影区域图像的顶点所对应的目标顶点。
根据上述方法可以确定出初始投影区域图像的四个顶点,各自对应的目标顶点。四个目标顶点即形成了目标投影区域。
需要说明的是,上述方法获取到的目标投影区域和幕布真实的轮廓之间的误差极小,因此可以认为目标投影区域是幕布区域,可以实现像素级的幕布识别效果,从而精准控制图像入幕,大大提高了用户的观看体验。
在一些实施例中,考虑到较暗的场景下,幕布轮廓相对较暗。此时,如果投影设备正在进行投射过程,投影面中会显示投影图像,导致投影面图像中会出现该投影图像。由于投影图像较亮,幕布轮廓相对较暗,可能会导致投影图像的边缘被识别为角点,从而出现第一角点实际属于投影图像的情况,从而误将投影图像确定为幕布。因此在获取到第一角点后,还可以进一步对该第一角点进行检测。
具体的,控制器可以先确定第一角点的位置信息,即第一角点在投影面图像中的坐标。控制器可以将该图像坐标系下的坐标直接转换为世界坐标系下的坐标。根据映射关系,可以将第一角点在世界坐标系下的坐标,转换为光机坐标系下的坐标,本申请实施例中称为角点坐标。在光机坐标系下,控制器已知了初始投影区域的顶点坐标。因此,可以将顶点坐标和角点坐标进行对比确认。控制器可以判断顶点坐标和角点坐标是否满足预设的第二尺寸条件。
本申请实施例中,预设的第二尺寸条件设定为:顶点坐标和角点坐标大于预设的第二尺寸,第二尺寸可以是相关技术人员预先设定好的尺寸,可以设置为与第一尺寸相同,也可以不同,本申请实施例不做限定。
如果检测到顶点坐标和角点坐标满足预设的第二尺寸条件,说明初始投影区域的顶点和第一角点相距较远,该第一角点不属于初始投影区域的边缘。此时,可以将第一角点确定为目标顶点,从而得到目标投影区域,并且该目标投影区域即为幕布区域。
如果检测到顶点坐标和角点坐标不满足预设的第二尺寸条件,则说明初始投影区域的顶点和第一角点相距较近,第一角点属于初始投影区域的边缘。此时识别到的并未真正的幕布区域,因此不能将第一角点作为目标顶点,此时不会执行入幕操作,即投影设备不会进行正常的投影过程。
在一些实施例中,当顶点坐标和角点坐标不满足预设的第二尺寸条件,即第一角点不能作为目标顶点时,可以将该第一角点删除。同时可以在初始投影区域图像中剩余的角点中,继续寻找 新的第一角点,并可以对该新的第一角点继续检测,直到获取到符合条件的第一角点,可以将该第一角点作为目标顶点。如果没有符合条件的第一角点,则不会执行入幕操作,投影设备不会投影。
在一些实施例中,如果没能获取到目标顶点,投影设备不会进行投影。同时投影设备可以提示用户,例如投射一个入幕失败的信息,用来提示用户投影设备的摆放位置无法正常入幕。此时,用户可以调整投影设备的位置,投影设备可以再次进行投影区域修正,直到识别出目标投影区域,即幕布区域。
在一些实施例中,在确定了目标投影区域时,此时得到的是目标投影区域在投影面中的位置信息。需要说明的是,由于目标投影区域有四个顶点,同时投影设备投射的图卡也有四个顶点,为了准确地进行投射,可以获取目标投影区域的四个顶点和图卡的四个顶点的一一对应关系。可以是图卡的左上顶点投射到目标投影区域的左上顶点,从而保证图卡能够正常显示在目标投影区域中。
对于正常摆放的幕布,可以确定出其左上、右上、左下、右下四个顶点,从而获取到目标投影区域和图卡的对应关系,然而,考虑到幕布区域不一定是正常摆放,有可能出现倾斜摆放的情况,因此投影设备还需要进一步确定出目标投影区域的四个目标顶点具体时哪一个顶点,从而确定图卡投射的对应关系。
具体的,控制器可以先判断目标投影区域的状态。本申请实施例中设定目标投影区域存在正投状态和倾斜状态。其中,正投状态指的是:与基准区域相比较为水平状态,没有倾斜。控制器可以获取四个目标顶点在图像坐标系或世界坐标系下的坐标信息。利用两个坐标系中任一个即可。
对于四个目标顶点,可以统计他们的X坐标。控制器进一步可以判断:是否存在两个目标顶点的X坐标的差值满足预设的坐标条件。预设的坐标条件可以设定为:两个目标顶点的X坐标的差值小于预设尺寸。当满足预设的坐标条件时,可以认为这两个目标顶点所处的直线为垂直状态,因此整个目标投影区域可以认为是正投状态。如果不满足预设的坐标条件,则认为目标投影区域处于倾斜状态。
根据目标投影区域的状态可以进一步确定出四个目标顶点的对应关系。
在一些实施例中,如果目标投影区域处于正投状态,控制器可以先将四个目标顶点分类,确定出X坐标较小的两个目标顶点和X坐标较大的两个目标顶点。图13示出了本申请一些实施例中目标投影区域的示意图。其中,目标投影区域的四个目标顶点的坐标依次为:A(x1,y2)、B(x1,y1)、C(x2,y2)和D(x2,y1)。顶点A和B为X坐标较小的两个目标顶点,C和D为X坐标较大的两个目标顶点。
控制器可以将X坐标较小的两个目标顶点中,Y坐标大的目标顶点确定为第一目标顶点,Y坐标小的目标顶点确定为第二目标顶点。即将顶点A确定为第一目标顶点,顶点B确定为第二目标顶点。控制器可以将X坐标较大的两个目标顶点中,Y坐标大的目标顶点确定为第三目标顶点,Y坐标小的目标顶点确定为第四目标顶点。即将顶点C确定为第三目标顶点,顶点D确定为第四目标顶点。
本申请实施例中,可以设定第一目标顶点为左上顶点、第二目标顶点为左下顶点、第三目标顶点为右上顶点、第四目标顶点为右下顶点。四个目标顶点和目标图卡的四个顶点之间为一一对应关系,从而确定出目标投影区域和图卡间具体的投射关系。
在一些实施例中,如果目标投影区域处于倾斜状态,控制器可以先确定X坐标较小的两个目标顶点和X坐标较大的两个目标顶点。
需要说明的是,目标投影区域可以向左倾斜,有可能向右倾斜,因此,控制器可以先确定目标投影区域的倾斜方向。
控制器可以获取X坐标较小的两个目标顶点所在直线的斜率。图14示出了一些实施例中目标投影区域的示意图。其中,目标投影区域的四个目标顶点的坐标依次为:A(x1,y1)、B(x2,y2)、C(x3,y3)和D(x4,y4)。顶点A和B为X坐标较小的两个目标顶点,从确定出直线AB的斜率K。
K=(x2-x1)/(y2-y1)
通过斜率K可以确定出目标投影区域的倾斜方向,具体的,如果斜率小于0,则确定为目标投影区域处于第一倾斜状态,即向左倾斜。如果斜率大于0,则确定为目标投影区域处于第二倾斜状态,即向右倾斜。图15中直线AB的斜率K明显小于0,因此时向左倾斜。
确定目标投影区域的倾斜方向后,可以进一步确定四个目标顶点具体时间哪个顶点。
本申请实施例中,以幕布尺寸为16:9为例进行介绍。
如果目标投影区域处于第一倾斜状态,即向左倾斜。控制器可以将四个目标顶点的X坐标由小到大进行顺序。并按照由小到大的顺序,将四个目标顶点依次确定为第一目标顶点、第二目标顶点、第三目标顶点和第四目标顶点。如图15所示,由于x1<x2<x3<x4,因此顶点A为第一目标顶点,即左上顶点。顶点B为第二目标顶点,即左下顶点。顶点C为第三目标顶点,即右上顶点。顶点D为第四目标顶点,即右下顶点。
如果目标投影区域处于第二倾斜状态,即向右倾斜,控制器可以按照X坐标由小到大的顺序,将四个目标顶点依次确定为第二目标顶点、第一目标顶点、第四目标顶点和第三目标顶点。如图15所示,由于x2<x1<x4<x3,因此顶点A为第一目标顶点,即左上顶点。顶点B为第二目标顶点,即左下顶点。顶点C为第三目标顶点,即右上顶点。顶点D为第四目标顶点,即右下顶点。
需要说明的是,本申请实施例中以幕布尺寸16:9进行介绍,此时幕布的长大于宽,因此利用X坐标进行计算。如果幕布的长小于宽,可以利用Y坐标进行计算,具体的方法和上述类似,此处不再赘述。
在一些实施例中,还可以在幕布的后方设置有旋转组件。控制器可以根据目标投影区域的斜率计算能够让其恢复正投状态的旋转角度,并将该旋转角度发送给旋转组件,控制旋转组件按照该旋转角度进行旋转,使得目标投影区域恢复正投状态,以提高用户的观看体验。
在一些实施例中,在确定了目标投影区域后,可以获取目标投影区域在世界坐标系下的位置信息。根据投射关系,控制器可以将世界坐标系下的位置信息转换为光机坐标系下的位置信息,得到目标位置信息。
根据该目标位置信息,控制器可以控制光机将目标图卡投射至目标投影区域,从而实现投影区域的修正。图16示出了本申请一些实施例中投影区域修正前后的示意图。投影设备由A位置移动至B位置后,投影图像会发生偏移,不能完全处于投影区域中。投影设备可以进行投影区域修正处理,使得投影设备在B位置时也能够完全投射到投影区域中。
本申请实施例还提供了一种投影区域修正方法,应用于投影设备,如图17所示,该方法包括:
步骤1701、响应于投影区域修正指令,获取光机投射的图卡和投影面之间的映射关系。
步骤1702、根据映射关系获取,光机将图卡投射至投影面中的初始投影区域。
步骤1703、对初始投影区域进行轮廓修正,得到目标投影区域。
步骤1704、根据映射关系确定目标投影区域的目标位置信息。
步骤1705、根据目标位置信息,控制光机将目标图卡投射至目标投影区域。
本说明书中各个实施例之间相同相似的部分互相参照即可,在此不再赘述。
图18示出了本申请一实施例投影设备的结构示意图。
在一些实施例中,该投影设备中的激光光源可以包括独立设置的蓝色激光器201-20、红色激光器202-20和绿色激光器203-20,该投影设备也可以称为三色投影设备,蓝色激光器201-20、红色激光器202-20和绿色激光器203-20均为MCL型封装激光器,其体积小,利于光路的紧凑排布。
在一些实施例中,参考图18,该至少一个亮度传感器可以包括第一亮度传感器401-40、第二亮度传感器402-40和第三亮度传感器403-40,其中,第一亮度传感器401-40为蓝光亮度传感器或者白光亮度传感器,第二亮度传感器402-40为红光亮度传感器或者白光亮度传感器,该第三亮度传感器403-40为绿光亮度传感器或者白光亮度传感器。
其中,该第一亮度传感器401-40设置在蓝色激光器201-20的出光路径中,具体地,可以设置于蓝色激光器201-20准直光束的出光路径一侧,同理,该第二亮度传感器402-40设置在红色激光器202-20的出光路径中,具体地设置于红色激光器201-20准直光束的出光路径一侧,该第三亮度传感器403-40设置在绿色激光器203-20的出光路径中,具体地,设置于绿色激光器203-20准直光束的出光路径一侧。由于该激光器出射的激光在其出光路径中并未出现衰减,将亮度传感器设置在激光器的出光路径中,提高了亮度传感器对激光器第一亮度值检测的精度。
该显示控制电路还用于在控制蓝色激光器201-20出射蓝色激光时,读取该第一亮度传感器401-40检测的亮度值。并在控制该蓝色激光器201-20关闭时,停止读取该第一亮度传感器401-40检测的亮度值。
该显示控制电路还用于在控制红色激光器202-20出射红色激光时,读取该第二亮度传感器402-40检测的亮度值,并在控制红色激光器202-20关闭时,停止读取第二亮度传感器402-40检测的亮度值。
该显示控制电路还用于在控制绿色激光器203-20出射绿色激光时,读取该第三亮度传感器403-40检测的亮度值,并在控制绿色激光器203-20关闭时,停止读取该第三亮度传感器403-40检测的亮度值。
需要说明的是,亮度传感器也可以为一个,设置于三色激光的合光路径中。
图19示出了本申请另一实施例投影设备的结构示意图。
在一些实施例中,投影设备还可以包括光导管110,光导管110作为集光光学部件,用于接收并匀化输出合光状态的三色激光。
在一些实施例中,亮度传感器可以包括第四亮度传感器404,该第四亮度传感器404可以为白光亮度传感器。其中,该第四亮度传感器404设置在光导管110的出光路径中,比如设置于光导管的出光侧,靠近其出光面。以及,上述第四亮度传感器为白光亮度传感器。
该显示控制电路还用于在控制蓝色激光器201-20、红色激光器202-20和绿色激光器203-20分时开启时,读取该第四亮度传感器404检测的亮度值,以确保该第四亮度传感器404可以检测到该蓝色激光器201-20的第一亮度值、该红色激光器202-20的第一亮度值和该绿色激光器203-20的第一亮度值。并在控制该蓝色激光器201-20、红色激光器202-20和绿色激光器203-20均关闭时,停止读取该第四亮度传感器404检测的亮度值。
在一些实施例中,在投影设备投影图像的过程中,该第四亮度传感器404一直处于开启状态。
在一些实施例中,参考图18和图19,该投影设备还可以包括第四二向色片604、第五二向色片605、第五反射镜904、第二透镜组件、扩散轮150、TIR透镜120、DMD 130和投影镜头140。其中,该第二透镜组件包括第一透镜901-90、第二透镜902-90和第三透镜903-90。该第四二向色片604可以透过蓝色激光,反射绿色激光。该第五二向色片605可以透过红色激光,反射绿色激光和蓝色激光。
该蓝色激光器201-20出射的蓝色激光透过第四二向色片604,再经过第五二向色片605反射进入 第一透镜901-90聚光。红色激光器202-20出射的红色激光透过第五二向色片605直接进入第一透镜901-90聚光。绿色激光器203-20出射的绿色激光经过第五反射镜904反射,依次经过第四二向色片604和第五二向色片605反射后进入第一透镜901-90聚光。经过第一透镜901-90聚光后的蓝色激光、红色激光和绿色激光分时透过旋转的扩散轮150进行消散斑,并投射到光导管110匀光后,经过第二透镜902-90和第三透镜903-90整形后进入TIR透镜120全反射,并经过DMD130反射后再透过TIR透镜120,最后经过投影镜头140投射至显示屏幕上,形成需要显示的图像。
图20示出了本申请另一实施例投影设备实现放射眼功能的信令交互时序示意图。
在一些实施例中,本申请提供的投影设备可实现防射眼功能。为防止用户偶然进入投影设备射出激光轨迹范围内而导致的视力损害危险,在用户进入投影设备所在的预设特定非安全区域时,控制器可控制用户界面显示对应的提示信息,以提醒用户离开当前区域,控制器还可控制用户界面降低显示亮度,以防止激光对用户视力造成伤害。
在一些实施例中,投影设备被配置为儿童观影模式时,控制器将自动开启防射眼开关。
在一些实施例中,控制器接收到陀螺仪传感器发送的位置移动数据后、或接收到其它传感器所采集的异物入侵数据后,控制器将控制投影设备开启防射眼开关。
在一些实施例中,在飞行时间(TOF)传感器、摄像头设备等设备所采集数据触发预设的任一阈值条件时,控制器将控制用户界面降低显示亮度、显示提示信息、降低光机发射功率、亮度、强度,以实现对用户视力的保护。
在一些实施例中,投影设备控制器可控制校正服务向飞行时间传感器发送信令;步骤2001,查询投影设备当前设备状态,然后控制器接受来自飞行时间传感器的数据反馈。
步骤2002,校正服务可向进程通信框架(HSP Core)发送通知算法服务启动防射眼流程信令;步骤2003,进程通信框架(HSP Core)将从算法库进行服务能力调用,以调取对应算法服务,例如可包括拍照检测算法、截图画面算法、以及异物检测算法等;
步骤2004,进程通信框架(HSP Core)基于上述算法服务返回异物检测结果至校正服务;针对返回结果,若达到预设阈值条件,控制器将控制用户界面显示提示信息、降低显示亮度,其信令时序如图20所示。
在一些实施例中,投影设备防射眼开关在开启状态下,用户进入预设的特定区域时,投影设备将自动降低光机发出激光强度、降低用户界面显示亮度、显示安全提示信息。投影设备对上述防射眼功能的控制,可通过以下方法实现,如图25所示:
控制器基于相机获取的投影画面,利用边缘检测算法识别投影设备的目标投影区域;在目标投影区域显示为矩形、或类矩形时,控制器通过预设算法获取上述矩形目标投影区域四个顶点的坐标值;
在实现对于目标投影区域内的异物检测时,可使用透视变换方法校正目标投影区域为矩形,计算矩形和投影截图的差值,以实现判断显示区域内是否有异物;若判断结果为存在异物,投影设备自动触发防射眼功能启动。
步骤2501,开启防射眼;
在实现对投影范围外一定区域的异物检测时,可将当前帧的相机内容、和上一帧的相机内容做差值,以判断投影范围外区域是否有异物进入;若判断有异物进入,投影设备自动触发防射眼功能。
于此同时,投影设备还可利用飞行时间(ToF)相机、或飞行时间传感器检测特定区域的实时深度变化;若深度值变化超过预设阈值,投影设备将自动触发防射眼功能。
在一些实施例中,投影设备基于采集的飞行时间数据、截图数据、以及相机数据分析判断是否需要开启防射眼功能。
例如,步骤2502,根据采集的飞行时间数据;步骤2503,控制器做深度差值分析;步骤2504,如果深度差值大于预设阈值X,当预设阈值X实施为0时,则可判定有异物已处于投影设备的特定区域。步骤2505,画面变暗,弹出提示;若用户位于所述特定区域,其视力存在被激光损害风险,投影设备将自动启动防射眼功能,以降低光机发出激光强度、降低用户界面显示亮度、并显示安全提示信息。
又例如,步骤2506,采集截图数据;步骤2507,加色模式(RGB)差值分析,如果大于阈值Y,则执行步骤2508,如所述色加模式差值大于预设阈值Y,则可判定有异物已处于投影设备的特定区域;步骤2505,画面变暗,弹出提示;所述特定区域内若存在用户,其视力存在被激光损害风险,投影设备将自动启动防射眼功能,降低发出激光强度、降低用户界面显示亮度并显示对应的安全提示信息。
又例如,步骤2509,采集相机数据;步骤2510,获取投影坐标;步骤2511,确定投影设备的目标投影区域;步骤2507,在目标投影区域内进行加色模式(RGB)差值分析;步骤2508,如果色加模式差值大于预设阈值Y,则可判定有异物已处于投影设备的特定区域;步骤2505,画面变暗,弹出提示;所述特定区域内若存在用户,其视力存在被激光损害的风险,投影设备将自动启动防射眼功能,降低发出激光强度、降低用户界面显示亮度并显示对应的安全提示信息。
步骤2512,获取的投影坐标处于扩展区域;步骤2513,控制器仍可在所述扩展区域进行加色模式(RGB)差值分析;步骤2514,如果色加模式差值大于预设阈值Y,则可判定有异物已处于投影设备的特定区域;步骤2505,画面变暗,弹出提示;所述特定区域内若存在用户,其视力存在被投影设备发出激光损害的风险,投影设备将自动启动防射眼功能,降低发出激光强度、降低用户界面显示亮度并显示对应的安全提示信息。
图21示出了本申请另一实施例投影设备实现显示画面校正功能的信令交互时序示意图。
在一些实施例中,通常情况下,投影设备可通过陀螺仪、或陀螺仪传感器对设备移动进行监测。步骤2101,校正服务向陀螺仪发出用于查询设备状态的信令;并接收陀螺仪反馈用于判定设备是否发生移动的信令。
在一些实施例中,投影设备的显示校正策略可配置为,在陀螺仪、飞行时间传感器同时发生变化时,投影设备优先触发梯形校正;在陀螺仪数据稳定预设时间长度后,步骤2102,通知算法服务启动梯形校正流程;控制器启动触发梯形校正;并且控制器还可将投影设备配置为在梯形校正进行时不响应遥控器按键发出的指令;为了配合梯形校正的实现,投影设备将打出纯白图卡。
其中,梯形校正算法可基于双目相机构建世界坐标系下的投影面与光机坐标系转换矩阵;进一步结合光机内参计算投影画面与播放图卡的单应性,并利用该单应性实现投影画面与播放图卡间的任意形状转换。
在一些实施例中,校正服务发送用于通知算法服务启动梯形校正流程的信令至进程通信框架(HSP CORE),所述进程通信框架进一步发送服务能力调用信令至算法服务,以获取能力对应的算法;
算法服务获取执行拍照和画面算法处理服务、避障算法服务,并将其以信令携带的方式发送至进程通信框架;在一些实施例中,进程通信框架执行上述算法,并将执行结果反馈给校正服务,所述执行结果可包括拍照成功、以及避障成功。
在一些实施例中,投影设备执行上述算法、或数据传送过程中,若出现错误校正服务将控制用户界面显示出错返回提示,并控制用户界面再次打出梯形校正、自动对焦图卡。
通过自动避障算法,投影设备可识别幕布;并利用投影变化,将投影画面校正至幕布内显示,实现与幕布边沿对齐的效果。
通过自动对焦算法,投影设备可利用飞行时间(ToF)传感器获取光机与投影面距离,基于所述距离在预设的映射表中查找最佳像距,并利用图像算法评价投影画面清晰程度,以此为依据实现微调像距。
在一些实施例中,步骤2103,校正服务发送至进程通信框架的自动梯形校正信令可包含其他功能配置指令,例如可包含是否实现同步避障、是否入幕等控制指令。
进程通信框架发送服务能力调用信令至算法服务,使算法服务获取执行自动对焦算法,实现调节设备与幕布之间的视距;在一些实施例中,在应用自动对焦算法实现对应功能后,算法服务还可获取执行自动入幕算法,所述过程中可包含梯形校正算法。
在一些实施例中,投影设备通过执行自动入幕,算法服务可设置投影设备与幕布之间的8位置坐标;然后再次通过自动对焦算法,实现投影设备与幕布的视距调节;最终,将校正结果反馈至校正服务,步骤2104,控制用户界面显示校正结果,如图21所示。
在一些实施例中,投影设备通过自动对焦算法,利用其配置的激光测距可获得当前物距,以计算初始焦距、及搜索范围;然后投影设备驱动相机(Camera)进行拍照,并利用对应算法进行清晰度评价。
投影设备在上述搜索范围内,基于搜索算法查找可能的最佳焦距,然后重复上述拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距,完成自动对焦。
例如,在步骤2201,投影设备启动后;步骤2202,用户移动设备;投影设备自动完成校正后重新对焦;步骤2203,控制器将检测自动对焦功能是否开启;当自动对焦功能未开启时,控制器将结束自动对焦业务;步骤2204,当自动对焦功能开启时,投影设备将通过中间件获取飞行时间(TOF)传感器的检测距离进行计算;
步骤2205,控制器根据获取的距离查询预设的映射表,以获取投影设备的大致焦距;步骤2206,中间件将获取焦距设置到投影设备的光机;
步骤2207,光机以上述焦距进行发出激光后,摄像头将执行拍照指令;步骤2208,控制器根据获取的拍照结果、评价函数,判断投影设备对焦是否完成;如果判定结果符合预设完成条件,则控制自动对焦流程结束;步骤2209,如果判定结果不符合预设完成条件,中间件将微调投影设备光机的焦距参数,例如可以预设步长逐渐微调焦距,并将调整的焦距参数再次设置到光机;从而实现反复拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距完成自动对焦,如图22所示。
在一些实施例中,本申请提供的投影设备可通过梯形校正算法实现显示校正功能。
首先基于标定算法,可获取两相机之间、相机与光机之间的两组外参,即旋转、平移矩阵;然后通过投影设备的光机播放特定棋盘格图卡,并计算投影棋盘格角点深度值,例如通过双目相机之间的平移关系、及相似三角形原理求解xyz坐标值;之后再基于所述xyz拟合出投影面、并求得其与相机坐标系的旋转关系与平移关系,具体可包括俯仰关系(Pitch)和偏航关系(Yaw)。
通过投影设备配置的陀螺仪可得到卷(Roll)参数值,以组合出完整旋转矩阵,最终计算求得世界坐标系下投影面到光机坐标系的外参。
结合上述步骤中计算获取的相机与光机的R、T值,可以得出投影面世界坐标系与光机坐标系的转换关系;结合光机内参,可以组成投影面的点到光机图卡点的单应性矩阵。
最终在投影面选择矩形,利用单应性反求光机图卡对应的坐标,该坐标就是校正坐标,将其设置到光机,即可实现梯形校正。
例如,流程图如图23所示,包括:
步骤2301,投影设备控制器获取照片像素点对应点的深度值,或投影点在相机坐标系下的坐标;
步骤2302,通过深度值,中间件获取光机坐标系与相机坐标系关系;
步骤2303,控制器计算得到投影点在光机坐标系下的坐标值;
步骤2304,基于坐标值拟合平面获取投影面与光机的夹角;
步骤2305,根据夹角关系获取投影点在投影面的世界坐标系中的对应坐标;
步骤2306,根据图卡在光机坐标系下的坐标与投影平面投影面对应点的坐标,可计算得到单应性矩阵。
步骤2307,控制器基于上述已获取数据判定障碍物是否存在;
步骤2308,障碍物存在时,在世界坐标系下的投影面上任取矩形坐标,根据单应性关系计算出光机要投射的区域;
步骤2309,障碍物不存在时,控制器例如可获取二维码特征点;
步骤2310,获取二维码在预制图卡的坐标;
步骤2311,获取相机照片与图纸图卡单应性关系;
步骤2312,将获取的障碍物坐标转换到图卡中,即可获取障碍物遮挡图卡坐标。
步骤2313,依据障碍物图卡遮挡区域在光机坐标系下坐标,通过单应性矩阵转换得到投影面的遮挡区域坐标;步骤2314,在世界坐标系下投影面上任取矩形坐标,同时避开障碍物,根据单应性关系求出光机要投射的区域。
可以理解,避障算法在梯形校正算法流程选择矩形步骤时,利用算法(OpenCV)库完成异物轮廓提取,选择矩形时避开该障碍物,以实现投影避障功能。
在一些实施例中,如图24所示,包括:
步骤2401,中间件获取相机拍到的二维码图卡;
步骤2402,识别二维码特征点,获取在相机坐标系下的坐标;
步骤2403,控制器进一步获取预置图卡在光机坐标系下的坐标;
步骤2404,以求解相机平面与光机平面的单应性关系;
步骤2405,控制器基于上述单应性关系,识别相机拍到的幕布四个顶点坐标;
步骤2406,根据单应性矩阵获取投影到幕布光机要投射图卡的范围。
可以理解,在一些实施例中,入幕算法基于算法库(OpenCV),可识别最大黑色闭合矩形轮廓并提取,判断是否为16:9尺寸;投影特定图卡并使用相机拍摄照片,提取照片中多个角点用于计算投影面(幕布)与光机播放图卡的单应性,将幕布四顶点通过单应性转换至光机像素坐标系,将光机图卡转换至幕布四顶点即可完成计算比对。
长焦微投设备(例如,微投电视)具有灵活移动的特点,每次位移后投影画面可能会出现失真,另外如投影面存在异物遮挡、或投影画面从幕布异常时,本申请提供的投影设备、以及基于几何校正的显示控制方法,可针对上述问题自动完成校正,包括实现自动梯形校正、自动入幕、自动避障、自动对焦、防射眼等功能的。
为了方便解释,已经结合具体的实施方式进行了上述说明。但是,上述在一些实施例中讨论不是意图穷尽或者将实施方式限定到上述公开的具体形式。根据上述的教导,可以得到多种修改和变形。上述实施方式的选择和描述是为了更好的解释本公开的内容,从而使得本领域技术人员更好的使用实施方式。

Claims (19)

  1. 一种投影设备,包括:
    光机,被配置为将图卡投射至投影面;
    控制器,被配置为:
    响应于投影区域修正指令,获取所述光机投射的图卡和所述投影面之间的映射关系;
    根据所述映射关系获取,所述光机将图卡投射至所述投影面中的初始投影区域;
    对所述初始投影区域进行轮廓修正,得到目标投影区域;
    根据所述映射关系确定所述目标投影区域的目标位置信息;
    根据所述目标位置信息,控制所述光机将目标图卡投射至所述目标投影区域。
  2. 根据权利要求1所述的投影设备,还包括:
    相机,被配置为拍摄所述投影面中显示的图像;
    所述控制器被配置为:
    在执行获取所述光机投射的图卡和所述投影面之间的映射关系的步骤中,
    控制所述光机投射第一图卡至所述投影面,并控制所述相机对所述第一图卡进行拍摄,得到第一图像,所述第一图卡中包含特征点;
    获取所述第一图像对应的图像坐标系下,所述特征点的第一坐标;
    根据所述第一坐标获取所述特征点在相机坐标系下的第二坐标;
    将所述第二坐标转换为所述特征点在光机坐标系下的第三坐标;
    根据所述第三坐标获取所述投影面在光机坐标系下的投影面方程;
    根据所述投影面方程获取所述光机坐标系和世界坐标系的转换矩阵,所述转换矩阵用于表征所述映射关系。
  3. 根据权利要求2所述的投影设备,所述控制器被配置为:
    在执行获取所述第一图像对应的图像坐标系下,所述特征点的第一坐标的步骤中,
    确定所述第一图像中的至少三个特征点,并分别获取所述至少三个特征点的第一坐标;
    在执行根据所述第三坐标获取所述投影面在光机坐标系下的投影面方程的步骤中,
    将所述至少三个特征点对应的所述第三坐标进行拟合,得到所述投影面在光机坐标系下的投影面方程。
  4. 根据权利要求2所述的投影设备,所述控制器被配置为:
    在执行对所述初始投影区域进行轮廓修正,得到目标投影区域的步骤中,
    控制所述相机对投影面进行拍摄,得到投影面图像;
    在所述投影面图像中,生成所述初始投影区域对应的初始投影区域图像;
    根据所述初始投影区域图像的顶点获取目标顶点;
    将所述目标顶点形成的区域确定为目标投影区域。
  5. 根据权利要求4所述的投影设备,所述控制器被配置为:
    在执行根据所述初始投影区域图像的顶点获取目标顶点的步骤中,
    对所述初始投影区域图像进行角点检测处理,得到所述初始投影区域图像中包含的所有角点;
    以所述初始投影区域图像的顶点为中心,在预设的第一尺寸中,获取距离所述初始投影区域图像的顶点最近的第一角点;
    将所述最近的第一角点确定为目标顶点。
  6. 根据权利要求5所述的投影设备,所述控制器被配置为:
    在执行根据所述映射关系获取,所述光机将图卡投射至所述投影面中的初始投影区域的步骤中,
    确定所述光机将图卡投射至所述投影面中的初始位置信息,所述初始位置信息为所述光机坐标系下,所述初始投影区域的顶点坐标;
    根据所述映射关系,将所述顶点坐标转换为所述世界坐标系下的位置信息;
    根据所述世界坐标系下的位置信息确定初始投影区域。
  7. 根据权利要求6所述的投影设备,所述控制器还被配置为:
    在执行将所述最近的第一角点确定为目标顶点的步骤前,
    确定所述第一角点在所述世界坐标系下的坐标,并根据所述映射关系转换为所述光机坐标系下的角点坐标;
    判断所述顶点坐标和所述角点坐标是否满足预设的第二尺寸条件;
    若是,则执行将所述最近的第一角点确定为目标顶点的步骤。
  8. 根据权利要求5所述的投影设备,所述控制器被配置为:
    在执行将所述最近的第一角点确定为目标顶点的步骤后,
    获取所述目标顶点在所述图像坐标系或所述世界坐标系下的坐标信息,所述目标顶点的数量为四;
    判断是否存在两个目标顶点的X坐标的差值满足预设的坐标条件;
    若是,则确定为所述目标投影区域处于正投状态,若否,则确定为所述目标投影区域处于倾斜状态;
    如果所述目标投影区域处于正投状态,确定X坐标较小的两个目标顶点和X坐标较大的两个目标顶点;
    将所述X坐标较小的两个目标顶点中,Y坐标大的目标顶点确定为第一目标顶点,Y坐标小的目标顶点确定为第二目标顶点;将所述X坐标较大的两个目标顶点中,Y坐标大的目标顶点确定为第三目标顶点,Y坐标小的目标顶点确定为第四目标顶点;其中,四个目标顶点和所述目标图卡的四个顶点之间为一一对应关系。
  9. 根据权利要求8所述的投影设备,所述控制器被配置为:
    如果所述目标投影区域处于倾斜状态,确定X坐标较小的两个目标顶点和X坐标较大的两个目标顶点;
    获取所述X坐标较小的两个目标顶点所在直线的斜率;如果所述斜率小于0,则确定为所述目标投影区域处于第一倾斜状态;如果所述斜率大于0,则确定为所述目标投影区域处于第二倾斜状态;
    如果所述目标投影区域处于第一倾斜状态,按照X坐标由小到大的顺序,将四个目标顶点依次确定为第一目标顶点、第二目标顶点、第三目标顶点和第四目标顶点;如果所述目标投影区域处于第二倾斜状态,按照X坐标由小到大的顺序,将四个目标顶点依次确定为第二目标顶点、第一目标顶点、第四目标顶点和第三目标顶点。
  10. 根据权利要求1所述的投影设备,所述控制器还被配置为:
    基于相机获取的投影画面,利用边缘检测算法识别投影设备的目标投影区域;在目标投影区域显示为矩形、或类矩形时,控制器通过预设算法获取矩形目标投影区域四个顶点的坐标值。
  11. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    使用透视变换方法校正目标投影区域为矩形,计算矩形和投影截图的差值,以实现判断显示区域内是否有异物。
  12. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    在实现对投影范围外一定区域的异物检测时,可将当前帧的相机内容、和上一帧的相机内容做差值,以判断投影范围外区域是否有异物进入;若判断有异物进入,投影设备自动触发防射眼功能。
  13. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    利用飞行时间相机、或飞行时间传感器检测特定区域的实时深度变化;若深度值变化超过预设阈值,投影设备将自动触发防射眼功能。
  14. 根据权利要求10所述的投影设备,所述控制器还被配置为:基于采集的飞行时间数据、截图数据、以及相机数据分析判断是否需要开启防射眼功能。
  15. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    若用户位于特定区域,其视力存在被激光损害风险,将自动启动防射眼功能,以降低光机发出激光强度、降低用户界面显示亮度、并显示安全提示信息。
  16. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    通过陀螺仪、或陀螺仪传感器对设备移动进行监测;向陀螺仪发出用于查询设备状态的信令,并接收陀螺仪反馈用于判定设备是否发生移动的信令。
  17. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    在陀螺仪数据稳定预设时间长度后,控制启动触发梯形校正,在梯形校正进行时不响应遥控器按键发出的指令。
  18. 根据权利要求10所述的投影设备,所述控制器还被配置为:
    通过自动避障算法识别幕布,并利用投影变化,将投影画面校正至幕布内显示,实现与幕布边沿对齐的效果。
  19. 一种用于投影设备的投影区域修正方法,所述方法包括:
    响应于投影区域修正指令,获取光机投射的图卡和所述投影面之间的映射关系;
    根据所述映射关系获取,所述光机将图卡投射至所述投影面中的初始投影区域;
    对所述初始投影区域进行轮廓修正,得到目标投影区域;
    根据所述映射关系确定所述目标投影区域的目标位置信息;
    根据所述目标位置信息,控制所述光机将目标图卡投射至所述目标投影区域。
PCT/CN2022/132250 2021-11-16 2022-11-16 一种投影设备和投影区域修正方法 WO2023088304A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280063350.5A CN118077192A (zh) 2021-11-16 2022-11-16 一种投影设备和投影区域修正方法

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202111355866.0 2021-11-16
CN202111355866 2021-11-16
CN202210389054.6A CN114827563A (zh) 2021-11-16 2022-04-13 投影设备和投影区域修正方法
CN202210389054.6 2022-04-13

Publications (1)

Publication Number Publication Date
WO2023088304A1 true WO2023088304A1 (zh) 2023-05-25

Family

ID=80658581

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/CN2022/122810 WO2023087950A1 (zh) 2021-11-16 2022-09-29 一种投影设备及显示控制方法
PCT/CN2022/132250 WO2023088304A1 (zh) 2021-11-16 2022-11-16 一种投影设备和投影区域修正方法
PCT/CN2022/132368 WO2023088329A1 (zh) 2021-11-16 2022-11-16 投影设备及投影图像校正方法

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/122810 WO2023087950A1 (zh) 2021-11-16 2022-09-29 一种投影设备及显示控制方法

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/132368 WO2023088329A1 (zh) 2021-11-16 2022-11-16 投影设备及投影图像校正方法

Country Status (2)

Country Link
CN (12) CN114466173A (zh)
WO (3) WO2023087950A1 (zh)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023087951A1 (zh) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 一种投影设备及投影图像的显示控制方法
CN114466173A (zh) * 2021-11-16 2022-05-10 海信视像科技股份有限公司 投影设备及自动投入幕布区域的投影显示控制方法
CN115002432A (zh) * 2022-05-30 2022-09-02 海信视像科技股份有限公司 一种投影设备及避障投影方法
WO2023087960A1 (zh) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 投影设备及调焦方法
WO2023087947A1 (zh) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 一种投影设备和校正方法
CN114760454A (zh) * 2022-05-24 2022-07-15 海信视像科技股份有限公司 一种投影设备及触发校正方法
CN114640832A (zh) * 2022-02-11 2022-06-17 厦门聚视智创科技有限公司 一种投影图像的自动校正方法
CN115002429B (zh) * 2022-05-07 2023-03-24 深圳市和天创科技有限公司 一种基于摄像头计算自动校准投影位置的投影仪
CN114885142B (zh) * 2022-05-27 2024-05-17 海信视像科技股份有限公司 一种投影设备及调节投影亮度方法
CN115314689A (zh) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 投影校正方法、装置、投影仪和计算机程序产品
CN115314691B (zh) * 2022-08-09 2023-05-09 北京淳中科技股份有限公司 一种图像几何校正方法、装置、电子设备及存储介质
CN115061415B (zh) * 2022-08-18 2023-01-24 赫比(成都)精密塑胶制品有限公司 一种自动流程监控方法、设备以及计算机可读存储介质
CN115474032B (zh) * 2022-09-14 2023-10-03 深圳市火乐科技发展有限公司 投影交互方法、投影设备和存储介质
CN115529445A (zh) * 2022-09-15 2022-12-27 海信视像科技股份有限公司 一种投影设备及投影画质调整方法
WO2024066776A1 (zh) * 2022-09-29 2024-04-04 海信视像科技股份有限公司 投影设备及投影画面处理方法
CN115361540B (zh) * 2022-10-20 2023-01-24 潍坊歌尔电子有限公司 投影图像的异常原因自检方法、装置、投影机及存储介质
CN115760620B (zh) * 2022-11-18 2023-10-20 荣耀终端有限公司 一种文档矫正方法、装置及电子设备
CN116723395A (zh) * 2023-04-21 2023-09-08 深圳市橙子数字科技有限公司 一种基于摄像头的无感对焦方法及装置
CN116993879B (zh) * 2023-07-03 2024-03-12 广州极点三维信息科技有限公司 一种自动避障布光的方法、电子设备和存储介质
CN117278735B (zh) * 2023-09-15 2024-05-17 山东锦霖智能科技集团有限公司 一种沉浸式图像投影设备
CN117830437B (zh) * 2024-03-01 2024-05-14 中国科学院长春光学精密机械与物理研究所 一种大视场远距离多目相机内外参数标定装置及方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016194191A1 (ja) * 2015-06-04 2016-12-08 日立マクセル株式会社 投射型映像表示装置および映像表示方法
CN107547881A (zh) * 2016-06-24 2018-01-05 上海顺久电子科技有限公司 一种投影成像的自动校正方法、装置及激光电视
CN110336987A (zh) * 2019-04-03 2019-10-15 北京小鸟听听科技有限公司 一种投影仪畸变校正方法、装置和投影仪
CN112584113A (zh) * 2020-12-02 2021-03-30 深圳市当智科技有限公司 基于映射校正的宽屏投影方法、系统及可读存储介质
CN112689136A (zh) * 2021-03-19 2021-04-20 深圳市火乐科技发展有限公司 投影图像调整方法、装置、存储介质及电子设备
CN113099198A (zh) * 2021-03-19 2021-07-09 深圳市火乐科技发展有限公司 投影图像调整方法、装置、存储介质及电子设备
CN114827563A (zh) * 2021-11-16 2022-07-29 海信视像科技股份有限公司 投影设备和投影区域修正方法

Family Cites Families (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005031267A (ja) * 2003-07-09 2005-02-03 Sony Corp 画像投射装置及び画像投射方法
JP3951984B2 (ja) * 2003-08-22 2007-08-01 日本電気株式会社 画像投影方法、及び画像投影装置
JP2006109088A (ja) * 2004-10-05 2006-04-20 Olympus Corp マルチプロジェクションシステムにおける幾何補正方法
US7872637B2 (en) * 2007-04-25 2011-01-18 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. System and method for tracking a laser spot on a projected computer screen image
JP4831219B2 (ja) * 2008-10-29 2011-12-07 セイコーエプソン株式会社 プロジェクタおよびプロジェクタの制御方法
CN102236784A (zh) * 2010-05-07 2011-11-09 株式会社理光 屏幕区域检测方法及系统
CN102681312B (zh) * 2011-03-16 2015-06-24 宏瞻科技股份有限公司 激光投影系统的人眼安全保护系统
JP2013033206A (ja) * 2011-07-06 2013-02-14 Ricoh Co Ltd 投影型表示装置、情報処理装置、投影型表示システム、およびプログラム
CN103293836A (zh) * 2012-02-27 2013-09-11 联想(北京)有限公司 一种投影方法及电子设备
CN103002240B (zh) * 2012-12-03 2016-11-23 深圳创维数字技术有限公司 一种设定避开障碍物投影的方法及设备
JP2015128242A (ja) * 2013-12-27 2015-07-09 ソニー株式会社 画像投影装置及びそのキャリブレーション方法
CN103905762B (zh) * 2014-04-14 2017-04-19 上海索广电子有限公司 投影模块的投影画面自动检查方法
CN103942796B (zh) * 2014-04-23 2017-04-12 清华大学 一种高精度的投影仪‑摄像机标定系统及标定方法
JP2016014712A (ja) * 2014-07-01 2016-01-28 キヤノン株式会社 シェーディング補正値算出装置およびシェーディング補正値算出方法
WO2016103543A1 (ja) * 2014-12-25 2016-06-30 パナソニックIpマネジメント株式会社 投影装置
CN104536249B (zh) * 2015-01-16 2016-08-24 努比亚技术有限公司 调节投影仪焦距的方法和装置
CN104835143A (zh) * 2015-03-31 2015-08-12 中国航空无线电电子研究所 一种快速投影机系统参数标定方法
JP2016197768A (ja) * 2015-04-02 2016-11-24 キヤノン株式会社 画像投射システム及び投射画像の制御方法
CN105208308B (zh) * 2015-09-25 2018-09-04 广景视睿科技(深圳)有限公司 一种获取投影仪的最佳投影焦点的方法及系统
CN106713879A (zh) * 2016-11-25 2017-05-24 重庆杰夫与友文化创意有限公司 避障投影方法及其装置
KR101820905B1 (ko) * 2016-12-16 2018-01-22 씨제이씨지브이 주식회사 촬영장치에 의해 촬영된 이미지 기반의 투사영역 자동보정 방법 및 이를 위한 시스템
CN109215082B (zh) * 2017-06-30 2021-06-22 杭州海康威视数字技术股份有限公司 一种相机参数标定方法、装置、设备及系统
CN109426060A (zh) * 2017-08-21 2019-03-05 深圳光峰科技股份有限公司 投影仪自动调焦方法及投影仪
CN107479168A (zh) * 2017-08-22 2017-12-15 深圳市芯智科技有限公司 一种能实现快速对焦功能的投影机及对焦方法
KR101827221B1 (ko) * 2017-09-07 2018-02-07 주식회사 조이펀 좌표계 자동 보정이 가능한 혼합현실 콘텐츠 제공 장치 및 이를 이용한 좌표계 자동 보정 방법
CN109856902A (zh) * 2017-11-30 2019-06-07 中强光电股份有限公司 投影装置及自动对焦方法
CN110058483B (zh) * 2018-01-18 2022-06-10 深圳光峰科技股份有限公司 自动对焦系统、投影设备、自动对焦方法及存储介质
US11240475B2 (en) * 2018-04-17 2022-02-01 Sony Corporation Information processing apparatus and method
CN110769214A (zh) * 2018-08-20 2020-02-07 成都极米科技股份有限公司 基于帧差值的自动跟踪投影方法及装置
CN109544643B (zh) * 2018-11-21 2023-08-11 北京佳讯飞鸿电气股份有限公司 一种摄像机图像校正方法及装置
CN109495729B (zh) * 2018-11-26 2023-02-10 青岛海信激光显示股份有限公司 投影画面校正方法和系统
CN110769225B (zh) * 2018-12-29 2021-11-09 成都极米科技股份有限公司 基于幕布的投影区域获取方法及投影装置
CN110769226B (zh) * 2019-02-27 2021-11-09 成都极米科技股份有限公司 超短焦投影机的对焦方法、对焦装置及可读存储介质
CN110769227A (zh) * 2019-02-27 2020-02-07 成都极米科技股份有限公司 超短焦投影机的对焦方法、对焦装置及可读存储介质
CN110636273A (zh) * 2019-10-15 2019-12-31 歌尔股份有限公司 调整投影画面的方法、装置、可读存储介质及投影仪
CN111028297B (zh) * 2019-12-11 2023-04-28 凌云光技术股份有限公司 面结构光三维测量系统的标定方法
CN111050150B (zh) * 2019-12-24 2021-12-31 成都极米科技股份有限公司 焦距调节方法、装置、投影设备及存储介质
CN111050151B (zh) * 2019-12-26 2021-08-17 成都极米科技股份有限公司 投影对焦的方法、装置、投影仪和可读存储介质
CN111311686B (zh) * 2020-01-15 2023-05-02 浙江大学 一种基于边缘感知的投影仪失焦校正方法
CN113554709A (zh) * 2020-04-23 2021-10-26 华东交通大学 一种基于偏振信息的相机-投影仪系统标定方法
CN111429532B (zh) * 2020-04-30 2023-03-31 南京大学 一种利用多平面标定板提高相机标定精确度的方法
CN113301314B (zh) * 2020-06-12 2023-10-24 阿里巴巴集团控股有限公司 对焦方法、投影仪、成像设备和存储介质
CN112050751B (zh) * 2020-07-17 2022-07-22 深圳大学 一种投影仪标定方法、智能终端及存储介质
CN111932571B (zh) * 2020-09-25 2021-01-22 歌尔股份有限公司 图像的边界识别方法、装置以及计算机可读存储介质
CN112598589A (zh) * 2020-12-17 2021-04-02 青岛海信激光显示股份有限公司 激光投影系统及图像校正方法
CN112904653A (zh) * 2021-01-26 2021-06-04 四川长虹电器股份有限公司 用于投影设备的调焦方法和调焦装置
CN112995625B (zh) * 2021-02-23 2022-10-11 峰米(北京)科技有限公司 用于投影仪的梯形校正方法及装置
CN112995624B (zh) * 2021-02-23 2022-11-08 峰米(北京)科技有限公司 用于投影仪的梯形误差校正方法及装置
CN112804507B (zh) * 2021-03-19 2021-08-31 深圳市火乐科技发展有限公司 投影仪校正方法、系统、存储介质以及电子设备
CN113038105B (zh) * 2021-03-26 2022-10-18 歌尔股份有限公司 投影仪的调整方法和调整设备
CN113160339B (zh) * 2021-05-19 2024-04-16 中国科学院自动化研究所苏州研究院 一种基于沙姆定律的投影仪标定方法
CN113286134A (zh) * 2021-05-25 2021-08-20 青岛海信激光显示股份有限公司 图像校正方法及拍摄设备
CN113473095B (zh) * 2021-05-27 2022-10-21 广景视睿科技(深圳)有限公司 一种避障动向投影的方法和设备
CN113489961B (zh) * 2021-09-08 2022-03-22 深圳市火乐科技发展有限公司 投影校正方法、装置、存储介质和投影设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016194191A1 (ja) * 2015-06-04 2016-12-08 日立マクセル株式会社 投射型映像表示装置および映像表示方法
CN107547881A (zh) * 2016-06-24 2018-01-05 上海顺久电子科技有限公司 一种投影成像的自动校正方法、装置及激光电视
CN110336987A (zh) * 2019-04-03 2019-10-15 北京小鸟听听科技有限公司 一种投影仪畸变校正方法、装置和投影仪
CN112584113A (zh) * 2020-12-02 2021-03-30 深圳市当智科技有限公司 基于映射校正的宽屏投影方法、系统及可读存储介质
CN112689136A (zh) * 2021-03-19 2021-04-20 深圳市火乐科技发展有限公司 投影图像调整方法、装置、存储介质及电子设备
CN113099198A (zh) * 2021-03-19 2021-07-09 深圳市火乐科技发展有限公司 投影图像调整方法、装置、存储介质及电子设备
CN114827563A (zh) * 2021-11-16 2022-07-29 海信视像科技股份有限公司 投影设备和投影区域修正方法

Also Published As

Publication number Publication date
CN118104230A (zh) 2024-05-28
CN114885137A (zh) 2022-08-09
CN114885136B (zh) 2024-05-28
CN114466173A (zh) 2022-05-10
CN114885138A (zh) 2022-08-09
WO2023088329A1 (zh) 2023-05-25
CN114827563A (zh) 2022-07-29
CN114205570A (zh) 2022-03-18
CN115174877A (zh) 2022-10-11
CN114401390A (zh) 2022-04-26
CN115022606A (zh) 2022-09-06
CN118077192A (zh) 2024-05-24
CN114885136A (zh) 2022-08-09
CN115174877B (zh) 2024-05-28
CN115022606B (zh) 2024-05-17
WO2023087950A1 (zh) 2023-05-25
CN118104231A (zh) 2024-05-28
CN114727079A (zh) 2022-07-08

Similar Documents

Publication Publication Date Title
WO2023088304A1 (zh) 一种投影设备和投影区域修正方法
WO2023087947A1 (zh) 一种投影设备和校正方法
CN114866751A (zh) 一种投影设备及触发校正方法
US11886107B2 (en) Projection method and projection device
CN115002432A (zh) 一种投影设备及避障投影方法
JP2012181264A (ja) 投影装置、投影方法及びプログラム
JP6714833B2 (ja) プロジェクター及びプロジェクターの制御方法
CN116320335A (zh) 一种投影设备及调整投影画面尺寸的方法
CN116055696A (zh) 一种投影设备及投影方法
CN114760454A (zh) 一种投影设备及触发校正方法
WO2023087951A1 (zh) 一种投影设备及投影图像的显示控制方法
CN114928728A (zh) 投影设备及异物检测方法
CN115623181A (zh) 一种投影设备及投影画面移动方法
CN115529445A (zh) 一种投影设备及投影画质调整方法
CN114885142B (zh) 一种投影设备及调节投影亮度方法
WO2023087948A1 (zh) 一种投影设备及显示控制方法
JP6642032B2 (ja) プロジェクター及びプロジェクターの制御方法
WO2024066776A9 (zh) 投影设备及投影画面处理方法
WO2024001922A1 (zh) 一种投影图像源的校正方法及超短焦投影设备
CN118075435A (zh) 一种投影设备及指令响应方法
WO2023115857A1 (zh) 激光投影设备及投影图像的校正方法
JP2017152765A (ja) プロジェクター及びプロジェクターの制御方法
CN116095287A (zh) 一种投影设备标定方法、标定系统及投影设备
CN115604442A (zh) 一种投影设备及调整光源亮度的方法
CN115243021A (zh) 一种投影设备及避障投影方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22894836

Country of ref document: EP

Kind code of ref document: A1