WO2023087951A1 - 一种投影设备及投影图像的显示控制方法 - Google Patents

一种投影设备及投影图像的显示控制方法 Download PDF

Info

Publication number
WO2023087951A1
WO2023087951A1 PCT/CN2022/122816 CN2022122816W WO2023087951A1 WO 2023087951 A1 WO2023087951 A1 WO 2023087951A1 CN 2022122816 W CN2022122816 W CN 2022122816W WO 2023087951 A1 WO2023087951 A1 WO 2023087951A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
image
projection device
area
controller
Prior art date
Application number
PCT/CN2022/122816
Other languages
English (en)
French (fr)
Inventor
王昊
何营昊
卢平光
王英俊
岳国华
唐高明
陈先义
郑晴晴
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202210050600.3A external-priority patent/CN114205570A/zh
Priority claimed from CN202210399514.3A external-priority patent/CN114866751A/zh
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Priority to CN202280063190.4A priority Critical patent/CN118104229A/zh
Publication of WO2023087951A1 publication Critical patent/WO2023087951A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present application relates to the technical field of display devices, and in particular, to a projection device and a display control method for projected images.
  • the deviation of the projection angle or projection distance caused by the position movement of the projector will cause the projected image to not be completely projected to the preset projection area, such as forming a trapezoidal projected image; the display position of the projected image can be corrected by using the correct projected image function of the projector , and the display shape, so that the projected image is completely displayed in the preset projected area, and the center of the projected image coincides with the center of the projected area.
  • the position of the projection surface in the camera coordinate system is usually obtained through the RGBD (RGB Deep: color depth) camera configured on the projector;
  • the position of the projected image in the camera coordinate system is converted to the optical-mechanical coordinate system; finally, the calculated angle between the optical machine of the projector and the projection surface is obtained for calculation, so as to realize the correction of the projected image.
  • the first aspect of the embodiments of the present application provides a projection device, including: an optical machine, used to project playback content to a projection surface; a binocular camera, including a left camera and a right camera, used to acquire a display image of the projection surface
  • the controller is configured to: after detecting the movement of the projection device, control the binocular camera to obtain the first and second images successively projected by the optical machine to the projection surface, and the first image corresponds to the projection of the first image card , the second image corresponds to projecting the second map, and the first image and the second image are used for corresponding feature comparison to determine the homography relationship between the projected map and the projection surface in the world coordinate system; based on The homography relationship acquires the image captured by one of the binocular cameras and transforms it into a rectangular area in the world coordinate system, and controls the optical machine to project the playback content to the rectangular area, and the rectangular area is used to replace the movement of the projection device The deformed projection area formed on the projection surface.
  • the second aspect of the embodiments of the present application provides a display control method of a projection screen, the method comprising: after detecting movement, acquiring a first image and a second image successively projected onto the projection surface, the first image corresponding to For projecting the first chart, the second image corresponds to projecting the second chart, and the first image and the second image are used for comparing corresponding features to determine the distance between the projected chart and the projection surface in the world coordinate system Homography relationship: Based on the homography relationship, obtain one of the binocular cameras to convert the image captured by one of the cameras into a rectangular area in the world coordinate system, and project the playback content to the rectangular area, which is used to replace the moved The deformed projection area formed on the projection surface.
  • FIG. 1A is a schematic diagram of placement of projection equipment according to an embodiment of the present application.
  • FIG. 1B is a schematic diagram of an optical path of a projection device according to an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a circuit structure of a projection device according to an embodiment of the present application
  • FIG. 3 is a schematic structural diagram of a projection device according to an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of a projection device according to another embodiment of the present application.
  • FIG. 5 is a schematic diagram of a circuit structure of a projection device according to an embodiment of the present application.
  • FIG. 6A is a schematic diagram of a system framework for realizing display control by a projection device according to an embodiment of the present application
  • FIG. 6B is a schematic diagram of the signaling interaction sequence of the projection device realizing the radioactive eye function according to another embodiment of the present application.
  • FIG. 6C is a schematic diagram of a signaling interaction sequence for realizing a display image correction function of a projection device according to another embodiment of the present application.
  • FIG. 6D is a schematic flow diagram of a projection device implementing an autofocus algorithm according to another embodiment of the present application.
  • FIG. 6E is a schematic flow diagram of a projection device implementing trapezoidal correction and obstacle avoidance algorithms according to another embodiment of the present application.
  • FIG. 6F is a schematic flow diagram of a projection device implementing a screen entry algorithm according to another embodiment of the present application.
  • FIG. 6G is a schematic flow diagram of a projection device implementing an anti-eye algorithm according to another embodiment of the present application.
  • FIG. 6H is a schematic projection diagram of a projection device according to another embodiment of the present application.
  • FIG. 6I is a schematic projection diagram of a projection device according to another embodiment of the present application.
  • FIG. 6J is a schematic diagram of the process of automatically correcting the projected image by the projection device according to another embodiment of the present application.
  • FIG. 6K is a schematic diagram of the process of automatically correcting the projected image by the projection device according to another embodiment of the present application.
  • FIG. 7 is a schematic diagram of a lens structure of a projection device in another embodiment of the present application.
  • FIG. 8 is a schematic structural diagram of a distance sensor and a camera of a projection device in another embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a distance sensor and a camera in another embodiment of the present application.
  • FIG. 10 is a schematic diagram of image correction triggered by a projection device in another embodiment of the present application.
  • FIG. 11 is a schematic diagram of determining whether the projection device moves in another embodiment of the present application.
  • Fig. 12 is a schematic diagram of the interaction between the controller, the optical machine and the projection surface in another embodiment of the present application.
  • Fig. 13 is a schematic diagram of a white card for controlling the optical-mechanical projection calibration of the projection device in another embodiment of the present application;
  • FIG. 14 is a schematic diagram of a projection device controlling an optomechanical projection correction chart in another embodiment of the present application.
  • Fig. 15 is a schematic diagram of a calibration chart in another embodiment of the present application.
  • Fig. 16 is a schematic diagram of another calibration chart in another embodiment of the present application.
  • Fig. 17 is a flow chart of the projection device triggering the eye protection mode in another embodiment of the present application.
  • FIG. 1A is a schematic diagram of placement of projection equipment according to an embodiment of the present application.
  • a projection device provided by the present application includes a projection screen 1 and a device 2 for projection.
  • the projection screen 1 is fixed on the first position, and the device 2 for projection is placed on the second position, so that the projected picture coincides with the projection screen 1.
  • This step is performed by professional after-sales technicians, that is, the second position is Optimal placement of device 2 for projection.
  • FIG. 1B is a schematic diagram of an optical path of a projection device according to an embodiment of the present application.
  • An embodiment of the present application provides a projection device, including a laser light source 100 , an optical machine 200 , a lens 300 , and a projection medium 400 .
  • the laser light source 100 provides illumination for the light machine 200
  • the light machine 200 modulates the light beam, and outputs it to the lens 300 for imaging, and projects it to the projection medium 400 to form a projection image.
  • the laser light source of the projection device includes a laser component and an optical lens component, and the light beam emitted by the laser component can pass through the optical lens component to provide illumination for the optical machine.
  • optical lens components require a higher level of environmental cleanliness and airtight level sealing; while the chamber where the laser component is installed can be sealed with a lower level of dustproof level to reduce sealing costs.
  • the light engine 200 of the projection device may be implemented to include a blue light engine, a green light engine, and a red light engine, and may also include a heat dissipation system, a circuit control system, and the like. It should be noted that, in some embodiments, the light emitting component of the projection device may also be realized by an LED (Light Emitting Diode: light emitting diode) light source.
  • LED Light Emitting Diode: light emitting diode
  • the present application provides a projection device, including a three-color light engine and a controller; wherein, the three-color light engine is used to modulate and generate laser light containing pixels in a user interface, including a blue light engine, a green light engine, and a red light engine.
  • the controller is configured to: obtain the average gray value of the user interface; when it is determined that the average gray value is greater than the first threshold and its duration is greater than the time threshold, control the operating current value of the red optical machine according to The preset gradient value is lowered to reduce the heating of the three-color light engine. It can be found that by reducing the operating current of the red light machine integrated in the three-color light machine, the overheating of the red light machine can be controlled, so as to control the overheating of the three-color light machine and the projection equipment.
  • the light machine 200 can be implemented as a three-color light machine, and the three-color light machine integrates a blue light machine, a green light machine, and a red light machine.
  • the technical solution provided by the present application will be described by taking the light machine 200 of the projection device as an example including a blue light machine, a green light machine, and a red light machine.
  • the optical system of the projection device is composed of a light source part and an optical machine part.
  • the function of the light source part is to provide illumination for the light machine, and the function of the light machine part is to modulate the illumination beam provided by the light source, and finally form Projected screen.
  • the laser assembly includes a red laser module, a green laser module, and a blue laser module, and each laser module and the corresponding installation port are dust-proof through a sealing ring (either fluorine rubber or other sealing materials can be used). Sealed installation.
  • FIG. 2 is a schematic diagram of a circuit structure of a projection device according to an embodiment of the present application.
  • the projection device provided by the present disclosure includes multiple sets of lasers. By setting a brightness sensor in the light output path of the laser light source, the brightness sensor can detect the first brightness value of the laser light source and send the first brightness value to the display. Control circuit.
  • the display control circuit can obtain the second brightness value corresponding to the driving current of each laser, and when it is determined that the difference between the second brightness value of the laser and the first brightness value of the laser is greater than the difference threshold, determine that the laser has COD (Catastrophic Optical Damage: optical catastrophe damage) failure; the display control circuit can adjust the current control signal of the corresponding laser drive component of the laser until the difference is less than or equal to the difference threshold, thereby eliminating the COD failure of the blue laser ;
  • the projection device can eliminate the COD failure of the laser in time, reduce the damage rate of the laser, and ensure the image display effect of the projection device.
  • the projection device may include a display control circuit 10, a laser light source 20, at least one laser driving component 30, and at least one brightness sensor 40, and the laser light source 20 may include a one-to-one corresponding At least one laser.
  • the at least one refers to one or more, and a plurality refers to two or more.
  • the projection device includes a laser driver assembly 30 and a brightness sensor 40.
  • the laser light source 20 includes three lasers that correspond one-to-one to the laser driver assembly 30, and the three lasers can be blue lasers respectively. 201 , red laser 202 and green laser 203 .
  • the blue laser 201 is used to emit blue laser
  • the red laser 202 is used to emit red laser
  • the green laser 203 is used to emit green laser.
  • the laser driving component 30 may be implemented to include a plurality of sub-laser driving components corresponding to lasers of different colors.
  • the display control circuit 10 is used to output the primary color enable signal and the primary color current control signal to the laser drive assembly 30 to drive the laser to emit light.
  • the display control circuit 10 is connected to the laser drive assembly 30 for outputting At least one enable signal corresponding to the three primary colors of each frame image in the multi-frame display image, the at least one enable signal is respectively transmitted to the corresponding laser drive assembly 30, and the output and each frame image At least one current control signal corresponding to each of the three primary colors transmits the at least one current control signal to the corresponding laser driving component 30 respectively.
  • the display control circuit 10 may be a microcontroller unit (microcontroller unit, MCU), also known as a single-chip microcomputer.
  • the current control signal may be a pulse width modulation (pulse width modulation, PWM) signal.
  • the blue laser 201 , the red laser 202 and the green laser 203 are respectively connected to the laser driving assembly 30 .
  • the laser driving component 30 can provide corresponding driving current to the blue laser 201 in response to the blue PWM signal B_PWM and the enable signal B_EN sent by the display control circuit 10 .
  • the blue laser 201 is used to emit light under the driving current.
  • the brightness sensor is arranged in the light output path of the laser light source, usually on one side of the light output path, without blocking the light path.
  • at least one brightness sensor 40 is arranged in the light path of the laser light source 20, and each brightness sensor is connected with the display control circuit 10 for detecting the first brightness value of a laser, and the first brightness value sent to the display control circuit 10.
  • the display control circuit 10 can obtain the second brightness value corresponding to the driving current of each laser from the corresponding relationship, the driving current is the current actual working current of the laser, and the second brightness value corresponding to the driving current is the brightness value that the laser can emit when it works normally under the driving current.
  • the difference threshold may be a fixed value pre-stored in the display control circuit 10 .
  • the display control circuit 10 when the display control circuit 10 adjusts the current control signal of the laser drive component 30 corresponding to the laser, it can reduce the duty cycle of the current control signal of the laser drive component 30 corresponding to the laser, thereby reducing the driving force of the laser. current.
  • the brightness sensor 40 can detect the first brightness value of the blue laser 201 and send the first brightness value to the display control circuit 10 .
  • the display control circuit 10 can obtain the driving current of the blue laser 201, and obtain the second brightness value corresponding to the driving current from the corresponding relationship between the current and the brightness value. Then detect whether the difference between the second luminance value and the first luminance value is greater than the difference threshold, if the difference is greater than the difference threshold, it indicates that the blue laser 201 has a COD failure, and the display control circuit 10 can reduce the difference with the difference threshold.
  • the current control signal of the laser driving component 30 corresponding to the blue laser 201 .
  • the display control circuit 10 can acquire the first luminance value of the blue laser 201 and the second luminance value corresponding to the driving current of the blue laser 201 again, and the difference between the second luminance value and the first luminance value is greater than
  • the difference threshold is reached, the current control signal of the laser driving component 30 corresponding to the blue laser 201 is lowered again. This loops until the difference is less than or equal to the difference threshold. Therefore, by reducing the driving current of the blue laser 201 , the COD failure of the blue laser 201 is eliminated.
  • FIG. 3 is a schematic structural diagram of a projection device according to an embodiment of the present application.
  • the laser light source 20 in the projection device may include a blue laser 201, a red laser 202 and a green laser 203 which are set independently, and the projection device may also be called a three-color projection device, the blue laser 201, the red laser 201 Both the laser 202 and the green laser 203 are MCL-type packaged lasers, which are small in size and facilitate the compact arrangement of optical paths.
  • the at least one brightness sensor 40 may include a first brightness sensor 401, a second brightness sensor 402 and a third brightness sensor 403, wherein the first brightness sensor 401 is a blue light brightness sensor or a white light brightness sensor sensor, the second brightness sensor 402 is a red light brightness sensor or a white light brightness sensor, and the third brightness sensor 403 is a green light brightness sensor or a white light brightness sensor.
  • the display control circuit 10 is also used to read the brightness value detected by the first brightness sensor 401 when controlling the blue laser 201 to emit blue laser light. And when the blue laser 201 is controlled to be turned off, stop reading the brightness value detected by the first brightness sensor 401 .
  • the display control circuit 10 is also used to read the brightness value detected by the second brightness sensor 402 when controlling the red laser 202 to emit red laser light, and stop reading the brightness value detected by the second brightness sensor 402 when controlling the red laser 202 to be turned off. Brightness value.
  • the display control circuit 10 is also used to read the luminance value detected by the third luminance sensor 403 when controlling the green laser 203 to emit green laser light, and stop reading the luminance value detected by the third luminance sensor 403 when controlling the green laser 203 to turn off. brightness value.
  • FIG. 4 is a schematic structural diagram of a projection device according to another embodiment of the present application.
  • the projection device may further include a light pipe 110, which is used as a light-collecting optical component for receiving and homogenizing the output three-color laser light in a combined light state.
  • a light pipe 110 which is used as a light-collecting optical component for receiving and homogenizing the output three-color laser light in a combined light state.
  • the brightness sensor 40 may include a fourth brightness sensor 404, which may be a white light brightness sensor.
  • the fourth brightness sensor 404 is arranged in the light exit path of the light pipe 110, for example, on the light exit side of the light pipe, close to its light exit surface.
  • the above-mentioned fourth brightness sensor is a white light brightness sensor.
  • the display control circuit 10 is also used to read the brightness value detected by the fourth brightness sensor 404 when controlling the blue laser 201, the red laser 202 and the green laser 203 to turn on time-divisionally, so as to ensure that the fourth brightness sensor 404 can detect to the first brightness value of the blue laser 201 , the first brightness value of the red laser 202 and the first brightness value of the green laser 203 . And when the blue laser 201 , the red laser 202 and the green laser 203 are all turned off, stop reading the brightness value detected by the fourth brightness sensor 404 .
  • the fourth brightness sensor 404 is always on when the projection device is projecting images.
  • the projection device may further include a fourth dichroic film 604, a fifth dichroic film 605, a fifth reflector 904, a second lens assembly 90, and a diffusion wheel 150.
  • TIR lens 120, DMD 130 and projection lens 140 the second lens assembly 90 includes a first lens 901 , a second lens 902 and a third lens 903 .
  • the fourth dichroic film 604 can transmit blue laser light and reflect green laser light.
  • the fifth dichroic film 605 can transmit red laser light and reflect green laser light and blue laser light.
  • FIG. 5 is a schematic diagram of a circuit structure of a projection device according to an embodiment of the present application.
  • the laser driving component 30 may include a driving circuit 301 , a switching circuit 302 and an amplifying circuit 303 .
  • the driving circuit 301 may be a driving chip.
  • the switch circuit 302 may be a metal-oxide-semiconductor (MOS) transistor.
  • the driving circuit 301 is respectively connected with the switch circuit 302 , the amplification circuit 303 and the corresponding laser included in the laser light source 20 .
  • the driving circuit 301 is used to output the driving current to the corresponding laser in the laser light source 20 through the VOUT terminal based on the current control signal sent by the display control circuit 10 , and transmit the received enabling signal to the switch circuit 302 through the ENOUT terminal.
  • the laser may include n sub-lasers connected in series, which are respectively sub-lasers LD1 to LDn. n is a positive integer greater than 0.
  • the switch circuit 302 is connected in series in the current path of the laser, and is used to control the conduction of the current path when the received enable signal is at an effective potential.
  • the amplifying circuit 303 is respectively connected to the detection node E in the current path of the laser light source 20 and the display control circuit 10, and is used to convert the detected driving current of the laser component into a driving voltage, amplify the driving voltage, and convert the amplified driving current The voltage is transmitted to the display control circuit 10 .
  • the display control circuit 10 is further configured to determine the amplified driving voltage as the driving current of the laser, and obtain a second brightness value corresponding to the driving current.
  • the amplifying circuit 303 may include: a first operational amplifier A1, a first resistor (also known as a sampling power resistor) R1, a second resistor R2, a third resistor R3 and a fourth resistor R4.
  • the display control circuit 10, the drive circuit 301, the switch circuit 302, and the amplifier circuit 303 form a closed loop to realize the feedback adjustment of the driving current of the laser, so that the display control circuit 10 can pass the laser second
  • the difference between the luminance value and the first luminance value adjusts the driving current of the laser in time, that is, adjusts the actual luminance of the laser in time, avoids long-term COD failure of the laser, and improves the accuracy of laser luminescence control.
  • the laser light source 20 includes a blue laser, a red laser and a green laser.
  • the blue laser 201 can be set at the L1 position
  • the red laser 202 can be set at the L2 position
  • the green laser 203 can be set at the L3 position.
  • the laser light at position L1 is transmitted once through the fourth dichroic film 604 , reflected once through the fifth dichroic film 605 , and enters the first lens 901 .
  • the light efficiency P1 Pt ⁇ Pf at the L1 position.
  • Pt represents the transmittance of the dichroic plate
  • Pf represents the reflectance of the dichroic plate or the fifth reflectance.
  • the light efficiency of the laser light at the position L3 is the highest, and the light efficiency of the laser light at the position L1 is the lowest.
  • the maximum optical power Pb output by the blue laser 201 is 4.5 watts (W)
  • the maximum optical power Pr output by the red laser 202 is 2.5W
  • the maximum optical power Pg output by the green laser 203 is 1.5W. That is, the maximum optical power output by the blue laser 201 is the largest, followed by the maximum optical power output by the red laser 202, and the maximum optical power output by the green laser 203 is the smallest.
  • the green laser 203 is therefore placed at the L3 position, the red laser 202 is placed at the L2 position, and the blue laser 201 is placed at the L1 position. That is, the green laser 203 is arranged in the optical path with the highest light efficiency, so as to ensure that the projection device can obtain the highest light efficiency.
  • the controller includes a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processing unit (Graphics Processing Unit, GPU), RAM Random Access Memory, RAM), ROM (Read- Only Memory, ROM), at least one of the first interface to the nth interface for input/output, a communication bus (Bus), and the like.
  • CPU Central Processing Unit
  • video processor video processor
  • audio processor audio processor
  • graphics processing unit Graphics Processing Unit, GPU
  • RAM Random Access Memory
  • ROM Read- Only Memory
  • the projection device after the projection device is started, it can directly enter the display interface of the signal source selected last time, or the signal source selection interface, wherein the signal source can be a preset video-on-demand program, or an HDMI interface, a live TV interface
  • the projector can display the content obtained from different signal sources.
  • the embodiments of the present application may be applied to various types of projection devices.
  • the projection device and the display control method for automatically correcting the projected picture will be described below by taking the projection device as an example.
  • a projection device is a device that can project images or videos onto a screen.
  • the projection device can communicate with computers, radio and television networks, the Internet, VCD (Video Compact Disc), DVD (Digital Versatile Disc Recordable) through different interfaces. : Digital Video Disc), game consoles, DV, etc. are connected to play corresponding video signals.
  • Projection equipment is widely used in homes, offices, schools and entertainment venues, etc.
  • the camera configured by the projection device can be implemented as a 3-dimensional (3-Dimensional, 3D) camera, or a binocular camera; when the camera is implemented as a binocular camera, it specifically includes a left camera and a right camera;
  • the target camera can obtain the corresponding screen of the projection device, that is, the image and playback content presented on the projection surface, and the image or playback content is projected by the built-in laser component of the projection device.
  • the projection device controller can be based on the deep learning neural network, through coupling
  • the correct display of the included angle between the optical machine projection surface and the projected image realizes automatic trapezoidal correction; however, the correction speed of this correction method is slow, and a large amount of scene data is required for model training to achieve a certain accuracy, so it is not suitable for user equipment. Instantly and quickly correct the scene.
  • the relationship among distance, horizontal angle and offset angle can be established in advance
  • the projection device controller obtains the current distance from the light machine to the projection surface, and determines the angle between the light machine and the projection surface at this moment in combination with the associated relationship, so as to realize the projection image correction;
  • the included angle is specifically implemented as the included angle between the central axis of the optical machine and the projection surface; however, in some complex environments, it may happen that the association relationship created in advance cannot adapt to all complex environments, resulting in the failure of projected image correction .
  • the projection device provided by the present application can maximize the advantages of the two dimming methods and weaken the impact of their disadvantages by combining their working modes and dynamic switching dimming of working scenes.
  • the projection device can monitor the movement of the device in real time through its configuration components, and immediately feed back the monitoring results to the projection device controller, so that after the projection device moves, the controller immediately starts the image correction function to realize the automatic correction of the projected image in the first time.
  • the controller receives monitoring data from the gyroscope and TOF sensor to determine whether the projection device moves;
  • the controller After determining that the projection device has moved, the controller will start the automatic correction process of the projected image and/or the barrier process, so as to realize functions such as trapezoidal projection area correction and projected obstacle avoidance.
  • the time-of-flight sensor realizes distance measurement and position movement monitoring by adjusting the frequency of the transmitted pulse. Its measurement accuracy will not decrease with the increase of the measurement distance, and it has strong anti-interference ability.
  • the laser light emitted by the projection device is reflected by the nanoscale lens of the DMD (Digital Micromirror Device: Digital Micromirror Device) chip, wherein the optical lens is also a precision element, and when the image plane and the object plane are not parallel, it will cause The image projected to the screen is geometrically distorted.
  • DMD Digital Micromirror Device: Digital Micromirror Device
  • FIG. 6A is a schematic diagram of a system framework for realizing display control by a projection device according to an embodiment of the present application.
  • the projection device provided by the present application has the characteristics of telephoto micro-projection.
  • the projection device includes a controller, and the controller can control the display of the optical-mechanical image through a preset algorithm, so as to realize automatic keystone correction of the display image. , automatic screen entry, automatic obstacle avoidance, automatic focus, and anti-eye shooting functions.
  • the projection device can realize flexible position movement in the telephoto micro-projection scene through the display control method based on geometric correction; From problems such as screen abnormality, the controller can control the projection device to realize the automatic display correction function, so that it can automatically return to normal display.
  • the geometric correction-based display control system includes an application program service layer (APK Service: Android Application Package Service), a service layer, and an underlying algorithm library.
  • API Service Android Application Package Service
  • the application service layer is used to realize the interaction between the projection device and the user; based on the display of the user interface, the user can configure various parameters of the projection device and the display screen, and the controller coordinates and calls the algorithm services corresponding to various functions , which can realize the function of automatically correcting the display screen of the projection device when the display is abnormal.
  • the service layer can include correction services, camera services, time-of-flight (TOF) services, etc., and the services correspond to the application program service layer (APK Service) to realize the corresponding specific functions of different configuration services of the projection device; the service layer is connected downwards Algorithm library, camera, time-of-flight sensor and other data acquisition services, realize the packaging of complex underlying logic.
  • APIK Service application program service layer
  • the underlying algorithm library provides correction services and control algorithms for various functions of projection equipment.
  • the algorithm library can complete various mathematical operations based on OpenCV (based on licensed open source) to provide basic computing capabilities for calibration services; OpenCV is an open source release based on BSD license A cross-platform computer vision and machine learning software library that can run on a variety of existing operating system environments.
  • the projection device is configured with a gyroscope sensor; during the movement of the device, the gyroscope sensor can sense the position movement and actively collect movement data; then the collected data is sent to the application service layer through the system framework layer, It is used to support the application data required in the process of user interface interaction and application program interaction, and the collected data can also be used for data calls by the controller in the implementation of algorithm services.
  • the projection device is configured with a time-of-flight (TOF) sensor, and after the time-of-flight sensor collects corresponding data, the data will be sent to the corresponding time-of-flight service of the service layer;
  • TOF time-of-flight
  • time-of-flight service After the above-mentioned time-of-flight service acquires data, it sends the collected data to the application program service layer through the process communication framework, and the data will be used for interactive use such as controller data calls, user interfaces, and program applications.
  • the projection device is configured with a camera for collecting images, and the camera may be implemented as a binocular camera, or a depth camera, or a 3D camera, etc.;
  • the data collected by the camera will be sent to the camera service, and then the camera service will send the collected image data to the process communication framework and/or the projection device correction service; the projection device correction service can receive the camera collection data sent by the camera service, and the controller will
  • the different functions that need to be implemented can call the corresponding control algorithm in the algorithm library.
  • data interaction is performed with the application service through the process communication framework, and then the calculation result is fed back to the correction service through the process communication framework; the correction service sends the obtained calculation result to the operating system of the projection device to generate a control signal command, and send the control signal to the optical-mechanical control driver to control the optical-mechanical working conditions and realize the automatic correction of the displayed image.
  • FIG. 6B is a schematic diagram of a signaling interaction sequence of a projection device implementing an anti-eye function according to another embodiment of the present application.
  • the projection device can realize the anti-eye function.
  • control The controller can control the user interface to display corresponding prompt information to remind the user to leave the current area, and the controller can also control the user interface to reduce the display brightness to prevent the laser from causing damage to the user's eyesight.
  • the controller when the projection device is configured as a children's viewing mode, the controller will automatically turn on the anti-eye switch. In some embodiments, after receiving the position movement data sent by the gyro sensor, or receiving the foreign object intrusion data collected by other sensors, the controller will control the projection device to turn on the anti-eye switch.
  • the controller when any preset threshold condition is triggered by the time-of-flight (TOF) sensor or the data collected by the camera, the controller will control the user interface to reduce the display brightness, display prompt information, and reduce the optical-mechanical transmission power, brightness, and intensity. .
  • TOF time-of-flight
  • the projection device controller can control the calibration service to send signaling to the time-of-flight sensor to query the current status of the projection device, and the controller will receive data feedback from the time-of-flight sensor.
  • the correction service sends a notification algorithm service to the process communication framework to start the anti-eye shot process signaling, and the process communication framework will call the service capability from the algorithm library to call the corresponding algorithm service;
  • the algorithm can include: camera detection algorithm, screenshot algorithm, And foreign object detection algorithm, etc.
  • the process communication framework is based on the above algorithm service, and can return the foreign object detection result to the correction service; for the returned result, if the preset threshold condition is reached, the controller will control the user interface to display prompt information and reduce the display brightness.
  • the signaling sequence is shown in Figure 6B Show.
  • the projection device when the anti-radiation eye is in the open state, when the user enters a preset specific area, the projection device will automatically reduce the intensity of the laser emitted by the optical machine, reduce the display brightness of the user interface, and display safety prompt information;
  • the control of eye function can be achieved by the following methods:
  • an edge detection algorithm is used to identify the projection area of the projection device; when the projection area is displayed as a rectangle or a rectangle, the controller obtains the coordinate values of the four vertices of the above-mentioned rectangular projection area through a preset algorithm;
  • the perspective transformation method can be used to correct the projection area to a rectangle, and the difference between the rectangle and the projection screenshot can be calculated to realize whether there is a foreign object in the display area; if the judgment result is that there is a foreign object, the projection The device automatically activates anti-eye.
  • the difference between the camera content of the current frame and the camera content of the previous frame can be used to judge whether there is foreign matter entering the area outside the projection area; if it is judged that there is foreign matter entering, the Trigger the activation of the anti-eye function.
  • the projection device can also use a time-of-flight (ToF) camera or a time-of-flight sensor to detect real-time depth changes in a specific area; if the depth value changes beyond a preset threshold, the projection device will automatically trigger the anti-eye function.
  • TOF time-of-flight
  • FIG. 6G a schematic flow diagram of the projection device implementing the anti-eye algorithm is given.
  • the projection device judges whether the anti-eye function needs to be enabled based on the collected time-of-flight data, screenshot data, and camera data analysis.
  • Step 6G00-1 collecting time-of-flight (TOF) data
  • Step 6G00-2 performing depth difference analysis according to the collected time-of-flight data
  • Step 6G00-3 judging whether the depth difference is greater than the preset threshold X, if the depth difference is greater than the preset threshold X, and X is 0, execute step 6G03;
  • step 6G03 the screen becomes dark and a prompt pops up.
  • the preset threshold X is 0, it can be determined that there is a foreign object in a specific area of the projection device
  • the projection device will automatically activate the anti-eye function to reduce the intensity of the laser light emitted by the light machine, reduce the brightness of the user interface display, and display safety reminders.
  • Step 6G01-1 collecting screenshot data
  • Step 6G01-2 do color addition mode (RGB) difference analysis according to the screenshot data collected;
  • Step 6G01-3 judging whether the RGB difference is greater than the preset threshold Y, if the RGB difference is greater than the preset threshold Y, execute step 6G03;
  • step 6G03 the screen becomes dark and a prompt pops up.
  • the projection device performs a difference analysis of the additive color mode (RGB) based on the collected screenshot data.
  • RGB additive color mode
  • Step 6G02-1 collecting camera data
  • Step 6G02-2 obtain projection coordinates according to the collected camera data, if the obtained projection coordinates are in the projection area, execute step 6G01-3, if the obtained projection coordinates are in the extended area, still execute step 6G01-3;
  • Step 6G02-3 performing color addition mode (RGB) difference analysis according to the collected camera data
  • Step 6G02-4 judging whether the RGB difference is greater than the preset threshold Y, and if the RGB difference is greater than the preset threshold Y, execute step 6G03.
  • step 6G03 the screen becomes dark and a prompt pops up.
  • the projection device obtains the projection coordinates according to the collected camera data; then determines the projection area of the projection device according to the projection coordinates, and further performs a difference analysis of the additive color mode (RGB) in the projection area; when the difference is greater than the preset threshold Y, it can be determined that there is a foreign object in a specific area of the projection device; the projection device will automatically activate the anti-eye function.
  • RGB additive color mode
  • the controller can still perform the difference analysis of the additive color mode (RGB) in the extended area; if the difference is greater than the preset threshold Y, it can be determined that there is a foreign object in the specific area of the projection device. area, the projection device will automatically activate the anti-eye function.
  • RGB additive color mode
  • FIG. 6C is a schematic diagram of a signaling interaction sequence of a projection device implementing a display image correction function according to another embodiment of the present application.
  • the projection device monitors the movement of the device through a gyroscope or a gyroscope sensor.
  • the calibration service sends a signaling to the gyroscope to query the status of the device, and receives the signaling fed back by the gyroscope to determine whether the device is moving.
  • the display correction strategy can be configured as follows: when the gyroscope and the time-of-flight sensor change at the same time, the projection device triggers the keystone correction first; With the realization of keystone correction, the projection equipment will display a pure white picture card.
  • the trapezoidal correction algorithm can construct the conversion matrix between the projection surface in the world coordinate system and the optical-mechanical coordinate system based on the binocular camera, and combine the optical-mechanical internal parameters to calculate the homography relationship between the projected image and the playing card, and the homography relationship It is also called a mapping relationship, and uses this homography relationship to realize arbitrary shape conversion between the projected image and the playing card.
  • the correction service sends a signaling for notifying the algorithm service to start the keystone correction process to the process communication framework, and the process communication framework further sends a service capability call signaling to the algorithm service to obtain the algorithm corresponding to the capability;
  • the algorithm service obtains and executes the camera and picture algorithm processing service and the obstacle avoidance algorithm service, and sends them to the process communication framework through signaling; in some embodiments, the process communication framework executes the above algorithms and feeds back the execution results to the correction service , the execution result may include feedback information such as successful photographing and successful obstacle avoidance.
  • the correction service will control the user interface to display an error return prompt, and control the user interface to perform keystone correction and auto-focus chart card again.
  • the projection device can identify the screen; and use the projection changes to correct the projection screen to be displayed inside the screen, so as to achieve the effect of aligning with the edge of the screen.
  • the projection device can use the time-of-flight (ToF) sensor to obtain the distance between the optical machine and the projection surface, based on the distance, find the best image distance in the preset mapping table, and use the image algorithm to evaluate the clarity of the projection screen. Based on this, the image distance can be fine-tuned.
  • ToF time-of-flight
  • the automatic keystone correction signaling sent by the correction service to the process communication framework may include other function configuration instructions, such as whether to implement synchronous obstacle avoidance, whether to enter the curtain, and other control instructions.
  • the process communication framework sends the service capability call signaling to the algorithm service, so that the algorithm service acquires and executes the autofocus algorithm to realize the adjustment of the viewing distance between the device and the screen; in some embodiments, after the application of the autofocus algorithm to realize the function, the algorithm service It can also obtain and execute the automatic curtain entry algorithm, which can include the keystone correction algorithm in the process.
  • the projection device executes automatic screen entry, and the algorithm service sets the 8-position coordinates between the projection device and the screen; then, through the autofocus algorithm again, the adjustment of the viewing distance between the projection device and the screen is realized; finally, the correction result is fed back to the correction service, and the user is controlled
  • the interface displays the calibration result, as shown in Figure 6C.
  • the projection device uses an autofocus algorithm to obtain the current object distance by using its configured laser ranging to calculate the initial focal length and search range; then the projection device drives the camera (Camera) to take pictures, and uses the corresponding algorithm to perform Clarity evaluation.
  • the projection device searches for the best possible focal length based on the search algorithm, then repeats the above steps of photographing and sharpness evaluation, and finally finds the optimal focal length through sharpness comparison to complete autofocus.
  • Step 6D01 the user moves the device, and the projection device automatically completes the calibration and then refocuses;
  • Step 6D02 the controller will detect whether the auto-focus function is enabled, if not, the controller will end the auto-focus service, and if so, execute step 6D03;
  • Step 6D03 the middleware obtains the detection distance of time-of-flight (TOF);
  • the projection device When the auto-focus function is turned on, the projection device will obtain the detection distance of the time-of-flight (TOF) sensor through the middleware for calculation.
  • TOF time-of-flight
  • Step 6D04 obtain the approximate focal length according to the distance query mapping table
  • Step 6D05 the middleware sets the focal length to the optical machine
  • the controller queries the preset mapping table according to the obtained distance to obtain the focal length of the projection device; then the middleware sets the obtained focal length to the optical engine of the projection device.
  • Step 6D06 taking pictures with the camera
  • Step 6D07 according to the evaluation function, judge whether the focus is completed, if so, end the auto-focus process, otherwise execute step 6D08;
  • step 6D08 the middleware fine-tunes the focal length (step size), and executes step 6D05 again.
  • the camera will execute the camera command; the controller judges whether the projection device is focused according to the captured image and evaluation function; if the judgment result meets the preset completion conditions, the control autofocus process ends;
  • the middleware will fine-tune the focal length parameters of the optical machine of the projection equipment, for example, the preset step size can be used to gradually fine-tune the focal length, and the adjusted focal length parameters will be set to the optical machine again; thus realizing repeated photos, clear The sharpness evaluation step, and finally find the optimal focal length through sharpness comparison to complete autofocus.
  • the projection device can project a picture of 75-100 inches; after the user purchases the laser projection device, after the projection installation and debugging, the projection device at position A can obtain a suitable A projection area, and the A The projection area is rectangular, as shown on the left side of Figure 6H;
  • a deformed projection area is often generated, such as the trapezoidal projection area shown on the right side of Figure 6H; it is understandable that there are disadvantages in the placement of the projection equipment.
  • the placement position of the projection device will change; and in the case of a fixed screen, the match between the screen of the projection device and the screen depends entirely on the position of the projection device. Changes in the position of the projection device will lead to changes in the projection position, resulting in Problems that don't fit the screen.
  • the projection device provided by the present application can realize the display correction function through the keystone correction algorithm.
  • the present application provides a method for automatically correcting the projected image, including the following steps:
  • Step 6J01 monitor the position change between the projector and the projection surface in real time
  • Step 6J02 the time-of-flight or gyro sensor detects a change
  • Step 6J03 projecting a white chart
  • Step 6J04 taking and storing photos with binocular cameras
  • Step 6J05 projecting the characteristic map card
  • Step 6J06 taking and storing photos with binocular cameras
  • the controller After the controller detects that the position of the projection device changes, the controller will control the optical machine to project the first and second graphics cards to the projection surface; the first image corresponding to the chart, and projecting the second image corresponding to the second chart;
  • the controller first controls the optical machine to project a white image card to the projection surface, and then controls the binocular camera to take pictures of the projection surface to obtain a picture corresponding to the white image card.
  • the white image card is the above-mentioned first image card.
  • the first picture card can also be implemented as other color picture cards;
  • the controller controls the optical machine to project the characteristic picture card to the projection surface again
  • the characteristic picture card is the above-mentioned second picture card
  • the second picture card can be implemented as a chessboard, for example Getu cards, or other chart cards with graphics and color features, as shown in Figure 6H.
  • Step 6J07 identifying the feature points corresponding to the binocular camera
  • the controller can determine the homography relationship between the projection chart and the projection surface in the world coordinate system by comparing the corresponding features of the first image and the second image, so as to obtain the relationship between the projection chart and the projection surface. Correct the projection image based on the correlation between them.
  • Step 6J08 calculating the coordinates of the points on the projection surface corresponding to the camera feature points in the camera coordinate system
  • Step 6J09 obtaining the coordinates of the points on the projection surface in the world coordinate system
  • Step 6J10 obtaining the homography relationship between the projection feature map card and the feature points of the projection surface
  • Step 6J11 obtain the largest rectangular area in the world coordinate system
  • Step 6J12 calculating the area to be projected by the optical machine according to the homography relationship between the projection feature map and the projection surface.
  • the controller obtains the coordinates of the corresponding feature points of the binocular camera in the image coordinate system according to the image recognition algorithm, and then calculates the projection surface in any camera of the binocular camera according to the coordinates of the feature points of the binocular camera in the image coordinate system, such as the left camera
  • the coordinates in the coordinate system to obtain the coordinates of the feature points of the projected surface in the world coordinate system.
  • the plane equation of the projected surface in the left camera coordinate system can be calculated and obtained, expressed as:
  • the controller calculates and obtains the unit normal vector of the projection surface under the left camera coordinate system; continue to define the world coordinate system 1, take the origin of the left camera coordinate system as the origin, and establish the direct coordinates of the space parallel to the projection surface as the XOY plane system, so the equation of the projection surface in the world coordinate system 1 is expressed as:
  • the rotation matrix between the two coordinate systems can be obtained, expressed as:
  • the coordinates of the projection plane point P in the world coordinate system 1 are obtained; the origin of the world coordinate 1 is in The projection of the projection surface is the origin, the projection surface is the XOY plane, and the world coordinate system 2 is established.
  • the coordinates of the projection point P in the world coordinate system 2 are expressed as (x′′,y′′,0), and the controller can obtain all projected feature points in Coordinates in the world coordinate system 2;
  • the homography relationship between the lower projection chart and the projection surface may also be referred to as a mapping relationship.
  • the controller before the projection device projects the playback content to the projection surface, if the controller detects that there is an obstacle blocking the projection beam between the light machine and the projection surface, the controller will start the obstacle avoidance process.
  • the controller determines whether the obstacle avoidance function of the projection device is turned on, and if it is not turned on, no obstacle detection is performed; if the projection device turns on the obstacle avoidance function, the controller calculates and obtains Any camera, such as the homography matrix of the left camera and the optical machine, that is, the homography relationship.
  • the projection device stores the feature points of the projection card. After the feature point recognition is performed on the corresponding image of the feature card captured by the left camera, it is compared with the stored feature points of the projection card, and the left camera can be calculated. , and the homography matrix of optomechanics.
  • the controller performs grayscale processing on the captured images acquired by the binocular camera, and then performs clustering processing based on a clustering algorithm.
  • the controller clusters the above-mentioned gray-scaled captured images into projection beam areas and obstacle areas according to the clustering algorithm and gray value.
  • the gray scale image in the projection area They are clustered into two categories, among which the gray value of the projection beam must be the largest category, and the category with the smallest gray value is determined as the obstacle area.
  • the controller binarizes the projection beam area and the obstacle area, so that it is convenient to extract the first closed contour larger than the preset obstacle avoidance area threshold, the first closed contour corresponds to the obstacle, for example, between the projection device and the obstacle area. household items between the curtains;
  • the controller can also extract a second closed contour that is smaller than the preset obstacle avoidance area threshold, and the second closed contour corresponds to a non-obstacle, such as a small stain on a wall or a curtain; After closing the contour, in order to improve the recognition efficiency of the subsequent algorithm, the second closed contour can be filled as the projected beam area;
  • the first closed contour can be used by the controller to avoid the closed contour during the optical-mechanical projection process, so that the final rectangular area generated by the projection surface can accurately avoid the projected obstacle avoidance and realize the projection obstacle avoidance function; at the same time, the first 2.
  • the identification and filling of the closed contour will not affect the position of the final rectangular area generated by the projection surface during the subsequent algorithm identification process.
  • a method for the projection device to automatically correct the projection image including the following steps:
  • Step 6K01 obtaining the homography relationship between the projection feature map card and the feature points of the projection surface
  • Step 6K02 determine whether obstacle avoidance is enabled, if so, execute step 6K03, otherwise execute step 6K07;
  • Step 6K03 obtaining the camera and optomechanical homography matrix
  • Step 6K04 obstacle contour detection
  • Step 6K05 converting the picture into the optical-mechanical coordinate system
  • Step 6K06 transforming the binarized picture of the extracted obstacle outline into the world coordinate system
  • Step 6K07 obtaining the largest rectangular area in the world coordinate system
  • Step 6K08 calculate the area to be projected by the optical machine according to the homography relationship between the projection feature map and the projection surface.
  • the controller performs binarization on the above two types of gray values, the largest gray value can be configured as 255, and the small gray value can be configured as 0; Close the contour; in this step, in order to improve the user experience effect and avoid some small objects, such as the stains on the wall, being recognized as obstacles to avoid when the projection device performs the obstacle avoidance function, a preset obstacle avoidance area threshold can be defined;
  • Contours larger than one percent of the area of the contour area are identified as obstacles, and contours smaller than one percent of the area of the contour area are not identified as obstacles and do not perform obstacle avoidance; at the same time, the controller regards the area smaller than the contour area
  • the gray value of the area where one percent of the contours are located is filled with a maximum value of 255, so that the projection device will not avoid such extremely small contours during subsequent obstacle avoidance.
  • the projection device directly adopts binarization processing for obstacle contour detection; and then uses empirical thresholds to identify obstacles; this method will reduce its recognition efficiency when dealing with complex environments, for example, the projection beam is projected to In the black screen area, if the experience threshold is too large, the projection that does not contain obstacles will also be mistakenly identified as obstacles, which will affect the obstacle avoidance effect.
  • the projection device controller can convert the captured image captured by any camera in the binocular camera to the world coordinate system based on the homography relationship obtained above, that is, the mapping relationship, to obtain the final rectangular area .
  • the controller After the left camera of the binocular camera takes a picture, the controller performs grayscale binarization on the captured image; then the controller converts the binarized image to Convert to the optical-mechanical coordinate system;
  • the controller converts the image obtained by the left camera after binarized contour recognition to the world coordinate system 2 through the homography matrix; according to the obtained picture, the controller selects a rectangular area in the world coordinate system 2, And control the optical machine to project the playback content to the rectangular area, and the rectangular area is used to replace the deformed projection area formed on the projection surface after the projection device moves, as shown in FIG. 6H .
  • the projection device controller can determine the coordinates of the center point in the world coordinate system of the captured image corresponding to the projection chart based on the aforementioned homography relationship.
  • the projection card of the projection device optical machine is defined as 3840*2160 resolution
  • the controller can calculate and obtain the center point (1920,1080) of the projection card; then calculate the projection according to the obtained homography matrix
  • the center point of the chart card is in the coordinate system of the world coordinate system 2.
  • the controller expands the rectangle based on the coordinates of its center point until it reaches the gray value change boundary, and obtains the four vertices of the rectangle at this moment as the vertices of the final rectangular area.
  • the selected rectangle will increase sequentially based on the coordinates of the center point of the chart in the world coordinate system 2 according to the ratio of 16:9; the gray value setting of the projection area is 255, and the gray value of the corresponding obstacle area is set to 0;
  • the controller will control the rectangle to stop expanding, and at this moment the four vertices of the final rectangular area can be obtained; it can be converted to the optical machine In the coordinate system, obtain the area to be projected by the optical machine of the projection device.
  • the controller obtains the final rectangular area by obtaining the maximum inscribed rectangle with a grayscale value of 255 in the captured image, and then correcting the width and height of the maximum inscribed rectangle according to the aspect ratio of the projected image card value.
  • the controller acquires the largest inscribed rectangle in the region with 255 pixels in the captured image, and calculates the aspect ratio of the largest inscribed rectangle; if the aspect ratio of the largest inscribed rectangle is:
  • the controller can determine any vertex of the largest inscribed rectangle to locate the rectangular area corresponding to the projection area; vertices, the projection device can obtain the finally selected rectangle; the controller transforms the four vertices of the rectangle into the optical-mechanical coordinate system to obtain the area to be projected by the optical-mechanical system.
  • the controller first obtains two sets of external parameters between the two cameras and between the camera and the optical machine based on the calibration algorithm, that is, rotation and translation matrices; and then plays a specific checkerboard pattern through the optical machine of the projection device card, and calculate the depth value of the projection checkerboard corner point, for example, solve the xyz coordinate value through the translation relationship between binocular cameras and the principle of similar triangles; then fit the projection surface based on the xyz, and obtain its coordinates with the camera
  • the rotation relationship and the translation relationship of the system can specifically include the pitch relationship (Pitch) and the yaw relationship (Yaw).
  • the Roll parameter value can be obtained through the gyroscope configured by the projection device to combine the complete rotation matrix, and finally calculate the external parameters from the projection plane to the optical-mechanical coordinate system in the world coordinate system.
  • FIG. 6E a flow diagram of the projection device implementing trapezoidal correction and obstacle avoidance algorithms is given, including the following steps:
  • Step 6E01 the projection device controller obtains the depth value of the point corresponding to the pixel point of the photo, or the coordinates of the projection point in the camera coordinate system;
  • Step 6E02 through the depth value, the middleware obtains the relationship between the optical machine coordinate system and the camera coordinate system;
  • Step 6E03 the controller calculates the coordinate value of the projected point in the optical machine coordinate system
  • Step 6E04 obtaining the angle between the projection surface and the optical machine based on the coordinate value fitting plane
  • Step 6E05 obtain the corresponding coordinates of the projection point in the world coordinate system of the projection surface according to the angle relationship;
  • Step 6E06 according to the coordinates of the map card in the optical-mechanical coordinate system and the coordinates of the corresponding points on the projection plane projection surface, a homography matrix can be calculated;
  • Step 6E07 the controller judges whether an obstacle exists based on the above-mentioned acquired data, if so, execute step 6E08, otherwise execute step 6E09;
  • Execute step 6E08 randomly select rectangular coordinates on the projection surface under the world coordinate system, and calculate the area to be projected by the optical machine according to the homography relationship;
  • Step 6E09 when the obstacle does not exist, the controller can obtain the feature points of the two-dimensional code, for example;
  • Step 6E10 then obtain the coordinates of the two-dimensional code on the prefabricated map card
  • Step 6E11 then obtain the homography relationship between the camera photo and the drawing card
  • Step 6E12 converting the obtained coordinates of the obstacle into the map, and then obtaining the coordinates of the map blocked by the obstacle;
  • Step 6E13 according to the coordinates of the occlusion area of the obstacle map in the optical-mechanical coordinate system, the coordinates of the occlusion area of the projection surface are obtained through the transformation of the homography matrix;
  • step 6E14 randomly select rectangular coordinates on the projection surface in the world coordinate system, avoid obstacles at the same time, and calculate the area to be projected by the optical machine according to the homography relationship.
  • the obstacle avoidance algorithm uses the algorithm (OpenCV) library to complete the contour extraction of foreign objects when selecting the rectangle step in the trapezoidal correction algorithm process, and avoids the obstacle when selecting the rectangle to realize the projection obstacle avoidance function.
  • OpenCV algorithm
  • a schematic flow chart of implementing a screen entry algorithm for a projection device includes the following steps:
  • Step 6F01 the middleware acquires the QR code image card captured by the camera
  • Step 6F02 identifying the feature points of the two-dimensional code, and obtaining the coordinates in the camera coordinate system
  • step 6F03 the controller further acquires the coordinates of the preset image card in the optical-mechanical coordinate system
  • Step 6F04 solving the homography relationship between the camera plane and the optical-mechanical plane
  • Step 6F05 based on the above homography, the controller identifies the coordinates of the four vertices of the curtain captured by the camera;
  • step 6F06 according to the homography matrix, the range of the chart to be projected by the screen light machine is acquired.
  • the screen entry algorithm can identify and extract the largest black closed rectangle outline based on the algorithm library, and judge whether it is a 16:9 size; project a specific chart and use a camera to take a photo, and extract multiple corner points in the photo to use
  • To calculate the homography between the projection surface (curtain) and the optical-mechanical display card convert the four vertices of the screen to the optical-mechanical pixel coordinate system through homography, and convert the optical-mechanical graphic card to the four vertices of the screen to complete the calculation and comparison .
  • the telephoto micro-projection TV has the characteristics of flexible movement, and the projection screen may be distorted after each displacement.
  • the projection equipment provided by this application can automatically complete the above problems. Correction, including the realization of automatic keystone correction, automatic screen entry, automatic obstacle avoidance, automatic focus, anti-eye and other functions.
  • this application also provides a display control method for automatically correcting the projected picture.
  • the method realizes the automatic correction of the projected picture on the projection device.
  • the control scheme has been described in detail, and will not be repeated here.
  • the beneficial effect of the embodiment of the present application is that by projecting the first and second charts, the corresponding first and second images can be obtained; further by comparing the feature points of the first and second images, it can be determined that the projection chart and The homography relationship of the projection surface realizes the conversion of the captured image to the world coordinate system based on the homography relationship, determines the corrected projection rectangular area, and realizes automatic correction of the generated deformed projection area after the projection equipment is shifted and changed.
  • the present application provides a display control method for automatically correcting projected images based on a trigger correction method, which can be applied to projection equipment.
  • the projection equipment includes a projection screen 1 and a device 2 for projection, so that the projection equipment can trigger image correction by default. And in the scene of eye protection mode, such as during the movement of the projection device, it solves the problem of false triggering of image correction and eye protection functions.
  • Fig. 7 is a schematic diagram of the lens structure of the projection device 2 in some embodiments.
  • the lens 300 of the projection device may further include an optical assembly 310 and a driving motor 320 .
  • the optical component 310 is a lens group composed of one or more lenses, which can refract the light emitted by the optical machine 200, so that the light emitted by the optical machine 200 can be transmitted to the projection surface to form a transmitted content image.
  • the optical assembly 310 may include a lens barrel and a plurality of lenses disposed in the lens barrel. According to whether the position of the lens can be moved, the lens in the optical assembly 310 can be divided into a movable lens 311 and a fixed lens 312, by changing the position of the movable lens 311, adjusting the distance between the movable lens 311 and the fixed lens 312, changing the overall optical assembly 310 focal length. Therefore, the driving motor 320 can drive the moving lens 311 to move its position by connecting with the moving lens 311 in the optical assembly 310 to realize the auto-focus function.
  • the focusing process described in some embodiments of the present application refers to changing the position of the moving lens 311 by driving the motor 320, thereby adjusting the distance between the moving lens 311 and the fixed lens 312, that is, adjusting the position of the image plane , so the imaging principle of the lens combination in the optical assembly 310, the adjustment of the focal length is actually the adjustment of the image distance, but in terms of the overall structure of the optical assembly 310, adjusting the position of the moving lens 311 is equivalent to adjusting the overall focal length adjustment of the optical assembly 310 .
  • the lens of the projection device needs to be adjusted to different focal lengths so as to transmit a clear image on the projection surface.
  • the distance between the projection device and the projection surface will require different focal lengths depending on the location of the user. Therefore, in order to adapt to different usage scenarios, the projection device needs to adjust the focal length of the optical assembly 310 .
  • Fig. 8 is a schematic structural diagram of a distance sensor and a camera in some embodiments.
  • the device 2 for projection can also have a built-in or external camera 700 , and the camera 700 can take images of images projected by the projection device to obtain projection content images.
  • the projection device checks the definition of the projected content image to determine whether the current lens focal length is appropriate, and adjusts the focal length if it is not appropriate.
  • the projection device can continuously adjust the lens position and take pictures, and find the focus position by comparing the clarity of the front and rear position pictures, so as to adjust the moving lens 311 in the optical assembly to Suitable location.
  • the controller 500 may first control the driving motor 320 to gradually move the moving lens 311 from the focus start position to the focus end position, and continuously obtain projected content images through the camera 700 during this period. Then, by performing definition detection on multiple projected content images, the position with the highest definition is determined, and finally the driving motor 320 is controlled to adjust the moving lens 311 from the focusing terminal to the position with the highest definition, and automatic focusing is completed.
  • Fig. 9 is a schematic diagram when the position of the projection device changes in some embodiments.
  • the projection device starts at position A and can project onto a suitable projection area, which is rectangular in shape and can usually completely and accurately cover the corresponding rectangular screen.
  • a suitable projection area which is rectangular in shape and can usually completely and accurately cover the corresponding rectangular screen.
  • deformed projected images are often generated, such as trapezoidal images, which cause problems that the projected images do not match the rectangular screen.
  • projection equipment has image correction functions.
  • the projection device can automatically correct the projected image to obtain a regular-shaped image, which can be a rectangular image, thereby realizing image correction.
  • the projection device may activate an image correction function to correct the projected image.
  • the image correction instruction refers to a control instruction used to trigger the projection device to automatically perform image correction.
  • the image correction instruction may be an instruction actively input by the user. For example, after the projection device is powered on, the projection device can project an image on the projection surface. At this time, the user can press the image correction switch preset in the projection device, or the image correction button on the remote control of the projection device, so that the projection device turns on the image correction function, and automatically performs image correction on the projected image.
  • the image correction instruction can also be automatically generated according to the built-in control program of the projection device.
  • the projection device actively generates the image correction instruction after it is turned on. For example, when the projection device detects that a video signal is input for the first time after being turned on, it may generate an image correction instruction, thereby triggering the image correction function.
  • the projection device may also be that the projection device generates the image correction instruction by itself during the working process. Considering that the user may actively move the projection device or accidentally touch the projection device during the use of the projection device, resulting in changes in the posture or installation position of the projection device, the projected image may also become a trapezoidal image. At this time, in order to ensure the user's viewing experience, the projection device can automatically perform image correction.
  • the projection device can detect its own situation in real time.
  • the projection device detects that its posture or setting position has changed, it can generate an image correction instruction, thereby triggering the image correction function.
  • the projection device can monitor the movement of the device in real time through its configuration components, and immediately feed back the monitoring results to the projection device controller, so that after the projection device moves, the controller immediately starts the image correction function and realizes it in the first time Projected image correction.
  • the controller receives monitoring data from the acceleration sensor to determine whether the projection device moves.
  • the controller can generate an image correction command and enable the image correction function.
  • the camera captures the image in real time.
  • the projection device will automatically trigger the eye protection mode. In this way, in order to reduce the damage of the light source to human eyes, the projection device can change the brightness of the image by adjusting the brightness of the light source, thereby achieving the effect of protecting human eyes.
  • the projection device acquires acceleration data in real time based on the gyroscope sensor, and determines whether the projection device moves according to the change of the acceleration data. If the projection device is moved, the image correction function is only activated when the projection device is in a static state.
  • the acquired acceleration data will be abnormal data. For example, it is determined that the projection device is in a static state during the movement based on abnormal data. Therefore, when the projection device is moving, image correction and eye protection functions will be falsely triggered.
  • the present application proposes a projection device that may include an optical machine, a light source, an acceleration sensor, a camera, and a controller.
  • the optical machine is used to project the playback content preset by the user onto the projection surface
  • the projection surface can be a wall or a curtain.
  • the camera can perform image correction on the projection device through the images captured by the camera.
  • the light source is used to provide illumination for the light machine.
  • the acceleration sensor is used to determine whether the projection device moves. In order to solve the problem of false triggering of image correction and eye protection functions during the movement of the projection device.
  • the projection device may trigger the image correction function.
  • the generation method of the image correction instruction includes, but is not limited to, generation by the user selecting a preset image correction switch, or automatic generation based on a change of the gyroscope sensor. Therefore, the embodiment of the present application will be described in detail with the image correction switch being in the on state and the image correction switch being in the off state. It can be understood that when the image correction switch is turned on, the projection device may automatically enable the image correction function based on the change of the gyroscope sensor. When the image correction switch is turned off, the projection device does not enable the image correction function.
  • FIG. 10 is a schematic diagram of image correction triggered by a projection device in an embodiment of the present application.
  • the controller acquires acceleration data collected by the acceleration sensor. If the acceleration data is less than the preset acceleration threshold, it is determined that the projection device is in a static state, and the controller controls the optical-mechanical projection correction card to the projection surface. And determine the area to be projected in the projection surface based on the calibration card, and project the playback content to the area to be projected.
  • the controller monitors and receives the acceleration data collected by the gyroscope sensor, that is, the acceleration sensor in real time.
  • the acceleration data includes acceleration data on three axis coordinates (X, Y and Z axes). Since the acceleration data represent the acceleration data collected in the coordinate directions of the three axes, when the acceleration data on one of the axes changes, it means that the projection device is displaced in the coordinate direction of the axis. Therefore, whether the projection device moves can be determined by judging the acceleration data in the directions of the three axes. Next, convert the acceleration data in the coordinate directions of the three axes into angular velocity data, and determine the movement angle of the projection device in the coordinate directions of the three axes through the angular velocity data.
  • the controller calculates the movement angles of the projection device in the coordinate directions of the three axes by acquiring the acceleration data collected last time and the acceleration data collected currently. If the movement angles in the coordinate directions of the three axes are all less than the preset acceleration threshold, it is determined that the projection device is in a static state, and the optomechanical projection correction map card can be controlled to the projection surface to trigger the image correction process.
  • the embodiment of the present application does not set numerical limitations on the preset acceleration threshold, and those skilled in the art can determine the value of the preset acceleration threshold according to actual needs, such as 0.1, 0.2, 0.3, etc., these designs do not Beyond the scope of protection of the embodiments of the present application.
  • FIG. 11 is a schematic diagram of determining whether the projection device moves in the embodiment of the present application, including the following steps:
  • Step 1101 judging whether the acceleration data is less than the preset acceleration threshold, if not, execute step 1102, if yes, execute step 1104;
  • Step 1102 assigning the data label corresponding to the acceleration data as the second state identifier
  • Step 1103 representing that the projection device is in a moving state
  • Step 1104 assigning the data label corresponding to the acceleration data as the first state identifier
  • Step 1105 acquiring the acceleration data at the first moment
  • Step 1106 detecting whether the data tag corresponding to the acceleration data at the first moment is the first state identifier, if not, execute step 1107, and if yes, execute step 1108;
  • Step 1107 representing that the projection device is in a moving state
  • Step 1108 acquiring the acceleration data at the second moment
  • Step 1109 detecting whether the acceleration data at the second moment is less than the preset acceleration threshold, if not, execute step 1110, if yes, execute step 1111;
  • Step 1110 assigning the data label corresponding to the second acceleration data as the second state identifier
  • Step 1111 trigger the image automatic correction process.
  • the controller is further configured to: if the acceleration data is less than the preset acceleration threshold, assign the data label corresponding to the acceleration data as a first state identifier, and the first state identifier is used to indicate that the projection device is in a static state. If the acceleration data is greater than the preset acceleration threshold, the data label is assigned a second status identifier, and the second status identifier is used to indicate that the projection device is in a moving state.
  • the preset acceleration threshold is 0.1.
  • assign the current data label as the second state flag for example, movingType is moving and staticType is false.
  • assign the current data label as the first state flag for example, movingType is moving and staticType is true.
  • the controller is further configured to: acquire the acceleration data at the first moment based on a preset duration. Next, the data label corresponding to the acceleration data at the first moment is detected. If the data tag value is the first state identifier, the acceleration data at the second moment is obtained; wherein, the interval between the second moment and the first moment is a preset time length. According to the acceleration data at the second moment, the moving angle of the projection device is calculated, so as to control the optical machine to project the correction chart to the projection surface according to the moving angle.
  • the default duration is 1s.
  • the acceleration data at the first moment is acquired again after an interval of 1 second. Determine the data label of the acceleration data at the first moment. At this point, the data label will appear in two situations.
  • Case 1 If the data label of the acceleration data at the first moment is the second state flag movingType is moving and staticType is false, then the acceleration data at the second moment will be acquired again after an interval of 1s. Determine the data label of the acceleration data at the second moment again. If the data label of the acceleration data at the second moment is still the second state flag movingType is moving and staticType is false, it is determined that the projection device is in a moving state, and the projection device does not trigger subsequent image correction process.
  • the acceleration data is collected by the gyroscope sensor in real time and sent to the controller. Furthermore, the process of acquiring acceleration data at different times and determining the corresponding data label can be performed multiple times, so as to more accurately determine whether the projection device is really in a static state.
  • the embodiment of the present application does not specifically limit the execution times of the acquiring and judging process, and can be set multiple times according to the device state corresponding to the specific projection device, and all of them are within the protection scope of the embodiment of the present application.
  • the controller is configured to: acquire the acceleration data at the third moment based on the preset duration; wherein, the first The interval between three moments and the moment of acquiring acceleration data is a preset period of time. If the acceleration data at the third moment is less than the preset acceleration threshold, calculate the movement angle of the projection device, so as to control the optical machine to project the correction chart to the projection surface according to the movement angle.
  • the acceleration data obtained by the controller is greater than 0.1, at this time, the data label of the acceleration data is the second status flag movingType is moving, and staticType is false.
  • the projection device can be considered to be in a moving state. Since the acceleration sensor collects and sends to the controller in real time, the controller still acquires and detects acceleration data in real time after the projection device is in a moving state. In this way, the controller acquires the acceleration data at the third moment again after an interval of 1s. If the acceleration data at the third moment is still greater than 0.1, it is determined that the projection device is still in a moving state.
  • the controller calculates the moving angle of the projection device according to the acceleration data, and triggers an image correction process, which will not be repeated here.
  • the process of the image correction process specifically includes: the controller controls the optical-mechanical projection correction card to the projection surface. And determine the area to be projected in the projection plane based on the calibration card. Finally, project the playback content to the area to be projected.
  • the calibration card includes a calibration white card and a calibration chart card.
  • the controller controls the optical machine to project the correction white card to the projection surface and extracts the first correction coordinates in the correction white card.
  • the controller controls the optical machine to project the calibration map onto the projection surface and extract the second calibration coordinates in the calibration map.
  • the controller acquires an angular relationship between the coordinates based on the first corrected coordinates and the second corrected coordinates, wherein the angular relationship is a projection relationship between the playing content of the optical machine projection and the projection surface.
  • the controller updates the first corrected coordinates based on the angle relationship, so that the area to be projected is determined according to the updated first corrected coordinates.
  • the controller After the controller receives the input image correction instruction, referring to FIG. 13 , it controls the optical machine to project the correction white card to the projection surface. Continue to extract the first correction coordinates in the correction white card, wherein the first correction coordinates include four vertex coordinates of the correction white card. Referring to FIG. 14 , the controller controls the optical machine to project the calibration chart onto the projection surface, and continues to extract the second calibration coordinates in the calibration chart, wherein the second calibration coordinates include the coordinates of four vertices of the calibration chart.
  • the controller compares the coordinates of the four vertices of the corrected white card with the coordinates of the four vertices of the corrected chart, such as comparing the coordinates of the vertices of the upper left corner of the corrected white card with the coordinates of the vertices of the upper left corner of the corrected chart , compare the vertex coordinates of the lower left corner in the corrected white card with the vertex coordinates of the lower left corner in the corrected chart, compare the vertex coordinates of the upper right corner in the corrected white card with the vertex coordinates of the upper right corner in the corrected chart, and Compare the vertex coordinates of the lower right corner in the calibration white card with the vertex coordinates of the lower right corner in the calibration chart.
  • the optical machine is controlled to project the playback content to the area to be projected for users to watch.
  • the projection device may obtain the angle relationship between the first correction coordinate and the second correction coordinate through a preset correction algorithm.
  • the projection relationship refers to the projection relationship in which the projection device projects an image onto the projection surface, specifically, the mapping relationship between the playback content of the projection device's optical-mechanical projection and the projection surface.
  • the projection device can convert the first correction coordinates into the second correction coordinates through a preset correction algorithm, and then complete the updating of the first correction coordinates.
  • the controller controls the optical machine to project the calibration white card and calibration chart onto the projection surface
  • the controller can also control the camera to view the calibration white card and calibration chart displayed on the projection surface Shoot to obtain the corrected white card image and the corrected chart card image. Therefore, the first correction coordinates and the second correction coordinates can be obtained from the correction white card image and the correction chart image captured by the camera.
  • the image coordinate system refers to the coordinate system with the center of the image as the coordinate origin and the X and Y axes parallel to the two sides of the image.
  • the center point of the projection area can be set as the origin, the horizontal direction is the X axis, and the vertical direction is the Y axis.
  • first calibration coordinates and the second calibration coordinates include but are not limited to four vertex coordinates, and the number and position of coordinates can be adjusted according to the calibration white card, calibration chart and projection equipment.
  • the calibration chart may contain patterns and color features preset by the user.
  • Figure 15 is a schematic diagram of a calibration chart in some embodiments.
  • Three kinds of correction areas can be arranged successively along the direction of the correction chart from the center to the edge.
  • the first calibration area is located in the middle area of the calibration chart, such as area A in the figure.
  • the first correction area may be a checkerboard pattern or a circular ring pattern.
  • the second calibration area is located between the first calibration area and the third calibration area, such as the four B areas in the figure, including four calibration areas B1-B4.
  • each second calibration area may be the same, and may be smaller than the area of the first calibration area.
  • Each second correction area includes a pattern preset by the user, wherein the patterns of each second correction area may be the same or different, which is not limited in this embodiment of the present application.
  • the third correction area is located in the edge area of the correction chart, as shown in the 16 C areas in the figure, including 16 correction areas C1-C16.
  • the area of each third correction area may be the same and smaller than the area of the second correction area.
  • Each third correction area may include a pattern preset by the user, wherein the patterns of each third correction area may be the same or different.
  • the second correction coordinates may be the four vertex coordinates of all the patterns in the three correction areas, or the four vertex coordinates of any several patterns in any correction area.
  • the pattern in the correction chart can also be set as a donut chart, and the donut chart includes a donut pattern. It should be noted that the correction chart can also be set as a combination of the above two types of patterns, or can also be set as other patterns that can realize image correction.
  • an auto-focusing process is triggered.
  • the autofocus process performed after the calibration white card is projected onto the projection surface is called the first focus.
  • the automatic focusing process after projecting the calibration chart onto the projection surface is called the second focusing.
  • the projection device provided by the embodiment of the present application further includes a lens, and the lens includes an optical assembly and a driving motor; the driving motor is connected to the optical assembly.
  • the focal length of the optical assembly is adjusted by driving the motor to realize the first focusing process and the second focusing process.
  • the controller controls the lens to adopt a contrast focusing method, and controls the driving motor to drive the optical components to move step by step.
  • the corresponding contrast contrast value of the lens in each moving position is calculated.
  • the target position is determined according to the contrast value, and the target position is the corresponding position of the image with the highest contrast value.
  • control the drive motor to move the position of the optical assembly according to the target position.
  • the projection device can also use an autofocus algorithm to obtain the current object distance by using its configured laser rangefinder to calculate the initial focal length and search range. Then the projection device drives the camera to take pictures, and uses the corresponding algorithm to evaluate the definition. Therefore, within the above search range, the projection device searches for a possible best focal length based on a search algorithm. Then repeat the above steps of taking photos and sharpness evaluation, and finally find the optimal focal length through sharpness comparison to complete autofocus.
  • the projection device needs to detect whether the auto-focus function is enabled before triggering image correction.
  • the controller will not trigger the auto focus process.
  • the projection device will automatically trigger the first focusing process, the second focusing process and the focusing process after image correction is completed.
  • the controller can also trigger the eye protection mode after performing the image correction process.
  • the display brightness of the user interface is reduced to prevent the risk of vision damage caused by the user accidentally entering the range of the laser trajectory emitted by the projection device.
  • the controller can control the user interface to display corresponding prompt information to remind the user to leave the current area.
  • Step 1701 judging whether the children's viewing mode switch is turned on, if so, execute step 1702, otherwise end the trigger;
  • Step 1702 judge whether the radioactive eye switch is turned on, if so, execute step 1703, otherwise end the trigger;
  • Step 1703 judging whether the image correction is performed, if not, execute step 1704, if yes, end the trigger;
  • Step 1704 judging whether autofocus is in progress, if not, execute step 1705, otherwise end triggering;
  • Step 1705 trigger to enter the eye protection mode.
  • the controller detects whether the children's movie viewing mode is turned on, such as detecting whether the children's movie watching mode switch is turned on. In the state that the children's viewing mode switch is turned on, the controller detects whether the eye shot prevention switch is turned on. If the anti-eye switch is turned on, when the user enters the projection area, the projection device will trigger the eye protection mode, that is, automatically reduce the laser intensity emitted by the light source, reduce the display brightness of the user interface, and display safety prompt information.
  • the controller can also automatically turn on the anti-eye switch. After receiving the acceleration data sent by the acceleration sensor or the data collected by other sensors, the controller will control the projection device to turn on the anti-eye switch.
  • the controller when the controller triggers the above-mentioned eye protection function after performing the image correction process, the controller is further configured to: acquire images with the same content in preset frames through the camera. Calculate the similarity value of the corresponding image in the preset frame; if the similarity value is greater than the similarity threshold, adjust the brightness of the light source to the preset brightness. Wherein, the preset brightness is lower than the brightness during normal projection.
  • the controller acquires images of two frames, such as calculating the similarity value between the image of the current frame and the image of the previous frame. If the similarity value is less than the similarity threshold, it is determined that the two frames of images are the same, and no person enters the projection area. If the similarity value is less than the similarity threshold, it is determined that the two frames of images are not the same, and a person enters the projection area. In this way, when a person is detected entering the projection area, the brightness of the light source is adjusted to a preset brightness. Therefore, the purpose of protecting human eyes is achieved by reducing the brightness of the light source. It should be noted that the present application does not limit the implementation manner of how to detect that a person enters the projection area, and only uses determining whether the preset frame images are the same as an example.
  • the controller is further configured to: detect the correction progress of the image; if the correction progress is not in progress, acquire the acceleration data at the current moment. And adjust the brightness of the light source to the preset brightness according to the acceleration data at the current moment.
  • the controller needs to detect the correction progress of the image. If no image correction is performed, the acceleration data at the current moment is acquired and detected. Determine whether the projection device is in a moving state based on the acceleration data at the current moment. If the projection device is in a static state, adjusting the brightness of the light source to a preset brightness will trigger the eye protection mode.
  • the controller is further configured to: detect the progress of focusing on the image. If the focus progress is not in progress, then acquire and detect the acceleration data at the current moment. According to the acceleration data at the current moment, it is determined whether the projection device is in a moving state, and if the projection device is in a static state, the eye protection mode is triggered.
  • the controller When the controller detects whether the auto-focus function is turned on, it can detect whether the auto-focus switch is turned on. If the auto focus switch is not turned on, the eye protection mode will be triggered. If the auto-focus switch is turned on, the controller needs to judge the focus progress of the image at the current moment. If the focus progress is not in progress, then trigger the eye protection mode.
  • the projection device can trigger the image correction process based on the change of the acceleration sensor, and trigger the eye protection mode after the image self-correction process is completed.
  • the acceleration data collected by the acceleration sensor it is more accurate to determine whether the projection device is in a static state. This prevents the projection device from accidentally triggering the image correction process and entering the eye protection mode when it is in a moving state. It further avoids triggering the image correction process and the auto-focus process at the same time when entering the eye protection mode, which improves the user experience.
  • the controller when the image correction switch of the projection device is turned off, the controller does not trigger the image correction process.
  • the projection device since the anti-glare eye is in the open state, when a person is detected entering the projection area, the projection device will automatically enter the eye protection mode, that is, the process of reducing the laser intensity emitted by the light source, reducing the display brightness of the user interface, and displaying safety prompts and other processes. It should be noted that, for the specific steps of triggering the eye protection mode, refer to the description in the above embodiment, and details are not repeated here.
  • the image correction switch when the image correction switch is off by default, after the projection device detects the on-off state of the auto-focus switch, if the auto-focus switch is off, the eye protection mode is triggered. At the same time, in the process of triggering the eye protection mode, by detecting the progress of correction and focusing, it is avoided to trigger the image correction process and the auto focus process at the same time when entering the eye protection mode, which improves the user experience.
  • the projection device acquires the acceleration data collected by the acceleration sensor in the scene where image correction and eye protection mode can be triggered by default. If the acceleration data is less than the preset acceleration threshold, control the optomechanical projection correction card to the projection surface. Determine the area to be projected on the projection surface based on the calibration card, and project the playing content to the area to be projected. Thus, image correction is triggered after it is determined that the projection device is in a static state after being moved. In addition, the image with the same content in the preset frame is acquired through the camera, and the brightness of the light source is adjusted according to the image to trigger the eye protection function.
  • the eye protection mode is triggered based on an image with the same content in the preset frame. Solve the problem of false triggering of image correction and eye protection functions during the movement of the projection device, and improve the user experience.
  • the present application also provides a trigger correction method, which is applied to a projection device.
  • the projection device includes a light source, an acceleration sensor, an optical machine, a camera, and a controller; the trigger correction method includes:
  • the acceleration data collected by the acceleration sensor if the acceleration data is less than the preset acceleration threshold, control the optical machine to project the correction map to the projection surface; determine the area to be projected in the projection surface based on the correction map, and project the playback content to the area to be projected ;
  • the method includes: if the acceleration data is less than a preset acceleration threshold, assigning the data label corresponding to the acceleration data as a first state identifier, the first state identifier is used to indicate that the projection device is in a static state; if the acceleration data is greater than If the acceleration threshold is preset, the data label is assigned a second status identifier, and the second status identifier is used to indicate that the projection device is in a moving state.
  • the method includes: acquiring the acceleration data at the first moment based on the preset duration; detecting the data label corresponding to the acceleration data at the first moment; if the data label value is the first state Identify, obtain the acceleration data at the second moment; wherein, the interval between the second moment and the first moment is a preset duration; calculate the movement angle of the projection device according to the acceleration data at the second moment, so as to control the optical machine to correct according to the movement angle
  • the chart is projected onto the projection surface.
  • the method includes: after the step of acquiring the acceleration data collected by the acceleration sensor, acquiring the acceleration data at a third moment based on a preset duration; wherein, the third moment is the same as the acquired acceleration
  • the time interval of the data is a preset duration; if the acceleration data at the third moment is less than the preset acceleration threshold, calculate the movement angle of the projection device, so as to control the optical machine to project the calibration chart to the projection surface according to the movement angle.
  • the method includes: in the step that the acceleration data is less than the preset acceleration threshold, acquiring the acceleration data collected last time; calculating the movement angle of the projection device in the coordinate directions of the three axes according to the acceleration data; If the movement angles in the axis coordinate direction are all less than the preset acceleration threshold, the optomechanical projection correction map is controlled to be stuck on the projection surface.
  • the method includes: in the step of adjusting the brightness of the light source to a preset brightness, calculating the similarity value of the corresponding image in the preset frame. If the similarity value is greater than the similarity threshold, the brightness of the light source is adjusted to a preset brightness.
  • the method includes: detecting the correction progress of the image; if the correction progress is not performed, acquiring the acceleration data at the current moment; Adjust the brightness to the preset brightness.
  • the calibration card includes a calibration white card and a calibration map card; the method includes: in the step of determining the area to be projected on the projection surface based on the calibration card, sequentially controlling the optical machine to project the calibration white card and the calibration map card to the projection surface; respectively extract the first correction coordinates in the correction white card and the second correction coordinates in the correction map card; obtain the angle relationship between the first correction coordinates and the second correction coordinates, and the angle relationship is the playback content and The projection relationship between the projection surfaces: updating the first correction coordinates based on the angle relationship, so that the area to be projected is determined according to the updated first correction coordinates.
  • the projection device further includes a lens
  • the lens includes an optical assembly and a driving motor
  • the driving motor is connected to the optical assembly to adjust the focal length of the optical assembly
  • the motor drives the optical components to move step by step; calculate the corresponding contrast value of the lens in each moving position; determine the target position according to the contrast value, and the target position is the corresponding position of the image with the highest contrast value; control the drive motor to move the optical component according to the target position s position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

本显示设备技术领域,特别地,涉及一种投影设备及投影图像的显示控制方法,一定程度上可以解决投影设备组件装配不规范导致参数精度低,测算夹角测量难度大导致的夹角数据不准确,从而造成投影设备反复校正投影图像、或校正时间长、或不能准确校正投影图像的问题,所述投影设备包括:光机;双目相机;控制器,被配置为:在投影设备移动后,先后获取光机投射至投影面的第一图像及第二图像,第一图像对应于第一图卡,第二图像对应于第二图卡,所述第一图像、第二图像用于对应特征比对以确定在世界坐标系下投影图卡与投影面的映射关系;基于映射关系获取拍摄图像转换至世界坐标系下的矩形区域,控制光机将播放内容投射至所述矩形区域。

Description

一种投影设备及投影图像的显示控制方法
相关申请的交叉引用
本申请要求在2021年11月16日提交、申请号为202111355866.0;在2022年04月15日提交、申请号为202210399514.3;在2022年01月17日提交、申请号为202210050600.3的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示设备技术领域,特别地,涉及一种投影设备及投影图像的显示控制方法。
背景技术
投影仪位置移动后造成投射角度、或投影距离的偏差,会引发投影图像不能完全投射至预设投影区域,如形成梯形投影图像;通过投影仪的校正投影图像功能,可以修正投影图像的显示位置、及显示形状,使投影图像完整显示在预设投影区域内、且投影图像中心与投影区域中心重合。
在一些自动校正投影图像的实现中,通常首先通过投影仪配置的RGBD(RGB Deep:彩色深度)相机获取投影面在相机坐标系下的位置;然后通过标定方式获取投影仪光机与RGBD相机之间的外参,将投影图像在相机坐标系下的位置转换到光机坐标系下;最后获取投影仪光机与投影面的测算夹角进行测算,从而实现投影图像的校正。
然而,在投影仪光机、相机、幕布等组件装配不规范存在误差时,会导致上述参数精度降低;且标定方式工艺复杂,获取投影仪光机与投影面夹角有时会不准确,造成投影仪反复校正投影图像、或校正时间长、或不能准确校正投影图像。
发明内容
本申请实施例的第一方面提供一种投影设备,包括:光机,用于将播放内容投射至投影面;双目相机,包括左相机与右相机,用于获取所述投影面的显示图像;控制器,被配置为:在检测到投影设备移动后,控制双目相机获取光机先后投射至所述投影面的第一及第二图像,所述第一图像对应于投射第一图卡,所述第二图像对应于投射第二图卡,所述第一图像、第二图像用于对应特征比对确定在世界坐标系下投影图卡与所述投影面的单应性关系;基于所述单应性关系获取所述双目相机其中一个相机拍摄图像转换至世界坐标系下的矩形区域,控制光机将播放内容投射至所述矩形区域,所述矩形区域用于替代投影设备移动后在投影面形成的畸形投影区域。
本申请实施例的第二方面提供一种投影画面的显示控制方法,所述方法包括:在检测到移动后,获取先后投射至投影面的第一图像及第二图像,所述第一图像对应于投射第一图卡,所述第二图像对应于投射第二图卡,所述第一图像、第二图像用于对应特征比对确定在世界坐标系下投影图卡与所述投影面的单应性关系;基于所述单应性关系获取双目相机其中一个相机拍摄图像转换至世界坐标系下的矩形区域,将播放内容投射至所述矩形区域,所述矩形区域用于替代移动后在投影面形成的畸形投影区域。
附图说明
图1A为本申请一实施例投影设备的摆放示意图;
图1B为本申请一实施例投影设备光路示意图;
图2为本申请一实施例投影设备的电路架构示意图;
图3为本申请一实施例投影设备的结构示意图;
图4为本申请另一实施例投影设备的结构示意图;
图5为本申请一实施例投影设备的电路结构示意图;
图6A为本申请一实施例投影设备实现显示控制的系统框架示意图;
图6B为本申请另一实施例投影设备实现放射眼功能的信令交互时序示意图;
图6C为本申请另一实施例投影设备实现显示画面校正功能的信令交互时序示意图;
图6D为本申请另一实施例投影设备实现自动对焦算法的流程示意图;
图6E为本申请另一实施例投影设备实现梯形校正、避障算法的流程示意图;
图6F为本申请另一实施例投影设备实现入幕算法的流程示意图;
图6G为本申请另一实施例投影设备实现防射眼算法的流程示意图;
图6H为本申请另一实施例投影设备的投影示意图;
图6I为本申请另一实施例投影设备的投影示意图;
图6J为本申请另一实施例投影设备实现自动校正投影图像的流程示意;
图6K为本申请另一实施例投影设备实现自动校正投影图像的流程示意;
图7为本申请另一实施例中投影设备的镜头结构示意图;
图8为本申请另一实施例中投影设备的距离传感器和相机结构示意图;
图9为本申请另一实施例中距离传感器和相机结构示意图;
图10为本申请另一实施例中投影设备触发图像校正的示意图;
图11为本申请另一实施例中判定投影设备是否移动的示意图;
图12为本申请另一实施例中控制器、光机和投影面之间的交互示意图;
图13为本申请另一实施例中投影设备控制光机投射校正白卡的示意图;
图14为本申请另一实施例中投影设备控制光机投射校正图卡的示意图;
图15为本申请另一实施例中一种校正图卡的示意图;
图16为本申请另一实施例中另一种校正图卡的示意图;
图17为本申请另一实施例中投影设备触发护眼模式的流程图。
具体实施方式
为使本申请的目的和实施方式更加清楚,下面将结合本申请示例性实施例中的附图,对本申请示例性实施方式进行清楚、完整地描述,显然,描述的示例性实施例仅是本申请一部分实施例,而不是全部的实施例。
图1A为本申请一实施例投影设备的摆放示意图。
在一些实施例中,本申请提供的一种投影设备包括投影屏幕1和用于投影的装置2。投影屏幕1固定于第一位置上,用于投影的装置2放置于第二位置上,使得其投影出的画面与投影屏幕1吻合,该步骤为专业售后技术人员操作,也即第二位置为用于投影的装置2的最佳摆放位置。
图1B为本申请一实施例投影设备光路示意图。
本申请实施例提供了一种投影设备,包括激光光源100,光机200,镜头300,投影介质400。其中,激光光源100为光机200提供照明,光机200对光源光束进行调制,并输出至镜头300进行成像,投射至投影介质400形成投影画面。
在一些实施例中,投影设备的激光光源包括激光器组件和光学镜片组件,激光器组件发出的光束可 透过光学镜片组件进而为光机提供照明。其中,例如,光学镜片组件需要较高等级的环境洁净度、气密等级密封;而安装激光器组件的腔室可以采用密封等级较低的防尘等级密封,以降低密封成本。
在一些实施例中,投影设备的光机200可实施为包括蓝色光机、绿色光机、红色光机,还可以包括散热系统、电路控制系统等。需要说明的是,在一些实施例中,投影设备的发光部件还可以通过LED(Light Emitting Diode:发光二极管)光源实现。
在一些实施例中,本申请提供了一种投影设备,包括三色光机和控制器;其中,三色光机用于调制生成用户界面包含像素点的激光,包括蓝色光机、绿色光机和红色光机;控制器被配置为:获取用户界面的平均灰度值;判定所述平均灰度值大于第一阈值、且其持续时间大于时间阈值时,控制所述红色光机的工作电流值按照预设梯度值降低,以减小所述三色光机的发热。可以发现,通过降低三色光机中所集成红色光机的工作电流,可以实现控制所述红色光机的过热,以实现控制三色光机及投影设备的过热。
光机200可实施为三色光机,所述三色光机集成蓝色光机、绿色光机、红色光机。
下文中将以投影设备的光机200实施为包括蓝色光机、绿色光机、红色光机为例,对本申请提供的技术方案进行阐述。
在一些实施例中,投影设备的光学系统由光源部分和光机部分组成,光源部分的作用是为光机提供照明,光机部分的作用是对光源提供的照明光束进行调制,最后通过镜头出射形成投影画面。
在一些实施例中,激光器组件包括红色激光器模组、绿色激光器模组、蓝色激光器模组、各个激光器模组与相应安装口均通过密封圈(采用氟橡胶或其他密封材料皆可)防尘密封安装。
图2为本申请一实施例投影设备的电路架构示意图。
在一些实施例中,本公开提供的投影设备包括多组激光器,通过在激光光源的出光路径中设置亮度传感器,亮度传感器可以检测激光光源的第一亮度值,并将第一亮度值发送至显示控制电路。
该显示控制电路可以获取每个激光器的驱动电流对应的第二亮度值,并在确定该激光器的第二亮度值与该激光器的第一亮度值的差值大于差值阈值时,确定该激光器发生COD(Catastrophic Optical Damage:光学灾变损伤)故障;则显示控制电路可以调整激光器的对应的激光器驱动组件的电流控制信号,直至该差值小于等于该差值阈值,从而消除该蓝色激光器的COD故障;该投影设备能够及时消除激光器的COD故障,降低了激光器的损坏率,确保了投影设备的图像显示效果。
在一些实施例中,该投影设备可以包括显示控制电路10、激光光源20、至少一个激光器驱动组件30以及至少一个亮度传感器40,该激光光源20可以包括与至少一个激光器驱动组件30一一对应的至少一个激光器。其中,该至少一个是指一个或多个,多个是指两个或两个以上。
在一些实施例中,投影设备包括激光器驱动组件30和一个亮度传感器40,相应的,该激光光源20包括与激光器驱动组件30一一对应的三个激光器,该三个激光器可以分别为蓝色激光器201、红色激光器202和绿色激光器203。其中,该蓝色激光器201用于出射蓝色激光,该红色激光器202用于出射红色激光,该绿色激光器203用于出射绿色激光。在一些实施例中,激光器驱动组件30可实施为包含多个子激光器驱动组件,分别对应不同颜色的激光器。
显示控制电路10用于向激光器驱动组件30输出基色使能信号以及基色电流控制信号,以驱动激光器发光,具体地,如图2所示,显示控制电路10与激光器驱动组件30连接,用于输出与多帧显示图像中的每一帧图像的三种基色一一对应的至少一个使能信号,将至少一个使能信号分别传输至对应的激光器驱动组件30,以及,输出与每一帧图像的三种基色一一对应的至少一个电流控制信号,将至少一个电流控制信号分别传输至对应的激光器驱动组件30。示例的,该显示控制电路10可以为微控制单元(microcontroller unit,MCU),又称为单片机。其中,该电流控制信号可以是脉冲宽度调制(pulse width modulation,PWM)信号。
在一些实施例中,蓝色激光器201、红色激光器202和绿色激光器203分别与激光器驱动组件30连接。激光器驱动组件30可以响应于显示控制电路10发送的蓝色PWM信号B_PWM和使能信号B_EN,向该蓝色激光器201提供对应的驱动电流。该蓝色激光器201用于在该驱动电流的驱动下发光。
亮度传感器设置于激光光源的出光路径中,通常设置在出光路径的一侧,而不会遮挡光路。如图2所示,至少一个亮度传感器40设置在激光光源20的出光路径中,该每个亮度传感器与显示控制电路10连接,用于检测一个激光器的第一亮度值,并将第一亮度值发送至显示控制电路10。
在一些实施例中,显示控制电路10可以从该对应关系中获取每个激光器的驱动电流对应的第二亮度值,该驱动电流为激光器当前的实际工作电流,该驱动电流对应的第二亮度值为激光器在该驱动电流的驱动下正常工作时能够发出的亮度值。该差值阈值可以为显示控制电路10中预先存储的固定数值。
在一些实施例中,显示控制电路10在调整与激光器对应的激光器驱动组件30的电流控制信号时,可以降低与激光器对应的激光器驱动组件30的电流控制信号的占空比,从而降低激光器的驱动电流。
在一些实施例中,亮度传感器40可以检测蓝色激光器201的第一亮度值,并将该第一亮度值发送至显示控制电路10。该显示控制电路10可以获取该蓝色激光器201的驱动电流,并从电流与亮度值的对应关系中获取该驱动电流对应的第二亮度值。之后检测第二亮度值与第一亮度值之间的差值是否大于差值阈值,若该差值大于差值阈值,表明该蓝色激光器201发生COD故障,则显示控制电路10可以降低与该蓝色激光器201对应的激光器驱动组件30的电流控制信号。之后显示控制电路10可以再次获取蓝色激光器201的第一亮度值,以及蓝色激光器201的驱动电流对应的第二亮度值,并在第二亮度值与第一亮度值之间的差值大于差值阈值时,再次降低与该蓝色激光器201对应的激光器驱动组件30的电流控制信号。如此循环,直至该差值小于等于差值阈值。由此通过降低蓝色激光器201的驱动电流,消除该蓝色激光器201的COD故障。
图3为本申请一实施例投影设备的结构示意图。
在一些实施例中,该投影设备中的激光光源20可以包括独立设置的蓝色激光器201、红色激光器202和绿色激光器203,该投影设备也可以称为三色投影设备,蓝色激光器201、红色激光器202和绿色激光器203均为MCL型封装激光器,其体积小,利于光路的紧凑排布。
在一些实施例中,参考图3,该至少一个亮度传感器40可以包括第一亮度传感器401、第二亮度传感器402和第三亮度传感器403,其中,第一亮度传感器401为蓝光亮度传感器或者白光亮度传感器,第二亮度传感器402为红光亮度传感器或者白光亮度传感器,该第三亮度传感器403为绿光亮度传感器或者白光亮度传感器。
该显示控制电路10还用于在控制蓝色激光器201出射蓝色激光时,读取该第一亮度传感器401检测的亮度值。并在控制该蓝色激光器201关闭时,停止读取该第一亮度传感器401检测的亮度值。
该显示控制电路10还用于在控制红色激光器202出射红色激光时,读取该第二亮度传感器402检测的亮度值,并在控制红色激光器202关闭时,停止读取第二亮度传感器402检测的亮度值。
该显示控制电路10还用于在控制绿色激光器203出射绿色激光时,读取该第三亮度传感器403检测的亮度值,并在控制绿色激光器203关闭时,停止读取该第三亮度传感器403检测的亮度值。
需要说明的是,亮度传感器也可以为一个,设置于三色激光的合光路径中。
图4为本申请另一实施例投影设备的结构示意图。
在一些实施例中,投影设备还可以包括光导管110,光导管110作为集光光学部件,用于接收并匀化输出合光状态的三色激光。
在一些实施例中,亮度传感器40可以包括第四亮度传感器404,该第四亮度传感器404可以为白光亮度传感器。其中,该第四亮度传感器404设置在光导管110的出光路径中,比如设置于光导管的出 光侧,靠近其出光面。以及,上述第四亮度传感器为白光亮度传感器。
该显示控制电路10还用于在控制蓝色激光器201、红色激光器202和绿色激光器203分时开启时,读取该第四亮度传感器404检测的亮度值,以确保该第四亮度传感器404可以检测到该蓝色激光器201的第一亮度值、该红色激光器202的第一亮度值和该绿色激光器203的第一亮度值。并在控制该蓝色激光器201、红色激光器202和绿色激光器203均关闭时,停止读取该第四亮度传感器404检测的亮度值。
在一些实施例中,在投影设备投影图像的过程中,该第四亮度传感器404一直处于开启状态。
在一些实施例中,参考图3或图4,该投影设备还可以包括第四二向色片604、第五二向色片605、第五反射镜904、第二透镜组件90、扩散轮150、TIR透镜120、DMD 130和投影镜头140。其中,该第二透镜组件90包括第一透镜901、第二透镜902和第三透镜903。该第四二向色片604可以透过蓝色激光,反射绿色激光。该第五二向色片605可以透过红色激光,反射绿色激光和蓝色激光。
图5为本申请一实施例投影设备的电路结构示意图。
在一些实施例中,激光器驱动组件30可以包括驱动电路301、开关电路302和放大电路303。该驱动电路301可以为驱动芯片。该开关电路302可以为金属氧化物半导体(metal-oxide-semiconductor,MOS)管。
其中,该驱动电路301分别与开关电路302、放大电路303以及激光光源20所包括的对应的激光器连接。该驱动电路301用于基于显示控制电路10发送的电流控制信号通过VOUT端向激光光源20中对应的激光器输出驱动电流,并通过ENOUT端将接收到的使能信号传输至开关电路302。其中,该激光器可以包括串联的n个子激光器,分别为子激光器LD1至LDn。n为大于0的正整数。
开关电路302串联在激光器的电流通路中,用于在接收到的使能信号为有效电位时,控制电流通路导通。
放大电路303分别与激光光源20的电流通路中的检测节点E和显示控制电路10连接,用于将检测到的激光器组件的驱动电流转换为驱动电压,放大该驱动电压,并将放大后的驱动电压传输至显示控制电路10。
显示控制电路10还用于将放大后的驱动电压确定为激光器的驱动电流,并获取该驱动电流对应的第二亮度值。
在一些实施例中,放大电路303可以包括:第一运算放大器A1、第一电阻(又称取样功率电阻)R1、第二电阻R2、第三电阻R3和第四电阻R4。
在一些实施例中,显示控制电路10、驱动电路301、开关电路302和放大电路303形成闭环,以实现对该激光器的驱动电流的反馈调节,从而使得该显示控制电路10可以通过激光器的第二亮度值与第一亮度值的差值,及时调节该激光器的驱动电流,也即是及时调节该激光器的实际发光亮度,避免激光器长时间发生COD故障,同时提高了对激光器发光控制的准确度。
需要说明的是,参考图3和图4,若激光光源20包括一个蓝色激光器、一个红色激光器和一个绿色激光器。该蓝色激光器201可以设置在L1位置处,该红色激光器202可以设置在L2位置处,绿色激光器203可以设置在L3位置处。
参考图3和图4,L1位置处的激光经过第四二向色片604一次透射,再经过第五二向色片605反射一次后进入第一透镜901中。该L1位置处的光效率P1=Pt×Pf。其中,Pt表示的是二向色片的透射率,Pf表示的是二向色片或者第五反射率的反射率。
在一些实施例中,在L1、L2和L3三个位置中,L3位置处的激光的光效率最高,L1位置处的激光的光效率最低。由于蓝色激光器201输出的最大光功率Pb=4.5瓦(W),红色激光器202输出的最大光功率Pr=2.5W,绿色激光器203输出的最大光功率Pg=1.5W。即蓝色激光器201输出的最大光功率最 大,红色激光器202输出的最大光功率次之,绿色激光器203输出的最大光功率最小。因此将绿色激光器203设置在L3位置处,将红色激光器202设置在L2位置处,将蓝色激光器201设置在L1位置处。也即是将绿色激光器203设置在光效率最高的光路中,从而确保投影设备能够获得最高的光效率。
在一些实施例中控制器包括中央处理器(Central Processing Unit,CPU),视频处理器,音频处理器,图形处理器(Graphics Processing Unit,GPU),RAM Random Access Memory,RAM),ROM(Read-Only Memory,ROM),用于输入/输出的第一接口至第n接口,通信总线(Bus)等中的至少一种。
在一些实施例中,投影设备启动后可以直接进入上次选择的信号源的显示界面,或者信号源选择界面,其中信号源可以是预置的视频点播程序,还可以是HDMI接口,直播电视接口等中的至少一种,用户选择不同的信号源后,投影机可以显示从不同信号源获得的内容。
本申请实施例可以应用于各种类型的投影设备。下文中将以投影设备为例,对投影设备及自动校正投影画面的显示控制方法进行阐述。
投影设备是一种可以将图像、或视频投射到屏幕上的设备,投影设备可以通过不同的接口同计算机、广电网络、互联网、VCD(Video Compact Disc:视频高密光盘)、DVD(Digital Versatile Disc Recordable:数字化视频光盘)、游戏机、DV等相连接播放相应的视频信号。投影设备广泛应用于家庭、办公室、学校和娱乐场所等。
在一些实施例中,投影设备配置的相机可具体实施为3维(3-Dimensional,3D)相机,或双目相机;在相机实施为双目相机时,具体包括左相机、以及右相机;双目相机可获取投影设备对应的幕布,即投影面所呈现的图像及播放内容,该图像或播放内容由投影设备内置的激光器组件投射。
当投影设备移动位置后,其投射角度及至投影面距离发生变化,会导致投影图像发生形变,投影图像会显示为梯形图像、或其他畸形图像;投影设备控制器可基于深度学习神经网络,通过耦合光机投影面之间夹角和投影图像的正确显示实现自动梯形校正;但是,该修正方法修正速度慢,且需要大量的场景数据进行模型训练才能达到一定的精度,因此不适合用户设备发生的即时、快速校正场景。
在一些实施例中,对于投影图像校正,可预先创建距离、水平夹角及偏移角之间的关联关系;
然后投影设备控制器通过获取光机至投影面的当前距离,结合所属关联关系确定该时刻光机与投影面的夹角,实现投影图像校正;
其中,所述夹角具体实施为光机中轴线与投影面的夹角;但是,在一些复杂环境下,可能会发生提前创建的关联关系不能适应所有复杂环境的情况,而导致投影图像校正失败。
在一些实施例中,本申请提供的投影设备通过结合其工作模式、工作场景的动态切换调光,可最大程度发挥两种调光方式优点、弱化其缺点造成的影响。投影设备通过其配置组件可实时监测设备移动,并将监测结果即时反馈至投影设备控制器,以实现投影设备移动后,控制器即时启动图像校正功能在第一时间实现投影图像自动校正。
例如,通过投影设备配置的陀螺仪、或TOF(Time of Flight:飞行时间)传感器,控制器接收来自陀螺仪、TOF传感器的监测数据,以判定投影设备是否发生移动;
在判定投影设备发生移动后,控制器将启动投影画面自动校正流程和/或壁障流程,从而实现诸如梯形投影区域校正功能、以及投影障碍物规避功能。
其中,飞行时间传感器通过调节发射脉冲的频率改变实现距离测量和位置移动监测,其测量精度不会随着测量距离的增大而降低,且抗干扰能力强。
在一些实施例中,投影设备发出的激光通过DMD(Digital Micromirror Device:数字微镜器件)芯片的纳米级镜片反射,其中光学镜头也是精密元件,当像面、以及物面不平行时,会使得投影到屏幕的图像发生几何形状畸变。
图6A为本申请一实施例投影设备实现显示控制的系统框架示意图。
在一些实施例中,本申请提供的投影设备具备长焦微投的特点,投影设备包括控制器,所述控制器通过预设算法可对光机画面进行显示控制,以实现显示画面自动梯形校正、自动入幕、自动避障、自动对焦、以及防射眼等功能。
可以理解,投影设备通过基于几何校正的显示控制方法,可实现长焦微投场景下的灵活位置移动;设备在每次移动过程中,针对可能出现的投影画面失真、投影面异物遮挡、投影画面从幕布异常等问题,控制器可控制投影设备实现自动显示校正功能,使其自动恢复正常显示。
在一些实施例中,基于几何校正的显示控制系统,包括应用程序服务层(APK Service:Android Application Package Service)、服务层、以及底层算法库。
应用程序服务层用于实现投影设备和用户之间的交互;基于用户界面的显示,用户可对投影设备的各项参数以及显示画面进行配置,控制器通过协调、调用各种功能对应的算法服务,可实现投影设备在显示异常时自动校正其显示画面的功能。
服务层可包括校正服务、摄像头服务、飞行时间(TOF)服务等内容,所述服务向上对应于应用程序服务层(APK Service),实现投影设备不同配置服务的对应特定功能;服务层向下对接算法库、相机、飞行时间传感器等数据采集业务,实现封装底层复杂逻辑。
底层算法库提供校正服务、以及投影设备实现各种功能的控制算法,算法库可基于OpenCV(基于许可开源)完成各种数学运算,为校正服务提供基础计算能力;OpenCV是一个基于BSD许可开源发行的跨平台计算机视觉、机器学习软件库,可以运行在多种现有操作系统环境中。
在一些实施例中,投影设备配置有陀螺仪传感器;设备在移动过程中,陀螺仪传感器可感知位置移动、并主动采集移动数据;然后通过系统框架层将已采集数据发送至应用程序服务层,用于支撑用户界面交互、应用程序交互过程中所需应用数据,其中的采集数据还可用于控制器在算法服务实现中的数据调用。
在一些实施例中,投影设备配置有飞行时间(TOF)传感器,在飞行时间传感器采集到相应数据后,所述数据将被发送至服务层对应的飞行时间服务;
上述飞行时间服务获取数据后,将采集数据通过进程通信框架发送至应用程序服务层,数据将用于控制器的数据调用、用户界面、程序应用等交互使用。
在一些实施例中,投影设备配置有用于采集图像的相机,所述相机可实施为双目相机、或深度相机、或3D相机等;
相机采集数据将发送至摄像头服务,然后由摄像头服务将采集图像数据发送至进程通信框架、和/或投影设备校正服务;所述投影设备校正服务可接收摄像头服务发送的相机采集数据,控制器针对所需实现的不同功能可在算法库中调用对应的控制算法。
在一些实施例中,通过进程通信框架、与应用程序服务进行数据交互,然后经进程通信框架将计算结果反馈至校正服务;校正服务将获取的计算结果发送至投影设备操作系统,以生成控制信令,并将控制信令发送至光机控制驱动以控制光机工况、实现显示图像的自动校正。
图6B为本申请另一实施例投影设备实现防射眼功能的信令交互时序示意图。
在一些实施例中,投影设备可实现防射眼功能,为防止用户偶然进入投影设备射出激光轨迹范围内而导致的视力损害危险,在用户进入投影设备所在的预设特定非安全区域时,控制器可控制用户界面显示对应的提示信息,以提醒用户离开当前区域,控制器还可控制用户界面降低显示亮度,以防止激光对用户视力造成伤害。
在一些实施例中,投影设备被配置为儿童观影模式时,控制器将自动开启防射眼开关。在一些实施 例中,在接收陀螺仪传感器发送的位置移动数据、或接收其它传感器所采集的异物入侵数据后,控制器将控制投影设备开启防射眼开关。
在一些实施例中,在飞行时间(TOF)传感器、摄像头所采集数据触发任一预设阈值条件时,控制器将控制用户界面降低显示亮度、显示提示信息、降低光机发射功率、亮度、强度。
在一些实施例中,投影设备控制器可控制校正服务向飞行时间传感器发送信令,查询投影设备当前状态,控制器将接受来自飞行时间传感器的数据反馈。
校正服务向进程通信框架发送通知算法服务以启动防射眼流程信令,进程通信框架将从算法库进行服务能力调用以调取对应算法服务;例如算法可包括:拍照检测算法、截图画面算法、以及异物检测算法等。
进程通信框架基于上述算法服务,可返回异物检测结果至校正服务;针对返回结果,若达到预设阈值条件,控制器将控制用户界面显示提示信息、降低显示亮度,其信令时序如图6B所示。
在一些实施例中,防射眼在开启状态下,用户进入预设特定区域时,投影设备将自动降低光机发出激光强度、降低用户界面显示亮度、显示安全提示信息;投影设备对上述防射眼功能的控制,可通过以下方法实现:
基于相机获取的投影画面,利用边缘检测算法识别投影设备的投影区域;在投影区域显示为矩形、或类矩形时,控制器通过预设算法获取上述矩形投影区域四个顶点的坐标值;
在实现对于投影区域内的异物检测时,可使用透视变换方法将投影区域校正为矩形,计算矩形和投影截图的差值,以实现判断显示区域内是否有异物;若判断结果为存在异物,投影设备自动启动防射眼。
在实现对投影区域外一定区域的异物检测时,可将当前帧的相机内容、和上一帧的相机内容做差值,以判断投影区域外区域是否有异物进入;若判断有异物进入,将触发防射眼功能的启动。
于此同时,投影设备还可利用飞行时间(ToF)相机、或飞行时间传感器检测特定区域的实时深度变化;若深度值变化超过预设阈值,投影设备将自动触发防射眼功能。
在一些实施例中,如图6G所示,给出投影设备实现防射眼算法的流程示意图,投影设备基于采集的飞行时间数据、截图数据、以及相机数据分析判断是否需要开启防射眼功能。投影设备实现防射眼算法的方式有以下三种:
方式一:
步骤6G00-1,采集飞行时间(TOF)数据;
步骤6G00-2,根据采集的飞行时间数据,做深度差值分析;
步骤6G00-3,判断深度差值是否大于预设阈值X,若深度差值大于预设阈值X,X实施为0时,执行步骤6G03;
步骤6G03,画面变暗,弹出提示。
如果深度差值大于预设阈值X,所述预设阈值X实施为0时,则可判定有异物已处于投影设备的特定区域;
若用户位于特定区域,其视力存在被激光损害风险,投影设备将自动启动防射眼功能,以降低光机发出激光强度、降低用户界面显示亮度、并显示安全提示信息。
方式二:
步骤6G01-1,采集截图数据;
步骤6G01-2,根据采集的截图数据做加色模式(RGB)差值分析;
步骤6G01-3,判断RGB差值是否大于预设阈值Y,若RGB差值大于预设阈值Y,执行步骤6G03;
步骤6G03,画面变暗,弹出提示。
投影设备根据已采集截图数据做加色模式(RGB)的差值分析,在差值大于预设阈值Y时,可判定有异物处于投影设备的特定区域;所述特定区域内若存在用户,其视力存在被激光损害风险,投影设备将自动启动防射眼功能。
方式三:
步骤6G02-1,采集相机数据;
步骤6G02-2,根据采集的相机数据获取投影坐标,若获取的投影坐标处于投影区域,执行步骤6G01-3,若获取的投影坐标处于扩展区域,仍执行步骤6G01-3;
步骤6G02-3,根据采集的相机数据做加色模式(RGB)差值分析;
步骤6G02-4,判断RGB差值是否大于预设阈值Y,若RGB差值大于预设阈值Y,执行步骤6G03。
步骤6G03,画面变暗,弹出提示。
投影设备根据已采集相机数据,从而获取投影坐标;然后根据所述投影坐标确定投影设备的投影区域,进一步在投影区域内进行加色模式(RGB)的差值分析;在差值大于预设阈值Y时,则可判定有异物已处于投影设备的特定区域;投影设备将自动启动防射眼功能。
若获取的投影坐标处于扩展区域,控制器仍可在所述扩展区域进行加色模式(RGB)的差值分析;如果差值大于预设阈值Y,则可判定有异物已处于投影设备的特定区域,投影设备将自动启动防射眼功能。
图6C为本申请另一实施例投影设备实现显示画面校正功能的信令交互时序示意图。
在一些实施例中,投影设备通过陀螺仪或陀螺仪传感器对设备移动进行监测。校正服务向陀螺仪发出用于查询设备状态的信令,并接收陀螺仪所反馈用于判定设备是否发生移动的信令。
在一些实施例中,显示校正策略可配置为:在陀螺仪、飞行时间传感器同时发生变化时,投影设备优先触发梯形校正;在梯形校正进行时,控制器不响应遥控器按键发出的指令,以配合梯形校正的实现,投影设备将打出纯白图卡。
其中,梯形校正算法可基于双目相机构建世界坐标系下投影面与光机坐标系的转换矩阵,结合光机内参,计算出投影图像与播放图卡的单应性关系,所单应性关系也称为映射关系,并利用该单应性关系实现投影图像与播放图卡之间的任意形状转换。
在一些实施例中,校正服务发送用于通知算法服务启动梯形校正流程的信令至进程通信框架,所述进程通信框架进一步发送服务能力调用信令至算法服务,以获取能力对应的算法;
算法服务获取执行拍照和画面算法处理服务、避障算法服务,并将其通过信令携带发送至进程通信框架;在一些实施例中,进程通信框架执行上述算法,并将执行结果反馈给校正服务,所述执行结果可包括拍照成功、以及避障成功等反馈信息。
在一些实施例中,投影设备执行上述算法、或数据传送过程中,若出现错误,校正服务将控制用户界面显示出错返回提示,并控制用户界面再次梯形校正、自动对焦图卡。
通过自动避障算法,投影设备可识别幕布;并利用投影变化,将投影画面校正至幕布内显示,实现与幕布边沿对齐的效果。通过自动对焦算法,投影设备可利用飞行时间(ToF)传感器获取光机与投影面距离,基于所述距离在预设的映射表中查找最佳像距,并利用图像算法评价投影画面清晰程度,以此为依据实现微调像距。
在一些实施例中,校正服务发送至进程通信框架的自动梯形校正信令可包含其他功能配置指令,例如包含是否实现同步避障、是否入幕等控制指令。
进程通信框架发送服务能力调用信令至算法服务,使算法服务获取执行自动对焦算法,实现调节设备与幕布之间的视距;在一些实施例中,在应用自动对焦算法实现功能后,算法服务还可获取、执行自 动入幕算法,过程中可包含梯形校正算法。
投影设备执行自动入幕,算法服务设置投影设备与幕布之间的8位置坐标;然后再次通过自动对焦算法,实现投影设备与幕布的视距调节;最终,将校正结果反馈至校正服务,并控制用户界面显示校正结果,如图6C所示。
在一些实施例中,投影设备通过自动对焦算法,利用其配置的激光测距可获得当前物距,以计算初始焦距及搜索范围;然后投影设备驱动相机(Camera)进行拍照,并利用对应算法进行清晰度评价。
投影设备在上述搜索范围内,基于搜索算法查找可能的最佳焦距,然后重复上述拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距,完成自动对焦。
例如,在投影设备启动后,实现自动对焦算法的步骤如图6D所示,包括以下步骤:
步骤6D01,用户移动设备,投影设备自动完成校正后重新对焦;
步骤6D02,控制器将检测自动对焦功能是否开启,若否则控制器将结束自动对焦业务,是则执行步骤6D03;
步骤6D03,中间件获取飞行时间(TOF)的检测距离;
当自动对焦功能开启时,投影设备将通过中间件获取飞行时间(TOF)传感器的检测距离进行计算。
步骤6D04,根据距离查询映射表获取大致焦距;
步骤6D05,中间件设置该焦距给光机;
控制器根据获取的距离查询预设的映射表,以获取投影设备的焦距;然后中间件将获取焦距设置到投影设备的光机。
步骤6D06,摄像头拍照;
步骤6D07,根据评价函数,判断对焦是否完成,若是则结束自动对焦流程,否则执行步骤6D08;
步骤6D08,中间件微调焦距(步长),再次执行步骤6D05。光机以上述焦距进行发出激光后,摄像头将执行拍照指令;控制器根据获取的拍摄图像、评价函数,判断投影设备对焦是否完成;如果判定结果符合预设完成条件,则控制自动对焦流程结束;
如果判定结果不符合预设完成条件,中间件将微调投影设备光机的焦距参数,例如可以预设步长逐渐微调焦距,并将调整的焦距参数再次设置到光机;从而实现反复拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距完成自动对焦。
在一些实施例中,投影设备可以投出75-100英寸的画面;用户在购买了激光投影设备后,经过投影安装调试,使处于A位置的投影设备可获取合适的A投影区域,所述A投影区域呈矩形,如图6H左侧所示;
在投影设备位置由A移动至B后,经常会生成畸形的投影区域,如图6H右侧所示的梯形投影区域;可以理解,投影设备在摆放上存在弊端,出于打扫等类似原因,投影设备的摆放位置会发生改变;而在屏幕固定的情况下,投影设备画面与屏幕的吻合全靠投影设备的位置来确定,投影设备的位置发生变化,会导致投影位置发生变化,造成画面与屏幕不吻合的问题。
因此,当投影设备的位置发生改变时,用户需要调整投影设备归位到最佳摆放位置上,以重新调整画面的边缘和屏幕的相对关系,保证画面与屏幕吻合。但对于不熟悉投影设备的普通用户来说,自行调节通常存在调节后的激光画面并非呈现周正矩形的情况,通过本申请提供的自动校正投影图像显示控制方法,可以实现软件方式的投影图像修正,如图6I所示;
下文将就投影设备对畸形投影区域的校正进行阐述。在一些实施例中,本申请提供的投影设备可通过梯形校正算法实现显示校正功能,如图6J所示,本申请提供一种自动校正投影图像的方法,包括以下步骤:
步骤6J01,实时监测投影仪和投影面之间的位置变化;
步骤6J02,飞行时间或陀螺仪传感器检测到变化;
步骤6J03,投射白色图卡;
步骤6J04,双目相机拍摄、存储照片;
步骤6J05,投射特征图卡;
步骤6J06,双目相机拍摄、存储照片;
控制器检测到投影设备位置发生变化后,控制器将控制光机向投影面投射第一图卡和第二图卡;在投射图卡的过程中,控制器将控制双目相机获取投射第一图卡对应的第一图像、以及投射第二图卡对应的第二图像;
例如,控制器首先将控制光机投射白色图卡至投影面,并随即控制双目相机对投影面进行拍照获取白色图卡对应的图片,所述白色图卡即上述第一图卡,所述第一图卡还可实施为其他颜色图卡;
投影设备在存储上述白色图卡对应的图片后,控制器控制光机再次投射特征图卡至投影面,所述特征图卡即上述第二图卡,所述第二图卡例如可实施为棋盘格图卡、或其他具有图形、颜色特征的图卡,如图6H所示。
步骤6J07,识别双目相机对应的特征点;
在一些实施例中,控制器通过比对第一图像和第二图像的对应特征,可确定世界坐标系下投影图卡和投影面的单应性关系,从而获取投影图卡及投影面显示之间的关联关系进行投影图像修正。
步骤6J08,计算相机特征点对应的投影面上点在相机坐标系下的坐标;
步骤6J09,获取投影面的点在世界坐标系下的坐标;
步骤6J10,获取投射特征图卡与投影面特征点的单应性关系;
步骤6J11,世界坐标系获取最大矩形区域;
步骤6J12,根据投射特征图卡与投影面之间的单应性关系求出光机要投射的区域。
控制器根据图像识别算法获取双目相机对应特征点在图像坐标系下的坐标,然后根据双目相机特征点图像坐标系下的坐标计算出投影面在双目相机其中任一相机,如左相机坐标系下的坐标,以获取投影面特征点在世界坐标系下的坐标。
例如,根据求出的在左相机坐标系下的离散点三维坐标,求解超静定方程,可计算获取左相机坐标系下投影面的平面方程,表示为:
z=a*x+b*y+c;
然后基于上述平面方程,控制器计算获取左相机坐标系下投影面的单位法向量;继续定义世界坐标系1以左相机坐标系原点为原点,以平行于投影面为XOY面建立空间的直接坐标系,因而投影面在世界坐标系1下的方程表示为:
z=c′;
其中,为常数,上述单位法向量表示为:
m=(0,0,1) T
根据上述投影面单位法向量在不同坐标系下的表示,可求得两个坐标系间的旋转矩阵,表示为:
m=R1*n;
根据上述旋转矩阵R1、与投影平面点P(x′,y′,z′)在光机坐标系下的坐标得到投影平面点P在世界坐标系1下的坐标;以世界坐标1的原点在投影面的投影为原点,投影面为XOY面,建立世界坐标系2,投影点P在世界坐标系2的坐标表示为(x″,y″,0),控制器可获取所有投影特征点在世界坐标系2下的坐标;
根据上述求得的投影特征点的坐标、与获取光机投影图卡特征点(已知)计算单应性矩阵,可获取投影特征图卡与投影面特征点的单应性关系,即世界坐标系下投影图卡与所述投影面的单应性关系,也可称为映射关系。
在一些实施例中,投影设备在将播放内容投射至投影面前,如控制器检测到光机与投影面之间存在阻挡投影光束的障碍物,控制器将启动避障流程。
当投影设备配置有壁障功能开关时,控制器判定投影设备的避障功能是否开启,若不开启则不进行障碍物检测;若投影设备开启避障功能,则控制器计算获取双目相机中任一相机,例如左相机与光机的单应性矩阵,即单应性关系。
投影设备中存储有投影图卡的特征点,对获取左相机拍摄到的特征图卡对应图像进行特征点识别后,将其与已存储的投影图卡特征点进行比对,可计算获取左相机、与光机的单应性矩阵。
例如,控制器将双目相机获取的拍摄图像进行灰度化处理,然后基于聚类算法进行聚类处理。
在聚类过程中,控制器根据聚类算法及灰度值将上述灰度化处理的拍摄图像聚类为投影光束区域及障碍物区域,如按照灰度大小,将投影区域内的灰度图聚为两类,其中投影光束灰度值必然是最大的一类,将灰度值小的一类判定为障碍物区域。
可以理解,控制器将投影光束区域及障碍物区域进行二值化,这样可以方便提取大于预设避障面积阈值的第一闭合轮廓,该第一闭合轮廓对应于障碍物,例如处于投影设备和幕布之间的家居物件;
在上述过程中,控制器还可以提取小于预设避障面积阈值的第二闭合轮廓,该第二闭合轮廓对应于非障碍物,例如墙壁或幕布的小污点等;控制器在识别得到第二闭合轮廓后,为了提高后续算法识别效率,可以将该第二闭合轮廓填充为投影光束区域;
第一闭合轮廓可用于控制器控制光机投影过程中避开该闭合轮廓,以使得投影面生成的最终矩形区域可以准确的避开投影避障物,实现投影避障功能;与此同时,第二闭合轮廓的识别、以及填充在后续算法识别过程中,也不会影响投影面生成的最终矩形区域位置,如图6K所示,给出投影设备实现自动校正投影图像的方法,包括以下步骤:
步骤6K01,获取投射特征图卡与投影面特征点之间的单应性关系;
步骤6K02,判断避障是否开启,若是执行步骤6K03,否则执行步骤6K07;
步骤6K03,获取相机与光机单应性矩阵;
步骤6K04,障碍物轮廓检测;
步骤6K05,将图片转换到光机坐标系下;
步骤6K06,将提取障碍物轮廓的二值化图片转化到世界坐标系下;
步骤6K07,世界坐标系获取最大矩形区域;
步骤6K08,根据投射特征图卡与投影面之间的单应性关系求出光机要投射的区域。
例如,控制器将上述两类灰度值再进行二值化处理,大灰度值可配置为255,小灰度值可配置为0;然后在二值化处理后的图像中提取包含的所有闭合轮廓;在该步骤中,为提高用户体验效果、避免投影设备执行避障功能时将一些很小的物体,例如墙壁存在的污点识别为障碍物进行躲避,可以定义预设避障面积阈值;
将大于轮廓区域面积百分之一的轮廓认定为障碍物,小于轮廓区域面积百分之一的轮廓不认定为障碍物也不进行避障;与此同时,控制器将所述小于轮廓区域面积百分之一的轮廓所在区域的灰度值填充为最大值255,以便投影设备在后续避障时不会避开这种极小轮廓。
需要说明的是,在一些实施例中投影设备对障碍轮廓检测直接采用二值化处理;然后采用经验阈值去识别障碍物;该方法在应对复杂环境时其识别效率会下降,例如投影光束投射至黑色幕布区域时,如 果经验阈值选择过大,会将不包含障碍物的投影也错误识别成障碍物,影响避障效果。
在一些实施例中,投影设备控制器可基于前文中获取的单应性关系、即映射关系,将双目相机中任一相机所获取的拍摄图像转换至世界坐标系下,得到最终的矩形区域。
双目相机的左相机拍摄图片后,控制器将该拍摄图像进行灰度二值化;然后控制器根据双目相机左相机、与投影设备光机之间的单应性矩阵将二值化图像转换到光机坐标系下;
可以理解,二值化图像在进行单应性矩阵转换后,相当于只保留了投影区域范围内图像,其他无关区域将被剔除,因此后续步骤中进行避障区域检索只需考虑该投影区域内即可;
然后,控制器将左相机所获取的经过二值化轮廓识别后的图像通过单应性矩阵再次转换至世界坐标系2下;根据该获取图片,控制器在世界坐标系2下选取矩形区域,并控制光机将播放内容投射至该矩形区域,所述矩形区域用于替代投影设备移动后在投影面形成的畸形投影区域,如图6H所示。
可以理解,对于常见的微投类型投影设备,由于其使用环境复杂,不可避免的会存在光机、投影墙面或幕布不垂直而发生投影画面产生梯形;在投影区域存在障碍物时,会造成投影画面被障碍物遮挡,本申请提供的自动校正投影画面的显示控制方法可以实现梯形自动校正以及避开障碍物的功能。
在一些实施例中,投影设备控制器基于前文所述单应性关系,可确定投影图卡对应拍摄图像在世界坐标系中的中心点坐标。
例如,投影设备光机的投影图卡被定义为3840*2160分辨率,控制器可计算获取该投影图卡的中心点(1920,1080);然后根据已获取的单应性矩阵,计算得到投影图卡中心点在世界坐标系2的坐标系。
控制器根据该投影图卡的宽高比,以其中心点坐标为基准进行矩形扩张直至抵达灰度值变化边界,获取该时刻矩形的四个顶点作为最终矩形区域顶点。
例如,该投影图卡的投射尺寸宽高比配置为16:9,则选取矩形则基于图卡中心点在世界坐标系2的坐标按照16:9的比例依次增大;投影区域灰度值设置为255,相应的障碍物区域灰度值设置为0;
因而,当矩形四条边上的点碰到灰度值为0值的区域时,控制器将控制该矩形停止扩大,并且该时刻可获取最终矩形区域的四个顶点;可将其转换到光机坐标系下,获取投影设备光机要投射的区域。
在一些实施例中,控制器获取最终矩形区域可通过获取拍摄图像中灰度值为255的最大内接矩形,然后依据投影图卡的宽高比修正所述最大内接矩形的宽值、高值。
例如,控制器获取拍摄图像中像素为255区域内的最大内接矩形,并且计算该最大内接矩形的宽高比;若该最大内接矩形的宽高比:
width/height>16:9;
则选取的最终矩形高不变,控制器将其宽值修正为:
width1=height/9*16;
若该最大内接矩形的宽高比:
width/height<16:9;
则选取的最终矩形宽不变,控制器将其高修正为:
height1=width/16*9;
控制器基于最终修正后的宽值、高值,可确定最大内接矩形的任意顶点以定位投影区域对应的矩形区域;即根据获取的宽、与高以及计算得到的最大内接矩形的任意1个顶点,投影设备可得到最终选取的矩形;控制器将矩形的四个顶点其转换到光机坐标系下,获取光机要投射的区域。
在一些实施例中,控制器首先基于标定算法,可获取两相机之间、相机与光机之间的两组外参,即旋转、平移矩阵;然后通过投影设备的光机播放特定棋盘格图卡,并计算投影棋盘格角点深度值,例如通过双目相机之间的平移关系及相似三角形原理求解xyz坐标值;之后再基于所述xyz拟合出投影面、 并求得其与相机坐标系的旋转关系与平移关系,具体可包括俯仰关系(Pitch)和偏航关系(Yaw)。
通过投影设备配置的陀螺仪可得到卷(Roll)参数值,以组合出完整旋转矩阵,最终计算求得世界坐标系下投影面到光机坐标系的外参。
结合上述步骤中计算获取的相机与光机的R、T值,可以得出投影面世界坐标系与光机坐标系的转换关系;结合光机内参,可以组成投影面的点到光机图卡点的单应性矩阵。
最终在投影面选择矩形,利用单应性反求光机图卡对应的坐标,该坐标就是校正坐标,将其设置到光机,即可实现梯形校正。
如图6E所示,给出投影设备实现梯形校正、避障算法的流程示意图,包括以下步骤:
步骤6E01,投影设备控制器获取照片像素点对应点的深度值,或投影点在相机坐标系下的坐标;
步骤6E02,通过深度值,中间件获取光机坐标系与相机坐标系关系;
步骤6E03,然后控制器计算得到投影点在光机坐标系下的坐标值;
步骤6E04,基于坐标值拟合平面获取投影面与光机的夹角;
步骤6E05,根据夹角关系获取投影点在投影面的世界坐标系中的对应坐标;
步骤6E06,根据图卡在光机坐标系下的坐标与投影平面投影面对应点的坐标,可计算得到单应性矩阵;
步骤6E07,控制器基于上述已获取数据判定障碍物是否存在,若是执行步骤6E08,否则执行步骤6E09;
执行步骤6E08,在世界坐标系下的投影面上任取矩形坐标,根据单应性关系计算出光机要投射的区域;
步骤6E09,障碍物不存在时,控制器例如可获取二维码特征点;
步骤6E10,然后获取二维码在预制图卡的坐标;
步骤6E11,然后获取相机照片与图纸图卡单应性关系;
步骤6E12,将获取的障碍物坐标转换到图卡中,即可获取障碍物遮挡图卡坐标;
步骤6E13,依据障碍物图卡遮挡区域在光机坐标系下坐标,通过单应性矩阵转换得到投影面的遮挡区域坐标;
步骤6E14,在世界坐标系下投影面上任取矩形坐标,同时避开障碍物,根据单应性关系求出光机要投射的区域。
可以理解,避障算法在梯形校正算法流程选择矩形步骤时,利用算法(OpenCV)库完成异物轮廓提取,选择矩形时避开该障碍物,以实现投影避障功能。
在一些实施例中,如图6F所示,为投影设备实现入幕算法的流程示意图,包括以下步骤:
步骤6F01,中间件获取相机拍到的二维码图卡;
步骤6F02,识别二维码特征点,获取在相机坐标系下的坐标;
步骤6F03,控制器进一步获取预置图卡在光机坐标系下的坐标;
步骤6F04,求解相机平面与光机平面的单应性关系;
步骤6F05,控制器基于上述单应性关系,识别相机拍到的幕布四个顶点坐标;
步骤6F06,根据单应性矩阵获取投影到幕布光机要投射图卡的范围。
可以理解,在一些实施例中,入幕算法基于算法库可识别最大黑色闭合矩形轮廓并提取,判断是否为16:9尺寸;投影特定图卡并使用相机拍摄照片,提取照片中多个角点用于计算投影面(幕布)与光机播放图卡的单应性,将幕布四顶点通过单应性转换至光机像素坐标系,将光机图卡转换至幕布四顶点即可完成计算比对。
长焦微投电视具有灵活移动的特点,每次位移后投影画面可能会出现失真,另外如投影面存在异物遮挡、或投影画面从幕布异常时,本申请提供的投影设备可针对上述问题自动完成校正,包括实现自动梯形校正、自动入幕、自动避障、自动对焦、防射眼等功能的。
基于上文投影设备实现自动校正投影画面的显示控制方案和相关附图的介绍,本申请还提供了一种自动校正投影画面的显示控制方法,所述方法在投影设备实现投影画面自动校正的显示控制方案中已进行详细阐述,在此不再赘述。
本申请实施例的有益效果在于,通过投影第一、第二图卡,可以获取对应的第一、第二图像;进一步通过比对第一、第二图像的特征点,可以确定投影图卡与投影面的单应性关系,实现基于单应性关系将拍摄图像转换至世界坐标系、确定修正后的投影矩形区域,实现投影设备移位置变化后对生成畸形投影区域进行自动校正。
本申请提供一种基于触发校正方法的自动校正投影图像的显示控制方法,可以应用于投影设备,所述投影设备包括投影屏幕1、用于投影的装置2,使投影设备在默认可以触发图像校正和护眼模式的场景下,如在投影设备的移动过程中,解决误触发图像校正以及护眼功能的问题。
图7为在一些实施例中用于投影的装置2的镜头结构示意图。为了支持用于投影的装置2的自动调焦过程,如图7所示,投影设备的镜头300还可以包括光学组件310和驱动马达320。其中,光学组件310是由一个或多个透镜组成的透镜组,可以对光机200发射的光线进行折射,使光机200发出的光线能够透射到投影面上,形成透射内容影像。
光学组件310可以包括镜筒以及设置在镜筒内的多个透镜。根据透镜位置是否能够移动,光学组件310中的透镜可以划分为移动镜片311和固定镜片312,通过改变移动镜片311的位置,调整移动镜片311和固定镜片312之间的距离,改变光学组件310整体焦距。因此,驱动马达320可以通过连接光学组件310中的移动镜片311,带动移动镜片311进行位置移动,实现自动调焦功能。
需要说明的是,本申请部分实施例中所述的调焦过程是指通过驱动马达320改变移动镜片311的位置,从而调整移动镜片311相对于固定镜片312之间的距离,即调整像面位置,因此光学组件310中镜片组合的成像原理,所述调整焦距实则为调整像距,但就光学组件310的整体结构而言,调整移动镜片311的位置等效于调节光学组件310的整体焦距调整。
当投影设备与投影面之间相距不同距离时,需要投影设备的镜头调整不同的焦距从而在投影面上透射清晰的图像。而在投影过程中,投影设备与投影面的间隔距离会受用户的摆放位置的不同而需要不同的焦距。因此,为适应不同的使用场景,投影设备需要调节光学组件310的焦距。
图8为在一些实施例中距离传感器和相机结构示意图。如图8所示,用于投影的装置2还可以内置或外接相机700,相机700可以对投影设备投射的画面进行图像拍摄,以获取投影内容图像。投影设备再通过对投射内容图像进行清晰度检测,确定当前镜头焦距是否合适,并在不合适时进行焦距调整。基于相机700拍摄的投影内容图像进行自动调焦时,投影设备可以通过不断调整镜头位置并拍照,并通过对比前后位置图片的清晰度找到调焦位置,从而将光学组件中的移动镜片311调整至合适的位置。例如,控制器500可以先控制驱动马达320将移动镜片311调焦起点位置逐渐移动至调焦终点位置,并在此期间不断通过相机700获取投影内容图像。再通过对多个投影内容图像进行清晰度检测,确定清晰度最高的位置,最后控制驱动马达320将移动镜片311从调焦终端调整到清晰度最高的位置,完成自动调焦。
图9为一些实施例中投影设备位置改变时的示意图。投影设备开始处于A位置,可以投射到合适的投影区域,投影区域呈矩形,通常可完整、准确的覆盖对应的矩形幕布。当投影设备由A位置移动至B位置后,经常会生成畸形投影图像,例如出现梯形图像,造成投影图像与矩形幕布不吻合的问题。
通常,投影设备具有图像校正功能。当投影设备在投影面中投射的投影图像出现梯形等畸形图像时, 投影设备可以自动对投影图像进行校正,得到规则形状的图像,可以是矩形图像,从而实现图像校正。
具体可以设置为,当接收到图像校正指令时,投影设备可以开启图像校正功能,对投影图像进行校正。图像校正指令是指用于触发投影设备自动进行图像校正的控制指令。
在一些实施例中,图像校正指令可以是用户主动输入的指令。例如,在接通投影设备的电源后,投影设备可以在投影面上投射出图像。此时,用户可以按下投影设备中预先设置的图像校正开关,或投影设备配套遥控器上的图像校正按键,使投影设备开启图像校正功能,自动对投影图像进行图像校正。
在一些实施例中,图像校正指令也可以根据投影设备内置的控制程序自动生成。
可以是投影设备在开机后主动生成图像校正指令。例如,当投影设备检测到开机后的首次视频信号输入时,可以生成图像校正指令,从而触发图像校正功能。
也可以是投影设备在工作过程中,自行生成图像校正指令。考虑到用户在使用投影设备的过程中,可能主动移动投影设备或者不小心碰到投影设备,导致投影设备的摆放姿态或设置位置发生改变,投影图像也可能变为梯形图像。此时,为了保证用户的观看体验性,投影设备可以自动进行图像校正。
具体的,投影设备可以实时检测自身的情况。当投影设备检测到自身摆放姿态或设置位置发生改变时,可以生成图像校正指令,从而触发图像校正功能。
在一些实施例中,投影设备通过其配置组件可实时监测设备的移动,并将监测结果即时反馈至投影设备控制器,以实现投影设备移动后,控制器即时启动图像校正功能在第一时间实现投影图像校正。例如,通过投影设备配置的加速度传感器,控制器接收来自加速度传感器的监测数据,以判定投影设备是否发生移动。
在判定投影设备发生移动后,控制器可以生成图像校正指令,并开启图像校正功能。
在一些实施例中,在投影设备进行图像校正的过程中,相机实时对图像进行拍摄。同时,如果检测到有人物进入到图像中,投影设备则会自动触发护眼模式。这样,为了减少光源对人眼的伤害,投影设备可以通过调节光源亮度使图像亮度发生变化,进而实现保护人眼的效果。
通常,在投影设备的移动过程中,投影设备基于陀螺仪传感器中实时获取加速度数据,通过加速度数据的变化,判定投影设备是否发生移动。如果投影设备移动后,在投影设备处于静止状态时才启动图像校正功能。
然而,由于陀螺仪传感器的硬件性能会导致获取加速度数据为异常数据。如根据异常数据在移动过程中出现判定投影设备为静止状态的情况。因此,在投影设备为移动状时,会误触发图像校正以及护眼功能。
为此,本申请提出了一种投影设备可以包括光机、光源、加速度传感器、相机和控制器。其中,光机用于将用户预先设定的播放内容投射至投影面,投影面可以是墙面或者幕布。相机可以通过相机拍摄的图像可以对投影设备进行图像校正。光源用于为光机提供照明。加速度传感器用于判定投影设备是否发生移动。以解决在投影设备的移动过程中,误触发图像校正以及护眼功能的问题。
需要说明的是,当投影设备接收到图像校正指令时,投影设备可以触发图像校正功能。进而,图像校正指令的生成方法包括但不限于是用户选择预先设置的图像校正开关生成的,也可以为基于陀螺仪传感器的变化自动生成的。因此,本申请实施例将以图像校正开关为开启状态以及图像校正开关为关闭状态两种状态进行具体阐述。可以理解的,在图像校正开关为开启状态时,投影设备可以基于陀螺仪传感器的变化自动开启图像校正功能。在图像校正开关为关闭状态时,投影设备不开启图像校正功能。
在一些实施例中,图10为本申请实施例中投影设备触发图像校正的示意图。参见图10,控制器获取加速度传感器采集的加速度数据。如果加速度数据小于预设加速度阈值,判定投影设备处于静止状态,控制器则控制光机投射校正卡至投影面。并基于校正卡确定投影面中的待投影区域,以及将播放内容投 射至待投影区域。
控制器实时监测并接收陀螺仪传感器即加速度传感器采集的加速度数据。其中,加速度数据包括三个轴坐标(X、Y及Z轴)上的加速度数据。由于加速度数据表示三个轴坐标方向上采集的加速度数据,当其中一个轴上的加速度数据变化时,说明在该轴坐标方向上,投影设备发生位移。因此,可通过判断加速度数据三个轴方向上的数据确定投影设备是否发生移动。接着,将三个轴坐标方向的加速度数据转换为角速度数据,通过角速度数据判定投影设备在三个轴坐标方向上的移动角度。这样,控制器通过获取上一次采集的加速度数据以及当前采集的加速度数据,计算投影设备在三个轴坐标方向上的移动角度。如果在三个轴坐标方向上的移动角度均小于预设加速度阈值,则判定投影设备处于静止状态,即可控制光机投射校正图卡至投影面,以触发图像校正过程。需要说明的是,本申请实施例对预设加速度阈值不作数值上的限定,本领域技术人员可以根据实际需求自行确定预设加速度阈值的取值,例如0.1、0.2、0.3等,这些设计都没有超出本申请实施例的保护范围。
在一些实施例中,图11为本申请实施例中判定投影设备是否移动的示意图,包括以下步骤:
步骤1101,判断加速度数据是否小于预设加速度阈值,若否执行步骤1102,是则执行步骤1104;
步骤1102,将加速度数据对应的数据标签赋值为第二状态标识;
步骤1103,表征投影设备处于移动状态;
步骤1104,将加速度数据对应的数据标签赋值为第一状态标识;
步骤1105,获取第一时刻的加速度数据;
步骤1106,检测第一时刻的加速度数据对应的数据标签是否为第一状态标识,若否则执行步骤1107,是则执行步骤1108;
步骤1107,表征投影设备处于移动状态;
步骤1108,获取第二时刻的加速度数据;
步骤1109,检测第二时刻加速度数据是否小于预设加速度阈值,若否则执行步骤1110,若是则执行步骤1111;
步骤1110,将第二加速度数据对应的数据标签赋值为第二状态标识;
步骤1111,触发图像自动校正过程。
参见图11,控制器还被配置为:如果加速度数据小于预设加速度阈值,则将加速度数据对应的数据标签赋值为第一状态标识,第一状态标识用于表征投影设备处于静止状态。如果加速度数据大于预设加速度阈值,则将数据标签赋值为第二状态标识,第二状态标识用于表征投影设备处于移动状态。
预设加速度阈值为0.1。当加速度数据大于0.1时,则赋值当前的数据标签为第二状态标识,如movingType为moving,staticType为false。当加速度数据小于0.1时,则赋值当前的数据标签为第一状态标识,如movingType为moving,staticType为true。
为了更加精确的判定投影设备是否真正处于静止状态,如果加速度数据小于预设加速度阈值,控制器还被配置为:基于预设时长获取第一时刻的加速度数据。接着,检测第一时刻的加速度数据对应的数据标签。如果数据标签值为第一状态标识,获取第二时刻的加速度数据;其中,第二时刻与第一时刻之间间隔预设时长。根据第二时刻的加速度数据,计算投影设备的移动角度,以根据移动角度控制光机将校正图卡投射至投影面。
预设时长为1s。当加速度数据小于0.1时,间隔1s后再次获取第一时刻的加速度数据。判断第一时刻加速度数据的数据标签。此时,数据标签会出现两种情况。
第一种情况:如第一时刻加速度数据的数据标签是第二状态标识movingType为moving,staticType为false,则间隔1sS后再次获取第二时刻的加速度数据。再次判断第二时刻加速度数据的数据标签, 如第二时刻加速度数据的数据标签仍是第二状态标识movingType为moving,staticType为false,则判定投影设备处于移动状态,投影设备则不触发后续图像校正过程。
第二种情况:如第一时刻加速度数据的数据标签是第一状态标识movingType为moving,staticType为true,则间隔1s后再次获取第二时刻的加速度数据。再次判断第二时刻加速度数据的数据标签,如第二时刻加速度数据的数据标签仍是第一状态标识movingType为moving,staticType为true,则判定投影设备处于静止状态,投影设备则触发后续图像校正过程。
需要说明的是,由于加速度数据是由陀螺仪传感器实时采集并发送至控制器的。进而,可进行多次获取不同时刻的加速度数据以及判断对应的数据标签的过程,以便于更加精确的判定投影设备是否真正的处于静止状态。本申请实施例不对获取和判断过程的执行次数进行具体限定,可根据具体投影设备对应的设备状态进行多次设置,且均在本申请实施例的保护范围内。
在一些实施例中,控制器在获取加速度传感器采集的加速度数据的步骤之后,如果加速度数据大于预设加速度阈值,控制器被配置为:基于预设时长获取第三时刻的加速度数据;其中,第三时刻与获取加速度数据的时刻间隔预设时长。如果第三时刻的加速度数据小于预设加速度阈值,计算投影设备的移动角度,以根据移动角度控制光机将校正图卡投射至投影面。
如果控制器获取的加速度数据大于0.1,此时,加速度数据的数据标签是第二状态标识movingType为moving,staticType为false。此时可认为投影设备处于移动状态。由于加速度传感器是实时采集并发送至控制器的,因此在投影设备处于移动状态后控制器仍实时获取并检测加速度数据。这样,控制器间隔1s后再次获取第三时刻的加速度数据。如果第三时刻的加速度数据仍大于0.1,则判定投影设备仍为移动状态。如果第三时刻的加速度数据小于0.1,则同上述继续多次获取不同时刻的加速度数据并判断对应的数据标签,以确定投影设备处于静止状态。如果投影设备处于静止状态,控制器根据加速度数据计算投影设备的移动角度,触发图像校正过程,在此不再赘述。
在一种可实现方式中,图像校正过程的过程具体包括:控制器控制光机投射校正卡至投影面。并基于校正卡确定投影面中的待投影区域。最后将播放内容投射至待投影区域。其中,校正卡包括校正白卡和校正图卡。
以下结合图12对图像校正过程进行具体阐述。
参见图12,控制器接收输入的图像校正指令后,控制光机投射校正白卡至投影面并提取校正白卡中的第一校正坐标。接着,控制器控制光机投校正图卡至投影面并提取校正图卡中的第二校正坐标。控制器基于第一校正坐标和第二校正坐标获取坐标之间的角度关系,其中,角度关系为光机投射的播放内容和投影面之间的投影关系。最后,控制器基于角度关系更新第一校正坐标,以使根据更新完成的第一校正坐标确定待投影区域。
控制器接收输入的图像校正指令后,参见图13,控制光机投射校正白卡至投影面。继续提取校正白卡中的第一校正坐标,其中,第一校正坐标包括校正白卡的四个顶点坐标。参见图14,控制器控制光机投射校正图卡至投影面,继续提取校正图卡中的第二校正坐标,其中,第二校正坐标包括校正图卡的四个顶点坐标。这样,控制器将校正白卡的四个顶点坐标以及校正图卡的四个顶点坐标进行比对,如将校正白卡中左上角的顶点坐标与校正图卡中左上角的顶点坐标进行比对、将校正白卡中左下角的顶点坐标与校正图卡中左下角的顶点坐标进行比对、将校正白卡中右上角的顶点坐标与校正图卡中右上角的顶点坐标进行比对以及将校正白卡中右下角的顶点坐标与校正图卡中右下角的顶点坐标进行比对。通过比对计算两两坐标的角度关系,并根据角度关系更新校正白卡中的四个顶点坐标。这样,可以基于完成更新校正白卡中的四个顶点坐标确定待投影区域即完成图像校正。最后控制光机将播放内容投射至待投影区域,供用户进行观看。
在一种可实现的方式中,投影设备可以通过预设的校正算法来获取第一校正坐标和第二校正坐标之间的角度关系。投射关系指的是投影设备将图像投射到投影面的投射关系,具体为投影设备的光机投射的播放内容和投影面之间的映射关系。
在一种可实现的方式中,投影设备可以通过预设的校正算法将第一校正坐标转化为第二校正坐标,进而完成第一校正坐标的更新。
为了获取第一校正坐标以及第二校正坐标,控制器控制光机将校正白卡和校正图卡投射至投影面后,控制器还可以控制相机对投影面中显示的校正白卡和校正图卡进行拍摄,得到校正白卡图像和校正图卡图像。因此,可以从相机拍摄到的校正白卡图像和校正图卡图像中获取第一校正坐标以及第二校正坐标。
在校正白卡图像和校正图卡图像对应的图像坐标系下,确定第一校正坐标以及第二校正坐标。图像坐标系指的是:以图像中心为坐标原点,X,Y轴平行于图像两边的坐标系。再如:对于用户预先设定的投影区域,可以将该投影区域的中心点设置为原点,水平方向为X轴,垂直方向为Y轴。
需要说明的是,第一校正坐标以及第二校正坐标包括但不限于四个顶点坐标,可根据校正白卡和校正图卡以及投影设备调整坐标的数量和坐标的位置。
在一些实施例中,校正图卡中可以包含用户预先设定的图案、颜色特征。图15为一些实施例中校正图卡的示意图。校正图卡中可以包括三种校正区域:第一校正区域、第二校正区域和第三校正区域。三种校正区域可以沿着校正图卡从中心到边缘的方向逐次排列。其中,第一校正区域位于校正图卡的中间区域,如图中的A区域。第一校正区域可以是棋盘格图案,也可以是圆环图案。第二校正区域位于第一校正区域和第三校正区域之间,如图中的四个B区域,包括B1-B4四个校正区域。每个第二校正区域的面积可以是相同的,都可以小于第一校正区域的面积。每个第二校正区域中都包括用户预先设定的图案,其中,每个第二校正区域的图案可以是相同的,也可以是不同的,本申请实施例不做限定。第三校正区域位于校正图卡的边缘区域,如图中的16个C区域,包括C1-C16共16个校正区域。每个第三校正区域的面积可以是相同的,都小于第二校正区域的面积。每个第三校正区域中都可以包括用户预先设定的图案,其中,每个第三校正区域的图案可以是相同的,也可以是不同的。
进而,第二校正坐标可以是三个校正区域中所有图案的四个顶点坐标,也可以是任意校正区域中任意几个图案的四个顶点坐标。
在一些实施例中,如图16所示,校正图卡中的图案也可设置为圆环图卡,圆环图卡包括圆环图案。需要说明的是,校正图卡还可设置为上述两类图案的组合,或者还可以设置为其他具有可实现图像校正的图案。
在一些实施例中,控制器依次控制光机将校正白卡和校正图卡投射至投影面后,均会触发自动对焦过程。为便于描述,将校正白卡投射至投影面后进行的自动对焦过程称为第一次对焦。将校正图卡投射至投影面后进行的自动调焦过程称为第二次对焦。
本申请实施例提供的投影设备还包括镜头,镜头包括光学组件和驱动马达;驱动马达连接光学组件。通过驱动马达调整光学组件的焦距,以实现第一次对焦过程和第二次对焦过程。在执行第一次对焦过程和第二次对焦过程时,控制器控制镜头采用反差式对焦法,控制驱动马达带动光学组件逐步移动。同时计算镜头在每个移动位置中对应的对比度反差值。根据对比度反差值确定目标位置,目标位置为对比度反差值最高的图像对应位置。最终,控制驱动马达按照目标位置移动光学组件的位置。
在一些可实现的方式中,投影设备还可以通过自动对焦算法,利用其配置的激光测距可获得当前物距,以计算初始焦距及搜索范围。然后投影设备驱动相机进行拍照,并利用对应算法进行清晰度评价。因此,投影设备在上述搜索范围内,基于搜索算法查找可能的最佳焦距。然后重复上述拍照、清晰度评价步骤,最终通过清晰度对比找到最优焦距,完成自动对焦。
在一些实施例中,在投影设备启动以及用户改变其位置后,投影设备在触发图像校正之前需检测自动对焦功能是否开启。当自动对焦功能未开启时,控制器将不触发自动对焦过程。当自动对焦功能开启时,投影设备将自动触发第一对焦过程、第二对焦过程以及在完成图像校正之后的对焦过程。
在一些实施例中,控制器在执行图像校正过程后,还可以触发护眼模式。通过触发护眼模式降低用户界面显示亮度,防止用户偶然进入投影设备射出激光轨迹范围内而导致的视力损害危险。在用户进入投影设备所在的投影区域时,控制器可以控制用户界面显示对应的提示信息,以提醒用户离开当前区域。
如图17所述,给出投影设备触发护眼模式的流程图,包括以下步骤:
步骤1701,判断儿童观影模式开关是否开启,若是则执行步骤1702,否则结束触发;
步骤1702,判断放射眼开关是否开启,若是则执行步骤1703,否则结束触发;
步骤1703,判断图像校正是否进行,若否则执行步骤1704,是则结束触发;
步骤1704,判断自动对焦是否进行,若否则执行步骤1705,否则结束触发;
步骤1705,触发进入护眼模式。在一种可实现的方式中,参见图17,控制器在执行图像校正过程后,检测儿童观影模式是否开启,如检测儿童观影模式开关是否开启。在儿童观影模式开关开启的状态下,控制器检测防射眼开关是否开启。如果防射眼开关在开启状态,用户进入投影区域时,投影设备将触发进入护眼模式,即自动降低光源发出激光强度、降低用户界面显示亮度以及显示安全提示信息。
在一些实施例中,控制器也可以自动开启防射眼开关。在接收加速度传感器发送的加速度数据或接收其他传感器所采集的数据后,控制器将控制投影设备开启防射眼开关。
在一种可实现的方式中,控制器在执行图像校正过程后触发上述护眼功能时,控制器还被配置为:通过相机获取预设帧具有相同内容的图像。计算预设帧中对应图像的相似度值;如果相似度值大于相似度阈值,将光源的亮度调整至预设亮度。其中,预设亮度小于正常投影时的亮度。
控制器获取两帧的图像,如计算当前帧的图像和上一帧的图像的相似度值。如果相似度值小于相似度阈值,则判定两帧图像相同,且没有人物进入投影区域。如果相似度值小于相似度阈值,则判定两帧图像不相同,有人物进入投影区域。这样,在检测到有人物进入投影区域时,将光源的亮度调整至预设亮度。因此,通过将光源亮度调低以实现保护人眼的目的。需要说明的是,本申请不对如何检测人物进入投影区域的实现方式进行限定,仅以判定预设帧图像是否相同作为示例。
由于图像校正、自动对焦过程、护眼模式不能同步触发。在进行图像校正或者自动对焦过程时,不能触发护眼模式,以防止影响校正效果。因此,如果预设帧对应的图像不相同,继续参见图17,控制器还被配置为:检测对图像的校正进度;如果校正进度为未进行,则获取当前时刻的加速度数据。并根据当前时刻的加速度数据,将光源的亮度调整至预设亮度。
这样,在检测有人物进入投影区域后,控制器需检测对图像的校正进度。如果没进行图像校正,则获取并检测当前时刻的加速度数据。通过当前时刻的加速度数据判定投影设备是否处于移动状态。如果投影设备处于静止状态,则将光源的亮度调整至预设亮度即触发护眼模式。
在一些实施例中,如果校正进度为未进行,控制器还被配置为:检测对图像的对焦进度。如果对焦进度为未进行,则获取并检测当前时刻的加速度数据。根据当前时刻的加速度数据判定投影设备是否处于移动状态,如果投影设备处于静止状态,则触发护眼模式。
控制器在检测自动对焦功能是否开启时,可以检测自动对焦开关是否开启。如果自动对焦开关未开启,即触发护眼模式。如果自动对焦开关开启,控制器需判断当前时刻对图像的对焦进度。如果对焦进度为未进行,再触发护眼模式。
以上为本申请实施例中在图像校正开关为默认开启状态场景下,投影设备可以基于加速度传感器的变化触发图像校正过程,并在完成图像自己校正过程后,触发进入护眼模式。通过对加速度传感器采集 的加速度数据的检测,更加精确的判定投影设备是否处于静止状态。避免了投影设备在移动状态时误触发图像校正过程及进入护眼模式。进一步避免在进入护眼模式同时触发图像校正过程和自动对焦过程,提升了用户的使用体验。
在一些实施例中,在投影设备的图像校正开关为关闭状态时,控制器即不会触发图像校正过程。此时,由于防射眼在开启状态下,检测到人物进入投影区域时,投影设备将自动进入护眼模式,即执行降低光源发出激光强度、降低用户界面显示亮度以及显示安全提示信息等过程。需要说明的是,具体触发护眼模式的步骤参照上述实施例中的描述,在此不再赘述。
这样,在图像校正开关为默认关闭状态时,投影设备通过检测自动对焦开关的开关闭状态后,如果自动对焦开关为关闭状态,则触发护眼模式。同时,在触发护眼模式的过程中,通过检测校正进度和对焦进度,避免在进入护眼模式时同时触发图像校正过程和自动对焦过程,提升了用户的使用体验。
由以上技术方案可知,本申请一些实施例中提供的投影设备,在默认可以触发图像校正和护眼模式的场景下,获取加速度传感器采集的加速度数据。如果加速度数据小于预设加速度阈值,控制光机投射校正卡至投影面。基于校正卡确定投影面中的待投影区域,以及将播放内容投射至待投影区域。由此,在判定投影设备移动后处于静止状态后触发图像校正。并且后续通过相机获取预设帧具有相同内容的图像,根据图像调整光源的亮度即触发护眼功能。这样,在触发图像校正过程后,再基于预设帧具有相同内容的图像触发护眼模式。解决在投影设备的移动过程中,误触发图像校正以及护眼功能的问题,提升用户的使用体验。
在一些实施例中,本申请还提供了一种触发校正方法,应用于投影设备,投影设备包括光源、加速度传感器、光机、相机以及控制器;触发校正方法包括:
获取加速度传感器采集的加速度数据;如果加速度数据小于预设加速度阈值,控制光机投射校正图卡至投影面;基于校正图卡确定投影面中的待投影区域,以及将播放内容投射至待投影区域;通过相机获取预设帧具有相同内容的图像;将光源的亮度调整至预设亮度;其中,预设亮度小于正常投影时的亮度。
在一些实施例中,方法包括:如果加速度数据小于预设加速度阈值,则将加速度数据对应的数据标签赋值为第一状态标识,第一状态标识用于表征投影设备处于静止状态;如果加速度数据大于预设加速度阈值,则将数据标签赋值为第二状态标识,第二状态标识用于表征投影设备处于移动状态。
在一些实施例中,如果加速度数据小于预设加速度阈值,方法包括:基于预设时长获取第一时刻的加速度数据;检测第一时刻的加速度数据对应的数据标签;如果数据标签值为第一状态标识,获取第二时刻的加速度数据;其中,第二时刻与第一时刻之间间隔预设时长;根据第二时刻的加速度数据,计算投影设备的移动角度,以根据移动角度控制光机将校正图卡投射至投影面。
在一些实施例中,如果加速度数据大于预设加速度阈值,方法包括:在获取加速度传感器采集的加速度数据的步骤之后,基于预设时长获取第三时刻的加速度数据;其中,第三时刻与获取加速度数据的时刻间隔预设时长;如果第三时刻的加速度数据小于预设加速度阈值,计算投影设备的移动角度,以根据移动角度控制光机将校正图卡投射至投影面。
在一些实施例中,方法包括:在加速度数据小于预设加速度阈值的步骤中,获取上一次采集的加速度数据;根据加速度数据计算投影设备在三个轴坐标方向上的移动角度;如果在三个轴坐标方向上的移动角度均小于预设加速度阈值,则控制光机投射校正图卡至投影面。
在一些实施例中,方法包括:在将光源的亮度调整至预设亮度的步骤中,计算预设帧中对应图像的相似度值。如果相似度值大于相似度阈值,则将光源的亮度调整至预设亮度。
在一些实施例中,如果预设帧数中对应的图像相同,方法包括:检测对图像的校正进度;如果校正 进度为未进行,获取当前时刻的加速度数据;根据当前时刻的加速度数据,将光源的亮度调整至预设亮度。
在一些实施例中,校正卡包括校正白卡和校正图卡;方法包括:在基于校正卡确定投影面中的待投影区域的步骤中,依次控制光机投射校正白卡和校正图卡至投影面;分别提取校正白卡中的第一校正坐标和校正图卡中的第二校正坐标;获取第一校正坐标和第二校正坐标之间的角度关系,角度关系为光机投射的播放内容和投影面之间的投影关系;基于角度关系更新第一校正坐标,以使根据更新完成的第一校正坐标确定待投影区域。
在一些实施例中,投影设备还包括镜头,镜头包括光学组件和驱动马达;驱动马达连接光学组件,以调整光学组件的焦距;方法包括:在控制光机投射校正白卡的步骤之后,控制驱动马达带动光学组件逐步移动;计算镜头在每个移动位置中对应的对比度反差值;根据对比度反差值确定目标位置,目标位置为对比度反差值最高的图像对应位置;控制驱动马达按照目标位置移动光学组件的位置。
本说明书中各个实施例之间相同相似的部分互相参照即可,在此不再赘述。

Claims (19)

  1. 一种投影设备,包括:
    光机,用于投射播放内容至投影面;
    相机,用于获取所述投影面的显示图像;
    控制器,被配置为:
    在检测到投影设备移动后,控制所述相机先后获取所述光机投射至所述投影面的第一图像及第二图像,所述第一图像对应于所述光机投射的第一图卡,所述第二图像对应于所述光机投射的第二图卡,所述第一图像、第二图像用于对应特征比对以确定在世界坐标系下投影图卡与所述投影面的映射关系;
    基于所述映射关系获取所述相机拍摄图像转换至所述世界坐标系下的矩形区域,并控制所述光机将播放内容投射至所述矩形区域,所述矩形区域用于替代所述投影设备移动后在所述投影面形成的畸形投影区域。
  2. 如权利要求1所述投影设备,控制器控制所述光机将播放内容投射至所述矩形区域前,所述控制器还被配置为:
    将拍摄图像进行灰度化处理;
    基于聚类算法及灰度值将所述灰度化处理的拍摄图像聚类为投影光束区域及障碍物区域;
    将所述投影光束区域及障碍物区域进行二值化以提取大于预设避障面积阈值的第一闭合轮廓,所述第一闭合轮廓对应于障碍物,所述第一闭合轮廓用于避开所述投影面生成的所述矩形区域以实现投影避障。
  3. 如权利要求2所述投影设备,控制器控制所述光机将播放内容投射至所述矩形区域前,所述控制器还被配置为:
    将所述投影光束区域及障碍物区域进行二值化以提取小于预设避障面积阈值的第二闭合轮廓,所述第二闭合轮廓对应于非障碍物,所述第二闭合轮廓填充为投影光束区域,所述第二闭合轮廓不影响所述投影面生成的所述矩形区域的位置。
  4. 如权利要求1所述投影设备,控制器基于所述映射关系获取所述相机拍摄图像转换至世界坐标系下的矩形区域,具体包括所述控制器:
    基于所述映射关系,确定投影图卡对应拍摄图像在世界坐标系中的中心点坐标;
    根据所述投影图卡的宽高比以所述中心点坐标为基准进行矩形扩张,直至抵达灰度值变化边界,获取该时刻矩形的四个顶点作为所述矩形区域的顶点。
  5. 如权利要求1所述投影设备,控制器基于所述映射关系获取所述相机拍摄图像转换至世界坐标系下的矩形区域,具体包括所述控制器:
    获取拍摄图像中灰度值为255的最大内接矩形;
    依据所述投影图卡的宽高比修正所述最大内接矩形的宽值、高值;
    基于所述宽值、高值得到所述最大内接矩形的任意顶点以定位所述矩形区域。
  6. 如权利要求1所述投影设备,所述控制器还被配置为:
    控制所述光机将播放内容投射至所述矩形区域的步骤之后,基于相机获取的投影面的显示图像,利用边缘检测算法识别投影设备的投影区域;在投影区域显示为矩形、或类矩形时,控制器通过预设算法获取上述矩形投影区域四个顶点的坐标值。
  7. 如权利要求6所述投影设备,所述控制器还被配置为:
    使用透视变换方法校正投影区域为矩形,计算矩形和投影截图的差值,以实现判断显示区域内是否有异物。
  8. 如权利要求6所述投影设备,所述控制器还被配置为:
    在实现对投影区域外一定区域的异物检测时,可将当前帧的相机内容、和上一帧的相机内容做差值,以判断投影范围外区域是否有异物进入;若判断有异物进入,投影设备自动触发防射眼功能。
  9. 如权利要求6所述投影设备,所述控制器还被配置为:
    利用飞行时间相机、或飞行时间传感器检测特定区域的实时深度变化;若深度值变化超过预设阈值,投影设备将自动触发防射眼功能。
  10. 如权利要求6所述投影设备,所述控制器还被配置为:基于采集的飞行时间数据、截图数据、以及相机数据分析判断是否需要开启防射眼功能。
  11. 如权利要求6所述投影设备,所述控制器还被配置为:
    若用户位于特定区域,其视力存在被激光损害风险,将自动启动防射眼功能,以降低光机发出激光强度、降低用户界面显示亮度、并显示安全提示信息。
  12. 如权利要求6所述投影设备,所述控制器还被配置为:
    通过陀螺仪、或陀螺仪传感器对设备移动进行监测;向陀螺仪发出用于查询设备状态的信令,并接收陀螺仪反馈用于判定设备是否发生移动的信令。
  13. 如权利要求6所述投影设备,所述控制器还被配置为:
    在陀螺仪数据稳定预设时间长度后,控制启动触发梯形校正,在梯形校正进行时不响应遥控器按键发出的指令。
  14. 如权利要求6所述投影设备,所述控制器还被配置为:
    通过自动避障算法识别幕布,并利用投影变化,将播放内容校正至幕布内显示,实现与幕布边沿对齐的效果。
  15. 一种用于投影设备的投影图像的显示控制方法,所述方法包括:
    在检测到投影设备移动后,先后获取投射至投影面的第一图像及第二图像,所述第一图像对应于投射第一图卡,所述第二图像对应于投射第二图卡,所述第一图像、第二图像用于对应特征比对以确定在世界坐标系下投影图卡与所述投影面的映射关系,所述投影设备包括:用于投射播放内容至投影面的光机和用于获取所述投影面的显示图像的相机;
    基于所述映射关系获取相机拍摄图像转换至世界坐标系下的矩形区域,将播放内容投射至所述矩形区域,所述矩形区域用于替代移动后在投影面形成的畸形投影区域。
  16. 如权利要求15所述投影图像的显示控制方法,将播放内容投射至所述矩形区域前,所述方法还包括:
    将拍摄图像进行灰度化处理;
    基于聚类算法及灰度值将所述灰度化处理的拍摄图像聚类为投影光束区域及障碍物区域;
    将所述投影光束区域及障碍物区域进行二值化以提取大于预设避障面积阈值的第一闭合轮廓,所述第一闭合轮廓对应于障碍物,所述第一闭合轮廓用于避开所述投影面生成的所述矩形区域以实现投影避障。
  17. 如权利要求16所述投影图像的显示控制方法,将播放内容投射至所述矩形区域前,所述方法还包括:
    将所述投影光束区域及障碍物区域进行二值化以提取小于预设避障面积阈值的第二闭合轮廓,所述第二闭合轮廓对应于非障碍物,所述第二闭合轮廓填充为投影光束区域,所述第二闭合轮廓不影响所述投影面生成的所述矩形区域的位置。
  18. 如权利要求15所述投影图像的显示控制方法,基于所述映射关系获取所述相机拍摄图像转换 至世界坐标系下的矩形区域,具体包括:
    基于所述映射关系,确定投影图卡对应拍摄图像在世界坐标系中的中心点坐标;
    根据所述投影图卡的宽高比以所述中心点坐标为基准进行矩形扩张,直至抵达灰度值变化边界,获取该时刻矩形的四个顶点作为所述矩形区域的顶点。
  19. 如权利要求15所述投影图像的显示控制方法,基于所述映射关系获取所述相机拍摄图像转换至世界坐标系下的矩形区域,具体包括:
    获取所述拍摄图像中灰度值为255的最大内接矩形;
    依据所述投影图卡的宽高比修正所述最大内接矩形的宽值、高值;
    基于所述宽值、高值得到所述最大内接矩形的任意顶点以定位所述矩形区域。
PCT/CN2022/122816 2021-11-16 2022-09-29 一种投影设备及投影图像的显示控制方法 WO2023087951A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280063190.4A CN118104229A (zh) 2021-11-16 2022-09-29 一种投影设备及投影图像的显示控制方法

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN202111355866 2021-11-16
CN202111355866.0 2021-11-16
CN202210050600.3A CN114205570A (zh) 2021-11-16 2022-01-17 一种投影设备及自动校正投影图像的显示控制方法
CN202210050600.3 2022-01-17
CN202210399514.3A CN114866751A (zh) 2022-04-15 2022-04-15 一种投影设备及触发校正方法
CN202210399514.3 2022-04-15

Publications (1)

Publication Number Publication Date
WO2023087951A1 true WO2023087951A1 (zh) 2023-05-25

Family

ID=86396219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/122816 WO2023087951A1 (zh) 2021-11-16 2022-09-29 一种投影设备及投影图像的显示控制方法

Country Status (2)

Country Link
CN (1) CN118104229A (zh)
WO (1) WO2023087951A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916175A (zh) * 2010-08-20 2010-12-15 浙江大学 自适应于投影表面的智能投影方法
US20180176520A1 (en) * 2016-12-16 2018-06-21 Cj Cgv Co., Ltd. Method of automatically correcting projection area based on image photographed by photographing device and system therefor
CN110336987A (zh) * 2019-04-03 2019-10-15 北京小鸟听听科技有限公司 一种投影仪畸变校正方法、装置和投影仪
CN110677634A (zh) * 2019-11-27 2020-01-10 成都极米科技股份有限公司 投影仪的梯形校正方法、装置、系统及可读存储介质
CN112584113A (zh) * 2020-12-02 2021-03-30 深圳市当智科技有限公司 基于映射校正的宽屏投影方法、系统及可读存储介质
CN114205570A (zh) * 2021-11-16 2022-03-18 海信视像科技股份有限公司 一种投影设备及自动校正投影图像的显示控制方法
CN114866751A (zh) * 2022-04-15 2022-08-05 海信视像科技股份有限公司 一种投影设备及触发校正方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916175A (zh) * 2010-08-20 2010-12-15 浙江大学 自适应于投影表面的智能投影方法
US20180176520A1 (en) * 2016-12-16 2018-06-21 Cj Cgv Co., Ltd. Method of automatically correcting projection area based on image photographed by photographing device and system therefor
CN110336987A (zh) * 2019-04-03 2019-10-15 北京小鸟听听科技有限公司 一种投影仪畸变校正方法、装置和投影仪
CN110677634A (zh) * 2019-11-27 2020-01-10 成都极米科技股份有限公司 投影仪的梯形校正方法、装置、系统及可读存储介质
CN112584113A (zh) * 2020-12-02 2021-03-30 深圳市当智科技有限公司 基于映射校正的宽屏投影方法、系统及可读存储介质
CN114205570A (zh) * 2021-11-16 2022-03-18 海信视像科技股份有限公司 一种投影设备及自动校正投影图像的显示控制方法
CN114885136A (zh) * 2021-11-16 2022-08-09 海信视像科技股份有限公司 投影设备和图像校正方法
CN114866751A (zh) * 2022-04-15 2022-08-05 海信视像科技股份有限公司 一种投影设备及触发校正方法

Also Published As

Publication number Publication date
CN118104229A (zh) 2024-05-28

Similar Documents

Publication Publication Date Title
WO2023088329A1 (zh) 投影设备及投影图像校正方法
WO2023087947A1 (zh) 一种投影设备和校正方法
US20160337626A1 (en) Projection apparatus
CN114866751A (zh) 一种投影设备及触发校正方法
CN116055696A (zh) 一种投影设备及投影方法
CN115002432A (zh) 一种投影设备及避障投影方法
CN115002433A (zh) 投影设备及roi特征区域选取方法
JP2012181264A (ja) 投影装置、投影方法及びプログラム
WO2023087951A1 (zh) 一种投影设备及投影图像的显示控制方法
WO2023088303A1 (zh) 投影设备及避障投影方法
CN114760454A (zh) 一种投影设备及触发校正方法
CN116320335A (zh) 一种投影设备及调整投影画面尺寸的方法
CN114928728A (zh) 投影设备及异物检测方法
CN115623181A (zh) 一种投影设备及投影画面移动方法
CN115604445A (zh) 一种投影设备及投影避障方法
CN114885142B (zh) 一种投影设备及调节投影亮度方法
WO2023087948A1 (zh) 一种投影设备及显示控制方法
WO2023087960A1 (zh) 投影设备及调焦方法
WO2024066776A9 (zh) 投影设备及投影画面处理方法
WO2023115857A1 (zh) 激光投影设备及投影图像的校正方法
CN118075435A (zh) 一种投影设备及指令响应方法
JP2017152765A (ja) プロジェクター及びプロジェクターの制御方法
CN116095287A (zh) 一种投影设备标定方法、标定系统及投影设备
CN115604442A (zh) 一种投影设备及调整光源亮度的方法
CN118158367A (zh) 一种投影设备及投影画面入幕方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22894489

Country of ref document: EP

Kind code of ref document: A1