CN114466173A - Projection equipment and projection display control method for automatically throwing screen area - Google Patents

Projection equipment and projection display control method for automatically throwing screen area Download PDF

Info

Publication number
CN114466173A
CN114466173A CN202210006233.7A CN202210006233A CN114466173A CN 114466173 A CN114466173 A CN 114466173A CN 202210006233 A CN202210006233 A CN 202210006233A CN 114466173 A CN114466173 A CN 114466173A
Authority
CN
China
Prior art keywords
image
projection
curtain
closed contour
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210006233.7A
Other languages
Chinese (zh)
Inventor
卢平光
何营昊
王昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Publication of CN114466173A publication Critical patent/CN114466173A/en
Priority to CN202280063192.3A priority Critical patent/CN118104230A/en
Priority to PCT/CN2022/122810 priority patent/WO2023087950A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/317Convergence or focusing systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/53Means for automatic focusing, e.g. to compensate thermal effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)
  • Automatic Focus Adjustment (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Focusing (AREA)

Abstract

The utility model relates to a display device technical field, in particular to, relate to a projection equipment and automatic projection display control method who drops into curtain region, can solve to a certain extent and need manual fine setting projection angle after the adjustment of projection equipment position, or the projector is the curtain with the regional mistake recognition of large tracts of land pure colors such as wall, leads to the problem that the broadcast content can not accurate projection to curtain projection area, projection equipment includes: a projection assembly; a controller configured to: obtaining a second image based on the brightness analysis and binarization of the obtained first image gray-scale image; determining a first-level closed contour contained in a second image, wherein the first-level closed contour contains a second-level closed contour; when the secondary closed contour is judged to be a convex quadrangle, projecting the playing content to the secondary closed contour, wherein the secondary closed contour corresponds to a projection area of the curtain; wherein the curtain comprises a curtain edge band corresponding to the primary closed contour, the projected area being surrounded by the curtain edge band.

Description

Projection equipment and projection display control method for automatically throwing screen area
The present application claims priority of chinese patent application entitled "projection apparatus and display control method based on geometric correction" filed by the chinese patent office at 16/11/2021 under the application number 202111355866.0, which is incorporated herein by reference in its entirety.
Technical Field
The application relates to the technical field of display equipment, in particular to projection equipment and a projection display control method capable of automatically throwing into a curtain area.
Background
The projector is a device which can project images or videos to a curtain for display, and can be connected with a computer, a VCD, a DVD, a BD, a game machine, a DV, a radio and television signal source, a video signal source and the like through different interfaces to play corresponding video signals.
In some implementations of display control that projects unplayed content to a canvas projection area, a first-phase projector acquires an image of the canvas area; then, carrying out binarization image processing on the acquired image so as to enable the outline of an object in the image to be displayed more clearly; and finally, extracting all closed contours contained in the binary image by the projector based on the binary image, and determining the closed contour with the largest area and consistent internal color as a curtain projection area.
However, when there is a large area of solid wall around the curtain to be projected and the wall edge forms a closed contour, the projector will recognize the wall as a curtain, resulting in the broadcast content being projected onto the wall and not onto the specified curtain.
Disclosure of Invention
In order to solve the problem that the projection angle needs to be manually finely adjusted after the position of the projection equipment is adjusted, or a large-area pure color area such as a wall is mistakenly identified as a curtain, so that the played content cannot be accurately projected to a curtain projection area, the application provides the projection equipment and a projection display control method automatically thrown into the curtain area.
The embodiment of the application is realized as follows:
a first aspect of embodiments of the present application provides a projection apparatus, including: the projection component is used for projecting the playing content to a curtain corresponding to the projection equipment; a controller configured to: performing binarization on the first image to obtain a second image based on the brightness analysis of a first image gray-scale image obtained by a camera; determining a primary closed contour contained in the second image, wherein the primary closed contour contains a secondary closed contour; when the secondary closed contour is judged to be a convex quadrangle, controlling the projection component to project the playing content to the secondary closed contour, wherein the secondary closed contour corresponds to a projection area of the curtain; wherein the curtain comprises a curtain edge band corresponding to the primary closed contour, the projected area being surrounded by the curtain edge band.
A second aspect of an embodiment of the present application provides a projection display control method for automatically throwing into a curtain area, where the method includes: performing binarization on a first image to obtain a second image based on the brightness analysis of the obtained first image gray-scale image, wherein the first image is an environment image; determining a primary closed contour contained in the second image, wherein the primary closed contour contains a secondary closed contour; when the secondary closed contour is judged to be a convex quadrangle, projecting the playing content to the secondary closed contour, wherein the secondary closed contour corresponds to a projection area of the curtain; wherein the curtain comprises a curtain edge band corresponding to the primary closed contour, the projected area being surrounded by the curtain edge band.
The beneficial effect of this application: the method comprises the steps that a first image is created, so that an environment image where a curtain corresponding to a projector is located can be obtained; furthermore, the accuracy of the projector for identifying the closed contour corresponding to the environmental element can be improved by creating the first image gray-scale image; further, the screening range of candidate areas of the curtain projection area can be reduced by constructing a first-stage closed contour and a second-stage closed contour; further, by judging that the second-stage closed contour is a convex quadrilateral, the screen projection area contained in the candidate area can be identified, the accuracy of identifying the screen projection area is improved, the condition that a user manually finely adjusts the projection angle is avoided, and the situation that the playing content of a projector used with the screen can be automatically projected to the screen projection area after the projector is moved is realized.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1A is a schematic view of a projection apparatus according to an embodiment of the present disclosure;
FIG. 1B is a schematic diagram of an optical path of a projection apparatus according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a circuit architecture of a projection apparatus according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a projection apparatus according to an embodiment of the present application;
FIG. 4 is a schematic structural diagram of a projection apparatus according to another embodiment of the present application;
FIG. 5 is a schematic circuit diagram of a projection apparatus according to an embodiment of the present application;
FIG. 6A is a schematic diagram of a screen corresponding to a projector according to an embodiment of the present application;
FIG. 6B is a schematic diagram of a first image of an environment in which a projector according to another embodiment of the present application is located;
FIG. 6C is a schematic diagram of a first image and its corresponding grayscale map according to another embodiment of the present application;
fig. 6D shows a second image schematic diagram after binarization of an environment image where a projector is located according to an embodiment of the application;
fig. 6E shows a binarization schematic diagram of a closed contour corresponding to a curtain in an embodiment of the application;
FIG. 6F is a schematic diagram of a projector according to an embodiment of the present application recognizing a large area of solid color wall as a projection area of a curtain;
FIG. 7A is a schematic diagram of a system framework for implementing display control of a projection device according to an embodiment of the present application;
FIG. 7B is a schematic timing diagram of signaling interaction for implementing the radial eye function by a projection apparatus according to another embodiment of the present application;
FIG. 7C is a schematic diagram illustrating a timing sequence of signaling interaction for implementing a display screen correction function by a projection apparatus according to another embodiment of the present application;
FIG. 7D is a flowchart illustrating an implementation of an auto-focus algorithm by a projection device according to another embodiment of the present application;
fig. 7E is a schematic flowchart illustrating a process of implementing a trapezoidal correction and obstacle avoidance algorithm by using a projection device according to another embodiment of the present application;
FIG. 7F is a schematic flow chart illustrating a projection apparatus implementing a screen entry algorithm according to another embodiment of the present application;
fig. 7G shows a schematic flowchart of a projection apparatus implementing an anti-glare eye algorithm according to another embodiment of the present application.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1A shows a schematic layout of a projection device according to an embodiment of the present application.
In some embodiments, the present application provides a projection device comprising a projection screen 1 and a projection device 2. The projection screen 1 is fixed on the first position, and the projection device 2 is placed on the second position, so that the projected picture is matched with the projection screen 1, the step is operated by professional after-sale technicians, namely the second position is the optimal placement position of the projection device 2.
FIG. 1B is a schematic diagram of an optical path of a projection apparatus according to an embodiment of the present application.
The light emitting component of the projection device can be implemented as a laser or a light source such as an LED, and the projection device and the projection display control scheme for automatically throwing into the curtain area provided by the present application will be described below by taking a laser type projection device as an example.
In some embodiments, the projection device may include a laser light source 100, an optical engine 200, a lens 300, and a projection medium 400. The laser light source 100 provides illumination for the optical engine 200, and the optical engine 200 modulates light source beams, outputs the modulated light source beams to the lens 300 for imaging, and projects the modulated light source beams to the projection medium 400 to form a projection image.
In some embodiments, the laser light source of the projection device includes a projection component and an optical lens component, where the projection component is embodied as a laser component in the laser type projection device provided by the present application, and is not described in detail below;
the light beam emitted by the laser component can penetrate through the optical lens component to further provide illumination for the optical machine. Wherein, for example, the optical lens assembly requires a higher level of environmental cleanliness, hermetic class sealing; and the chamber for installing the laser assembly can be sealed by adopting a dustproof grade with a lower sealing grade so as to reduce the sealing cost.
In some embodiments, the light engine 200 of the projection apparatus may be implemented to include a blue light engine, a green light engine, a red light engine, a heat dissipation system, a circuit control system, and the like. It should be noted that, in some embodiments, the light emitting component of the projector may also be implemented by an LED light source.
In some embodiments, the present application provides a projection device comprising a tristimulus engine and a controller; the three-color light machine is used for modulating and generating laser with a user interface containing pixel points and comprises a blue light machine, a green light machine and a red light machine; the controller is configured to: acquiring an average gray value of a user interface; and when the average gray value is judged to be larger than a first threshold value and the duration time of the average gray value is judged to be larger than a time threshold value, controlling the working current value of the red light machine to be reduced according to a preset gradient value so as to reduce the heat generation of the three-color light machine. It can be found that the overheating of the red light machine can be controlled by reducing the working current of the red light machine integrated in the three-color light machine, so that the overheating of the three-color light machine and the projection equipment can be controlled.
The light engine 200 may be implemented as a three-color light engine, which integrates a blue light engine, a green light engine, and a red light engine.
The optical engine 200 of the projection apparatus is implemented to include a blue optical engine, a green optical engine, and a red optical engine, and the technical solution provided by the present application will be described below.
In some embodiments, the optical system of the projection device is composed of a light source part and an optical machine part, the light source part is used for providing illumination for the optical machine, the optical machine part is used for modulating illumination light beams provided by the light source, and finally the illumination light beams are emitted through the lens to form a projection picture.
In some embodiments, the light source portion specifically includes a housing, a laser assembly, and an optical lens assembly, and a light beam emitted from the laser assembly is shaped and combined by the optical lens assembly, so as to provide illumination for the light engine. Wherein, the laser instrument subassembly includes light-emitting chip, collimating lens, multiple devices such as wire, but usually for the subassembly that has packaged, when using as the subassembly, compare in optical lens also as accurate part, optical lens can be higher to the cleanliness factor requirement of environment, because if lens surface deposition, can influence the processing effect that the lens was set a light on the one hand, lead to the luminance decay of outgoing, finally influence projection equipment through the effect of camera lens projection image, on the other hand, the dust can absorb the laser beam heat formation of high energy, very easily makes the lens take place to damage.
In some embodiments, the optical lens assembly includes at least a convex lens, wherein the convex lens is a component of a telescopic system, which generally consists of a convex lens and a concave lens, for demagnifying a larger area laser beam to form a smaller area laser beam. The convex lens is generally large in surface type, is generally arranged at a position close to the light outlet of the laser, can receive laser beams with a large area, is convenient to penetrate through the laser beams as a large window, and reduces light loss.
The optical lens assembly can further comprise a concave lens, a light combining lens, a light homogenizing component or a spot dissipating component and the like, and is used for reshaping and combining the laser beam to meet the requirement of the lighting system.
In some embodiments, the laser assembly includes a red laser module, a green laser module, and a blue laser module, and each laser module and the corresponding mounting port are mounted in a dust-proof sealing manner by a sealing ring (made of fluororubber or other sealing materials).
Fig. 2 is a schematic diagram illustrating a circuit architecture of a projection device according to an embodiment of the present application.
In some embodiments, the projection device provided by the present disclosure includes a plurality of sets of lasers, and by providing a brightness sensor in a light exit path of the laser light source, the brightness sensor can detect a first brightness value of the laser light source and send the first brightness value to the display control circuit.
The display control circuit can acquire a second brightness value corresponding to the driving current of each laser, and determines that the laser has a COD fault when the difference value between the second brightness value of the laser and the first brightness value of the laser is greater than a difference threshold value; the display control circuit can adjust the current control signal of the corresponding laser driving component of the laser until the difference value is less than or equal to the difference value threshold value, so as to eliminate the COD fault of the blue laser; the projection equipment can eliminate COD fault of the laser in time, reduce the damage rate of the laser and ensure the image display effect of the projection equipment.
In some embodiments, the projection device may include a display control circuit 10, a laser light source 20, at least one laser driving assembly 30, and at least one brightness sensor 40, and the laser light source 20 may include at least one laser in one-to-one correspondence with the at least one laser driving assembly 30. Wherein, the at least one means one or more, and the plurality means two or more.
In some embodiments, the projection device includes a laser driving assembly 30 and a brightness sensor 40, and accordingly, the laser light source 20 includes three lasers, which may be a blue laser 201, a red laser 202, and a green laser 203, corresponding to the laser driving assembly 30 in a one-to-one correspondence. The blue laser 201 is used for emitting blue laser, the red laser 202 is used for emitting red laser, and the green laser 203 is used for emitting green laser. In some embodiments, the laser driving assembly 30 may be implemented to include a plurality of sub-laser driving assemblies, each corresponding to a laser of a different color.
The display control circuit 10 is configured to output a primary color enable signal and a primary color current control signal to the laser driving component 30 to drive the laser to emit light, and specifically, as shown in fig. 2, the display control circuit 10 is connected to the laser driving component 30 and configured to output at least one enable signal corresponding to three primary colors of each frame of image in a multi-frame display image, transmit the at least one enable signal to the corresponding laser driving component 30, and output at least one current control signal corresponding to the three primary colors of each frame of image, and transmit the at least one current control signal to the corresponding laser driving component 30. For example, the display control circuit 10 may be a Micro Control Unit (MCU), which is also called a single chip microcomputer. The current control signal may be a Pulse Width Modulation (PWM) signal.
In some embodiments, the display control circuit 10 may output a blue PWM signal B _ PWM corresponding to the blue laser 201 based on a blue primary color component of an image to be displayed, a red PWM signal R _ PWM corresponding to the red laser 202 based on a red primary color component of the image to be displayed, and a green PWM signal G _ PWM corresponding to the green laser 203 based on a green primary color component of the image to be displayed. The display control circuit may output an enable signal B _ EN corresponding to the blue laser 201 based on a lighting period of the blue laser 201 in a drive period, output an enable signal R _ EN corresponding to the red laser 202 based on a lighting period of the red laser 202 in a drive period, and output an enable signal G _ EN corresponding to the green laser 203 based on a lighting period of the green laser 203 in a drive period.
The laser driving assembly 30 is connected to the corresponding laser for providing a corresponding driving current to the laser connected thereto in response to the received enable signal and the current control signal, and each laser is used for emitting light under the driving of the driving current provided by the laser driving assembly 30.
In some embodiments, blue laser 201, red laser 202, and green laser 203 are each connected to laser drive assembly 30. The laser driving assembly 30 may provide a corresponding driving current to the blue laser 201 in response to the blue PWM signal B _ PWM and the enable signal B _ EN transmitted from the display control circuit 10. The blue laser 201 is used to emit light under the drive of the drive current.
The brightness sensor is arranged in the light-emitting path of the laser light source, and is usually arranged at one side of the light-emitting path, so that the light path is not blocked. As shown in fig. 2, at least one brightness sensor 40 is disposed in the light outgoing path of the laser light source 20, and each brightness sensor is connected to the display control circuit 10, and is configured to detect a first brightness value of one laser and send the first brightness value to the display control circuit 10.
In some embodiments, the display control circuit 10 is further configured to obtain a second brightness value corresponding to the driving current of each laser, and if it is detected that a difference between the second brightness value of the laser and the first brightness value of the laser is greater than a difference threshold, indicating that the laser has a COD fault, the display control circuit 10 may adjust the current control signal of the laser driving assembly 30 until the difference is less than or equal to the difference threshold, that is, the COD fault of the laser is eliminated by reducing the driving current of the laser. Specifically, the first brightness value and the second brightness value are both characterized as the light output power value, wherein the second brightness value may be pre-stored or may be a brightness value sent back by the brightness sensor in a normal lighting state. If the laser suffers a COD failure, usually a sudden drop in its optical output power, the first brightness value returned by the brightness sensor will be less than half the normal second brightness value. When the fault is confirmed, the display control circuit reduces the current control signal of the laser driving component corresponding to the laser, and continuously collects and compares the brightness signal returned by the brightness sensor.
In some embodiments, if the detected difference between the second brightness value of the laser and the first brightness value of the laser is less than or equal to the difference threshold, indicating that the laser has no COD fault, the display control circuit 10 does not need to adjust the current control signal of the laser driving component 30 corresponding to the laser.
The display control circuit 10 may store a corresponding relationship between the current and the brightness value. The brightness value corresponding to each current in the corresponding relationship is an initial brightness value which can be emitted by the laser when the laser normally works under the driving of the current (namely, when no COD fault occurs). For example, the brightness value may be an initial brightness when the laser is first turned on when it is operated under the driving of the current.
In some embodiments, the display control circuit 10 may obtain, from the corresponding relationship, a second brightness value corresponding to a driving current of each laser, where the driving current is a current actual operating current of the laser, and the second brightness value corresponding to the driving current is a brightness value that can be emitted when the laser normally operates under the driving of the driving current. The difference threshold may be a fixed value stored in advance in the display control circuit 10.
In some embodiments, the display control circuit 10 may reduce the duty ratio of the current control signal of the laser driving component 30 corresponding to the laser when adjusting the current control signal of the laser driving component 30 corresponding to the laser, thereby reducing the driving current of the laser.
In some embodiments, the brightness sensor 40 may detect a first brightness value of the blue laser 201 and send the first brightness value to the display control circuit 10. The display control circuit 10 can obtain the driving current of the blue laser 201, and obtain the second brightness value corresponding to the driving current from the corresponding relationship between the current and the brightness value. Then, it is detected whether the difference between the second brightness value and the first brightness value is greater than a difference threshold, and if the difference is greater than the difference threshold, it indicates that the blue laser 201 has a COD fault, the display control circuit 10 may decrease the current control signal of the laser driving component 30 corresponding to the blue laser 201. The display control circuit 10 may then obtain the first brightness value of the blue laser 201 and the second brightness value corresponding to the driving current of the blue laser 201 again, and decrease the current control signal of the laser driving component 30 corresponding to the blue laser 201 again when the difference between the second brightness value and the first brightness value is greater than the difference threshold. And the operation is circulated until the difference is less than or equal to the difference threshold. Thereby eliminating the COD failure of the blue laser 201 by reducing the drive current of the blue laser 201.
In some embodiments, the display control circuit 10 may monitor whether each laser has a COD fault in real time according to the first brightness value of each laser acquired by the at least one brightness sensor 40 and the second brightness value corresponding to the driving current of each laser. When any laser is determined to have COD fault, the COD fault of the laser is eliminated in time, the duration of the COD fault of the laser is reduced, the damage of the laser is reduced, and the image display effect of the projection equipment is ensured.
Fig. 3 shows a schematic structural diagram of a projection apparatus according to an embodiment of the present application.
In some embodiments, the laser light source 20 in the projection apparatus may include a blue laser 201, a red laser 202, and a green laser 203, which are independently disposed, and the projection apparatus may also be referred to as a three-color projection apparatus, where the blue laser 201, the red laser 202, and the green laser 203 are MCL-type packaged lasers, which are small in size and beneficial to the compact arrangement of light paths.
In some embodiments, referring to fig. 3, the at least one brightness sensor 40 may include a first brightness sensor 401, a second brightness sensor 402, and a third brightness sensor 403, wherein the first brightness sensor 401 is a blue or white brightness sensor, the second brightness sensor 402 is a red or white brightness sensor, and the third brightness sensor 403 is a green or white brightness sensor.
The first brightness sensor 401 is disposed in the light-emitting path of the blue laser 201, specifically, may be disposed on the light-emitting path side of the collimated light beam of the blue laser 201, and similarly, the second brightness sensor 402 is disposed in the light-emitting path of the red laser 202, specifically, on the light-emitting path side of the collimated light beam of the red laser 201, and the third brightness sensor 403 is disposed in the light-emitting path of the green laser 203, specifically, on the light-emitting path side of the collimated light beam of the green laser 203. Because the laser emitted by the laser is not attenuated in the light emitting path, the brightness sensor is arranged in the light emitting path of the laser, and the precision of the brightness sensor for detecting the first brightness value of the laser is improved.
The display control circuit 10 is also configured to read the brightness value detected by the first brightness sensor 401 when controlling the blue laser 201 to emit blue laser. And stops reading the brightness value detected by the first brightness sensor 401 when the blue laser 201 is controlled to be turned off.
The display control circuit 10 is further configured to read the brightness value detected by the second brightness sensor 402 when the red laser 202 is controlled to emit red laser, and stop reading the brightness value detected by the second brightness sensor 402 when the red laser 202 is controlled to be turned off.
The display control circuit 10 is further configured to read the brightness value detected by the third brightness sensor 403 when the green laser 203 is controlled to emit green laser, and to stop reading the brightness value detected by the third brightness sensor 403 when the green laser 203 is controlled to be turned off.
The three-color laser may be a laser beam having a plurality of colors, and may be a laser beam having a plurality of colors.
Fig. 4 shows a schematic structural diagram of a projection device according to another embodiment of the present application.
In some embodiments, the projection apparatus may further include a light guide 110, and the light guide 110 is used as a light collecting optical component for receiving and homogenizing the three-color laser beams in the combined light state.
In some embodiments, the brightness sensor 40 may include a fourth brightness sensor 404, and the fourth brightness sensor 404 may be a white light brightness sensor. The fourth luminance sensor 404 is disposed in the light emitting path of the light guide 110, for example, on the light emitting side of the light guide, near the light emitting surface. And the fourth brightness sensor is a white light brightness sensor.
The display control circuit 10 is further configured to read the brightness value detected by the fourth brightness sensor 404 when the blue laser 201, the red laser 202, and the green laser 203 are controlled to be turned on in a time-sharing manner, so as to ensure that the fourth brightness sensor 404 can detect the first brightness value of the blue laser 201, the first brightness value of the red laser 202, and the first brightness value of the green laser 203. And stops reading the brightness value detected by the fourth brightness sensor 404 when the blue laser 201, the red laser 202 and the green laser 203 are all controlled to be turned off.
In some embodiments, the fourth brightness sensor 404 is always in an on state during the projection of the image by the projection device.
In some embodiments, referring to fig. 3 and 4, the projection device may further include a fourth dichroic sheet 604, a fifth dichroic sheet 605, a fifth mirror 904, a second lens assembly 90, a diffusion wheel 150, a TIR lens 120, a DMD130, and a projection lens 140. Wherein the second lens assembly 90 comprises a first lens 901, a second lens 902 and a third lens 903. The fourth dichroic filter 604 transmits blue laser light and reflects green laser light. The fifth dichroic plate 605 transmits the red laser beam and reflects the green laser beam and the blue laser beam.
The blue laser emitted from the blue laser 201 passes through the fourth dichroic plate 604, and is reflected by the fifth dichroic plate 605 to enter the first lens 901 for condensing. The red laser light emitted from the red laser 202 passes through the fifth dichroic plate 605 and directly enters the first lens 901 to be condensed. The green laser emitted from the green laser 203 is reflected by the fifth mirror 904, and then reflected by the fourth dichroic plate 604 and the fifth dichroic plate 605 in sequence, and then enters the first lens 901 to be condensed. The blue laser, the red laser and the green laser condensed by the first lens 901 penetrate through the rotating diffusion wheel 150 in a time-sharing manner to eliminate speckles, are projected to the light guide tube 110 for light homogenization, enter the TIR lens 120 for total reflection after being shaped by the second lens 902 and the third lens 903, penetrate the TIR lens 120 after being reflected by the DMD130, and are projected to a display screen through the projection lens 140 to form an image to be displayed.
Fig. 5 shows a schematic circuit diagram of a projection apparatus according to an embodiment of the present application.
In some embodiments, the laser drive assembly 30 may include a drive circuit 301, a switching circuit 302, and an amplification circuit 303. The driving circuit 301 may be a driving chip. The switch circuit 302 may be a metal-oxide-semiconductor (MOS) transistor.
The driving circuit 301 is connected to the switching circuit 302, the amplifying circuit 303, and the corresponding laser included in the laser light source 20. The driving circuit 301 is configured to output a driving current to a corresponding laser in the laser light source 20 through the VOUT terminal based on a current control signal sent by the display control circuit 10, and transmit a received enable signal to the switch circuit 302 through the ENOUT terminal. Wherein the laser may comprise n sub-lasers LD1 to LDn connected in series. n is a positive integer greater than 0.
The switch circuit 302 is connected in series in the current path of the laser, and is used for controlling the current path to be conducted when the received enable signal is at the effective potential.
The amplifying circuit 303 is connected to the detection node E in the current path of the laser light source 20 and the display control circuit 10, respectively, and is configured to convert the detected driving current of the laser assembly 201 into a driving voltage, amplify the driving voltage, and transmit the amplified driving voltage to the display control circuit 10.
The display control circuit 10 is further configured to determine the amplified driving voltage as a driving current of the laser, and obtain a second brightness value corresponding to the driving current.
In some embodiments, the amplification circuit 303 may include: the circuit comprises a first operational amplifier A1, a first resistor (also called a sampling power resistor) R1, a second resistor R2, a third resistor R3 and a fourth resistor R4.
The non-inverting input terminal (also called positive terminal) of the first operational amplifier a1 is connected to one end of the second resistor R2, the inverting input terminal (also called negative terminal) of the first operational amplifier a1 is connected to one end of the third resistor R3 and one end of the fourth resistor R4, respectively, and the output terminal of the first operational amplifier a1 is connected to the other end of the fourth resistor R4 and the processing sub-circuit 3022, respectively. One end of the first resistor R1 is connected to the detection node E, and the other end of the first resistor R1 is connected to a reference power source terminal. The other end of the second resistor R2 is connected to the detection node E, and the other end of the third resistor R3 is connected to a reference power source terminal. The reference power source terminal is a ground terminal.
In some embodiments, the first operational amplifier a1 may further include two power supply terminals, one of which may be connected to the power supply terminal VCC and the other of which may be connected to the reference power supply terminal.
The larger driving current of the laser included in the laser light source 20 generates a voltage drop after passing through the first resistor R1, and the voltage Vi at one end (i.e., the detection node E) of the first resistor R1 is transmitted to the non-inverting input end of the first operational amplifier a1 through the second resistor R2, and is amplified by N times by the first operational amplifier a1 and then output. The N is the amplification of the first operational amplifier a1, and N is a positive number. The amplification factor N may be such that the value of the voltage Vfb output by the first operational amplifier a1 is an integer multiple of the value of the drive current of the laser. For example, the voltage Vfb may have a value equal to that of the driving current, so that the display control circuit 10 determines the amplified driving voltage as the driving current of the laser.
In some embodiments, the display control circuit 10, the driving circuit 301, the switching circuit 302, and the amplifying circuit 303 form a closed loop to implement feedback adjustment of the driving current of the laser, so that the display control circuit 10 can adjust the driving current of the laser in time through a difference between a second brightness value and a first brightness value of the laser, that is, adjust actual luminance brightness of the laser in time, avoid a long-time COD failure of the laser, and improve accuracy of light emission control of the laser.
It should be noted that, referring to fig. 3 and 4, if the laser light source 20 includes a blue laser 201, a red laser 202 and a green laser 203. The blue laser 201 may be disposed at the L1 position, the red laser 202 may be disposed at the L2 position, and the green laser 203 may be disposed at the L3 position.
Referring to fig. 3 and 4, the laser light at the position of L1 is transmitted once through the fourth dichroic plate 604 and reflected once through the fifth dichroic plate 605, and then enters the first lens 901. The light efficiency at the position L1, P1 ═ Pt × Pf. Where Pt denotes the transmittance of the dichroic filter, and Pf denotes the reflectance of the dichroic filter or the fifth reflectance 904.
In some embodiments, among the three positions L1, L2, and L3, the light efficiency of the laser light at the L3 position is the highest, and the light efficiency of the laser light at the L1 position is the lowest. Since the maximum optical power Pb output by the blue laser 201 is 4.5 watts (W), the maximum optical power Pr output by the red laser 202 is 2.5W, and the maximum optical power Pg output by the green laser 203 is 1.5W. That is, the maximum optical power output by the blue laser 201 is the largest, the maximum optical power output by the red laser 202 is the next largest, and the maximum optical power output by the green laser 203 is the smallest. Thus, the green laser 203 is disposed at the L3 position, the red laser 202 is disposed at the L2 position, and the blue laser 201 is disposed at the L1 position. That is, the green laser 203 is disposed in the optical path with the highest light efficiency, thereby ensuring that the projection apparatus can obtain the highest light efficiency.
In some embodiments, the display control circuit 10 is further configured to restore the current control signal of the laser driving component corresponding to the laser to an initial value when a difference between the second brightness value of the laser and the first brightness value of the laser is less than or equal to a difference threshold, where the initial value is a magnitude of the PWM current control signal for the laser in a normal state. Therefore, when the laser has COD fault, the laser can be rapidly identified, measures for reducing the driving current are taken in time, the continuous damage of the laser is reduced, the self-recovery of the laser is facilitated, the disassembly and the human interference are not needed in the whole process, the reliability of the use of the laser light source is improved, and the projection display quality of the laser projection equipment is ensured.
The embodiment of the application can be applied to various types of projection equipment. Hereinafter, a projector and a projection display control method for automatically feeding a screen area will be described by taking a projector as an example.
The projector is a device capable of projecting images or videos onto a screen, and the projector can be connected with a computer, a broadcast television network, the internet, a Video Compact Disc (VCD), a Digital Versatile Disc (DVD), a game machine, a DV, and the like through different interfaces to play corresponding Video signals. Projection apparatuses are widely used in homes, offices, schools, entertainment venues, and the like.
Fig. 6A is a schematic diagram of a curtain corresponding to a projector according to an embodiment of the present application.
In some embodiments, the projection curtain is a tool used for displaying images and video files in the occasions of movies, offices, home theaters, large conferences and the like, and can be set to different specification sizes according to actual needs; in some embodiments, in order to make the display effect more suitable for the viewing habit of the user, the aspect ratio of the curtain corresponding to the projector is usually set to 16: 9, and the laser assembly can project the playing content to the curtain corresponding to the projector, as shown in fig. 6A,
most screens look white, but different screens have different reflection characteristics for light rays and different reflectivities for light rays with different colors, and common screens comprise diffuse reflection screens and regression screens;
the white plastic screen is a typical diffuse scattering screen, the diffuse scattering screen uniformly scatters incident light of a projector to all directions, and the same image can be seen at each angle; the diffuse reflection screen has an ultra-wide audio-visual range and a soft image, but the influence of external light and cluttered light needs to be noticed; in an environment with external light and stray light, the external light, the stray light and the light of the image are scattered, reflected and overlapped with the image light together, so that the image quality is low; conventional diffusely scattering screens maximize performance during use of the projection application without external light, clutter.
It should be noted that the wall also has the characteristic of diffuse scattering, but because the wall is not processed by color correction, non-absorption and the like, the displayed image when used as a screen has the phenomena of color misalignment, chromatic dispersion, false light in a dark part, insufficient brightness and contrast and the like; so a wall is not a good choice as a screen.
The glass bead screen is a typical retro-type screen, and since the glass beads on the screen surface reflect light back around the incident direction of the projection light, a bright and vivid image can be seen at a normal viewing position; because the diffuse scattering of the screen is suppressed to a certain extent, the brightness of an image seen near the front of the screen and an image seen at a position with a large angle will be different; an image with good brightness, contrast and gradation can be seen near the front of the screen; under the environment that has external light, mixed and disorderly light, because the screen is along external light, the incident direction of mixed and disorderly light is with its reflection go back, so the image light and the external light of projecting apparatus, mixed and disorderly light overlap very little to can obtain bright-colored image.
It can be understood that a wide-viewing-angle and low-gain screen is suitable to be selected in the situation that the number of users is large or the users watch the screen transversely; a narrow-viewing-angle and high-gain screen is suitable for viewing in a narrow and long space; the screen with proper gain is selected, which is beneficial to improving the contrast of the screen, increasing the gray level of the image, brightening the color and increasing the observability of the image; the diffuse reflection and regression type screen can be used in places with good light shading and light absorption, and the regression type screen can be selected in a family living room; any screen can be selected for placing the projector on the desktop, and the screen with diffuse reflection or a larger half-value angle can be selected for hoisting the projector.
In some embodiments, the peripheral edge position of the curtain corresponding to the projector has a dark edge line, and the edge line generally has a certain width, so the dark edge line may also be referred to as an edge band, and the projection apparatus and the projection display control method automatically put into the curtain region provided by the present application accurately identify the curtain in an environment stably and efficiently by using the edge band characteristics of the curtain, and realize that the curtain enters the curtain quickly and automatically after the position of the projector is moved, where the curtain is as shown in fig. 6A.
In some embodiments, the projector provided by the application is provided with the camera, after the projector is moved by a user, in order to enable the laser component to accurately project the playing content to the projection area of the curtain again, the controller controls the camera to shoot the environment where the projection device is located to obtain the first image, and the position of the projection area of the curtain can be determined by analyzing and processing the first image, that is, the controller controls the camera of the projector to obtain the first image of the area where the projection device corresponds to the curtain, so that the calculation amount for identifying the curtain by a subsequent algorithm can be reduced.
The first image may include various environmental objects, such as a curtain, a television cabinet, a wall, a ceiling, a tea table, etc., which is to be found by the projector provided by the present application corresponding to a curtain edge having a dark color, as shown in fig. 6B;
and a controller of the projector analyzes and processes the first image, identifies the curtain by the environmental factors contained in the first image through an algorithm, controls the projection direction of the laser assembly and accurately projects the playing content to the projection area of the curtain.
In some embodiments, in order to more easily and accurately identify the curtain projection area in the environmental factor in the first image, the controller obtains the most appropriate binarization threshold value based on the brightness analysis of the gray-scale map of the first image at the moment of obtaining the environmental image, so as to obtain the corresponding second image by means of binarization of the first image; by obtaining the most appropriate binarization threshold value, the extraction of each environment element outline in the second image can be kept as clear as possible, and the extraction of the closed outline in the subsequent algorithm is facilitated.
Firstly, the controller generates a corresponding gray scale distribution, namely a gray scale map, based on the acquired first image, wherein the right image is the gray scale map corresponding to the first image as shown in fig. 6C;
then, the controller will determine the gray scale value corresponding to the maximum brightness ratio in the gray scale map of the first image, wherein the 255 gray scale map reflects the ratio of the brightest part of the first image in the whole image, for example, the brightest part of the gray scale map in fig. 6C is assumed to be 130;
finally, the controller selects a preset number of gray scale values as threshold values within a preset range from top to bottom of the acquired gray scale values as a center, and repeatedly binarizes the first image until the typical feature extraction of the curtain is performed on the second image to reach a preset condition to obtain the second image, wherein the acquired gray scale values are the binary threshold values to be selected when the second image is obtained.
For example, in the case where the black region is fixed in fig. 6C, the starting point of the binarization threshold value of the first image may be temporarily set to 130; then, the controller selects 120, 122, 124, 126, 128, 130, 132, 134, 136, 138 and 140 as binarization threshold values in sequence to binarize the first image respectively to obtain a plurality of binarization images; then analyzing the acquired multiple binary images, and identifying the binary images containing curtain characteristics as second images; the combination of the curtain features, namely the dark curtain edge band and the white curtain projection area, is embodied in a binary image as a two-level closed contour contained in a one-level closed contour, and the binary image is shown in fig. 6D.
In some embodiments, the binarization threshold may be selected as a fixed value during binarization for the first image; however, the effect of taking a picture of a screen area by a projector camera is sometimes poor, because the shooting environment has a great influence on the final picture imaging, and if a higher threshold is selected at night, most areas may be binarized into edges.
It can be understood that the first image is binarized to obtain the second image, the most accurate method is to traverse all threshold values, namely the threshold values traverse 0 to 255, and the obtained binarized image is subjected to edge image analysis to obtain the binarized image with the best edge image effect; however, the binarization in the complete traversal mode causes an increased operation amount; therefore, the projection display control method for automatically throwing the screen area provided by the application adopts gray-scale image brightness analysis to create a threshold value optimal interval, and traversal detection is carried out in the threshold value optimal interval to obtain an optimal binary image.
In some embodiments, after the projector controller completes binarization of the first image, the controller identifies and extracts the closed contours contained in the second image, and when the closed contours also contain secondary closed contours, the closed contours can be temporarily determined to be combined to be matched with the color features of the curtain to a certain extent.
It can be understood that, corresponding to the color and structural features of the curtain, in the identification process of the closed contour, the outer edge lines of the curtain edge strip are identified as the closed contour with a larger range, which can also be called a primary closed contour; the inner edge lines of the edge band of the curtain will be identified as a relatively small range of closed contours, which may also be referred to as secondary closed contours, i.e., the controller will determine the primary closed contour contained in the second image and the secondary closed contour contained in the primary closed contour.
For example, in a second image including a plurality of environment factor images, the controller analyzes and identifies the closed contour corresponding to each environment factor image in the image, and finds the closed contour having the hierarchical relationship as a candidate curtain projection area, that is, the controller first acquires all primary closed contours including secondary closed contours in the second image, and rejects single-layer closed contours not including the secondary closed contours.
The first-level closed contour can also be called a parent closed contour, and the second-level closed contour contained in the first-level closed contour can also be called a child closed contour, namely the first-level closed contour and the second-level closed contour have a parent-child relationship; it can be understood that, among a plurality of parent closed contours including child closed contours identified in the second image, there is necessarily one closed contour corresponding to the curtain, that is, only a region including two layers of the parent and child closed contours is likely to be a candidate region of the projection region of the curtain;
for example, in the closed contour diagram shown in fig. 6E, a total of A, B, C, D four closed contours are identified; wherein A is a primary closed contour, namely a father closed contour; B. c, D are both secondary closed contours, i.e., sub-closed contours, encompassed by the A-contour; the controller takes the B, C, D secondary closed contour as a candidate area of the projection area of the curtain and continues the algorithm identification.
In some embodiments, the controller performs convex quadrilateral recognition determination on the acquired secondary closed contour in the candidate region, and if the secondary closed contour is a convex quadrilateral, the controller recognizes the secondary closed contour as a projection region corresponding to the curtain, that is, a projection region surrounded by a dark edge band of the curtain, and controls to project a laser component controlling the projector onto the secondary closed contour, so that the playing content is accurately projected to cover the projection region corresponding to the curtain.
Firstly, after a camera acquires a first image of an environment where a projector is located, in order to more accurately extract a closed contour in the image, a controller binarizes the first image to obtain a second image, and then extracts various closed contours contained in the second image based on the second image in order to better reflect environment elements corresponding to the closed contours.
It can be found that the environmental elements such as furniture, home appliances, articles, walls and the like in the first image living room can be well recognized in the binary image as long as the color of the environmental elements has a dark color and light color contrast relation with the surrounding environment, for example, the curtain edge band area, the curtain edge to-be-internal white curtain area, the television cabinet area, the sofa, the tea table and the like which are marked in the figure can be accurately recognized by the controller.
Then, the controller performs polygon fitting on the acquired secondary closed contour in the region meeting the hierarchical relationship, and the controller determines the secondary closed contour of which the fitting result is a quadrilateral contour as a candidate region of the curtain projection region because the curtain projection region is a standard rectangle; therefore, the closed contour is proposed to be a triangle, a circle, a pentagon or other irregular closed contour, so that the subsequent algorithm can continuously identify the rectangular closed contour corresponding to the curtain projection area.
After the plurality of quadrilateral secondary closed contours are identified, the controller judges the unevenness of the plurality of candidate areas, namely judges the unevenness of the plurality of quadrilateral secondary closed contours, and judges the candidate area with the convex quadrilateral as a projection area corresponding to the curtain.
Wherein, some sides in the concave quadrangle extend to two directions, and other sides are not at the same side of the straight line obtained by extension; any side of the convex quadrilateral is extended towards two directions, and other sides are arranged at the same side of the extended straight line;
the concave quadrangles are different from the convex quadrangles, and only one angle is larger than 180 degrees but smaller than 360 degrees; of the remaining three angles, two angles adjacent to the largest angle must be acute angles; the diagonal of the maximum angle can be an acute angle, a right angle or an obtuse angle; the angle outside the graph on the upper side of the maximum angle is equal to the sum of the other three internal angles; the convex quadrangle is a quadrangle without an angle larger than 180 degrees, and the straight line of any side does not pass through other line segments, namely other three sides are on one side of the straight line of the fourth side, and the sum of any three sides is larger than the fourth side. For example, assuming that the identified secondary closed contour is as shown in fig. 6E, the controller will polygon fit B, C, D three secondary closed contours;
wherein B is identified as a quadrilateral closed contour, C is identified as a pentagonal closed contour, and D is identified as a near-circular closed contour; therefore, the B quadrilateral closed contour is continuously reserved as a secondary closed contour candidate region corresponding to the projection region of the curtain, and C, D closed contours are removed from the candidate region;
aiming at the structural shape characteristics of the curtain, the controller judges the concavity and convexity of the acquired B quadrilateral closed contour; since the screen may actually only be a convex quadrilateral, the controller may perform the concavity and convexity determination by a concave-convex quadrilateral determination algorithm, which may be implemented, for example, as:
for a convex quadrangle, connecting any two points which are not adjacent to each other to obtain two triangles, wherein the area of the original quadrangle is the same as the sum of the areas of the two triangles; for a convex quadrangle, the area of the original quadrangle is equal to the sum of the areas of the two triangles obtained by connecting the corresponding concave points. Through the implementation of the convex quadrilateral decision algorithm, images corresponding to other irrelevant environmental factors contained in the primary closed contour can be removed, so that the convex quadrilateral secondary closed contour corresponding to the curtain projection area can be accurately obtained.
It can be understood that through the algorithm steps, the accuracy of the projector for identifying the projection area of the curtain can be improved, and the curtain can be identified no matter whether the curtain is a pure color or not; when the projector plays any picture, the extraction of the projection area of the curtain can be carried out.
When the second-level closed contour identified by the controller further comprises a third-level closed contour, the third-level closed contour corresponds to an image generated by playing content, and when the controller detects that the second-level closed contour comprises the third-level closed contour, the controller does not extract and analyze the third-level closed contour, so that the projector can be moved when the projector works, automatic screen entering of the projector can still be realized, and the projection content is accurately thrown into a screen projection area.
It will be appreciated that in the above algorithmic implementation of identifying the projected area of the curtain, the dark edge band of the curtain corresponds to the primary closed contour identified by the controller, and the white curtain area inside the dark edge band corresponds to the convex quadrilateral secondary closed contour identified by the controller.
In some embodiments, the controller uses the closed contour with the largest area in the first image as the identification condition of the projection area of the curtain.
After the projector acquires the first image of the environment, a fixed binarization threshold value can be selected, for example, the first image binarization is performed according to the brightness of 20% to acquire a second image, and it may happen that a certain area of the curtain or a lower right corner area does not form a closed contour, so that a subsequent closed contour extraction has obvious errors; then searching the closed contour with the largest area in all the closed contours, and judging whether the color inside the closed contour is consistent or not; if the color inside a certain closed contour is determined to be consistent and the area is the largest, the closed contour is determined to be a curtain projection area, as shown in fig. 6F.
However, even if the projector can extract an accurate closed contour, when a large area of a solid-color closed contour region exists in the first image of the shot picture, the final result obtained by the algorithm may be biased, such as the large area of a solid-color wall region shown in fig. 6F.
Based on the above display control scheme for realizing automatic input of the played content into the curtain projection area by the display device and the introduction of the related drawings, the application also provides a projection display control method for automatically inputting into the curtain area, and the method comprises the following steps: performing binarization on a first image to obtain a second image based on the brightness analysis of a first image gray-scale image at the acquisition moment, wherein the first image is an environment image; determining a primary closed contour contained in the second image, wherein the primary closed contour contains a secondary closed contour; when the secondary closed contour is judged to be a convex quadrangle, projecting the secondary closed contour to the secondary closed contour so as to cover a projection area corresponding to the curtain; wherein the curtain comprises a curtain edge band corresponding to the primary closed contour, the projection area is surrounded by the curtain edge band, and the curtain is used for displaying projection of the playing content. The method has been elaborated in detail in a display control scheme for realizing that the display device automatically puts the playing content into the projection area of the curtain, and is not described herein again.
In some embodiments, binarizing the first image to obtain a second image based on the luminance analysis of the gray-scale image of the first image at the time of acquisition specifically includes: determining a gray scale value corresponding to the maximum brightness ratio in the first image gray scale image; and repeatedly binarizing the first image by taking the gray-scale values as the center and selecting a preset number of gray-scale values as a threshold value, and extracting the typical feature of the curtain to obtain a binarized image reaching a preset condition as a second image. The method has been elaborated in detail in a display control scheme for realizing that the display device automatically puts the playing content into the projection area of the curtain, and is not described herein again.
In some embodiments, determining the projection region containing the secondary closed contour corresponding to the curtain in the second image specifically includes: acquiring all primary closed contours including the secondary closed contour in the second image; performing polygonal fitting on the secondary closed contour, and judging the secondary closed contour with a fitting result of a quadrilateral contour as a candidate area of a curtain projection area; and judging the unevenness of the screen projection area candidate area, and judging the candidate area with the result of the convex quadrangle as the projection area corresponding to the screen. The method has been explained in detail in the display control scheme for realizing automatic feeding of the played content into the projection area of the curtain by the display device, and is not described herein again.
In some embodiments, in determining the projection region corresponding to the curtain containing the secondary closed contour in the second image, the method further comprises: when the second-level closed contour also comprises a third-level closed contour generated by playing the content, the third-level closed contour is not subjected to extraction analysis. The method has been elaborated in detail in a display control scheme for realizing that the display device automatically puts the playing content into the projection area of the curtain, and is not described herein again.
In some embodiments, acquiring the first image of the environment in which the projection apparatus is located includes acquiring, by the controller, the first image of the area in which the canvas is located. The method has been elaborated in detail in a display control scheme for realizing that the display device automatically puts the playing content into the projection area of the curtain, and is not described herein again.
In some embodiments, laser light from the projector is reflected by a nanoscale mirror of a Digital Micromirror Device (DMD) chip, where the optical lens is also a precision element that geometrically distorts the image projected onto the screen when the image and object planes are not parallel.
Fig. 7A is a schematic diagram of a system framework for implementing display control of a projection device according to an embodiment of the present application.
In some embodiments, the projector provided by the application has the characteristic of long-focus micro-projection, and comprises a controller, wherein the controller can perform display control on a camera picture through a preset algorithm so as to realize functions of automatic trapezoidal correction, automatic screen entering, automatic obstacle avoidance, automatic focusing, anti-glare and the like of the display picture.
It can be understood that the projector can realize flexible position movement in a long-focus micro-projection scene through the display control method based on geometric correction provided by the application; and in the process of each movement of the equipment, aiming at the problems of projection picture distortion, foreign matter shielding of a projection plane, abnormal projection picture from a curtain and the like which possibly occur, the controller can control the projector to realize the automatic display correction function, so that the projector automatically recovers to display normally.
In some embodiments, the display control system based on geometric correction provided by the present application includes an application Service layer (APK Service), a Service layer, and an underlying algorithm library.
The application program service layer is used for realizing interaction between the projector and the user; based on the display of the user interface, the user can configure various parameters and display pictures of the projector, and the controller can realize the function of automatically correcting the display pictures of the projector when the display of the projector is abnormal by coordinating and calling the algorithm service corresponding to various functions.
The Service layer can comprise contents such as correction Service, camera Service, Time of Flight (TOF) Service and the like, and the Service can focus an application program Service layer (APK Service) upwards to realize corresponding specific functions of different Service configurations of the projector; the service layer is connected with data acquisition services such as an algorithm library, a camera, a flight time sensor and the like downwards, and the functions of packaging the complex logic of the bottom layer and transmitting service data to the corresponding service layer are realized.
The underlying algorithm library may provide correction services, and control algorithms for the projector to perform various functions, and may perform various mathematical operations based on OpenCV, for example, to provide basic capabilities for the correction services. OpenCV is a cross-platform computer vision and machine learning software library issued based on BSD license (open source), and can run on operating systems such as Linux, Windows, Android, and Mac OS.
In some embodiments, the projector is further configured with a gyroscope sensor; in the moving process of the projector, the gyroscope sensor can sense position movement and actively acquire moving data, then the acquired data are sent to the application program service layer through the system framework layer so as to support application data required in the user interface interaction and application program interaction processes, and the acquired data can also be used for data calling of the controller in the implementation of algorithm service.
In some embodiments, the projector is configured with a Time of Flight (TOF) sensor, and after the Time of Flight sensor collects corresponding data, the data is sent to a Time of Flight service corresponding to a service layer;
the time-of-flight service continues to send the data collected by the time-of-flight sensor through the process communication framework (HSP Core) to the projector's application services layer, which data is used for data calls by the controller and interactive use with the user interface, program applications.
In some embodiments, the projector is configured with a camera for capturing images, which may be implemented, for example, as a binocular camera, or a depth camera, etc.; the acquired data is sent to a camera service, and then the camera service sends the image data acquired by the binocular camera to a process communication framework (HSP Core) and/or a projector correction service for realizing the function of the projector.
In some embodiments, the projector calibration service may receive camera acquisition data sent by the camera service, and the controller may invoke respective corresponding control algorithms in the algorithm library for different functions that need to be implemented.
In some embodiments, data interaction can be performed with the application service through the process communication framework, then the calculation result is fed back to the correction service through the process communication framework, the correction service sends the obtained calculation result to the projector operating system to generate a corresponding control signaling, and the control signaling is sent to the optical-mechanical control driver to control the working condition of the optical-mechanical and realize automatic correction of the display effect.
Fig. 7B is a schematic timing diagram of signaling interaction of a projection apparatus implementing a radial eye function according to another embodiment of the present application.
In some embodiments, the projector provided by the application can realize the function of preventing the eyes from being shot. In order to prevent the danger of eyesight damage caused by the fact that a user accidentally enters a laser emitting track range of the projector, when the user enters a preset specific unsafe area where the projector is located, the controller can control the user interface to display corresponding prompt information to remind the user to leave the current area, and the controller can also control the user interface to reduce the display brightness to prevent the laser from damaging the eyesight of the user.
In some embodiments, the controller will automatically turn on the anti-glare switch when the projector is configured in the child viewing mode.
In some embodiments, after the controller receives the position movement data sent by the gyroscope sensor or the foreign object intrusion data collected by other sensors, the controller controls the projector to turn on the anti-eye switch.
In some embodiments, when data collected by a time of flight (TOF) sensor, a camera device, or the like triggers any preset threshold condition, the controller controls the user interface to reduce display brightness, display prompt information, and reduce the emission power, brightness, and intensity of the optical engine, so as to protect the eyesight of the user.
In some embodiments, the projector controller may control the correction service to send signaling to the time-of-flight sensor to query the projector for current device status, and then the controller accepts data feedback from the time-of-flight sensor.
The correction service may send a notification algorithm service to a process communication framework (HSP Core) to initiate an anti-eye-fire procedure signaling; a process communication framework (HSP Core) calls service capability from an algorithm library to call corresponding algorithm services, which can comprise a shooting detection algorithm, a screenshot algorithm, a foreign object detection algorithm and the like;
the process communication framework (HSP Core) returns a foreign object detection result to a correction service based on the algorithm service; for the returned result, if the preset threshold condition is reached, the controller controls the user interface to display the prompt information and reduce the display brightness, and the signaling timing sequence is shown in fig. 7B.
In some embodiments, when the eye-shooting prevention switch of the projector is in an on state, and a user enters a preset specific area, the projector automatically reduces the laser intensity emitted by the optical machine, reduces the display brightness of a user interface, and displays safety prompt information. The projector can control the anti-eye-shooting function by the following method:
the controller identifies a projection area of the projector by using an edge detection algorithm based on a projection picture acquired by the camera; when the projection area is displayed as a rectangle or a quasi-rectangle, the controller acquires coordinate values of four vertexes of the rectangular projection area through a preset algorithm;
when the foreign matter detection in the projection area is realized, the projection area can be corrected to be rectangular by using a perspective transformation method, and the difference value between the rectangular and the projection screenshot is calculated to judge whether the foreign matter exists in the display area; if the judgment result shows that foreign matters exist, the projector automatically triggers the anti-eye-shooting function to start.
When foreign matter detection in a certain area outside the projection range is realized, the difference value between the camera content of the current frame and the camera content of the previous frame can be made so as to judge whether foreign matter enters the area outside the projection range; if the projector judges that foreign matters enter, the projector automatically triggers the function of preventing the eyes from being shot.
At the same time, the projector may also detect real-time depth changes of a particular area using a time-of-flight (ToF) camera, or a time-of-flight sensor; if the change of the depth value exceeds a preset threshold value, the projector automatically triggers the function of preventing the eyes from being shot.
In some embodiments, the projector determines whether an anti-glare function needs to be turned on based on the collected time-of-flight data, the screenshot data, and the camera data analysis.
For example, based on the collected time-of-flight data, the controller performs a depth difference analysis; if the depth difference is greater than the preset threshold value X, it may be determined that a foreign object is already in a specific area of the projector when the preset threshold value X is implemented as 0. If the user is located in the specific area, the vision of the user is damaged by the laser, and the projector automatically starts the anti-eye-shooting function so as to reduce the intensity of the laser emitted by the optical machine, reduce the display brightness of the user interface and display safety prompt information.
For another example, the projector performs color-adding mode (RGB) difference analysis according to the acquired screenshot data, and if the color-adding mode difference is greater than a preset threshold Y, it can be determined that a foreign object is in a specific area of the projector; if a user exists in the specific area, the vision of the user is damaged by laser, the projector automatically starts the function of preventing the eyes from being shot, the intensity of the emitted laser is reduced, the display brightness of a user interface is reduced, and corresponding safety prompt information is displayed.
For another example, the projector acquires projection coordinates according to acquired camera data, then determines a projection area of the projector according to the projection coordinates, further performs color-adding mode (RGB) difference analysis in the projection area, if the color-adding mode difference is greater than a preset threshold Y, it can be determined that a foreign object is located in a specific area of the projector, if a user has a risk that the eyesight of the user is damaged by laser in the specific area, the projector will automatically start an anti-glare function, reduce the intensity of emitted laser, reduce the display brightness of a user interface, and display corresponding safety prompt information.
If the acquired projection coordinates are in an extended area, the controller can still perform additive color mode (RGB) difference analysis in the extended area; if the color-adding mode difference value is greater than the preset threshold value Y, it can be determined that a foreign object is located in a specific area of the projector, if a user exists in the specific area, the vision of the user is damaged by laser emitted by the projector, the projector automatically starts an anti-eye-shooting function, reduces the intensity of emitted laser, reduces the display brightness of a user interface, and displays corresponding safety prompt information, as shown in fig. 7G.
Fig. 7C is a schematic signaling interaction timing diagram illustrating a projection device implementing a display screen correction function according to another embodiment of the present application.
In some embodiments, the projector may monitor device movement through a gyroscope, or gyroscope sensor, as is typical. The correction service sends signaling to the gyroscope to query the state of the device and receives signaling from the gyroscope to determine if the device is moving.
In some embodiments, the display correction strategy of the projector may be configured such that, when changes occur simultaneously with the gyroscope and the time-of-flight sensor, the projector preferentially triggers trapezoidal correction; after the data of the gyroscope is stable for a preset time length, the controller starts triggering trapezoidal correction; and the controller may further configure the projector not to respond to an instruction from a key of the remote controller while the keystone correction is in progress; in order to match the implementation of the keystone correction, the projector will print a pure white picture card.
The trapezoidal correction algorithm can construct a projection plane and a light machine coordinate system conversion matrix under a world coordinate system based on a binocular camera; and further combining the parameters in the optical machine to calculate the homography of the projection picture and the playing card, and realizing the arbitrary shape conversion between the projection picture and the playing card by utilizing the homography.
In some embodiments, the correction service sends a signaling for notifying the algorithm service to start the trapezoidal correction process to a process communication framework (HSP CORE), and the process communication framework further sends a service capability call signaling to the algorithm service to acquire an algorithm corresponding to the capability;
the algorithm service acquires and executes photographing and picture algorithm processing services and obstacle avoidance algorithm services, and sends the services to a process communication framework in a signaling carrying manner; in some embodiments, the process communication framework executes the above algorithm and feeds back the execution results, which may include a photographing success and an obstacle avoidance success, to the correction service.
In some embodiments, the projector executes the algorithm or performs the data transmission process, and controls the user interface to display an error return prompt if the error correction service occurs, and controls the user interface to print the trapezoidal correction and auto-focus graphics card again.
Through an automatic obstacle avoidance algorithm, the projector can identify the curtain; and the projection picture is corrected to the interior of the curtain for display by utilizing the projection change, so that the effect of aligning with the edge of the curtain is realized.
Through an automatic focusing algorithm, the projector can acquire the distance between the optical machine and the projection surface by using a time-of-flight (ToF) sensor, search the optimal image distance in a preset mapping table based on the distance, evaluate the definition degree of a projection image by using an image algorithm, and realize fine adjustment of the image distance according to the optimal image distance.
In some embodiments, the automatic keystone correction signaling sent by the correction service to the process communication framework may include other function configuration instructions, for example, control instructions such as whether to implement synchronous obstacle avoidance, whether to enter into the screen, and the like.
The process communication framework sends a service capability calling signaling to the algorithm service, so that the algorithm service acquires and executes an automatic focusing algorithm to realize the adjustment of the visual distance between the equipment and the curtain; in some embodiments, after the auto-focusing algorithm is applied to realize the corresponding function, the algorithm service may further obtain to execute an auto-entry algorithm, which may include a trapezoidal correction algorithm.
In some embodiments, the projector performs automatic screening, and the algorithmic service may set 8 position coordinates between the projector and the screen; then, adjusting the sight distance between the projector and the curtain by an automatic focusing algorithm; finally, the correction result is fed back to the correction service, and the user interface is controlled to display the correction result, as shown in fig. 7C.
In some embodiments, the projector may obtain the current object distance by using the configured laser ranging through an auto-focusing algorithm to calculate the initial focal length and the search range; the projector then drives the Camera (Camera) to take a picture and a corresponding algorithm is used for sharpness evaluation.
And (3) searching the possible optimal focal distance by the projector based on a search algorithm in the search range, then repeating the steps of photographing and evaluating the definition, and finally finding the optimal focal distance through the definition comparison to finish the automatic focusing.
For example, after the projector is started, the user moves the device; the projector automatically finishes the correction and then refocuses, and the controller detects whether the automatic focusing function is started or not; when the automatic focusing function is not started, the controller ends the automatic focusing service; when the automatic focusing function is started, the projector acquires the detection distance of a time of flight (TOF) sensor through the middleware and calculates;
the controller inquires a preset mapping table according to the acquired distance so as to acquire the approximate focal length of the projector; then the middleware sets the acquired focal length to an optical machine of the projector;
after the optical machine emits laser at the focal length, the camera executes a photographing instruction; the controller judges whether the focusing of the projector is finished or not according to the obtained photographing result and the evaluation function; if the judgment result meets the preset completion condition, controlling the automatic focusing process to be ended; if the judgment result does not meet the preset completion condition, the middleware finely adjusts the focal length parameter of the projector optical machine, for example, the focal length can be gradually finely adjusted by preset step length, and the adjusted focal length parameter is set to the optical machine again; therefore, the steps of repeated photographing and definition evaluation are realized, and finally, the optimal focal length is found through definition contrast to complete automatic focusing, as shown in fig. 7D.
In some embodiments, the projector provided herein can implement a display correction function through a keystone correction algorithm.
Firstly, based on a calibration algorithm, two groups of external parameters, namely rotation and translation matrixes, between two cameras and between the cameras can be obtained; then playing a special checkerboard graphic card through an optical machine of the projector, and calculating depth values of corner points of the projection checkerboard, for example, solving xyz coordinate values through a translation relation between binocular cameras and a similar triangle principle; and then fitting a projection plane based on the xyz, and solving the rotation relation and the translation relation between the projection plane and a camera coordinate system, wherein the rotation relation and the translation relation specifically comprise a Pitch relation (Pitch) and a Yaw relation (Yaw).
The values of Roll (Roll) parameters can be obtained through a gyroscope configured by a projector so as to combine a complete rotation matrix, and finally, the external parameters from a projection surface to an optical machine coordinate system under a world coordinate system are calculated and solved.
Combining the R, T values of the camera and the optical machine obtained by calculation in the above steps, the conversion relation between the world coordinate system of the projection plane and the optical machine coordinate system can be obtained; combining the internal parameters of the optical machine, a homography matrix from the point of the projection surface to the point of the optical machine graph card can be formed.
Finally, a rectangle is selected on the projection surface, the homography is utilized to reversely solve the coordinate corresponding to the photo-machine graphic card, the coordinate is the correction coordinate, and the correction coordinate is arranged on the photo-machine, so that the trapezoidal correction can be realized.
For example, the projector controller obtains the depth value of the corresponding point of the pixel point of the photo, or the coordinate of the projection point in the camera coordinate system; acquiring the relation between a light machine coordinate system and a camera coordinate system by the middleware according to the depth value;
then the controller calculates to obtain the coordinate value of the projection point under the optical machine coordinate system, and obtains the included angle between the projection surface and the optical machine based on the coordinate value fitting plane, and then obtains the corresponding coordinate of the projection point in the world coordinate system of the projection surface according to the included angle relationship; and calculating to obtain a homography matrix according to the coordinates of the graphic card in the optical machine coordinate system and the coordinates of the corresponding point of the projection plane.
The controller determines whether an obstacle exists based on the acquired data; when the obstacle exists, taking a rectangular coordinate on a projection surface under a world coordinate system, and calculating an area to be projected by the optical computer according to the homography relation; when the obstacle does not exist, the controller can acquire the characteristic points of the two-dimensional code, then acquire the coordinates of the two-dimensional code on the prefabricated drawing card, then acquire the homography relation between the camera picture and the drawing card, convert the acquired obstacle coordinates into the drawing card, and then acquire the coordinates of the obstacle blocking drawing card.
Obtaining the coordinates of the shielded area of the projection plane through homography matrix transformation according to the coordinates of the shielded area of the obstacle map card in the optical machine coordinate system, arbitrarily taking rectangular coordinates on the projection plane in the world coordinate system, avoiding the obstacle, and obtaining the area to be projected by the optical machine according to the homography relation, wherein the logic flow is shown in fig. 7E.
It can be understood that, in the obstacle avoidance algorithm, when the trapezoidal correction algorithm selects a rectangle step, the extraction of the foreign body outline is completed by using an algorithm (OpenCV) library, and when the rectangle is selected, the obstacle is avoided, so that the projection obstacle avoidance function is realized.
In some embodiments, the middleware acquires a two-dimensional code graphic card shot by a camera, identifies characteristic points of the two-dimensional code, and acquires coordinates in a camera coordinate system; the controller further obtains coordinates of the preset graphic card in an optical machine coordinate system to solve the homography relation between the camera plane and the optical machine plane; the controller identifies coordinates of four top points of the curtain shot by the camera based on the homography, and obtains a range of the chart projected to the curtain optical machine according to the homography, as shown in fig. 7F.
It is to be appreciated that in some embodiments, the on-screen algorithm is based on an algorithm library (OpenCV), and can identify and extract the maximum black closed rectangular contour, determining whether the size is 16: 9; projecting a specific graphic card, shooting a picture by using a camera, extracting a plurality of angular points in the picture for calculating homography between a projection surface (curtain) and an optical machine playing graphic card, converting four vertexes of the curtain into an optical machine pixel coordinate system through homography, and converting the optical machine graphic card into the four vertexes of the curtain to complete calculation and comparison.
The long-focus micro-projection television has the characteristic of flexible movement, a projection picture may be distorted after each displacement, and in addition, if foreign matters exist on a projection surface or the projection picture is abnormal from a curtain, the projector and the display control method based on geometric correction can automatically complete correction aiming at the problems, and the functions of automatic trapezoidal correction, automatic screen entering, automatic obstacle avoidance, automatic focusing, eye shooting prevention and the like are realized.
The method and the device have the advantages that the first image is created, so that the image of the environment where the curtain corresponding to the projector is located can be obtained; furthermore, the accuracy of the projector for identifying the closed contour corresponding to the environmental element can be improved by creating the first image gray-scale image; further, the screening range of candidate areas of the curtain projection area can be reduced by constructing a first-stage closed contour and a second-stage closed contour; further, by judging that the second-stage closed contour is a convex quadrilateral, the screen projection area contained in the candidate area can be identified, the accuracy of identifying the screen projection area is improved, the condition that a user manually finely adjusts the projection angle is avoided, and the situation that the playing content of a projector used in cooperation with the screen can be automatically projected to the screen projection area after the projector is moved is realized.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block", "controller", "engine", "unit", "component", or "system". Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, the order in which elements and sequences of the processes described herein are processed, the use of alphanumeric characters, or the use of other designations, is not intended to limit the order of the processes and methods described herein, unless explicitly claimed. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner inconsistent or contrary to the present disclosure, and except where the claim is filed in its broadest scope (whether present or later appended to the application) as well. It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements and/or uses of the present application in the material attached to this application.

Claims (10)

1. A projection device, comprising:
the projection component is used for projecting the playing content to a curtain corresponding to the projection equipment;
a controller configured to:
performing binarization on the first image to obtain a second image based on the brightness analysis of a first image gray-scale image obtained by a camera;
determining a primary closed contour contained in the second image, wherein the primary closed contour contains a secondary closed contour;
when the secondary closed contour is judged to be a convex quadrangle, controlling the projection component to project the playing content to the secondary closed contour, wherein the secondary closed contour corresponds to a projection area of the curtain; wherein the curtain comprises a curtain edge band corresponding to the primary closed contour, the projected area being surrounded by the curtain edge band.
2. The projection device of claim 1, wherein the controller is configured such that, in binarizing the first image into the second image based on camera-acquired grayscale map luminance analysis of the first image, the controller:
determining a gray scale value corresponding to the maximum brightness ratio in the first image gray scale image;
and selecting a preset number of gray scale values as a threshold value by taking the gray scale values as the center, carrying out binarization on the first image, and taking a binarization image containing curtain characteristics as the second image.
3. The projection device of claim 1, wherein the controller is configured to, in the step of determining a projection area corresponding to a curtain containing the secondary closed contour in the second image, the controller:
acquiring all primary closed contours including the secondary closed contours in the second image;
performing polygon fitting on the secondary closed contour, and judging the secondary closed contour with a fitting result of a quadrilateral contour as a candidate area of a curtain projection area;
and judging the unevenness of the candidate area of the screen projection area, and judging the candidate area with a convex quadrilateral result as the projection area corresponding to the screen.
4. The projection device of claim 1, wherein the controller is configured to, in determining a projection area corresponding to the curtain that includes a secondary closed contour in the second image, the controller:
when the secondary closed contour further comprises a tertiary closed contour generated by projection of the projection device, the controller does not perform extraction analysis on the tertiary closed contour.
5. The projection device of claim 1, wherein the controller is configured to, in controlling the camera to obtain a first image of an environment in which the projection device is located:
and controlling the camera to acquire a first image of an area where the projection equipment corresponding to the curtain is located.
6. A projection display control method for automatically throwing into a curtain area is characterized by comprising the following steps:
performing binarization on a first image to obtain a second image based on the brightness analysis of the obtained first image gray-scale image, wherein the first image is an environment image;
determining a primary closed contour contained in the second image, wherein the primary closed contour contains a secondary closed contour;
when the secondary closed contour is judged to be a convex quadrangle, projecting the playing content to the secondary closed contour, wherein the secondary closed contour corresponds to a projection area of the curtain; wherein the curtain comprises a curtain edge band corresponding to the primary closed contour, the projected area being surrounded by the curtain edge band.
7. The projection display control method of automatically feeding a curtain region as claimed in claim 6, wherein in the step of binarizing the first image to obtain the second image based on the gray-scale image luminance analysis of the first image, the method specifically comprises:
determining a gray scale value corresponding to the maximum brightness ratio in the first image gray scale image;
and selecting a preset number of gray scale values as a threshold value by taking the gray scale values as the center, carrying out binarization on the first image, and taking the binarized image containing curtain characteristics as the second image.
8. The method for controlling projection display of automatically feeding into a curtain area as claimed in claim 6, wherein the step of determining the projection area corresponding to the curtain containing the secondary closed contour in the second image specifically comprises:
acquiring all primary closed contours including the secondary closed contours in the second image;
performing polygon fitting on the secondary closed contour, and judging the secondary closed contour with a fitting result of a quadrilateral contour as a candidate area of a curtain projection area;
and judging the unevenness of the candidate area of the screen projection area, and judging the candidate area with a convex quadrilateral result as the projection area corresponding to the screen.
9. The method of claim 6, wherein in determining a projection area corresponding to the screen that includes a secondary closed contour in the second image, the method further comprises:
when the three-level closed contour generated by projection is also included in the two-level closed contour, the three-level closed contour is not subjected to extraction analysis.
10. The projection display control method of automatically feeding a curtain area as claimed in claim 6, wherein the step of obtaining the first image of the area where the curtain is located includes obtaining the first image of the area where the curtain is located.
CN202210006233.7A 2021-11-16 2022-01-05 Projection equipment and projection display control method for automatically throwing screen area Pending CN114466173A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202280063192.3A CN118104230A (en) 2021-11-16 2022-09-29 Projection equipment and display control method
PCT/CN2022/122810 WO2023087950A1 (en) 2021-11-16 2022-09-29 Projection device and display control method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2021113558660 2021-11-16
CN202111355866 2021-11-16

Publications (1)

Publication Number Publication Date
CN114466173A true CN114466173A (en) 2022-05-10

Family

ID=80658581

Family Applications (13)

Application Number Title Priority Date Filing Date
CN202210006233.7A Pending CN114466173A (en) 2021-11-16 2022-01-05 Projection equipment and projection display control method for automatically throwing screen area
CN202210050600.3A Active CN114205570B (en) 2021-11-16 2022-01-17 Projection equipment and display control method for automatically correcting projection image
CN202210168263.8A Active CN114401390B (en) 2021-11-16 2022-02-23 Projection equipment and projection image correction method based on optical machine camera calibration
CN202210325721.4A Active CN114885136B (en) 2021-11-16 2022-03-29 Projection apparatus and image correction method
CN202210345204.3A Active CN114727079B (en) 2021-11-16 2022-03-31 Projection equipment and focusing method based on position memory
CN202210343444.XA Pending CN114885138A (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210343443.5A Active CN114885137B (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210389054.6A Active CN114827563B (en) 2021-11-16 2022-04-13 Projection apparatus and projection area correction method
CN202210583357.1A Active CN115022606B (en) 2021-11-16 2022-05-25 Projection equipment and obstacle avoidance projection method
CN202210709400.4A Active CN115174877B (en) 2021-11-16 2022-06-21 Projection device and focusing method thereof
CN202280063192.3A Pending CN118104230A (en) 2021-11-16 2022-09-29 Projection equipment and display control method
CN202280063350.5A Pending CN118077192A (en) 2021-11-16 2022-11-16 Projection equipment and projection area correction method
CN202280063329.5A Pending CN118104231A (en) 2021-11-16 2022-11-16 Projection apparatus and projection image correction method

Family Applications After (12)

Application Number Title Priority Date Filing Date
CN202210050600.3A Active CN114205570B (en) 2021-11-16 2022-01-17 Projection equipment and display control method for automatically correcting projection image
CN202210168263.8A Active CN114401390B (en) 2021-11-16 2022-02-23 Projection equipment and projection image correction method based on optical machine camera calibration
CN202210325721.4A Active CN114885136B (en) 2021-11-16 2022-03-29 Projection apparatus and image correction method
CN202210345204.3A Active CN114727079B (en) 2021-11-16 2022-03-31 Projection equipment and focusing method based on position memory
CN202210343444.XA Pending CN114885138A (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210343443.5A Active CN114885137B (en) 2021-11-16 2022-03-31 Projection equipment and automatic focusing method
CN202210389054.6A Active CN114827563B (en) 2021-11-16 2022-04-13 Projection apparatus and projection area correction method
CN202210583357.1A Active CN115022606B (en) 2021-11-16 2022-05-25 Projection equipment and obstacle avoidance projection method
CN202210709400.4A Active CN115174877B (en) 2021-11-16 2022-06-21 Projection device and focusing method thereof
CN202280063192.3A Pending CN118104230A (en) 2021-11-16 2022-09-29 Projection equipment and display control method
CN202280063350.5A Pending CN118077192A (en) 2021-11-16 2022-11-16 Projection equipment and projection area correction method
CN202280063329.5A Pending CN118104231A (en) 2021-11-16 2022-11-16 Projection apparatus and projection image correction method

Country Status (2)

Country Link
CN (13) CN114466173A (en)
WO (3) WO2023087950A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002432A (en) * 2022-05-30 2022-09-02 海信视像科技股份有限公司 Projection equipment and obstacle avoidance projection method
CN115002430A (en) * 2022-05-17 2022-09-02 深圳市当智科技有限公司 Projection method, projector, and computer-readable storage medium
WO2023087950A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and display control method

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118476211A (en) * 2021-11-16 2024-08-09 海信视像科技股份有限公司 Projection device and focusing method
CN118476210A (en) * 2021-11-16 2024-08-09 海信视像科技股份有限公司 Projection equipment and display control method
CN118541967A (en) * 2021-11-16 2024-08-23 海信视像科技股份有限公司 Projection device and correction method
CN118104229A (en) * 2021-11-16 2024-05-28 海信视像科技股份有限公司 Projection equipment and display control method of projection image
CN114760454A (en) * 2022-05-24 2022-07-15 海信视像科技股份有限公司 Projection equipment and trigger correction method
CN114640832A (en) * 2022-02-11 2022-06-17 厦门聚视智创科技有限公司 Automatic correction method for projected image
CN115002429B (en) * 2022-05-07 2023-03-24 深圳市和天创科技有限公司 Projector capable of automatically calibrating projection position based on camera calculation
CN114885142B (en) * 2022-05-27 2024-05-17 海信视像科技股份有限公司 Projection equipment and method for adjusting projection brightness
CN115314689A (en) * 2022-08-05 2022-11-08 深圳海翼智新科技有限公司 Projection correction method, projection correction device, projector and computer program product
CN115314691B (en) * 2022-08-09 2023-05-09 北京淳中科技股份有限公司 Image geometric correction method and device, electronic equipment and storage medium
CN115061415B (en) * 2022-08-18 2023-01-24 赫比(成都)精密塑胶制品有限公司 Automatic process monitoring method and device and computer readable storage medium
CN115474032B (en) * 2022-09-14 2023-10-03 深圳市火乐科技发展有限公司 Projection interaction method, projection device and storage medium
CN115529445A (en) * 2022-09-15 2022-12-27 海信视像科技股份有限公司 Projection equipment and projection image quality adjusting method
WO2024066776A1 (en) * 2022-09-29 2024-04-04 海信视像科技股份有限公司 Projection device and projection-picture processing method
CN115361540B (en) * 2022-10-20 2023-01-24 潍坊歌尔电子有限公司 Method and device for self-checking abnormal cause of projected image, projector and storage medium
CN115760620B (en) * 2022-11-18 2023-10-20 荣耀终端有限公司 Document correction method and device and electronic equipment
WO2024124978A1 (en) * 2022-12-12 2024-06-20 海信视像科技股份有限公司 Projection device and projection method
CN116095287B (en) * 2022-12-28 2024-08-23 海信集团控股股份有限公司 Projection equipment calibration method, calibration system and projection equipment
CN116723395B (en) * 2023-04-21 2024-08-16 深圳市橙子数字科技有限公司 Non-inductive focusing method and device based on camera
CN116993879B (en) * 2023-07-03 2024-03-12 广州极点三维信息科技有限公司 Method for automatically avoiding obstacle and distributing light, electronic equipment and storage medium
CN116886881B (en) * 2023-07-26 2024-09-24 深圳市极鑫科技有限公司 Projector based on omnidirectional trapezoidal technology
CN117278735B (en) * 2023-09-15 2024-05-17 山东锦霖智能科技集团有限公司 Immersive image projection equipment
CN117830437B (en) * 2024-03-01 2024-05-14 中国科学院长春光学精密机械与物理研究所 Device and method for calibrating internal and external parameters of large-view-field long-distance multi-view camera
CN118247776B (en) * 2024-05-24 2024-07-23 南昌江铃集团胜维德赫华翔汽车镜有限公司 Automobile blind area display and identification method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080266253A1 (en) * 2007-04-25 2008-10-30 Lisa Seeman System and method for tracking a laser spot on a projected computer screen image
CN102236784A (en) * 2010-05-07 2011-11-09 株式会社理光 Screen area detection method and system
CN110769225A (en) * 2018-12-29 2020-02-07 成都极米科技股份有限公司 Projection area obtaining method based on curtain and projection device
CN111932571A (en) * 2020-09-25 2020-11-13 歌尔股份有限公司 Image boundary identification method and device and computer readable storage medium
CN113286134A (en) * 2021-05-25 2021-08-20 青岛海信激光显示股份有限公司 Image correction method and shooting equipment

Family Cites Families (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005031267A (en) * 2003-07-09 2005-02-03 Sony Corp Picture projection device and picture projection method
JP3951984B2 (en) * 2003-08-22 2007-08-01 日本電気株式会社 Image projection method and image projection apparatus
JP2006109088A (en) * 2004-10-05 2006-04-20 Olympus Corp Geometric correction method in multi-projection system
JP4984968B2 (en) * 2007-02-28 2012-07-25 カシオ計算機株式会社 Projection apparatus, abnormality control method and program
JP4831219B2 (en) * 2008-10-29 2011-12-07 セイコーエプソン株式会社 Projector and projector control method
CN102681312B (en) * 2011-03-16 2015-06-24 宏瞻科技股份有限公司 Human eye safety protection system of laser projection system
JP2013033206A (en) * 2011-07-06 2013-02-14 Ricoh Co Ltd Projection display device, information processing device, projection display system, and program
CN103293836A (en) * 2012-02-27 2013-09-11 联想(北京)有限公司 Projection method and electronic device
CN103002240B (en) * 2012-12-03 2016-11-23 深圳创维数字技术有限公司 A kind of method and apparatus setting avoiding obstacles projection
JP6201359B2 (en) * 2013-03-22 2017-09-27 カシオ計算機株式会社 Projection system, projection method, and projection program
JP2015128242A (en) * 2013-12-27 2015-07-09 ソニー株式会社 Image projection device and calibration method of the same
CN103905762B (en) * 2014-04-14 2017-04-19 上海索广电子有限公司 Method for automatically detecting projection picture for projection module
CN103942796B (en) * 2014-04-23 2017-04-12 清华大学 High-precision projector and camera calibration system and method
JP2016014712A (en) * 2014-07-01 2016-01-28 キヤノン株式会社 Shading correction value calculation device and shading correction value calculation method
JP6186599B1 (en) * 2014-12-25 2017-08-30 パナソニックIpマネジメント株式会社 Projection device
CN104536249B (en) * 2015-01-16 2016-08-24 努比亚技术有限公司 The method and apparatus of regulation projector focal length
CN104835143A (en) * 2015-03-31 2015-08-12 中国航空无线电电子研究所 Rapid projector system parameter calibration method
JP2016197768A (en) * 2015-04-02 2016-11-24 キヤノン株式会社 Image projection system and control method of projection image
WO2016194191A1 (en) * 2015-06-04 2016-12-08 日立マクセル株式会社 Projection-type picture display apparatus and picture display method
CN105208308B (en) * 2015-09-25 2018-09-04 广景视睿科技(深圳)有限公司 A kind of method and system for the best projection focus obtaining projecting apparatus
US10630884B2 (en) * 2016-03-23 2020-04-21 Huawei Technologies Co., Ltd. Camera focusing method, apparatus, and device for terminal
CN107318007A (en) * 2016-04-27 2017-11-03 中兴通讯股份有限公司 The method and device of projected focus
CN107547881B (en) * 2016-06-24 2019-10-11 上海顺久电子科技有限公司 A kind of auto-correction method of projection imaging, device and laser television
CN106713879A (en) * 2016-11-25 2017-05-24 重庆杰夫与友文化创意有限公司 Obstacle avoidance projection method and apparatus
KR101820905B1 (en) * 2016-12-16 2018-01-22 씨제이씨지브이 주식회사 An image-based projection area automatic correction method photographed by a photographing apparatus and a system therefor
CN109215082B (en) * 2017-06-30 2021-06-22 杭州海康威视数字技术股份有限公司 Camera parameter calibration method, device, equipment and system
CN109426060A (en) * 2017-08-21 2019-03-05 深圳光峰科技股份有限公司 Projector automatic focusing method and projector
CN107479168A (en) * 2017-08-22 2017-12-15 深圳市芯智科技有限公司 A kind of projector that can realize rapid focus function and focusing method
KR101827221B1 (en) * 2017-09-07 2018-02-07 주식회사 조이펀 Mixed Reality Content Providing Device with Coordinate System Calibration and Method of Coordinate System Calibration using it
CN109856902A (en) * 2017-11-30 2019-06-07 中强光电股份有限公司 Projection arrangement and Atomatic focusing method
CN110058483B (en) * 2018-01-18 2022-06-10 深圳光峰科技股份有限公司 Automatic focusing system, projection equipment, automatic focusing method and storage medium
WO2019203002A1 (en) * 2018-04-17 2019-10-24 ソニー株式会社 Information processing device and method
CN110769214A (en) * 2018-08-20 2020-02-07 成都极米科技股份有限公司 Automatic tracking projection method and device based on frame difference
CN111147732B (en) * 2018-11-06 2021-07-20 浙江宇视科技有限公司 Focusing curve establishing method and device
CN109544643B (en) * 2018-11-21 2023-08-11 北京佳讯飞鸿电气股份有限公司 Video camera image correction method and device
CN109495729B (en) * 2018-11-26 2023-02-10 青岛海信激光显示股份有限公司 Projection picture correction method and system
CN110769226B (en) * 2019-02-27 2021-11-09 成都极米科技股份有限公司 Focusing method and focusing device of ultra-short-focus projector and readable storage medium
CN110769227A (en) * 2019-02-27 2020-02-07 成都极米科技股份有限公司 Focusing method and focusing device of ultra-short-focus projector and readable storage medium
CN110336987B (en) * 2019-04-03 2021-10-08 北京小鸟听听科技有限公司 Projector distortion correction method and device and projector
CN110336951A (en) * 2019-08-26 2019-10-15 厦门美图之家科技有限公司 Contrast formula focusing method, device and electronic equipment
CN110636273A (en) * 2019-10-15 2019-12-31 歌尔股份有限公司 Method and device for adjusting projection picture, readable storage medium and projector
CN112799275B (en) * 2019-11-13 2023-01-06 青岛海信激光显示股份有限公司 Focusing method and focusing system of ultra-short-focus projection lens and projector
CN111028297B (en) * 2019-12-11 2023-04-28 凌云光技术股份有限公司 Calibration method of surface structured light three-dimensional measurement system
CN111050150B (en) * 2019-12-24 2021-12-31 成都极米科技股份有限公司 Focal length adjusting method and device, projection equipment and storage medium
CN111050151B (en) * 2019-12-26 2021-08-17 成都极米科技股份有限公司 Projection focusing method and device, projector and readable storage medium
CN110996085A (en) * 2019-12-26 2020-04-10 成都极米科技股份有限公司 Projector focusing method, projector focusing device and projector
CN111311686B (en) * 2020-01-15 2023-05-02 浙江大学 Projector defocus correction method based on edge perception
CN113554709A (en) * 2020-04-23 2021-10-26 华东交通大学 Camera-projector system calibration method based on polarization information
CN111429532B (en) * 2020-04-30 2023-03-31 南京大学 Method for improving camera calibration accuracy by utilizing multi-plane calibration plate
CN113301314B (en) * 2020-06-12 2023-10-24 阿里巴巴集团控股有限公司 Focusing method, projector, imaging apparatus, and storage medium
CN112050751B (en) * 2020-07-17 2022-07-22 深圳大学 Projector calibration method, intelligent terminal and storage medium
CN112584113B (en) * 2020-12-02 2022-08-30 深圳市当智科技有限公司 Wide-screen projection method and system based on mapping correction and readable storage medium
CN112598589A (en) * 2020-12-17 2021-04-02 青岛海信激光显示股份有限公司 Laser projection system and image correction method
CN112904653A (en) * 2021-01-26 2021-06-04 四川长虹电器股份有限公司 Focusing method and focusing device for projection equipment
CN112995624B (en) * 2021-02-23 2022-11-08 峰米(北京)科技有限公司 Trapezoidal error correction method and device for projector
CN112995625B (en) * 2021-02-23 2022-10-11 峰米(北京)科技有限公司 Trapezoidal correction method and device for projector
CN112689136B (en) * 2021-03-19 2021-07-02 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN113099198B (en) * 2021-03-19 2023-01-10 深圳市火乐科技发展有限公司 Projection image adjusting method and device, storage medium and electronic equipment
CN112804507B (en) * 2021-03-19 2021-08-31 深圳市火乐科技发展有限公司 Projector correction method, projector correction system, storage medium, and electronic device
CN113038105B (en) * 2021-03-26 2022-10-18 歌尔股份有限公司 Projector adjusting method and adjusting apparatus
CN113160339B (en) * 2021-05-19 2024-04-16 中国科学院自动化研究所苏州研究院 Projector calibration method based on Molaque law
CN113473095B (en) * 2021-05-27 2022-10-21 广景视睿科技(深圳)有限公司 Method and device for obstacle avoidance dynamic projection
CN113489961B (en) * 2021-09-08 2022-03-22 深圳市火乐科技发展有限公司 Projection correction method, projection correction device, storage medium and projection equipment
CN114466173A (en) * 2021-11-16 2022-05-10 海信视像科技股份有限公司 Projection equipment and projection display control method for automatically throwing screen area

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080266253A1 (en) * 2007-04-25 2008-10-30 Lisa Seeman System and method for tracking a laser spot on a projected computer screen image
CN102236784A (en) * 2010-05-07 2011-11-09 株式会社理光 Screen area detection method and system
CN110769225A (en) * 2018-12-29 2020-02-07 成都极米科技股份有限公司 Projection area obtaining method based on curtain and projection device
CN111932571A (en) * 2020-09-25 2020-11-13 歌尔股份有限公司 Image boundary identification method and device and computer readable storage medium
CN113286134A (en) * 2021-05-25 2021-08-20 青岛海信激光显示股份有限公司 Image correction method and shooting equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023087950A1 (en) * 2021-11-16 2023-05-25 海信视像科技股份有限公司 Projection device and display control method
CN115002430A (en) * 2022-05-17 2022-09-02 深圳市当智科技有限公司 Projection method, projector, and computer-readable storage medium
CN115002432A (en) * 2022-05-30 2022-09-02 海信视像科技股份有限公司 Projection equipment and obstacle avoidance projection method

Also Published As

Publication number Publication date
CN114401390B (en) 2024-08-20
CN118077192A (en) 2024-05-24
CN114885138A (en) 2022-08-09
CN115174877A (en) 2022-10-11
WO2023088304A1 (en) 2023-05-25
CN115174877B (en) 2024-05-28
CN114205570A (en) 2022-03-18
CN114727079A (en) 2022-07-08
CN114885137B (en) 2024-05-31
CN114885136A (en) 2022-08-09
CN114885136B (en) 2024-05-28
CN114827563A (en) 2022-07-29
CN114885137A (en) 2022-08-09
WO2023088329A1 (en) 2023-05-25
CN115022606B (en) 2024-05-17
CN114401390A (en) 2022-04-26
CN114205570B (en) 2024-08-27
CN114727079B (en) 2024-08-20
CN114827563B (en) 2024-08-23
WO2023087950A1 (en) 2023-05-25
CN115022606A (en) 2022-09-06
CN118104231A (en) 2024-05-28
CN118104230A (en) 2024-05-28

Similar Documents

Publication Publication Date Title
CN114466173A (en) Projection equipment and projection display control method for automatically throwing screen area
CN111435985B (en) Projector control method, projector and projection system
JP5396012B2 (en) System that automatically corrects the image before projection
US9348212B2 (en) Image projection system and image projection method
WO2023087947A1 (en) Projection device and correction method
CN103576428A (en) Laser projection system with security protection mechanism
US20180278900A1 (en) Image display apparatus and method for controlling the same
CN104658462B (en) The control method of projector and projector
JP2013064827A (en) Electronic apparatus
US20240305754A1 (en) Projection device and obstacle avoidance projection method
CN116320335A (en) Projection equipment and method for adjusting projection picture size
CN113949852A (en) Projection method, projection apparatus, and storage medium
CN116055696A (en) Projection equipment and projection method
CN115883803A (en) Projection equipment and projection picture correction method
JP2021132312A (en) Projection system, control method of projector, and projector
JP2012181264A (en) Projection device, projection method, and program
US11950339B2 (en) Lighting apparatus, and corresponding system, method and computer program product
CN118104229A (en) Projection equipment and display control method of projection image
JP2006004330A (en) Video display system
CN114885142B (en) Projection equipment and method for adjusting projection brightness
JP2012226507A (en) Emitter identification device, emitter identification method and emitter identification program
CN114928728B (en) Projection apparatus and foreign matter detection method
CN118233608A (en) Laser projection correction method and projection display device
CN114928728A (en) Projection apparatus and foreign matter detection method
CN118158367A (en) Projection equipment and projection picture curtain entering method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination