CN116246553A - Projection apparatus and projection method - Google Patents

Projection apparatus and projection method Download PDF

Info

Publication number
CN116246553A
CN116246553A CN202211614385.1A CN202211614385A CN116246553A CN 116246553 A CN116246553 A CN 116246553A CN 202211614385 A CN202211614385 A CN 202211614385A CN 116246553 A CN116246553 A CN 116246553A
Authority
CN
China
Prior art keywords
image
curtain
area
projection
rectangular
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211614385.1A
Other languages
Chinese (zh)
Inventor
杨丽娟
李佳琳
王昊
岳国华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202211614385.1A priority Critical patent/CN116246553A/en
Publication of CN116246553A publication Critical patent/CN116246553A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it

Abstract

The application relates to the technical field of display equipment, and discloses projection equipment and a projection method, wherein the projection equipment comprises the following components: projection assembly, camera device and controller. The controller is configured to: receiving a first image sent by an image pickup device, and processing the first image to obtain a second image; wherein the pixel value of each pixel in the second image is the first pixel value or the second pixel value; if the area of a first communication area formed by the first pixel values in the second image is smaller than the first threshold area, the first pixel values in the first communication area are adjusted to be second pixel values, and a third image is obtained; determining a first rectangular region in the third image based on the second pixel value in the third image, the first rectangular region being a largest rectangular region in the third image composed of the second pixel value; and determining the area of the curtain from the first rectangular area according to the preset length-width ratio. By applying the technical scheme, the accuracy of the projection equipment for identifying the curtain can be improved.

Description

Projection apparatus and projection method
Technical Field
The present disclosure relates to the field of display devices, and in particular, to a projection device and a projection method.
Background
A projection device is a display device that can project an image or video onto a curtain, for example, the projection device can project content to be projected onto the curtain by connecting to a computer, the internet, a smart phone, a video signal source, or the like.
Generally, curtains have various specifications and types, and for different curtains, a projection device needs to be adjusted before projection, so as to ensure that an image to be projected can be projected onto the curtain more accurately. For some conventional curtains (e.g., four-sided curtains), the projection device may more easily determine the area where the curtain is located, so as to project the content to be projected onto the curtain; however, for some unconventional curtains (e.g., bilateral curtains), the projection device may not be able to quickly and accurately determine the position of the curtain, thereby affecting the accuracy of the projection device.
Disclosure of Invention
In order to solve the above problems, the embodiment of the application provides a projection device and a projection method, wherein the projection device can be suitable for curtains which cannot be specified and are of different types, and the accuracy of identifying the curtain by the projection device is improved, so that the content to be projected is projected to the curtain more accurately, and the universality of the projection device to the curtain is improved.
In one aspect, some embodiments of the present application provide a projection device, comprising: the projection device comprises a projection assembly, an image pickup device and a controller coupled with the image pickup device and the projection assembly respectively. Wherein the projection assembly is configured to: and projecting the image to be projected to the curtain. The image pickup apparatus is configured to: and shooting the area where the curtain is positioned, and obtaining a first image. The controller is configured to: receiving a first image sent by an image pickup device, and processing the first image to obtain a second image; wherein the pixel value of each pixel in the second image is the first pixel value or the second pixel value; if the area of a first communication area formed by the first pixel values in the second image is smaller than the first threshold area, the first pixel values in the first communication area are adjusted to be second pixel values, and a third image is obtained; determining a first rectangular area in the third image based on the second pixel value in the third image, wherein the first rectangular area is the largest rectangular area in the third image, and the largest rectangular area consists of the second pixel value; determining the area of the curtain in a first rectangular area according to a preset length-width ratio; based on the area of the curtain, the projection component is controlled to project the image to be projected to the curtain.
In another aspect, some embodiments of the present application further provide a projection method, the method including: receiving a first image, wherein the first image is an image of an area where a curtain is located; processing the first image to obtain a second image; wherein, the pixel value of each pixel in the second image is the first pixel value or the second pixel value; if the area of a first communication area formed by the first pixel values in the second image is smaller than the first threshold area, the first pixel values in the first communication area are adjusted to be second pixel values, and a third image is obtained; determining a first rectangular region in the third image based on the second pixel value in the third image, the first rectangular region being a largest rectangular region in the third image composed of the second pixel value; determining an area of the curtain in the first rectangular area according to a preset length-width ratio; based on the area of the curtain, the projection component is controlled to project the image to be projected to the curtain.
As can be seen from the above technical solutions, according to the projection apparatus and the projection method provided in some embodiments of the present application, after a first image captured by an image capturing device is received, the first image may be processed to obtain a second image with a first pixel value or a second pixel value of each pixel; then judging the relation between the area of a first communication area formed by first pixel values in the second image and the first threshold area, and when the area of the first communication area is smaller than the first threshold area, adjusting the first pixel value in the first communication area to be the second pixel value to obtain a third image; next, determining a first rectangular region in the third image based on the second pixel value in the third image, the first rectangular region being a largest rectangular region in the third image composed of the second pixel value; determining the area of the curtain in the first rectangular area according to the preset length-width ratio; finally, based on the area of the curtain, the projection component is controlled to project the image to be projected to the curtain. Therefore, the projection equipment that this application provided can more accurate discernment to the region of curtain to will treat the more accurate projection of projection content to the curtain, can be adapted to different curtains simultaneously, improved the universality of projection equipment to the curtain.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic diagram of a projection system according to an embodiment of the present disclosure;
fig. 2 is a schematic view of a structure of a projection apparatus according to an embodiment of the present application;
fig. 3 is a schematic diagram of an architecture of an optical engine in a projection device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an optical path in a projection apparatus according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a projection device according to an embodiment of the present disclosure;
fig. 6 is a schematic view of a curtain according to an embodiment of the present disclosure;
FIG. 7 is a schematic view of another curtain according to an embodiment of the present disclosure;
fig. 8 is a schematic flow chart of a projection method according to an embodiment of the present application;
FIG. 9 is a schematic flow chart of another projection method according to an embodiment of the present disclosure;
FIG. 10 is a flowchart of another projection method according to an embodiment of the present disclosure;
FIG. 11 is a flowchart of another projection method according to an embodiment of the present disclosure;
FIG. 12 is a flowchart of another projection method according to an embodiment of the present disclosure;
fig. 13 is a flowchart of another projection method according to an embodiment of the present application.
Detailed Description
For purposes of clarity and implementation of the present application, the following description will make clear and complete descriptions of exemplary implementations of the present application with reference to the accompanying drawings in which exemplary implementations of the present application are illustrated, it being apparent that the exemplary implementations described are only some, but not all, of the examples of the present application.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for limiting a particular order or sequence, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements explicitly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that is capable of performing the function associated with that element.
The embodiments of the present application may be applied to various types of projection apparatuses. Hereinafter, a projector will be exemplified, and a projection apparatus and an auto-focusing method will be described.
Fig. 1 is a schematic diagram of a projection system according to some embodiments of the present application, and fig. 2 is a schematic diagram of a structure of a projection device according to some embodiments of the present application.
In some embodiments, referring to fig. 1, a projection system may include a projection device 1 and a projection screen 2. Wherein, the projection device 1 is placed on the first position, and the projection screen 2 is fixed on the second position, so that the picture projected by the projection device 1 coincides with the projection screen 2. Referring to fig. 2, the projection apparatus 1 includes a laser light source 100, an optical machine 200, and a lens 300. The laser light source 100 provides illumination for the optical machine 200, and the optical machine 200 modulates the light beam of the light source, outputs the modulated light beam to the lens 300 for imaging, and projects the imaged light beam onto the projection medium 400 to form a projection screen. In some examples, projection medium 400 may also be projection screen 2.
In some embodiments, the laser light source 100 of the projection device 1 includes a laser assembly and an optical lens assembly, and the light beam emitted by the laser assembly can penetrate the optical lens assembly to provide illumination for the optical machine. Wherein, for example, the optical lens assembly requires a higher level of environmental cleanliness, hermetic level of sealing; and the chamber for mounting the laser assembly can be sealed by adopting a dustproof grade with a lower sealing grade so as to reduce the sealing cost.
In some embodiments, the light engine 200 of the projection device 1 may be implemented to include a blue light engine, a green light engine, a red light engine, a heat dissipation system, a circuit control system, and the like. It should be noted that, in some embodiments, the light emitting component of the projector may also be implemented by an LED light source.
Fig. 3 is a schematic circuit architecture of a projection device according to some embodiments of the present application. In some embodiments, projection device 1 may include display control circuitry 10, a laser light source 100, at least one laser drive assembly 30, and at least one brightness sensor 40; the laser light source 100 may include at least one laser in a one-to-one correspondence with at least one laser driving assembly 30. Wherein, the at least one means one or more, and the plurality means two or more.
In some embodiments, the laser light source 100 includes three lasers, which may be a blue laser 201, a red laser 202, and a green laser 203, respectively, in a one-to-one correspondence with the laser driving assembly 30. The blue laser 201 is used for emitting blue laser light, the red laser 202 is used for emitting red laser light, and the green laser 203 is used for emitting green laser light. In some embodiments, the laser driving assembly 30 may be implemented to include a plurality of sub-laser driving assemblies, each corresponding to a different color laser.
The display control circuit 10 is configured to output control signals (e.g., an enable signal and a current control signal) to the laser driving assembly 30 to drive the laser to emit light. For example, the display control circuit 10 is connected to the laser driving assembly 30, and is configured to output at least one enable signal corresponding to three primary colors of each frame of images in the multi-frame display image, transmit the at least one enable signal to the corresponding laser driving assembly 30, and output at least one current control signal corresponding to the three primary colors of each frame of images, and transmit the at least one current control signal to the corresponding laser driving assembly 30. For example, the display control circuit 10 may be a micro control unit (Micro Controller Unit, MCU), also referred to as a single chip microcomputer. The current control signal may be a pulse width modulation (Pulse Width Modulation, PWM) signal, among others.
As shown in fig. 3, the display control circuit 10 may output a blue PWM signal b_pwm corresponding to the blue laser 201 based on a blue base color component of an image to be displayed, a red PWM signal r_pwm corresponding to the red laser 202 based on a red base color component of the image to be displayed, and a green PWM signal g_pwm corresponding to the green laser 203 based on a green base color component of the image to be displayed. The display control circuit 10 may output the enable signal b_en corresponding to the blue laser 201 based on the lighting period of the blue laser 201 in the driving period, output the enable signal r_en corresponding to the red laser 202 based on the lighting period of the red laser 202 in the driving period, and output the enable signal g_en corresponding to the green laser 203 based on the lighting period of the green laser 203 in the driving period.
The laser driving assembly 30 is connected to corresponding lasers for providing corresponding driving currents to the lasers to which it is connected in response to the received enable signal and current control signal, each of the lasers being adapted to emit light under the driving of the driving current provided by the laser driving assembly 30.
For example, as shown in fig. 3. The blue laser 201, the red laser 202, and the green laser 203 are respectively connected to the laser driving assembly 30. The laser driving assembly 30 may provide a corresponding driving current to the blue laser 201 in response to the blue PWM signal b_pwm and the enable signal b_en transmitted from the display control circuit 10. The blue laser 201 is configured to emit light when driven by the driving current.
Based on the circuit architecture of fig. 3, the projection apparatus 1 allows the luminance sensor 40 to detect the first luminance value of the laser light source by providing the luminance sensor 40 in the light-emitting path of the laser light source 100, and transmits the first luminance value to the display control circuit 10.
The display control circuit 10 may obtain a second brightness value corresponding to the driving current of each laser, and determine that the laser has a COD fault when it is determined that the difference between the second brightness value of the laser and the first brightness value of the laser is greater than a difference threshold; the display control circuit can adjust the current control signals of the corresponding laser driving components of the lasers until the difference value is smaller than or equal to the difference value threshold value, so that the COD fault of the blue laser is eliminated; the projection equipment can timely eliminate the COD fault of the laser, reduce the damage rate of the laser and improve the image display effect of the projection equipment 1.
Fig. 4 is a schematic view of an optical path of a projection apparatus according to some embodiments of the present application.
In some embodiments, the laser light source 100 in the projection device 1 may include a blue laser 201, a red laser 202, and a green laser 203 that are independently provided, and thus, the projection device 1 may also be referred to as a three-color projection device. The blue laser 201, the red laser 202 and the green laser 203 may be all light-weight (Mirai Console Loader, MCL) packaged lasers, which has a small volume and is beneficial to compact arrangement of light paths.
As shown in fig. 4, the optical bench 200 of the projection device 1 includes an optical assembly 210, and the optical assembly 210 can modulate the light beam provided by the laser light source 100 with an image signal of an image to be displayed to obtain a projection light beam.
Fig. 5 is a schematic diagram of a projection device according to some embodiments of the present application. As shown in fig. 5, in some embodiments, the projection device 1 further includes a controller 50, the controller 50 including at least one of a central processing unit (Central Processing Unit, CPU), a video processor, an audio processor, a graphics processor (Graphics Processing Unit, GPU), a random access Memory (Random Access Memory, RAM), a Read-Only Memory (ROM), a first interface to an nth interface for input/output, a communication Bus (Bus), and the like.
In some embodiments, referring to fig. 5, the projection device 1 may further comprise an image capturing device 60, the image capturing device 60 being coupled to the controller 50 for cooperating with the projection device 1 to enable adjustment control of the projection process. The image pickup device 60 configured by the projection apparatus 1 may be a normal camera, a 3D camera, a binocular camera, or a depth camera, for example. When the image pickup device 60 is a binocular camera, it specifically includes a left camera and a right camera; the binocular camera can acquire the corresponding curtain of the projection device 1, namely the image and the playing content presented by the projection surface, and the image or the playing content is projected by the optical machine 200 built in the projection device 1.
In some embodiments, referring to fig. 5, the projection device 1 may further include a projection assembly 70, and in some examples, the projection assembly 70 may include at least one of the laser light source 100, the light engine 200, and the lens 300 of fig. 2; for example, the projection assembly 70 may include a laser light source 100, an optical engine 200, and a lens 300; alternatively, the projection module 70 has a function of at least one of the laser light source 100, the optical machine 200, and the lens 300.
It should be noted that the embodiments of the present application may be applied to various types of projection apparatuses 1, for example, the projection apparatus 1 may be a projector. The projector is a projection device capable of projecting images or videos on a screen, and can play corresponding video signals through different interfaces in connection with computers, broadcast television networks, the internet, video compact discs (Video Compact Disc, VCD), digital video discs (Digital Versatile Disc Recordable, DVD), game consoles, DV and the like. Projectors are widely used in homes, offices, schools, entertainment venues, and the like. The projection device 1 may also be another type of projection device, which is not limited in this application.
Fig. 6 is a schematic view of a curtain according to some embodiments of the present application. As shown in fig. 7, the curtain 21 may include a projection area 211 and an edge line 212. The projection area 211 is used for displaying the image to be projected by the projection assembly 70.
In some embodiments, the projection assembly 70 may project an image to be projected onto the curtain 21. Illustratively, the projection screen 2 includes a curtain 21, or the curtain 21 may be one implementation of the projection screen 2. The curtain 21 may be used for displaying images to be projected, video files, etc., projected by the projection assembly 70 in movies, offices, home theatres, large meetings, etc.
In some examples, the curtain 21 may be provided in different gauge sizes according to actual needs. For example, in order to better conform to the viewing habit of the user, the aspect ratio of the curtain 21 may be set to 16:9, or the aspect ratio of the curtain 21 may be set to 4:3.
In other examples, the curtain 21 may also be provided with different edge lines 212 according to actual requirements. Referring to fig. 7, the peripheral edge positions of the curtain 21 have dark color edge lines 212, for example, four edge lines 212 of the curtain 21 may be black. The dark edge lines 212 typically have a certain width, and thus the edge lines 212 may also be referred to as edge strips. For example, the curtain 21 shown in fig. 7 may be referred to as a four-sided curtain. Because the edges of the four-side curtain have more obvious edge lines 212, the projection device 1 can stably, efficiently and accurately identify the four-side curtain in the environment by utilizing the characteristics, so that the projection device 1 can also realize quick curtain entering after moving.
However, there are other types of curtains 21, such as double-sided curtains, in addition to four-sided curtains. Fig. 7 is a schematic diagram of a double-sided curtain provided in the embodiment of the present application, where two edge positions of the double-sided curtain have dark edge lines 212, and the other two edges have no dark edge lines 212. For example, as shown in fig. 7, in the double-sided curtain, the positions of the upper and lower edges may have black edge lines 212 of a certain width, respectively, while the positions of the left and right edges do not have dark edge lines 212. Therefore, the four edges of the double-sided curtain cannot form a closed quadrangle (e.g., a rectangle), so that the projection apparatus 1 may not accurately recognize the position of the double-sided curtain when performing projection, or the projection apparatus 1 may recognize the double-sided curtain as an obstacle, so that the image to be projected cannot be accurately projected onto the curtain 21.
It should be noted that the double-sided curtain may also be any two edges with dark edge lines, and the other two edges except for the two edges do not have dark edge lines, which is not limited in this application. For example, the positions of the left and right edges of the double-sided curtain have black edge lines 212 with a certain width, and the positions of the upper and lower edges have no edge lines, so that the positions of the upper and lower edges of the double-sided curtain cannot be accurately identified.
In order to solve the above problem, the projection device 1 provided in the embodiment of the present application can also realize that the projection content can be accurately projected to the area of the curtain 21 for the curtain (for example, the double-sided curtain) with other specifications that are not four-sided curtains, thereby improving the universality of the projection device 1 to the curtain 21.
Fig. 8 is a flowchart of a projection process of the projection device for projecting a strip projection image onto a curtain, as shown in fig. 8, and the projection process includes steps S810 to S860.
In step S810, a first image is received, where the first image is an image of the area where the curtain 21 is located.
In some embodiments, the image capturing device 60 may capture the first image by capturing the region of the curtain 21. For example, the first image may include various objects in the environment of the area where the curtain 21 is located, such as a wall, a wall pendant, wallpaper, and the like. In some examples, the first image captured by the image capturing device 60 may or may not be a color image, and some embodiments of the present application will be described taking the first image captured by the image capturing device 60 as a color image as an example.
The image pickup device 60 transmits the photographed first image to the controller 50, and the controller 50 receives the first image.
Step S820, processing the first image to obtain a second image; the pixel value of each pixel in the second image is the first pixel value or the second pixel value.
After receiving the first image transmitted from the image pickup device 60, the controller 50 performs a corresponding process on the first image. In some embodiments, the controller 50 may perform gray scale conversion, truncation, binarization, etc. on the received first image to obtain the second image. The obtained pixel values of each pixel in the second image are the first pixel or the second pixel, that is, the second image only contains two pixel values, that is, the pixel values in the second image are the pixel values of two different colors, so that the identification of the curtain 21 by the projection device 1 is facilitated.
Illustratively, the first pixel value may be a pixel value of 0, and the second pixel value may be a pixel value of 255, so that the second image includes a black pixel with a pixel value of 0 and a white pixel with a pixel value of 255, thereby further improving the accuracy of the identification of the curtain 21.
In some embodiments, as shown in fig. 9, processing the first image in step S820 to obtain the second image includes the following steps S910 to S930.
Step S910, performing gray-scale conversion on the first image to obtain a gray-scale image corresponding to the first image.
In general, the first image is a color image captured by the image capturing device 60, and in order to more easily and accurately identify the region of the curtain 21 in the environmental factors in the first image, the controller 50 may perform a gray-scale conversion process based on the acquired first image, and convert the color first image into a gray-scale map corresponding to the first image. For example, the controller 50 may adjust the gray value of each pixel in the first image according to a preset transformation relationship according to a preset condition, so as to obtain a gray map corresponding to the first image, where the gray map corresponding to the first image has better image quality and better display effect than the first image.
In step S920, the gray scale image corresponding to the first image is truncated, so as to obtain a truncated image, where the truncated image includes the curtain 21.
In general, the size of the first image may be relatively large, for example, the first image may be an image of the entire wall including the area of the curtain 21, and in order to more quickly locate the position of the area of the curtain 21, the controller 50 may intercept a gray scale image corresponding to the first image, and acquire the intercepted image. The size of the intercepted image is smaller than the size of the gray scale image corresponding to the first image, and the intercepted image is a part of the image of the area including the curtain 21 in the first image.
For example, the controller 50 may intercept the gray scale map corresponding to the first image according to a preset parameter, where the preset parameter may be, for example, a parameter set in advance inside the projection device 1, or may also be a default parameter in the projection device 1. In some examples, the first and second sensors are configured to detect a signal. The preset parameters may be set based on internal and external parameters of the light engine 200. For example, if the preset parameter in the projection apparatus 1 is 50px×50px, when the size of the gray map corresponding to the first image is 100px×100px, the controller 50 may intercept, as the intercepted image, the image with the size of 50px×50px including the area of the curtain 21 in the gray map corresponding to the first image of 100px×100 px.
The controller 50 may also intercept the first image based on the maximum projection area of the projection device 1, for example. In some examples, the maximum projection area of the projection device 1 is related to the device type of the projection device 1, etc., and the areas of the maximum projection areas corresponding to different types of projection devices 1 may be different. For example, the projection apparatus 1 may cut out an area having the same size as the maximum projection area thereof in the gray scale image corresponding to the first image as the cut-out image.
The execution order of step S920 and step S930 is not limited, for example, step S930 may be performed first to intercept the first image, and then step S920 may be performed to perform gray-scale conversion on the intercepted first image.
Step S930, binarizing the intercepted image to obtain a second image.
In some examples, the binarized image is processed, that is, the pixel value (also called gray value) of each pixel point on the truncated image is set to 0 or 255, that is, the truncated image exhibits obvious visual effects of only black and white. The pixels in the second image obtained after binarization have only two pixel values, i.e. the pixel value of each pixel is the first pixel value (e.g. pixel value 0) or the second pixel value (e.g. pixel value 255), so that the second image only contains two colors (e.g. black and white). Therefore, by performing binarization processing on the truncated image, the accuracy of the identification of the region of the curtain 21 by the projection apparatus 1 can be further improved.
For example, the controller 50 may have a preset pixel threshold, and compare and divide the pixel values of the pixels in the truncated image based on the preset pixel threshold. When the pixel value of a certain pixel in the intercepted image is smaller than a preset pixel threshold value, setting the pixel value of the pixel as a first pixel value; when the pixel value of a certain pixel in the intercepted image is larger than or equal to the preset pixel threshold value, setting the pixel value of the pixel as a second pixel value.
In some examples, the preset pixel threshold may be obtained through multiple tests, e.g., the preset pixel threshold may be 100. That is, when the pixel value of a certain pixel in the truncated image is less than 100, the pixel value of the pixel is adjusted to a first pixel value (for example, pixel value 0), and when the pixel value of a certain pixel in the truncated image is greater than or equal to 100, the pixel value of the pixel is adjusted to a second pixel value (for example, pixel value 255). For example, for a bilateral curtain, after binarization, the edge lines 212 on the upper and lower edges may be displayed in black (with a pixel value of 0), and the projection area 211 may be displayed in white (with a pixel value of 255). It should be noted that, for the bilateral curtain, the edges of the left and right sides may appear white, so that the exact area of the curtain 21 cannot be determined, and the further determination in step S830 needs to be performed.
In step S830, if the area of the first connection region composed of the first pixel values in the second image is smaller than the first threshold area, the first pixel values in the first connection region are adjusted to the second pixel values, so as to obtain a third image.
In some embodiments, pixels of the first pixel values in the second image may constitute a plurality of first connected regions, and pixels of the second pixel values may constitute a plurality of second connected regions. Wherein the second communication region includes a region of the curtain 21.
Specifically, the first communication region and the second communication region may be respectively closed patterns, for example, the first communication regions may be black pixels, that is, the pixel values of the pixels in the first communication regions are all 0; the second communication regions may each be a white pixel, that is, the pixel values for each pixel in the second communication regions are 255.
In some examples, the number of the first communication area and the second communication area may also be one, or the number of the first communication area and the second communication area may also be greater than 1; if the number of the first communication areas and the second communication areas is plural, the sizes (e.g., the sizes of the areas) of each of the first communication areas and each of the second communication areas may be the same or different, which is not limited in this application.
The first threshold area may be obtained by a number of tests, for example. For example, the first threshold area may be one-fourth of the area of the largest first communication area among the plurality of first communication areas. The first threshold area may be used to indicate whether the first communication area affects the projection effect of the projection device 1 or, alternatively, whether the first communication area may be part of the projection area of the projection device 1.
In some examples, when the area of the first communication region composed of the first pixel values is smaller than the first threshold area, the first communication region is considered to be smaller, and no obstacle affecting the projection by the projection apparatus 1 is included in the first communication region. Thus, to avoid areas (or blobs) of abrupt color in the curtain 21, the pixel values (e.g., first pixel values) of these abrupt color blobs may be processed, e.g., the pixel value of the portion (e.g., pixel value 0) may be adjusted to modify to a second pixel value (e.g., pixel value 255).
If the area of the first communication area is larger than the first threshold area, the portion may be an obstacle area and cannot be ignored, and at this time, the pixel value in the first communication area may not be adjusted, i.e., the first pixel value may be maintained.
After the processing in step S830, a third image is obtained, where one or more second connected areas are included in the third image, and illustratively, an area of the curtain 21 may be included in the largest second connected area in the third image.
In step S840, a first rectangular region is determined in the third image based on the second pixel values in the third image, the first rectangular region being the largest rectangular region in the third image that is composed of the second pixel values.
As can be seen from the above step S830, the second pixel values form a plurality of second communication areas. First, determining a largest second communication area among the plurality of second communication areas; then, the first rectangular region is determined in the largest second connected region, that is, the first rectangular region may be the largest rectangular region composed of the second pixel values in the second connected region.
It should be noted that, the first rectangular area may also be referred to as a first rectangle, and in general, the curtain 21 is rectangular, and meanwhile, the projection area of the rectangle may conform to the habit of viewing by the eyes of the person, so that the viewing effect is better. Therefore, the largest rectangular area determined in the second communication area may contain the curtain 21, which is advantageous for further determining the area of the curtain 21.
In step S850, an area of the curtain 21 is determined in the first rectangular area according to the preset aspect ratio.
Illustratively, the preset aspect ratio may be set according to a length-width parameter of the curtain 21, or a user's demand, or a viewing effect, or the like. For example, the preset aspect ratio may be 16:9, or the preset aspect ratio may be 4:3. Note that, the length and width referred to in the present application are used to indicate two adjacent sides of the first rectangular area, and the length and width may also be referred to as width and height, and the preset aspect ratio may also be referred to as a preset aspect ratio in the present application, which is not limited in the present application.
In some embodiments, as shown in fig. 10, determining the region of the curtain 21 from the first rectangular region according to the preset aspect ratio in step S850 includes the following steps S1010 to S1030.
In step S1010, the center point position of the first rectangular area is calculated according to the four corner point positions of the first rectangular area.
In some examples, after the controller 50 determines the first rectangular region, corner coordinates of four corners of the first rectangular region (hereinafter may also be referred to as a first rectangle) may be obtained. For example, four corner coordinates of the first rectangle may be denoted as a (x 1 ,y 1 )、B(x 1 ,y 2 )、C(x 2 ,y 1 )、D(x 2 ,y 2 )。
From the four corner coordinates of the first rectangle, the center point coordinates of the first rectangle may be calculated. For example, the center point coordinate O (x 3 ,y 3 ) The coordinates of the first rectangular center point are: x is x 3 =(x 1 +x 2 )/2,y 3 =(y 1 +y 2 ) 2, i.e. O ((x) 1 +x 2 )/2,(y 1 +y 2 )/2)。
Step S1020, determining four corner positions of the second rectangular region from the first rectangular region according to the preset aspect ratio based on the center point position.
In some embodiments, the center point position of the second rectangular region is the same as the center point position of the first rectangular region, and the aspect ratio of the second rectangular region is a preset aspect ratio.
That is, after the center point coordinates of the first rectangular region are determined, the center point coordinates of the second rectangular region are determined, the center point of the first rectangular region is taken as the center point of the second rectangular region, the aspect ratio of the second rectangular region is taken as the preset aspect ratio, the second rectangular region (hereinafter may also be referred to as the second rectangle) is determined, and at the same time, the corner point coordinates of the four corner points of the second rectangular region are determined.
Specifically, the length (or width) W of the first rectangle 1 =x 2 -x 1 The width (or called height) H of the first rectangle 1 =y 2 -y 1 The method comprises the steps of carrying out a first treatment on the surface of the When the length of the second rectangle is W 2 The width of the second rectangle is H 2 At a preset aspect ratio of 16:9, i.e. W 2 /H 2 =16:9 as examples:
when W is 1 /H 1 At > 16/9, W 2 =H 1 *(16/9),H 2 =H 1 The method comprises the steps of carrying out a first treatment on the surface of the When W is 1 /H 1 When the ratio of H to H is less than or equal to 16/9 2 =W 1 *(9/16),W 2 =W 1
The coordinates of the four corner points of the second rectangle can be expressed as: e (x) 3 -W 2 /2,y 3 -H 2 /2)、F(x 3 +W 2 /2,y 3 -H 2 /2)、G(x 3 +W 2 /2,y 3 +H 2 /2)、H(x 3 -W 2 /2,y 3 +H 2 /2)。
In step S1030, the area of the curtain 21 is determined according to the four corner positions of the second rectangular area.
After four corner coordinates of the second rectangle are determined, the area of the curtain 21 may be determined according to the four corner coordinates of the second rectangle.
In some embodiments, as shown in fig. 11, determining the region of the curtain 21 according to the four corner positions of the second rectangular region in step S1030 includes the following steps S1110 to S1140.
Step S1110, performing filtering processing on the truncated image to obtain a fourth image.
After step S920 in the above embodiment is completed, filtering processing may also be performed on the truncated image. For example, the average value filtering processing is performed on the truncated image. The mean value filtering processing can realize the smoothing processing of the intercepted image so as to avoid the influence of irrelevant angular points in the intercepted image, and the filtered image is a fourth image.
Step S1120, determining a plurality of projection point positions according to the corner positions of the fourth image.
For example, after the fourth image is obtained, a plurality of projection point positions may be determined in the fourth image by a method of corner detection. In some examples, the coordinates of each corner in the fourth image may be obtained using an OpenCV function, where the coordinates of each corner are coordinates of a plurality of projection points in the projection area of the projection apparatus 1. That is, a plurality of projection point coordinates may determine a projection area of the projection apparatus 1, to which the projection apparatus 1 needs to project an image to be projected, including an area of the curtain 21.
In step S1130, among the plurality of projection point positions, the projection point positions closest to the four corner positions of the second rectangular region, respectively, are determined as target projection point positions.
After determining the plurality of projection point coordinates in the fourth image and obtaining the four corner coordinates of the second rectangular area through step S1020, in some examples, if coordinates that overlap with the four corner coordinates of the second rectangular area exist in the plurality of projection point coordinates, determining the coordinates that overlap with the four corner coordinates of the second rectangular area in the plurality of projection point coordinates as target projection point coordinates. If the coordinates which are coincident with the four corner coordinates of the second rectangular area do not exist in the plurality of projection point coordinates, the projection point coordinates which are closest to the four corner coordinates of the second rectangular area respectively can be determined as the target projection point positions.
For example, among the plurality of projection point coordinates, a projection point coordinate that coincides with the four corner coordinates of the second rectangular region or a projection point coordinate that is closest to the four corner coordinates of the second rectangular region may be determined in a traversal manner.
In step S1140, the region composed of the target proxel positions is determined as the region of the curtain 21.
The region composed of the coordinates of the target projection points is the region of the curtain 21, that is, the coordinates of the target projection points are the coordinates of the four corner points of the curtain 21.
In step S860, the projection assembly 70 is controlled to project the image to be projected onto the curtain 21 based on the region of the curtain 21.
After determining the region of the curtain 21, the controller 50 may control the projection assembly 70 to project the image to be projected onto the curtain 21, or the controller 50 may control the projection assembly 70 to project the image to be projected onto the region of the curtain 21.
Fig. 12 is a flowchart of still another projection method according to some embodiments of the present application. As shown in fig. 12, the projection method may further include steps S1211 to S1223.
Step S1211, a first image is received.
The first image is an image of the area where the curtain 21 is located. This step S1211 is similar to step S810 in the above-described embodiment, and will not be described here.
Step S1212, performing gray-scale conversion on the first image to obtain a gray-scale image corresponding to the first image.
This step is similar to step S910 in the above embodiment, and will not be described here again.
After step S1212 is performed, step S1213 and step S1220 may be continuously performed, where step S1213 may be performed simultaneously with step S1220, and step S1231 may be performed before or after step S1220, and the execution sequence of step S1213 and step S1220 is not limited in this application.
Step S1213, the gray scale image corresponding to the first image is truncated, so as to obtain a truncated image.
The step S1213 is similar to the step S920 in the above embodiment, and is not repeated here, but the captured image includes an area of the curtain 21.
Step S1214, binarizing the intercepted image to obtain a second image.
This step is similar to step S930 in the above embodiment, and will not be described here again.
In step S1215, it is determined whether the area of the first connection region composed of the first pixel values in the second image is smaller than the first threshold area.
If the area of the first communication area is smaller than the first threshold area, continuing to execute step S1216; if the area of the first communication area is greater than or equal to the second threshold area, the process continues to step S1217.
In step S1216, the first pixel value in the first connection region is adjusted to the second pixel value, so as to obtain a third image.
This step is similar to step S830 in the above embodiment, and will not be described here again.
Step S1217, a first rectangular region is determined in the third image based on the second pixel value in the third image.
The first rectangular area is the largest rectangular area consisting of the second pixel values in the third image. This step S1217 is similar to step S840 in the above-described embodiment, and will not be described here again.
In step S1218, the center point position of the first rectangular area is calculated according to the four corner point positions of the first rectangular area.
This step is similar to step S1010 in the above embodiment, and will not be described here again.
In step S1219, four corner positions of the second rectangular region are determined from the first rectangular region according to the preset aspect ratio based on the center point position.
This step is similar to step S1020 in the above embodiment, and will not be described here again. After step S1219, step S1222 is continued to be executed.
Step S1220, filtering the truncated image to obtain a fourth image.
Step S1110 in the above embodiment is similar, and will not be described here again. It should be noted that step S1220 may be performed at any time between step S1212 to step S1222.
Step S1221, determining a plurality of projection point positions according to the corner positions of the fourth image.
Step S1120 in the above embodiment is similar, and will not be described here again. After step S1221, step S1222 is continued to be executed.
In step S1222, among the plurality of projection point positions, the projection point position closest to each of the four corner positions of the second rectangular region is determined as the target projection point position.
Step S1130 in the above embodiment is similar, and will not be described here.
In step S1223, the area composed of the target proxel positions is determined as the area of the curtain 21.
Step S1140 in the above embodiment is similar, and will not be described here again.
In some embodiments, step S860 specifically further includes: comparing the area of the curtain 21 with a second threshold area; if the area of the first projection area is greater than or equal to the second threshold area, the projection component 70 is controlled to project the image to be projected onto the curtain 21 based on the area of the curtain 21.
Specifically, by comparing the area of the curtain 21 (which may also be referred to as the area of the curtain 21) with the second threshold area, the overlap ratio of the projection area of the projection device 1 and the curtain 21, or, in other words, the coverage of the projection area of the projection device 1 on the curtain 21, may be indicated. If the area of the curtain 21 is greater than or equal to the second threshold area, this indicates that the projection area of the projection device 1 can better cover the area of the curtain 21, in this case, this indicates that the projection device 1 is successful in entering the curtain, and the projection component 70 can better project the image to be projected onto the curtain 21. If the area of the curtain 21 is smaller than the second threshold area, this indicates that the projection area of the projection device 1 cannot better cover the area of the curtain 21, in which case, this indicates that the projection device 1 fails to enter the curtain, and the projection device 1 needs to re-enter the correction or perform other operations to achieve further projection.
Illustratively, the setting of the second threshold area is related to the device type of the projection device 11, etc., and for example, the second threshold area may be set to 85% of the maximum projection area of the projection device 1 for better viewing. It should be noted that the second threshold area may also be set according to actual requirements, which is not limited in this application.
Fig. 13 is a flowchart of another projection method according to some embodiments of the present application, as shown in fig. 13, where the projection method includes steps S1310 to S1370.
Step S1310, receive the incoming screen opening command.
For example, the projection apparatus 1 may be provided with a screen entry switch, which may be a push switch, or may be a touch switch, which is not limited in this application. In some examples, when the user activates the in-curtain switch, the in-curtain switch sends an in-curtain opening command to the controller 50, and the controller 50 receives the in-curtain opening command.
Step S1320, it is determined whether the second image has a closed rectangle.
For example, the second image may be acquired by performing steps S1211 to S1214 in the above-described embodiments; by judging whether the second image has a closed rectangle, the edge line 212 of the curtain 21 can be obtained, for example, whether the curtain 21 is a four-sided curtain or a two-sided curtain can be determined.
If the second image has a closed rectangle, step S1330 is performed; if the second image does not have the closed rectangle, the process continues to step S1340.
In step S1330, the projection module 70 is controlled to project the image to be projected onto the curtain 21 by four-edge curtain entering.
If the second image has a closed rectangle, the screen 21 is a four-sided screen, and at this time, projection may be performed in accordance with the projection system of the four-sided screen.
Step S1340, enter the control flow of double-side incoming curtain.
At this time, the controller 50 may perform steps S810 to S860 in the above embodiments to obtain the region of the curtain 21.
In step S1350, it is determined whether the area of the curtain 21 is greater than or equal to the second threshold area.
If the area of the curtain 21 is greater than or equal to the second threshold area, step S1360 is performed, and if the area of the curtain 21 is less than the second threshold area, step S1370 is performed.
In step S1360, the projection device 1 succeeds in entering the screen.
In step S1370, the projection device 1 fails to enter the screen.
In summary, the projection apparatus 1 and the projection method provided in the embodiments of the present application may process a first image captured by the image capturing device 60 after receiving the first image, to obtain a second image with a first pixel value or a second pixel value of each pixel; then judging the relation between the area of a first communication area formed by first pixel values in the second image and the first threshold area, and when the area of the first communication area is smaller than the first threshold area, adjusting the first pixel value in the first communication area to be the second pixel value to obtain a third image; next, determining a first rectangular region in the third image based on the second pixel value in the third image, the first rectangular region being a largest rectangular region in the third image composed of the second pixel value; determining the area of the curtain 21 in the first rectangular area according to the preset length-width ratio; finally, the projection assembly 70 is controlled to project the image to be projected onto the curtain 21 based on the region of the curtain 21. Therefore, the projection device 1 provided by the application can more accurately identify the region of the curtain 21, so that the content to be projected is projected to the curtain 21 more accurately, and meanwhile, the projection device 1 provided by the application is higher in universality of the curtain 21.
The foregoing detailed description of the embodiments is merely illustrative of the general principles of the present application and should not be taken in any way as limiting the scope of the invention. Any other embodiments developed in accordance with the present application without inventive effort are within the scope of the present application for those skilled in the art.

Claims (10)

1. A projection device, comprising:
a projection assembly configured to: projecting an image to be projected to a curtain;
an image pickup device configured to: shooting an area where the curtain is located to obtain a first image;
a controller coupled to the image capture device and the projection assembly, respectively, and configured to:
acquiring the first image, and processing the first image to obtain a second image; wherein, the pixel value of each pixel in the second image is the first pixel value or the second pixel value;
if the area of a first connection region formed by the first pixel values in the second image is smaller than a first threshold area, the first pixel values in the first connection region are adjusted to be the second pixel values, and a third image is obtained;
Determining a first rectangular region in the third image based on the second pixel value in the third image, the first rectangular region being a largest rectangular region in the third image composed of the second pixel value;
determining the area of the curtain in the first rectangular area according to a preset length-width ratio;
and controlling the projection component to project the image to be projected to the curtain based on the region of the curtain.
2. The projection device of claim 1, wherein the controller performs determining the region of the curtain in the first rectangular region according to a preset aspect ratio, further configured to:
calculating the center point position of the first rectangular area according to the four corner point positions of the first rectangular area;
determining four corner positions of a second rectangular area from the first rectangular area according to the preset length-width ratio based on the center point position of the first rectangular area; the center point position of the second rectangular area is the same as the center point position of the first rectangular area, and the aspect ratio of the second rectangular area is the preset aspect ratio;
and determining the area of the curtain according to the four corner positions of the second rectangular area.
3. The projection device of claim 2, wherein the controller performs processing of the first image to obtain a second image, further configured to:
performing gray level conversion on the first image to obtain a gray level image corresponding to the first image;
intercepting a gray level image corresponding to the first image to obtain an intercepted image; wherein the intercepted image comprises the curtain;
and performing binarization processing on the intercepted image to obtain the second image.
4. The projection device of claim 3, wherein the controller performs determining the area of the curtain from four corner locations of the second rectangular area, further configured to:
filtering the intercepted image to obtain a fourth image;
determining a plurality of projection point positions according to the angular point positions in the fourth image;
determining the closest projection point positions to the four corner positions of the second rectangular area from the plurality of projection point positions as target projection point positions;
and determining the region formed by the target projection points as the region of the curtain.
5. The projection device of any of claims 1-4, wherein the controller executing the curtain-based region controls the projection component to project the image to be projected to the curtain is further configured to:
comparing the area of the curtain with a second threshold area;
and if the area of the curtain is larger than or equal to the second threshold area, controlling the projection component to project the image to be projected to the curtain based on the area of the curtain.
6. A projection method, the method comprising:
acquiring a first image, wherein the first image is an image of an area where the curtain is located;
processing the first image to obtain a second image; wherein, the pixel value of each pixel in the second image is the first pixel value or the second pixel value;
if the area of a first connection region formed by the first pixel values in the second image is smaller than a first threshold area, the first pixel values in the first connection region are adjusted to be the second pixel values, and a third image is obtained;
determining a first rectangular region in the third image based on the second pixel value in the third image, the first rectangular region being a largest rectangular region in the third image composed of the second pixel value;
Determining the area of the curtain in the first rectangular area according to a preset length-width ratio;
and controlling the projection component to project the image to be projected to the curtain based on the region of the curtain.
7. The method of claim 6, wherein the determining the region of the curtain from the first rectangular region according to a preset aspect ratio further comprises:
calculating the center point position of the first rectangular area according to the four corner point positions of the first rectangular area;
determining four corner positions of a second rectangular area from the first rectangular area according to the preset length-width ratio based on the center point position of the first rectangular area; the center point position of the second rectangular area is the same as the center point position of the first rectangular area, and the aspect ratio of the second rectangular area is the preset aspect ratio;
and determining the area of the curtain according to the four corner positions of the second rectangular area.
8. The method of claim 7, wherein processing the first image to obtain a second image further comprises:
performing gray level conversion on the first image to obtain a gray level image corresponding to the first image;
Intercepting a gray level image corresponding to the first image to obtain an intercepted image; the intercepted image comprises the curtain;
and performing binarization processing on the intercepted image to obtain the second image.
9. The method of claim 8, wherein the determining the area of the curtain from four corner locations of the second rectangular area further comprises:
filtering the intercepted image to obtain a fourth image;
determining a plurality of projection point positions according to the corner positions of the fourth image;
determining the closest projection point positions to the four corner positions of the second rectangular area from the plurality of projection point positions as target projection point positions;
and determining the region formed by the target projection points as the region of the curtain.
10. The method of any of claims 6-9, wherein the controlling the projection component to project the image to be projected to the curtain based on an area of the curtain further comprises:
comparing the area of the curtain with a second threshold area;
and if the area of the curtain is larger than or equal to the second threshold area, controlling the projection component to project the image to be projected to the curtain based on the area of the curtain.
CN202211614385.1A 2022-12-12 2022-12-12 Projection apparatus and projection method Pending CN116246553A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211614385.1A CN116246553A (en) 2022-12-12 2022-12-12 Projection apparatus and projection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211614385.1A CN116246553A (en) 2022-12-12 2022-12-12 Projection apparatus and projection method

Publications (1)

Publication Number Publication Date
CN116246553A true CN116246553A (en) 2023-06-09

Family

ID=86625025

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211614385.1A Pending CN116246553A (en) 2022-12-12 2022-12-12 Projection apparatus and projection method

Country Status (1)

Country Link
CN (1) CN116246553A (en)

Similar Documents

Publication Publication Date Title
US10397486B2 (en) Image capture apparatus and method executed by image capture apparatus
CN114827563A (en) Projection apparatus and projection region correction method
US10775688B2 (en) Single and multi-modulator projector systems with global dimming
US20180013995A1 (en) Projection system and method for adjusting projection system
US10281714B2 (en) Projector and projection system that correct optical characteristics, image processing apparatus, and storage medium
WO2009142299A1 (en) Signal processing device and projection image display device
US9462215B2 (en) Enhanced global dimming for projector display systems
JP2011242455A (en) Control device and projection type video display device
CN115883803A (en) Projection equipment and projection picture correction method
US10977777B2 (en) Image processing apparatus, method for controlling the same, and recording medium
CN115002432A (en) Projection equipment and obstacle avoidance projection method
JP2020532821A (en) Lighting methods and systems that improve the perspective color perception of images observed by the user
CN116055696A (en) Projection equipment and projection method
CN116246553A (en) Projection apparatus and projection method
US10044998B2 (en) Projection apparatus, projection method, and storage medium
CN116320335A (en) Projection equipment and method for adjusting projection picture size
CN115529445A (en) Projection equipment and projection image quality adjusting method
EP4090006A2 (en) Image signal processing based on virtual superimposition
CN115604445A (en) Projection equipment and projection obstacle avoidance method
CN114928728A (en) Projection apparatus and foreign matter detection method
JP2019128827A (en) Image processing device, image processing method, and program
JP2011160062A (en) Tracking-frame initial position setting apparatus and method of controlling operation of the same
US20230028087A1 (en) Control apparatus, image projection system, control method, and storage medium
CN114885142B (en) Projection equipment and method for adjusting projection brightness
JP5166859B2 (en) White balance control device, imaging device using the same, and white balance control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination