WO2020155995A1 - 投影方法及投影设备 - Google Patents

投影方法及投影设备 Download PDF

Info

Publication number
WO2020155995A1
WO2020155995A1 PCT/CN2019/129517 CN2019129517W WO2020155995A1 WO 2020155995 A1 WO2020155995 A1 WO 2020155995A1 CN 2019129517 W CN2019129517 W CN 2019129517W WO 2020155995 A1 WO2020155995 A1 WO 2020155995A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
projection
brightness
area
stray light
Prior art date
Application number
PCT/CN2019/129517
Other languages
English (en)
French (fr)
Inventor
李勇
高志强
杨伟樑
杨承德
Original Assignee
广景视睿科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广景视睿科技(深圳)有限公司 filed Critical 广景视睿科技(深圳)有限公司
Publication of WO2020155995A1 publication Critical patent/WO2020155995A1/zh
Priority to US17/389,443 priority Critical patent/US11886107B2/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/53Means for automatic focusing, e.g. to compensate thermal effects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B9/00Exposure-making shutters; Diaphragms
    • G03B9/02Diaphragms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/317Convergence or focusing systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Definitions

  • the embodiments of the present application relate to the field of projection technology, and in particular to a projection method and projection equipment.
  • the projector As a better large-screen display device, has gradually become a standard computer peripheral product, and has been widely used in various industries such as education and business.
  • the projection quality of the existing projection screen is relatively poor.
  • One purpose of the embodiments of the present application is to provide a projection method and projection equipment, which can reduce the influence of stray light.
  • an embodiment of the present application provides a projection method, including:
  • the aperture is adjusted, and the aperture is used to control the light flux projected to the projection area.
  • an embodiment of the present application provides a projection device, including:
  • the acquisition module is used to acquire the projected image when the projection area is projected and the environmental image when it is not projected;
  • the adjustment module is configured to adjust an aperture according to the projection image and the environmental image, and the aperture is used to control the luminous flux projected to the projection area.
  • the projection image and the environment image when the projection area is projected are acquired, where the environment image includes the image when the projection area is projected on the preset projection screen or when the projection area is not projected.
  • adjust the aperture which is used to control the luminous flux projected to the projection area. Therefore, by adjusting the aperture, it can reduce the influence of stray light, and effectively coordinate the relationship between the projection contrast and the brightness, so that the projection contrast and the brightness are effectively balanced, thereby improving the projection quality.
  • Figure 1a is a schematic structural diagram of a projection device provided by an embodiment of the present application.
  • FIG. 1b is a schematic structural diagram of a projection module provided by an embodiment of the present application.
  • FIG. 1c is a schematic structural diagram of an aperture provided by an embodiment of the present application.
  • FIG. 1d is a schematic structural diagram of a projection device provided by another embodiment of the present application.
  • Fig. 2a is a schematic flowchart of a projection method provided by an embodiment of the present application.
  • Figure 2b is a schematic diagram of the flow of S22 in Figure 2a;
  • Figure 2c is a schematic diagram of a projected image provided by an embodiment of the present application.
  • Figure 2d is a schematic diagram of the flow of S222 in Figure 2b;
  • Figure 2e is a schematic diagram of the flow of S2222 in Figure 2d;
  • Fig. 2f is a schematic diagram of the flow of S221 in Fig. 2b;
  • Figure 2g is a schematic diagram of the flow of S2211 in Figure 2f;
  • Fig. 3a is a schematic diagram of the flow of S2212 in Fig. 2f;
  • Figure 3b is a schematic diagram of the flow of S31 in Figure 3a;
  • 4a to 4d are schematic diagrams of the coordinates of the four intersection points of the quadrilateral image provided by the embodiment of the present application.
  • FIG. 5 is a schematic flowchart of a projection method provided by another embodiment of the present application.
  • Fig. 6a is a schematic structural diagram of a projection device provided by an embodiment of the present application.
  • Fig. 6b is a schematic structural diagram of the adjustment module in Fig. 6a;
  • 6c is a schematic structural diagram of a projection device provided by another embodiment of the present application.
  • FIG. 6d is a schematic structural diagram of a projection device provided by yet another embodiment of the present application.
  • Fig. 7 is a schematic structural diagram of a controller provided by an embodiment of the present application.
  • the projection equipment When projecting, users often require the projected image to have high quality and contrast to obtain a comfortable high-quality projection effect. Corresponding to a dynamic image, when the light-dark transition is faster, the higher the contrast, the easier it is for human eyes to distinguish such a transition process, thereby bringing users a realistic visual experience.
  • the projection equipment often has a relationship between brightness and contrast. When the projection equipment has high brightness, it will also be accompanied by more stray light, resulting in a decrease in contrast.
  • the condenser lens When the size of the projection device is smaller, the condenser lens is closer to the digital micro-mirror device. At this time, most of the dark and flat light will be reflected by the condenser lens, and serious stray light areas appear in the boundary area of the projection screen, which reduces the quality of the projection image. , Affect the viewer's visual experience.
  • an embodiment of the present application provides a projection device.
  • the projection device can be configured into any shape and used in any suitable business scenario.
  • the projection device can be configured as a projection mobile phone, a large projector, a projection TV, and so on.
  • the projection device 10 includes a projection module 11, an image acquisition module 12, an aperture 13 and a controller 14.
  • the projection module 11 is used for projecting a projection image to the projection area, and the projection image can be an image in any suitable image format.
  • the projection module 11 can be any suitable type of projection module, such as a projection module using CRT, LCD, DLP or DLV technology.
  • the projection module 11 includes: an illumination light source 111 and a projection lens 112.
  • the projection lens 112 is arranged on the light exit side of the illumination light source 111, and the aperture 13 is arranged between the illumination light source 111 and the projection lens 112. By adjusting the aperture 13, the light flux of the illumination light source 111 passing through the projection lens 112 is blocked, thereby improving stray light Impact.
  • the image acquisition module 12 is used to acquire the image where the projection area is located.
  • the image acquisition module 12 includes one or more optical sensors and lenses.
  • One or more optical sensors are arranged on the imaging surface of the lens.
  • the generated optical image is projected onto the optical sensor.
  • Optical sensors include charge-coupled devices (CCD) and complementary metal oxide semiconductors (Complementary Metal Oxide Semiconductor, CMOS).
  • CCD charge-coupled devices
  • CMOS complementary metal oxide semiconductors
  • the CMOS sensors can be back-illuminated CMOS sensors or stacked CMOS sensors.
  • the image acquisition module 12 is also integrated with an ISP (Image Signal Processor).
  • the ISP is used to process the output data of the optical sensor, such as AEC (Automatic Exposure Control) and AGC (Automatic Gain Control). ), AWB (automatic white balance), color correction and other functions.
  • the aperture 13 is used to control the light flux projected to the projection area.
  • the aperture 13 can move back and forth along the direction perpendicular to the light output axis of the illumination light source 111 to control the light flux projected to the projection area.
  • the aperture 13 can move along the direction perpendicular to the light exit axis of the illumination light source 111 and move away from the light exit axis, the aperture 13 increases part of the projection light projected by the illumination light source 111 to the projection lens 112.
  • the aperture 13 When the aperture 13 can move along the direction perpendicular to the light exit axis of the illumination light source 111 and move toward the direction close to the light exit axis, the aperture 13 blocks part of the projection light projected from the illumination light source 111 to the projection lens 112, thereby reducing the stray light Influence, improve the contrast.
  • the aperture 13 includes: a base 131, a light blocking sheet 132, and an adjusting device 133.
  • the base 131 is provided with a light through hole 13a, and the light through hole 13a includes a central axis.
  • the number of light through holes 13a can be one or more than two.
  • the light blocking sheet 132 is disposed on the base 131, and the light blocking sheet 132 can be moved away from or close to the central axis to adjust the opening area of the light through hole 13a.
  • the adjusting device 133 is arranged on the base 131, and the adjusting device 133 is respectively connected with the light barrier 132 and the controller 15.
  • the adjusting device 133 is used to control the movement of the light barrier 132.
  • the adjusting device 133 includes The motor is connected to the transmission mechanism, the motor is connected to the transmission mechanism, the transmission mechanism is connected to the light barrier 132, and the motor is also connected to the controller 14.
  • the controller 14 sends a control command to the motor.
  • the motor controls the light barrier 132 to move away from or close to the central axis of the light hole 13a through a transmission mechanism to adjust the opening area of the light hole 13a, for example, when When the light-blocking sheet 132 is controlled to move away from the central axis of the light-passing hole 13a, the opening area of the light-passing hole 13a is increased. When the light-blocking sheet 132 is controlled to move close to the central axis of the light-passing hole 13a, the opening area of the light-passing hole 13a is reduced.
  • the transmission mechanism may include a transmission unit composed of a transmission shaft or various transmission linkages.
  • the projection device 10 further includes a sensor module 15 connected to the controller 14.
  • the sensor module 15 is used to collect environmental data of the projection environment in the projection area and transmit the environmental data to the controller 14.
  • the controller 14 analyzes the environmental data so as to execute preset projection logic according to the analysis result.
  • the sensor module 15 includes an environment detection device, wherein the environment detection device detects changes in the projection environment in real time, for example, detects the brightness of the ambient light and/or the projection position, so the environment detection device obtains the environment data, and the controller 14 By analyzing the environmental data, it is obtained whether the ambient light brightness and/or the projection position has changed, and according to the change result, the preset projection logic is executed.
  • the environment detection device detects changes in the projection environment in real time, for example, detects the brightness of the ambient light and/or the projection position, so the environment detection device obtains the environment data, and the controller 14 By analyzing the environmental data, it is obtained whether the ambient light brightness and/or the projection position has changed, and according to the change result, the preset projection logic is executed.
  • the environment detection device includes a light sensing detection device and/or a position movement sensor, wherein the light sensing detection device is used to detect the brightness of the ambient light, and the position movement sensor is used to determine the projection position.
  • the projection device provided by the embodiments of the present application can reduce the influence of stray light, thereby increasing the projection contrast, thereby improving the projection quality.
  • the embodiments of the present application also provide a projection method, which can be used as a collection of several instructions and applied to the projection device provided in the above-mentioned embodiment, so that the projection device executes several instructions. In order to achieve the purpose or function of the projection method.
  • the projection method S200 includes:
  • the projection area is the area where the projection screen is projected, where the projection area can be selected by the user according to business requirements, for example, the projection area can be a projection screen or a projection wall.
  • the projection image is displayed at the specific position, and the sum of the areas corresponding to the specific position is the projection area.
  • the projection area can be changed rather than fixed.
  • the projection area can be the projection screen of office A. By increasing the area of the projection screen, the projection area will also become larger.
  • the projected image is the image obtained by the projection device photographing the projection area after the projection image is projected into the projection area.
  • the projected image includes the image of the projection image in the projection area and the stray light in the projection area.
  • the environmental image may be an image obtained by the projection device photographing the projection area when the projection area is not projecting any projection images.
  • the projection area since the projection area is not projected, the projection area is There is no image of the projected image in the projection area and the image of the stray light in the projection area, and the environmental image only includes the image of the ambient light in the projection area.
  • the environmental image may also be an image obtained by the projection device photographing the projection area when the projection area is projected on a preset projection image, where the preset projection image includes a completely black projection image and so on.
  • the image acquisition module can be installed in the projection device to collect the projected image or the environmental image. In some embodiments, the image acquisition module can also be arranged in other locations. When the projection image or environmental image is acquired, the image acquisition module sends the projection image or the environmental image to the projection device through the wireless module. Therefore, there are many ways to obtain projected images or environmental images.
  • the projection device when the projection image needs to be acquired, the projection device automatically starts the projection module to project the projection image to the projection area, and then the image acquisition module captures the projection area to obtain the projection image. Immediately afterwards, the projection device closes the projection module and stops projecting the projection image to the projection area, and the image acquisition module captures the projection area to obtain the environmental image.
  • the projection device analyzes the projected image and the environment image according to the image analysis algorithm to determine whether there is still stray light and the location area of the stray light in the projection area when there is stray light.
  • the projection device adjusts the aperture according to the analysis result to reduce or eliminate stray light. Since the aperture can block part of the projection light from being projected to the projection area, it can reduce or eliminate the phenomenon of stray light generated by part of the projection light being projected to the projection area. For example, when there is stray light, the projection device continuously shrinks the aperture and reduces the luminous flux passing through the aperture until the stray light is reduced or eliminated.
  • the aperture by adjusting the aperture, it can reduce the influence of stray light, and effectively coordinate the relationship between projection contrast and brightness, so that the projection contrast and brightness are effectively balanced, thereby improving the projection quality.
  • the projection device in order to obtain a higher quality projection image and facilitate subsequent reduction or elimination of stray light, the projection device needs to initialize a clear projection display during projection. For example, in some embodiments, first, the projection device initializes the projection function to achieve automatic focusing. After the projection device automatically focuses, it has a clear projection function. At this time, the projection device does not move and changes the projection position. Secondly, the projection device projects a preset projection screen to the projection area, and the preset projection screen includes an image with a white background and a black frame.
  • the contrast between the white area and the surrounding area is more obvious, that is, the brightness of the white area in the image and the brightness of the surrounding area in the same image are quite different.
  • S22 includes:
  • the stray light is light that is formed around the projection image and reduces the quality of the projection image when the projection image is projected onto the projection area.
  • the ambient light is the light intensity projected from the surrounding environment to the projection area.
  • the surrounding environment includes natural environment or A light source that is installed in the natural environment and can generate light.
  • the stray light area image is an image obtained by shooting after stray light is projected on the projection area, and the projected image includes the stray light area image.
  • the ambient light area image is an image that is captured after ambient light is projected on the projection area, where the ambient light area image can not only be included in the projected image, but also can be included in the environmental image.
  • the projected image 2c0 includes the screen image 2c1, the stray light area image 2c2, and the ambient light area image 2c3, where the screen image 2c1 is The projected image is projected in the projection area and collected.
  • the stray light area image 2c2 is a stray light area image mainly formed by stray light
  • the ambient light area image 2c3 is an ambient light area image mainly formed by ambient light.
  • the screen image 2c1 is quadrilateral and includes four intersections A, B, C, and D.
  • the projected image and the environment image can establish the same coordinate system, so as to determine the coordinates of the screen image, the coordinates of the stray light area image, and the coordinates of the environment light area image.
  • the position of the screen image when the position of the screen image is mapped to the environmental image, the position can be occupied by the ambient light, that is, although the environmental image is There is no picture image, but the position of the picture image in the projected image can be mapped to the same position in the environment image.
  • the projection device determines the stray light area image from the projected image and the environment image from the ambient light area image according to the image analysis algorithm.
  • the projection device can calculate the stray light brightness of the image of the stray light area and the ambient light brightness of the image of the ambient light area.
  • the stray light brightness can be the average brightness of all pixels in the image of the stray light area, or it can be the stray light.
  • the brightness of a specific pixel in the area image can also be the brightness determined by other methods.
  • the ambient light brightness can be the average brightness of all pixels in the image of the ambient light area, or the brightness of a specific pixel in the image of the ambient light area, or the brightness determined by other methods.
  • the projection device calculates the brightness difference between the brightness of the stray light and the brightness of the ambient light, and determines whether the brightness difference meets the preset value. Threshold conditions, if yes, stop adjusting the iris, if not, continue to adjust the iris, for example, in the process of continuing to adjust the iris, the projection device continues to obtain another frame of the projected image when the projection area is projected, according to the other frame of the projected image and the environment image , Adjust the aperture until the brightness difference meets the preset threshold condition.
  • the aperture value corresponding to the projected image of the corresponding frame at time t cannot reduce or eliminate the stray light, continue to shrink the aperture, lower the aperture value, and reproject the projected image at time t+1, so you can get another One frame of projected image.
  • the aperture is no longer adjusted.
  • the projection device adjusts the aperture according to another frame of projected image and the environmental image. In this way, it can make full use of the original environmental image data.
  • the environment does not change rapidly. Therefore, It does not need to acquire the environment image again, and adjusts the aperture according to another frame of projected image.
  • the preset threshold condition includes that the brightness difference is less than the preset threshold, for example, when the brightness difference is less than the preset threshold, stop adjusting the aperture, and when the brightness difference is greater than the preset threshold, continue to adjust the aperture.
  • S222 includes:
  • the size of the projected image and the environmental image are the same, and a common coordinate system can be established for the projected image and the environmental image. Therefore, the stray light area image and the ambient light area image can find pixels with the same coordinate each other For example, the pixel point M is located in the stray light area image, and its coordinates in the projected image are (10, 20), then the pixel point corresponding to the coordinate (10, 20) in the environment image is the pixel point M.
  • the coordinates of the pixels common to both the projected image and the environmental image are the same, the brightness of the common pixel may be different in different images and may be the same. For example, the brightness of the pixel point M in the image of the stray light area is 200, and the brightness of the pixel point M in the image of the ambient light area is 150.
  • the optimal pixel reference point is the pixel point that most objectively and most comprehensively reflects the influence of stray light on the projected image, and the optimal pixel reference point can be determined by a statistical method.
  • the projection device traverses the brightness in the stray light area image and the image in the ambient light area from all the common pixel points of the stray light area image and the ambient light area image.
  • the common pixel point with the largest difference in brightness between the two is used as the optimal pixel reference point.
  • all common pixels in both the stray light area image and the ambient light area image include pixel point A1, pixel point B1, pixel point C1 and pixel Point D1, where the brightness of pixel A1 in the image of the stray light area is 200, and the brightness of the pixel point A1 in the image of the ambient light area is 195.
  • the brightness of the pixel point B1 in the stray light area image is 220, and the brightness in the ambient light area image is 200.
  • the brightness of the pixel point C1 in the stray light area image is 220, and the brightness in the ambient light area image is 160.
  • the brightness of the pixel point D1 in the stray light area image is 190, and the brightness in the ambient light area image is 180. Since the difference between the brightness of the pixel point C1 in the stray light area image and the brightness in the ambient light area image is the largest among the above pixel points A1, B1, and D1, the pixel point 1C is selected as the optimal pixel reference point .
  • S2222 includes:
  • S22221 Use a preset image filtering algorithm to separately process the stray light area image and the ambient light area image;
  • S22222 Calculating the average value of the stray light brightness of all pixels in the filtered stray light area image and the average value of the ambient light brightness of all pixels in the filtered image of the ambient light area;
  • the stray light area image and the ambient light area image are respectively processed by the preset image filtering algorithm, some impurity pixels in the stray light area image and the ambient light area image are filtered out, and the original The brightness of all pixels is optimized.
  • all pixels in the filtered stray light area image are [A1, A2, A3...A100]
  • all pixels in the filtered ambient light area image are [B1, B 2, B 3... B 100], where each pixel has a corresponding brightness.
  • the projection device adds all the brightness of all the pixels in the filtered stray light area image, and again, calculates the average value after all the additions, and the projection device uses the average value as the average value of the stray light brightness.
  • the projection device adds all the brightness of all the pixels in the filtered ambient light area image, and again, calculates the average value after all the additions, and the projection device uses the average value as the ambient light brightness average.
  • the preset image filter algorithm includes a median filter algorithm, and the median filter algorithm in this embodiment uses a 3*3 filter template.
  • the stray light area image of the stray light in the projected image there are many ways to determine the stray light area image of the stray light in the projected image. For example, the brightness range of the stray light can be determined, and the pixels whose brightness falls within the brightness range of the stray light can be traversed from the projection image. , Connecting the above-mentioned pixel points to form a stray light area image, and then determining the stray light area image of the stray light in the projection image. For another example, in some embodiments, referring to FIG. 2f, S221 includes:
  • the projection image when the projection image is projected to the projection area and collected, the projection image is obtained, and the projection image includes the image image corresponding to the projection image.
  • stray light is mainly concentrated around the projection screen.
  • the area between 3 cm or 10 cm from the boundary of the projection screen is the stray light area. Therefore, when the projected image is obtained, if the screen image is determined in the projected image, then the stray light area image can be optimally estimated.
  • S2211 includes:
  • S22112 Traverse each pixel point in the projection image whose brightness is greater than or equal to the average value of the projection brightness
  • the projection device adds all the brightness of all pixels in the projected image, and then calculates the average value of the brightness of all pixels after the addition, and uses the average value as the projection brightness average value.
  • the projection device traverses each pixel in the projected image whose brightness is greater than or equal to the average projection brightness, and characterizes each such pixel, for example, the brightness of the pixel whose brightness is greater than or equal to the average projection brightness Set to 255, and set the brightness of pixels whose brightness is less than the average projection brightness to 0.
  • the projection device connects each pixel with a brightness greater than or equal to the average projection brightness, for example, connects all the pixels with a brightness of 255 to form a picture image.
  • the projection device connects each pixel with a brightness lower than the average projection brightness, for example, connects all the pixels with a brightness of 0 to form a stray light area image or an ambient light area image.
  • S2212 includes:
  • the image boundary of the picture image can be any shape, such as a quadrilateral, triangle, circle, or rhombus.
  • the projection device determines the image boundary of the screen image from the projected image according to the image analysis algorithm. Since the pixels corresponding to the stray light are not located in the picture image, the stray light area image can be formed by connecting the pixels not surrounded by the image boundary of the picture image.
  • the picture image includes a quadrilateral image.
  • S31 includes:
  • S311 Determine the coordinates of the four intersection points of the quadrilateral image
  • S312 Connect the coordinates of every two adjacent intersection points to form an image boundary of the screen image.
  • the quadrilateral image includes four intersections A (x1, y1), B (x2, y2), C (x3, y3) and D (x4, y4), Among them, the brightness of each intersection is 255.
  • Fig. 4a taking intersection A as the reference point, below A, the coordinates of A1 are (x1+1, y1), and its brightness is 255. Above A, the coordinates of A2 are (x1-1, y1), and its brightness is 0. On the right of A, the coordinates of A3 are (x1, y1+1), and its brightness is 255. On the left of A, the coordinates of A4 are (x1, y1-1), and its brightness is 0.
  • Fig. 4b taking intersection B as the reference point, below B, the coordinates of B1 are (x2+1, y2), and its brightness is 255. Above B, the coordinates of B2 are (x2-1, y2), and its brightness is 0. On the right of B, the coordinates of B3 are (x2, y2+1), and its brightness is 0. On the left of B, the coordinates of B4 are (x2, y2-1), and its brightness is 255.
  • Fig. 4c taking intersection C as the reference point, below C, the coordinates of C1 are (x3+1, y3), and its brightness is 0. Above C, the coordinates of C2 are (x3-1, y3), and its brightness is 255. On the right of C, the coordinate of C3 is (x3, y3+1), and its brightness is 255. On the left of C, the coordinates of C4 are (x3, y3-1), and its brightness is 0.
  • Fig. 4d taking intersection D as a reference point, below D, the coordinates of D1 are (x4+1, y4), and its brightness is 0. Above D, the coordinates of D2 are (x4-1, y4), and its brightness is 255. On the right of D, the coordinates of D3 are (x4, y4+1), and its brightness is 0. To the left of D, the coordinates of D4 are (x4, y4-1), and its brightness is 255.
  • the projection device traverses the four pixel reference points around the specific pixel and meets the conditions listed in FIGS. 4a to 4d, then the projection device can determine that the specific pixel is an intersection.
  • the selected pixel reference points are not limited to 4. For example, selecting three points may also satisfy the judgment condition.
  • the projection method S200 further includes:
  • the projection environment includes ambient light brightness and/or projection position
  • changes in the projection environment include ambient light brightness changes and/or projection position changes.
  • the change of the projection position can cause the change of the brightness of the stray light area, the change of the picture image boundary and the change of the image of the stray light area.
  • the detection of changes in the projection environment here can be real-time acquisition of projection images or environmental images, processing and analysis of the projection screen images, and detection of reference conditions for judging changes in the projection environment.
  • the embodiments of the present application provide a projection device.
  • the projection device of the embodiment of the present application can be used as one of the software function units.
  • the projection device includes a number of instructions, which are stored in a memory, and the processor can access the memory and call the instructions for execution to complete the above projection method.
  • the projection device 600 includes: an acquisition module 61 and an adjustment module 62.
  • the acquiring module 61 is configured to acquire a projection image when the projection area is projected and an environment image, where the environment image includes an image when the projection area is projected with a preset projection screen or an image when the projection area is not projected.
  • the adjustment module 62 is used to adjust the aperture according to the projected image and the environmental image, and the aperture is used to control the light flux projected to the projection area.
  • the aperture by adjusting the aperture, it can reduce the influence of stray light, and effectively coordinate the relationship between projection contrast and brightness, so that the projection contrast and brightness are effectively balanced, thereby improving the projection quality.
  • the adjustment module 62 includes: a determination unit 621, a calculation unit 622, and an adjustment unit 623.
  • the determining unit 621 is configured to determine the stray light area image of the stray light in the projected image and the ambient light area image of the ambient light in the environmental image;
  • the calculation unit 622 is configured to calculate the stray light brightness of the stray light area image and the ambient light brightness of the ambient light area image;
  • the adjusting unit 623 is used to adjust the aperture according to the stray light brightness and the ambient light brightness.
  • the calculation unit 622 is specifically configured to: select the common optimal pixel reference point from both the stray light area image and the ambient light area image; select the brightness of the optimal pixel reference point in the stray light area image as The stray light brightness and the brightness in the image of the ambient light area are regarded as the ambient light brightness.
  • the calculation unit 622 is also specifically configured to: traverse the brightness in the stray light area image and the brightness in the ambient light area image from all common pixel points in the stray light area image and the ambient light area image.
  • the common pixel with the largest difference in brightness between the two is used as the optimal pixel reference point.
  • the calculation unit 622 is further specifically configured to: separately process the stray light area image and the ambient light area image by using a preset image filtering algorithm; and obtain the average stray light brightness of all pixels in the filtered stray light area image Value and the average value of ambient light brightness of all pixels in the filtered ambient light area image; select the average value of stray light brightness as the stray light brightness and the average value of ambient light brightness as the ambient light brightness.
  • the preset image filtering algorithm includes a median filtering algorithm.
  • the determining unit 621 is specifically configured to: determine the screen image of the projection screen in the projection image; and determine the stray light area image of the stray light from the projection image according to the screen image.
  • the determining unit 621 is specifically configured to: obtain the average projection brightness of all pixels in the projected image; traverse each pixel in the projected image whose brightness is greater than or equal to the average projection brightness; connect each pixel, To form a picture image.
  • the determining unit 621 is specifically configured to: determine the image boundary of the screen image; connect the pixels not surrounded by the image boundary to form a stray light area image.
  • the picture image includes a quadrilateral image.
  • the determining unit 621 is specifically configured to: determine the coordinates of the four intersections of the quadrilateral image; connect the coordinates of every two adjacent intersections to form the image boundary of the screen image.
  • the adjustment unit 623 is specifically configured to: calculate the brightness difference between the stray light brightness and the ambient light brightness; determine whether the brightness difference meets the preset threshold condition; if so, stop adjusting the aperture; if not, continue to adjust aperture.
  • the adjusting unit 623 is further specifically configured to: continue to obtain another frame of the projection image when the projection area is projected; adjust the aperture according to the other frame of the projection image and the environmental image, until the brightness difference meets the preset threshold condition .
  • the preset threshold condition includes that the brightness difference is less than the preset threshold.
  • the projection apparatus 600 further includes: an initialization module 63 and a projection module 64.
  • the initialization module 63 is used to initialize the projection function to realize auto focus
  • the projection module 64 is used to project a preset projection image to the projection area.
  • the preset projection screen includes an image with a white background and a black frame.
  • the projection device 600 further includes: a monitoring module 65 and a judgment module 66.
  • the monitoring module 65 is used to monitor whether the projection environment of the projection area changes
  • the judging module 66 is used for returning and continuing to obtain the projected image when the projection area is projected and the environmental image when the projected area is not projected if yes; if not, execute the preset logic.
  • the projection environment includes ambient light brightness and/or projection position.
  • the above-mentioned projection device can execute the projection method provided in the embodiment of the present application, and has the functional modules and beneficial effects corresponding to the execution method.
  • the projection method provided in the embodiment of the present application can execute the projection method provided in the embodiment of the present application.
  • the embodiments of the present application provide a controller.
  • the controller 700 includes: one or more processors 71 and a memory 72.
  • one processor 71 is taken as an example in FIG. 7.
  • the processor 71 and the memory 72 may be connected through a bus or in other ways.
  • the connection through a bus is taken as an example.
  • the memory 72 can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions corresponding to the projection method in the embodiments of the present application.
  • the processor 71 runs the non-volatile software programs, instructions, and modules stored in the memory 72 to execute the projection methods of the foregoing embodiments, or various functional applications and data processing of the projection devices of the foregoing embodiments.
  • the memory 72 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
  • the memory 72 may optionally include memories remotely provided with respect to the processor 71, and these remote memories may be connected to the processor 71 via a network. Examples of the aforementioned networks include but are not limited to the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
  • the program instructions/modules are stored in the memory 72, and when executed by the one or more processors 71, execute the projection method in any of the foregoing method embodiments, for example, to execute the projection method of the foregoing various embodiments , Or various functional applications and data processing of the projection device in the above embodiments.
  • the embodiments of the present application also provide a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to make the projection device execute any of the above The projection method described in the item.
  • the embodiment of the present application provides a computer program product.
  • the computer program product includes a computer program stored on a non-volatile computer-readable storage medium.
  • the computer program includes program instructions. When executing, the projection device is caused to execute any one of the projection methods.
  • the aperture by adjusting the aperture, it can reduce the influence of stray light, and effectively coordinate the relationship between projection contrast and brightness, so that the projection contrast and brightness are effectively balanced, thereby improving the projection quality.
  • the above-described device or device embodiments are merely illustrative.
  • the unit modules described as separate components may or may not be physically separated, and the components displayed as modular units may or may not be physical units. , Which can be located in one place, or can be distributed to multiple network module units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • each implementation manner can be implemented by software plus a general hardware platform, and of course, it can also be implemented by hardware.
  • the computer software product can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic disk , CD-ROM, etc., including a number of instructions to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute the methods described in each embodiment or some parts of the embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)

Abstract

本申请实施例涉及投影技术领域,特别是涉及一种投影方法及投影设备。其中,投影方法包括:获取投影区域被投影时的投影图像以及环境图像,其中,环境图像包括投影区域被投影预设投影画面时的图像或投影区域未被投影时的图像;根据投影图像与环境图像,调节光圈,光圈用于控制投影至投影区域的光通量。因此,通过调节光圈,其能够降低杂散光的影响,并且有效地协调好投影对比度与亮度之间的关系,使得投影对比度与亮度得到有效地平衡,从而提高了投影品质。

Description

投影方法及投影设备 技术领域
本申请实施例涉及投影技术领域,特别是涉及一种投影方法及投影设备。
背景技术
随着计算机多媒体显示技术的发展,投影机作为较佳的大屏幕显示设备,渐渐成为了标准的计算机外设产品,并且得以广泛应用于教育、商务等各个行业领域。
投影时,由于受到投影机光学系统的杂散光影响,现有投影画面的投影品质比较差。
发明内容
本申请实施例一个目的旨在提供一种投影方法及投影设备,其能够降低杂散光的影响。
为解决上述技术问题,本申请实施例提供以下技术方案:
在第一方面,本申请实施例提供一种投影方法,包括:
获取投影区域被投影时的投影图像以及环境图像,其中,所述环境图像包括所述投影区域被投影预设投影画面时的图像或所述投影区域未被投影时的图像;
根据所述投影图像与所述环境图像,调节光圈,所述光圈用于控制投影至所述投影区域的光通量。
在第二方面,本申请实施例提供一种投影装置,包括:
获取模块,用于获取投影区域被投影时的投影图像以及未被投影时的环境图像;
调节模块,用于根据所述投影图像与所述环境图像,调节光圈,所述光圈用于控制投影至所述投影区域的光通量。
在本申请各个实施例提供的投影方法中,首先,获取投影区域被投 影时的投影图像以及环境图像,其中,环境图像包括投影区域被投影预设投影画面时的图像或投影区域未被投影时的图像;其次,根据投影图像与环境图像,调节光圈,光圈用于控制投影至投影区域的光通量。因此,通过调节光圈,其能够降低杂散光的影响,并且有效地协调好投影对比度与亮度之间的关系,使得投影对比度与亮度得到有效地平衡,从而提高了投影品质。
附图说明
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1a是本申请实施例提供的一种投影设备的结构示意图;
图1b是本申请实施例提供的一种投影模组的结构示意图;
图1c是本申请实施例提供的一种光圈的结构示意图;
图1d是本申请另一实施例提供的一种投影设备的结构示意图;
图2a是本申请实施例提供的一种投影方法的流程示意图;
图2b是图2a中S22的流程示意图;
图2c是本申请实施例提供的一种投影图像的示意图;
图2d是图2b中S222的流程示意图;
图2e是图2d中S2222的流程示意图;
图2f是图2b中S221的流程示意图;
图2g是图2f中S2211的流程示意图;
图3a是图2f中S2212的流程示意图;
图3b是图3a中S31的流程示意图;
图4a至图4d是本申请实施例提供的画面图像为四边形图像的四个交点的坐标示意图;
图5是本申请另一实施例提供的一种投影方法的流程示意图;
图6a是本申请实施例提供的一种投影装置的结构示意图;
图6b是图6a中调节模块的结构示意图;
图6c是本申请另一实施例提供的一种投影装置的结构示意图;
图6d是本申请又另一实施例提供的一种投影装置的结构示意图;
图7是本申请实施例提供的一种控制器的结构示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅用以解释本申请,并不用于限定本申请。
投影时,用户往往要求所投射的投影画面具有很高的品质与对比度,以获得舒适的高品质投影效果。对应动态图像,其明暗转换比较快时,对比度越高,人的眼睛越容易分辨出这样的转换过程,从而带给用户逼真的视觉体验。但是,投影设备往往存在亮度与对比度相互制约的关系,当投影设备具备高亮度的同时,其也会伴随着更多的杂散光,导致对比度降低。
当投影设备的尺寸越小,其聚光镜越接近数字微镜装置,此时多数的暗态光及平态光会被聚光镜反射,投影画面的边界区域出现严重的杂散光区域,降低了投影成像品质,影响观看者的视觉感受。
基于此,本申请实施例提供一种投影设备。其中,投影设备可被构造成任意形状以及用在任意合适业务场景中,例如,投影设备可被构成成投影手机、大型投影机、投影电视机等等。
请参阅图1a,投影设备10包括投影模组11、图像采集模组12、光圈13及控制器14。
投影模组11用于向投影区域投射投影画面,投影画面可采用任何合适图像格式的图像。投影模组11可为任何合适类型的投影模组,诸如采用CRT、LCD、DLP或DLV技术的投影模组。
在一些实施例中,请参阅图1b,投影模组11包括:照明光源111与投影镜头112。投影镜头112设置于照明光源111的出光侧,并且,光圈13设置于照明光源111与投影镜头112之间,其中,通过调节光 圈13,阻挡照明光源111通过投影镜头112的光通量,进而改善杂散光的影响。
图像采集模组12用于采集投影区域所在的图像,图像采集模组12包括一个或多个光学传感器与镜头,一个或多个光学传感器设置于镜头的成像面,拍摄投影区域时,通过镜头将生成的光学图像投射至光学传感器上。光学传感器包括电荷耦合设备(Charge-coupled Device,CCD)、互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS),CMOS传感器可以为背照式CMOS传感器或堆栈式CMOS传感器。
在一些实施例中,图像采集模组12还集成有ISP(Image Signal Processor,图像信号处理器),ISP用于处理光学传感器的输出数据,如做AEC(自动曝光控制)、AGC(自动增益控制)、AWB(自动白平衡)、色彩校正等等功能的处理。
光圈13用于控制投影至投影区域的光通量,例如,光圈13可沿着与照明光源111的出光轴垂直的方向来回移动,以控制投影至投影区域的光通量。当光圈13可沿着与照明光源111的出光轴垂直的方向并且朝着远离出光轴方向移动时,由于光圈13增加了照明光源111向投影镜头112投射的部分投影光。当光圈13可沿着与照明光源111的出光轴垂直的方向并且朝着靠近出光轴方向移动时,由于光圈13阻挡了照明光源111向投影镜头112投射的部分投影光,进而降低了杂散光的影响,提高了对比度。
在一些实施例中,请参阅图1c,光圈13包括:底座131、挡光片132及调节装置133。
底座131设置有通光孔13a,通光孔13a包括中心轴。在一些实施例中,通光孔13a的数量可为一个,亦可为两个以上。当投影模组11投射投影画面时,投影光可穿过通光孔13a而投影出。
挡光片132设置于底座131上,并且,挡光片132可远离或靠近中心轴移动,以调节通光孔13a的开口面积。
调节装置133设置于底座131上,并且调节装置133分别与挡光片132和控制器15连接,调节装置133用于控制挡光片132的移动,例如, 在一些实施例中,调节装置133包括电机与传动机构,电机与传动机构连接,传动机构与挡光片132连接,并且,电机还与控制器14连接。控制器14通过向电机下达控制命令,电机根据控制命令,通过传动机构,控制挡光片132可远离或靠近通光孔13a的中心轴移动,以调节通光孔13a的开口面积,例如,当控制挡光片132远离通光孔13a的中心轴移动时,于是便增加了通光孔13a的开口面积。当控制挡光片132靠近通光孔13a的中心轴移动时,于是便缩小了通光孔13a的开口面积。
在一些实施例中,传动机构可包括传动轴或者各类传动连杆构成的传动单元。
在一些实施例中,请参阅图1d,投影设备10还包括传感器模组15,传感器模组15与控制器14连接。
传感器模组15用于采集投影区域的投影环境的环境数据,并向控制器14传输环境数据,控制器14分析环境数据,以便根据分析结果执行预设投影逻辑。
在一些实施例中,传感器模组15包括环境检测装置,其中,环境检测装置实时检测投影环境变化,例如,检测环境光亮度和/或投影位置,于是,环境检测装置便得到环境数据,控制器14通过分析环境数据,从而得到环境光亮度和/或投影位置是否发生变化,并且根据变化结果,执行预设投影逻辑。
在一些实施例中,环境检测装置包括光感应检测装置和/或位置移动传感器,其中,光感应检测装置用于检测环境光的亮度,位置移动传感器用于确定投影位置。
总体而言,通过本申请实施例提供的投影设备,其能够降低杂散光的影响,进而提高投影对比度,从而提高了投影品质。
作为本申请实施例另一方面,本申请实施例还提供一种投影方法,该投影方法可作为若干指令的集合,应用于上述实施例提供的投影设备中,以使该投影设备执行若干指令,以便实现投影方法的目的或功能。
请参阅图2a,投影方法S200包括:
S21、获取投影区域被投影时的投影图像以及环境图像,其中,环 境图像包括投影区域被投影预设投影画面时的图像或投影区域未被投影时的图像;
在本实施例中,投影区域为投影画面被投影的区域,其中,投影区域可由用户根据业务需求来选择,诸如投影区域可为投影屏幕或投影墙面等等。一般的,投影画面被投影至特定位置后,于是,投影画面便显示在该特定位置,该特定位置对应的区域总和为投影区域。
可以理解的是,投影区域可以是变换的,并非是固定的,例如,投影区域可以是办公室A的投影屏幕,通过加大投影屏幕的面积,于是,投影区域也伴随着变大。
在本实施例中,投影图像为当投影画面投射至投影区域后,投影设备拍摄投影区域而得到的图像,其中,在投影图像中,其包含了投影画面在投影区域的画面图像、杂散光在投影区域的杂散光区域图像以及环境光在投影区域的环境光区域图像。
在本实施例中,环境图像可以为当投影区域未被投影任何投影画面时,投影设备拍摄投影区域而得到的图像,其中,在环境图像中,由于投影区域未被投影,于是,投影区域便没有投影画面在投影区域的画面图像与杂散光在投影区域的杂散光区域图像,并且,环境图像仅包含环境光在投影区域的环境光区域图像。
在一些实施例中,环境图像还可以为当投影区域被投影预设投影画面时,投影设备拍摄投影区域而得到的图像,其中,该预设投影画面包括全黑投影画面等等。
在本实施例中,图像采集模组可设置于投影设备内,以便采集投影图像或环境图像。在一些实施例中,图像采集模组还可设置于其它位置,当采集到投影图像或环境图像,图像采集模组通过无线模组将投影图像或环境图像发送给投影设备。因此,获取投影图像或环境图像的方式比较繁多。
在本实施例中,需要获取投影图像时,投影设备自动启动投影模组,向投影区域投射投影画面,接着,图像采集模组便拍摄投影区域,从而获得投影图像。紧接着,投影设备关闭投影模组,停止向投影区域投射 投影画面,图像采集模组便拍摄投影区域,从而获得环境图像。
S22、根据投影图像与环境图像,调节光圈,光圈用于控制投影至投影区域的光通量。
在本实施例中,投影设备根据图像分析算法,分析投影图像与环境图像,以确定是否还存在杂散光以及存在杂散光时杂散光在投影区域的位置区域。并且,投影设备根据分析结果,调节光圈,以便减少或消除杂散光。由于光圈能够阻挡部分投影光投射至投影区域,因此,其能够减小或消除部分投影光投射至投影区域而产生杂散光的现象。例如,存在杂散光时,投影设备不断地收缩光圈,降低通过光圈的光通量,直至杂散光减小或被消除。
综上所述,通过调节光圈,其能够降低杂散光的影响,并且有效地协调好投影对比度与亮度之间的关系,使得投影对比度与亮度得到有效地平衡,从而提高了投影品质。
在一些实施例中,为了得到更加优质的投影画面以及方便后续减小或消除杂散光,在投影时,投影设备需要初始化清晰投影显示。例如,在一些实施例中,首先,投影设备初始化投影功能,以实现自动对焦,当投影设备自动对焦后便具有了清晰化投影的功能,此时,投影设备不再移动而改变投影位置。其次,投影设备向投影区域投射预设的投影画面,预设的投影画面包括白底黑边框的图像。由于以白底为背景、黑边框为边界的图像,白色区域与周边区域的对比度比较明显,亦即白色区域在图像中的亮度与周边区域在同一图像中的亮度两者的差别是比较大的,其能够更好地用于定位杂散光分布的区域,亦即更好地确定杂散光区域,以此后期更容易确定杂散光区域图像以及投影画面在投影图像中的画面图像。
在一些实施例中,其可根据杂散光亮度与环境光亮度调节光圈。例如,请参阅图2b,S22包括:
S221、确定杂散光在投影图像中的杂散光区域图像以及环境光在环境图像中的环境光区域图像;
S222、计算杂散光区域图像的杂散光亮度以及环境光区域图像的环 境光亮度;
S223、根据杂散光亮度与环境光亮度,调节光圈。
在本实施例中,杂散光为向投影区域投射投影画面时,在投影画面的周围形成并且降低投影画面品质的光,环境光为周围环境向投影区域投射的光强,周围环境包括自然环境或设置于自然环境内并能够产生光的光源。
杂散光区域图像为杂散光投射在投影区域后被拍摄而得到的图像,其中,投影图像便包含了杂散光区域图像。
环境光区域图像为环境光投射在投影区域后被拍摄而得到的图像,其中,环境光区域图像不仅可被投影图像所包含,而且还可被环境图像所包含。
可以理解的是,杂散光区域图像与环境光区域图像存在交集,例如,请参阅图2c,投影图像2c0包括画面图像2c1、杂散光区域图像2c2及环境光区域图像2c3,其中,画面图像2c1为投影画面投影在投影区域中而采集的图像,杂散光区域图像2c2为主要由杂散光形成的杂散光区域图像,环境光区域图像2c3为主要由环境光形成的环境光区域图像。其中,画面图像2c1呈四边形,包括四个交点A、B、C及D。
可以理解的是,投影图像与环境图像可建立同一坐标系,以此来确定画面图像的坐标、杂散光区域图像的坐标及环境光区域图像的坐标。并且,对应画面图像在投影图像与环境图像中的关联性,可以理解的是,画面图像在投影图像的位置映射至环境图像时,该位置可被环境光所占据,亦即,虽然环境图像中没有画面图像,但是可以将画面图像在投影图像的位置映射至环境图像中的同一位置。
在本实施例中,投影设备根据图像分析算法,从投影图像中确定杂散光区域图像,从环境光区域图像中确定环境图像。于是,投影设备便可以计算杂散光区域图像的杂散光亮度以及环境光区域图像的环境光亮度,其中,杂散光亮度可为杂散光区域图像中全部像素点的亮度平均值,亦可以为杂散光区域图像中特定像素点的亮度,还可以为由其它方式而确定的亮度。同理,环境光亮度可为环境光区域图像中全部像素点 的亮度平均值,亦可以为环境光区域图像中特定像素点的亮度,还可以为由其它方式而确定的亮度。
在本实施例中,投影设备根据杂散光亮度与环境光亮度调节光圈的方式比较繁多,例如,投影设备计算杂散光亮度与环境光亮度两者的亮度差值,判断亮度差值是否满足预设阈值条件,若是,停止调节光圈,若否,继续调节光圈,例如,继续调节光圈的过程中,投影设备继续获取投影区域被投影时的另一帧投影图像,根据另一帧投影图像与环境图像,调节光圈,直至亮度差值满足预设阈值条件。
由于第t时刻对应帧的投影图像所对应的光圈值未能够减小或消除杂散光,于是,继续收缩光圈,降低光圈值,在第t+1时刻重新投射投影画面,于是,便可得到另一帧投影图像。依次类推,直至亮度差值满足预设阈值条件,亦即,能够减小或消除杂散光后,其便不再调节光圈。
并且,重新调节时,投影设备是根据另一帧投影图像与环境图像调节光圈的,采用此种方式,其能够充分利用原来环境图像的数据,由于一般而言,环境不会迅速变化,因此,其无需再次获取环境图像,便根据另一帧投影图像调节光圈。
在一些实施例中,预设阈值条件包括亮度差值小于预设阈值,例如,亮度差值小于预设阈值时,停止调节光圈,亮度差值大于预设阈值时,继续调节光圈。
为了提高调节光圈的精度,以便更有效地减小或消除杂散光,在一些实施例中,请参阅图2d,S222包括:
S2221、从杂散光区域图像与环境光区域图像两者中选择共同的最优像素参考点;
S2222、选择最优像素参考点在杂散光区域图像中的亮度作为杂散光亮度以及在环境光区域图像中的亮度作为环境光亮度。
在本实施例中,投影图像与环境图像的大小相同,可对投影图像与环境图像建立共同的坐标系,于是,杂散光区域图像与环境光区域图像两者之间可互相找到同一坐标的像素点,例如,像素点M位于杂散光区域图像中,其在投影图像的坐标为(10,20),那么,环境图像中坐标为 (10,20)对应的像素点即为像素点M。虽然投影图像与环境图像两者共同的像素点的坐标相同,但是该共同像素点的亮度在不同图像中可不同,可相同。例如,像素点M在杂散光区域图像中的亮度为200,像素点M在环境光区域图像中的亮度为150。
在本实施例中,最优像素参考点为最客观以及最全面地反映杂散光对投影画面的影响的像素点,最优像素参考点可通过统计方法而确定。
在一些实施例中,选择最优像素参考点时,投影设备从杂散光区域图像与环境光区域图像两者全部共同像素点中,遍历出在杂散光区域图像中的亮度与在环境光区域图像中的亮度两者差值最大的共同像素点作为最优像素参考点,例如,杂散光区域图像与环境光区域图像两者全部共同像素点包括像素点A1,像素点B1,像素点C1及像素点D1,其中,像素点A1在杂散光区域图像中的亮度为200,在环境光区域图像中的亮度为195。像素点B1在杂散光区域图像中的亮度为220,在环境光区域图像中的亮度为200。像素点C1在杂散光区域图像中的亮度为220,在环境光区域图像中的亮度为160。像素点D1在杂散光区域图像中的亮度为190,在环境光区域图像中的亮度为180。由于像素点C1在杂散光区域图像中的亮度与在环境光区域图像中的亮度两者差值在上述像素点A1、B1及D1中是最大的,则选择像素点1C作为最优像素参考点。
为了更进一步地提高调节光圈的精度,以便更加有效地减小或消除杂散光,在一些实施例中,请参阅图2e,S2222包括:
S22221、采用预设图像滤波算法分别处理杂散光区域图像以及环境光区域图像;
S22222、求取滤波后的杂散光区域图像中全部像素点的杂散光亮度平均值以及滤波后的环境光区域图像中全部像素点的环境光亮度平均值;
S22223、选择杂散光亮度平均值作为杂散光亮度以及环境光亮度平均值作为环境光亮度。
在本实施例中,举例而言,杂散光区域图像以及环境光区域图像分别经过预设图像滤波算法处理后,杂散光区域图像以及环境光区域图像 中一些杂质像素点得以被滤除,并且原有全部像素点的亮度得以优化。例如,滤波后的杂散光区域图像中全部像素点分别为[A1、A2、A3……A100],滤波后的环境光区域图像中全部像素点分别为[B1、B 2、B 3……B 100],其中,每个像素点都具有对应的亮度。
其次,投影设备将滤波后的杂散光区域图像中全部像素点的亮度全部相加,再次,求取全部相加后的平均值,投影设备将该平均值作为杂散光亮度平均值。
同理,投影设备将滤波后的环境光区域图像中全部像素点的亮度全部相加,再次,求取全部相加后的平均值,投影设备将该平均值作为环境光亮度平均值。
在一些实施例中,预设图像滤波算法包括中值滤波算法,其中,本实施例的中值滤波算法采用3*3的滤波模板。
在一些实施例中,确定杂散光在投影图像中的杂散光区域图像的方式比较繁多,例如,可确定杂散光的亮度范围,从投影图像中遍历出亮度落在杂散光的亮度范围的像素点,连通上述各个像素点以形成杂散光区域图像,进而确定杂散光在投影图像中的杂散光区域图像。再例如,在一些实施例中,请参阅图2f,S221包括:
S2211、确定投影画面在投影图像中的画面图像;
S2212、根据画面图像,从投影图像中确定杂散光的杂散光区域图像。
在本实施例中,投影画面投影至投影区域后而被采集时,便得到投影图像,投影图像包含着投影画面对应的画面图像。
如前所述,一般的,杂散光主要集中于投影画面的周围,例如,距离投影画面边界3厘米或10厘米之间的区域为杂散光区域。于是,当得到投影图像后,若在投影图像确定了画面图像,于是,便可以最优化估计出杂散光区域图像。
在一些实施例中,请参阅图2g,S2211包括:
S22111、求取投影图像中全部像素点的投影亮度平均值;
S22112、遍历出投影图像中亮度大于或等于投影亮度平均值的各个 像素点;
S22113、连通各个像素点,以形成画面图像。
在本实施例中,举例而言,首先,投影设备将投影图像中全部像素点的亮度全部相加,然后求取相加后的全部像素点亮度的平均值,并将该平均值作为投影亮度平均值。
其次,投影设备遍历出投影图像中亮度大于或等于投影亮度平均值的各个像素点,并对此类的各个像素点作特征化,例如,将亮度大于或等于投影亮度平均值的像素点的亮度置为255,亮度小于投影亮度平均值的像素点的亮度置为0。
最后,投影设备连通亮度大于或等于投影亮度平均值的各个像素点,例如连通亮度为255的全部像素点,以形成画面图像。在一些实施例中,投影设备连通亮度小于投影亮度平均值的各个像素点,例如连通亮度为0的全部像素点,以形成杂散光区域图像或环境光区域图像。
在一些实施例中,请参阅图3a,S2212包括:
S31、确定画面图像的图像边界;
S32、连通未被图像边界包围的像素点,以形成杂散光区域图像。
在本实施例中,画面图像的图像边界可为任何形状,诸如四边形、三角形、圆形或菱形等等。
在本实施例中,投影设备根据图像分析算法,从投影图像中确定画面图像的图像边界。由于与杂散光对应的像素点不位于画面图像内,于是,通过连通未被画面图像的图像边界包围的像素点,便可以形成杂散光区域图像。
在一些实施例中,画面图像包括四边形图像。请参阅图3b,S31包括:
S311、确定四边形图像的四个交点坐标;
S312、连通每相邻两个交点坐标,以形成画面图像的图像边界。
在本实施例中,举例而言,请继续参阅图2c,四边形图像包括四个交点A(x1,y1)、B(x2,y2)、C(x3,y3)及D(x4,y4),其中,各个交点的亮度皆为255。
请参阅图4a,以交点A为参考点,在A的下方,A1坐标为(x1+1,y1),其亮度为255。在A的上方,A2坐标为(x1-1,y1),其亮度为0。在A的右方,A3坐标为(x1,y1+1),其亮度为255。在A的左方,A4坐标为(x1,y1-1),其亮度为0。
请参阅图4b,以交点B为参考点,在B的下方,B1坐标为(x2+1,y2),其亮度为255。在B的上方,B2坐标为(x2-1,y2),其亮度为0。在B的右方,B3坐标为(x2,y2+1),其亮度为0。在B的左方,B4坐标为(x2,y2-1),其亮度为255。
请参阅图4c,以交点C为参考点,在C的下方,C1坐标为(x3+1,y3),其亮度为0。在C的上方,C2坐标为(x3-1,y3),其亮度为255。在C的右方,C3坐标为(x3,y3+1),其亮度为255。在C的左方,C4坐标为(x3,y3-1),其亮度为0。
请参阅图4d,以交点D为参考点,在D的下方,D1坐标为(x4+1,y4),其亮度为0。在D的上方,D2坐标为(x4-1,y4),其亮度为255。在D的右方,D3坐标为(x4,y4+1),其亮度为0。在D的左方,D4坐标为(x4,y4-1),其亮度为255。
当画面图像为四边形图像时,投影设备遍历特定像素点的周边四个像素参考点满足上述图4a至图4d列举的条件时,于是,投影设备便可确定该特定像素点为交点。在一些实施例中,选取的像素参考点不限定4个,例如选取3个点也可满足判断条件。
在一些实施例中,请参阅图5,投影方法S200还包括:
S23、监测投影区域的投影环境是否变化;
S24、若是,返回继续获取投影区域被投影时的投影图像以及未被投影时的环境图像;
S25、若否,执行预设逻辑。
在一些实施例中,投影环境包括环境光亮度和/或投影位置,投影环境的变化包括环境光亮度变化和/或投影位置变化。投影位置的变化可以引起杂散光区域亮度的变化、画面图像边界的变化以及杂散光区域图像的变化。在减少或消除杂散光后而正常投影显示时,若检测到投影 环境的变化,例如,光感应检测装置检测到环境光亮度有一定量的变化时,则触发自动调节光圈;若检测到投影位置变化时,则需要重新进行自动调焦等实现清晰投影的步骤。值得注意的是,此处检测投影环境变化可以是实时采集投影图像或环境图像,并对投影画面图像进行处理分析、检测判断投影环境变化的参照条件。
需要说明的是,在上述各个实施例中,上述各步骤之间并不必然存在一定的先后顺序,本领域普通技术人员,根据本申请实施例的描述可以理解,不同实施例中,上述各步骤可以有不同的执行顺序,亦即,可以并行执行,亦可以交换执行等等。
作为本申请实施例的另一方面,本申请实施例提供一种投影装置。本申请实施例的投影装置可以作为其中一个软件功能单元,投影装置包括若干指令,该若干指令存储于存储器内,处理器可以访问该存储器,调用指令进行执行,以完成上述投影方法。
请参阅图6a,投影装置600包括:获取模块61与调节模块62。
获取模块61用于获取投影区域被投影时的投影图像以及环境图像,其中,环境图像包括投影区域被投影预设投影画面时的图像或投影区域未被投影时的图像。
调节模块62用于根据投影图像与环境图像,调节光圈,光圈用于控制投影至投影区域的光通量。
综上,通过调节光圈,其能够降低杂散光的影响,并且有效地协调好投影对比度与亮度之间的关系,使得投影对比度与亮度得到有效地平衡,从而提高了投影品质。
在一些实施例中,请参阅图6b,调节模块62包括:确定单元621、计算单元622及调节单元623。
确定单元621用于确定杂散光在投影图像中的杂散光区域图像以及环境光在环境图像中的环境光区域图像;
计算单元622用于计算杂散光区域图像的杂散光亮度以及环境光区域图像的环境光亮度;
调节单元623用于根据杂散光亮度与环境光亮度,调节光圈。
在一些实施例中,计算单元622具体用于:从杂散光区域图像与环境光区域图像两者中选择共同的最优像素参考点;选择最优像素参考点在杂散光区域图像中的亮度作为杂散光亮度以及在环境光区域图像中的亮度作为环境光亮度。
在一些实施例中,计算单元622还具体用于:从杂散光区域图像与环境光区域图像两者全部共同像素点中,遍历出在杂散光区域图像中的亮度与在环境光区域图像中的亮度两者差值最大的共同像素点作为最优像素参考点。
在一些实施例中,计算单元622还具体用于:采用预设图像滤波算法分别处理杂散光区域图像以及环境光区域图像;求取滤波后的杂散光区域图像中全部像素点的杂散光亮度平均值以及滤波后的环境光区域图像中全部像素点的环境光亮度平均值;选择杂散光亮度平均值作为杂散光亮度以及环境光亮度平均值作为环境光亮度。
在一些实施例中,预设图像滤波算法包括中值滤波算法。
在一些实施例中,确定单元621具体用于:确定投影画面在投影图像中的画面图像;根据画面图像,从投影图像中确定杂散光的杂散光区域图像。
在一些实施例中,确定单元621具体用于:求取投影图像中全部像素点的投影亮度平均值;遍历出投影图像中亮度大于或等于投影亮度平均值的各个像素点;连通各个像素点,以形成画面图像。
在一些实施例中,确定单元621具体用于:确定画面图像的图像边界;连通未被图像边界包围的像素点,以形成杂散光区域图像。
在一些实施例中,画面图像包括四边形图像。确定单元621具体用于:确定四边形图像的四个交点坐标;连通每相邻两个交点坐标,以形成画面图像的图像边界。
在一些实施例中,调节单元623具体用于:计算杂散光亮度与环境光亮度两者的亮度差值;判断亮度差值是否满足预设阈值条件;若是,停止调节光圈;若否,继续调节光圈。
在一些实施例中,调节单元623还具体用于:继续获取投影区域被 投影时的另一帧投影图像;根据另一帧投影图像与环境图像,调节光圈,直至亮度差值满足预设阈值条件。
在一些实施例中,预设阈值条件包括亮度差值小于预设阈值。
在一些实施例中,请参阅图6c,投影装置600还包括:初始化模块63与投影模块64。
初始化模块63用于初始化投影功能,以实现自动对焦;
投影模块64用于向投影区域投影预设的投影画面。
在一些实施例中,预设的投影画面包括白底黑边框的图像。请参阅图6d,投影装置600还包括:监测模块65与判断模块66。
监测模块65用于监测投影区域的投影环境是否变化;
判断模块66用于若是,返回继续获取投影区域被投影时的投影图像以及未被投影时的环境图像;若否,执行预设逻辑。
在一些实施例中,投影环境包括环境光亮度和/或投影位置。
需要说明的是,上述投影装置可执行本申请实施例所提供的投影方法,具备执行方法相应的功能模块和有益效果。未在投影装置实施例中详尽描述的技术细节,可参见本申请实施例所提供的投影方法。
作为本申请实施例的另一方面,本申请实施例提供一种控制器。请参阅图7,该控制器700包括:一个或多个处理器71以及存储器72。其中,图7中以一个处理器71为例。
处理器71和存储器72可以通过总线或者其他方式连接,图7中以通过总线连接为例。
存储器72作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的投影方法对应的程序指令/模块。处理器71通过运行存储在存储器72中的非易失性软件程序、指令以及模块,从而执行上述各个实施例的投影方法,或者上述各个实施例的投影装置的各种功能应用以及数据处理。
存储器72可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存 储器件。在一些实施例中,存储器72可选包括相对于处理器71远程设置的存储器,这些远程存储器可以通过网络连接至处理器71。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
所述程序指令/模块存储在所述存储器72中,当被所述一个或者多个处理器71执行时,执行上述任意方法实施例中的投影方法,例如,从而执行上述各个实施例的投影方法,或者上述各个实施例的投影装置的各种功能应用以及数据处理。
本申请实施例还提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储有计算机可执行指令,所述计算机可执行指令用于使投影设备执行如上任一项所述的投影方法。
本申请实施例提供了一种计算机程序产品,所述计算机程序产品包括存储在非易失性计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被投影设备执行时,使所述投影设备执行任一项所述的投影方法。
综上,通过调节光圈,其能够降低杂散光的影响,并且有效地协调好投影对比度与亮度之间的关系,使得投影对比度与亮度得到有效地平衡,从而提高了投影品质。
以上所描述的装置或设备实施例仅仅是示意性的,其中所述作为分离部件说明的单元模块可以是或者也可以不是物理上分开的,作为模块单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络模块单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备 等)执行各个实施例或者实施例的某些部分所述的方法。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;在本申请的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本申请的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (17)

  1. 一种投影方法,其特征在于,包括:
    获取投影区域被投影时的投影图像以及环境图像,其中,所述环境图像包括所述投影区域被投影预设投影画面时的图像或所述投影区域未被投影时的图像;
    根据所述投影图像与所述环境图像,调节光圈,所述光圈用于控制投影至所述投影区域的光通量。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述投影图像与所述环境图像,调节光圈,包括:
    确定杂散光在所述投影图像中的杂散光区域图像以及环境光在所述环境图像中的环境光区域图像;
    计算所述杂散光区域图像的杂散光亮度以及所述环境光区域图像的环境光亮度;
    根据所述杂散光亮度与所述环境光亮度,调节光圈。
  3. 根据权利要求2所述的方法,其特征在于,所述计算所述杂散光区域图像的杂散光亮度以及所述环境光区域图像的环境光亮度,包括:
    从所述杂散光区域图像与所述环境光区域图像两者中选择共同的最优像素参考点;
    选择所述最优像素参考点在所述杂散光区域图像中的亮度作为所述杂散光亮度以及在所述环境光区域图像中的亮度作为所述环境光亮度。
  4. 根据权利要求3所述的方法,其特征在于,所述从所述杂散光区域图像与所述环境光区域图像两者中选择共同的最优像素参考点,包括:
    从所述杂散光区域图像与所述环境光区域图像两者全部共同像素点中,遍历出在所述杂散光区域图像中的亮度与在所述环境光区域图像中的亮度两者差值最大的共同像素点作为最优像素参考点。
  5. 根据权利要求3所述的方法,其特征在于,所述选择所述最优像素参考点在所述杂散光区域图像中的亮度作为所述杂散光亮度以及在所述环境光区域图像中的亮度作为所述环境光亮度,包括:
    采用预设图像滤波算法分别处理所述杂散光区域图像以及所述环境光区域图像;
    求取滤波后的所述杂散光区域图像中全部像素点的杂散光亮度平均值以及滤波后的所述环境光区域图像中全部像素点的环境光亮度平均值;
    选择所述杂散光亮度平均值作为所述杂散光亮度以及所述环境光亮度平均值作为所述环境光亮度。
  6. 根据权利要求2所述的方法,其特征在于,所述确定所述杂散光在所述投影图像中的杂散光区域图像,包括:
    确定投影画面在所述投影图像中的画面图像;
    根据所述画面图像,从所述投影图像中确定所述杂散光的杂散光区域图像。
  7. 根据权利要求6所述的方法,其特征在于,所述确定投影画面在所述投影图像中的画面图像,包括:
    求取所述投影图像中全部像素点的投影亮度平均值;
    遍历出所述投影图像中亮度大于或等于所述投影亮度平均值的各个像素点;
    连通所述各个像素点,以形成所述画面图像。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述画面图像,从所述投影图像中确定所述杂散光的杂散光区域图像,包括:
    确定所述画面图像的图像边界;
    连通未被所述图像边界包围的像素点,以形成所述杂散光区域图像。
  9. 根据权利要求8所述的方法,其特征在于,所述画面图像包括四边形图像;
    所述确定所述画面图像的图像边界,包括:
    确定所述四边形图像的四个交点坐标;
    连通每相邻两个所述交点坐标,以形成所述画面图像的图像边界。
  10. 根据权利要求2所述的方法,其特征在于,所述根据所述杂散光亮度与所述环境光亮度,调节光圈,包括:
    计算所述杂散光亮度与所述环境光亮度两者的亮度差值;
    判断所述亮度差值是否满足预设阈值条件;
    若是,停止调节光圈;
    若否,继续调节光圈。
  11. 根据权利要求10所述的方法,其特征在于,所述继续调节光圈,包括:
    继续获取所述投影区域被投影时的另一帧投影图像;
    根据所述另一帧投影图像与所述环境图像,调节光圈,直至所述亮度差值满足预设阈值条件。
  12. 根据权利要求10所述的方法,其特征在于,所述预设阈值条件包括所述亮度差值小于预设阈值。
  13. 根据权利要求1至12任一项所述的方法,其特征在于,所述方法还包括:
    初始化投影功能,以实现自动对焦;
    向所述投影区域投射预设的投影画面。
  14. 根据权利要求1至12任一项所述的方法,其特征在于,所述方法还包括:
    监测所述投影区域的投影环境是否变化;
    若是,返回继续获取投影区域被投影时的投影图像以及未被投影时的环境图像;
    若否,执行预设逻辑。
  15. 根据权利要求14所述的方法,其特征在于,所述投影环境包括环境光亮度和/或投影位置。
  16. 一种投影设备,其特征在于,包括:
    投影模组,用于向投影区域投射投影画面;
    图像采集模组,用于采集所述投影区域所在的图像;
    光圈,用于控制投影至所述投影区域的光通量;以及
    控制器,分别与所述投影模组及所述图像采集模组连接;
    其中,所述投影模组包括照明光源与投影镜头,所述投影镜头设置于所述照明光源的出光侧,并且,所述光圈设置于所述照明光源与所述投影镜头之间;
    其中,所述控制器包括:
    至少一个处理器;以及
    与所述至少一个处理器通信连接的存储器;其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够用于执行如权利要求1至15任一项所述的投影方法。
  17. 根据权利要求16所述的投影设备,其特征在于,所述投影设备还包括传感器模组,与所述控制器连接,用于采集所述投影区域的投影环境的环境数据。
PCT/CN2019/129517 2019-01-31 2019-12-28 投影方法及投影设备 WO2020155995A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/389,443 US11886107B2 (en) 2019-01-31 2021-07-30 Projection method and projection device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910099193.3 2019-01-31
CN201910099193.3A CN109698948B (zh) 2019-01-31 2019-01-31 投影方法及投影设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/389,443 Continuation US11886107B2 (en) 2019-01-31 2021-07-30 Projection method and projection device

Publications (1)

Publication Number Publication Date
WO2020155995A1 true WO2020155995A1 (zh) 2020-08-06

Family

ID=66234723

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/129517 WO2020155995A1 (zh) 2019-01-31 2019-12-28 投影方法及投影设备

Country Status (3)

Country Link
US (1) US11886107B2 (zh)
CN (1) CN109698948B (zh)
WO (1) WO2020155995A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109698948B (zh) 2019-01-31 2021-04-23 广景视睿科技(深圳)有限公司 投影方法及投影设备
CN112367752B (zh) * 2020-10-19 2023-01-10 深圳市太和世纪文化创意有限公司 一种沉浸式半球投影系统及控制方法、智能装置
CN118784801A (zh) * 2023-04-06 2024-10-15 宜宾市极米光电有限公司 投影装置的亮度调节方法、装置及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101017245A (zh) * 2006-02-06 2007-08-15 中华映管股份有限公司 改变环境对画面显示影响的投影系统
CN101377610A (zh) * 2007-08-30 2009-03-04 鸿富锦精密工业(深圳)有限公司 投影系统及使用该投影系统的便携式电子设备
CN102316295A (zh) * 2010-07-08 2012-01-11 鸿富锦精密工业(深圳)有限公司 投影仪及其校正装置、校正方法
US8223163B2 (en) * 2008-05-14 2012-07-17 Seiko Epson Corporation Display device, program, and information storage medium
CN105208308A (zh) * 2015-09-25 2015-12-30 广景视睿科技(深圳)有限公司 一种获取投影仪的最佳投影焦点的方法及系统
CN107102736A (zh) * 2017-04-25 2017-08-29 上海唱风信息科技有限公司 实现增强现实的方法
CN107888891A (zh) * 2016-09-30 2018-04-06 海信集团有限公司 图像投影显示方法和光学引擎
CN109698948A (zh) * 2019-01-31 2019-04-30 广景视睿科技(深圳)有限公司 投影方法及投影设备

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070082799A (ko) * 2006-02-17 2007-08-22 엘지전자 주식회사 투사형 영상표시기기 및 영상표시기기의 밝기제어방법
JP5962080B2 (ja) * 2012-03-08 2016-08-03 セイコーエプソン株式会社 画像処理装置、プロジェクター、およびプロジェクターの制御方法
JP6089461B2 (ja) * 2012-06-22 2017-03-08 セイコーエプソン株式会社 プロジェクター、画像表示システム、プロジェクターの制御方法
WO2018180530A1 (ja) * 2017-03-28 2018-10-04 ソニー株式会社 画像処理装置および方法
CN108668118A (zh) * 2017-03-31 2018-10-16 中强光电股份有限公司 自动对焦系统、具有自动对焦系统的投影机以及自动对焦方法
CN107807488B (zh) * 2017-10-18 2020-03-17 维沃移动通信有限公司 一种摄像头组件、光圈调整方法及移动终端
US10684537B2 (en) * 2017-11-14 2020-06-16 Texas Instruments Incorporated Camera-assisted arbitrary surface characterization and correction
CN207924351U (zh) * 2018-03-27 2018-09-28 广景视睿科技(深圳)有限公司 一种光圈及投影仪
CN108600714A (zh) * 2018-03-29 2018-09-28 联想(北京)有限公司 一种投影控制方法和电子系统

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101017245A (zh) * 2006-02-06 2007-08-15 中华映管股份有限公司 改变环境对画面显示影响的投影系统
CN101377610A (zh) * 2007-08-30 2009-03-04 鸿富锦精密工业(深圳)有限公司 投影系统及使用该投影系统的便携式电子设备
US8223163B2 (en) * 2008-05-14 2012-07-17 Seiko Epson Corporation Display device, program, and information storage medium
CN102316295A (zh) * 2010-07-08 2012-01-11 鸿富锦精密工业(深圳)有限公司 投影仪及其校正装置、校正方法
CN105208308A (zh) * 2015-09-25 2015-12-30 广景视睿科技(深圳)有限公司 一种获取投影仪的最佳投影焦点的方法及系统
CN107888891A (zh) * 2016-09-30 2018-04-06 海信集团有限公司 图像投影显示方法和光学引擎
CN107102736A (zh) * 2017-04-25 2017-08-29 上海唱风信息科技有限公司 实现增强现实的方法
CN109698948A (zh) * 2019-01-31 2019-04-30 广景视睿科技(深圳)有限公司 投影方法及投影设备

Also Published As

Publication number Publication date
US20210360207A1 (en) 2021-11-18
US11886107B2 (en) 2024-01-30
CN109698948B (zh) 2021-04-23
CN109698948A (zh) 2019-04-30

Similar Documents

Publication Publication Date Title
WO2020155995A1 (zh) 投影方法及投影设备
WO2018201809A1 (zh) 基于双摄像头的图像处理装置及方法
CA2996751C (en) Calibration of defective image sensor elements
US8350954B2 (en) Image processing apparatus and image processing method with deconvolution processing for image blur correction
CN107305312B (zh) 投影亮度和对比的自动调整系统及方法
TWI640199B (zh) 影像擷取裝置及其攝影構圖的方法
US20170134634A1 (en) Photographing apparatus, method of controlling the same, and computer-readable recording medium
KR101941801B1 (ko) Led 디스플레이에 이용되는 영상 처리 방법 및 장치
JP2020504953A (ja) カメラアセンブリおよびモバイル電子装置
US20180288339A1 (en) Dynamic frame rate controlled thermal imaging systems and methods
US20170180714A1 (en) Method of enhanced alignment of two means of projection
US20120019709A1 (en) Assisting focusing method using multiple face blocks
WO2011095026A1 (zh) 摄像方法及系统
WO2023072030A1 (zh) 镜头自动对焦方法及装置、电子设备和计算机可读存储介质
TWI703868B (zh) 投影亮度和對比的自動調整系統及方法
US20230033956A1 (en) Estimating depth based on iris size
CN108289170B (zh) 能够检测计量区域的拍照装置、方法及计算机可读介质
WO2018166170A1 (zh) 一种图像处理的方法、装置及智能会议终端
CN103364404A (zh) 相机检测方法与相机
CN114928728B (zh) 投影设备及异物检测方法
JP5359930B2 (ja) 撮像装置、表示方法、および、プログラム
TW201820856A (zh) 可檢測投影機的投影影像清晰度的檢測系統及其檢測方法
CN109274923A (zh) 一种用于工业设备的智能感知器件
JP5847280B2 (ja) 撮影装置、撮影装置の制御方法、および撮影方法
US20240114117A1 (en) Method for adjusting projection system and projection system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913355

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19.11.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19913355

Country of ref document: EP

Kind code of ref document: A1