CN114928728A - Projection apparatus and foreign matter detection method - Google Patents

Projection apparatus and foreign matter detection method Download PDF

Info

Publication number
CN114928728A
CN114928728A CN202210594204.7A CN202210594204A CN114928728A CN 114928728 A CN114928728 A CN 114928728A CN 202210594204 A CN202210594204 A CN 202210594204A CN 114928728 A CN114928728 A CN 114928728A
Authority
CN
China
Prior art keywords
image
area
projection
value
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210594204.7A
Other languages
Chinese (zh)
Inventor
岳国华
李佳琳
刘胤伯
高伟
李保成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202210594204.7A priority Critical patent/CN114928728A/en
Publication of CN114928728A publication Critical patent/CN114928728A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor

Abstract

Some embodiments of the present application provide a projection apparatus and a foreign object detection method, which are configured to obtain an average gray value of a detection area in an image and adjust brightness of an image captured by a camera according to the average gray value. Wherein, the detection area is an area except the projection area in the projection plane. After the brightness of the image is adjusted, the image with the preset frame number after the brightness is adjusted and the distance data of the shooting moment of the image with the preset frame number are obtained, and image difference data and distance change data are obtained through calculation. And controlling to reduce the brightness of the light source if the distance change data is larger than the distance threshold and/or the image difference data is larger than the difference threshold. Therefore, foreign matter detection is carried out on the basis of the image shot by the camera in the detection area and/or foreign matter detection is carried out on the basis of the distance data collected by the distance sensor in the projection area, even if light interference exists and/or the image quality shot by the camera is poor, whether a user enters the light source emission area can be accurately detected, and the use experience of the user is improved.

Description

Projection apparatus and foreign matter detection method
Technical Field
The application relates to the technical field of display equipment, in particular to projection equipment and a foreign matter detection method.
Background
A projection device is a display device that can project an image or video onto a screen. The projection device can project laser light of a specific color to a screen to form a specific image through the refraction effect of the optical lens assembly. Based on the portability of the projection device, the user can move the projection device during projection to project images or video in different directions.
Generally, if a user looks directly at the projection light source or enters the light source emitting area of the projection device, the projection light source is liable to illuminate the user's eyes, causing injury. In order to avoid the above accidents, the projection device performs the foreign object detection in real time in the on state. If the user is close to the light source emission area, the projection device can adjust the projection brightness of the light source to achieve the effect of protecting human eyes.
However, when the projection apparatus detects the foreign object at night or in a dark scene, it is impossible to accurately detect whether the foreign object enters the light source emitting area due to light interference and poor quality of the image captured by the camera. And then the operation of adjusting the projection brightness can not be triggered, and the use experience of the user is reduced.
Disclosure of Invention
Some embodiments of the present application provide a projection device and a foreign object detection method, so as to solve a problem that when the projection device performs foreign object detection at night or in a dark scene, it is not possible to accurately detect whether a foreign object enters a light source emission area due to light interference and poor image quality photographed by a camera.
In one aspect, some embodiments of the present application provide a projection apparatus, including:
a light source;
a distance sensor configured to acquire distance data;
the optical machine is configured to project the playing content to a projection area in the projection surface;
a camera configured to capture a corresponding image in the plane of projection;
a controller configured to:
acquiring an average gray value of a detection area in the image, wherein the detection area is an area except the projection area in the projection plane;
adjusting the brightness of the image shot by the camera according to the average gray value;
acquiring images with preset frame numbers after brightness adjustment and distance data of shooting moments of the images with the preset frame numbers, and calculating to obtain image difference data and distance change data;
and if the distance change data is larger than a distance threshold value and/or the image difference data is larger than a difference threshold value, controlling to reduce the brightness of the light source.
On the other hand, some embodiments of the present application further provide a foreign object detection method, which is applied to a projection device, and the method specifically includes the following steps:
acquiring an average gray value of a detection area in an image, wherein the detection area is an area except the projection area in the projection plane;
adjusting the brightness of the image shot by the camera according to the average gray value;
acquiring images with preset frame numbers after brightness adjustment and distance data of shooting moments of the images with the preset frame numbers, and calculating to obtain image difference data and distance change data;
and if the distance change data is larger than a distance threshold value and/or the image difference data is larger than a difference threshold value, controlling to reduce the brightness of the light source.
According to the technical scheme, the projection equipment and the foreign object detection method are provided in some embodiments of the present application, the average gray value of the detection area in the image is obtained, and the brightness of the image shot by the camera is adjusted according to the average gray value. Wherein the detection region is a region of the projection plane other than the projection region. After the brightness of the image is adjusted, the image with the preset frame number after the brightness is adjusted and the distance data of the shooting moment of the image with the preset frame number are obtained, and image difference data and distance change data are obtained through calculation. And if the distance change data is larger than a distance threshold value and/or the image difference data is larger than a difference threshold value, controlling to reduce the brightness of the light source. Therefore, when the projection equipment detects the foreign objects at night or in a dark scene, even if light interference exists and/or the quality of the image shot by the camera is poor, whether the user enters a light source emitting area can be accurately detected, and the operation of adjusting the projection brightness is triggered. The damage of bright light emitted by the light source to the eyes of the user is relieved, and the use experience of the user is improved.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic view illustrating a projection arrangement state of a projection apparatus in an embodiment of the present application;
FIG. 2 is a schematic optical path diagram of a projection apparatus in an embodiment of the present application;
FIG. 3 is a schematic circuit diagram of a projection apparatus according to an embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of a projection apparatus in an embodiment of the present application;
fig. 5 is a schematic view of a lens structure of a projection apparatus in an embodiment of the present application;
FIG. 6 is a schematic diagram of a distance sensor and a camera of a projection apparatus according to an embodiment of the present disclosure;
FIG. 7 is a block diagram of a system for implementing display control of a projection device according to an embodiment of the present disclosure;
FIG. 8 is a diagram illustrating a projection apparatus performing foreign object detection according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a detection area in a projection plane according to an embodiment of the present disclosure;
FIG. 10 is a schematic view of another detection area in the projection plane in the embodiment of the present application;
FIG. 11 is a schematic diagram of distance change data calculated by the projection device in the embodiment of the present application;
FIG. 12 is a schematic diagram of image difference data calculated by a projection device in an embodiment of the present application;
fig. 13 is a schematic diagram of a rectangular region in a projection image in the embodiment of the present application.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all the embodiments.
Here, the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to all of the elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The embodiment of the application can be applied to various types of projection equipment. The projection apparatus and the automatic focusing method will be explained below by taking the projection apparatus as an example.
The projection device is a device capable of projecting images or videos onto a screen, and the projection device can be connected with a computer, a broadcast television network, the internet, a VCD (Video Compact Disc), a DVD (Digital Versatile Disc Recordable), a game machine, a DV, and the like through different interfaces to play corresponding Video signals. Projection devices are widely used in homes, offices, schools, entertainment venues, and the like.
Fig. 1 shows a schematic view of a placement state of a projection apparatus according to an embodiment of the present application, and fig. 2 shows a schematic view of an optical path of the projection apparatus according to an embodiment of the present application.
In some embodiments of the present application, referring to fig. 1-2, a projection device provided by some embodiments of the present application includes a projection screen 1 and a projection device 2. The projection screen 1 is fixed on the first position, and the projection device 2 is placed on the second position, so that the projected picture is matched with the projection screen 1. The projection device 2 comprises a laser light source 100, an optical engine 200, a lens 300 and a projection medium 400. The laser light source 100 provides illumination for the optical engine 200, and the optical engine 200 modulates light source beams, outputs the modulated light source beams to the lens 300 for imaging, and projects the modulated light source beams to the projection medium 400 to form a projection image.
In some embodiments of the present application, the laser source 100 of the projection apparatus 2 includes a laser assembly 110 and an optical lens assembly 120, and a light beam emitted from the laser assembly 110 can pass through the optical lens assembly 120 to provide illumination for the optical engine 200. Wherein, for example, the optical lens assembly 120 requires a higher level of environmental cleanliness, hermetic class sealing; and the chamber for installing the laser assembly can be sealed by adopting a dustproof grade with a lower sealing grade so as to reduce the sealing cost.
In some embodiments of the present application, the light engine 200 of the projection apparatus 2 may be implemented to include a blue light engine, a green light engine, a red light engine, and may further include a heat dissipation system, a circuit control system, and the like. In some embodiments of the present application, the light emitting component of the projection device 2 may also be implemented by an LED light source.
Fig. 3 is a schematic diagram illustrating a circuit architecture of a projection device according to an embodiment of the present application. In some embodiments of the present application, the projection device 2 may include a display control circuit 10, a laser light source 20, at least one laser driving component 30, and at least one brightness sensor 40, and the laser light source 20 may include at least one laser in one-to-one correspondence with the at least one laser driving component 30. Wherein, the at least one means one or more, and the plurality means two or more.
Based on the circuit architecture, the projection device 2 can implement adaptive adjustment. For example, by providing the luminance sensor 40 in the light outgoing path of the laser light source 20, the luminance sensor 40 can detect a first luminance value of the laser light source and send the first luminance value to the display control circuit 10.
The display control circuit 10 may obtain a second brightness value corresponding to the driving current of each laser, and determine that the laser has a COD fault when it is determined that a difference between the second brightness value of the laser and the first brightness value of the laser is greater than a difference threshold; the display control circuit can adjust the current control signal of the corresponding laser driving component of the laser until the difference value is less than or equal to the difference value threshold value, thereby eliminating the COD fault of the blue laser; this projection equipment 2 can in time eliminate the COD trouble of laser instrument, reduces the spoilage of laser instrument, improves projection equipment 2's image display effect.
Fig. 4 shows a schematic structural diagram of a projection apparatus according to an embodiment of the present application.
In some embodiments of the present application, the laser light source 20 in the projection apparatus 2 may include a blue laser 201, a red laser 202, and a green laser 203 that are independently disposed, where the projection apparatus 2 may also be referred to as a three-color projection apparatus, and the blue laser 201, the red laser 202, and the green laser 203 are all module light weight (MCL) packaged lasers, which are small in size and benefit for compact arrangement of optical paths.
In some embodiments of the present application, the controller includes at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first interface to an nth interface for input/output, a communication Bus (Bus), and the like.
In some embodiments of the present application, after the projection device 2 is started, the display interface of the signal source selected last time or the signal source selection interface may be directly entered, where the signal source may be a preset video-on-demand program, or may also be at least one of an HDMI interface, a live tv interface, and the like, and after the user selects different signal sources, the projector may display contents obtained from different signal sources.
In some embodiments of the present application, the projection device 2 may be configured with a camera, and the camera is configured to operate in cooperation with the projection device 2 to achieve adjustment control of the projection process. For example, the camera configured for the projection device 2 may be embodied as a 3D camera, or a binocular camera; when the camera is implemented as a binocular camera, the camera specifically includes a left camera and a right camera; the binocular camera may acquire a curtain corresponding to the projection device 2, that is, an image and play content presented by the projection plane, and the image or the play content is projected by an optical machine 200 built in the projection device 2.
The camera may be used to capture an image displayed on the projection plane, and may be a camera. The camera may include a lens assembly in which a light sensing element and a lens are disposed. The lens can make the light of the image of the scenery irradiate on the photosensitive element through the refraction effect of the plurality of lenses on the light. The photosensitive element can select a detection principle based on a charge coupled device or a complementary metal oxide semiconductor according to the specification of the camera, converts an optical signal into an electric signal through a photosensitive material, and outputs the converted electric signal into image data.
Fig. 5 shows a schematic lens structure of the projection device 2 in some embodiments of the present application. To support the auto-focusing process of the projection apparatus 2, as shown in fig. 5, the lens 300 of the projection apparatus 2 may further include an optical assembly 310 and a driving motor 320. The optical component 310 is a lens assembly composed of one or more lenses, and can refract the light emitted by the optical engine 200, so that the light emitted by the optical engine 200 can be transmitted onto the projection surface to form a transmission content image.
The optical assembly 310 may include a lens barrel and a plurality of lenses disposed within the lens barrel. The lens in the optical assembly 310 can be divided into a moving lens 311 and a fixed lens 312 according to whether the position of the lens can be moved, and the overall focal length of the optical assembly 310 can be changed by changing the position of the moving lens 311 and adjusting the distance between the moving lens 311 and the fixed lens 312. Therefore, the driving motor 320 can drive the movable lens 311 to move by connecting the movable lens 311 in the optical assembly 310, thereby implementing an auto-focusing function.
In the focusing process in some embodiments of the present application, the position of the movable lens 311 is changed by the driving motor 320, so as to adjust the distance between the movable lens 311 and the fixed lens 312, that is, adjust the position of the image plane, and therefore, adjusting the focal length is actually adjusting the image distance according to the imaging principle of the lens assembly in the optical assembly 310, but adjusting the position of the movable lens 311 is equivalent to adjusting the overall focal length of the optical assembly 310 in terms of the overall structure of the optical assembly 310.
When the projection device 2 is at different distances from the projection surface, the lens of the projection device 2 is required to adjust different focal lengths so as to transmit a clear image on the projection surface. In the projection process, the distance between the projection device 2 and the projection surface may be different depending on the placement position of the user, and thus different focal lengths may be required. Therefore, to adapt to different usage scenarios, the projection device 2 needs to adjust the focal length of the optical assembly 310.
FIG. 6 shows a schematic diagram of a distance sensor and camera configuration in some embodiments of the present application. As shown in fig. 6, the projection device 2 may further include or be externally connected to a camera 700, and the camera 700 may capture images of the pictures projected by the projection device 2 to obtain the projected content images. The projection device 2 determines whether the current focal length of the lens is proper or not by performing definition detection on the projected content image, and performs focal length adjustment when the current focal length is not proper. When automatic focusing is performed based on a projection content image captured by the camera 700, the projection device 2 may continuously adjust the lens position and photograph the image, and find the focusing position by comparing the sharpness of the front and rear position images, thereby adjusting the movable lens 311 in the optical assembly to a suitable position. For example, the controller 500 may first control the driving motor 320 to gradually move the focusing start position of the moving lens 311 to the focusing end position, and continuously acquire the projection content image through the camera 700 during this period. And finally, controlling a driving motor 320 to adjust the movable lens 311 from a focusing terminal to the position with the highest definition, and completing automatic focusing.
For the auto-focusing process, after acquiring the auto-focusing instruction, the controller of the projection apparatus 2 may acquire the separation distance by the distance sensor 600 in response to the auto-focusing instruction. Among them, the distance sensor 600 may be a sensor device based on a Time of flight (TOF) principle, such as a laser radar, an infrared radar, or the like, which can detect a target distance. The distance sensor 600 may be disposed at the optical engine 200, and includes a transmitting end and a receiving end of the signal. In the process of detecting the spacing distance, the transmitting end of the distance sensor 600 can transmit a wireless signal to the direction of the projection surface 400, and the wireless signal can be reflected to the receiving end of the distance sensor 600 after contacting the projection surface 400, so that the signal flight time can be calculated according to the time of transmitting the signal by the transmitting end and the time of receiving the signal by the receiving end, the actual flight distance of the wireless signal can be obtained by combining the flight speed, and the spacing distance between the projection surface 400 and the optical machine 200 can be calculated.
For the auto-focusing process, after acquiring the auto-focusing instruction, the controller of the projection apparatus 2 may acquire the separation distance through the distance sensor 600 in response to the auto-focusing instruction. Among them, the distance sensor 600 may be a sensor device based on a Time of flight (TOF) principle, such as a laser radar, an infrared radar, or the like, which can detect a target distance. The distance sensor 600 may be disposed at the optical engine 200, and includes a transmitting end and a receiving end of the signal. In the process of detecting the spacing distance, the transmitting end of the distance sensor 600 can transmit a wireless signal to the direction of the projection surface 400, and the wireless signal can be reflected to the receiving end of the distance sensor 600 after contacting the projection surface 400, so that the signal flight time can be calculated according to the time of transmitting the signal by the transmitting end and the time of receiving the signal by the receiving end, the actual flight distance of the wireless signal can be obtained by combining the flight speed, and the spacing distance between the projection surface 400 and the optical machine 200 can be calculated.
Fig. 7 is a schematic diagram illustrating a system framework of a projection device implementing display control according to an embodiment of the present application.
In some embodiments of the present application, the projection device 2 has the feature of long-focus micro-projection, and the controller thereof can perform display control on the projection light image through a preset algorithm, so as to realize functions of automatic trapezoidal correction of the display image, automatic screen entry, automatic obstacle avoidance, automatic focusing, anti-glare, and the like.
In some embodiments of the present application, the projection device 2 is configured with a gyro sensor; in the moving process of the equipment, the gyroscope sensor can sense the position movement and actively acquire moving data; and then, the acquired data is sent to an application program service layer through a system framework layer, the application data required in the user interface interaction and application program interaction processes is supported, and the acquired data can also be used for data calling of the controller in the implementation of algorithm service.
In some embodiments of the present application, the projection device 2 is configured with a distance sensor 600, and after the distance sensor 600 collects corresponding data, the data is sent to a flight time service corresponding to a service layer; after the flight time service acquires data, the acquired data is sent to an application program service layer through a process communication framework, and the data is used for data calling, a user interface, program application and the like of the controller for interactive use.
In some embodiments of the present application, the projection device 2 is configured with a camera 700 for capturing images, the camera 700 may be implemented as a binocular camera, a depth camera, a 3D camera, or the like; the camera collected data are sent to the camera service, and then the camera service sends the collected image data to the process communication framework and/or the projection equipment correction service; the projection device calibration service may receive data collected by the camera 700 sent by the camera service, and the controller may invoke a corresponding control algorithm in the algorithm library for different functions to be implemented.
In some embodiments of the present application, data interaction is performed with an application service through a process communication framework, and then a calculation result is fed back to a correction service through the process communication framework; and the correction service sends the acquired calculation result to the projection equipment operating system to generate a control signaling, and sends the control signaling to the optical machine control driver to control the optical machine working condition and realize the automatic correction of the display image.
In some embodiments of the present application, the calibration service sends signaling to the distance sensor 600 to query the current status of the projection device 2, and the controller will receive data feedback from the distance sensor 600. The correction service sends notification algorithm service to the process communication framework to start the anti-shooting process signaling, the process communication framework calls service capacity from the algorithm library to call corresponding algorithm service, and the algorithm can comprise a photographing detection algorithm, a screenshot image algorithm, a foreign matter detection algorithm and the like. The process communication framework can return a foreign body detection result to a correction service based on the algorithm service; and aiming at the returned result, if the preset threshold condition is reached, the controller controls the user interface to display the prompt message and reduce the display brightness.
In some embodiments of the present application, the controller will automatically turn on the anti-glare switch when the projection device 2 is configured in the child viewing mode. After receiving the position movement data sent by the acceleration sensor or receiving the foreign matter invasion data collected by other sensors, the controller controls the projection equipment 2 to turn on the anti-shooting switch. Thus, if the user looks directly at the projection light source or enters the light source emitting area of the projection device 2, the projection light source is likely to illuminate the user's eyes, causing injury. In order to avoid the above-mentioned accidents, the projection device 2 performs the foreign object detection in real time in the turned-on state. If the user is close to the light source emitting area, the projection device 2 will turn down the projection brightness of the light source to achieve the effect of protecting human eyes.
However, when the projection apparatus 2 performs the foreign object detection at night or in a dark scene, it is impossible to accurately detect whether the foreign object enters the light source emitting area due to light interference and poor quality of the image captured by the camera. And then the operation of adjusting the projection brightness can not be triggered, and the use experience of the user is reduced.
To this end, some embodiments of the present application provide a projection device, the projection device 2 comprising: a light source, a light engine 200, a distance sensor 600, a camera 700, and a controller. Wherein the distance sensor 600 is configured to collect distance data; the optical engine 200 is configured to project the playing content to a projection area in the projection plane 400; the camera 700 is configured to capture a corresponding image in the plane of projection 400. The controller adjusts the brightness of the image captured by the camera 700 in advance, and acquires the data collected by the distance data sensor 600 and the camera 700 after adjusting the brightness. When the acquired data triggers any preset condition, the controller controls the light source to reduce the display brightness. When the projection equipment 2 detects foreign matters at night or in a dark scene, the situation that whether the foreign matters enter a light source emitting area or not can not be accurately detected due to light interference and poor image quality shot by a camera is avoided. Wherein, the process of controlling the light source to reduce the display brightness is the eye protection process. The light source here is the laser light source 20 described above for convenience of description.
The process of performing foreign object detection by the projection apparatus according to some embodiments of the present application is further described below with reference to fig. 8.
In some embodiments of the present application, fig. 8 shows a schematic diagram of a projection apparatus performing foreign object detection in an embodiment of the present application. Referring to fig. 8, the controller in the projection device 2 is configured to:
and S1, acquiring the average gray value of the detection area in the image, wherein the detection area is an area except the projection area in the projection plane. And adjusting the brightness of the image shot by the camera according to the average gray value.
In some embodiments of the present application, when detecting a foreign object at night or in a dark scene, the camera 700 tends to clearly photograph the projection area during the photographing process due to the strong bright-dark contrast between the projection area and the area of the projection surface except the projection area, and ignores the information corresponding to the area of the projection surface except the projection area. Thus, since the brightness corresponding to the region other than the projection region in the projection surface is low, a situation that whether a foreign object enters the light source emission region cannot be accurately detected may occur. Meanwhile, in the process of detecting a foreign object, a user may often enter the projection area from an area other than the projection area in the projection plane 400. Therefore, fig. 9 shows a schematic diagram of a detection area in a projection plane in an embodiment of the present application, and referring to fig. 9, the present application sets the detection area around the projection area and increases the brightness of an image according to the detection area, so as to enhance effective information in the image and improve detection accuracy of foreign object detection.
Fig. 10 is a schematic view showing another detection region in the projection plane in the embodiment of the present application. Referring to fig. 10, in some embodiments of the present application, the controller is configured to: a homography matrix of the image and the projection area is acquired. The coordinates of the corner points of the projected area in the camera coordinate system are calculated based on the homography matrix. And calculating the width and height of the projection area according to the coordinates of the corner points. The two sides in the width direction of the projection area extend out by one third to form an extension width area, and the two sides in the height direction of the projection area extend out by one third to form an extension height area; a detection region is generated based on the extended width region and the extended height region.
For example, the controller obtains coordinate values of four corner points and four edge midpoints of the projection area in the optical-mechanical coordinate system calculated based on the image captured by the camera 700. And the included angle between the projection plane 400 and the optical machine 200 is obtained based on the coordinate value fitting plane. And acquiring corresponding coordinates of the four corner points and the four edge midpoints in the world coordinate system of the projection plane 400 according to the included angle relationship. Then, the optical engine 200 is used to project the calibration image onto the projection plane 400, and the coordinates of the calibration chart card in the optical engine coordinate system and the coordinates of the corresponding point of the projection plane 400 are obtained, so that the homography matrix can be calculated. And finally, converting coordinate values of four corners and four edges of the projection area under an optical machine coordinate system into coordinate values corresponding to a camera coordinate system through the homography matrix. And determining the position, height, width and area of the projection area in the projection image according to the coordinate values of the four corner points and the four edge points in the camera coordinate system.
In this way, the controller generates the detection area based on the width and height of the projection area. For example, when the coordinates of four corner points of the projection area are (3, 3), (6, 6), and (3, 6), respectively, the extension width d1 is 3 × 1/3-1 meter, and the extension height d2 is 3 × 1/3-1 meter. And respectively carrying out region extension on two sides of the projection region in the width and height directions, wherein the extension height region and the extension width region form a detection region, namely a 'loop' region around the projection region.
In other embodiments of the present application, the controller is configured to: and calculating the area of the projection area according to the coordinates of the corner points. And zooming the projection region based on the region area and a preset proportion to obtain the zoomed region area. And arranging detection regions at the periphery of the projection region, wherein the region area corresponding to the detection regions is the zoomed region area.
Illustratively, when the coordinates of four corner points of the projection area are (3, 3), (6, 6) and (3, 6), respectively, and the preset proportion is one third, the area of the projection area is 3 × 3 ═ 9 square meters. The scaled area is 9 × 1/3-3 square meters. And 3 square meters are expanded around the projection area, namely the area of the detection area is 3 square meters. Wherein the setting of the detection area can be set according to the actual use condition and the environmental condition of the projection device 2. For example, an average of three quarters of a square meter may extend around the projected area. And the expansion can be performed around the projection area according to a preset expansion area. For example, the expanded area in the width direction of the projection area is larger than the expanded area in the height direction of the projection area.
In some embodiments of the present application, in the step of acquiring an average gray value of the detection region in the image and adjusting the brightness of the image captured by the camera 700 according to the average gray value, the controller is configured to: and if the average gray value is greater than the first gray threshold value and the average gray value is less than the second gray threshold value, normalizing the pixel value of each pixel point in the image to obtain the normalized pixel value. And performing enhancement processing on the normalized pixel value to obtain an enhanced pixel value. Finally, the brightness of the image captured by the camera 700 is adjusted according to the enhanced pixel value.
Illustratively, the first gray level threshold is 10 and the second gray level threshold is 15. If the average gray scale value is less than 15 and greater than 10, it indicates that the gray scale of the image captured by the camera 700 is low. Further, the controller triggers a brightness boost process. First, the image is normalized according to the following formula (one).
Figure BDA0003667051180000071
Wherein normXi represents a pixel value of an ith pixel point in the normalized image, and Xi represents a pixel value of an ith pixel point in the image before normalization.
To enhance the effective information in the image, the controller is further configured to: and (5) performing enhancement processing on the image according to the following formula (II). The second preset formula is as follows:
augXi ═ (Xi + (1-Xi) × Xi) formula (two);
wherein augXi represents the pixel value of the ith pixel point in the image after the enhancement processing, and Xi represents the pixel value of the ith pixel point in the image before the enhancement processing.
Furthermore, the controller may remove weak light interference in the image by performing a brightness boosting process. Meanwhile, the pixel value corresponding to the whole pixel point is increased, and effective information in the image is enhanced. So as to improve the accuracy of the detection result. If the average gray value is greater than the second gray threshold, it indicates that the image brightness of the image captured by the camera 700 satisfies the condition, and the subsequent foreign object detection process may be performed. On the contrary, if the average gray-scale value is smaller than the first gray-scale threshold value, it indicates that the image brightness of the image captured by the camera 700 is low. If the subsequent images shot by using the original parameters still cannot meet the foreign matter detection requirement. Therefore, the camera shooting parameters, such as setting exposure gain and brightness enhancement parameters, need to be adjusted.
And S2, acquiring the image with the preset frame number after the brightness is adjusted and the distance data of the shooting moment of the image with the preset frame number, and calculating to obtain image difference data and distance change data. And controlling to reduce the brightness of the light source if the distance change data is larger than the distance threshold and/or the image difference data is larger than the difference threshold.
In some embodiments of the present application, referring to fig. 11, in the step of calculating the distance change data, the controller is configured to:
s301, analyzing a distance value corresponding to the current frame image in the distance data.
S302, analyzing the distance change value of the preset frame number image relative to the current frame image.
S303, if the distance value is greater than the distance threshold value and the distance change value is greater than the distance change threshold value, executing S304; otherwise, S305 is executed.
And S304, controlling to reduce the brightness of the light source, namely triggering an eye protection process.
And S305, controlling the light source to operate in a state before the brightness is reduced, namely, not triggering an eye protection process.
In the process that the controller calculates the distance change data according to the distance data of the preset frame number image shooting time, illustratively, at least one distance value is analyzed in the distance data, the distance value corresponding to the first frame number image shooting time is 3.5 meters, and the distance value corresponding to the second frame number image shooting time is 3 meters. The distance variation value of the second frame image with respect to the first frame image is |3-3.5| -0.5 meter, so as to determine whether to reduce the brightness of the light source subsequently according to the distance value and the distance variation value.
In some embodiments of the present application, in the step of determining whether to reduce the brightness of the light source subsequently according to the distance value and the distance variation value, the controller is configured to: setting a first distance range, a second distance range and a third distance range. The first distance range is smaller than the second distance range, and the second distance range is smaller than the third distance range. And setting a first variation threshold, a second variation threshold and a third variation threshold according to the distance data. Wherein the first variation threshold is smaller than the second variation threshold, and the second variation threshold is smaller than the third variation threshold.
Exemplary, the first variation threshold a 1, the second variation threshold a 2, and the third variation threshold a 3. The first distance range is 0-2 meters, the second distance range is 2-3 meters, and the third distance range is 3-4 meters. If the distance value corresponding to the current frame image is within 0-2 m, the first variation threshold value alpha 1 is used. And if the distance variation value is larger than a first variation threshold value alpha 1, judging that an object invades, and controlling to reduce the light source brightness. Similarly, if the distance value corresponding to the current frame image is within 2 meters to 3 meters, the second variation threshold a 2 is used. And if the distance variation value is larger than a second variation threshold value alpha 2, judging that an object invades, and controlling to reduce the light source brightness. And if the distance value corresponding to the current frame image is within 3 m-4 m, using a third variation threshold alpha 3. And if the distance variation value is larger than a third variation threshold value alpha 3, judging that an object invades, and controlling to reduce the light source brightness. Therefore, by setting different change threshold values and dynamically selecting the corresponding change threshold value for judgment based on the distance value, the situation that the detection result is inaccurate due to small environmental changes when the projection distance is far is avoided.
In some embodiments of the application, after the step of parsing the data, the controller is configured to: acquiring a near-end threshold value and a far-end threshold value between the projection device 2 and the projection plane 400; if the distance value is greater than the near-end threshold value and the distance value is less than the far-end threshold value, a subsequent process of calculating distance change data is performed.
Illustratively, the closest distance between the distance sensor 600 and the plane of projection 400 is set to 0.02 m, i.e. the proximity threshold is 0.02 m. The farthest distance set between the distance sensor 600 and the projection plane 400 is 4 meters, i.e., the far-end threshold is 4 meters. Thus, the range of the distance sensor 600 can be trusted from 0.02 meters to 4 meters. Furthermore, if the distance value in the distance data is within the trusted distance range, a subsequent process of calculating distance variation data is performed. If the distance value is less than 0.02 meters, or greater than 4 meters, it is indicated that the distance value is not within the trusted distance range of the distance sensor 600, i.e., the subsequent process of calculating distance change data is not performed.
In the process of obtaining the distance change data through calculation by the controller, the whole projection surface area does not need to be completely calculated, and only the distance change data in the projection area needs to be calculated. The distance change data is calculated in the projection area, and the distance change data can be designed according to an actual calculation scene and an actual projection environment.
In some embodiments of the present application, in order to reduce the calculation time for subsequently calculating the image difference data, the controller needs to perform a linear pre-detection process on the boundary of the projection region before the step of calculating the image difference data. The method comprises the following specific steps: the controller is configured to: acquiring an image with adjusted brightness; converting the image format of the image with the adjusted brightness into a YUV format; extracting a gray scale component in the YUV format image; carrying out binarization processing on the gray scale component of the image by using a preset component threshold value to obtain a first target image; carrying out noise removal processing on the first target image to obtain an image with noise removed; identifying the image after removing the noise by using an edge detection algorithm to obtain an edge line corresponding to the projection area; and identifying each edge line by using a straight line detection algorithm, and if the edge line consists of at least three straight lines, executing the step of acquiring the image with the preset frame number after the brightness is adjusted.
In the above pre-detection processing of the projection area boundary, when a human body or an object enters the projection area, the projection reflection medium is changed, so that the original straight projection boundary is changed into a broken line or a curve. Therefore, the subsequent foreign object detection process is performed based on the image in which the projection boundary becomes a broken line or a curved line.
In some embodiments of the present application, referring to fig. 12, the controller is further configured to:
s401, a difference image in the detection area is analyzed in the image difference data.
S402, sequentially carrying out gray processing and binarization processing on the difference image to obtain a second target image.
And S403, extracting a foreground image in the second target image.
S404, calculating the ratio of the image area of the foreground image to the area of the detection area.
S405, if the ratio is larger than the difference threshold, executing S406; otherwise, S407 is executed.
And S406, controlling to reduce the brightness of the light source, namely triggering an eye protection process.
And S407, controlling the light source to operate in a state before the brightness is reduced, namely, not triggering the eye protection process.
Illustratively, the binarized image includes first type pixel points and second type pixel points, the value of the first type pixel points in the binarized image is a first value representing the foreground, such as 1, and the value of the second type pixel points in the binarized image is a second value representing the background, such as 0. And generating a foreground image according to the area where the first type of pixel points are located, and generating a background image according to the area where the second type of pixel points are located. And calculating the image area of the foreground image and the image area of the background image by extracting the first type of pixel points with the pixel point value of 1 and the second type of pixel points with the pixel point value of 0 to obtain the image area of the binary image. And finally, calculating the ratio of the image area of the foreground image to the image area of the binary image, and judging whether to trigger the subsequent eye protection process or not according to the ratio. Meanwhile, a difference threshold value of 0.1 is set. If the ratio of the image area of the foreground image to the image area of the binarized image is greater than 0.1, the optical engine 200 may be controlled to reduce the brightness.
In some embodiments of the present application, the controller is further configured to loop through the process of determining the image difference data and the difference threshold. By setting the threshold number of times beta, if the image difference data continuously exceeds the threshold number of times beta and is greater than the difference threshold, the eye protection process is triggered.
In the process of obtaining the image difference data through calculation by the controller, it is not necessary to perform all calculations on the projection image captured by the camera 700, and only the image difference data in the detection area needs to be calculated. The present application only exemplifies the calculation of image difference data in the detection area, and can be designed according to the actual calculation scene and the actual projection environment.
In some embodiments of the present application, generally, the camera 700 is affected by light and/or camera quality when shooting is performed continuously, and two images obtained by continuous shooting may have different overall brightness, resulting in an error in a difference image. Further, the detection result of the subsequent foreign matter detection is abnormal. To this end, in order to remove the light interference present in the image, the controller is further configured to: after the step of analyzing the image difference data to obtain a difference image in the detection area, acquiring a top area in the difference image, wherein the top area is an area from the top edge of the projection area to the top edge of the difference image; calculating an average luminance value in the top edge region; if the average brightness value is larger than the preset brightness threshold value, the brightness value corresponding to the difference image is updated; and the updated brightness value is smaller than the brightness value before updating.
Illustratively, referring to fig. 13, a top region T in the difference image is extracted, wherein the top region T is a rectangular region directly above the projection region. The average luminance of the rectangular area is calculated, and a preset luminance threshold θ is set to 25. If the average brightness value is greater than 25, the brightness of the difference image is adjusted to be uniform according to the following formula (three).
Figure BDA0003667051180000101
Wherein norm (X) represents the brightness value of the xth pixel point in the image after brightness adjustment, i (X) represents the brightness value of the xth pixel point in the image before brightness adjustment, and θ is a preset brightness threshold.
In some embodiments of the present application, generally in a dim scene, the astigmatism of the projection area may hit the detection area due to the highest brightness of the projection area. Therefore, when the video picture is played in the projection area, light rays in the projection area can be projected into the detection area, and the light rays are changed continuously to influence the foreign matter detection of the detection area. To this end, the present application removes the projected astigmatism existing in the detection region by using a connected component elimination method. Specifically, the controller is further configured to: in the step of obtaining the second target image, identifying an image connected region in the second target image, wherein the image connected region is a region which has the same pixel value and is formed by pixel points with adjacent pixel positions in the second target image; calculating the area of the image connected region; and if the area of the area with the image connected region is smaller than the first area threshold value and the area of the area is larger than the second area threshold value, removing the image connected region.
Illustratively, the first area threshold is set to occupy 300 pixel points, and the second area threshold occupies 3000 pixel points. And if the area of the region with the image connected region occupies less than 300 pixel points, judging the image connected region as noise interference. And if the area of the region with the image connected region occupies more than 3000 pixel points, determining the image connected region as a projection region. Therefore, the accuracy of subsequent foreground image detection is improved by removing the noise interference area and the projection area from the image communication area with overlarge and/or undersize area.
In some embodiments of the present application, in order to avoid that the foreign object detection process is affected by light objects arranged in the environment, the present application avoids the problem of brightness difference in the image captured by the camera 700 during use by utilizing the characteristic that the positions of the light objects are not changed. The controller is further configured to: after the difference image in the detection area is analyzed in the image difference data, the difference image is denoised by an expansion corrosion algorithm. And then avoid appearing in the difference image because light, door and window reflect light and lead to the condition that light is bright dark.
As can be seen from the above solutions, the projection apparatus provided in the above embodiments obtains the average gray-scale value of the detection area in the image, and adjusts the brightness of the image captured by the camera 700 according to the average gray-scale value. Wherein the detection region is a region of the projection plane 400 other than the projection region. After the brightness of the image is adjusted, the image with the preset frame number after the brightness is adjusted and the distance data of the shooting moment of the image with the preset frame number are obtained, and image difference data and distance change data are obtained through calculation. And controlling to reduce the brightness of the light source if the distance change data is larger than the distance threshold and/or the image difference data is larger than the difference threshold. Like this, when projection equipment 2 carries out the foreign matter and detects under the scene of night or light darkness, even there is light interference and/or the image quality that the camera was shot poor, whether can be accurate detection user get into light source emission area after trigger the operation of adjusting projection luminance. The damage of the light source to the eyes of the user is avoided, and the use experience of the user is improved.
The application also provides a foreign matter detection method, which is applied to the projection equipment 2, wherein the projection equipment 2 comprises a light source, a distance sensor 600, an optical machine 200, a camera 700 and a controller; the foreign matter detection method specifically includes the steps of:
and acquiring the average gray value of a detection area in the image, wherein the detection area is an area except the projection area in the projection plane. And adjusting the brightness of the image shot by the camera according to the average gray value. And obtaining the image with the preset frame number after the brightness is adjusted and the distance data of the shooting moment of the image with the preset frame number, and calculating to obtain image difference data and distance change data. And controlling to reduce the brightness of the light source if the distance change data is larger than the distance threshold and/or the image difference data is larger than the difference threshold.
In some embodiments of the present application, the method further comprises: in the step of adjusting the brightness of the image shot by the camera according to the average gray value, if the average gray value is greater than the first gray threshold and the average gray value is less than the second gray threshold, normalizing the pixel values of the pixel points in the image to obtain the normalized pixel values. And performing enhancement processing on the normalized pixel value to obtain an enhanced pixel value. And adjusting the brightness of the image shot by the camera according to the enhanced pixel value.
In some embodiments of the present application, the method further comprises: and analyzing a distance value corresponding to the current frame image and a distance change value of the preset frame image relative to the current frame image in the distance data. And controlling to reduce the brightness of the light source if the distance value is greater than the distance threshold and the distance change value is greater than the distance change threshold.
In some embodiments of the present application, the method further comprises: after the step of resolving the data, a near threshold and a far threshold between the projection device and the projection surface are obtained. If the distance value is greater than the near-end threshold value and the distance value is less than the far-end threshold value, a process of judging that the distance value is greater than the distance threshold value and the distance change value is greater than the distance change threshold value is executed.
In some embodiments of the present application, the method further comprises: in the step of acquiring the average gray value of the detection region in the image, a homography matrix of the image and the projection region is acquired. The coordinates of the corner points of the projected area in the camera coordinate system are calculated based on the homography matrix. And calculating the width and height of the projection area according to the coordinates of the corner points. The width of the two sides of the projection area in the width direction extends by one third to form an extension width area, and the height of the two sides of the projection area in the height direction extends by one third to form an extension height area. A detection region is generated based on the extended width region and the extended height region.
In some embodiments of the present application, the method further comprises: and acquiring the image with the adjusted brightness before the step of acquiring the image with the preset frame number after the adjusted brightness. And converting the image format of the image with the adjusted brightness into a YUV format. And extracting a gray scale component in the YUV format image. And carrying out binarization processing on the gray scale component of the image by using a preset component threshold value to obtain a first target image. And carrying out noise removal processing on the first target image to obtain an image with noise removed. And identifying the image after removing the noise by using an edge detection algorithm to obtain an edge line corresponding to the projection area. And identifying each edge line by using a straight line detection algorithm, and if the edge line consists of at least three straight lines, executing the step of acquiring the image with the preset frame number after the brightness is adjusted.
In some embodiments of the present application, the method further comprises: a difference image in the detection area is resolved in the image difference data. And sequentially carrying out gray level processing and binarization processing on the difference image to obtain a second target image. And extracting a foreground image in the second target image. And calculating the ratio of the image area of the foreground image to the area of the detection area. If the ratio is greater than the difference threshold, control decreases the brightness of the light source.
In some embodiments of the present application, the method further comprises: after the step of analyzing the image difference data to obtain a difference image in the detection area, a top area in the difference image is obtained, wherein the top area is an area from the top edge of the projection area to the top edge of the difference image. The average luminance value in the top edge region is calculated. If the average brightness value is larger than the preset brightness threshold value, the brightness value corresponding to the difference image is updated; and the updated brightness value is smaller than the brightness value before updating.
In some embodiments of the present application, the method further comprises: in the step of obtaining the second target image, identifying an image connected region in the second target image, wherein the image connected region is a region which has the same pixel value and is formed by pixel points with adjacent pixel positions in the second target image. And calculating the area of the image connected region. And if the area of the region with the image connected region is smaller than the first area threshold value and the area of the region is larger than the second area threshold value, removing the image connected region.
The same and similar parts in the embodiments in this specification may be referred to one another, and are not described herein again.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of software products, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and include instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method in the embodiments or some parts of the embodiments of the present invention.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A projection device, comprising:
a light source;
a distance sensor configured to acquire distance data;
the optical machine is configured to project the playing content to a projection area in the projection surface;
a camera configured to capture a corresponding image in the plane of projection;
a controller configured to:
acquiring an average gray value of a detection area in the image, wherein the detection area is an area except the projection area in the projection plane;
adjusting the brightness of the image shot by the camera according to the average gray value;
acquiring images with preset frame numbers after brightness adjustment and distance data of shooting moments of the images with the preset frame numbers, and calculating to obtain image difference data and distance change data;
and if the distance change data is larger than a distance threshold value and/or the image difference data is larger than a difference threshold value, controlling to reduce the brightness of the light source.
2. The projection device of claim 1, wherein the controller is configured to:
in the step of adjusting the brightness of the image shot by the camera according to the average gray value, if the average gray value is greater than a first gray threshold and the average gray value is less than a second gray threshold, normalizing the pixel value of each pixel point in the image to obtain a normalized pixel value;
enhancing the normalized pixel value to obtain an enhanced pixel value;
and adjusting the brightness of the image shot by the camera according to the enhanced pixel value.
3. The projection device of claim 1, wherein the controller is configured to:
analyzing a distance value corresponding to the current frame image and a distance change value of a preset frame image relative to the current frame image in the distance data;
and controlling to reduce the brightness of the light source if the distance value is greater than the distance threshold and the distance change value is greater than a distance change threshold.
4. The projection device of claim 3, wherein the controller is configured to:
after the step of analyzing the data, acquiring a near-end threshold value and a far-end threshold value between the projection equipment and the projection surface;
and if the distance value is greater than the near-end threshold value and the distance value is less than the far-end threshold value, executing a process of judging that the distance value is greater than the distance threshold value and the distance change value is greater than a distance change threshold value.
5. The projection device of claim 4, wherein the controller is configured to:
in the step of obtaining the average gray value of the detection area in the image, obtaining a homography matrix of the image and the projection area;
calculating the coordinates of the corner points of the projection area in a camera coordinate system based on the homography matrix;
calculating the width and height of the projection region according to the corner coordinates;
extending one third of the width from two sides of the projection area in the width direction to form an extension width area, and extending one third of the height from two sides of the projection area in the height direction to form an extension height area;
generating the detection region based on the extended width region and the extended height region.
6. The projection device of claim 1, wherein the controller is further configured to:
acquiring an image with adjusted brightness before the step of acquiring the image with the preset frame number after the brightness is adjusted;
converting the image format of the image with the adjusted brightness into a YUV format;
extracting a gray scale component in the YUV format image;
carrying out binarization processing on the gray scale component of the image by using a preset component threshold value to obtain a first target image;
carrying out noise removal processing on the first target image to obtain an image with noise removed;
identifying the image after the noise is removed by utilizing an edge detection algorithm to obtain an edge line corresponding to the projection area;
and identifying each edge line by using a straight line detection algorithm, and if the edge line consists of at least three straight lines, executing the step of acquiring the image with the preset frame number after the brightness is adjusted.
7. The projection device of claim 1, wherein the controller is further configured to:
analyzing a difference image in the detection area in the image difference data;
sequentially carrying out gray processing and binarization processing on the difference image to obtain a second target image;
extracting a foreground image in the second target image;
calculating the ratio of the image area of the foreground image to the area of the detection area;
controlling to reduce the brightness of the light source if the ratio is greater than the difference threshold.
8. The projection device of claim 7, wherein the controller is further configured to:
after the step of analyzing the image difference data to obtain a difference image in the detection area, acquiring a top area in the difference image, wherein the top area is an area from the top edge of the projection area to the top edge of the difference image;
calculating an average luminance value in the top edge region;
if the average brightness value is larger than a preset brightness threshold value, updating the brightness value corresponding to the difference image; and the updated brightness value is smaller than the brightness value before updating.
9. The projection device of claim 8, wherein the controller is further configured to:
in the step of obtaining the second target image, identifying an image connected region in the second target image, wherein the image connected region is a region which is formed by pixel points with the same pixel value and adjacent pixel positions in the second target image;
calculating the area of the image connected region;
and if the area of the area where the image connected region exists is smaller than a first area threshold value and larger than a second area threshold value, removing the image connected region.
10. The foreign matter detection method is characterized by being applied to projection equipment, wherein the projection equipment comprises a light source, a distance sensor, an optical machine, a camera and a controller; the method specifically comprises the following steps:
acquiring an average gray value of a detection area in an image, wherein the detection area is an area except the projection area in the projection plane;
adjusting the brightness of the image shot by the camera according to the average gray value;
acquiring images with preset frame numbers after brightness adjustment and distance data of shooting moments of the images with the preset frame numbers, and calculating to obtain image difference data and distance change data;
and if the distance change data is larger than a distance threshold value and/or the image difference data is larger than a difference threshold value, controlling to reduce the brightness of the light source.
CN202210594204.7A 2022-05-27 2022-05-27 Projection apparatus and foreign matter detection method Pending CN114928728A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210594204.7A CN114928728A (en) 2022-05-27 2022-05-27 Projection apparatus and foreign matter detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210594204.7A CN114928728A (en) 2022-05-27 2022-05-27 Projection apparatus and foreign matter detection method

Publications (1)

Publication Number Publication Date
CN114928728A true CN114928728A (en) 2022-08-19

Family

ID=82811216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210594204.7A Pending CN114928728A (en) 2022-05-27 2022-05-27 Projection apparatus and foreign matter detection method

Country Status (1)

Country Link
CN (1) CN114928728A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361542A (en) * 2022-10-24 2022-11-18 潍坊歌尔电子有限公司 Projector cleanliness self-checking method, device, equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005091665A (en) * 2003-09-17 2005-04-07 Nec Viewtechnology Ltd Projector and obstacle detecting method
JP2008148089A (en) * 2006-12-12 2008-06-26 Nikon Corp Projection apparatus and camera
WO2013088657A1 (en) * 2011-12-16 2013-06-20 日本電気株式会社 Projecting projector device, optical anti-glare method, and optical anti-glare program
WO2013088656A1 (en) * 2011-12-16 2013-06-20 日本電気株式会社 Object detection device, object detection method, and object detection program
CN107426552A (en) * 2016-05-23 2017-12-01 中兴通讯股份有限公司 Anti-glare method and device, projector equipment
CN109883354A (en) * 2019-03-05 2019-06-14 盎锐(上海)信息科技有限公司 Regulating system and method for projection grating modeling
US20210058593A1 (en) * 2019-08-23 2021-02-25 Coretronic Corporation Projection device and brightness adjusting method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005091665A (en) * 2003-09-17 2005-04-07 Nec Viewtechnology Ltd Projector and obstacle detecting method
JP2008148089A (en) * 2006-12-12 2008-06-26 Nikon Corp Projection apparatus and camera
WO2013088657A1 (en) * 2011-12-16 2013-06-20 日本電気株式会社 Projecting projector device, optical anti-glare method, and optical anti-glare program
WO2013088656A1 (en) * 2011-12-16 2013-06-20 日本電気株式会社 Object detection device, object detection method, and object detection program
CN107426552A (en) * 2016-05-23 2017-12-01 中兴通讯股份有限公司 Anti-glare method and device, projector equipment
CN109883354A (en) * 2019-03-05 2019-06-14 盎锐(上海)信息科技有限公司 Regulating system and method for projection grating modeling
US20210058593A1 (en) * 2019-08-23 2021-02-25 Coretronic Corporation Projection device and brightness adjusting method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361542A (en) * 2022-10-24 2022-11-18 潍坊歌尔电子有限公司 Projector cleanliness self-checking method, device, equipment and storage medium
CN115361542B (en) * 2022-10-24 2023-02-28 潍坊歌尔电子有限公司 Projector cleanliness self-checking method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN115174877A (en) Projection apparatus and focusing method of projection apparatus
JP5108093B2 (en) Imaging apparatus and imaging method
CN114339194B (en) Projection display method, apparatus, projection device, and computer-readable storage medium
WO2023087947A1 (en) Projection device and correction method
US9961269B2 (en) Imaging device, imaging device body, and lens barrel that can prevent an image diaphragm value from frequently changing
US8199247B2 (en) Method for using flash to assist in focal length detection
CN115002432A (en) Projection equipment and obstacle avoidance projection method
CN114866751A (en) Projection equipment and trigger correction method
CN115883803A (en) Projection equipment and projection picture correction method
CN115002433A (en) Projection equipment and ROI (region of interest) feature region selection method
CN114928728A (en) Projection apparatus and foreign matter detection method
CN116320335A (en) Projection equipment and method for adjusting projection picture size
WO2024055793A1 (en) Projection device and projection image quality adjustment method
JP6336337B2 (en) Imaging apparatus, control method therefor, program, and storage medium
CN116055696A (en) Projection equipment and projection method
CN114760454A (en) Projection equipment and trigger correction method
CN115604445A (en) Projection equipment and projection obstacle avoidance method
CN115623181A (en) Projection equipment and projection picture moving method
CN114885141A (en) Projection detection method and projection equipment
CN114885142B (en) Projection equipment and method for adjusting projection brightness
CN115022606B (en) Projection equipment and obstacle avoidance projection method
CN114885142A (en) Projection equipment and method for adjusting projection brightness
US10498969B2 (en) Image pickup apparatus, control apparatus, and exposure control method
WO2023087948A1 (en) Projection device and display control method
CN115243021A (en) Projection equipment and obstacle avoidance projection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination