CN114339182B - Focusing method of projection device, projection device and storage medium - Google Patents

Focusing method of projection device, projection device and storage medium Download PDF

Info

Publication number
CN114339182B
CN114339182B CN202111653144.3A CN202111653144A CN114339182B CN 114339182 B CN114339182 B CN 114339182B CN 202111653144 A CN202111653144 A CN 202111653144A CN 114339182 B CN114339182 B CN 114339182B
Authority
CN
China
Prior art keywords
image
projection
lens
pixel
definition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111653144.3A
Other languages
Chinese (zh)
Other versions
CN114339182A (en
Inventor
姜建德
余横
徐赛杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Shunjiu Electronic Technology Co ltd
Original Assignee
Shanghai Shunjiu Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Shunjiu Electronic Technology Co ltd filed Critical Shanghai Shunjiu Electronic Technology Co ltd
Priority to CN202111653144.3A priority Critical patent/CN114339182B/en
Publication of CN114339182A publication Critical patent/CN114339182A/en
Application granted granted Critical
Publication of CN114339182B publication Critical patent/CN114339182B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The application relates to the technical field of projection equipment, and discloses a focusing method of projection equipment, projection equipment and a storage medium, which are used for solving the problem that the blurring of a projection picture cannot be timely solved in the related technology. When the using time of the projection equipment exceeds a preset time threshold and/or the lens temperature exceeds a preset temperature threshold, acquiring multi-frame images input to the projection equipment; if the image currently input to the projection equipment is determined to display a still picture based on the acquired multi-frame image, determining a lens position with highest definition based on the definition of the projection images of the plurality of projection lens positions; and adjusting the position of the projection lens to the lens position with the highest definition. Therefore, the user-perception-free focusing method provided by the application fundamentally ensures that the multi-frame projection images with the definition are judged to be the same images, so that the position of the projection lens is adjusted more accurately according to the definition, and the projection focusing effect of the projection equipment is improved.

Description

Focusing method of projection device, projection device and storage medium
Technical Field
The present disclosure relates to the field of projection devices, and particularly to a focusing method for a projection device, and a storage medium.
Background
The projection device is a very precise product, and the optical lens is also a precise element, so that it can be said that a slight change in hardware of the projection device may cause blurring of a projection image or blurring of a projection image. Blurring of the picture is often caused after a projection device has been used for some time.
In the prior art, common automatic focusing has focusing patterns in the use process of a user, so that the user experience is seriously influenced, and the problem that the effect after focusing cannot be finely perceived by the user without perceived noninductive focusing is poor, so that the problem of blurring of a projection picture cannot be timely solved in the related art.
Disclosure of Invention
The purpose of the present application is to provide a focusing method for a projection device, a projection device and a storage medium, which are used for solving the problem that the projection image blurring cannot be solved in time in the related art.
In a first aspect, the present application provides a focusing method of a projection device, the method including:
when the using time of the projection equipment exceeds a preset time threshold or the lens temperature exceeds a preset temperature threshold, acquiring a multi-frame image input to the projection equipment;
if the image currently input to the projection equipment is determined to display a still picture based on the acquired multi-frame image, determining a lens position with highest definition based on the definition of the projection images of the plurality of projection lens positions;
And adjusting the position of the projection lens to the lens position with the highest definition.
In one possible implementation, based on the acquired multi-frame image, determining whether the image currently input to the projection device displays a still picture specifically includes:
acquiring and storing a first frame image in the multi-frame images, and respectively executing for each frame image acquired subsequently;
comparing the acquired current frame image with a previous frame image, and determining the similarity between the current frame image and the previous frame image;
if the similarity is smaller than a first similarity threshold, configuring a still picture parameter to be a first value;
if the similarity is greater than or equal to the first similarity threshold, increasing the still picture parameter by a second value;
if the parameter value of the still picture parameter is greater than or equal to the parameter threshold value, determining that the image currently input to the projection device displays a still picture;
and if the parameter value of the still picture parameter is smaller than the parameter threshold value, determining that the image currently input to the projection equipment displays a dynamic picture, taking the current frame image as a first frame image, and returning to execute the step of comparing the acquired current frame image with the previous frame image to determine the similarity between the current frame image and the previous frame image.
In one possible implementation, based on the acquired multi-frame image, determining whether the image currently input to the projection device displays a still picture specifically includes:
the multi-frame images are pairwise paired to obtain a plurality of image pairs;
determining image similarity for each image pair;
if the image similarity of each pair of image pairs is greater than or equal to a second similarity threshold, determining that the image currently input to the projection device displays a still picture;
and if the image similarity of any image pair is smaller than the second similarity threshold, determining that the image currently input to the projection equipment displays a dynamic picture, re-acquiring multi-frame images, and executing the step of executing the pairwise multi-frame images to obtain a plurality of image pairs.
In one possible implementation manner, the determining the lens position with the highest definition based on the definition of the projection images of the plurality of projection lens positions specifically includes:
respectively determining the definition of projection images of a plurality of lens positions;
determining a position direction in which the definition of the projected image becomes large based on the definition of the projected images of the plurality of lens positions;
and continuously adjusting the lens position to the next lens position along the position direction, and calculating the definition of the projection image of the next lens position until the lens position of the projection image with the maximum definition in the position direction is determined.
In one possible implementation manner, determining the definition of the projection image of each lens position specifically includes:
acquiring a pixel value of each pixel point in the projection image of the lens position;
calculating pixel differences between adjacent pixel rows, and adding the pixel differences of the adjacent pixel rows to obtain pixel differences in the direction of the pixel rows; the method comprises the steps of,
calculating pixel differences between adjacent pixel columns, and adding the pixel differences of each adjacent pixel column to obtain the pixel differences in the direction of the pixel column;
and adding the pixel difference in the pixel row direction and the pixel difference in the pixel column direction to obtain the definition of the projection image.
In one possible implementation manner, the determining, for each lens position, the definition of the projection image of the lens position specifically includes:
acquiring a pixel value of each pixel point in the projection image of the lens position;
calculating pixel differences between each pixel point and pixel values of adjacent points in a preset adjacent area respectively to serve as adjacent point pixel differences;
determining the sum of adjacent pixel differences of the pixel point and each adjacent point to obtain a neighborhood pixel difference of the pixel point;
and determining the sum of neighborhood pixel differences of each pixel point in the projection image to obtain the definition of the projection image.
In a possible implementation manner, after the usage time of the projection device exceeds a preset time threshold and/or after the lens temperature exceeds a preset temperature threshold, the method further includes:
and returning to the step of acquiring the multi-frame images input to the projection equipment every time the time interval is designated.
In one possible embodiment, the method further comprises two
Continuously monitoring whether the image currently input to the projection device is a still picture or not;
and if the image currently input to the projection equipment is changed from a still picture to a dynamic picture in the process of determining the lens position with the highest definition based on the definition of the projection images of the plurality of projection lens positions, acquiring the lens position with the highest definition obtained by the last focusing operation, and adjusting the position of the projection lens to the lens position with the highest definition obtained by the last focusing operation.
In a second aspect, the present application provides a projection device comprising:
a computer readable storage medium, a processor, a projection lens, wherein:
the projection lens is used for outputting an image;
the computer-readable storage medium is for storing the processor-executable instructions;
Wherein the processor is configured to execute the instructions to implement the projection device focusing method of any one of the first aspect above.
In a third aspect, the present application also provides a computer-readable storage medium comprising:
the instructions in the computer-readable storage medium, when executed by the processor of the projection device, enable the projection device to perform the projection device focusing method of any one of the first aspects described above.
In a fourth aspect, the present application also provides a computer program product comprising a computer program which, when executed by the processor, implements the projection device focusing method of any one of the first aspects described above.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects:
the embodiment of the application provides a focusing method of projection equipment, the projection equipment and a storage medium, wherein when the using time of the projection equipment exceeds a preset time threshold and/or the lens temperature exceeds a preset temperature threshold, multi-frame images input to the projection equipment are acquired; if the image currently input to the projection equipment is determined to display a still picture based on the acquired multi-frame image, determining a lens position with highest definition based on the definition of the projection images of the plurality of projection lens positions; and adjusting the position of the projection lens to the lens position with the highest definition. The application provides a noninductive focusing method, which can adjust the position of a projection lens to the position with the highest definition by sensing the temperature of the lens and comparing the definition of the projection images at the positions of a plurality of projection lenses of a still image input to projection equipment when the temperature of the lens reaches a certain temperature, and can also solve the problem of projection blurring caused by temperature rise in time when the periodic focusing is performed by adopting preset time. In addition, the image adopted in the application is the image input to the projection equipment, so that whether the image input to the projection equipment is a still image or not is directly judged from the input source of the projection image, the multi-frame projection image for judging the definition is fundamentally ensured to be the same image, the position of the projection lens is adjusted more accurately according to the definition, and the projection focusing effect of the projection equipment is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, and it is obvious that the drawings that are described below are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1a is a schematic diagram of a projection apparatus according to an embodiment of the present disclosure;
FIG. 1b is a schematic diagram of a projection apparatus according to a second embodiment of the present disclosure;
FIG. 1c is a third schematic diagram of a projection apparatus according to an embodiment of the present disclosure;
FIG. 1d is a schematic diagram of a projection apparatus according to an embodiment of the present disclosure;
FIG. 1e is a schematic diagram of a projection apparatus according to an embodiment of the present disclosure;
Fig. 2 is a flowchart of a focusing method of a projection device according to an embodiment of the present application;
fig. 3 is a schematic flowchart of determining a lens position of a projection image with maximum sharpness according to an embodiment of the present application;
fig. 4 is a flowchart of a method for determining sharpness of a projection image according to an embodiment of the present application;
fig. 5 is a schematic view of a projection image of a current lens position according to an embodiment of the present application;
FIG. 6 is a flowchart of another method for determining sharpness of a projection image according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of a plurality of lens positions of a projection device according to an embodiment of the present disclosure;
fig. 8 is a flowchart of another focusing method of a projection device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. Wherein the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Also, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, meaning that there may be three relations, e.g., a and/or B, may represent: the three cases where a exists alone, a and B exist together, and B exists alone, and in addition, in the description of the embodiments of the present application, "plural" means two or more than two.
The terms "first", "second" are used in the following for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first", "a second", or "a second" may include one or more such feature, either explicitly or implicitly, and in the description of embodiments of the application, the meaning of "a plurality" is two or more, unless otherwise indicated.
In the following, some terms in the embodiments of the present application are explained for easy understanding by those skilled in the art.
1) Projection device: the device can project images onto a curtain, is widely applied to places such as families, offices and schools, and brings great convenience to life of people. The projection device is a very precise product, the optical lens is also a precise element, so that the condition that the image is blurred or is generated by slight change of hardware of the projection device can be said, and after the projection device is used for a period of time, the image is often slightly blurred, which is caused by thermal expansion and cold contraction of the lens.
2) Focusing: after the projected picture of the projection device is blurred, the projection device makes the picture clearer by focusing this function. Currently, there are manual focusing and automatic focusing for focusing of projection devices.
3) Manual focusing: the definition of the projection picture needs to be adjusted manually, and the focusing mode is to toggle the focusing ring. The focusing ring is generally positioned above the lens, and the principle is that the focusing ring is shifted to adjust the lens inside the lens of the projection equipment, so that the focal length from the lens to the projection wall surface is changed, and the picture becomes clear.
4) Automatic focusing: as the name implies, manual adjustment of the sharpness of the picture is not required, and when the position of the projection device is moved or other conditions cause blurring of the picture, the projection device automatically adjusts the sharpness until the picture is sharp. The common automatic focusing can generate focusing patterns in the middle of the screen, focusing is completed by detecting the definition of the preset focusing patterns, and the time consumption is long. The distance between the projection equipment and the projection screen is measured by the ranging device without sensing focusing, and the lens position required by the specific projection distance is found by using a table look-up method so as to complete automatic focusing.
The focusing of the projection equipment is realized by comparing the definition of the image projected onto the screen under different lens positions and then adjusting the lens to the position of the maximum definition of projection. When the lens positions are the same, the projection definition of two identical images is the same.
In the prior art, a common automatic focusing pattern can appear in the use process of a user, so that the user experience is seriously influenced, and the problem that the effect after focusing cannot be finely perceived by the user without perceived noninductive focusing is poor is solved, so that the problem that the focusing method of the projection equipment for improving the current projection effect is still solved.
In view of this, the present application provides a focusing method of a projection device, a projection device and a storage medium, which are used for solving the problem of poor projection focusing effect of the projection device.
The inventive concepts of the present application can be summarized as follows: in the embodiment of the application, when the using time of the projection equipment exceeds a preset time threshold and/or the lens temperature exceeds a preset temperature threshold, multi-frame images input to the projection equipment are acquired; if the image currently input to the projection equipment is determined to display a still picture based on the acquired multi-frame image, determining a lens position with highest definition based on the definition of the projection images of the plurality of projection lens positions; and adjusting the position of the projection lens to the lens position with the highest definition. The application provides a noninductive focusing method, which can adjust the position of a projection lens to the position with the highest definition by sensing the temperature of the lens and comparing the definition of the projection images at the positions of a plurality of projection lenses of a still image input to projection equipment when the temperature of the lens reaches a certain temperature, and can also solve the problem of projection blurring caused by temperature rise in time when the periodic focusing is performed by adopting preset time. In addition, the image adopted in the application is the image input to the projection equipment, so that whether the image input to the projection equipment is a still image or not is directly judged from the input source of the projection image, the multi-frame projection image for judging the definition is fundamentally ensured to be the same image, the position of the projection lens is adjusted more accurately according to the definition, and the projection focusing effect of the projection equipment is improved.
After the inventive concept of the present application is introduced, some simple descriptions are made below on application scenarios applicable to the technical solution of the embodiments of the present application, and it should be noted that the application scenarios described below are only used to illustrate the embodiments of the present application and are not limiting. In specific implementation, the technical scheme provided by the embodiment of the application can be flexibly applied according to actual needs.
Fig. 1a shows a first schematic structure of a projection apparatus of the present application, fig. 1b shows a second schematic structure of a projection apparatus of the present application, fig. 1c shows a third schematic structure of a projection apparatus of the present application, and fig. 1d shows a fourth schematic structure of a projection apparatus of the present application.
As shown in fig. 1a, the projection device 100 may include a processor, a computer readable storage medium, at least one light source 120, and a light valve 130. The display control component 110 may be used as a processor to execute the focusing method provided in the embodiments of the present application. The display control assembly 110 may be a digital light processing chip (digital light processing chip, DLPC), the computer readable storage medium not being shown in fig. 1 a. The display control component 110 of an example may be a DLPC 6540. The Light source 120 is a laser Light source or a bulb Light source or a semiconductor material chip (LED) Light source, and the Light source 120 may include at least one group of lasers in one-to-one correspondence with the at least one laser driving assembly 111. The at least one means one or more, and the plurality means two or more. The at least one group refers to one or more groups, and the plurality of groups refers to two or more groups, each of which may include one or more lasers. For example, referring to fig. 1a, if the light source 120 is a laser light source, the light source 120 includes a blue laser 121, a red laser 122, and a green laser 123. The light valve 130 is a digital micro-mirror device (DMD).
In the embodiment of the present application, referring to fig. 1b, if the projection device is a projection television, the projection device may further include a power source 140, a start control component 150, a program storage component 160, and a main control chip 170, where the main control chip 170 may be used as a processor to implement the focusing method provided in the embodiment of the present application, and the program storage component 160 may be used as a computer readable memory. The main control chip 170 is connected to the start control unit 150 and the display control unit 110, the power supply 140 is connected to the laser driving unit 111, and the program storage unit 160 is connected to the display control unit 110.
The main control chip 170 transmits a start command to the start control module 150, and the start control module 150 starts to operate after receiving the start command, and sequentially outputs, for example, 1.1 volt (V), 1.8V,3.3V,2.5V, and 5V to the display control module 110 according to a power-up timing of the start control module 150 to supply power to the display control module 110. After the power supply voltage and the time sequence are correct, the start control component 150 sends a power sense (POSENSE) signal and a power good (PWRGOOD) signal to the display control component 110, and after receiving the two control signals, the display control component 110 reads a program from the external program storage component 160 and initializes the program, and at this time, the whole projection device starts to work. The display control assembly 110 configures the activation control assembly 150 via serial peripheral interface (serial peripheral interface, SPI) communication and instructs the activation control assembly 150 to begin powering the light valve 130. The start-up control unit 150 outputs 3 voltages to the light valve 130, wherein the voltages are respectively Voltage Bias (VBIAS) of 18V, voltage Reset (VRST) of-14V, and Voltage Offset (VOFS) of 10V, and the light valve 130 starts to operate after the voltage of the light valve 130 is normal. The display control component 110 sends the primary color gradation values of the sub-images to the light valve 130 at 594MHz, for example, over a high-speed serial interface (high-speed serial interface, HSSI) to effect display of the images. Power in the projection device is supplied by power supply 140 converting ac power, for example, 100V to 240V, to dc power for each component.
Referring to fig. 1c, if the light source 120 in the laser projection device includes two sets of red lasers 1201, one set of blue lasers 1202 and one set of green lasers 1203 integrally arranged. The projection device may be referred to as a full color laser projection device. Since the blue laser 1202 can withstand higher temperatures, the blue laser 1202 is disposed between the red laser 1201 and the green laser 1203, which is more beneficial to rapid heat dissipation of the red laser 1201 and the green laser 1203, so that the reliability of the integrated multiple sets of lasers is higher. Referring to fig. 1c, the full color laser projection device may further comprise four reflective mirrors 70, a lens assembly 80, a diffuser 90, a light pipe 60, a total internal reflection (total internal reflection, TIR) lens 11, a projection lens 12 and a projection screen 13. Wherein the lens assembly 80 comprises a first lens 801-80, a second lens 802-80 and a third lens 803-80. Each set of lasers is provided with a mirror plate 70.
In the process of projection display of the first frame of sub-image, blue laser emitted by the blue laser 1202 is reflected by the reflecting mirror 70 at the corresponding position, condensed by the first lenses 801-80, homogenized by the diffusion wheel 90, and then subjected to total reflection and homogenization by the light guide 60. The red laser emitted by the red laser 302 is reflected by the reflecting mirror 70 at the corresponding position, condensed by the first lenses 801-80, dispersed in spots and uniform chromaticity by the diffusion wheel 90, and totally reflected by the light guide 60. The green laser emitted by the green laser 1203 is reflected by the reflecting mirror 70 at the corresponding position, condensed by the first lenses 801-80, dispersed in spots and uniform chromaticity by the diffusion wheel 90, and totally reflected by the light guide 60. The blue laser, the red laser and the green laser after being homogenized by the light guide 60 are shaped by the second lens 802-80 and the third lens 803-80 in a time sharing way and enter the TIR lens 11 for total reflection, in the process that the three primary color light is irradiated to the light valve in a time sequence way, the display control assembly 110 controls the light valve 130 to overturn according to the primary color gradation value of the pixel in the sub-image of the first frame, and the overturned light valve 130 reflects the light totally reflected by the TIR lens 11 and passes through the TIR lens 11 again and finally is projected onto the projection screen 13 through the projection lens 12.
Furthermore, as shown in fig. 1c, the projection device may further comprise: and one first luminance sensor W1 provided on the light emitting side of each laser, the first luminance sensor W1 being for detecting the light emitting luminance of the corresponding one of the lasers. The first luminance sensor W1 provided at the light emitting side of the blue laser 1202 may be a blue luminance sensor. The first luminance sensor W1 provided at the light emitting side of the red laser 1201 may be a red light luminance sensor. The first luminance sensor W1 provided on the light emitting side of the green laser 1203 may be a green luminance sensor.
Alternatively, as shown in fig. 1c, the projection device may further include: a second luminance sensor W2 is disposed on the light emitting side of the light guide 60, and the second luminance sensor W2 may be a white light luminance sensor.
Alternatively still, the projection device may include both the first luminance sensor W1 and the second luminance sensor W2.
In the related art, referring to fig. 1d, the main control chip 170 of the projection television may be used as a processor to implement the focusing method provided in the embodiments of the present application, and the program storage component 160 may be used as a computer readable storage medium to store computer executable instructions. The main control chip 170 decodes the image signal after receiving the 4K video signal or the digital tv signal, and transmits the image signal with the resolution of 3840×2160 to the field programmable gate array (field programmable gate array, FPGA) 202 in the form of 8 VX1 signals at the rate of 60HZ, and after the FPGA 202 processes the image signal with the resolution of 3840×2160, one frame 4K (i.e., 3840×2160) signal is decomposed into 4 sub-frame 2K (i.e., 1920×1080) signals, and buffered into 2 sets of Double Data Rate (DDR) 203 externally connected to the FPGA 202, where DDR 203 is a 14-bit Address (ADDR) line and a 32-bit data (data) line. The FPGA power management outputs 1.1v,1.15v,1.5v,2.5v,3.3v, ddr_vtt, ddr_vref power the FPGA 202 and DDR 203. The FPGA 202 inputs the primary color gradation values of the 2K (1920×1080) signal of one frame sub-image into the first control chip 208 and the second control chip 209, respectively, in the form of 60-bit transistor-transistor logic (transistor transistor logic) TTL data. The first control chip 208 and the second control chip 209 control the data amounts of half primary color gradation values of one frame sub-image, respectively. And respectively transmitting the primary color gradation values of (960+32) ×1080 to the light valve 130 according to a 2-way low voltage differential signal (low-voltage differential signaling, LVDS) data format at 240Hz, wherein the extra 32 columns of pixels are pixels needing overlapping processing. The first control chip 208 and the second control chip 209 each control half of the primary color gradation value of a sub-image of one frame, thereby realizing high-speed transmission of the primary color gradation value of the sub-image. The first control chip 208 controls the 2-way 16-pair total 32 pairs of LVDS primary color gradation values to be transmitted to the light valve 130, controls half of image display, the second control chip 209 controls the 2-way 16-pair total 32 pairs of LVDS primary color gradation values to be transmitted to the light valve 130, and controls the other half of image display, namely, the first control chip 208 and the second control chip 209 control the 4-way total 64 pairs of LVDS primary color gradation values to be transmitted to the light valve 211 at 240Hz to perform 2K (1920×1080) image display, and 200 millivolt (mV) amplitude between the LVDS data pairs can effectively ensure signal integrity and reduce electromagnetic interference (electro magnetic interference, EMI). The power supply of the first control chip 208 and the second control chip 209 is provided by the start control component 150, and the first control chip 208 sends out a control command to start the start control component 150 to start working, and the start control component 150 sequentially outputs 1.1V,1.8V,3.3V,2.5V and 5V to supply power to the first control chip 208 and the second control chip 209 according to the power-on time sequence of the first control chip 208 and the second control chip 209. After the supply voltage and the timing are correct, the start control component 150 is started to output two control signals POSENSE and PWRGOOD to the first control chip 208. After receiving the two control signals, the first control chip 208 starts to read a program from the external program storage component 160 to perform initialization operation, at this time, the whole projection device starts to operate, the first control chip 208 configures the start control component 150 through SPI communication, sends a power supply start command to the light valve 130, the start control component 150 outputs 3 voltages VBIAS of 18V, vrst of-14V, vofs of 10V, and the light valve 130 can start to operate after the voltage of the light valve 130 is normal. For example, the first control chip 208 and the second control chip 209 are DLPC6421.
The display control component 110 provided in the embodiment of the present disclosure may implement the functions of one FPGA chip, 4 DDR chips, and the first control chip 208 and the second control chip 209 in the related art, which simplifies the circuit and reduces the cost. And the wiring of the PCB circuit board for setting the display control component is simpler, and the lamination is less. Meanwhile, the size of the PCB is reduced, and the cost of the PCB is reduced, and meanwhile, the miniaturization design of the projection equipment is facilitated. Other parts of the projection device using the integrated display control assembly 110 are unchanged, facilitating rapid introduction of the product.
Fig. 1e shows a fifth schematic structure of the projection device 100 of the present application. The projection apparatus 100 includes: light source 120, projection lens 12 and processor 14, projection screen 13 and computer readable storage medium 15. And the structure shown in fig. 1e is not limiting of the present application.
The light source 120 provides illumination to the projection lens 12, the processor 14 modulates the light beam of the light source, the projection lens 12 images the modulated light beam, and the modulated light beam is projected onto the projection screen 13 to form an image.
A projection lens 12 for projecting an image on a projection screen;
a processor 14 for determining whether the image input to the projection device is a still image or a moving image, and automatically adjusting the position of the projection lens 12 according to the sharpness of the projected image projected on the projection screen by the image input to the projection device. In this application, processor 14 is used to perform any of the projection device focusing methods described herein.
A projection screen 13 for displaying the image projected by the projection lens 12.
A computer readable storage medium 15 for storing processor executable instructions.
The foregoing is merely an alternative embodiment of the disclosure, and is not intended to limit the disclosure, so that any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.
In order to further explain the technical solutions provided in the embodiments of the present application, the following details are described with reference to the accompanying drawings and the detailed description.
The common automatic focusing can generate focusing patterns in the use process of users, and seriously affects the user experience. And the same image projected by the projection equipment is used for replacing the focusing pattern by the non-perceived focusing of the user, so that the non-perceived focusing is realized. In the no-sense focusing technique, a still picture projected by a projection device is used instead of a focusing pattern so that the focusing process is not perceived by the user.
In one possible implementation manner, a camera capturing method may be used to determine that the image projected by the projection device is a still picture, that is, capture two projection images at different lens positions, if the similarity of the two projection images is higher, consider that the two projection images projected by the projection device are the same image, calculate sharpness values of the two projection images respectively, compare the sharpness values, select a lens position of the projection image with higher sharpness, and adjust the lens to the position, so as to complete non-perception focusing of the projection device. The method is used for comparing the two projection images to obtain the projection images by projecting the two projection images at two different lens positions, on one hand, even if the input images are the same, the captured projection images may have a certain difference due to the difference of the positions of the projection lenses when the projection images are captured, on the other hand, even if the input images are different, the captured projection images may be similar due to the fact that the lens movement is caused, and the images are misjudged to be the same, so that errors occur in comparing definition, and the focusing effect is required to be improved. The embodiment of the application provides a focusing method of projection equipment. Fig. 2 is a flowchart of a focusing method of a projection device according to an embodiment of the present application. As shown in fig. 2, the method comprises the steps of:
In step 201, when the usage time of the projection device exceeds a preset time threshold or the lens temperature exceeds a preset temperature threshold, a multi-frame image input to the projection device is acquired.
In one possible implementation manner, the lens is not deformed due to the lower temperature of the lens when the projection device is just started, so that the projection device can complete focusing according to the practical situation, such as manual focusing, common automatic focusing, and the like. After the projection equipment is used for a period of time, the temperature of the lens can be too high, so that the lens deforms, and the definition of a picture is affected. Therefore, in the embodiment of the present application, a preset time threshold or a preset temperature threshold is set, so that focusing by using the focusing method of the projection device provided by the present application is determined through the following two cases.
First case: when the service time of the projection equipment exceeds a preset time threshold, the temperature of a lens may be too high, and a picture is blurred, and at the moment, focusing is performed by using the focusing method of the projection equipment.
Second case: when the temperature of the lens of the projection equipment exceeds a preset temperature threshold, the fact that the temperature of the lens is too high is indicated, and the picture is blurred, and focusing is performed by using the focusing method of the projection equipment.
Therefore, the problem of projection blurring caused by temperature rise can be solved in time by adopting the preset time for periodical focusing.
The focusing method of the projection device provided by the embodiment of the application uses an image input to the projection device instead of a projection image projected onto a projection curtain. Therefore, no matter how the lens position changes, whether the image input to the projection equipment is a still image or not can be directly judged from the input source of the projection image, the multi-frame image with the definition is basically ensured to be the same image, and the definition comparison result in automatic focusing is more accurate.
In another possible implementation manner, in order to save power consumption, in this embodiment of the present application, after the usage time of the projection device exceeds the preset time threshold or after the lens temperature exceeds the preset temperature threshold, multiple frames of images input to the projection device may be acquired at intervals of a specified duration, for example, the specified duration may be set to be 2 minutes, so that after the usage time of the projection device exceeds the preset time threshold or after the lens temperature exceeds the preset temperature threshold, multiple frames of images input to the projection device may be detected at intervals of 2 minutes. Thus, although it may not be possible to timely detect whether an image input to the projection apparatus is a still image or a moving image, the frequency of focusing by the projection apparatus according to sharpness may be controlled, and power consumption may be saved.
In step 202, if it is determined that the image currently input to the projection apparatus displays a still picture based on the acquired multi-frame image, a lens position with the highest definition is determined based on the definition of the projection images of the plurality of projection lens positions.
In one possible implementation manner, in order to determine, based on the acquired multiple frame images, whether the image currently input to the projection device displays a still picture, the embodiment of the present application may acquire a first frame image of the multiple frame images to store, and perform, for each frame image acquired subsequently, respectively:
comparing the acquired current frame image with the previous frame image, and determining the similarity between the current frame image and the previous frame image; if the similarity is smaller than the first similarity threshold, configuring the still picture parameter as a first value; if the similarity is greater than or equal to the first similarity threshold, increasing the still picture parameter by a second value;
if the parameter value of the still picture parameter is greater than or equal to the parameter threshold value, determining that the image currently input to the projection device displays the still picture; if the parameter value of the still picture parameter is smaller than the parameter threshold value, determining that the image currently input to the projection equipment displays the dynamic picture, taking the current frame image as a first frame image, and returning to execute the step of comparing the acquired current frame image with the previous frame image to determine the similarity between the current frame image and the previous frame image. Thus, the operation of circularly searching for the multi-frame images as the same image is realized. When the method is implemented, the loop operation can be performed until the image of the still picture is determined, the still image stop loop can not be found when the image of the specified frame number is continuously processed, and the still image stop loop can not be found when the loop is executed for the specified duration.
For example, a first similarity threshold may be set in the projection device, where the larger and the better the first similarity threshold, the larger the first similarity threshold, and when the similarity of two frames of images is greater than the first similarity threshold, the most possible guarantee that the image input to the projection device is a still picture may be obtained. Assuming that the first value is 0 and the second value is 1, if the similarity of the two frame images is smaller than a first similarity threshold, the current frame image and the previous frame image are not considered to be the same frame, the still picture parameter is configured to be 0, if the similarity is larger than or equal to the first similarity threshold, the current frame image and the previous frame image are considered to be the same frame image, at the moment, the still picture parameter is configured to be increased by 1, when the similarity of the two frame images is smaller than the first similarity threshold, the still picture parameter is immediately reconfigured to be 0 until the still picture parameter is larger than or equal to the parameter threshold, the image currently input to the projection device is determined to display the still picture, and then the definition of the images respectively projected by the plurality of projection lens positions is acquired. For example, the similarity of the first frame image and the second frame image in the multi-frame image is greater than or equal to the first similarity threshold, and the still picture parameter becomes 1; the similarity of the second frame image and the third frame image is greater than or equal to the first similarity threshold, the still picture parameter is increased by 1 and becomes 2, at this time, if the similarity of the third frame image and the fourth frame image is less than the first similarity threshold, the still picture parameter immediately becomes 0, and if the similarity of the fourth frame image and the fifth frame image is greater than or equal to the first similarity threshold, the still picture parameter is increased by 1 and becomes 1 until the still picture parameter continues to be increased until it is greater than or equal to the parameter threshold, for example, 50, then it is determined that the image currently input to the projection apparatus displays a still picture.
The similarity between the current frame image and the previous frame image may be obtained by calculating a difference value between the current frame image and the previous frame image input to the projection apparatus through an SOC (System on Chip), and the difference value may be denoted as DIFF (difference). Wherein the DIFF value is the sum of the differences of each corresponding pixel in the current frame image and the previous frame image. If the two frames of images are identical, the pixel difference is 0, the DIFF value is 0, if the two frames of images are not identical, the pixel difference is 100%, the DIFF value is 100%, and the larger the difference between the two frames of images is, the larger the obtained pixel difference is, and the larger the DIFF value is, the smaller the similarity value is.
In one possible embodiment, if the percentage value is used to represent the similarity between the current frame image and the previous frame image, the percentage value obtained by subtracting the DIFF value from one hundred percent may be used to obtain the similarity. Wherein the DIFF value can be converted to a percentage value, for example, a frame of 4K image having a size of 3840x2160, each pixel having a bit width of 8 bits, i.e., a pixel value of 0-255, the DIFF value is maximally 3840x2160 x 255. If some PQ (performance qualification, performance verification) modules in the SOC do not downsample the image, then there are R, G, B three color channels, the DIFF value of each color channel can be converted to a percentage value using the method of DIFF/(3840X 2160X 255) X100, and the percentage values of the three channels are summed and averaged to obtain the total percentage value. If some PQ modules in the SOC perform downsampling processing on the image, three color channels are mixed into one color channel, and the DIFF value of the color channel is directly converted into a percentage value by using a DIFF/(3840 x2160 x 255) 100 calculation method.
In view of calculation errors, the first similarity threshold may be set to 98%, and if the similarity of two frames of images is greater than 98%, the still picture parameter may be increased by a second value until the still picture parameter is greater than or equal to the parameter threshold, and it is determined that the image currently input to the projection apparatus displays a still picture.
In another possible embodiment, if the similarity between the current frame image and the previous frame image is represented using a numerical value, the maximum numerical value of the similarity may be used to subtract the DIFF value from the similarity. For example, the first similarity threshold is 0.8, the obtained DIFF value is 0.1, and 0-1 is used for representing the similarity, then 1 minus 0.1 can be directly used for obtaining the similarity of 0.9, and compared with the first similarity threshold of 0.8, the similarity between the current frame image and the previous frame image is larger than the first similarity threshold, the still picture parameter can be increased by a second value until the still picture parameter is larger than or equal to the parameter threshold, and then the image currently input to the projection device is determined to display the still picture.
In another possible implementation manner, the difference value and the difference value threshold value between the current frame image and the previous frame image may be directly used to determine the similarity between the current frame image and the previous frame image, if the difference value between the current frame image and the previous frame image is smaller than the difference value threshold value, the still picture parameter is increased by a second value until the still picture parameter is greater than or equal to the parameter threshold value, and it is determined that the image currently input to the projection device displays the still picture; if the difference value is greater than or equal to the difference value threshold, the still picture parameter is configured to be a first value, and the difference value between the rest frame images is continuously judged. For example, the difference value threshold is set to 0.1, and the difference value of the current frame image and the previous frame image, which is labeled DIFF (difference), can be calculated by an SOC (System on Chip). If the difference value of the two frames of images is 0.05, which means that the difference value of the two frames of images is smaller than the difference value threshold, the still picture parameter can be increased by 1 until the still picture parameter is larger than or equal to the parameter threshold, the image currently input to the projection device is determined to display the still picture, if the difference value of the two frames of images is 0.6, which means that the difference value of the two frames of images is larger than the difference value threshold, the still picture parameter can be changed into 0, and then the relationship between the difference value between the current frame of images and the next frame of images and the difference value threshold can be continuously judged.
The manner of calculating the difference value of the current frame image and the previous frame image input to the projection device and the manner of calculating the similarity between the current frame image and the previous frame image are not limited to the manner provided in the embodiments of the present application, and any manner that is used to evaluate the index of the difference between the image frames or that can calculate the similarity between the two frame images is applicable to the embodiments of the present application.
Therefore, the parameter threshold value is set to ensure that the image currently input to the projection device displays a still picture to the greatest extent, the frequency of exiting the focusing operation is reduced, and the focusing operation process is more stable. Meanwhile, the condition that the parameters of the still picture are unchanged after the image is changed in the middle can be overcome, and the static frame image which can be automatically focused can be timely found, so that the automatic focusing can be timely completed.
In one possible implementation manner, based on the acquired multi-frame images, whether the images currently input to the projection device display a still picture is determined, and the embodiment of the application may further obtain a plurality of image pairs by pairwise pairing the multi-frame images; then determining the image similarity of each image pair; if the image similarity of each pair of image pairs is greater than or equal to a second similarity threshold, determining that the image currently input to the projection device displays a still picture; if the image similarity of any image pair is smaller than the second similarity threshold, determining that the image currently input to the projection equipment displays a dynamic picture, and re-acquiring multi-frame images, and returning to execute the step of pairwise obtaining a plurality of image pairs. Thus, the operation of circularly searching for the multi-frame images as the same image is realized. When the loop operation is implemented, the loop operation may be performed until the still picture image is determined, or the still picture stop loop may not be found in the continuously processed image of the specified frame number, or may not be found in the loop execution specified period.
For example, 10 frames of images can be obtained, and the 10 frames of images can be divided into two pairsFor image pairs, if this->If the image similarity of one pair of the image pairs is smaller than the second similarity threshold, the focusing scheme of the application is not executed, and 10 frames of images are acquired again to judge whether the image currently input to the projection device displays a still picture or not, only if ∈ ->The image currently input to the projection apparatus is determined to display a still picture with respect to the image for which the image similarity is greater than or equal to the second similarity threshold, and the following operation of determining sharpness is performed.
Therefore, the multi-frame image can be used as a group of image frames, and the similarity judgment can be carried out on the group of image frames at the same time, so that whether the image currently input to the projection equipment displays a still picture or not can be strictly judged, and the still frame image capable of automatically focusing can be determined, so that the automatic focusing can be conveniently completed in time.
In one possible implementation manner, after determining that the image currently input to the projection device is a still picture, the lens position with the highest definition may be determined based on the definition of the projection images of the plurality of projection lens positions in the embodiment of the present application, which may be implemented as steps shown in fig. 3:
In step 301, the sharpness of the projected images of the plurality of lens positions is determined, respectively.
In one possible implementation, comparing the sharpness of the projected images of two lens positions first requires determining, for each lens position, the sharpness of the projected image of the lens position, which in the present example can be calculated by the steps shown in fig. 4:
in step 401, a pixel value of each pixel point in the projection image of the current lens position is acquired.
In step 402, pixel differences between adjacent pixel rows are calculated and the pixel differences of each adjacent pixel row are added to obtain a pixel difference in the pixel row direction.
In step 403, pixel differences between adjacent pixel columns are calculated, and the pixel differences of the adjacent pixel columns are added to obtain pixel differences in the pixel column direction.
In step 404, the pixel differences in the pixel row direction and the pixel differences in the pixel column direction are added to obtain the sharpness of the projected image at the current lens position.
For example, if the pixel points in the projected image at the current lens position are shown in fig. 5, the pixel value of each pixel point may be first obtained by calculating the sharpness of the projected image at the current lens position, for example, the pixel value of the pixel point at the A1 position is represented by A1, and the rest positions are similar, so that the pixel difference between the first line and the second line in the pixel row direction may be calculated by using the method of (A1-B1) + (A2-B2) + (A3-B3) + (A4-B4), and the pixel difference between the second line and the third line in the pixel row direction may be obtained by using the method of (B1-C1) + (B3-C3) + (B4-C4), and the pixel difference between the third line and the fourth line in the pixel row direction may be obtained by using the method of (C1-D1) + (C2-D2) + (C4-D4), and finally the pixel difference between the third line and the third line in the pixel row direction may be obtained by adding the pixel difference between the third line and the third line in the pixel row direction and the third line direction. Also, the pixel difference DV in the pixel column direction of the projection image of the current lens position can be calculated using the same method. And finally, adding the pixel difference DH in the pixel row direction of the projection image of the current lens position and the pixel difference DV in the pixel column direction of the projection image of the current lens position to obtain the definition of the projection image of the current lens position.
In another possible embodiment, the definition of the projected image of the lens position is determined for each lens position, by the steps as shown in fig. 6:
in step 601, a pixel value of each pixel point in the projection image of the current lens position is acquired.
In step 602, a pixel difference between each pixel point and a pixel value of each adjacent point in the preset adjacent area is calculated as an adjacent point pixel difference.
In step 603, the sum of the pixel differences between the pixel and the neighboring points of each neighboring point is determined to obtain the neighborhood pixel difference of the pixel.
In step 604, the sum of the neighborhood pixel differences of each pixel in the projected image is determined, resulting in the sharpness of the projected image at the current lens position.
For example, if the pixel point in the projection image of the current lens position is as shown in fig. 5, the pixel value of each pixel point is first obtained, for example, the pixel value of the pixel point at the A1 position is represented by A1, and the rest positions are similar. Firstly, setting a preset neighborhood to be 3 x 3, calculating a neighborhood pixel difference of each pixel point by taking each pixel point as a center, if the pixel point at the A1 position is taken as a center, determining that the neighborhood pixel difference of the pixel point at the A1 position is (A1-B1) + (A1-A2) + (A1-B2), taking the pixel point at the B2 position as a center, determining that the neighborhood pixel difference of the pixel point at the B2 position is (B2-A1) + (B2-A2) + (B2-A3) + (B2-B1) + (B2-B3) + (B2-C1) + (B2-C2) + (B2-C3), and then adding the neighborhood pixel differences of all the pixel points in the projection image at the current lens position by using the same method to obtain the definition of the projection image at the current lens position.
In the preset adjacent area, for example, in the preset adjacent area with the pixel point at the position A1 as the center, only three positions have the pixel point, and only the pixel differences between the pixel points at the three positions and the pixel point at the position A1 are calculated, and the pixel differences between the positions of the rest of the non-pixel points and the pixel point at the position A1 default to 0.
The method for calculating the definition of the projection image of the current lens position is not limited to the two methods in fig. 4 and 6, and the method for determining the definition of the projection image of the current lens position by the pixel values of the pixel points in the projection image is applicable to the embodiments of the present application. Thereby, the sharpness of the projected image for each lens position can be determined.
In step 302, a position direction in which the sharpness of the projected image becomes large is determined based on the sharpness of the projected images of the plurality of lens positions.
For example, assume that the current lens position is 0, +1 indicates that the lens position is moved forward by one position, and-1 indicates that the lens is moved backward by one position in fig. 7, projection images of the three positions are respectively captured and calculated to obtain corresponding resolutions C0, c+1, C-1, and the greatest resolutions among the three are obtained by comparison. If the maximum definition is C+1, the definition indicating that the lens position moves forward is higher, and if the maximum definition is C-1, the definition indicating that the lens position moves backward is higher.
In step 303, the lens position is continuously adjusted to the next lens position along the position direction in which the definition of the projection image becomes larger, and the definition of the projection image at the next lens position is calculated until the lens position of the projection image with the maximum definition in the position direction is determined.
Illustratively, assuming that the maximum sharpness is c+1 in fig. 7, it is indicated that the sharpness of the forward movement of the lens position is higher, the lens is adjusted to the +2 position, and sharpness c+2 is calculated. If the C+2 is larger than the C+1 value, the lens is adjusted to the +3 position and C+3 is calculated, the lens position is sequentially moved forward and the definition is compared until the lens position of the projection image with the maximum definition is determined, if the C+2 is larger than the C+1 and the C+3 is smaller than or equal to the C+2, the definition of the projection image with the maximum definition at the +2 position is determined.
In step 203, the position of the projection lens is adjusted to the lens position with the highest definition.
In one possible implementation, if the projected image with the greatest sharpness is determined in step 202, the projection lens of the projection device is adjusted to the lens position of the projected image with the greatest sharpness. For example, in fig. 7, if the definition of the projection image at the +2 position is maximum, the position of the projection lens is adjusted to the +2 position. Therefore, one round of focusing operation of the projection equipment can be completed, focusing is started from the projection equipment until the position with the highest definition is determined, the projection lens is adjusted to the lens position with the highest definition, and the definition of an image obtained by projection is highest at the position with the highest definition.
In a possible implementation manner, in the process of searching the lens position with the highest definition in the embodiment of the present application, whether the image input to the projection device is a still image or a dynamic image may be continuously determined by the method of step 202, then the projection device may continuously monitor the result of the determination, if the image currently input to the projection device is the dynamic image, the position with the highest definition obtained by the previous focusing operation is obtained, and the position of the projection lens is adjusted to the position with the highest definition obtained by the previous focusing operation.
For example, when an image input to the projection device from one frame is a still picture, the lens position of the projection device is adjusted to the position P0 with the highest definition through focusing operation, if the image input to the projection device from the current frame is a dynamic picture, the focusing method of the application is not executed, the lens position of the projection device is kept at the P0 position, and if the image input to the projection device from the current frame is a still picture, the focusing method of the application is continuously executed to adjust the position of the projection lens.
Fig. 8 is a schematic flow chart of another focusing method of a projection device according to an embodiment of the present application. In the flowchart of focusing the projection device, the operation of the projection device may be divided into two processes, namely, a first process 803 and a second process 805, where the first process 2 includes a step 804 and a step 805. After the process is executed, whether the image currently input to the projection device is still picture or not is continuously monitored, the result is output until the projection device is turned off, and whether the process is executed or not is determined according to the output result of the process I. As shown in fig. 8, the projection device focusing operation may be implemented as the following steps:
In step 801, the projection device is powered on and selects an appropriate method for focusing.
In step 802, it is determined that the usage time exceeds a preset time threshold or that the lens temperature exceeds a preset temperature threshold.
In step 803, it is continuously determined whether the image currently input to the projection device is a still picture; if the image is still, determining a lens position with the maximum definition of the projection image based on the definition of the projection images of the plurality of lens positions in step 804, and adjusting the position of the projection lens to the position with the maximum definition in step 805; if the still picture is not the still picture, if the steps 804 and 805 are being executed, the steps 804 and 805 are exited in the step 806, and if the steps 804 and 805 are not executed, the step 806 is not required to be executed.
In step 807, the projection device is powered off. The focusing method provided by the embodiment of the application is used for stopping the projection equipment when the projection equipment is turned off, and focusing operation is performed by using the focusing method until the projection equipment is turned off when the using time exceeds a preset time threshold or the lens temperature exceeds a preset temperature threshold.
Therefore, when the current image input to the projection equipment is a still image, the projection lens is adjusted to the lens position with the maximum definition by comparing the definition of the projection images at the plurality of lens positions, so that the focusing operation of the projection equipment is completed, the position of the projection lens is adjusted according to the definition more accurately, and the projection focusing effect of the projection equipment is improved.
Based on the foregoing description, in the embodiment of the present application, when the usage time of the projection device exceeds a preset time threshold or the lens temperature exceeds a preset temperature threshold, a multi-frame image input to the projection device is obtained; if the image currently input to the projection equipment is determined to display a still picture based on the acquired multi-frame image, determining a lens position with highest definition based on the definition of the projection images of the plurality of projection lens positions; and adjusting the position of the projection lens to the lens position with the highest definition. The utility model provides a no sense focusing method, when focusing based on the temperature, through perception camera lens temperature, when camera lens temperature reaches certain temperature, through the definition of the projection image of a plurality of projection lens positions of comparing the still image of input to projection equipment, thereby adjust the position of projection lens to the position that definition is highest, when focusing periodically with preset time, also can in time solve the problem of the projection ambiguity because of the temperature rise, and, the image that adopts in this application is the image of input to projection equipment, whether the image of input to projection equipment is still image has been realized directly judging from the input source of projection image, fundamentally guaranteed to judge the multiframe projection image of definition is the same image, make the position of adjusting projection lens more accurate according to definition, thereby improve projection equipment's projection focusing effect.
In an exemplary embodiment, the present application also provides a computer-readable storage medium including instructions, such as program storage component 160 including instructions, executable by main control chip 170 (i.e., a processor) of projection device 100 to perform the above-described projection device focusing method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, comprising a computer program which, when executed by a processor, implements a projection device focusing method as provided herein.
The foregoing detailed description of the embodiments is merely illustrative of the general principles of the present application and should not be taken in any way as limiting the scope of the invention. Any other embodiments developed in accordance with the present application without inventive effort are within the scope of the present application for those skilled in the art.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.

Claims (8)

1. A method of focusing a projection device, the method comprising:
when the using time of the projection equipment exceeds a preset time threshold or the lens temperature exceeds a preset temperature threshold, acquiring a multi-frame image input to the projection equipment;
if the image currently input to the projection equipment is determined to display a still picture based on the acquired multi-frame image, determining a lens position with highest definition based on the definition of the projection images of the plurality of projection lens positions; wherein, based on the acquired multi-frame image, determining whether the image currently input to the projection device displays a still picture specifically includes: acquiring and storing a first frame image in the multi-frame images, and respectively executing for each frame image acquired subsequently; comparing the acquired current frame image with a previous frame image, and determining the similarity between the current frame image and the previous frame image; determining whether the image currently input to the projection device displays a still picture based on a similarity between the current frame image and the previous frame image; wherein the determining whether the image currently input to the projection device displays a still picture based on the similarity between the current frame image and the previous frame image specifically includes: if the similarity is smaller than a first similarity threshold, configuring a still picture parameter to be a first value; if the similarity is greater than or equal to the first similarity threshold, increasing the still picture parameter by a second value; if the parameter value of the still picture parameter is greater than or equal to the parameter threshold value, determining that the image currently input to the projection device displays a still picture; if the parameter value of the still picture parameter is smaller than the parameter threshold value, determining that the image currently input to the projection equipment displays a dynamic picture, taking the current frame image as a first frame image, and returning to execute the step of comparing the acquired current frame image with the previous frame image to determine the similarity between the current frame image and the previous frame image; or, the multi-frame images are paired to obtain a plurality of image pairs; determining image similarity for each image pair; determining whether the image currently input to the projection device displays a still picture based on the image similarity of each image pair; wherein the determining whether the image currently input to the projection device displays a still picture based on the image similarity of each image pair specifically comprises: if the image similarity of each pair of image pairs is greater than or equal to a second similarity threshold, determining that the image currently input to the projection device displays a still picture; if the image similarity of any image pair is smaller than the second similarity threshold, determining that the image currently input to the projection equipment displays a dynamic picture, re-acquiring multi-frame images, and executing the step of executing the pairwise multi-frame images to obtain a plurality of image pairs; wherein the image similarity between two frames of images can be determined by: calculating the sum of the difference values of each corresponding pixel point in the two frames of images through the system-on-chip SOC to obtain a difference value; using a hundred percent value to represent the image similarity, and using the hundred percent value obtained by subtracting the difference value from the hundred percent value to obtain the image similarity; or using a numerical value to represent the image similarity, and subtracting a difference value from a maximum numerical value to obtain the image similarity; or determining the image similarity by using the difference value and the difference value threshold;
And adjusting the position of the projection lens to the lens position with the highest definition.
2. The method according to claim 1, wherein determining the lens position with the highest definition based on the definition of the projection images of the plurality of projection lens positions, specifically comprises:
respectively determining the definition of projection images of a plurality of lens positions;
determining a position direction in which the definition of the projected image becomes large based on the definition of the projected images of the plurality of lens positions;
and continuously adjusting the lens position to the next lens position along the position direction, and calculating the definition of the projection image of the next lens position until the lens position of the projection image with the maximum definition in the position direction is determined.
3. The method according to any of claims 1-2, wherein determining for each lens position the sharpness of the projected image of the lens position, in particular comprises:
acquiring a pixel value of each pixel point in the projection image of the lens position;
calculating pixel differences between adjacent pixel rows, and adding the pixel differences of the adjacent pixel rows to obtain pixel differences in the direction of the pixel rows; the method comprises the steps of,
Calculating pixel differences between adjacent pixel columns, and adding the pixel differences of each adjacent pixel column to obtain the pixel differences in the direction of the pixel column;
and adding the pixel difference in the pixel row direction and the pixel difference in the pixel column direction to obtain the definition of the projection image.
4. The method according to any one of claims 1-2, wherein said determining for each lens position the sharpness of the projected image of said lens position, in particular comprises:
acquiring a pixel value of each pixel point in the projection image of the lens position;
calculating pixel differences between each pixel point and pixel values of adjacent points in a preset adjacent area respectively to serve as adjacent point pixel differences;
determining the sum of adjacent pixel differences of the pixel point and each adjacent point to obtain a neighborhood pixel difference of the pixel point;
and determining the sum of neighborhood pixel differences of each pixel point in the projection image to obtain the definition of the projection image.
5. The method of any of claims 1-2, wherein the method further comprises, after the projection device usage time exceeds a preset time threshold and/or after a lens temperature exceeds a preset temperature threshold:
And returning to the step of acquiring the multi-frame images input to the projection equipment every time the time interval is designated.
6. The method according to any one of claims 1-2, wherein the method further comprises:
continuously monitoring whether the image currently input to the projection device is a still picture or not;
and if the image currently input to the projection equipment is changed from a still picture to a dynamic picture in the process of determining the lens position with the highest definition based on the definition of the projection images of the plurality of projection lens positions, acquiring the lens position with the highest definition obtained by the last focusing operation, and adjusting the position of the projection lens to the lens position with the highest definition obtained by the last focusing operation.
7. A projection device, the projection device comprising:
a computer readable storage medium, a processor, a projection lens, wherein:
the projection lens is used for outputting an image;
the computer-readable storage medium is for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement a focusing method of the projection device of any one of claims 1-6.
8. A computer-readable storage medium, comprising:
the instructions in the computer-readable storage medium, when executed by the processor of the projection device of claim 7, enable the projection device to perform the method of focusing of a projection device of any one of claims 1-6.
CN202111653144.3A 2021-12-30 2021-12-30 Focusing method of projection device, projection device and storage medium Active CN114339182B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111653144.3A CN114339182B (en) 2021-12-30 2021-12-30 Focusing method of projection device, projection device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111653144.3A CN114339182B (en) 2021-12-30 2021-12-30 Focusing method of projection device, projection device and storage medium

Publications (2)

Publication Number Publication Date
CN114339182A CN114339182A (en) 2022-04-12
CN114339182B true CN114339182B (en) 2024-03-26

Family

ID=81018193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111653144.3A Active CN114339182B (en) 2021-12-30 2021-12-30 Focusing method of projection device, projection device and storage medium

Country Status (1)

Country Link
CN (1) CN114339182B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1257214A (en) * 1998-12-14 2000-06-21 三星电子株式会社 Large size image projector
CN101325722A (en) * 2007-06-15 2008-12-17 东芝照明技术株式会社 Back projection device
CN101571665A (en) * 2008-04-28 2009-11-04 鸿富锦精密工业(深圳)有限公司 Automatic focusing device and automatic focusing method for projector
CN104658462A (en) * 2013-11-20 2015-05-27 精工爱普生株式会社 Porjector and method of controlling projector
CN107037670A (en) * 2017-04-27 2017-08-11 青岛海信宽带多媒体技术有限公司 One kind projection definition method of adjustment and projector equipment
CN109005393A (en) * 2018-08-01 2018-12-14 苏州佳世达光电有限公司 A kind of Atomatic focusing method and its system
CN110769220A (en) * 2018-11-08 2020-02-07 成都极米科技股份有限公司 Focusing method and focusing device of projection equipment
CN112511814A (en) * 2021-02-05 2021-03-16 深圳市橙子数字科技有限公司 Projector focusing method, projector, computer device, and storage medium
CN112822467A (en) * 2019-11-15 2021-05-18 中强光电股份有限公司 Projection device and automatic focusing method thereof
CN113784102A (en) * 2019-10-31 2021-12-10 峰米(北京)科技有限公司 Thermal defocus compensation method, storage medium and projection equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1257214A (en) * 1998-12-14 2000-06-21 三星电子株式会社 Large size image projector
CN101325722A (en) * 2007-06-15 2008-12-17 东芝照明技术株式会社 Back projection device
CN101571665A (en) * 2008-04-28 2009-11-04 鸿富锦精密工业(深圳)有限公司 Automatic focusing device and automatic focusing method for projector
CN104658462A (en) * 2013-11-20 2015-05-27 精工爱普生株式会社 Porjector and method of controlling projector
CN107037670A (en) * 2017-04-27 2017-08-11 青岛海信宽带多媒体技术有限公司 One kind projection definition method of adjustment and projector equipment
CN109005393A (en) * 2018-08-01 2018-12-14 苏州佳世达光电有限公司 A kind of Atomatic focusing method and its system
CN110769220A (en) * 2018-11-08 2020-02-07 成都极米科技股份有限公司 Focusing method and focusing device of projection equipment
CN113784102A (en) * 2019-10-31 2021-12-10 峰米(北京)科技有限公司 Thermal defocus compensation method, storage medium and projection equipment
CN112822467A (en) * 2019-11-15 2021-05-18 中强光电股份有限公司 Projection device and automatic focusing method thereof
CN112511814A (en) * 2021-02-05 2021-03-16 深圳市橙子数字科技有限公司 Projector focusing method, projector, computer device, and storage medium

Also Published As

Publication number Publication date
CN114339182A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
JP4923500B2 (en) Projector apparatus and light source control method thereof
RU2473943C2 (en) Multi-screen display
US11178367B2 (en) Video display apparatus, video display system, and luminance adjusting method of video display apparatus
US20160014316A1 (en) Photography method using projecting light source and a photography element thereof
US8388140B2 (en) Projector, projection system, and method for controlling projection system
US20060139245A1 (en) Projection video display apparatus and brightness adjustment method therefor
CN108965841B (en) Projection optical system and projection display method
JP4138677B2 (en) Display device, display method, and projection display device
EP3043551B1 (en) Display device
JP2015022201A (en) Projector, and method of controlling projector
US10002574B2 (en) Method, apparatus for display compensation and display device
CN114339172B (en) Projection correction method and device, projection equipment, chip and medium
US9967540B2 (en) Ultra high definition 3D conversion device and an ultra high definition 3D display system
US20130208008A1 (en) Projector and projection image adjustment method
CN106507079B (en) A kind of color rendition method and device
TW200820768A (en) Adaptive emission frame projection display and method
US10699611B2 (en) Projector and brightness adjusting method
CN114339182B (en) Focusing method of projection device, projection device and storage medium
CN110446019A (en) A kind of optical fiber scanning optical projection system and its modulator approach
US20170154586A1 (en) Image display device and method for dimming light source
US10573255B2 (en) Display apparatus and control method therefor
JP2008083247A (en) Video creating apparatus
CN108475488A (en) Device and method for the validity for examining the image data in display device
JP2012155049A (en) Projector and light source control method of the same
WO2019097882A1 (en) Display device and display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant