CN110769238B - Projection environment brightness detection method and device, electronic equipment and medium - Google Patents

Projection environment brightness detection method and device, electronic equipment and medium Download PDF

Info

Publication number
CN110769238B
CN110769238B CN201911157817.9A CN201911157817A CN110769238B CN 110769238 B CN110769238 B CN 110769238B CN 201911157817 A CN201911157817 A CN 201911157817A CN 110769238 B CN110769238 B CN 110769238B
Authority
CN
China
Prior art keywords
brightness
original image
image
projection
projected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911157817.9A
Other languages
Chinese (zh)
Other versions
CN110769238A (en
Inventor
钟波
肖适
王鑫
宁仲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Jimi Technology Co Ltd
Original Assignee
Chengdu Jimi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Jimi Technology Co Ltd filed Critical Chengdu Jimi Technology Co Ltd
Priority to CN201911157817.9A priority Critical patent/CN110769238B/en
Publication of CN110769238A publication Critical patent/CN110769238A/en
Application granted granted Critical
Publication of CN110769238B publication Critical patent/CN110769238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

The application provides a projection environment brightness detection method, which comprises the following steps: acquiring an original image projected by a projector and a projected image projected on a curtain by the original image shot by a camera; extracting at least four groups of corresponding characteristic points from the original image and the projected image respectively; obtaining evaluation values of pixel brightness corresponding to the four groups of feature points by using a preset algorithm; and determining the current environment brightness value from the corresponding relation of the evaluation value and the brightness according to the evaluation value and the current distance, wherein the current distance is the distance between the current projector and the curtain. The brightness value of the current environment can be accurately and efficiently obtained, the brightness meter can be applied to the same type of projection equipment, the problem that in the related technology, hardware equipment needs to be added, and a sensor is used for collecting the environment brightness and is not accurate is solved, and the practicability is high. The application also provides a projection environment brightness detection device, electronic equipment and a computer readable storage medium, which have the beneficial effects.

Description

Projection environment brightness detection method and device, electronic equipment and medium
Technical Field
The present disclosure relates to the field of brightness detection technologies, and in particular, to a method for detecting brightness of a projection environment, a device for detecting brightness of a projection environment, an electronic device, and a computer-readable storage medium.
Background
At present in the domestic projection field, when carrying out ambient light intensity measurement, the sensor setting specifically can set up at projection equipment's front end or rear end on projection equipment, but, because the luminance of projection equipment itself and the position that sets up of sensor can produce very big influence to the testing result of light intensity, projection ambient brightness testing result is very inaccurate.
Therefore, how to provide a solution to the above technical problem is a problem that needs to be solved by those skilled in the art.
Disclosure of Invention
The application aims to provide a projection environment brightness detection method, a projection environment brightness detection device, an electronic device and a computer readable storage medium, which can greatly improve the accuracy of a projection environment brightness detection result. The specific scheme is as follows:
the application provides a projection environment brightness detection method, which comprises the following steps:
acquiring an original image projected by a projector and a projected image of the original image projected on a curtain and shot by a camera;
extracting at least four groups of corresponding feature points from the original image and the projected image respectively;
obtaining evaluation values of the pixel brightness corresponding to the four groups of feature points by using a preset algorithm;
and determining the current environment brightness value from the pre-established corresponding relation between the evaluation value and the brightness according to the evaluation value and the current distance, wherein the current distance is the distance between the projector and the curtain at present.
Optionally, before the obtaining of the original image projected by the projector and the projection image of the original image projected on the curtain shot by the camera, the method further includes:
under different ambient brightness and different reference distances, acquiring a reference original image and a reference projection image of the reference original image shot by the camera projected on the curtain;
extracting at least four groups of corresponding reference feature points from the reference original image and the reference projection image respectively;
obtaining reference evaluation values by using the four groups of corresponding reference characteristic points through the preset algorithm;
establishing a correspondence relationship of the evaluation value-brightness based on the reference evaluation value, the reference distance, and the ambient brightness;
wherein the ambient brightness is brightness collected at a target location using a sensor;
the reference distance is a distance between the projector and the curtain.
Optionally, before the obtaining of the original image projected by the projector and the projection image of the original image projected on the curtain shot by the camera, the method further includes:
when the projector is at a target distance from the curtain, acquiring a target original image and a target projection image of the target original image shot by the camera projected on the curtain;
extracting at least four groups of corresponding target feature points from the target original image and the target projection image respectively;
determining a projection-camera basic mapping relation by using the positions of the four groups of target feature points;
correspondingly, after at least four groups of corresponding feature points are extracted from the original image and the projected image respectively, the method further includes:
determining the actual mapping relation of the projection and the camera by using the positions of the four groups of feature points;
judging whether the difference value of the actual projection-camera mapping relation and the basic projection-camera mapping relation is within the maximum difference value range or not;
and if the brightness of the pixels corresponding to the four groups of feature points is within the maximum difference range, executing the step of obtaining an evaluation value by utilizing a preset algorithm for the brightness of the pixels corresponding to the four groups of feature points.
Optionally, the extracting at least four groups of corresponding feature points from the original image and the projected image respectively includes:
adjusting the brightness of the original image and the brightness of the projected image so that the brightness of the original image is the same as that of the projected image;
and extracting at least four groups of corresponding feature points from the original image and the projected image after the brightness is adjusted respectively.
Optionally, the extracting at least four groups of corresponding feature points from the original image and the projected image after the brightness adjustment respectively includes:
utilizing an SURF algorithm to position a plurality of groups of original characteristic points from the original image and the projected image after the brightness is adjusted respectively;
and performing feature vector matching on all original feature points by using a RANSAC algorithm to obtain at least four groups of corresponding feature points.
Optionally, before the obtaining of the original image projected by the projector and the projection image of the original image projected on the curtain shot by the camera, the method further includes:
trapezoidal correction is performed.
Optionally, the obtaining of the evaluation value by using a preset algorithm for the pixel brightness corresponding to the four groups of feature points includes:
calculating the pixel brightness corresponding to the four groups of feature points by using Ci ═ a/(1-e (alpha (Pi-b))) + k to obtain a first parameter a, a second parameter b, a third parameter alpha and a fourth parameter k;
determining an evaluation value using L ═ a/b × (alpha + k) based on the first parameter, the second parameter, the third parameter, and the fourth parameter;
wherein Pi represents pixel brightness corresponding to the feature point of the projected image, Ci represents pixel brightness corresponding to the feature point of the original image, and L is an evaluation value.
The application provides a projection environment luminance detection device, includes:
the device comprises an image acquisition module, a screen and a control module, wherein the image acquisition module is used for acquiring an original image projected by a projector and a projected image of the original image projected on the screen, which is shot by a camera;
the characteristic point extraction module is used for extracting at least four groups of corresponding characteristic points from the original image and the projected image respectively;
the evaluation value calculation module is used for obtaining evaluation values of the pixel brightness corresponding to the four groups of feature points by utilizing a preset algorithm;
and the current environment brightness value determining module is used for determining the current environment brightness value from the corresponding relation between the evaluation value and the brightness which is established in advance according to the evaluation value and the current distance, wherein the current distance is the distance between the projector and the curtain at present.
The application provides an electronic device, including:
a memory for storing a computer program;
and the processor is used for realizing the steps of the projection environment brightness detection method when executing the computer program.
The present application provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, implements the steps of the projection environment brightness detection method as described above.
The application provides a projection environment brightness detection method, which comprises the following steps: acquiring an original image projected by a projector and a projected image projected on a curtain by the original image shot by a camera; extracting at least four groups of corresponding characteristic points from the original image and the projected image respectively; obtaining evaluation values of pixel brightness corresponding to the four groups of feature points by using a preset algorithm; and determining the current environment brightness value from the pre-established evaluation value-brightness corresponding relation according to the evaluation value and the current distance, wherein the current distance is the distance between the current projector and the curtain.
The method selects four groups of feature points from feature points extracted from an original image and a projected image, calculates according to pixel brightness of the feature points by using a preset algorithm to obtain an evaluation value, determines the brightness value of the current environment from a pre-established evaluation value-brightness corresponding relation based on the evaluation value and the current distance, the method can quickly and accurately obtain the brightness value of the current environment, and the pre-established evaluation value-brightness corresponding relation can be applied to the same type of projection equipment, thereby avoiding the need of adding a sensor to acquire the environment brightness in the related art, and because the problem of inaccurate ambient brightness that the restriction of sensor mounted position obtained, the projection environment detection method practicality that this application provided is strong, has improved ambient brightness's detection efficiency and accuracy when having reduced equipment production price.
The application also provides a projection environment brightness detection device, electronic equipment and a computer readable storage medium, which all have the beneficial effects and are not repeated herein.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a method for detecting a brightness of a projection environment according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a method for establishing a correspondence between an evaluation value and brightness according to an embodiment of the present application;
FIG. 3 is a flowchart of another method for detecting the brightness of the projection environment according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a projection environment brightness detection apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
At present in the domestic projection field, when carrying out ambient light intensity measurement, the sensor setting specifically can set up at projection equipment's front end or rear end on projection equipment, but, because the luminance of projection equipment itself and the position that sets up of sensor can produce very big influence to the testing result of light intensity, projection ambient brightness testing result is very inaccurate. Based on the above technical problems, the present embodiment provides a method for detecting brightness of a projection environment, which selects four groups of feature points from feature points extracted from an original image and a projected image, calculates an evaluation value according to pixel brightness of the feature points by using a preset algorithm, and determines a brightness value of the current environment from a pre-established evaluation value-brightness corresponding relationship based on the evaluation value and a current distance, so that the method can quickly and accurately obtain the brightness value of the current environment, and the pre-established evaluation value-brightness corresponding relationship can be applied to a same type of projection equipment, thereby avoiding the problems that a sensor needs to be added to acquire the brightness of the environment in the related art, and the brightness of the environment is inaccurate due to the restriction of the installation position of the sensor, the method for detecting the projection environment provided by the present application has strong practicability, reduces the production price of the equipment, and simultaneously improves the detection efficiency and the accuracy of the brightness of, referring to fig. 1, fig. 1 is a flowchart of a method for detecting a brightness of a projection environment according to an embodiment of the present application, which specifically includes:
s101, acquiring an original image projected by a projector and a projected image projected on a curtain by the original image shot by a camera.
The original image projected by the projector is an image to be projected on the curtain stored in the system, and this embodiment does not limit the original image, and may be an image determined by a user in actual application, or may be a feature image, as long as the purpose of this embodiment can be achieved. It is understood that the image portion corresponding to the region based on the original image in the projection image is substantially the same as that of the original image to ensure the accuracy of the determination of the feature points.
Wherein, before obtaining the projection image of the original image of projector projection and the original image projection that the camera shot on the curtain, still include: trapezoidal correction is performed.
In this embodiment, the manner of the trapezoidal correction is not limited, and may be a conventional trapezoidal correction, or may be an improved trapezoidal correction, and the user may select the trapezoidal correction by self-definition, where the purpose of this step is to obtain the corresponding feature point more accurately when step S102 is executed.
And S102, extracting at least four groups of corresponding feature points from the original image and the projected image respectively.
The purpose of this step is to obtain at least four sets of corresponding feature points. In this embodiment, feature points may be extracted from the original image and the projected image by using an image algorithm, and the feature points are matched, and when matching is successful, the successfully matched feature points are used as a group of corresponding feature points. Wherein the feature points include, but are not limited to, pixel intensities of the feature points and locations of the feature points.
Wherein, S102 may specifically include: adjusting the brightness of the original image and the brightness of the projected image to make the brightness of the original image and the brightness of the projected image the same; and respectively extracting at least four groups of corresponding characteristic points from the original image and the projected image after the brightness is adjusted.
It can be understood that the brightness of the original image and the brightness of the projected image are adjusted to the same brightness, specifically, the brightness of the original image is ensured to be unchanged, and the brightness of the projected image is adjusted to the brightness of the original image; the brightness of the projected image is ensured to be unchanged, and the brightness of the original image is adjusted to the brightness of the projected image; it is also possible to adjust both the brightness of the projected image and the brightness of the original image to the target brightness value. When the brightness is the same, the accuracy of matching the feature points can be improved, and the accuracy of obtaining at least four groups of corresponding feature points is further greatly improved.
As for at least four groups of corresponding feature points extracted from the original image and the projected image after brightness adjustment, the present embodiment further elaborates, and specifically includes: utilizing an SURF algorithm to position a plurality of groups of original characteristic points from the original image and the projected image after the brightness is adjusted respectively; and performing feature vector matching on all original feature points by using a RANSAC algorithm to obtain at least four groups of corresponding feature points.
Specifically, the SURF is used for positioning the feature points, the algorithm is strong in robustness in the aspects of rotation resistance and scale, and due to the fact that the two images are different in nature, the difference between the number of the calculated feature points of the original image and the projected image and the feature vectors is obvious, so that the RANSAC algorithm is used for matching the feature vectors, and the matching accuracy of the feature points is improved.
And S103, obtaining an evaluation value by using a preset algorithm for the pixel brightness corresponding to the four groups of feature points.
Wherein, S103 may specifically include: calculating the pixel brightness corresponding to the four groups of feature points by using Ci ═ a/(1-e (alpha (Pi-b))) + k to obtain a first parameter a, a second parameter b, a third parameter alpha and a fourth parameter k; determining an evaluation value by using L ═ a/b × (alpha) + k based on the first parameter, the second parameter, the third parameter and the fourth parameter; where Pi represents pixel brightness corresponding to a feature point of the projected image, Ci represents pixel brightness corresponding to a feature point of the original image, and L is an evaluation value.
It can be understood that when the pixel brightness corresponding to the four sets of feature points is the pixel brightness P1 corresponding to the feature point of the projected image, the pixel brightness C1 corresponding to the feature point of the original image; pixel brightness P2 corresponding to the feature points of the projected image, and pixel brightness C2 corresponding to the feature points of the original image; pixel brightness P3 corresponding to the feature points of the projected image, and pixel brightness C3 corresponding to the feature points of the original image; the pixel brightness P4 corresponding to the feature point of the projected image and the pixel brightness C4 corresponding to the feature point of the original image. Substituting four groups of values into Ci ═ a/(1-e (alpha (Pi-b))) + k; respectively obtain C1 ═ a/(1-e (alpha)*(P2-b)))+k;C2=a/(1-e(alpha*(P2-b)))+k;C3=a/(1-e(alpha*(P3-b)))+k;C4=a/(1-e(alpha*(P4-b))) + k, and solving to obtain a first parameter recorded as a, a second parameter recorded as b, a third parameter recorded as alpha and a fourth parameter recorded as k. And determining an evaluation value by using L ═ a/b × [ alpha ] + k based on the first parameter, the second parameter, the third parameter and the fourth parameter.
And S104, determining the current environment brightness value from the pre-established corresponding relation between the evaluation value and the brightness according to the evaluation value and the current distance, wherein the current distance is the distance between the current projector and the curtain.
It is understood that the evaluation value-luminance correspondence relationship is provided with correspondence relationships of different environmental luminance values with evaluation values at a distance. The pre-established evaluation value-luminance correspondence relationship includes, but is not limited to, being in the form of a table. The pre-established corresponding relation between the evaluation value and the brightness, specifically the corresponding relation between the evaluation value and the brightness under different environmental brightness and different reference distances, includes: the sensor is used for collecting the ambient brightness under different reference distances, a reference evaluation value is obtained based on reference characteristic points extracted by the reference original image and the reference projection image, and the corresponding relation between the evaluation value and the brightness is established. For example, at a distance a, the evaluation value a1, the corresponding ambient brightness is a 1; at a distance a, the evaluation value a2 corresponds to an ambient brightness a 2; at a distance a, the evaluation value A3 corresponds to an ambient brightness A3; at a distance B, the value B1 is evaluated, and the corresponding ambient brightness is B1; at a distance B, the value B2 is evaluated, and the corresponding ambient brightness is B2; at a distance B, the value B3 was evaluated, corresponding to an ambient brightness of B3. When the current distance and the evaluation value are known, the current environment brightness value is determined by substituting the current distance and the evaluation value into the table.
Based on the above technical solution, in the embodiment, by extracting the feature points from the original image and the projected image, selecting four groups of feature points, calculating according to pixel brightness of the feature points by using a preset algorithm to obtain an evaluation value, determining the brightness value of the current environment from a pre-established evaluation value-brightness corresponding relation based on the evaluation value and the current distance, the method can quickly and accurately obtain the brightness value of the current environment, and the pre-established evaluation value-brightness corresponding relation can be applied to the same type of projection equipment, thereby avoiding the need of adding a sensor to acquire the environment brightness in the related art, and because the problem of inaccurate ambient brightness that the restriction of sensor mounted position obtained, the projection environment detection method practicality that this application provided is strong, has improved ambient brightness's detection efficiency and accuracy when having reduced equipment production price.
Further, referring to fig. 2 specifically, for a method for establishing a corresponding relationship between an evaluation value and brightness in a projection environment brightness detection method, fig. 2 is a flowchart of a method for establishing a corresponding relationship between an evaluation value and brightness provided in the embodiment of the present application, and includes:
s201, under different ambient brightness and different reference distances, a reference original image and a reference projection image which is projected on a curtain by the reference original image shot by a camera are obtained.
In this embodiment, different ambient brightness may be set according to a gradient, at this time, a sensor is set at a target position between the projector and the curtain, the target position may be an intermediate position, the target position is slightly affected by the brightness of the projector and can be almost ignored, it can be understood that a difference value of an ambient brightness value detected by the sensor in a preset region range of the target position is within a preset difference value range, and at this time, accuracy of the ambient brightness acquired by the sensor at the target position is ensured.
Specifically, under the condition of first ambient brightness, acquiring a first reference projection image of a reference original image and a reference original image shot by a camera, which are projected on a curtain, under the condition of a first reference distance; acquiring a reference original image and a second reference projection image of the reference original image shot by the camera projected on the curtain under the condition of a second reference distance under the condition of the first ambient brightness; acquiring a third reference projection image of the reference original image and the reference original image shot by the camera projected on the curtain under the condition of a third reference distance under the condition of the first ambient brightness; under the condition of second ambient brightness, acquiring a fourth reference projection image of the reference original image and the reference original image shot by the camera, which are projected on the curtain, under the condition of the first reference distance; in the case of the second ambient brightness, a fifth reference projection image or the like, in which the reference original image and the reference original image photographed by the camera are projected on the curtain, is acquired in the case of the second reference distance.
It is understood that the image portion corresponding to the region based on the reference original image in the reference projection image is substantially the same as that of the reference original image to ensure the accuracy of the determination of the reference feature point. Before acquiring a reference projection image of the reference original image and the reference original image shot by the camera projected on the curtain, the method further comprises the following steps: trapezoidal correction is performed. In this embodiment, the trapezoidal correction mode is not limited, and may be a conventional trapezoidal correction mode, or may be an improved trapezoidal correction mode, and the user may select the trapezoidal correction mode by self-definition.
S202, at least four groups of corresponding reference feature points are extracted from the reference original image and the reference projection image respectively.
The method specifically comprises the following steps: adjusting the brightness of the reference original image and the brightness of the reference projection image so that the brightness of the reference original image is the same as that of the reference projection image; and extracting at least four groups of corresponding reference characteristic points from the reference original image and the reference projection image after the brightness is adjusted respectively. It can be understood that adjusting the brightness of the reference original image and the brightness of the reference projection image to the same brightness may improve the accuracy of matching the reference feature points, and further greatly improve the accuracy of obtaining at least four sets of corresponding reference feature points.
And S203, obtaining a reference evaluation value by using the four groups of corresponding reference characteristic points through a preset algorithm.
Wherein, S203 may specifically include: and calculating the pixel brightness corresponding to the four groups of reference feature points by using Ci ═ a/(1-e (alpha (Pi-b))) + kObtaining a first parameter recorded as a and the secondIIRecording the parameter as b, the third parameter as alpha and the fourth parameter as k; determining an evaluation value by using L ═ a/b × (alpha) + k based on the first parameter, the second parameter, the third parameter and the fourth parameter; wherein Pi represents pixel brightness corresponding to the reference feature point of the reference projection image, Ci represents pixel brightness corresponding to the reference feature point of the reference original image, and L is an evaluation value.
S204, establishing a corresponding relation between the evaluation value and the brightness based on the reference evaluation value, the reference distance and the environment brightness.
Wherein the ambient brightness is brightness acquired at the target position by using a sensor; the reference distance is the distance between the projector and the curtain.
Based on the foregoing embodiments, please refer to fig. 3, where fig. 3 is a flowchart of another projection environment brightness detection method provided in the embodiments of the present application, including:
s301, when the projector is at a target distance from the curtain, acquiring a target projection image of a target original image and a target original image shot by the camera projected on the curtain.
S302, at least four groups of corresponding target feature points are extracted from the target original image and the target projection image respectively.
And S303, determining a projection-camera basic mapping relation by using the positions of the four groups of target feature points.
Selecting one target distance D which is compatible with the farthest and the nearest deviation and is the smallest by taking the actual use distance range of projection as reference, and establishing a projection-camera basic mapping relation H under the target distance D, wherein in the process, the four corner coordinates of a target projection image shot by a camera are calculated in advance and correspond to a target original image, and the projection-camera basic mapping relation H is calculated to be a matrix of 3 x 4 formed by the coordinates of 4 corners of the target original image in the camera, C is a matrix of 3 x 3 formed by the coordinates of 4 corners of the target original image, and P' is an inverse matrix of 4 x 3 of the 4 corners of the target original image.
S304, acquiring an original image projected by the projector and a projected image projected on the curtain by the original image shot by the camera.
S305, at least four groups of corresponding feature points are extracted from the original image and the projected image respectively.
And S306, determining the actual mapping relation of the projection and the camera by using the positions of the four groups of feature points.
Specifically, the SURF is used for positioning the feature points, the algorithm is strong in robustness in the aspects of rotation resistance and scale, and due to the fact that the two images are different in nature, the difference between the number of the calculated feature points of the original image and the projected image and the feature vectors is obvious, so that the RANSAC algorithm is used for matching the feature vectors, and the matching accuracy of the feature points is improved. H1 is calculated, H1 ═ Cn × Pn ', Cn represents a matrix of 3 × n composed of the coordinates of the feature points of the projection screen in the camera, Pn' represents an inverse matrix of n × 3 of the feature points of the projection image, where n is 4.
S307, judging whether the difference value of the actual projection-camera mapping relation and the basic projection-camera mapping relation is within the maximum difference value range.
And S308, if the difference value is within the maximum difference value range, obtaining an evaluation value by using a preset algorithm for the pixel brightness corresponding to the four groups of feature points.
S309, determining the current environment brightness value from the pre-established corresponding relation between the evaluation value and the brightness according to the evaluation value and the current distance, wherein the current distance is the distance between the current projector and the curtain.
It can be understood that the target distance is a standard distance, and when step S304 is executed, the current distance is the same as or close to the target distance, so that the projection-camera basis mapping relationship is used as a reference, and if the difference between the obtained actual projection-camera mapping relationship and the projection-camera basis mapping relationship is within the maximum difference range, the accuracy of the finally obtained current environment brightness value is high; and if the current ambient brightness is not within the maximum difference range, stopping measurement, and obtaining lower accuracy of the current ambient brightness even if further measurement is carried out. Therefore, by comparing whether the difference value is within the maximum difference value range, the accuracy of measurement can be further ensured.
Referring to fig. 4, fig. 4 is a schematic structural diagram of a projection environment brightness detection apparatus provided in an embodiment of the present application, where the projection environment brightness detection apparatus provided in the embodiment of the present application is described below, and the projection environment brightness detection apparatus described below and the projection environment brightness detection method described above may be referred to correspondingly, and the schematic structural diagram of the projection environment brightness detection apparatus provided in the embodiment of the present application includes:
the image acquisition module 410 is used for acquiring an original image projected by the projector and a projected image projected on the curtain by the original image shot by the camera;
a feature point extraction module 420, configured to extract at least four groups of corresponding feature points from the original image and the projected image respectively;
an evaluation value calculation module 430, configured to obtain an evaluation value by using a preset algorithm for pixel brightness corresponding to the four groups of feature points;
and a current environment brightness value determining module 440, configured to determine a current environment brightness value from a pre-established evaluation value-brightness correspondence according to the evaluation value and the current distance, where the current distance is a distance between the current projector and the curtain.
In particular, the projector is mainly provided, and the projector includes, but is not limited to, a short-focus projector or a long-focus projector, wherein the short-focus projector may be a laser television.
Optionally, the method further includes:
the reference image acquisition module is used for acquiring a reference original image and a reference projection image which is projected on the curtain by the reference original image shot by the camera under different ambient brightness and different reference distances;
the reference characteristic point extraction module is used for extracting at least four groups of corresponding reference characteristic points from the reference original image and the reference projection image respectively;
the reference evaluation value calculation module is used for obtaining reference evaluation values by utilizing the four groups of corresponding reference characteristic points through a preset algorithm;
the evaluation value-brightness corresponding relation establishing module is used for establishing an evaluation value-brightness corresponding relation based on a reference evaluation value, a reference distance and environment brightness;
wherein the ambient brightness is brightness acquired at the target position by using a sensor;
the reference distance is the distance between the projector and the curtain.
Optionally, the method further includes:
the target image acquisition module is used for acquiring a target original image and a target projection image of the target original image shot by the camera projected on the curtain when the projector is at a target distance from the curtain;
the target characteristic point acquisition module is used for extracting at least four groups of corresponding target characteristic points from the target original image and the target projection image respectively;
the basic mapping relation determining module is used for determining a projection-camera basic mapping relation by utilizing the positions of the four groups of target feature points;
correspondingly, the method also comprises the following steps:
the actual mapping relation determining module is used for determining the actual mapping relation of the projection-camera by utilizing the positions of the four groups of feature points;
the judgment module is used for judging whether the difference value between the actual projection-camera mapping relation and the basic projection-camera mapping relation is within the maximum difference value range or not;
and the execution module is used for executing the step of obtaining the evaluation value of the pixel brightness corresponding to the four groups of characteristic points by using a preset algorithm if the pixel brightness is within the maximum difference range.
Optionally, the feature point extracting module 420 includes:
the brightness adjusting unit is used for adjusting the brightness of the original image and the brightness of the projected image so as to enable the brightness of the original image to be the same as the brightness of the projected image;
and the extraction unit is used for extracting at least four groups of corresponding characteristic points from the original image and the projected image after the brightness is adjusted.
Optionally, the extracting unit includes:
the original characteristic point determining subunit is used for respectively positioning a plurality of groups of original characteristic points from the original image and the projection image after the brightness is adjusted by utilizing an SURF algorithm;
and the matching subunit is used for performing feature vector matching on all the original feature points by using a RANSAC algorithm to obtain at least four groups of corresponding feature points.
Optionally, the method further includes:
and the correction module is used for executing trapezoidal correction.
Optionally, the evaluation value calculating module 430 includes:
a parameter determining unit for calculating pixel brightness corresponding to the four groups of feature points by using Ci ═ a/(1-e (alpha (Pi-b))) + kObtaining a first parameter a, a second parameter b, a third parameter alpha and a fourth parameter k;
an evaluation value determination unit configured to determine an evaluation value using L ═ a/b × alpha + k based on the first parameter, the second parameter, the third parameter, and the fourth parameter;
where Pi represents pixel brightness corresponding to a feature point of the projected image, Ci represents pixel brightness corresponding to a feature point of the original image, and L is an evaluation value.
Since the embodiment of the projection environment brightness detection apparatus portion and the embodiment of the projection environment brightness detection method portion correspond to each other, please refer to the description of the embodiment of the projection environment brightness detection method portion for the embodiment of the projection environment brightness detection apparatus portion, and details thereof are not repeated here.
In the following, an electronic device provided by an embodiment of the present application is introduced, and the electronic device described below and the projection environment brightness detection method described above may be referred to correspondingly.
The application provides an electronic device, including:
a memory for storing a computer program;
and the processor is used for realizing the steps of the projection environment brightness detection method when executing the computer program.
Since the embodiment of the electronic device portion corresponds to the embodiment of the projection environment brightness detection method portion, please refer to the description of the embodiment of the projection environment brightness detection method portion for the embodiment of the electronic device portion, and details are not repeated here.
In the following, a computer-readable storage medium provided by an embodiment of the present application is described, and the computer-readable storage medium described below and the projection environment brightness detection method described above may be referred to correspondingly.
The present application provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the above projection environment brightness detection method.
Since the embodiment of the computer-readable storage medium portion corresponds to the embodiment of the projection environment brightness detection method portion, please refer to the description of the embodiment of the projection environment brightness detection method portion for the embodiment of the computer-readable storage medium portion, which is not repeated here.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The projection environment brightness detection method, the projection environment brightness detection device, the electronic device, and the computer-readable storage medium provided by the present application are described in detail above. The principles and embodiments of the present application are explained herein using specific examples, which are provided only to help understand the method and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.

Claims (9)

1. A projection environment brightness detection method is characterized by comprising the following steps:
acquiring an original image projected by a projector and a projected image of the original image projected on a curtain and shot by a camera;
extracting at least four groups of corresponding feature points from the original image and the projected image respectively;
obtaining evaluation values of the pixel brightness corresponding to the four groups of feature points by using a preset algorithm;
determining a current environment brightness value from a pre-established corresponding relation between an evaluation value and brightness according to the evaluation value and a current distance, wherein the current distance is the distance between the projector and the curtain at present;
the obtaining of the evaluation value of the pixel brightness corresponding to the four groups of feature points by using a preset algorithm comprises the following steps:
calculating the pixel brightness corresponding to the four groups of feature points by using Ci ═ a/(1-e (alpha (Pi-b))) + k to obtain a first parameter a, a second parameter b, a third parameter alpha and a fourth parameter k;
determining an evaluation value using L ═ a/b × (alpha + k) based on the first parameter, the second parameter, the third parameter, and the fourth parameter;
wherein Pi represents pixel brightness corresponding to the feature point of the projected image, Ci represents pixel brightness corresponding to the feature point of the original image, and L is an evaluation value.
2. The method for detecting the brightness of the projection environment according to claim 1, wherein before the obtaining of the projection image of the original image projected by the projector and the projection image of the original image projected on the curtain captured by the camera, the method further comprises:
under different ambient brightness and different reference distances, acquiring a reference original image and a reference projection image of the reference original image shot by the camera projected on the curtain;
extracting at least four groups of corresponding reference feature points from the reference original image and the reference projection image respectively;
obtaining reference evaluation values by using the four groups of corresponding reference characteristic points through the preset algorithm;
establishing a correspondence relationship of the evaluation value-brightness based on the reference evaluation value, the reference distance, and the ambient brightness;
wherein the ambient brightness is brightness collected at a target location using a sensor;
the reference distance is a distance between the projector and the curtain.
3. The method for detecting the brightness of the projection environment according to claim 1, wherein before the obtaining of the projection image of the original image projected by the projector and the projection image of the original image projected on the curtain captured by the camera, the method further comprises:
when the projector is at a target distance from the curtain, acquiring a target original image and a target projection image of the target original image shot by the camera projected on the curtain;
extracting at least four groups of corresponding target feature points from the target original image and the target projection image respectively;
determining a projection-camera basic mapping relation by using the positions of the four groups of target feature points;
correspondingly, after at least four groups of corresponding feature points are extracted from the original image and the projected image respectively, the method further includes:
determining the actual mapping relation of the projection and the camera by using the positions of the four groups of feature points;
judging whether the difference value of the actual projection-camera mapping relation and the basic projection-camera mapping relation is within the maximum difference value range or not;
and if the brightness of the pixels corresponding to the four groups of feature points is within the maximum difference range, executing the step of obtaining an evaluation value by utilizing a preset algorithm for the brightness of the pixels corresponding to the four groups of feature points.
4. The method according to claim 1, wherein the extracting at least four groups of corresponding feature points from the original image and the projected image respectively comprises:
adjusting the brightness of the original image and the brightness of the projected image so that the brightness of the original image is the same as that of the projected image;
and extracting at least four groups of corresponding feature points from the original image and the projected image after the brightness is adjusted respectively.
5. The method according to claim 4, wherein the extracting at least four groups of corresponding feature points from the original image and the projected image after brightness adjustment respectively comprises:
utilizing an SURF algorithm to position a plurality of groups of original characteristic points from the original image and the projected image after the brightness is adjusted respectively;
and performing feature vector matching on all original feature points by using a RANSAC algorithm to obtain at least four groups of corresponding feature points.
6. The method for detecting the brightness of the projection environment according to claim 1, wherein before the obtaining of the projection image of the original image projected by the projector and the projection image of the original image projected on the curtain captured by the camera, the method further comprises:
trapezoidal correction is performed.
7. A projection environment brightness detection device is characterized by comprising:
the device comprises an image acquisition module, a screen and a control module, wherein the image acquisition module is used for acquiring an original image projected by a projector and a projected image of the original image projected on the screen, which is shot by a camera;
the characteristic point extraction module is used for extracting at least four groups of corresponding characteristic points from the original image and the projected image respectively;
the evaluation value calculation module is used for obtaining evaluation values of the pixel brightness corresponding to the four groups of feature points by utilizing a preset algorithm;
a current environment brightness value determining module, configured to determine a current environment brightness value from a pre-established evaluation value-brightness correspondence according to the evaluation value and a current distance, where the current distance is a distance between the projector and the curtain at present;
the evaluation value calculation module includes:
a parameter determining unit, configured to calculate pixel intensities corresponding to the four groups of feature points by using Ci ═ a/(1-e (alpha (Pi-b))) + k, to obtain a first parameter a, a second parameter b, a third parameter alpha, and a fourth parameter k;
an evaluation value determination unit configured to determine an evaluation value using L ═ a/b × alpha + k based on the first parameter, the second parameter, the third parameter, and the fourth parameter;
wherein Pi represents pixel brightness corresponding to the feature point of the projected image, Ci represents pixel brightness corresponding to the feature point of the original image, and L is an evaluation value.
8. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the projection environment brightness detection method according to any one of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the projection environment brightness detection method according to any one of claims 1 to 6.
CN201911157817.9A 2019-11-22 2019-11-22 Projection environment brightness detection method and device, electronic equipment and medium Active CN110769238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911157817.9A CN110769238B (en) 2019-11-22 2019-11-22 Projection environment brightness detection method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911157817.9A CN110769238B (en) 2019-11-22 2019-11-22 Projection environment brightness detection method and device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN110769238A CN110769238A (en) 2020-02-07
CN110769238B true CN110769238B (en) 2021-04-27

Family

ID=69338912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911157817.9A Active CN110769238B (en) 2019-11-22 2019-11-22 Projection environment brightness detection method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN110769238B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116631306B (en) * 2022-07-21 2024-02-23 宜宾市极米光电有限公司 Brightness adjustment method, brightness adjustment device, display device and storage medium
CN115484447B (en) * 2022-11-14 2023-03-24 深圳市芯图科技有限公司 Projection method, projection system and projector based on high color gamut adjustment
CN116546175B (en) * 2023-06-01 2023-10-31 深圳创疆网络科技有限公司 Intelligent control method and device for realizing projector based on automatic induction

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1410872A (en) * 2001-09-27 2003-04-16 精工爱普生株式会社 Image display system, projector, information storage medium and image processing method
CN1767606A (en) * 2004-10-25 2006-05-03 伯斯有限公司 Enhancing contrast
CN1991571A (en) * 2005-12-29 2007-07-04 联想(北京)有限公司 Luminance self-adjusting projector and method thereof
CN101324749A (en) * 2008-07-24 2008-12-17 上海交通大学 Method for performing projection display on veins plane
CN103327273A (en) * 2012-03-21 2013-09-25 精工爱普生株式会社 Image processing device, projector, and method of controlling projector
CN104318912A (en) * 2014-10-23 2015-01-28 赵辉 Method and device for detecting environmental light brightness
JP2015087321A (en) * 2013-10-31 2015-05-07 セイコーエプソン株式会社 Control apparatus, robot control system, control method, and control program
CN109831658A (en) * 2019-04-03 2019-05-31 贵安新区新特电动汽车工业有限公司 Projected light tone adjusting method and projected light tone engagement positions
US10331207B1 (en) * 2013-03-15 2019-06-25 John Castle Simmons Light management for image and data control

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7911693B2 (en) * 2006-03-20 2011-03-22 Hewlett-Packard Development Company, L.P. Ambient light absorbing screen
US7679602B2 (en) * 2006-09-22 2010-03-16 Aptina Imaging Corporation Graphical user interface based control of imaging parameters including scene illumination parameters

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1410872A (en) * 2001-09-27 2003-04-16 精工爱普生株式会社 Image display system, projector, information storage medium and image processing method
CN1767606A (en) * 2004-10-25 2006-05-03 伯斯有限公司 Enhancing contrast
CN1991571A (en) * 2005-12-29 2007-07-04 联想(北京)有限公司 Luminance self-adjusting projector and method thereof
CN101324749A (en) * 2008-07-24 2008-12-17 上海交通大学 Method for performing projection display on veins plane
CN103327273A (en) * 2012-03-21 2013-09-25 精工爱普生株式会社 Image processing device, projector, and method of controlling projector
US10331207B1 (en) * 2013-03-15 2019-06-25 John Castle Simmons Light management for image and data control
JP2015087321A (en) * 2013-10-31 2015-05-07 セイコーエプソン株式会社 Control apparatus, robot control system, control method, and control program
CN104318912A (en) * 2014-10-23 2015-01-28 赵辉 Method and device for detecting environmental light brightness
CN109831658A (en) * 2019-04-03 2019-05-31 贵安新区新特电动汽车工业有限公司 Projected light tone adjusting method and projected light tone engagement positions

Also Published As

Publication number Publication date
CN110769238A (en) 2020-02-07

Similar Documents

Publication Publication Date Title
CN110769238B (en) Projection environment brightness detection method and device, electronic equipment and medium
CN107766855B (en) Chessman positioning method and system based on machine vision, storage medium and robot
JP5961945B2 (en) Image processing apparatus, projector and projector system having the image processing apparatus, image processing method, program thereof, and recording medium recording the program
CN108389212B (en) Method for measuring foot size and computer readable medium
US20230027389A1 (en) Distance determination method, apparatus and system
US11024052B2 (en) Stereo camera and height acquisition method thereof and height acquisition system
CN104361580B (en) Projected image real-time correction method based on planar screen
CN111083456B (en) Projection correction method, apparatus, projector, and readable storage medium
CN110084133B (en) Obstacle detection method, obstacle detection apparatus, vehicle, computer device, and storage medium
CN109345597B (en) Camera calibration image acquisition method and device based on augmented reality
WO2021136386A1 (en) Data processing method, terminal, and server
CN108871185B (en) Method, device and equipment for detecting parts and computer readable storage medium
CN110087049A (en) Automatic focusing system, method and projector
JP2011182397A (en) Method and apparatus for calculating shift length
CN112753047B (en) Method and system for in-loop calibration and target point setting of hardware of camera and related equipment
KR20130007950A (en) Apparatus and method for detecting region of interest, and the recording media storing the program performing the said method
EP3690800A2 (en) Information processing apparatus, information processing method, and program
CN111681186A (en) Image processing method and device, electronic equipment and readable storage medium
CN111627073B (en) Calibration method, calibration device and storage medium based on man-machine interaction
CN109496326B (en) Image processing method, device and system
CN112995525B (en) Camera exposure method and device for self-walking equipment
CN113749646A (en) Monocular vision-based human body height measuring method and device and electronic equipment
JP2011069797A (en) Displacement measuring device and displacement measuring method
CN101685240B (en) Method for judging focusing quality of image extracting device
CN113375555A (en) Power line clamp measuring method and system based on mobile phone image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant