WO2022198508A1 - 镜头异常提示方法、装置、可移动平台及可读存储介质 - Google Patents

镜头异常提示方法、装置、可移动平台及可读存储介质 Download PDF

Info

Publication number
WO2022198508A1
WO2022198508A1 PCT/CN2021/082779 CN2021082779W WO2022198508A1 WO 2022198508 A1 WO2022198508 A1 WO 2022198508A1 CN 2021082779 W CN2021082779 W CN 2021082779W WO 2022198508 A1 WO2022198508 A1 WO 2022198508A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
abnormal
photosensitive
probability
movable platform
Prior art date
Application number
PCT/CN2021/082779
Other languages
English (en)
French (fr)
Inventor
徐骥飞
周游
刘洁
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2021/082779 priority Critical patent/WO2022198508A1/zh
Publication of WO2022198508A1 publication Critical patent/WO2022198508A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details

Definitions

  • the present application relates to the field of intelligent detection, and in particular, to a method, device, movable platform, and computer-readable storage medium for indicating abnormality of a lens.
  • a movable platform loaded with lenses such as binocular cameras can obtain information about the surrounding environment based on the lenses, thereby enabling intelligent motion, intelligent navigation, and the like.
  • the lens is the "visual organ" of the movable platform. If part of the lens is dirty or blocked, it will cause abnormal light sensitivity in this area. In the collected image, some pixel units may have a fixed patch The texture of the mobile platform will affect the accurate collection of the surrounding environment information by the movable platform, thereby burying potential safety hazards.
  • the present application provides a method, device, movable platform and computer-readable storage medium for prompting abnormality of a lens.
  • a method for prompting abnormality of a lens comprising: acquiring an image collected by the lens; acquiring luminance values of multiple pixel units of the image, a plurality of the pixel units
  • the brightness value of the lens is obtained by collecting the environment from a plurality of photosensitive areas of the lens; based on the brightness values of a plurality of the pixel units, the enhancement factor and attenuation factor of the photosensitive area of the lens corresponding to any pixel unit are obtained,
  • the enhancement factor represents the degree of enhancement of the brightness value of the pixel unit corresponding to the photosensitive area relative to the estimated brightness value
  • the attenuation factor represents the brightness value of the pixel unit corresponding to the photosensitive area relative to the estimated brightness value.
  • the brightness value of the corresponding pixel unit is obtained by collecting the environment; based on the numerical range of the enhancement factor and the attenuation factor, determine An abnormal photosensitive area in the plurality of photosensitive areas of the lens; prompt information is generated based on the position of the abnormal photosensitive area.
  • the method includes: acquiring an image collected by the lens; detecting whether a light-transmitting component or a photosensitive device of the lens is blocked by a foreign object based on pixel values of a plurality of pixel units of the image; when it is detected that the foreign object is blocking generate a prompt message.
  • a lens abnormality prompting device the device includes a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor executing the program At the same time, the following steps of the methods of the first aspect and the second aspect of the embodiments of the present application are implemented.
  • a movable platform includes: a lens, a power system, a memory, and a processor; the lens is used to photograph the environment where the movable platform is located.
  • the depth map provides environmental information for the movement of the movable platform;
  • the power system is used to provide power for the movement of the movable platform;
  • the memory is used for storing program codes;
  • the processor calls the program
  • the code when the program code is executed, is used to implement the steps of the methods of the first aspect and the second aspect of the embodiments of the present application.
  • a computer-readable storage medium where several computer instructions are stored on the computer-readable storage medium, and when the computer instructions are executed, the first aspect and the first aspect of the embodiments of the present application are implemented. The steps of the method described in the second aspect.
  • the enhancement factor and attenuation factor are obtained by acquiring the image captured by the lens to be notified of abnormality, and based on the brightness values of multiple pixel units of the image, and then based on the numerical range of the enhancement factor and the attenuation factor, determine the The abnormal photosensitive area of the lens is detected, and the prompt information is generated, which can simply and quickly determine whether the lens has an abnormal photosensitive area due to abnormal conditions such as dirt or occlusion, and the specific location of the abnormal photosensitive area, so as to generate a prompt information reminder
  • the processing of abnormal photosensitive areas by users or managers overcomes the defects of low efficiency and high error detection rate in the manual detection method in the related art.
  • FIG. 1 is a schematic diagram of a depth map collected by a binocular camera according to an exemplary embodiment of the present application.
  • FIG. 2 is a schematic diagram of an image of a scene where a mobile platform is collected according to an exemplary embodiment of the present application.
  • FIG. 3 is a flowchart of a first method for prompting abnormality of a lens according to an exemplary embodiment of the present application.
  • FIG. 4 is a flowchart of a second method for prompting abnormality of a lens according to an exemplary embodiment of the present application.
  • FIG. 5A is an image captured by a left-eye camera blocked by a tripod of a drone according to an exemplary embodiment of the present application.
  • FIG. 5B is an image captured by a right-eye camera blocked by a tripod of a drone according to an exemplary embodiment of the present application.
  • FIG. 5C is an image captured by a camera blocked by a protective cover of a gimbal of a drone according to an exemplary embodiment of the present application.
  • FIG. 6 is a flowchart of a method for determining an abnormally sensitive area based on luminance values of multiple pixel units according to an exemplary embodiment of the present application.
  • FIG. 7 is a schematic diagram showing the principle of determining an abnormal photosensitive area based on luminance values of multiple pixel units according to an exemplary embodiment of the present application.
  • FIG. 8 is a schematic diagram of a probability map composed of cumulative abnormal probabilities of multiple photosensitive regions according to an exemplary embodiment of the present application.
  • FIG. 9 is a flowchart of a third method for prompting abnormality of a lens according to an exemplary embodiment of the present application.
  • FIG. 10 is a schematic diagram of the hardware structure of a lens abnormality prompting device according to an exemplary embodiment of the present application.
  • FIG. 11 is a schematic diagram of the hardware structure of a mobile platform according to an exemplary embodiment of the present application.
  • first, second, third, etc. may be used in this application to describe various information, such information should not be limited by these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as the second information, and similarly, the second information may also be referred to as the first information, without departing from the scope of the present disclosure.
  • word “if” as used herein can be interpreted as "at the time of” or “when” or “in response to determining.”
  • Computer vision is also in the stage of rapid development. Computer vision relies on the imaging system as a "visual organ” to collect image information and analyze and process it based on a computer, so that the computer can "perceive” from the collected images like a human, obtain useful information and make use of it.
  • the Stereo Vision System is based on two cameras and takes two pictures of the scene at the same time and from different angles. Perform analysis and processing, and based on the position and angle relationship between the two cameras, using the triangular relationship, the distance relationship between the object in the scene and the binocular camera can be calculated and obtained, as shown in Figure 1, based on the calculated distance.
  • the depth map obtained from the relationship, the grayscale of each pixel unit represents the distance from the binocular camera.
  • the movable platform can intelligently perform obstacle avoidance, motion route planning, etc., based on the obtained distance relationship of objects in the scene, so as to realize intelligent perception.
  • the "vision organ" the lens of the imaging system is dirty or blocked, a fixed texture will appear on the collected image, that is, the brightness value of some pixel units is abnormal, which will affect computer vision. Calculation results, resulting in false detection or false detection, and even lead to the occurrence of security accidents.
  • FIG. 2 it is a movable platform—a scene graph in front of the flight direction captured by the binocular camera mounted on the drone during the flight, wherein the area 201 is the binocular camera.
  • the area 201 is the binocular camera.
  • the existence of an abnormal photosensitive area on the right-eye camera will result in the fact that shooting based on the binocular camera cannot obtain the real distance information of objects located in the scene corresponding to this area, thereby burying a potential safety hazard for the flight control of the drone.
  • the lens In the related art, whether there is an abnormality in the lens usually requires a staff member or a user to actively check. Once the abnormal prompt to the lens is forgotten before the movable platform moves, it will be a safety hazard.
  • the manual detection method has a low intelligent program, and has the defect of low efficiency for batch shot detection.
  • the size of the lens is getting smaller and smaller, and the resolution of the human eye is limited. In the process of artificial lens detection, there are often problems of missed detection and false detection. .
  • an embodiment of the present application provides a method for indicating abnormality of a lens, wherein the lens at least includes a light-transmitting component and a photosensitive device.
  • the light-transmitting components may be glass light-transmitting components, plastic light-transmitting components, or other light-transmitting components, which are used for transforming optical signals, which are not limited in this application.
  • the photosensitive device can be various sensors, such as a CMOS sensor, a CCD sensor, etc., or various other photosensitive devices, so as to convert optical signals into electrical signals.
  • the first method for prompting an abnormality of a lens includes the following steps:
  • Step 301 acquiring an image collected by the lens
  • Step 302 based on the brightness values of the plurality of pixel units of the image, detect whether the lens has abnormal photosensitive areas, and the brightness values of the plurality of pixel units are collected from the environment by the plurality of photosensitive areas of the lens;
  • Step 303 generating prompt information when the abnormal photosensitive area exists.
  • the lens is mounted on a movable platform to provide the movable platform with depth information of the scene in which it is located.
  • a movable platform may be any one or more cameras among binocular cameras or multi-camera cameras, for acquiring the depth information of the scene of the movable platform.
  • the movable platform includes at least one of the following: drones, unmanned vehicles, intelligent robots, and the like.
  • the movable platform may also be of other movable types capable of carrying other devices, and the embodiment of the present application does not limit the specific type of the movable platform.
  • FIG. 4 another method for prompting abnormal shots as shown in FIG. 4 is provided, and the method includes:
  • Step 401 acquiring an image collected by the lens
  • Step 402 based on the pixel values of the plurality of pixel units of the image, detect whether the light-transmitting component or the photosensitive device of the lens is blocked by foreign objects;
  • Step 403 generating prompt information when it is detected that the foreign object is blocked.
  • the pixel value includes a luminance value as a component
  • the method further includes:
  • Step 4021 Determine whether there is an abnormal photosensitive area in the light-transmitting component or the photosensitive device of the lens based on the brightness values of the plurality of pixel units of the image.
  • other components of the pixel values such as RGB distribution, etc., can also be used to determine the light transmittance of the lens. Whether there is an abnormal photosensitive region in the component or the photosensitive device is not limited in this embodiment of the present application.
  • the lens is mounted on a movable platform for providing the movable platform with depth information of the scene in which it is located.
  • the lens In order to prevent the lens from being exposed to the space environment for a long time, resulting in the accumulation of dust and contamination of the lens, the lens is often protected by the lens protection cover when not in use.
  • the protection The hood is opened.
  • the protective cover may be insufficiently opened, which in turn causes the light-transmitting component or the photosensitive device of the lens to be blocked, resulting in that when the lens is used to collect environmental information around the movable platform, the Insufficient and brings security risks.
  • step 402 whether the light-transmitting component or the photosensitive device of the lens is blocked by foreign objects in step 402 includes: whether there is any adhesive on the light-transmitting component or the photosensitive device, and whether the light-transmitting component or the photosensitive device exists Whether the component or the photosensitive device is blocked by an external component.
  • the external components may include one or more of the following: a protective cover, a mount for a movable platform, or a power assembly.
  • the protective cover may be a protective cover of the lens itself, or may be a protective cover on a movable platform mounted on the lens.
  • the support of the movable platform can be a tripod, a foldable support table, and the like.
  • the power component may be the moving arm of the movable platform, such as the arm of an unmanned aerial vehicle, a flying paddle, a limb component of an intelligent robot, and the like.
  • the movable platform may be an unmanned aerial vehicle, an unmanned vehicle, an intelligent robot, etc., which is not limited in this embodiment of the present application.
  • the movable platform is an unmanned aerial vehicle
  • the external components may be an arm of the unmanned aerial vehicle, a flight paddle shield of the unmanned aerial vehicle, and a cloud of the unmanned aerial vehicle.
  • the protective cover of the platform which is not limited in this embodiment of the present application.
  • 5A, 5B and 5C are described.
  • 5A and 5B are the images collected by the drone based on the left eye camera and the right eye camera, respectively, wherein, the areas indicated by 501 and 502 are the tripods where the drone is not placed in the correct position.
  • An occlusion is formed on the left eye camera and the right eye camera. Due to the existence of the occlusion, there are abnormal photosensitive areas in the areas 501 and 502 of the collected image.
  • the pixels of the multiple pixel units of the collected image value detect the existence of foreign object occlusion, and generate prompt information, which can avoid the problem of potential safety hazards due to errors in the collected depth map.
  • Fig. 5C is an image collected by the drone based on one of the binocular cameras, wherein the area indicated by 503a to 503c is when the drone is opening the gimbal protective cover due to some The reason is that the protective cover of the gimbal is not fully opened, thereby forming a block on the lens, and the regions 503a-c are abnormal photosensitive regions.
  • the pixel values of multiple pixel units of the collected image the presence of foreign object occlusion is detected, and prompt information is generated, which can avoid errors due to the collected depth map. There are security risks.
  • the method described in the above embodiments further includes: when it is detected that the light-transmitting component of the lens or the photosensitive device is blocked by a foreign object, controlling the movement of the external component.
  • the occlusion of the lens By controlling the movement of the external component, it can be determined whether the source of occlusion of the lens is the external component.
  • the preset range of the motion control is appropriate, the occlusion of the lens can be eliminated. Even if the occlusion of the lens cannot be eliminated by controlling the external components, a preset alarm can be issued to remind the user that the light-transmitting component or the photosensitive device of the lens is blocked by foreign objects; it is even possible to control the component to move further In order to eliminate the occlusion, the automatic elimination of the occlusion of the lens is realized.
  • the acquired image may be one image collected by the lens, or may be multiple images collected by the lens.
  • the image is an image collected by the lens
  • the average of the gradients is the brightness value of the image and the gradient of the brightness value of the image.
  • step 301 and step 401 the acquisition of the image captured by the lens is performed when the lens is in different positions and ⁇ or attitude collected.
  • the lens is in different positions and/or attitudes, which can be realized by controlling the position and/or attitude of the movable platform on which the lens is located, or by controlling the position and/or attitude of the lens itself Yes, the embodiments of the present application do not limit this.
  • step 301 and step 401 further include the following steps:
  • Step 3011 Control the movement of the movable platform or the pan/tilt or mechanical arm connected with the lens to adjust the shooting position and/or posture of the lens.
  • the implementation methods of the different positions and/or attitudes of the lens in the above-mentioned embodiments are only illustrative, and those skilled in the art can use other methods to make the movable platform and the specific structure of the lens.
  • the lenses to be detected are in different positions and/or attitudes, which are not limited in the embodiments of the present application.
  • the method for prompting abnormal shots provided by the embodiments of the present application further includes the following steps after step 3011:
  • Step 3012 Detect whether the lens has a position and/or attitude change greater than a threshold, so as to acquire an image captured by the lens when the change in the position and attitude of the lens is greater than a threshold.
  • the change of the position and attitude of the lens can be detected by one of a global positioning system (GPS), an inertial measurement system (IMU), a visual inertial odometry (VIO) and the like carried on the lens. or multiple ok.
  • GPS global positioning system
  • IMU inertial measurement system
  • VIO visual inertial odometry
  • the change of the position and the attitude of the lens can be determined, which may be implemented with reference to the related art, which will not be repeated in this embodiment of the present application.
  • step 302 determine whether there is an abnormal photosensitive area in the lens based on the brightness values of multiple pixel units of the image
  • step 4021 determine the light-transmitting component of the lens based on the brightness values of multiple pixel units of the image
  • the pixel unit is a pixel unit that does not include the sky area.
  • determining whether the pixel unit is a pixel unit that does not include a sky area it can be determined in various ways. For example, a deep learning network's semantic parsing Parsing algorithm can be used to detect sky regions in an image. Of course, those skilled in the art should understand that other ways can also be used to determine whether the pixel unit is a sky area. When the pixel unit is a sky area, the pixel unit is excluded and is not used to determine the lens Abnormal photosensitive area.
  • step 302 it is determined whether there is an abnormal photosensitive area in the lens based on the brightness values of multiple pixel units of the image, and step 4021, the transmittance of the lens is determined based on the brightness values of multiple pixel units of the image.
  • Whether there is an abnormal photosensitive region in the optical component or the photosensitive device can be realized in various ways, which is not limited in the embodiment of the present application.
  • step 302 based on the brightness values of multiple pixel units of the image, determines whether there is an abnormally sensitive area in the lens, which can be implemented by the method shown in FIG. 6, and the method includes the following steps:
  • Step 601 based on the luminance values of a plurality of the pixel units, obtain the enhancement factor and attenuation factor of the photosensitive area of the lens corresponding to any pixel unit, wherein the enhancement factor represents the pixel unit corresponding to the photosensitive area.
  • the enhancement degree of the brightness value relative to the estimated brightness value, the attenuation factor represents the attenuation degree of the brightness value of the pixel unit corresponding to the photosensitive area relative to the estimated brightness value, wherein the estimated brightness value represents the photosensitive area
  • the brightness value of the corresponding pixel unit is obtained by collecting the environment;
  • Step 602 based on the numerical range of the enhancement factor and the attenuation factor, determine an abnormal photosensitive area among the plurality of photosensitive areas of the lens.
  • step 403 generating prompt information when it is detected that the foreign object is blocked includes: generating the prompt information based on the abnormal photosensitive area.
  • the “pixel unit” described in the embodiments of the present application refers to each pixel unit of the image captured by the lens
  • the “photosensitive area” refers to the lens on the lens, which is different from each pixel unit of the image.
  • the pixel unit has a unique corresponding photosensitive area.
  • the lens 700 for collecting environmental information is generally composed of a light shielding plate 701 , an optical lens 702 and a sensor 703 .
  • the shading plate 701 is located at the front end of the lens, which protects the optical lens and filters some stray light; the optical lens 702 is used for optical imaging of the environment; the sensor 703 is used for imaging the image formed by the optical lens
  • the optical signal is converted into an electrical signal so that the processor can perform subsequent image processing.
  • the target object point A in the object space is imaged by the optical imaging system (in FIG. 7 , namely the lens 700 ) to obtain the target image point B in the image space.
  • the imaging process can be Expressed as:
  • I 0 (x, y) represents the light intensity at the target object point A
  • I(x, y) represents the light intensity at the target image point B
  • k(x, y) is the blur kernel, which represents the target object point
  • (x, y) represents the coordinates of the target object point and the target image point with the optical axis of the lens as the center.
  • the light intensity from the target point will be imaged at other positions outside the target image point, resulting in the attenuation of the light intensity; while the light intensity of other non-target object points , it will be imaged at the position of the target image point, resulting in the enhancement of the illumination intensity, so the ideal imaging does not exist.
  • the lens 700 captures images of the environment in which it is located, the target object point A at a certain position in the environment is imaged through the lens 700 and is in the image space.
  • the light intensity corresponding to the target image point B is I, then, the relationship between I 0 and I can be written as:
  • I(x,y) I0 (x,y) ⁇ [ ⁇ (x,y)*k(x,y)]+I ⁇ (x,y)*k(x,y) (2)
  • ⁇ (x, y) means that due to the existence of scattering points on the imaging path of the target point A, a part of the illumination intensity of the target point A is finally imaged in other positions than the target image point B, so that the target image point B
  • the degree of attenuation of the light intensity; I ⁇ (x, y) indicates that due to the existence of scattering points on the imaging path of other object points, the light intensity corresponding to other object points is shifted in position, and this part of the light intensity is used as an additional object point.
  • a(x, y) is the enhancement factor, which represents the imaging acquisition of the target object point A by the lens.
  • the pixel unit corresponding to the object point A that is, at the image point B
  • b(x,y) is the attenuation factor, which represents the brightness value of the pixel unit corresponding to the object point A (ie, at the image point B) in the obtained image.
  • the estimated brightness value is the brightness value of the corresponding pixel unit collected from the environment when there is no abnormal light sensitivity in the photosensitive area of the lens.
  • Increase; b(x,y) 0, indicating that there is no scattering point on the imaging path of the target object point A, resulting in the attenuation of the illumination intensity of the target image point B corresponding to the target object point A.
  • the scattering state of different photosensitive areas of the lens will change significantly, which will cause the enhancement factor a(x, y of the same photosensitive area) ) and the attenuation factor b(x,y) changes.
  • the attenuation factor corresponding to the pixel unit of the image corresponding to the target point A when there is an abnormal condition such as dirt in the photosensitive area corresponding to the target point A of the lens, and there is no abnormal condition in other photosensitive areas, it will cause the attenuation factor corresponding to the pixel unit of the image corresponding to the target point A.
  • b(x,y) is larger, and its enhancement factor a(x,y) is almost unchanged.
  • the numerical range of the attenuation factor b(x,y) of the pixel unit or the respective numerical ranges of the enhancement factor a(x,y) and the numerical range of the attenuation factor b(x,y) of the pixel unit, or according to The numerical relationship between the pixel unit enhancement factor a(x, y) and the attenuation factor b(x, y), such as a quotient relationship, can determine whether the photosensitive area of the lens is an abnormal photosensitive area.
  • step 501 based on the brightness values of a plurality of the pixel units, the enhancement factor and/or attenuation factor of the photosensitive area of the lens corresponding to any of the pixel units is obtained, which can be determined in various ways. The example does not limit this.
  • the enhancement factor and the attenuation factor in step 501 may be determined by a pre-trained deep learning model.
  • the deep learning model can be a developer to the selected deep learning model, input a large number of images collected by the lens, and each pixel unit is marked with an enhancement factor and an attenuation factor, so that the deep learning model can automatically learn: when When an image captured by the lens without an enhancement factor and attenuation factor of each pixel unit is input into the deep learning model, the deep learning model can automatically output the enhancement factor and attenuation factor of each pixel unit of the image.
  • the enhancement factor and the attenuation factor can also be determined by algorithms other than deep learning technology.
  • An example introduction follows:
  • p 1 and p 2 represent the coordinates of two adjacent pixels.
  • AVG represents the average value
  • represents the absolute value
  • the expression of the enhancement factor is:
  • the expression of the attenuation factor is:
  • the enhancement factor and the attenuation factor in step 601 may be determined in the following manner:
  • Step 6011 according to the obtained luminance values I(x, y) of multiple pixel units, obtain the estimated luminance value I 0 (x, y) of each pixel unit through a preset fitting algorithm;
  • Step 6012 based on the obtained luminance values I(x, y) of multiple pixel units, and the estimated luminance distribution of the image generated based on the estimated luminance values I 0 (x, y) of each pixel unit, determine The enhancement factor and attenuation factor.
  • the enhancement factor may be determined according to the quotient between the mean value of the first parameter and the mean value of the second parameter, where the first parameter is a gradient of luminance values of a plurality of pixel units of the image, so The second parameter is the gradient of the estimated luminance values of the plurality of pixel units.
  • the enhancement factor can be determined based on formula (8), where, It may be the mean value of the brightness values of all pixel units of the image collected by the lens, or may be the mean value of the brightness values of some pixel units including the target pixel unit; It may be the mean value of the estimated brightness values of all pixel units of the image captured by the lens, or may be the mean value of the estimated brightness values of some pixel units including the target pixel unit, which is not limited in this embodiment of the present application.
  • the target pixel unit is the pixel unit to be confirmed whether the corresponding photosensitive area is abnormal.
  • the attenuation factor may be determined according to a difference between a third parameter and a fourth parameter, where the third parameter is an average value of luminance values of the plurality of pixel units, and the fourth parameter is the The product of the mean value of the estimated luminance values of the pixel units and a fifth parameter determined from the quotient between the mean value of the first parameter and the mean value of the second parameter.
  • the attenuation factor can be determined based on formula (9), where, and It is the same as the enhancement factor mentioned above, and will not be repeated here.
  • AVG(I(x,y)) may be the average value of the luminance values of all pixel units of the image captured by the lens, or may be the average value of the luminance values of some pixel units including the target pixel unit.
  • AVG(I 0 (x, y)) may be the mean value of the estimated brightness values of all pixel units of the image captured by the lens, or may be the mean value of the estimated brightness values of some pixel units including the target pixel unit, This embodiment of the present application does not limit this.
  • the target pixel unit is the pixel unit to be confirmed whether the corresponding photosensitive area is abnormal.
  • the determination of the enhancement factor a(x, y) and the attenuation factor b(x, y) may not be limited to formula (8) and formula (9).
  • the formula (8) in order to make the determined value ranges of the enhancement factor a(x, y) and the attenuation factor b(x, y) suitable and not too large or too small, the formula (8)
  • the formula (9) is multiplied by a preset scaling factor, or a preset offset is added, which is not limited in this embodiment of the present application.
  • step 6011 the estimated brightness value I 0 (x, y) of each pixel unit is determined, and the preset fitting algorithm is not limited in this embodiment of the present application.
  • the preset fitting algorithm may be a Random Sample Consensus (RANSAC) algorithm.
  • RANSAC Random Sample Consensus
  • the basic assumption of the RANSAC algorithm is that the sample contains correct data (inliners, data that can be described by the model), and also contains abnormal data (outliners, data that deviates far from the normal range and cannot be adapted to the data model). These abnormal data may be generated due to wrong measurements, wrong assumptions, wrong calculations, etc.
  • the RANSAC algorithm assumes that, given a correct set of data, there are methods available to calculate model parameters that fit those data.
  • the brightness value of the pixel unit that is not affected by abnormal conditions such as dirt or occlusion can be regarded as normal data, while the abnormality such as dirt or occlusion can be regarded as normal data.
  • the luminance value of the pixel unit affected by the situation can be regarded as abnormal data.
  • the estimated brightness value of the image captured by the lens can be fitted by formula (10) based on the RANSAC algorithm:
  • (x, y) represents the coordinates of the pixel unit
  • p i, j represents the polynomial coefficients of the brightness distribution model to be solved.
  • the determined p i ,j continue to bring in the luminance values of multiple groups of pixel units for fitting, if for most pixel units, I 0 ( x, y), is not equal to the brightness value of the pixel unit, it means that the determined p i, j is not suitable, and the first set of correct data is not the real correct data, but there are abnormalities such as dirt or occlusion If the pixel unit is affected by the state, you can mark the abnormal data in the first group of data, and re-select a group of pixel units as the correct data, perform polynomial fitting, and determine p i,j , for the determined p i,j , continue Bring in the luminance values of multiple groups of pixel units for fitting, if for most pixel units, the I 0 (x, y) obtained by formula (10) determined by the polynomial coefficients p i,j is equal to the pixel unit
  • the brightness value indicates that p i,j is suitable, and the estimated brightness
  • the determined p i,j is still inappropriate, continue to repeat the above steps until a suitable p i,j is determined.
  • the estimated luminance values I 0 (x,y) of the image captured by the lens in multiple pixel units can be determined.
  • the gradient is calculated to obtain the gradient of the luminance values of multiple pixel units Based on the RANSAC algorithm, I 0 (x,y) and When the lens is used to collect multiple images, AVG(I(x,y)), AVG(I 0 (x,y)), as well as Further, the enhancement factor and the attenuation factor can be determined.
  • determining an abnormal photosensitive area in the plurality of photosensitive areas of the lens may be based on the enhancement factor a(x, The numerical range of the quotient of y) and b(x,y), that is, according to or The numerical range of , determines the abnormal photosensitive area among the multiple photosensitive areas of the lens.
  • the abnormal light-sensitive area in the multiple light-sensitive areas of the lens is determined, or it may be based on the enhancement factor a(x, y) and all the The numerical range of the attenuation factor b(x, y) is used to determine the abnormality probability of each photosensitive area, and based on the abnormality probability, it is determined whether each photosensitive area is abnormal.
  • determining an abnormal photosensitive area among the plurality of photosensitive areas of the lens includes:
  • Step 6021 based on the numerical range of the enhancement factor and the attenuation factor, determine the abnormal probability of each photosensitive area;
  • Step 6022 Determine whether each photosensitive area is abnormal according to the relationship between the abnormal probability and a preset threshold.
  • the abnormal probability of the pixel unit corresponding to each photosensitive area is determined. It can be based on the numerical range of the enhancement factor a(x, y), or according to the numerical value of the attenuation factor b(x, y), or according to the enhancement factor a(x, y) and all
  • the abnormal probability of the photosensitive area may be determined by the mapping relationship between the numerical range of the enhancement factor and/or the attenuation factor and the abnormal probability established in advance.
  • the range of values for determining the abnormal probability of a pixel unit can be preset: When it is between [A 1 , B 1 ], the corresponding abnormal probability is 0.5; When it is between [A 2 , B 2 ], the corresponding abnormal probability is 0.6; When between [A 3 , B 3 ], the corresponding abnormal probability is 0.7...
  • the functional relationship between the numerical range of the enhancement factor and/or the attenuation factor and the abnormal probability can also be pre-established to determine the abnormality probability, for example, For each increase of N, the corresponding abnormal probability increases by M.
  • the determined abnormal probability of the pixel unit may be compared with a preset threshold to determine Whether the photosensitive area corresponding to the pixel unit is abnormal.
  • the abnormality probability described in step 6021 may be a single abnormality probability determined by calculating the numerical range of the enhancement factor and the attenuation factor obtained based on the image collected by the lens. That is, as described above, based on The determined anomaly probability of the pixel unit.
  • the abnormality probability may also be a cumulative abnormality probability within a preset time period. That is, the determination of the abnormal probability is the historical cumulative value of a plurality of single abnormal probability calculated within a preset time period. For a preset time period including multiple time periods t 1 , t 2 . . .
  • the shot can be based on For the images collected in the time period t1 , the first single abnormal probability P 1 of the pixel units corresponding to the multiple photosensitive areas of the lens corresponding to the time period t1 is obtained; in the time period t2 , based on the time period t2 From the collected images, the second single abnormal probability P 2 of the pixel units corresponding to the multiple photosensitive regions of the lens corresponding to the time period t 2 is obtained.
  • the second single abnormality probability P2 may be directly used as the abnormality probability of the pixel units corresponding to the multiple photosensitive areas of the lens corresponding to the time period t2 .
  • the probability of a single abnormality may be affected by some temporary factors, there will be a large error.
  • the probability of a single abnormality is high due to the occlusion of dust. It may not be appropriate to determine the abnormal photosensitive area of the lens based only on the probability of a single abnormality. Because the dust is likely to no longer block the lens when the position and posture of the lens change. Therefore, in some embodiments, the cumulative abnormal probability corresponding to the t 2 time period may be determined based on the cumulative abnormal probability of the historical time period before the t 2 time period, that is, the product of P 1 and P 2 P 1 ⁇ P 2 is taken as t 2. The abnormal probability of pixel units corresponding to multiple photosensitive areas of the lens corresponding to the time period. For subsequent moments, similarly, the abnormal probability is determined by the single abnormal probability calculated in the time period and the cumulative abnormal probability determined by the historical time period before the time period.
  • the cumulative abnormal probability may be determined in the following manner:
  • Step 6023 based on the numerical range of the enhancement factor and the attenuation factor of the images captured within the preset time period, determine the abnormal probability of a plurality of photosensitive regions within the preset time period;
  • Step 6024 based on the abnormal probability within the preset time period and the accumulated abnormal probability determined before the preset time period, update the cumulative abnormal probability of multiple photosensitive regions.
  • the determination of the cumulative abnormal probability is only an exemplary illustration, and other methods can also be used, for example, based on all single abnormal probability determined before the preset time period and the single abnormality probability determined in the preset time period to jointly determine the cumulative abnormality probability of the preset time period, etc., to determine the cumulative abnormality probability, which is not limited in this embodiment of the present application.
  • updating the cumulative abnormal probability of multiple photosensitive regions can be achieved by the following manner: multiplying the abnormal probability of the preset time period and the cumulative abnormal probability determined between the preset time periods, and the product is as The updated cumulative anomaly probability.
  • multiplying the abnormal probability of the preset time period and the cumulative abnormal probability determined between the preset time periods and the product is as The updated cumulative anomaly probability.
  • updating the cumulative abnormal probability of a plurality of photosensitive regions can be achieved by the following methods: performing a logarithmic operation on the abnormal probability of the preset time period, and the logarithmic form before the preset time period. The cumulative abnormal probabilities are added, and the added sum is used as the updated cumulative abnormal probability.
  • P(n) represents the probability that a certain pixel unit has abnormal conditions such as dirt or occlusion
  • n represents the number of the pixel unit, which is an identifier
  • z 1:t-1 represents the brightness value of the pixel unit at the start time and Measured between time t-1
  • z t indicates that the luminance value of the pixel unit is measured at time t.
  • the abnormal photosensitive area of the lens can be determined based on the above-determined cumulative abnormal probability. It may also be determined based on an update strategy determined with the cumulative abnormal probability.
  • the probability that the lens area corresponding to the pixel unit of the image has an abnormality can be determined by the update strategy determined by formula (13):
  • l max and l min may be preset maximum probability values and minimum probability values.
  • the reason why there is a preset maximum probability value and minimum probability value is that the probability value will continue to increase as the cumulative number of times increases, and the probability is a quantity with mathematical meaning, it is impossible to calculate the probability in a cumulative way. It is infinitely large, so the final probability value can be made meaningful through the preset maximum probability value and minimum probability value.
  • the prompt information described in step 603 may be prompt information in various forms.
  • the prompt information is a probability map composed of abnormal probabilities of the plurality of photosensitive regions, wherein the abnormal probability may be the single abnormal probability described above, or the abnormal probability described above.
  • the cumulative abnormal probability is not limited in this embodiment of the present application.
  • a probability map exemplified by the cumulative abnormal probability of the plurality of photosensitive regions is presented.
  • the higher the gray value the higher the cumulative abnormal probability corresponding to the pixel unit. It can be seen from this exemplary probability map that when the prompt information is a probability map, the user or staff can intuitively obtain the abnormal probability of the pixel units corresponding to the multiple photosensitive regions of the lens.
  • the prompt information can also be in other forms, for example, a prompt image or a prompt sound that marks the abnormally sensitive area of the lens, etc.
  • the specific form of the prompt information in the embodiment of the present application is No restrictions apply.
  • the method further includes: sending the prompt information to the display of the client device and/or the service management center, Display a reminder, or send the reminder information to the client device and/or the sounding device of the service management center, such as a speaker, etc., for sounding reminder.
  • the prompt information is actively sent to the client device or the service management center for display reminder or sound reminder, so that the user or the staff of the service management center can timely discover that the lens has an abnormal photosensitive area , to deal with the abnormal state to avoid the occurrence of security accidents.
  • the movable platform in addition to the lens used to obtain the depth information of the scene where the movable platform is located, it is usually also equipped with a main camera, which is used to collect images of the scene where the movable platform is located for the user to view. .
  • the images containing depth information collected by the lens are usually not displayed to the user.
  • the method may further include: replacing the prompt information with the display of the movable platform.
  • the displayed image captured by the main camera is used to remind the lens that there is an abnormal area.
  • the above-mentioned embodiments provide a first method for alerting an abnormality of a lens, wherein the lens is installed on a movable platform, and is used to provide the movable platform with depth information of the scene in which it is located.
  • the embodiment of the present application also provides a third method for prompting abnormality of a lens as shown in FIG. 9 , and the method for prompting abnormality of a lens includes:
  • Step 901 acquiring an image captured by the lens
  • Step 902 acquiring the brightness values of multiple pixel units of the image, where the brightness values of the multiple pixel units are collected from the environment by multiple photosensitive areas of the lens;
  • Step 903 based on the brightness values of a plurality of the pixel units, obtain an enhancement factor and an attenuation factor of the photosensitive area of the lens corresponding to any of the pixel units, wherein the enhancement factor represents the pixel unit corresponding to the photosensitive area.
  • the enhancement degree of the brightness value relative to the estimated brightness value, the attenuation factor represents the attenuation degree of the brightness value of the pixel unit corresponding to the photosensitive area relative to the estimated brightness value, wherein the estimated brightness value represents the photosensitive area
  • the brightness value of the corresponding pixel unit is obtained by collecting the environment.
  • Step 904 based on the numerical range of the enhancement factor and the attenuation factor, determine an abnormal photosensitive area in the plurality of photosensitive areas of the lens;
  • Step 905 generating prompt information based on the position of the abnormal photosensitive area.
  • the method can be applied to the abnormal lens prompt that can provide the depth information of the scene where the mobile platform is located, as described in the foregoing embodiments, and can also be applied to the abnormal lens (such as mobile phone, SLR lens, etc.) As a reminder, the detection method does not limit the specific type of the lens.
  • the lens is mounted on a movable platform.
  • the movable platform includes at least one of the following: drones, unmanned vehicles and intelligent robots.
  • the images are acquired with the lenses in different positions and/or poses.
  • the method further comprises: controlling the movable platform or a pan/tilt connected to the lens or Movement of the robotic arm to adjust the position and/or attitude of the lens.
  • step 701 further includes: detecting whether the lens has a position and/or attitude change greater than a threshold value, so as to detect whether the lens has a position and/or attitude change. /or when the posture change is greater than the threshold, acquire the image captured by the lens.
  • the pixel units are pixel units that do not include a sky area.
  • step 903 obtaining the enhancement factor and attenuation factor of the photosensitive area of the lens corresponding to any one of the pixel units can be achieved in the following ways:
  • Step 9031 according to the obtained luminance values of the plurality of pixel units, obtain the estimated luminance value of each pixel unit through a preset fitting algorithm
  • Step 9032 Determine the enhancement factor and the attenuation factor based on the acquired luminance values of a plurality of pixel units and the estimated luminance distribution of the image generated based on the estimated luminance values of each pixel unit.
  • the enhancement factor may be determined according to the quotient between the mean value of the first parameter and the mean value of the second parameter; and ⁇ or, the attenuation factor may be determined according to the difference between the third parameter and the fourth parameter.
  • the first parameter is the gradient of the luminance values of the plurality of pixel units
  • the second parameter is the gradient of the estimated luminance values of the plurality of pixel units
  • the third parameter is the plurality of The mean value of the luminance values of the pixel units
  • the fourth parameter is the product of the mean value of the estimated luminance values of the plurality of pixel units and the enhancement factor.
  • the preset fitting algorithm described in step 9031 is a random consistent sampling algorithm.
  • the numerical range of the enhancement factor and the attenuation factor is the numerical range of the quotient of the attenuation factor and the enhancement factor.
  • step 904 determines an abnormal photosensitive area in the plurality of photosensitive areas of the lens, including:
  • Step 9041 based on the numerical range of the enhancement factor and the attenuation factor, determine the abnormal probability of each photosensitive area;
  • Step 9042 Determine whether each photosensitive area is abnormal according to the relationship between the abnormal probability and a preset threshold.
  • the abnormality probability described in step 9041 may be the same as the accumulated abnormality probability within a preset time period.
  • the abnormal cumulative probability may be determined by:
  • Step 9043 based on the numerical range of the enhancement factor and attenuation factor of the images captured within the preset time period, determine the abnormal probability of each photosensitive area within the preset time period;
  • Step 9044 Update the cumulative abnormal probability of each photosensitive area based on the abnormal probability in the preset time period and the cumulative abnormal probability determined before the preset time period.
  • the abnormal probability of the preset time period is the accumulated abnormal probability.
  • step 9044 the cumulative abnormal probability of each photosensitive area is updated, including:
  • the prompt information in step 905 includes at least one of the following: presenting a probability map of the abnormal probability of the plurality of photosensitive regions, that is, the image shown in FIG. 8 ; marking the abnormal photosensitive areas The prompt image or prompt sound of the area.
  • the third method for prompting lens abnormality provided by the embodiments of the present application further includes: sending the prompt information to the display of the client device and/or the service management center to display a prompt; and ⁇ Or, send the prompt information to the client device and ⁇ or the occurrence device of the service management center to remind the occurrence.
  • the third method for prompting an abnormality of a lens provided by the embodiments of the present application is similar to the first method for prompting an abnormality of a lens described above, and the relevant content of each embodiment is similar to that described above. .
  • an enhancement factor and an attenuation factor are obtained by acquiring an image collected by a lens to be alerted of an anomaly, and according to the luminance values of multiple pixel units of the image, and then Based on the numerical range of the enhancement factor and the attenuation factor, the abnormal photosensitive area of the lens is determined, and prompt information is generated, which can simply and quickly determine whether the lens has an abnormal photosensitive area due to abnormal conditions such as dirt or occlusion. , and the specific location of the abnormal photosensitive area, so that prompt information can be generated to remind users or managers to process the abnormal photosensitive area, which overcomes the defects of low efficiency and high error detection rate in the related art by manual detection.
  • an embodiment of the present application further provides a device corresponding to the method for prompting abnormality of a lens.
  • the device includes a memory 1001, a processor 1002, and a computer stored in the memory and running on the processor.
  • a program when the processor executes the program, any method embodiment provided by the embodiments of the present application is implemented.
  • the memory 1001 may be an internal storage unit of an electronic device, such as a hard disk or a memory of the electronic device.
  • the memory 1001 can also be an external storage device of the electronic device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a Secure Digital (Secure Digital, SD) card equipped on the electronic device, Flash card (Flash Card) and so on. Further, the memory 1001 may also include both an internal storage unit of the electronic device and an external storage device. The memory is used to store the computer program and other programs and data required by the electronic device. The memory may also be used to temporarily store data that has been output or is to be output. When the program stored in the memory is executed, the processor 1002 calls the program stored in the memory 1001 to execute the methods of the foregoing embodiments, which have been described in detail above and will not be repeated here.
  • an embodiment of the present application also provides a movable platform, and a schematic diagram of its hardware structure is shown in FIG. 11 .
  • the movable platform includes a lens 1101 , a power system 1102 , a memory 1103 and a processor 1104 .
  • the lens 1101 is used to collect a depth map of the environment where the movable platform is located, and provides environmental information for the movement of the movable platform;
  • the power system 1102 is used to provide power for the movement of the movable platform ;
  • the memory is used to store the program code 1103;
  • the processor calls the program code, and when the program code is executed, is used to implement the methods of the foregoing embodiments, the methods have been described in detail above, and will not be repeated here. Repeat.
  • the embodiments of the present application also provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, implements all the embodiments of the above methods of the present application, It is not repeated here.
  • the computer-readable storage medium may be an internal storage unit of the electronic device, such as a hard disk or a memory of the electronic device.
  • the computer-readable storage medium may also be an external storage device of the electronic device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) equipped on the electronic device ) card, Flash Card, etc.
  • the computer-readable storage medium may also include both an internal storage unit of the electronic device and an external storage device.
  • the computer-readable storage medium is used to store the computer program and other programs and data required by the electronic device.
  • the computer-readable storage medium can also be used to temporarily store data that has been or will be output.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM) or the like.

Abstract

一种镜头异常提示方法,包括:获取镜头采集的图像(步骤901);获取图像的多个像素单元的亮度值(步骤902);基于多个像素单元的亮度值,得到任一像素单元对应的镜头的感光区域的增强因子和衰减因子(步骤903),增强因子表征感光区域对应的像素单元的亮度值相对于估计的亮度值的增强程度,衰减因子表征感光区域对应的像素单元的亮度值相对于估计的亮度值的衰减程度,估计的亮度值表征感光区域非异常时,对环境采集得到对应的像素单元的亮度值;基于增强因子和衰减因子的数值范围,确定镜头的多个感光区域中的异常感光区域(步骤904);基于异常感光区域的位置生成提示信息(步骤905)。

Description

镜头异常提示方法、装置、可移动平台及可读存储介质 技术领域
本申请涉及智能检测领域,尤其涉及一种镜头异常提示方法、装置、可移动平台及计算机可读存储介质。
背景技术
在智能感知领域,装载有双目摄像机等镜头的可移动平台,由于基于所述镜头能够获取周围的环境信息,进而能够实现智能运动、智能导航等等。其中,镜头作为可移动平台的“视觉器官”,如果该镜头的部分区域存在脏污或者被遮挡等情况,会造成该区域的感光异常,所采集的图像上,部分像素单元可能会存在一片固定的纹理,从而影响可移动平台对周围环境信息的准确采集,进而埋下安全隐患。
相关技术中,可移动平台上镜头的异常与否,通常需要工作人员或者用户主动去检查,存在智能化程度低、效率低下的缺陷。此外,对于小型镜头来说,由于尺寸较小,漏检率较高。
发明内容
为克服相关技术中所存在的智能化程度低、效率低下、漏检率较高等缺陷,本申请提供了一种镜头异常提示方法、装置、可移动平台及计算机可读存储介质。
根据本申请实施例的第一方面,提供一种镜头异常提示方法,所述方法包括:获取所述镜头采集的图像;获取所述图像的多个像素单元的亮度值,多个所述像素单元的亮度值由所述镜头的多个感光区域对环境采集得到;基于多个所述像素单元的亮度值,得到任一所述像素单元对应的所述镜头的感光区域的增强因子和衰减因子,其中,所述增强因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的增强程度,所述衰减因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的衰减程度,其中,所述估计的亮度值表征所述感光区域非异常时,对所述环境采集得到对应的像素单元的亮度值;基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的多个感光区域中的异常感光区域;基于所述异常感光区域的位置生成提示信息。
根据本申请实施例的第二方面,提供另一种镜头异常提示方法,
所述方法包括:获取所述镜头采集的图像;基于所述图像的多个像素单元的像素值,检测所述镜头的透光组件或者感光器件是否受到异物遮挡;当检测到有所述异物遮挡时生成提示信息。
根据本申请实施例的第三方面,提供一种镜头异常提示装置,所述装置包括存储器和处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现以下本申请实施例第一方面和第二方面的方法的步骤。
根据本申请实施例的第四方面,提供一种可移动平台,所述可移动平台包括:镜头、动力系统、存储器和处理器;所述镜头,用于拍摄所述可移动平台所处环境的深度图,为所述可移动平台的运动提供环境信息;所述动力系统,用于为所述可移动平台的运动提供动力;所述存储器用于存储程序代码;所述处理器调用所述程序代码,当程序代码被执行时,用于实现本申请实施例第一方面和第二方面的方法的步骤。
根据本申请实施例的第五方面,提供一种计算机可读存储介质,所述计算机可读存储介质上存储有若干计算机指令,所述计算机指令被执行时实现本申请实施例第一方面和第二方面所述方法的步骤。
本申请实施例提供的技术方案可以包括以下有益效果:
通过获取待异常提示的镜头所采集的图像,并根据所述图像的多个像素单元的亮度值,得到增强因子和衰减因子,进而基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的异常感光区域,并生成提示信息,可以简单快速地确定出所述镜头是否因脏污或者遮挡等异常情况存在异常感光区域,以及该异常感光区域的具体位置,从而能够生成提示信息提醒用户或者管理人员进行异常感光区域的处理,克服了相关技术中,通过人工检测的方式的效率低下、检错率高的缺陷。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本说明书。
附图说明
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这 些附图获得其他的附图。
图1是本申请根据一示例性实施例示出的一种双目摄像头所采集的深度图的示意图。
图2是本申请根据一示例性实施例示出的一种可移动平台所采集的所处场景的图像的示意图。
图3是本申请根据一示例性实施例示出的第一种镜头异常提示方法的流程图。
图4是本申请根据一示例性实施例示出的第二种镜头异常提示方法的流程图。
图5A是本申请根据一示例性实施例示出的一种被无人机脚架遮挡的左目摄像头所采集的图像。
图5B是本申请根据一示例性实施例示出的一种被无人机脚架遮挡的右目摄像头所采集的图像。
图5C是本申请根据一示例性实施例示出的一种被无人机云台保护罩遮挡的摄像头所采集的图像。
图6是本申请根据一示例性实施例示出的一种基于多个像素单元的亮度值,确定异常感光区域的方法的流程图。
图7是本申请根据一示例性实施例示出的一种基于多个像素单元的亮度值,确定异常感光区域的原理示意图。
图8是本申请根据一示例性实施例示出的一种由多个感光区域的累计异常概率所构成的概率图的示意图。
图9是本申请根据一示例性实施例示出的第三种镜头异常提示方法的流程图。
图10是本申请根据一示例性实施例示出的一种镜头异常提示装置的硬件结构示意图。
图11是本申请根据一示例性实施例示出的一种可移动平台的硬件结构示意图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们 仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在本申请使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开。在本公开说明书和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本申请可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
人类正在进入信息时代,基于计算机的智能计算越来越广泛地应用到各个领域。计算机视觉作为其中的重要领域,目前也处于飞速发展阶段。计算机视觉依靠成像系统作为“视觉器官”,对图像信息进行采集,并基于计算机进行分析处理,以使计算机能够像人一样能够从采集的图像中进行“感知”,获取有用信息并加以利用。
以装载有双目摄像头系统(Stereo Vision System)的可移动平台为例,双目摄像头系统基于两个摄像头,对所处场景拍摄同一时刻、不同角度的两张照片,通过对两张照片的差异进行分析处理,以及基于两个摄像头之间的位置、角度关系,利用三角关系,可以计算获得所处场景内的物体与双目摄像头的距离关系,如图1所示,为基于所计算的距离关系得到的深度图,每个像素单元的灰度代表着与所述双目摄像头的距离。所述可移动平台,基于所获得的所处场景内的物体的距离关系,可以智能地进行避障、运动路线规划等等,实现智能感知。
然而,如果“视觉器官”——成像系统的镜头发生了脏污或者遮挡等异常情况,就会造成所采集的图像上出现一块固定纹理,即部分像素单元的亮度值不正常,进而影响计算机视觉的计算结果,造成误检或者错检,甚至导致安全事故的发生。
如图2所示,为一种可移动平台——无人机在飞行过程中基于搭载在其上的双目摄像头所拍摄的飞行方向前方的场景图,其中,区域201为所述双目摄像头的右目摄像头由于存在脏污或者遮挡等异常情况而导致的所采集的图像上存在一片固定的纹理。右目摄像头上异常感光区域的存在,将会导致基于所述双目摄像头进行拍摄,无法获得位于此区域对应场景中的物体的真实距离信息,从而为无人机的飞行控制埋下 安全隐患。
相关技术中,镜头是否存在异常,通常需要工作人员或者用户主动去检查。一旦在可移动平台进行运动之前,遗忘了对镜头的异常提示,就埋下了安全隐患。且人工检测的方法,智能化程序低,对于批量次的镜头检测,存在效率低下的缺陷。此外,随着设备越来越往小型化方向发展,镜头的尺寸也越来越小,而人眼的分辨率有限,在进行人工镜头检测的过程中,也常会存在漏检、误检的问题。
为了克服相关技术所存在的缺陷,本申请实施例提供了镜头异常提示方法,其中,所述镜头至少包括透光组件和感光器件。所述透光组件,可以为玻璃透光组件,也可以为塑料透光组件,还可以是其他透光组件,以用于对光学信号进行变换,本申请对此不做限制。所述感光器件,可以为各种传感器,如CMOS传感器、CCD传感器等等,还可以是其他各种感光器件,以将光学信号转换为电信号。
如图3所示,为本申请实施例所提供的第一种镜头异常提示方法,所述方法包括如下步骤:
步骤301,获取所述镜头采集的图像;
步骤302,基于所述图像的多个像素单元的亮度值,检测所述镜头是否存在异常感光区域,多个所述像素单元的亮度值由所述镜头的多个感光区域对环境采集得到;
步骤303,当存在所述异常感光区域时生成提示信息。
在一些实施例中,所述镜头被安装在可移动平台上,为所述可移动平台提供所处场景的深度信息。例如,可以为双目摄像头或者多目摄像头中的任一个或以上的摄像头,用于获取所述可移动平台所述场景的深度信息。
在一些实施例中,所述可移动平台至少包括以下之一:无人机、无人汽车、智能机器人等等。当然,本领域技术人员应当理解,所述可移动平台还可以是其他可以移动的、能够搭载其他设备的类型,本申请实施例对所述可移动平台的具体类型不做限制。
在一些实施例中,提供另一种如图4所示的镜头异常提示方法,所述方法包括:
步骤401,获取所述镜头采集的图像;
步骤402,基于所述图像的多个像素单元的像素值,检测所述镜头的透光组件或者感光器件是否受到异物遮挡;
步骤403,当检测到有所述异物遮挡时生成提示信息。
在一些实施例中,所述像素值包括作为分量的亮度值,所述方法还包括:
步骤4021,基于所述图像的多个像素单元的亮度值确定所述镜头的透光组件或者感光器件是否存在异常感光区域。当然,本领域技术人员应当理解,除了根据多个像素单元的亮度值确定所述异常感光区域,还可以通过所述像素值的其他分量,例如RGB分布等等,来确定所述镜头的透光组件或者感光器件是否存在异常感光区域,本申请实施例对此不做限制。
在一些实施例中,所述镜头被安装在可移动平台上,用于为所述可移动平台提供所处场景的深度信息。
对于可移动平台来说,在可移动平台在移动过程中,可能会因为环境中的浮尘、动物排泄物等等的粘附,以及一些外部设备组件的位置不当,例如,在某些产品中,为了避免镜头长期暴露在空间环境中,导致灰尘累积、镜头受污等情况,所述镜头在未被使用时,常常处于被镜头保护罩保护的状态,在所述镜头被使用时,所述保护罩被打开。在保护罩被打开的过程中,保护罩可能会打开不充分,进而导致所述镜头的透光组件或者感光器件被遮挡,造成利用所述镜头采集所述可移动平台周围的环境信息时,采集不充分,带来安全风险。
故在一些实施例中,步骤402中所述镜头的透光组件或者感光器件是否受到异物遮挡,包括:所述透光组件或所述感光器件上是否存在粘附物,以及,所述透光组件或所述感光器件是否被外部组件遮挡。
在一些实施例中,所述外部组件可以包括以下一种或多种:保护罩,可移动平台的支架或动力组件。所述保护罩可以为所述镜头自身的保护罩,也可以为所述镜头所搭载的可移动平台上的保护罩。所述可移动平台的支架可以为脚架、可折叠支撑台等等。所述动力组件可以为所述所述可移动平台的运动臂,例如无人机的机臂,飞行桨,智能机器人的四肢组件等等。当然,以上各个例子仅为示例性说明,并非穷举,所述外部组件还可以是其他类型的外部组件,本申请实施例对此不做限制。
在一些实施例中,所述可移动平台可以为无人机、无人汽车和智能机器人等等,本申请实施例对此也不做限制。
在一些实施例中,所述可移动平台为无人机,所述外部组件可以为所述无人机的机臂,所述无人机的飞行桨保护罩,以及所述无人机的云台的保护罩,本申请实施 例对此不做限制。
结合图5A、5B和5C进行说明。其中,图5A和图5B分别是无人机基于左目摄像头和右目摄像头所采集的图像,其中,501和502所指示的区域是所述无人机未安置至正确位置的脚架,在所述左目摄像头和所述右目摄像头上形成了遮挡。由于所述遮挡的存在,导致所述所述采集的图像,在区域501和502存在异常感光区域,当基于本申请上述实施例所提供的方法,根据所采集的图像的多个像素单元的像素值,检测到存在异物遮挡,并生成提示信息,能够避免由于采集到的深度图存在误差,存在安全隐患的问题。
图5C是无人机基于双目摄像头中的一个,所采集到的图像,其中,503a~503c所指示的区域是所述无人机在打开所述云台保护罩的过程中,由于某些原因导致所述云台保护罩未被完全打开,从而在所述镜头上形成了遮挡,区域503a~c为异常感光区域。同样,当基于本申请上述实施例所提供的方法,根据所采集的图像的多个像素单元的像素值,检测到存在异物遮挡,并生成提示信息,能够避免由于采集到的深度图存在误差,存在安全隐患的问题。
在一些实施例中,上述实施例所述的方法,还包括:当检测到所述镜头的透光组件或者感光器件被异物遮挡时,可以控制所述外部组件运动。
通过控制所述外部组件运动,能够确定所述镜头的遮挡来源是否是所述外部组件。当确定所述镜头的遮挡来源是所述外部组件时,如果所述运动控制预先设置的范围合适,则可以消除所述镜头的遮挡。即使通过控制所述外部组件不能消除所述镜头的遮挡,还可以采用发出预设警报等方式,提醒用户所述镜头的透光组件或者感光器件被异物遮挡;甚至还可以控制所述组件进一步运动以消除所述遮挡,实现自动化消除所述镜头的遮挡。
对于步骤301中和步骤401中,所获取的图像,可以是所述镜头采集的一张图像,也可以是所述镜头采集的多张图像。当所述图像是所述镜头采集的一张图像上,本领域技术人员应当理解,后文确定所述镜头是否存在异常感光区域的方法实施例中所提及的亮度值的平均以及亮度值的梯度的平均,即为该图像的亮度值以及该图像的亮度值的梯度。
当所述镜头采集的是多张图像时,如果所述图像内容相同,则所述多张图像提供的与所述镜头异常感光区域有关的信息量,同一张图像能够提供的消息量相近。故, 为了能够为确定所述镜头的异常感光区域提供更丰富的信息,在一些实施例中,步骤301中和步骤401中,所述获取所述镜头采集的图像是在所述镜头处于不同位置和\或姿态下采集的。其中,所述镜头处于不同位置和\或姿态下,可以是通过控制所述镜头所在的可移动平台的位置和\或姿态实现,也可以是通过控制所述镜头本身的位置和\或姿态实现的,本申请实施例对此不作限制。
通过上述实施例可以看到,当所述获取的图像是在所述镜头处于不同位置和\或姿态下采集的时,不同的图像具有不同的内容,如果所述镜头确实存在感光异常区域,则不同内容的图像所对应的同一像素单元的亮度值会存在相似的分布,进而可以确定该像素单元对应的镜头的感光区域的异常状况,能够避免基于一张或者多张相同内容的图像确定所述镜头的异常感光区域时,由于图像所提供的信息量有限,而导致所述镜头异常感光区域的误判。
在一些实施例中,为了能够获取所述镜头处于不同位置和/或姿态下采集的图像,本申请实施例所提供的一种镜头异常提示方法,步骤301和步骤401还包括以下步骤:
步骤3011,控制所述可移动平台或与所述镜头连接的云台或机械臂运动,以调整所述镜头的拍摄位置和/或姿态。
当然,上述实施例中所述镜头处于的不同位置和\或姿态的实现方法,仅为示例性说明,本领域技术人员可以根据所述可移动平台以及所述镜头的具体结构,通过其他方式使待检测的镜头处于不同的位置和\或姿态,本申请实施例对此也不做限制。
当通过上述实施例,获取所述镜头所采集的多张图像时,如果所述镜头的位置和/或姿态仅仅发生微小的变化,则所获取的多张图像较大可能具有相似的内容,则多张图像所提供的信息量仍然是有限的。而多张内容相似的图像的存储,却又需要占用存储空间。故,为了能够尽可能地获取具有有用信息的图像,并节省图像的存储空间,在一些实施例中,本申请实施例所提供的镜头异常提示的方法,在步骤3011之后,还包括以下步骤:
步骤3012,检测所述镜头是否存在大于阈值的位置和/或姿态变化,以在所述镜头的位置和姿态的变化大于阈值时,获取所述镜头采集的图像。
通过上述实施例可以看到,在获取待检测的镜头处于不同位置和\或姿态下所采集的图像的过程中,通过检测所述镜头是否存在大于阈值的位置和/或姿态变化,能 够实现在所述镜头的位置和姿态的变化大于阈值时,才获取所述镜头采集的图像,能够尽量地保证图像所提供的信息量的丰富度,同时节省图像的存储空间。
在一些实施例中,所述镜头的位置和姿态变化的检测,可以通过所述镜头所搭载的全球定位系统(GPS)、惯性测量系统(IMU)以及视觉惯性里程计(VIO)等中的一个或者多个确定。基于全球定位系统、惯性测量系统以及视觉惯性里程计确定所述镜头的位置和姿态的变化,可以参照相关技术来实现,本申请实施例对此不作赘述。
当然,本领域技术人员应当理解,所述镜头的位置和姿态变化的检测,还可以通过其他方式来实现,本申请实施例对此不作限制。本申请上述实施例所给出的方式,仅为示例性说明。
对于步骤302基于所述图像的多个像素单元的亮度值,确定所述镜头是否存在异常感光区域,以及步骤4021,基于所述图像的多个像素单元的亮度值确定所述镜头的透光组件或者感光器件是否存在异常感光区域,当所述图像包括天空区域时,可能会由于天空区域的亮度值分布与所述镜头所采集的其他对象的亮度值分布具有比较不同的特点,如果不排除所述图像中的天空区域,可能会造成对所述镜头是否存在异常感光区域的错误判断,故在一些实施例中,所述像素单元为不包括天空区域的像素单元。
对于用于确定所述像素单元是否为不包括天空区域的像素单元,可以通过多种方式判断。例如,可以基于深度学习网络的语义解析Parsing算法,来检测图像中的天空区域。当然,本领域技术人员应当理解,还可以采用其他方式来实现确定所述像素单元是否为天空区域,当所述像素单元为天空区域时,将所述像素单元排除,不用于确定所述镜头的异常感光区域。
对于步骤302基于所述图像的多个像素单元的亮度值,确定所述镜头是否存在异常感光区域,和步骤4021,所述基于所述图像的多个像素单元的亮度值确定所述镜头的透光组件或者感光器件是否存在异常感光区域,可以通过多种方式实现,本申请实施例对此不做限制。
在一些实施例中,步骤302,基于所述图像的多个像素单元的亮度值,确定所述镜头是否存在异常感光区域,可以通过如图6所示的方法实现,所述方法包括以下步骤:
步骤601,基于多个所述像素单元的亮度值,得到任一像素单元对应的所述镜 头的感光区域的增强因子和衰减因子,其中,所述增强因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的增强程度,所述衰减因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的衰减程度,其中,所述估计的亮度值表征所述感光区域非异常时的对所述环境采集得到对应的像素单元的亮度值;
步骤602,基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的多个感光区域中的异常感光区域。
在一些实施例中,步骤403,当检测到有所述异物遮挡时生成提示信息,包括:基于所述异常感光区域生成所述提示信息。
需要说明的是,为了便于区分,在本申请实施例中所述的“像素单元”,为所述镜头所采集的图像的各个像素单元,“感光区域”为所述镜头上,与图像的各个像素单元有唯一对应关系的感光区域。
结合图7,对利用上述方法,基于所述图像的多个像素单元的亮度值确定所述镜头的多个感光区域中的异常感光区域的原理进行说明。
用于采集环境信息的镜头700,通常由遮光板701、光学透镜702以及传感器703构成。其中,遮光板701位于所述镜头的最前端,起到保护光学透镜以及过滤部分杂散光的作用;光学透镜702用于对所述环境进行光学成像;传感器703用于将光学透镜所成的像由光信号转换为电信号,以使处理器进行后续的图像处理。
基于物理光学的原理可知,理想状态下,物空间的目标物点A,经过光学成像系统(在图7中,即所述镜头700)进行成像,得到像空间的目标像点B,成像过程可表示为:
I(x,y)=I 0(x,y)*k(x,y)     (1)
其中,I 0(x,y)表示目标物点A处的光照强度,I(x,y)表示目标像点B处的光照强度,k(x,y)为模糊核,表征着目标物点对应的光线通过光学系统前后,光照强度的变化,(x,y)表示以透镜的光轴为中心,所述目标物点和所述目标像点的坐标。
然而,在真实世界中,由于光学系统的各部件存在散射,目标物点出的光照强度,会成像在目标像点外的其他位置,造成光照强度的衰减;而其他非目标物点的光照强度,会成像在目标像点位置处,造成光照强度的增强,故理想成像是不存在的。
结合图7,可知,真实的成像过程中,当所述镜头700对其所处环境进行图像 采集,所述环境中某一位置的目标物点A,经过所述镜头700进行成像,在像空间的目标像点B对应的光强为I,那么,I 0与I之间的关系可以记作:
I(x,y)=I 0(x,y)·[α(x,y)*k(x,y)]+I α(x,y)*k(x,y)   (2)
其中,α(x,y)表示由于目标物点A成像路径上散射点的存在,导致目标物点A的一部分光照强度最终成像在目标像点B以外的其他位置,使得目标像点B处的光照强度存在的衰减程度;I α(x,y)表示由于其他物点成像路径上散射点的存在,导致其他物点对应的光照强度发生位置偏移,进而使这部分光照强度作为附加物点,经过镜头700成像,造成目标像点A的光照强度增强了I α(x,y)*k(x,y)。
对公式(2)进行简化,令a(x,y)=α(x,y)*k(x,y),b(x,y)=I α(x,y)*k(x,y),则有:
I(x,y)=I 0(x,y)·a(x,y)+b(x,y)    (3)
其中,a(x,y)即为所述增强因子,表征着所述镜头对目标物点A进行成像采集,所得到的图像中,物点A对应的像素单元(即像点B处)的亮度值相对于估计的亮度值的增强程度;b(x,y)即为所述衰减因子,表征着所得到的图像中,物点A对应的像素单元(即像点B处)的亮度值相对于估计亮度值的衰减程度。所述的估计的亮度值,为所述镜头的感光区域不存在异常感光情况下,对所述环境采集得到的对应的像素单元的亮度值。
当所述镜头,不存在散射,即为理想成像,那么,a(x,y)=1,表示不存在由于其他物点的散射,造成目标物点A对应的目标像点B的光照强度的增加;b(x,y)=0,表示不存在目标物点A的成像路径上存在散射点,导致目标物点A对应的目标像点B的光照强度的衰减。
而当所述镜头,存在着脏污或者遮挡等异常情况的存在,则所述镜头不同感光区域的散射状态会发生比较明显的变化,进而造成同一感光区域的所述增强因子a(x,y)和所述衰减因子b(x,y)发生变化。例如,当所述镜头与目标物点A对应的感光区域存在着脏污这种异常情况,而其他感光区域不存在任何异常情况,则会造成目标物点A对应图像的像素单元对应的衰减因子b(x,y)较大,其增强因子a(x,y)几乎不变。基于该像素单元的所述衰减因子b(x,y)的数值范围,或者该像素单元增强因子a(x,y)以及衰减因子b(x,y)的数值范围各自的数值范围,或者根据该像素单元增强因子 a(x,y)与衰减因子b(x,y)之间的数值关系,例如商的关系,可以确定所述镜头的该感光区域是否为异常感光区域。
其中,步骤501,基于多个所述像素单元的亮度值,得到任一所述像素单元对应的所述镜头的感光区域的增强因子和\或衰减因子,可以通过多种方式确定,本申请实施例对此不做限定。
在一些实施例中,步骤501中的增强因子和衰减因子,可以通过预先训练好的深度学习模型确定。所述深度学习模型可以是开发人员向选定的深度学习模型,输入大量镜头采集的、且每个像素单元标注有增强因子和衰减因子的图像,以供所述深度学习模型自动学习到:当向所述深度学习模型输入所述镜头采集的、未标注每个像素单元增强因子和衰减因子的图像时,所述深度学习模型能够自动输出该图像每个像素单元的增强因子和衰减因子。
当然,所述增强因子和所述衰减因子,还可以通过深度学习技术之外的算法来确定。以下进行示例性介绍:
对于所述镜头所采集的图像,假设所述在所述图像上,各个像素单元的增强因子a(x,y)和b(x,y)是平滑变化的,且相邻像素之间,a(p 1)≈a(p 2),b(p 1)≈b(p 2),则有:
Figure PCTCN2021082779-appb-000001
其中,
Figure PCTCN2021082779-appb-000002
为梯度符号,p 1和p 2代表两个相邻像素的坐标。
则,对公式(3)等式左右两边进行求导,可得:
Figure PCTCN2021082779-appb-000003
由于此等式在同一镜头存在脏污或者遮挡等异常情况下,对该镜头所采集的所有图像都成立,则可以得出:
Figure PCTCN2021082779-appb-000004
其中,AVG表示平均值,||表示绝对值。
基于与上述相似的推导,可以得到:
AVG(I(x,y))≈AVG(I 0(x,y))·a(x,y)+b(x,y)      (7)
基于公式(6)和(7),最终可得,
所述增强因子的表达式为:
Figure PCTCN2021082779-appb-000005
所述衰减因子的表达式为:
Figure PCTCN2021082779-appb-000006
故,在一些实施例中,步骤601中的增强因子和所述衰减因子,可以通过以下方式确定:
步骤6011,根据所获取的多个像素单元的亮度值I(x,y),通过预设的拟合算法获得每个像素单元的估计的亮度值I 0(x,y);
步骤6012,基于所获取的多个像素单元的亮度值I(x,y),以及基于所述每个像素单元的估计的亮度值I 0(x,y)生成的图像的估计亮度分布,确定所述增强因子和衰减因子。
本领域技术人员应当理解,上述增强因子和衰减因子表达式的获得,并不是唯一的推导方式。本领域技术人员也可以基于其他适用于应用场景的假设,对公式(3)进行变形,得到与公式(8)和公式(9)不同的表达式,以表示本申请所述的增强因子和衰减因子,本申请实施例对此不做限制。
在一些实施例中,所述增强因子可以根据第一参数的均值与第二参数的均值之间的商确定,所述第一参数为所述图像的多个像素单元的亮度值的梯度,所述第二参数为所述多个像素单元的估计的亮度值的梯度。
即所述增强因子可以基于公式(8)确定,其中,
Figure PCTCN2021082779-appb-000007
可以为所述镜头所采集的图像的全部像素单元的亮度值的均值,也可以为包含目标像素单元的部分像素单元的亮度值的均值;
Figure PCTCN2021082779-appb-000008
可以为所述镜头所采集的图像的全部像素单元的估计的亮度值的均值,也可以为包含目标像素单元的部分像素单元的估计的亮度值的均值,本申请实施例对此不做限制。所述目标像素单元,即待确认是否其对应的感光区域存在异常情况的像素单元。
在一些实施例中,所述衰减因子可以根据第三参数与第四参数的差确定,所述 第三参数为所述多个像素单元的亮度值的均值,所述第四参数为所述多个像素单元的估计的亮度值的均值与第五参数的乘积,所述第五参数根据第一参数的均值与第二参数的均值之间的商确定。
即所述衰减因子可以基于公式(9)确定,其中,
Figure PCTCN2021082779-appb-000009
Figure PCTCN2021082779-appb-000010
同前文增强因子所述,这里不做赘述。AVG(I(x,y))可以是所述镜头所采集的图像的全部像素单元的亮度值的均值,也可以是包含目标像素单元的部分像素单元的亮度值的均值。AVG(I 0(x,y))可以是所述镜头所采集的图像的全部像素单元的估计的亮度值的均值,也可以是包含目标像素单元的部分像素单元的估计的亮度值的均值,本申请实施例对此不做限制。所述目标像素单元,即待确认是否其对应的感光区域存在异常情况的像素单元。
当然,本领域技术人员应当理解,所述增强因子a(x,y)和衰减因子b(x,y)的确定,也可以不局限于公式(8)和公式(9)。例如,在一些实施例中,为了使得所确定的增强因子a(x,y)和衰减因子b(x,y)的取值范围合适,不至于太大或者太小,可以所公式(8)和公式(9)乘以预设的缩放因子,或者加上预设的偏移量,本申请实施例对此不做限制。
对于步骤6011中,确定每个像素单元的估计的亮度值I 0(x,y),所预设的拟合算法,本申请实施例对此不做限制。
在一些实施例中,所述预设的拟合算法可以为随机一致性采样(Random Sample Consensus,RANSAC)算法。
RANSAC算法的基本假设是样本中包含正确数据(inliners,可以被模型描述的数据),也包含异常数据(outliners,偏离正常范围很远、无法适应数据模型的数据)。这些异常数据可能由于错误的测量、错误的假设、错误的计算等产生。RANSAC算法假设,给定一组正确的数据,存在可用计算出符合这些数据的模型参数的方法。
对于本申请实施例所应用的场景,所述镜头所采集图像中,未受脏污或者遮挡等异常情况影响的像素单元的亮度值,可以被视为正常数据,而受脏污或者遮挡等异常情况影响的像素单元的亮度值,可以被视为异常数据。
对于所述镜头所采集的图像的估计的亮度值,可以基于RANSAC算法,通过公式(10)拟合:
Figure PCTCN2021082779-appb-000011
其中,(x,y)表示像素单元的坐标,p i,j表示所要求解的亮度分布模型的多项式系数。在应用RANSAC算法时,由于镜头的异常区域并不确定,故由于受脏污或者遮挡等异常情况影响的像素单元无法确定,可以将所述镜头所采集的图像中的多个像素单元的亮度值视为一组正确数据,带入至公式(10),进行多项式系数拟合,先确定一个p i,j。对于所确定的p i,j,继续带入多组像素单元的亮度值进行拟合,如果对于大多数像素单元,由多项式系数p i,j所确定的公式(10)所获得的I 0(x,y),并不等于所述像素单元亮度值,则说明所确定的p i,j并不合适,且第一组正确数据不是真正的正确数据,而是存在受脏污或者遮挡等异常状态影响的像素单元,则可以标记第一组的数据存在异常数据,并重新选择一组像素单元作为正确数据,进行多项式拟合,确定p i,j,对于所确定的p i,j,继续带入多组像素单元的亮度值进行拟合,如果对于大多数像素单元,由多项式系数p i,j所确定的公式(10)所获得的I 0(x,y),等于所述像素单元亮度值,则说明p i,j是合适的,则可以获得所述镜头所采集的图像在每个像素单元的估计的亮度值,如果所确定的p i,j仍然不合适,则继续重复上述步骤,直至确定出合适的p i,j。在确定了亮度分布模型的多项式系数p i,j之后,即可确定所述镜头所采集的图像在多个像素单元的估计的亮度值I 0(x,y)。
对于所述镜头所采集的图像的估计的亮度值的梯度,可以在基于RANSAC算法,获得了所述镜头所采集的图像在每个像素单元的估计的亮度值I 0(x,y)之后,对基于所述每个像素单元的估计的的亮度值生成的图像的估计亮度分布,在多个像素单元求梯度,获得多个像素单元的估计的亮度值的梯度
Figure PCTCN2021082779-appb-000012
也可以与求解多个像素单元的估计的亮度值I 0(x,y)类似,基于RANSAC算法,通过公式(11)拟合(其中,q i,j表示所要求解的亮度梯度分布模型的多项式系数。):
Figure PCTCN2021082779-appb-000013
获得多个像素单元的估计的亮度值的梯度
Figure PCTCN2021082779-appb-000014
求解
Figure PCTCN2021082779-appb-000015
与求解I 0(x,y)类似,本申请实施例在此不做赘述。
基于所述镜头所采集的图像的多个像素单元的亮度值I(x,y),进行求梯度,可 得多个像素单元的亮度值的梯度
Figure PCTCN2021082779-appb-000016
基于RANSAC算法,可以求得I 0(x,y)以及
Figure PCTCN2021082779-appb-000017
当采用所述镜头进行多张图像的采集,即可以获得AVG(I(x,y))、AVG(I 0(x,y))、
Figure PCTCN2021082779-appb-000018
以及
Figure PCTCN2021082779-appb-000019
进而能够确定出所述增强因子与所述衰减因子。
在一些实施例中,步骤602中,基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的多个感光区域中的异常感光区域,可以是基于所述增强因子a(x,y)和b(x,y)的商的数值范围,即根据
Figure PCTCN2021082779-appb-000020
或者
Figure PCTCN2021082779-appb-000021
的数值范围,确定所述镜头的多个感光区域中的异常感光区域。例如,如果
Figure PCTCN2021082779-appb-000022
在[A,B]之间,则像素单元(x 0,y 0)所对应的感光区域存在异常,其中A和B的具体数值,可以是基于大量统计结果所确定的数值,也可以是用户预设的经验值,本申请实施例对所述数值范围的确定方式不做限定。
对于步骤602中,基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的多个感光区域中的异常感光区域,还可以是基于所述增强因子a(x,y)和所述衰减因子b(x,y)的数值范围,确定每个感光区域的异常概率,基于所述异常概率,来确定每个感光区域是否异常。
故,在一些实施例中,确定所述镜头的多个感光区域中的异常感光区域,包括:
步骤6021,基于所述增强因子和衰减因子的数值范围,确定每个感光区域的异常概率;
步骤6022,根据所述异常概率与预设的阈值之间的关系,确定每个感光区域是否异常。
对于步骤6021,基于所述增强因子和衰减因子的数值范围,确定每个感光区域所对应的像素单元的异常概率。可以是根据所述增强因子a(x,y)的数值范围,也可以是根据所述衰减因子b(x,y)的数值,还可以是根据所述增强因子a(x,y)和所述衰减因子b(x,y)之间的关系,例如商的数值范围,确定每个感光区域所对应的像素单元的异常概率。所述感光区域的异常概率,可以是通过预先建立的增强因子和\或衰减因子的数值范围与异常概率的映射关系来确定。以基于
Figure PCTCN2021082779-appb-000023
的数值范围确定像素单元的 异常概率为例,可以是预先设置:
Figure PCTCN2021082779-appb-000024
在[A 1,B 1]之间时,对应的异常概率为0.5;
Figure PCTCN2021082779-appb-000025
在[A 2,B 2]之间时,对应的异常概率为0.6;
Figure PCTCN2021082779-appb-000026
在[A 3,B 3]之间时,对应的异常概率为0.7...当然,也可以是预先建立增强因子和\或衰减因子的数值范围与异常概率的函数关系,来确定所述异常概率,例如,
Figure PCTCN2021082779-appb-000027
的数值每增大N,对应的异常概率增大M。在基于所述增强因子和衰减因子的数值范围,确定了每个感光区域所对应的像素单元的异常概率之后,可以将所确定的像素单元的异常概率,与预设的阈值进行比较,进而确定该像素单元对应的感光区域是否异常。
步骤6021所述的异常概率,可以是基于所述镜头所采集的图像,计算获得的增强因子和衰减因子的数值范围,所确定出的单次异常概率。即,例如上文所述的,基于
Figure PCTCN2021082779-appb-000028
所确定出的像素单元的异常概率。
在一些实施例中,所述异常概率还可以是预设时间段内的累计异常概率。即所述异常概率的确定,是预设时间段内,所计算出的多个单次异常概率的历史累计值。对于包括t 1,t 2...t n多个时间段的预设时间段(其中,t 1为最早的时间段,t 2至t n依次为之后的时间段),所述镜头可以基于t 1时间段所采集的图像,获得t 1时间段对应的所述镜头的多个感光区域所对应的像素单元的第一单次异常概率P 1;在t 2时间段,基于t 2时间段所采集的图像,获得t 2时间段对应的所述镜头的多个感光区域所对应的像素单元的第二单次异常概率P 2。当所述异常概率基于单次异常概率确定时,可以直接将第二单次异常概率P 2作为t 2时间段对应的所述镜头的多个感光区域所对应的像素单元的异常概率。
但是,由于单次异常概率可能会受到某些临时性因素的影响出现较大的误差。例如,当所述镜头正好在t 2时间段,由于灰尘的遮挡导致单次异常概率较高,如果仅基于单次异常概率就确定所述镜头的异常感光区域,可能并不合适。因为所述灰尘很可能在所述镜头的位置和姿态发生变化时,就不再遮挡所述镜头。故,在一些实施例中,可以基于t 2时间段之前的历史时间段的累计异常概率,确定t 2时间段对应的累计异常概率,即将P 1与P 2的乘积P 1×P 2作为t 2时间段对应的所述镜头的多个感光区域所对应的像素单元的异常概率。对于后续的时刻,同理,其异常概率,均由该时间段所计算的单次异常概率和该时间段之前的历史时间段共同所确定的累计异常概率确定。
故,在一些实施例中,所述累计异常概率可以通过以下方式确定:
步骤6023,基于所述预设时间段内所拍摄的图像的增强因子和衰减因子的数值范围,确定多个感光区域在所述预设时间段内的异常概率;
步骤6024,基于所述预设时间段内的异常概率,与所述预设时间段之前所确定的累计异常概率,更新多个感光区域累计异常概率。
当然,本领域技术人员应当理解,所述累计异常概率的确定,上述实施例仅为示例性说明,还可以根据其他方式,例如,基于所述预设时间段之前所确定的所有单次异常概率和所述预设时间段所确定的单次异常概率,共同确定预设时间段的累计异常概率等,确定所述累计异常概率,本申请实施例对此不做限制。
在一些实施例中,更新多个感光区域的累计异常概率,可以通过以下方式实现:将所述预设时间段的异常概率与预设时间段之间所确定的累计异常概率相乘,乘积作为更新后的累计异常概率。前文示例所述,将P 1与P 2的乘积P 1×P 2作为t 2时间段对应的所述镜头的多个感光区域所对应的像素单元的异常概率,即属于此种计算方式。
对于计算单元而言,处理加法运算的速度要远快于处理乘法运算的速度。故在一些实施例中,更新多个感光区域的累计异常概率,可以通过以下方式实现:将所述预设时间段的异常概率进行取对数操作,与预设时间段之前的对数形式的累计异常概率相加,将相加之和作为更新后的累计异常概率。
上述实施例的累积异常概率,用数学表达式可以表示为:
L(n|z 1:t)=L(n|z 1:t-1)+L(n|z t)(12)
其中,
Figure PCTCN2021082779-appb-000029
P(n)表示某个像素单元存在脏污或者遮挡等异常情况的概率,n表示像素单元的编号,为标识符,z 1:t-1表示该像素单元的亮度值是在起始时间以及时间t-1之间测量的,z t表示该像素单元的亮度值是在时间t测量的。
通过上述实施例可以看到,采用取对数操作后,计算累计异常概率能够加速计算过程,快速获得累计异常概率的结果。
所述镜头的异常感光区域,除了可以基于上述所确定的累计异常概率确定之外。还可以基于与所述累计异常概率所确定的更新策略来确定。
在一些实施例中,所述图像的像素单元所对应的镜头区域存在异常的概率,可 以通过公式(13)所确定的更新策略来确定:
L(n|z 1:t)=max(min(L(n|z 1:t-1)+L(n|z t),l max),l min)
其中,l max和l min可以为预设的最大概率值和最小概率值。之所以存在预设的最大概率值和最小概率值,是因为,采用累计的方式计算概率,概率值会随着累计次数的增多而不断增大,而概率是一个具有数学意义的量,不可能无限大,故可以通过预设的最大概率值和最小概率值,来使最终所确定的概率值有意义。
对于步骤603中所述的提示信息,可以是多种形式的提示信息。
在一些实施例中,所述提示信息为由所述多个感光区域的异常概率所构成的概率图,其中,所述异常概率可以是前文所述的单次异常概率,也可以是前文所述的累计异常概率,本申请实施例对此不作限制。
如图8所示,给出了一张示例性由所述多个感光区域的累计异常概率所构成的概率图。其中,灰度值越高,对应着该像素单元的累计异常概率越高。通过该示例性概率图可以看到,当所述提示信息为概率图时,能够使用户或者工作人员直观地获得所述镜头的多个感光区域所对应的像素单元的异常概率。
当然,本领域技术人员应当理解,所述提示信息还可以是其他形式,例如,标注出所述镜头异常感光区域的提示图或者提示音等等,本申请实施例对所述提示信息的具体形式不作限制。
在一些实施例中,在本申请实施例所提供的一种镜头异常提示方法的步骤303和步骤403之后,还包括:将所述提示信息发送至用户端设备和\或服务管理中心的显示器,进行显示提醒,或者,将提示信息发送至用户端设备和\或服务管理中心的发声设备,例如扬声器等等,进行发声提醒。
通过上述实施例可以看到,将所述提示信息主动发送给用户端设备或者服务管理中心,进行显示提醒或者发声提醒,能够使得用户或者服务管理中心的工作人员及时发现所述镜头存在异常感光区域,进行异常状态的处理,避免安全事故的发生。
对于可移动平台来说,除了搭载有用于获取所述可移动平台所处场景的深度信息的镜头,通常还搭载有主摄像头,用于采集所述可移动平台所处场景的图像,供用户查看。而所述镜头所采集的包含深度信息的图像,通常是不会向用户展示的。在本申请实施例中所提供的一种镜头异常提示的方法中,当所述提示信息为图像类的提示信息时,所述方法还可以包括:将所述提示信息替换可移动平台的显示器所展示的主 摄像头所采集的图像,进行所述镜头存在异常区域的提醒。
通过上述实施例可以看到,通过用提示信息替换可移动平台的显示器所展示的主摄像头所采集的图像,能够提醒用户所述镜头存在异常区域的提醒,及时有效。
上述各个实施例,提供了第一种镜头异常提示的方法,其中,所述镜头被安装在可移动平台上,用于为所述可移动平台提供所处场景的深度信息。除此之外,本申请实施例还提供了第三种如图9所示的镜头异常提示方法,该镜头异常提示方法包括:
步骤901,获取所述镜头采集的图像;
步骤902,获取所述图像的多个像素单元的亮度值,多个所述像素单元的亮度值由所述镜头的多个感光区域对环境采集得到;
步骤903,基于多个所述像素单元的亮度值,得到任一所述像素单元对应的镜头的感光区域的增强因子和衰减因子,其中,所述增强因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的增强程度,所述衰减因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的衰减程度,其中,所述估计的亮度值表征所述感光区域非异常时,对所述环境采集得到对应的像素单元的亮度值。
步骤904,基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的多个感光区域中的异常感光区域;
步骤905,基于所述异常感光区域的位置生成提示信息。
该方法可以如前文各个实施例所述,应用于能够为可移动平台提供所处场景的深度信息的镜头异常提示上,也可以被应用于一般的镜头(如手机、单反的镜头等等)异常提示上,即所述检测方法对所述镜头的具体类型不做限制。
在一些实施例中,所述镜头被安装在可移动平台上。
在一些实施例中,所述可移动平台至少包括以下任一:无人机、无人汽车和智能机器人。
在一些实施例中,所述图像是在所述镜头处于不同位置和/或姿态下被采集的。
在一些实施例中,为了实现所述图像是在所述镜头处于不同位置和/或姿态下被采集的,所述方法还包括:控制所述可移动平台或与所述镜头连接的云台或者机械臂运动,以调整所述镜头的位置和\或姿态。
在一些实施例中,本申请实施例所提供的第二种镜头异常提示方法,步骤701 还包括:检测所述镜头是否存在大于阈值的位置和/或姿态变化,以在所述镜头的位置和/或姿态变化大于阈值时,获取所述镜头所采集的图像。
在一些实施例中,步骤903中,基于多个所述像素单元的亮度值,所述像素单元为不包含天空区域的像素单元。
在一些实施例中,步骤903中,得到任一所述像素单元对应的所述镜头的感光区域的增强因子和衰减因子,可以通过以下方式实现:
步骤9031,根据所获取的多个像素单元的亮度值,通过预设的拟合算法获得每个像素单元的估计的亮度值;
步骤9032,基于所获取的多个像素单元的亮度值,以及基于所述每个像素单元的估计的亮度值生成的图像的估计亮度分布,确定所述增强因子和衰减因子。
在一些实施例中,所述增强因子可以根据第一参数的均值与第二参数的均值之间的商确定;和\或,所述衰减因子可以根据第三参数与第四参数的差确定。其中,所述第一参数为所述多个像素单元的亮度值的梯度,所述第二参数为所述多个像素单元的估计的亮度值的梯度;所述第三参数为所述多个像素单元的亮度值的均值,所述第四参数为所述多个像素单元的估计的亮度值的均值与所述增强因子的乘积。
在一些实施例中,步骤9031中所述的预设的拟合算法,为随机一致性采样算法。
在一些实施例中,所述增强因子和所述衰减因子的数值范围,为所述衰减因子与所述增强因子的商的数值范围。
在一些实施例中,步骤904,基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的多个感光区域中的异常感光区域,包括:
步骤9041,基于所述增强因子和衰减因子的数值范围,确定每个感光区域的异常概率;
步骤9042,根据所述异常概率与预设的阈值之间的关系,确定每个感光区域是否异常。
在一些实施例中,步骤9041中所述的异常概率,可以与预设时间段内的累计异常概率。
在一些实施例中,所述异常累计概率可以通过以下方式确定:
步骤9043,基于所述预设时间段内所拍摄的图像的增强因子和衰减因子的数值范围,确定每个感光区域在所述预设时间段内的异常概率;
步骤9044,基于所述预设时间段的异常概率,与所述预设时间段之前所确定的累计异常概率,更新每个感光区域的累计异常概率。
其中,当所述预设时间段之前不存在所述累计异常概率,则所述预设时间段的异常概率即为所述累计异常概率。
在一些实施例中,步骤9044中,更新每个感光区域的累计异常概率,包括:
将所述预设时间段的异常概率与预设时间段之前所确定的累计异常概率相乘,乘积作为更新后的累计异常概率;或者,将所述预设时间段的异常概率机械能取对数操作,与预设时间段之前的对数形式的累计异常概率相加,将相加之和作为更新后的累计异常概率。
在一些实施例中,步骤905中的所述提示信息,至少包括以下之一:呈现所述多个感光区域的异常概率的概率图,即如图8所示的图像;标注出所述异常感光区域的提示图或者提示音。
在一些实施例中,本申请实施例所提供的第三种镜头异常提示方法,还包括:将所述提示信息发送至所述用户端设备和\或服务管理中心的显示器,进行显示提醒;和\或,将所述提示信息发送至所述用户端设备和\或服务管理中心的发生设备,进行发生提醒。
本申请实施例所提供的第三种镜头异常提示方法,各个实施例的相关内容与前文所述的第一种镜头异常提示方法相似,可以参见前文所述,本申请实施例在此不做赘述。
基于本申请实施例所提供的第三种镜头异常提示方法,通过获取待异常提示的镜头所采集的图像,并根据所述图像的多个像素单元的亮度值,得到增强因子和衰减因子,进而基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的异常感光区域,并生成提示信息,可以简单快速地确定出所述镜头是否因脏污或者遮挡等异常情况存在异常感光区域,以及该异常感光区域的具体位置,从而能够生成提示信息提醒用户或者管理人员进行异常感光区域的处理,克服了相关技术中,通过人工检测的方式的效率低下、检错率高的缺陷。
相应地,本申请实施例还提供了一种与所述镜头异常提示方法相对应的装置。 如图10所示,为本申请实施例所提供的一种镜头异常提示装置的硬件结构图,所述装置包括存储器1001和处理器1002及存储在所述存储器上并可在处理器运行的计算机程序,所述处理器执行所述程序时实现本申请实施例所提供的任一方法实施例。所述存储器1001可以是电子设备的内部存储单元,例如是电子设备的硬盘或者内存。所述存储器1001也可以是所述电子设备的外部存储设备,例如所述电子设备上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述存储器1001还可以既包括所述电子设备的内部存储单元也包括外部存储设备。所述存储器用于存储所述计算机程序以及所述电子设备所需的其他程序和数据。所述存储器还可以用于暂时地存储已经输出或者将要输出的数据。当存储器存储的程序被执行时,所述处理器1002调用存储器1001中存储的程序,用于执行前述各实施例的方法,所述方法已在前文详细介绍,这里不再赘述。
当然,本领域技术人员应当理解,通常根据电子设备的实际功能,还可以包括其他硬件,例如网络接口等等,本申请对此不再赘述。
此外,本申请实施例还提供了一种可移动平台,其硬件结构示意图如图11所示。所述可移动平台包括镜头1101、动力系统1102、存储器1103和处理器1104。所述镜头1101,用于采集所述可移动平台所处环境的深度图,为所述可移动平台的运动提供环境信息;所述动力系统1102,用于为所述可移动平台的运动提供动力;所述存储器用于存储程序代码1103;所述处理器调用所述程序代码,当程序代码被执行时,用于实现前述各实施例的方法,所述方法已在前文详细介绍,这里不再赘述。
在本申请的实施例中还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现本申请上述方法中的所有实施例,在此不再赘述。
所述计算机可读存储介质可以是电子设备的内部存储单元,例如电子设备的硬盘或内存。所述计算机可读存储介质也可以是所述电子设备的外部存储设备,例如所述电子设备上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述计算机可读存储介质还可以既包括所述电子设备的内部存储单元也包括外部存储设备。所述计算机可读存储介质用于存储所述计算机程序以及所述电子设备所需的其他程序和数据。所述计算机可读存储介质还可以用于暂时地存储已经输出或者将要输出的数据。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可 以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
上述对本申请特定实施例进行了描述。其它实施例在所附权利要求书的范围内。在一些情况下,在权利要求书中记载的动作或步骤可以按照不同于实施例中的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定要求示出的特定顺序或者连续顺序才能实现期望的结果。在某些实施方式中,多任务处理和并行处理也是可以的或者可能是有利的。
本领域技术人员在考虑说明书及实践这里申请的发明后,将容易想到本申请的其它实施方案。本申请旨在涵盖本申请的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本申请的一般性原理并包括本申请未申请的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本申请的真正范围和精神由下面的权利要求指出。
应当理解的是,本申请并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本申请的范围仅由所附的权利要求来限制。
以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (36)

  1. 一种镜头异常提示方法,其特征在于,所述方法包括:
    获取所述镜头采集的图像;
    获取所述图像的多个像素单元的亮度值,多个所述像素单元的亮度值由所述镜头的多个感光区域对环境采集得到;
    基于多个所述像素单元的亮度值,得到任一所述像素单元对应的所述镜头的感光区域的增强因子和衰减因子,其中,所述增强因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的增强程度,所述衰减因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的衰减程度,其中,所述估计的亮度值表征所述感光区域非异常时,对所述环境采集得到对应的像素单元的亮度值;
    基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的多个感光区域中的异常感光区域;
    基于所述异常感光区域的位置生成提示信息。
  2. 根据权利要求1所述的方法,其特征在于,所述镜头被安装在可移动平台上。
  3. 根据权利要求2所述的方法,其特征在于,所述可移动平台至少包括以下任一:无人机、无人汽车和智能机器人。
  4. 根据权利要求2所述的方法,其特征在于,所述图像是在所述镜头处于不同位置和/或姿态下被采集的。
  5. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    控制所述可移动平台或与所述镜头连接的云台或机械臂运动,以调整所述镜头的位置和/或姿态。
  6. 根据权利要求4所述的方法,其特征在于,所述方法还包括:
    检测所述镜头是否存在大于阈值的位置和/或姿态变化,以在所述镜头的位置和/或姿态变化大于阈值时,获取所述镜头所采集的图像。
  7. 根据权利要求1所述的方法,其特征在于,所述像素单元为不包含天空区域的像素单元。
  8. 根据权利要求1所述的方法,其特征在于,得到任一所述像素单元对应的所述镜头的感光区域的增强因子和衰减因子,通过以下方式实现:
    根据所获取的多个像素单元的亮度值,通过预设的拟合算法获得每个像素单元的估计的亮度值;基于所获取的多个像素单元的亮度值,以及基于所述每个像素单元的估计的的亮度值生成的图像的估计亮度分布,确定所述增强因子和衰减因子。
  9. 根据权利要求8所述的方法,其特征在于,所述增强因子
    根据第一参数的均值与第二参数的均值之间的商确定;
    和\或,
    所述衰减因子根据第三参数与第四参数的差确定;
    其中,所述第一参数为所述多个像素单元的亮度值的梯度,所述第二参数为所述多个像素单元的估计的亮度值的梯度;所述第三参数为所述多个像素单元的亮度值的均值,所述第四参数为所述多个像素单元的估计的亮度值的均值与所述增强因子的乘积。
  10. 根据权利要求8所述的方法,其特征在于,所述预设的拟合算法为随机一致性采样算法。
  11. 根据权利要求1所述的方法,其特征在于,所述增强因子和所述衰减因子的数值范围,为所述衰减因子与所述增强因子的商的数值范围。
  12. 根据权利要求1所述的方法,其特征在于,确定所述镜头的多个感光区域中的异常感光区域,包括:
    基于所述增强因子和衰减因子的数值范围,确定每个感光区域的异常概率;
    根据所述异常概率与预设的阈值之间的关系,确定每个感光区域是否异常。
  13. 根据权利要求12所述的方法,其特征在于,所述异常概率为预设时间段内的累计异常概率。
  14. 根据权利要求13所述的方法,其特征在于,所述累计异常概率通过以下方式确定:
    基于所述预设时间段内所采集的图像的增强因子和衰减因子的数值范围,确定每个感光区域在所述预设时间段内的异常概率;
    基于所述预设时间段的异常概率,与所述预设时间段之前所确定的累计异常概率,更新每个感光区域的累计异常概率;
    其中,当所述预设时间段之前不存在所述累计异常概率,则所述预设时间段的异常概率即为所述累计异常概率。
  15. 根据权利要求14所述的方法,其特征在于,更新每个感光区域的累计异常概率,包括:
    将所述预设时间段的异常概率与预设时间段之前所确定的累计异常概率相乘,乘积作为更新后的累计异常概率;
    或者,
    将所述预设时间段的异常概率进行取对数操作,与预设时间段之前的对数形式的累计异常概率相加,将相加之和作为更新后的累计异常概率。
  16. 根据权利要求1所述的方法,其特征在于,提示信息至少包括以下之一:
    呈现所述多个感光区域的异常概率的概率图、标注出所述异常感光区域的提示图或提示音。
  17. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    将提示信息发送至用户端设备和\或服务管理中心的显示器,进行显示提醒;
    和\或,
    将提示信息发送至用户端设备和\或服务管理中心的发声设备,进行发声提醒。
  18. 一种镜头异常提示方法,其特征在于,所述方法包括:
    获取所述镜头采集的图像;
    基于所述图像的多个像素单元的像素值,检测所述镜头的透光组件或者感光器件是否受到异物遮挡;
    当检测到有所述异物遮挡时生成提示信息。
  19. 根据权利要求18所述的方法,其特征在于,所述像素值包括作为分量的亮度值,所述方法还包括:
    基于所述图像的多个像素单元的亮度值确定所述镜头的透光组件或者感光器件是否存在异常感光区域。
  20. 根据权利要求18所述的方法,其特征在于,所述镜头被安装在可移动平台上,用于为所述可移动平台提供所处场景的深度信息。
  21. 根据权利要求18所述的方法,其特征在于,所述镜头的透光组件或者感光器件是否受到异物遮挡,包括:所述透光组件或所述感光器件上是否存在粘附物,以及,所述透光组件或所述感光器件是否被外部组件遮挡。
  22. 根据权利要求21所述的方法,其特征在于,所述外部组件包括以下一种或多种:保护罩,可移动平台的支架或动力组件。
  23. 根据权利要求20或22所述的方法,其特征在于,所述可移动平台至少包括以下之一:无人机、无人汽车和智能机器人。
  24. 根据权利要求22所述的方法,其特征在于,所述可移动平台为无人机,所述外部组件包括以下一种或多种:所述无人机的机臂,所述无人机的飞行桨保护罩,所述无人机的云台保护罩。
  25. 根据权利要求21所述的方法,其特征在于,所述方法还包括:当检测到所述 镜头的透光组件或者感光器件被异物遮挡时,控制所述外部组件运动。
  26. 根据权利要求19所述的方法,其特征在于,所述基于所述亮度值确定所述镜头的透光组件或者感光器件是否存在异常感光区域,包括:
    基于多个所述像素单元的亮度值,得到任一所述像素单元对应的所述镜头的感光区域的增强因子和衰减因子,其中,所述增强因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的增强程度,所述衰减因子表征所述感光区域对应的像素单元的亮度值相对于估计的亮度值的衰减程度,其中,所述估计的亮度值表征所述感光区域非异常时,对所述镜头周围的环境采集得到的对应的像素单元的亮度值;
    基于所述增强因子和所述衰减因子的数值范围,确定所述镜头的多个感光区域中的异常感光区域;
    所述当检测到有所述异物遮挡时生成提示信息,包括:
    基于所述异常感光区域生成所述提示信息。
  27. 根据权利要求18所述的方法,其特征在于,所述获取所述镜头采集的图像是在所述镜头处于不同位置和/或姿态下采集的。
  28. 根据权利要求27所述的方法,其特征在于,所述方法还包括:
    控制可移动平台或与所述镜头连接的云台或机械臂运动,以调整所述镜头的拍摄位置和/或姿态。
  29. 根据权利要求28所述的方法,其特征在于,所述方法还包括:
    检测所述镜头是否存在大于阈值的位置和/或姿态变化,以在所述镜头的位置和/或姿态的变化大于阈值时,获取所述镜头所采集的图像。
  30. 根据权利要求29所述的方法,其特征在于,所述检测通过所述可移动平台的全球定位系统,惯性测量系统和视觉惯性里程计之中的至少一个实现。
  31. 根据权利要求18所述的方法,其特征在于,所述提示信息至少包括以下之一:
    由所述镜头多个感光区域的异常概率所构成的概率图。
  32. 根据权利要求18所述的方法,其特征在于,所述方法还包括:
    将提示信息发送至用户端设备和\或服务管理中心的显示器,进行显示提醒;
    和\或,
    将提示信息发送至用户端设备和\或服务管理中心的发声设备,进行发声提醒。
  33. 根据权利要求32所述的方法,其特征在于,将提示信息发送至所述用户端设备和\或服务管理中心的显示器,进行显示提醒,包括:
    用提示信息替换在显示器显示的主摄像头所采集的图像,进行所述镜头存在异常 区域的提醒。
  34. 一种镜头异常提示装置,其特征在于,所述装置包括存储器和处理器及存储在存储器上并可在处理器上运行的计算机程序,其特征在于,所述处理器执行所述程序时实现权利要求1至33任一所述的方法。
  35. 一种可移动平台,其特征在于,所述可移动平台包括:镜头、动力系统、存储器和处理器;
    所述镜头,用于采集所述可移动平台所处环境的深度图,为所述可移动平台的运动提供环境信息;
    所述动力系统,用于为所述可移动平台的运动提供动力;
    所述存储器用于存储程序代码;
    所述处理器调用所述程序代码,当程序代码被执行时,用于实现权利要求1至33任一所述的方法。
  36. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储有若干计算机指令,所述计算机指令被执行时实现权利要求1至33任一项所述方法的步骤。
PCT/CN2021/082779 2021-03-24 2021-03-24 镜头异常提示方法、装置、可移动平台及可读存储介质 WO2022198508A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/082779 WO2022198508A1 (zh) 2021-03-24 2021-03-24 镜头异常提示方法、装置、可移动平台及可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/082779 WO2022198508A1 (zh) 2021-03-24 2021-03-24 镜头异常提示方法、装置、可移动平台及可读存储介质

Publications (1)

Publication Number Publication Date
WO2022198508A1 true WO2022198508A1 (zh) 2022-09-29

Family

ID=83395027

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/082779 WO2022198508A1 (zh) 2021-03-24 2021-03-24 镜头异常提示方法、装置、可移动平台及可读存储介质

Country Status (1)

Country Link
WO (1) WO2022198508A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116823839A (zh) * 2023-08-31 2023-09-29 梁山中维热力有限公司 基于热红外图像的管道泄漏检测方法
CN116994074A (zh) * 2023-09-27 2023-11-03 安徽大学 一种基于深度学习的摄像头脏污检测方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107404647A (zh) * 2016-05-20 2017-11-28 中兴通讯股份有限公司 镜头状态检测方法及装置
TW201838399A (zh) * 2017-03-30 2018-10-16 晶睿通訊股份有限公司 影像處理系統及鏡頭狀態判斷方法
CN109118498A (zh) * 2018-08-22 2019-01-01 科大讯飞股份有限公司 一种摄像头污点检测方法、装置、设备及存储介质
CN110049320A (zh) * 2019-05-23 2019-07-23 北京猎户星空科技有限公司 摄像头遮挡检测方法、装置、电子设备及存储介质
CN110544211A (zh) * 2019-07-26 2019-12-06 纵目科技(上海)股份有限公司 一种镜头付着物的检测方法、系统、终端和存储介质
CN111932596A (zh) * 2020-09-27 2020-11-13 深圳佑驾创新科技有限公司 摄像头遮挡区域的检测方法、装置、设备和存储介质
CN112446246A (zh) * 2019-08-30 2021-03-05 初速度(苏州)科技有限公司 一种图像遮挡检测方法及车载终端

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107404647A (zh) * 2016-05-20 2017-11-28 中兴通讯股份有限公司 镜头状态检测方法及装置
TW201838399A (zh) * 2017-03-30 2018-10-16 晶睿通訊股份有限公司 影像處理系統及鏡頭狀態判斷方法
CN109118498A (zh) * 2018-08-22 2019-01-01 科大讯飞股份有限公司 一种摄像头污点检测方法、装置、设备及存储介质
CN110049320A (zh) * 2019-05-23 2019-07-23 北京猎户星空科技有限公司 摄像头遮挡检测方法、装置、电子设备及存储介质
CN110544211A (zh) * 2019-07-26 2019-12-06 纵目科技(上海)股份有限公司 一种镜头付着物的检测方法、系统、终端和存储介质
CN112446246A (zh) * 2019-08-30 2021-03-05 初速度(苏州)科技有限公司 一种图像遮挡检测方法及车载终端
CN111932596A (zh) * 2020-09-27 2020-11-13 深圳佑驾创新科技有限公司 摄像头遮挡区域的检测方法、装置、设备和存储介质

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116823839A (zh) * 2023-08-31 2023-09-29 梁山中维热力有限公司 基于热红外图像的管道泄漏检测方法
CN116823839B (zh) * 2023-08-31 2023-12-01 梁山中维热力有限公司 基于热红外图像的管道泄漏检测方法
CN116994074A (zh) * 2023-09-27 2023-11-03 安徽大学 一种基于深度学习的摄像头脏污检测方法

Similar Documents

Publication Publication Date Title
US10496163B2 (en) Eye and head tracking
WO2019223382A1 (zh) 单目深度估计方法及其装置、设备和存储介质
CN105955308B (zh) 一种飞行器的控制方法和装置
JP2018522348A (ja) センサーの3次元姿勢を推定する方法及びシステム
CN107357286A (zh) 视觉定位导航装置及其方法
WO2022198508A1 (zh) 镜头异常提示方法、装置、可移动平台及可读存储介质
WO2020103108A1 (zh) 一种语义生成方法、设备、飞行器及存储介质
US11315264B2 (en) Laser sensor-based map generation
JP2015184767A (ja) 情報処理装置、情報処理方法、位置姿勢推定装置、ロボットシステム
JP7272024B2 (ja) 物体追跡装置、監視システムおよび物体追跡方法
CN109670421B (zh) 一种疲劳状态检测方法和装置
US11282180B1 (en) Object detection with position, pose, and shape estimation
WO2019216005A1 (ja) 自己位置推定システム、自律移動システム及び自己位置推定方法
CN108444452B (zh) 目标经纬度和拍摄装置的三维空间姿态的检测方法及装置
CN111160220A (zh) 一种基于深度学习的包裹检测方法、装置、存储介质
CN113052907B (zh) 一种动态环境移动机器人的定位方法
WO2020019175A1 (zh) 图像处理方法和设备、摄像装置以及无人机
WO2020014864A1 (zh) 位姿确定方法、设备、计算机可读存储介质
WO2020019130A1 (zh) 运动估计方法及可移动设备
TWI499999B (zh) The 3D ring car image system based on probability calculation and its obtaining method
CN112585945A (zh) 对焦方法、装置及设备
CN115088244A (zh) 用于基于来自单目相机的数据进行定位的深度传感器激活
CN112017229B (zh) 一种相机相对位姿求解方法
WO2021210492A1 (ja) 情報処理装置、情報処理方法、およびプログラム
CN114600162A (zh) 用于捕捉摄像机图像的场景锁定模式

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21932136

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21932136

Country of ref document: EP

Kind code of ref document: A1