WO2021134714A1 - 红外图像处理方法、坏点标记方法及相关装置 - Google Patents

红外图像处理方法、坏点标记方法及相关装置 Download PDF

Info

Publication number
WO2021134714A1
WO2021134714A1 PCT/CN2019/130969 CN2019130969W WO2021134714A1 WO 2021134714 A1 WO2021134714 A1 WO 2021134714A1 CN 2019130969 W CN2019130969 W CN 2019130969W WO 2021134714 A1 WO2021134714 A1 WO 2021134714A1
Authority
WO
WIPO (PCT)
Prior art keywords
dead
pixel
user
coordinates
infrared
Prior art date
Application number
PCT/CN2019/130969
Other languages
English (en)
French (fr)
Inventor
张青涛
王黎
鄢蕾
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2019/130969 priority Critical patent/WO2021134714A1/zh
Publication of WO2021134714A1 publication Critical patent/WO2021134714A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/68Noise processing, e.g. detecting, correcting, reducing or removing noise applied to defects

Definitions

  • This application relates to the field of infrared image processing, and in particular to an infrared image processing method, a dead pixel marking method and related devices.
  • dead pixels In the manufacturing process of the infrared sensor, due to the influence of dust and manufacturing process, there will be dead pixels. There will also be some dead pixels during use. The existence of dead pixels will affect the quality of the infrared sensor, thereby affecting the user experience and causing interference to the remote temperature difference detection capability.
  • the embodiments of the present application provide an infrared image processing method, a dead pixel marking method, and related devices, which can improve the quality of infrared imaging.
  • an embodiment of the present application provides an infrared image processing method, including: obtaining the coordinates of the dead pixels marked by the user; the dead pixels marked by the user are the dead pixels generated during the use of the infrared sensor after the infrared sensor is shipped; Identify the dead points in the infrared image obtained by the infrared sensor according to the coordinates of the dead points marked by the user; remove the dead points in the infrared image.
  • the infrared image processing method provided by the embodiments of the present application can remove the dead pixels generated by the infrared sensor during use by the user after leaving the factory by means of user marks, thereby improving the quality of infrared imaging.
  • an embodiment of the present application provides a method for marking dead pixels of an infrared sensor, which includes: receiving user operations for marking dead pixels; Point; in response to the above-mentioned user operation, determine the coordinates of the dead point marked by the user.
  • the method for marking dead pixels of an infrared sensor provided in the embodiments of the present application can determine the dead pixels of the external sensor during use by the user after leaving the factory through user operations, thereby removing the red dead pixels and improving the quality of infrared imaging.
  • an embodiment of the present application provides an infrared image processing device, including: a memory and a processor; the memory is coupled with the processor, the memory is used to store program instructions; the processor is used to call the program in the memory Instruction and execution: Obtain the coordinates of the dead points marked by the user; the dead points marked by the user are the dead points produced by the infrared sensor during the user's use after leaving the factory; the coordinates of the dead points marked by the user are identified by the infrared sensor The dead pixels in the infrared image; remove the dead pixels in the above infrared image.
  • the embodiment of the present application can remove the dead pixels generated by the infrared sensor during use by the user after leaving the factory by means of user marking, thereby improving the quality of infrared imaging.
  • an embodiment of the present application provides a dead pixel marking device for an infrared sensor, including: a memory, a processor, and a user interface; the memory and the user interface are coupled with the processor, and the memory is used to store program instructions; The processor is used to call and execute the program instructions in the above-mentioned memory: receiving user operations for marking dead pixels through the user interface; the above-mentioned dead points are the dead points generated by the infrared sensor during use by the user after leaving the factory; in response to the above-mentioned user Operation to determine the coordinates of the dead pixels marked by the user.
  • the dead pixels produced by the user during use of the external sensor after the factory is shipped can be determined through user operations, and the red dead pixels can be removed, and the infrared imaging image quality can be improved.
  • an embodiment of the present application provides a computer-readable storage medium having a computer program stored thereon, and the computer program, when executed by a processor, implements the infrared image processing method provided in the first aspect of the embodiments of the present application.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, it realizes the dead pixel mark of the infrared sensor provided in the second aspect of the embodiment of the present application. method.
  • FIG. 1 is a schematic diagram of the architecture of an infrared imaging system provided by an embodiment of the application
  • FIG. 2 is a schematic flowchart of an infrared image processing method provided by an embodiment of the application
  • FIG. 3 is a schematic diagram of comparison between an initial pixel-level response rate data table and a second pixel-level response rate data table provided by an embodiment of the application;
  • FIG. 4 is a schematic diagram of a dead pixel table provided by an embodiment of the application.
  • FIG. 5 is a schematic flowchart of an infrared image processing method provided by another embodiment of this application.
  • FIG. 6 is a schematic flowchart of an infrared image processing method provided by another embodiment of this application.
  • FIGS 7-11 are schematic diagrams of some user interfaces for marking dead pixels provided by embodiments of this application.
  • FIG. 12 is a schematic flowchart of a method for marking dead pixels of an infrared sensor according to an embodiment of the application
  • FIG. 13 is a schematic structural diagram of an infrared image processing device provided by an embodiment of the application.
  • FIG. 14 is a schematic structural diagram of a dead pixel marking device of an infrared sensor provided by an embodiment of the application.
  • Static dead pixels can be identified according to the response rate, for example, pixels with a response rate exceeding the range are static dead pixels, for example, the response rate is within the range of X-Y as non-static dead pixels, and the response rate is greater than Y or less than X
  • the pixels are static dead pixels.
  • X is equal to 0.5 and Y is equal to 1.
  • Dynamic dead pixels within a certain pixel range, the display is normal, and beyond this range, the brightness value of the point is brighter than the surrounding pixels. This is related to the temperature and gain of the sensor. When the temperature of the sensor increases or the gain increases, the dynamic dead pixels will become more obvious. Pixels with a response rate greater than the threshold are dynamic dead pixels. For example, pixels that exist in some image frames, and pixels that do not exist in some image frames and whose response rate is greater than a threshold are dynamic dead pixels. After the infrared sensor leaves the factory, the newly generated pixels during the user's use are also dynamic dead pixels.
  • Fig. 1 shows a schematic structural diagram of an infrared imaging system provided by an embodiment of the present invention.
  • the system includes an infrared camera terminal 10 and an infrared display terminal 20.
  • the infrared photographing terminal 10 may include an infrared sensor for acquiring infrared images.
  • the infrared display terminal 20 may be used to receive the infrared image sent by the infrared photographing terminal 10 and display the infrared image.
  • the infrared camera terminal 10 may be mounted on a movable device (including but not limited to aircraft, boats, automobiles, etc., the aircraft 30 is taken as an example in this application). Specifically, it can be mounted on the gimbal of the aircraft 30 to complete the aerial photography mission of the corresponding target during the flight of the aircraft 30. Possibly, when the infrared photographing terminal 10 is mounted on an aircraft, the infrared display device 20 may be a ground control device of the aircraft 30.
  • the ground control equipment of the aircraft 30 can establish a communication connection through a wireless connection (for example, a wireless connection based on WIFI or radio frequency communication), which is used to control the flight trajectory of the aircraft 30 and receive infrared images sent by the infrared photographing terminal 10.
  • the ground control equipment of the aircraft 30 may include a display screen for displaying infrared images sent by the infrared photographing terminal 10.
  • the infrared camera terminal 10 is used to complete infrared image/video shooting, and the aircraft 30 is used to implement the infrared image processing process provided in this application, and send the processed infrared image/video to the ground control device for display.
  • the infrared camera terminal 10 can also be used to complete infrared image/video shooting, and the aircraft 30 is used to send the infrared image/video taken by the infrared camera terminal 10 to the ground control device, and the ground control device implements the infrared image provided by this application. Processing process and image display.
  • the ground control device of the above-mentioned aircraft 30 may be a controller with a joystick, or may be a smart device such as a smart phone or a tablet computer.
  • the infrared photographing terminal 10 may be integrated with the infrared display terminal 20, that is, one device can realize the function of photographing infrared images and the function of displaying infrared images.
  • the infrared camera includes both an infrared photographing terminal 10 and an infrared display terminal 20.
  • the infrared camera is used to complete the infrared image/video shooting and the infrared image processing process provided in this application, and display the processed infrared image/video.
  • the infrared image processing method may include the following steps:
  • the dead pixels marked by the user are the dead pixels produced by the user during the use of the infrared sensor after it leaves the factory.
  • the method is applied to an infrared display terminal 20, which includes an infrared sensor, that is, the infrared photographing terminal 10 and the infrared display terminal 20 are integrated.
  • the infrared display terminal 20 may receive a user operation for marking a dead point, and in response to the user operation, determine the coordinates of the dead point marked by the user.
  • the infrared display terminal 20 may include a touch screen, and the user can input a user operation for marking dead pixels on the touch screen.
  • the infrared display terminal 20 may include keys, and the user can input a user operation for marking dead pixels through the keys.
  • the method is executed by the infrared photographing terminal 10, and the infrared photographing terminal 10 includes an infrared sensor.
  • the infrared display terminal 20 may receive a user operation for marking a dead point, and in response to the user operation, determine the coordinates of the dead point marked by the user.
  • the infrared display terminal 20 sends the coordinates of the dead pixels to the infrared photographing terminal 20.
  • the input mode of the user operation for marking dead pixels can be input through a joystick in addition to the touch screen or key input described in the foregoing embodiment.
  • the infrared display terminal 20 may be the ground control device of the aircraft 30.
  • the ground control device can be a controller with a joystick.
  • the method is executed by the infrared display terminal 20, and the infrared photographing terminal 10 includes an infrared sensor.
  • the infrared display terminal 20 can receive the infrared image sent by the infrared photographing terminal 10, and the infrared display terminal 20 can display the infrared image.
  • the user can input a user operation for marking the dead point in the infrared display terminal 20, and in response to the user operation, the infrared display terminal 20 can determine the coordinates of the dead point marked by the user.
  • the input mode of the user operation for marking dead pixels can be input through a joystick in addition to the touch screen or key input described in the foregoing embodiment.
  • the infrared display terminal 20 may be the ground control device of the aircraft 30.
  • the ground control device can be a controller with a joystick.
  • S202 Identify the dead points in the infrared image obtained by the infrared sensor according to the coordinates of the dead points marked by the user.
  • the initial pixel-level response rate data table and the coordinates of the dead pixels marked by the user can be merged to generate a second pixel-level response rate data table.
  • the initial pixel-level response rate data table and the second pixel-level response rate data table both include the response rates of different pixels corresponding to the pixel points.
  • the second pixel-level response rate data table corresponds to the dead pixels marked by the user.
  • the response rate is a preset value, and the preset value is, for example, but not limited to 0. By identifying whether the response rate corresponding to each pixel in the second pixel-level response rate data table is the above preset value, it can be determined whether the pixel is a dead pixel.
  • the dead pixels in the infrared image are pixels whose response rate is the above-mentioned preset value (such as 0).
  • FIG. 3 exemplarily shows a comparison diagram of the initial pixel-level response rate data table and the second pixel-level response rate data table.
  • the initial pixel-level response rate data table is shown in the left image in Figure 3. If the coordinates of the dead pixels marked by the user are (1, 2), (2, 1), (n, n-1), then the pixel P 1 with the coordinates (1, 2) in the second pixel-level response rate data table ,2 , pixel P 2,1 with coordinates (2,1), pixel P n, n-1 with coordinates (n,n-1) , the corresponding response rates are all changed to 0, as shown in the right picture in Figure 3 Shown.
  • the initial pixel-level response rate data table and the second pixel-level response rate data table may also have other expression forms, which are not limited in the embodiment of the present application.
  • the first dead point table may be generated according to the coordinates of the dead points marked by the user.
  • the location of the dead pixel can be recorded in the first dead pixel table.
  • FIG. 4 exemplarily shows the first dead point table, and the first dead point table records the coordinates of the dead points marked by the user.
  • the pixel coordinates of the dead pixels as shown in Fig. 4 are (1, 2), (2, 1), (n, n-1).
  • the dead pixels in the infrared sensor can be identified as P 1 , 2, P 2 , 1, P n, n-1 .
  • the first dead pixel table may also be represented in other forms, which is not limited in the embodiment of the present application.
  • the initial pixel-level response rate data table and the coordinates of the dead pixels marked by the user can be stored in static memory, and the second pixel-level response rate data table can be stored in dynamic memory to improve the rate of identifying dead pixels.
  • static memory refers to the fixed storage space allocated when the program is compiled.
  • Dynamic memory refers to the storage space dynamically allocated during program execution.
  • the final temperature value of the dead point in the infrared image can be determined according to the infrared image obtained by the infrared sensor and the coordinates of the dead point marked by the user. Determining the final temperature value is to remove the dead pixels.
  • the final temperature value of the dead pixel can be obtained by interpolation according to the temperature values of other pixels around it.
  • the embodiment of the present application can identify the dead pixels produced by the user during use of the infrared sensor after leaving the factory by marking the dead pixels by the user, and remove the dead pixels marked by the user, that is, calculate the final temperature value of the dead pixels marked by the user. Improve the quality of infrared imaging.
  • the infrared image processing method may include the following steps:
  • S501 is the same as S201, and will not be repeated here.
  • the coordinates of the static dead pixels calibrated at the factory can be calibrated by the manufacturer of the infrared sensor before the infrared sensor leaves the factory.
  • the embodiment of the present application does not limit the sequence of implementing S502 and S501.
  • S503 Integrate the initial pixel-level response rate data table, the coordinates of the dead pixels marked by the user, and the coordinates of the static dead pixels to generate a third pixel-level response rate data table.
  • the initial pixel-level response rate data table, the coordinates of the dead pixels marked by the user, and the coordinates of the static dead pixels can also be merged to generate The third pixel-level response rate data table.
  • the initial pixel-level response rate data table and the third pixel-level response rate data table both include the response rates of different pixels and the corresponding pixel points, and the dead pixels and static points marked by the user in the third pixel-level response rate data table
  • the response rate corresponding to the dead pixel is a preset value, and the preset value is, for example, but not limited to 0.
  • the response rate corresponding to each pixel in the third pixel-level response rate data table is the above-mentioned preset value, it can be determined whether the pixel is a dead pixel or a static dead pixel marked by the user.
  • the dead pixels or static dead pixels marked by the user in the external image are pixels with a response rate of the aforementioned preset value (such as 0).
  • S504 Identify the dead pixels of the infrared image acquired by the infrared sensor according to the third pixel-level response rate data table.
  • the pixel with the response rate above the preset value is the dead pixel.
  • Dead pixels include dead pixels marked by users and static dead pixels.
  • the dead pixels marked by the user and the initial pixel can be separately combined.
  • the second-level response rate data table is merged to generate the second pixel-level response rate data table; then the static dead pixels are merged with the initial pixel-level response rate data table to generate the third pixel-level response rate data table.
  • the dead pixels marked by the user in the infrared image can be identified according to the second pixel-level response rate data table, and the static dead pixels in the infrared image can be identified according to the third pixel-level response rate data table.
  • the above-mentioned initial pixel-level response rate data table, the coordinates of the static dead pixels, and the coordinates of the dead pixels marked by the user can be stored in the static memory, and the second pixel-level response rate data table and the third pixel-level response rate data table can be stored in the static memory.
  • the static memory In dynamic memory, to improve the rate of identifying dead pixels.
  • a second dead point table may be generated according to the coordinates of the dead pixels marked by the user and the coordinates of the static dead pixels.
  • the first dead point table can also be generated separately according to the coordinates of the dead points marked by the user. Then generate the second dead pixel table separately according to the coordinates of the static dead pixels.
  • the dead pixels marked by the user in the infrared image can be identified by querying the first dead pixel table, and the static dead pixels in the infrared image can be identified by querying the second dead pixel table.
  • the above-mentioned first dead point table and second dead point table can be stored in a dynamic memory to improve the rate of identifying dead points.
  • the final temperature value of the dead point in the infrared image can be determined according to the infrared image obtained by the infrared sensor, the coordinates of the dead point marked by the user, and the coordinates of the static dead point. Determining the final temperature value is to remove the dead pixels.
  • the final temperature value of the dead pixel can be obtained by interpolation according to the temperature values of other pixels around it.
  • the embodiment of the application can identify the dead pixels generated by the user during the use of the infrared sensor after the factory is shipped, and the static dead pixels generated by the infrared sensor during the factory process, and remove the dead pixels and static dead pixels marked by the user to improve infrared imaging Picture quality.
  • the infrared image processing method may include the following steps:
  • S601 is the same as S501, and will not be repeated here.
  • S602 is the same as S502, and will not be repeated here.
  • S603 Fusion the initial pixel-level response rate data table, the coordinates of the dead pixels marked by the user, and the coordinates of the static dead pixels to generate a third pixel-level response rate data table.
  • S603 is the same as S503, and will not be repeated here.
  • S604 Identify the dead pixels of the infrared image acquired by the infrared sensor according to the third pixel-level response rate data table.
  • S604 is the same as S504, and will not be repeated here.
  • S605 is the same as S505, and will not be repeated here.
  • S606 Determine the coordinates of the dynamic dead pixels.
  • Method 1 The neighborhood of the first pixel in the infrared image acquired by the infrared sensor can be determined, and then the average of the temperature values of all pixels in the neighborhood except the first pixel can be calculated. When the absolute value of the difference between the initial temperature value of the first pixel and the average value exceeds the preset threshold, it is determined that the first pixel is a dynamic dead pixel.
  • the first pixel is any pixel in the infrared image obtained by the infrared sensor, and the neighborhood includes a preset number of pixels.
  • the coordinate of the first pixel is the coordinate of the dynamic dead pixel.
  • Manner 2 The neighborhood of the first pixel in the infrared image acquired by the infrared sensor can be determined; the probability that the first pixel is a dynamic dead pixel is determined according to the temperature value of the pixel in the neighborhood.
  • the neighborhood includes a preset number of pixels; the first pixel is any pixel in the infrared image obtained by the infrared sensor; the coordinates of the first pixel are the coordinates of the dynamic dead pixels.
  • the aforementioned neighborhood is centered on the first pixel.
  • the aforementioned neighborhood is not centered on the first pixel.
  • S607 Identify the dynamic dead pixels in the infrared image obtained by the infrared sensor according to the coordinates of the dynamic dead pixels.
  • the pixel points corresponding to the coordinates are the dynamic dead pixels.
  • the final temperature value of the first pixel can be obtained by interpolation according to the temperature values of other pixels around the first pixel.
  • the final temperature of the first pixel can be determined according to the initial temperature value of the first pixel, the probability that the first pixel is a dynamic dead pixel, and the reconstructed temperature value of the first pixel
  • the reconstructed temperature value of the first pixel point is obtained by interpolation according to the temperature value of one or more pixel points in the above-mentioned neighborhood.
  • the final temperature value of the first pixel point is the sum of the following two: the product of the reconstructed temperature value and the probability that the first pixel point is a dynamic dead point, the initial temperature value of the first pixel point and the first pixel point are not The product of the probability of a dynamic dead pixel; the sum of the probability that the first pixel is a dynamic dead pixel and the probability that the first pixel is not a dynamic dead pixel is 1.
  • T T reconstruction *P dead pixels +T initial *(1-P dead pixels ).
  • T is a first pixel value of the final temperature
  • T is the reconstructed first pixel value of a point reconstruction of temperature
  • T is the initial value of the initial temperature of the first pixel
  • P dead pixel is a first pixel as defective pixel probability .
  • the reconstructed temperature value of the first pixel is obtained by interpolation based on the temperature value of a pixel in the above-mentioned neighborhood, and the reconstructed temperature value is the temperature value of the above-mentioned one pixel.
  • the first pixel point P i, j is the reconstruction value of the temperature according to the temperature value reconstructed pixels of the second neighborhood of P i, j-1 interpolated temperature values obtained, the first pixel point P i, j, i.e., Is the temperature value of the pixel P i,j-1.
  • the reconstructed temperature value of the first pixel is obtained by interpolation according to the temperature values of multiple pixel points in the above-mentioned neighborhood, and the reconstructed temperature value is an average value or a weighted average value of the temperature values of the above-mentioned multiple pixel points.
  • the reconstructed temperature value of the first pixel point Pi ,j is based on the multiple pixel points Pi ,j-2 , Pi ,j-1 , Pi ,j+1 and Pi ,j in the second neighborhood. +2 temperature value interpolation, the reconstructed temperature value of the first pixel can be the average of the temperature values of the four adjacent pixels, or the four adjacent pixels have different weights, the first pixel
  • the reconstructed temperature value of is the value of the weighted summation of the temperature values of the four adjacent pixels. It can be known that the sum of the weights occupied by four adjacent pixels is 1.
  • the reconstructed temperature value of the first pixel can also be obtained according to the interpolation of the temperature values of other pixels in the second neighborhood, which is not limited in the embodiment of the present application. For example, it can be determined that the first pixel point is on the edge or contour of the image according to the content of the image, and the direction of interpolation can be a direction along the contour or edge. That is, the reconstructed temperature value of the first pixel may be obtained by interpolation of the temperature values of other pixels located on the contour or edge in the second neighborhood.
  • the embodiment of the application can identify the dead pixels produced by the infrared sensor during use by the user after leaving the factory by marking the dead pixels by the user, and can also identify static dead pixels and dynamic dead pixels, and remove these dead pixels to improve the infrared imaging picture. quality.
  • the soft threshold judgment method can be used to accurately determine the probability that the current previous pixel is a dynamic dead pixel, which effectively reduces the side effect of judging a single-pixel object (or a 2-3 pixel object) as a dynamic dead pixel, and removes the dynamic dead pixel. At the same time, it will not reduce the remote discovery and detection capabilities of the infrared system.
  • the embodiment of the application can identify the dead pixels produced by the user during the use of the infrared sensor after leaving the factory, the static dead pixels produced by the infrared sensor during the factory, and the dynamic dead pixels, and remove the dead pixels and static dead pixels marked by the user And dynamic dead pixels to improve the quality of infrared imaging.
  • FIG. 7 exemplarily shows a user interface 70 for displaying infrared images.
  • the user interface 70 is a user interface displayed by the infrared display terminal 20.
  • the user interface 70 may include an image display area 701 and a mark control 702. among them:
  • the image display area 701 can be used to display infrared images.
  • the user can view the infrared image taken by the infrared photographing terminal 10 through the image display area 701.
  • the infrared image displayed in the image display area 701 shown in FIG. 7 contains three dead pixels, dead pixel 1 (x1, y1), dead pixel 2 (x2, y2), and dead pixel 3 (x3, y3).
  • the marking control 702 may be used to receive a user operation for starting to mark a bad pixel.
  • the user can input a first user operation for enlarging the area to be selected (the area 7011 in the white dotted line frame in FIG. 7).
  • the first user operation can be, but is not limited to, double-clicking the to-be-selected area, or a two-finger reverse sliding operation. After the area to be selected is enlarged, it is convenient for the user to accurately select the dead pixels.
  • Figure 8 shows the area to be selected after zooming in.
  • the user can input a second user operation in the enlarged to-be-selected area 7011, and the second user operation can be, but is not limited to, a click operation, a double-click operation, or a long-press operation.
  • the infrared display terminal 20 may determine that the pixel corresponding to the point (such as (x1, y1)) of the second user's operation is the dead pixel to be added, and display the image shown in FIG. 9 in the user interface 70 Add control 703.
  • the infrared display terminal 20 can detect a user operation (such as a click operation) acting on the adding control 703, and in response to this operation, the infrared display terminal 20 can determine (x1, y1) as the dead pixel coordinates marked by the user.
  • a user operation such as a click operation
  • the infrared display terminal 20 can also generate a dead pixel table based on the dead pixels added by the user.
  • the user interface 70 may also include a view control 703 shown in FIG. 10. The user can click the control 703 to view the dead pixel table.
  • FIG. 11 exemplarily shows a user interface 80 for displaying a dead point table.
  • the user interface 80 may include a dead point table 801 and a return control 802. among them:
  • the dead point list 801 may be used to display the coordinates of the dead points that the user has marked, and the coordinates of each dead point correspond to a delete control 8011 for deleting the dead point.
  • the return control 802 can be used to return to the previous user interface of the current user interface 80, that is, the user interface 70 displaying infrared images.
  • the infrared display terminal 20 can receive a third user operation (such as clicking the delete control 8011) of the user to modify the dead pixel table, and in response to the user operation, the infrared display terminal 20 can delete the coordinates of the dead pixels corresponding to the delete control 8011, and update the display Dead pixel table.
  • a third user operation such as clicking the delete control 8011
  • the method for marking dead pixels of an infrared sensor can include the following steps:
  • S1201 Receive a first user operation for determining a to-be-selected area.
  • the first user operation may be, but is not limited to, double-clicking a large area to be selected, or a two-finger reverse sliding operation.
  • S1202 In response to the first user operation, zoom in and display the to-be-selected area.
  • S1203 Receive a second user operation that acts on any point in the to-be-selected area.
  • the second user operation may be, but is not limited to, a click operation, a double-click operation, or a long-press operation that acts on any point, for example.
  • S1204 In response to the second user operation, determine the coordinates of the dead point marked by the user according to the coordinates of any point mentioned above.
  • the coordinates of any of the above-mentioned points in the entire infrared image are the coordinates of the dead points marked by the user.
  • S1205 Generate a dead pixel table according to the coordinates of the dead pixel marked by the user.
  • the user can repeatedly execute the above S1201-S1204 to mark multiple dead pixels.
  • the infrared display terminal 20 may generate a dead point table as shown in FIG. 11 according to the coordinates of one or more dead points marked by the user.
  • S1207 Receive a third user operation for modifying dead pixels.
  • the third user operation may be, for example, a click operation acting on the delete control 8011 in FIG. 11.
  • the infrared display terminal 20 may delete the coordinates of the dead pixels corresponding to the delete control 8011 operated by the third user from the dead pixel table, and update the dead pixel table after the dead pixels are deleted.
  • the infrared display terminal 20 can also process the infrared image acquired by the infrared photographing terminal 10 according to the updated dead point table to remove the dead points marked by the user.
  • the infrared display terminal 20 may also send the updated dead point table to the infrared photographing terminal 10, so that the infrared photographing terminal 10 removes the dead points marked by the user, and sends the infrared image after removing the dead points to the infrared display terminal 20. , Presented to the user.
  • the embodiment of the present application provides a method for marking dead pixels by a user.
  • the dead pixels generated by the user after the infrared sensor is shipped from the factory can be removed by the user independently marking the dead pixels, thereby improving the quality of infrared imaging.
  • FIG. 13 shows a schematic structural diagram of an infrared image processing device provided by an embodiment of the present application.
  • the infrared image processing device 130 may include: at least one processor 1301, such as a CPU, a memory 1303, and at least one communication bus 1302. Among them, the communication bus 1302 is used to implement connection and communication between these components.
  • the memory 1303 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory.
  • the memory 1303 includes the flash in the embodiment of the present invention.
  • the memory 1303 may also be at least one storage system located far away from the foregoing processor 1301.
  • the infrared image processing device 130 is an infrared photographing terminal 10, and the infrared image processing device 130 may further include an infrared sensor and at least one network interface.
  • infrared sensors can be used to obtain infrared images.
  • the network interface can optionally include a standard wired interface and a wireless interface (such as a WI-FI interface), and a communication connection with the infrared display terminal 20 can be established through the network interface.
  • the infrared processing device 130 is an infrared display terminal 20, and the infrared processing device 130 may also include a display screen, a user interface, and a network interface.
  • the display screen can be used to display infrared images.
  • the network interface can optionally include a standard wired interface and a wireless interface (such as a WI-FI interface), and a communication connection with the infrared photographing terminal 10 can be established through the network interface.
  • the user interface may be used to receive user operations performed by the user on the infrared image, or to control the infrared photographing terminal 10.
  • the user interface may include a touch screen, keyboard or mouse, joystick, physical buttons, and so on.
  • the network interface can be connected to a receiver, transmitter or other communication module.
  • Other communication modules can include but are not limited to a WiFi module, a Bluetooth module, etc.
  • the infrared image processing device 130 in the embodiment of the present invention can also be Including receivers, transmitters and other communication modules.
  • the infrared processing device 130 is an integration of the infrared photographing terminal 10 and the infrared display terminal 20, and the infrared processing device 130 may also include an infrared sensor, a display screen, and a user interface.
  • infrared sensors can be used to obtain infrared images.
  • the display screen can be used to display infrared images.
  • the user interface can be used to receive user operations performed by the user on the infrared image.
  • the processor 1301 may be used to call program instructions stored in the memory 1303 and perform the following operations:
  • the dead points marked by the user are the dead points produced by the infrared sensor during use by the user after leaving the factory; according to the coordinates of the dead points marked by the user, the infrared image obtained by the infrared sensor is identified Dead pixels; remove the dead pixels in the above infrared image.
  • the infrared image processing device 130 is an integration of the infrared photographing terminal 10 and the infrared display terminal 20.
  • the infrared image processing device 130 executes: determining the final temperature value of the dead pixels in the infrared image according to the infrared image obtained by the infrared sensor and the coordinates of the dead pixels marked by the user.
  • the infrared image processing device 130 is an infrared display terminal 20.
  • the above-mentioned infrared display terminal includes the above-mentioned infrared sensor.
  • the infrared image processing device 130 executes to obtain the coordinates of the dead points marked by the user, it specifically executes: receiving user operations for marking dead points; in response to the above user operations, determining the dead points marked by the user coordinate of.
  • the infrared image processing device 130 is an infrared photographing terminal 10.
  • the infrared image processing device 130 executes to obtain the coordinates of the dead pixels marked by the user, it specifically executes: receiving the coordinates of the dead pixels marked by the user sent by the infrared display terminal, and the coordinates of the dead pixels marked by the user are based on the user's action on the infrared display terminal. User action is confirmed.
  • the above-mentioned device is applied to an infrared imaging system
  • the above-mentioned system includes the above-mentioned infrared photographing terminal and the above-mentioned device
  • the above-mentioned infrared photographing terminal includes the above-mentioned infrared sensor.
  • the infrared image processing device 130 When the infrared image processing device 130 acquires the coordinates of the dead pixels marked by the user, it specifically executes: receiving a user operation for marking the dead pixels; in response to the above user operation, determining the coordinates of the dead pixels marked by the user.
  • the infrared image processing device 130 Before identifying the dead points in the infrared image acquired by the infrared sensor according to the coordinates of the dead points marked by the user, the infrared image processing device 130 further executes: receiving the infrared image sent by the infrared photographing terminal.
  • the infrared image processing device 130 further executes: fusing the initial pixel-level response rate data table and the coordinates of the dead pixels marked by the user to generate the second pixel -Level response rate data table; the above-mentioned initial pixel-level response rate data table and the above-mentioned second pixel-level response rate data table both include the response rates of different pixels corresponding to the above-mentioned pixel points, and the above-mentioned second pixel-level response rate data table
  • the response rate corresponding to the dead pixels marked by the user is a preset value.
  • the infrared image processing device 130 executes the identification of the dead pixels in the infrared image obtained by the infrared sensor according to the coordinates of the dead points marked by the user, it specifically executes: identifying the infrared image obtained by the infrared sensor according to the second pixel-level response rate data table Dead pixels in.
  • the infrared image processing device 130 after acquiring the coordinates of the dead points marked by the user, the infrared image processing device 130 further executes: generating a first dead point table according to the coordinates of the dead points marked by the user.
  • the infrared image processing device 130 executes the identification of the dead pixels in the infrared image obtained by the infrared sensor according to the coordinates of the dead points marked by the user, it specifically executes: identifying the bad points in the infrared image obtained by the infrared sensor according to the first dead point table. point.
  • the infrared image processing device 130 further executes: acquiring the coordinates of the static dead pixels calibrated at the factory.
  • the infrared image processing device 130 executes the identification of the dead pixels in the infrared image acquired by the infrared sensor based on the coordinates of the dead pixels marked by the user, it specifically executes: identifying the above based on the coordinates of the dead pixels marked by the user and the coordinates of the static dead pixels. Dead pixels in the infrared image.
  • the infrared image processing device 130 further executes: fusing the initial pixel-level response rate data table, the coordinates of the dead points marked by the user, and the static bad points. Point coordinates, a third pixel-level response rate data table is generated; the coordinates of the dead points marked by the user in the third pixel-level response rate data table and the response rate corresponding to the static dead points are preset values.
  • the infrared image processing device 130 executes the identification of the dead pixels in the infrared image obtained by the infrared sensor based on the coordinates of the dead pixels marked by the user and the coordinates of the static dead pixels, it specifically executes: identification according to the above-mentioned third pixel-level response rate data table Bad pixels in the infrared image obtained by the above infrared sensor.
  • the infrared image processing device 130 executes the removal of the dead pixels in the infrared image, it specifically executes: removing the dead pixels marked by the user and the static dead pixels in the infrared image.
  • the infrared image processing device 130 after acquiring the coordinates of the dead points marked by the user, the infrared image processing device 130 further executes: generating a second dead point table according to the coordinates of the dead points marked by the user and the coordinates of the static dead points. .
  • the infrared image processing device 130 executes the identification of the dead pixels in the infrared image obtained by the infrared sensor according to the coordinates of the dead points marked by the user, it specifically executes: identifying the bad points in the infrared image obtained by the infrared sensor according to the second dead point table. point.
  • the infrared image processing device 130 when the infrared image processing device 130 performs the removal of the dead pixels in the infrared image, it specifically executes: according to the infrared image obtained by the infrared sensor, the coordinates of the dead pixels marked by the user, and the static dead pixels To determine the final temperature value of the dead pixels in the above infrared image.
  • the infrared image processing device 130 further executes: determining the coordinates of the dynamic dead pixels; identifying the dynamic dead pixels in the infrared image obtained by the infrared sensor according to the coordinates of the above dynamic dead pixels; removing the above dynamic dead pixels .
  • the infrared image processing device 130 executes the removal of the above-mentioned dynamic dead pixels, it specifically executes: determining the final temperature value of the above-mentioned dynamic dead pixels according to the infrared image obtained by the infrared sensor and the coordinates of the above-mentioned dynamic dead pixels .
  • the infrared image processing device 130 executes the determination of the coordinates of the dynamic dead pixels, it specifically executes: determining the neighborhood of the first pixel in the infrared image acquired by the infrared sensor; the first pixel is the foregoing For any pixel in the infrared image obtained by the infrared sensor, the neighborhood includes a preset number of pixels; calculate the average of the temperature values of all pixels in the neighborhood except the first pixel; When the absolute value of the difference between the initial temperature value of the first pixel and the average value exceeds the preset threshold, the first pixel is determined to be a dynamic dead pixel; wherein, the coordinates of the first pixel are the dynamic dead pixel. The coordinates of the point.
  • the infrared image processing device 130 executes the determination of the coordinates of the dynamic dead pixels, it specifically executes: determining the neighborhood of the first pixel in the infrared image acquired by the infrared sensor; wherein, the neighborhood includes the preset Set a number of pixels; the first pixel is any pixel in the infrared image obtained by the infrared sensor; the probability that the first pixel is a dynamic dead pixel is determined according to the temperature value of the pixel in the neighborhood; where , The coordinate of the first pixel point is the coordinate of the dynamic dead pixel.
  • the infrared image processing device 130 executes the removal of the above-mentioned dynamic dead pixels, it specifically executes: determining according to the initial temperature value of the above-mentioned first pixel point, the probability that the above-mentioned first pixel point is a dynamic dead point, and the reconstruction temperature value of the above-mentioned first pixel point The final temperature value of the first pixel, wherein the reconstructed temperature value of the first pixel is obtained by interpolation according to the temperature value of one or more pixels in the neighborhood.
  • the final temperature value of the first pixel is the sum of the following two: the product of the reconstructed temperature value and the probability that the first pixel is a dynamic dead pixel, and the value of the first pixel
  • the product of the initial temperature value and the probability that the first pixel is not a dynamic dead pixel; the sum of the probability that the first pixel is a dynamic dead pixel and the probability that the first pixel is not a dynamic dead pixel is 1.
  • the reconstructed temperature value of the first pixel is obtained by interpolation based on the temperature value of a pixel in the neighborhood, and the reconstructed temperature value is the temperature of the pixel; or
  • the reconstructed temperature value of the first pixel is obtained by interpolation based on the temperature values of multiple pixel points in the neighborhood, and the reconstructed temperature value is an average value or a weighted average value of the temperature values of the multiple pixel points.
  • the aforementioned neighborhood is centered on the aforementioned first pixel.
  • the functions of the infrared image processing device 130 in this embodiment can be specifically implemented according to the method in the infrared image processing method embodiment provided in FIG. 2, FIG. 5, or FIG. 6, and will not be repeated here.
  • FIG. 14 shows a schematic structural diagram of a bad point marking device of an infrared sensor provided by an embodiment of the present application.
  • the bad point marking device 140 of an infrared sensor may include: at least one processor 1401, such as a CPU, a user interface 1403, and a memory 1405, At least one communication bus 1402 and display screen 1406. Optionally, it may also include at least one network interface 1404 and an infrared sensor. Among them, the communication bus 1402 is used to implement connection and communication between these components.
  • the user interface 1403 may include a touch screen, a keyboard or mouse, a joystick, physical buttons, and so on. Infrared sensors can be used to obtain infrared images.
  • the network interface 1404 may optionally include a standard wired interface and a wireless interface (such as a WI-FI interface), and a communication connection with the server can be established through the network interface 1404.
  • the memory 1405 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory.
  • the memory 1405 includes the flash in the embodiment of the present invention.
  • the memory 1405 may also be at least one storage system located far away from the foregoing processor 1401.
  • the memory 1405, which is a computer storage medium may include an operating system, a network communication module, a user interface module, and program instructions.
  • the network interface 1404 can be connected to a receiver, a transmitter or other communication modules.
  • Other communication modules can include but are not limited to a WiFi module, a Bluetooth module, etc. It can be understood that the infrared sensor's dead-point marking device in the embodiment of the present application It can also include receivers, transmitters, and other communication modules.
  • the processor 1401 can be used to call the program instructions stored in the memory 1305, and perform the following operations: receive user operations for marking dead pixels through the user interface 1403; the above-mentioned dead pixels are generated by the user during use of the infrared sensor after leaving the factory. In response to the above-mentioned user operation, determine the coordinates of the dead pixels marked by the user.
  • the processor 1401 before receiving the user operation for marking dead pixels, the processor 1401 further executes: receiving a first user operation for determining the area to be selected through the user interface 1403; and responding to the first user operation described above. Operate to zoom in and display the above candidate area.
  • the processor 1401 When the processor 1401 receives a user operation for marking a bad pixel through the user interface 1403, it specifically executes: receiving a second user operation that acts on any point in the to-be-selected area through the user interface 1403.
  • the processor 1401 determines the coordinates of the dead points marked by the user in response to the foregoing user operation, it specifically executes: in response to the foregoing second user operation, determine the coordinates of the dead points marked by the user according to the coordinates of any point.
  • the processor 1401 after determining the coordinates of the dead pixels marked by the user in response to the aforementioned user operation, the processor 1401 further executes: generating a dead point table according to the coordinates of the dead pixels marked by the user; displaying the aforementioned dead pixels table.
  • the processor 1401 further executes: receiving a third user operation for modifying the above-mentioned dead point table through the user interface 1403; in response to the above third user operation, updating The above dead pixel table.
  • the function of the infrared sensor bad point marking device 140 of this embodiment can be specifically implemented according to the method in the infrared sensor bad point marking method embodiment provided in FIG. 12, and will not be repeated here.
  • the program can be stored in a computer readable storage medium, and the program can be stored in a computer readable storage medium. During execution, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium can be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random storage memory RAM, etc. In the case of no conflict, the technical features in this embodiment and the implementation can be combined arbitrarily.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

一种红外图像处理方法、坏点标记方法及相关装置。其中,红外图像处理方法包括:获取用户标记的坏点的坐标(S201);上述用户标记的坏点为红外传感器在出厂后在用户使用过程中产生的坏点;根据上述用户标记的坏点的坐标识别上述红外传感器获取的红外图像中的坏点(S202);去除上述红外图像中的坏点(S203)。该方法可以通过用户标记的方式去除红外传感器在出厂后在用户使用过程中产生的坏点,提高红外成像画质。

Description

红外图像处理方法、坏点标记方法及相关装置 技术领域
本申请涉及红外图像处理领域,尤其涉及一种红外图像处理方法、坏点标记方法及相关装置。
背景技术
红外传感器在制造过程中,由于灰尘、制造工艺等影响,会产生坏点。在使用的过程中也会产生一些坏点。坏点的存在会影响红外传感器的质量,从而影响用户体验,对于远程的温差发现能力造成干扰。
现有技术中对于坏点的处理技术大多用于处理在制造过程中产生的坏点,而无法对使用过程中产生的坏点进行处理,导致红外传感器成像质量不高。
发明内容
本申请实施例提供了一种红外图像处理方法、坏点标记方法及相关装置,可以提高红外成像画质。
第一方面,本申请实施例提供了一种红外图像处理方法,包括:获取用户标记的坏点的坐标;上述用户标记的坏点为红外传感器在出厂后再用户使用过程中产生的坏点;根据上述用户标记的坏点的坐标识别上述红外传感器获取的红外图像中的坏点;去除上述红外图像中的坏点。
本申请实施例提供的红外图像处理方法可以通过用户标记的方式去除红外传感器在出厂后在用户使用过程中产生的坏点,提高红外成像画质。
第二方面,本申请实施例提供了一种红外传感器的坏点标记方法,包括:接收用于标记坏点的用户操作;上述坏点为上述红外传感器在出厂后在用户使用过程中产生的坏点;响应于上述用户操作,确定用户标记的坏点的坐标。
本申请实施例提供的红外传感器的坏点标记方法可以通过用户操作来确定外传感器在出厂后在用户使用过程中产生的坏点,进而可以去除红坏点,提高红外成像画质。
第三方面,本申请实施例提供了一种红外图像处理装置,包括:存储器及 处理器;上述存储器与上述处理器耦合,上述存储器用于存储程序指令;上述处理器用于调用上述存储器中的程序指令并执行:获取用户标记的坏点的坐标;上述用户标记的坏点为红外传感器在出厂后在用户使用过程中产生的坏点;根据上述用户标记的坏点的坐标识别上述红外传感器获取的红外图像中的坏点;去除上述红外图像中的坏点。
本申请实施例可以通过用户标记的方式去除红外传感器在出厂后在用户使用过程中产生的坏点,提高红外成像画质。
第四方面,本申请实施例提供了一种红外传感器的坏点标记装置,包括:存储器、处理器及用户接口;上述存储器、用户接口与上述处理器耦合,上述存储器用于存储程序指令;上述处理器用于调用上述存储器中的程序指令并执行:通过用户接口接收用于标记坏点的用户操作;上述坏点为上述红外传感器在出厂后在用户使用过程中产生的坏点;响应于上述用户操作,确定用户标记的坏点的坐标。
本申请实施例可以通过用户操作来确定外传感器在出厂后在用户使用过程中产生的坏点,进而可以去除红坏点,提高红外成像画质。
第五方面,本申请实施例提供了一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现本申请实施例第一方面提供的红外图像处理方法。
第六方面,本申请实施例提供了一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现本申请实施例第二方面提供的红外传感器的坏点标记方法。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍。
图1为本申请实施例提供的一种红外成像系统的架构示意图;
图2为本申请实施例提供的红外图像处理方法流程示意图;
图3为本申请实施例提供的初始像素级响应率数据表及第二像素级响应率数据表的对比示意图;
图4为本申请实施例提供的坏点表示意图;
图5为本申请另一实施例提供的红外图像处理方法流程示意图;
图6为本申请另一实施例提供的红外图像处理方法流程示意图;
图7-图11为本申请实施例提供的标记坏点的一些用户界面示意图;
图12为本申请实施例提供的一种红外传感器的坏点标记方法流程示意图;
图13为本申请实施例提供的一种红外图像处理装置结构示意图;
图14为本申请实施例提供的一种红外传感器的坏点标记装置结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述。
首先介绍本申请实施例中涉及的两个概念:
静态坏点:可以按照响应率来识别,比如,响应率超过范围的像素点为静态坏点,例如,响应率为X-Y范围内的为非静态坏点,响应率大于Y的或者小于X的像素点为静态坏点。例如,X等于0.5,Y等于1。
动态坏点:在一定像素范围内,显示正常,而超过这一范围后,该点的亮度值比周围的像素要亮。这与传感器的温度、增益有关。传感器的温度升高或者增益增大时,动态坏点会变的更加明显。响应率大于阈值的像素点为动态坏点。比如,有的图像帧中存在,有的图像帧中不存在的且响应率大于阈值的像素点为动态坏点。红外传感器出厂后在用户使用过程中新产生的像素点也为动态坏点。
图1示出了本发明实施例提供的一种红外成像系统的架构示意图。该系统包括红外拍摄终端10及红外显示终端20。其中,红外拍摄终端10可以包括红外传感器,用于获取红外图像。红外显示终端20可以用于接收红外拍摄终端10发送的红外图像,并显示该红外图像。
在一些可能的实施例中,红外拍摄终端10可以搭载于可移动设备(包括但不限于飞行器、船、汽车等,本申请以飞行器30为例)上。具体可以搭载于飞行器30的云台上,以在飞行器30飞行的过程中完成相应目标的航拍任务。可能地,当红外拍摄终端10搭载于飞行器上时,红外显示装置20可以是飞行 器30的地面控制设备。飞行器30的地面控制设备可以通过无线连接方式(例如基于WIFI或射频通信的无线连接方式)建立通信连接,用于控制飞行器30的飞行轨迹,并接收红外拍摄终端10发送的红外图像。飞行器30的地面控制设备可以包括显示屏,用于显示红外拍摄终端10发送的红外图像。这种情况下,红外拍摄终端10用于完成红外图像/视频拍摄,飞行器30用于实现本申请提供的红外图像处理过程,并将处理后的红外图像/视频发送给地面控制设备进行显示。或者,还可以是红外拍摄终端10用于完成红外图像/视频拍摄,飞行器30用于将红外拍摄终端10拍摄的红外图像/视频发送给地面控制设备,由地面控制设备实现本申请提供的红外图像处理过程并进行图像显示。
其中,上述飞行器30的地面控制设备可以是带摇杆的控制器,也可以为智能手机、平板电脑等智能设备。
在另外一些可能的实施例中,红外拍摄终端10可以与红外显示终端20集成在一起,即一个设备既可以实现拍摄红外图像的功能,又可以实现显示红外图像的功能。例如,红外相机中既包括红外拍摄终端10,还包括红外显示终端20。这种情况下,红外相机用于完成红外图像/视频拍摄,以及本申请提供的红外图像处理过程,并将处理后的红外图像/视频进行显示。
接下来结合图1示出的红外成像系统的架构,介绍本申请实施例提供的红外图像处理方法。如图2所示,红外图像处理方法可以包括以下几个步骤:
S201:获取用户标记的坏点的坐标。
具体地,用户标记的坏点为红外传感器在出厂后,在用户使用过程中产生的坏点。
在一种可能的实施例中,该方法应用于红外显示终端20,红外显示终端20包括红外传感器,也即是说,红外拍摄终端10与红外显示终端20集成在一起。红外显示终端20可以接收用于标记坏点的用户操作,响应于该用户操作,确定用户标记的坏点的坐标。
可能地,红外显示终端20可以包括触摸屏,用户可以在触摸屏中输入用于标记坏点的用户操作。
可能地,红外显示终端20可以包括按键,用户可以通过按键输入用于标记坏点的用户操作。
在另外一种可能的实施例中,该方法由红外拍摄终端10执行,红外拍摄终端10包括红外传感器。红外显示终端20可以接收用于标记坏点的用户操作,响应于该用户操作,确定用户标记的坏点的坐标。红外显示终端20将坏点的坐标发送给红外拍摄终端20。其中,用于标记坏点的用户操作的输入方式除了前述实施例描述的通过触摸屏或按键输入之外,还可以通过摇杆输入,此时红外显示终端20可以是飞行器30的地面控制设备,该地面控制设备可以是带摇杆的控制器。
在另外一种可能的实施例中,该方法由红外显示终端20执行,红外拍摄终端10包括红外传感器。红外显示终端20可以接收红外拍摄终端10发送的红外图像,红外显示终端20可以显示该红外图像。用户可以在红外显示终端20中输入用于标记坏点的用户操作,响应于该用户操作,红外显示终端20可以确定用户标记的坏点的坐标。其中,用于标记坏点的用户操作的输入方式除了前述实施例描述的通过触摸屏或按键输入之外,还可以通过摇杆输入,此时红外显示终端20可以是飞行器30的地面控制设备,该地面控制设备可以是带摇杆的控制器。
S202:根据用户标记的坏点的坐标识别红外传感器获取的红外图像中的坏点。
可能地,上述获取用户标记的坏点的坐标之后,还可以融合初始像素级响应率数据表及用户标记的坏点的坐标,生成第二像素级响应率数据表。其中,初始像素级响应率数据表及第二像素级响应率数据表中均包括不同的像素点与该像素点对应的响应率,第二像素级响应率数据表中用户标记的坏点对应的响应率为预设值,该预设值例如但不限于是0。通过识别第二像素级响应率数据表中各个像素点对应的响应率是否为上述预设值即可确定该像素点是否为坏点。红外图像中的坏点即为响应率为上述预设值(如0)的像素点。
图3示例性示出了初始像素级响应率数据表及第二像素级响应率数据表的对比示意图。初始像素级响应率数据表如图3中的左图所示。若用户标记的坏点坐标为(1,2)、(2,1)、(n,n-1),则将第二像素级响应率数据表中坐标为(1,2)的像素P 1,2、坐标为(2,1)的像素P 2,1、坐标为(n,n-1)的像素P n,n-1对应的响应率均更改为0,如图3中的右图所示。
不限于图3列举的形式,在具体实现中初始像素级响应率数据表及第二像 素级响应率数据表还可以有其他的表现形式,本申请实施例对此不作限定。
可能地,上述获取用户标记的坏点的坐标之后,还可以根据用户标记的坏点的坐标生成第一坏点表。第一坏点表中可以记录坏点的位置。通过查询第一坏点表即可识别红外传感器获取的红外图像中的坏点。
图4示例性示出了第一坏点表,第一坏点表记录了用户标记的坏点的坐标。如图4中示出的坏点的像素坐标为(1,2)、(2,1)、(n,n-1)。通过查询第一坏点表即可识别红外传感器中的坏点为P 1,2、P 2,1、P n,n-1
不限于图4列举的形式,在具体实现中第一坏点表还可以由其他的表现形式,本申请实施例对此不作限定。
可能地,上述初始像素级响应率数据表及用户标记的坏点的坐标可以保存在静态内存中,而第二像素级响应率数据表可保存在动态内存中,以提高识别坏点的速率。其中,静态内存是指在程序编译时分配的固定的存储空间。动态内存是指在程序执行过程中动态的分配的存储空间。
S203:去除红外图像中的坏点。
具体地,可以根据红外传感器获取的红外图像及用户标记的坏点的坐标,确定红外图像中的坏点的最终温度值。确定最终温度值即为去除坏点。
具体地,坏点的最终温度值可以根据其周围的其他像素点的温度值进行插值得到。
本申请实施例可以通过用户标记坏点的方式识别红外传感器在出厂之后,在用户使用过程中产生的坏点,并去除用户标记的坏点,即计算出用户标记的坏点的最终温度值,提高红外成像画质。
接下来介绍本申请实施例提供的另外一种红外图像处理方法。如图5所示,红外图像处理方法可以包括以下几个步骤:
S501:获取用户标记的坏点的坐标。
具体地,S501与S201一致,此处不再赘述。
S502:获取出厂标定的静态坏点的坐标。
具体地,出厂标定的静态坏点的坐标可由红外传感器的生产厂商在出厂该红外传感器之前标定。
具体地,本申请实施例对S502与S501实现的先后顺序不作限定。
S503:融合初始像素级响应率数据表、用户标记的坏点的坐标及静态坏点的坐标,生成第三像素级响应率数据表。
可能地,上述获取用户标记的坏点的坐标及获取出厂标定的静态坏点的坐标之后,还可以融合初始像素级响应率数据表及用户标记的坏点的坐标、静态坏点的坐标,生成第三像素级响应率数据表。其中,初始像素级响应率数据表及第三像素级响应率数据表中均包括不同的像素点与该像素点对应的响应率,第三像素级响应率数据表中用户标记的坏点及静态坏点对应的响应率为预设值,该预设值例如但不限于是0。通过识别第三像素级响应率数据表中各个像素点对应的响应率是否为上述预设值即可确定该像素点是否为用户标记的坏点或静态坏点。外图像中的用户标记的坏点或静态坏点即为响应率为上述预设值(如0)的像素点。
S504:根据第三像素级响应率数据表识别红外传感器获取的红外图像的坏点。
具体地,第三像素级响应率数据表中响应率为上述预设值的像素点即为坏点。坏点包括用户标记的坏点与静态坏点。
不限于将用户标定的坏点与静态坏点一起与初始像素级响应率数据表融合,生成第三像素级响应率数据表,本申请实施例中还可以单独将用户标记的坏点与初始像素级响应率数据表融合,生成第二像素级响应率数据表;然后再将静态坏点与初始像素级响应率数据表融合,生成第三像素级响应率数据表。根据第二像素级响应率数据表即可识别红外图像中用户标记的坏点,并根据第三像素级响应率数据表即可识别红外图像中的静态坏点。
上述初始像素级响应率数据表、静态坏点的坐标及用户标记的坏点的坐标可以保存在静态内存中,而第二像素级响应率数据表及第三像素级响应率数据表可保存在动态内存中,以提高识别坏点的速率。
可能地,上述获取用户标记的坏点的坐标及获取出厂标定的静态坏点的坐标之后,还可以根据用户标记的坏点的坐标及静态坏点的坐标生成第二坏点表。通过查询第二坏点表即可识别红外传感器获取的红外图像中的用户标记的坏点及静态坏点。
不限于根据用户标记的坏点的坐标及获取出厂标定的静态坏点的坐标生成第二坏点表,在具体实现中,还可以单独根据用户标记的坏点的坐标生成第 一坏点表,再单独根据静态坏点的坐标生成第二坏点表。通过查询第一坏点表即可识别红外图像中用户标记的坏点,并通过查询第二坏点表即可识别红外图像中的静态坏点。
上述第一坏点表、第二坏点表可保存在动态内存中,以提高识别坏点的速率。
S505:去除红外图中的坏点。
具体地,可以根据红外传感器获取的红外图像、用户标记的坏点的坐标及静态坏点的坐标,确定红外图像中的坏点的最终温度值。确定最终温度值即为去除坏点。
具体地,坏点的最终温度值可以根据其周围的其他像素点的温度值进行插值得到。
本申请实施例可以识别红外传感器在出厂之后,在用户使用过程中产生的坏点,以及红外传感器在出厂过程中产生的静态坏点,并去除用户标记的坏点以及静态坏点,提高红外成像画质。
本申请实施例还提供了另外一种红外图像处理方法。如图6所示,红外图像处理方法可以包括以下几个步骤:
S601:获取用户标记的坏点的坐标。
具体地,S601与S501一致,此处不再赘述。
S602:获取出厂标定的静态坏点的坐标。
具体地,S602与S502一致,此处不再赘述。
S603:融合初始像素级响应率数据表、用户标记的坏点的坐标及静态坏点的坐标,生成第三像素级响应率数据表。
具体地,S603与S503一致,此处不再赘述。
S604:根据第三像素级响应率数据表识别红外传感器获取的红外图像的坏点。
具体地,S604与S504一致,此处不再赘述。
S605:去除红外图中的坏点。
具体地,S605与S505一致,此处不再赘述。S606:确定动态坏点的坐标。
具体地,确定动态坏点的坐标的方式有如下几种:
方式一:可以确定红外传感器获取的红外图像中第一像素点的邻域,再计算该邻域内除第一像素点之外的其他所有像素点的温度值的平均值。在第一像素点的初始温度值与平均值的差值的绝对值超过预设阈值的情况下,确定所述第一像素点为动态坏点。第一像素点为红外传感器获取的红外图像中的任意一个像素点,邻域中包括预设数量的像素点。第一像素点的坐标即为动态坏点的坐标。
方式二:可以确定所述红外传感器获取的红外图像中第一像素点的邻域;根据邻域内的像素点的温度值确定第一像素点为动态坏点的概率。其中,邻域包括预设数量的像素点;第一像素点为红外传感器获取的红外图像中的任意一个像素点;第一像素点的坐标即为动态坏点的坐标。
可能地,上述邻域以第一像素为中心。
可能地,上述邻域不以第一像素为中心。
S607:根据动态坏点的坐标识别红外传感器获取的红外图像中的动态坏点。
具体地,确定动态坏点的坐标后,该坐标对应的像素点即为动态坏点。
S608:去除动态坏点。
若采用上述方式一确定动态坏点,则可根据第一像素周围的其他像素的温度值插值得到第一像素的最终温度值。
若采用上述方式二确定动态坏点,则可以根据第一像素点的初始温度值、第一像素点为动态坏点的概率及第一像素点的重建温度值,确定第一像素点的最终温度值,其中,第一像素点的重建温度值是根据上述邻域内的一个或多个像素点的温度值进行插值得到的。
具体地,第一像素点的最终温度值为以下两者的和:重建温度值与第一像素点为动态坏点的概率的乘积、第一像素点的初始温度值与第一像素点不为动态坏点的概率的乘积;第一像素点为动态坏点的概率与所述第一像素点不为动态坏点的概率的和为1。
具体地,T=T 重建*P 坏点+T 初始*(1-P 坏点)。
其中,T为第一像素点的最终温度值,T 重建为第一像素点的重建温度值,T 初始为第一像素点的初始温度值,P 坏点为第一像素点为坏点的概率。
可能地,第一像素点的重建温度值是根据上述邻域内的一个像素点的温度 值进行插值得到的,重建温度值为上述一个像素点的温度值。
例如,第一像素点P i,j的重建温度值是根据第二邻域内的像素点P i,j-1的温度值插值得到的,则第一像素点P i,j的重建温度值即为像素点P i,j-1的温度值。
可能地,第一像素点的重建温度值是根据上述邻域内的多个像素点的温度值进行插值得到的,重建温度值为上述多个像素点的温度值的平均值或加权均值。
例如,第一像素点P i,j的重建温度值是根据第二邻域内的多个像素点P i,j-2、P i,j-1、P i,j+1和P i,j+2的温度值插值得到的,第一像素点的重建温度值可以是上述四个相邻像素点的温度值的平均值,或者以上四个相邻像素点占不同的权重,第一像素点的重建温度值为上述四个相邻像素点的温度值的加权求和的值。可以知道,四个相邻像素点占的权重总和为1。
不限于上述列举的四个相邻像素点,在具体实现中还可以根据第二邻域内其他像素点的温度值的插值得到第一像素点的重建温度值,本申请实施例对此不作限定。例如可以根据图像的内容确定第一像素点在图像边缘或轮廓上,则插值的方向可以是沿着轮廓或边缘的方向。即第一像素点的重建温度值可以是第二邻域内位于该轮廓或者边缘上的其他像素点的温度值插值得到的。
本申请实施例可以通过用户标记坏点的方式识别红外传感器在出厂之后,在用户使用过程中产生的坏点,还可以识别静态坏点及动态坏点,并去除这些坏点,提高红外成像画质。此外,还可以通过软阈值判断方法,准确判定当前前像素点是动态坏点的概率,有效降低把单像素物体(或者2-3像素物体)判定为动态坏点的副作用,在去除动态坏点的同时,不会降低红外系统的远程发现和探测能力。
不限于图6中示出的在S605中去除红外图像中的坏点(静态坏点及用户标记的坏点)之后,再识别并去除动态坏点,在具体实现中还可以在识别出静态坏点、用户标记的坏点及动态坏点之后,再统一去除这三类坏点,本申请实施例对去除这三类坏点的时序不作限定。
本申请实施例可以识别红外传感器在出厂之后,在用户使用过程中产生的坏点,红外传感器在出厂过程中产生的静态坏点,以及动态坏点,并去除用户标记的坏点、静态坏点以及动态坏点,提高红外成像画质。
接下来介绍本申请实施例提供的标记坏点的一些用户界面。
图7示例性示出了一种用于显示红外图像的用户界面70。用户界面70为红外显示终端20显示的用户界面。
如图7所示,用户界面70可以包括图像显示区701及标记控件702。其中:
图像显示区701,可以用于显示红外图像。用户可以通过图像显示区701查看红外拍摄终端10拍摄的红外图像。图7中示出的图像显示区701中显示的红外图像中包含了三个坏点,坏点1(x1,y1)、坏点2(x2,y2)及坏点3(x3,y3)。
标记控件702,可以用于接收用于开始标记坏点的用户操作。
用户点击标记控件702之后,可以输入用于放大待选区域(如图7中的白色虚线框区域7011)的第一用户操作。第一用户操作可以但不限于是双击待选区域,或者双指反向滑动操作。放大待选区域后,可以便于用户准确地选定坏点。
图8示出了放大后的待选区域。用户可以在放大后的待选区域7011中输入第二用户操作,第二用户操作可以但不限于是点击操作、双击操作或者长按操作等。响应于第二用户操作,红外显示终端20可以确定第二用户操作作用的点(如(x1,y1))对应的像素点为待添加的坏点,并在用户界面70中显示图9示出的添加控件703。
红外显示终端20可以检测作用于添加控件703的用户操作(如点击操作),响应于该操作,红外显示终端20可以确定(x1,y1)为用户标记的坏点坐标。
坏点添加成功后,用户可反复上述图7-图9所示的过程来添加其他坏点。
红外显示终端20还可以根据用户添加的坏点生成坏点表。用户界面70中还可以包括图10示出的查看控件703。用户可以点击控件703来查看坏点表。
图11示例性示出了用于显示坏点表的用户界面80。用户界面80可以包括坏点表801及返回控件802。其中:
坏点列表801可以用于显示用户已标记的坏点的坐标,每个坏点的坐标对应一个删除控件8011,用于删除该坏点。
返回控件802可以用于返回当前用户界面80的上一个用户界面,即显示红外图像的用户界面70。
红外显示终端20可以接收用户修改坏点表的第三用户操作(如点击删除控件8011),响应于该用户操作,红外显示终端20可以删除该删除控件8011 对应的坏点的坐标,并更新显示坏点表。
接下来结合图7-图11介绍本申请实施例提供的一种红外传感器的坏点标记方法。如图12所示,红外传感器的坏点标记方法可以包括以下几个步骤:
S1201:接收用于确定待选区域的第一用户操作。
具体地,第一用户操作可以但不限于是双击待选大区域,或者双指反向滑动操作。
S1202:响应于第一用户操作,放大显示待选区域。
S1203:接收作用于待选区域中任意一点的第二用户操作。
具体地,第二用户操作例如可以但不限于是作用于任意一点的点击操作、双击操作或者长按操作等。
S1204:响应于第二用户操作,根据上述任意一点的坐标确定用户标记的坏点的坐标。
具体地,上述任意一点在整副红外图像中的坐标(而非在当前显示屏中的坐标)即为用户标记的坏点的坐标。
S1205:根据用户标记的坏点的坐标生成坏点表。
具体地,用户可以反复执行上述S1201-S1204来标记多个坏点。
S1206:显示坏点表。
具体地,红外显示终端20可以根据用户标记的一个或多个坏点的坐标生成如图11所示的坏点表。
S1207:接收用于修改坏点的第三用户操作。
具体地,第三用户操作例如可以是作用于图11中的删除控件8011的点击操作。
S1208:响应于第三用户操作,更新坏点表。
具体地,红外显示终端20可以将第三用户操作作用的删除控件8011对应的坏点的坐标从坏点表中删除,并更新删除坏点后的坏点表。
此外,红外显示终端20还可以根据更新后的坏点表对红外拍摄终端10获取的红外图像进行处理,去除用户标记的坏点。
可能地,红外显示终端20还可以将更新后的坏点表发送给红外拍摄终端10,以使红外拍摄终端10去除用户标记的坏点,将去除坏点后的红外图像发 送给红外显示终端20,呈现给用户。
本申请实施例提供了一种用户标记坏点的方法,通过用户自主标记坏点的方式可以去除红外传感器在出厂之后,在用户使用过程中产生的坏点,提高红外成像画质。
上述详细阐述了本申请实施例的方法,下面为了便于更好地实施本申请实施例的上述方案,相应地,下面还提供用于配合实施上述方案的相关装置。
图13示出了本申请实施例提供的一种红外图像处理装置结构示意图,红外图像处理装置130可以包括:至少一个处理器1301,例如CPU,存储器1303,至少一个通信总线1302。其中,通信总线1302用于实现这些组件之间的连接通信。存储器1303可以是高速RAM存储器,也可以是非不稳定的存储器(non-volatilememory),例如至少一个磁盘存储器,存储器1303包括本发明实施例中的flash。存储器1303可选的还可以是至少一个位于远离前述处理器1301的存储系统。
在一种可能的实施例中,红外图像处理装置130为红外拍摄终端10,则红外图像处理装置130还可以包括红外传感器及至少一个网络接口。其中,红外传感器可以用于获取红外图像。网络接口可选的可以包括标准的有线接口、无线接口(如WI-FI接口),通过网络接口可以与红外显示终端20建立通信连接。
在另外一种可能的实施例中,红外处理装置130为红外显示终端20,则红外处理装置130还可以包括显示屏、用户接口及网络接口。其中,显示屏可以用于显示红外图像。网络接口可选的可以包括标准的有线接口、无线接口(如WI-FI接口),通过网络接口可以与红外拍摄终端10建立通信连接。用户接口可以用于接收用户作用于红外图像的用户操作,或者用于操控红外拍摄终端10。其中,用户接口可以包括触摸屏、键盘或鼠标、摇杆、物理按钮等等。
需要说明的是,网络接口可以连接接收器、发射器或其他通信模块,其他通信模块可以包括但不限于WiFi模块、蓝牙模块等,可以理解,本发明实施例中的红外图像处理装置130也可以包括接收器、发射器和其他通信模块等。
在另外一种可能的实施例中,红外处理装置130为红外拍摄终端10与红外显示终端20的集成,则红外处理装置130还可以包括红外传感器、显示屏 及用户接口。其中,红外传感器可以用于获取红外图像。显示屏可以用于显示红外图像。用户接口可以用于接收用户作用于红外图像的用户操作。
处理器1301可以用于调用存储器1303中存储的程序指令,并执行以下操作:
获取用户标记的坏点的坐标;上述用户标记的坏点为红外传感器在出厂后在用户使用过程中产生的坏点;根据上述用户标记的坏点的坐标识别上述红外传感器获取的红外图像中的坏点;去除上述红外图像中的坏点。
在一种可能的实施例中,红外图像处理装置130为红外拍摄终端10与红外显示终端20的集成。红外图像处理装置130执行去除上述红外图像中的坏点时,具体执行:根据上述红外传感器获取的红外图像及上述用户标记的坏点的坐标,确定上述红外图像中的坏点的最终温度值。
在一种可能的实施例中,红外图像处理装置130为红外显示终端20。上述红外显示终端包括上述红外传感器,红外图像处理装置130执行获取用户标记的坏点坐标时,具体执行:接收用于标记坏点的用户操作;响应于上述用户操作,确定上述用户标记的坏点的坐标。
在一种可能的实施例中,红外图像处理装置130为红外拍摄终端10。红外图像处理装置130执行获取用户标记的坏点坐标时,具体执行:接收上述红外显示终端发送的用户标记的坏点的坐标,上述用户标记的坏点的坐标根据用户作用于上述红外显示终端的用户操作确定。
在一种可能的实施例中,上述装置应用于红外成像系统,上述系统包括上述红外拍摄终端及上述装置,上述红外拍摄终端包括上述红外传感器。
红外图像处理装置130执行获取用户标记的坏点坐标时,具体执行:接收用于标记坏点的用户操作;响应于上述用户操作,确定上述用户标记的坏点的坐标。
上述根据上述用户标记的坏点的坐标识别上述红外传感器获取的红外图像中的坏点之前,红外图像处理装置130还执行:接收上述红外拍摄终端发送的上述红外图像。
在一种可能的实施例中,上述获取用户标记的坏点的坐标之后,红外图像处理装置130还执行:融合初始像素级响应率数据表及上述用户标记的坏点的坐标,生成第二像素级响应率数据表;上述初始像素级响应率数据表及上述第 二像素级响应率数据表中均包括不同的像素点与上述像素点对应的响应率,上述第二像素级响应率数据表中用户标记的坏点对应的响应率为预设值。
红外图像处理装置130执行根据上述用户标记的坏点的坐标识别上述红外传感器获取的红外图像中的坏点时,具体执行:根据上述第二像素级响应率数据表识别上述红外传感器获取的红外图像中的坏点。
在一种可能的实施例中,上述获取用户标记的坏点的坐标之后,红外图像处理装置130还执行:根据上述用户标记的坏点的坐标生成第一坏点表。
红外图像处理装置130执行根据上述用户标记的坏点的坐标识别上述红外传感器获取的红外图像中的坏点时,具体执行:根据上述第一坏点表识别上述红外传感器获取的红外图像中的坏点。
在一种可能的实施例中,红外图像处理装置130还执行:获取出厂标定的静态坏点的坐标。
红外图像处理装置130执行根据上述用户标记的坏点的坐标识别上述红外传感器获取的红外图像中的坏点时,具体执行:根据上述用户标记的坏点的坐标以及上述静态坏点的坐标识别上述红外图像中的坏点。
在一种可能的实施例中,上述获取用户标记的坏点的坐标之后,红外图像处理装置130还执行:融合上述初始像素级响应率数据表、上述用户标记的坏点的坐标及上述静态坏点的坐标,生成第三像素级响应率数据表;上述第三像素级响应率数据表中上述用户标记的坏点的坐标及上述静态坏点对应的响应率为预设值。
红外图像处理装置130执行根据上述用户标记的坏点的坐标以及上述静态坏点的坐标识别上述红外传感器获取的红外图像中的坏点时,具体执行:根据上述第三像素级响应率数据表识别上述红外传感器获取的红外图像中的坏点。
红外图像处理装置130执行去除上述红外图像中的坏点时,具体执行:去除上述红外图像中的上述用户标记的坏点以及上述静态坏点。
在一种可能的实施例中,上述获取用户标记的坏点的坐标之后,红外图像处理装置130还执行:根据上述用户标记的坏点的坐标及上述静态坏点的坐标生成第二坏点表。
红外图像处理装置130执行根据上述用户标记的坏点的坐标识别上述红 外传感器获取的红外图像中的坏点时,具体执行:根据上述第二坏点表识别上述红外传感器获取的红外图像中的坏点。
在一种可能的实施例中,红外图像处理装置130执行去除上述红外图像中的坏点时,具体执行:根据上述红外传感器获取的红外图像、上述用户标记的坏点的坐标及上述静态坏点的坐标,确定上述红外图像中的坏点的最终温度值。
在一种可能的实施例中,红外图像处理装置130还执行:确定动态坏点的坐标;根据上述动态坏点的坐标识别上述红外传感器获取的红外图像中的动态坏点;去除上述动态坏点。
在一种可能的实施例中,红外图像处理装置130执行去除上述动态坏点时,具体执行:根据上述红外传感器获取的红外图像及上述动态坏点的坐标,确定上述动态坏点的最终温度值。
在一种可能的实施例中,红外图像处理装置130执行确定动态坏点的坐标时,具体执行:确定上述红外传感器获取的红外图像中第一像素点的邻域;上述第一像素点为上述红外传感器获取的红外图像中的任意一个像素点,上述邻域中包括预设数量的像素点;计算上述邻域内除上述第一像素点之外的其他所有像素点的温度值的平均值;在上述第一像素点的初始温度值与上述平均值的差值的绝对值超过预设阈值的情况下,确定上述第一像素点为动态坏点;其中,上述第一像素点的坐标为动态坏点的坐标。
在一种可能的实施例中,红外图像处理装置130执行确定动态坏点的坐标时,具体执行:确定上述红外传感器获取的红外图像中第一像素点的邻域;其中,上述邻域包括预设数量的像素点;上述第一像素点为上述红外传感器获取的红外图像中的任意一个像素点;根据上述邻域内的像素点的温度值确定上述第一像素点为动态坏点的概率;其中,上述第一像素点的坐标为动态坏点的坐标。
红外图像处理装置130执行去除上述动态坏点时,具体执行:根据上述第一像素点的初始温度值、上述第一像素点为动态坏点的概率及上述第一像素点的重建温度值,确定上述第一像素点的最终温度值,其中,上述第一像素点的重建温度值是根据上述邻域内的一个或多个像素点的温度值进行插值得到的。
在一种可能的实施例中,上述第一像素点的最终温度值为以下两者的和: 上述重建温度值与上述第一像素点为动态坏点的概率的乘积、上述第一像素点的初始温度值与上述第一像素点不为动态坏点的概率的乘积;上述第一像素点为动态坏点的概率与上述第一像素点不为动态坏点的概率的和为1。
在一种可能的实施例中,上述第一像素点的重建温度值是根据上述邻域内的一个像素点的温度值进行插值得到的,上述重建温度值为上述一个像素点的温度值;或上述第一像素点的重建温度值是根据上述邻域内的多个像素点的温度值进行插值得到的,上述重建温度值为上述多个像素点的温度值的平均值或加权均值。
在一种可能的实施例中,上述邻域以上述第一像素点为中心。
可理解的是,本实施例的红外图像处理装置130的功能可根据上述图2、图5或图6提供的红外图像处理方法实施例中的方法具体实现,此处不再赘述。
图14示出了本申请实施例提供的一种红外传感器的坏点标记装置结构示意图,红外传感器的坏点标记装置140可以包括:至少一个处理器1401,例如CPU,用户接口1403,存储器1405,至少一个通信总线1402及显示屏1406。可选地,还可以包括至少一个网络接口1404及红外传感器。其中,通信总线1402用于实现这些组件之间的连接通信。用户接口1403可以包括触摸屏、键盘或鼠标、摇杆、物理按钮等等。红外传感器可以用于获取红外图像。网络接口1404可选的可以包括标准的有线接口、无线接口(如WI-FI接口),通过网络接口1404可以与服务器建立通信连接。存储器1405可以是高速RAM存储器,也可以是非不稳定的存储器(non-volatilememory),例如至少一个磁盘存储器,存储器1405包括本发明实施例中的flash。存储器1405可选的还可以是至少一个位于远离前述处理器1401的存储系统。如图14所示,作为一种计算机存储介质的存储器1405中可以包括操作系统、网络通信模块、用户接口模块以及程序指令。
需要说明的是,网络接口1404可以连接接收器、发射器或其他通信模块,其他通信模块可以包括但不限于WiFi模块、蓝牙模块等,可以理解,本申请实施例中红外传感器的坏点标记装置也可以包括接收器、发射器和其他通信模块等。
处理器1401可以用于调用存储器1305中存储的程序指令,并执行以下操 作:通过用户接口1403接收用于标记坏点的用户操作;上述坏点为上述红外传感器在出厂后在用户使用过程中产生的坏点;响应于上述用户操作,确定用户标记的坏点的坐标。
在一种可能的实施例中,上述接收用于标记坏点的用户操作之前,处理器1401还执行:通过用户接口1403接收用于确定待选区域的第一用户操作;响应于上述第一用户操作,放大显示上述待选区域。
处理器1401通过用户接口1403接收用于标记坏点的用户操作时,具体执行:通过用户接口1403接收作用于上述待选区域中任意一点的第二用户操作。
处理器1401执行响应于上述用户操作,确定用户标记的坏点的坐标时,具体执行:响应于上述第二用户操作,根据上述任意一点的坐标确定上述用户标记的坏点的坐标。
在一种可能的实施例中,上述响应于上述用户操作,确定用户标记的坏点的坐标之后,处理器1401还执行:根据上述用户标记的坏点的坐标生成坏点表;显示上述坏点表。
在一种可能的实施例中,上述显示上述坏点表之后,处理器1401还执行:通过用户接口1403接收用于修改上述坏点表的第三用户操作;响应于上述第三用户操作,更新上述坏点表。
可理解的是,本实施例的红外传感器的坏点标记装置140的功能可根据上述图12提供的红外传感器的坏点标记方法实施例中的方法具体实现,此处不再赘述。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-OnlyMemory,ROM)或随机存储记忆体RAM等。在不冲突的情况下,本实施例和实施方案中的技术特征可以任意组合。
以上所揭露的仅为本发明较佳实施例而已,当然不能以此来限定本发明之权利范围,因此依本发明权利要求所作的等同变化,仍属本发明所涵盖的范围。

Claims (46)

  1. 一种红外图像处理方法,其特征在于,包括:
    获取用户标记的坏点的坐标;所述用户标记的坏点为红外传感器在出厂后在用户使用过程中产生的坏点;
    根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点;
    去除所述红外图像中的坏点。
  2. 如权利要求1所述的方法,其特征在于,所述去除所述红外图像中的的坏点,包括:根据所述红外传感器获取的红外图像及所述用户标记的坏点的坐标,确定所述红外图像中的的坏点的最终温度值。
  3. 如权利要求1或2所述的方法,其特征在于,所述方法应用于红外显示终端,所述红外显示终端包括所述红外传感器;所述获取用户标记的坏点的坐标,包括:
    接收用于标记坏点的用户操作;
    响应于所述用户操作,确定所述用户标记的坏点的坐标。
  4. 如权利要求1或2所述的方法,其特征在于,所述方法应用于红外成像系统,所述系统包括红外拍摄终端及红外显示终端,所述方法由所述红外拍摄终端执行,所述红外拍摄终端包括所述红外传感器;
    所述获取用户标记的坏点的坐标,包括:
    接收所述红外显示终端发送的用户标记的坏点的坐标,所述用户标记的坏点的坐标根据用户作用于所述红外显示终端的用户操作确定。
  5. 如权利要求1或2所述的方法,其特征在于,所述方法应用于红外成像系统,所述系统包括红外拍摄终端及红外显示终端,所述方法由所述红外显示终端执行,所述红外拍摄终端包括所述红外传感器;
    所述获取用户标记的坏点的坐标,包括:
    接收用于标记坏点的用户操作;
    响应于所述用户操作,确定所述用户标记的坏点的坐标;
    所述根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点之前,还包括:
    接收所述红外拍摄终端发送的所述红外图像。
  6. 如权利要求1-5任一项所述的方法,其特征在于,所述获取用户标记的坏点的坐标之后,所述方法还包括:
    融合初始像素级响应率数据表及所述用户标记的坏点的坐标,生成第二像素级响应率数据表;所述初始像素级响应率数据表及所述第二像素级响应率数据表中均包括不同的像素点与所述像素点对应的响应率,所述第二像素级响应率数据表中用户标记的坏点对应的响应率为预设值;
    所述根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点,包括:根据所述第二像素级响应率数据表识别所述红外传感器获取的红外图像中的坏点。
  7. 如权利要求1-5任一项所述的方法,其特征在于,所述获取用户标记的坏点的坐标之后,所述方法还包括:
    根据所述用户标记的坏点的坐标生成第一坏点表;
    所述根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点,包括:根据所述第一坏点表识别所述红外传感器获取的红外图像中的坏点。
  8. 如权利要求1至5任一项所述的方法,其特征在于,所述方法还包括:
    获取出厂标定的静态坏点的坐标;
    所述根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点,包括:
    根据所述用户标记的坏点的坐标以及所述静态坏点的坐标识别所述红外图像中的坏点。
  9. 如权利要求8所述的方法,其特征在于,所述获取用户标记的坏点的坐标之后,所述方法还包括:
    融合所述初始像素级响应率数据表、所述用户标记的坏点的坐标及所述静态坏点的坐标,生成第三像素级响应率数据表;所述第三像素级响应率数据表中所述用户标记的坏点的坐标及所述静态坏点对应的响应率为预设值;
    所述根据所述用户标记的坏点的坐标以及所述静态坏点的坐标识别所述红外传感器获取的红外图像中的坏点,包括:
    根据所述第三像素级响应率数据表识别所述红外传感器获取的红外图像 中的坏点;
    所述去除所述红外图像中的坏点,包括:
    去除所述红外图像中的所述用户标记的坏点以及所述静态坏点。
  10. 如权利要求8所述的方法,其特征在于,所述获取用户标记的坏点的坐标之后,所述方法还包括:
    根据所述用户标记的坏点的坐标及所述静态坏点的坐标生成第二坏点表;
    所述根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点,包括:根据所述第二坏点表识别所述红外传感器获取的红外图像中的坏点。
  11. 如权利要求8-10任一项所述的方法,其特征在于,所述去除所述红外图像中的坏点,包括:根据所述红外传感器获取的红外图像、所述用户标记的坏点的坐标及所述静态坏点的坐标,确定所述红外图像中的坏点的最终温度值。
  12. 如权利要求1-11任一项所述的方法,其特征在于,所述方法还包括:
    确定动态坏点的坐标;
    根据所述动态坏点的坐标识别所述红外传感器获取的红外图像中的动态坏点;
    去除所述动态坏点。
  13. 如权利要求12所述的方法,其特征在于,所述去除所述动态坏点,包括:根据所述红外传感器获取的红外图像及所述动态坏点的坐标,确定所述动态坏点的最终温度值。
  14. 如权利要求12或13所述的方法,其特征在于,所述确定动态坏点的坐标,包括:
    确定所述红外传感器获取的红外图像中第一像素点的邻域;所述第一像素点为所述红外传感器获取的红外图像中的任意一个像素点,所述邻域中包括预设数量的像素点;
    计算所述邻域内除所述第一像素点之外的其他所有像素点的温度值的平均值;
    在所述第一像素点的初始温度值与所述平均值的差值的绝对值超过预设阈值的情况下,确定所述第一像素点为动态坏点;其中,所述第一像素点的坐 标为动态坏点的坐标。
  15. 如权利要求12或13所述的方法,其特征在于,所述确定动态坏点的坐标,包括:
    确定所述红外传感器获取的红外图像中第一像素点的邻域;其中,所述邻域包括预设数量的像素点;所述第一像素点为所述红外传感器获取的红外图像中的任意一个像素点;
    根据所述邻域内的像素点的温度值确定所述第一像素点为动态坏点的概率;其中,所述第一像素点的坐标为动态坏点的坐标;
    所述去除所述动态坏点,包括:
    根据所述第一像素点的初始温度值、所述第一像素点为动态坏点的概率及所述第一像素点的重建温度值,确定所述第一像素点的最终温度值,其中,所述第一像素点的重建温度值是根据所述邻域内的一个或多个像素点的温度值进行插值得到的。
  16. 如权利要求15所述的方法,其特征在于,所述第一像素点的最终温度值为以下两者的和:所述重建温度值与所述第一像素点为动态坏点的概率的乘积、所述第一像素点的初始温度值与所述第一像素点不为动态坏点的概率的乘积;所述第一像素点为动态坏点的概率与所述第一像素点不为动态坏点的概率的和为1。
  17. 如权利要求15或16所述的方法,其特征在于,所述第一像素点的重建温度值是根据所述邻域内的一个像素点的温度值进行插值得到的,所述重建温度值为所述一个像素点的温度值;或
    所述第一像素点的重建温度值是根据所述邻域内的多个像素点的温度值进行插值得到的,所述重建温度值为所述多个像素点的温度值的平均值或加权均值。
  18. 如权利要求15-17任一项所述的方法,其特征在于,所述邻域以所述第一像素点为中心。
  19. 一种红外传感器的坏点标记方法,其特征在于,包括:
    接收用于标记坏点的用户操作;所述坏点为所述红外传感器在出厂后在用户使用过程中产生的坏点;
    响应于所述用户操作,确定用户标记的坏点的坐标。
  20. 如权利要求19所述的方法,其特征在于,所述接收用于标记坏点的用户操作之前,所述方法还包括:
    接收用于确定待选区域的第一用户操作;
    响应于所述第一用户操作,放大显示所述待选区域;
    所述接收用于标记坏点的用户操作,包括:接收作用于所述待选区域中任意一点的第二用户操作;
    所述响应于所述用户操作,确定用户标记的坏点的坐标,包括:响应于所述第二用户操作,根据所述任意一点的坐标确定所述用户标记的坏点的坐标。
  21. 如权利要求20所述的方法,其特征在于,所述响应于所述用户操作,确定用户标记的坏点的坐标之后,所述方法还包括:
    根据所述用户标记的坏点的坐标生成坏点表;
    显示所述坏点表。
  22. 如权利要求21所述的方法,其特征在于,所述显示所述坏点表之后,所述方法还包括:
    接收用于修改所述坏点表的第三用户操作;
    响应于所述第三用户操作,更新所述坏点表。
  23. 一种红外图像处理装置,其特征在于,包括:存储器及处理器;
    所述存储器与所述处理器耦合,所述存储器用于存储程序指令;所述处理器用于调用所述存储器中的程序指令并执行:
    获取用户标记的坏点的坐标;所述用户标记的坏点为红外传感器在出厂后在用户使用过程中产生的坏点;
    根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点;
    去除所述红外图像中的坏点。
  24. 如权利要求23所述的装置,其特征在于,所述处理器执行去除所述红外图像中的坏点时,具体执行:根据所述红外传感器获取的红外图像及所述用户标记的坏点的坐标,确定所述红外图像中的坏点的最终温度值。
  25. 如权利要求23或24所述的装置,其特征在于,所述装置为红外显示终端,所述红外显示终端包括所述红外传感器,所述装置还包括用户接口,所述处理器执行获取用户标记的坏点坐标时,具体执行:
    通过所述用户接口接收用于标记坏点的用户操作;
    响应于所述用户操作,确定所述用户标记的坏点的坐标。
  26. 如权利要求23或24所述的装置,其特征在于,所述装置应用于红外成像系统,所述系统包括所述装置及红外显示终端,所述装置还包括所述红外传感器及收发器;
    所述处理器执行获取用户标记的坏点坐标时,具体执行:
    通过所述收发器接收所述红外显示终端发送的用户标记的坏点的坐标,所述用户标记的坏点的坐标根据用户作用于所述红外显示终端的用户操作确定。
  27. 如权利要求23或24所述的装置,其特征在于,所述装置应用于红外成像系统,所述系统包括所述红外拍摄终端及所述装置,所述红外拍摄终端包括所述红外传感器,所述装置还包括用户接口及收发器;
    所述处理器执行获取用户标记的坏点坐标时,具体执行:
    通过所述用户接口接收用于标记坏点的用户操作;
    响应于所述用户操作,确定所述用户标记的坏点的坐标;
    所述根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点之前,所处理器还执行:
    通过所述收发器接收所述红外拍摄终端发送的所述红外图像。
  28. 如权利要求23-27任一项所述的装置,其特征在于,所述获取用户标记的坏点的坐标之后,所述处理器还执行:
    融合初始像素级响应率数据表及所述用户标记的坏点的坐标,生成第二像素级响应率数据表;所述初始像素级响应率数据表及所述第二像素级响应率数据表中均包括不同的像素点与所述像素点对应的响应率,所述第二像素级响应率数据表中用户标记的坏点对应的响应率为预设值;
    所述处理器根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点时,具体执行:根据所述第二像素级响应率数据表识别所述红外传感器获取的红外图像中的坏点。
  29. 如权利要求23-27任一项所述的装置,其特征在于,所述获取用户标记的坏点的坐标之后,所述处理器还执行:
    根据所述用户标记的坏点的坐标生成第一坏点表;
    所述处理器执行根据所述用户标记的坏点的坐标识别所述红外传感器获 取的红外图像中的坏点时,具体执行:根据所述第一坏点表识别所述红外传感器获取的红外图像中的坏点。
  30. 如权利要求23-27任一项所述的装置,其特征在于,所述处理器还执行:
    获取出厂标定的静态坏点的坐标;
    所述处理器执行根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点时,具体执行:
    根据所述用户标记的坏点的坐标以及所述静态坏点的坐标识别所述红外图像中的坏点。
  31. 如权利要求30所述的装置,其特征在于,所述获取用户标记的坏点的坐标之后,所述处理器还执行:
    融合所述初始像素级响应率数据表、所述用户标记的坏点的坐标及所述静态坏点的坐标,生成第三像素级响应率数据表;所述第三像素级响应率数据表中所述用户标记的坏点的坐标及所述静态坏点对应的响应率为预设值;
    所述处理器执行根据所述用户标记的坏点的坐标以及所述静态坏点的坐标识别所述红外传感器获取的红外图像中的坏点时,具体执行:
    根据所述第三像素级响应率数据表识别所述红外传感器获取的红外图像中的坏点;
    所述处理器执行去除所述红外图像中的坏点时,具体执行:
    去除所述红外图像中的所述用户标记的坏点以及所述静态坏点。
  32. 如权利要求30所述的装置,其特征在于,所述获取用户标记的坏点的坐标之后,所述处理器还执行:
    根据所述用户标记的坏点的坐标及所述静态坏点的坐标生成第二坏点表;
    所述处理器执行根据所述用户标记的坏点的坐标识别所述红外传感器获取的红外图像中的坏点时,具体执行:根据所述第二坏点表识别所述红外传感器获取的红外图像中的坏点。
  33. 如权利要求30-32任一项所述的装置,其特征在于,所述处理器执行去除所述红外图像中的坏点时,具体执行:根据所述红外传感器获取的红外图像、所述用户标记的坏点的坐标及所述静态坏点的坐标,确定所述红外图像中的坏点的最终温度值。
  34. 如权利要求23-33任一项所述的装置,其特征在于,所述处理器还执行:
    确定动态坏点的坐标;
    根据所述动态坏点的坐标识别所述红外传感器获取的红外图像中的动态坏点;
    去除所述动态坏点。
  35. 如权利要求34所述的装置,其特征在于,所述处理器执行去除所述动态坏点时,具体执行:根据所述红外传感器获取的红外图像及所述动态坏点的坐标,确定所述动态坏点的最终温度值。
  36. 如权利要求34或35所述的装置,其特征在于,所述处理器执行确定动态坏点的坐标时,具体执行:确定所述红外传感器获取的红外图像中第一像素点的邻域;所述第一像素点为所述红外传感器获取的红外图像中的任意一个像素点,所述邻域中包括预设数量的像素点;
    计算所述邻域内除所述第一像素点之外的其他所有像素点的温度值的平均值;
    在所述第一像素点的初始温度值与所述平均值的差值的绝对值超过预设阈值的情况下,确定所述第一像素点为动态坏点;其中,所述第一像素点的坐标为动态坏点的坐标。
  37. 如权利要求34或35所述的装置,其特征在于,所述处理器执行确定动态坏点的坐标时,具体执行:确定所述红外传感器获取的红外图像中第一像素点的邻域;其中,所述邻域包括预设数量的像素点;所述第一像素点为所述红外传感器获取的红外图像中的任意一个像素点;
    根据所述邻域内的像素点的温度值确定所述第一像素点为动态坏点的概率;其中,所述第一像素点的坐标为动态坏点的坐标;
    所述处理器执行去除所述动态坏点时,具体执行:
    根据所述第一像素点的初始温度值、所述第一像素点为动态坏点的概率及所述第一像素点的重建温度值,确定所述第一像素点的最终温度值,其中,所述第一像素点的重建温度值是根据所述邻域内的一个或多个像素点的温度值进行插值得到的。
  38. 如权利要求37所述的装置,其特征在于,所述第一像素点的最终温 度值为以下两者的和:所述重建温度值与所述第一像素点为动态坏点的概率的乘积、所述第一像素点的初始温度值与所述第一像素点不为动态坏点的概率的乘积;所述第一像素点为动态坏点的概率与所述第一像素点不为动态坏点的概率的和为1。
  39. 如权利要求36或37所述的装置,其特征在于,所述第一像素点的重建温度值是根据所述邻域内的一个像素点的温度值进行插值得到的,所述重建温度值为所述一个像素点的温度值;或
    所述第一像素点的重建温度值是根据所述邻域内的多个像素点的温度值进行插值得到的,所述重建温度值为所述多个像素点的温度值的平均值或加权均值。
  40. 如权例要求37-39任一项所述的装置,其特征在于,所述邻域以所述第一像素点为中心。
  41. 一种红外传感器的坏点标记装置,其特征在于,包括:存储器、处理器及用户接口;
    所述存储器、所述用户接口与所述处理器耦合,所述存储器用于存储程序指令;所述处理器用于调用所述存储器中的程序指令并执行:
    通过所述用户接口接收用于标记坏点的用户操作;所述坏点为所述红外传感器在出厂后在用户使用过程中产生的坏点;
    响应于所述用户操作,确定用户标记的坏点的坐标。
  42. 如权利要求41所述的装置,其特征在于,所述通过所述用户接口接收用于标记坏点的用户操作之前,所述处理器还执行:
    通过所述用户接口接收用于确定待选区域的第一用户操作;
    响应于所述第一用户操作,放大显示所述待选区域;
    所述处理器通过所述用户接口接收用于标记坏点的用户操作时,具体执行:通过所述用户接口接收作用于所述待选区域中任意一点的第二用户操作;
    所述处理器执行响应于所述用户操作,确定用户标记的坏点的坐标时,具体执行:响应于所述第二用户操作,根据所述任意一点的坐标确定所述用户标记的坏点的坐标。
  43. 如权利要求42所述的装置,其特征在于,所述响应于所述用户操作,确定用户标记的坏点的坐标之后,所述处理器还执行:
    根据所述用户标记的坏点的坐标生成坏点表;
    显示所述坏点表。
  44. 如权利要求43所述的装置,其特征在于,所述显示所述坏点表之后,所述处理器还执行:
    通过所述用户接口接收用于修改所述坏点表的第三用户操作;
    响应于所述第三用户操作,更新所述坏点表。
  45. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-18任一项所述的方法。
  46. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求19-22任一项所述的方法。
PCT/CN2019/130969 2019-12-31 2019-12-31 红外图像处理方法、坏点标记方法及相关装置 WO2021134714A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/130969 WO2021134714A1 (zh) 2019-12-31 2019-12-31 红外图像处理方法、坏点标记方法及相关装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/130969 WO2021134714A1 (zh) 2019-12-31 2019-12-31 红外图像处理方法、坏点标记方法及相关装置

Publications (1)

Publication Number Publication Date
WO2021134714A1 true WO2021134714A1 (zh) 2021-07-08

Family

ID=76685812

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/130969 WO2021134714A1 (zh) 2019-12-31 2019-12-31 红外图像处理方法、坏点标记方法及相关装置

Country Status (1)

Country Link
WO (1) WO2021134714A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538289A (zh) * 2021-07-30 2021-10-22 浙江天铂云科光电股份有限公司 一种利用区域连通动态去除伪坏点的方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120019693A1 (en) * 2006-03-24 2012-01-26 Qualcomm Incorporated Method and apparatus for processing bad pixels
CN105191288A (zh) * 2012-12-31 2015-12-23 菲力尔系统公司 异常像素检测
CN110463199A (zh) * 2018-04-10 2019-11-15 深圳市大疆创新科技有限公司 图像传感器坏点检测方法、拍摄装置、无人机及存储介质
CN110567584A (zh) * 2019-07-22 2019-12-13 河南中光学集团有限公司 一种实时红外探测器盲元检测提取及校正的方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120019693A1 (en) * 2006-03-24 2012-01-26 Qualcomm Incorporated Method and apparatus for processing bad pixels
CN105191288A (zh) * 2012-12-31 2015-12-23 菲力尔系统公司 异常像素检测
CN110463199A (zh) * 2018-04-10 2019-11-15 深圳市大疆创新科技有限公司 图像传感器坏点检测方法、拍摄装置、无人机及存储介质
CN110567584A (zh) * 2019-07-22 2019-12-13 河南中光学集团有限公司 一种实时红外探测器盲元检测提取及校正的方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538289A (zh) * 2021-07-30 2021-10-22 浙江天铂云科光电股份有限公司 一种利用区域连通动态去除伪坏点的方法

Similar Documents

Publication Publication Date Title
KR102169431B1 (ko) 이미지들의 시퀀스 중의 이미지에서의 객체 경계 안정화를 위한 이미지 처리 장치 및 방법
US9697416B2 (en) Object detection using cascaded convolutional neural networks
EP3410390B1 (en) Image processing method and device, computer readable storage medium and electronic device
WO2019109801A1 (zh) 拍摄参数的调整方法、装置、存储介质及移动终端
US10380756B2 (en) Video processing system and method for object detection in a sequence of image frames
CN108989678B (zh) 一种图像处理方法、移动终端
CN110944160B (zh) 一种图像处理方法及电子设备
WO2018045592A1 (zh) 拍摄图像方法、装置和终端
US10621730B2 (en) Missing feet recovery of a human object from an image sequence based on ground plane detection
EP4273745A1 (en) Gesture recognition method and apparatus, electronic device, readable storage medium, and chip
CN109068063B (zh) 一种三维图像数据的处理、显示方法、装置及移动终端
CN109246351B (zh) 一种构图方法及终端设备
CN109981989B (zh) 渲染图像的方法、装置、电子设备和计算机可读存储介质
CN109104573B (zh) 一种确定对焦点的方法及终端设备
CN109639981B (zh) 一种图像拍摄方法及移动终端
CN110944163A (zh) 一种图像处理方法及电子设备
WO2021134714A1 (zh) 红外图像处理方法、坏点标记方法及相关装置
CN107992894B (zh) 图像识别方法、装置及计算机可读存储介质
CN107705275B (zh) 一种拍照方法及移动终端
CN110913133B (zh) 拍摄方法及电子设备
JP6744237B2 (ja) 画像処理装置、画像処理システムおよびプログラム
US10990802B2 (en) Imaging apparatus providing out focusing and method for controlling the same
JP6451300B2 (ja) 画像処理装置、撮像装置、画像処理方法、及びプログラム
CN108540726B (zh) 连拍图像的处理方法、装置、存储介质及终端
WO2021102939A1 (zh) 图像处理方法及设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19958705

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19958705

Country of ref document: EP

Kind code of ref document: A1