WO2021254233A1 - 对象检测方法和摄像设备 - Google Patents

对象检测方法和摄像设备 Download PDF

Info

Publication number
WO2021254233A1
WO2021254233A1 PCT/CN2021/099211 CN2021099211W WO2021254233A1 WO 2021254233 A1 WO2021254233 A1 WO 2021254233A1 CN 2021099211 W CN2021099211 W CN 2021099211W WO 2021254233 A1 WO2021254233 A1 WO 2021254233A1
Authority
WO
WIPO (PCT)
Prior art keywords
detected
image
size
actual size
size range
Prior art date
Application number
PCT/CN2021/099211
Other languages
English (en)
French (fr)
Inventor
林克荣
Original Assignee
杭州海康微影传感科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康微影传感科技有限公司 filed Critical 杭州海康微影传感科技有限公司
Priority to EP21825409.2A priority Critical patent/EP4166981A4/en
Publication of WO2021254233A1 publication Critical patent/WO2021254233A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/581Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets
    • G01S13/582Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets adapted for simultaneous range and velocity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/886Radar or analogous systems specially adapted for specific applications for alarm systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • G06V10/811Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J2005/0077Imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/28Details of pulse systems
    • G01S7/285Receivers
    • G01S7/288Coherent receivers
    • G01S7/2883Coherent receivers using FFT processing

Definitions

  • This application relates to the field of surveillance, and in particular to an object detection method and camera equipment.
  • Object detection which can also be called perimeter detection, refers to the technology of detecting whether a preset designated object appears in the monitoring area.
  • Perimeter detection technology is the first line of defense of the entire security system and is of great significance to the entire security system.
  • the perimeter detection method is easily affected by weather, obstructions, etc., which makes the accuracy of the perimeter detection result low.
  • the present application provides an object detection method and camera equipment for improving the accuracy of object detection.
  • an object detection method which is applied to a camera device, and includes:
  • the spatial position information includes at least: the distance information of the object to be detected relative to the imaging device and the speed information of the object to be detected ;
  • the calibrated size range corresponding to the object to be detected is determined according to the distance information, and it is determined that the object to be detected is located in the captured image containing the object to be detected.
  • determining the calibrated size range corresponding to the object to be detected according to the distance information includes:
  • the calibrated size range is determined based on the calibrated size and a preset dimensional error.
  • determining whether the object to be detected is the designated object based on the actual size and the scaled size range includes:
  • the image is a thermal imaging image.
  • determining the actual size of the object to be detected in the image according to the acquired image containing the object to be detected further comprising: determining the current temperature information of the object to be detected according to the thermal imaging image ;
  • determining whether the object to be detected is the designated object includes:
  • the image includes a visible light image.
  • the image includes a thermal imaging image and a visible light image
  • Determining the actual size of the object to be detected in the image according to the captured image containing the object to be detected includes:
  • Determining the actual size of the object to be detected in the image according to the captured image containing the object to be detected further includes: determining the current temperature information of the object to be detected according to the thermal imaging image;
  • detecting whether the object to be detected is the designated object includes:
  • an imaging device including:
  • the radar module is used to send the radar detection signal used to detect the object to be detected and to receive the reflected radar detection signal;
  • An image module for collecting an image containing the object to be detected
  • the processor is configured to obtain the spatial position information of the object to be detected; the spatial position information is determined according to the radar detection signal used to detect the object to be detected, and the spatial position information includes at least: the relative position of the object to be detected The distance information of the imaging device and the speed information of the object to be detected; in response to the speed information meeting a preset speed requirement, the calibrated size range of the object to be detected is determined according to the distance information, and the The actual size of the object to be detected in the image; the actual size is determined based on the captured image containing the object to be detected; based on the actual size and the scaled size range, it is determined whether the object to be detected is Designated object
  • the scaled size range is used to indicate the size range of the designated object in the image when the object to be detected is a designated object.
  • the processor is configured to input the distance information into a preset size prediction model to obtain the corresponding calibrated size when determining the calibrated size range of the object to be detected according to the distance information;
  • the scaled size and the preset size error determine the scaled size range; wherein, the size prediction model represents the corresponding relationship between the distance information and the scaled size.
  • the processor when determining whether the object to be detected is the designated object based on the actual size and the scaled size range, is configured to respond to the actual size being within the scaled size range , It is determined that the object to be detected is the designated object; in response to the actual size is not within the calibration size range, it is determined that the object to be detected is not the designated object.
  • the image module is a thermal imaging module, and the image is a thermal imaging image
  • the processor is further configured to obtain current temperature information of the object to be detected; the current temperature information is determined according to the thermal imaging image;
  • the processor when determining whether the object to be detected is the designated object based on the actual size and the calibration size range, is configured to respond to the actual size being within the calibration size range and If the current temperature information meets the preset temperature requirement, it is determined that the object to be detected is the designated object; in response to the actual size being not within the calibration size range, and/or the current temperature information does not meet the preset temperature If required, it is determined that the object to be detected is not the designated object.
  • the image module is a visible light module, and the image is a visible light image.
  • the image module includes: a thermal imaging module and a visible light module; the image includes: a thermal imaging image and a visible light image;
  • the processor is configured to obtain the first actual size of the object to be detected and the second actual size of the object to be detected when obtaining the actual size of the object to be detected in the image; An actual size is determined based on the thermal imaging image; the second actual size is determined based on the visible light image;
  • the processor is further configured to determine the current temperature information of the object to be detected according to the thermal imaging image
  • the processor when determining whether the object to be detected is the designated object based on the actual size and the calibrated size range, is configured to respond to the fact that the first actual size and the second actual size are both at all If the current temperature information satisfies the preset temperature requirement within the calibration size range, it is determined that the object to be detected is the designated object; in response to the first actual size being not within the calibration size range, and/or If the second actual size is not within the scaled size range, and/or the current temperature information does not meet the preset temperature requirement, it is determined that the object to be detected is not the designated object.
  • the distance information between the object to be detected and the camera equipment determined based on the radar detection signals is more accurate, and the calibration size range determined based on the distance information is more accurate. It is accurate. Therefore, the method of implementing object detection based on the radar detection signal and image jointly proposed in this application will greatly improve the accuracy of object detection.
  • Fig. 1 is a flowchart of an object detection method shown in an exemplary embodiment of the present application
  • Fig. 2 is a schematic diagram of a camera device shown in an exemplary embodiment of the present application
  • Fig. 3 is a schematic diagram showing an integrated architecture of a radar module according to an exemplary embodiment of the present application
  • Fig. 4a is a schematic diagram of another camera device shown in an exemplary embodiment of the present application.
  • Fig. 4b is a flowchart of an object detection method shown in an exemplary embodiment of the present application.
  • Fig. 5a is a schematic diagram of another camera device shown in an exemplary embodiment of the present application.
  • Fig. 5b is a flowchart of an object detection method shown in an exemplary embodiment of the present application.
  • Fig. 6 is a schematic diagram of a surveillance image collected by a camera device.
  • first, second, third, etc. may be used in this application to describe various information, the information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as second information, and similarly, the second information may also be referred to as first information.
  • word “if” as used herein can be interpreted as "when” or “when” or “in response to a certainty”.
  • the purpose of this application is to propose an object detection method to accurately determine whether the object to be detected in the monitoring area is a designated object.
  • the camera equipment When implemented, the camera equipment integrates a radar module and an image module.
  • the radar module can send and receive radar detection signals for detecting the object to be detected, and the image module can collect images containing the object to be detected.
  • the camera device can determine the distance information of the object to be detected relative to the camera device and the speed information of the object to be detected according to the radar detection signal used to detect the object to be detected. If the speed information meets the preset speed requirement, the imaging device may determine the scaled size range corresponding to the object to be detected according to the distance information, and determine the object to be detected according to the captured image containing the object to be detected The actual size of the object. The imaging device determines whether the object to be detected is a designated object based on the actual size and the scaled size range.
  • this application proposes The method of realizing object detection based on the radar detection signal and image together will greatly improve the accuracy of object detection.
  • the specified object is the object that the user wants to monitor, and is the object preset by the user.
  • the designated object can be a person, a car, a small animal, and so on. There is no specific restriction on the specified object here.
  • the object to be detected refers to a movable object that appears in the monitoring area.
  • the object to be detected may be a person, a car, a small animal, etc., and the object to be detected is not specifically limited here.
  • the designated object is a person. If a movable object to be detected appears in the monitoring area, it is necessary to detect whether the object to be detected is a person.
  • the size prediction model can represent the distance information between the object to be detected and the imaging device, and the corresponding relationship between the scaled size.
  • the scaled size indicates the size of the specified object in the image when the object to be detected is the specified object.
  • what is input to the size prediction model is the distance information between the object to be detected and the imaging device, and what the size prediction model outputs is the calibrated size corresponding to the object to be detected.
  • At least one designated location can be selected in the monitoring area.
  • For a specified location place the specified object at the specified location.
  • the imaging device sends out radar detection signals and receives the returned radar detection signals to determine the distance information of the designated object relative to the imaging device. And, the imaging device collects an image of the specified object, and determines the size of the specified object from the collected images as the calibration size. Thus, the imaging device can obtain the distance information and the calibration size corresponding to the designated position.
  • the imaging device can obtain the distance information and the calibrated size corresponding to each designated position.
  • the distance information corresponding to each designated position is used as an independent variable
  • the calibrated size corresponding to each designated position is used as a dependent variable to generate an independent variable-dependent variable pair corresponding to each designated position.
  • the respective variable-dependent variable pairs are fitted (for example, interpolation algorithm fitting) to obtain a size prediction model.
  • the size of the designated object is the same in the images collected at multiple different positions with the same distance from the imaging device.
  • FIG. 1 is a flowchart of an object detection method shown in an exemplary embodiment of the present application. The method may be applied to a camera device and may include the following steps.
  • Step 101 The imaging device determines the spatial position information of the object to be detected according to the radar detection signal used to detect the object to be detected, and the spatial position information includes at least: the distance information of the object to be detected relative to the imaging device and the Speed information of the detected object.
  • the camera device may be a multi-camera camera device.
  • One of the multi-eye camera equipment can be equipped with a radar detector, and one eye can be equipped with a camera for image collection.
  • the camera device can receive a radar detection signal reflected by the object to be detected through a radar detector, and collect an image containing the object to be detected through a camera.
  • the imaging device can determine the spatial position information of the object to be detected based on the radar detection signal.
  • the spatial position information may include: distance information of the object to be detected relative to the imaging device, speed information of the object to be detected, and the like.
  • distance information of the object to be detected relative to the imaging device may include: distance information of the object to be detected relative to the imaging device, speed information of the object to be detected, and the like.
  • speed information of the object to be detected may include: speed information of the object to be detected, and the like.
  • Step 102 If the speed information meets the preset speed requirement, determine the calibrated size range corresponding to the object to be detected according to the distance information, and determine the object to be detected according to the acquired image containing the object to be detected The actual size of the object; wherein the scaled size range is used to indicate the size range of the specified object in the image when the object to be detected is a specified object.
  • the camera device can detect whether the speed information of the object to be detected meets a preset speed requirement.
  • the speed requirement may be that the speed of the object to be detected is within the preset speed range of the specified object. In one embodiment, for people, the preset speed range is 5-19 km/h, and for cars, the preset speed range is 20-60 km/h.
  • the speed requirement may be that the difference between the speed of the object to be detected and the preset speed of the designated object is within a preset error range (for example, ⁇ 5%).
  • a preset error range for example, ⁇ 5%
  • the speed information of the object to be detected does not meet the preset speed requirement, it is determined that the object to be detected is not a designated object.
  • the calibrated size range corresponding to the object to be detected is determined according to the distance information, and the image to be detected is determined according to the acquired image containing the object to be detected. The actual size of the detected object.
  • the camera device can input the distance information between the object to be detected and the camera device into the aforementioned preset size prediction model, and obtain the calibrated size of the object to be detected output by the size prediction model.
  • the camera device can determine the calibration size range based on the calibration size and the preset size error. For example, the camera device can add the calibration size and the preset size error to obtain the calibration size range.
  • the camera device may also use other methods to obtain the calibrated size range, which is only an exemplary description here, and no specific limitation is imposed on it.
  • the meaning of the scaled size range of the object to be detected is: place the specified object at the position of the object to be detected currently, that is, the distance between the specified object and the camera device and the distance between the object to be detected and the camera device
  • the imaging device collects images of the specified object at the position, and specifies the size of the object in the collected image.
  • the meaning is: assuming that the object to be detected is a designated object, at the same position, the actual size of the designated object in the image.
  • the camera device can obtain the actual size of the object to be detected from the captured image according to the distance between the object to be detected and the camera device.
  • step A The method of obtaining the actual size will be described in detail below through step A to step B.
  • Step A The camera device can perform object recognition on the image, and recognize the actual size of the position of at least one object in the image in the image from the image.
  • the camera device may input the image into a preset object recognition model, so that the object recognition model performs object recognition on the input image, and outputs a region frame of at least one object in the image. Then, for each area frame of the recognized object, the imaging device can determine the position of the area frame in the image as the position information of the object in the image, and the size range of the area frame as the object in the image The actual size in.
  • the object recognition model may be a YOLO model or other models.
  • the object recognition model is only exemplified, and no specific limitation is imposed on it.
  • the imaging device may also adopt other methods to implement step A, which is only an exemplary description here, and no specific limitation is imposed on it.
  • Step B The camera device can determine the object to be detected from at least one object identified in the image, and the actual size of the object to be detected in the image.
  • the corresponding relationship between image position information and distance information is pre-configured on the imaging device.
  • the image position information in the corresponding relationship refers to a position in the image, and the position is unique in the image
  • the distance in the corresponding relationship refers to a position of the monitored area corresponding to the image position relative to the camera device distance.
  • the image position corresponds to the position in the monitoring area one-to-one, and the distance between a position in the monitoring area and the camera equipment is unique.
  • the image position can be correlated with the radar of a position in the monitoring area for the image position. The measured distances are correlated.
  • the imaging device can determine the distance between each recognized object and the imaging device according to the image position of each object recognized from the image according to the corresponding relationship between the image position information and the distance information.
  • the imaging device can determine the same distance as the distance between the object to be detected and the imaging device among the distances between each recognized object and the imaging device, and based on the determined distance, the image position of each recognized object is- Find the corresponding image position in the distance relationship, and use the object at the found image position as the object to be detected.
  • the imaging device may use the determined actual size of the object in the image as the actual size of the object to be detected in the image. It should be noted that the distance here is the same, and the difference in the distance can be within the preset range.
  • the determined distance between the object to be detected and the imaging device is distance 1
  • two objects are identified from the image, object 1 and object 2, respectively.
  • the position of the recognized object 1 in the image is position 1
  • the actual size in the image is size 1.
  • the position of the identified object 2 in the image is position 2, and the actual size in the image is size 2.
  • the camera equipment can find the distance value distance 1 and distance 2 corresponding to position 1 and position 2 in Table 1, compare the distance 1 and distance 2 with the distance between the object to be detected and the camera equipment (namely distance 1) to determine The distance 1 corresponding to position 1 is the same as the distance between the object to be detected and the imaging device. Then, since the position of the object 1 in the image is position 1, the imaging device determines that the object 1 is the object to be detected, and uses the actual size of the object 1 in the image as the actual size of the object to be detected in the image.
  • two or more objects may be at the same distance from the camera device. If the distance is the same as the distance between the object to be detected and the imaging device, the two or more objects are all determined to be objects to be detected.
  • Step 103 The camera device determines whether the object to be detected is a designated object based on the actual size and the scaled size range.
  • the scaled size range means: assuming that the object to be detected is a designated object, the size range of the designated object in the image at the same position.
  • the actual size refers to the actual size of the object to be detected in the image.
  • the imaging device first assumes that the object to be detected is a designated object, and then verifies whether the assumption is true by calibrating the size range and the actual size, so as to determine whether the object to be detected is the designated object.
  • the camera device can detect whether the actual size is within the calibrated size range. If the actual size is within the scaled size range, it is determined that the object to be detected is a designated object. If the actual size is not within the scaled size range, it is determined that the object to be detected is not a designated object.
  • the thermal imaging image can provide more information about the object, such as the size of the object in the image, the position of the object in the image, and the current temperature of the object. Therefore, in order to improve the accuracy of object detection, the above-mentioned collected image may be a thermal imaging image.
  • the temperature information in the thermal imaging image can be combined to eliminate the interference of some small target objects (such as tree branches, rain and snow, etc.).
  • the imaging device can determine the actual size of the object to be detected in the image according to the thermal imaging image that has been collected and contains the object to be detected, and determine the actual size of the object to be detected according to the thermal imaging image. The current temperature of the object.
  • the imaging device may determine that the object to be detected is the designated object;
  • the imaging device may determine that the object to be detected is not the designated object.
  • the preset temperature requirement is 35-38 degrees Celsius.
  • the images collected in this application may be thermal imaging images, visible light images, or a combination of the two.
  • the imaging device may determine the actual size of the object to be detected in the image according to the acquired visible light image containing the object to be detected.
  • the imaging device can determine that the object to be detected is the designated object; if the actual size is not within the calibrated size range, the imaging device can determine the object to be detected The detection object is not the specified object.
  • the imaging device may determine the first actual size of the object to be detected in the image according to the thermal imaging image that has been collected and contains the object to be detected, and determine the object to be detected according to the thermal imaging image And determine the second actual size of the object to be detected in the image according to the acquired visible light image containing the object to be detected.
  • the object to be detected is the designated object; If the actual size is not within the calibration size range, and/or the second actual size is not within the calibration size range, and/or the current temperature information does not meet the preset temperature requirements, it is determined that the object to be detected is not all The specified object.
  • the imaging device may also fuse the thermal imaging image and the visible light image containing the object to be detected, and output The merged image.
  • the camera device can also output visible light images, thermal imaging images, and so on. There is no specific limitation here.
  • the fused image contains not only the detailed information in the visible light image, but also the temperature information in the thermal imaging image, the fused image will contain more information, which is more conducive to the subsequent processing of the image.
  • the radar detection signal is not easily affected by environmental factors, the distance information between the object to be detected and the camera equipment determined based on the radar detection signal is more accurate, and the calibration size determined based on the distance information The range is more accurate. Therefore, the method of implementing object detection based on the radar detection signal and image jointly proposed in this application will greatly improve the accuracy of object detection.
  • the temperature information in the thermal imaging image is also combined during object detection, so that the interference of small target objects (such as tree branches) on object detection can be reduced during object detection, so that object detection The accuracy is higher.
  • small target objects such as tree branches
  • the visible light image when detecting objects, is also combined for detection, so that the accuracy of object detection is higher.
  • the above-mentioned method can also be used to detect the objects. For details, please refer to the above-mentioned steps, which will not be repeated here.
  • this application also provides a camera device.
  • FIG. 2 is a schematic diagram of a camera device according to an exemplary embodiment of the present application.
  • the camera equipment includes a radar module, an image module and a processor.
  • the camera device may also include other modules, such as communication modules, etc.
  • the modules integrated with the camera device are only exemplified, and the camera device is not specifically limited.
  • the radar module is connected to the processor through a digital interface, and the image module is connected to the processor through a digital interface.
  • the radar module can adopt an integrated module solution or a discrete module solution.
  • FIG. 3 is a schematic diagram of an integrated architecture of a radar module according to an exemplary embodiment of the present application.
  • the radar module includes: a radar transceiver module (for example, the radar transceiver module includes a radar detector, an antenna, etc.), which is used to send radar detection signals and receive reflected radar detection signals.
  • the digital signal acquisition module performs digital signal acquisition based on the transmitted and reflected radar detection signals.
  • the 2D FFT (two-dimensional fast Fourier transform) module performs two-dimensional fast Fourier transform on the collected digital signal.
  • the two-dimensional peak search module can perform two-dimensional peak search processing on the transformed signal.
  • the constant false alarm detection module performs constant false alarm detection on the output result of the two-dimensional peak search module.
  • the clustering module clusters the constant false alarm detection results.
  • the track output module can determine the spatial position information of the object to be detected based on the clustering result.
  • the spatial position information of the object to be detected may include: the distance between the object to be detected and the imaging device, the speed of the object to be detected, and so on.
  • the spatial location information is only an exemplary description of the spatial location information, and no specific limitation is imposed on it.
  • the modules shown in Figure 3 can be integrated into the radar module.
  • the radar module directly outputs the spatial position information of the object to be detected.
  • part of the modules in Figure 3 can be deployed in the radar module, and another part of the modules can be deployed in the processor of the camera device.
  • the radar transceiver module in the radar module, and deploy the remaining modules in the processor.
  • the radar module sends the radar detection signal and related information to the processor, and the processor determines the spatial position information of the object to be detected from the radar detection signal based on the remaining modules.
  • the image module may include: a thermal imaging module, a visible light module, or a thermal imaging module and a visible light module.
  • the camera device includes a radar module, a thermal imaging module, and a processor.
  • the camera device includes a radar module, a thermal imaging module, a visible light module, and a processor.
  • the camera device includes a radar module, a visible light module, and a processor.
  • the thermal imaging module includes: a thermal imaging lens, a processing chip, and so on.
  • the thermal imaging module is not specifically limited here.
  • the thermal imaging module is used to collect thermal imaging images of the object to be detected in the monitoring area.
  • the visible light module includes a visible light lens, a processing chip, etc., and the visible light module is not specifically limited here.
  • the visible light module is used to collect the visible light image of the object to be detected in the monitoring area.
  • the processor is used to implement object detection.
  • the processor may be a logic processing chip, and the logic processing chip is configured with an SOC (system-on-a-chip, operating system on a chip), and object detection is implemented through the operating system.
  • SOC system-on-a-chip, operating system on a chip
  • the radar module is used to send a radar detection signal for detecting an object to be detected, and to receive the reflected radar detection signal.
  • the image module is used to collect an image containing the object to be detected.
  • the processor is configured to obtain the spatial position information of the object to be detected; the spatial position information is determined according to the radar detection signal used to detect the object to be detected, and the spatial position information includes at least: the object to be detected relative to the The distance information of the imaging device and the speed information of the object to be detected; if the speed information meets the preset speed requirement, the calibrated size range of the object to be detected is determined according to the distance information, and the object to be detected is acquired
  • the actual size in the image the actual size is determined based on the captured image containing the object to be detected; based on the actual size and the scaled size range, it is determined whether the object to be detected is a designated object; wherein, The scaled size range is used to indicate the size range of the designated object at the same position in the image when the object to be detected is a designated object.
  • step 201 The function of the processor will be specifically introduced below through step 201 to step 203.
  • Step 201 The processor obtains the spatial position information of the object to be detected; the spatial position information is determined according to the radar detection signal used to detect the object to be detected, and the spatial position information includes at least: the object to be detected relative to the The distance information of the imaging device and the speed information of the object to be detected.
  • the radar module determines the spatial position information of the object to be detected based on the radar detection signal, and sends the spatial position information to the processor.
  • the processor can receive the spatial position information.
  • the radar module sends the radar detection signal and information related to the radar detection signal (such as the sending time and receiving time of the radar detection signal) to the processor, and the processor is based on The radar detection signal and the information related to the radar detection signal determine the spatial position information of the object to be detected.
  • information related to the radar detection signal such as the sending time and receiving time of the radar detection signal
  • the processor is based on The radar detection signal and the information related to the radar detection signal determine the spatial position information of the object to be detected.
  • step 101 refers to step 101, which will not be repeated here.
  • Step 202 If the speed information meets the preset speed requirement, the processor determines the calibrated size range of the object to be detected according to the distance information, and obtains the actual size of the object to be detected in the image; The size is determined according to the captured image containing the object to be detected.
  • the processor can detect whether the speed information of the object to be detected meets a preset speed requirement.
  • the speed requirement may be that the speed of the object to be detected is within the preset speed range of the specified object.
  • the speed requirement may be that the difference between the speed of the object to be detected and the preset speed of the designated object is within a preset error range.
  • the speed information of the object to be detected does not meet the preset speed requirement, it is determined that the object to be detected is not a designated object.
  • the calibrated size range corresponding to the object to be detected is determined according to the distance information, and the image to be detected is determined according to the acquired image containing the object to be detected. The actual size of the detected object in the image.
  • the processor may input the distance information between the object to be detected and the imaging device into the aforementioned preset size prediction model, and obtain the calibrated size of the object to be detected output by the size prediction model.
  • the processor can determine the calibration size range based on the calibration size and the preset size error. For example, the processor can accumulate the calibration size and the preset size error to obtain the calibration size range.
  • the processor may also use other methods to obtain the scaled size range, which is only an exemplary description here and does not specifically limit it.
  • the meaning of the calibrated size range of the object to be detected is: the specified object is placed at the position of the object to be detected, the image module collects the image of the specified object at that position, and the specified object is being collected.
  • the size in the image is: assuming that the object to be detected is a designated object, the actual size of the designated object in the image at the same position.
  • the image module can recognize the captured image to obtain the image position information and actual size of multiple recognized objects in the image, and place multiple objects in the image.
  • the image location information and actual size in the image are sent to the processor.
  • the processor determines the distance corresponding to the image position of each object according to the corresponding relationship between the image position information and the distance information. Then, the processor can determine the same distance as the distance between the object to be detected and the imaging device among the determined distances, and find the image from the image position-distance correspondence relationship of each identified object according to the determined distance.
  • the object at the position of the image found is used as the object to be detected, and the actual size of the determined object in the image is taken as the actual size of the object to be detected in the image.
  • the image module can send the captured image to the processor.
  • the processor recognizes the collected images, obtains the image position information and actual sizes of the multiple recognized objects in the image, and determines the distance corresponding to the image position of each object according to the corresponding relationship between the image positions and distances. Then, the processor can determine the same distance as the distance between the object to be detected and the imaging device among the determined distances, and find the image from the image position-distance correspondence relationship of each identified object according to the determined distance. The object at the position of the image found is used as the object to be detected, and the actual size of the determined object in the image is taken as the actual size of the object to be detected in the image.
  • step 102 For details, refer to the description of step 102 above, which will not be repeated here.
  • Step 203 The processor determines whether the object to be detected is a designated object based on the actual size and the scaled size range.
  • the processor is configured to input the distance information into a preset size prediction model to obtain the corresponding calibrated size when determining the calibrated size range of the object to be detected according to the distance information;
  • the calibration size and the preset size error determine the calibration size range.
  • the processor is configured to determine whether the object to be detected is the designated object based on the actual size and the scaled size range, if the actual size is within the scaled size range , It is determined that the object to be detected is the designated object; if the actual size is not within the range of the calibrated size, it is determined that the object to be detected is not the designated object.
  • the image module is a visible light module, and the image is a visible light image.
  • the image module is a thermal imaging module, and the image is a thermal imaging image.
  • the processor is further configured to obtain current temperature information of the object to be detected; the current temperature information is determined according to the thermal imaging image.
  • the processor when determining whether the object to be detected is the designated object based on the actual size and the scaled size range, is configured to: if the actual size is within the scaled size range and the If the current temperature information meets the preset temperature requirement, it is determined that the object to be detected is the designated object; if the actual size is not within the calibrated size range, and/or the current temperature information does not meet the preset temperature requirement, It is determined that the object to be detected is not the designated object.
  • the thermal imaging module can determine the current temperature information of the object to be detected from the thermal imaging image, and compare the current temperature The information is sent to the processor.
  • the thermal imaging module sends the thermal imaging image to the processor, and the processor determines the current temperature information of the object to be detected from the thermal imaging image.
  • the image module includes a thermal imaging module and a visible light module; the image includes a thermal imaging image and a visible light image; the processor is used to obtain the actual size of the object to be detected The first actual size of the object to be detected in the image and the second actual size of the object to be detected in the image; the first actual size is determined according to the thermal imaging image; the second actual size is determined according to the The visible light image is OK.
  • the processor is further configured to determine the current temperature information of the object to be detected according to the thermal imaging image.
  • the processor when determining whether the object to be detected is a designated object based on the actual size and the scaled size range, is configured to: if the first actual size and the second actual size are both within the scaled size Within the range and the current temperature information meets the preset temperature requirements, it is determined that the object to be detected is the designated object; if the first actual size is not within the calibration size range, and/or the second If the actual size is not within the calibrated size range, and/or the current temperature information does not meet the preset temperature requirement, it is determined that the object to be detected is not the designated object.
  • the thermal imaging module can determine the current temperature of the object to be detected from the thermal imaging image Information and the first actual size in the image, and send the current temperature information and the first actual size in the image to the processor.
  • the thermal imaging module sends the thermal imaging image to the processor, and the processor determines from the thermal imaging image the current temperature information of the object to be detected and the information in the image.
  • the first actual size This is only an exemplary description, and no specific limitation is imposed on it.
  • the visible light module can determine the second actual size of the object to be detected in the image from the visible light image, and set the second actual size Send to the processor.
  • the visible light module sends the visible light image to the processor, and the processor determines the second actual size of the object to be detected in the image from the visible light image.
  • the radar detection signal is not easily affected by environmental factors, the distance between the object to be detected and the camera equipment determined based on the radar detection signal is more accurate, and the calibration size range determined based on the distance information It is more accurate. Therefore, the method of implementing object detection based on the radar detection signal and image jointly proposed in this application will greatly improve the accuracy of object detection.
  • this application combines the speed information and size information of the object to be detected for object detection, so that the accuracy of object detection is higher.
  • the temperature information in the thermal imaging image is also combined during object detection, so that the interference of small target objects (such as tree branches) on object detection can be reduced during object detection, so that object detection The accuracy is higher.
  • small target objects such as tree branches
  • object detection is also performed based on the actual size of the object to be detected in the thermal imaging image and the visible light image, so that the accuracy of object detection is higher.
  • Fig. 4b is a flowchart of an object detection method shown in an exemplary embodiment of the present application.
  • the camera device can include a radar module, a thermal imaging module, and a processor.
  • the radar module and the thermal imaging module are connected to the processor through a digital interface. Communication.
  • the method may include the following steps.
  • the radar module can receive a radar detection signal reflected by the object to be detected to detect the object, and the thermal imaging module can collect a thermal imaging image containing the object to be detected.
  • Step 401 The processor obtains the spatial position information of the object to be detected; the spatial position information is determined according to the radar detection signal sent and received by the radar module, and the spatial position information includes at least: the object to be detected relative to the camera device The distance information and the speed information of the object to be detected.
  • Step 402 The processor detects whether the speed information meets the speed requirement.
  • step 403 is executed; if the speed information does not meet the speed requirement, step 407 is executed.
  • Step 403 The processor determines the scaled size range of the object to be detected according to the distance information, and obtains the actual size of the object to be detected in the image and the current temperature information of the object to be detected; the actual size and current temperature information It is determined based on the collected thermal imaging image containing the object to be detected.
  • Step 404 The processor detects whether the actual size is within the calibrated size range.
  • step 405 is executed; if the actual size is not within the calibration size range, step 407 is executed.
  • Step 405 The processor detects whether the current temperature information meets the temperature requirement.
  • step 406 is executed; if the current temperature information does not meet the temperature requirement, step 407 is executed.
  • Step 406 The processor determines that the object to be detected is a designated object.
  • Step 407 The processor determines that the object to be detected is not a designated object.
  • step 401 to step 407 can be referred to the above description, and will not be repeated here.
  • Fig. 5b is a flowchart of an object detection method shown in an exemplary embodiment of the present application.
  • the camera device can include a radar module, a thermal imaging module, a visible light module, and a processor.
  • the radar module, the thermal imaging module, and the visible light module The groups communicate with the processor through a digital interface respectively.
  • the method may include the following steps.
  • the radar module can receive the radar detection signal reflected by the object to be detected to detect the object to be detected, and the thermal imaging module and the visible light module can collect the heat containing the object to be detected. Imaging image and visible light image.
  • Step 501 The processor obtains the spatial position information of the object to be detected; the spatial position information is determined according to the radar detection signal sent and received by the radar module, and the spatial position information includes at least: the object to be detected relative to the camera device The distance information and the speed information of the object to be detected.
  • Step 502 The processor detects whether the speed information meets the speed requirement.
  • step 503 is executed; if the speed information does not meet the speed requirement, step 507 is executed.
  • Step 503 The processor determines the scaled size range of the object to be detected according to the distance information, and obtains the first actual size of the object to be detected in the image, the current temperature information of the object to be detected, and the object to be detected in the image.
  • the first actual size and the current temperature information are determined based on the collected thermal imaging image containing the object to be detected; the second actual size is based on the collected visible light containing the object to be detected The image is ok.
  • Step 504 The processor detects whether the first actual size and the second actual size are both within the calibrated size range.
  • step 505 is executed. If the first actual size is not within the calibration size range and/or the second actual size is not within the calibration size range, step 507 is executed.
  • Step 505 The processor detects whether the current temperature information meets the temperature requirement.
  • step 506 is executed. If the current temperature information does not meet the temperature requirement, step 507 is executed.
  • Step 506 The processor determines that the object to be detected is a designated object.
  • Step 507 The processor determines that the object to be detected is not a designated object.
  • step 501 to step 507 can be referred to the above description, and will not be repeated here.
  • the relevant part can refer to the part of the description of the method embodiment.
  • the device embodiments described above are merely illustrative, where the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of the present application. Those of ordinary skill in the art can understand and implement without creative work.
  • the preset speed requirement is, for example, 1.3m/s to 2.7m/s.
  • a size preset model is pre-stored in the camera device, and the size preset model represents the correspondence relationship between the distance between the person and the camera device and the calibrated size.
  • the installer can walk around in the monitoring range.
  • the camera equipment collects the distance information of several points and combines the video images to establish and store the distance between the person and the camera equipment-the corresponding relationship between the calibration size.
  • the manufacturer can establish the correspondence between the distance between the person and the camera device and the calibration size and store it in the camera device.
  • the camera equipment has been collecting images of the monitored area, so the camera equipment can collect images containing moving objects.
  • Fig. 6 it is a schematic diagram of the image collected by the camera device.
  • the object 1 (branch) is swaying with the wind, and the object 2 (person) moves toward the camera device.
  • the camera equipment also sends out radar detection signals all the time.
  • the camera equipment can receive the radar detection signals reflected by the object to be detected.
  • the camera device can determine the distance between the object and the camera device and the speed of the object. For example, the distance between the object 1 and the imaging device is 4.2 m, the movement speed of the object 1 is 5.0 m/s, the distance between the object 2 and the imaging device is 3.7 m, and the speed of the object 2 is 1.6 m/s.
  • the imaging device determines that the speed of the object 1 does not meet the preset speed requirement by comparing the speed of the object with the preset speed requirement, so it is determined that the object 1 is not a designated object; the speed of the object 2 meets the preset speed requirement.
  • the imaging device relies on the radar detection signal to determine that there is an object to be detected that needs to be further determined whether it belongs to the specified object.
  • the camera device inputs the distance 3.7m associated with the speed value of 1.6m/s into the size prediction model to obtain the corresponding calibration size of 33mm, the preset size error is ⁇ 5mm, and the calibration size range is 28mm-38mm.
  • the camera device inputs the image into a preset object recognition model to recognize the image, recognizes the object 2 and the object 3 from the image, and determines the area frames B1 and B2 of the object 2 and the object 3.
  • the imaging device takes the position of the intersection of the diagonals of the area frame as the position of the object in the image and takes the height of the area block as the actual size of the object in the image. Then the position of object 2 is O1 (579,328), and the actual size in the image is 32mm; the position of object 3 is O2 (1105,413), and the actual size in the image is 51mm.
  • the camera equipment is also pre-configured with the corresponding relationship between the image position and the distance, and the corresponding distances can be found to be 3.6m and 2.5m according to the positions O1 and O2. Then, the imaging device compares the above two distance values with the distance of the object to be detected 3.7m, and determines that O1 meets the requirements, so object 2 corresponding to position O1 is determined as the object to be detected, and the actual size of object 2 in the image 32mm is determined as the actual size of the object to be detected in the image.
  • the imaging device compares the actual size of 32mm with the calibration size range of 28mm-38mm, and determines that the actual size is within the calibration size range, so the object to be detected is determined to be the designated object, that is, the object 2 is a person.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Signal Processing (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种对象检测方法和摄像设备,该方法包括:依据用于探测待检测对象的雷达探测信号确定待检测对象的空间位置信息,空间位置信息至少包括:待检测对象相对摄像设备的距离信息和待检测对象的速度信息(101);响应于速度信息满足预设速度要求,则依据距离信息确定待检测对象对应的标定尺寸范围,并依据已采集的包含待检测对象的图像确定待检测对象的实际尺寸;其中,标定尺寸范围用于表示在待检测对象为指定对象时指定对象在图像中的尺寸范围(102),基于实际尺寸和标定尺寸范围,确定待检测对象是否为指定对象(103)。

Description

对象检测方法和摄像设备 技术领域
本申请涉及监控领域,尤其涉及一种对象检测方法和摄像设备。
背景技术
对象检测,也可以称为周界检测,是指检测监控区域内是否出现预设的指定对象的技术。周界检测技术是整个安防系统的第一道防线,对于整个安防系统具有重要意义。
但是周界检测方法容易受到天气、遮挡物等影响,使得周界检测结果的准确性较低。
发明内容
有鉴于此,本申请提供一种对象检测方法和摄像设备,用于提高对象检测的准确性。
根据本申请的第一方面,提供一种对象检测方法,所述方法应用于摄像设备,包括:
依据用于探测待检测对象的雷达探测信号确定待检测对象的空间位置信息,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息;
响应于所述速度信息满足预设速度要求,则依据所述距离信息确定所述待检测对象对应的标定尺寸范围,并依据已采集的包含所述待检测对象的图像确定所述待检测对象在所述图像中的实际尺寸,其中,所述标定尺寸范围用于表示在所述待检测对象为指定对象时所述指定对象在所述图像中的尺寸范围;
基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象。
可选的,依据所述距离信息确定所述待检测对象对应的标定尺寸范围,包括:
将所述距离信息输入至预设的尺寸预测模型得到对应的标定尺寸,其中,所述尺寸预测模型表示距离信息与标定尺寸之间的对应关系;
基于所述标定尺寸和预设的尺寸误差确定所述标定尺寸范围。
可选的,基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象,包括:
响应于所述实际尺寸在所述标定尺寸范围内,则确定所述待检测对象为所述指定对象;
响应于所述实际尺寸不在所述标定尺寸范围内,则确定所述待检测对象不是所述指定对象。
可选的,所述图像为热成像图像。
可选的,依据已采集的包含所述待检测对象的图像确定所述待检测对象在所述图像中的实际尺寸,还包括:依据所述热成像图像确定所述待检测对象的当前温度信息;
基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象,包括:
响应于所述实际尺寸在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;
响应于所述实际尺寸不在所述标定尺寸范围内、和/或所述当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
可选的,所述图像包括可见光图像。
可选的,所述图像包括热成像图像和可见光图像;
依据已采集的包含所述待检测对象的图像确定所述待检测对象在所述图像中的实际尺寸,包括:
从所述热成像图像中确定所述待检测对象的第一实际尺寸;
从所述可见光图像中确定所述待检测对象的第二实际尺寸;
依据已采集的包含所述待检测对象的图像确定所述待检测对象在所述图像中的实际尺寸还包括:依据所述热成像图像确定所述待检测对象的当前温度信息;
基于所述实际尺寸和所述标定尺寸范围,检测所述待检测对象是否为所述指定对象,包括:
响应于所述第一实际尺寸、第二实际尺寸均在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;
响应于所述第一实际尺寸不在所述标定尺寸范围内、和/或所述第二实际尺寸不在所述标定尺寸范围内、和/或当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
根据本申请的第二方面,提供一种摄像设备包括:
雷达模组,用于发送用于探测待检测对象的雷达探测信号以及接收反射回的雷达探测信号;
图像模组,用于采集包含所述待检测对象的图像;
处理器,用于获取所述待检测对象的空间位置信息;所述空间位置信息依据所述用于探测待检测对象的雷达探测信号确定,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息;响应于所述速度信息满足预设速度要求,则依据所述距离信息确定所述待检测对象的标定尺寸范围,并获取所述待检测对象在所述图像中的实际尺寸;所述实际尺寸依据已采集的包含所述待检测对象的图像确定;基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为指定对象;
其中,所述标定尺寸范围用于表示在所述待检测对象为指定对象时所述指定对象在所述图像中的尺寸范围。
可选的,所述处理器,在依据所述距离信息确定所述待检测对象的标定尺寸范围时,用于将所述距离信息输入至预设的尺寸预测模型得到对应的标定尺寸;基于所述标定尺寸和预设的尺寸误差确定所述标定尺寸范围;其中,所述尺寸预测模型表示距离信息与 标定尺寸之间的对应关系。
可选的,所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象时,用于响应于所述实际尺寸在所述标定尺寸范围内,则确定所述待检测对象为所述指定对象;响应于所述实际尺寸不在所述标定尺寸范围内,则确定所述待检测对象不是所述指定对象。
可选的,所述图像模组为热成像模组,所述图像为热成像图像;
所述处理器,还用于获取所述待检测对象的当前温度信息;所述当前温度信息依据所述热成像图像确定;
所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象时,用于响应于所述实际尺寸在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;响应于所述实际尺寸不在所述标定尺寸范围内、和/或所述当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
可选的,所述图像模组为可见光模组,所述图像为可见光图像。
可选的,所述图像模组包括:热成像模组和可见光模组;所述图像包括:热成像图像和可见光图像;
所述处理器,在获取所述待检测对象在所述图像中的实际尺寸时,用于获取所述待检测对象的第一实际尺寸和所述待检测对象的第二实际尺寸;所述第一实际尺寸依据所述热成像图像确定;所述第二实际尺寸依据所述可见光图像确定;
所述处理器,还用于依据所述热成像图像确定所述待检测对象的当前温度信息;
所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象时,用于响应于所述第一实际尺寸、第二实际尺寸均在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;响应于所述第一实际尺寸不在所述标定尺寸范围内、和/或所述第二实际尺寸不在所述标定尺寸范围内、和/或当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
由上述描述可知,由于基于雷达探测信号不易受到环境因素影响,因此基于依据雷达探测信号确定出的待检测对象与摄像设备的距离信息更为准确,进而依据距离信息确定出的标定尺寸范围更为准确,所以本申请提出的基于雷达探测信号和图像共同来实现对象检测的方式,会大大提高对象检测的准确率。
附图说明
图1是本申请一示例性实施例示出的一种对象检测方法的流程图;
图2是本申请一示例性实施例示出的一种摄像设备的示意图;
图3是本申请一示例性实施例示出的一种雷达模组的集成架构示意图;
图4a是本申请一示例性实施例示出的另一种摄像设备的示意图;
图4b是本申请一示例性实施例示出的一种对象检测方法的流程图;
图5a是本申请一示例性实施例示出的另一种摄像设备的示意图;
图5b是本申请一示例性实施例示出的一种对象检测方法的流程图;
图6是摄像设备采集到的监控图像的示意图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
在本申请使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本申请。在本申请和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本申请可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本申请范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
本申请旨在提出一种对象检测方法,用以准确确定监控区域中出现的待检测对象是否为指定对象。
在实现时,摄像设备集成了雷达模组和图像模组,雷达模组可发送并接收用于探测待检测对象的雷达探测信号,图像模组可采集包含所述待检测对象的图像,这就使得摄像设备可依据用于探测待检测对象的雷达探测信号确定待检测对象相对摄像设备的距离信息和待检测对象的速度信息。若所述速度信息满足预设速度要求,则摄像设备可依据所述距离信息确定所述待检测对象对应的标定尺寸范围,并依据已采集的包含所述待检测对象的图像确定所述待检测对象的实际尺寸。摄像设备基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为指定对象。
由于雷达探测信号不易受到环境因素影响,因此依据雷达探测信号确定出的待检测对象与摄像设备的距离信息更为准确,进而依据距离信息确定出的标定尺寸范围更为准确,所以本申请提出的基于雷达探测信号和图像共同来实现对象检测的方式,会大大提高对象检测的准确率。
在介绍本申请提供的对象检测方法之前,先对本申请所涉及的概念进行介绍。
1、指定对象
指定对象是用户想要监控的对象、是用户预设的对象。该指定对象可以是人、车、小动物等等。这里不对指定对象进行具体的限定。
2、待检测对象
待检测对象是指监控区域内出现的可运动的对象。该待检测对象可以是人、车、小动物等等,这里不对该待检测对象进行具体的限定。
例如,在本申请实施例中,假设指定对象为人。如果监控区域内出现了可运动的待检测对象,则需要检测该待检测对象是否为人。
3、尺寸预测模型
本申请预先训练了尺寸预测模型。该尺寸预测模型可以表示待检测对象与摄像设备之间的距离信息、与标定尺寸之间的对应关系。其中,该标定尺寸表示待检测对象为指定对象时,指定对象在图像中的尺寸。换句话来说,输入该尺寸预测模型的是待检测对象与摄像设备之间的距离信息,该尺寸预测模型输出的是待检测对象对应的标定尺寸。
下面对该尺寸预测模型的构建过程进行介绍。
在实现时,可在监控区域内选择至少一个指定位置。
针对一个指定位置,将指定对象放置在该指定位置。
然后,摄像设备通过发出雷达探测信号,并接收返回的雷达探测信号,确定该指定对象相对于摄像设备的距离信息。以及,摄像设备采集该指定对象的图像,并从采集的图像中确定出该指定对象的尺寸,作为标定尺寸。由此,摄像设备可得到该指定位置对应的距离信息和标定尺寸。
基于上述同样的方式,摄像设备可得到与每个指定位置对应的距离信息和标定尺寸。
然后,以每个指定位置对应的距离信息作为自变量,以各指定位置对应的标定尺寸作为因变量,生成每个指定位置对应的自变量-因变量对。然后,对各自变量-因变量对进行拟合(例如,插值算法拟合),得到尺寸预测模型。在与摄像设备间的距离相同的多个不同位置处所采集到的图像中指定对象的尺寸相同。
下面对本申请提供的对象检测方法进行详细的说明。
参见图1,图1是本申请一示例性实施例示出的一种对象检测方法的流程图,该方法可应用在摄像设备上,可包括如下步骤。
步骤101:摄像设备依据用于探测待检测对象的雷达探测信号确定待检测对象的空间位置信息,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息。
在本申请实施例中,该摄像设备可以是多目摄像设备。多目摄像设备的其中一目可以安装有雷达探测器,一目可以安装用于图像采集的摄像头。
摄像设备运行期间,一直在发射雷达探测信号,也一直在进行图像采集。在监控区域有运动的待检测对象时,摄像设备可通过雷达探测器接收被待检测对象反射的用于探测待检测对象的雷达探测信号、以及通过摄像头采集包含待检测对象的图像。
摄像设备可依据该雷达探测信号确定出待检测对象的空间位置信息。
其中,该空间位置信息可包括:待检测对象相对于摄像设备的距离信息、待检测对象的速度信息等。这里只是对空间位置信息进行示例性的说明,不对其进行具体的限定。
步骤102:若所述速度信息满足预设速度要求,则依据所述距离信息确定所述待检测对象对应的标定尺寸范围,并依据已采集的包含所述待检测对象的图像确定所述待检测对象的实际尺寸;其中,所述标定尺寸范围用于表示在所述待检测对象为指定对象时所述指定对象在所述图像中的尺寸范围。
在实现时,摄像设备可检测该待检测对象的速度信息是否满足预设的速度要求。比如,该速度要求可以是待检测对象的速度在预设的指定对象的速度范围内。在一个实施例中,对于人,预设的速度范围为5-19km/h,对于车,预设的速度范围为20-60km/h。再比如,该速度要求可以是待检测对象的速度与预设的指定对象的速度的差值在预设的误差范围(比如±5%)内。这里只是对速度要求进行示例性的说明,不对其进行具体的限定。
若该待检测对象的速度信息不满足预设的速度要求,则确定该待检测对象不是指定对象。
若该待检测对象的速度信息满足预设的速度要求,则依据所述距离信息确定所述待检测对象对应的标定尺寸范围,并依据已采集的包含所述待检测对象的图像确定所述待检测对象的实际尺寸。
下面详细介绍下这两个尺寸的获取方式
1)待检测对象的标定尺寸范围的获取
在实现时,摄像设备可将该待检测对象与摄像设备之间的距离信息输入至上述预设的尺寸预测模型中,并获取该尺寸预测模型输出的该待检测对象的标定尺寸。
然后,摄像设备可基于标定尺寸和预设的尺寸误差,确定标定尺寸范围。例如,摄像设备可将标定尺寸和预设的尺寸误差相加,得到标定尺寸范围。当然,摄像设备还可采用其他方式来获取标定尺寸范围,这里只是示例性的说明,不对其进行具体的限定。
需要说明的是:该待检测对象的标定尺寸范围所表示含义是:将指定对象放置在当前待检测对象所处的位置上,即指定对象与摄像设备的距离与待检测对象与摄像设备的距离相同时,摄像设备对该位置上的指定对象进行图像采集,指定对象在采集的图像中的尺寸。换句话来说,该含义是:假设该待检测对象是指定对象,在相同位置处,该指定对象在图像中的实际尺寸。
2)待检测对象的实际尺寸的获取
在一实施例中,摄像设备可依据待检测对象与摄像设备的距离,从已采集的图像中获取该待检测对象的实际尺寸。
下面通过步骤A至步骤B对该实际尺寸的获取方式进行详细的说明。
步骤A:摄像设备可对图像进行对象识别,从图像中识别出至少一个对象在图像中的位置在图像中的实际尺寸。
例如,摄像设备可以将该图像输入至预设的对象识别模型中,以由对象识别模型对输入的图像进行对象识别,输出图像中的至少一个对象的区域框。然后,针对每个识别到的对象的区域框,摄像设备可确定该区域框在图像中的位置作为该对象在图像中的位 置信息,以及将该区域框的尺寸范围,作为该对象在该图像中的实际尺寸。
其中,该对象识别模型可以是YOLO模型,也可以是其他模型,这里只是对对象识别模型进行示例性的说明,不对其进行具体的限定。
当然,摄像设备也可采用其他方式来实现步骤A,这里只是示例性的说明,不对其进行具体的限定。
步骤B:摄像设备可从图像中识别出的至少一个对象中,确定出待检测对象,以及该待检测对象在图像中的实际尺寸。
摄像设备上预配置有图像位置信息和距离信息的对应关系。其中,该对应关系中的图像位置信息是指图像中的一个位置,该位置在图像中是唯一的,该对应关系中的距离是指该图像位置对应的、监控区域的一个位置相对于摄像设备的距离。图像位置和监控区域中的位置一一对应,而监控区域中一个位置与摄像设备的距离是唯一的,通过这个对应关系,可以将图像位置与针对该图像位置的、监控区域的一位置的雷达测出的距离关联起来。
在实现步骤B时,摄像设备可依据该图像位置信息和距离信息的对应关系,根据从图像中识别出的各对象的图像位置确定各识别出的对象与摄像设备间的距离。
然后,摄像设备可在各识别出的对象与摄像设备的距离中,确定出与该待检测对象和摄像设备间的距离相同的距离,基于确定出的距离在识别出的各对象的图像位置-距离关系中查找到对应的图像位置,并将查找到的图像位置处的对象作为待检测对象。摄像设备可将确定出的对象在图像中的实际尺寸,作为待检测对象在图像中的实际尺寸。需要注意的是,此处的距离相同,可以是距离的差值在预设范围内。
例如,图像位置信息和距离信息的对应关系如表1所示。
图像位置 距离
位置1 距离1
位置2 距离2
位置3 距离3
位置4 距离2
位置5 距离4
位置6 距离5
位置M 距离N
表1
假设确定出的待检测对象与摄像设备间的距离为距离1,从图像中识别出两个对象,分别为对象1和对象2。识别出的对象1在图像中的位置为位置1,在图像中的实际尺 寸为尺寸1。识别出的对象2在图像中的位置为位置2,在图像中的实际尺寸为尺寸2。
摄像设备可在表1中,查找出位置1和位置2对应的距离值距离1和距离2,将距离1和距离2与待检测对象和摄像设备间的距离(即距离1)进行比较,确定位置1对应的距离1与待检测对象与摄像设备间的距离相同。然后,由于对象1在图像中的位置为位置1,摄像设备确定对象1为待检测对象,并将对象1在图像中的实际尺寸作为待检测对象在图像中的实际尺寸。
在一些实施例中,两个或多个对象可能与摄像设备的距离相同。若该距离与待检测对象与摄像设备的距离相同,则该两个或多个对象均确定为待检测对象。
步骤103:摄像设备基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为指定对象。
需要说明的是,标定尺寸范围表示:假设该待检测对象是指定对象,则在相同位置处的该指定对象在图像中的尺寸范围。而实际尺寸是指待检测对象在图像中的真实的尺寸。在本申请实施例中,摄像设备先假设待检测对象是指定对象,然后通过标定尺寸范围和实际尺寸来验证该假设是否成立,从而实现确定该待检测对象是否为指定对象。
在一种可选的实现方式中,摄像设备可检测该实际尺寸是否在该标定尺寸范围内。若该实际尺寸在该标定尺寸范围内,则确定该待检测对象是指定对象。若该实际尺寸不在该标定尺寸范围内,则确定该待检测对象不是指定对象。
由于热成像图像可以提供对象较多的信息,比如对象在图像中的尺寸、对象在图像中的位置、以及对象的当前温度等。所以为了提高对象检测的准确性,上述采集的图像可以是热成像图像。
此外,为了进一步提高对象检测的准确性,可以结合热成像图像中的温度信息,排除一些小目标物体(比如树枝、雨雪等)的干扰。
因此,在另一种可选的实现方式中,摄像设备可依据已采集的包含所述待检测对象的热成像图像确定待检测对象在图像中的实际尺寸,以及依据该热成像图像确定待检测对象的当前温度。
若所述实际尺寸在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,摄像设备则可确定所述待检测对象为所述指定对象;
若所述实际尺寸不在所述标定尺寸范围内、和/或所述当前温度信息不满足预设温度要求,摄像设备则可确定所述待检测对象不是所述指定对象。
在一实施例中,对于人,预设温度要求为35-38摄氏度。
此外,由于可见光图像可以提供较多的图像细节信息、便于取证、确认。本申请采集的图像可以为热成像图像,还可为可见光图像,或者二者的组合。
在一种可选的实现方式中,摄像设备可依据已采集的包含所述待检测对象的可见光图像确定待检测对象在图像中的实际尺寸。
若所述实际尺寸在所述标定尺寸范围内,摄像设备则可确定所述待检测对象为所述指定对象;若所述实际尺寸不在所述标定尺寸范围内,摄像设备则可确定所述待检测对 象不是所述指定对象。
在另一种可选的实现方式中,摄像设备可依据已采集的包含所述待检测对象的热成像图像确定待检测对象在图像中的第一实际尺寸并依据该热成像图像确定待检测对象的当前温度,以及依据已采集的包含所述待检测对象的可见光图像确定待检测对象在图像中的第二实际尺寸。
若所述第一实际尺寸、第二实际尺寸均在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;若所述第一实际尺寸不在所述标定尺寸范围内、和/或所述第二实际尺寸不在所述标定尺寸范围内、和/或当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
此外,在本申请实施例中,为了输出信息更多的图像,在检测出待检测对象是指定对象后,摄像设备还可将包含该待检测对象的热成像图像和可见光图像进行融合,并输出融合后的图像。当然,摄像设备也可输出可见光图像、热成像图像等。这里不进行具体的限定。
由于融合后的图像既包含了可见光图像中的细节信息,还包含热成像图像中的温度信息,所以融合后的图像会包含更多的信息,更有利于图像的后续处理。
由上述描述可以看出,一方面,由于雷达探测信号不易受到环境因素影响,因此依据雷达探测信号确定出的待检测对象与摄像设备的距离信息更为准确,进而依据距离信息确定出的标定尺寸范围更为准确,所以本申请提出的基于雷达探测信号和图像共同来实现对象检测的方式,会大大提高对象检测的准确率。
另一方面,在进行对象检测时结合了待检测对象的速度信息和尺寸使得对象检测的准确性更高。
第三方面,在一些实施例中,在对象检测时,还结合了热成像图像中的温度信息,使得在对象检测时可以降低小目标物体(比如树枝等)对对象检测的干扰,使得对象检测的准确性更高。
第四方面,在一些实施例中,在对象检测时,还结合了可见光图像进行检测,使得对象检测的准确性更高。
当通过雷达探测信号确定多个待检测对象的空间位置信息时,也可以采用上述方法进行对象的检测,具体可参见上述步骤,在此不再赘述。
此外,本申请还提供一种摄像设备。
参见图2,图2是本申请一示例性实施例示出的一种摄像设备的示意图。该摄像设备包括:雷达模组、图像模组和处理器。
在实际应用中,摄像设备还可包括其他模组,比如通信模组等,这里只是对摄像设备集成的模组进行示例性的说明,不对该摄像设备进行具体的限定。
雷达模组通过数字接口与处理器相连,图像模组通过数字接口与处理器相连。
1)雷达模组
雷达模组可采用集成模块方案或者分立模块方案。
参见图3,图3是本申请一示例性实施例示出的一种雷达模组的集成架构的示意图。
如图3所示,雷达模组包括:雷达收发模块(比如雷达收发模块包括雷达探测器、天线等),用于发送雷达探测信号,以及接收反射回的雷达探测信号。数字信号采集模块基于发射和反射回的雷达探测信号进行数字信号采集。2D FFT(二维快速傅里叶变换)模块会对采集出的数字信号进行二维快速傅里叶变换。然后,二维峰值搜索模块可对变换后的信号进行二维峰值搜索处理。然后,恒虚警检测模块会对二维峰值搜索模块输出结果进行恒虚警检测。聚类模块对恒虚警检测结果进行聚类。航迹输出模块可以基于聚类结果确定待检测对象的空间位置信息。其中,待检测对象的空间位置信息可包括:待检测对象和摄像设备之间的距离以及待检测对象的速度等。这里只是对空间位置信息进行示例性的说明,不对其进行具体的限定。
在集成模块方案中,可以将图3所示的模块均集成在雷达模组中。雷达模组直接输出待检测对象的空间位置信息。
在分立模块方案中,可以将图3中的部分模块部署在雷达模组中,将另一部分模块部署在摄像设备的处理器中。
例如,将雷达收发模块部署在雷达模组中,将其余的模块部署在处理器中。雷达模组向处理器发送雷达探测信号及相关的信息,处理器基于其余模块从雷达探测信号中确定出待检测对象的空间位置信息。
这里只是对雷达模组进行示例性的说明,不对其进行具体的限定。
2)图像模组
其中,图像模组可包括:热成像模组,可见光模组,或者,热成像模组和可见光模组。
在一种可选的实现方式中,摄像设备包括:雷达模组、热成像模组和处理器。
在另一种可选的实现方式中,摄像设备包括:雷达模组、热成像模组、可见光模组和处理器。
在又一种可选的实现方式中,摄像设备包括:雷达模组、可见光模组和处理器。
其中,热成像模组包括:热成像镜头、处理芯片等。这里不对热成像模组进行具体的限定。热成像模组,用于采集监控区域内的待检测对象的热成像图像。
可见光模组包括:可见光镜头、处理芯片等等,这里不对该可见光模组进行具体的限定。可见光模组,用于采集监控区域内的待检测对象的可见光图像。
3)处理器
处理器用于实现对象检测。其中,该处理器可以是逻辑处理芯片,逻辑处理芯片上配置有SOC(system–on–a-chip,片上操作系统),通过操作系统实现对象检测。
在本申请一实施例中,雷达模组,用于发送用于探测待检测对象的雷达探测信 号,以及接收反射回来的该雷达探测信号。
图像模组,用于采集包含所述待检测对象的图像。
处理器,用于获取待检测对象的空间位置信息;所述空间位置信息依据所述用于探测待检测对象的雷达探测信号确定,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息;若所述速度信息满足预设速度要求,则依据所述距离信息确定所述待检测对象的标定尺寸范围,并获取所述待检测对象在图像中的实际尺寸,所述实际尺寸依据已采集的包含所述待检测对象的图像确定;基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为指定对象;其中,所述标定尺寸范围用于表示在所述待检测对象为指定对象时,相同位置处的所述指定对象在所述图像中的尺寸范围。
下面通过步骤201至步骤203对处理器的作用进行具体的介绍。
步骤201:处理器获取待检测对象的空间位置信息;所述空间位置信息依据所述用于探测待检测对象的雷达探测信号确定,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息。
在一种可选实现方式中,雷达模组基于该雷达探测信号确定待检测对象的空间位置信息,并将空间位置信息发送给处理器。处理器可以接收该空间位置信息。
在另一种可选的实现方式中,雷达模组将该雷达探测信号,以及与该雷达探测信号相关的信息(比如雷达探测信号的发送时间以及接收时间等)发送给处理器,处理器依据该雷达探测信号以及与该雷达探测信号相关的信息确定待检测对象的空间位置信息。这里不进行具体的限定。
此外,具体的实现方式可参见步骤101,这里不再赘述。
步骤202:若所述速度信息满足预设速度要求,处理器则依据所述距离信息确定所述待检测对象的标定尺寸范围,并获取所述待检测对象在图像中的实际尺寸;所述实际尺寸依据已采集的包含所述待检测对象的图像确定。
在实现时,处理器可检测该待检测对象的速度信息是否满足预设的速度要求。比如,该速度要求可以是待检测对象的速度在预设的指定对象的速度范围内。再比如,该速度要求可以是待检测对象的速度与预设的指定对象的速度的差值在预设的误差范围内。这里只是对速度要求进行示例性的说明,不对其进行具体的限定。
若该待检测对象的速度信息不满足预设的速度要求,则确定该待检测对象不是指定对象。
若该待检测对象的速度信息满足预设的速度要求,则依据所述距离信息确定所述待检测对象对应的标定尺寸范围,并依据已采集的包含所述待检测对象的图像确定所述待检测对象在图像中的实际尺寸。
下面详细介绍下这两个尺寸的获取方式
1)待检测对象的标定尺寸范围的获取
在实现时,处理器可将该待检测对象与摄像设备之间的距离信息输入至上述预 设的尺寸预测模型中,并获取该尺寸预测模型输出的该待检测对象的标定尺寸。
然后,处理器可基于标定尺寸和预设的尺寸误差,确定标定尺寸范围。例如,处理器可将标定尺寸和预设的尺寸误差进行累加,得到标定尺寸范围。当然,处理器还可采用其他方式来获取标定尺寸范围,这里只是示例性的说明,不对其进行具体的限定。
需要说明的是:该待检测对象的标定尺寸范围所表示含义是:将指定对象放置在当前待检测对象的位置上,图像模组对该位置上的指定对象进行图像采集,指定对象在采集的图像中的尺寸。换句话来说,该含义是:假设该待检测对象是指定对象,在相同位置时,该指定对象在图像中的实际尺寸。
2)待检测对象的实际尺寸的获取
在一种可选的确定实际尺寸的方式中,图像模组可对已采集的图像进行识别,得到多个识别到的对象在图像中的图像位置信息和实际尺寸,并将多个对象在图像中的图像位置信息和实际尺寸发送给处理器。处理器通过上述图像位置信息和距离信息的对应关系确定各对象的图像位置对应的距离。然后,处理器可在确定出的各距离中确定出与待检测对象和摄像设备的距离相同的距离,并根据所确定的距离从识别出的各对象的图像位置-距离对应关系中查找到图像位置,并将该查找到的图像位置处的对象,作为待检测对象,将确定出的对象在图像中的实际尺寸作为待检测对象在图像中的实际尺寸。
在另一种可选的确定实际尺寸的方式中,图像模组可将已采集的图像发送给处理器。处理器对已采集的图像进行识别,得到多个识别到的对象在图像中的图像位置信息和实际尺寸,并通过上述图像位置和距离的对应关系确定各对象的图像位置对应的距离。然后,处理器可在确定出的各距离中确定出与待检测对象和摄像设备的距离相同的距离,并根据所确定的距离从识别出的各对象的图像位置-距离对应关系中查找到图像位置,并将该查找到的图像位置处的对象,作为待检测对象,将确定出的对象在图像中的实际尺寸作为待检测对象在图像中的实际尺寸。
具体可参见上述步骤102的描述,这里不再赘述。
步骤203:处理器基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为指定对象。
可选的,所述处理器,在依据所述距离信息确定所述待检测对象的标定尺寸范围时,用于将所述距离信息输入至预设的尺寸预测模型得到对应的标定尺寸;基于所述标定尺寸和预设的尺寸误差确定所述标定尺寸范围。
可选的,所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象时,用于若所述实际尺寸在所述标定尺寸范围内,则确定所述待检测对象为所述指定对象;若所述实际尺寸不在所述标定尺寸范围内,则确定所述待检测对象不是所述指定对象。
可选的,所述图像模组为可见光模组,所述图像为可见光图像。
可选的,所述图像模组为热成像模组,所述图像为热成像图像。
可选的,所述处理器,还用于获取所述待检测对象的当前温度信息;所述当前 温度信息依据所述热成像图像确定。
所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象时,用于若所述实际尺寸在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;若所述实际尺寸不在所述标定尺寸范围内、和/或所述当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
需要说明的是,在获取待检测对象的当前温度信息时,在一种可选的实现方式中,热成像模组可以从热成像图像中确定该待检测对象的当前温度信息,并将当前温度信息发送给处理器。当然,在另一种可选的实现方式中,热成像模组将该热成像图像发送给处理器,处理器从该热成像图像中确定该待检测对象的当前温度信息。这里只是示例性的说明,不对其进行具体的限定。
可选的,所述图像模组包括热成像模组和可见光模组;所述图像包括热成像图像和可见光图像;所述处理器,在获取所述待检测对象的实际尺寸时,用于获取所述待检测对象在图像中的第一实际尺寸和所述待检测对象在图像中的第二实际尺寸;所述第一实际尺寸依据所述热成像图像确定;从第二实际尺寸依据所述可见光图像确定。
所述处理器,还用于依据所述热成像图像确定所述待检测对象的当前温度信息。
所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为指定对象时,用于若所述第一实际尺寸、第二实际尺寸均在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;若所述第一实际尺寸不在所述标定尺寸范围内、和/或所述第二实际尺寸不在所述标定尺寸范围内、和/或当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
需要说明的是:在获取待检测对象的第一实际尺寸和当前温度信息时,在一种可选的实现方式中,热成像模组可以从热成像图像中确定该待检测对象的当前的温度信息、以及在图像中的第一实际尺寸,并将当前温度信息和在图像中的第一实际尺寸发送给处理器。当然,在另一种可选的实现方式中,热成像模组将该热成像图像发送给处理器,处理器从该热成像图像中确定该待检测对象的当前的温度信息和在图像中的第一实际尺寸。这里只是示例性的说明,不对其进行具体的限定。
在获取待检测对象的第二实际尺寸时,在一种可选的实现方式中,可见光模组可以从可见光图像中确定该待检测对象在图像中的第二实际尺寸,并将第二实际尺寸发送给处理器。当然,在另一种可选的实现方式中,可见光模组将该可见光图像发送给处理器,处理器从该可见光图像中确定该待检测对象在图像中的第二实际尺寸。这里只是示例性的说明,不对其进行具体的限定。
由上述描述可以看出,一方面,由于雷达探测信号不易受到环境因素影响,因此依据雷达探测信号确定出的待检测对象与摄像设备的距离更为准确,进而依据距离信息确定出的标定尺寸范围更为准确,所以本申请提出的基于雷达探测信号和图像共同来实现对象检测的方式,会大大提高对象检测的准确率。
另一方面,本申请结合了待检测对象的速度信息、尺寸信息进行对象检测,使得对象检测的准确性更高。
第三方面,在一些实施例中,在对象检测时,还结合了热成像图像中的温度信息,使得在对象检测时可以降低小目标物体(比如树枝等)对对象检测的干扰,使得对象检测的准确性更高。
第四方面,在一些实施例中,在对象检测时,还依据待检测对象在热成像图像和可见光图像中的实际尺寸,来进行对象检测,使得对象检测的准确性更高。
参见图4b,图4b是本申请一示例性实施例示出的一种对象检测方法的流程图。
该方法可应用在摄像设备上,如图4a所示,该摄像设备可包括:雷达模组、热成像模组、处理器,其中,雷达模组、热成像模组分别通过数字接口与处理器通信。该方法可包括如下步骤。
在监控区域内出现待检测对象时,雷达模组可接收用于待检测对象反射的探测该待检测对象的雷达探测信号、以及热成像模组可采集包含该待检测对象的热成像图像。
步骤401:处理器获取待检测对象的空间位置信息;所述空间位置信息依据雷达模组发送并接收的雷达探测信号确定,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息。
步骤402:处理器检测该速度信息是否满足速度要求。
若该速度信息满足速度要求,则执行步骤403;若该速度信息不满足速度要求,则执行步骤407。
步骤403:处理器依据所述距离信息确定所述待检测对象的标定尺寸范围,以及获取待检测对象在图像中的实际尺寸以及该待检测对象的当前温度信息;所述实际尺寸和当前温度信息依据已采集的包含所述待检测对象的热成像图像确定。
步骤404:处理器检测该实际尺寸是否在标定尺寸范围内。
若该实际尺寸在标定尺寸范围内,则执行步骤405;若该实际尺寸不在标定尺寸范围内,则执行步骤407。
步骤405:处理器检测当前温度信息是否满足温度要求。
若当前温度信息满足温度要求,则执行步骤406;若当前温度信息不满足温度要求,则执行步骤407。
步骤406:处理器确定待检测对象是指定对象。
步骤407:处理器确定待检测对象不是指定对象。
需要说明的是,步骤401至步骤407的具体实现方式可参见上文描述,这里不再赘述。
参见图5b,图5b是本申请一示例性实施例示出的一种对象检测方法的流程图。
该方法可应用在摄像设备上,如图5a所示,该摄像设备可包括:雷达模组、热 成像模组、可见光模组、处理器,其中,雷达模组、热成像模组、可见光模组分别通过数字接口与处理器通信。该方法可包括如下步骤。
在监控区域内出现待检测对象时,雷达模组可接收用于待检测对象反射的探测该待检测对象的雷达探测信号、以及热成像模组和可见光模组可采集包含该待检测对象的热成像图像和可见光图像。
步骤501:处理器获取待检测对象的空间位置信息;所述空间位置信息依据雷达模组发送并接收的雷达探测信号确定,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息。
步骤502:处理器检测该速度信息是否满足速度要求。
若该速度信息满足速度要求,则执行步骤503;若该速度信息不满足速度要求,则执行步骤507。
步骤503:处理器依据所述距离信息确定所述待检测对象的标定尺寸范围,以及获取待检测对象在图像中的第一实际尺寸、该待检测对象的当前温度信息、该待检测对象在图像中的第二实际尺寸;所述第一实际尺寸和当前温度信息依据已采集的包含所述待检测对象的热成像图像确定;所述第二实际尺寸依据已采集的包含该待检测对象的可见光图像确定。
步骤504:处理器检测该第一实际尺寸、第二实际尺寸是否均在标定尺寸范围内。
若该第一实际尺寸和第二实际尺寸均在标定尺寸范围内,则执行步骤505。若该第一实际尺寸不在标定尺寸范围内,和/或该第二实际尺寸不在标定尺寸范围内,则执行步骤507。
步骤505:处理器检测当前温度信息是否满足温度要求。
若当前温度信息满足温度要求,则执行步骤506。若当前温度信息不满足温度要求,则执行步骤507。
步骤506:处理器确定待检测对象是指定对象。
步骤507:处理器确定待检测对象不是指定对象。
需要说明的是,步骤501至步骤507的具体实现方式可参见上文描述,这里不再赘述。
上述装置中各个单元的功能和作用的实现过程具体详见上述方法中对应步骤的实现过程,在此不再赘述。
对于装置实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实施例的部分说明即可。以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本申请方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
下面,结合具体应用场景对本申请一个或多个实施例进行说明。
对于家庭安防使用的摄像设备,通常要监控是否有陌生人出现,则指定对象为人,对于人,预设速度要求为,例如,1.3m/s到2.7m/s。
该摄像设备中预存有尺寸预设模型,该尺寸预设模型表示人与摄像设备的距离-标定尺寸的对应关系。在一个例子中,可以在安装摄像设备时,安装人员在监控范围内走动,摄像设备采集几个点位的距离信息并结合视频图像,建立并储存人与摄像设备的距离-标定尺寸的对应关系。在另一些例子中,可由厂家建立好人与摄像设备的距离-标定尺寸的对应关系并预存在摄像设备中。
摄像设备一直在采集监控区域的图像,因此,摄像设备可以采集到包含运动对象的图像。如图6所示,为摄像设备采集到的图像的示意图,图中对象1(树枝)随风摇摆,对象2(人)向着摄像设备运动。
摄像设备也一直向外发送雷达探测信号,当监控区域内有活动的对象(摆动的树枝、向着摄像设备运动的人)时,摄像设备可接收到由该待检测对象反射的雷达探测信号。基于发送和接收到的雷达探测信号,摄像设备可以确定该对象与摄像设备的距离以及该对象的速度。例如,对象1与摄像设备的距离为4.2m,对象1的运动速度为5.0m/s,对象2与摄像设备的距离为3.7m,对象2的速度为1.6m/s。接着,摄像设备通过比较对象的速度与预设速度要求确定对象1的速度不满足预设速度要求,故确定对象1不是指定对象;对象2的速度满足预设速度要求。摄像设备此时依靠雷达探测信号确定存在需要进一步确定是否属于指定对象的待检测对象。然后,摄像设备将速度值1.6m/s关联的距离3.7m输入到尺寸预测模型中得到对应的标定尺寸33mm,预设的尺寸误差为±5mm,得到标定尺寸范围为28mm-38mm。
摄像设备将该图像输入至预设的对象识别模型中以对该图像进行识别,从图像中识别出对象2和对象3,并确定对象2和对象3的区域框B1和B2。摄像设备将区域框对角线的交点位置作为对象在该图像中的位置并将该区域块的高度作为该对象在图像中的实际尺寸。则对象2的位置为O1(579,328),在图像中的实际尺寸为32mm;对象3的位置为O2(1105,413),在图像中的实际尺寸为51mm。
摄像设备上还预配置有图像位置和距离的对应关系,可以依据位置O1和O2查询出对应的距离为3.6m和2.5m。然后,摄像设备将上述两个距离值与待检测对象的距离3.7m进行比较,确定O1符合要求,故将位置O1对应的对象2确定为待检测对象,并将对象2在图像中的实际尺寸32mm确定为待检测对象在图像中的实际尺寸。
随后,摄像设备将实际尺寸32mm与标定尺寸范围28mm-38mm进行比较,确定实际尺寸在标定尺寸范围内,故确定待检测对象为指定对象,即对象2为人。
以上所述仅为本申请的较佳实施例而已,并不用以限制本申请,凡在本申请的精神和原则之内,所做的任何修改、等同替换、改进等,均应包含在本申请保护的范围之内。

Claims (14)

  1. 一种对象检测方法,应用于摄像设备,包括:
    依据用于探测待检测对象的雷达探测信号确定待检测对象的空间位置信息,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息;
    响应于所述速度信息满足预设速度要求,则依据所述距离信息确定所述待检测对象对应的标定尺寸范围,并依据已采集的包含所述待检测对象的图像确定所述待检测对象在所述图像中的实际尺寸,其中,所述标定尺寸范围用于表示在所述待检测对象为指定对象时所述指定对象在所述图像中的尺寸范围;
    基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象。
  2. 根据权利要求1所述方法,其中,依据所述距离信息确定所述待检测对象对应的标定尺寸范围,包括:
    将所述距离信息输入至预设的尺寸预测模型得到对应的标定尺寸,其中,所述尺寸预测模型表示距离信息与标定尺寸之间的对应关系;
    基于所述标定尺寸和预设的尺寸误差确定所述标定尺寸范围。
  3. 根据权利要求1或2所述方法,其中,基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象,包括:
    响应于所述实际尺寸在所述标定尺寸范围内,则确定所述待检测对象为所述指定对象;
    响应于所述实际尺寸不在所述标定尺寸范围内,则确定所述待检测对象不是所述指定对象。
  4. 根据权利要求1-3中任一项所述的方法,其中,所述图像为热成像图像。
  5. 根据权利要求4中所述的方法,其中,依据已采集的包含所述待检测对象的图像确定所述待检测对象在所述图像中的实际尺寸,还包括:依据所述热成像图像确定所述待检测对象的当前温度信息;
    基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象,包括:
    响应于所述实际尺寸在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;
    响应于所述实际尺寸不在所述标定尺寸范围内、和/或所述当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
  6. 根据权利要求1-3中任一项所述的方法,其中,所述图像为可见光图像。
  7. 根据权利要求1-3中任一项所述的方法,其中,所述图像包括热成像图像和可见光图像;
    依据已采集的包含所述待检测对象的图像确定所述待检测对象在所述图像中的实际尺寸,包括:
    从所述热成像图像中确定所述待检测对象的第一实际尺寸;
    从所述可见光图像中确定所述待检测对象的第二实际尺寸;
    依据已采集的包含所述待检测对象的图像确定所述待检测对象在所述图像中的实际尺寸还包括:依据所述热成像图像确定所述待检测对象的当前温度信息;
    基于所述实际尺寸和所述标定尺寸范围,检测所述待检测对象是否为所述指定对象,包括:
    响应于所述第一实际尺寸、第二实际尺寸均在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;
    响应于所述第一实际尺寸不在所述标定尺寸范围内、和/或所述第二实际尺寸不在所述标定尺寸范围内、和/或所述当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
  8. 一种摄像设备,包括:
    雷达模组,用于发送用于探测待检测对象的雷达探测信号以及接收反射回的雷达探测信号;
    图像模组,用于采集包含所述待检测对象的图像;
    处理器,用于获取所述待检测对象的空间位置信息;所述空间位置信息依据所述用于探测待检测对象的雷达探测信号确定,所述空间位置信息至少包括:所述待检测对象相对所述摄像设备的距离信息和所述待检测对象的速度信息;响应于所述速度信息满足预设速度要求,则依据所述距离信息确定所述待检测对象的标定尺寸范围,并获取所述待检测对象在所述图像中的实际尺寸;所述实际尺寸依据已采集的包含所述待检测对象的图像确定;基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为指定对象;
    其中,所述标定尺寸范围用于表示在所述待检测对象为所述指定对象时所述指定对象在所述图像中的尺寸范围。
  9. 根据权利要求8所述的设备,其中,所述处理器,在依据所述距离信息确定所述待检测对象对应的标定尺寸范围时,用于将所述距离信息输入至预设的尺寸预测模型得到对应的标定尺寸;基于所述标定尺寸和预设的尺寸误差确定所述标定尺寸范围;
    其中,所述尺寸预测模型表示距离信息与标定尺寸之间的对应关系。
  10. 根据权利要求8或9所述的设备,其中,所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象时,用于响应于所述实际尺寸在所述标定尺寸范围内,则确定所述待检测对象为所述指定对象;响应于所述实际尺寸不在所述标定尺寸范围内,则确定所述待检测对象不是所述指定对象。
  11. 根据权利要求8-10中任一项所述的设备,其中,所述图像模组为热成像模组,所述图像为热成像图像。
  12. 根据权利要求11中所述的设备,其中,所述处理器,还用于获取所述待检测对象的当前温度信息;所述当前温度信息依据所述热成像图像确定;
    所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象时,用于响应于所述实际尺寸在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;响应于所述实际尺寸不在所述标定尺寸范围内、和/或所述当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
  13. 根据权利要求8-10中任一项所述的设备,其中,所述图像模组为可见光模组,所述图像为可见光图像。
  14. 根据权利要求8-10中任一项所述的设备,其中,所述图像模组包括:热成像 模组和可见光模组;
    所述图像包括:热成像图像和可见光图像;
    所述处理器,在获取所述待检测对象在所述图像中的实际尺寸时,用于获取所述待检测对象的第一实际尺寸和所述待检测对象的第二实际尺寸;所述第一实际尺寸依据所述热成像图像确定;所述第二实际尺寸依据所述可见光图像确定;
    所述处理器,还用于依据所述热成像图像确定所述待检测对象的当前温度信息;
    所述处理器,在基于所述实际尺寸和所述标定尺寸范围,确定所述待检测对象是否为所述指定对象时,用于响应于所述第一实际尺寸、第二实际尺寸均在所述标定尺寸范围内、且所述当前温度信息满足预设温度要求,则确定所述待检测对象为所述指定对象;响应于所述第一实际尺寸不在所述标定尺寸范围内、和/或所述第二实际尺寸不在所述标定尺寸范围内、和/或当前温度信息不满足预设温度要求,则确定所述待检测对象不是所述指定对象。
PCT/CN2021/099211 2020-06-15 2021-06-09 对象检测方法和摄像设备 WO2021254233A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21825409.2A EP4166981A4 (en) 2020-06-15 2021-06-09 OBJECT RECOGNITION METHOD AND CAMERA DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010544549.2 2020-06-15
CN202010544549.2A CN111736140B (zh) 2020-06-15 2020-06-15 一种对象检测方法和摄像设备

Publications (1)

Publication Number Publication Date
WO2021254233A1 true WO2021254233A1 (zh) 2021-12-23

Family

ID=72649190

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/099211 WO2021254233A1 (zh) 2020-06-15 2021-06-09 对象检测方法和摄像设备

Country Status (3)

Country Link
EP (1) EP4166981A4 (zh)
CN (1) CN111736140B (zh)
WO (1) WO2021254233A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111736140B (zh) * 2020-06-15 2023-07-28 杭州海康微影传感科技有限公司 一种对象检测方法和摄像设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007255978A (ja) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd 物体検出方法および物体検出装置
JP2016009474A (ja) * 2014-06-26 2016-01-18 株式会社リコー 物体識別システム、情報処理装置、情報処理方法及びプログラム
WO2017029886A1 (ja) * 2015-08-20 2017-02-23 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
CN108845574A (zh) * 2018-06-26 2018-11-20 北京艾瑞思机器人技术有限公司 目标识别与追踪方法、装置、设备及介质
CN111045000A (zh) * 2018-10-11 2020-04-21 阿里巴巴集团控股有限公司 监测系统和方法
CN111736140A (zh) * 2020-06-15 2020-10-02 杭州海康微影传感科技有限公司 一种对象检测方法和摄像设备

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4595833B2 (ja) * 2006-02-24 2010-12-08 トヨタ自動車株式会社 物体検出装置
JP2014215877A (ja) * 2013-04-26 2014-11-17 株式会社デンソー 物体検出装置
GB201509387D0 (en) * 2015-06-01 2015-07-15 Apical Ltd Method and apparatus for image processing (distance detection)
CN106385530B (zh) * 2015-07-28 2022-12-13 杭州海康微影传感科技有限公司 一种双光谱摄像机
JP6643166B2 (ja) * 2016-03-31 2020-02-12 株式会社デンソー 物体認識装置及び物体認識方法
CN107302655B (zh) * 2016-09-29 2019-11-01 维沃移动通信有限公司 一种拍摄取景的调节方法及移动终端
CN108012143B (zh) * 2017-12-04 2021-02-09 深圳市无限动力发展有限公司 双目摄像头标定方法及装置
CN108615321B (zh) * 2018-06-07 2019-10-08 湖南安隆软件有限公司 基于雷达侦测及视频图像行为分析的安防预警系统及方法
CN110609274B (zh) * 2018-06-15 2022-07-01 杭州海康威视数字技术股份有限公司 一种测距方法、装置及系统
CN110874953B (zh) * 2018-08-29 2022-09-06 杭州海康威视数字技术股份有限公司 区域报警方法、装置、电子设备及可读存储介质
US11287523B2 (en) * 2018-12-03 2022-03-29 CMMB Vision USA Inc. Method and apparatus for enhanced camera and radar sensor fusion
CN110207727A (zh) * 2019-07-03 2019-09-06 湖北中达智造科技有限公司 一种发动机气门尺寸测量的温度补偿方法
CN110458888A (zh) * 2019-07-23 2019-11-15 深圳前海达闼云端智能科技有限公司 基于图像的测距方法、装置、存储介质和电子设备
KR102092552B1 (ko) * 2019-11-11 2020-03-24 주식회사 삼주전자 지능형 전천후 영상 표출 카메라 시스템

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007255978A (ja) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd 物体検出方法および物体検出装置
JP2016009474A (ja) * 2014-06-26 2016-01-18 株式会社リコー 物体識別システム、情報処理装置、情報処理方法及びプログラム
WO2017029886A1 (ja) * 2015-08-20 2017-02-23 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
CN108845574A (zh) * 2018-06-26 2018-11-20 北京艾瑞思机器人技术有限公司 目标识别与追踪方法、装置、设备及介质
CN111045000A (zh) * 2018-10-11 2020-04-21 阿里巴巴集团控股有限公司 监测系统和方法
CN111736140A (zh) * 2020-06-15 2020-10-02 杭州海康微影传感科技有限公司 一种对象检测方法和摄像设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4166981A4 *

Also Published As

Publication number Publication date
EP4166981A4 (en) 2023-11-29
CN111736140A (zh) 2020-10-02
CN111736140B (zh) 2023-07-28
EP4166981A1 (en) 2023-04-19

Similar Documents

Publication Publication Date Title
NL2025935B1 (en) Intelligent personnel evacuation system and method used in subway station fire
KR101758576B1 (ko) 물체 탐지를 위한 레이더 카메라 복합 검지 장치 및 방법
US8115814B2 (en) Mobile tracking system, camera and photographing method
JP2009143722A (ja) 人物追跡装置、人物追跡方法及び人物追跡プログラム
KR20160062880A (ko) 카메라 및 레이더를 이용한 교통정보 관리시스템
KR101553000B1 (ko) 비콘을 이용한 영상 보안 시스템 및 방법, 그리고 이를 위한 객체 관리 장치
CN109076191A (zh) 用于基于摄像机的监视的监视系统和方法
JP2009295140A (ja) 侵入者検知システム及びその方法
CN111474537A (zh) 一种雷达人员监测测量系统及方法
KR20120124785A (ko) 물체의 이동 경로를 추적하는 물체 추적 시스템 및 그 방법
KR101125233B1 (ko) 융합기술기반 보안방법 및 융합기술기반 보안시스템
WO2021254233A1 (zh) 对象检测方法和摄像设备
CN109544870A (zh) 用于智能监控系统的报警判断方法与智能监控系统
CN108734910A (zh) 一种异常行为监控报警系统及其方法
KR20170100892A (ko) 위치 추적 장치
US20220065976A1 (en) Mobile body detection device, mobile body detection method, and mobile body detection program
JP7258732B2 (ja) 通信種別と対象種別との対応関係に基づき端末を同定する装置、プログラム及び方法
KR20150031530A (ko) 무인 항공 감시 장치를 이용한 영상 감시 방법 및 장치
CN109286785B (zh) 一种环境信息共享系统及方法
KR101829274B1 (ko) 비콘을 이용한 영상 보안 시스템 및 방법과 이를 위한 객체 관리 장치
CN114697165B (zh) 基于无人机视觉和无线信号融合的信号源检测方法
KR20060003871A (ko) 검출시스템, 물체검출방법 및 물체검출을 위한 컴퓨터프로그램
CN108256502A (zh) 基于图像处理的安保系统
CN115035470A (zh) 一种基于混合视觉的低小慢目标识别与定位方法及系统
KR101859883B1 (ko) 네트워크 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21825409

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021825409

Country of ref document: EP

Effective date: 20230116