CN113507577A - Target object detection method, device, equipment and storage medium - Google Patents

Target object detection method, device, equipment and storage medium Download PDF

Info

Publication number
CN113507577A
CN113507577A CN202110768302.3A CN202110768302A CN113507577A CN 113507577 A CN113507577 A CN 113507577A CN 202110768302 A CN202110768302 A CN 202110768302A CN 113507577 A CN113507577 A CN 113507577A
Authority
CN
China
Prior art keywords
suspected
target
detection
suspected object
visible light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110768302.3A
Other languages
Chinese (zh)
Inventor
唐子立
韩鹏
季俊康
贺小龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision System Technology Co Ltd filed Critical Hangzhou Hikvision System Technology Co Ltd
Priority to CN202110768302.3A priority Critical patent/CN113507577A/en
Publication of CN113507577A publication Critical patent/CN113507577A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C1/00Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
    • G07C1/20Checking timed patrols, e.g. of watchman
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Abstract

The application discloses a target object detection method, a device, equipment and a storage medium, wherein the method comprises the following steps: determining whether a suspected object exists in a target space area or not in a first detection mode according to a preset routing inspection route; and confirming whether the suspected object is the target object in a second detection mode when the suspected object is determined to exist. According to the method and the device, primary detection based on a thermal imaging mechanism is carried out on a target space region and secondary detection based on a visible light tracking mechanism is carried out on a detected suspected object through a time-sharing full-coverage inspection mode, so that accurate and reliable identification of the target object is realized.

Description

Target object detection method, device, equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a target object detection method, apparatus, device, and storage medium.
Background
With the development of video monitoring technology, identification and monitoring of target objects in different scenes, especially moving target objects in a dynamic background, are widely applied in more and more fields.
For example, in the fishery field, the ecological system is unbalanced due to the over-fishing phenomenon in recent years, so that there is a need to enhance the fishery supervision and improve the supervision of the forbidden fishing area and the fishing season in a specific water area by modern scientific and technological means to realize the full coverage of the supervision. However, the basic working principle of the existing fishing behavior detection technology is as follows, exemplarily: for a specific water area (e.g. inland river basin, coastal fishing port, etc.), people or fishing vessels performing specific fishing activities in the target water area are identified at different preset positions by means of camera devices. However, the existing detection schemes have the following drawbacks:
(1) the azimuth angle (horizontal angle, pitch angle) and the multiplying power of camera equipment at a certain preset position are fixed to in order to be able to accurately identify, usually will set up great multiplying power for the camera and shoot, so the detection area that the camera can cover is less, detection efficiency is lower.
(2) When the camera apparatus switches from one preset position to the next for detection, there is a possibility that the detection target is missed.
(3) The existing camera equipment has poor night vision capability and cannot realize all-weather real-time comprehensive detection and supervision on a target space area.
Disclosure of Invention
In view of the above defects in the prior art, embodiments of the present application provide a target object detection method, apparatus, device, and storage medium. According to the method and the device, primary detection based on a thermal imaging mechanism is carried out on a target space region and secondary detection based on a visible light tracking mechanism is carried out on a detected suspected object through a time-sharing full-coverage inspection mode, so that accurate and reliable identification of the target object is realized.
Specifically, in one embodiment of the present application, there is provided a target object detection method, including: determining whether a suspected object exists in a target space area or not in a first detection mode according to a preset routing inspection route; and confirming whether the suspected object is the target object in a second detection mode when the suspected object is determined to exist.
In another embodiment of the present application, there is provided a target object detection apparatus, including: the first detection module is used for determining whether a suspected object exists in the target space area in a first detection mode according to a preset routing inspection route; and the second detection module is used for confirming whether the suspected object is the target object in a second detection mode when the suspected object is determined to exist.
In another embodiment of the present application, there is provided a target object detecting apparatus including: a thermal imaging assembly; a visible light camera component; and a controller including a storage unit and a processing unit, wherein the storage unit stores a computer program, and the program can drive the thermal imaging assembly and the visible light imaging assembly to work and realize the steps of the target object detection method according to any one of the above embodiments when being executed by the processing unit.
In a further embodiment of the present application, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the target object detection method according to any one of the above embodiments.
The embodiment of the application adopts at least one technical scheme which can achieve the following beneficial effects: the specific type of object is automatically detected by time-sharing full-coverage inspection and an intelligent detection and analysis method based on a thermal imaging mechanism, so that all-weather real-time comprehensive detection and supervision of a target space area can be realized within a large detection range; whether a suspected object exists in the target space region is determined through the filtering operation on the object of the specific type, so that misjudgment and misinformation of the suspected object can be reduced, and the detection accuracy is improved; and then, the focal length is automatically changed by the visible light assembly, the suspected object is tracked in real time, and the object is secondarily detected, so that the detection accuracy and the inspection efficiency can be further improved.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 shows a schematic flowchart of a target object detection method provided in a first embodiment of the present application.
Fig. 2 shows another schematic flow chart of the target object detection method provided in the first embodiment of the present application.
Fig. 3 shows a schematic diagram of a time-sharing full-coverage inspection method provided by the first embodiment of the present application.
Fig. 4 shows another schematic flow chart of the target object detection method provided in the first embodiment of the present application.
Fig. 5 shows a block diagram of a target object detection apparatus according to a second embodiment of the present application.
Fig. 6 shows a block diagram of a target object detection apparatus according to a third embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," and the like in the description and in the claims of the present application, and in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the objects so described are interchangeable under appropriate circumstances. In the description of the present application, "a plurality" means two or more unless specifically defined otherwise. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover a non-exclusive inclusion. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware circuits or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
The present application is further described in detail below with reference to the accompanying drawings and the detailed description, so that the objects, features and advantages of the present application can be more clearly understood.
Fig. 1 is a flowchart illustrating a target object detection method according to a first embodiment of the present application, which may be applied to a detection apparatus having a camera device with horizontal and vertical movement and magnification adjustment functions (i.e., PTZ functions) capable of driving a tilt movement, a horizontal rotation movement, and magnification adjustment of a built-in lens through a pan/tilt control unit of the camera device. In addition, an execution subject of the target object detection method provided in this embodiment may be a target object detection apparatus as provided in the second embodiment, and the apparatus may be implemented by software, hardware, or a combination of software and hardware.
The execution subject of the following method embodiment is described by taking as an example a detection apparatus having a camera device having the aforementioned PTZ function, and the target object detection method specifically includes the steps of:
step S10, determining whether a suspected object exists in the target space area in a first detection mode according to a preset routing inspection route;
in step S20, when it is determined that the suspected object exists, it is determined whether the suspected object is the target object in a second detection manner.
Specifically, in this embodiment, when it is determined that a suspected object exists in the target space area in a first detection manner according to a preset routing inspection route, the routing inspection is suspended, and whether the suspected object is the target object is determined in a second detection manner. And when the suspected object is confirmed not to be the target object in the second detection mode, the inspection is recovered, and whether the suspected object exists in the target space area or not is continuously determined in the first detection mode according to a preset inspection route.
As shown in fig. 2, the step S10 further includes the sub-steps S11 to S13 of the first detection mode:
in step S11, the inspection apparatus having the camera device configures inspection rules in advance, including configuration of an inspection route or an inspection area, inspection time or interval time, and makes an inspection plan, for example. Specifically, a patrol route of the target space area is preset, and patrol scanning analysis is performed on the whole target space area in a time-sharing full-coverage patrol mode.
Fig. 3 shows a schematic diagram of a time-sharing full-coverage inspection method provided by the first embodiment of the present application. As shown in fig. 3, the time-sharing and full-coverage patrol inspection mode adopted by the present embodiment refers to performing detection on a plurality of adjacent sub-areas (e.g., S1, S2, and S3 in fig. 3) of the target space area according to a preset route R1 at different consecutive time periods, so as to implement full-coverage patrol scanning of the target space area. For example, the camera apparatus a detects the first target sub-area S1 in the target space region in the first detection posture for a first period of time (e.g., for 30 seconds), then detects the second target sub-area S2 adjacent to the first target area S1 in the target space region in the second detection posture for a second period of time, and then detects the third target sub-area S3 adjacent to the second target area S2 in the target space region in the third detection posture for a third period of time. The camera apparatus a continuously detects the sub-areas S1 to S3 in the target space area along the preset patrol route R1 for a predetermined period of time until the camera apparatus a performs full coverage detection on all the sub-areas of the target space area along the preset patrol route R1. Wherein the first, second, and third detection poses have different pan/tilt PTZ values.
In addition, the configuration of the routing inspection rule further comprises the steps of acquiring an electronic map of a target space area according to specific scene requirements (for example, aiming at key supervision areas such as inland rivers and harbors), and setting a routing inspection route according to the electronic map. For example, taking the detection of fishing behavior as an example, for inland river scenes, the target space region may include a bank sub-region and a water surface sub-region, and the patrol route may be set to be a Z-shaped patrol route to cover the bank sub-region and the water surface sub-region to be detected. It should be understood that other routing inspection routes may be set according to the target space region and the scene requirement, which is not limited in this application.
In step S12, in the process of detecting according to the preset patrol route, it is detected whether a specific type of object is present within the target space area based on the image of the target space area captured via the thermal imaging mechanism.
Illustratively, the aforementioned camera device includes a camera component based on a thermal imaging mechanism, in which a thermal imaging core senses thermal radiation through a thermal imaging sensing chip and outputs a gray-scale image, which is processed by an image processor (GPU) inside the camera device by invoking a thermal imaging-based object feature detection algorithm to determine whether a specific type of object is present in a number of preset regions of interest. It should be understood that the classification and identification of the image can be performed by using the existing thermal imaging feature identification algorithm, which is not limited by the present application.
For example, taking the detection of fishing behavior as an example, when the target space region is a river bank sub-region, a human body feature recognition algorithm based on thermal imaging is invoked to detect suspected fishing people on the river bank; and when the target space area is the water surface sub-area, calling a ship feature recognition algorithm based on thermal imaging to detect the suspected fishing ship. The thermal imaging-based human/vessel detection algorithm may be an existing thermal imaging human body feature recognition algorithm based on deep learning and background modeling and a thermal imaging vessel feature recognition algorithm based on deep learning, for example, foreground target features to be recognized are extracted from a grayscale image captured by thermal imaging, the target features are recognized according to a human or vessel object class label obtained through pre-training, and whether a preset class of fishing objects (i.e., people or vessels) is included in the image is determined.
The thermal imaging-based object feature detection algorithm has low requirements on pixels of an object to be detected in a captured image, so that the focal length of the selected thermal imaging lens is smaller, the monitoring coverage range of the camera equipment is larger, the routing inspection route is shortened, the detection efficiency is improved, and the full-coverage detection of a full-target space area is realized.
It should be understood that the term image or picture as used herein includes both still images and image data streams formed by a plurality of still images, and also dynamic images, such as video data streams formed by a plurality of frames of images that are continuously and rapidly played according to a frame count standard (e.g., 12 frames, 24 frames, or 25 frames per second, etc.).
In step S13, when an object of a specific type exists in the target space region, acquiring feature data of the object of the specific type; and determining whether the suspected object exists in the target space area or not based on the feature data and a preset filtering strategy. Specifically, when it is determined that at least one specific type of object among specific types of objects exists in the target space region, feature data of the specific type of object is acquired; and excluding non-target objects from the objects of the specific type based on the feature data and a preset filtering strategy, thereby screening out suspected objects existing in the target space area. For example, feature data of the specific type of object may be acquired according to related pose parameters (e.g., PTZ values) of the camera device, and multiple frames of images including the specific type of object may be captured from the detection of the presence of the specific type of object in the target spatial region and analyzed to acquire feature data of the specific type of object, which may include static and/or dynamic features of the specific type of object. The static characteristics may include the size of the particular type of object; the dynamic characteristics may include the location and trajectory of the particular type of object.
For example, taking the detection of fishing behavior as an example, when it is determined that a ship exists in the target flow domain, feature data of the ship is acquired according to the PTZ value of the pan/tilt/zoom (PTZ) of the camera device, and non-fishing ships in the ship are excluded based on the feature data and a preset filtering strategy. The characteristic data of the ship at least comprises the data of the length, the height, the speed, the position, the loitering track, the stay time and the like of the ship. The preset filtering strategy may be preset according to a specific scenario and a requirement, and the strategy may include, for example, filtering a large ship in a ship, a ship having a fast sailing speed, and comprehensively determining the possibility that the ship is a fishing ship through the operation trajectory data (e.g., the stay time, etc.) of the ship. Illustratively, the preset filtering policy may include the following policies:
and (3) filtering strategy A: filtering by ship length conditions; for example, when the fishing forbidden service is involved, the fishing ship is often miniaturized, and meanwhile, sand-collecting ships, ferry ships and cargo ships often pass through the river/lake, so that the false alarm rate of equipment can be reduced by filtering large ships (the captain is over 15-20 meters) or monitoring only small ships (the captain is between 3-10 meters).
And (3) filtering strategy B: filtering under the condition of ship speed; also, for example, in a fishing prohibition business, a fishing ship tends to run at a slow speed or stop on the water surface during operation and fishing, and the ship suspected of having fishing behavior can be screened according to the ship speed (for example, the ship speed is lower than 2 m/s).
And (3) filtering strategy C: filtering through the condition of the residence time of the region; similarly, for example, in a fishing administration forbidden fishing business, a fishing ship often has the characteristic of wandering in a certain area during operation and fishing, and the stay time in the area can be set to screen ships suspected of having fishing behaviors; for example, ships with stay periods less than a preset value can be filtered out. If the stay time exceeds a certain time (for example, 1 day), the corresponding ship may be a faulty ship, a abandoned ship, etc., or may be listed as a non-suspicious object according to a policy for filtering, and the device may also automatically mark the position coordinates of the ship, and may directly filter when the ship at the position is detected by future inspection.
Combined filtering strategy D: in order to further reduce the false alarm rate, comprehensive judgment can be carried out through a combined strategy; similarly, for example, in a fishing administration forbidden fishing business, various conditions such as stay time, ship length, ship speed and the like in an area are set, so that suspected fishing ships can be screened out more accurately.
In the present embodiment, acquiring feature data of an object of a specific type from relevant pose parameters (e.g., PTZ values) of the camera device may include calculating a straight-line distance from the detected object of the specific type by a trigonometric function by acquiring an installation height of the camera device and a device spatial pose angle; calculating the size and the position of the object of the specific type in a similar triangular mode by acquiring parameters such as the number of pixels of the object of the specific type in the image, the number of pixels of the detector, the size of the pixels, the distance between camera equipment and the object of the specific type, the field angle of the equipment and the like; in addition, the moving speed of the object of the specific type can be calculated by calculating the pixel position deviation of the center point of the object of the specific type between different frame images, and the running track of the object of the specific type can be determined. It should be understood that the feature data of the object of a specific type may be calculated by other existing calculation methods, which are not limited in this application.
Specifically, as shown in fig. 4, the step S20 further includes the sub-steps S21 to S23 of the second detection mode:
in step S21, after determining that the suspected object exists according to the thermal imaging mechanism and the filtering strategy, the visible light lens is driven to perform a zooming operation to magnify and display the suspected object.
Specifically, after the suspected object is determined to exist according to the thermal imaging mechanism and the filtering strategy, the camera device automatically calculates a suitable zoom magnification, and then performs a focal length zoom operation in linkage with the visible light lens to magnify and display the suspected object. And driving the visible light lens to execute focal length zooming operation according to the current attitude information (such as PTZ value of the pan-tilt) of the pan-tilt and the pixel information of the suspected object in the image of the target space area captured based on thermal imaging so as to realize a preset zooming strategy. The preset zoom policy may include: after the visible light lens performs the zooming operation, the proportion of the suspected object in the captured image picture can be within a preset threshold range (for example, about 1/4-1/3), so that the subsequent algorithm identification and analysis can be facilitated. Exemplarily, in the present embodiment, the camera apparatus drives the built-in thermal imaging lens and the visible light lens to work by using the same pan/tilt head; in other embodiments, the camera device may also use different driving devices to control the operation of the thermal imaging lens and the visible light lens respectively, and a plurality of the driving devices may cooperate with each other.
In step S22, after the focal length magnification changing operation is performed, an image including the suspected object is captured based on a visible light tracking mechanism.
The step of shooting the image comprising the suspected object based on the visible light tracking mechanism further comprises the steps of calculating the rotation angle of the holder in real time and driving the visible light lens to follow the movement of the suspected object based on the calculated rotation angle.
Specifically, firstly, the deviation of the pixel of the center point of the current suspected object from the center point of the image in the image captured by the camera device is calculated, and the field angle (i.e. the target deviation angle) of the center point of the suspected object from the center point of the image is calculated by combining the field angle of the whole image according to the proportion of the pixel deviation to the total number of the pixels of the whole image, so that the field angle required for moving the suspected object in the current image to the center of the image can be obtained. And then the spherical tangent plane coordinates of the field angle are converted into the spherical coordinates of the holder through the trigonometric function, so that the holder rotation angle corresponding to the field angle is calculated, namely, the effect of moving the suspected object to the center of the picture can be achieved when the holder rotates by the corresponding angle.
For example, the camera device may analyze the speed of 12.5 frames of pictures per second to identify the suspected object, and calculate how many angles the pan-tilt needs to move the suspected object to the center of the picture. And then, the calculation result is sent to the holder to control the built-in lens to perform corresponding pitching and horizontal rotation movement so as to capture the image of the moving or running suspected object, and further the effect of tracking the suspected object is achieved.
The camera device controls the holder and the zoom lens in real time, so that in the tracking process, the picture can always move by taking a tracked object (namely a suspected object) as a center, and meanwhile, the lens is pulled in or pulled out, so that the tracked object always keeps a certain size in the picture.
In this embodiment, the camera device may acquire current ambient brightness information and/or current time information, and determine whether to start supplementary lighting according to the current ambient brightness information and/or the current time information before shooting an image including the suspected object based on a visible light tracking mechanism. Illustratively, the camera device may perform a fill-in operation by a built-in laser.
In step S23, synchronously determining whether the suspected object is the target object while tracking and capturing the image including the suspected object, wherein determining whether the suspected object is the target object includes: and identifying whether the suspected object implements a preset behavior or has a preset characteristic.
Taking the detection of fishing behavior as an example, when the tracking shot includes suspected fishing objects (e.g., suspected fishermen or fishing vessel), it is identified whether the suspected fishermen implement a predetermined fishing behavior or whether the suspected fishing vessel has a predetermined characteristic of the fishing vessel.
Specifically, after a suspected fishing object is found by the camera device, the cradle head is controlled to display the suspected fishing object in a centered mode and execute visible light zooming linkage amplification operation according to the steps S21 to S22, after a stable tracking state is achieved, ship shape recognition and/or bank or ship personnel behavior recognition are carried out, whether fishing ships exist or whether fishing, net casting, electric fish, poisonous fish and other behaviors exist is judged, if the fishing ships and/or fishing behaviors exist, tracking is timely finished, an alarm is reported to a relevant client/platform, meanwhile, an image and/or video report can be uploaded, and inspection is continuously carried out in the first detection mode to determine whether the suspected object exists in the target space area. And if the fishing boat and/or fishing behavior is not found, returning the camera equipment to the original position after the preset tracking time is over, and continuously carrying out routing inspection in the first detection mode to determine whether the suspected object exists in the target space area.
For example, for a captured visible light image or visible light video, features in the image can be learned using existing deep learning algorithms and applied in a classification recognition task. Specifically, identifying personnel activities on shore or on board may include: calibrating the characteristics of fishing tools such as a fishing rod, a fishing net, a ground cage, an electric fish appliance and the like or the continuous actions of fishing, net casting and the like by using calibration methods such as rectangles, rotating polygons, segmentation and the like, learning the characteristics by using a deep learning algorithm, and identifying the fishing behavior types such as fishing, electric fish, net casting and the like by using a trained fishing behavior model; the operation of recognizing the form of the ship may include: the method comprises the steps of calibrating characteristics of facilities replacing fishing boats, such as small wooden boats, kayaks, simple rowing boats and the like by utilizing calibration methods of rectangles, rotary polygons, segmentation and the like, learning the characteristics by utilizing a deep learning algorithm, and identifying the specific type of a suspected fishing boat by applying a trained fishing boat model.
In this embodiment, when a certain camera device that is performing an inspection task finds that a suspected object exists but misses the best shooting angle, joint tracking snapshot may be performed by one or more other camera devices that are networked with the camera device, so as to improve inspection efficiency.
In this embodiment, after a plurality of suspected objects are identified from an image captured based on a thermal imaging mechanism, the plurality of suspected objects can be respectively subjected to joint tracking snapshot by one or more other camera devices networked with the camera device; the suspected objects may also be sequentially tracked and shot according to a preset sequence, for example, tracking is performed according to the sequence in which the suspected objects are detected, or a suspected object closest to a central point is automatically selected from a screen for tracking.
Fig. 5 shows a block diagram of a target object detection apparatus 100 according to a second embodiment of the present application. The apparatus 100 comprises: the first detection module 10 is configured to determine whether a suspected object exists in the target space area in a first detection manner according to a preset routing inspection route; and a second detection module 20, configured to, when it is determined that the suspected object exists, confirm whether the suspected object is the target object in a second detection manner.
The first detection module 10 includes a thermal imaging detection unit 11 and a filtering unit 12. Wherein the thermal imaging detection unit 11 is configured to detect whether a specific type of object is present within the target spatial region based on an image of the target spatial region captured via a thermal imaging mechanism. The filtering unit 12 is configured to, when an object of a specific type exists in the target space region, acquire feature data of the object of the specific type, and determine whether the suspected object exists in the target space region based on the feature data and a preset filtering policy.
The second detection module 20 comprises a variable magnification element 21, a tracking element 22 and a determination element 23, wherein: the zooming element 21 is configured to drive the visible light lens to perform a focal length zooming operation when the visible light lens is driven according to current attitude information of a pan-tilt on which the visible light lens is mounted and pixel information of the suspected object in the captured image of the target space region; the tracking element 22 is configured to capture an image including the suspected object based on a visible light tracking mechanism, wherein the tracking element is further configured to calculate a rotation angle of the pan/tilt head in real time and drive the visible light lens to follow the movement of the suspected object based on the calculated rotation angle; and the determination element 23 is configured to synchronously determine whether the suspected object is the target object while tracking and capturing the image including the suspected object, wherein the determining whether the suspected object is the target object includes: and identifying whether the suspected object implements a preset behavior or has a preset characteristic.
Other working principles and implementation manners of the components of the target object detection apparatus 100 of this embodiment are the same as or similar to those of the target object detection method described above with reference to the first embodiment, and are not described herein again.
Fig. 6 shows a block diagram of a target object detection apparatus 200 provided in a third embodiment of the present application. The target object detection apparatus 200 includes: a thermal imaging component 201, a visible light image capturing component 202, and a controller 203, the controller 203 includes a storage unit 2031 and a processing unit 2032, wherein the storage unit 2031 stores a computer program, and the program can drive the thermal imaging component 201 and the visible light image capturing component 202 to operate and implement the steps of the aforementioned target object detection method when executed by the processing unit 2032.
In another embodiment, a computer readable storage medium is provided, which stores computer instructions that, when executed by a processor, implement the steps of the aforementioned target object detection method.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The application scenario of the embodiment of the application can be set as detection of fishing behaviors, and intelligent detection and result output of unauthorized fishing events can be achieved through the technical scheme provided by the embodiment of the application. The method comprises the steps that a specific fishing object is automatically detected based on full-river intelligent routing inspection analysis (man-ship linked snapshot) acquired by a thermal imaging part, the specific fishing object is firstly filtered to determine whether a suspected fishing object exists, when the suspected fishing object is determined to exist, a visible light part automatically performs focal length zooming, so that pixel points of the suspected fishing object to be detected meet the requirement of identification algorithm analysis, meanwhile, the suspected fishing object is automatically locked and snapshot is tracked, tracking is performed while detection is performed, a visible light identification algorithm is called to perform secondary identification on the suspected fishing person or suspected fishing ship, whether fishing behaviors belonging to the type of reported objects exist is judged, and the fishing behaviors are reported in time.
The target object detection method, apparatus, device and storage medium provided in the embodiments of the present application are described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understanding the technical solution and the core idea of the present application; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the present disclosure as defined by the appended claims.

Claims (10)

1. A target object detection method, the method comprising:
determining whether a suspected object exists in a target space area or not in a first detection mode according to a preset routing inspection route; and
when the suspected object exists, confirming whether the suspected object is the target object in a second detection mode.
2. The target object detection method of claim 1, wherein the first detection means comprises:
detecting whether a particular type of object is present within the target spatial region based on an image of the target spatial region captured via a thermal imaging mechanism;
when the object of the specific type exists in the target space region, acquiring characteristic data of the object of the specific type; and
and determining whether the suspected object exists in the target space area or not based on the feature data and a preset filtering strategy.
3. The target object detection method of claim 1, wherein the second detection means comprises: after the suspected object is determined to exist according to the first detection mode, driving a visible light lens to execute focal length zooming operation according to the current attitude information of a holder provided with the visible light lens and the pixel information of the suspected object in the captured image of the target space area so as to magnify and display the suspected object;
after the focal length zooming operation is performed, an image including the suspected object is shot based on a visible light tracking mechanism.
4. The target object detection method of claim 3, wherein the second detection mode further comprises: acquiring current environment brightness information and/or current time information, and determining whether to start supplementary lighting according to the current environment brightness information and/or the current time information before shooting an image including the suspected object based on a visible light tracking mechanism;
the step of capturing an image including the suspected object based on a visible light tracking mechanism includes: and calculating the rotation angle of the holder in real time and driving the visible light lens to follow the movement of the suspected object based on the calculated rotation angle.
5. The target object detection method of claim 3, wherein the second detection mode further comprises: synchronously determining whether the suspected object is the target object while tracking and shooting the image including the suspected object, wherein determining whether the suspected object is the target object comprises: and identifying whether the suspected object implements a preset behavior or has a preset characteristic.
6. The target object detection method of claim 1, further comprising:
when the suspected object exists in the target space area is determined in a first detection mode according to a preset routing inspection route, suspending inspection, and confirming whether the suspected object is the target object in a second detection mode;
when the suspected object is confirmed not to be the target object in the second detection mode, the inspection is recovered, and whether the suspected object exists in the target space area or not is continuously determined in the first detection mode according to a preset inspection route;
the step of determining whether the suspected object exists in the target space area in a first detection mode according to the preset routing inspection route comprises the following steps: performing patrol scanning on the whole target space area according to a preset time-sharing full-coverage patrol route and determining whether a suspected object exists in the target space area in a first detection mode;
the target space area is a target water area, and the target object is a fisherman or a fishing vessel.
7. A target object detection apparatus, characterized in that the apparatus comprises:
the first detection module is used for determining whether a suspected object exists in the target space area in a first detection mode according to a preset routing inspection route; and
and the second detection module is used for confirming whether the suspected object is the target object in a second detection mode when the suspected object is determined to exist.
8. The target object detection apparatus of claim 7, wherein the first detection module comprises a thermal imaging detection unit and a filtering unit, wherein:
the thermal imaging detection unit is used for detecting whether a specific type of object exists in the target space region based on the image of the target space region captured by the thermal imaging mechanism; and
the filtering unit is used for acquiring feature data of an object of a specific type when the object of the specific type exists in the target space region, and determining whether the suspected object exists in the target space region based on the feature data and a preset filtering strategy;
the second detection module comprises a zoom element, a tracking element, and a determination element, wherein:
the zooming element is used for driving the visible light lens to execute focal length zooming operation according to the current attitude information of a holder provided with the visible light lens and the pixel information of the suspected object in the captured image of the target space area;
the tracking element is used for shooting an image including the suspected object based on a visible light tracking mechanism after the focal length zooming operation is executed, wherein the tracking element is also used for calculating the rotation angle of the holder in real time and driving the visible light lens to follow the movement of the suspected object based on the calculated rotation angle; and
the determination element is configured to synchronously determine whether the suspected object is the target object while tracking and capturing an image including the suspected object, wherein determining whether the suspected object is the target object includes: and identifying whether the suspected object implements a preset behavior or has a preset characteristic.
9. A target object detection apparatus, characterized in that the apparatus comprises:
a thermal imaging assembly;
a visible light camera component; and
a controller comprising a storage unit and a processing unit, wherein the storage unit stores a computer program that, when executed by the processing unit, is capable of driving the thermal imaging assembly and the visible light camera assembly to operate and implement the target object detection method according to any one of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the target object detection method according to any one of claims 1 to 6.
CN202110768302.3A 2021-07-07 2021-07-07 Target object detection method, device, equipment and storage medium Pending CN113507577A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110768302.3A CN113507577A (en) 2021-07-07 2021-07-07 Target object detection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110768302.3A CN113507577A (en) 2021-07-07 2021-07-07 Target object detection method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113507577A true CN113507577A (en) 2021-10-15

Family

ID=78011559

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110768302.3A Pending CN113507577A (en) 2021-07-07 2021-07-07 Target object detection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113507577A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115019402A (en) * 2022-08-09 2022-09-06 东莞先知大数据有限公司 Shore personnel dangerous behavior detection method, electronic equipment and storage medium
CN117270580A (en) * 2023-11-21 2023-12-22 长春通视光电技术股份有限公司 Servo control method, system and equipment for tracking unmanned aerial vehicle photoelectric pod target

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107105207A (en) * 2017-06-09 2017-08-29 北京深瞐科技有限公司 Target monitoring method, target monitoring device and video camera
CN111770266A (en) * 2020-06-15 2020-10-13 北京世纪瑞尔技术股份有限公司 Intelligent visual perception system
CN112417955A (en) * 2020-10-14 2021-02-26 国电大渡河沙坪水电建设有限公司 Patrol video stream processing method and device
WO2021047306A1 (en) * 2019-09-10 2021-03-18 中兴通讯股份有限公司 Abnormal behavior determination method and apparatus, terminal, and readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107105207A (en) * 2017-06-09 2017-08-29 北京深瞐科技有限公司 Target monitoring method, target monitoring device and video camera
WO2021047306A1 (en) * 2019-09-10 2021-03-18 中兴通讯股份有限公司 Abnormal behavior determination method and apparatus, terminal, and readable storage medium
CN111770266A (en) * 2020-06-15 2020-10-13 北京世纪瑞尔技术股份有限公司 Intelligent visual perception system
CN112417955A (en) * 2020-10-14 2021-02-26 国电大渡河沙坪水电建设有限公司 Patrol video stream processing method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115019402A (en) * 2022-08-09 2022-09-06 东莞先知大数据有限公司 Shore personnel dangerous behavior detection method, electronic equipment and storage medium
CN115019402B (en) * 2022-08-09 2022-11-25 东莞先知大数据有限公司 Shore personnel dangerous behavior detection method, electronic equipment and storage medium
CN117270580A (en) * 2023-11-21 2023-12-22 长春通视光电技术股份有限公司 Servo control method, system and equipment for tracking unmanned aerial vehicle photoelectric pod target

Similar Documents

Publication Publication Date Title
US10121078B2 (en) Method and system for detection of foreign objects in maritime environments
US7889232B2 (en) Method and system for surveillance of vessels
CN104378582B (en) A kind of intelligent video analysis system and method cruised based on Pan/Tilt/Zoom camera
US9652860B1 (en) System and method for autonomous PTZ tracking of aerial targets
US8041077B2 (en) Method of motion detection and autonomous motion tracking using dynamic sensitivity masks in a pan-tilt camera
CN109872483B (en) Intrusion alert photoelectric monitoring system and method
KR101709751B1 (en) An automatic monitoring system for dangerous situation of persons in the sea
CN109409283A (en) A kind of method, system and the storage medium of surface vessel tracking and monitoring
CN112396116B (en) Thunder and lightning detection method and device, computer equipment and readable medium
CN113507577A (en) Target object detection method, device, equipment and storage medium
US9367748B1 (en) System and method for autonomous lock-on target tracking
WO2001084844A1 (en) System for tracking and monitoring multiple moving objects
CN111163290B (en) Method for detecting and tracking night navigation ship
CN113936029A (en) Video-based illegal fishing automatic detection method and system
JP4764172B2 (en) Method for detecting moving object candidate by image processing, moving object detecting method for detecting moving object from moving object candidate, moving object detecting apparatus, and moving object detecting program
Bloisi et al. Camera based target recognition for maritime awareness
CN111242025A (en) Action real-time monitoring method based on YOLO
CN113012383A (en) Fire detection alarm method, related system, related equipment and storage medium
CN116453276A (en) Marine wind power electronic fence monitoring and early warning method and system
CN114140745A (en) Method, system, device and medium for detecting personnel attributes of construction site
CN114401354A (en) Intelligent control method and system for over-the-horizon monitoring of offshore ship
JP7125843B2 (en) Fault detection system
CN112307943B (en) Water area man-boat target detection method, system, terminal and medium
Liu et al. Ship detection and tracking in nighttime video images based on the method of LSDT
Bloisi et al. Integrated visual information for maritime surveillance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211015