CN111815669A - Target tracking method, target tracking device and storage device - Google Patents

Target tracking method, target tracking device and storage device Download PDF

Info

Publication number
CN111815669A
CN111815669A CN202010585510.5A CN202010585510A CN111815669A CN 111815669 A CN111815669 A CN 111815669A CN 202010585510 A CN202010585510 A CN 202010585510A CN 111815669 A CN111815669 A CN 111815669A
Authority
CN
China
Prior art keywords
target frame
target
tracking
light image
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010585510.5A
Other languages
Chinese (zh)
Other versions
CN111815669B (en
Inventor
李明竹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010585510.5A priority Critical patent/CN111815669B/en
Publication of CN111815669A publication Critical patent/CN111815669A/en
Application granted granted Critical
Publication of CN111815669B publication Critical patent/CN111815669B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

The invention discloses a target tracking method, a target tracking device and a storage device, wherein the method comprises the following steps: collecting a visible light image and an infrared light image of a current scene; tracking and acquiring a first target frame on the visible light image; determining a second target frame in the infrared light image according to the first target frame; selecting a core area in the second target frame and an extension area of the second target frame; dividing pixel points of the extension area by taking the pixel value of the core area as a standard, and determining a second adjustment target frame; adjusting the first target frame based on the second adjustment target frame to obtain a first adjustment target frame; and determining a tracking target according to the first adjustment target frame. Through the mode, the target tracking accuracy can be improved.

Description

Target tracking method, target tracking device and storage device
Technical Field
The present invention relates to the field of target tracking technologies, and in particular, to a target tracking method, a target tracking apparatus, and a storage apparatus.
Background
When the size of an existing target changes rapidly, zooming tracking of a dome camera is carried out through visible light, and the target is lost frequently due to inaccurate zooming. In order to solve the existing target tracking problem, the tracking of the target with the size changing rapidly is mainly improved from the aspect of algorithm, however, the algorithm has the problems that the original characteristics of the target are lost when the target is closer to a dome camera, and the zoom value cannot be responded rapidly when the target is further away from the dome camera, so that the tracking is difficult. Meanwhile, the improvement method at the algorithm level can increase the time consumption of the system.
Disclosure of Invention
In view of this, the present invention provides a target tracking method, a target tracking apparatus and a storage apparatus, which can improve the target tracking accuracy.
In order to solve the technical problems, the invention adopts a technical scheme that: there is provided a target tracking method, the method comprising: collecting a visible light image and an infrared light image of a current scene; tracking and acquiring a first target frame on the visible light image; determining a second target frame in the infrared light image according to the first target frame; selecting a core area in the second target frame and an extension area of the second target frame; dividing pixel points of the extension area by taking the pixel value of the core area as a standard, and determining a second adjustment target frame; adjusting the first target frame based on the second adjustment target frame to obtain the first adjustment target frame; and determining a tracking target according to the first adjusting target frame.
In order to solve the technical problem, the invention adopts another technical scheme that: there is provided a target tracking apparatus comprising a memory and a processor coupled to each other; the processor is used for executing the program instructions stored in the memory so as to realize the target tracking method.
In order to solve the technical problem, the invention adopts another technical scheme that: there is provided a storage device storing program instructions executable by a processor for implementing the target tracking method described above.
The invention has the beneficial effects that: the present invention provides a target tracking method, an image processing target tracking apparatus, and a storage apparatus, which are different from the prior art. Tracking and acquiring a first target frame on the visible light image; determining a second target frame in the infrared light image according to the first target frame; selecting a core area in the second target frame and an extension area of the second target frame; dividing pixel points of the extension area by taking the pixel value of the core area as a standard, and determining a second adjustment target frame; adjusting the first target frame based on the second adjustment target frame to obtain a first adjustment target frame; and determining a tracking target according to the first adjustment target frame. The first target frame on the visible light image is adjusted by the second target adjusting frame obtained on the infrared light image through the combination of the visible light image and the infrared light image, so that the target tracking accuracy can be improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention. Moreover, the drawings and the description are not intended to limit the scope of the inventive concept in any way, but rather to illustrate it by those skilled in the art with reference to specific embodiments.
FIG. 1 is a schematic flow chart diagram of a first embodiment of a target tracking method of the present invention;
FIG. 2 is a flowchart illustrating an embodiment of step S16 in FIG. 1;
FIG. 3 is a flowchart illustrating an embodiment of step S15 in FIG. 1;
FIG. 4 is a schematic flow chart diagram illustrating a second embodiment of the target tracking method of the present invention;
FIG. 5 is a block diagram of an embodiment of the target tracking device of the present invention;
FIG. 6 is a block diagram of an embodiment of a memory device according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic flow chart of a target tracking method according to a first embodiment of the invention. It should be noted that the scale estimation method for target tracking described in this embodiment is not limited to the following steps:
s11: and collecting a visible light image and an infrared light image of the current scene.
In this embodiment, a visible light image and an infrared light image currently in the same scene are acquired. The scene can be in any time period of day and night, and the indoor or outdoor is not limited as long as the visible light image and the infrared light image of the current scene can be acquired. Wherein, the visible light image is collected by a common camera, and the infrared light image is collected by an infrared camera and the like. The infrared light image has the characteristic of no background interference, and can be applied to target tracking for 24 hours all day.
S12: and tracking and acquiring a first target frame on the visible light image.
In this embodiment, the position information of the first target frame on the visible light image is obtained by tracking through a tracking algorithm. The tracking algorithm may be a KCF algorithm, a siamrpn algorithm, an tld algorithm, a dat algorithm, or the like. The tracking algorithm may be selected according to the requirement, and is not particularly limited.
Specifically, the center coordinates and the width and height information of the first target frame on the visible light image can be obtained through a tracking algorithm, and if the position of the center point of the first target frame and the width and height thereof are: (x1, y1, w1, h 1).
Further, in order to judge whether the scale change of the optical image target can be accurately tracked, the tracking algorithm calculates the first confidence of the first target frame besides the center coordinate and the width and height information of the first target frame. Comparing the first confidence with a first threshold, and if the first confidence is less than or equal to the first threshold, entering a subsequent step S13; if the first confidence coefficient is larger than the first threshold value, the change of the target tracking scale of the visible light image is accurate, namely the first target frame is determined to represent the tracking target, and then the tracking is continued. By comparing the first confidence coefficient with the first threshold value, the effectiveness of tracking the optical image target can be quickly analyzed, when the first confidence coefficient cannot meet the requirement of the first threshold value, the first confidence coefficient can be quickly combined with the infrared image to complete further tracking of the first target frame, and the target tracking efficiency is improved. The first threshold value is an empirical value, and a credible threshold value is tracked for each frame of visible light image. By comparing the first confidence coefficient with the first threshold value, the effectiveness of target tracking of the visible light image can be improved, and the overall accuracy of target tracking is improved.
And S13, determining a second target frame in the infrared light image according to the first target frame.
In this embodiment, the second target frame position of the first target in the infrared light image is calculated according to the first target frame position on the visible light image. Specifically, the second target frame information can be determined by the first target frame information through an image calibration algorithm, and certainly, other methods may also be used to obtain information of the second target frame on the infrared light image, which is not limited herein.
S14, selecting a core area in the second target frame and an extension area of the second target frame; and dividing pixel points of the extension area by taking the pixel value of the core area as a standard, and determining a second adjustment target frame.
In this embodiment, the core area and the extension area are formed separately on the basis of the second target frame, wherein the extension area is necessarily larger than the core area. And dividing pixel point information of the extension area through the pixel value of the core area to further form a second adjustment target frame.
Specifically, the second target frame is reduced by a first multiple to obtain a core region, and the second target frame is enlarged by a second multiple to determine an extended region, wherein the center points of the core region, the extended region and the second target frame are the same. The first multiple of the second target frame reduction is not specifically limited and is less than 1, and the second multiple of the second target frame amplification is not specifically limited and is greater than 1.
Specifically, binarization processing is carried out on the extended area according to the average value of the pixel values of the core area; and analyzing the connected domain of the expanded region after the binarization processing to obtain a minimum circumscribed rectangle corresponding to the maximum connected domain, and taking the minimum circumscribed rectangle as a second adjustment target frame. The second adjustment target frame is obtained without complicated mathematical operation, so that the speed is high, the time consumption is low, the application range on a using platform is wider, and the calculated amount is reduced.
And S15, adjusting the first target frame based on the second adjustment target frame to obtain a first adjustment target frame.
In this embodiment, the first adjustment target frame is formed after the estimated target scale of the second adjustment target frame is obtained by using the infrared light image and is transferred to the first target frame of the visible light image.
And S16, determining a tracking target according to the first adjustment target frame.
In this embodiment, if the first adjustment target frame is determined as the tracking target, the tracking is continued. In particular, to achieve continuity of target tracking. The target tracking method further includes updating the first target frame on the tracking-acquired visible light image in the step S12, i.e., reacquiring a new first target frame, and then resuming tracking of the first target frame, re-executing the steps after S13, S14, and S15.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating an embodiment of step S16 in fig. 1. Specifically, the step S16 may include the following steps:
s161: and calculating a second confidence degree of the first adjusting target frame.
In this embodiment, in order to improve the efficiency of tracking the target in the infrared light image, the first adjustment target frame reruns the tracking algorithm in step S12 in the first embodiment of the target tracking method, and the position and width of the central point of the first adjustment target frame are obtained again through the tracking algorithm, and the second confidence is obtained at the same time. The second threshold is an empirical value and is a threshold for which the tracking structure of each frame of target image is credible.
S162: comparing the second confidence coefficient with a second threshold, and if the second confidence coefficient is greater than the second threshold, determining that the first adjustment target frame represents the tracking target; and if the second confidence coefficient is less than or equal to the second threshold value, losing the tracking target.
In this embodiment, the efficiency of tracking the infrared light target can be further improved by comparing the second confidence with the second threshold. And when the second confidence coefficient is larger than a second threshold value, the first adjustment target frame is a tracking target, otherwise, the target tracking is lost. By comparing the second confidence coefficient with the second threshold value, the effectiveness of the infrared light image in estimating the scale change after the visible light image and the infrared light image are combined for use can be rapidly analyzed, so that the overall accuracy of target tracking is improved, and the robustness is excellent.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating an embodiment of step S15 in fig. 1. Specifically, the step S15 may include the following steps:
s151: and calculating the change scale of the second adjustment target frame compared with the target frame of the previous frame.
In this embodiment, the variation scale is the area of the second adjustment target frame divided by the area of the target frame, and the area of the circumscribed rectangle frame with the smallest area of the second adjustment target frame is assumed to be SmI.e. S2=Sm/(w*h),S2To vary the scale, w and h are the width and height of the second target box. By calculating the second adjustment target frame of the second target frame on the infrared light image, the change scale can be effectively estimated, so that the infrared light image has the characteristic of high precision and fast response.
S152: and adjusting the first target frame based on the change scale to obtain a first adjustment target frame.
In this embodiment, the variation scale S of the infrared image is obtained2Transmitting the image to a first target frame in the visible light image to obtain a first adjusted target frame position (x1, y1, w 1S)2,h1*S2) Wherein the first target frame position is assumed to be (x1, y1, w1, h 1).
By calculating the change scale of the second target frame on the infrared image and adjusting the first target frame through the change scale, the tracking accuracy can be improved, and the robustness is excellent.
Referring to fig. 4, fig. 4 is a flowchart illustrating a target tracking method according to a second embodiment of the present invention. Specifically, the method comprises the following steps:
s101: and collecting a visible light image and an infrared light image of the current scene.
S102: and tracking and acquiring the center coordinate and the width and height information of the first target frame on the visible light image.
In this embodiment, the first target frame is obtained by a tracking algorithm. Assuming that the position of the center point of the first target frame and the width and height thereof are: (x1, y1, w1, h 1).
S103: a first confidence level for the first target box is calculated.
Wherein the first confidence level is calculated by a tracking algorithm.
S104: the first confidence level is compared to a first threshold magnitude.
Comparing the first confidence with a first threshold, and if the first confidence is greater than the first threshold, entering step S105; otherwise, the process proceeds to step S106.
S105: and determining that the first target frame represents a tracking target.
And if the first target frame represents a tracking target, the tracking scale change of the visible light image is accurate, and the visible light image can be continuously tracked.
S106: and determining a second target frame in the infrared light image according to the first target frame.
Wherein, the position and width and height of the center point of the first target frame on the visible light image are assumed as follows: (x1, y1, w1, h1), the first object frame is transformed to form a second object frame. Assuming that the center position and width and height of the second target frame on the infrared light image are: (x2, y2, w2, h 2).
S107: and carrying out reduction of the first multiple on the second target frame to obtain a core area, and carrying out magnification of the second multiple on the second target frame to determine an expansion area, wherein the center points of the core area, the expansion area and the second target frame are the same.
For example, if the second target frame is reduced by one-half to form a core region, the core region is located at ((max (0, (x2-w2)/4), max (0, (y2-h2)/4)), (min (width, (x2+ w2)/4), min (height, (y2+ h 2)/4))). The second target frame is expanded by two times to form an expanded region, and the expanded region is positioned at ((max (0, x2-w2), max (0, y2-h2)), (min (width, x2+ w2), min (height, y2+ h 2))). Wherein the center points of the core region, the extension region, and the second target box are the same.
S108: and carrying out binarization processing on the extended area according to the average value of the pixel values of the core area.
And after the pixel value of the core region is averaged, the extension region is subjected to binarization processing so as to form a binary image.
S109: and analyzing the connected domain of the expanded region after the binarization processing to obtain a minimum circumscribed rectangle corresponding to the maximum connected domain, and taking the minimum circumscribed rectangle as a second adjustment target frame.
Wherein, the minimum bounding rectangle obtained is assumed to be SmThe S ofmNamely the second adjustment target frame.
S110: and calculating the change scale of the second adjustment target frame compared with the target frame of the previous frame.
Wherein the change scale is assumed to be S2Then S is2=SmAnd/or (w h), wherein the previous frame target box has the width and height of the second target box.
S111: and adjusting the first target frame based on the change scale to obtain a first adjustment target frame.
Wherein S is2After transferring to the first target frame, the center point position and the width and height of the first adjusted target frame are (x1, y1, w 1S)2,h1*S2)。
S112: and calculating a second confidence degree of the first adjusting target frame.
Wherein the second confidence of the first adjustment target box is recalculated.
S113: the second confidence level is compared to a second threshold magnitude.
Wherein, the second confidence is greater than the second threshold, if the second confidence is greater than the second threshold, step S115 is entered; if the second confidence is less than or equal to the second threshold, the process proceeds to step S114. The second confidence coefficient may be equal to or different from the first confidence coefficient, and may be selected according to an actual situation.
S114: the tracking target is lost.
And if the lost tracking target is the frame target lost, continuing to judge the next frame.
S115: and determining a tracking target according to the first adjustment target frame.
Wherein, the tracking target is not lost, and the first adjustment target frame is determined as the tracking target.
In the embodiment, the change scale is estimated by using the infrared light image, and meanwhile, the accuracy is estimated by comparing the second confidence coefficient with the second threshold value, so that the overall tracking accuracy rate is increased, and on the other hand, the infrared camera can be started in a night state, so that night tracking is realized. Meanwhile, in the embodiment, the change scale is estimated through the infrared image, complex mathematical operation is not involved, the speed is high, the time consumption is low, and the application range on a use platform is wider.
Referring to fig. 5, fig. 5 is a schematic diagram of a frame of an embodiment of a target tracking device according to the present invention.
In this embodiment, the target tracking device 200 includes a memory 210 and a processor 220 coupled to each other; the processor 220 is configured to execute the program instructions stored in the memory 210 to implement the target tracking method in any of the above embodiments.
Specifically, the processor 220 is configured to control itself and the memory 210 to implement the target tracking method in any of the above embodiments. Such as: collecting a visible light image and an infrared light image of a current scene; tracking and acquiring a first target frame on the visible light image; determining a second target frame in the infrared light image according to the first target frame; selecting a core area in the second target frame and an extension area of the second target frame; dividing pixel points of the extension area by taking the pixel value of the core area as a standard, and determining a second adjustment target frame; adjusting the first target frame based on the second adjustment target frame to obtain a first adjustment target frame; and determining a tracking target according to the first adjustment target frame.
Processor 220 may also be referred to as a CPU (Central Processing Unit). The processor 220 may be an integrated circuit chip having signal processing capabilities. The Processor 220 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, processor 220 may be commonly implemented by multiple integrated circuit chips.
Referring to fig. 6, fig. 6 is a schematic diagram of a memory device according to an embodiment of the invention.
In this embodiment, the storage device 300 is used for storing program instructions 310, and the program instructions 310 are used for implementing the target tracking method in any embodiment when being executed by a processor.
In addition, in the present invention, unless otherwise expressly specified or limited, the terms "connected," "stacked," and the like are to be construed broadly, e.g., as meaning permanently connected, detachably connected, or integrally formed; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A method of target tracking, the method comprising:
collecting a visible light image and an infrared light image of a current scene;
tracking and acquiring a first target frame on the visible light image;
determining a second target frame in the infrared light image according to the first target frame;
selecting a core area in the second target frame and an extension area of the second target frame; dividing pixel points of the extension area by taking the pixel value of the core area as a standard, and determining a second adjustment target frame;
adjusting the first target frame based on the second adjustment target frame to obtain the first adjustment target frame;
and determining a tracking target according to the first adjusting target frame.
2. The method of claim 1, further comprising:
calculating a first confidence of the first target frame;
comparing the first confidence coefficient with a first threshold, and if the first confidence coefficient is smaller than or equal to the first threshold, executing the action of determining a second target frame in the infrared light image according to the first target frame; and if the first confidence degree is larger than a first threshold value, determining that the first target frame represents a tracking target.
3. The method of claim 1, wherein determining a tracking target according to the first adjusted target box comprises:
calculating a second confidence of the first adjustment target frame;
comparing the second confidence with a second threshold, and if the second confidence is greater than the second threshold, determining that the first adjustment target frame represents a tracking target; and if the second confidence coefficient is less than or equal to the second threshold value, losing the tracking target.
4. The method of claim 1, wherein the selecting the core region in the second target box and the extension region of the second target box comprises:
and carrying out reduction of the second target frame by a first multiple to obtain the core area, and carrying out amplification of the second target frame by a second multiple to determine the extension area, wherein the center points of the core area, the extension area and the second target frame are the same.
5. The method according to claim 1, wherein the dividing the pixel points of the extended area based on the pixel value of the core area to determine a second adjustment target frame comprises:
performing binarization processing on the extended area according to the average value of the pixel values of the core area;
and analyzing the connected domain of the expanded region after binarization processing to obtain a minimum external rectangle corresponding to the maximum connected domain, and taking the minimum external rectangle as the second adjustment target frame.
6. The method of claim 1, wherein the adjusting the first target frame based on the second adjustment target frame to obtain the first adjustment target frame comprises:
calculating the change scale of the second adjustment target frame compared with the previous frame target frame;
and adjusting the first target frame based on the change scale to obtain the first adjustment target frame.
7. The method of claim 1, wherein the tracking acquires a first target frame on the visible light image, comprising:
and tracking and acquiring the center coordinate and the width and height information of the first target frame on the visible light image.
8. The method of claim 1, wherein after the step of determining a tracking target according to the first adjustment target box,
and updating the first target frame on the visible light image obtained by tracking, and repeatedly executing the steps after the first target frame on the visible light image is obtained by tracking.
9. An object tracking device comprising a memory and a processor coupled to each other;
the processor is configured to execute the program instructions stored by the memory to implement the object tracking method of any one of claims 1 to 8.
10. A storage device storing program instructions executable by a processor to perform the object tracking method of any one of claims 1 to 8.
CN202010585510.5A 2020-06-23 2020-06-23 Target tracking method, target tracking device and storage device Active CN111815669B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010585510.5A CN111815669B (en) 2020-06-23 2020-06-23 Target tracking method, target tracking device and storage device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010585510.5A CN111815669B (en) 2020-06-23 2020-06-23 Target tracking method, target tracking device and storage device

Publications (2)

Publication Number Publication Date
CN111815669A true CN111815669A (en) 2020-10-23
CN111815669B CN111815669B (en) 2023-02-28

Family

ID=72844871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010585510.5A Active CN111815669B (en) 2020-06-23 2020-06-23 Target tracking method, target tracking device and storage device

Country Status (1)

Country Link
CN (1) CN111815669B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114697525A (en) * 2020-12-29 2022-07-01 华为技术有限公司 Method for determining tracking target and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106327461A (en) * 2015-06-16 2017-01-11 浙江大华技术股份有限公司 Image processing method and device used for monitoring
WO2018050128A1 (en) * 2016-09-13 2018-03-22 纳恩博(北京)科技有限公司 Target tracking method, electronic device and storage medium
US20190130579A1 (en) * 2017-10-27 2019-05-02 Samsung Electronics Co., Ltd. Method and apparatus for tracking object
WO2019196130A1 (en) * 2018-04-12 2019-10-17 广州飒特红外股份有限公司 Classifier training method and device for vehicle-mounted thermal imaging pedestrian detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106327461A (en) * 2015-06-16 2017-01-11 浙江大华技术股份有限公司 Image processing method and device used for monitoring
WO2018050128A1 (en) * 2016-09-13 2018-03-22 纳恩博(北京)科技有限公司 Target tracking method, electronic device and storage medium
US20190130579A1 (en) * 2017-10-27 2019-05-02 Samsung Electronics Co., Ltd. Method and apparatus for tracking object
WO2019196130A1 (en) * 2018-04-12 2019-10-17 广州飒特红外股份有限公司 Classifier training method and device for vehicle-mounted thermal imaging pedestrian detection

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
赵高鹏等: "多特征提取的红外和可见光目标跟踪方法", 《兵工学报》 *
郑超等: "基于改进亮度变化函数实现红外图像中行人跟踪", 《光学精密工程》 *
闫钧华等: "基于可见光与红外图像特征融合的目标跟踪", 《中国惯性技术学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114697525A (en) * 2020-12-29 2022-07-01 华为技术有限公司 Method for determining tracking target and electronic equipment

Also Published As

Publication number Publication date
CN111815669B (en) 2023-02-28

Similar Documents

Publication Publication Date Title
US8542313B2 (en) Depth from defocus calibration
US9710715B2 (en) Image processing system, image processing device, and image processing method
CN107316326B (en) Edge-based disparity map calculation method and device applied to binocular stereo vision
US20090095880A1 (en) Autofocus control circuit, autofocus control method and image pickup apparatus
WO2014010263A1 (en) Image processing device and image processing method
CN109558471B (en) Updating method, device, storage medium and system of grid map
WO2006081018A1 (en) Object-of-interest image capture
WO2021004260A1 (en) Depth map processing method and device
CN112509003B (en) Method and system for solving target tracking frame drift
CN111815669B (en) Target tracking method, target tracking device and storage device
CN109087347B (en) Image processing method and device
US20230384085A1 (en) Phase unwrapping method based on multi-view constraints of light field and related components
CN110246169B (en) Gradient-based window adaptive stereo matching method and system
US9621780B2 (en) Method and system of curve fitting for common focus measures
CN114331919A (en) Depth recovery method, electronic device, and storage medium
CN114066980A (en) Object detection method and device, electronic equipment and automatic driving vehicle
CN112927256A (en) Boundary fusion method and device for partitioned area and mobile robot
CN111784733A (en) Image processing method, device, terminal and computer readable storage medium
CN112020721A (en) Training method and device for classification neural network for semantic segmentation, and electronic equipment
CN111833376A (en) Target tracking system and method
JP4115291B2 (en) Target tracking device
CN117061879B (en) Automatic light quantity correction method and system and electronic equipment
CN116523774B (en) Shadow correction method suitable for video image
TWI841901B (en) Image processing method, processor, and non-transitory computer readable storage medium
JP7151479B2 (en) Attached matter detection device and attached matter detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant