CN113744299B - Camera control method and device, electronic equipment and storage medium - Google Patents

Camera control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113744299B
CN113744299B CN202111025060.5A CN202111025060A CN113744299B CN 113744299 B CN113744299 B CN 113744299B CN 202111025060 A CN202111025060 A CN 202111025060A CN 113744299 B CN113744299 B CN 113744299B
Authority
CN
China
Prior art keywords
camera
target
data
tracking target
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111025060.5A
Other languages
Chinese (zh)
Other versions
CN113744299A (en
Inventor
范柘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aware Information Technology Co ltd
Original Assignee
Shanghai Aware Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aware Information Technology Co ltd filed Critical Shanghai Aware Information Technology Co ltd
Priority to CN202111025060.5A priority Critical patent/CN113744299B/en
Publication of CN113744299A publication Critical patent/CN113744299A/en
Application granted granted Critical
Publication of CN113744299B publication Critical patent/CN113744299B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Abstract

The embodiment of the invention discloses a camera control method, a camera control device, electronic equipment and a storage medium. The camera control method includes: acquiring motion data of a tracking target in a current camera view image; under the condition that the tracked target is determined to move out of the image range of the current camera view image, calculating an estimated motion track of the tracked target in a predicted walking area according to motion data of the tracked target; and determining camera adjustment parameters according to the estimated motion track. The technical scheme of the embodiment of the invention can reduce the camera parameter configuration cost of target tracking, improve the universality of camera control in target tracking and further improve the user experience.

Description

Camera control method and device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computer vision, in particular to a camera control method and device, electronic equipment and a storage medium.
Background
Target tracking is an important branch of computer vision technology. In the field of target tracking, a camera may be the primary device for capturing and tracking a target.
At present, camera control strategies for target tracking mainly include speed control strategies and position control strategies. The principle of the speed control strategy is to track the target by adjusting the speed gear of the camera, and the principle of the position control strategy is to continuously perform parameter configuration to enable the camera to adjust the level, the pitch and the focal length according to the configured parameters, so that the tracked target is centered and enlarged to the field of view of the camera.
However, since the speed gears that the camera can actually adjust are very few and cannot be used in most of the nominal interval, the problem that the speed gears in the nominal interval of the camera cannot be used when the camera is controlled by a speed control strategy (such as SDK (Software Development Kit) or the like) is caused. When the speed control strategy lacks the ability to precisely control speed, overshoot is inevitable, which can cause the target to frequently shake on the camera screen, which further results in a poor user experience. In addition, the camera is usually controlled by a speed control strategy, which requires frequent sampling, and the control frequency is in milliseconds, which causes great stress on network interaction. Because uncertain factors (such as the size, distance, speed, movement direction and the like of a tracked target) of a field environment are more, when a position control strategy is adopted to control a camera, the position of the camera needs to be frequently adjusted in order to ensure that the tracked target is kept at the center of an image in a proper size, and the frequent adjustment of the position of the camera can cause poor user experience. It should be noted that the existing position control strategy requires complicated parameter configuration to adapt to different application environments, which greatly limits the universality of camera control.
Disclosure of Invention
Embodiments of the present invention provide a camera control method and apparatus, an electronic device, and a storage medium, which can reduce a camera parameter configuration cost for target tracking, improve universality of camera control for target tracking, and further improve user experience.
In a first aspect, an embodiment of the present invention provides a camera control method, including:
acquiring motion data of a tracking target in a current camera view image;
under the condition that the tracked target is determined to move out of the image range of the current camera view image, calculating an estimated motion track of the tracked target in a predicted walking area according to motion data of the tracked target;
and determining camera adjustment parameters according to the estimated motion track.
In a second aspect, an embodiment of the present invention further provides a camera control apparatus, including:
the motion data acquisition module is used for acquiring motion data of a tracking target in a current camera view image;
the estimated motion track calculation module is used for calculating the estimated motion track of the tracking target in the predicted walking area according to the motion data of the tracking target under the condition that the tracking target is determined to move out of the image range of the current camera view image;
and the camera adjustment parameter determining module is used for determining camera adjustment parameters according to the estimated motion trail.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
storage means for storing one or more programs;
when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the camera control method provided by any of the embodiments of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the camera control method provided in any embodiment of the present invention.
According to the technical scheme, the estimated motion trail of the tracking target in the predicted walking area is calculated according to the motion data of the tracking target under the condition that the tracking target is determined to move out of the image range of the current camera view image by acquiring the motion data of the tracking target in the current camera view image, and the camera adjustment parameters are determined according to the estimated motion trail. In the case where the tracked object moves out of the current camera view image, the camera cannot determine the position of the tracked object, by predicting the motion trail of the tracked target, the predicted motion trail of the tracked target in the predicted walking area can be obtained, when the camera is adjusted according to the camera adjustment parameters determined by the predicted motion trail, the tracking target can be quickly captured again, the tracking target does not need to be captured through frequently adjusting the position of the camera, the problem of overshoot caused by speed gear adjustment can be solved, the problems that in the prior art, the position control strategy needs complex parameter configuration to frequently control the camera to track the target, the universality of camera control is poor, and the cost of camera parameter configuration is high are solved, the cost of camera parameter configuration for target tracking can be reduced, the universality of camera control for target tracking is improved, and the user experience is further improved.
Drawings
Fig. 1 is a flowchart of a camera control method according to an embodiment of the present invention;
fig. 2 is a flowchart of a camera control method according to a second embodiment of the present invention;
fig. 3 is a schematic diagram of a predictive tracking image embedded in a target walking area according to a second embodiment of the present invention;
fig. 4 is a flowchart of another camera control provided in the second embodiment of the present invention;
fig. 5 is a schematic diagram of a camera control apparatus according to a third embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention.
It should be further noted that, for the convenience of description, only some but not all of the relevant aspects of the present invention are shown in the drawings. Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Example one
Fig. 1 is a flowchart of a camera control method according to an embodiment of the present invention, where the embodiment is applicable to a case where a camera can track a target without complicated parameter configuration, and the method can be executed by a camera control apparatus, which can be implemented by software and/or hardware, and can be generally integrated in an electronic device. Accordingly, as shown in fig. 1, the method comprises the following operations:
and S110, acquiring motion data of the tracking target in the current camera view image.
Among them, the tracking target may be an object that the camera needs to capture and track. The current camera view image may be an image of the camera's current view taken by the camera. The motion data can be used to characterize the behavior of the tracked target.
In the embodiment of the invention, when an object needs to be captured and tracked by the camera, the current camera view image can be acquired first, and then the tracking target in the current camera view image is determined, so that the motion data of the tracking target in the current camera view image can be obtained by analyzing the tracking target in the current camera view image.
And S120, under the condition that the tracked target is determined to move out of the image range of the current camera view image, calculating the estimated motion track of the tracked target in the predicted walking area according to the motion data of the tracked target.
The predicted walking area may be a predicted range of the walking area of the tracking target when the tracking target moves out of an image range of the current camera view image. The predicted motion trajectory may be a predicted motion trajectory of the tracked object in the predicted walking area.
In the embodiment of the invention, if the tracking target moves out of the image range of the current camera view image, the motion condition of the tracking target can be estimated according to the motion data of the tracking target, so that the camera can quickly track the tracking target moving out of the image range of the current camera view image. Specifically, the predicted motion trail can be calculated according to the motion data of the tracked target, and then the predicted walking area is determined according to the predicted motion trail, so that the predicted motion trail of the tracked target in the predicted walking area is obtained.
And S130, determining camera adjusting parameters according to the estimated motion trail.
The camera adjustment parameters can be used for adjusting the orientation, the focal length, the definition degree and the like of the camera.
In the embodiment of the invention, in order to enable the estimated motion track in the predicted walking area to be adaptive to the display range of the camera view image, the camera adjustment parameters can be calculated according to the estimated motion track and the display range of the camera view image, so that the camera can be adjusted through the camera adjustment parameters.
According to the technical scheme, the estimated motion trail of the tracking target in the predicted walking area is calculated according to the motion data of the tracking target under the condition that the tracking target is determined to move out of the image range of the current camera view image by acquiring the motion data of the tracking target in the current camera view image, and the camera adjustment parameters are determined according to the estimated motion trail. In the case where the tracked object moves out of the current camera view image, the camera cannot determine the position of the tracked object, by predicting the motion trail of the tracked target, the predicted motion trail of the tracked target in the predicted walking area can be obtained, the tracking target can be rapidly captured again after the camera is adjusted according to the camera adjustment parameters determined by the predicted motion trail, the tracking target does not need to be captured through frequently adjusting the position of the camera, the problem of overshoot caused by speed gear adjustment can be avoided, the problems that in the prior art, a position control strategy needs complex parameter configuration to frequently control the camera to track the target, and the universality of camera control is poor and the cost of camera parameter configuration is high are solved, the cost of camera parameter configuration for target tracking can be reduced, the universality of camera control in target tracking is improved, and the user experience is further improved.
Example two
Fig. 2 is a flowchart of a camera control method according to a second embodiment of the present invention, which is embodied based on the above-described embodiment, and in this embodiment, a specific optional implementation scheme is provided for calculating an estimated motion trajectory of a tracked target in a predicted walking area according to motion data of the tracked target and determining camera adjustment parameters according to the estimated motion trajectory when it is determined that the tracked target moves out of an image range of a current camera view image. Accordingly, as shown in fig. 2, the method includes the following operations:
s210, acquiring motion data of the tracking target in the current camera view image.
Optionally, before acquiring the motion data of the tracking target in the current camera view image, the method may include: and capturing the tracking target, determining position data of the tracking target in a camera view image, and performing centering amplification processing on the tracking target according to the position data. And after the tracking target is enlarged in the middle of the camera visual field, capturing the tracking target again, and tracking the re-captured tracking target to obtain the tracking target in the current camera visual field image.
For example, if the tracking target is at the upper left corner of the camera view image and the tracking target is small and not easily resolved in the camera view image, the tracking target may be subjected to a centering zoom-in process. Accordingly, the PTZ (Pan/Tilt/Zoom, left-right-up-down zooming) of the camera can be adjusted through the camera adjustment parameters, specifically, the camera is adjusted horizontally to the right and vertically to the down, so that the tracking target is located in the middle area of the camera view image. After the orientation adjustment of the camera is completed, the focal length may be further adjusted to enlarge the tracking target in the camera view image. For example, the camera may be a PTZ camera, but the specific type of camera is not limited by the embodiments of the present invention.
S220, under the condition that the tracking target is determined to move out of the image range of the current camera view image, determining the camera motion control interval time.
The camera motion control interval time can be preset according to user requirements, and the time length of the target crossing the predicted walking area can be tracked. The camera motion control interval time may be a parameter set according to a user's need. The larger the camera motion control interval time is set, the smaller the tracking target is automatically in the camera view image. The default value for the camera motion control interval time may be set to about 10 seconds. The embodiment of the invention does not limit the specific duration of the camera motion control interval time. It should be noted that, when the tracked target is large, the camera automatically enlarges the field of view; when the tracking target moves slowly, the camera will auto-focus. No matter how the size and the moving speed of the tracking target change, the time interval of camera motion control is kept unchanged, and user experience can be improved. Optionally, the camera motion control interval time may be used as a parameter that only needs to be set by a user, and after the user sets the camera motion control interval time, the camera adjustment parameter may be automatically obtained according to the camera control method.
Accordingly, before capturing the tracking target, the camera motion control interval time may be preset according to the user's needs, and when the tracking target moves out of the image range of the current camera view image, the camera motion control interval time may be obtained from the preset parameters of the camera.
In an alternative embodiment of the present invention, determining that the tracking target moves out of the image range of the current camera view image may include: determining a target walking area of a tracking target in a current camera view image; and under the condition that the tracking target is determined to move out of the target walking area, determining that the tracking target moves out of the image range of the current camera view image.
The target walking area may be a partial area in the current camera view image, and represents an action range of the tracking target in the current camera view image, that is, a display range of the tracking target in the current camera view image.
In the embodiment of the invention, the display range of the tracking target in the camera view image can be determined according to the size of the camera view image, and then the display range of the tracking target in the camera view image is taken as the target walking range. After the target walking range is obtained, the tracking target can be moved out of the target walking area to serve as a judgment standard for moving the tracking target out of the image range of the current camera view image, namely when the tracking target moves out of the target walking area, the tracking target can be determined to move out of the image range of the current camera view image.
And S230, calculating the estimated motion track of the tracking target in the predicted walking area according to the motion data of the tracking target and the camera motion control interval time.
In the embodiment of the invention, the starting point and the end point of the tracking target in the estimated motion track can be calculated according to the motion data of the tracking target and the camera motion control interval time, and the predicted walking area is further determined according to the starting point and the end point of the tracking target in the estimated motion track.
In an optional embodiment of the present invention, calculating an estimated motion trajectory of the tracking target in the predicted walking area according to the motion data of the tracking target and the camera motion control interval time may include: determining region moving-out position data and moving-out speed vector data when the tracking target moves out of the target walking region according to the motion data of the tracking target; determining a camera position movement time; and calculating an estimated motion track according to the region moving-out position data, the moving-out speed vector data, the camera position moving time and the camera motion control interval time.
The region moving-out position data may be position data when the tracking target moves out of the target walking region. The movement-out velocity vector data may be velocity vector data when the tracked target moves out of the target walking area. The velocity vector data may include, but is not limited to, velocity magnitude and velocity direction, among other data. The camera position moving time may be a time when the camera reaches a designated position according to the camera adjustment parameter, completes the focal length adjustment, and the camera view image is stable.
In the embodiment of the invention, the motion data of the tracked target can be subjected to data analysis to obtain the region moving-out position data and the moving-out speed vector data when the tracked target moves out of the target walking region, the camera position moving time is further obtained, the starting point and the end point of the estimated motion trail are calculated according to the region moving-out position data, the moving-out speed vector data, the camera position moving time and the camera motion control interval time, and the estimated motion trail is determined according to the starting point and the end point of the estimated motion trail.
In an optional embodiment of the invention, determining the camera position movement time may comprise: acquiring historical offline parameters of a camera, and determining the position moving time of the camera according to the historical offline parameters; wherein the historical offline parameters include historical camera position movement times of the camera.
The historical offline parameter may be data that is locally cached by the camera before the camera is adjusted according to the camera adjustment parameter corresponding to the current camera view image. The historical camera position movement time may be partial data in the historical offline parameters, and is used to represent a time when the camera reaches a specified position according to the camera adjustment parameters in the historical offline parameters, completes focal length adjustment, and stabilizes the camera view image before the camera is adjusted according to the camera adjustment parameters corresponding to the current camera view image.
Correspondingly, after the camera is started, historical offline parameters of the camera can be automatically acquired, the historical offline parameters are analyzed, and the moving time of the camera according to a plurality of historical camera positions in the historical offline parameters can be obtained. And carrying out averaging processing on the obtained plurality of historical camera position moving time, and taking the obtained average as the camera position moving time. It should be noted that, the maximum value of the plurality of historical camera position moving times may also be used as the camera position moving time, and the embodiment of the present invention is not limited to what data processing method is performed on the historical camera position moving times to obtain the camera position moving times.
Alternatively, after the tracking target tracked by the camera moves out of the current camera view image for the nth time, the average value of the n-1 real-time camera position moving times when the camera tracks the same tracking target may be used as the nth camera position moving time. Wherein n may be any integer of 2 or more. Alternatively, the camera position moving time may be determined in real time according to the action speed of the tracking target. For example, when the action speed of the tracking target exceeds a certain speed threshold, the camera position moving time can be shortened. Accordingly, when the action speed of the tracking target is lower than a certain speed threshold, the camera position moving time can be further extended. The moving time of the camera position is determined in real time through the action speed of the tracked target, so that the effect of quickly tracking the tracked target can be achieved.
In an optional embodiment of the present invention, calculating the estimated motion trajectory according to the region move-out position data, the move-out velocity vector data, the camera position move time, and the camera motion control interval time may include: determining predicted initial position data of the tracking target in a predicted walking area according to the camera position moving time, the area moving-out position data and the moving-out speed vector data; determining predicted target position data of the tracking target in the predicted walking area according to the predicted initial position data, the moving-out speed vector data and the camera motion control interval time; and calculating the estimated motion trail according to the data of the predicted initial position and the data of the predicted target position.
The predicted initial position data may be position data of a start point of the predicted motion trajectory. The predicted target position data may be position data of an estimated motion trajectory end point.
In the embodiment of the present invention, a product value of the moving-out velocity vector data and the camera position moving time may be first used as a first product value, and a sum value of the first product value and the area moving-out position data may be further used as the predicted initial position data of the tracking target in the predicted walking area. After the predicted initial position data is obtained, the product of the moving-out speed vector data and the camera motion control interval time can be calculated, the product value of the moving-out speed vector data and the camera motion control interval time is used as a second product value, and the sum value of the second product value and the predicted initial position data is further used as predicted target position data of the tracking target in the predicted walking area. After the predicted initial position data and the predicted target position data are obtained, a connection line between the position corresponding to the predicted initial position data and the position corresponding to the predicted target position data can be further determined, and the connection line is used as a predicted motion track.
Since the moving-out velocity vector data is vector data, the product value (first product value) of the moving-out velocity vector data and the camera position movement time, and the product value (second product value) of the moving-out velocity vector data and the camera motion control interval time are vectors. The calculation process of predicting the initial position data may be: first, the sum of the horizontal direction value of the first product value and the horizontal direction value of the data of the area removal position is calculated as the horizontal direction value of the data of the predicted initial position, and then the sum of the vertical direction value of the first product value and the vertical direction value of the data of the area removal position is calculated as the vertical direction value of the data of the predicted initial position. The calculation process of the predicted target position data is similar to the calculation process of the predicted initial position, and details are not repeated here.
Illustratively, with the center of gravity of the target walking section as an origin, a horizontal perpendicular bisector and a vertical perpendicular bisector of the target walking section as an abscissa axis and an ordinate axis, respectively, when the coordinate value of the region shift-out position data is (1, -1), the camera position moving time is 1 second, the camera motion control interval time is 5 seconds, the shift-out speed is 1m/s, and the moving direction is horizontal to the right, the tracking target moves horizontally to the right at a speed of 1m/s, and can move to (2, -1) within the camera position moving time, thereby taking (2, -1) as the predicted initial position data of the tracking target in the predicted walking region. After the predicted initial position data is obtained, the tracking target moves to the right at a speed level of 1m/s from the position corresponding to the predicted initial position data, and can move to (7, -1) in the camera motion control interval time, so that (7, -1) is taken as the predicted target position data of the tracking target in the predicted walking area, and further, the connecting line between (2, -1) and (7, -1) can be taken as the predicted motion track.
And S240, acquiring the area association parameters of the target walking area.
Wherein the region association parameter may be data associated with a position of the target walking region in the current camera view image. For example, the region association parameters may include, but are not limited to, vertex coordinate data of the walking region of the target in the current camera view image, barycenter coordinate data of the walking region of the target, and focal length of the walking region of the target in the current camera view image.
In the embodiment of the present invention, after obtaining the target walking area of the tracking target in the current view image, the position of the target walking area in the current camera view image and the data associated with the position of the target walking area in the current camera view image may be further determined, and then the data associated with the position of the target walking area in the current camera view image may be used as the area association parameter.
And S250, determining camera adjustment parameters according to the estimated motion track and the area association parameters of the target walking area.
In the embodiment of the invention, camera adjustment parameters which can enable the estimated motion track to be adaptive to the display range of the camera view images can be calculated according to the estimated motion track and the area association data of the target walking area.
In an optional embodiment of the present invention, determining the camera adjustment parameter according to the estimated motion trajectory and the area association parameter of the target walking area may include: calculating a prediction mapping initial position of the tracking target in the prediction tracking image according to the prediction initial position data of the prediction motion track, and calculating a prediction mapping target position of the tracking target in the prediction tracking image according to the prediction target position data of the prediction motion track; determining area tracking center data according to the target walking area; and calculating camera adjustment parameters according to the target walking area, the prediction mapping initial position, the prediction mapping target position and the area tracking center data.
The predicted tracking image may represent an image of the predicted motion trajectory in the predicted walking region. The prediction mapping initial position may be a position in the prediction tracking image corresponding to the prediction initial position data. The prediction mapping target position may be a position in the prediction tracking image corresponding to the prediction target position data. The region tracking center data can be gravity center coordinate data of the target walking region, and is used for representing the gravity center of the target walking region.
In the embodiment of the invention, a point can be arbitrarily selected on a two-dimensional plane as a coordinate origin, and a coordinate axis of the coordinate origin is established. For example, the central point of the page may be used as the origin of coordinates, the horizontal perpendicular bisector and the vertical perpendicular bisector of the page may be respectively used as the abscissa axis and the ordinate axis, and further, any point in the page may be selected as the prediction mapping initial position of the tracking target corresponding to the prediction initial position data in the prediction tracking image. And determining the predicted mapping target position of the tracking target corresponding to the predicted target position data in the predicted tracking image according to the predicted initial position data and the horizontal difference value and the vertical difference value of the predicted target position data. After the initial position of the prediction mapping and the target position of the prediction mapping are obtained, an area capable of enclosing the two positions can be determined according to the initial position of the prediction mapping and the target position of the prediction mapping, and the center of a connecting line of the initial position of the prediction mapping and the target position of the prediction mapping coincides with the center of gravity of the enclosed area. And forming a prediction tracking image by a connecting line of the prediction mapping initial position and the prediction mapping target position, the edge of the surrounding area, the prediction mapping initial position and the prediction mapping target position. And calculating area tracking center data according to the vertex position data of the target walking area, and calculating camera adjustment parameters capable of enabling the prediction tracking image to be embedded in the target walking area according to the target walking area, the prediction mapping initial position, the prediction mapping target position and the area tracking center data. The prediction tracking image can be embedded in the target walking area and can be understood as follows: and overlapping the gravity center of the prediction tracking image with the gravity center of a target walking area corresponding to the area tracking center data, and overlapping one side of the prediction tracking image with one side of the target walking area, wherein if the prediction tracking image cannot be completely overlapped with the target walking area, the non-overlapped side of the prediction tracking image and the target walking area is positioned at the inner side of the target walking area.
Illustratively, when the target walking area is a square with the side length of 1, the prediction tracking image is a square with the side length of 2, the prediction mapping initial position is the top left corner vertex position of the prediction tracking image, and the prediction mapping target position is the bottom right corner vertex position of the prediction tracking image. In order to embed the predictive tracking image in the target traveling region, the predictive tracking image may be first reduced by one time, and since the predictive tracking image is reduced by one time to be a square with a side length of 1, when the reduced predictive tracking image coincides with the center of the target traveling region, the reduced predictive tracking image coincides with each side of the target traveling region. The embodiment of the present invention does not limit the specific shape of the prediction trace pattern and the specific size of the prediction trace pattern.
Fig. 3 is a schematic diagram of a predictive tracking image embedded in a target walking area according to a second embodiment of the present invention, as shown in fig. 3, a tracking target in a current camera view image is located at a position T0, and moves out of the target walking area at a position T1, a predictive mapping initial position of the tracking target in the predictive tracking image calculated according to predicted initial position data of an estimated motion trajectory is T2, and a predictive mapping target position of the tracking target in the predictive tracking image calculated according to predicted target position data of the estimated motion trajectory is T3. And calculating a camera adjustment parameter which can enable the predictive tracking image to be embedded in the target walking area according to the target walking area, the predictive mapping initial position, the predictive mapping target position and the area tracking center data, so that the predictive tracking image is embedded in the target walking area.
In an optional embodiment of the invention, the camera adjustment parameters may comprise camera position adjustment parameters and/or camera focal length adjustment parameters.
The camera position adjustment parameters may be used to adjust the pan and tilt position of the camera, among other things. The camera focal length adjustment parameter may be used to adjust the focal length of the camera.
Fig. 4 is a flowchart of another camera control according to a second embodiment of the present invention, as shown in fig. 4, a camera performs initial capture on a tracking target, calculates a camera adjustment parameter required for capturing the tracking target to perform centering and zooming in an image range of a camera view image, and adjusts the camera according to the camera adjustment parameter. After the camera finishes primary adjustment, the tracking target is captured and tracked again, whether the tracking target moves out of a safe area or not is judged, if the tracking target moves out of the safe area (target walking area), the prediction mapping initial position of the tracking target in the prediction tracking image after the camera position moving time is elapsed is calculated, and then the prediction mapping target position of the tracking target in the prediction tracking image and the track of the connection line of the prediction mapping initial position and the prediction mapping target position are determined according to the camera control interval time. And calculating camera adjustment parameters which enable the prediction tracking image to be embedded in the target walking area through the data of the connecting line center, the prediction mapping initial position, the prediction mapping target position, the target walking area and the area tracking center. And if the tracking target does not move out of the safe area, continuing to track the tracking target. The equipment for calculating the camera adjustment parameters capable of enabling the prediction tracking image to be embedded in the target walking area can be a pan-tilt free-walking device. The free movement of the holder can calculate camera adjustment parameters matched with the target walking area through different algorithms according to the predicted mapping initial position and the predicted mapping target position. The size of the security zone may be a 20% reduction in the image range of the camera view image.
According to the technical scheme of the embodiment, the motion data of the tracking target in the current camera view image is obtained, the camera motion control interval time is determined under the condition that the tracking target is determined to move out of the image range of the current camera view image, the estimated motion track of the tracking target in the predicted walking area is calculated according to the motion data of the tracking target and the camera motion control interval time, the area association parameters of the target walking area are further obtained, and the camera adjustment parameters are determined according to the estimated motion track and the area association parameters of the target walking area. In the case where the tracking target moves out of the current camera view image, the camera cannot determine the position of the tracking target, by predicting the motion trail of the tracked target, the predicted motion trail of the tracked target in the predicted walking area can be obtained, the tracking target can be rapidly captured again after the camera is adjusted according to the camera adjustment parameters determined by the predicted motion trail, the tracking target does not need to be captured through frequently adjusting the position of the camera, the problem of overshoot caused by speed gear adjustment can be avoided, the problems that in the prior art, a position control strategy needs complex parameter configuration to frequently control the camera to track the target, and the universality of camera control is poor and the cost of camera parameter configuration is high are solved, the cost of camera parameter configuration for target tracking can be reduced, the universality of camera control in target tracking is improved, and the user experience is further improved.
It should be noted that any permutation and combination between the technical features in the above embodiments also belong to the scope of the present invention.
EXAMPLE III
Fig. 5 is a schematic diagram of a camera control apparatus according to a third embodiment of the present invention, and as shown in fig. 5, the apparatus includes: a motion data obtaining module 310, an estimated motion trajectory calculating module 320, and a camera adjustment parameter determining module 330, wherein:
a motion data acquisition module 310, configured to acquire motion data of a tracking target in a current camera view image;
the estimated motion trail calculation module 320 is used for calculating an estimated motion trail of the tracking target in the predicted walking area according to the motion data of the tracking target under the condition that the tracking target is determined to move out of the image range of the current camera view image;
and a camera adjustment parameter determining module 330, configured to determine a camera adjustment parameter according to the estimated motion trajectory.
Optionally, the predicted motion trajectory calculation module 320 is specifically configured to: determining a target walking area of the tracking target in the current camera view image; and under the condition that the tracking target is determined to move out of the target walking area, determining that the tracking target moves out of the image range of the current camera view field image.
Optionally, the predicted motion trajectory calculation module 320 is specifically configured to: determining a camera motion control interval time; and calculating the estimated motion trail of the tracking target in the predicted walking area according to the motion data of the tracking target and the camera motion control interval time.
Optionally, the predicted motion trajectory calculation module 320 is specifically configured to: determining region moving-out position data and moving-out speed vector data when the tracking target moves out of the target walking region according to the motion data of the tracking target; determining a camera position movement time; and calculating the estimated motion trail according to the region moving-out position data, the moving-out speed vector data, the camera position moving time and the camera motion control interval time.
Optionally, the predicted motion trajectory calculation module 320 is specifically configured to: acquiring historical offline parameters of a camera, and determining the position moving time of the camera according to the historical offline parameters; wherein the historical offline parameters include historical camera position movement times of the camera.
Optionally, the predicted motion trajectory calculation module 320 is specifically configured to: determining predicted initial position data of the tracking target in the predicted walking area according to the camera position moving time, the area moving-out position data and the moving-out speed vector data; determining predicted target position data of the tracking target in the predicted walking area according to the predicted initial position data, the moving-out speed vector data and the camera motion control interval time; and calculating the estimated motion trail according to the predicted initial position data and the predicted target position data.
Optionally, the camera adjustment parameter determining module 330 is specifically configured to: acquiring area association parameters of the target walking area; and determining the camera adjustment parameters according to the estimated motion track and the area association parameters of the target walking area.
Optionally, the camera adjustment parameter determining module 330 is specifically configured to: calculating the prediction mapping initial position of the tracking target in a prediction tracking image according to the prediction initial position data of the prediction motion track, and calculating the prediction mapping target position of the tracking target in the prediction tracking image according to the prediction target position data of the prediction motion track; determining region tracking center data according to the target walking region; and calculating the camera adjustment parameter according to the target walking area, the prediction mapping initial position, the prediction mapping target position and the area tracking center data.
Optionally, the camera adjustment parameter includes a camera position adjustment parameter and/or a camera focal length adjustment parameter.
According to the technical scheme, the estimated motion trail of the tracking target in the predicted walking area is calculated according to the motion data of the tracking target under the condition that the tracking target is determined to move out of the image range of the current camera view image by acquiring the motion data of the tracking target in the current camera view image, and the camera adjustment parameters are determined according to the estimated motion trail. In the case where the tracked object moves out of the current camera view image, the camera cannot determine the position of the tracked object, by predicting the motion trail of the tracked target, the predicted motion trail of the tracked target in the predicted walking area can be obtained, when the camera is adjusted according to the camera adjustment parameters determined by the predicted motion trail, the tracking target can be quickly captured again, the tracking target does not need to be captured through frequently adjusting the position of the camera, the problem of overshoot caused by speed gear adjustment can be solved, the problems that in the prior art, the position control strategy needs complex parameter configuration to frequently control the camera to track the target, the universality of camera control is poor, and the cost of camera parameter configuration is high are solved, the cost of camera parameter configuration for target tracking can be reduced, the universality of camera control for target tracking is improved, and the user experience is further improved.
The camera control device can execute the camera control method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to a camera control method provided in any embodiment of the present invention.
Since the camera control device described above is a device capable of executing the camera control method in the embodiment of the present invention, based on the camera control method described in the embodiment of the present invention, a person skilled in the art can understand the specific implementation of the camera control device in the embodiment and various variations thereof, and therefore, how to implement the camera control method in the embodiment of the present invention by the camera control device is not described in detail herein. The device used by those skilled in the art to implement the camera control method in the embodiments of the present invention is within the scope of the present application.
Example four
Fig. 6 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention. FIG. 6 illustrates a block diagram of an electronic device 412 that is suitable for use in implementing embodiments of the present invention. The electronic device 412 shown in fig. 6 is only an example and should not bring any limitations to the functionality and scope of use of the embodiments of the present invention. The electronic device 412 may be, for example, a computer device or the like.
As shown in fig. 6, the electronic device 412 is in the form of a general purpose computing device. The components of the electronic device 412 may include, but are not limited to: one or more processors 416, a storage device 428, and a bus 418 that couples the various system components including the storage device 428 and the processors 416.
Bus 418 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, such architectures can include, but are not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an enhanced ISA bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus.
Electronic device 412 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 412 and includes both volatile and nonvolatile media, removable and non-removable media.
Storage 428 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 430 and/or cache Memory 432. The electronic device 412 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 434 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, commonly referred to as a "hard drive"). Although not shown in FIG. 6, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk-Read Only Memory (CD-ROM), a Digital Video disk (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 418 by one or more data media interfaces. Storage 428 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
Program 436 having a set (at least one) of program modules 426 may be stored, for example, in storage 428, such program modules 426 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination may comprise an implementation of a network environment. Program modules 426 generally perform the functions and/or methodologies of embodiments of the present invention as described herein.
The electronic device 412 may also communicate with one or more external devices 414 (e.g., keyboard, pointing device, camera, display 424, etc.), with one or more devices that enable a user to interact with the electronic device 412, and/or with any devices (e.g., network card, modem, etc.) that enable the electronic device 412 to communicate with one or more other computing devices. Such communication may be through an Input/Output (I/O) interface 422. Also, the electronic device 412 may communicate with one or more networks (e.g., a Local Area Network (LAN), Wide Area Network (WAN), and/or a public Network, such as the internet) via the Network adapter 420. As shown, network adapter 420 communicates with the other modules of electronic device 412 over bus 418. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 412, including but not limited to: microcode, device drivers, Redundant processing units, external disk drive Arrays, disk array (RAID) systems, tape drives, and data backup storage systems, to name a few.
The processor 416 executes various functional applications and data processing by executing programs stored in the storage device 428, for example, implementing the camera control method provided by the above-described embodiment of the present invention: acquiring motion data of a tracking target in a current camera view image; under the condition that the tracked target is determined to move out of the image range of the current camera view image, calculating an estimated motion track of the tracked target in a predicted walking area according to motion data of the tracked target; and determining camera adjustment parameters according to the estimated motion track.
According to the technical scheme, the estimated motion trail of the tracking target in the predicted walking area is calculated according to the motion data of the tracking target under the condition that the tracking target is determined to move out of the image range of the current camera view image by acquiring the motion data of the tracking target in the current camera view image, and the camera adjustment parameters are determined according to the estimated motion trail. In the case where the tracked object moves out of the current camera view image, the camera cannot determine the position of the tracked object, by predicting the motion trail of the tracked target, the predicted motion trail of the tracked target in the predicted walking area can be obtained, when the camera is adjusted according to the camera adjustment parameters determined by the predicted motion trail, the tracking target can be quickly captured again, the tracking target does not need to be captured through frequently adjusting the position of the camera, the problem of overshoot caused by speed gear adjustment can be solved, the problems that in the prior art, the position control strategy needs complex parameter configuration to frequently control the camera to track the target, the universality of camera control is poor, and the cost of camera parameter configuration is high are solved, the cost of camera parameter configuration for target tracking can be reduced, the universality of camera control for target tracking is improved, and the user experience is further improved.
EXAMPLE five
An embodiment five of the present invention further provides a computer storage medium storing a computer program, where the computer program is used to execute the camera control method according to any one of the above embodiments of the present invention when executed by a computer processor: acquiring motion data of a tracking target in a current camera view image; under the condition that the tracked target is determined to move out of the image range of the current camera view image, calculating an estimated motion track of the tracked target in a predicted walking area according to motion data of the tracked target; and determining camera adjustment parameters according to the estimated motion track.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an Erasable Programmable Read-Only Memory (EPROM) or flash Memory), an optical fiber, a portable compact disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. Those skilled in the art will appreciate that the present invention is not limited to the particular embodiments described herein, and that various obvious changes, rearrangements and substitutions will now be apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (9)

1. A camera control method, comprising:
acquiring motion data of a tracking target in a current camera view image;
under the condition that the tracking target is determined to move out of the image range of the current camera view image, calculating the estimated motion track of the tracking target in the predicted walking area according to the motion data of the tracking target;
determining camera adjustment parameters according to the estimated motion track;
the determining that the tracking target moves out of the image range of the current camera view image comprises:
determining a target walking area of the tracking target in the current camera view image;
determining that the tracking target moves out of the image range of the current camera view image under the condition that the tracking target is determined to move out of the target walking area;
calculating the estimated motion trail of the tracking target in the predicted walking area according to the motion data of the tracking target, wherein the method comprises the following steps:
determining a camera motion control interval time;
calculating the estimated motion trail of the tracking target in a predicted walking area according to the motion data of the tracking target and the camera motion control interval time;
the calculating the estimated motion trail of the tracking target in the predicted walking area according to the motion data of the tracking target and the camera motion control interval time comprises the following steps:
determining region moving-out position data and moving-out speed vector data when the tracking target moves out of the target walking region according to the motion data of the tracking target;
determining a camera position movement time; the camera position moving time is the time when the camera reaches the designated position according to the camera adjusting parameters, the focal length adjustment is completed, and the camera view image is stable;
and calculating the estimated motion trail according to the region moving-out position data, the moving-out speed vector data, the camera position moving time and the camera motion control interval time.
2. The method of claim 1, wherein determining a camera position movement time comprises:
acquiring historical offline parameters of a camera, and determining the position moving time of the camera according to the historical offline parameters;
wherein the historical offline parameters include historical camera position movement times of the camera.
3. The method of claim 1, wherein said calculating the estimated motion trajectory from the region move-out position data, the move-out velocity vector data, the camera position move time, and the camera motion control interval time comprises:
determining predicted initial position data of the tracking target in the predicted walking area according to the camera position moving time, the area moving-out position data and the moving-out speed vector data;
determining predicted target position data of the tracking target in the predicted walking area according to the predicted initial position data, the moving-out speed vector data and the camera motion control interval time;
and calculating the estimated motion trail according to the predicted initial position data and the predicted target position data.
4. The method of claim 1, wherein determining camera adjustment parameters from the estimated motion trajectory comprises:
acquiring area association parameters of the target walking area;
and determining the camera adjustment parameters according to the estimated motion track and the area association parameters of the target walking area.
5. The method of claim 4, wherein the determining the camera adjustment parameter according to the estimated motion trajectory and the region-related parameter of the target walking area comprises:
calculating the prediction mapping initial position of the tracking target in a prediction tracking image according to the prediction initial position data of the prediction motion track, and calculating the prediction mapping target position of the tracking target in the prediction tracking image according to the prediction target position data of the prediction motion track;
determining area tracking center data according to the target walking area;
and calculating the camera adjustment parameter according to the target walking area, the prediction mapping initial position, the prediction mapping target position and the area tracking center data.
6. The method according to any one of claims 1 to 5, wherein the camera adjustment parameters comprise camera position adjustment parameters and/or camera focal length adjustment parameters.
7. A camera control apparatus, comprising:
the motion data acquisition module is used for acquiring motion data of a tracking target in a current camera view image;
the estimated motion track calculation module is used for calculating the estimated motion track of the tracking target in the predicted walking area according to the motion data of the tracking target under the condition that the tracking target is determined to move out of the image range of the current camera view image;
the camera adjustment parameter determining module is used for determining camera adjustment parameters according to the estimated motion track;
the pre-estimated motion trajectory calculation module is specifically configured to: determining a target walking area of the tracking target in the current camera view image; determining that the tracking target moves out of the image range of the current camera view image under the condition that the tracking target is determined to move out of the target walking area;
the pre-estimated motion trajectory calculation module is specifically configured to: determining a camera motion control interval time; calculating the estimated motion trail of the tracking target in a predicted walking area according to the motion data of the tracking target and the camera motion control interval time;
the pre-estimated motion trajectory calculation module is specifically configured to: determining region moving-out position data and moving-out speed vector data when the tracking target moves out of the target walking region according to the motion data of the tracking target; determining a camera position movement time; calculating the estimated motion trail according to the area moving-out position data, the moving-out speed vector data, the camera position moving time and the camera motion control interval time; and the camera position moving time is the time when the camera reaches the designated position according to the camera adjusting parameters, completes the focal length adjustment and stabilizes the camera view image.
8. An electronic device, characterized in that the electronic device comprises:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the camera control method of any one of claims 1-6.
9. A computer storage medium on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out a camera control method according to any one of claims 1 to 6.
CN202111025060.5A 2021-09-02 2021-09-02 Camera control method and device, electronic equipment and storage medium Active CN113744299B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111025060.5A CN113744299B (en) 2021-09-02 2021-09-02 Camera control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111025060.5A CN113744299B (en) 2021-09-02 2021-09-02 Camera control method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113744299A CN113744299A (en) 2021-12-03
CN113744299B true CN113744299B (en) 2022-07-12

Family

ID=78734857

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111025060.5A Active CN113744299B (en) 2021-09-02 2021-09-02 Camera control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113744299B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114608555A (en) * 2022-02-28 2022-06-10 珠海云洲智能科技股份有限公司 Target positioning method, system and storage medium
CN115514858A (en) * 2022-10-09 2022-12-23 江苏超正科技有限公司 Target identification and motion detection method and system based on deep neural network image

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102263933A (en) * 2010-05-25 2011-11-30 杭州华三通信技术有限公司 Intelligent monitoring method and device
CN105353772A (en) * 2015-11-16 2016-02-24 中国航天时代电子公司 Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking
CN105407283A (en) * 2015-11-20 2016-03-16 成都因纳伟盛科技股份有限公司 Multi-target active recognition tracking and monitoring method
CN105825524A (en) * 2016-03-10 2016-08-03 浙江生辉照明有限公司 Target tracking method and apparatus
CN107409175A (en) * 2015-03-26 2017-11-28 富士胶片株式会社 Follow-up control apparatus, tracking and controlling method, tracing control program and automatic follow shot system
CN107507243A (en) * 2016-06-14 2017-12-22 华为技术有限公司 A kind of camera parameters method of adjustment, instructor in broadcasting's video camera and system
CN108875683A (en) * 2018-06-30 2018-11-23 北京宙心科技有限公司 Robot vision tracking method and system
CN109743541A (en) * 2018-12-15 2019-05-10 深圳壹账通智能科技有限公司 Intelligent control method, device, computer equipment and storage medium
CN110086988A (en) * 2019-04-24 2019-08-02 薄涛 Shooting angle method of adjustment, device, equipment and its storage medium
CN111586303A (en) * 2020-05-22 2020-08-25 浩鲸云计算科技股份有限公司 Control method and device for dynamically tracking road surface target by camera based on wireless positioning technology
CN111986224A (en) * 2020-08-05 2020-11-24 七海行(深圳)科技有限公司 Target behavior prediction tracking method and device
WO2021017283A1 (en) * 2019-07-30 2021-02-04 平安科技(深圳)有限公司 Offline method-based online tracking method and apparatus, computer device, and storage medium
CN112616019A (en) * 2020-12-16 2021-04-06 重庆紫光华山智安科技有限公司 Target tracking method and device, holder and storage medium
CN112859854A (en) * 2021-01-08 2021-05-28 姜勇 Camera system and method of camera robot capable of automatically following camera shooting

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4413957B2 (en) * 2007-08-24 2010-02-10 株式会社東芝 Moving object detection device and autonomous moving object
CN103268480B (en) * 2013-05-30 2016-07-06 重庆大学 A kind of Visual Tracking System and method
CN104751486B (en) * 2015-03-20 2017-07-11 安徽大学 A kind of moving target relay tracking algorithm of many ptz cameras
JP6206857B1 (en) * 2016-08-24 2017-10-04 パナソニックIpマネジメント株式会社 Tracking support device, tracking support system, and tracking support method
CN107392941A (en) * 2017-07-25 2017-11-24 哈尔滨理工大学 A kind of takeoff and landing tracking system and method
CN111291585B (en) * 2018-12-06 2023-12-08 杭州海康威视数字技术股份有限公司 GPS-based target tracking system, method and device and ball machine
US11277556B2 (en) * 2019-04-01 2022-03-15 Jvckenwood Corporation Control device for automatic tracking camera
CN111145213A (en) * 2019-12-10 2020-05-12 中国银联股份有限公司 Target tracking method, device and system and computer readable storage medium
CN111179308B (en) * 2019-12-17 2022-10-11 清华大学 Visual servo-based fruit fly tracking method and system
CN111815679B (en) * 2020-07-27 2022-07-26 西北工业大学 Binocular camera-based trajectory prediction method during loss of spatial target feature points
CN112037257B (en) * 2020-08-20 2023-09-29 浙江大华技术股份有限公司 Target tracking method, terminal and computer readable storage medium thereof

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102263933A (en) * 2010-05-25 2011-11-30 杭州华三通信技术有限公司 Intelligent monitoring method and device
CN107409175A (en) * 2015-03-26 2017-11-28 富士胶片株式会社 Follow-up control apparatus, tracking and controlling method, tracing control program and automatic follow shot system
CN105353772A (en) * 2015-11-16 2016-02-24 中国航天时代电子公司 Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking
CN105407283A (en) * 2015-11-20 2016-03-16 成都因纳伟盛科技股份有限公司 Multi-target active recognition tracking and monitoring method
CN105825524A (en) * 2016-03-10 2016-08-03 浙江生辉照明有限公司 Target tracking method and apparatus
CN107507243A (en) * 2016-06-14 2017-12-22 华为技术有限公司 A kind of camera parameters method of adjustment, instructor in broadcasting's video camera and system
CN108875683A (en) * 2018-06-30 2018-11-23 北京宙心科技有限公司 Robot vision tracking method and system
CN109743541A (en) * 2018-12-15 2019-05-10 深圳壹账通智能科技有限公司 Intelligent control method, device, computer equipment and storage medium
CN110086988A (en) * 2019-04-24 2019-08-02 薄涛 Shooting angle method of adjustment, device, equipment and its storage medium
WO2021017283A1 (en) * 2019-07-30 2021-02-04 平安科技(深圳)有限公司 Offline method-based online tracking method and apparatus, computer device, and storage medium
CN111586303A (en) * 2020-05-22 2020-08-25 浩鲸云计算科技股份有限公司 Control method and device for dynamically tracking road surface target by camera based on wireless positioning technology
CN111986224A (en) * 2020-08-05 2020-11-24 七海行(深圳)科技有限公司 Target behavior prediction tracking method and device
CN112616019A (en) * 2020-12-16 2021-04-06 重庆紫光华山智安科技有限公司 Target tracking method and device, holder and storage medium
CN112859854A (en) * 2021-01-08 2021-05-28 姜勇 Camera system and method of camera robot capable of automatically following camera shooting

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PTZ camera target tracking in large complex scenes;Faliang Chang 等;《2010 8th World Congress on Intelligent Control and Automation》;20100823;第2914-2918页 *
摄像头PTZ控制的目标跟踪;裴仁旺;《中国硕士学位论文全文数据库》;20130815(第8期);I138-540 *

Also Published As

Publication number Publication date
CN113744299A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
JP4575829B2 (en) Display screen position analysis device and display screen position analysis program
CN113744299B (en) Camera control method and device, electronic equipment and storage medium
JP7048764B6 (en) Panorama video target tracking method and panoramic camera
AU2016352215B2 (en) Method and device for tracking location of human face, and electronic equipment
EP3605386A1 (en) Method and apparatus for obtaining vehicle loss assessment image, server and terminal device
US6226388B1 (en) Method and apparatus for object tracking for automatic controls in video devices
US7783076B2 (en) Moving-object tracking control apparatus, moving-object tracking system, moving-object tracking control method, and program
US11102413B2 (en) Camera area locking
US11887318B2 (en) Object tracking
JP6574645B2 (en) Control device for controlling imaging apparatus, control method for imaging apparatus, and program
JP2007208453A (en) Automatic tracking apparatus and method
CN111432115B (en) Face tracking method based on voice auxiliary positioning, terminal and storage device
JP3440916B2 (en) Automatic tracking device, automatic tracking method, and recording medium recording automatic tracking program
JP6551226B2 (en) INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM
US20200267309A1 (en) Focusing method and device, and readable storage medium
CN113910224B (en) Robot following method and device and electronic equipment
CN111091584B (en) Target tracking method, device, equipment and storage medium
US9465433B2 (en) Information processing device, method and program
EP3432575A1 (en) Method for performing multi-camera automatic patrol control with aid of statistics data in a surveillance system, and associated apparatus
CN103503435B (en) Image aspects error correction device and method
CN106445133B (en) Display adjustment method and system for tracking face movement
CN107993247B (en) Tracking and positioning method, system, medium and computing device
CN113163112B (en) Fusion focus control method and system
KR101576426B1 (en) Apparatus and Method for surveillance using fish eyes lens
US20120300058A1 (en) Control computer and method for regulating mechanical arm using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant