CN113687627B - Target tracking method based on camera robot - Google Patents
Target tracking method based on camera robot Download PDFInfo
- Publication number
- CN113687627B CN113687627B CN202110949804.6A CN202110949804A CN113687627B CN 113687627 B CN113687627 B CN 113687627B CN 202110949804 A CN202110949804 A CN 202110949804A CN 113687627 B CN113687627 B CN 113687627B
- Authority
- CN
- China
- Prior art keywords
- target
- camera
- camera robot
- robot
- mechanical arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Program-control systems
- G05B19/02—Program-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of program data in numerical form
- G05B19/19—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of program data in numerical form characterised by positioning or contouring control systems, e.g. to control position from one programmed point to another or to control movement along a programmed continuous path
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
- G05B2219/36424—Balance mechanically arm to be moved
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the invention discloses a target tracking method based on a camera robot. The target tracking method based on the camera robot enables a movie and television creator to only pay attention to the key frame position of the mechanical arm of the camera robot in the shooting requirement surrounding the target tracking object, and adjusts the continuous arm and the camera in real time in the motion track process of the mechanical arm by utilizing a target tracking algorithm, so that the lens of the camera is aligned to the target object in real time, and the focal distance of the lens of the camera is in the same position with the distance of the target object in real time.
Description
Technical Field
The invention relates to the technical field of camera robots, in particular to a target tracking method based on a camera robot.
Background
With the progress of technology and the development of video technology in recent years, more and more video production starts to perform image capturing using an image capturing robot. The target tracking technology based on the camera robot is a convenient shooting method for continuously tracking and shooting static target objects.
The movement of the targets changes the appearance patterns of the targets and the scene, the non-rigid target structure, the occlusion between the targets and the scene, the movement of the camera, etc., making the target tracking task particularly difficult. Target tracking is often applied in application environments where the position and shape of each frame of a target need to be known, and assumptions are often used to constrain the tracking problem in a particular application environment.
Through a target tracking technology, a movie creator can build a static target scene to be shot by using an animation modeling module before shooting, plan the running track of a camera through a key frame, and correct the rolling angle (roll), the yaw angle (yaw) and the pitch angle (pitch) of the track of a camera robot by using the optical center track of the camera through a target tracking algorithm, so that the center of a picture can be positioned to a target tracking object in real time in the running process of the camera. Meanwhile, the lens controller can also run the lens track to realize the effect of automatic focus following. By using a target tracking technology, a movie and television creator can accurately control the camera robot through a computer so as to plan the motion tracks of the camera which the creator wants, and the tracks can be repeated in space continuously and accurately. The shooting efficiency and the quality of the special-effect lens can be greatly improved by accurate repeated shooting.
Currently, target tracking technology in camera motion control systems generally uses a tape measure to measure the distance between the camera robot and the target object. The error caused by the measurement of the tape measure can greatly influence the tracking effect; and the field operation of professional technicians is needed, and the distance between the optical center position of the scale camera and the target object is troublesome for shooting at partial high machine positions. Further, the redesign and adjustment of the trajectory is time consuming and expensive in terms of rental, which greatly increases the cost of shooting.
Therefore, in order to solve the above technical problems, it is necessary to provide a simple and easily adjustable target tracking method by a camera robot.
Disclosure of Invention
In view of this, an object of the embodiments of the present invention is to provide a simple and easily adjustable target tracking method based on a camera robot.
In order to achieve the above object, an embodiment of the present invention provides the following technical solutions: a target tracking method based on a camera robot comprises the following steps: step S1, establishing a three-dimensional space model of the camera robot in a preset scene, designing a mechanical arm track of the camera robot according to the target position, selecting a plurality of frames in a video frame of the mechanical arm track as basic calculation frames, and acquiring pose information of each simulation target based on the basic calculation frames; step S2: carrying out weighted average on the pose information of each simulation target, taking the point value after weighted average as the pose information of the theoretical position of the target, and adjusting the parameters of the mechanical arm of the camera robot according to the position information of the theoretical position so that the optical center of the camera robot is over against the target; step S3: acquiring a tail end track curve of a mechanical arm of the camera robot based on the pose information of the theoretical position of the target, and performing accessibility analysis and acceleration analysis according to the inverse kinematics generated by the mechanical arm to acquire a safe motion range of the mechanical arm of the camera robot and a joint track of the mechanical arm of the camera robot; step S4: calculating the linear distance from each position point in the joint track to a target in a three-dimensional space based on the joint track of the mechanical arm of the camera robot, and setting the focal length of a camera of the camera robot; step S5: and synchronously synthesizing the joint track of the mechanical arm of the camera robot and the preset track of the target, and controlling the mechanical arm of the camera robot and a camera of the camera robot in real time according to the synchronous synthesis result, so that the target is tracked in real time.
As a further improvement of the present invention, a three-dimensional space model of the target is created based on the right-hand coordinate system in step S1.
As a further improvement of the invention, after the pose information of each simulation target is obtained based on the basic calculation frame, the Cartesian track of the mechanical arm of the camera robot is planned.
As a further improvement of the present invention, in step S3, the cartesian trajectory of the robot arm is converted into a joint trajectory through reachability analysis and acceleration analysis.
As a further improvement of the invention, the calculation method based on the basic calculation frame is to calculate the pose information of each simulation target by adopting a ray focusing method.
As a further improvement of the present invention, the parameters of the robot arm of the camera robot in the step S2 include an attitude roll angle, a pitch angle, and a yaw angle.
The invention has the following advantages:
the target tracking method based on the camera robot provided by the embodiment of the invention enables a movie creator to only pay attention to the key frame position of the mechanical arm of the camera robot in the shooting requirement surrounding the target tracking object, and adjusts the continuous arm and the camera in real time in the motion track process of the mechanical arm by using a target tracking algorithm, so that the lens of the camera is aligned to the target object in real time, and the focal length of the lens of the camera is in a position consistent with the distance of the target object in real time. The target tracking method based on the camera robot provided by the embodiment of the invention only needs to pay attention to the movement of the mechanical arm, so that the control of target tracking is simplified and the adjustment is easy to realize.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and it is also possible for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a schematic flowchart of a target tracking method based on a camera robot according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of another expression in the embodiment shown in FIG. 1.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1 and 2, a flowchart of a target tracking method based on a camera robot is illustrated. In this embodiment, the target tracking method based on the camera robot includes five steps, and the specific contents of each step are as follows.
And step S1, establishing a three-dimensional space model of the camera robot in a preset scene, designing a mechanical arm track of the camera robot according to the target position, selecting a plurality of frames in video frames of the mechanical arm track as basic calculation frames, and acquiring pose information of each simulation target based on the basic calculation frames. Further, the three-dimensional space model of this embodiment is built based on a right-hand coordinate system. And planning a Cartesian track of the mechanical arm of the camera robot after the pose information of each simulation target is obtained based on the basic calculation frame.
And calculating the pose information of each simulation target by adopting a ray focusing method based on the basic calculation frame. In a preset scene, a plurality of frames (also called key frames) of rays extending out from the optical center direction of the camera robot are made, the deviation of each camera from the optical center is adjusted, each simulation target point is calculated and displayed in a right-hand coordinate system, and the distance between each simulation target point is adjusted to the same point in the right-hand coordinate system as far as possible.
Step S2: and carrying out weighted average on the pose information of each simulation target, taking the point value after weighted average as the pose information of the theoretical position of the target, and adjusting the parameters of the mechanical arm of the camera robot according to the position information of the theoretical position so that the optical center of the camera robot is over against the target. In this embodiment, the parameters of the robot arm of the camera robot include an attitude roll angle, a pitch angle, and a yaw angle.
Because the tail end (x, y, z, r, p, y) of each frame of the planned mechanical arm track not only has position information, but also has attitude information, the real-time shooting direction of the camera robot in the running process is determined. Through a target tracking algorithm, the positions x, y and z in a Cartesian space are kept unchanged, and the attitude roll angle (roll), the pitch angle (pitch) and the yaw angle (yaw) are adjusted, so that the optical center corresponding to each point x, y, z, r, p and y in the track is over against the target object.
Step S3: and obtaining a tail end track curve of the mechanical arm of the camera robot based on the pose information of the theoretical position of the target, and performing accessibility analysis and acceleration analysis according to the inverse kinematics generated by the mechanical arm so as to obtain the safe motion range of the mechanical arm of the camera robot and the joint track of the mechanical arm of the camera robot. Only the motion trajectory through the accessibility analysis and the acceleration analysis can be operated in a real mechanical arm environment. The cartesian trajectory is converted into a joint trajectory by reachability analysis and acceleration verification. If the inverse kinematics cannot calculate the analytic solution in the conversion process, the step S1 is repeated to adjust the spatial position of the key frame.
Step S4: and calculating the linear distance from each position point in the joint track to a target in a three-dimensional space based on the joint track of the mechanical arm of the camera robot, and setting the focal length of a camera of the camera robot. In the process of setting the focal length of the camera robot, the track planning of the focal length and the aperture is carried out by utilizing the calibration file according to different camera robots or cameras of different camera robots.
Step S5: and synchronously synthesizing the joint track of the mechanical arm of the camera robot and the preset track of the target, and controlling the mechanical arm of the camera robot and a camera of the camera robot in real time according to the synchronous synthesis result, thereby realizing the real-time tracking of the target.
The target tracking method based on the camera robot provided by the embodiment of the invention enables a movie creator to only pay attention to the key frame position of the mechanical arm of the camera robot in the shooting requirement surrounding the target tracking object, and adjusts the continuous arm and the camera in real time in the motion track process of the mechanical arm by using a target tracking algorithm, so that the lens of the camera is aligned to the target object in real time, and the focal length of the lens of the camera is in a position consistent with the distance of the target object in real time.
Further, since the object tracking method based on the camera robot according to the embodiment of the present invention requires attention only to the movement of the robot arm, the control of the object tracking is simplified and the adjustment is easily achieved, it will be apparent to those skilled in the art that the present invention is not limited to the details of the above-described exemplary embodiment, and the present invention can be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.
Claims (6)
1. A target tracking method based on a camera robot is characterized by comprising the following steps:
step S1, establishing a three-dimensional space model of the camera robot in a preset scene, designing a mechanical arm track of the camera robot according to the target position, selecting a plurality of frames in video frames of the mechanical arm track as basic calculation frames, and acquiring pose information of each simulation target based on the basic calculation frames;
step S2: carrying out weighted average on the pose information of each simulation target, taking the point value after weighted average as the pose information of the theoretical position of the target, and adjusting the parameters of the mechanical arm of the camera robot according to the position information of the theoretical position so that the optical center of the camera robot is over against the target;
step S3: acquiring a tail end track curve of a mechanical arm of the camera robot based on the pose information of the theoretical position of the target, and performing accessibility analysis and acceleration analysis according to the inverse kinematics generated by the mechanical arm to acquire a safe motion range of the mechanical arm of the camera robot and a joint track of the mechanical arm of the camera robot;
step S4: calculating the linear distance from each position point in the joint track to a target in a three-dimensional space based on the joint track of the mechanical arm of the camera robot, and setting the focal length of a camera of the camera robot;
step S5: and synchronously synthesizing the joint track of the mechanical arm of the camera robot and the preset track of the target, and controlling the mechanical arm of the camera robot and a camera of the camera robot in real time according to the synchronous synthesis result, so that the target is tracked in real time.
2. The camera-robot-based target tracking method according to claim 1, wherein in step S1, a three-dimensional space model of the target is created based on a right-hand coordinate system.
3. The target tracking method based on the camera robot as claimed in claim 2, wherein after pose information of each simulation target is obtained based on the basic calculation frame, a cartesian trajectory of a mechanical arm of the camera robot is planned.
4. The camera robot-based target tracking method according to claim 3, wherein the Cartesian locus of the robot arm is converted into a joint locus by reachability analysis and acceleration analysis in step S3.
5. The camera robot-based target tracking method according to claim 1, wherein the calculation method based on the basic calculation frame is to calculate pose information of each simulation target by using a ray focusing method.
6. The target tracking method based on the camera robot as claimed in claim 1, wherein the parameters of the robot arm of the camera robot in the step S2 include an attitude roll angle, a pitch angle and a yaw angle.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202110949804.6A CN113687627B (en) | 2021-08-18 | 2021-08-18 | Target tracking method based on camera robot |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202110949804.6A CN113687627B (en) | 2021-08-18 | 2021-08-18 | Target tracking method based on camera robot |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN113687627A CN113687627A (en) | 2021-11-23 |
| CN113687627B true CN113687627B (en) | 2022-08-19 |
Family
ID=78580477
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202110949804.6A Active CN113687627B (en) | 2021-08-18 | 2021-08-18 | Target tracking method based on camera robot |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN113687627B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112116663A (en) * | 2020-08-20 | 2020-12-22 | 太仓中科信息技术研究院 | Offline programming method and system for camera robot and electronic equipment |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5521843A (en) * | 1992-01-30 | 1996-05-28 | Fujitsu Limited | System for and method of recognizing and tracking target mark |
| CN103400409A (en) * | 2013-08-27 | 2013-11-20 | 华中师范大学 | 3D (three-dimensional) visualization method for coverage range based on quick estimation of attitude of camera |
| CN104647390A (en) * | 2015-02-11 | 2015-05-27 | 清华大学 | Multi-camera combined initiative object tracking method for teleoperation of mechanical arm |
| CN108601626A (en) * | 2015-12-30 | 2018-09-28 | 皇家飞利浦有限公司 | Image-Based Robot Guidance |
| CN111131813A (en) * | 2019-10-29 | 2020-05-08 | 牧今科技 | Method and system for determining pose of camera calibration |
| CN112541946A (en) * | 2020-12-08 | 2021-03-23 | 深圳龙岗智能视听研究院 | Real-time pose detection method of mechanical arm based on perspective multi-point projection |
| CN112859854A (en) * | 2021-01-08 | 2021-05-28 | 姜勇 | Camera system and method of camera robot capable of automatically following camera shooting |
| CN113084827A (en) * | 2021-04-01 | 2021-07-09 | 北京飞影科技有限公司 | Method and device for calibrating optical center position of camera device |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10112303B2 (en) * | 2013-10-25 | 2018-10-30 | Aleksandar Vakanski | Image-based trajectory robot programming planning approach |
| WO2019049331A1 (en) * | 2017-09-08 | 2019-03-14 | 株式会社ソニー・インタラクティブエンタテインメント | Calibration device, calibration system, and calibration method |
-
2021
- 2021-08-18 CN CN202110949804.6A patent/CN113687627B/en active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5521843A (en) * | 1992-01-30 | 1996-05-28 | Fujitsu Limited | System for and method of recognizing and tracking target mark |
| CN103400409A (en) * | 2013-08-27 | 2013-11-20 | 华中师范大学 | 3D (three-dimensional) visualization method for coverage range based on quick estimation of attitude of camera |
| CN104647390A (en) * | 2015-02-11 | 2015-05-27 | 清华大学 | Multi-camera combined initiative object tracking method for teleoperation of mechanical arm |
| CN108601626A (en) * | 2015-12-30 | 2018-09-28 | 皇家飞利浦有限公司 | Image-Based Robot Guidance |
| CN111131813A (en) * | 2019-10-29 | 2020-05-08 | 牧今科技 | Method and system for determining pose of camera calibration |
| CN112541946A (en) * | 2020-12-08 | 2021-03-23 | 深圳龙岗智能视听研究院 | Real-time pose detection method of mechanical arm based on perspective multi-point projection |
| CN112859854A (en) * | 2021-01-08 | 2021-05-28 | 姜勇 | Camera system and method of camera robot capable of automatically following camera shooting |
| CN113084827A (en) * | 2021-04-01 | 2021-07-09 | 北京飞影科技有限公司 | Method and device for calibrating optical center position of camera device |
Also Published As
| Publication number | Publication date |
|---|---|
| CN113687627A (en) | 2021-11-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US7106361B2 (en) | System and method for manipulating the point of interest in a sequence of images | |
| US7027083B2 (en) | System and method for servoing on a moving fixation point within a dynamic scene | |
| US9369694B2 (en) | Adjusting stereo images | |
| CN106780601B (en) | Spatial position tracking method and device and intelligent equipment | |
| US5457370A (en) | Motion control system for cinematography | |
| US20140232818A1 (en) | Method and device for spherical resampling for video generation | |
| KR102686182B1 (en) | Method and data processing system for image synthesis | |
| JP7185860B2 (en) | Calibration method for a multi-axis movable vision system | |
| US20110249095A1 (en) | Image composition apparatus and method thereof | |
| KR20130075712A (en) | Laser vision sensor and its correction method | |
| JP7082713B2 (en) | Rolling Shutter Correction for images / videos using convolutional neural networks in applications for image / video SFM / SLAM | |
| JP2021124395A (en) | Pan tilt angle calculation device and program therefor | |
| CN103729839A (en) | Outdoor camera tracing method and system based on sensors | |
| CN113687627B (en) | Target tracking method based on camera robot | |
| CN117527993A (en) | Device and method for performing virtual shooting in controllable space | |
| JP2013101525A (en) | Image processing device, method, and program | |
| JP2010183384A (en) | Photographic camera learning apparatus and program therefor | |
| CN120780031A (en) | Cloud deck tracking method, equipment and storage medium based on binocular camera | |
| CN109919976A (en) | Scene automation multiplexing method, equipment and storage medium based on cameras people | |
| CN118870209A (en) | Optical motion capture method and system for movable camera | |
| CN119342348B (en) | A XR virtual studio camera tracking and positioning method and system | |
| CN119693408B (en) | Real-time athlete running track recording method based on multi-view spliced video | |
| JP2020147105A (en) | Camera control device and its program, and multi-view robot camera system | |
| Lee et al. | Infinite Video Generation with Cinematic Camera Trajectory Control | |
| CN115277996A (en) | Real-time film production method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |