CN109120849B - Intelligent shooting method for micro-class tracking - Google Patents

Intelligent shooting method for micro-class tracking Download PDF

Info

Publication number
CN109120849B
CN109120849B CN201811090606.3A CN201811090606A CN109120849B CN 109120849 B CN109120849 B CN 109120849B CN 201811090606 A CN201811090606 A CN 201811090606A CN 109120849 B CN109120849 B CN 109120849B
Authority
CN
China
Prior art keywords
point
positioning
shooting
ssx
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811090606.3A
Other languages
Chinese (zh)
Other versions
CN109120849A (en
Inventor
朱玉荣
彭泽波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Wenxiang Information Technology Co Ltd
Original Assignee
Anhui Wenxiang Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Wenxiang Information Technology Co Ltd filed Critical Anhui Wenxiang Information Technology Co Ltd
Priority to CN201811090606.3A priority Critical patent/CN109120849B/en
Publication of CN109120849A publication Critical patent/CN109120849A/en
Application granted granted Critical
Publication of CN109120849B publication Critical patent/CN109120849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • G06T3/06

Abstract

The invention relates to an intelligent shooting method for micro-class tracking, which solves the technical problems of non-centered shooting and poor usability, and comprises the following steps: step 1, defining the corresponding relation between the edge of an area to be analyzed and detected and shooting positioning, and simultaneously defining a two-dimensional space coordinate system as a Z system coordinate; step 2, converting the amplification values of four points of the coordinate of the Z system into corresponding amplification values by utilizing the conversion from the three-dimensional space to the two-dimensional space coordinate system, and calculating the Z coordinate of the detection point as an amplification instruction of the shooting camera according to the quadrilateral similarity principle; step 3, using the multi-point positioning of the plane of the analysis and detection area, using the image analysis to detect the moving target point, calculating the mapping between the detection point and the actual absolute coordinate xy, and calculating the positioning instruction of the shooting camera; and 4, the shooting camera works according to the positioning instruction and the amplification instruction in the steps 2 and 3 to shoot the centered technical scheme, so that the problem is well solved, and the shooting camera can be used in a micro class.

Description

Intelligent shooting method for micro-class tracking
Technical Field
The invention relates to the field, in particular to an intelligent shooting method for micro-class tracking.
Background
The micro lesson is a teaching process which takes video as a main carrier to record the whole teaching process of a teacher around a single or a plurality of knowledge points in the teaching process. The video quality of shooting seems very important, and the little class main part is the mr, so mainly lie in the tracking shooting to the mr, at present divide into several regions at little class shooting in-process podium, every region sets for a fixed shooting point with shooting camera, and then switch over the shooting point in this region when the mr gets into certain region, but this mode has the limitation because the shooting point has been fixed, when different height fat slim mr teach, the personage can not shoot in the middle of in the picture, influences the teaching.
In order to effectively reduce the garbage pictures and perfectly present the core composition content of the whole micro-class which is the class example segment of the class teaching video, the invention provides an intelligent shooting method for micro-class tracking.
Disclosure of Invention
The invention aims to solve the technical problems of non-centered shooting and poor usability in the prior art. The intelligent shooting method for micro class tracking has the characteristics of convenience and accuracy in shooting centering and capability of enabling a target to be centered in a plane all the time without depending on characteristic changes of target detection points.
In order to solve the technical problems, the technical scheme is as follows:
a smart photographing method for micro-class tracking, the smart photographing method for micro-class tracking comprising:
step 1, defining the corresponding relation between the edge of an area to be analyzed and detected and shooting positioning, and simultaneously defining a two-dimensional space coordinate system as a Z system coordinate;
step 2, converting the amplification values of four points of the coordinate of the Z system into corresponding amplification values by utilizing the conversion from the three-dimensional space to the two-dimensional space coordinate system, and calculating the Z coordinate of the detection point as an amplification instruction of the shooting camera according to the quadrilateral similarity principle;
step 3, using the multi-point positioning of the plane of the analysis and detection area, using the image analysis to detect the moving target point, calculating the mapping between the detection point and the actual absolute coordinate xy, and calculating the positioning instruction of the shooting camera;
and 4, the shooting camera works according to the positioning instruction and the amplification instruction in the step 2 and the step 3, and shooting is centered.
The working principle of the invention is as follows: the invention firstly defines a corresponding relation between the edge of an analysis detection area and shooting positioning. And converting the amplification values of four points in the Z system coordinate into corresponding amplification values to obtain the Z coordinate of the detection point according to a quadrilateral similarity principle by using the conversion from the three-dimensional space to the two-dimensional space coordinate system and the change of the amplification values of the four points in the Z system coordinate closer to the origin to form the change of a linear relation. According to the multi-point positioning of the plane of the detection area, a moving reasonable target point is detected by utilizing image analysis, as long as the detection point belongs to the point of the plane area, the mapping relation of a mathematical model formed at multiple points can accurately calculate the mapping between the detection point and an actual absolute coordinate xy at any time according to a proportional formula, so that the picture is centered.
In the above scheme, for optimization, further, the shooting positioning in step 1 is completed by 4 positioning cameras, and the 4 positioning cameras are respectively arranged at point a, point B, point C, and point D;
in the Z coordinate system, the A point is (X1, Y1), the B point is (X2, Y2), the C point is (X3, Y3) and the D point is (X4, Y4).
Further, step 2 comprises:
step A1, setting the center of the picture of the shooting camera at the position of each positioning camera, and calculating the corresponding amplification value;
step A2, defining the coordinates of the tracking point as SourceX and SourceY, and when the positioning camera detects the coordinates of the tracking point, calculating the transverse proportion SSX = SourceX/MaxX and the longitudinal proportion SSY = SourceY/MaxY;
step A3, calculating the magnification value of the target point according to the magnification value in step a1 as:
CZ=(Q-P)*SSY+P,P=(B-A)*SSX+A,Q=(D-C)*SSX+C;
CZ=((D-C)*SSX+C-((B-A)*SSX+A))*SSY+(B-A)*SSX+A
and finally calculating a zoom-in command CZ of the shooting camera as follows:
CZ=((Z4-Z3)*SSX+Z3-((Z2-Z1)*SSX+Z1))*SSY+(Z2-Z1)*SSX+Z1。
further, the step a1 includes:
setting the position A point (X1, Y1) of the A positioning camera, setting the center of the picture of the shooting camera at the point A (X1, Y1), and calculating an amplification value Z1;
setting the position B point (X2, Y2) of the B positioning camera, setting the center of the picture of the shooting camera at the point B (X2, Y2), and calculating an amplification value Z2;
setting the C point (X3, Y3) of the position of the C positioning camera, setting the center of the picture of the shooting camera at the C point (X3, Y3), and calculating an amplification value Z3;
the position D of the D-position camera is set (X4, Y4), the center of the shot camera screen is set at the point D (X4, Y4), and the zoom value Z4 is calculated.
Further, step 3 comprises:
step B1, based on the four A, B, C, D points in step 1, calculates:
PX=(X2-X1)*SSX+X1,PY=(Y2-Y1)*SSY+Y1,QX=(X4-X3)*SSX+X3,QY=(Y4-Y3)*SSY+Y3;
calculating a positioning command (CX, CY) of the shooting camera:
CX=(X4-X3-X2+X1)*SSX*SSX+(X3+X2-2*X1)*SSX+X1;
CY=(Y4-Y3-Y2+Y1)*SSY*SSY+(Y3+Y2-2*Y1)*SSY+Y1。
the invention has the beneficial effects that: the invention utilizes the mathematical model to establish the mapping change from the geometrically used three-dimensional space to the two-dimensional plane, can quickly calculate the conversion of the physical coordinate by positioning the coordinate, is very convenient and accurate, does not depend on the characteristic change of the target detection point to ensure that the target is always centered in the plane, and is very suitable for shooting the dynamically changed scene in a micro-course.
Drawings
The invention is further illustrated with reference to the following figures and examples.
Fig. 1, a positioning camera setup schematic.
Fig. 2, a schematic diagram of a calculation scale.
Fig. 3 is a schematic diagram of an enlarged value calculation.
Fig. 4 is a schematic diagram of the calculation of the positioning coordinates.
Fig. 5 is a schematic flow chart of an intelligent shooting method for micro-class tracking.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example 1
The present embodiment provides a smart shooting method for micro-class tracking, as shown in fig. 5, the smart shooting method for micro-class tracking includes:
step 1, defining the corresponding relation between the edge of an area to be analyzed and detected and shooting positioning, and simultaneously defining a two-dimensional space coordinate system as a Z system coordinate;
step 2, converting the amplification values of four points of the coordinate of the Z system into corresponding amplification values by utilizing the conversion from the three-dimensional space to the two-dimensional space coordinate system, and calculating the Z coordinate of the detection point as an amplification instruction of the shooting camera according to the quadrilateral similarity principle;
step 3, using the multi-point positioning of the plane of the analysis and detection area, using the image analysis to detect the moving target point, calculating the mapping between the detection point and the actual absolute coordinate xy, and calculating the positioning instruction of the shooting camera;
and 4, the shooting camera works according to the positioning instruction and the amplification instruction in the step 2 and the step 3, and shooting is centered.
Specifically, as shown in fig. 2, the shooting positioning in step 1 is completed by 4 positioning cameras, and the 4 positioning cameras are respectively arranged at a point a, a point B, a point C, and a point D;
in the Z coordinate system, the A point is (X1, Y1), the B point is (X2, Y2), the C point is (X3, Y3) and the D point is (X4, Y4).
Specifically, step 2 comprises:
step A1, setting the center of the picture of the shooting camera at the position of each positioning camera, and calculating the corresponding amplification value;
step A2, defining the coordinates of the tracking point as SourceX and SourceY, and when the positioning camera detects the coordinates of the tracking point, calculating the transverse proportion SSX = SourceX/MaxX and the longitudinal proportion SSY = SourceY/MaxY;
step A3, calculating the magnification value of the target point according to the magnification value in step a1 as:
CZ=(Q-P)*SSY+P,P=(B-A)*SSX+A,Q=(D-C)*SSX+C;
CZ=((D-C)*SSX+C-((B-A)*SSX+A))*SSY+(B-A)*SSX+A
as shown in fig. 3, the zoom-in command CZ for the shooting camera is finally calculated as:
CZ=((Z4-Z3)*SSX+Z3-((Z2-Z1)*SSX+Z1))*SSY+(Z2-Z1)*SSX+Z1。
specifically, the step a1 includes:
setting the position A point (X1, Y1) of the A positioning camera, setting the center of the picture of the shooting camera at the point A (X1, Y1), and calculating an amplification value Z1;
setting the position B point (X2, Y2) of the B positioning camera, setting the center of the picture of the shooting camera at the point B (X2, Y2), and calculating an amplification value Z2;
setting the C point (X3, Y3) of the position of the C positioning camera, setting the center of the picture of the shooting camera at the C point (X3, Y3), and calculating an amplification value Z3;
the position D of the D-position camera is set (X4, Y4), the center of the shot camera screen is set at the point D (X4, Y4), and the zoom value Z4 is calculated.
Specifically, step 3 includes:
step B1, based on the four A, B, C, D points in step 1, calculates:
PX=(X2-X1)*SSX+X1,PY=(Y2-Y1)*SSY+Y1,QX=(X4-X3)*SSX+X3,QY=(Y4-Y3)*SSY+Y3;
as shown in fig. 4, the positioning command (CX, CY) of the photographing camera is calculated:
CX=(X4-X3-X2+X1)*SSX*SSX+(X3+X2-2*X1)*SSX+X1;
CY=(Y4-Y3-Y2+Y1)*SSY*SSY+(Y3+Y2-2*Y1)*SSY+Y1。
the embodiment utilizes the mathematical model to establish the mapping change from the geometric three-dimensional space to the two-dimensional plane, can quickly calculate the conversion of the physical coordinate through the positioning coordinate, is very convenient and accurate, does not depend on the characteristic change of the target detection point to enable the target to be centered in the plane all the time, and is very suitable for shooting a dynamically-changed scene in a micro-class.
Although the illustrative embodiments of the present invention have been described above to enable those skilled in the art to understand the present invention, the present invention is not limited to the scope of the embodiments, and it is apparent to those skilled in the art that all the inventive concepts using the present invention are protected as long as they can be changed within the spirit and scope of the present invention as defined and defined by the appended claims.

Claims (2)

1. An intelligent shooting method for micro-class tracking is characterized in that: the intelligent shooting method for micro-class tracking comprises the following steps:
step 1, defining a corresponding relation between the edge of an area to be analyzed and detected and shooting positioning, and defining a two-dimensional space coordinate system as a Z system coordinate, wherein the shooting positioning is completed by 4 positioning cameras, the 4 positioning cameras are respectively arranged at a point A, a point B, a point C and a point D, and the point A, the point B, the point C and the point D in the Z coordinate system are (X1, Y1), (X2, Y2), (X3, Y3) and (X4, Y4);
step 2, converting the amplification values of four points of the coordinate of the Z system into corresponding amplification values by utilizing the conversion from the three-dimensional space to the two-dimensional space coordinate system, and calculating the Z coordinate of the detection point as an amplification instruction of the shooting camera according to the quadrilateral similarity principle;
step 3, using the multi-point positioning of the plane of the analysis and detection area, using the image analysis to detect the moving target point, calculating the mapping between the detection point and the actual absolute coordinate xy, and calculating the positioning instruction of the shooting camera;
step 4, the shooting camera works according to the positioning instruction and the amplification instruction in the step 2 and the step 3, the shooting is centered,
wherein the step 2 comprises:
step A1, setting the center of the picture of the shooting camera at each positioning camera position, calculating the corresponding amplification value, specifically setting the A point (X1, Y1) of the positioning camera position A, setting the center of the picture of the shooting camera at the A point (X1, Y1), and calculating the amplification value Z1;
setting the position B point (X2, Y2) of the B positioning camera, setting the center of the picture of the shooting camera at the point B (X2, Y2), and calculating an amplification value Z2;
setting the C point (X3, Y3) of the position of the C positioning camera, setting the center of the picture of the shooting camera at the C point (X3, Y3), and calculating an amplification value Z3;
setting the position D points (X4, Y4) of the D positioning camera, setting the center of the shot camera picture at the D points (X4, Y4), and calculating an amplification value Z4;
step A2, defining the coordinates of the tracking point as SourceX and SourceY, and when the positioning camera detects the coordinates of the tracking point, calculating the transverse proportion SSX = SourceX/MaxX and the longitudinal proportion SSY = SourceY/MaxY;
step A3, calculating the magnification value of the target point according to the magnification value in step a1 as:
CZ=(Q-P)*SSY+P,P=(B-A)*SSX+A,Q=(D-C)*SSX+C;
CZ=((D-C)*SSX+C-((B-A)*SSX+A))*SSY+(B-A)*SSX+A
and finally calculating a zoom-in command CZ of the shooting camera as follows:
CZ=((Z4-Z3)*SSX+Z3-((Z2-Z1)*SSX+Z1))*SSY+(Z2-Z1)*SSX+Z1。
2. the smart photography method for micro-class tracking according to claim 1, wherein: the step 3 comprises the following steps:
step B1, based on the four A, B, C, D points in step 1, calculates:
PX=(X2-X1)*SSX+X1,PY=(Y2-Y1)*SSY+Y1,QX=(X4-X3)*SSX+X3,QY=(Y4-Y3)*SSY+Y3;
calculating a positioning command (CX, CY) of the shooting camera:
CX=(X4-X3-X2+X1)*SSX*SSX+(X3+X2-2*X1)*SSX+X1;
CY=(Y4-Y3-Y2+Y1)*SSY*SSY+(Y3+Y2-2*Y1)*SSY+Y1。
CN201811090606.3A 2018-09-19 2018-09-19 Intelligent shooting method for micro-class tracking Active CN109120849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811090606.3A CN109120849B (en) 2018-09-19 2018-09-19 Intelligent shooting method for micro-class tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811090606.3A CN109120849B (en) 2018-09-19 2018-09-19 Intelligent shooting method for micro-class tracking

Publications (2)

Publication Number Publication Date
CN109120849A CN109120849A (en) 2019-01-01
CN109120849B true CN109120849B (en) 2020-09-18

Family

ID=64859761

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811090606.3A Active CN109120849B (en) 2018-09-19 2018-09-19 Intelligent shooting method for micro-class tracking

Country Status (1)

Country Link
CN (1) CN109120849B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003346132A (en) * 2002-05-28 2003-12-05 Toshiba Corp Coordinate conversion device and coordinate conversion program
CN1554985A (en) * 2002-02-28 2004-12-15 ���չ�˾ Camera system,display and control method,control program and readable medium
CN102629986A (en) * 2012-04-10 2012-08-08 广州市奥威亚电子科技有限公司 Automatic tracking and shooting method
CN105354578A (en) * 2015-10-27 2016-02-24 安徽大学 Multi-target object image matching method
CN105894702A (en) * 2016-06-21 2016-08-24 南京工业大学 Invasion detecting alarming system based on multi-camera data combination and detecting method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9749594B2 (en) * 2011-12-22 2017-08-29 Pelco, Inc. Transformation between image and map coordinates

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1554985A (en) * 2002-02-28 2004-12-15 ���չ�˾ Camera system,display and control method,control program and readable medium
JP2003346132A (en) * 2002-05-28 2003-12-05 Toshiba Corp Coordinate conversion device and coordinate conversion program
CN102629986A (en) * 2012-04-10 2012-08-08 广州市奥威亚电子科技有限公司 Automatic tracking and shooting method
CN105354578A (en) * 2015-10-27 2016-02-24 安徽大学 Multi-target object image matching method
CN105894702A (en) * 2016-06-21 2016-08-24 南京工业大学 Invasion detecting alarming system based on multi-camera data combination and detecting method thereof

Also Published As

Publication number Publication date
CN109120849A (en) 2019-01-01

Similar Documents

Publication Publication Date Title
Chen et al. High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm
CN107358633A (en) Join scaling method inside and outside a kind of polyphaser based on 3 points of demarcation things
CN110300292B (en) Projection distortion correction method, device, system and storage medium
CN103617615B (en) Radial distortion parameter acquisition methods and acquisition device
CN104715479A (en) Scene reproduction detection method based on augmented virtuality
US20190096092A1 (en) Method and device for calibration
CN112686877B (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
CN104867113B (en) The method and system of perspective image distortion correction
CN102509304A (en) Intelligent optimization-based camera calibration method
CN108362205B (en) Space distance measuring method based on fringe projection
CN105701828A (en) Image-processing method and device
CN114792345B (en) Calibration method based on monocular structured light system
CN112288815B (en) Target die position measurement method, system, storage medium and device
CN111080776A (en) Processing method and system for human body action three-dimensional data acquisition and reproduction
CN107851301A (en) System and method for selecting image to convert
CN114640833A (en) Projection picture adjusting method and device, electronic equipment and storage medium
JP6579727B1 (en) Moving object detection device, moving object detection method, and moving object detection program
CN103533326A (en) System and method for alignment of stereo views
CN101729739A (en) Method for rectifying deviation of image
CN109120849B (en) Intelligent shooting method for micro-class tracking
CN102023763A (en) Positioning method of touch system camera
CN111145266B (en) Fisheye camera calibration method and device, fisheye camera and readable storage medium
KR101673144B1 (en) Stereoscopic image registration method based on a partial linear method
CN104156952B (en) A kind of image matching method for resisting deformation
CN104732538B (en) Camera positioning and tracing method and related system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200824

Address after: 247126 workshop C2, science and Technology Incubation Park, Jiangnan industrial concentration zone, Chizhou City, Anhui Province

Applicant after: Anhui Wenxiang Information Technology Co.,Ltd.

Address before: 100075 Beijing Daxing District, Beijing Economic and Technological Development Zone, No. 26 Kechuang Thirteenth Street, 1 5-storey 501

Applicant before: BEIJING WENXIANG INFORMATION TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: An Intelligent Shooting Method for Micro Course Tracking

Granted publication date: 20200918

Pledgee: Anhui Jiangnan Industrial Concentration Zone Construction Investment Development (Group) Co.,Ltd.

Pledgor: Anhui Wenxiang Information Technology Co.,Ltd.

Registration number: Y2024980010826