CN104732538B - Camera positioning and tracing method and related system - Google Patents

Camera positioning and tracing method and related system Download PDF

Info

Publication number
CN104732538B
CN104732538B CN201510138377.8A CN201510138377A CN104732538B CN 104732538 B CN104732538 B CN 104732538B CN 201510138377 A CN201510138377 A CN 201510138377A CN 104732538 B CN104732538 B CN 104732538B
Authority
CN
China
Prior art keywords
camera
destination object
parameter
rectangle
obtains
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510138377.8A
Other languages
Chinese (zh)
Other versions
CN104732538A (en
Inventor
王国孟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wang Guomeng
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201510138377.8A priority Critical patent/CN104732538B/en
Publication of CN104732538A publication Critical patent/CN104732538A/en
Application granted granted Critical
Publication of CN104732538B publication Critical patent/CN104732538B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

Camera positioning and tracing method and related system, method include the view data for obtaining destination object by the first camera and second camera respectively;The coordinate parameters obtained using each reference point as destination object during reference are calculated respectively according to the view data of the view data of destination object and four reference points prestored, to obtain four groups of coordinate parameters of destination object;Calculate the average coordinates parameter for obtaining four groups of coordinate parameters;And the corresponding 3rd camera position parameter matrix of acquisition destination object is calculated according to the average coordinates parameter of the corresponding four groups of location parameter matrixes of four reference points prestored, the width of the rectangle, length and destination object.Precisely, simple to operate, layman also can easily and flexibly use tracing and positioning of the present invention.

Description

Camera positioning and tracing method and related system
Technical field
The present invention relates to a kind of camera positioning and tracing method and related system.
Background technology
, it is necessary to carry out locating and tracking to student by Camera Tracking System in school curricula automatic recording, conventional at present There are single camera tracking system and dual camera tracking system, but existing single camera tracking system can not carry out three-dimensional sky Between the operation of location tracking feature, easily position it is inaccurate, using extremely limiting to;Although dual camera can realize solid space Static positioning, but control two cameras by dynamic track and localization carry out feature operation when, it is necessary to which professional participates in it at the moment In, it is necessary to set many professional parameters, operating process is considerably complicated, and tracing and positioning precision is low, generally requires artificial repeatedly to adjust It is whole, it is time-consuming.
The content of the invention
In view of the shortcomings of the prior art, it is contemplated that being positioned in providing a kind of camera for solving above-mentioned technical problem Tracking and related system.
To achieve the above object, the present invention is adopted the following technical scheme that:
A kind of camera positioning and tracing method, it is characterised in that:It comprises the following steps:
Step S101:The view data of destination object is obtained by the first camera and second camera respectively;
Step S102:Acquisition is calculated according to the view data of the view data of destination object and four reference points prestored respectively Using each reference point as the coordinate parameters of destination object during reference, to obtain four groups of coordinate parameters of destination object;Wherein, Four reference points form a rectangle on space level face, and the straight line and rectangle where the first camera and second camera are wherein It is parallel on one side;
Step S103:Calculate the average coordinates parameter for obtaining four groups of coordinate parameters;And
Step S104:According to the corresponding four groups of location parameter matrixes of four reference points prestored, the width of the rectangle, length and The average coordinates parameter of destination object, which is calculated, obtains the corresponding 3rd camera position parameter matrix of destination object.
A kind of camera locating and tracking system, it is characterised in that including the first camera, second camera, the 3rd shooting Head and processor;
First camera and second camera are used for the view data for generating destination object respectively;
The processor is used to be calculated respectively according to the view data of destination object and the view data of four reference points prestored The coordinate parameters using each reference point as destination object during reference are obtained, to obtain four groups of coordinate parameters of destination object; Wherein, four reference points form a rectangle on space level face, straight line and rectangle where the first camera and second camera Wherein on one side it is parallel;
The processor is additionally operable to calculate the average coordinates parameter for obtaining four groups of coordinate parameters;And according to four ginsengs prestored The corresponding four groups of location parameter matrixes of examination point, the width of the rectangle, the average coordinates parameter calculating of length and destination object are obtained The corresponding 3rd camera position parameter matrix of destination object.
A kind of locating and tracking system, it is characterised in that:Including with lower module:
Image collection module, the picture number for obtaining destination object by the first camera and second camera respectively According to;
Coordinate parameters acquisition module, for the view data according to destination object and the view data of four reference points prestored The coordinate parameters obtained using each reference point as destination object during reference are calculated respectively, to obtain four groups of seats of destination object Mark parameter;And calculate the average coordinates parameter for obtaining four groups of coordinate parameters;Wherein, four reference points shape on space level face Into a rectangle, the straight line where the first camera and second camera is wherein parallel on one side with rectangle;
Location parameter matrix acquisition module, for according to prestore the corresponding four groups of location parameter matrixes of four reference points, should The width of rectangle, the average coordinates parameter calculating of length and destination object obtain the corresponding 3rd camera position ginseng of destination object Matrix number.
Beneficial effects of the present invention are at least as follows:
It need to only be distinguished during tracing and positioning of the present invention by the first camera and second camera after photographic subjects object The automatic location parameter matrix calculated when obtaining the 3rd camera photographic subjects object, so that the 3rd camera carries out accurate feature Shoot.Whole operation process, user only needs photographic subjects object, you can realize accurately tracing and positioning, and operation is very simple, non- Professional person also can easily and flexibly use.
Brief description of the drawings
Fig. 1 is the broad flow diagram of the better embodiment of camera positioning and tracing method of the present invention.
Reference point that Fig. 2 is related to for Fig. 1 camera positioning and tracing method, destination object, the first camera and second are taken the photograph As the distribution schematic diagram of head.
Fig. 3 is the structural representation of the better embodiment of camera locating and tracking system of the present invention.
Fig. 4 is the module diagram of the better embodiment of locating and tracking system of the present invention.
Embodiment
Below in conjunction with accompanying drawing and embodiment, the present invention is described further:
Fig. 1 is referred to, the present invention relates to a kind of camera positioning and tracing method, its better embodiment includes following step Suddenly:
Step S101:The view data of destination object is obtained by the first camera and second camera respectively;.
For example, with reference to Fig. 2, point A, B, C and D are respectively 4 reference points, point O1With point O2Respectively the first camera and Two cameras, straight line O1O2Parallel with the side CD in rectangle, point S is destination object.
Step S102:Acquisition is calculated according to the view data of the view data of destination object and four reference points prestored respectively Using each reference point as the coordinate parameters of destination object during reference, to obtain four groups of coordinate parameters of destination object;
Wherein, 4 reference points form straight where a rectangle, the first camera and second camera on space level face Line is wherein parallel on one side with rectangle.
In practical application, two figures comprising four reference points can be obtained respectively beforehand through the first camera and second camera As data, so that follow-up tracing and positioning is used.
For example, each geometric figure such as triangle using reference point A, B, C and D as summit calculate and obtains destination object Coordinate parameters.
In the present embodiment, step S102 is obtained using a wherein reference point by following sub-step and is used as the target pair during reference The coordinate parameters of elephant, for convenience of describe, using reference point A as references object exemplified by, but following manner do not limit to suitable for reference Point A, it is equally applicable to its excess-three reference point.
Step S1021:∠ AO are obtained according to camera projection theory1S and ∠ AO2S;If for example, the first camera O1It is wide Angle is 3.6mm, and the projection angle of wide-angle is 60 degree, then when a width of 800 pixel of image frame, if destination object S is in 400 pixels The position of point, then according to reference point A and destination object S in the first camera O1Migration imagery can try to achieve ∠ AO1S is 30 degree.Together Reason, can try to achieve ∠ AO2S。
Step S1022:According to default distance parameter AB, distance parameter AC, distance parameter CH1With distance parameter O1O2It is logical Cross antitrigonometric function and calculate acquisition ∠ AO1O2With ∠ AO2O1;For example, in right angled triangle △ AO1H1In, can be according to arc tangent letter Number obtains ∠ AO1O2
Step S1023:By ∠ AO1O2Subtract ∠ AO1S obtains ∠ SO1O2, by ∠ AO2O1Subtract ∠ AO2S obtains ∠ SO2O1
Step S1024:In △ SO1O2In distance parameter SO is obtained by sine1
Step S1025:The coordinate parameters (x, y) for obtaining destination object S are calculated by formula group I, wherein, formula group I is
Step S103:Calculate the average coordinates parameter for obtaining four groups of coordinate parameters.
Step S104:According to the corresponding four groups of location parameter matrixes of four reference points prestored, the width of the rectangle, length and The average coordinates parameter of destination object, which is calculated, obtains the corresponding 3rd camera position parameter matrix of destination object, with precisely quick Ground dollying head carries out feature shooting to destination object.
In practical application, the location parameter matrix that the 3rd camera point shoots each reference point can be obtained in advance, for follow-up Use.
In the present embodiment, step S104 obtains the corresponding 3rd camera position parameter square of destination object by formula group II Gust, formula group II isWherein, P0,P1,P2,P3Respectively described four groups of location parameter matrixes, AB For rectangle length, AC is rectangle width, PSFor the corresponding camera position parameter matrix of destination object, sy and sx are destination object Average coordinates parameter, P20And P31It is intermediate variable, to facilitate clearly expression formula, it is to avoid formula is gone out due to oversize Existing ambiguity, can also not need P20And P31, formula group II is directly merged into a formula expression, therefore, P20And P31This is not influenceed The protection domain of invention.
It need to only be distinguished during tracing and positioning of the present invention by the first camera and second camera after photographic subjects object The automatic location parameter matrix calculated when obtaining the 3rd camera photographic subjects object, so that the 3rd camera carries out accurate feature Shoot.Whole operation process, user only needs photographic subjects object, you can realize accurately tracing and positioning, and operation is very simple, non- Professional person also can easily and flexibly use.
Above-mentioned 4 reference points can be four angles in a certain rectangular area, or four participant such as students.
Referring to Fig. 3, the invention further relates to a kind of camera locating and tracking system, its better embodiment includes the first shooting Head, second camera, the 3rd camera and processor;
First camera and second camera are used for two view data for generating destination object;
The processor is used to calculate acquisition respectively according to the view data of view data and four reference points prestored with each Reference point as reference when destination object coordinate parameters, to obtain four groups of coordinate parameters of destination object;
Specifically, the processor obtains ∠ AO according to camera projection theory1O2With ∠ AO2O1;Joined according to default distance Number AB, distance parameter AC, distance parameter CH1With distance parameter O1O2Calculated by antitrigonometric function and obtain ∠ AO1O2And ∠ AO2O1;By ∠ AO1O2Subtract ∠ AO1S obtains ∠ SO1O2, by ∠ AO2O1Subtract ∠ AO2S obtains ∠ SO2O1;In △ SO1O2In by Sine obtains distance parameter SO1;Calculated by formula group I obtain destination object S with when wherein a reference point is referred to Coordinate parameters (x, y), wherein, formula group I is
The processor is additionally operable to calculate the average coordinates parameter for obtaining four groups of coordinate parameters;And according to four ginsengs prestored The corresponding four groups of location parameter matrixes of examination point, the width of the rectangle, the average coordinates parameter calculating of length and destination object are obtained The corresponding 3rd camera position parameter matrix of destination object;
Specifically, the processor obtains the corresponding 3rd camera position parameter matrix of destination object by formula group II, Formula group II isWherein, P0,P1,P2,P3Respectively described four groups of location parameter matrixes, AB is square Shape length, AC is rectangle width, PSFor the corresponding camera position parameter matrix of destination object, P20And P31It is intermediate variable, Sy and sx is the average coordinates parameter of destination object.
Referring to Fig. 4, the invention further relates to a kind of locating and tracking system, it is included with lower module:
Image collection module, the picture number for obtaining destination object by the first camera and second camera respectively According to;
Coordinate parameters acquisition module, for the view data according to destination object and the view data of four reference points prestored The coordinate parameters obtained using each reference point as destination object during reference are calculated respectively, to obtain four groups of seats of destination object Mark parameter;And calculate the average coordinates parameter for obtaining four groups of coordinate parameters;
Location parameter matrix acquisition module, for four groups of location parameter matrixes, rectangle according to four reference points prestored Width, the average coordinates parameter of length and destination object calculates and obtains the corresponding 3rd camera position parameter square of destination object Battle array.
Preferably, coordinate parameters acquisition module obtains ∠ AO according to camera projection theory1O2With ∠ AO2O1;According to default Distance parameter AB, distance parameter AC, distance parameter CH1With distance parameter O1O2Calculated by antitrigonometric function and obtain ∠ AO1O2 With ∠ AO2O1;By ∠ AO1O2Subtract ∠ AO1S obtains ∠ SO1O2, by ∠ AO2O1Subtract ∠ AO2S obtains ∠ SO2O1;In △ SO1O2 In distance parameter SO is obtained by sine1;Calculated by formula group I and obtain joining with wherein reference point work for destination object S Coordinate parameters (x, y) when examining, wherein, formula group I is
Preferably, location parameter matrix acquisition module obtains the corresponding 3rd camera position of destination object by formula group II Parameter matrix is put, formula group II isWherein, P0,P1,P2,P3Respectively described four groups of positions ginseng Matrix number, AB is rectangle length, and AC is rectangle width, PSFor the corresponding camera position parameter matrix of destination object, P20And P31 It is intermediate variable, sy and the average coordinates parameter that sx is destination object.
Preferably, the first camera and second camera are floor-mounted camera, and the 3rd camera is monopod video camera.
" first " of the present invention, " second " are used to distinguish different objects, do not play differentiation order.
For those skilled in the art, technical scheme that can be as described above and design, make other each It is kind corresponding to change and deform, and all these change and deformation should all belong to the protection model of the claims in the present invention Within enclosing.

Claims (6)

1. a kind of camera positioning and tracing method, it is characterised in that:It comprises the following steps:
Step S101:The view data of destination object is obtained by the first camera and second camera respectively;
Step S102:Acquisition is calculated with each according to the view data of the view data of destination object and four reference points prestored respectively Individual reference point as reference when destination object coordinate parameters, to obtain four groups of coordinate parameters of destination object;Wherein, four ginseng Examination point forms a rectangle on space level face, wherein one side of the straight line and rectangle where the first camera and second camera It is parallel;
Step S103:Calculate the average coordinates parameter for obtaining four groups of coordinate parameters;And
Step S104:According to the corresponding four groups of location parameter matrixes of four reference points prestored, the width of the rectangle, length and target The average coordinates parameter of object, which is calculated, obtains the corresponding 3rd camera position parameter matrix of destination object;
Step S102 obtains the coordinate parameters using a wherein reference point as destination object during reference by following sub-step:
Step S1021:∠ AO are obtained according to camera projection theory1S and ∠ AO2S;
Step S1022:According to default distance parameter AB, distance parameter AC, distance parameter CH1With distance parameter O1O2By anti- Trigonometric function, which is calculated, obtains ∠ AO1O2With ∠ AO2O1
Step S1023:By ∠ AO1O2Subtract ∠ AO1S obtains ∠ SO1O2, by ∠ AO2O1Subtract ∠ AO2S obtains ∠ SO2O1
Step S1024:In Δ SO1O2In distance parameter SO is obtained by sine1;And
Step S1025:The coordinate parameters (x, y) for obtaining destination object S are calculated by formula group I,
Wherein, formula group I is
Wherein, A, B, C are three in four reference points, point O1With point O2Respectively the first camera and second camera, point S For destination object, H1For AC extended line and O1O2Intersection point.
2. camera positioning and tracing method as claimed in claim 1, it is characterised in that:Step S104 is obtained by formula group II The corresponding 3rd camera position parameter matrix of destination object, formula group II isWherein, P0,P1, P2,P3Respectively described four groups of location parameter matrixes, AB is rectangle length, and AC is rectangle width, PSTaken the photograph for destination object is corresponding As head location parameter matrix, P20And P31It is intermediate variable, sy and the average coordinates parameter that sx is destination object.
3. a kind of camera locating and tracking system, it is characterised in that including the first camera, second camera, the 3rd camera And processor;
First camera and second camera are used for the view data for generating destination object respectively;
The processor is used to calculate acquisition respectively according to the view data of destination object and the view data of four reference points prestored Using each reference point as the coordinate parameters of destination object during reference, to obtain four groups of coordinate parameters of destination object;Wherein, Four reference points form a rectangle on space level face, and the straight line and rectangle where the first camera and second camera are wherein It is parallel on one side;
The processor is additionally operable to calculate the average coordinates parameter for obtaining four groups of coordinate parameters;And
According to the flat of the corresponding four groups of location parameter matrixes of four reference points prestored, the width of the rectangle, length and destination object Equal coordinate parameters, which are calculated, obtains the corresponding 3rd camera position parameter matrix of destination object;
The processor obtains ∠ AO according to camera projection theory1O2With ∠ AO2O1;According to default distance parameter AB, distance ginseng Number AC, distance parameter CH1With distance parameter O1O2Calculated by antitrigonometric function and obtain ∠ AO1O2With ∠ AO2O1;By ∠ AO1O2Subtract Remove ∠ AO1S obtains ∠ SO1O2, by ∠ AO2O1Subtract ∠ AO2S obtains ∠ SO2O1;In Δ SO1O2In distance is obtained by sine Parameter SO1;Calculated by formula group I obtain destination object S with the coordinate parameters (x, y) when wherein a reference point is referred to, Wherein, formula group I is
Wherein, A, B, C are three in four reference points, point O1With point O2Respectively the first camera and second camera, point S For destination object, H1For AC extended line and O1O2Intersection point.
4. camera locating and tracking system as claimed in claim 3, it is characterised in that:The processor is obtained by formula group II The corresponding 3rd camera position parameter matrix of destination object, formula group II isWherein, P0,P1, P2,P3Respectively described four groups of location parameter matrixes, AB is rectangle length, and AC is rectangle width, PSTaken the photograph for destination object is corresponding As head location parameter matrix, P20And P31It is intermediate variable, sy and the average coordinates parameter that sx is destination object.
5. a kind of locating and tracking system, it is characterised in that:Including with lower module:
Image collection module, the view data for obtaining destination object by the first camera and second camera respectively;
Coordinate parameters acquisition module, the view data difference for the view data according to destination object and four reference points prestored The coordinate parameters obtained using each reference point as destination object during reference are calculated, are joined with four groups of coordinates for obtaining destination object Number;And calculate the average coordinates parameter for obtaining four groups of coordinate parameters;Wherein, four reference points form one on space level face Rectangle, the straight line where the first camera and second camera is wherein parallel on one side with rectangle;
Location parameter matrix acquisition module, for according to the corresponding four groups of location parameter matrixes of four reference points, the rectangle prestored Width, the average coordinates parameter of length and destination object calculates and obtains the corresponding 3rd camera position parameter square of destination object Battle array;
Coordinate parameters acquisition module obtains ∠ AO according to camera projection theory1O2With ∠ AO2O1;According to default distance parameter AB, distance parameter AC, distance parameter CH1With distance parameter O1O2Calculated by antitrigonometric function and obtain ∠ AO1O2With ∠ AO2O1; By ∠ AO1O2Subtract ∠ AO1S obtains ∠ SO1O2, by ∠ AO2O1Subtract ∠ AO2S obtains ∠ SO2O1;In Δ SO1O2In it is fixed by sine Reason obtains distance parameter SO1;Calculated by formula group I obtain destination object S with coordinate when wherein a reference point is referred to Parameter (x, y), wherein, formula group I is
Wherein, A, B, C are three in four reference points, point O1With point O2Respectively the first camera and second camera, point S For destination object, H1For AC extended line and O1O2Intersection point.
6. locating and tracking system as claimed in claim 5, it is characterised in that:Location parameter matrix acquisition module passes through formula group II obtains the corresponding 3rd camera position parameter matrix of destination object, and formula group II isWherein, P0,P1,P2,P3Respectively described four groups of location parameter matrixes, AB is rectangle length, and AC is rectangle width, PSFor destination object pair The camera position parameter matrix answered, P20And P31It is intermediate variable, sy and the average coordinates parameter that sx is destination object.
CN201510138377.8A 2015-03-26 2015-03-26 Camera positioning and tracing method and related system Active CN104732538B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510138377.8A CN104732538B (en) 2015-03-26 2015-03-26 Camera positioning and tracing method and related system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510138377.8A CN104732538B (en) 2015-03-26 2015-03-26 Camera positioning and tracing method and related system

Publications (2)

Publication Number Publication Date
CN104732538A CN104732538A (en) 2015-06-24
CN104732538B true CN104732538B (en) 2017-11-07

Family

ID=53456406

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510138377.8A Active CN104732538B (en) 2015-03-26 2015-03-26 Camera positioning and tracing method and related system

Country Status (1)

Country Link
CN (1) CN104732538B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201926149A (en) * 2017-11-22 2019-07-01 財團法人資訊工業策進會 Section procedure tracking system and section procedure tracking system method
CN111380457B (en) * 2018-12-29 2024-02-06 上海晨兴希姆通电子科技有限公司 Positioning method and system for material tray

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102620713A (en) * 2012-03-26 2012-08-01 梁寿昌 Method for measuring distance and positioning by utilizing dual camera
CN103777643A (en) * 2012-10-23 2014-05-07 北京网动网络科技股份有限公司 Automatic camera tracking system based on image positioning and tracking method
CN104349037A (en) * 2013-07-29 2015-02-11 浙江大华系统工程有限公司 Method, device and system for automatic tracking of moving target

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19534415A1 (en) * 1995-09-16 1997-03-20 Alain Piaget Method and device for detecting and measuring three-dimensional bodies or any surface

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102620713A (en) * 2012-03-26 2012-08-01 梁寿昌 Method for measuring distance and positioning by utilizing dual camera
CN103777643A (en) * 2012-10-23 2014-05-07 北京网动网络科技股份有限公司 Automatic camera tracking system based on image positioning and tracking method
CN104349037A (en) * 2013-07-29 2015-02-11 浙江大华系统工程有限公司 Method, device and system for automatic tracking of moving target

Also Published As

Publication number Publication date
CN104732538A (en) 2015-06-24

Similar Documents

Publication Publication Date Title
US20240153143A1 (en) Multi view camera registration
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
US10726580B2 (en) Method and device for calibration
KR100966592B1 (en) Method for calibrating a camera with homography of imaged parallelogram
CN105096283B (en) The acquisition methods and device of panoramic picture
CN107767422A (en) A kind of fish-eye bearing calibration, device and portable terminal
CN107358633A (en) Join scaling method inside and outside a kind of polyphaser based on 3 points of demarcation things
CN105469389B (en) A kind of grid ball target for vision sensor calibration and corresponding scaling method
KR20170018009A (en) Image processing method and apparatus
CN105894511B (en) Demarcate target setting method, device and parking assistance system
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
CN107038722A (en) A kind of equipment localization method and device
CN110505468B (en) Test calibration and deviation correction method for augmented reality display equipment
CN107084680A (en) A kind of target depth measuring method based on machine monocular vision
CN105447865B (en) A kind of method and apparatus for assessing panoramic mosaic algorithm static state joining quality
CN107016707B (en) A kind of integration imaging super large three-dimensional scenic shooting method for correcting image
CN110415304B (en) Vision calibration method and system
CN102930551B (en) Camera intrinsic parameters determined by utilizing projected coordinate and epipolar line of centres of circles
KR20160117143A (en) Method, device and system for generating an indoor two dimensional plan view image
CN108734738A (en) Camera calibration method and device
CN104732538B (en) Camera positioning and tracing method and related system
CN103035007B (en) Solving camera intrinsic parameters by using frustum of prism
CN106952262A (en) A kind of deck of boat analysis of Machining method based on stereoscopic vision
CN106524995A (en) Positioning method for detecting spatial distances of target objects on basis of visible-light images in real time
CN110012236A (en) A kind of information processing method, device, equipment and computer storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20170922

Address after: 510000 No. 16, Mingzhu Road, Guangzhou economic and Technological Development Zone, Guangdong, Guangzhou

Applicant after: Wang Guomeng

Address before: LAN Guangzhou economic and Technological Development Zone, Guangdong province Guangzhou city four street 510000 No. 9 No. 6 building three or four floor

Applicant before: CREATOR CORPORATION

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant