CN106648360A - Locating method and device for 3D ball machine - Google Patents
Locating method and device for 3D ball machine Download PDFInfo
- Publication number
- CN106648360A CN106648360A CN201611088774.XA CN201611088774A CN106648360A CN 106648360 A CN106648360 A CN 106648360A CN 201611088774 A CN201611088774 A CN 201611088774A CN 106648360 A CN106648360 A CN 106648360A
- Authority
- CN
- China
- Prior art keywords
- angle
- impact point
- width
- sin
- height
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
The invention provides a locating method and device for a 3D ball machine. Coordinate values, width and height of the target point to be located on the screen are obtained, and the horizontal moving angel of the target point is calculated; and then according to the coordinate values, width and height of the target point on the screen, combining with the horizontal moving angle of the target point, the vertical moving angle of the target point is calculated. Finally, according to the horizontal moving angel and the vertical moving angle of the target point, the target point is located. Because the locating combines the horizontal moving angle and the vertical moving angle, the locating is accurate. Moreover, the zoom of the locating method can be intelligently scaled according to the locating accuracy, and the error caused by optical magnification is reduced.
Description
Technical field
The application is related to field of video monitoring, more particularly to a kind of localization method and device of 3D ball machines.
Background technology
At present, the development with security protection market and needs, intelligent 3D (three dimensional, three-dimensional) positioning function
Also increasingly popularize.Application particularly in the monitoring of the large area such as traffic, square, airport, intelligent 3D positioning functions can be fast
Speed positions the selected region of user and the direction according to selected region and size are reasonably scaled, and is passing through
Original monitor state can be quickly recovered to after scaling.Intelligent 3D positioning facilitates user when large area is monitored, and quick positioning is simultaneously
Amplify local detail picture, and can the original monitor state of fast quick-recovery.In many safe cities and traffic monitoring project, intelligence
Energy 3D positioning functions are played a very important role, therefore, whether there is 3D positioning functions to have become in safety monitoring equipment
One of major criterion that security protection is laid.
But, inventor has found during the research to prior art with practice, in existing 3D location technologies also
There is following defect:On the one hand, the walking only merely from horizontal and vertical both horizontally and vertically of existing 3D location technologies
Angle is carried out, and the method comparison that walks is simple, is difficult to ensure card accurate positioning.On the other hand, existing location technology, and do not account for
To the error that optical zoom is brought, cause the level really intelligent scaling that can not come along when monitoring zoom.
The content of the invention
The application provides a kind of localization method and device of 3D ball machines, and accurately target can be positioned.
According to the application's in a first aspect, the application provides a kind of localization method of 3D ball machines, including:Obtain to be positioned
Coordinate value of the impact point on screen, width and height;Calculate the impact point moves horizontally angle;According to the target
Coordinate value of the point on screen, width and height, with reference to the angle that moves horizontally of the impact point, calculate the impact point
Vertically move angle;According to the impact point move horizontally angle and it is described vertically move angle, to the target click through
Row positioning.
Optionally, it is described the impact point is positioned after also include:The impact point is zoomed in or out.
Optionally, it is described the impact point is zoomed in or out including:Obtain user to mark in ball machine video area
Size box;The size box is converted into, with the rectangle that video resolution is reference, to obtain the size of area of reference;According to institute
The size of area of reference is stated, the numerical value of calculating optical zoom amplifies according to the Numerical Control ball machine of the optical zoom or contracts
It is little.
Optionally, it is described to be converted into the size box, with the rectangle that video resolution is reference, to obtain area of reference
Size includes:According to coordinate points position, width, the height in the upper left corner of the size box, with reference to the width of default reference rectangle
The resolution ratio of degree, height and video, calculates the width and height of the reference rectangle after conversion;According to the reference after the conversion
The width of rectangle and the size of high computational area of reference.
Optionally, the angle that moves horizontally for calculating the impact point includes:Impact point is calculated according to below equation
Move horizontally angle γ:
γ=atan (| x | * sin (atan (2* | y | * tan (α/2)/w))/(| y | * cos (&+atan (atan (2* | y | *
tan(α/2)/w)))))
Wherein, x and y are respectively the transverse and longitudinal coordinate value of the coordinate position of the impact point, and α is the current level of ball machine movement
Visual angle, w is the width of video resolution.
Optionally, the coordinate value according to the impact point on screen, width and height, with reference to the impact point
Move horizontally angle, calculating the angle that vertically moves of the impact point includes:The vertical of impact point is calculated according to below equation
Move angle θ:
θ=asin (2* | x | * sin (β/2)/(w*sin (γ)))-asin (cos (90-&) * sin (β/2)/w);
Wherein, γ is the angle that moves horizontally of impact point, and β is the present level visual angle of ball machine movement, and & is when movement is vertical
Angle, w is the width of video resolution.
According to the second aspect of the application, the application provides a kind of positioner of 3D ball machines, including:Acquiring unit, uses
Coordinate value, width and height in acquisition impact point to be positioned on screen;Level calculation unit, for calculating the mesh
Punctuate moves horizontally angle;Vertical computing unit, for the coordinate value according to the impact point on screen, width and height
Degree, with reference to the angle that moves horizontally of the impact point, calculate the impact point vertically moves angle;Positioning unit, for root
According to the impact point move horizontally angle and it is described vertically move angle, the impact point is positioned.
Optionally, also include:Unit for scaling, for zooming in or out to the impact point.
Optionally, the unit for scaling includes:Acquisition module, for obtaining the scaling that user marks in ball machine video area
Frame;Modular converter, for being converted into the size box, with the rectangle that video resolution is reference, to obtain the big of area of reference
It is little;Computing module, for according to the size of the area of reference, the numerical value of calculating optical zoom;Zoom module, for according to institute
The Numerical Control ball machine for stating optical zoom zooms in or out.
Optionally, the level calculation unit specifically for:Angle is moved horizontally according to below equation calculating impact point
γ:
γ=atan (| x | * sin (atan (2* | y | * tan (α/2)/w))/(| y | * cos (&+atan (atan (2* | y | *
tan(α/2)/w)))))
Wherein, x and y are respectively the transverse and longitudinal coordinate value of the coordinate position of the impact point, and α is the current level of ball machine movement
Visual angle, w is the width of video resolution;
The vertical computing unit specifically for:Angle, θ is vertically moved according to below equation calculating impact point:
θ=asin (2* | x | * sin (β/2)/(w*sin (γ)))-asin (cos (90-&) * sin (β/2)/w);
Wherein, γ is the angle that moves horizontally of impact point, and β is the present level visual angle of ball machine movement, and & is when movement is vertical
Angle, w is the width of video resolution.
The localization method and device of the 3D ball machines of the application, obtains coordinate value, the width of impact point to be positioned on screen
Degree and height, calculate impact point and move horizontally angle, coordinate value, width further according to the impact point on screen and
Highly, combining target point moves horizontally angle, and calculate impact point vertically move angle.Move finally according to the level of impact point
Dynamic angle and it is described vertically move angle, impact point is positioned.Carry out due to combining level angle and vertical angle
Walk, thus accurate positioning, also, application scheme zoom can reduce optical zoom according to the intelligent scaling of accurate positioning
The error brought.
Description of the drawings
The above-mentioned and/or additional aspect and advantage of the present invention will become from the description with reference to accompanying drawings below to embodiment
Substantially with it is easy to understand, wherein:
Fig. 1 is the method flow diagram of the embodiment of the present application;
Fig. 2 is the schematic diagram calculation for moving horizontally of the embodiment of the present application;
Fig. 3 is the schematic diagram calculation for vertically moving of the embodiment of the present application;
Fig. 4 is the flow chart zoomed in or out to impact point of the embodiment of the present application;
Fig. 5 is the apparatus structure schematic diagram of the embodiment of the present application;
Fig. 6 is another kind of apparatus structure schematic diagram of the embodiment of the present application;
Fig. 7 is another apparatus structure schematic diagram of the embodiment of the present application.
Specific embodiment
Accompanying drawing is combined below by specific embodiment to be described in further detail the present invention.The application provides a kind of 3D
The localization method and device of ball machine, accurately can position to target, and can recover original monitor state.
Embodiment one:
Fig. 1 is referred to, Fig. 1 is the method flow diagram of the embodiment of the present application one, as shown in figure 1, the embodiment of the present application is provided
A kind of localization method of 3D ball machines, specifically may comprise steps of:
S10, obtain coordinate value on screen of impact point to be positioned, width and height.
S20, calculate impact point and move horizontally angle.
S30, the coordinate value according to impact point on screen, width and height, combining target point moves horizontally angle,
Calculate impact point vertically moves angle.
S40, according to impact point move horizontally angle and it is described vertically move angle, impact point is positioned.
In the present embodiment, S20 the step of moving horizontally angle of above-mentioned calculating impact point includes:
Angle γ is moved horizontally according to below equation calculating impact point:
γ=atan (| x | * sin (atan (2* | y | * tan (α/2)/w))/(| y | * cos (&+atan (atan (2* | y | *
tan(α/2)/w)))))
Wherein, x and y are respectively the transverse and longitudinal coordinate value of the coordinate position of the impact point, and α is the current level of ball machine movement
Visual angle, w is the width of video resolution.
The detailed process and principle that move horizontally angle that the present embodiment step S20 calculates impact point is explained below
State:
As shown in Fig. 2 level walks, principle is according to the position of impact point place screen, current movement horizontal view angle and works as
Front code distinguishability, gets through a series of geometric operations.
Step 1. analyzes target
After the impact point A on screen determines, relevant parameter (abscissa value x and ordinate y) of impact point A of point A
Also determine that.Because the picture of movement collection is spherical, when movement vertical angle is constant to horizontally rotate, each on screen
Impact point all can run along corresponding track.Such as Fig. 2, after impact point A determines, vertical angle of view is constant horizontally rotate when, A points
Track on screen is a fixed camber line.Angle is moved horizontally in order to more accurately calculate, will be corresponding on screen
Spot projection is analyzed on horizontal plane OZN from Fig. 2, and ∠ EOM are the level of impact point A and walk angle, wherein assuming that E is A
Put the location point after moving horizontally.
Step 2. Algorithm Analysis
Because video resolution, movement present level visual angle are known, a width of w of setting video resolution ratio PH, movement is current
Horizontal view angle ∠ HOK are α, and screen perpendicular bisector OK is h, can obtain relationship below:
H=w/ (2*tan (α/2)) is 1.
Due to impact point A coordinates, it is known that i.e. BK=y, if ∠ EOK are β, a length of m of straight line OE, in Vertical Triangular BOK
There is relationship below:
β=atan (| y |/h) is 2.
M=| y |/sin (β) are 3.
Movement current vertical angle is ∠ KOM, and straight line OE and OK is with a plane, if movement is understood to Fig. 2 analyses
Current vertical angle ∠ KOM is &, and ∠ EOM are φ, then φ=β+&, in Vertical Triangular EOM and Vertical Triangular BOM, if
OM is s, and angle on target ∠ BOM are γ, then
γ=atan (| x |/s) is 5.
In sum, 1., 2., 3., 4., 5. can obtain level by formula to walk angle on target γ:
γ=atan (| x | * sin (atan (2* | y | * tan (α/2)/w))/(| y | * cos (&+atan (atan (2* | y | *
tan(α/2)/w)))))
In the present embodiment, above-mentioned coordinate value according to the impact point on screen, width and height, with reference to the mesh
Punctuate moves horizontally angle, and S30 the step of vertically moving angle for calculating impact point specifically includes:
Angle, θ is vertically moved according to below equation calculating impact point:
θ=asin (2* | x | * sin (β/2)/(w*sin (γ)))-asin (cos (90-&) * sin (β/2)/w);
Wherein, γ is the angle that moves horizontally of impact point, and β is the present level visual angle of ball machine movement, and & is when movement is vertical
Angle, w is the width of video resolution.
Below the detailed process and principle that vertically move angle to calculating impact point in the present embodiment step S30 is explained
State:
As shown in figure 3, the principle that vertically walks be according to the position of impact point place screen, current movement vertical angle, when
The horizontal and vertical visual angle of front movement, gets through a series of geometric operations.Vertically walk and dependent, it walks with level phase
Close, the angle for moving horizontally is one of |input paramete as the algorithm that vertically walks.
Step 10. analyzes impact point
After impact point A determines on screen, i.e. impact point A is determined in the coordinate value of screen, wide and higher position.System meeting
The angle that level walks is obtained after computing automatically according to the relevant parameter of target, because the picture of movement collection is for spherical,
Each point i.e. in screen can run on corresponding arc track, and the vertical angle of view of movement is bigger, corresponding to circular arc
Radius is less.Such as impact point A (x, y) in Fig. 3, when movement vertical angle & is constant, point A will be with radius as r2 in screen
Circular arc on move, knowable to top view, A points will move to center's point of screen, and the points of A first will move horizontally γ angles, then enter
Row is vertically moved.Knowable to side view, after the mobile γ angles of A point levels, the upright position (y) of A has occurred and that change, i.e. A
The vertical angle of view of point is altered, changes to B points.Therefore, calculate vertically walk angle when can not with the y values of A points come as
Vertically walk the input variable of algorithm, this can cause vertically to walk be moved through it is many or very few.
Step 20. Algorithm Analysis
From the target point analysis of movement, with reference to top view and side view, the movement angle, θ that vertically walks is:
θ=Φ-α
Before impact point A determines, according to the present level visual angle β of movement, the radius R of sphere can be tried to achieve:
R=w/ (2*sin (β/2)) (1)
Before impact point A determines, according to current vertical angle & of movement, r1 can be tried to achieve:
R1=R*cos (90-&) (2)
Calculate vertically walk angle when, need the angle that moves horizontally for first determining impact point, then using its number of degrees as hanging down
A points in one of input quantity of straight algorithm, such as Fig. 3, A points can be calculated movement and moved horizontally after determining by the level algorithm that walks
Angle γ, thus can try to achieve r3 such as following formulas:
R3=| x |/sin (γ) (3)
From side-looking map analysis,
α=asin (r1/R), Φ=asin (r3/R) (4)
To sum up analyze, by formula (1), (2), (3), what (4) can obtain point A vertically moves angle, θ:
θ=asin (2* | x | * sin (β/2)/(w*sin (γ)))-asin (cos (90-&) * sin (β/2)/w)
Embodiment two:
The embodiment of the present application provides a kind of localization method of 3D ball machines, similar with the method and step of embodiment one, and difference exists
In the localization method of the 3D ball machines of the present embodiment can also be comprised the following steps after positioning to impact point:
S50, impact point is zoomed in or out.
Specifically, the step of above is zoomed in or out to impact point S50 may comprise steps of:
The size box that S501, acquisition user mark in ball machine video area.
S502, by size box be converted into video resolution be reference rectangle, obtain the size of area of reference.
S503, according to the size of area of reference, the numerical value of calculating optical zoom.
S504, zoomed in or out according to the Numerical Control ball machine of optical zoom.
Specifically, it is above-mentioned to be converted into the size box, with the rectangle that video resolution is reference, to obtain area of reference
The step of size, S502 may comprise steps of:
S502A, according to coordinate points position, width, the height in the upper left corner of size box, with reference to the width of default reference rectangle
The resolution ratio of degree, height and video, calculates the width and height of the reference rectangle after conversion.
The size of S502B, the width according to the reference rectangle after conversion and high computational area of reference.
Below to being illustrated to the principle that impact point is zoomed in or out in the present embodiment.
In a kind of application scenarios, user can watch the video of front network ball machine by terminal, and user can be in ball machine
Video area inside-paint rectangle frame is specifically positioned.
As shown in figure 4, when mouse is pulled to the upper left corner from the lower right corner on image, meeting downscaled images, on the contrary then amplify.
The rectangle size of picture frame and direction, are delivered to ball machine equipment the inside, and ball machine equipment carries out intelligent multiplication factor
Or reduction multiple.
For the ease of being calculated, and matched with inner parameter, it is therefore desirable to the rectangle ginseng for first delimiting user
Number and the wide high needs of reference are converted into the value related to video resolution.The ginseng as device interior using resolution ratio is needed herein
Wide height is examined, the drawn rectangle on screen of user is converted into into the reference value of resolution ratio.
The parameter that the present embodiment step is used has:Rectangle top left co-ordinate, width, height, with reference to rectangle width, height, according to
Changed described in lower:
Assume that rectangle top left co-ordinate is:(x, y), wide high respectively w, h;With reference to rectangle width a height of refw, refh;Video
Resolution ratio be W, H.If rectangular centre coordinate is respectively x1, y1, the width of rectangle is high to be respectively w1, h1, it is known that after conversion
User drawn by rectangle width it is high and its center point coordinate is:
X1=x*W/refw+w*W/ (2*refw)-W/2
Y1=y*H/refh+h*H/ (2*refh)-H/2
W1=w*W/refw
H1=h*H/refh
By above-mentioned formula change after, rectangle drawn by user become with video resolution be reference rectangle.
The image width height that terminal shows is respectively X, and Y, maximum is respectively default 704.0,576.0.
Area of reference S=(w1*h1);
The size of area of reference S is compared with maximum rectangular area (704.0*576.0), the 1/4 of maximum rectangular area
More than scope, 8 times of optical zoom, ball machine is maximum 20 times.
More than 1/16 scope of maximum rectangular area, within 1/4 scope, 7 times of optical zoom, ball machine is maximum 20 times.
......
More than 1/ (256*256) scope of maximum rectangular area, within 1/ (256*64) scope, 1 times of optical zoom, ball
Machine is maximum 20 times.
By that analogy.
Embodiment three:
Fig. 5 is referred to, Fig. 5 is the apparatus structure schematic diagram of the embodiment of the present application, as shown in figure 5, the embodiment of the present application is carried
For a kind of positioner of 3D ball machines, can include:
Acquiring unit 50, for obtaining coordinate value of the impact point to be positioned on screen, width and height;
Level calculation unit 51, for calculate the impact point angle is moved horizontally;
Vertical computing unit 52, for the coordinate value according to the impact point on screen, width and height, with reference to institute
The angle that moves horizontally of impact point is stated, calculate the impact point vertically moves angle;
Positioning unit 53, for according to the impact point move horizontally angle and it is described vertically move angle, to institute
State impact point to be positioned.
Fig. 6 is referred to, in one embodiment, the positioner of the 3D ball machines of the application can also include:
Unit for scaling 54, for zooming in or out to the impact point.
Fig. 7 is referred to, in a preferred embodiment, unit for scaling 54 is specifically included:
Acquisition module 540, for obtaining the size box that user marks in ball machine video area.
Modular converter 541, for being converted into the size box, with the rectangle that video resolution is reference, to obtain the plane of reference
Long-pending size.
Computing module 542, for according to the size of the area of reference, the numerical value of calculating optical zoom.
Zoom module 543, zooms in or out for the Numerical Control ball machine according to the optical zoom.
In one embodiment, above-mentioned level calculation unit 51 specifically for:
Angle γ is moved horizontally according to below equation calculating impact point:
γ=atan (| x | * sin (atan (2* | y | * tan (α/2)/w))/(| y | * cos (&+atan (atan (2* | y | *
tan(α/2)/w)))))
Wherein, x and y are respectively the transverse and longitudinal coordinate value of the coordinate position of the impact point, and α is the current level of ball machine movement
Visual angle, w is the width of video resolution;
Above-mentioned vertical computing unit 52 specifically for:
Angle, θ is vertically moved according to below equation calculating impact point:
θ=asin (2* | x | * sin (β/2)/(w*sin (γ)))-asin (cos (90-&) * sin (β/2)/w);
Wherein, γ is the angle that moves horizontally of impact point, and β is the present level visual angle of ball machine movement, and & is when movement is vertical
Angle, w is the width of video resolution.
The localization method and device of the 3D ball machines of the application, obtains coordinate value, the width of impact point to be positioned on screen
Degree and height, calculate impact point and move horizontally angle, coordinate value, width further according to the impact point on screen and
Highly, combining target point moves horizontally angle, and calculate impact point vertically move angle.Move finally according to the level of impact point
Dynamic angle and it is described vertically move angle, impact point is positioned.Carry out due to combining level angle and vertical angle
Walk, thus accurate positioning, also, application scheme zoom can reduce optical zoom according to the intelligent scaling of accurate positioning
The error brought.And after this embodiment scheme 3D is positioned, user can immediately recover the original state before its positioning, bag
Level is included, vertical and amplification or reduction multiple.
Above content is to combine specific embodiment further description made for the present invention, it is impossible to assert this
It is bright to be embodied as being confined to these explanations.For general technical staff of the technical field of the invention, do not taking off
On the premise of present inventive concept, some simple deduction or replace can also be made.
Claims (10)
1. a kind of localization method of 3D ball machines, it is characterised in that include:
Obtain coordinate value on screen of impact point to be positioned, width and height;
Calculate the impact point moves horizontally angle;
Coordinate value, width and height according to the impact point on screen, with reference to the impact point angle is moved horizontally,
Calculate the impact point vertically moves angle;
According to the impact point move horizontally angle and it is described vertically move angle, the impact point is positioned.
2. the localization method of 3D ball machines as claimed in claim 1, it is characterised in that it is described the impact point is positioned after
Also include:
The impact point is zoomed in or out.
3. the localization method of 3D ball machines as claimed in claim 2, it is characterised in that it is described the impact point is amplified or
Diminution includes:
Obtain the size box that user marks in ball machine video area;
The size box is converted into, with the rectangle that video resolution is reference, to obtain the size of area of reference;
According to the size of the area of reference, the numerical value of calculating optical zoom, according to the Numerical Control ball machine of the optical zoom
Zoom in or out.
4. the localization method of 3D ball machines as claimed in claim 3, it is characterised in that described to be converted into the size box to regard
Frequency division resolution is the rectangle of reference, and obtaining the size of area of reference includes:
According to coordinate points position, width, the height in the upper left corner of the size box, with reference to width, the height of default reference rectangle
The resolution ratio of degree and video, calculates the width and height of the reference rectangle after conversion;
According to the width and the size of high computational area of reference of the reference rectangle after the conversion.
5. the localization method of the 3D ball machines as described in any one in claim 1-4, it is characterised in that the calculating mesh
The angle that moves horizontally of punctuate includes:
Angle γ is moved horizontally according to below equation calculating impact point:
γ=atan (| x | * sin (atan (2* | y | * tan (α/2)/w))/(| y | * cos (&+atan (atan (2* | y | * tan
(α/2)/w)))))
Wherein, x and y are respectively the transverse and longitudinal coordinate value of the coordinate position of the impact point, and α is that the current level of ball machine movement is regarded
Angle, w is the width of video resolution.
6. the localization method of 3D ball machines as claimed in claim 5, it is characterised in that it is described according to the impact point on screen
Coordinate value, width and height, with reference to the angle that moves horizontally of the impact point, calculate the impact point vertically moves angle
Degree includes:
Angle, θ is vertically moved according to below equation calculating impact point:
θ=asin (2* | x | * sin (β/2)/(w*sin (γ)))-asin (cos (90-&) * sin (β/2)/w);
Wherein, γ is the angle that moves horizontally of impact point, and β is the present level visual angle of ball machine movement, and & is when movement vertical angle
Degree, w is the width of video resolution.
7. a kind of positioner of 3D ball machines, it is characterised in that include:
Acquiring unit, for obtaining coordinate value of the impact point to be positioned on screen, width and height;
Level calculation unit, for calculate the impact point angle is moved horizontally;
Vertical computing unit, for the coordinate value according to the impact point on screen, width and height, with reference to the target
That what is put moves horizontally angle, and calculate the impact point vertically moves angle;
Positioning unit, for according to the impact point move horizontally angle and it is described vertically move angle, to the target
Point is positioned.
8. the positioner of 3D ball machines as claimed in claim 8, it is characterised in that also include:
Unit for scaling, for zooming in or out to the impact point.
9. the positioner of 3D ball machines as claimed in claim 8, it is characterised in that the unit for scaling includes:
Acquisition module, for obtaining the size box that user marks in ball machine video area;
Modular converter, for being converted into the size box, with the rectangle that video resolution is reference, to obtain the big of area of reference
It is little;
Computing module, for according to the size of the area of reference, the numerical value of calculating optical zoom;
Zoom module, zooms in or out for the Numerical Control ball machine according to the optical zoom.
10. the positioner of the 3D ball machines as described in any one in claim 7-9, it is characterised in that the level calculation
Unit specifically for:
Angle γ is moved horizontally according to below equation calculating impact point:
γ=atan (| x | * sin (atan (2* | y | * tan (α/2)/w))/(| y | * cos (&+atan (atan (2* | y | * tan
(α/2)/w)))))
Wherein, x and y are respectively the transverse and longitudinal coordinate value of the coordinate position of the impact point, and α is that the current level of ball machine movement is regarded
Angle, w is the width of video resolution;
The vertical computing unit specifically for:
Angle, θ is vertically moved according to below equation calculating impact point:
θ=asin (2* | x | * sin (β/2)/(w*sin (γ)))-asin (cos (90-&) * sin (β/2)/w);
Wherein, γ is the angle that moves horizontally of impact point, and β is the present level visual angle of ball machine movement, and & is when movement vertical angle
Degree, w is the width of video resolution.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611088774.XA CN106648360B (en) | 2016-11-30 | 2016-11-30 | Positioning method and device of 3D ball machine |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611088774.XA CN106648360B (en) | 2016-11-30 | 2016-11-30 | Positioning method and device of 3D ball machine |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106648360A true CN106648360A (en) | 2017-05-10 |
CN106648360B CN106648360B (en) | 2020-11-17 |
Family
ID=58814787
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611088774.XA Active CN106648360B (en) | 2016-11-30 | 2016-11-30 | Positioning method and device of 3D ball machine |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106648360B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110248148A (en) * | 2018-09-25 | 2019-09-17 | 浙江大华技术股份有限公司 | A kind of method and device of determining positional parameter |
CN111210472A (en) * | 2019-12-31 | 2020-05-29 | 山东信通电子股份有限公司 | 3D positioning method, device, equipment and medium for video picture |
WO2021134507A1 (en) * | 2019-12-31 | 2021-07-08 | 海能达通信股份有限公司 | Video monitoring positioning method and video monitoring system |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4240108A (en) * | 1977-10-03 | 1980-12-16 | Grumman Aerospace Corporation | Vehicle controlled raster display system |
CN102291571A (en) * | 2011-08-11 | 2011-12-21 | 杭州华三通信技术有限公司 | Method and device for realizing frame-pulling scaling in monitoring system |
CN102915043A (en) * | 2012-10-17 | 2013-02-06 | 天津市亚安科技股份有限公司 | Method for increasing location accuracy of cloud platform |
CN102932598A (en) * | 2012-11-06 | 2013-02-13 | 苏州科达科技股份有限公司 | Method for intelligently tracking image on screen by camera |
CN103905792A (en) * | 2014-03-26 | 2014-07-02 | 武汉烽火众智数字技术有限责任公司 | 3D positioning method and device based on PTZ surveillance camera |
CN104125390A (en) * | 2013-04-28 | 2014-10-29 | 浙江大华技术股份有限公司 | Method and device for locating spherical camera |
CN104504685A (en) * | 2014-12-04 | 2015-04-08 | 高新兴科技集团股份有限公司 | Enhanced reality video camera virtual tag real-time high-precision positioning method |
-
2016
- 2016-11-30 CN CN201611088774.XA patent/CN106648360B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4240108A (en) * | 1977-10-03 | 1980-12-16 | Grumman Aerospace Corporation | Vehicle controlled raster display system |
CN102291571A (en) * | 2011-08-11 | 2011-12-21 | 杭州华三通信技术有限公司 | Method and device for realizing frame-pulling scaling in monitoring system |
CN102915043A (en) * | 2012-10-17 | 2013-02-06 | 天津市亚安科技股份有限公司 | Method for increasing location accuracy of cloud platform |
CN102932598A (en) * | 2012-11-06 | 2013-02-13 | 苏州科达科技股份有限公司 | Method for intelligently tracking image on screen by camera |
CN104125390A (en) * | 2013-04-28 | 2014-10-29 | 浙江大华技术股份有限公司 | Method and device for locating spherical camera |
CN103905792A (en) * | 2014-03-26 | 2014-07-02 | 武汉烽火众智数字技术有限责任公司 | 3D positioning method and device based on PTZ surveillance camera |
CN104504685A (en) * | 2014-12-04 | 2015-04-08 | 高新兴科技集团股份有限公司 | Enhanced reality video camera virtual tag real-time high-precision positioning method |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110248148A (en) * | 2018-09-25 | 2019-09-17 | 浙江大华技术股份有限公司 | A kind of method and device of determining positional parameter |
CN110248148B (en) * | 2018-09-25 | 2022-04-15 | 浙江大华技术股份有限公司 | Method and device for determining positioning parameters |
CN111210472A (en) * | 2019-12-31 | 2020-05-29 | 山东信通电子股份有限公司 | 3D positioning method, device, equipment and medium for video picture |
WO2021134507A1 (en) * | 2019-12-31 | 2021-07-08 | 海能达通信股份有限公司 | Video monitoring positioning method and video monitoring system |
Also Published As
Publication number | Publication date |
---|---|
CN106648360B (en) | 2020-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Huang et al. | A 3D GIS-based interactive registration mechanism for outdoor augmented reality system | |
CN106797456B (en) | Projected picture correcting method, means for correcting and robot | |
CN103631698B (en) | Camera PTZ (pan/tilt/zoom) control method and device for target tracking | |
KR101626065B1 (en) | Apparatus and method for markerless motion capturing | |
JP5586765B2 (en) | Camera calibration result verification apparatus and method | |
CN107358633A (en) | Join scaling method inside and outside a kind of polyphaser based on 3 points of demarcation things | |
CN109074083A (en) | Control method for movement, mobile robot and computer storage medium | |
CN105701828B (en) | A kind of image processing method and device | |
CN106648360A (en) | Locating method and device for 3D ball machine | |
WO2021129305A1 (en) | Calibration rod testing method for optical motion capture system, device, apparatus, and storage medium | |
CN103686065A (en) | Cloud mirror cluster control method and device of monitoring equipment based on GIS (geographic information system) interoperability | |
CN101894380B (en) | Method for tracing target object in panoramic video automatically | |
CN102917171A (en) | Small target locating method based on pixel | |
CN103258329A (en) | Camera calibration method based on one-dimensional feature of balls | |
CN101636748A (en) | The coupling based on frame and pixel of the graphics images to camera frames for computer vision that model generates | |
CN112215308B (en) | Single-order detection method and device for hoisted object, electronic equipment and storage medium | |
JP2009210331A (en) | Camera calibration apparatus and camera calibration method | |
CN112422653A (en) | Scene information pushing method, system, storage medium and equipment based on location service | |
CN105631454B (en) | A kind of ball machine localization method, equipment and ball machine | |
CN103617631A (en) | Tracking method based on center detection | |
CN108717704A (en) | Method for tracking target, computer installation based on fish eye images and computer readable storage medium | |
CN104125390B (en) | A kind of localization method and device for ball-shaped camera | |
CN112258641A (en) | Automatic configuration system and method for inspection point, storage medium, equipment and robot | |
Carozza et al. | Image-based localization for an indoor VR/AR construction training system | |
JP2019121176A (en) | Position specifying apparatus, position specifying method, position specifying program, and camera apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address | ||
CP03 | Change of name, title or address |
Address after: 518000 6th floor, building 3, Nanyou 4th Industrial Zone, Nanshan Avenue, Nanshan District, Shenzhen City, Guangdong Province Patentee after: Shenzhen Sanjiang Intelligent Control Technology Co.,Ltd. Address before: 518054 6th floor, building 3, Nanyou 4th Industrial Zone, Nanshan Avenue, Nanshan District, Shenzhen City, Guangdong Province Patentee before: SHENZHEN FANHAI SANJIANG TECHNOLOGY DEVELOPMENT Co.,Ltd. |