CN107065935A - A kind of cloud platform control method, device and Target Tracking System positioned for light stream - Google Patents
A kind of cloud platform control method, device and Target Tracking System positioned for light stream Download PDFInfo
- Publication number
- CN107065935A CN107065935A CN201710178765.8A CN201710178765A CN107065935A CN 107065935 A CN107065935 A CN 107065935A CN 201710178765 A CN201710178765 A CN 201710178765A CN 107065935 A CN107065935 A CN 107065935A
- Authority
- CN
- China
- Prior art keywords
- light source
- point
- optical sensor
- target light
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D3/00—Control of position or direction
- G05D3/12—Control of position or direction using feedback
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The present invention discloses a kind of cloud platform control method, device and Target Tracking System positioned for light stream, one of which is used for the cloud platform control method that light stream is positioned, camera lens, filter plate and optical sensor are disposed with head, the light that the target light source being arranged on object to be tracked is sent enters camera lens, optical sensor is injected after filtered filtering, this method comprises the following steps:Obtain the spot area that target light source is projected in optical sensor;Calculate the distance between target light source point and optical sensor center in spot area;Cloud platform rotation is controlled until target light source point is located at the center of optical sensor.The optical parametric that the present invention is projected on optical sensor by target light source, reference point is made with the center of optical sensor, control cloud platform rotation causes the center superposition of target light source point and optical sensor, and object to be tracked is tracked out so as to realize and be precisely located, and improves tracking efficiency.
Description
Technical field
The present invention relates to image processing field, and in particular to it is a kind of for light stream position cloud platform control method, device and
Target Tracking System.
Background technology
Current target following or localization method, typically rely on the matching of numerous characteristic points, and then to target signature point
Identification, numerous features of the framework by extracting face complete recognition of face, further complete positioning or track, and this is current
Method, once during motion, face occurs side and rotated, is just easily caused due to people and tracks failure, therefore current mesh
Mark tracking is easily influenceed by factors such as environmental changes, causes Feature Points Matching to fail.
Light stream is a kind of expression way of simple and practical image motion, is normally defined the image in an image sequence
Table of the movement velocity put in the apparent motion of luminance patterns, i.e. space object surface on the imaging plane of vision sensor
Reach.With the development of science and technology the research to light stream is played the part of on destination object segmentation, identification, tracking etc. in computer vision
Key player.Light stream is the important method of current movement image analysis, target light source is placed on object to be tracked, is led to
The position for obtaining the optical sensor that the target light source is fallen into after camera lens is crossed, object to be tracked is tracked with reference to head or fixed
Position.So light stream expresses the change of image, because it contains the information of target motion, therefore can observed person be used for determining
The motion conditions of target, so as to realize the tracking or positioning to target object.But fall in the prior art with the movement of light source
Change in location can also occur for hot spot on the optical sensor, and tracking effect is influenceed when offseting larger.
The content of the invention
Therefore, the technical problem to be solved of the embodiment of the present invention be in the prior art using optic flow technique carry out target with
Light source falls into the positions of lens sensors and is offset with the change of light source position and cause the defect of tracking error during track.
Therefore, the embodiments of the invention provide following technical scheme:
The embodiment of the present invention provides and is disposed with mirror on a kind of cloud platform control method positioned for light stream, the head
Head, filter plate and optical sensor, the light that the target light source being arranged on object to be tracked is sent enter the camera lens, through filter
The optical sensor is injected after wave plate filtering, it is characterised in that this method comprises the following steps:The target light source is obtained to throw
Spot area of the shadow in the optical sensor;Calculate in the spot area in target light source point and the optical sensor
The distance between heart;The cloud platform rotation is controlled until the target light source point is located at the center of the optical sensor.
Alternatively, the acquisition target light source is projected in the spot area in the optical sensor, including:Set up
Data coordinates figure, the data are equivalent to the light source dot matrix on the optical sensor;The luminance threshold of the target light source is set
Value and the spot area length and width ratio;The value of each point coordinates is read on the data coordinates figure, the value is light source
Brightness value;Judge whether the value is more than the luminance threshold;If the value is more than the luminance threshold, judgement takes
Can the shape of value point composition constitute rectangular area, if the value is not more than the luminance threshold, filter out under the brightness
Veiling glare;If the shape of the data point can constitute rectangular area, the rectangular area length and width ratio is calculated;Judge the square
Whether shape region length and width ratio is equal to the spot area length and width ratio;If the rectangular area length and width ratio is not equal to the light
Spot region length and width ratio, then filter out the veiling glare in the rectangular area again.
Alternatively, the control cloud platform rotation is until the target light source point is located in the optical sensor
The heart, including:Benchmark is chosen at the center of the data coordinates figure to judge a little;The integer for calculating the rectangular area length and width ratio is put down
Average;According to the integer average value, the equivalent point of the target light source point is obtained;Compare the equivalent point with the benchmark to sentence
The transverse and longitudinal coordinate of breakpoint is apart from size;The transverse and longitudinal coordinate of point is judged apart from of different sizes according to the equivalent point and the benchmark,
The head is controlled to rotate in different directions.
Alternatively, the cloud platform control method positioned for light stream, in addition to:The brightness of the target light source point is set
The value correspondence head and the actual range of the target light source.
The embodiment of the present invention provides and is disposed with mirror on a kind of tripod head controlling device positioned for light stream, the head
Head, filter plate and optical sensor, the light that the target light source being arranged on object to be tracked is sent enter the camera lens, through filter
The optical sensor is injected after wave plate filtering, it is characterised in that the device includes:Acquisition module, for obtaining the target
Spot area of the light sources project in the optical sensor;Computing module, for calculating target light source in the spot area
The distance between point and the optical sensor center;Control module, for controlling the cloud platform rotation until the target light
Source point is located at the center of the optical sensor.
Alternatively, the acquisition module, including:Unit is set up, for setting up data coordinates figure, the data are equivalent to institute
State the light source dot matrix on optical sensor;Setting unit, luminance threshold and the facular area for setting the target light source
Domain length and width ratio;Reading unit, the value for reading each point coordinates on the data coordinates figure, the value is light source
Brightness value;First judging unit, for judging whether the value is more than the luminance threshold;Second judging unit, if for
Can the value be more than the luminance threshold, then judge the shape of data point composition and constitute rectangular area, first filters out unit,
If for judging that the value is not more than the luminance threshold in the second judging unit, filtering out the veiling glare under the brightness;
First computing unit, if the shape for the data point can constitute rectangular area, calculates the rectangular area length and width ratio;
3rd judging unit, for judging whether the rectangular area length and width ratio is equal to the spot area length and width ratio;Second filter
Except unit, if long for judging that the rectangular area length and width ratio is not equal to the spot area in the 3rd judging unit block
Wide ratio, then filter out the veiling glare in the rectangular area again.
Alternatively, the control module, including:Unit is chosen, for choosing benchmark at the center of the data coordinates figure
Judge a little;Second computing unit, the integer average value for calculating the rectangular area length and width ratio;Acquiring unit, for root
According to the integer average value, the equivalent point of the target light source point is obtained;Comparing unit, for compare the equivalent point with it is described
Benchmark judges the transverse and longitudinal coordinate of point apart from size;Control unit, the horizontal stroke for judging point according to the equivalent point and the benchmark
Ordinate controls the head to rotate in different directions apart from of different sizes.
Alternatively, the tripod head controlling device positioned for light stream, in addition to:Setup module, for setting the mesh
Mark the brightness value correspondence head and the actual range of the target light source of light source point.
The embodiment of the present invention provides a kind of Target Tracking System, including:Target light source, is arranged on object to be tracked;Cloud
Platform, multi-direction can rotate;Camera lens, sets on the head, can gather the light that the target light source is sent;Filter plate,
It is arranged on behind the camera lens, for filtering out the jamming light source in incident ray;Optical sensor, is arranged on the filter plate
Behind, the light after filtering out, which is injected in the optical sensor, forms hot spot;Picture pick-up device, is arranged on the head, uses
In the shooting object to be tracked.Described tripod head controlling device, for controlling the cloud platform rotation so that the target light source
Light source point and the optical sensor center superposition.
Technical scheme of the embodiment of the present invention, has the following advantages that:
The present invention discloses a kind of cloud platform control method, device and Target Tracking System positioned for light stream, one of which
The cloud platform control method positioned for light stream, is disposed with camera lens, filter plate and optical sensor on head, be arranged on treat with
The light that target light source on track object is sent enters camera lens, injects optical sensor after filtered filtering, this method includes
Following steps:Obtain the spot area that target light source is projected in optical sensor;Calculate spot area in target light source point with
The distance between optical sensor center;Cloud platform rotation is controlled until target light source point is located at the center of optical sensor.This hair
The bright optical parametric projected to by target light source on optical sensor, reference point is made with the center of optical sensor, controls cloud
Platform rotates the center superposition for causing target light source point and optical sensor, and thing to be tracked is tracked out so as to realize and be precisely located
Body, improves tracking efficiency.
Brief description of the drawings
, below will be to specific in order to illustrate more clearly of the specific embodiment of the invention or technical scheme of the prior art
The accompanying drawing used required in embodiment or description of the prior art is briefly described, it should be apparent that, in describing below
Accompanying drawing is some embodiments of the present invention, for those of ordinary skill in the art, before creative work is not paid
Put, other accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 is the flow chart of the cloud platform control method positioned in the embodiment of the present invention 1 for light stream;
Fig. 2 is another flow chart of the cloud platform control method positioned in the embodiment of the present invention 1 for light stream;
Fig. 3 is the data coordinates of the cloud platform control method acquisition spot area positioned in the embodiment of the present invention 1 for light stream
Figure;
Fig. 4 passes to be used for the cloud platform control method control targe light source point that light stream is positioned in the embodiment of the present invention 1 with optics
The data coordinates figure of sensor center superposition;
Fig. 5 is the structured flowchart of the tripod head controlling device positioned in the embodiment of the present invention 2 for light stream;
Fig. 6 is another structured flowchart of the tripod head controlling device positioned in the embodiment of the present invention 2 for light stream;
Fig. 7 is the structural representation of Target Tracking System in the embodiment of the present invention 3.
Embodiment
The technical scheme of the embodiment of the present invention is clearly and completely described below in conjunction with accompanying drawing, it is clear that described
Embodiment be a part of embodiment of the invention, rather than whole embodiment.Based on the embodiment in the present invention, this area is general
The every other embodiment that logical technical staff is obtained under the premise of creative work is not made, belongs to what the present invention was protected
Scope.
, it is necessary to explanation in the description of the embodiment of the present invention, term " " center ", " on ", " under ", "left", "right",
The orientation or position relationship of the instruction such as " vertical ", " level ", " interior ", " outer " be based on orientation shown in the drawings or position relationship,
It is for only for ease of the description embodiment of the present invention and simplifies description, rather than indicates or imply that the device or element of meaning must have
There is specific orientation, with specific azimuth configuration and operation, therefore be not considered as limiting the invention.In addition, term " the
One ", " second ", " the 3rd " are only used for describing purpose, and it is not intended that indicating or implying relative importance.
, it is necessary to which explanation, unless otherwise clearly defined and limited, term " are pacified in the description of the embodiment of the present invention
Dress ", " connected ", " connection " should be interpreted broadly, for example, it may be fixedly connected or be detachably connected, or integratedly
Connection;Can be mechanical connection or electrical connection;Can be joined directly together, can also be indirectly connected to by intermediary,
The connection of two element internals is can also be, can be wireless connection or wired connection.For the common skill of this area
For art personnel, the concrete meaning of above-mentioned term in the present invention can be understood with concrete condition.
As long as in addition, technical characteristic involved in invention described below different embodiments non-structure each other
It can just be combined with each other into conflict.
Embodiment 1
The embodiment of the present invention provides a kind of cloud platform control method positioned for light stream, be disposed with head camera lens,
Filter plate and optical sensor, the light that the target light source being arranged on object to be tracked is sent enter camera lens, filtered filter
Optical sensor is injected after ripple, this method comprises the following steps:As shown in figure 1,
S1, acquisition target light source are projected in the spot area in optical sensor.
Light source herein can use arbitrary light source, can such as use 850nm infrared light sources, be not limited to 850nm
Infrared ray, any wave band light can also, target light source is the light source for meeting track demand, such as use 850nm narrow-band filterings
Piece filters out the optical parametric control according to projection on the optical sensor after remaining light, single tracking 850nm light source
Cloud platform rotation enables to target light source point to be overlapped with the central point of head.Utilize mobile head (i.e. camera lens, filter plate, optics
Body where sensor), camera lens is pointed to target light source point, realizes tracking or position object to be tracked by control head rotation.
As a kind of alternatively implementation, above-mentioned steps S1 obtains the light that target light source is projected in optical sensor
Spot region, as shown in Fig. 2 including:
S11, data coordinates figure is set up, data are equivalent to the light source dot matrix on optical sensor.Data coordinates figure herein
To be set up using transverse and longitudinal coordinate axle in corresponding content of parameter, the present embodiment by setting up data coordinates figure acquisition optical brightness ginseng
The position of number and respective coordinates inner light source point.Actually data coordinates figure is the data equivalent graph of optical sensor, by a lot
Dot matrix is constituted, as shown in figure 3, be 10*10 to be projected in light source point battle array (resolution ratio) on optical sensor (SENSOR), with
10*10 resolution ratio SENSOR carrys out principle of specification.
S12, the luminance threshold and spot area length and width ratio that target light source is set.The optical parametric of target light source includes bright
Size is spent, in order to obtain the brightness value of target light source, so a luminance threshold is set in advance as the reference value of control algolithm,
If being set to 143, the light of target light source, which is incident on after camera lens, enters optical sensor formation spot area, similarly to be terrible
To target light source, the length and width ratio of target spot area is pre-set out as the reference value of control algolithm, if it is set to 3:
2。
S13, the value for reading on data coordinates figure each point coordinates, value are the brightness value of light source.Fig. 3 all grids
It is interior, there is numeral in theory, non-target grid, numeral can be regarded as 0 herein.Numeral in grid, represents photosensitive brightness value, typically
Read the photosensitive brightness value of SENSOR each point in ground.
S14, judge value whether be more than luminance threshold.Point as Fig. 3 logarithm values are more than 143 is analyzed, i.e., 143 be pre-
If luminance threshold, then, the point Ignore All less than 143 obtains the region in two square frames, to the two regions carry out it is horizontal sit
Mark is read with ordinate, then the course count being made up of in the two regions transverse and longitudinal coordinate is calculated, i.e. the length to two regions
Width is read out.
If S15, value are more than luminance threshold, can judge the shape of data point composition constitute rectangular area.Such as in Fig. 3
Digital value 144,221,177,232,200,212 in interior first big frame region, and second small inframe digital value
220th, 210, both greater than 143, so as value is more than the luminance threshold 143 pre-set.After requirement is met, to this two frame
Can the shape of digital data point composition be judged in region, constitute the region that a rectangle is surrounded.
If S16, value are not more than luminance threshold, the veiling glare under the brightness is filtered out.Judge in above-mentioned steps S14
Brightness value parameter is not more than luminance threshold 143, to remaining luminance point Ignore All, as filter out be not target light source its
Its veiling glare.
If the shape of S17, data point can constitute rectangular area, rectangular area length and width ratio is calculated.To disclosure satisfy that square
The image of the hot spot point composition in shape region is analyzed.Usually, the length and width to rectangular area are analyzed, as shown in figure 3, the
One big frame region meets rectangle requirement, so reading X columns:Y line number=3:2, as the length and width ratio of rectangular area.This
Embodiment takes rectangular shape as reference, naturally it is also possible to choose other shapes as reference.
S18, judge rectangular area length and width ratio whether be equal to spot area length and width ratio.In above-mentioned steps S17, because
Read X columns:Y line number=3:2, the as length and width ratio of rectangular area meets the target pre-set in above-mentioned steps S11
The length and width ratio of spot area.
If S19, rectangular area length and width ratio are not equal to spot area length and width ratio, filter out again in the rectangular area
Veiling glare.And read X columns in the second small frame regions of Fig. 3:Y line number=2:1, it is unsatisfactory for the length-width ratio of target spot area
Value, it is undesirable.The region being made up of the hot spot of target light source projection on the optical sensor is so as to obtain final mesh
Spot area is marked, the judgement of the spot area shape of target light source is completed, accordingly, filters out veiling glare, it is to avoid interference.
The distance between target light source point and optical sensor center in S2, calculating spot area.Target light source point is located at
In target spot area, according to target light source point and the position at optical sensor center on optical sensor is obtained, then pass through
Compare distance therebetween, referred to the position at optical sensor center, sat by the data set up in above-mentioned steps S11
Mark on a map, projection the distance between target light source point and optical sensor center on the optical sensor are calculated, to control head
Rotation enables the two to overlap.
S3, control cloud platform rotation are until target light source point is located at the center of optical sensor.The light of light source transmitting enters
Camera lens, by filter plate, squeezes into optical sensor, with the movement of light source, finally falls light source point on the optical sensor
The change of data is had, with this changing rule, by above-mentioned steps S2, target light source point and optical sensor center is calculated
The distance between, judge to move horizontal motor, pitching motor on variable quantity, control head and rotate, until target light source point
Fall in the center of optical sensor.In this way, it is possible to achieve random motion so that head moment alignment target
Light source point position, realizes locating and tracking.
As a kind of optional implementation, the cloud platform control method that light stream is positioned, above-mentioned steps are used in the present embodiment
S3, control cloud platform rotation is located at the center of optical sensor up to target light source point, including:
The first step, chooses benchmark at the center of data coordinates figure and judges a little.As shown in figure 4, optical sensor (SENSOR)
Resolution ratio be 10*10, point (X5, Y6) is judged on the basis of SENSOR center position, setting, rectangle frame is target light source
The spot area that point is beaten on SENSOR, takes a little, i.e., this, which takes, a little represents target light source etc. in spot area on the rectangular area
Effect point.
Second step, calculates the integer average value of rectangular area length and width ratio.Rectangular area to meeting target spot area
Clearing, carry out the central point clearing in region, on Fig. 4, the effective coverage of rectangle frame abscissa is:7th, 8,9, rectangle frame ordinate
Effective coverage be:7、8、9.So take the algorithm to be exactly, abscissa X average value=abscissa summation/columns, ordinate Y
Average value=ordinate summation/line number, ordinate X average value and ordinate Y average value are necessary on obtained result
Make integer processing.
3rd step, according to integer average value, obtains the equivalent point of target light source point.Such as the above-mentioned step first step, rectangle frame
The effective coverage of abscissa is:7th, 8,9, the effective coverage of rectangle frame ordinate is:7th, 8,9, according to algorithmic formula, abscissa X
Average value=(7+8+9)/3=8, ordinate Y average value=(7+8+9)/3=8, so equivalent point be (X8, Y8).
4th step, compares equivalent point and judges the transverse and longitudinal coordinate of point apart from size with benchmark.The step of above-mentioned steps the 3rd, gets
The coordinate of equivalent point is (X8, Y8), and judgement takes point (X8, Y8) to judge point (X5, Y6) difference with benchmark, because X8>X5, is represented
The equivalent point of target light source point (using SENSOR as object of reference, then represents target light source point and existed on the right side at SENSOR centers
On the left of SENSOR), then, send instruction, the horizontal motor of head is controlled to anticlockwise, similarly because Y8>Y6, so control
The past up-regulation of the pitching motor of head processed.Now, head judges that point is moved in Fig. 4 upper right corner in rotation equivalent to benchmark,
Until benchmark judges that point is overlapped with equivalent point, after coincidence, abscissa is identical with ordinate, and the motor of control head stops.
5th step, the transverse and longitudinal coordinate for judging point according to equivalent point and benchmark controls head with not Tongfang apart from of different sizes
To rotation.
On controlling the detailed process of head motion to be:If the abscissa of target light source equivalent point, which is more than benchmark, judges point
Abscissa, then controlled level motor is to left movement;If the abscissa of the equivalent point of target light source is less than the horizontal seat that benchmark judges point
Mark, then controlled level motor is moved right;If the abscissa of the equivalent point of target light source is equal to the abscissa that benchmark judges point,
Controlled level motor stop motion;If the ordinate of the equivalent point of target light source is more than the ordinate that benchmark judges point, control
Pitching motor is moved upwards;If the ordinate of the equivalent point of target light source is less than the ordinate that benchmark judges point, pitching is controlled
Motor is moved downward;If the ordinate of the equivalent point of target light source is equal to the ordinate that benchmark judges point, pitching motor is controlled
Stop motion.
As a kind of optional implementation, the cloud platform control method that light stream is positioned is used in the present embodiment, in addition to:If
Put the brightness value correspondence head of target light source point and the actual range of target light source.
Specifically, because the distance of light source and head is more remote, brightness is with regard to smaller, same light source, with the increasing of distance
Plus, the numerical value beaten in SENSOR dot matrix grids will reduce.Such as, same target light source, it is bright during from being 2m with a distance from head
Angle value is 220;During 3m, brightness value is 145;During 4m, when brightness value is 132,5m, brightness value is 125.With reference to above-mentioned steps S12
In luminance threshold, match target light source effect, according to the luminance threshold 143 pre-set, reach distance limitation.
Embodiment 2
The embodiment of the present invention provides a kind of tripod head controlling device positioned for light stream, be disposed with head camera lens,
Filter plate and optical sensor, the light that the target light source being arranged on object to be tracked is sent enter camera lens, filtered filter
Optical sensor is injected after ripple, the device as shown in figure 5, including:
Acquisition module 51, for obtaining the spot area that target light source is projected in optical sensor;
Computing module 52, for calculating the distance between target light source point and optical sensor center in spot area;
Control module 53, for controlling cloud platform rotation until target light source point is located at the center of optical sensor.
As a kind of optional implementation, the tripod head controlling device that light stream is positioned, acquisition module are used in the present embodiment
51, as shown in fig. 6, including:
Unit 511 is set up, for setting up data coordinates figure, data are equivalent to the light source dot matrix on optical sensor;
Setting unit 512, luminance threshold and spot area length and width ratio for setting target light source;
Reading unit 513, the value for reading each point coordinates on data coordinates figure, value is the brightness value of light source;
First judging unit 514, for judging whether value is more than luminance threshold;
Second judging unit 515, if being more than luminance threshold for value, can judge the shape of data point composition be constituted
Rectangular area,
First filters out unit 516, if for judging value no more than luminance threshold in the second judging unit, filtering out
Veiling glare under the brightness;
First computing unit 517, if the shape for data point can constitute rectangular area, calculates rectangular area length-width ratio
Value;
3rd judging unit 518, for judging whether rectangular area length and width ratio is equal to spot area length and width ratio;
Second filters out unit 519, if for judging that rectangular area length and width ratio is not equal to light in the 3rd judging unit
Spot region length and width ratio, then filter out the veiling glare in the rectangular area again.
As a kind of optional implementation, the tripod head controlling device that light stream is positioned, control module are used in the present embodiment
53, including:
Unit is chosen, is judged a little for choosing benchmark at the center of data coordinates figure;
Second computing unit, the integer average value for calculating rectangular area length and width ratio;
Acquiring unit, for according to integer average value, obtaining the equivalent point of target light source point;
Comparing unit, judges the transverse and longitudinal coordinate of point apart from size for comparing equivalent point and benchmark;
Control unit, for judging the transverse and longitudinal coordinate of point according to equivalent point and benchmark apart from of different sizes, control head with
Different directions are rotated.
As a kind of optional implementation, the tripod head controlling device that light stream is positioned is used in the present embodiment, in addition to:If
Module is put, for setting the brightness value correspondence head of target light source point and the actual range of target light source.
Embodiment 3
The embodiment of the present invention provides a kind of Target Tracking System, as shown in fig. 7, comprises:
Target light source 71, is arranged on object to be tracked.Light source point can use arbitrary light source, and target light source is full
Foot control head 72 precisely tracks the light of upper target object to be tracked.
Head 72, multi-direction can rotate.Head 72 is moved, and is that the horizontal motor and pitching motor of inside rotate in fact
Motion.It is used for the cloud platform control method that light stream is positioned in control chip Application Example 1 inside head 72, controls two motors
Rotation, and while two motor rotations, camera lens 721, filter plate 722 and optical sensor 723 can be connected and regulated and controled together.
Camera lens 721, sets on head 72, can gather the light that target light source is sent.Usually, camera lens 721 herein
Tracking effect can be made more preferably from wide-angle lens, mobile mesh can effectively be tracked in the bigger visual field by choosing wide-angle lens
Mark.
Filter plate 722, is arranged on behind camera lens, for filtering out the jamming light source in incident ray.Match target light source
71 wavelength, interference effect of the extraneous invalid light of blocking to optical sensor 723 (SENSOR).
Optical sensor 723, is arranged on behind filter plate 722, and the light after filtering out is injected in optical sensor and formed
Hot spot.Optical parametric is calculated by the spot area being incident on optical sensor 723, in addition combined with camera lens 721, filter
Target light source is identified wave plate 722, analyze light source parameters, and then controls cloud platform rotation to enable target light source point and light
Learn the center superposition of sensor.
Picture pick-up device 724, is arranged on head 72, for shooting object to be tracked.One is placed above head 72
Picture pick-up device 724, then this picture pick-up device 724 can realize track up.
Tripod head controlling device in embodiment 2, for controlling cloud platform rotation so that the light source point and optics of target light source 71
The center superposition of sensor.
Obviously, above-described embodiment is only intended to clearly illustrate example, and the not restriction to embodiment.It is right
For those of ordinary skill in the art, can also make on the basis of the above description it is other it is various forms of change or
Change.There is no necessity and possibility to exhaust all the enbodiments.And the obvious change thus extended out or
Among changing still in the protection domain of the invention.
Claims (9)
1. on a kind of cloud platform control method positioned for light stream, the head camera lens, filter plate and optics is disposed with to pass
Sensor, the light that the target light source being arranged on object to be tracked is sent enters the camera lens, and institute is injected after filtered filtering
State optical sensor, it is characterised in that this method comprises the following steps:
Obtain the spot area that the target light source is projected in the optical sensor;
Calculate the distance between target light source point and the optical sensor center in the spot area;
The cloud platform rotation is controlled until the target light source point is located at the center of the optical sensor.
2. according to the method described in claim 1, it is characterised in that the acquisition target light source is projected in the optics and passed
Spot area in sensor, including:
Data coordinates figure is set up, the data are equivalent to the light source dot matrix on the optical sensor;
The luminance threshold and the spot area length and width ratio of the target light source are set;
The value of each point coordinates is read on the data coordinates figure, the value is the brightness value of light source;
Judge whether the value is more than the luminance threshold;
If the value is more than the luminance threshold, can judge the shape of data point composition constitute rectangular area,
If the value is not more than the luminance threshold, the veiling glare under the brightness is filtered out;
If the shape of the data point can constitute rectangular area, the rectangular area length and width ratio is calculated;
Judge whether the rectangular area length and width ratio is equal to the spot area length and width ratio;
If the rectangular area length and width ratio is not equal to the spot area length and width ratio, filter out again in the rectangular area
Veiling glare.
3. method according to claim 1 or 2, it is characterised in that the control cloud platform rotation is until the target
Light source point is located at the center of the optical sensor, including:
Benchmark is chosen at the center of the data coordinates figure to judge a little;
Calculate the integer average value of the rectangular area length and width ratio;
According to the integer average value, the equivalent point of the target light source point is obtained;
Compare the equivalent point judges the transverse and longitudinal coordinate of point apart from size with the benchmark;
The transverse and longitudinal coordinate for judging point according to the equivalent point and the benchmark controls the head with not Tongfang apart from of different sizes
To rotation.
4. method according to claim 3, it is characterised in that also include:The brightness value pair of the target light source point is set
Answer the actual range of the head and the target light source.
5. on a kind of tripod head controlling device positioned for light stream, the head camera lens, filter plate and optics is disposed with to pass
Sensor, the light that the target light source being arranged on object to be tracked is sent enters the camera lens, and institute is injected after filtered filtering
State optical sensor, it is characterised in that the device includes:
Acquisition module, for obtaining the spot area that the target light source is projected in the optical sensor;
Computing module, for calculating the distance between target light source point and the optical sensor center in the spot area;
Control module, for controlling the cloud platform rotation until the target light source point is located at the center of the optical sensor.
6. device according to claim 5, it is characterised in that the acquisition module, including:
Unit is set up, for setting up data coordinates figure, the data are equivalent to the light source dot matrix on the optical sensor;
Setting unit, luminance threshold and the spot area length and width ratio for setting the target light source;
Reading unit, the value for reading each point coordinates on the data coordinates figure, the value is the brightness value of light source;
First judging unit, for judging whether the value is more than the luminance threshold;
Second judging unit, if being more than the luminance threshold for the value, judges that can the shape of data point composition structure
Rectangular region,
First filters out unit, if for judging that the value is not more than the luminance threshold in the second judging unit, filtering out
Veiling glare under the brightness;
First computing unit, if the shape for the data point can constitute rectangular area, calculates the rectangular area length and width
Ratio;
3rd judging unit, for judging whether the rectangular area length and width ratio is equal to the spot area length and width ratio;
Second filters out unit, if for judging that it is described that the rectangular area length and width ratio is not equal in the 3rd judging unit block
Spot area length and width ratio, then filter out the veiling glare in the rectangular area again.
7. the device according to claim 5 or 6, it is characterised in that the control module,
Including:
Unit is chosen, is judged a little for choosing benchmark at the center of the data coordinates figure;
Second computing unit, the integer average value for calculating the rectangular area length and width ratio;
Acquiring unit, for according to the integer average value, obtaining the equivalent point of the target light source point;
Comparing unit, judges the transverse and longitudinal coordinate of point apart from size for comparing the equivalent point and the benchmark;
Control unit, the transverse and longitudinal coordinate for judging point according to the equivalent point and the benchmark controls institute apart from of different sizes
Head is stated to rotate in different directions.
8. device according to claim 7, it is characterised in that also include:Setup module, for setting the target light source
The brightness value correspondence head and the actual range of the target light source of point.
9. a kind of Target Tracking System, it is characterised in that including:
Target light source, is arranged on object to be tracked;
Head, multi-direction can rotate;
Camera lens, sets on the head, can gather the light that the target light source is sent;
Filter plate, is arranged on behind the camera lens, for filtering out the jamming light source in incident ray;
Optical sensor, is arranged on behind the filter plate, and the light after filtering out is injected in the optical sensor and forms light
Spot;
Picture pick-up device, is arranged on the head, for shooting the object to be tracked.
Tripod head controlling device described in claim 5-8, for controlling the cloud platform rotation so that the light source of the target light source
The center superposition of point and the optical sensor.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710178765.8A CN107065935A (en) | 2017-03-23 | 2017-03-23 | A kind of cloud platform control method, device and Target Tracking System positioned for light stream |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710178765.8A CN107065935A (en) | 2017-03-23 | 2017-03-23 | A kind of cloud platform control method, device and Target Tracking System positioned for light stream |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107065935A true CN107065935A (en) | 2017-08-18 |
Family
ID=59618003
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710178765.8A Pending CN107065935A (en) | 2017-03-23 | 2017-03-23 | A kind of cloud platform control method, device and Target Tracking System positioned for light stream |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107065935A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112955844A (en) * | 2020-06-30 | 2021-06-11 | 深圳市大疆创新科技有限公司 | Target tracking method, device, system and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5581250A (en) * | 1995-02-24 | 1996-12-03 | Khvilivitzky; Alexander | Visual collision avoidance system for unmanned aerial vehicles |
CN101651784A (en) * | 2009-09-24 | 2010-02-17 | 上海交通大学 | Video tracking system of panoramic pan-tilt-zoom camera |
CN101813523A (en) * | 2010-04-30 | 2010-08-25 | 中国科学院安徽光学精密机械研究所 | Device and method for measuring atmospheric coherence length of mobile beacon |
CN103297702A (en) * | 2013-05-06 | 2013-09-11 | 中航华东光电有限公司 | Image processing device for aviation onboard helmet-mounted locating system and image processing method thereof |
CN105373140A (en) * | 2014-08-20 | 2016-03-02 | 深圳Tcl新技术有限公司 | Light source tracking method and system |
CN205408001U (en) * | 2016-03-09 | 2016-07-27 | 张立秀 | Automatic trail imaging system |
-
2017
- 2017-03-23 CN CN201710178765.8A patent/CN107065935A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5581250A (en) * | 1995-02-24 | 1996-12-03 | Khvilivitzky; Alexander | Visual collision avoidance system for unmanned aerial vehicles |
CN101651784A (en) * | 2009-09-24 | 2010-02-17 | 上海交通大学 | Video tracking system of panoramic pan-tilt-zoom camera |
CN101813523A (en) * | 2010-04-30 | 2010-08-25 | 中国科学院安徽光学精密机械研究所 | Device and method for measuring atmospheric coherence length of mobile beacon |
CN103297702A (en) * | 2013-05-06 | 2013-09-11 | 中航华东光电有限公司 | Image processing device for aviation onboard helmet-mounted locating system and image processing method thereof |
CN105373140A (en) * | 2014-08-20 | 2016-03-02 | 深圳Tcl新技术有限公司 | Light source tracking method and system |
CN205408001U (en) * | 2016-03-09 | 2016-07-27 | 张立秀 | Automatic trail imaging system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112955844A (en) * | 2020-06-30 | 2021-06-11 | 深圳市大疆创新科技有限公司 | Target tracking method, device, system and storage medium |
WO2022000242A1 (en) * | 2020-06-30 | 2022-01-06 | 深圳市大疆创新科技有限公司 | Target tracking method, device, and system, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021004312A1 (en) | Intelligent vehicle trajectory measurement method based on binocular stereo vision system | |
CN110458898B (en) | Camera calibration board, calibration data acquisition method, distortion correction method and device | |
CN110142785A (en) | A kind of crusing robot visual servo method based on target detection | |
CN106875444B (en) | A kind of object localization method and device | |
CN104778690B (en) | A kind of multi-target orientation method based on camera network | |
CN110967166B (en) | Detection method, detection device and detection system of near-eye display optical system | |
CN109521019A (en) | A kind of bridge bottom crack detection method based on unmanned plane vision | |
CN108234984A (en) | Binocular depth camera system and depth image generation method | |
CN108765489A (en) | A kind of pose computational methods, system, medium and equipment based on combination target | |
CN106780389B (en) | Fisheye image correction method and device based on coordinate transformation | |
CN104361603B (en) | Gun camera image target designating method and system | |
CN108363519B (en) | Distributed infrared visual detection and projection fusion automatic correction touch display system | |
CN109785381B (en) | Optical inertia fusion space positioning method and positioning system | |
CN105357515A (en) | Color and depth imaging method and device based on structured light and light-field imaging | |
CN109752855A (en) | A kind of method of hot spot emitter and detection geometry hot spot | |
CN104090664B (en) | A kind of interactive projection method, apparatus and system | |
CN103099623A (en) | Extraction method of kinesiology parameters | |
CN105554472B (en) | The method of the video monitoring system and its positioning robot of overlay environment | |
CN103226817A (en) | Superficial venous image augmented reality method and device based on perspective projection | |
CN110675350B (en) | Cloud deck camera view field coordinate mapping method and device, storage medium and cloud deck camera | |
CN113450253B (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
CN113902809A (en) | Method for jointly calibrating infrared camera and laser radar | |
CN107038714A (en) | Many types of visual sensing synergistic target tracking method | |
CN109143001A (en) | pantograph detection system | |
Chen et al. | Bionic mosaic method of panoramic image based on compound eye of fly |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170818 |
|
RJ01 | Rejection of invention patent application after publication |