CN106340043A - Image identification spatial localization method and image identification spatial localization system - Google Patents
Image identification spatial localization method and image identification spatial localization system Download PDFInfo
- Publication number
- CN106340043A CN106340043A CN201610718986.5A CN201610718986A CN106340043A CN 106340043 A CN106340043 A CN 106340043A CN 201610718986 A CN201610718986 A CN 201610718986A CN 106340043 A CN106340043 A CN 106340043A
- Authority
- CN
- China
- Prior art keywords
- image
- globe
- luminescence body
- type luminescence
- camera head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 230000004807 localization Effects 0.000 title abstract 10
- 238000004020 luminiscence type Methods 0.000 claims description 106
- 230000003287 optical effect Effects 0.000 claims description 19
- 230000005540 biological transmission Effects 0.000 claims description 4
- 239000011248 coating agent Substances 0.000 claims description 3
- 238000000576 coating method Methods 0.000 claims description 3
- 210000000746 body region Anatomy 0.000 claims description 2
- 235000013399 edible fruits Nutrition 0.000 claims 1
- 230000005611 electricity Effects 0.000 claims 1
- 230000010365 information processing Effects 0.000 claims 1
- 238000005259 measurement Methods 0.000 description 10
- 238000011946 reduction process Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 238000000926 separation method Methods 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G06T5/70—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
Abstract
The invention provides an image identification spatial localization method and an image identification spatial localization system. The image identification spatial localization method comprises an image module, a localization module, and a processing module. The image module includes an image processing device and at least two camera devices, and the camera devices include a main camera device. The localization module includes a spherical luminous body which can emit light, and the spherical luminous body is spherical. Compared with the prior art, analysis and localization are performed based on an image shot by the camera devices, the distance is measured according to the correspondence between diameter data of the spherical luminous body in the shot image and distance, the angle is measured according to the correspondence between the center coordinate of the spherical luminous body in the shot image and directional angle, and the position of the spherical luminous body is restored according to the measured distance and angle. A novel means of localization is provided, and the novel means of localization is efficient and accurate.
Description
Technical field
The present invention relates to space orientation field, more particularly, it relates to a kind of image recognition space-location method and system.
Background technology
Space orientation is typically positioned using the pattern of optics or ultrasound wave and is calculated, and is treated by setting up model and deriving
Survey the locus of object.General optical space alignment system to determine by the way of laser scanning and optical inductor receive
The locus of object, this space-like alignment system often has measuring apparatus more huge, time of measuring length and cannot be real-time
The problem of measurement, range of application is subject to larger limiting to.
Content of the invention
In order to solve, current spatial Position System device is lengthy and jumbled and the long defect of time of measuring, and the present invention provides one kind to set
The short image recognition space-location method of standby simplicity, time of measuring and system.
The technical solution adopted for the present invention to solve the technical problems is: a kind of image recognition space-location method is provided,
Including image module, locating module and processing module, described image module includes image processing apparatus and at least two shooting dresses
Put, described camera head includes main camera head, described locating module includes the globe-type luminescence body that can light, described spherical
Body of light is spherical, and the sterically defined process of image recognition comprises the following steps:
S1: described main camera head shoots described globe-type luminescence body image, and the image information of shooting is transferred to described
Image processing apparatus, described image processing meanss are processed to image information, and result is transferred to described process mould
Block;
S2: described processing module is further processed to by the information that the transmission of described image processing meanss comes, according to place
Reason result and the spherical hair described in positional information judging distance being stored in the multiple described camera head in described processing module
The nearest described camera head of body of light, and send designated command apart from the nearest described camera head shooting of described globe-type luminescence body
Image;
S3: apart from the nearest described camera head shooting image of described globe-type luminescence body and be transferred to image information described
Image processing apparatus, described image processing meanss are processed to image information, and result is transferred to described process mould
Block;
S4: described processing module is further processed to by the information that the transmission of described image processing meanss comes, and draws institute
State the positional information of globe-type luminescence body.
Preferably, the equal fixed position of described camera head and direction.
Preferably, the outer surface of described globe-type luminescence body scribbles optical material, and described optical material is in described globe-type luminescence
Body can substantially distinguish the border of described globe-type luminescence body when lighting.
Preferably, described image processing meanss can divide the described globe-type luminescence body in described camera head shooting image
Region, and the geometric center of globe-type luminescence body and institute according to the regional assignment image of the described globe-type luminescence body dividing
State the diameter data of globe-type luminescence body.
Preferably, described image processing meanss calculate globe-type luminescence body region institute described in described camera head shooting image
Coordinate a little, and the center point coordinate of globe-type luminescence body described in image is obtained by weighted mean.
Preferably, described image processing meanss divide the region of described globe-type luminescence body by following steps:
S101: described image processing meanss carry out infrared optical processing to the image photographing, draws infrared response region;
S202: described image processing meanss divide to described infrared response region, mark off spherical region and make an uproar
Sound area domain, excludes described noise region;
S303: described image processing meanss are further analyzed to described spherical region, in described spherical region
In, fluorescence area and entity area are distinguished according to infrared threshold set in advance.
There is provided a kind of image recognition space positioning system, described image module further includes fixing end embedded Control mould
Block.
Preferably, described locating module further includes supply unit, mobile terminal embedded type control module, operation device,
Described mobile terminal embedded type control module is electrically connected with described supply unit, described operation device.
Preferably, described locating module includes operation device, and described mobile terminal embedded type control module can process described
The order that operation device sends.
Preferably, described globe-type luminescence body is arranged on the top of virtual implementing helmet.
Compared with prior art, the present invention is analyzed positioning using the image that camera head shoots, using shooting image
The diameter data of middle globe-type luminescence body corresponding with distance come measurement distance, the center using globe-type luminescence body in shooting image is sat
Mark corresponding with orientation angle measuring angle, and the distance by recording and angle, to reduce the position of globe-type luminescence body, carry
A kind of positioning means of novelty are supplied, efficiently and accurately.Using the combination of main camera head and multiple camera heads, choose the most suitable
The method that the camera head taken the photograph of being in step with is shot, is greatly improved the accuracy of position measurement.All of camera head is equal
Fixed position and direction, are easy to calculate the relative position of globe-type luminescence body and camera head.Coat on the surface of globe-type luminescence body
Optical material is easy to the border of time zone bulb separation shape luminous body that lights in globe-type luminescence body, increased the accuracy of measurement.At image
Reason device is divided to the relevant range of picture by infrared optical processing, noise reduction process and threshold process, not only accelerates number
According to the speed obtaining, more improve the degree of accuracy obtaining data.Globe-type luminescence body is arranged on the top of virtual implementing helmet, permissible
Facilitate camera head to shoot the image of globe-type luminescence body, prevent globe-type luminescence body from being blocked and cannot be taken by the body of user
Arrive, improve the success rate of shooting.
Brief description
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 is image recognition space positioning system module diagram of the present invention;
Fig. 2 is that image recognition space positioning system camera head of the present invention shoots schematic diagram;
Fig. 3 is that image processing apparatus process picture schematic diagram;
Fig. 4 is circular spherical luminous body parameter processing schematic diagram;
Fig. 5 is non-circular spherical luminous body parameter processing schematic diagram;
Fig. 6 is image recognition space-location method principle schematic of the present invention;
Fig. 7 is image recognition space-location method schematic flow sheet of the present invention.
Specific embodiment
In order to solve, current spatial Position System device is lengthy and jumbled and the long defect of time of measuring, and the present invention provides one kind to set
The short image recognition space-location method of standby simplicity, time of measuring and system.
In order to be more clearly understood to the technical characteristic of the present invention, purpose and effect, now comparison accompanying drawing describes in detail
The specific embodiment of the present invention.
Refer to Fig. 1 Fig. 2, the present invention includes image module 1, locating module 2 as identification space-location method and system
With processing module 3.Image module 1 includes fixing end embedded type control module 16, image processing apparatus 18 and camera head 12.Take the photograph
As device 12 fixed position and direction.Fixing end embedded type control module 16 is with image processing apparatus 18, camera head 12 respectively
It is electrically connected with, the picture that image processing apparatus 18 can shoot to camera head 12 is analyzed and processes.Processing module 3 with solid
Fixed end embedded type control module 16 is electrically connected with, and can mutually pass on from one to another between processing module 3 and fixing end embedded type control module 16
Pass information.
Locating module 2 includes supply unit 25, mobile terminal embedded type control module 23, operation device 27 and globe-type luminescence body
21, the profile of globe-type luminescence body 21 is spherical, facilitates image module 1 it to be identified and judges.Outside globe-type luminescence body 21
Surface scribbles optical coating, can substantially distinguish the border of globe-type luminescence body 21 when globe-type luminescence body 21 lights.Globe-type luminescence
Body 21 is arranged on the top of virtual implementing helmet 100, and camera head 12 so can be facilitated to shoot the image of globe-type luminescence body 21,
Prevent globe-type luminescence body 21 from being blocked and cannot be photographed by the body of user.Mobile terminal embedded type control module 23 and power supply
Device 25, operation device 27 and globe-type luminescence body 21 are electrically connected with, and globe-type luminescence body 21 and supply unit 25 are electrically connected with.
Mobile terminal embedded type control module 23 can process the order that operation device 27 sends, and makes locating module 2 function abundanter.
Refer to Fig. 3, Fig. 3 is the image that image processing apparatus 18 carry out to the image shooting drawing after infrared optical processing.
Camera head 12 can shoot the image of globe-type luminescence body 21 and be transferred to image processing apparatus 18.Image processing apparatus 18 are to bat
The image taken the photograph carries out infrared optical processing, can be clearly seen that, in the image after treatment of the position of globe-type luminescence body 21 more
Substantially.After the image receiving camera head 12 shooting, image processing apparatus 18 carry out infrared first to the image photographing
Optical processing, draws infrared response region 41;Followed by noise reduction process, image processing apparatus 18 enter to infrared response region 41
Row divides, and marks off spherical region 43 and noise region 42, excludes noise region 42;After the completion of noise reduction process, image procossing
Device 18 carries out threshold process to spherical region 43, and image processing apparatus 18 are further analyzed to spherical region 43,
In spherical region 43, fluorescence area 431 and entity area 432, entity area are distinguished according to infrared threshold set in advance
432 is exactly the region of the globe-type luminescence body 21 that we divide out.The region dividing globe-type luminescence body 21 by means of which is fast
Fast and accurate, rapidly distinguish spherical region not only by optical treatment, conveniently processed further, more eliminate fluorescence
Region 431, for the interference of measurement result, increased accuracy.
Refer to Fig. 4, after the region of globe-type luminescence body 21 is divided out, on globe-type luminescence body, every bit is in picture 10
In coordinate uniquely determine, image processing apparatus 18 by delimit globe-type luminescence body 21 geometric center method find globe-type luminescence
Central point k (x, y) of body 21.Horizontal and vertical coordinate system point can be passed through during the geometric center delimiting globe-type luminescence body 21
Central point determining, i.e. lateral coordinates boundary point one a (x1, y1), lateral coordinates boundary point two a ' (x2, y2), longitudinal coordinate border
Point one b (x3, y3), longitudinal coordinate boundary point b ' (x4, y4) jointly to determine.That is: (x, y)=(1/2 (x1+x2),1/2(y3+
y4)).Can also by calculate on globe-type luminescence body 21 coordinate a little.Draw globe-type luminescence body 21 with weighted mean
The coordinate of central point k.Meanwhile, diameter data d in globe-type luminescence body 21 region can also be calculated.In figure 3 can be very clear
See to Chu, the size of diameter data d is lateral separation aa ' length, i.e. d=Shu x2-x1Shu.
Refer to Fig. 5, in some cases, the image of the globe-type luminescence body 21 that camera head 12 shoots may be due to angle
Degree problem is blocked a part, so allows for central point k and the measurement of diameter data d is affected, now we can draw
Go out the radial direction of globe-type luminescence body 21, and determine radially two end points c (x5, y5) and c ' (x6, y6), now,
The coordinate of central point k is:
(x, y)=(1/2 (x5+x6),1/2(y5+y6)),
The size of diameter data d of globe-type luminescence body 21 is:
D=((x5-x6)2+(y5-y6)2)1/2.
After drawing the coordinate of globe-type luminescence body 21 central point k and diameter data d of globe-type luminescence body 21, image module 1
By data transfers such as the coordinate of globe-type luminescence body 21 central point k and globe-type luminescence body 21 diameter data d to processing module 3.Due to
In the picture 10 shooting, diameter data d of globe-type luminescence body 21 and globe-type luminescence body 21 are direct apart from d ' apart from camera head 12
Correlation, each can be by straight in picture 10 apart from diameter data d of the corresponding unique globe-type luminescence body 21 of d ', therefore we
The size of footpath data d come to correspond to draw globe-type luminescence body 21 and camera head 12 apart from d '.We first by diameter data d with
Preserve in the corresponding relation input processing module 3 of d ', processing module 3 can draw camera head 12 and ball according to diameter data d
Shape luminous body 21 apart from d '.The coordinate that we again may be by central point k draws globe-type luminescence body 21 and camera head 12
Relative direction.Because camera head 12 is fixing, on picture 10, each coordinate pair answers camera head 12 direction unique
Angle information, therefore can draw its corresponding angle and direction by the coordinate on picture 10.Similarly, we are by picture
On 10, the corresponding angle information of the coordinate of every bit is stored in processing module 3, and processing module 3 can be according to globe-type luminescence body 21
The coordinate pair of central point k should draw the angle information that globe-type luminescence body 21 is with respect to camera head 12.By this angle information and
Globe-type luminescence body 21 and the range information d ' of camera head 12, we just can draw globe-type luminescence body 21 with respect to camera head
12 space coordinatess, also just can record the particular location of globe-type luminescence body 21.
Refer to Fig. 6 Fig. 7, in position fixing process, when globe-type luminescence body 21 apart from camera head 12 distant when,
The image very little of the globe-type luminescence body 21 that camera head 12 photographs, is at this moment subject to the boundary of fluorescence area 431 and entity area 432
Line obscures and globe-type luminescence body 21 image occupies the too small impact of pixel, by calculating the diameter number of the image of globe-type luminescence body 21
Have larger error according to what d to calculate globe-type luminescence body 21 and camera head 12 apart from d '.At this moment we need multiple shootings
Device 12 is determining the position of globe-type luminescence body 21.Multiple camera heads 12 are set in the work space of positioning, take the photograph multiple
As there is the maximum main camera head 121 in a visual angle in device 12, when positioning starts, main camera head 121 shooting image,
And the image of shooting is delivered to image processing apparatus 18 is processed, image processing apparatus 18 are by infrared optical processing, noise reduction
Process and threshold process, judge the related data information of globe-type luminescence body 21.Generally, the position of main camera head 121
Can cover whole positioning region as basic standard, it is distant apart from globe-type luminescence body 21, and it photographs for setting
The image of globe-type luminescence body 21 is less, and the positional data error being drawn by this image is larger.When processing module 3 is according to main shooting
Device 121 shoot image judge the position data of globe-type luminescence body 21 after, processing module 3 by this position data be stored in
The position data of the multiple camera heads 12 in processing module 3 compares, and therefrom filters out apart from globe-type luminescence body 21 position
Near camera head 12, and send the image that this camera head 12 of designated command shoots globe-type luminescence body 21, image processing apparatus
The image of the globe-type luminescence body 21 of 18 pairs of this camera head 12 shootings carries out infrared optical processing, noise reduction process and threshold process, sentences
Determine the related data information of globe-type luminescence body 21 and related data information is transferred to processing module 3 carries out final process, draw
The particular location of globe-type luminescence body 21.Because this camera head 12 is close together apart from globe-type luminescence body 21, globe-type luminescence body
21 image occupies more pixels, and profile becomes apparent from, and fluorescence area 431 is relative with the boundary of entity area 432
Substantially many, these all substantially increase the degree of accuracy of measurement.
Compared with prior art, the present invention is analyzed positioning using the image that camera head 12 shoots, using shooting figure
As in globe-type luminescence body 21 diameter data corresponding with distance come measurement distance, using globe-type luminescence body 21 in shooting image
Centre coordinate corresponding with orientation angle measuring angle, and the distance by recording and angle are reducing globe-type luminescence body 21
A kind of position, there is provided novel positioning means, efficiently and accurately.Using main camera head 121 and multiple camera heads 12
In conjunction with, the method that the camera head 12 that selection is best suitable for shooting is shot, the accuracy of position measurement is greatly improved.Institute
The equal fixed position of some camera heads 12 and direction, are easy to calculate the relative position of globe-type luminescence body 21 and camera head 12.?
The surface of globe-type luminescence body 21 is coated optical material and is easy to the border of time zone bulb separation shape luminous body 21 that lights in globe-type luminescence body 21,
Increased the accuracy of measurement.Image processing apparatus 18 are by infrared optical processing, noise reduction process and threshold process to picture 10
Relevant range is divided, and not only accelerates the speed of data acquisition, more improves the degree of accuracy obtaining data.Globe-type luminescence body
21 tops being arranged on virtual implementing helmet 100, can facilitate camera head 12 to shoot the image of globe-type luminescence body 21, prevent ball
Shape luminous body 21 is blocked and cannot be photographed by the body of user, improves the success rate of shooting.
Above in conjunction with accompanying drawing, embodiments of the invention are described, but the invention is not limited in above-mentioned concrete
Embodiment, above-mentioned specific embodiment is only schematically, rather than restricted, those of ordinary skill in the art
Under the enlightenment of the present invention, in the case of without departing from present inventive concept and scope of the claimed protection, also can make a lot
Form, these belong within the protection of the present invention.
Claims (10)
1. a kind of image recognition space-location method is it is characterised in that include image module, locating module and processing module, institute
State image module and include image processing apparatus and at least two camera heads, described camera head includes main camera head, described
Locating module includes the globe-type luminescence body that can light, and described globe-type luminescence body is spherical, the sterically defined process of image recognition
Comprise the following steps:
S1: described main camera head shoots described globe-type luminescence body image, and the image information of shooting is transferred to described image
Processing meanss, described image processing meanss are processed to image information, and result is transferred to described processing module;
S2: described processing module is further processed to by the information that the transmission of described image processing meanss comes, and ties according to processing
Fruit and the globe-type luminescence body described in positional information judging distance of the multiple described camera head being stored in described processing module
Nearest described camera head, and send designated command apart from the nearest described camera head shooting figure of described globe-type luminescence body
Picture;
S3: be transferred to described image apart from the nearest described camera head shooting image of described globe-type luminescence body and by image information
Processing meanss, described image processing meanss are processed to image information, and result is transferred to described processing module;
S4: described processing module is further processed to by the information that the transmission of described image processing meanss comes, and draws described ball
The positional information of shape luminous body.
2. image recognition space-location method according to claim 1 is it is characterised in that the equal fixed bit of described camera head
Put and direction.
3. image recognition space-location method according to claim 1 is it is characterised in that the appearance of described globe-type luminescence body
Face scribbles optical coating, and described optical coating can substantially distinguish described globe-type luminescence body when described globe-type luminescence body lights
Border.
4. image recognition space-location method according to claim 1 is it is characterised in that described image processing meanss are permissible
Divide the region of the described globe-type luminescence body in described camera head shooting image, and according to the described globe-type luminescence body dividing
The geometric center of globe-type luminescence body described in regional assignment image and the diameter data of described globe-type luminescence body.
5. image recognition space-location method according to claim 4 is it is characterised in that described image processing meanss are calculated
Globe-type luminescence body region described in described camera head shooting image coordinate a little, and obtain image by weighted mean
Described in globe-type luminescence body center point coordinate.
6. image recognition space-location method according to claim 4 is it is characterised in that described image processing meanss are passed through
The region of the following steps described globe-type luminescence body of division:
S101: described image processing meanss carry out infrared optical processing to the image photographing, draws infrared response region;
S202: described image processing meanss divide to described infrared response region, mark off spherical region and noise range
Domain, excludes described noise region;
S303: described image processing meanss are further analyzed to described spherical region, in described spherical region, root
Distinguish fluorescence area and entity area according to infrared threshold set in advance.
7. a kind of image recognition space positioning system of image recognition space-location method positioning according to claim 1,
It is characterized in that, described image module further includes fixing end embedded type control module, described fixing end embedded Control mould
Block is electrically connected with described image processing meanss, described camera head.
8. a kind of image recognition space positioning system according to claim 7 is it is characterised in that described locating module enters one
Step includes supply unit, mobile terminal embedded type control module, described mobile terminal embedded type control module and described supply unit electricity
Property connect.
9. image recognition space-location method according to claim 8 is it is characterised in that described locating module includes operating
Device, described operation device is electrically connected with described mobile terminal embedded type control module, described mobile terminal embedded type control module
The instruction that described operation device sends can be processed.
10. a kind of image recognition space positioning system according to claim 8 is it is characterised in that further include virtual
The real helmet, described globe-type luminescence body is arranged on the top of described virtual implementing helmet.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610718986.5A CN106340043A (en) | 2016-08-24 | 2016-08-24 | Image identification spatial localization method and image identification spatial localization system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610718986.5A CN106340043A (en) | 2016-08-24 | 2016-08-24 | Image identification spatial localization method and image identification spatial localization system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106340043A true CN106340043A (en) | 2017-01-18 |
Family
ID=57825244
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610718986.5A Pending CN106340043A (en) | 2016-08-24 | 2016-08-24 | Image identification spatial localization method and image identification spatial localization system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106340043A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108414195A (en) * | 2018-01-17 | 2018-08-17 | 深圳市绚视科技有限公司 | Detection method, device, system and the storage device of light source emitter to be measured |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1567384A (en) * | 2003-06-27 | 2005-01-19 | 史中超 | Method of image acquisition, digitized measure and reconstruction of three-dimensional object |
JP2006317441A (en) * | 2005-05-10 | 2006-11-24 | Pixart Imaging Inc | Localization system including image display and image sensor |
CN103438904A (en) * | 2013-08-29 | 2013-12-11 | 深圳市宇恒互动科技开发有限公司 | Inertial positioning method and system using vision-aided correction |
CN105867611A (en) * | 2015-12-29 | 2016-08-17 | 乐视致新电子科技(天津)有限公司 | Space positioning method, device and system in virtual reality system |
-
2016
- 2016-08-24 CN CN201610718986.5A patent/CN106340043A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1567384A (en) * | 2003-06-27 | 2005-01-19 | 史中超 | Method of image acquisition, digitized measure and reconstruction of three-dimensional object |
JP2006317441A (en) * | 2005-05-10 | 2006-11-24 | Pixart Imaging Inc | Localization system including image display and image sensor |
CN103438904A (en) * | 2013-08-29 | 2013-12-11 | 深圳市宇恒互动科技开发有限公司 | Inertial positioning method and system using vision-aided correction |
CN105867611A (en) * | 2015-12-29 | 2016-08-17 | 乐视致新电子科技(天津)有限公司 | Space positioning method, device and system in virtual reality system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108414195A (en) * | 2018-01-17 | 2018-08-17 | 深圳市绚视科技有限公司 | Detection method, device, system and the storage device of light source emitter to be measured |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106054929B (en) | A kind of unmanned plane based on light stream lands bootstrap technique automatically | |
CN106197422B (en) | A kind of unmanned plane positioning and method for tracking target based on two-dimensional tag | |
CN104067111B (en) | For following the tracks of the automated systems and methods with the difference on monitoring objective object | |
CN110142785A (en) | A kind of crusing robot visual servo method based on target detection | |
CN107680593A (en) | The sound enhancement method and device of a kind of smart machine | |
CN106707296A (en) | Dual-aperture photoelectric imaging system-based unmanned aerial vehicle detection and recognition method | |
CN108919838A (en) | A kind of unmanned plane transmission line of electricity automatic tracking method based on binocular vision | |
CN108032011B (en) | Initial point guiding device and method are stitched based on laser structure flush weld | |
CN107741175B (en) | A kind of artificial intelligence fine sight method | |
CN112215860A (en) | Unmanned aerial vehicle positioning method based on image processing | |
CN106162144A (en) | A kind of visual pattern processing equipment, system and intelligent machine for overnight sight | |
KR101347450B1 (en) | Image sensing method using dual camera and apparatus thereof | |
CN113115008B (en) | Pipe gallery master-slave operation inspection system and method | |
CN108174111B (en) | Crusing robot target image grasping means | |
CN107065871A (en) | It is a kind of that dining car identification alignment system and method are walked based on machine vision certainly | |
US20110150300A1 (en) | Identification system and method | |
CN106650701A (en) | Binocular vision-based method and apparatus for detecting barrier in indoor shadow environment | |
CN108171753A (en) | Stereoscopic vision localization method based on centroid feature point Yu neighborhood gray scale cross correlation | |
CN104253944A (en) | Sight connection-based voice command issuing device and method | |
CN108161930A (en) | A kind of robot positioning system of view-based access control model and method | |
CN109366501A (en) | Gas ice hockey robot control method, device and gas ice hockey equipment | |
CN104966302B (en) | A kind of detection localization method of any angle laser cross | |
CN109270652A (en) | A kind of optical lens glasses lens assembling system | |
CN108931982A (en) | Vision navigation system and method for robot moving equipment | |
CN106326890A (en) | Space positioning method based on image recognition and space positioning system thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170118 |