CN108592937A - A kind of night flight or navigation road conditions for field exploration identify system - Google Patents
A kind of night flight or navigation road conditions for field exploration identify system Download PDFInfo
- Publication number
- CN108592937A CN108592937A CN201810437491.4A CN201810437491A CN108592937A CN 108592937 A CN108592937 A CN 108592937A CN 201810437491 A CN201810437491 A CN 201810437491A CN 108592937 A CN108592937 A CN 108592937A
- Authority
- CN
- China
- Prior art keywords
- data
- module
- signal
- image
- echo
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3644—Landmark guidance, e.g. using POIs or conspicuous other objects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3635—Guidance using 3D or perspective road maps
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/365—Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The present invention provides a kind of night flight or navigation road conditions for field exploration to identify system, including:The central controller includes:AR glasses, infrared camera, ultrasonic sensor, micro projector and central controller;The system of the present invention is by the way that augmented reality to be applied on night flight or navigation road conditions identification function, compared with traditional navigation equipment, this system has the autonomous analytic function of higher degree, without using GPS signal navigation feature, it only needs to wear moveable AR visions glasses, the road conditions identification information that can observe image conversion at any time in the process of walking avoids bringing new danger to explorer because navigating inaccuracy;This system can provide specific gound-mapping image for explorer, and be combined with each other with real scene, improve the Infravision of explorer in dark or in the case of can not see ambient enviroment clearly.
Description
Technical field
The present invention relates to road conditions to identify apparatus field, and in particular to a kind of night flight or navigation road conditions for field exploration identify system
System.
Background technology
Field exploration refers to the tourism side with exploration property or experience exploration property using field natural environment as place
Formula, it is deep to be liked by numerous young men because there are certain irritations and non-intellectual for it.The destination of field exploration is often positioned in
Meagrely-populated, the dangerous areas such as countryside, mountain area, desert far from city, the satisfactions such as stimulation, anxiety are being brought to explorer
While, there is also huge security risks.For example, the natural calamities such as earthquake, mud-rock flow are met with during exploration outdoors
Invasion, or get lost because encountering various road conditions, and then be strayed into the dangerously steep dangerous situation of topography, or even therefore cause pendant precipice, rolling
It falls or the tragedies such as overboard occurs.
The above danger is all caused by explorer is unclear to current road conditions;Especially visited in the remote mountains of dark, cave
During danger and night walk, explorer is usually using head lamp or flashlight lighting, still, since headlamp light irradiates model
It is with limit so that explorer is insufficient to the visualization of ambient enviroment, can not still solve root problem.
In this regard, most field exploration persons carry navigation equipment, to realize the real-time navigation capability of position.But
It is unintelligible or the case where GPS signal can not be received at all due to generally will appear GPS signal in the wild, so as to cause navigation nothing
Method is realized.In addition, since civilian GPS itself is there are position error (error range is at 5 meters -20 meters), can not effectively differentiate, and nothing
By being traditional GPS navigation technology or professional GPS navigation hardware technology, in some complicated landform, when as in hill path, in equipment
It only has route and can all take nearest route in most cases, and practical need of the client in complicated landform cannot be directed to
It is instructed, can especially bring new danger to explorer in night use, the purpose of navigation is not achieved.
Invention content
It is an object of the present invention to can not be at night to exploration for the navigation equipment for solving field exploration in the prior art
Person provides the technical issues of road conditions, proposes that a kind of night flight or navigation road conditions for field exploration identify system, the system is with augmented reality
Based on display technology, visual outdoor scene display function can be provided for explorer so that explorer understands in real time at night
The specific environment being presently in.
To achieve the above object, a kind of night flight or navigation road conditions for field exploration provided by the invention identify system, the system
It specifically includes:The central controller includes:AR glasses, infrared camera, ultrasonic sensor, micro projector and center
Controller;
At least two infrared cameras are symmetrically arranged on the both sides mirror holder of the AR glasses, the infrared camera is used
In acquiring the infrared night vision image in its visual range, and the image data of generation is sent to central controller;
The signal output end of the ultrasonic sensor is connect with central controller, for emitting ultrasonic wave to ambient enviroment
Signal, and the echo-signal that acquisition obtains is sent to central controller;
The central controller builds the three-dimensional space model of equipment local environment using the image data received, and
Space orientation is carried out to echo-signal in the three-dimensional space model, generates location data, while the echo to receiving
Signal carries out ranging analysis, and ranging data is carried out image conversion, virtual display object is obtained, in conjunction with infrared ray scene to void
Quasi- display object is rendered, and the virtual display object after rendering is pressed to the coordinate of location data instruction by micro projector
Position is projected, which is additionally operable to identify the gesture data of human body from the image data of reception, by right
Gesture data generates corresponding action data after being analyzed, and executes man-machine interactive operation using the action data;
The signal input part of the micro projector is connect with central controller, the projection end of the micro projector with AR
The half-transmitting mirror being arranged on mirror is opposite, the image projection for exporting central controller to half-transmitting mirror.
As a further improvement of the above technical scheme, the central controller includes:Image signal processing blocks, sound
Signal processing module, spatial model structure module, range finder module, echo-signal locating module, image rendering module and gesture are distinguished
Know module;
Described image signal processing module is used to receive the picture signal of infrared camera output, and picture signal is carried out
Processing generates the image data that module and gesture identification module identification are built for spatial model;
The Underwater Acoustic channels module is used to receive the acoustical signal of ultrasonic sensor output, and will be at acoustical signal
Reason generates the sonic data identified for range finder module;
The spatial model structure module builds the three-dimensional space model of system local environment using image data;
The range finder module passes through for ultrasonic wave launch time to be compared with the echo time of sonic data
Time difference and the velocity of sound calculate the sound path for obtaining echo-signal;
The echo-signal locating module corresponding coordinate position in three-dimensional space model with ultrasonic sensor, and
Receives echo-signal direction carries out space orientation to echo-signal, generates location data;
The image rendering module is used to the sound path of echo-signal carrying out image conversion, obtains virtual display object,
And with the display location of the selected virtual display object of the coordinate position of location data instruction, while according to infrared ray scene image frame
In brightness to virtually show object carry out illumination render;
The gesture identification module is used to identify the gesture data of human body in image data, and by gesture number
According to generating corresponding action data after being analyzed.
As a further improvement of the above technical scheme, the image signal processing blocks include:A/D converter, letter
Number amplifier and filter, the picture signal for exporting infrared camera carry out at analog-to-digital conversion, amplification and filtering successively
Reason.
As a further improvement of the above technical scheme, the Underwater Acoustic channels module includes:A/D converter, signal
Amplifier and filter, the echo-signal for exporting ultrasonic sensor carry out at analog-to-digital conversion, amplification and filtering successively
Reason.
As a further improvement of the above technical scheme, the central controller further includes:Data analysis module, strategy
Matching module and policy store module;
The data analysis module is used for the sound path data of receives echo-signal, by analyzing acquisition system current location
Pavement state data and ambient conditions data, the pavement state data include:Road gradient, surface evenness and
Width of roadway;The ambient conditions data include:Obstacle height, vegetation density, vegetation height and water surface depth;
The strategy matching module is used for will be in pavement state data, ambient conditions data and policy store module
The feature composite sequence of storage is matched, and the implementation strategy corresponding to matched feature composite sequence is extracted;
The policy store module is road surface shape for storing various feature composite sequences, the feature composite sequence
State data, ambient conditions data combination and implementation strategy between mapping, the implementation strategy includes:Instruction is advanced
Direction and route, indicate hedging position, stop the strategy that waits for rescue;
The image rendering module is additionally operable to implementation strategy carrying out image conversion, obtains and virtually shows object, and with
The display location of the selected virtual display object of coordinate position of location data instruction, while according in infrared ray scene image frame
Brightness is to virtually showing that object carries out illumination render.
As a further improvement of the above technical scheme, the central controller further includes feature recognition module, described
Characteristic of the feature recognition module for identifying animal in infrared night vision image, and the characteristic of acquisition is sent to
The warning device of system setting sends alarm signal to trigger warning device.
A kind of night flight or navigation road conditions mark system advantage for field exploration of the present invention is:
The system of the present invention is set by the way that augmented reality to be applied on night flight or navigation road conditions identification function with traditional navigation
Standby to compare, this system has the autonomous analytic function of higher degree, without using GPS signal navigation feature, it is only necessary to wear removable
Dynamic AR vision glasses, can observe the road conditions identification information of image conversion at any time in the process of walking, avoid inaccurate because navigating
New danger is really brought to explorer;This system can be in dark or in the case of can not see ambient enviroment clearly, to visit
Dangerous person provides specific gound-mapping image, and be combined with each other with real scene, improves the Infravision of explorer;In addition, energy
A series of visual analysis data enough are provided for explorer, explorer need not use the exploration experience compared with horn of plenty to road conditions
It is analyzed, you can know that explorer is presently in the safe coefficient of environment, and give corresponding implementation strategy, to improve open air
The safety coefficient of exploration, it is ensured that the safety of outdoor sports to improve crowd's scope of application of system, while also reducing people
For wrong direction caused by identification.
Description of the drawings
Fig. 1 is that a kind of night flight or navigation road conditions for field exploration provided by the invention identify system structure diagram;
Fig. 2 is the central controller structural schematic diagram in one embodiment of the invention;
Fig. 3 is the central controller structural schematic diagram in another embodiment of the present invention;
Fig. 4 is the image signal processing blocks structural schematic diagram in the embodiment of the present invention;
Fig. 5 is the Underwater Acoustic channels modular structure schematic diagram in the embodiment of the present invention;
Fig. 6 a are the AR eyeglasses-wearing schematic diagrames provided in the embodiment of the present invention;
Fig. 6 b are the AR glasses external structure schematic diagrams provided in the embodiment of the present invention;
Fig. 7 is the design sketch that virtual image is carried out using the AR glasses provided in the embodiment of the present invention.
Reference numeral
1, AR glasses 2, image acquisition device
3, micro projector 4, half-transmitting mirror
Specific implementation mode
System is identified to a kind of night flight or navigation road conditions for field exploration of the present invention with reference to the accompanying drawings and examples
It is described in detail.
As shown in Figure 1, a kind of night flight or navigation road conditions for field exploration provided by the invention identify system, including:Described
Central controller includes:AR glasses, infrared camera, ultrasonic sensor, micro projector and central controller.
As shown in Figure 6 b, at least two infrared cameras 2 are symmetrically arranged on the both sides mirror holder of the AR glasses 1, it is described
Infrared camera 2 be used to acquire infrared night vision image in its visual range, and the image data of generation is sent to center
Controller;It is as shown in Figure 6 a that the AR glasses 1 wear filling;
The signal output end of the ultrasonic sensor is connect with central controller, for emitting ultrasonic wave to ambient enviroment
Signal, and the echo-signal that acquisition obtains is sent to central controller;
The central controller builds the three-dimensional space model of equipment local environment using the image data received, and
Space orientation is carried out to echo-signal in the three-dimensional space model, generates location data, while the echo to receiving
Signal carries out ranging analysis, and ranging data is carried out image conversion, virtual display object is obtained, in conjunction with infrared ray scene to void
Quasi- display object is rendered, and the virtual display object after rendering is pressed to the coordinate of location data instruction by micro projector
Position is projected, which is additionally operable to identify the gesture data of human body from the image data of reception, by right
Gesture data generates corresponding action data after being analyzed, and executes man-machine interactive operation using the action data;
The signal input part of the micro projector is connect with central controller, as shown in Figure 6 b, the micro projector 3
The half-transmitting mirror 4 being arranged on projection end and AR glasses 1 is opposite, the image projection for exporting central controller to half-transmitting mirror
On 4.
The light that the half-transmitting mirror 4 can be such that micro projector 3 projects is reflected, and makes the light of external environment incidence
It is transmitted, to which real scene and virtual objects are realized image co-registration in human eye portion.
Frontend experience of the AR technologies as current virtual reality technology, has a wide range of applications in industry-by-industry.It is one
The concept of kind of image procossing be combined with each other with multi view tool and becomes as a special kind of skill, that is, it has often been said that enhancing it is existing
It is real.The target of this technology is that virtual world is sleeved on real world on the screen and carries out interaction, is needed on realizing effect
The multimedias such as camera, sensor are merged with scene, augmented reality not only presents the information of real world, but also will
Virtual information shows that two kinds of information are complementary to one another, are superimposed simultaneously.
Above-mentioned AR technologies are applied in night flight or navigation road conditions mark system by the present invention, pass through the night to actual environment residing for equipment
Depending on infrared image acquisition, the three-dimensional space model of equipment local environment is built with the image data collected, then in three-dimensional
Space orientation is carried out to ultrasonic echo signal in spatial model, while the echo-signal to receiving carries out ranging analysis, and
Ranging data is subjected to image conversion, virtual display object is obtained, in conjunction with infrared ray scene by virtual display object by positioning
Coordinate position is directly displayed on AR glasses, with the ambient condition around instruction system.
Space orientation, obstacle recognition, image are converted and are combined with AR technologies by the system of the present invention, being capable of real-time detection
And show the ambient condition information of current location, only need wearing can compared to existing navigation equipment, system of the invention is used
Mobile AR vision glasses can observe the road conditions identification information (road as shown in Figure 7 of image conversion at any time in the process of walking
The shrub of both sides develops), without using GPS signal navigation feature, avoid bringing new danger to explorer because navigating inaccuracy
Danger.At the same time, this system can provide specifically in dark or in the case of can not see ambient enviroment clearly for explorer
Shape shows image, and be combined with each other with real scene, improves the Infravision of explorer, which is traditional navigation equipment
It can not accomplish.In addition, the system of the present invention can realize man-machine interactive operation, just herein in connection with the perceptional function of human body gesture
In the operation of producer's manual control system.
In order to realize the intelligent control function of above-mentioned central controller, as shown in Fig. 2, the central controller includes:
Image signal processing blocks, Underwater Acoustic channels module, spatial model structure module, range finder module, echo-signal locating module, figure
As rendering module and gesture recognize module.
Described image signal processing module is used to receive the picture signal of infrared camera output, and picture signal is carried out
Processing generates the image data that module and gesture identification module identification are built for spatial model;
The Underwater Acoustic channels module is used to receive the acoustical signal of ultrasonic sensor output, and will be at acoustical signal
Reason generates the sonic data identified for range finder module;
The spatial model structure module builds the three-dimensional space model of system local environment using image data;
The range finder module passes through for ultrasonic wave launch time to be compared with the echo time of sonic data
Time difference and the velocity of sound calculate the sound path for obtaining echo-signal;
The echo-signal locating module corresponding coordinate position in three-dimensional space model with ultrasonic sensor, and
Receives echo-signal direction carries out space orientation to echo-signal, generates location data;
The image rendering module is used to the sound path of echo-signal carrying out image conversion, obtains virtual display object,
And with the display location of the selected virtual display object of the coordinate position of location data instruction, while according to infrared ray scene image frame
In brightness to virtually show object carry out illumination render;
The gesture identification module is used to identify the gesture data of human body in image data, and by gesture number
According to generating corresponding action data after being analyzed.
For outdoor sports explorer, the risk about outdoor sports, which carries out effectively identification, has extremely important meaning
Justice.Many experienced explorers can judge its security risk that may be present, and perceive according to the state of its local environment
Obtain safer track route;But for the explorer being lacking in experience, potential danger can not be generally determined,
To do the judgement to make mistake.Correlation report by taking mountaineering as an example shows:Since 2000, country's mountain-climbing outdoor sports
Accident is integrally in rising trend, and type diversification.In more than the ten kind accident pattern such as sliding pendant, stranded, disease, get lost continuously
Become one of the main Types of mountain-climbing outdoor sports accident for many years, and slides pendant and high pendant class accident and accounted in injured and death by accident
Compare highest.Data reporting is shown, compared with 2015, in lost, 2016 accident growth rate is 65.5%, quantity wound it is over the years it
Most.It is lost include darkness get lost, fall it is single it is lost, dense fog is lost, new line is got lost etc., and types, induced factor are various.However, real
But it teaches that, the maximum enemy of climber not instead of mountain environment, the decision of mistake.Contingency causes climber serious
It is injured or even dead, all originate from themselves erroneous decision done.
For this purpose, the system of the present invention additionally provides road conditions risk assessment and decision making function.In order to realize the function, such as Fig. 3
Shown, the central controller further includes:Data analysis module, strategy matching module and policy store module;
The data analysis module is used for the sound path data of receives echo-signal, by analyzing acquisition system current location
Pavement state data and ambient conditions data, the pavement state data include:Road gradient, surface evenness and
Width of roadway;The ambient conditions data include:Obstacle height, vegetation density, vegetation height and water surface depth;
The strategy matching module is used for will be in pavement state data, ambient conditions data and policy store module
The feature composite sequence of storage is matched, and the implementation strategy corresponding to matched feature composite sequence is extracted;
The policy store module is road surface shape for storing various feature composite sequences, the feature composite sequence
State data, ambient conditions data combination and implementation strategy between mapping, the implementation strategy includes:Instruction is advanced
Direction and route, indicate hedging position, stop the strategy that waits for rescue;
The image rendering module is additionally operable to implementation strategy carrying out image conversion, obtains and virtually shows object, and with
The display location of the selected virtual display object of coordinate position of location data instruction, while according in infrared ray scene image frame
Brightness is to virtually showing that object carries out illumination render.
The system of the present invention is by by the strategy of current pavement state data, the storage of ambient conditions data and system
Matched, the safe coefficient of environment be presently in recognize explorer, and give corresponding implementation strategy (as shown in fig. 7,
The direction walking indicated by arrows of the gentle section of topography), to improve the safety coefficient of field exploration, it is ensured that the peace of outdoor sports
Entirely.
In addition, in order to further increase the safety of field exploration, system of the invention additionally provides warning function.Such as figure
Shown in 3, the central controller further includes feature recognition module, and the feature recognition module is used in infrared night vision image
The characteristic of interior identification animal, and the characteristic of acquisition is sent to the warning device that system is arranged, it is set with triggering alarm
Preparation send alarm signal.The warning device can be used the forms such as sound, light and send alarm signal to explorer, while also may be used
By micro projector include reminding explorer to hide close wild beast in time on AR glasses by dangerous position.
As shown in figure 4, the image signal processing blocks can be by the A/D converter of setting, the simulation that is received
Signal is converted to builds module and the digital signal of gesture identification module identification for spatial model;Meanwhile in order to reduce signal
Noise jamming, improves the detection accuracy of signal, and the signal processing module may also include signal amplifier and filter, be used for
The signal that infrared camera exports is amplified and is filtered.
As shown in figure 5, the Underwater Acoustic channels module can be by the A/D converter of setting, the simulation received is believed
Number be converted to the digital signal identified for range finder module;Meanwhile in order to reduce the noise jamming of signal, improving the detection essence of signal
Degree, the Underwater Acoustic channels module may also include signal amplifier and filter, the letter for exporting ultrasonic sensor
It number is amplified and is filtered.
It should be noted last that the above examples are only used to illustrate the technical scheme of the present invention and are not limiting.Although ginseng
It is described the invention in detail according to embodiment, it will be understood by those of ordinary skill in the art that, to the technical side of the present invention
Case is modified or replaced equivalently, and without departure from the spirit and scope of technical solution of the present invention, should all be covered in the present invention
Right in.
Claims (6)
1. a kind of night flight or navigation road conditions for field exploration identify system, which is characterized in that including:The central controller packet
It includes:AR glasses, infrared camera, ultrasonic sensor, micro projector and central controller;
It is symmetrically arranged at least two infrared cameras (2), the infrared camera on the both sides mirror holder of the AR glasses (1)
(2) it is used to acquire the infrared night vision image in its visual range, and the image data of generation is sent to central controller;
The signal output end of the ultrasonic sensor is connect with central controller, for believing to ambient enviroment transmitting ultrasonic wave
Number, and the echo-signal that acquisition obtains is sent to central controller;
The central controller builds the three-dimensional space model of equipment local environment using the image data received, and in institute
Space orientation is carried out to echo-signal in the three-dimensional space model stated, generates location data, while the echo-signal to receiving
Ranging analysis is carried out, and ranging data is subjected to image conversion, virtual display object is obtained, in conjunction with infrared ray scene to virtually showing
Show that object is rendered, and the virtual display object after rendering is pressed to the coordinate of location data instruction by micro projector (3)
Position is projected, which is additionally operable to identify the gesture data of human body from the image data of reception, by right
Gesture data generates corresponding action data after being analyzed, and executes man-machine interactive operation using the action data;
The signal input part of the micro projector (3) is connect with central controller, the projection end of the micro projector (3) and AR
The half-transmitting mirror (4) being arranged on glasses (1) relatively, for the image projection that exports central controller to half-transmitting mirror (4).
2. the night flight or navigation road conditions according to claim 1 for field exploration identify system, which is characterized in that the center
Controller includes:Image signal processing blocks, Underwater Acoustic channels module, spatial model structure module, range finder module, echo-signal
Locating module, image rendering module and gesture recognize module;
Described image signal processing module is used to receive the picture signal of infrared camera (2) output, and picture signal is carried out
Processing generates the image data that module and gesture identification module identification are built for spatial model;
The Underwater Acoustic channels module is used to receive the acoustical signal of ultrasonic sensor output, and acoustical signal is handled,
Generate the sonic data identified for range finder module;
The spatial model structure module builds the three-dimensional space model of system local environment using image data;
The range finder module passes through the time for ultrasonic wave launch time to be compared with the echo time of sonic data
Difference and the velocity of sound calculate the sound path for obtaining echo-signal;
The echo-signal locating module corresponding coordinate position in three-dimensional space model with ultrasonic sensor, and receive
Echo-signal direction carries out space orientation to echo-signal, generates location data;
The image rendering module is used to the sound path of echo-signal carrying out image conversion, obtains and virtually shows object, and with
The display location of the selected virtual display object of coordinate position of location data instruction, while according in infrared ray scene image frame
Brightness is to virtually showing that object carries out illumination render;
The described gesture identification module is used to identify the gesture data of human body in image data, and by gesture data into
Corresponding action data is generated after row analysis.
3. the night flight or navigation road conditions according to claim 2 for field exploration identify system, which is characterized in that the image
Signal processing module includes:A/D converter, signal amplifier and filter, the image for exporting infrared camera (2) are believed
Analog-to-digital conversion is carried out number successively, is amplified and is filtered.
4. the night flight or navigation road conditions according to claim 2 for field exploration identify system, which is characterized in that the sound letter
Number processing module includes:A/D converter, signal amplifier and filter, the echo-signal for exporting ultrasonic sensor
Analog-to-digital conversion is carried out successively, is amplified and is filtered.
5. the night flight or navigation road conditions according to claim 2 for field exploration identify system, which is characterized in that the center
Controller further includes:Data analysis module, strategy matching module and policy store module;
The data analysis module is used for the sound path data of receives echo-signal, by the road for analyzing acquisition system current location
Surface state data and ambient conditions data, the pavement state data include:Road gradient, surface evenness and road surface
Width;The ambient conditions data include:Obstacle height, vegetation density, vegetation height and water surface depth;
The strategy matching module in pavement state data, ambient conditions data and policy store module for will store
Feature composite sequence matched, extract the implementation strategy corresponding to matched feature composite sequence;
The policy store module is pavement state number for storing various feature composite sequences, the feature composite sequence
According to the mapping between the combination and implementation strategy of, ambient conditions data, the implementation strategy includes:Indicate the side to advance
To and route, indicate hedging position, stop the strategy that waits for rescue;
The image rendering module is additionally operable to implementation strategy carrying out image conversion, obtains virtual display object, and with positioning
The display location of the selected virtual display object of coordinate position of data instruction, while according to the brightness in infrared ray scene image frame
To virtually showing that object carries out illumination render.
6. the night flight or navigation road conditions according to claim 2 for field exploration identify system, which is characterized in that the center
Controller further includes feature recognition module, feature of the feature recognition module for identifying animal in infrared night vision image
Data, and the characteristic of acquisition is sent to the warning device that system is arranged, send alarm signal to trigger warning device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810437491.4A CN108592937A (en) | 2018-05-09 | 2018-05-09 | A kind of night flight or navigation road conditions for field exploration identify system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810437491.4A CN108592937A (en) | 2018-05-09 | 2018-05-09 | A kind of night flight or navigation road conditions for field exploration identify system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108592937A true CN108592937A (en) | 2018-09-28 |
Family
ID=63636532
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810437491.4A Withdrawn CN108592937A (en) | 2018-05-09 | 2018-05-09 | A kind of night flight or navigation road conditions for field exploration identify system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108592937A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109583372A (en) * | 2018-11-29 | 2019-04-05 | 北京谷东网科技有限公司 | Augmented reality system and its apparatus for nighttime driving |
CN109992873A (en) * | 2019-03-27 | 2019-07-09 | 北京交通大学 | The mounting design method of station oriented identification |
CN111442784A (en) * | 2020-04-03 | 2020-07-24 | 北京四维智联科技有限公司 | Road guiding method, device and equipment based on AR navigation |
CN114007711A (en) * | 2019-06-19 | 2022-02-01 | 环球城市电影有限责任公司 | Techniques for selective viewing of projected images |
CN114152253A (en) * | 2021-12-06 | 2022-03-08 | 华东师范大学 | All-weather hiking auxiliary system and method based on deep learning and big data |
CN114173657A (en) * | 2019-05-31 | 2022-03-11 | 瑞思迈私人有限公司 | System and method for minimizing cognitive decline using augmented reality |
CN115752481A (en) * | 2022-12-09 | 2023-03-07 | 广东车卫士信息科技有限公司 | AR navigation method, AR glasses, medium and equipment based on image recognition |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104367447A (en) * | 2014-11-05 | 2015-02-25 | 东莞市安拓普塑胶聚合物科技有限公司 | Auxiliary walking system and method |
CN105182535A (en) * | 2015-09-28 | 2015-12-23 | 大连楼兰科技股份有限公司 | Method of using intelligent glasses for vehicle maintenance |
US20160026257A1 (en) * | 2014-07-23 | 2016-01-28 | Orcam Technologies Ltd. | Wearable unit for selectively withholding actions based on recognized gestures |
CN105445743A (en) * | 2015-12-23 | 2016-03-30 | 南京创维信息技术研究院有限公司 | Ultrasonic blind man guide system and realizing method thereof |
CN107622539A (en) * | 2017-10-30 | 2018-01-23 | 成都极致空觉科技有限公司 | The intelligent AR night watching devices of night watching efficiency can be improved |
-
2018
- 2018-05-09 CN CN201810437491.4A patent/CN108592937A/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160026257A1 (en) * | 2014-07-23 | 2016-01-28 | Orcam Technologies Ltd. | Wearable unit for selectively withholding actions based on recognized gestures |
CN104367447A (en) * | 2014-11-05 | 2015-02-25 | 东莞市安拓普塑胶聚合物科技有限公司 | Auxiliary walking system and method |
CN105182535A (en) * | 2015-09-28 | 2015-12-23 | 大连楼兰科技股份有限公司 | Method of using intelligent glasses for vehicle maintenance |
CN105445743A (en) * | 2015-12-23 | 2016-03-30 | 南京创维信息技术研究院有限公司 | Ultrasonic blind man guide system and realizing method thereof |
CN107622539A (en) * | 2017-10-30 | 2018-01-23 | 成都极致空觉科技有限公司 | The intelligent AR night watching devices of night watching efficiency can be improved |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109583372A (en) * | 2018-11-29 | 2019-04-05 | 北京谷东网科技有限公司 | Augmented reality system and its apparatus for nighttime driving |
CN109992873A (en) * | 2019-03-27 | 2019-07-09 | 北京交通大学 | The mounting design method of station oriented identification |
CN114173657A (en) * | 2019-05-31 | 2022-03-11 | 瑞思迈私人有限公司 | System and method for minimizing cognitive decline using augmented reality |
CN114007711A (en) * | 2019-06-19 | 2022-02-01 | 环球城市电影有限责任公司 | Techniques for selective viewing of projected images |
CN111442784A (en) * | 2020-04-03 | 2020-07-24 | 北京四维智联科技有限公司 | Road guiding method, device and equipment based on AR navigation |
CN114152253A (en) * | 2021-12-06 | 2022-03-08 | 华东师范大学 | All-weather hiking auxiliary system and method based on deep learning and big data |
CN114152253B (en) * | 2021-12-06 | 2024-05-17 | 华东师范大学 | All-weather hiking auxiliary system and method based on deep learning and big data |
CN115752481A (en) * | 2022-12-09 | 2023-03-07 | 广东车卫士信息科技有限公司 | AR navigation method, AR glasses, medium and equipment based on image recognition |
CN115752481B (en) * | 2022-12-09 | 2023-09-01 | 广东车卫士信息科技有限公司 | AR navigation method, AR glasses, medium and equipment based on image recognition |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108592937A (en) | A kind of night flight or navigation road conditions for field exploration identify system | |
US10378863B2 (en) | Smart wearable mine detector | |
CN111344592B (en) | Detector for determining the position of at least one object | |
Pajares | Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs) | |
JP2023085535A (en) | Detector for optically detecting at least one object | |
CN111587384A (en) | Detector for determining a position of at least one object | |
US20170344833A1 (en) | Method and system for identifying an individual with increased body temperature | |
CN105416584B (en) | Life tracking UAS after a kind of calamity | |
CN113557549A (en) | Detector for determining a position of at least one object | |
CN106859929A (en) | A kind of Multifunctional blind person guiding instrument based on binocular vision | |
CN102389361A (en) | Blindman outdoor support system based on computer vision | |
CN106162144A (en) | A kind of visual pattern processing equipment, system and intelligent machine for overnight sight | |
CN109672875A (en) | Fire-fighting and rescue intelligent helmet, fire-fighting and rescue method and Related product | |
CN113544745A (en) | Detector for determining a position of at least one object | |
CN109730910A (en) | Vision-aided system and its ancillary equipment, method, the readable storage medium storing program for executing of trip | |
KR102166432B1 (en) | Method for replying disaster situation using smart drone | |
CN106597690A (en) | Visually impaired people passage prediction glasses based on RGB-D camera and stereophonic sound | |
CN114152253B (en) | All-weather hiking auxiliary system and method based on deep learning and big data | |
KR20200067286A (en) | 3D scan and VR inspection system of exposed pipe using drone | |
Zhang et al. | Forest fire detection solution based on UAV aerial data | |
Martínez-de-Dios et al. | Multi-UAV experiments: application to forest fires | |
KR20220066783A (en) | Estimation method of forest biomass | |
CN110418290A (en) | A kind of location tracking system | |
US11281287B2 (en) | Method of generating an augmented reality environment | |
Sharma et al. | An edge-controlled outdoor autonomous UAV for colorwise safety helmet detection and counting of workers in construction sites |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20190306 Address after: 102600 Unit 403, Unit 1, 15th Floor, Changfengyuan, Huangcun Town, Daxing District, Beijing Applicant after: Dong Jinhuan Address before: 300000 Tianjin Binhai New Area Jingu Highway Petroleum Xincun 6 Building 206 Applicant before: He Hui Applicant before: Jiao Baocheng Applicant before: Xiang Guangli |
|
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20180928 |