CN113069330A - Outdoor travel direction induction method for visually impaired people based on intelligent terminal - Google Patents

Outdoor travel direction induction method for visually impaired people based on intelligent terminal Download PDF

Info

Publication number
CN113069330A
CN113069330A CN202110312365.8A CN202110312365A CN113069330A CN 113069330 A CN113069330 A CN 113069330A CN 202110312365 A CN202110312365 A CN 202110312365A CN 113069330 A CN113069330 A CN 113069330A
Authority
CN
China
Prior art keywords
point
path
intersection
angle
latitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110312365.8A
Other languages
Chinese (zh)
Other versions
CN113069330B (en
Inventor
季晓勇
张财旺
周依娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University
Original Assignee
Nanjing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University filed Critical Nanjing University
Priority to CN202110312365.8A priority Critical patent/CN113069330B/en
Publication of CN113069330A publication Critical patent/CN113069330A/en
Application granted granted Critical
Publication of CN113069330B publication Critical patent/CN113069330B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Navigation (AREA)

Abstract

The invention provides an outdoor travel direction induction method for visually impaired people based on an intelligent terminal, which comprises the following steps: step 1, creating a user-defined blind guide map storing the detailed condition of a road network in a target area; step 2, the vision-impaired user selects a destination, loads a self-defined blind guide map of the area, plans and selects a route which is the safest and has an acceptable distance from the departure place to the destination; and 3, in the walking process of the visually impaired users, providing direction guidance for each step of the visually impaired users according to the longitude and latitude coordinates and the orientation azimuth angle of each moment, and broadcasting the obstacle information, the warning information and the turning information prestored in the process to the users in time so as to guide the users to walk and turn according to the route calibrated by the blind guiding map and finally reach the destination. The invention can help the visually impaired people to go out stably and safely outdoors, and adopts the intelligent terminal as a carrier, so that the intelligent outdoor mobile terminal is light, cheap and extremely high in operability.

Description

Outdoor travel direction induction method for visually impaired people based on intelligent terminal
Technical Field
The invention belongs to the field of travel navigation of blind people, and particularly relates to an outdoor travel direction induction method for visually impaired people based on an intelligent terminal.
Background
A report on a global estimate of the health of a population, currently about 2.17 million people worldwide are visually impaired, of which about 3600 million people are completely blind. Many visually impaired people can achieve their travel purpose in areas familiar with them, but in most cases they are faced with completely strange environments where independent travel is extremely difficult and dangerous, which greatly hinders the socialization of visually impaired people. The physical space is used for indicating a route and guiding directions and is generally a pure visual element, which is similar to a guideboard, a road sign and the like, while the vision is the most main perception channel of human, and the information acquired by human through the vision accounts for 83% of the total amount of the information acquired by human.
Therefore, the visually impaired people have to face two problems when they want to go out independently in the strange environment: first, how to plan and sort out the safest and distance-acceptable route from among many potential routes to the destination before departure. Secondly, how to walk and turn along the established route step by step in the process of traveling and get warning timely.
Disclosure of Invention
The invention aims at the existing application requirements and technical problems, and well completes the application scene of outdoor travel of the visually impaired people with low cost and simple operation, and aims to provide an outdoor travel direction induction method for the visually impaired people based on an intelligent terminal (in the invention, the intelligent terminal can adopt an intelligent mobile phone), which comprises the following steps:
step 1, creating a user-defined blind guide map storing detailed road network conditions in a target area, wherein the user-defined blind guide map draws road intersections, paths and obstacle warning information by longitude and latitude coordinate points;
step 2, the visually impaired user selects a destination through the intelligent terminal, the intelligent terminal loads the self-defined blind guide map of the area, and a route which is the safest and has a proper distance from the departure place to the destination is planned and selected;
step 3, advancing induction: in the walking process of the visually impaired users, direction guidance is given to the visually impaired users for each step of advancing according to the longitude and latitude coordinates and the orientation azimuth of the visually impaired users at each moment, and obstacle information, warning information and turning information prestored in the process can be timely broadcasted to the users so as to guide the visually impaired users to walk and turn according to a path calibrated by a blind guiding map and finally reach a destination.
In the step 1, the user-defined blind guide map is drawn by a traffic network formed by road intersections and bidirectional paths on the right side of the road in the area;
the path is from the starting intersection X1(generally 5) m from the right side of the road and a distance X from the edge2(general value is 1) m, and intersection X is reached to the end of distance1(generally 5) ending the curve at the meter, wherein each road has one to two paths;
the intersection is a road intersection for short;
wherein the intersection comprises the following fields:
id, which represents a primary key and is the unique identifier of the road intersection table;
the junctionCode is an intersection code, the field is set as the code of the road intersection, and the length of two bytes is taken as the unique identifier of the intersection;
the type is an intersection type, the field stores the type of the intersection, and the intersection type is divided into five types of X type, T type, Y type, L type and O type;
wherein the path includes the following fields:
id, which represents the primary key and is the unique identifier of the path table;
the roadCode is a path code, the field stores the path code of each path, and the path code is formed by splicing the initial intersection code and the end intersection code of the path;
the type is a path type, the field stores the type of the path, and the type of the path is divided into a straight line and a curve;
name is the name of the road, and the name of the road where the field stores the path is;
distance is the path distance, and this field stores the length of the path traversed by the path.
In the step 1, the self-defined blind guiding map draws road intersections, paths and obstacle warning information by longitude and latitude coordinate points, namely, the paths are drawn by path points with fixed spacing distances, the road intersections are drawn by inflection points in the intersections, and the inflection points and the path points store the obstacle information and necessary warning information along the way;
the path point includes the following fields:
id, which represents a primary key, is a unique identifier of the path point table (here, the storage table structure of the database is referred to, and the path point table is referred to as the database table storing path point data);
latitude is the path point latitude, and the field stores the path point latitude;
longituude is the path point longitude, and the field stores the longitude of the path point;
attention is warning information, and the field stores necessary warning information on the path;
obstacle is obstacle information, and the field stores the obstacle information encountered on the path;
the obsacleStor is an obstacle grading index, the field is set as an index divided by the obstacle according to the risk degree, the index is divided into 5 types including 1, 2, 3, 4 and infinity, the higher the index is, the higher the risk degree is, and the infinity indicates that the obstacle cannot pass through;
the roadId is a path identifier, and the field stores a primary key identifier corresponding to a path where the path point is located;
the inflection point is defined as an intersection point of straight lines where each path connected with one intersection is located, the storage sequence of the inflection point is a counterclockwise sequence in a physical space, and the inflection point comprises the following fields:
id represents a primary key and is the unique identifier of the inflection point table;
latitude is inflection point latitude, and the field stores the inflection point latitude;
longituude is the longitude of the inflection point, this field stores the longitude of the inflection point;
turning direction is the turning direction of the intersection, and the field is used for storing the turning direction when the intersection passes through the inflection point and is divided into a left part, a straight part and a right part;
the junctionId is a road intersection identifier, and the field stores a main key identifier corresponding to the road intersection where the inflection point is located.
The step 2 comprises the following steps:
step 2-1: the intelligent terminal is vertically fixed in front of the chest by the visually impaired user, one side of the screen faces towards the chest, the upper edge of the intelligent terminal is inclined towards the front of the body, and the azimuth angle of the intelligent terminal at any moment is the orientation angle of the visually impaired user;
step 2-2: the method comprises the steps that a destination is input by a vision-impaired user, a starting place is determined by an intelligent terminal, and a starting path and a starting intersection are determined;
step 2-3: loading all path codes of a road network in a target area to construct a symbolic graph;
step 2-4: taking the path length of each path as a weight, leading the initial intersection of the path to point to the ending intersection, and constructing a weighted directed graph;
step 2-5: calculating the shortest route distance shortdist between the departure place and the destination by using a Dijkstra algorithm;
step 2-6: and taking the multiplying power ratio, calculating a route constraint distance limitDist:
limitDist=shortestDist*ratio;
step 2-7: searching out all potential feasible routes with the distance from the departure place to the destination not greater than the route constraint distance limitdistin the weighted directed graph by adopting a depth-first search backtracking method;
step 2-8: the potential feasible routes are sorted and screened, the first route is the optimal route, and the sorting standard is as follows according to the priority sequence from high to low: the sum of the route barrier indexes is the lowest, the number of intersections passed by the route is the least, and the route distance is the shortest;
step 2-9: and obtaining the path code and the road intersection code related to the optimal route in sequence.
Step 2-2 comprises:
the input destination refers to the intersection code of the input target intersection;
the determining of the departure place, the starting path and the starting intersection specifically comprises the following steps:
step 2-2-1, taking the user position as a starting place, calculating a latitude and longitude coordinate with the azimuth angle of 0 degree and the distance of 20 meters according to a second point latitude formula, wherein the latitude of the latitude coordinate is called as an upper latitude boundary and is marked as hiLat;
let the starting point be point S, SwIs the latitude of the point S, SjLongitude of S point, BsIs an azimuth angle, LsIf R is the average radius of the earth, the second point longitude and latitude formula includes:
Figure BDA0002990349080000041
a=arccos(cos(90-Sw)·cos(c)+sin(90-Sw)·sin(c)·cos(Bs))
Figure BDA0002990349080000042
Tw=90-a
Tj=Sj+ΔLng
wherein c is the arc length LsThe two end points of the arc on the earth surface are connected with the corresponding angle behind the geocentric; a is the deviation value between the latitude of the obtained point T and the northern latitude of 90 degrees; delta Lng refers to the deviation value of the longitude of the obtained point T and the longitude of the S point;
the second point in the second point longitude and latitude formula is T, TwAt a latitude of T, TjLongitude of a point T is determined by adopting a radian system; the second point longitude and latitude formula is used for calculating an azimuth angle B from a point S on the earth surfacesAdvancing LsThe "second point" refers to the required point T, and the latitude and longitude coordinates of the reached point T are obtained.
Step 2-2-2, respectively calculating longitude and latitude coordinates of positions 20 meters away along the azimuth angles of 180 degrees, 90 degrees and 270 degrees to obtain a lower latitude boundary, an upper longitude boundary and a lower longitude boundary of a calculation range, and respectively marking as loLat, hiLng and loLng;
step 2-2-3, finding out path points with loLat < m < hilLat and loLng < n < hiLng in the blind guide map, wherein m represents path point latitude, and n represents path point longitude;
step 2-2-4, calculating the distance between the path point found in step 2-2-3 and the user position according to a distance formula between two points, finding the nearest path point, forming a starting path by the path where the nearest path point is located according to the remaining path points in the direction of the nearest path point, wherein the ending intersection corresponding to the path is the first intersection of the whole route and is called as the starting intersection;
the distance between the two points is expressed as follows:
setting a point P, Q, wherein the longitude and latitude of P are respectively Pw、PjAnd Q is respectively the longitude and latitude of Qw、QjThe distance between the two points is L;
Figure BDA0002990349080000051
the step 3 comprises the following steps:
step 3-1: extracting paths and intersections from the user-defined blind guiding map in sequence according to the path codes and the road intersection codes obtained by the optimal route;
step 3-2: extracting a path point sequence forming the initial path aiming at the initial path in the step 2-2;
further extracting path point sequences constituting the paths extracted in the step 3-1;
screening inflection points from each intersection extracted in the step 3-1 according to the following rules to form an inflection point sequence:
taking the nearest inflection point to the last path point of the intersection entering path as an entering inflection point;
taking an inflection point nearest to a first path point of an intersection departure path as a departure inflection point;
in an inflection point storage table of the intersection, sequentially extracting inflection points from an entering inflection point to a leaving inflection point;
connecting the path point sequence and the inflection point sequence in sequence to form a guide point sequence;
step 3-3: determining the turning direction of each intersection in the optimal route, and storing corresponding turning information in a first turning point of a turning point sequence corresponding to the intersection in the guide point sequence extracted in the step 3-2;
step 3-4: calculating a direction angle which should advance by using a true heading angle calculation formula according to longitude and latitude coordinates and guide point coordinates of the vision-impaired user in real time, wherein the direction angle which should advance is called a target orientation angle, and the user is guided step by combining the orientation angles of the user to finish traveling;
step 3-5: and pre-stored barrier information, warning information and turning information encountered in the advancing process are broadcasted to a user.
In step 3-3, the determining the turning direction of each intersection in the optimal route includes:
step 3-3-1, calculating an entrance direction angle according to a true course angle calculation formula for the last two path points of the entrance path, and marking as entry angle; for the initial two path points of the departure path, calculating a departure direction angle according to a true course angle calculation formula, and recording the departure direction angle as a leave angle;
step 3-3-2, judging the turning direction according to the relative positions of the entering direction angle and the leaving direction angle, and if the deviation of the leaving direction angle relative to the entering direction angle is not more than a threshold delta which is 30 degrees, judging that the vehicle is straight; if the departure direction angle is deviated to the left side of the entry direction angle and is greater than a threshold delta, determining that the vehicle turns to the left, and if the departure direction angle is deviated to the right side of the entry direction angle and is greater than the threshold delta, determining that the vehicle turns to the right;
step 3-3-3, calculating a reversal angle reverse of the entering direction angle as (entirangle + 180)% 360, wherein% is a modulus operator, and is divided into three cases by an interval [ entirangle-delta, entirangle + delta ], as follows:
the first condition is as follows: enterAngle-delta <0
If the leaveAngle is more than or equal to the entreAngle-delta +360 or the leaveAngle is less than or equal to the entreAngle + delta, judging the left-turn is a straight line, and if the leaveAngle belongs to the entreAngle + delta, judging the left-turn is a right-turn; the rest of the situations are left turning;
case two: the delta of enterAngle + delta is more than or equal to 360
If the leaveAngle is more than or equal to the entreAngle-delta or the leaveAngle is less than or equal to the entreAngle + delta-360, the left turn is determined as the straight line, and if the leaveAngle is the E (reverse, entrAngle-delta); the rest is right-turning.
Case three: the enterAngle-delta is more than or equal to 0 and the enterAngle + delta is less than 360
If the leaveAngle is in the E [ entrAnlge-delta, entrAngle + delta ], determining the leaved line as the straight line, otherwise, discussing the following two situations according to whether entrAngle is greater than 180 points:
1) when the entrenangle is less than 180, if the leavenangle belongs to (entrenangle + delta, reverse), judging the left turn, otherwise, judging the right turn;
2) when the entrenangle is more than or equal to 180, if the leavenangle is the E (reverse, entrenangle-delta), the left turn is determined, otherwise, the right turn is determined.
In step 3-4, the true heading angle calculation formula is as follows:
two points A, B are set;
Awat a point of latitude, AjIs longitude at point a. B iswLatitude of point B, BjLongitude is point B;
c=arccos(cos(90-Bw)·cos(90-Aw)+sin(90-Bw)·sin(90-Aw)·cos(Bj-Aj))
Figure BDA0002990349080000061
Figure BDA0002990349080000062
c is an angle corresponding to a minor arc formed by the two points A, B and the geocentric on the earth surface;
bearing is the initial true course angle of A pointing to B; Δ B is an intermediate parameter;
in step 3-4, the following two tasks are performed simultaneously:
task one: correcting the user orientation in real time, directing it to advance towards the target guidance point:
the method comprises the steps that a visually impaired user moves forward towards a guide point which is closest to the visually impaired user in a guide point list in each step, the guide point is called a target guide point, a point behind the target guide point is called a next target guide point, the intelligent terminal calculates the relative position relation among the point where the user is located, the target guide point and the next target guide point at the frequency of 0.5 second, and direction induction is carried out by combining the current direction of the user;
calculating a direction angle between a user position point and a target guide point according to a true course angle calculation formula, wherein the direction angle is called a target orientation angle and is recorded as tB; the orientation of the user at the moment is represented as an orientation angle of the intelligent terminal, namely the orientation angle of the user and is recorded as dB, and a reverse angle of the dB is recorded as (dB + 180)% 360; calculating the orientation and the degree which should be corrected by the orientation angle of the user by taking the target orientation angle as a reference, namely the direction correction angle which needs to be transmitted to the vision-impaired user by the intelligent terminal at the moment;
the direction correction angle consists of two points: first, deflection to the left or to the right, denoted turn; secondly, the degree of deflection is recorded as degree, and the calculation method of the direction correction angle is divided into the following two cases:
the first condition is as follows: dB is more than or equal to 180
If tB belongs to (reverse, dB ], turn is left, and degree is dB-tB;
if it is not
Figure BDA0002990349080000071
Turn is taken right, and degree is tB-dB, if it is<0,degree=degree+360;
Case two: dB <180
If tB belongs to [ dB, reverse), turn is taken right, and degree is tB-dB;
if it is not
Figure BDA0002990349080000072
Turn is left, and degree is dB-tB, if degree is present<0,degree=degree+360;
And a second task: target guidance point iteration:
the method comprises the following steps that a vision-impaired user slowly approaches to a target guide point under the guidance of a direction and needs to switch to a next target guide point at a proper time, the proper time is based on the following two conditions, and the target guide point is switched to the next target guide point when any one condition is met:
the first condition is as follows: the distance between the current target guide point of the user and the next target guide point is less than a threshold value threshold (generally, the value is 1 meter);
and a second condition: the degree of deviation of the target orientation angle tB from the path local orientation angle pB is greater than 90;
the local direction angle of the path refers to a direction angle between a target guidance point and a next target guidance point which are calculated according to a real course angle calculation formula when the target guidance point and the next target guidance point are both path points, and is called as a local direction angle of the path and is marked as pB;
the deviation degree between the target orientation angle tB and the path local direction angle pB is greater than 90, and the specific determination method is as follows:
if pB belongs to (90,270) and | tB-pB | ≧ 90, determining that the deviation degree of tB and pB is greater than 90;
if it is not
Figure BDA0002990349080000081
And pB<270, and tB e [ pB +90, pB +270]Judging that the deviation degree of tB and pB is more than 90;
if it is not
Figure BDA0002990349080000082
And pB is more than or equal to 270, and tB belongs to [ (pB + 90)% 360, pB-90]Namely, the deviation degree of tB and pB is judged to be more than 90.
Has the advantages that:
the method is realized based on an intelligent terminal platform, and the outdoor trip of the visually impaired people can be finished by low cost and simple operation without depending on customized special equipment, so that an optimal route is planned and selected, and the point-by-point guidance is carried out to ensure that a user travels in a correct direction. Meanwhile, the special blind guide map has operability when being constructed, and a basic map for blind guide can be constructed by only carrying the intelligent terminal provided with the software for realizing the method to walk in the area according to simple rules. In addition, the method does not need to build a large amount of special infrastructure and greatly changes the environment.
Drawings
The foregoing and/or other advantages of the invention will become further apparent from the following detailed description of the invention when taken in conjunction with the accompanying drawings.
FIG. 1 is a flow chart of an implementation of the present invention.
Fig. 2 is a conceptual diagram of a path in the present invention.
Fig. 3 is a schematic diagram of the type of intersection in the present invention.
Fig. 4 is a block diagram of a route planning selection step in the present invention.
FIG. 5 is a schematic diagram of the construction of a corner sequence in the present invention.
Fig. 6 is a schematic diagram of a guide point sequence in the present invention.
Fig. 7 is a schematic view of turning direction determination in the present invention.
FIG. 8 is a schematic diagram of the direction induction in the present invention.
Fig. 9 is a display view of the direction guide machine of the present invention.
Fig. 10 is a flowchart of calculation of the orientation correction angle.
Fig. 11 is a specific determination flowchart in which the degree of deviation of the target orientation angle tB from the path local direction angle pB is greater than 90.
Detailed Description
Fig. 1 is a flowchart of an outdoor travel direction induction method for visually impaired people based on an intelligent terminal according to an embodiment of the present invention.
The user-defined blind guide map in the step 1 is drawn by a traffic network formed by all road intersections in an area and bidirectional paths on the right side of the roads;
the path is a curve obtained from a position 5 meters away from the starting intersection to a position 5 meters away from the ending intersection along the right side of the road and 1 meter away from the edge, and each road has one to two paths;
FIG. 2 is a schematic diagram of the path concept of the present invention;
the intersection is the short for the road intersection;
wherein the intersection comprises the following fields:
id, which represents a primary key and is the unique identifier of the road intersection table;
the junctionCode is an intersection code, the field is set as the code of the road intersection, and the length of two bytes is taken as the unique identifier of the intersection;
the type is an intersection type, the field stores the type of the intersection, and the intersection type can be divided into five types, namely X type, T type, Y type, L type and O type;
FIG. 3 is a schematic diagram of the present invention relating to five types of intersections;
wherein the path includes the following fields:
id, which represents the primary key and is the unique identifier of the path table;
the roadCode is a path code, the field stores the path code of each path, and the path code is formed by splicing the initial intersection code and the end intersection code of the path;
the type is a path type, and the field stores the type of the path and is divided into a straight line or a curve;
name is the name of the road, and the name of the road where the field stores the path is;
distance is the path distance, and this field stores the length of the path traversed by the path.
Step 1, drawing road intersections, paths, obstacle warning information and other elements by longitude and latitude coordinate points of the map, namely drawing the paths by path points with fixed spacing distances, drawing the road intersections by inflection points in intersections, wherein the inflection points and the path points store obstacle information and necessary warning information along the way;
the path point includes the following fields:
id, which represents the primary key and is the unique identifier of the path point table;
latitude is the path point latitude, and the field stores the path point latitude;
longituude is the path point longitude, and the field stores the longitude of the path point;
attention is warning information, and the field stores necessary warning information on a path, such as climbing, road sliding and the like;
obstacle is obstacle information, and the field stores the obstacle information encountered on the path;
the obsacleStor is an obstacle grading index, the field is set as an index divided by the obstacle according to the risk degree, integers are divided into five categories of 1, 2, 3, 4 and infinity, the higher the index is, the higher the risk degree is, and the infinity indicates that the obstacle cannot pass through;
the roadId is a path identifier, and the field stores a primary key identifier corresponding to a path where the path point is located;
the inflection point is defined as an intersection point of straight lines where each path connected with one intersection is located, the storage sequence is a counterclockwise sequence in a physical space, and the inflection point comprises the following fields:
id represents a primary key and is the unique identifier of the inflection point table;
latitude is inflection point latitude, and the field stores the inflection point latitude;
longituude is the longitude of the inflection point, this field stores the longitude of the inflection point;
turning direction is the turning direction of the intersection, and the field is used for storing the turning direction when the intersection passes through the inflection point and is divided into a left part, a straight part and a right part;
the junctionId is a road intersection identifier, and the field stores a main key identifier corresponding to the road intersection where the inflection point is located;
wherein the longitude and latitude are all under WGS84 coordinate system.
In the practical verification realization, the self-contained lightweight embedded SQLite in the Android system is used as a storage mode for storing the path, the intersection, the path point and the inflection point.
Step 2, the vision-impaired user selects a destination through the intelligent terminal, the program loads the self-defined blind guide map of the area, and an optimal route which is the safest and has an acceptable distance from the departure place to the destination is planned and selected;
FIG. 4 is a block diagram of the steps for selecting the optimal route portion for the entire step 2 planning;
step 2-1: the intelligent terminal is vertically fixed in front of the chest by a vision-impaired user through a portable support, one side of a screen faces towards the chest, and the upper edge of the intelligent terminal is inclined by about 30 degrees towards the front of the body, so that the normal function of an electronic compass in the intelligent terminal is ensured, and the azimuth angle of the intelligent terminal at any moment is the orientation angle of the vision-impaired user;
step 2-2: the method comprises the steps that a destination is input by a vision-impaired user, an intelligent terminal determines a place of departure and then determines a starting path and a starting intersection;
the input destination refers to an intersection code of a voice input target intersection;
the determining the departure place and then the departure path and the departure intersection is a process as follows:
taking the position of a user as a starting place, calculating a latitude and longitude coordinate with the azimuth angle of 0 degrees and the distance of 20 meters according to a second point latitude formula, wherein the latitude of the coordinate is called as an upper latitude boundary hieat;
the second point longitude and latitude formula:
let the starting point be point S, SwIs the latitude of the point S, SjLongitude of S point, BsIs an azimuth angle, LsIs distance, R is the mean radius of the earth;
Figure BDA0002990349080000111
a=arccos(cos(90-Sw)·cos(c)+sin(90-Sw)·sin(c)·cos(Bs))
Figure BDA0002990349080000112
Tw=90-a
Tj=Aj+ΔLng
wherein the second point is T, TwAt a latitude of T, TjLongitude of a point T is determined by adopting a radian system;
secondly, similarly, calculating longitude and latitude coordinates of positions 20 meters away along the azimuth angles of 180 degrees, 90 degrees and 270 degrees respectively to obtain a lower latitude boundary, an upper longitude boundary and a lower longitude boundary of a calculation range, and marking the lower latitude boundary, the upper longitude boundary and the lower longitude boundary as loLat, hlng and loLng respectively;
next, searching out path points of loLat < latitude < hilLat and LoLng < longitude < hiLng in the blind guiding map database;
fourthly, calculating the distances between the found path points and the user positions according to a distance formula between the two points, finding the nearest path point, forming an initial path by the rest path points of the path where the point is located in the direction of the point, and forming an initial intersection at the end intersection of the path;
the distance between two points formula:
setting a point P, Q with longitude and latitude of Pw、Pj、Qw、QjThe distance between the two points is L;
Figure BDA0002990349080000113
wherein the arcsine function adopts a radian system, and the rest adopts an angle system;
step 2-3: loading all path codes of a road network in a target area, and constructing a symbolic graph;
loading the road network in the target area means that all path codes are pulled from the blind guiding map database, each path code is subjected to binary interception to obtain each road junction code, each road junction is a vertex in the map, and the step establishes one-to-one correspondence between the road junction code and an integer array subscript to form a symbolic map symbol;
step 2-4: taking the path length of each path as a weight, leading the initial intersection of the path to point to the ending intersection, and constructing a weighted directed graph EdgeWeiighteddigraph on the basis of the symbol graph SymbolGraph;
step 2-5: obtaining the shortest route distance shortdist between the departure place and the destination by applying a Dijkstra algorithm to an edge weighted digraph;
step 2-6: and setting the default value of the multiplying power ratio to be 1.5, and enabling a user to independently specify and calculate the route constraint distance limit Dist:
limitDist=shortestDist·ratio;
step 2-7: searching all potential feasible routes PotentialRoutes with the distance from the starting point to the destination not greater than the route constraint distance limitDist in the weighted directed graph EdgeWeiightedDigraph by adopting a depth-first search backtracking method;
step 2-8: sequencing and screening potential feasible routes PotentialRoutes, wherein the first ranked route is the optimal route MostSuitabeRoute;
the sequencing standard comprises the following in sequence according to the priority order:
(1) the sum of the route obstacle indices is lowest;
(2) the number of intersections passed by the route is minimum;
(3) the distance of the route is shortest;
step 2-9: obtaining path codes and road intersection codes related to the optimal route in sequence;
the step 3 of the present invention comprises that,
step 3-1: extracting path data and intersection data from a user-defined blind guiding map database in sequence according to the path code and the road intersection code obtained by the optimal route;
step 3-2: extracting path point sequences forming the initial paths in the step 2-1;
further extracting path point sequences constituting the paths extracted in the step 3-1;
screening out required inflection points according to a certain rule for each intersection extracted in the step 3-1 to form an inflection point sequence;
the screening rule is as follows with reference to fig. 5:
firstly, taking an inflection point nearest to the last path point of an entering path of the intersection as an entering inflection point;
taking the nearest inflection point to the first path point of the departure path of the intersection as the departure inflection point;
extracting inflection points from the entering inflection point to the leaving inflection point in the inflection point storage table of the intersection in sequence;
connecting the path point sequence and the inflection point sequence in sequence to form a guide point sequence, wherein fig. 6 is a schematic diagram of the guide point sequence;
step 3-3: judging the turning direction of each intersection in the optimal route according to the direction of the entering route and the direction of the leaving route, and storing the turning information in the first turning point of the corresponding turning point sequence extracted in the step 3-2;
the turning direction determination method is specifically described below with reference to fig. 7:
calculating an entering direction angle according to a calculation formula of a true course angle by using the last two path points of an entering path, and recording the entering direction angle as an entry angle, and calculating a leaving direction angle according to a calculation formula of a true course angle by using the initial two path points of a leaving path, and recording the leaving direction angle as a leave angle;
judging the turning direction according to the relative positions of the two direction angles, if the deviation of the departure direction angle relative to the entrance direction angle is not more than 30 degrees, judging the turning direction to be straight if the threshold is marked as delta, if the departure direction angle is deviated to the left side of the entrance direction angle and is more than 30 degrees, judging the turning direction to be left, and if the departure direction angle is deviated to the right side of the entrance direction angle and is more than 30 degrees, judging the turning direction to be right;
and the interval of the azimuth angle degree of the electronic compass of the intelligent terminal is [0,360), so the specific judgment mode of steering is as follows:
the backward angle reverse of the entering direction angle is calculated as (entrangle + 180)% 360 (% as the modulo operator), and is discussed in three cases with the interval [ entrangle-delta, entrangle + delta ]:
the first condition is as follows: enterAngle-delta <0
If the leaveAngle is more than or equal to the enterAngle-delta +360 or the leaveAngle is less than or equal to the enterAngle + delta, the straight line is determined. If the leaveAngle ∈ (entreAngle + delta, reverse), it is determined as right turn. The rest of the cases are left turns.
Case two: the delta of enterAngle + delta is more than or equal to 360
If the leaveAngle is more than or equal to the entrAngle-delta or the leaveAngle is less than or equal to the entrAngle + delta-360, the straight line is determined. If the leaveAngle is E (reverse, entreAngle-delta), it is determined as a left turn. The rest is right-turning.
Case three: the enterAngle-delta is more than or equal to 0 and the enterAngle + delta is less than 360
If the leaveAngle is in the same element as [ enterAnlge-delta, enterAngle + delta ], the operation is determined to be the straight line. Otherwise, the two cases of whether the entrangle is greater than 180 are discussed: 1) when the entrenangle is less than 180, if the leavenangle is e (entrenangle + delta, reverse), it is determined as right turn. And the rest are judged to be left-turning. 2) When the entry angle is larger than or equal to 180, if the leaveAngle is the element (reverse, entry angle-delta), the left turn is determined. The rest are judged to be right-turning.
Step 3-4: calculating a direction to be advanced in real time according to the longitude and latitude coordinates and the guide point coordinates of the vision-impaired user, namely a target orientation angle, and gradually guiding the user to finish traveling by combining the orientation angle of the user;
calculating a target orientation angle by adopting a true course angle calculation formula;
the true heading angle calculation formula is expressed as follows:
c=arccos(cos(90-Bw)·cos(90-Aw)+sin(90-Bw)·sin(90-Aw)·cos(Bj-Aj))
Figure BDA0002990349080000141
Figure BDA0002990349080000142
the procedure of this step performs two tasks simultaneously:
task one: as shown in fig. 8, the user orientation is corrected in real time, directing it to proceed towards the target guidance point:
the visual impairment user goes to the nearest guide point in the guide point list in each step, the point is called a target guide point, a point behind the target guide point is called a next target guide point, the system calculates the relative position relation among the user point, the target guide point and the next target guide point at the frequency of 0.5 second, and the direction induction is carried out by combining the current direction of the user.
Calculating a formula according to the true heading angle, wherein a direction angle between a user position point and a target guide point is called a target orientation angle and is recorded as tB; the orientation of the user at the moment, namely the orientation angle of the intelligent terminal, is called as a user orientation angle and is recorded as dB, and the reverse angle of the dB is recorded as (dB + 180)% 360; and calculating the direction and the degree which should be corrected by the user orientation angle by taking the target orientation angle as a reference, namely the direction correction angle which needs to be transmitted to the vision-impaired user by the system at the moment, wherein fig. 9 is a direction-guiding real machine display diagram.
The direction correction angle consists of two points: first, deflection to the left or to the right, denoted turn; second, how much deflection, denoted as degree, the direction correction angle is calculated as shown in FIG. 10:
the first condition is as follows: dB is more than or equal to 180
If tB belongs to (reverse, dB ], turn is left, and degree is dB-tB;
if it is not
Figure BDA0002990349080000143
Turn is taken right, and degree is tB-dB, if it is<0,degree=degree+360;
Case two: dB <180
If tB belongs to [ dB, reverse), turn is taken right, and degree is tB-dB;
if it is not
Figure BDA0002990349080000151
Turn is left, and degree is dB-tB, if degree is present<0,degree=degree+360;
And a second task: a target bootstrap point iteration;
the visually impaired user is slowly approached to the target guide point by the direction guide, and then the user is inevitably required to switch to the next target guide point at a proper time, so that the whole travel process is formed. At the proper time, the method provides two conditions, and the target guide point can be switched to the next point when any condition is met:
the first condition is as follows: the distance between the current point of the user and the target guide point is less than 1 meter, and the threshold value is recorded as threshold;
and a second condition: the degree of deviation of the target orientation angle tB from the path local orientation angle pB is greater than 90;
the local direction angle of the path refers to that when the target guidance point and the next target guidance point are both path points, the formula is calculated according to the true heading angle, and the direction angle between the two is calculated and called the local direction angle of the path and is marked as pB;
the deviation degree between the target orientation angle tB and the path local direction angle pB is greater than 90, as shown in fig. 11, the specific determination method is as follows:
if pB belongs to (90,270) and | tB-pB | ≧ 90, determining that the deviation degree of tB and pB is greater than 90;
if it is not
Figure BDA0002990349080000152
And pB<270, and tB e [ pB +90, pB +270]Judging that the deviation degree of tB and pB is more than 90;
if it is not
Figure BDA0002990349080000153
And pB is more than or equal to 270, and tB belongs to [ (pB + 90)% 360, pB-90]Namely, the deviation degree of tB and pB is judged to be more than 90.
Step 3-5: and the information of the pre-stored obstacles and the warning information encountered in the advancing process are broadcasted to the user.
If the obstacle information is stored in the front target guide point during traveling, the system broadcasts to help the visually impaired people to avoid;
warning information including POI (point of interest) pre-stored special road conditions and the like can be broadcasted to the user so as to deepen the cognition of the visually impaired people on the surrounding environment;
and (3) informing the visually impaired user of the next turning information before each intersection, wherein the turning information is stored in the first turning point in the turning point sequence extracted from each intersection before the user approaches each intersection, and the information can be broadcasted to the user at the moment.
The invention provides an outdoor travel direction inducing method for visually impaired people based on an intelligent terminal, and a plurality of methods and ways for implementing the technical scheme are provided. All the components not specified in the present embodiment can be realized by the prior art.

Claims (9)

1. An outdoor travel direction induction method for visually impaired people based on an intelligent terminal is characterized by comprising the following steps:
step 1, creating a user-defined blind guide map storing detailed road network conditions in a target area, wherein the user-defined blind guide map draws road intersections, paths and obstacle warning information by longitude and latitude coordinate points;
step 2, the visually impaired user selects a destination through the intelligent terminal, the intelligent terminal loads the self-defined blind guide map of the area, and a route which is the safest and has a proper distance from the departure place to the destination is planned and selected;
step 3, advancing induction: in the walking process of the visually impaired users, direction guidance is given to the visually impaired users for each step of advancing according to the longitude and latitude coordinates and the orientation azimuth of the visually impaired users at each moment, and obstacle information, warning information and turning information prestored in the process can be timely broadcasted to the users so as to guide the visually impaired users to walk and turn according to a path calibrated by a blind guiding map and finally reach a destination.
2. The method of claim 1, wherein:
in the step 1, the user-defined blind guide map is drawn by a traffic network formed by road intersections and bidirectional paths on the right side of the road in the area;
the path is from the starting intersection X1Beginning at the meter along the right side of the road and at a distance of X from the edge2Crossing X from rice to distance end1Ending the curve in the meter, wherein each road has one to one and two paths;
the intersection is a road intersection for short;
wherein the intersection comprises the following fields:
id, which represents a primary key and is the unique identifier of the road intersection table;
the junctionCode is an intersection code, the field is set as the code of the road intersection, and the length of two bytes is taken as the unique identifier of the intersection;
the type is an intersection type, the field stores the type of the intersection, and the intersection type is divided into five types of X type, T type, Y type, L type and O type;
wherein the path includes the following fields:
id, which represents the primary key and is the unique identifier of the path table;
the roadCode is a path code, the field stores the path code of each path, and the path code is formed by splicing the initial intersection code and the end intersection code of the path;
the type is a path type, the field stores the type of the path, and the type of the path is divided into a straight line and a curve;
name is the name of the road, and the name of the road where the field stores the path is;
distance is the path distance, and this field stores the length of the path traversed by the path.
3. The method of claim 2,
in the step 1, the self-defined blind guiding map draws road intersections, paths and obstacle warning information by longitude and latitude coordinate points, namely, the paths are drawn by path points with fixed spacing distances, the road intersections are drawn by inflection points in the intersections, and the inflection points and the path points store the obstacle information and necessary warning information along the way;
the path point includes the following fields:
id, which represents the primary key and is the unique identifier of the path point table;
latitude is the path point latitude, and the field stores the path point latitude;
longituude is the path point longitude, and the field stores the longitude of the path point;
attention is warning information, and the field stores necessary warning information on the path;
obstacle is obstacle information, and the field stores the obstacle information encountered on the path;
the obsacleStor is an obstacle grading index, the field is set as an index divided by the obstacle according to the risk degree, the index is divided into 5 types including 1, 2, 3, 4 and infinity, the higher the index is, the higher the risk degree is, and the infinity indicates that the obstacle cannot pass through;
the roadId is a path identifier, and the field stores a primary key identifier corresponding to a path where the path point is located;
the inflection point is defined as an intersection point of straight lines where each path connected with one intersection is located, the storage sequence of the inflection point is a counterclockwise sequence in a physical space, and the inflection point comprises the following fields:
id represents a primary key and is the unique identifier of the inflection point table;
latitude is inflection point latitude, and the field stores the inflection point latitude;
longituude is the longitude of the inflection point, this field stores the longitude of the inflection point;
turning direction is the turning direction of the intersection, and the field is used for storing the turning direction when the intersection passes through the inflection point and is divided into a left part, a straight part and a right part;
the junctionId is a road intersection identifier, and the field stores a main key identifier corresponding to the road intersection where the inflection point is located.
4. A method according to claim 3, characterized in that step 2 comprises the steps of:
step 2-1: the intelligent terminal is vertically fixed in front of the chest by the visually impaired user, one side of the screen faces towards the chest, the upper edge of the intelligent terminal is inclined towards the front of the body, and the azimuth angle of the intelligent terminal at any moment is the orientation angle of the visually impaired user;
step 2-2: the method comprises the steps that a destination is input by a vision-impaired user, a starting place is determined by an intelligent terminal, and a starting path and a starting intersection are determined;
step 2-3: loading all path codes of a road network in a target area to construct a symbolic graph;
step 2-4: taking the path length of each path as a weight, leading the initial intersection of the path to point to the ending intersection, and constructing a weighted directed graph;
step 2-5: calculating the shortest route distance shortdist between the departure place and the destination by using a Dijkstra algorithm;
step 2-6: and taking the multiplying power ratio, calculating a route constraint distance limitDist:
limitDist=shortestDist*ratio;
step 2-7: searching out all potential feasible routes with the distance from the departure place to the destination not greater than the route constraint distance limitdistin the weighted directed graph by adopting a depth-first search backtracking method;
step 2-8: the potential feasible routes are sorted and screened, the first route is the optimal route, and the sorting standard is as follows according to the priority sequence from high to low: the sum of the route barrier indexes is the lowest, the number of intersections passed by the route is the least, and the route distance is the shortest;
step 2-9: and obtaining the path code and the road intersection code related to the optimal route in sequence.
5. The method of claim 4, wherein step 2-2 comprises:
the input destination refers to the intersection code of the input target intersection;
the determining of the departure place, the starting path and the starting intersection specifically comprises the following steps:
step 2-2-1, taking the user position as a starting place, calculating a latitude and longitude coordinate with the azimuth angle of 0 degree and the distance of 20 meters according to a second point latitude formula, wherein the latitude of the latitude coordinate is called as an upper latitude boundary and is marked as hiLat;
let the starting point be point S, SwIs the latitude of the point S, SjLongitude of S point, BsIs an azimuth angle, LsIf R is the average radius of the earth, the second point longitude and latitude formula includes:
Figure FDA0002990349070000031
a=arccos(cos(90-Sw)·cos(c)+sin(90-Sw)·sin(c)·cos(Bs))
Figure FDA0002990349070000032
Tw=90-a
Tj=Sj+ΔLng
wherein c is the arc length LsThe two end points of the arc on the earth surface are connected with the corresponding angle behind the geocentric; a is the deviation value between the latitude of the obtained point T and the northern latitude of 90 degrees; delta Lng refers to the deviation value of the longitude of the obtained point T and the longitude of the S point;
the second point in the second point longitude and latitude formula is T, TwAt a latitude of T, TjLongitude of a point T is determined by adopting a radian system;
step 2-2-2, respectively calculating longitude and latitude coordinates of positions 20 meters away along the azimuth angles of 180 degrees, 90 degrees and 270 degrees to obtain a lower latitude boundary, an upper longitude boundary and a lower longitude boundary of a calculation range, and respectively marking as loLat, hiLng and loLng;
step 2-2-3, finding out path points with loLat < m < hilLat and loLng < n < hiLng in the blind guide map, wherein m represents path point latitude, and n represents path point longitude;
step 2-2-4, calculating the distance between the path point found in step 2-2-3 and the user position according to a distance formula between two points, finding the nearest path point, forming a starting path by the path where the nearest path point is located according to the remaining path points in the direction of the nearest path point, wherein the ending intersection corresponding to the path is the first intersection of the whole route and is called as the starting intersection;
the distance between the two points is expressed as follows:
setting a point P, Q, wherein the longitude and latitude of P are respectively Pw、PjAnd Q is respectively the longitude and latitude of Qw、QjThe distance between the two points is L;
Figure FDA0002990349070000041
6. the method of claim 5, wherein step 3 comprises the steps of:
step 3-1: extracting paths and intersections from the user-defined blind guiding map in sequence according to the path codes and the road intersection codes obtained by the optimal route;
step 3-2: extracting a path point sequence forming the initial path aiming at the initial path in the step 2-2;
further extracting path point sequences constituting the paths extracted in the step 3-1;
screening inflection points from each intersection extracted in the step 3-1 according to the following rules to form an inflection point sequence:
taking the nearest inflection point to the last path point of the intersection entering path as an entering inflection point;
taking an inflection point nearest to a first path point of an intersection departure path as a departure inflection point;
in an inflection point storage table of the intersection, sequentially extracting inflection points from an entering inflection point to a leaving inflection point;
connecting the path point sequence and the inflection point sequence in sequence to form a guide point sequence;
step 3-3: determining the turning direction of each intersection in the optimal route, and storing corresponding turning information in a first turning point of a turning point sequence corresponding to the intersection in the guide point sequence extracted in the step 3-2;
step 3-4: calculating a direction angle which should advance by using a true heading angle calculation formula according to longitude and latitude coordinates and guide point coordinates of the vision-impaired user in real time, wherein the direction angle which should advance is called a target orientation angle, and the user is guided step by combining the orientation angles of the user to finish traveling;
step 3-5: and pre-stored barrier information, warning information and turning information encountered in the advancing process are broadcasted to a user.
7. The method of claim 6, wherein in step 3-3, determining the turning direction of each intersection in the optimal route comprises:
step 3-3-1, calculating an entrance direction angle according to a true course angle calculation formula for the last two path points of the entrance path, and marking as entry angle; for the initial two path points of the departure path, calculating a departure direction angle according to a true course angle calculation formula, and recording the departure direction angle as a leave angle;
step 3-3-2, judging the turning direction according to the relative positions of the entering direction angle and the leaving direction angle, and if the deviation of the leaving direction angle relative to the entering direction angle is not more than a threshold delta, judging that the vehicle is straight; if the departure direction angle is deviated to the left side of the entry direction angle and is greater than a threshold delta, determining that the vehicle turns to the left, and if the departure direction angle is deviated to the right side of the entry direction angle and is greater than the threshold delta, determining that the vehicle turns to the right;
step 3-3-3, calculating a reversal angle reverse of the entering direction angle as (entirangle + 180)% 360, wherein% is a modulus operator, and is divided into three cases by an interval [ entirangle-delta, entirangle + delta ], as follows:
the first condition is as follows: enterAngle-delta <0
If the leaveAngle is more than or equal to the entreAngle-delta +360 or the leaveAngle is less than or equal to the entreAngle + delta, judging the left-turn is a straight line, and if the leaveAngle belongs to the entreAngle + delta, judging the left-turn is a right-turn; the rest of the situations are left turning;
case two: the delta of enterAngle + delta is more than or equal to 360
If the leaveAngle is more than or equal to the entreAngle-delta or the leaveAngle is less than or equal to the entreAngle + delta-360, the left turn is determined as the straight line, and if the leaveAngle is the E (reverse, entrAngle-delta); the other situations are right turning;
case three: the enterAngle-delta is more than or equal to 0 and the enterAngle + delta is less than 360
If the leaveAngle is in the E [ entrAnlge-delta, entrAngle + delta ], determining the leaved line as the straight line, otherwise, discussing the following two situations according to whether entrAngle is greater than 180 points:
1) when the entrenangle is less than 180, if the leavenangle belongs to (entrenangle + delta, reverse), judging the left turn, otherwise, judging the right turn;
2) when the entrenangle is more than or equal to 180, if the leavenangle is the E (reverse, entrenangle-delta), the left turn is determined, otherwise, the right turn is determined.
8. The method of claim 7, wherein in step 3-4, the true heading angle is calculated by the following equation:
two points A, B are set;
Awat a point of latitude, AjLongitude is point A; b iswLatitude of point B, BjLongitude is point B;
c=arccos(cos(90-Bw)·cos(90-Aw)+sin(90-Bw)·sin(90-Aw)·cos(Bj-Aj))
Figure FDA0002990349070000061
Figure FDA0002990349070000062
c is an angle corresponding to a minor arc formed by the two points A, B and the geocentric on the earth surface;
bearing is the starting true heading angle for A pointing to B.
9. Method according to claim 8, characterized in that in step 3-4, the following two tasks are performed simultaneously:
task one: correcting the user orientation in real time, directing it to advance towards the target guidance point:
the method comprises the steps that a visually impaired user moves forward towards a guide point which is closest to the visually impaired user in a guide point list in each step, the guide point is called a target guide point, a point behind the target guide point is called a next target guide point, the intelligent terminal calculates the relative position relation among the point where the user is located, the target guide point and the next target guide point at the frequency of 0.5 second, and direction induction is carried out by combining the current direction of the user;
calculating a direction angle between a user position point and a target guide point according to a true course angle calculation formula, wherein the direction angle is called a target orientation angle and is recorded as tB; the orientation of the user at the moment is represented as an orientation angle of the intelligent terminal, namely the orientation angle of the user and is recorded as dB, and a reverse angle of the dB is recorded as (dB + 180)% 360; calculating the orientation and the degree which should be corrected by the orientation angle of the user by taking the target orientation angle as a reference, namely the direction correction angle which needs to be transmitted to the vision-impaired user by the intelligent terminal at the moment;
the direction correction angle consists of two points: first, deflection to the left or to the right, denoted turn; secondly, the degree of deflection is recorded as degree, and the calculation method of the direction correction angle is divided into the following two cases:
the first condition is as follows: dB is more than or equal to 180
If tB belongs to (reverse, dB ], turn is left, and degree is dB-tB;
if it is not
Figure FDA0002990349070000071
Turn is taken right, and degree is tB-dB, if it is<0,degree=degree+360;
Case two: dB <180
If tB belongs to [ dB, reverse), turn is taken right, and degree is tB-dB;
if it is not
Figure FDA0002990349070000072
Turn is left, and degree is dB-tB, if degree is present<0,degree=degree+360;
And a second task: target guidance point iteration:
the method comprises the following steps that a vision-impaired user slowly approaches to a target guide point under the guidance of a direction and needs to switch to a next target guide point at a proper time, the proper time is based on the following two conditions, and the target guide point is switched to the next target guide point when any one condition is met:
the first condition is as follows: the distance between the current target guide point of the user and the next target guide point is less than a threshold value;
and a second condition: the degree of deviation of the target orientation angle tB from the path local orientation angle pB is greater than 90;
the local direction angle of the path refers to a direction angle between a target guidance point and a next target guidance point which are calculated according to a real course angle calculation formula when the target guidance point and the next target guidance point are both path points, and is called as a local direction angle of the path and is marked as pB;
the deviation degree between the target orientation angle tB and the path local direction angle pB is greater than 90, and the specific determination method is as follows:
if pB belongs to (90,270) and | tB-pB | ≧ 90, determining that the deviation degree of tB and pB is greater than 90;
if it is not
Figure FDA0002990349070000073
And pB<270, and tB e [ pB +90, pB +270]Judging that the deviation degree of tB and pB is more than 90;
if it is not
Figure FDA0002990349070000074
And pB is more than or equal to 270, and tB belongs to [ (pB + 90)% 360, pB-90]Namely, the deviation degree of tB and pB is judged to be more than 90.
CN202110312365.8A 2021-03-24 2021-03-24 Outdoor travel direction induction method for visually impaired people based on intelligent terminal Active CN113069330B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110312365.8A CN113069330B (en) 2021-03-24 2021-03-24 Outdoor travel direction induction method for visually impaired people based on intelligent terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110312365.8A CN113069330B (en) 2021-03-24 2021-03-24 Outdoor travel direction induction method for visually impaired people based on intelligent terminal

Publications (2)

Publication Number Publication Date
CN113069330A true CN113069330A (en) 2021-07-06
CN113069330B CN113069330B (en) 2022-04-22

Family

ID=76613632

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110312365.8A Active CN113069330B (en) 2021-03-24 2021-03-24 Outdoor travel direction induction method for visually impaired people based on intelligent terminal

Country Status (1)

Country Link
CN (1) CN113069330B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000161978A (en) * 1998-12-01 2000-06-16 Sony Corp Method and system for searching and guiding courses, and automobile
US20060089788A1 (en) * 2004-10-22 2006-04-27 Tom Laverty Method and apparatus for navigation system for searching easily accessible POI along route
CN101226061A (en) * 2008-02-21 2008-07-23 上海交通大学 Method for locating walker
CN101256083A (en) * 2008-04-09 2008-09-03 山东大学 Method for selecting urban traffic network path based on dynamic information
CN101464161A (en) * 2007-12-20 2009-06-24 爱信艾达株式会社 Destination input apparatus, method and program
CN103226018A (en) * 2013-04-03 2013-07-31 广东欧珀移动通信有限公司 Blind guiding method based on mobile terminal and mobile terminal
CN103278833A (en) * 2013-05-13 2013-09-04 深圳先进技术研究院 Line recommendation system and method based on Beidou satellite/GPS (global positioning system) data
US20150106014A1 (en) * 2013-10-16 2015-04-16 Thinkware Systems Corporation Apparatus and Method for Providing Map Data and System Thereof
US20150134242A1 (en) * 2013-11-14 2015-05-14 Honda Motor Co., Ltd. Navigation system and navigation method of electronic device
CN104990552A (en) * 2015-07-01 2015-10-21 南京大学 Indoor positioning system and positioning method based on footstep perception
CN105167967A (en) * 2015-09-14 2015-12-23 深圳市冠旭电子有限公司 Blind guiding method and system
CN105490795A (en) * 2015-12-11 2016-04-13 哈尔滨工业大学 Mobile wireless network node one-dimensional disposition method based on cooperative transmission technology
CN106248081A (en) * 2016-09-09 2016-12-21 常州大学 A kind of blind person's indoor navigation method combining Wi Fi auxiliary positioning based on inertial navigation
CN106461409A (en) * 2014-05-15 2017-02-22 三菱电机株式会社 Path guidance control device, path guidance control method, and navigation system
CN106969772A (en) * 2017-04-10 2017-07-21 南京大学 A kind of seeing-eye dog method based on cell phone platform
KR20190078712A (en) * 2017-12-15 2019-07-05 노진송 Deep learning technology base blind man Walking danger information guidance system and the method thereof
CN109990782A (en) * 2017-12-29 2019-07-09 北京欣奕华科技有限公司 A kind of method and apparatus of avoiding obstacles
CN110766211A (en) * 2019-10-14 2020-02-07 中国地质大学(武汉) Method for creating vehicle path planning problem model under real-time road condition

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000161978A (en) * 1998-12-01 2000-06-16 Sony Corp Method and system for searching and guiding courses, and automobile
US20060089788A1 (en) * 2004-10-22 2006-04-27 Tom Laverty Method and apparatus for navigation system for searching easily accessible POI along route
CN101464161A (en) * 2007-12-20 2009-06-24 爱信艾达株式会社 Destination input apparatus, method and program
CN101226061A (en) * 2008-02-21 2008-07-23 上海交通大学 Method for locating walker
CN101256083A (en) * 2008-04-09 2008-09-03 山东大学 Method for selecting urban traffic network path based on dynamic information
CN103226018A (en) * 2013-04-03 2013-07-31 广东欧珀移动通信有限公司 Blind guiding method based on mobile terminal and mobile terminal
CN103278833A (en) * 2013-05-13 2013-09-04 深圳先进技术研究院 Line recommendation system and method based on Beidou satellite/GPS (global positioning system) data
US20150106014A1 (en) * 2013-10-16 2015-04-16 Thinkware Systems Corporation Apparatus and Method for Providing Map Data and System Thereof
US20150134242A1 (en) * 2013-11-14 2015-05-14 Honda Motor Co., Ltd. Navigation system and navigation method of electronic device
CN106461409A (en) * 2014-05-15 2017-02-22 三菱电机株式会社 Path guidance control device, path guidance control method, and navigation system
CN104990552A (en) * 2015-07-01 2015-10-21 南京大学 Indoor positioning system and positioning method based on footstep perception
CN105167967A (en) * 2015-09-14 2015-12-23 深圳市冠旭电子有限公司 Blind guiding method and system
CN105490795A (en) * 2015-12-11 2016-04-13 哈尔滨工业大学 Mobile wireless network node one-dimensional disposition method based on cooperative transmission technology
CN106248081A (en) * 2016-09-09 2016-12-21 常州大学 A kind of blind person's indoor navigation method combining Wi Fi auxiliary positioning based on inertial navigation
CN106969772A (en) * 2017-04-10 2017-07-21 南京大学 A kind of seeing-eye dog method based on cell phone platform
KR20190078712A (en) * 2017-12-15 2019-07-05 노진송 Deep learning technology base blind man Walking danger information guidance system and the method thereof
CN109990782A (en) * 2017-12-29 2019-07-09 北京欣奕华科技有限公司 A kind of method and apparatus of avoiding obstacles
CN110766211A (en) * 2019-10-14 2020-02-07 中国地质大学(武汉) Method for creating vehicle path planning problem model under real-time road condition

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
KAWAMURA, T , SUGAHARA, K: "Practical path planning system for bus network", 《TRANSACTION OF THE INFORMATION PROCESSING SOCIETY OF JAPAN》 *
YANG GAO: "An improved shortest route algorithn in Vehicle Navigation System", 《IEEE》 *
吉艳霞: "一种基于遗传算法的北京公共交通选乘问题的启发式算法", 《运城学院学报》 *
季晓勇等: "Montgomery算法在ARM上的快速实现", 《微型电脑应用》 *
马云飞: "移动机器人全覆盖路径规划与自主导航算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Also Published As

Publication number Publication date
CN113069330B (en) 2022-04-22

Similar Documents

Publication Publication Date Title
EP2080983B1 (en) Navigation system, mobile terminal device, and route guiding method
EP1826532B1 (en) Navigation system
US6173232B1 (en) Vehicle navigation system and a recording medium
US7339496B2 (en) Geographic data transmitting method, information delivering apparatus and information terminal
KR101022148B1 (en) Navigation system, route search server, route search method, program and computer readable medium having program
CN104061927B (en) Method and apparatus for walking navigation
CN100492449C (en) On-vehicle information terminal
JP4255950B2 (en) Navigation device
CN105229422A (en) Automatic Pilot route planning is applied
WO2004038335A1 (en) Map data delivering method for communication-type navigation system
RU2589381C2 (en) Navigation system and method for pilot tracking of movement
CN112665606A (en) Walking navigation method, device, equipment and storage medium
CN101813491A (en) Walking navigation path planning method, walking navigation method and navigation equipment
CN107655490A (en) Hotspot path based on mobile subscriber track segmentation and most hot search finds method
CN113069330B (en) Outdoor travel direction induction method for visually impaired people based on intelligent terminal
JP4923692B2 (en) Route guidance device
JP2011214877A (en) Route search device
JP2019128260A (en) Display device, and map data structure
US20110246063A1 (en) Information providing system, information distribution server, and information providing method
CN115905449B (en) Semantic map construction method and automatic driving system with acquaintance road mode
JP3534228B2 (en) Destination guidance device
KR101142386B1 (en) Method for configuring pedestrian pavement network and providing pedestrian guide service
JP6525815B2 (en) Area guidance system, method and program
JP4321326B2 (en) Navigation system and program
JP2005337837A (en) Approach way information providing device and approach way information using terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant