CN111494175B - Navigation method based on head-mounted equipment, head-mounted equipment and storage medium - Google Patents

Navigation method based on head-mounted equipment, head-mounted equipment and storage medium Download PDF

Info

Publication number
CN111494175B
CN111494175B CN202010392129.7A CN202010392129A CN111494175B CN 111494175 B CN111494175 B CN 111494175B CN 202010392129 A CN202010392129 A CN 202010392129A CN 111494175 B CN111494175 B CN 111494175B
Authority
CN
China
Prior art keywords
head
blind road
lamp strip
navigation
virtual blind
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010392129.7A
Other languages
Chinese (zh)
Other versions
CN111494175A (en
Inventor
喻纯
史元春
杨辞源
许书畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Interactive Future Beijing Technology Co ltd
Tsinghua University
Original Assignee
Interactive Future Beijing Technology Co ltd
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Interactive Future Beijing Technology Co ltd, Tsinghua University filed Critical Interactive Future Beijing Technology Co ltd
Priority to CN202010392129.7A priority Critical patent/CN111494175B/en
Publication of CN111494175A publication Critical patent/CN111494175A/en
Application granted granted Critical
Publication of CN111494175B publication Critical patent/CN111494175B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1604Head
    • A61H2201/1607Holding means therefor

Landscapes

  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Pain & Pain Management (AREA)
  • Epidemiology (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The application discloses a navigation method based on head-mounted equipment, the head-mounted equipment comprises a lamp strip, a preset number of light-emitting devices are arranged in the length direction of the lamp strip, and the navigation method comprises the following steps: determining environmental information and device location of the head-mounted device; generating a virtual blind road according to the navigation destination, the environment information and the equipment position; determining a target walking direction corresponding to the current equipment position in the virtual blind road; and lightening the target light-emitting device corresponding to the target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction. This application can be under the prerequisite of avoiding the barrier for looking the barrier personage and carrying out accurate navigation. The application also discloses a head-mounted device and a storage medium, which have the beneficial effects.

Description

Navigation method based on head-mounted equipment, head-mounted equipment and storage medium
Technical Field
The application relates to the technical field of intelligent wearable equipment, in particular to a navigation method based on head-mounted equipment, the head-mounted equipment and a storage medium.
Background
Safe independent traveling is the main problem that visual impaired people face, and visual impaired people mostly rely on the cane as the instrument of going out at present. However, the blind stick can only perform early warning on the near-ground obstacle in the front area of the body, and cannot avoid the obstacle and perform navigation in a complex road environment. Therefore, visually impaired people still face the troubles of complicated road conditions and obstacles, lack of understanding of the surrounding environment, easy loss of the sense of direction and the like when going out.
Therefore, how to accurately navigate for the visually impaired on the premise of avoiding the obstacle is a technical problem that needs to be solved by those skilled in the art.
Disclosure of Invention
The application aims to provide a navigation method based on a head-mounted device, the head-mounted device and a storage medium, which can accurately navigate for visually impaired people on the premise of avoiding obstacles.
In order to solve the technical problem, the present application provides a navigation method based on a head-mounted device, where the head-mounted device includes a lamp strip, and a preset number of light-emitting devices are arranged in a length direction of the lamp strip, and the navigation method based on the head-mounted device includes:
determining environmental information and device location of the head-mounted device;
generating a virtual blind road according to the navigation destination, the environment information and the equipment position;
determining a target walking direction corresponding to the current equipment position in the virtual blind road;
and lightening the target light-emitting device corresponding to the target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction.
Optionally, the method further includes:
determining the rotation angle of the head-mounted equipment according to the motion data of the wearing part;
and updating the corresponding relation between the light-emitting device in the lamp strip and the walking direction according to the rotation angle.
Optionally, the method further includes:
acquiring motion data of the wearing part by using a motion detection device; wherein the motion detection device comprises any one or a combination of any several of a camera, a gyroscope, an accelerometer, a magnetometer and an inertial conduction module.
Optionally, the generating a virtual blind road according to the navigation destination, the environment information, and the device location includes:
determining the position of an obstacle according to the environment information;
generating the virtual blind road according to the position of the obstacle, the navigation destination and the position of the equipment;
the starting point of the virtual blind road is the equipment position, the end point of the virtual blind road is the navigation destination, the virtual blind road does not interfere with obstacles, the blind road width of the virtual blind road is greater than a preset value, and the starting point of the center line of the virtual blind road is the equipment position.
Optionally, determining a target walking direction corresponding to the current device location in the virtual blind sidewalk includes:
judging whether the distance between the current equipment position and the center line of the virtual blind road is smaller than or equal to a preset distance or not; wherein the preset distance is one half of the width of the blind road;
if so, setting a point on the central line of the virtual blind road, which is closest to the current equipment position, as a reference point, and taking the tangential direction of the reference point on the central line of the virtual blind road as the target walking direction;
and if not, updating the virtual blind road according to the current equipment position, the navigation destination and the environment information.
Optionally, before determining the environment information and the device location of the head-mounted device, the method further includes:
collecting environmental information of the head-mounted equipment by using a distance measuring device;
the distance measuring device comprises any one or combination of any one of a camera, a radar, an infrared distance measuring sensor and an ultrasonic sensor; when range unit includes when the camera environmental information includes the depth map, works as range unit includes when the radar environmental information includes radar range finding information, works as range unit includes when infrared range sensor environmental information includes infrared range finding information, works as environmental information includes ultrasonic range finding information.
Optionally, the generating a virtual blind road according to the navigation destination, the environment image, and the device location includes:
calculating a shortest walking path between the navigation destination and the device location;
judging whether the distance of the shortest walking path is greater than a preset walking distance; the preset walking distance is determined according to the acquisition range of a device for acquiring the environmental information;
if so, selecting a navigation intermediate point on the shortest walking path, and generating the virtual blind road according to the navigation intermediate point, the environment image and the equipment position; and the walking distance between the navigation intermediate point and the equipment position is less than or equal to the preset walking distance.
The application also provides head-mounted equipment which comprises a processor, a lamp strip controller and a lamp strip, wherein a preset number of light-emitting devices are arranged in the length direction of the lamp strip;
a processor to determine environmental information and device location of a head-mounted device; the virtual blind road is generated according to the navigation destination, the environment information and the equipment position; the device is used for determining a target walking direction corresponding to the current device position in the virtual blind road; the device comprises a lamp strip controller, a target light-emitting device and a control module, wherein the lamp strip controller is used for determining a target light-emitting device corresponding to a target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction, and sending a control instruction corresponding to the target light-emitting device to the lamp strip controller;
and the lamp strip controller is used for lightening a target light-emitting device of the lamp strip according to the control instruction.
Optionally, the method further includes:
and the motion detection device is used for determining the rotation angle of the head-mounted equipment according to the motion data of the wearing part so that the processor can update the corresponding relation between the light-emitting device in the lamp strip and the walking direction according to the rotation angle.
The present application further provides a storage medium having a computer program stored thereon, which when executed, implements the steps performed by the above-mentioned head-mounted device based navigation method.
The application provides a navigation method based on head-mounted equipment, the head-mounted equipment comprises a lamp strip, a preset number of light-emitting devices are arranged in the length direction of the lamp strip, and the navigation method comprises the following steps: determining environmental information and device location of the head-mounted device; generating a virtual blind road according to the navigation destination, the environment information and the equipment position; determining a target walking direction corresponding to the current equipment position in the virtual blind road; and lightening the target light-emitting device corresponding to the target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction.
According to the method and the device, the virtual blind road is determined and generated according to the environment information of the head-mounted device, the device position and the navigation destination, so that the user can be indicated to walk according to the virtual blind road to reach the navigation destination. This application is realized based on the head-mounted apparatus, is provided with the lamp area of presetting a plurality of illuminator in the length direction on the head-mounted apparatus, and each illuminator can correspond a walking direction on the lamp area. And when the wearer of the head-mounted equipment is navigated based on the virtual blind way, lightening the target light-emitting device corresponding to the target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction. The current direction that needs the walking can be confirmed according to luminous position in the lamp area to the person of wearing the head equipment, and this application can be under the prerequisite of avoiding the barrier for looking the barrier personage and carrying out accurate navigation. This application still provides a head-mounted device and a storage medium simultaneously, has above-mentioned beneficial effect, and it is no longer repeated here.
Drawings
In order to more clearly illustrate the embodiments of the present application, the drawings needed for the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort.
Fig. 1 is a flowchart of a method for head-mounted device based navigation according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a method for setting a target walking direction according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a flowchart of a head-mounted device based navigation method according to an embodiment of the present disclosure.
The specific steps may include:
s101: determining environmental information and device location of the head-mounted device;
the embodiment is applied to head-mounted equipment such as intelligent hats, intelligent glasses, head-mounted displays and intelligent helmets, the head-mounted equipment can comprise lamp belts, and a preset number of light-emitting devices are arranged in the length direction of the lamp belts. When the user wears the head-mounted device, the length direction of the lamp strip is parallel to the pupil distance direction of the user. As a possible embodiment, a plurality of light emitting devices are arranged at equal intervals in the length direction of the light strip, and the light emitting devices may include, for example, incandescent lamps, LED lamps, flexible neon lamps or laser lamps. As another possible implementation, the present embodiment may also use an OLED screen or an LCD screen as the light emitting device on the light strip.
The method comprises the steps of determining related information of the current position of the head-mounted device, wherein specific environment information comprises road surface information and information of a space area in front of walking, the environment information is used for describing the physical space condition in the space around the position of the head-mounted device, and the position of the device is position information describing the current position of the head-mounted device. The embodiment can determine the height fluctuation condition of the road surface in the preset area and the obstacles in the preset area according to the environment information. The obstacles mentioned above may include inanimate objects as well as pedestrians around the user. As a possible implementation, there may be means for detecting environmental information in the head-mounted device, such as a camera, a radar, an infrared ranging sensor, or an ultrasonic sensor, and the camera may be a depth camera. A positioning module may also be present in the head-mounted device for determining a device position of the head-mounted device. Specifically, the present embodiment may collect surrounding environment information through a micro intel D435i RGBD camera (depth camera) worn on the user.
S102: generating a virtual blind road according to the navigation destination, the environment information and the equipment position;
before this step, there may be a navigation instruction received, and a speech recognition device may be present in a specific headset, and a navigation destination may be determined by recognizing speech of a user. The further head-mounted equipment can also receive navigation instructions transmitted by other devices (such as a mobile phone) through Bluetooth, and the head-mounted equipment can determine a navigation destination by analyzing the navigation instructions.
The virtual blind road used for indicating the walking route of the user is generated according to the navigation destination, the environment information and the equipment position, the navigation destination is equivalent to the walking end point in the process of generating the virtual blind road, the equipment position is equivalent to the walking start point, and the position which cannot be passed by in the walking process can be determined according to the environment information, so that the virtual blind road generated by combining the navigation destination, the environment information and the equipment position can guide the user to reach the navigation destination, and the user can be prevented from touching obstacles in the walking process. Specifically, the embodiment can plan the front walkable region, i.e. the virtual blind road, through computer vision. Namely, the user can walk according to the virtual blind road to avoid the obstacles and reach the destination.
As a possible implementation, the present embodiment may generate the virtual blind road by:
step 1: calculating a shortest walking path between the navigation destination and the device location;
step 2: judging whether the distance of the shortest walking path is greater than a preset walking distance; if yes, entering step 3; if not, ending the flow;
the preset walking distance is determined according to the acquisition range of the device for acquiring the environmental information, and the acquisition range of the device for acquiring the environmental information is positively correlated with the preset walking distance.
And step 3: selecting a navigation intermediate point on the shortest walking path, and generating the virtual blind road according to the navigation intermediate point, the environment image and the equipment position; and the walking distance between the navigation intermediate point and the equipment position is less than or equal to the preset walking distance.
Due to the fact that the collection range of the device for collecting the environmental information is limited, when the preset walking distance between the navigation destination and the equipment position is too long, the navigation intermediate point can be selected, and the virtual blind road is generated based on the navigation intermediate point.
S103: determining a target walking direction corresponding to the current equipment position in the virtual blind road;
the user wearing the head-mounted equipment continuously walks in the virtual blind sidewalk, so that the current equipment position of the head-mounted equipment is continuously changed, and each current equipment position has a corresponding target walking direction in the virtual blind sidewalk. As a possible implementation manner, a point on the center line of the virtual blind road, which is closest to the current device position, may be set as a reference point, and a tangential direction of the reference point on the center line of the virtual blind road may be set as the target walking direction.
S104: and lightening the target light-emitting device corresponding to the target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction.
The correspondence between the light emitting devices in the light strip and the walking direction is preset in the embodiment, and the corresponding target light emitting devices can be lightened based on the correspondence after the target walking direction is determined. The lighting devices other than the target lighting device on the lamp string in the present embodiment may be in the off state. When the visually impaired people wear the head-mounted equipment, the current walking direction can be determined according to the change of light, and then the navigation of the visually impaired people is realized. The visually impaired people are people with low-vision visual impairment, the low-vision visual impairment is a specific medical concept, is different from the totally blind people, has weak vision, can feel the change of light or the shape of a fuzzy object to a lower degree, and most of the visually impaired people belong to low-vision groups.
The wearer of the head-mounted device mentioned in this embodiment may be a visually impaired person, that is, the present application plans a route (i.e., a virtual blind sidewalk) for a visually impaired user, and this embodiment guides a low-vision visually impaired person to walk on the virtual blind sidewalk by using the ability of light perception. The orientation information is conveyed to the user by the illumination of a light placed in front of the eyes of the visually impaired user. Can send the virtual blind road information that the last step computer planned for the Arduino in the head-mounted apparatus through the bluetooth, utilize Arduino control to arrange the lamp area in the visual disturbance user before the eye, the lamp area is gone up the region that lights and is corresponding correct walking direction, and what position's lamp lights, and the user just can to go to what direction.
As a feasible implementation mode, the Arduino, the Bluetooth, the IMU, the LED lamp strip and the like can be integrated on the head-mounted equipment, the Arduino is used for controlling the on and off of the LED lamp, the Bluetooth is used for transmitting the virtual blind road information planned by the computer, and the IMU is used for detecting the motion information of the user. When the user wears the head-mounted device, the user can feel the position of the light on the front brim of the eye to judge the reasonable walking direction, and accordingly the purpose of assisting the low-vision visually-impaired people to independently go out is achieved. The imu (inertial Measurement unit) refers to an inertial sensing module, and can report the acceleration and angular velocity of the inertial sensing module to the system at the frequency of 100-. The present embodiment may utilize the IMU to obtain body orientation information of the user.
The embodiment firstly determines to generate the virtual blind road according to the environment information of the head-mounted device, the device position and the navigation destination, so as to instruct the user to walk according to the virtual blind road to reach the navigation destination. This embodiment is realized based on the head-mounted apparatus, is provided with the lamp area of presetting a plurality of illuminator including length direction on the head-mounted apparatus, and each illuminator can correspond a walking direction on the lamp area. And when the wearer of the head-mounted equipment is navigated based on the virtual blind way, lightening the target light-emitting device corresponding to the target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction. The current direction that needs the walking can be confirmed according to luminous position in the lamp area to the person of wearing the head equipment, and this embodiment can be under the prerequisite of avoiding the barrier for seeing the barrier personage and carrying out accurate navigation.
As a feasible implementation, on the basis of the embodiment shown in fig. 1, in the process of lighting the target light-emitting device corresponding to the target walking direction, the motion detection device in the head-mounted device may be further used to obtain the motion data of the wearing part, determine the rotation angle of the head-mounted device according to the motion data of the wearing part, and update the corresponding relationship between the light-emitting device in the light strip and the walking direction according to the rotation angle. That is, the lighting condition of the lighting device can be changed in real time along with the walking of the user through the above manner, so that the area lighted on the lamp strip is always mapped with the area which should be moved, and the absolute position of the target lighting device lighted on the lamp strip is not changed no matter how the user swings or shakes.
Specifically, the motion detection device may include any one or a combination of any several of a camera, a gyroscope, an accelerometer, a magnetometer, and an inertial conduction module. The motion detection means may calculate the orientation of the user and thus the position of the light up area may be corrected by the motion data detected by the motion detection means.
Referring to fig. 2, fig. 2 is a flowchart of a method for setting a target walking direction according to an embodiment of the present application, where the embodiment is further described with reference to S102 and S103 in the embodiment corresponding to fig. 1, and a further implementation manner can be obtained by combining the embodiment with the embodiment corresponding to fig. 1, where the embodiment may include the following steps:
s201: determining the position of an obstacle according to the environment information;
wherein, this embodiment can utilize range unit to gather wear equipment's environmental information, works as range unit includes during the camera environmental information includes the depth map, works as range unit includes during the radar environmental information includes radar range finding information, works as range unit includes during infrared distance measuring sensor environmental information includes infrared range finding information, works as environmental information includes ultrasonic range finding information. The present embodiment may determine the obstacle position based on the depth image, radar ranging information, infrared ranging information, and ultrasonic ranging information.
S202: generating the virtual blind road according to the position of the obstacle, the navigation destination and the position of the equipment;
the starting point of the virtual blind road generated in the step is the equipment position, the end point of the virtual blind road is the navigation destination, the virtual blind road does not interfere with an obstacle, the blind road width of the virtual blind road is larger than a preset value, and the starting point of the center line of the virtual blind road is the equipment position.
S203: judging whether the distance between the current equipment position and the center line of the virtual blind road is smaller than or equal to a preset distance or not; if yes, entering S204; if not, the process goes to S205;
wherein the preset distance is one half of the width of the blind road;
s204: setting a point which is closest to the current equipment position on the central line of the virtual blind road as a reference point, and taking the tangential direction of the reference point on the central line of the virtual blind road as the target walking direction;
s205: and updating the virtual blind road according to the current equipment position, the navigation destination and the environment information.
When the user wearing the head-mounted device walks out of the virtual blind road, the virtual blind road can be updated by using the relevant steps of S205 so as to continue navigation.
Further, the embodiment of the application also provides a head-mounted device; the head-mounted device can comprise a processor, a lamp strip controller and a lamp strip, wherein a preset number of light-emitting devices are arranged in the length direction of the lamp strip;
the processor is used for determining environment information and equipment position of the head-mounted equipment; the virtual blind road is generated according to the navigation destination, the environment information and the equipment position; the device is used for determining a target walking direction corresponding to the current device position in the virtual blind road; the device comprises a lamp strip controller, a target light-emitting device and a control module, wherein the lamp strip controller is used for determining a target light-emitting device corresponding to a target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction, and sending a control instruction corresponding to the target light-emitting device to the lamp strip controller;
and the lamp strip controller is used for lightening a target light-emitting device of the lamp strip according to the control instruction.
The embodiment firstly determines to generate the virtual blind road according to the environment information of the head-mounted device, the device position and the navigation destination, so as to instruct the user to walk according to the virtual blind road to reach the navigation destination. This embodiment is realized based on the head-mounted apparatus, is provided with the lamp area of presetting a plurality of illuminator including length direction on the head-mounted apparatus, and each illuminator can correspond a walking direction on the lamp area. And when the wearer of the head-mounted equipment is navigated based on the virtual blind way, lightening the target light-emitting device corresponding to the target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction. The current direction that needs the walking can be confirmed according to luminous position in the lamp area to the person of wearing the head equipment, and this embodiment can be under the prerequisite of avoiding the barrier for seeing the barrier personage and carrying out accurate navigation.
Further, the method also comprises the following steps:
the motion detection device is used for determining the rotation angle of the head-mounted equipment according to the motion data of the wearing part so that the processor can update the corresponding relation between the light-emitting device in the lamp strip and the walking direction according to the rotation angle; wherein the motion detection device can acquire motion data of the wearing part; wherein the motion detection device comprises any one or a combination of any several of a camera, a gyroscope, an accelerometer, a magnetometer and an inertial conduction module.
Further, the method also comprises the following steps:
the distance measuring device is used for acquiring the environmental information of the head-mounted equipment;
the distance measuring device comprises any one or combination of any one of a camera, a radar, an infrared distance measuring sensor and an ultrasonic sensor; when range unit includes when the camera environmental information includes the depth map, works as range unit includes when the radar environmental information includes radar range finding information, works as range unit includes when infrared range sensor environmental information includes infrared range finding information, works as environmental information includes ultrasonic range finding information.
Since the embodiments of the apparatus portion and the method portion correspond to each other, please refer to the description of the embodiments of the method portion for the embodiments of the apparatus portion, which is not repeated here.
The present application also provides a storage medium having a computer program stored thereon, which when executed, may implement the steps provided by the above-described embodiments. The storage medium may include: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.
It is further noted that, in the present specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.

Claims (8)

1. A navigation method based on head-mounted equipment is characterized in that the head-mounted equipment comprises a lamp strip, a preset number of light-emitting devices are arranged in the length direction of the lamp strip, and the navigation method comprises the following steps:
determining environmental information and device location of the head-mounted device;
generating a virtual blind road according to the navigation destination, the environment information and the equipment position; the virtual blind sidewalk is not interfered with an obstacle, and the blind sidewalk width of the virtual blind sidewalk is larger than a preset value;
determining a target walking direction corresponding to the current equipment position in the virtual blind road;
lightening a target light-emitting device corresponding to the target walking direction according to the corresponding relation between the light-emitting devices in the lamp strip and the walking direction;
wherein, still include:
determining the rotation angle of the head-mounted equipment according to the motion data of the wearing part;
and updating the corresponding relation between the light-emitting device in the lamp strip and the walking direction according to the rotation angle.
2. The navigation method of claim 1, further comprising:
acquiring motion data of the wearing part by using a motion detection device; wherein the motion detection device comprises any one or a combination of any several of a camera, a gyroscope, an accelerometer, a magnetometer and an inertial conduction module.
3. The navigation method according to claim 1, wherein the generating a virtual blind road according to the navigation destination, the environment information and the device location comprises:
determining the position of an obstacle according to the environment information;
generating the virtual blind road according to the position of the obstacle, the navigation destination and the position of the equipment;
the starting point of the virtual blind road is the equipment position, the end point of the virtual blind road is the navigation destination, the virtual blind road does not interfere with obstacles, the blind road width of the virtual blind road is greater than a preset value, and the starting point of the center line of the virtual blind road is the equipment position.
4. The navigation method according to claim 3, wherein determining a target walking direction corresponding to a current device location in the virtual blind road comprises:
judging whether the distance between the current equipment position and the center line of the virtual blind road is smaller than or equal to a preset distance or not; wherein the preset distance is one half of the width of the blind road;
if so, setting a point on the central line of the virtual blind road, which is closest to the current equipment position, as a reference point, and taking the tangential direction of the reference point on the central line of the virtual blind road as the target walking direction;
and if not, updating the virtual blind road according to the current equipment position, the navigation destination and the environment information.
5. The navigation method according to claim 1, further comprising, before determining the environment information and the device location of the head-mounted device:
collecting environmental information of the head-mounted equipment by using a distance measuring device;
the distance measuring device comprises any one or combination of any one of a camera, a radar, an infrared distance measuring sensor and an ultrasonic sensor; when range unit includes when the camera environmental information includes the depth map, works as range unit includes when the radar environmental information includes radar range finding information, works as range unit includes when infrared range sensor environmental information includes infrared range finding information, works as environmental information includes ultrasonic range finding information.
6. The navigation method according to any one of claims 1 to 5, wherein the generating a virtual blind road according to the navigation destination, the environment information and the device position comprises:
calculating a shortest walking path between the navigation destination and the device location;
judging whether the distance of the shortest walking path is greater than a preset walking distance; the preset walking distance is determined according to the acquisition range of a device for acquiring the environmental information;
if so, selecting a navigation intermediate point on the shortest walking path, and generating the virtual blind road according to the navigation intermediate point, the environment information and the equipment position; and the walking distance between the navigation intermediate point and the equipment position is less than or equal to the preset walking distance.
7. Head-mounted equipment is characterized by comprising a processor, a lamp strip controller and a lamp strip, wherein a preset number of light-emitting devices are arranged in the length direction of the lamp strip;
the processor is used for determining environment information and equipment position of the head-mounted equipment; the virtual blind road is generated according to the navigation destination, the environment information and the equipment position; the device is used for determining a target walking direction corresponding to the current device position in the virtual blind road; the device comprises a lamp strip controller, a target light-emitting device and a control module, wherein the lamp strip controller is used for determining a target light-emitting device corresponding to a target walking direction according to the corresponding relation between the light-emitting device in the lamp strip and the walking direction, and sending a control instruction corresponding to the target light-emitting device to the lamp strip controller; the virtual blind sidewalk is not interfered with an obstacle, and the blind sidewalk width of the virtual blind sidewalk is larger than a preset value;
the lamp strip controller is used for lightening a target light-emitting device of the lamp strip according to the control instruction;
and the motion detection device is used for determining the rotation angle of the head-mounted equipment according to the motion data of the wearing part so that the processor can update the corresponding relation between the light-emitting device in the lamp strip and the walking direction according to the rotation angle.
8. A storage medium having stored thereon computer-executable instructions which, when loaded and executed by a processor, carry out the steps of the head-mounted device based navigation method according to any one of claims 1 to 6.
CN202010392129.7A 2020-05-11 2020-05-11 Navigation method based on head-mounted equipment, head-mounted equipment and storage medium Active CN111494175B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010392129.7A CN111494175B (en) 2020-05-11 2020-05-11 Navigation method based on head-mounted equipment, head-mounted equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010392129.7A CN111494175B (en) 2020-05-11 2020-05-11 Navigation method based on head-mounted equipment, head-mounted equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111494175A CN111494175A (en) 2020-08-07
CN111494175B true CN111494175B (en) 2021-11-30

Family

ID=71865441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010392129.7A Active CN111494175B (en) 2020-05-11 2020-05-11 Navigation method based on head-mounted equipment, head-mounted equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111494175B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113624236A (en) * 2021-08-06 2021-11-09 西安电子科技大学 Mobile device-based navigation system and navigation method for blind people
CN113850232B (en) * 2021-11-09 2024-06-11 安徽农业大学 Virtual blind road monitoring system
CN114767483A (en) * 2022-05-07 2022-07-22 宏脉信息技术(广州)股份有限公司 Intelligent navigation cap and control method thereof

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9508269B2 (en) * 2010-08-27 2016-11-29 Echo-Sense Inc. Remote guidance system
CN102973395B (en) * 2012-11-30 2015-04-08 中国舰船研究设计中心 Multifunctional intelligent blind guiding method, processor and multifunctional intelligent blind guiding device
CN106017489B (en) * 2016-04-01 2019-01-18 李晓莉 It is visually impaired with navigation pen based on Beidou positioning
CN107132667B (en) * 2017-04-06 2019-08-02 上海交通大学 Secondary row suitable for tubular visual field patient walks glasses and auxiliary traveling method
CN107356262A (en) * 2017-07-11 2017-11-17 上海共佰克智能科技有限公司 Correct the method for navigation direction and the device of amendment navigation direction
CN107328424B (en) * 2017-07-12 2020-12-11 三星电子(中国)研发中心 Navigation method and device
CN107990902B (en) * 2017-12-29 2019-08-16 达闼科技(北京)有限公司 Air navigation aid, navigation system based on cloud, electronic equipment
CN108403389A (en) * 2018-04-03 2018-08-17 闽南师范大学 A kind of intelligent blind-guiding cap
CN109031306A (en) * 2018-06-29 2018-12-18 合肥东恒锐电子科技有限公司 A kind of navigation methods and systems for disturbance people

Also Published As

Publication number Publication date
CN111494175A (en) 2020-08-07

Similar Documents

Publication Publication Date Title
CN111494175B (en) Navigation method based on head-mounted equipment, head-mounted equipment and storage medium
US10769463B2 (en) Training of vehicles to improve autonomous capabilities
US11852493B1 (en) System and method for sensing walked position
CN104127302B (en) A kind of visually impaired people's walking along the street safety navigation method
US20200043368A1 (en) Personal navigation system
JP6600310B2 (en) Navigation method based on see-through head-mounted device
AU2024202759A1 (en) Autonomous Vehicles and Advanced Driver Assistance
JP4748389B2 (en) Route search device
KR20180093753A (en) A stick for the blind
JP2007205764A (en) Route searching apparatus
AU2018267553B2 (en) Systems and methods to train vehicles
AU2018267541A1 (en) Systems and methods of training vehicles
JP2013117766A (en) Level difference detection system
US20190155024A1 (en) Image display device and image display method
CN109938973A (en) A kind of visually impaired person's air navigation aid and system
CN105716600B (en) Pedestrian navigation system and method
KR101655820B1 (en) Time of arrival notice system and there of method using smart glasses
CN116394981A (en) Vehicle control method, automatic driving prompting method and related devices
Scalvini et al. Outdoor navigation assistive system based on robust and real-time visual–auditory substitution approach
KR101893880B1 (en) Smart headlight for bicycle
WO2021124299A1 (en) Walking assisting systems and methods
EP4270416A1 (en) Infectious disease-monitoring smart device
WO2023089327A1 (en) Handheld guidance device for the visually-impaired
ES2447641B2 (en) NAVIGATION ASSISTANCE SYSTEM FOR INVIDENT OR VISUAL DEFICIENCY PERSONS
CN114569416A (en) Blind guiding system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant