EP3204722A2 - Interface de construction de trajectoire dans un environnement et ensemble environnement et interface de construction de trajectoire - Google Patents
Interface de construction de trajectoire dans un environnement et ensemble environnement et interface de construction de trajectoireInfo
- Publication number
- EP3204722A2 EP3204722A2 EP15798152.3A EP15798152A EP3204722A2 EP 3204722 A2 EP3204722 A2 EP 3204722A2 EP 15798152 A EP15798152 A EP 15798152A EP 3204722 A2 EP3204722 A2 EP 3204722A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- environment
- trajectory
- information
- azimuth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H9/00—Pneumatic or hydraulic massage
- A61H9/005—Pneumatic massage
- A61H9/0078—Pneumatic massage with intermittent or alternately inflated bladders or cuffs
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3652—Guidance using non-audiovisual output, e.g. tactile, haptic or electric stimuli
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3664—Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/003—Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/08—Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
- A61H3/061—Walking aids for blind persons with electronic detecting or guiding means
- A61H2003/063—Walking aids for blind persons with electronic detecting or guiding means with tactile perception
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H15/00—Massage by means of rollers, balls, e.g. inflatable, chains, or roller chains
- A61H2015/0007—Massage by means of rollers, balls, e.g. inflatable, chains, or roller chains with balls or rollers rotating about their own axis
- A61H2015/0014—Massage by means of rollers, balls, e.g. inflatable, chains, or roller chains with balls or rollers rotating about their own axis cylinder-like, i.e. rollers
- A61H2015/0021—Massage by means of rollers, balls, e.g. inflatable, chains, or roller chains with balls or rollers rotating about their own axis cylinder-like, i.e. rollers multiple on the same axis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1602—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
- A61H2201/1604—Head
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1602—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
- A61H2201/1619—Thorax
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1602—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
- A61H2201/1635—Hand or arm, e.g. handle
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1602—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
- A61H2201/164—Feet or leg, e.g. pedal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1602—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
- A61H2201/1645—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support contoured to fit the user
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/16—Physical interface with patient
- A61H2201/1602—Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
- A61H2201/165—Wearable interfaces
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5023—Interfaces to the user
- A61H2201/5048—Audio interfaces, e.g. voice or music controlled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5058—Sensors or detectors
- A61H2201/5064—Position sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5058—Sensors or detectors
- A61H2201/5079—Velocity sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5058—Sensors or detectors
- A61H2201/5084—Acceleration sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5058—Sensors or detectors
- A61H2201/5092—Optical sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2201/00—Characteristics of apparatus not provided for in the preceding codes
- A61H2201/50—Control means thereof
- A61H2201/5097—Control means thereof wireless
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2203/00—Additional characteristics concerning the patient
- A61H2203/04—Position of the patient
- A61H2203/0425—Sitting on the buttocks
- A61H2203/0431—Sitting on the buttocks in 90°/90°-position, like on a chair
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H2205/00—Devices for specific parts of the body
- A61H2205/02—Head
- A61H2205/021—Scalp
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H23/00—Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms
- A61H23/02—Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms with electric or magnetic drive
- A61H23/0218—Percussion or vibration massage, e.g. using supersonic vibration; Suction-vibration massage; Massage with moving diaphragms with electric or magnetic drive with alternating magnetic fields producing a translating or oscillating movement
Definitions
- the present invention relates to the field of systems making it possible to construct a trajectory in space, and more particularly relates to a trajectory construction interface in an environment and to a trajectory construction interface. on a set of trajectory construction interface and environment, allowing a configuration according to a configuration a hands-free movement and free eyes, allowing a person not or partially sighted to build and follow his own trajectory in the environment, or to any person without handicap to be guided very precisely without the use of sight, hearing or hands.
- a trajectory is the sequence of positions that describes a moving point.
- a trajectory can be trivial: sequence of segments connecting navigation points, or complex: combinations of curves with variable radii passing through points associated with tangents to the curve.
- the trajectory that a person will describe will depend on an objective: for example, the shortest path or the fastest travel time taking into account lateral or longitudinal limits of adhesion as a function of speed.
- a specialist will be able to extrapolate in real time its optimal trajectory based on a limited amount of information on its evolution environment, in combination with its own situation (speed, orientation ). He will then be able to autonomously make immediate decisions like turning, turning, slowing down, speeding up, without a system imposing them on it.
- chord points The necessary and sufficient information that the specialist needs are particular points commonly known as “chord points”. It is essential for the specialist to know at all times and simultaneously the position, the direction of passage and the tangent direction of at least one point of rope to extrapolate a complex trajectory. For optimization purposes, he must also know his own speed. The only information of the position of this point is insufficient to allow the specialist to extrapolate a complex trajectory. For example, if the rope stitch is a few inches from a wall, it is essential that the specialist integrates to go through this rope point parallel to the wall. The tangent information specific to the rope point is therefore indispensable for our problem. A simple series of waypoints, without this tangent information, is unusable for constructing complex trajectories.
- the goal for the athlete is to reach the end of a course in a minimum time, without being outside the limits of the environment imposed on him (ski slope, hallway in athletics , limit of the road or velodrome in cycling, limit of the road or circuit in automobile or motorcycle).
- the athlete must balance his speed with his ability to maintain or change his direction of evolution in the environment.
- additional parameters come into play, such as the ability of the equipment to accelerate or decelerate depending, for example, on speed or radius of curvature.
- the optimization of these parameters is reflected in the trajectory described by the athlete in his environment of evolution.
- the top athlete is the one who excels at the same time in the technique of his particular sport (running, going fast on skis, managing a drift in a car, optimizing braking) and in planning his trajectory in the environment which will be imposed on him for an event.
- the best athlete will be the one who will exploit its intrinsic capacities to the maximum in the environment, without exceeding its limits.
- This is the object of the trajectory: it stems from a real expertise of the sportsman. It is the combination between the capabilities of the athlete and the constraints of the evolution environment.
- points of reference that the athlete will follow throughout his career. These points may be waypoints, chord points, orientation change points, deceleration points, acceleration points.
- these points are predetermined by the athlete and his concentration is focused on the landmarks distant from his instantaneous situation.
- the system according to the invention provides athletes with innovative support in the complex process of controlling trajectories.
- the system according to the invention can be used both as part of a pedagogical approach but also in an operational situation as an effective medium of information useful for the management of the trajectory.
- the great feature of the system according to the invention is that it allows effective use also for the visually impaired and blind.
- US patent application US2011268300 discloses a tactile guidance system, the system comprising tactile stimulators in a headgear that can provide tactile sensations at different locations around the head to transfer information to a person such as direction or azimuth, the system further comprising magnetic sensors, accelerometers and / or a GPS.
- the system can only touch one type of information at a time to the user, which limits the amount of trajectory information sent to the user.
- this system does not detect objects in the environment that are not in the environment mapping.
- US patent application US2008120029 describes a tactile touch navigation system which transfers position information to the user in a tactile manner, the system being able to provide to the user, via a belt having four tactile actuators located at the four cardinal points, direction and distance information at using a GPS.
- this system can not provide the user with touching other navigation information such as own gateway azimuth information or evolution environment boundary information.
- this system does not detect objects in the environment that are not in the environment mapping.
- this system having only four touch actuators the user's orientation accuracy is very limited.
- this system does not allow any use at high speed and also does not allow optimized management of trajectories.
- the French patent application FR2771626 describes a system allowing blind or visually impaired people to orient themselves and move in an unknown environment, the system comprising transmitters arranged on the path and a portable object comprising tactile means for indicating to the the user to follow according to the direction of the orientation mark, the selected destination and the information emitted by the transmitters.
- this system can not provide the user with touching other navigation information such as own gateway azimuth information or environmental boundary information.
- the exchange of information between the transmitters and the mobile object is achieved via a wireless communication, which forces the object worn by the user to have a receiver of wireless communication.
- This system does not allow any use at high speed and also does not allow optimized management trajectories.
- U.S. Patent Application US 5,470,233 A discloses a system for allowing a blind person to navigate in an urban environment. This system does not allow any use at high speed and also does not allow optimized management trajectories.
- German patent application DE 10 2011 119864 A1 a navigation device having a touch interface. This system does not allow any use at high speed and also does not allow optimized management trajectories. In addition, it does not allow hands-free use.
- the subject of the present invention is an interface for constructing a trajectory in an environment for a user, the user having at a given moment a position and a direction on said trajectory, characterized by the fact that the interface includes:
- the environment recognition means for indicating in real time to the user direction information of a future gate on the trajectory via the first means of transmitting information to the user by sensory means.
- the interface further comprises:
- the distance calculation means making it possible to indicate in real time to the user distance information to said future passage gate on the trajectory via the second means of transmitting information to the user by sensory means.
- the expression "sensory pathway” will be understood to mean haptically, particularly tactile, or sound.
- the invention comprises a target passageway defined by a right limit and a left limit of passage, its relative direction relative to the reference specific to the user and its distance from the user .
- the target gateway can be fixed in the environment or mobile.
- the interface is equipped with:
- a haptic transmission system of information on the right and left limits of the passage gate a haptic or sound information system on the distance between the user and the passage gate.
- a camera in athletics on a race, a camera is positioned on a blind athlete, for example on his chest.
- a valid athlete, the guide for the blind athlete has a visual cue in his back (for example, a specific pattern such as cross or other).
- the system constantly detects the position of the guide in front of the blind athlete. It constantly indicates to the user the direction between the axis of the camera and the guide haptically.
- the camera or cameras can also be used to detect any anomalies, such as moving obstacles or not. This will alert the user.
- this system also makes it possible to determine the speed and the orientation of the camera, and therefore of the user, by reading a scroll of predefined environment markers (known spacing of standard ground marks, for example ).
- the interface according to the invention further comprises third means for transmitting information to the user by sensory means, the environment recognition means making it possible to indicate in real time to the user. user of the right limit and left limit information of said future gate through the third means of transmitting information to the user by sensory means.
- the limits of the environment are advantageously transmitted to the user according to a time projection, for example at 0, 0.1, 1, or several seconds, according to user situation data.
- the interface according to the invention further comprises means for determining the distance between the user and another reference user located downstream on the same trajectory.
- a camera or several cameras are positioned on a blind athlete, for example on his helmet.
- a valid skier the guide of the blind skier, has a visual cue in his back (eg a specific pattern).
- the system continuously detects the position of the guide in front of the blind athlete as well as its distance thanks to a radar system or a radio system (radio transmitter on the guide, radio receiver on the blind spectator, with system determining the distance of the radio source).
- the interface according to the invention further comprises a relational database of the environment, containing all the elements constituting the environment and their respective distance and position relationships, and means location of the user in the environment.
- the relational database of the environment is a database of points of the environment associated with coordinates, possibly enriched by data specific to points or areas of the environment. In very detailed versions, such a relational environment database can be found in mapping databases.
- the interface may advantageously include a means of locating the position of the user and his derivatives (speed, acceleration), such means may be a GPS, landmarks visible by infrared camera or TV, HD, UHD, 4K, an accelerometer, an inertial unit, electronic compasses, or system of localization by telephone network ...
- the camera system can allow on the one hand a redundancy of position information, which is an advantage for the reliability of the information, but also a direct detection of the real environment, in particular for real-time visual signals: unmapped obstacle, flag indicating a situation at risk, etc.
- the interface according to the invention further comprises fourth means for transmitting information to the user by sensory means to indicate to the user the direction of passage of said gate of future passage.
- a camera is positioned on a blind sports user, for example on his chest.
- the system is set to indicate real-time information about the position, the right and left boundaries, and the orientation of the Passage of a mobile virtual transit door permanently located at 5 meters (configurable data) in front of the sports user.
- the camera continuously provides the information to the trajectory construction interface on the bounding lines of the athlete's lane (position, curvature).
- the trajectory building interface calculates in real time the situation of the virtual gate with respect to the sporting user by using the data collected by the camera.
- the path construction interface calculates the direction of passage of the gate in real time using the radius of curvature of the corridor lines.
- the trajectory construction interface transmits it haptically to the user, for example with a system of one or more vibrating cells or with a system of mechanical pointers.
- the system indicates in real time the position of the user in relation to the right and left limits of the passage door, which allows the user to position himself permanently in his corridor.
- This system allows for example a blind to practice the races in the corridor (100m, 200m, 400m in athletics for example) without the help of a guide which for the moment is essential.
- the interface according to the invention further comprises at least one of the fifth means for transmitting information to the user by sensory means to indicate to the user the position of the user, sixth means of transmitting information to the user by sensory means to indicate to the user the speed of the user identifies, and seventh means of transmitting information to the user sensory way to indicate to the user the acceleration of the user landmark.
- speed here means a vector value (direction, direction, value) and not the simple speed value.
- this makes it possible to add in particular the reference user's orientation information in real time as well as that of the distance between the reference user and the user. the user, thus dispensing with a radio and / or radar system.
- each of the first to seventh means of transmitting information to the user by sensory means is one of a haptic tool positioned on a part of the body of the user and a sound tool.
- each haptic tool is one of:
- one or more pointers in contact with a part of the body of the user
- a mechanical finger system controlled by pneumatic network preferably without metal component.
- one or more pointers can move on a slideway mechanically, electromagnetically, pneumatically, or hydraulically.
- the displacement of each pointer is determined according to the evolution of the haptic information that it must transmit.
- Each pointer may be in contact with the user via a non-abrasive surface, a wheel or a vibrating device.
- each sound tool is at least one of an audio headset and at least one loudspeaker.
- the sound tool can transmit information on the speed of the user and its proximity to a point, boundary or area of the environment, or alert the user of the proximity of an obstacle.
- the sound information on its speed can be transmitted to the user in different ways: absolute speed or relative speed with respect to a reference speed where the user is located, information transmitted by a voice that speaks to the user. user, or by frequency modulation.
- the sound information on the proximity can be transmitted to the user in different ways: countdown before a top, modulation of frequencies.
- each haptic tool can separately use the "left” and “right” channels to broadcast two different types of information (speed on one side, proximity on the other side, for example).
- each haptic tool is positioned on the head, neck, torso, arms and / or legs of the user.
- each environment recognition means is constituted by one of an infrared camera, a TV camera, a photographic sensor connected to an image recognition computer program. It can of course have more than one camera, for example to have a 360 ° vision.
- At least two cameras can be used to determine the relative position of a body part with respect to a repository that can be the body itself or the environment of the user.
- a camera is fixed on the head of the user, another on his torso.
- differential analysis of the images of the two cameras it is possible to determine the position of the head of the user relative to the torso.
- a camera is fixed on the head of the user, another is fixed on a car in which the user is.
- differential analysis of the images of the two cameras it is possible to determine the position of the user's head relative to the vehicle.
- Stereoscopic vision can also allow distance calculation.
- the distance calculating means are constituted by at least one of a radar, a radio wave transmitter-receiver pair, a transmitter-receiver pair of ultrasonic waves.
- the system constantly indicates to the user the direction between the axis of the camera and the guide by way of haptic and the distance between the guide and the user by haptic or sound.
- the means for determining the distance between the user and a reference user located downstream on the same trajectory is constituted by a camera system carried by the user connected to an image recognition program and a sign capable of being recognized on the reference user, such that the image processing software, after capturing the image of the sign on the reference user, is able to deduce from the image of the sign the distance between the user and the reference user.
- the relational database is a cartography of the environment, the means for locating the user in his environment being constituted by at least one of a GPS system, a Galileo system and a system Glonass.
- the present invention proposes, according to one embodiment, an environment and trajectory construction interface assembly in the environment, the trajectory construction interface comprising a cartography of the environment, the environment comprising at least one object, path construction interface comprising at least one real time calculation means, a user's position determination means, a user's actual azimuth determination means, a haptic feedback indication means of the position of the at least one object, and an indication means by haptic stimulation of the own azimuth of the at least one object.
- This set can provide haptically to the user, which can be no or bad seeing multiple navigation information simultaneously, such as direction and azimuth information, so that the user can navigate an environment path that he chooses.
- this set makes it possible to detect objects positioned in the environment, without the latter being in the environment mapping, and without the path construction interface having a wireless communication means.
- the assembly according to the present invention allows any person to move in a natural way in his environment without the sense of sight, said set providing in real time to its user the synthesized information of his environment of evolution which will allow him to make his own movement choices.
- the real azimuth of a user is the direction followed by this user, that is to say the axis of the user's head. For a sighted user, the real azimuth is the direction of his gaze.
- the principle of the invention is based on the fact that, when moving the user, the body of the user uses the position of the head as a natural reference.
- the position of the head naturally causes the body on a trajectory in space, so the trajectory construction interface of the invention indicates in real time to the user the direction to which to go and therefore the direction of the objective relative to his body, specific information to define a trajectory is also sent in real time to the user by haptic and / or sound means.
- the subject of the present invention is therefore a set of environment and construction interface of trajectory in the environment, the trajectory construction interface comprising a cartography of the environment, the environment comprising at least one object, characterized in that the trajectory construction interface comprises:
- mapping is stored
- an indication means by haptic stimulation of the own azimuth of the at least one object said means of indication by haptic stimulation of the own azimuth of the at least one object being controlled by the calculating means; real-time based on the map stored in the memory, user position information from the position determination means, and / or real user azimuth information from the azimuth determination means real.
- the memory may especially be one of a random access memory, a read only memory, a volatile memory or a flash memory.
- the real-time calculation means may especially be one of a microprocessor, a microcontroller, an embedded system, an FPGA or an ASIC.
- position and azimuth information specific to at least one object in the environment can be indicated in real time to the user haptically depending on the position and the real azimuth of the user, 1 'at least one object in the environment that can be in the mapping of the environment stored in the memory, the user thus having information on the path to follow in the environment, the user knowing the position of the target object to follow or to cross as well as the direction of evolution of the target to follow or passage of the object to be crossed.
- the user can do without the sense of sight to move in the environment, the information on the path he must follow in the environment being transmitted haptically.
- the trajectory construction interface further comprises a distance indication means of at least one object relative to the user, said means for indicating the distance from the least one object being controlled by the real-time calculation means according to the map stored in the memory, information the user's positional position derived from the position determination means and / or real user azimuth information from the actual azimuth determination means. Indications on the speed and acceleration of the user can also be provided.
- the user also receives, in real time, distance indications from the at least one object in the environment, the user being able to adapt his speed and trajectory according to the distance that separates him from the object in the environment. 'environment.
- the trajectory construction interface further comprises an environment limit indication means, said environment limit indication means being controlled by the real time calculation means. based on the map stored in the memory, user position information from the position determination means and / or real user azimuth information from the actual azimuth determination means.
- the position determination means is one of a GPS, a Galileo system and a Glonass system, and / or one or more cameras, preferably infrared (IR) cameras, said cameras being able to locate objects of the environment so that the real-time computing means can determine the user's position using the map.
- IR infrared
- the GPS, Galileo or Glonass system allows to locate in real time the position of the user in the environment, to know its position relative to the map stored in the memory.
- the camera (s) makes it possible to locate objects in the environment, for example by detecting the color or shape of the objects, or by detecting for example the frequency of a signal visible by camera sensors ( IR or not), in order to locate in real time the position of the user compared to the information contained in the map stored in the memory.
- IR cameras In the case of IR cameras, IR cameras detect IR waves radiated by specific objects in the environment.
- the position determination means preferably contains a GPS and several cameras for a redundancy of position information, the whole being thus more secure.
- the real azimuth determination means is an electronic or inertial central compass positioned on the user's head.
- the electronic compass / inertial unit allows the trajectory construction interface to know in real time the orientation of the head of the user, the real-time calculation means sending the trajectory information to the user by depending on the orientation of the head of it.
- the trajectory construction interface furthermore comprises one or more accelerometers, the accelerometer or accelerometers possibly being included in the inertial unit.
- the accelerometer or accelerometers allow the real-time calculation means to know in real time the acceleration of the user in the environment.
- the invention may therefore comprise an electronic compass and accelerometers, or an inertial unit or a combination of an inertial unit, an electronic compass and accelerometers.
- the means for indicating by haptic stimulation of the position of the at least one object is a haptic tool positioned on a part of the user's body, and the means of indication by stimulation.
- the haptic of the azimuth of the at least one object is another haptic tool positioned on a part of the body of the user.
- the expression "proper azimuth of an object” refers to the direction and orientation of the object according to its own frame of reference.
- the expression “clean azimuth” of the door designates its meaning and its orientation with respect to its own reference system.
- the haptic tools make it possible to indicate to the user respectively object position information and object azimuth information without the user needing the sense of sight, the user being able to for example, being a bad or blind person.
- the distance indicating means of the at least one object is a haptic tool positioned on a part of the body of the user and / or a sound tool.
- the object distance indication information can be provided to the user in a haptic and / or audible manner, without the user needing the sense of sight.
- the environmental limit indication means is a haptic tool positioned on a part of the body of the user and / or a sound tool.
- the environmental limit indication information may be provided to the user in a haptic and / or audible manner, without the user needing the sense of sight.
- the sound tool (s) is an audio headset or one or more loudspeakers.
- the sound tool may be an audio headset, and in the case where the trajectory construction interface is jointly worn by the user and a vehicle, the sound tool may be loudspeakers arranged in the vehicle.
- the at least one object is a primary trajectory gate, characterized by a left limit, a right limit and an azimuth, the own azimuth of the primary gate corresponding to the direction of passage of the the primary door by the user.
- a trajectory door is a passage door through which the user must pass in its evolution on the trajectory in the environment, the passage door being defined by a left limit and a right limit as well as a direction of passage, the door of passage that can be defined hard in the environment or dynamic by the trajectory construction interface, or parameterized on the fly by the user.
- the primary trajectory gate is the next gateway that the user must cross on the path in the environment.
- the assembly transmits to the user position information of the left and right limits of the primary door via the object position indicating means, the own azimuth information of the primary door by the intermediate of the object azimuth indicating means, and optionally distance information of the left and right boundaries of the primary gate via the object distance indicating means, the user having all the necessary information to move to the primary door and cross it.
- the secondary trajectory door is the passage gate that follows the primary trajectory door, the secondary door becoming the new primary door after the passage of the front primary door.
- the environment further comprises one or more environment boundary objects, preferably a left boundary boundary and a right boundary boundary.
- the user may further receive environmental limit information via the environmental limit indicating means, the environmental boundary objects being defined in the environment mapping and / or arranged in hard in the environment and detected by the cameras.
- the environment further comprises one or more reference objects, previously positioned on the map or on the fly in the environment.
- These cue objects are markers for specific actions (for example, braking, steering, jumping, etc.) or elements of the environment that must be dynamically managed, such as obstacles (for example, stopped car or mobile, person on the move, red light ...) or game partners (for example, opponents or team-mates in a team sport) or a particular object (for example, ball, ball 7), markers for specific actions being previously arranged on the map or detected by the cameras, the other elements of the dynamically managed environment being detected by the cameras.
- markers for specific actions for example, braking, steering, jumping, etc.
- elements of the environment that must be dynamically managed such as obstacles (for example, stopped car or mobile, person on the move, red light 7) or game partners (for example, opponents or team-mates in a team sport) or a particular object (for example, ball, ball ”).
- each haptic tool is one of:
- one or more pointers in contact with a part of the body of the user
- a mechanical finger system controlled by pneumatic network preferably without metal component.
- one or more pointers can move on a slideway mechanically, electromagnetically, pneumatically, or hydraulically.
- the displacement of each pointer is determined according to the evolution of the haptic information that it must transmit.
- Each pointer may be in contact with the user via a non-abrasive surface, a wheel or a vibrating device.
- the haptic tool or tools used in the context of the present invention may also be connected objects.
- a safety belt equipped with haptic information devices may indicate the direction of a primary and / or secondary door, and / or its distance and / or orientation.
- the steering wheel of a car or a boat, the handle of an airplane, the handles of a trolley or any device that the user grasps in connection with a trip may be used as a haptic tool according to its capabilities: a parameterization system will define the precise nature of the information that will be supported by the connected object as well as their calibration, according to the characteristics of the connected object and preferences of the user.
- each haptic tool can accurately indicate to the user the information of the associated indicating means, each haptic tool being able to be disposed on a different part of the body of the user.
- each haptic tool is positioned on the user's head, neck, torso, arms and / or legs.
- Two sliding contact wheels on a slide can, for example, be arranged on the user's head in order to indicate to the user information of position and distance of the primary door, the position of the two contact wheels by user's skull ratio indicating the position of the two primary door boundaries relative to the orientation of the user's head, the spacing of the two contact wheels indicating to the user the distance from the primary door in relation to this one.
- vibrating cells may be arranged at a constant pitch around the user's head in order to indicate to the user azimuth information specific to the primary door, one of the vibrating cells being in vibration so that to indicate the azimuth of the primary door relative to the orientation of the head of the user.
- a plurality of electromagnetically actuatable mini-reels, air or liquid may be arranged at a constant pitch around the torso of the user to indicate to the user secondary door position information, only a minimum of actuator being actuated at a time to indicate the position of the secondary door relative to the orientation of the user's head.
- inflatable pockets may, for example, be disposed on each user's arm to indicate to the user environmental limit information, with only one pocket inflated on each arm to indicate the distances respectively. left and right boundaries of the environment relative to the user.
- the trajectory construction interface is able to be wirelessly connected to specific applications to the environment, such as a guidance system application, the interface of construction of trajectory receiving real-time information about the environment, such as environmental changes, from specific applications to the environment.
- the subject of the present invention is an environment and trajectory construction interface assembly in the environment as described below, the interface trajectory construction apparatus comprising a mapping of the environment, the environment comprising at least one object, characterized by the fact that:
- the calculation and memory means store the map
- the calculation means are in real time
- the trajectory construction interface further comprises a user's position determination means and a user's real azimuth determination means;
- the first means of transmitting information to the user by sensory means indicate the direction of the at least one object, said first means of transmitting information to the user by sensory means being controlled by the calculating means; real time according to the map stored in the calculation and memory means, user position information from the position determination means and / or real user azimuth information from the means. real azimuth determination; and
- the second means of transmitting information to the user by sensory means indicate the distance of the at least one object, said second means being controlled by the calculation means in real time according to the map stored in the means of computing and memory, user position information from the position determination means and / or real user azimuth information from the real azimuth determination means.
- the distance calculating means calculate a distance from the less object compared to the user, said means for calculating the distance of the at least one object being controlled by the calculation means in real time according to the map stored in the calculation and memory means, information the user's positional position derived from the position determination means and / or real user azimuth information from the actual azimuth determination means.
- the position determination means is one of a GPS, a Galileo system, a Glonass system, and at least one camera, preferably infrared (IR), but also possibly TV, HD, UHD, 4K ... the at least one camera being able to locate objects of the environment so that the real-time calculation means can determine the position of the user using the map.
- IR infrared
- the real azimuth determination means is one of an electronic compass and an inertial unit positioned on the head of the user.
- the trajectory construction interface further comprises at least one accelerometer, the accelerometer or accelerometers can be included where appropriate in the inertial unit.
- the at least one object is a primary trajectory gate, characterized by a left boundary, a right boundary and a proper azimuth, the own azimuth of the primary gate corresponding to the direction of passage of the primary door by the user.
- the environment further comprises a path secondary door object, corresponding to the point of passage following the point of passage of the primary door (34), the secondary door becoming the new primary door after the passage the front primary door (34).
- the trajectory construction interface is able to be connected wirelessly to specific applications to the environment, such as a guidance system application, the trajectory building interface receiving real-time information about the environment, such as changes to the environment, from specific applications to the environment.
- the trajectory construction interface can receive in real time information concerning the airport or the metro via specific applications to the airport.
- airport or the metro information concerning, for example, a boarding gate, a baggage claim area or a guide during a correspondence in the case of an airport, or a subway terminal or a guide during a correspondence in the case of a metro.
- trajectory construction interfaces according to the present invention can be interconnected to share data with each other.
- all the other interfaces of the group benefit from these information.
- the relative positions of the members of the group are known from each other through the sharing of information and through the means of recognition of the environment (cameras). Any interface according to the individual invention can be seen as a primary or secondary door by another interface according to the invention.
- the interface comprises a means for detecting the inclination around the y-axis, which is in the horizontal reference plane of the user and perpendicular to the user's reference frontal axis. , coupled to a device with haptic transmitters called “vertical system” which indicates to the user an angular position around this same axis y.
- the haptic emitter device is composed of one or more tactile pointers that transmit information "more inclined up” or “more downward” to the user according to a setting that the user can refine according to its sensitivity.
- the inclination detection means may be a connected object with or without a wire, integral with any part of the user's body (head, arm %) or any object used for the objective designation (binoculars, lamp, weapon ).
- the haptic transmitter device relating to the information processed by this detection means can be placed on any part of the user's body.
- This device allows ALT-Sen to designate an objective not only in the reference plane of the user, but in space.
- the trajectory construction interface may comprise a remote system for determining the position of the head: the position of the head is determined by an external system, for example one or cameras fixed on a support (handlebars or dashboard of a vehicle for example). This system for determining the position of the head can then complement or replace the on-board integral system of the user (compass, accelerometer, inertial unit, angular position detection ).
- the trajectory construction interface may comprise a remote system for determining the direction of the gaze: the direction of the gaze of the user is determined by an external system, for example one or more cameras fixed on a support (handlebar or dashboard of a vehicle for example). This system for determining the direction of the gaze can then complement or replace the on board integral system of the user (compass, accelerometer, inertial unit, angular position detection ).
- the trajectory construction interface may comprise an obstacle indication system: a fixed or mobile obstacle detection system (radar, ultrasonic detection system, cameras) is coupled a haptic belt that indicates to the user in real time the presence of obstacles.
- This belt may be provided with one or more haptic information transmission devices.
- These haptic information transmission devices indicate the proximity of the obstacles by frequency variation, and possibly their relative direction with a precision which depends on the number of haptic information transmission devices of the fixed obstacle detection system or mobile.
- the trajectory construction interface may comprise an unplanned obstacle indication system: a fixed or mobile obstacle detection system (radar, ultrasound detection system, cameras) is coupled to a sound signal management system. In the event of detection of at least one fixed or mobile obstacle, the system indicates by sound the presence, the distance and the direction of the obstacle (s) detected.
- the trajectory construction interface may comprise a video system in front of the eyes of the user.
- This system superimposes the environment with additional information to assist the user in his move.
- the nature and position of the information displayed depends on the parameters processed by the interface (position of the head, the user, speed, etc.).
- this video system can be used to display a virtual guide for use by a visually impaired person in speed events (alpine skiing, running, cycling), or to display information such as speed or the proximity of a particular point, a limit, in addition to sound and haptic data, thanks to color codes or strobe effects.
- the path construction interface may include a radio system for determining the distance to a target.
- the target is equipped with an active radio transmitter system.
- the user is equipped with a reception system which detects the strength of the signal in reception.
- the distance to the target is determined by measuring the received signal, which is a function of the distance to the source. he can to exist one or more targets: the targets are identified thanks to a signal of their own.
- a radar worn by the user can be used to detect the objects of his environment.
- Figure 1 is a block diagram of a path construction interface according to the present invention
- Figure 2 is a perspective view of a haptic head tool of the trajectory construction interface according to a preferred embodiment of the present invention
- Figure 2A is an enlargement of Figure 2 on the haptic primary primary azimuth indication tool of the haptic head tool;
- Figure 2B is an enlargement of Figure 2 on the primary door limit position indication haptic tool of the haptic head tool
- Figure 3 is a perspective view of a haptic torso tool of the trajectory construction interface according to a preferred embodiment of the present invention
- Figure 4 is a perspective view of a haptic arm tool of the trajectory construction interface according to a preferred embodiment of the present invention
- Figure 4A is a sectional view of the haptic arm tool of Figure 4 at the wrist;
- Figure 5 is a schematic view of an exemplary environmental trajectory according to the present invention.
- Figure 6 is a perspective view of a user in a vehicle equipped with the path construction interface according to a second preferred embodiment of the present invention.
- An environment and trajectory construction interface assembly includes a trajectory construction interface 1 and an environment (not shown in FIG. 1), a user of the trajectory construction interface 1 traveling on a trajectory in the environment, the environment comprising at least one object, fixed or mobile in the environment.
- the trajectory construction interface 1 comprises a memory 2 in which is stored a cartography 2a of the environment, a user's position determination means 3, an azimuth determination means of the user's head. 4 and an accelerometer 5.
- said accelerometer may advantageously be replaced or supplemented by an inertial unit, without departing from the scope of the present invention.
- the memory 2 may be in particular one of a random access memory, a read only memory, a volatile memory or a flash memory.
- the user position determination means 3 comprises a GPS 3a and several IR cameras 3b.
- the GPS 3a makes it possible to determine in real time the GPS position and the speed of the user in the environment by defining the three-dimensional position of the user in GPS coordinates, the GPS 3a being high frequencies and with a high precision, of preferably of the order of 5 cm.
- the IR cameras 3b make it possible to locate IR objects in the environment in order to determine in real time the position of the user with respect to these IR objects, the IR objects being or not in the cartography 2a, the IR objects being able to emit intrinsic data to the environment (eg boundaries, signs or messages) or anomaly data (eg static obstacle or moving object).
- intrinsic data eg boundaries, signs or messages
- anomaly data eg static obstacle or moving object
- the GPS 3a and the IR cameras 3b make it possible to obtain a redundancy of position information of the user, the whole being thus more secure.
- the user position determination means 3 could comprise only a GPS 3a or only IR cameras 3b, without departing from the scope of the present invention.
- the user position determination means 3 could also include a Galileo or Glonass system in place of the GPS 3a, without departing from the scope of the present invention.
- the cameras could also be non-IR, without departing from the scope of the present invention, the cameras then being able to locate shapes or colors of objects of the environment.
- the user's real azimuth determination means 4 comprises an electronic compass 4a, the electronic compass 4a being disposed on the head of the user and making it possible to measure in real time the absolute orientation of the user's head. .
- said electronic compass 4a can advantageously be replaced or supplemented by an inertial unit, without departing from the scope of the present invention.
- the accelerometer 5 makes it possible to measure in real time the three-dimensional and angular accelerations of the user.
- trajectory construction interface 1 may not include an accelerometer 5, without departing from the scope of the present invention.
- the trajectory construction interface 1 further comprises real-time calculation means 6, said real-time calculation means 6 being connected to the memory 2, by means of position determination 3, by means of azimuth determination.
- Real 4 and accelerometer 5 to receive their respective measured information.
- the real-time calculation means 6 may be in particular one of a microprocessor, a microcontroller, an embedded system, an FPGA or an ASIC.
- the real-time calculation means 6 compiles the information from the memory 2, the position determination means 3, the real azimuth determination means 4 and the accelerometer 5 in order to determine in real time the trajectory three-dimensional current of the user in the cartography 2a of the environment, by calculating the orientation of the head of the user (via the electronic compass 4a), the user speed (via GPS 3a), the absolute position of the user in the environment (via GPS 3a), the relative position of the user in the environment ( via the IR cameras 3b), and three-dimensional and angular accelerations of the user (via the GPS 3a and the accelerometer 5), the coherence of the information being controlled with the redundancy of the information sources .
- the trajectory construction interface 1 further comprises an object position 7 haptic stimulation indication means, an object specific azimuth haptic indication indication means 8, a distance indicating means object 9 and an environmental limit indication means 10.
- path construction interface 1 might not include object distance indication means 9 and environmental limit indication means 10, without departing from the scope of this invention.
- the means of indication by haptic stimulation of object position 7 is controlled by the real-time calculation means 6 according to the map 2a stored in the memory 2, of position information of the user from the position determining means 3, real user azimuth information from the actual azimuth determination means 4 and / or acceleration information from the accelerometer 5, the means for indication by haptic stimulation of object position 7 haptically indicating to the user the position of at least one object in the environment.
- the object azimuth haptic stimulation indication means 8 is controlled by the real-time calculation means 6 as a function of the map 2a stored in the memory 2, of user position information from the position determination means 3, actual user azimuth information from the real azimuth determination means 4 and / or acceleration information from the accelerometer 5, the indication means by haptic stimulation of the object's own azimuth 8 haptically indicating to the user the azimuth of at least one object in the environment.
- the object distance indication means 9 is controlled by the real time calculation means 6 as a function of the map 2a stored in the memory 2, of the user position information from the position determination means. 3, actual user azimuth information from the actual azimuth determination means 4 and / or acceleration information from the accelerometer 5, the object distance indicating means 9 haptically or audibly indicating to the user the distance of at least one object in the environment from the user.
- the environment limit indication means 10 is controlled by the real time calculation means 6 according to the map 2a stored in the memory 2, user position information from the position determination means 3, real user azimuth information from the real azimuth determination means 4 and / or acceleration information from The accelerometer 5, the environmental limit indicating means 10 haptically or audibly indicating to the user the distance of environmental limits from the user.
- the object position 7 haptic stimulation indication means is a haptic tool positioned on a part of the user's body and the haptic object own azimuth stimulation means 8 is another haptic tool. positioned on a part of the user's body.
- the environmental limit indication means 10 is a haptic tool positioned on a part of the body of the user and / or a sound tool.
- the user can do without the sense of sight to move in the environment, the information concerning the trajectory he must follow in the environment being transmitted to him in a haptic or audible manner, the user knowing in time real position of the next object to cross as well as the direction of passage of the next object to cross.
- trajectory construction interface 1 can either be carried entirely by the user, for example when the user is moving while walking or skiing, or be worn jointly by the user and a vehicle driven by the user. the user, by example when the user is driving a car, motorcycle, etc.
- An object of the environment that the user must cross is a primary trajectory gate, said primary trajectory gate comprising a left limit, a right limit and a proper azimuth, the azimuth proper to the primary gate corresponding to the direction of movement. passage of the primary door by the user.
- Another object of the environment that the user must then cross is a secondary trajectory door, corresponding to the point of passage following the point of passage of the primary door, the secondary door becoming the new primary door after the passage of the door previous primary.
- environment borders preferably a left boundary border and a right boundary border.
- Other objects of the environment are landmarks, previously positioned on the map 2a or on the fly in the environment.
- FIG. 2A and 2B it can be seen that there is shown a haptic head tool 11 of the trajectory construction interface 1 according to a preferred embodiment of the present invention.
- the haptic head tool 11 comprises a haptic tool for indicating the primary azimuth of the primary door 12, a haptic tool for indicating the primary door limit position 13 and two IR cameras 13a, 13b.
- the haptic primary door clean azimuth indication tool 12 has a strip 14 disposed around the circumference of the user's skull, the strip 14 having a plurality of vibrating cells 15 uniformly disposed around the strip 14, one of the cell The vibrators 15 are vibrated to indicate to the user the azimuth of the primary door relative to the orientation of the user's head.
- haptic primary primary azimuth indication tool 12 could also be composed of a matrix of haptic pointers, such as a hood having several rows and several columns of vibrating cells, without depart from the scope of the present invention.
- the primary door limit position indication haptic tool 13 comprises a rigid gantry 16, located on the top of the user's skull, on which is fixed a flexible slide 17, the gantry 16 also being fixed to the strip 14 at two opposite sides.
- the positions of the left and right limits of the primary door are respectively defined by movable pointers 18, 19 sliding on the slide 17, the pointers 18, 19 always being in contact with the skull of the user.
- the IR cameras 13a, 13b are fixed on the top of the gantry 16 and turned in the direction of the user's gaze, the IR cameras 13a, 13b being designed to detect IR objects in the environment in order to inform the means real time calculation 6.
- the gantry 16 can be adjusted for the comfort of the user, and the position of the gantry 16 is adjustable, in particular according to its angular position on the axis traversing the skull transversely (right ear - left ear).
- the tension of the slide 17 is also adjustable.
- the haptic head tool 11 can also be integrated into a helmet that can be attached to the user's head.
- the strip 14 comprises a body support 14a, such as a helmet, on which is fixed a semi-rigid support membrane 14b, the membrane 14b bearing on that the vibrating cells 15 in contact with the user's skull, one of the vibration vibrating cells indicating the azimuth of the primary door to the user.
- a body support 14a such as a helmet
- the membrane 14b bearing on that the vibrating cells 15 in contact with the user's skull, one of the vibration vibrating cells indicating the azimuth of the primary door to the user.
- a guide wire 20 is also attached to the gantry 16, parallel to the slide 17.
- the movable pointer 18 comprises a frame 18a on which is fixed in rotation a contact wheel 18b, the contact wheel 18b always being in contact with the skull of the user, the frame 18a being in fixed connection with the guide cable 20 and in sliding connection with the slide 17.
- An electric motor 21 is also fixed on the gantry 16, said electric motor 21 making it possible to move the guide cable 20 parallel to the slide 17, which makes it possible to move the moving pointer 18 on the slide 17 in order to indicate to the user a point of contact relating to the position of one of the primary door boundaries relative to the user's head.
- the electric motor 21 associated with the mobile pointer 18 could also be secured to the frame 18a of the moving pointer 18, or connected to the frame 18a by a chain or belt transmission system, without departing from the scope of this invention.
- the primary door limit position indication haptic tool 13 also includes (but not shown) an additional guide wire and an additional electric motor associated with the moving pointer 19 in order to move it on the slide 17 to indicate to the user a contact point relative to the position of the other of the primary door boundaries with respect to the user's head.
- the primary door limit position indication haptic tool 13 could also be a system of guided rotating (rotating) rings each having a haptic pointer, without departing from the scope of the present invention.
- the movable pointer portion 18, 19 which is in contact with the user must not be braked or blocked by the hair of the user. That's why this contact can be done:
- the pointer 18, 19 will be mounted on a damping system (spring, pneumatic, hydraulic ...) adjustable to ensure a permanent contact pressure and comfortable for the user.
- This suspension can also be provided via the connection between the slide 17 and the device attached to the body of the user (for example, helmet).
- haptic tools 12 and 13 could also be arranged on other parts of the user's body, such as the neck, torso, arms and / or legs, without departing from the scope of use. the present invention.
- haptic tools 12 and 13 could also be composed of a matrix of haptic pointers, such as a hood having several rows and several columns of vibrating cells in the case where the tools are arranged on the head of the user without departing from the scope of the present invention.
- the haptic tool 12 could also be a pointer in contact with a part of the body of the user sliding on a slideway, mini-actuators operated by electromagnetism, air or liquid in order to be in contact with a part of the body of the user, or a pressure point operated by inflation of pockets, without departing from the scope of the present invention.
- haptic tool 13 could also be vibrating cells in contact with a part of the user's body, mini-actuators actuated by electromagnetism, air or liquid in order to be in contact with a part of the body of the user, or pressure points operated by inflation of pockets, without departing from the scope of the present invention.
- FIG. 3 it can be seen that there is shown a haptic tool of torso 22 of the trajectory construction interface 1 according to the preferred embodiment of the present invention.
- the haptic torso tool 22 is a haptic tool for indicating the secondary door position.
- the haptic torso tool 22 has a headband
- the strip 23 which is arranged around the circumference of the torso of the user, the strip 23 comprising several vibrating cells 24 arranged uniformly around the strip 23, one of the vibrating cells 24 being vibrated to indicate to the user the position of the secondary door relative to the orientation of the user's head.
- the haptic torso tool 22 may also be attached to a body support such as a belt or plastron.
- the position of the secondary door could also be indicated with a type of haptic tool different from that which indicates the azimuth proper of the primary door, in order to allow the user to distinguish the information more easily, without departing from the of the present invention.
- the haptic torso tool 22 could also be a pointer in contact with the torso of the user sliding on a slideway, mini-actuators operated by electromagnetism, air or liquid to be in contact with the torso of the user. user, or a pressure point operated by inflation of pockets, without departing from the scope of the present invention.
- the position of the secondary door could be indicated by a haptic tool on a part of the body other than the torso, without departing from the scope of the present invention.
- the position of the secondary door and the azimuth of the primary door are both indicated by a haptic tool with vibrating cells, it is possible to distinguish the types of information by the size of the vibrating contact surface. (vibrating plate with larger surface or vibrating segment), not the vibration frequencies.
- the haptic arm tool 25 is a haptic right edge boundary distance indication tool, the haptic arm tool 25 being disposed on the right arm of the user.
- an identical but symmetrical arm haptic tool is also disposed on the user's left arm as a haptic tool for boundary indication of the left border boundary of the environment.
- One of the vibrating cells 27, 28, 29a, 29b, 29c is vibrated to indicate in real time the user's position relative to the right boundary of the environment border.
- the vibrating cells 29a, 29b, 29c accurately indicate the proximity of the right boundary, the vibration of the vibrating cell 29a indicating that the user is close to the right boundary, the vibration of the vibrating cell 29b indicating that the user is almost on the right boundary, and vibration of the vibrating cell 29c indicating that the user is on the right boundary.
- the haptic arm tool 25 could also be pointers in contact with the user's arm sliding on a slideway, mini-actuators actuated by electromagnetism, air or liquid in order to be in contact with the arm of the user, or a pressure point actuated by inflation of pockets, without departing from the scope of the present invention.
- the position of the left and right environmental boundary boundaries could be indicated by a haptic tool on a body part other than the arms, without departing from the scope of the present invention.
- the environment is a driving track 30 on which the user moves for example in a car 31, the driving track 30 having a left evolution environment limit 32 and a right evolution environment limit 33 that the user must not cross, the driving track 30 is, in this example, a hairpin turn.
- a current primary door object 34 and a current secondary door object 35 are on the driving track 30, the user having to successively pass the primary door 34 and the secondary door 35 in order to make the hairpin turn, the primary door 34 being characterized by a left boundary 34a, a right boundary 34b, and a proper azimuth 34c, the secondary gate being characterized by a left boundary 35a, a right boundary 35b, and an own azimuth 35c to indicate this information to the user when the Secondary door 35 will become the new primary door as soon as the user has crossed the current primary door 34.
- the primary gate objects 34 and secondary gate 35 are in the environment map and / or arranged in hard on the driving track 30 and able to be identified by the cameras.
- the environment also includes a braking point marker object 36, a turning point marker object 37, and a chord point marker object 38, the braking point marker object 36 being disposed at the same time. turn entry, the turning point mark object 37 being disposed at the primary door 34, the rope stitch mark object 38 being disposed at the secondary door 35.
- the reference objects 36, 37, 38 are in the map and / or hard on the driving track 30 and can be marked by the cameras.
- the haptic head tool 11 indicates to the user the positions of the left and right limits 34a, 34b of the primary door 34 and the own azimuth 34c of the primary door 34.
- the haptic torso tool 22 indicates to the user the position of the secondary door 35.
- Two haptic arm tools 25 respectively indicate to the user the left evolution environment boundary 32 and the right evolution environment boundary 33.
- a sound tool indicates to the user the crossing of the reference objects 36, 37, 38, to assist the user when driving the car 31 by indicating essential information to the passage of the turn, that is to say the braking point, the steering point and the rope point.
- the user 39 is installed in a vehicle equipped with the trajectory construction interface 1, the user being able to drive the vehicle.
- the user 39 carries a haptic head tool 40 identical to the haptic head tool 11 except that it does not have IR cameras and has an electronic compass 4a and an accelerometer 5 fixed on it.
- the electronic compass 4a and the accelerometer 5 can be combined with or included in an inertial unit, the haptic torso tool 22, the haptic arm tool 25, and a haptic arm tool 41 identical to the tool.
- haptic 25 except that it indicates the left edge of the environment border.
- the vehicle is equipped with two IR cameras 42a, 42b disposed on the top of the driver's seat 43 of the vehicle, and two speakers 44a, 44b respectively disposed on each side of the driver's seat 43 of the vehicle, the IR cameras 42a, 42b being fit identifying locator objects in the environment and the speakers 44a, 44b being able to indicate to the user in a sound manner the information of the marker objects.
- the IR cameras could be arranged on a helmet worn by the user, without departing from the scope of the present invention.
- the trajectory construction interface 1 continuously records all the data received and returned so that the user can analyze his session, this recording also serving as a "black box" in the event of an incident.
- the different subsystems will be controlled by cross checking GPS data - inertial unit / camera accelerometers - mapping.
- the good mechanical functioning will also be controlled, in particular the good positioning of the pointers.
- the set In case of inconsistency of information or malfunction of one of the subsystems, the set generates alerts to eventually turn off.
- the set can also alert the user if its evolution comes out of a predefined frame (for example, trajectory relative to speed).
- the autonomy of the trajectory construction interface 1 is provided by batteries, possibly rechargeable by photovoltaic cells.
- the real-time calculation means 6 is connected to the various peripherals of the interface (cameras, inertial unit / compass, haptic tools, etc.) either by electrical harness or wirelessly (Bluetooth, Wifi, etc.).
- the path construction interface 1 is able to be wirelessly connected to environment-specific applications, such as a guidance system application, the trajectory building interface 1 receiving in real time environmental information, such as environmental changes, from specific environment applications.
- the user can direct his eyes in the right direction, the motorway exit, the street to take. This use requires only partial use of the overall capabilities of the set (only the orientation of the head to a point of the environment in 2D).
- the trajectory construction interface is controlled by a navigation system.
- the interface directs the user's head in real time towards the next exit or the next change of direction, or towards the queue on which to roll.
- the trajectory construction interface may in particular allow the user to continue reliably guiding the user in areas not covered by a satellite tracking system (such as underground tunnels for example).
- a satellite tracking system such as underground tunnels for example.
- a user guide (guide) is associated with the user.
- the reference user constitutes a mobile primary door.
- the information transmitted by the means for transmitting information to the user by sensory means are then: the user-primary door direction, the right and left limits of the primary door, the primary user-door distance, and alerts on the door. 'environment.
- the means for recognizing the environment consist of a camera, for example an infrared camera.
- the primary user-door distance is calculated by the camera reading a mark carried by the marker user or a ground marker.
- the distance calculation can be improved by a radio or radar distance calculation system.
- a relational database of the environment can be used, which includes a mapping of the environment, with the race lines, the ground markings, thus making it possible to calculate a speed and / or a reference user-user distance.
- the relational database is integrated or accessed by a wireless network, conventionally.
- the distances calculated on the basis of the marks are calculated, also conventionally, by image processing. Athletics, run in the corridor without a guide
- the primary door is a virtual mobile primary door.
- the means for recognizing the environment consist of a camera, for example an infrared camera.
- a relational database of the environment can be used, which includes a mapping of the environment, with the race lines, the ground markings, thus making it possible to calculate a speed and / or a user-primary door distance.
- the relational database is integrated or accessed by a wireless network, conventionally.
- the distances calculated on the basis of the marks are calculated, also conventionally, by image processing.
- a reference user (guide) is associated with the user.
- the reference user constitutes a mobile primary door.
- the information transmitted by the means for transmitting information to the user by sensory means are then: the primary user-door direction, the right and left limits of the primary door, the primary azimuth of the primary door, the user-distance. primary door, and alerts on the environment.
- the means for recognizing the environment consist of a camera, for example an infrared camera.
- the reference user is equipped with means for determining his direction of movement, for example an electronic compass.
- a reference user (guide) is associated with the user.
- the reference user constitutes a mobile primary door.
- the means for recognizing the environment consist of one or more cameras, for example of the infrared camera type.
- a relational database of the environment can be used, which includes a mapping of the environment.
- the relational database is integrated or accessed by a wireless network, conventionally.
- the distances calculated on the basis of the marks are calculated, also conventionally, by image processing.
- the user is equipped with a set GPS / accelerometer / inertial unit, allowing in liaison with the relational database to determine its position, speed, acceleration, azimuth.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Public Health (AREA)
- Pain & Pain Management (AREA)
- Life Sciences & Earth Sciences (AREA)
- Veterinary Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Physical Education & Sports Medicine (AREA)
- Epidemiology (AREA)
- Animal Behavior & Ethology (AREA)
- Computer Hardware Design (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Navigation (AREA)
- Operation Control Of Excavators (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1459619A FR3026638B1 (fr) | 2014-10-07 | 2014-10-07 | Ensemble environnement et interface de stimulation tactile de guidage sur une trajectoire dans l'environnement |
PCT/FR2015/052635 WO2016055721A2 (fr) | 2014-10-07 | 2015-10-01 | Interface de construction de trajectoire dans un environnement et ensemble environnement et interface de construction de trajectoire |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3204722A2 true EP3204722A2 (fr) | 2017-08-16 |
Family
ID=52021263
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15798152.3A Withdrawn EP3204722A2 (fr) | 2014-10-07 | 2015-10-01 | Interface de construction de trajectoire dans un environnement et ensemble environnement et interface de construction de trajectoire |
Country Status (5)
Country | Link |
---|---|
US (1) | US10507157B2 (fr) |
EP (1) | EP3204722A2 (fr) |
CA (1) | CA2963058A1 (fr) |
FR (1) | FR3026638B1 (fr) |
WO (1) | WO2016055721A2 (fr) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106671084B (zh) * | 2016-12-20 | 2019-11-15 | 华南理工大学 | 一种基于脑机接口的机械臂自主辅助方法 |
JP2019148949A (ja) * | 2018-02-27 | 2019-09-05 | 株式会社 イマテック | 支援システム及び支援方法 |
DE102018212869A1 (de) * | 2018-08-01 | 2020-02-06 | Volkswagen Aktiengesellschaft | Konzept zum Vermitteln einer Richtungsangabe an einen Nutzer |
CN110428694A (zh) * | 2019-08-01 | 2019-11-08 | 深圳市机场股份有限公司 | 一种模拟驾驶的训练方法、移动终端和计算机存储介质 |
IL276989A (en) * | 2020-08-28 | 2022-03-01 | Rapoport Anatoli | Navigation aid for the blind and visually impaired |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5470233A (en) * | 1994-03-17 | 1995-11-28 | Arkenstone, Inc. | System and method for tracking a pedestrian |
FR2771626B1 (fr) * | 1997-12-01 | 2000-05-26 | Fabien Beckers | Procede et systeme permettant aux personnes non ou mal voyantes de s'orienter et de se diriger dans un environnement inconnu |
GB2428927A (en) * | 2005-08-05 | 2007-02-07 | Hewlett Packard Development Co | Accurate positioning of a time lapse camera |
US20080120029A1 (en) | 2006-02-16 | 2008-05-22 | Zelek John S | Wearable tactile navigation system |
US20130218456A1 (en) | 2006-02-16 | 2013-08-22 | John S. Zelek | Wearable tactile navigation system |
US8600800B2 (en) * | 2008-06-19 | 2013-12-03 | Societe Stationnement Urbain Developpements et Etudes (SUD SAS) | Parking locator system including promotion distribution system |
TW201106941A (en) | 2009-08-28 | 2011-03-01 | Univ Nat Taiwan | Electronic blind-navigation device and corresponding electronic blind-navigation cane |
US8457844B2 (en) * | 2010-04-12 | 2013-06-04 | Delphi Technologies, Inc. | Parallel parking assistant system and method thereof |
US8995678B2 (en) | 2010-04-30 | 2015-03-31 | Honeywell International Inc. | Tactile-based guidance system |
US20120176525A1 (en) * | 2011-01-12 | 2012-07-12 | Qualcomm Incorporated | Non-map-based mobile interface |
US8752851B2 (en) * | 2011-07-25 | 2014-06-17 | Shia-Lin Chen | Auxiliary device for bicycle |
DE102011119864A1 (de) * | 2011-12-01 | 2013-06-06 | Deutsche Telekom Ag | Verfahren zum Betrieb eines handhaltbaren elektronischen Navigationsgerätes und Navigationsgerät hierzu |
US20140309876A1 (en) * | 2013-04-15 | 2014-10-16 | Flextronics Ap, Llc | Universal vehicle voice command system |
US8744771B2 (en) * | 2012-03-26 | 2014-06-03 | Navteq B.V. | Reverse natural guidance |
US8552847B1 (en) * | 2012-05-01 | 2013-10-08 | Racing Incident Pty Ltd. | Tactile based performance enhancement system |
KR101539331B1 (ko) * | 2014-02-04 | 2015-07-28 | 고려대학교 산학협력단 | 양방향 통신 기능을 갖는 차량용 내비게이터를 이용한 주차 유도 시스템 및 그 방법 |
KR102302439B1 (ko) * | 2014-02-21 | 2021-09-15 | 삼성전자주식회사 | 전자 장치 |
US9588586B2 (en) * | 2014-06-09 | 2017-03-07 | Immersion Corporation | Programmable haptic devices and methods for modifying haptic strength based on perspective and/or proximity |
DE102014212843A1 (de) * | 2014-07-02 | 2016-01-07 | Robert Bosch Gmbh | Verfahren zur Parkplatzvermittlung und Freier-Parkplatz-Assistenzsystem |
-
2014
- 2014-10-07 FR FR1459619A patent/FR3026638B1/fr active Active
-
2015
- 2015-10-01 US US15/517,419 patent/US10507157B2/en active Active
- 2015-10-01 EP EP15798152.3A patent/EP3204722A2/fr not_active Withdrawn
- 2015-10-01 CA CA2963058A patent/CA2963058A1/fr not_active Abandoned
- 2015-10-01 WO PCT/FR2015/052635 patent/WO2016055721A2/fr active Application Filing
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2016055721A2 * |
Also Published As
Publication number | Publication date |
---|---|
FR3026638B1 (fr) | 2016-12-16 |
US10507157B2 (en) | 2019-12-17 |
FR3026638A1 (fr) | 2016-04-08 |
CA2963058A1 (fr) | 2016-04-14 |
US20180235833A1 (en) | 2018-08-23 |
WO2016055721A2 (fr) | 2016-04-14 |
WO2016055721A3 (fr) | 2016-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6740196B2 (ja) | ロボティックトレーニングシステムおよび方法 | |
WO2016055721A2 (fr) | Interface de construction de trajectoire dans un environnement et ensemble environnement et interface de construction de trajectoire | |
US10180329B2 (en) | Robotic golf caddy | |
US9963199B2 (en) | Components, systems and methods of bicycle-based network connectivity and methods for controlling a bicycle having network connectivity | |
US20180173223A1 (en) | Robotic Golf Caddy | |
CN104127302B (zh) | 一种视障人士行路安全导航方法 | |
US10453264B1 (en) | System for simulating a virtual fitness partner | |
US11710422B2 (en) | Driving analysis and instruction device | |
US20210113914A1 (en) | A gait controlled mobility device | |
EP2350565A1 (fr) | Dispositif et procede de determination d'une caracteristique d'une trajectoire formee de positions successives d'un accelerometre triaxial lie de maniere solidaire a un element mobile | |
US10429454B2 (en) | Method and system for calibrating a pedometer | |
US20170227574A1 (en) | Method and system for calibrating a pedometer | |
KR101763404B1 (ko) | 헤드 마운트 디스플레이 장치를 통해 디스플레이되는 도로 영상의 로드 정보가 헬스 자전거에 부여된 운동 시스템 | |
EP3392128A1 (fr) | Composants, systèmes et procédés de connectivité de réseau pour bicyclette et procédés pour commander une bicyclette présentant une connectivité de réseau | |
US20170319939A1 (en) | Sports apparatus for providing information | |
US10288446B2 (en) | System and method for movement triggering a head-mounted electronic device while inclined | |
US11432777B2 (en) | Sports apparatus for providing information | |
US10285649B1 (en) | Wheelchair movement measurement and analysis | |
US10197592B2 (en) | Method and system for calibrating a pedometer | |
US10527452B2 (en) | Method and system for updating a calibration table for a wearable device with speed and stride data | |
US20200134922A1 (en) | Movable body | |
US20080294341A1 (en) | Interactive route information transmitting and receiving device for two-wheel vehicle | |
WO2022228921A1 (fr) | Zones de vision croisées | |
GB2567231A (en) | A sports apparatus for providing information | |
KR20190069199A (ko) | 무인 자동차 가상체험 방법 및 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20170508 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20181010 |
|
17Q | First examination report despatched |
Effective date: 20200102 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: VAILLANT, YANNICK |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: VAILLANT, YANNICK |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200714 |