EP3500823A1 - System zur unterstützung der navigation unter verwendung von visuellen und lateralisierten schnittstellen - Google Patents

System zur unterstützung der navigation unter verwendung von visuellen und lateralisierten schnittstellen

Info

Publication number
EP3500823A1
EP3500823A1 EP17768828.0A EP17768828A EP3500823A1 EP 3500823 A1 EP3500823 A1 EP 3500823A1 EP 17768828 A EP17768828 A EP 17768828A EP 3500823 A1 EP3500823 A1 EP 3500823A1
Authority
EP
European Patent Office
Prior art keywords
orientation
human
message
electronic device
machine interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17768828.0A
Other languages
English (en)
French (fr)
Inventor
Philippe LECA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ontracks
Original Assignee
Ontracks
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ontracks filed Critical Ontracks
Publication of EP3500823A1 publication Critical patent/EP3500823A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/265Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network constructional aspects of navigation devices, e.g. housings, mountings, displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3632Guidance using simplified or iconic instructions, e.g. using arrows
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3688Systems comprising multiple parts or multiple output devices (not client-server), e.g. detachable faceplates, key fobs or multiple output screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1438Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using more than one graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed

Definitions

  • the invention relates to a system for visual aid and lateralized navigation usable for the majority of us, whatever the mode of travel and space to go.
  • the invention thus offers more particularly and in a non-limiting way a valuable help during a hiking, biking or skiing.
  • the invention also remains relevant for delivering a navigation aid during a journey on or in a motorized vehicle.
  • the invention can also be exploited during journeys made on the surface of the water or in the air. To move as freely as possible in space is a constant concern for Man.
  • the freedom of movement does not exclude a renunciation for some comfort, the search for performance or a form of serenity and safety during a trip to prevent, for example, any risk of distraction during a hike in mountain, or any disappointment inherent in the fact of not having been able to reach a remarkable point of view, for lack of knowledge of a course or a site.
  • Many solutions have been developed, from the most basic to the most sophisticated, to achieve this goal with varying degrees of success.
  • a motorized land vehicle operator such as an automobile or a carrier of persons or goods, can consult, during a trip, a screen positioned within the cabin of said vehicle, displaying, in real time , a traffic plan, in the form of a dynamic road map and disclosing in textual, semi-figurative and sound ways, guidance indications along a journey.
  • a vehicle is equipped, near the dashboard, with an electronic navigation aid device comprising a touch screen or not, of sufficient display size, generally of the order of one to two tens of centimeters diagonally, so that the driver can follow and consult, certainly by diverting some of his driving attention, visually and comfortably, the road map updated in real time.
  • an electronic navigation aid device comprising a touch screen or not, of sufficient display size, generally of the order of one to two tens of centimeters diagonally, so that the driver can follow and consult, certainly by diverting some of his driving attention, visually and comfortably, the road map updated in real time.
  • Such equipment also includes a computer and satellite communication means for locating in space the location where the vehicle is.
  • the driver may, prior to departure, inform a desired destination and, according to configuration parameters stored in memory of said equipment and reflecting certain preferences of said driver, such as the recommendation of the shortest path in distance or time, the most fuel-efficient route, toll charges, etc., the calculator of said equipment determines a relevant route and then dispensing graphical and sound ways of routing indications along the path.
  • Certain solutions such as those described, for example, in documents EP 2690406 A1 and WO 00/18612, make it possible to repeat or to deport all or part of certain graphic orientation information, generally delivered by a console or a central screen within a cockpit, on side display devices, at least on which the driver of a land vehicle thus adapted naturally poses his eyes during maneuvers, such as mirrors.
  • Such solutions are reserved for the automobile and result in expensive rear-view mirrors dedicated to a given vehicle.
  • Such mobile equipment and adapted offers a similar solution in terms of functionality while providing mobility allowing the user not to return his equipment to help permanent resident navigation within a given vehicle.
  • Such mobile devices generally cooperate with suitable support means ensuring a maintenance and positioning of said equipment close to the driver's field of vision.
  • supports provide a reversible attachment to an aerator grid of a dashboard, a side window or a windshield of the vehicle via a sucker, for example.
  • Such solutions are expensive and require sophisticated and fragile equipment.
  • a navigation aid device This is for example the case during a pedestrian, bicycle or more generally on a motorized two-wheeler or not.
  • Some operators have proposed positioning on the frame, the handlebars or the stem of a bicycle a mobile device like the mobile devices mentioned above.
  • suitable supports have been developed to maintain, for example, a mobile phone on a bicycle, or even a dedicated mobile device of reduced size compared to what is found to equip an automobile.
  • this type of electronic equipment is not intended to be resistant to bad weather and falls. It can also be easily stolen by a malicious third party.
  • a hiker usually prefers to store his valuable equipment in a waterproof pocket and consult during breaks the course of his journey at the risk of having to deviate from the selected route.
  • Other waterproof and more robust equipment than a smart mobile phone are available. They include a screen capable of displaying figurative signs such as arrows for example. The dimensions of said screen are modest, that is to say a few centimeters in diameter or radius, so as not to hinder its user in its evolution.
  • Some equipment is similar to an alternative compass, which can be positioned in the center of the handlebars. Thus, a moving luminous point on the periphery of a circular dial indicates a relative direction.
  • a central electronic device comprising two light-emitting diodes controls the illumination of the first or second diode according to the geographical positioning delivered by a system satellite positioning system operated by a mobile phone, introduced into the garment in a pocket provided for this purpose, in communication with said central electronic device.
  • the light signal of each diode can be routed to the distal parts of the sleeves of the jacket.
  • such a solution is restrictive because reserved and integrated into the garment that is worn. If the user has to remove, for example weather conditions, said garment or conversely cover it, for example by a second raincoat or warmer clothing, the light indications are no longer usable by the user.
  • the light signal delivered by such a garment is more than succinct, since it consists only of two light sources reduced to their simplest expression.
  • the document WO2007 / 105937 discloses an electronic belt comprising a plurality of vibrating means distributed around the abdominal belt of the wearer.
  • the different vibrating means are controlled by a computer in response to indications or commands emanating from a navigation aid device cooperating with said computer.
  • the wearer of the belt can interpret orientation instructions transmitted by this or that vibrating means.
  • Such a solution is however cumbersome and not very exploitable by sighted hiker because it requires a certain learning to be able to interpret precise and complex indications.
  • a hiker on an all-terrain bike may not perceive the vibrations drowned in the jolts caused by rough terrain.
  • a hiker therefore has no other solution than to resort, at present, to a navigational aid equipment not adapted to his leisure requiring him to consult a screen generally restoring graphic indications outside of its natural field of vision, at the risk of going astray and unnecessarily increasing a route because of an inappropriate frequency of consultations, or even falling or being injured by loss of concentration related to reading and / or interpretation of a little intuitive instruction in full piloting or tracking.
  • a navigational aid equipment not adapted to his leisure requiring him to consult a screen generally restoring graphic indications outside of its natural field of vision, at the risk of going astray and unnecessarily increasing a route because of an inappropriate frequency of consultations, or even falling or being injured by loss of concentration related to reading and / or interpretation of a little intuitive instruction in full piloting or tracking.
  • the invention achieves such an objective by providing a visual and lateralized navigation aid system, thus solving the disadvantages of known solutions. New prospects for hiking are thus within the reach of all, whatever the mode of travel. In the rest of the document, we will describe
  • a “non-lateralized” aid may consist in a centralized visual indication or not, distributed within said field of vision, or outside of it - forcing a loss of visual tracking of the current path , which requires an interpretation of the user not instinctive and therefore indirect.
  • a display of an indication in the form of a menu or in a literal or figurative form, located in the center of the field of view or on one of the sides thereof, in an undifferentiated manner whatever the nature of the indication, will not be considered as an indication or aid "lateralised”.
  • direction to describe a current path taken by a user of a system according to the invention.
  • a “direction” thus includes in an undifferentiated way the notions of trajectory, when a displacement is carried out in the absence of road or physical track, or path as such.
  • change of direction are similar and encompass the notion of deflection or modification of a path or a selection of a path among a plurality of possible paths downstream of the path. current path.
  • orientation setpoint to encompass the notion of setpoint aimed at modifying a current direction, as defined previously, that is to say a selection of a downstream path, a deflection of a trajectory, possibly including a degree or more generally any complementary and useful information for the user to make a relevant change of direction in a relevant manner.
  • a system according to the invention requires no structural modification of clothing or vehicle elements, or support, to use said system;
  • a visual aid system and lateralized to the navigation, delivering visual and lateral orientation indications on the left and / or on the right of a user's field of vision comprising an electronic device, a first human-machine interface, said electronic device and first human-machine interface cooperating via respective communication means, said electronic device comprising a processing unit arranged to produce a setpoint orientation, develop and transmit a setpoint message encoding orientation said orientation instruction to said first human machine interface, said first human-machine interface comprising display means and a processing unit, said display means being controlled by said processing unit of the first human-machine interface to display a first light indication in response to the receiving and decoding said setpoint message in orientation.
  • said system comprises a second human-machine interface, said second human-machine interface cooperating with the electronic device and comprising display means and a unit processing, said display means being driven by said processing unit of the second interface human-machine for displaying a second light indication in response to the reception and decoding of a setpoint message in orientation.
  • the electronic device of such a system is adapted to develop and trigger the transmission of a setpoint message in orientation to the first human-machine interface or the second human-machine interface according to the orientation of the change of direction concerned by said instruction in orientation.
  • each of the first and second human-machine interfaces of such a system may comprise, or cooperate with, a body principal arranged to guide the respective emissions of the first and second light indications respectively on the left and right of the field of view of the user of the system.
  • said main body can be arranged to be adjusted to one of the user's system members.
  • each human-machine interface may consist of a bracelet that can be positioned on one of said user's wrists.
  • the electronic device can develop and trigger the transmission of a setpoint message in orientation to the first human-machine interface if the change of direction concerned by said setpoint in orientation induces a directional orientation on the left of said user.
  • said electronic device can develop and trigger the sending of a setpoint message in orientation to the second human-machine interface if the change of orientation concerned by said orientation setpoint induces a directional orientation on the right of said user.
  • the display means of the first and second human interfaces -machines may advantageously comprise first and second display spaces, respectively driven by the processing unit of the man-machine interface concerned, to display a lateralized light signal of orientation and a time gauge reflecting the imminence of the change direction indicated by said first display space.
  • the orientation reference message advantageously comprises a time-lapse datum of the change of direction whose value defines said imminence of the change of direction.
  • said first and second display spaces may consist respectively of two series of at least one light emitting diode.
  • the display means of the first and second human interfaces -machines may have a third display space, controlled by the processing unit of the man-machine interface concerned, to display a figurative symbol among a plurality of specified symbols.
  • the orientation reference message includes an additional field whose value designates said figurative symbol from among said determined plurality of symbols.
  • the first and second human-machine interfaces can include means for alerting the imminence of a change of direction, said warning means diffusing information of a nature other than light, and being actuated by the processing unit of the human-machine interface having received the orientation reference message when the imminence of a change of direction deduced from said message reaches a determined threshold.
  • the electronic device may comprise second communication means for collecting data from at least one remote satellite and for wherein the processing unit of said electronic device uses said data to determine the terrestrial position of said electronic device.
  • the invention provides that the electronic device and one of the two man-machine interfaces may constitute a single physical entity .
  • the said electronic device can be a smart mobile phone having a program memory cooperating with the processing unit of said electronic device and recording instructions of an application program whose execution or interpretation by said processing unit causes the implementation a method of generating a setpoint message in orientation to one of the two man-machine interfaces.
  • the invention provides such a method of generating a setpoint orientation message implemented by the processing unit of an electronic device of a visual aid system and lateralised.
  • a method comprises:
  • said step for encoding said setpoint in orientation in the form of a setpoint message in orientation and sending said message is to elaborate and transmitting said setpoint message in orientation to one of the two man-machine interfaces of said system according to the orientation of the change of direction affected by the orientation instruction.
  • a computer program product comprising a plurality of program instructions which, when they are previously loaded into the program memory of an electronic device of such a system and then executed or interpreted by the processing unit of said electronic device, causes the implementation of a method of generating a message setpoint orientation as discussed above.
  • FIG. 1 describes an exemplary architecture of a navigation aid system according to the invention comprising two human-machine interfaces communicating with an intelligent mobile phone adapted by the implementation of an appropriate application;
  • FIG. 2 describes a functional example of a method for restoring an orientation indication implemented by one of the man-machine interfaces of a navigation aid system according to FIG. 1;
  • FIG. 3 describes an exemplary method for generating setpoint messages implemented by an electronic device of a navigation aid system according to the invention, said device possibly being the intelligent mobile telephone described in connection with the Figure 1 or alternatively integrated to one of the man-machine interfaces also described in Figure 1;
  • FIG. 4A is a schematic view of an exemplary embodiment of means for displaying lateralized visual indications of a home-machine interface of a navigation aid system according to the invention
  • FIG. 4B describes several non-exhaustive examples of figurative information optionally displayed by a human-machine interface of a navigation aid system according to the invention
  • FIG. 5 presents a three-dimensional view of one of the two human-machine interfaces of a navigation aid system according to the invention in the form of a communicating electronic bracelet.
  • Figure 1 shows a simplified architecture of a first embodiment of a visual aid system and lateralised navigation according to the invention.
  • Such a system comprises two man-machine interfaces 10L and 10R, for example in the form of two communicating electronic bracelets such as that described later in connection with FIG. 5.
  • the two human-machine interfaces 10L and 10R are similar in their architecture and operation.
  • Figure 1 presents, for the sake of simplification, a detailed view of the means included in the man-machine interface 10L.
  • This comprises a processing unit 11, in the form of one or more computers or microcontrollers cooperating by means of communication buses, symbolized by double arrows in FIG. 1, with a program memory 13 and a data memory 12.
  • Said memories 12 and 13 may constitute one and the same physical entity or be physically or logically dissociated.
  • the program memory 13 is intended to record in particular instructions of a computer program product P10.
  • Said program instructions are interpretable or executable by the processing unit 11 and cause the implementation of a method for restoring an orientation indication IL, IR or IC, such as the method 100 described later in connection with FIG. 2.
  • the processing unit 11 also cooperates with first communication means 14 responsible for providing a communication C1, advantageously but not exclusively wirelessly, with a remote electronic device 20.
  • Such communication Cl can be implemented, for example by relying on a wireless communication protocol of proximity such as Bluetooth, ZigBee, iBeacon or any other protocol or equivalent technology.
  • a communication C1 can be implemented in a wired manner via a communication of the USB type (Universal Serial Bus, according to English terminology) for example.
  • a man-machine interface 10L or 10R furthermore comprises display means 15, in the form of one or more screens of sizes voluntarily reduced to their simplest expressions. .
  • Such display means 15 may consist, as non-limiting examples, in two or even three display spaces.
  • the first display space 15-1 may consist of one or more light-emitting diodes or a screen advantageously flexible to reduce the risk of breakage in case of a fall, for example, liquid crystal, or any other equivalent.
  • Figure 4A described more Specifically and in conjunction with FIG. 2, an exemplary embodiment of such a first display space 15-1 is shown in the form of a row of five light-emitting diodes. Such a diode is symbolized by a square with rounded edges. It can be extinguished, said symbol is then white. Said diode may be illuminated, said symbol is in this case black.
  • the main function of this first display space 15-1 is to deliver a first IL-oriented lateralized light signal to the user U.
  • said user U carries or carries the two man-machine interfaces 10L and 10R respectively at these two wrists, for example when said man-machine interfaces 10L and 10R consist of two separate electronic bracelets such as that described in FIG. 5, said user U immediately and instinctively knows that he must take a direction on his left when the first display space 15-1 of the interface 10L positioned at its left wrist delivers a light signal IL. Conversely, the user knows that he must take a direction on his right when the first display space 15-1 of the man-machine interface 10R which is positioned on his right wrist, delivers such an IR light signal.
  • the light information IL or IR is instantly taken into account by the user because the latter, for example if it is a cyclist hiker, has at all times his forearms in his field of vision, his hands resting on his handlebars.
  • the information delivered by the man-machine interfaces 10L and 10R are therefore present in said field of view.
  • the IL or IR light signal being lateralized (that is to say a single human-machine interface between the two man-machine interfaces 10L and 10R does not deliver such a light signal at a given instant, apart from any guiding incident reporting) the user's brain U instantly assimilates such information, in a completely intuitive way without any mental effort.
  • a man-machine interface 10L or 10R comprises a second display space 15-2.
  • This may be similar in nature to the previous first display space 15-1, as described by way of non-limiting example, Figure 4A.
  • Such a second display space 15-2 may thus consist of a row of light-emitting diodes or a screen. Its function, however, is similar to a time gauge reflecting the imminence of a change of direction indicated by the first display space 15-1.
  • the more said time gauge is light, that is to say the more light emitting diodes are lit, the more the change of direction is imminent.
  • the 4A thus shows an IL or IR light information at three distinct instants t1, t2 and t3.
  • the man-machine interface 10L or 10R concerned does not deliver light information.
  • the two display spaces 15-1 and 15-2 respectively consist of two rows of five light-emitting diodes extinguished.
  • the first display space 15-1 delivers a clear light information (the five diodes constituting it are lit).
  • the user then knows that a change of direction is near.
  • the time gauge indicates (two of the five diodes are lit) that said change should occur in n seconds, for example sixteen seconds.
  • a third diode would be lit four seconds later for example and so on.
  • the five diodes of the time gauge are lit, thus preventing the user U that he must make the change of direction, on his left, if it is the interface 10L man-machine positioned on the left of its field of vision which delivers this luminous information IL, or vice versa, on its right, if it is the man-machine interface 10R, positioned on the right of said field of vision which delivers said IR visual information.
  • Any other method or technique could alternatively be implemented to materialize and display such a time gauge. It is the same for the durations associated with said gauge.
  • the hiker is pedestrian, it will be possible to determine, that is to say, parameterize, the time gradient of the gauge, for example by requesting a means of input, such as a push button 18 or a microphone or any other equivalent means for transmitting parameters Pu to the processing unit 11.
  • a means of input such as a push button 18 or a microphone or any other equivalent means for transmitting parameters Pu to the processing unit 11.
  • Such preferences Pu are recorded in the data memory 12.
  • the user preferences Pu may also emanate from the electronic device 20 in communication with the two human interfaces. machine 10L and 10R, as we will describe later.
  • the invention provides, as with the time gauge described in connection with the second display space 15-2, that the first space 15-1 can describe a directional gauge.
  • the more said directional gauge is illuminated that is to say, the more the number of light-emitting diodes composing by way of non-limiting example said first display space is large), the more the change of direction must be marked with regard to the current direction.
  • Other tests have shown a loss of intuitive and instinctive perception of such information IL or IR.
  • a preferred embodiment of display means 15 may consist in providing a third optional display space 15-3, preferably in the form of a screen capable of displaying figurative symbols, such as the symbols SI to S5 illustrated in Figure 4B as non-limiting and non-exhaustive examples.
  • the first display space 15-1 remains basic and Boolean, that is to say a simple and straightforward light signal (all the diodes are off in the absence of signaling or on to indicate a change of direction), and this information is completed by the optional display of a complementary symbol by the third display space 15-3.
  • FIG. 4B thus presents some non-exhaustive examples of symbols whose display can be triggered by the processing unit 11 of a human-machine interface according to the invention, such as those 10L and 10R described in connection with the figures 1 and 5.
  • a first example of a symbol, referenced SI in FIG. 4B, indicates to orient slightly to the right with respect to the current trajectory.
  • a second example S2 instructs the user U to turn right, simply.
  • the symbol S3 may indicate that the said change of direction on the right of the user must be clearly marked.
  • a fourth example of a figurative symbol S4 may indicate a quasi-turn to be made by the right.
  • FIG. 4B also describes a fifth example of symbol S5 in the form of a number, in this case the number ⁇ 2 '.
  • symbol S5 instead of figurative symbols SI to S4, as described above, can be clever to indicate the rank of an output of a roundabout, for example.
  • said symbol S5 describing the digit ⁇ 2 ' may indicate to take the second output.
  • the invention also provides, for increasing the lateralization of the visual information delivered to a user, that the display means 15 of the first 10L and second 10R human-machine interfaces, respectively intended to be positioned on the left and on the right the field of view of said user, also have axial symmetry.
  • the third display space 15-3 is arranged on the left of said display means 15 of the man-machine interface 10L and on the right of those of the IR interface. It is the same for the directions of virtual "filling" of the time gauges 15-2, or even directional 15-1. Any other ergonomic arrangement said display means 15 for enhancing the intuitiveness of the IL and IR visualized visual information perceived by the user could alternatively or in addition be considered.
  • the invention makes it possible to deliver timely, complete and relevant information to the user to guide him in his journey.
  • the latter perceives, in his field of vision, that is to say, without necessarily leaving his current direction and thus without loss of concentration, contrary to what the previous solutions impose, an information IL, IR clear, linked a change of direction, said information being lateralized and therefore directly and instinctively taken into account by the user, supplemented by a time gauge indicating the imminence of said change of direction, or even additional information in multiple possible routes to avoid any navigation error.
  • said display means 15 may consist only of two rows of a few light-emitting diodes, offering a particularly economical solution. robust.
  • man-machine interfaces 10L or 10R may further comprise means 16 for alerting an imminence of a change of direction, complementary to the display means 15.
  • the nature of such additional alert means 16 is chosen so that they deliver a signal IC of different nature than light, for example sound or preferably vibratory.
  • Such means 16 may therefore consist of a loudspeaker, a beeper (or beeper or buzzer according to English terminology) or a mobile feeder like a vibrator for mobile phone. Said means 16 cooperate with the processing unit 11 which electronically controls them. Said warning means 16 may be actuated by the processing unit 11, for example when the first display space 15-1 is activated, and thus at the earliest, and then when the time gauge 15-2 describes the first display space 15-1. instant to effect said change of direction, such as the time t3 described in connection with Figure 4A.
  • a human-machine interface 10L or 10R may include a source or reserve of electrical energy, such as one or more batteries, sized to ensure the operation of electronic components constituting said interface.
  • Each human-machine interface 10L and 10R is thus autonomous and offers a decentralized functional architecture, of great modularity, advantageously having a direct link with an electronic device 20, as described below, and responsible for sending setpoint messages. in orientation.
  • Each human-machine interface 10L or 10R may be, depending on the embodiments, linked or totally independent of a second pair of human-machine interfaces. The failure of one can be of no consequence for the second.
  • each man-machine interface 10L or 10R can cooperate or be integrated in a support means 10S adapted to the situation, sport or leisure concerned.
  • a support means 10S will be particularly designed to cooperate or embed all or part of a human body member, for example a wrist or more generally a distal part of a limb.
  • said support means 10S advantageously separated from a housing as such, not shown in the figures, which would be responsible for housing and grouping the electronic elements of said man-machine interface 10L or 10R, makes it possible to exploit the same human-machine interface according to different configurations, that is to say conveyed such as a watch via a bracelet, positioned on the top of a wrist, inside or on the periphery thereof, or even on the top of a hand, so as not to impede the user in his gestures and maintain excellent lateralized vision during the evolution of said user.
  • An electronic device 20 further comprises a program memory 23 and a data memory 22.
  • Said memories 22 and 23 may constitute a single physical entity or be physically or logically dissociated.
  • the memory 23 is intended to record including instructions of an application computer program product PA.
  • Said program instructions are interpretable or executable by the processing unit 21 and cause the implementation of a method for generating a setpoint message in orientation, such as the method 200 described later in connection with Figure 3.
  • the data memory 22 is arranged to record preferences of the user, preset routes, etc..
  • the data memories 22 and 12 moreover advantageously record identifiers specific to each element 20, 10L and 10R, or even secrets traditionally exploited to implement a secure and / or confidential communication Cl mentioned above. All or only part of these identifiers and / or said secrets stored in the data memories 22 and 12 can be exchanged through said configuration messages mp.
  • an electronic device 20 further comprises second communication means 25, generally known as GPS receivers (Global Positioning System according to English terminology) to determine said location by triangulation from ms data emitted by satellites.
  • Said means 25 thus make it possible to establish such a remote communication C3 with a plurality of satellites, such as those referenced ST1 and ST2 in FIG. 1.
  • said communication means 25 may cooperate with other satellite systems such as, but not limited to, Galileo, GLONASS, or Beidou.
  • the communication means 24 and 25 may constitute physically separate entities from one another or constitute one and the same physical entity.
  • the invention further provides alternative embodiments of the electronic device 20 and human-machine interfaces 10L and 10R.
  • a first variant may consist of simplifying the structure of one of the two man-machine interfaces 10L or 10R.
  • one of the two man-machine interfaces may be the slave of the other.
  • the man-machine interface 10L may be responsible for receiving any message me or mp emanating from the electronic device 20.
  • the man-machine interface 10L may be responsible for receiving any message me or mp emanating from the electronic device 20.
  • the man-machine interface 10L may be responsible for receiving any message me or mp emanating from the electronic device 20.
  • a message concerning me a direction in direction to the right
  • said message is propagated to me, that is to say, repeated, by the man-machine interface 10L to the interface man-machine sister 10R, via C2 communication advantageously wireless or wired.
  • the man-machine interfaces 10L and 10R comprise second communication means 17 cooperating with their respective processing units 11 for encoding and / or decoding secondary instruction messages mes corresponding to propagated setpoint messages.
  • the processing unit 11 of the slave man-machine interface can be simplified so as to be no more than a signal controller controlling the complementary display and / or warning means 16.
  • this case is the processing unit 11 of the master human-machine interface, in this case in the example represented in FIG. 1, the man-machine interface 10L, which produces the control signals of said means display 15 and / or alert 16 of the human-machine interface slave 10R.
  • the communication C2 as well as the second communication means 17 are then adapted and arranged accordingly.
  • the complementary warning means 16 may only be present on the master human-machine interface 10L. It is the same for possible means 18, also present and only on the master human-machine interface 10L.
  • the invention also provides a fourth variant not described in FIG. 1, according to which the implementation of all or part of the processes performed by the electronic device 20 can be carried out by the processing unit 11 of one of said man interfaces. - 10L or 10R machine. The invention even provides that the electronic device 20 and one of said man-machine interfaces 10L or 10R may form one and the same physical entity. In this case, the communication C1 is reduced to its simplest expression.
  • the processing units 11 and 21 can also be confused, as can the data memories 12 and 22, or even the program memories 13 and 23 mentioned above.
  • one of said electronic bracelets such as that described in connection with Figure 5, is responsible for determining its current land position and the production of guidance in orientation.
  • FIG. 2 a nonlimiting exemplary embodiment of a method 100 of restitution of an orientation indication implemented by a processing unit 11 of a human-machine interface conforming to FIG. invention, such as the man-machine interfaces 10L and 10R described with reference to FIG.
  • Such a method 100 includes a first step 101, prior to any return of an orientation indication, to configure the operation of said man-machine interface 10L or 10R.
  • a step consists in taking into consideration possible preferences Pu of the user U but also, to participate in a discovery procedure between the human-machine interface implementing said method 100 and the electronic device 20, for example via one or more messages referenced mp in FIG. 1, or even with a human-machine slave interface.
  • This step 101 makes it possible in particular to determine, and to write in data memory 13, information characterizing the fact that the man-machine interface 10L or 10R whose processing unit 11 implements said method 100 is intended to be positioned to the left or right in the field of view of the user U.
  • the method 100 described in FIG.
  • Such a message advantageously includes several informative fields such as an identifier of the man-machine interface recipient and concerned by said setpoint message, an orientation datum whose predetermined value designates a change of particular direction from a set of possible pre-established changes, a time-lapse data of the change of direction, or even a complementary and optional.
  • Said iterative processing comprises a subsequent step 103 to verify that the man-machine interface, whose processing unit implements said method 100, is indeed recipient of the orientation instruction message received me.
  • a check may consist in comparing the value of the identifier of the recipient contained in said message me with that of the identifier specific to said human-machine interface, implementing said method 100, recorded for example in the data memory 12
  • It may alternatively or additionally consist of a filter of the instruction messages in orientation me to retain only those carrying a set of instructions. orientation relating to a change of direction on the left or on the right according to the parameterization of the human-machine interface in the configuration step 101.
  • the method comprises a step 104 for extracting and recognizing the characteristics of the change of direction and its imminence with respect to the values of the data read from the setpoint message in view of a table of possible directional changes advantageously entered in the data memory 12.
  • the iterative processing of the method 100 therefore comprises a first step 105 for producing a control signal of the first display space 15.
  • the step 105 is thus to drive a display consisting of a plurality of light emitting diodes for these d
  • the lights illuminate, or trigger the display of a graphic content developed or read in the data memory 12 by a screen 15-1.
  • the first display space 15-1 changes from a "diode off" state to a time t1 prior to receiving the setpoint message me, to a "lighted diode” state.
  • the method 100 includes a step 106 for producing a control signal of the second display space 15-2 of the display means 15.
  • the step 106 thus consists in controlling a display consisting of a plurality of light-emitting diodes so that the latter illuminate all or some of them, or to trigger the display of a graphic content developed or read in the data memory 12 by a screen constituting the display space 15-2.
  • This step 106 consists of producing and triggering the display of the time gauge associated with the change of direction.
  • the second display space 15-2 goes from a state, "diodes off” to a time t1 prior to receiving the setpoint message in orientation me, to a state "Illuminated diodes” such as those described at time t2 and t3 according to the precocity of said change of direction.
  • the steps 105 and 106 may also be implemented in parallel, before or after an optional step 107 to control the complementary alerting means 16 if the human-machine interface comprises.
  • a step 106 may consist of actuating said warning means 16 for a predetermined duration (for example a second) and a frequency also predetermined.
  • the invention provides that such an alert is effective only at certain times of change of direction.
  • no control signal of the warning means 16 can be produced while the first and second display spaces are actuated.
  • the invention provides one or more determined thresholds whose values can be entered in the data memory 12, for example following the implementation of the configuration step 101 mentioned above.
  • step 106 When the imminence of a change of direction deduced from an orientation reference message reaches me the value of one of said determined thresholds, step 106 is triggered. It is the same for an optional second step 108 to actuate the third display space if it exists. Such a step aims to drive said third display space 15-3 for the latter to display a graphic content developed or read in the data memory 12 by a screen.
  • This step 108 consists of producing and triggering the display of one of the symbols described, as non-limiting examples in FIG. 4B, according to the characteristics of the change of orientation extracted in step 104 of the setpoint message me.
  • the iterative processing constituted by the steps 102 to 108 is then completed.
  • the lateralized graphic information IL or IR restored by the display means 15 to the user U is maintained until the receipt of a next set message me.
  • said method 100 can comprise a step 110 to produce a secondary mes instruction message, as evoked previously, to said slave human-machine interface.
  • a configuration test referenced 109 in FIG. 2 is provided for this purpose. Said steps 109 and 110 are implemented only if the test 103 has determined (situation symbolized by the link 103n in FIG. 2) that the setpoint message received at step 102 was not intended for the human interface. -Machine master.
  • processing unit 11 implementing said method 100 does not have to manage setpoint messages to a human-machine interface slave (situation symbolized by the link 109n in Figure 2), the processing iterative triggered at the end of step 102 stops until the next receipt of a message set me.
  • the invention provides an electronic device 20, for example in the form of a smart mobile phone adapted to develop setpoint messages in orientation me to said man-machine interfaces 10L and 10R.
  • an adaptation resides, for example by the loading of instructions of an application program PA whose execution or interpretation by the processing unit 21 of said electronic device 20 causes the implementation of a suitable method whose The main features are illustrated by the method 200 described with reference to FIG. 3 by way of non-limiting example.
  • Such a method 200 includes a first step 201 for determining or selecting a route along which a user wishes to be guided.
  • the processing unit 21 of the electronic device 20 triggers the display of one or more graphic or sound selection menus by display means, a loudspeaker and / or input means not shown in FIG. Figure 1. These may consist of a touch screen for example, or a microphone.
  • Such a method 200 may further comprise a step 202 for configuring the subsequent execution, by the processing unit 21, of the other steps of said method 200 according to one or more preferences of the user U.
  • preferences may be translated by said means for displaying and / or inputting the electronic device 20 in one or more operating parameters whose respective values can be advantageously recorded in the data memory 22 of the electronic device 20.
  • the user U of the electronic device 20 can inform the processing unit 21 as to its choices and preferences for selecting the shortest route for a given destination, the itinerary presenting the least difficulty, a periodicity and / or a given delay or distance preceding a crossing of a characteristic point of the route in order to benefit from an instruction in orientation, etc.
  • a pairing or discovery protocol 203 it is advantageous to implement a pairing or discovery protocol 203.
  • Such a step is in particular required if the communication C1 established between the different entities is said to be wireless.
  • such a step 203 could be optional if the communication C1 is wired.
  • such a step 203 may consist in transmitting, by the communication means 24, a discovery request, for example via a Bluetooth wireless proximity communication protocol, and waiting, in response receiving a message mp emanating from a human-machine interface 10L and / or 10R comprising the identifier of said human-machine interface or a secret or any other element that can characterize said interface.
  • a discovery request for example via a Bluetooth wireless proximity communication protocol
  • Such an exchange consists in the joint implementation of step 201 by the processing unit 21 of the electronic device 20 and by the step 101, mentioned above, of a method 100 by the processing unit 11 of an interface 10L or 10R man-machine.
  • the first step 204 consists in taking into consideration the terrestrial position of the electronic device 20. For this, the step 204 commands the processing unit 21 to collect data transmitted by a plurality of satellites via the GPS receiver means 25 or any equivalent. Knowing the desired route and determined in step 202, whose respective terrestrial coordinates of a plurality of characteristic crossing points are stored in data memory 22, a step 205 is to produce an orientation instruction, intended to maintain the electronic device 20, so its user U, on the desired route, or to bring said user U of the latter.
  • step 205 can be triggered at the end of a time period of predetermined duration or at the approach of one of said characteristic crossing points mentioned above.
  • step 205 determines whether the user must maintain his direction or inflect it.
  • an instruction in orientation is developed: move to the left, to the right, or even turn around.
  • Such setpoint may be enriched in the case of a bifurcation or junction, by determining a degree of shift of the required path or a tracking index (e.g. select the nth output 'a roundabout) .
  • a step 206 consists in encoding said instruction in orientation developed in 205, in the form of a setpoint message orientation me to the human-machine interface intended to be positioned on the left of the user's field of vision, in this case the 10L interface. If, on the other hand, said instruction in orientation relates to a recommended change of direction on the right of said user, step 206 consists in elaborating a setpoint message in orientation me intended for the man-machine interface positioned on the right of the field of vision of said user, in this case according to Figure 1, the man-machine interface 10R. More generally, step 206 thus consists in triggering the transmission of a setpoint message in orientation me to a first human-machine interface or a second human-machine interface, according to the orientation of the change of direction concerned by the orientation instruction encoded by said message me.
  • step 205 can also determine that it is preferable for the user U to turn back, because it tends to depart inexorably from the recommended route.
  • step 206 may consist of developing and sending two messages addressed to me respectively to the two man-machine interfaces 10L and 10R and encoding the same instruction in orientation, for example of semantics "obliquely mentally your direction". As indicated by the situation t3 in FIG. 4A, the simultaneous taking into account of such a setpoint by the two man-machine interfaces 10L and 10R spontaneously alerts the user U.
  • the user Faced with the concomitant visual indications IL and IR which would induce for the user simultaneous and contradictory instructions or actions, for example "turn left” and “turn right", the user understands instantly that it is lost or that it is facing a limit of guidance. He can then proceed to a half-turn or at least wonder about his current path.
  • the display means 15 of one of the two man-machine interfaces or when each of said interfaces comprises a third display space 15-3 the latter can display the symbol S4, described by way of example in FIG. 4B.
  • the message I produce in step 206 may include a meta-information designating said symbol or a predetermined characteristic information instructing each man-machine interface of the need for a U-turn.
  • such a step 206 consists in producing a message encoding me advantageously several informative fields such as an identifier of the human-machine interface recipient of said setpoint message, an orientation datum whose predetermined value indicates a change of orientation among a set of possible pre-established changes, a time-lapse datum of the change of orientation, or even additional and optional data, known as meta-information.
  • Steps 204, 205 and 206 are thus iterated until the user U reaches his destination.
  • FIG. 5 schematically illustrates, in a very simplified manner, an exemplary embodiment of a human-machine interface 10L or 10R conveyed in the advantageous form of a bracelet whose main body 10S constitutes a support for the elements 11 to 19 previously described and whose only the display means 15 comprising three display spaces 15-1, 15-2 and 15-3, as described by way of example in FIGS. 4A and 4B, are not shown for simplification purposes.
  • Said support 10S can be arranged to encapsulate the electronic members or elements, the latter can additionally be united in a non-obscuring housing the display means 15, to give them excellent protection against shocks and weather, with the exception of all or part of the "active face" of said display means 15, that is to say any visible part of a screen or light-emitting diodes, to allow the visual transmission of graphic indications delivered by said display means 15.
  • Said body 10S may advantageously have a boss, not described in Figure 5, to further orient the display means towards the field of view of the user.
  • Said body 10S may further comprise any locking means, not described in FIG.
  • the invention can not be limited to this single mode of conditioning a human-machine interface.
  • the main body or support 10S of such a man-machine interface may constitute all or part of a glove in which the user could slide a hand or more generally a garment and adapted to convey one or two interfaces man-machine 10L, 10R.
  • the body 10S of such a man-machine interface can be arranged to cooperate with the handlebars of a two-wheeled vehicle or any equivalent, such as a bicycle, a motorcycle, a motorized watercraft, etc., and be advantageously fixed near one of the parts distal of said handlebar, that is to say near one of the handles of said handlebar.
  • said body 10S may further cooperate or comprise fastening elements on a dashboard of a vehicle having a passenger compartment, for example a car, a truck, a boat, etc.
  • two human-machine interfaces 10L and 10R can be arranged respectively on the left and right within the field of view of the driver.
  • the invention also provides for arranging the body or support 10S of such a man-machine interface to satisfy the constraints imposed by other hobbies or sports.
  • such a man-machine interface can be arranged to be positioned on an accessory, visually accessible in the field of view of the user, such as a ski board near the spatula or the upper part of a stick skier or hiker.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)
  • Traffic Control Systems (AREA)
EP17768828.0A 2016-08-19 2017-08-18 System zur unterstützung der navigation unter verwendung von visuellen und lateralisierten schnittstellen Withdrawn EP3500823A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1657832A FR3055158B1 (fr) 2016-08-19 2016-08-19 Systeme d'aide a la navigation visuelle et lateralisee
PCT/FR2017/052242 WO2018033686A1 (fr) 2016-08-19 2017-08-18 Systeme d'aide a la navigation utilisant des interfaces visuelles et lateralisees

Publications (1)

Publication Number Publication Date
EP3500823A1 true EP3500823A1 (de) 2019-06-26

Family

ID=57137150

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17768828.0A Withdrawn EP3500823A1 (de) 2016-08-19 2017-08-18 System zur unterstützung der navigation unter verwendung von visuellen und lateralisierten schnittstellen

Country Status (9)

Country Link
US (1) US10563989B2 (de)
EP (1) EP3500823A1 (de)
JP (1) JP2019529946A (de)
KR (1) KR20190039585A (de)
CN (1) CN109983302A (de)
AU (1) AU2017313346A1 (de)
CA (1) CA3034184A1 (de)
FR (1) FR3055158B1 (de)
WO (1) WO2018033686A1 (de)

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2952119A (en) * 1956-01-30 1960-09-13 Rodi & Wienerberger Ag Detachable elastic linkage
US4055057A (en) * 1976-07-06 1977-10-25 Kolman Anita M Coupling device for the opposed ends of bracelets
US5247814A (en) * 1990-12-17 1993-09-28 Mcdonald Gordon T Combination eyeglass holder and wrist bracelet
US5266958A (en) * 1992-11-27 1993-11-30 Motorola, Inc. Direction indicating apparatus and method
FR2783778B1 (fr) 1998-09-30 2001-02-16 De Villeroche Gerard Jodon Accessoire pour vehicule automobile comprenant un ecran d'affichage d'informations pour le conducteur
GB0004688D0 (en) * 2000-02-28 2000-04-19 Radley Smith Philip J Bracelet
DE202004000890U1 (de) * 2004-01-21 2004-04-01 Wöbke, Klaus Armreif
KR20060064930A (ko) * 2004-12-09 2006-06-14 엘지전자 주식회사 개선된 네비게이션 시스템
SE528297C2 (sv) * 2005-02-21 2006-10-10 Dennis Jansson Anordning som navigationshjälpmedel för indikering av kurs
US20060242599A1 (en) * 2005-04-22 2006-10-26 Creative Technology Ltd. Improvements in and Relating to Searching on a User Interface
WO2007105937A1 (en) 2006-03-10 2007-09-20 Tomtom International B.V. Tactile device, navigation device and system comprising such a tactile device and navigation device
US7609503B2 (en) * 2007-11-12 2009-10-27 Roland Hee Insulated metal grounding bracelet
TW200942782A (en) * 2008-04-14 2009-10-16 Mitac Int Corp Navigation direction indication device
JP5188910B2 (ja) * 2008-09-03 2013-04-24 シャープ株式会社 進路報知装置、判定結果送信装置、これらの装置の制御方法、これらの装置を含む指示方向案内システム、進路報知プログラム、判定結果送信プログラム並びにこれらのプログラムを記録したコンピュータ読み取り可能な記録媒体
US20100085279A1 (en) * 2008-10-02 2010-04-08 Repko Sean R Interactive display bracelet
US9288836B1 (en) * 2011-03-18 2016-03-15 Marvell International Ltd. Electronic bracelet
US20130086502A1 (en) * 2011-09-30 2013-04-04 Nokia Corporation User interface
US9250768B2 (en) * 2012-02-13 2016-02-02 Samsung Electronics Co., Ltd. Tablet having user interface
EP2850510A2 (de) * 2012-05-18 2015-03-25 Apple Inc. Vorrichtung, verfahren und grafische benutzeroberfläche zur manipulation von benutzeroberflächen auf basis von fingerabdrucksensoreingaben
US9638537B2 (en) * 2012-06-21 2017-05-02 Cellepathy Inc. Interface selection in navigation guidance systems
EP3150966B1 (de) * 2012-07-27 2019-06-26 Harman Becker Automotive Systems GmbH Navigationssystem und verfahren zur navigation
EP2740381B1 (de) * 2012-12-04 2015-09-16 Omega SA Verstellbarer Armbandverschluss
WO2014087200A1 (en) * 2012-12-07 2014-06-12 Nokia Corporation An apparatus and method to provide a user with an indication of a direction to a particular destination.
KR102131358B1 (ko) * 2013-06-17 2020-07-07 삼성전자주식회사 유저 인터페이스 장치 및 유저 인터페이스 장치의 동작 방법
CN104223613B (zh) * 2014-09-26 2016-09-28 京东方科技集团股份有限公司 智能手环显示控制系统及智能手环
US9772190B2 (en) * 2014-10-14 2017-09-26 Polar Electro Oy Orientation during swimming
KR102297360B1 (ko) * 2014-11-10 2021-09-01 푸마 에스이 미리 정해진 주행 또는 보행 경로를 따라 주자 또는 보행자를 안내하는 방법 및 기기
US20170303646A1 (en) * 2014-12-29 2017-10-26 Loop Devices, Inc. Functional, socially-enabled jewelry and systems for multi-device interaction
KR102318887B1 (ko) * 2015-03-06 2021-10-29 삼성전자주식회사 웨어러블 전자 장치 및 그 제어 방법
FR3034890B1 (fr) * 2015-04-10 2018-09-07 Cn2P Dispositif electronique de projection interactif
KR20160143338A (ko) * 2015-06-05 2016-12-14 엘지전자 주식회사 이동단말기 및 그 제어방법
US9898039B2 (en) * 2015-08-03 2018-02-20 Toyota Motor Engineering & Manufacturing North America, Inc. Modular smart necklace
US10154316B2 (en) * 2016-02-26 2018-12-11 Apple Inc. Motion-based configuration of a multi-user device
US10024680B2 (en) * 2016-03-11 2018-07-17 Toyota Motor Engineering & Manufacturing North America, Inc. Step based guidance system
EP3270278B1 (de) * 2016-07-14 2020-06-24 Volkswagen Aktiengesellschaft Verfahren zum betreiben eines bediensystems und bediensystem
US10012505B2 (en) * 2016-11-11 2018-07-03 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable system for providing walking directions
US10591730B2 (en) * 2017-08-25 2020-03-17 II Jonathan M. Rodriguez Wristwatch based interface for augmented reality eyewear

Also Published As

Publication number Publication date
CA3034184A1 (fr) 2018-02-22
JP2019529946A (ja) 2019-10-17
AU2017313346A1 (en) 2019-04-11
FR3055158B1 (fr) 2019-11-08
CN109983302A (zh) 2019-07-05
KR20190039585A (ko) 2019-04-12
FR3055158A1 (fr) 2018-02-23
US10563989B2 (en) 2020-02-18
US20190178652A1 (en) 2019-06-13
WO2018033686A1 (fr) 2018-02-22

Similar Documents

Publication Publication Date Title
KR102433291B1 (ko) 웨어러블 글래스 및 웨어러블 글래스의 컨텐트 제공 방법
US10527722B2 (en) Radar sensor system providing situational awareness information
CN105547318B (zh) 一种智能头戴设备和智能头戴设备的控制方法
CN108431667A (zh) 信息处理装置、信息处理方法和程序
US11710422B2 (en) Driving analysis and instruction device
CN110100153B (zh) 信息提供系统
CA2255118C (fr) Appareil individuel d'orientation
US11110933B2 (en) Driving support device, wearable device, driving support system, driving support method, and computer-readable recording medium
KR20240065182A (ko) 개인 이동성 시스템의 ar 기반 성능 변조
KR20240091285A (ko) 개인 이동성 시스템을 이용한 증강 현실 강화된 게임플레이
WO2017153708A1 (en) Cyclist accessory system
CN110062937B (zh) 信息提供系统
EP3204722A2 (de) Schnittstelle zur konstruktion einer trajektorie in einer umgebung und umgebungsanordnung und trajektoriekonstruktionsschnittstelle
FR3055158B1 (fr) Systeme d'aide a la navigation visuelle et lateralisee
FR3093072A1 (fr) Poignée avec dispositif indicateur de changement de direction rétractable et système communicant multicanal
US11219797B2 (en) Real-time sensor based balance gamification and feedback
FR3091768A1 (fr) Système d’aide à la navigation visuelle et latéralisée
WO1998017352A1 (fr) Dispositif portatif d'aide a la randonnee
EP3529100A1 (de) Fahrerassistenzsystem für ein fahrzeug mit einem smartphone und einer fernschnittstellenvorrichtung
JP2011220899A (ja) 情報提示システム
EP4241044A1 (de) Benutzerführungsassistenzsystem zur ausrichtung auf oder beobachtung einer bestimmten interessenszone
FR3144343A1 (fr) Principe d'aide aux personnes pour se déplacer à l'aide de vibrations parvenant à leur montre connectée.
WO2015071728A1 (fr) Procédé et système de navigation sensitif
CN115686294A (zh) 导览模式用户界面和体验

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20190214

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20200116

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20200603