WO2007042846A1 - Procédé de navigation et dispositif de mise en œuvre correspondant - Google Patents

Procédé de navigation et dispositif de mise en œuvre correspondant Download PDF

Info

Publication number
WO2007042846A1
WO2007042846A1 PCT/HU2005/000133 HU2005000133W WO2007042846A1 WO 2007042846 A1 WO2007042846 A1 WO 2007042846A1 HU 2005000133 W HU2005000133 W HU 2005000133W WO 2007042846 A1 WO2007042846 A1 WO 2007042846A1
Authority
WO
WIPO (PCT)
Prior art keywords
picture
output
input
control unit
pictures
Prior art date
Application number
PCT/HU2005/000133
Other languages
English (en)
Inventor
Zoltán KOVÁCS
Zoltán FEICHT
Original Assignee
Kovacs Zoltan
Feicht Zoltan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kovacs Zoltan, Feicht Zoltan filed Critical Kovacs Zoltan
Publication of WO2007042846A1 publication Critical patent/WO2007042846A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3647Guidance involving output of stored or live camera images or video streams
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096716Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/09675Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where a selection from the received information takes place in the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096805Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route
    • G08G1/096827Systems involving transmission of navigation instructions to the vehicle where the transmitted instructions are used to compute a route where the route is computed onboard
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/096855Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver
    • G08G1/096861Systems involving transmission of navigation instructions to the vehicle where the output is provided in a suitable form to the driver where the immediate route instructions are output to the driver, e.g. arrow signs for next turn

Definitions

  • the subject of the invention is navigation procedure primarily for vehicles, device to implement the procedure. It can be employed expediently in vehicle navigation, traffic control, man-made navigation, tourism, marketing or to promote shops and services.
  • the route search, routing and/or navigation systems used at present are known to show the route on so-called traditional (classical) maps, and with the help of this route plan the user (driver) can get from the start point to the destination point.
  • the navigation equipment used recently also show the optimal route on a traditional map but such are generally supplied with a Global Position System (GPS). So the exact location of the vehicle can be shown to the driver on the map, and as the vehicle approaches crossings or partings, sound or other signals (e.g. arrows) supply further information about the heading to follow.
  • GPS Global Position System
  • GPS positioning solutions the user can feel at greater ease because the system communicates the headings, which is straight on, turn left or right.
  • the disadvantage of GPS positioning is that the user does not get adequate information about the junctions in advance, so in a busy crossing that interconnects several roads, there is for example scarce information available about how to go on.
  • a driver for example will find it difficult to give an eye to the traffic, the navigation device and the choice of the suitable lane at the same time, so he can get uncertain and hesitant in making the correct decision when he is using the device.
  • the GPS positioning solution requires proper view of the signallers (satellite), which is yet not perfect in every case - for example the permanent use of the device is not guaranteed in narrow urban streets — , moreover the accuracy of the currently available GPS' s (approx. 10 m) is not satisfactory, which can make correct identification quite random and lead to mistakes mainly in crossings that are close in cities.
  • 3D navigation systems are also well known. Their disadvantage is that they display computer generated pictures and these pictures will not necessarily appear as the user physically perceives the specific environment.
  • a specific crossing, area or perhaps a short route is also well known to be depicted or shown in a 360° panoramic view, generally for the purposes of tourism.
  • the panoramic view shows the area, the object or the course from the centre of a crossing or an area. But these pictures are less useful in navigation because they are shown from a point which is just opposite the traveller, so when the driver approaches a crossing from a specific direction, the pictures will be displayed from the opposite than in reality, or the driver will only get a correct picture when he is in the crossing, so the necessary information is not available in advance with this method either.
  • the GPS supported navigation system expounded in patent description DE 19935718 OS and comprising a board computer, an LCD display, a databank, a GPS interface, a radio modem with radio Internet access and local databanks, represents the professional level.
  • the navigation system maps out the actual real environment in a three-dimension picture and shows it on the display for the traveller's benefit.
  • the three-dimension pictures and the orientating objects - such as crossings, traffic lights, filling stations, houses, trees, bus stops, telephone booths, traffic signs, fire hydrants, water taps, etc. - are displayed as the driver actually sees them when he is moving in the street, so that they could help him choose the proper course, especially when turning.
  • the published invention (No. HU P0303417) also represents the professional level. Its essential property is that when moving in poor visibility conditions or in situations where obscured ground objects or dangerous points are invisible for the driver, a virtually generated three-dimension picture is displayed, based on the user's actual location and the digitised map information. When this solution is used, the driver can see the road section much further on, will be much quicker and notice any dangerous situations that are due to the high speed or any unexpected roadblocks, whereas in water traffic all dangerous waters are prompt previsional and the proposed heading can be viewed.
  • the virtual, three-dimension environment is updated with a computer and GPS support in both solutions outlined above, so the environment and the pictures are not displayed as the user physically perceives them in reality.
  • the invention expounded in patent description No. WO 03/002943, also represents the professional level. It is related to the operational procedures of a navigation system and the navigation system of vehicles, especially motor vehicles.
  • This solution uses a positioning system to define all the data related to the geographical location of the vehicle, then an electronic data processor system transmits this set of data to a database with access to a digital route map. On the basis of the above data and directions, the location of the vehicle versus the destination point is given on the digital route plan. Visual and/or acoustic outputs are at the disposal of the driver of the vehicle.
  • the supplementary information is stored in and can be easily retrieved, if you like, from the navigation system. This is what the driver and/or any other person can display in the vehicle, on the output through the navigation system.
  • the supplementary information for navigation is actually uploaded in the navigation system, in the form of a user program, which helps the user and/or any other persons to use that, as so-called "navigation information", as he/they display it in the vehicle.
  • the picture used for the supplementary information is not a real one but is computer transformed and digitised.
  • the vehicle navigation system in patent description US 2005/0027439 also represents the professional level.
  • the navigation information is transmitted to the display in the vehicle in the form of video data and in a standard video format.
  • Data such as still or motion pictures can be displayed on the video.
  • the vehicle is fitted with a navigation system decoder and a video data display, to connect to the entertainment system in the vehicle.
  • the video data include map data, user information, instructions for direction, video information (still or motion picture) or information about the destination point or objects of public interest.
  • the video information does not include real but computer generated pictures in this well-known vehicle navigation system again.
  • the invention is purposed to cease the deficiencies of the well-known solutions, develop a navigation procedure primarily applicable in vehicles and produce a device to implement the procedure, which will all lead to simplified and safer position detection and decision-making, quick and less abstraction efforts made, orientation that requires focused attention and is based on the visual perception of real pictures, therefore a kind of high-precision navigation materially excluding the possibility of mistakes.
  • the solution developed in the invention is based on having recognised that if pictures (photographs or video shots) are made in advance about a specific environment, landscape, settlements, for example the streets of a locality with the house numbers designated, crossings, partings, junctions, objects and buildings of public interest, a picture database is created from the store of these pictures, and the pictures selected from this picture database, for example in order to cover a specific route, are completed with supplementary information and are displayed on an expedient picture display device or are put at the user's disposal in another form, for example in a book or a guide with printed pictures, within an acceptable distance before decision-making and the orientation task, and the user can perceive these in a manner that the human eye is used to, visually, in accordance with the real situation and can consequently recognise and identify the location or make this comparison and identification with an expedient device, then the solution elaborated in the invention meets its objectives.
  • Figure 3 skeleton diagram for a preferential design of the device developed in the invention, suitable for traffic control
  • Figure 4 skeleton diagram for a preferential design of the device developed in the invention, suitable for man-made navigation and use in mobile appliances.
  • Figure 1 shows the skeleton diagram of the device developed in the invention, containing an interconnected user interface 1 and control unit 2.
  • the first output 11 of the user interface 1 is connected to the first input 201 of the control unit 2, its first input 12 to the first output 202 of the control unit 2, its second input 13 to the second output 203 of the control unit 2, and its third input 14 to the third output 204 of the control unit 2 in the specific case. (This connection is marked with a broken line in the Figure.)
  • the fourth output 206 of the control unit 2 is connected to the first input 33 of the routing unit 3, its third input 207 to the second output 32 of the routing unit 3, its fourth input 208 to the first output 31 of the routing unit 3.
  • the third output 34 of the routing unit 3 is connected to the input 42 of the map database 4, its second input 35 to the output 41 of the map database 4. Additionally the navigation device has a picture database 5, and its first output 51 is connected to the fifth input 210 of the control unit 2, its first input 52 to the fifth output 209 of the control unit 2.
  • Figure 2 shows a skeleton diagram for a preferential design of the device developed in the invention, suitable for navigation on the basis of position detection and continuous positioning.
  • the device shown in Figure 1 also has a shooting device 6 and its output 61 is connected to the fourth input 16 of the user interface 1, its input 62 to the third output 17 of the user interface I 5 the second output 15 of the user interface 1 to the second input 205 of the control unit 2.
  • the sixth output 211 of the control unit 2 is connected to the first input 73 of the picture-based positioning unit 7, its sixth input 212 to the second output 72 of the picture-based positioning unit 7, its seventh input 213 to the first output 71 of the picture-based positioning unit 7, the second input 74 of the picture-based positioning unit 7 to the second output 54 of the picture database 5 and its third output 75 to the second input 53 of the picture database 5.
  • Figure 3 shows a skeleton diagram for a preferential design of the device developed in the invention, suitable for traffic control.
  • the device shown in Figure 2 has a picture analysis unit 8 in addition to the units outlined above and its output 81 is connected to the eighth input 214 of the control unit 2, its input 82 to the sixth output 211 of the control unit 2 and the interconnected first input 73 of the picture-based positioning unit 7.
  • Figure 4 shows a skeleton diagram for a preferential design of the device developed in the invention, suitable for man-made navigation and use in mobile appliances.
  • the user interface 1 and the shooting device 6 is placed in the mobile appliance, as in a user system U (marked with a broken line in the Figure).
  • the other units, such as the control unit 2, the routing unit 3, the map database 4, the picture database 5 and the picture-based positioning unit 7, are all placed in a central system C (similarly marked with a broken line in the Figure), and the connection between the user system U and the central system C is realised through mobile communication, preferably with GPRS, EDGE or 3G.
  • the user interface 1 (display and input unit) is in charge of displaying the pictures (photographs, video shots), maps, data input surfaces, voice-based information on the one hand, and inputting the user's (operator's) commands in various modes, for example through a toll or key button, as well as receiving and assessing the answers to and information about the commands, on the other.
  • the control unit 2 is in charge of controlling the other units, receiving, transmitting data and information and converting them into the format required in the individual units.
  • the routing unit 3 is in charge of building the optimal route between two points — the start point and the destination point -, also including some intermediate points, in accordance with the well-known procedures and algorithms used in everyday practice.
  • the map database 4 is in charge of storing the data of the well-known traditional maps used in everyday practice, the digital maps and the routes.
  • the picture database 5 is in charge of storing the pictures - as data — generated from the still or motion pictures pre-shot from each approachable direction of all the decision- making locations stored in the map database 4 (crossing, parting, junction).
  • the shooting device 6 (camera) is in charge of making pre-shots about the decision- making locations and streets, so that when comparing these with the pictures of the picture database 5, the user's location could be determined.
  • the picture-based positioning unit 7 is in charge of comparing the picture, received from the control unit 2, with the pictures stored in the picture database 5, and in case of agreement, to return the address of the transmitted picture to the control unit 2.
  • the picture analysis unit 8 is in charge of analysing the picture received from the control unit 2 and specifying the traffic control features of those who participate in traffic.
  • the first step is to assemble the picture database 5.
  • the picture is made in a mode natural for the human eye, generally from the viewpoint corresponding with the travel direction, nearly parallel with the longitudinal axis of the road or route, or - in specific cases, for navigation with the accuracy of a street name or house number — nearly squarely to the longitudinal axis of the road or route.
  • Pictures are made from each approachable direction of all the decision-making locations (crossing, parting, junction, building) stored in the map database 4.
  • the pictures of the picture database 5 are generated from the above mentioned, pre-shot still pictures or motion pictures and these, with regard to format, will expediently be electronic still pictures or motion pictures or printed still pictures.
  • Positioning can be manual, with an external appliance or with picture-based position detection. In manual positioning, you yourself (or the user) set the address data of the start point. Positioning with an external appliance can be carried out with a GPS or on the basis of mobile cell information, used to define the address of the start point. If positioning is not manual, nor with an external appliance but takes the form of picture- based position detection, the device shown in Figure 1 is, in the basic case, completed with the shooting device 6 and the picture-based positioning unit 7, and this set corresponds to the skeleton diagram of the device suitable for navigation on the basis of position detection and continuous positioning, in Figure 2.
  • Picture-based position detection In the process of picture-based position detection the picture made of the start point with the shooting device 6 is compared, using the picture-based positioning unit 7, with the pictures supplied with the address data and stored in the picture database 5, and in case of agreement the address of the picture is returned to the control unit 2 through the first output 71.
  • Picture-based position detection can take two forms: intermittent or continuous. In the case of intermittent picture-based position detection, picture-based position detection is only carried out in the decision-making locations, whereas in continuous picture-based position detection this is done on a continuous basis, so you can watch and perceive the stay-in positions constantly. Knowing the address of the start point, you will have to determine the optimal route between the start point and the destination point.
  • the address data (street/house number) of the points to be reached or passed by to the control unit 2 through the first output 11 of the user interface 1.
  • the address data (address list for the route plan) received from the user interface 1 are transmitted to the routing unit 3 through the fourth output 206 of the control unit 2.
  • the optimal route plan is built with the routing unit 3 and the map database 4.
  • the address list for the route plan - as data needed to select the map - is transmitted from the third output 34 of the routing unit 3 to the input 42 of the map database 4, and the data of the restricted, selected map of the relevant area are returned from the output 41 to the second input 35 of the routing unit 3.
  • the optimal route between the start point and the destination point, including the list of decision-making locations in sequence, is produced on the second output 32 of the routing unit 3.
  • These locations are, as mentioned above,, crossings, partings, junctions, streets as a general rule, but can also be the pictures of buildings, for example house gates, facades, which means that essentially they are locations where a decision must be made as to where, to which direction to continue the, trip.
  • the list of decision-making locations is completed with supplementary information, for example which direction you are approaching from (and so the relevant picture can be selected), if the specific street can be left (controlling information about turning direction). Then the list of decision- making locations and the information to support navigation (e.g.
  • the pictures stored in the picture database 5 are pre-shot still or motion pictures and these, with regard to format, will expediently be electronic still pictures or motion pictures or printed still pictures.
  • the pictures generally show the decision-making locations from the viewpoint corresponding with the travel direction, nearly parallel with the longitudinal axis of the road or route. But the pictures can also be made nearly squarely to the longitudinal axis of the road or route, showing the facade of the line of buildings along the two sides of the street.
  • the control unit 2 ranges the selected pictures in the sequence of the route and adds some navigating (control) signals according to the turning directions to show where to turn at the end of the specific street to follow the correct course.
  • These navigating signals can be visual signals (e.g. arrow, street name) and/or phonic instructions (e.g. turn right). They are essentially required to determine the turning direction unambiguously and in due time.
  • the first selected picture and the navigating signals are transmitted from the first output 202 of the control unit 2 to the user interface 1 unit and are demonstrated on the display of the user interface 1 unit.
  • the navigation information e.g.
  • control signals to follow the proper course is also transmitted into the user interface 1 unit through the second output 203 of the control unit 2, and, in specific cases, the traditional map information to show the route is also transmitted there from the third output 204 of the control unit 2 (marked with a broken line in the Figure). Then, based on the picture displayed, the location of the start point can be identified and you can start toward the correct direction, always minding the navigating signals.
  • you have reached the location you have to proceed in the proper and set direction, based on the previously retrieved picture and the related navigating signals. Afterwards you will have to repeat the above step and continue with the procedure as long as you have reached the destination point.
  • the solution developed in the invention is expediently applicable in primarily vehicle navigation, that is navigation relying on picture-based position detection, and in continuous positioning.
  • the navigation device demonstrated in Figure 1 and employed in the above basic case should be expediently used with the shooting device 6 and the picture-based positioning unit 7, both fitted in the vehicle, which is shown in Figure 2.
  • positioning can for example be made manually or with picture-based position detection, and you will have to proceed as explained above as long as the optimal route is determined.
  • the shooting device 6 transmits the continuously shot live pictures through its output 61 to the fourth input 16 of the user interface 1 and then through the second output 15 of the user interface 1 to the control unit 2.
  • the live pictures received from the shooting device 6 are transmitted to the first input 73 of the picture-based positioning unit 7.
  • the picture selection data are transmitted from the third output 75 of the picture- based positioning unit 7 to the picture database 5, through its second input 53, whereas the selected pictures and the related addresses are transmitted from the second output 54 of the picture database 5 to the picture-based positioning unit 7, through its second input 74.
  • the control unit 2 continuously compares the pictures that are stored in the picture database 5 with the live pictures made by the shooting device 6, and when the stored picture coincides with the picture shown in the live picture - that is the location harmonises - the control unit 2 supplies the live picture, in place of the stored picture, with the navigating signals and that is shown through the first input 12 of the user interface 1, on its display.
  • the control unit 2 When you have left the first decision-making location the control unit 2 continuously sends the live picture to the display of the user interface 1, so that when you have got there, it could be compared with the picture in the picture database 5 and the live picture could be supplied with navigating signals.
  • the live motion picture is continuously processed with the picture-based positioning unit 7, which facilitates the continuous and real-time determination of the vehicle position, moreover there is no need for user feedback concerning the decision-making locations, which will all in all lead to safer and more accurate navigation. It is safer because continuous position detection is not subjective (by man) but objective (automatically, with a device), and more accurate because more position detection can be made in a unit of time.
  • the units - so the user interface 1, the control unit 2, the routing unit 3, the map database 4, the picture database 5, the shooting device 6 and the picture-based positioning unit 7 - are all built in one device, reasonably in the vehicle.
  • the solution developed in the invention is also expediently applicable in traffic control, based on vehicle navigation.
  • the vehicle navigation device is completed with the picture analysis unit 8, as shown in Figure 3.
  • the live picture received from the shooting device 6 through the user interface 1 and the control unit 2 is transmitted to the input 82 of the picture analysis unit 8, and the features of those who participate in traffic (e.g. car speed, standing cars, roadblocks) are transmitted from its output 81 through the control unit 2 to the display of the user interface 1.
  • the application of the picture analysis unit 8 must be completed by establishing connection between the vehicles, which can follow two structures:
  • central unit central server
  • each traffic control device of eight units is connected to a central unit and communication is managed through these units
  • connection is based on mobile communication in both cases.
  • traffic control is based on these steps: the traffic in front of the vehicle is continuously shot with the shooting device 6, which is located on the vehicle, and the traffic features (e.g. car speed, standing cars, roadblocks) of the specific location are continuous analysed and determined with the picture analysis unit 8.
  • the traffic control devices share the generated traffic status data according to the above structure I or II.
  • the current traffic conditions can be continuously traced, which is a basic condition of traffic control.
  • this system can offer alternative routes with the help of the control unit 2, the routing unit 3 and the knowledge of the address of the destination point, and the travellers can choose the route that will best suit them through the user interface 1.
  • the solution developed in the invention is also expediently applicable in man-made navigation, with the help of a mobile device.
  • the user interface 1 and the shooting device 6 is installed in a user system U, whereas the other units - i.e. the control unit 2, the routing unit 3, the map database 4, the picture database 5 and the picture-based positioning unit 7 — are all in a central system C, as illustrated in Figure 4.
  • the user system U is expediently installed in a mobile phone, while the central system C on a server computer.
  • the mobile phone held by the user must be fitted with camera and picture display functions.
  • the connection between the user system U and the central system C is expediently developed with mobile communication, for example GPRS.
  • An advantage of mobile communication is that with the help of the cell information the user's location can be determined with the accuracy of some comers in a relatively big settlement, for example in a town.
  • the user determines the start point, which can be the momentary stay-in position or any other location. If this is done manually, the user determines the address data of the start point himself and enters them in the mobile phone. If this is done with an external appliance, the user uses his mobile phone to query the list of streets in his vicinity, applying cell information-based positioning, chooses his actual stay-in point from the list and enters the house number on the keyboard. When positioning applies picture- based position detection, the user makes a picture of the crossing or junction that is the closest to him on Ms mobile phone and sends it to the central system C.
  • the picture-based positioning unit 7 compares the picture with the pictures stored in the picture database 5 and supplied with address data, and in case of compliance it sends the address of the picture through the first output 71 to the control unit 2 and the user's mobile phone. If the picture-based positioning unit 7 is unable to identify the crossing or junction accurately, it returns the pictures that most resemble the picture of the start point to the user through its second output 72 and the first output 202 of the control unit 2 to the mobile phone display and, having investigated the pictures, the user can choose where he is staying or initiate another identification process. When the start point is determined, you have to enter the address data of the destination point (e.g.
  • the central system C sends the first picture of the start point to the mobile phone display, which in turn will display it for the user.
  • the user identifies the location of the start point on the basis of the picture and starts toward the proper direction, using the navigating signals (e.g. arrow) in the picture.
  • the user presses a button on the mobile phone or gives a phonic instruction to record that he has left the crossing or junction.
  • the central system C selects the picture of the subsequent decision-making location, including the supplementary information, with the help of the control unit 2 and the picture database 5 and returns it to the user's mobile phone.
  • the user receives this picture, recognises and identifies it as he gets to the location and following the navigating signals (instructions) he starts toward the given direction. Thereafter the user should continue the above steps of procedure as long as he has reached the destination point.
  • the pre-shot pictures are also applicable in tourism, when the printed pictures of a city are distributed to the tourists in the form of a "traveller's companion", an itinerary or the annex to a guidebook.
  • the solution developed in the invention has met its objectives and offers the following advantages: - navigation is made with real pictures, the user sees the same on the device as what he can visually perceive, no need for map reading abstraction skills, facilitates simpler, safer and more accurate position detection and decision- making because the user is supplied with visual information, based on real pictures, before reaching the navigation decision-making location, orientation requires less abstraction efforts and focused attention, use of the device does not necessarily require external information or signal source, the pictures are shot with identical picture properties, which helps uniform and objective orientation and decision-making.

Abstract

La présente invention concerne une procédure de navigation destinée principalement aux véhicules, dans laquelle grâce à l’interface utilisateur (1), l’unité de commande (2), l’unité d’acheminement (3) et la base de données de cartes (4), le chemin optimal est déterminé entre le point de départ et le point de destination, qui comprendra la liste des emplacements de décision en séquence. L’invention est caractérisée en ce que les images des emplacements de décision sont sélectionnées à l’aide de l’unité de commande (2) à partir de la base de données d’images (5) qui contient des images réelles en ligne avec la liste séquentielle, et sont fournies aux signaux de navigation. Dès lors, la première image du point de départ est affichée sur l’interface utilisateur (1) afin d’identifier l’emplacement du point de départ sur la base de l’image et démarrer dans la direction adéquate. L’invention concerne aussi le dispositif de mise en œuvre de la procédure de navigation.
PCT/HU2005/000133 2005-10-11 2005-12-14 Procédé de navigation et dispositif de mise en œuvre correspondant WO2007042846A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
HUP0500937 2005-10-11
HU0500937A HUP0500937A2 (en) 2005-10-11 2005-10-11 Navigation method and equipment mainly for vehicles

Publications (1)

Publication Number Publication Date
WO2007042846A1 true WO2007042846A1 (fr) 2007-04-19

Family

ID=89986334

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/HU2005/000133 WO2007042846A1 (fr) 2005-10-11 2005-12-14 Procédé de navigation et dispositif de mise en œuvre correspondant

Country Status (2)

Country Link
HU (1) HUP0500937A2 (fr)
WO (1) WO2007042846A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2072954A1 (fr) * 2007-12-20 2009-06-24 Telecom Design Srl Méthode pour la représentation d'un environnement dans un système de navigation par l'utilisation de photographies panoramiques.
EP2075541A1 (fr) * 2007-12-31 2009-07-01 STMicroelectronics Design and Application GmbH Système amélioré de navigation de véhicule
WO2009120303A1 (fr) * 2008-03-24 2009-10-01 Google Inc. Images panoramiques dans des directions de conduite
WO2011154050A1 (fr) * 2010-06-11 2011-12-15 Tomtom International B.V. Dispositif et procédé de navigation comprenant des instructions améliorées contenant une image panoramique d'une scène
CN101340661B (zh) * 2008-08-14 2011-12-28 北京中星微电子有限公司 实现导游控制的移动设备和服务器以及导游控制方法
US8359157B2 (en) 2008-04-07 2013-01-22 Microsoft Corporation Computing navigation device with enhanced route directions view

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106445367B (zh) * 2016-10-14 2019-06-04 上海语途信息技术有限公司 基于三维设计软件进行操作界面设计与更换的方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10132598A (ja) * 1996-10-31 1998-05-22 Sony Corp ナビゲート方法、ナビゲーション装置及び自動車
JPH11271074A (ja) * 1998-03-20 1999-10-05 Fujitsu Ltd 目印画像照合装置及び目印画像照合方法及びプログラム記憶媒体
GB2337653A (en) * 1998-05-19 1999-11-24 Pleydell Bouverie David Archie Route calculation and display apparatus
US20040210382A1 (en) * 2003-01-21 2004-10-21 Tatsuo Itabashi Information terminal apparatus, navigation system, information processing method, and computer program
EP1484579A1 (fr) * 2003-06-03 2004-12-08 Samsung Electronics Co., Ltd. Apparei et méthode pour télécharger et afficher des images de positionnement global dans un système de navigation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10132598A (ja) * 1996-10-31 1998-05-22 Sony Corp ナビゲート方法、ナビゲーション装置及び自動車
JPH11271074A (ja) * 1998-03-20 1999-10-05 Fujitsu Ltd 目印画像照合装置及び目印画像照合方法及びプログラム記憶媒体
GB2337653A (en) * 1998-05-19 1999-11-24 Pleydell Bouverie David Archie Route calculation and display apparatus
US20040210382A1 (en) * 2003-01-21 2004-10-21 Tatsuo Itabashi Information terminal apparatus, navigation system, information processing method, and computer program
EP1484579A1 (fr) * 2003-06-03 2004-12-08 Samsung Electronics Co., Ltd. Apparei et méthode pour télécharger et afficher des images de positionnement global dans un système de navigation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1998, no. 10 31 August 1998 (1998-08-31) *
PATENT ABSTRACTS OF JAPAN vol. 2000, no. 01 31 January 2000 (2000-01-31) *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2072954A1 (fr) * 2007-12-20 2009-06-24 Telecom Design Srl Méthode pour la représentation d'un environnement dans un système de navigation par l'utilisation de photographies panoramiques.
EP2075541A1 (fr) * 2007-12-31 2009-07-01 STMicroelectronics Design and Application GmbH Système amélioré de navigation de véhicule
WO2009120303A1 (fr) * 2008-03-24 2009-10-01 Google Inc. Images panoramiques dans des directions de conduite
US8428873B2 (en) 2008-03-24 2013-04-23 Google Inc. Panoramic images within driving directions
US8359157B2 (en) 2008-04-07 2013-01-22 Microsoft Corporation Computing navigation device with enhanced route directions view
CN101340661B (zh) * 2008-08-14 2011-12-28 北京中星微电子有限公司 实现导游控制的移动设备和服务器以及导游控制方法
WO2011154050A1 (fr) * 2010-06-11 2011-12-15 Tomtom International B.V. Dispositif et procédé de navigation comprenant des instructions améliorées contenant une image panoramique d'une scène

Also Published As

Publication number Publication date
HU0500937D0 (en) 2005-12-28
HUP0500937A2 (en) 2007-05-29

Similar Documents

Publication Publication Date Title
Raubal et al. Enriching wayfinding instructions with local landmarks
US9240029B2 (en) Street level video simulation display system and method
US7831387B2 (en) Visually-oriented driving directions in digital mapping system
ES2534311T3 (es) Método de funcionamiento de un sistema de navegación que usa imágenes
DE60316327T2 (de) Routenberechnung um verkehrshindernisse herum unter verwendung markierter umleitungen
US9212927B2 (en) Map view
EP1612707B1 (fr) Procédé de collecte d'informations pour une base de données géographique à l'emploi dans un système de navigation
US7688229B2 (en) System and method for stitching of video for routes
WO2007042846A1 (fr) Procédé de navigation et dispositif de mise en œuvre correspondant
JP7061634B2 (ja) インテリジェント防災システムおよびインテリジェント防災方法
WO2005098362A1 (fr) Systeme et procede de navigation
CN102003965A (zh) 操作导航系统以提供路线指引的方法
CN101750072A (zh) 三维动画视频导航方法及系统
US6847886B2 (en) Method and apparatus for finding a location in a digital map
EP4089370A1 (fr) Procédé et dispositif pour vérifier la localisation et l'orientation actuelles d'un utilisateur à l'aide de repères
US7398156B2 (en) Point search apparatus and in-vehicle navigation apparatus
CN113240816A (zh) 基于ar和语义模型的城市精确导航方法及其装置
JP2000171264A (ja) 経路誘導方法
Gintner et al. Improving reverse geocoding: Localization of blind pedestrians using conversational ui
Bodgan et al. Using 3D urban models for pedestrian navigation support
Mikulowski et al. Ontological support for teaching the blind students spatial orientation using virtual sound reality
JP2973862B2 (ja) 動画像を利用した経路設定装置
CN102235868A (zh) 操作导航系统以提供路线指引的方法
Hirtle Wayfinding and orientation: cognitive aspects of human navigation
JP3651784B2 (ja) ナビゲーション装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05821997

Country of ref document: EP

Kind code of ref document: A1