CN100561133C - Map datum and route air navigation aid in matching camera photographic images and the portable terminal device - Google Patents

Map datum and route air navigation aid in matching camera photographic images and the portable terminal device Download PDF

Info

Publication number
CN100561133C
CN100561133C CNB2006100927889A CN200610092788A CN100561133C CN 100561133 C CN100561133 C CN 100561133C CN B2006100927889 A CNB2006100927889 A CN B2006100927889A CN 200610092788 A CN200610092788 A CN 200610092788A CN 100561133 C CN100561133 C CN 100561133C
Authority
CN
China
Prior art keywords
target
positional information
camera
display
captured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CNB2006100927889A
Other languages
Chinese (zh)
Other versions
CN1880918A (en
Inventor
郑文镐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of CN1880918A publication Critical patent/CN1880918A/en
Application granted granted Critical
Publication of CN100561133C publication Critical patent/CN100561133C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)
  • Instructional Devices (AREA)

Abstract

The present invention relates to image and the coupling of the map datum in the portable terminal with camera.Based on the positional information of camera, calculating such as the positional information that is included in the buildings in the map datum, and uses the positional information of being calculated to mate target and display-object by the positional information and the display-object of the target of the image of camera.On screen, show image, and read text message with the display-object of object matching, and on the position of the target that shows on the screen, show from map datum by camera.If the user moves, use the match information between target and display-object, the navigation itinerary is so that checking target in person, in the time of such as buildings, by the itinerary navigation user.

Description

Map datum and route air navigation aid in matching camera photographic images and the portable terminal device
The cross reference of related application
The application requires the Korean application No.10-2005-0051093 that submitted on June 14th, 2005, the Korean application No.10-2005-0051099 of 14 submissions June in 2005, and early date of application and the right of priority of the Korean application No.10-2005-0054397 of submission on June 23rd, 2005, its content is all introduced for your guidance at this.
Technical field
The present invention relates to be used for method with the map datum coupling of the image of camera and portable terminal.The invention still further relates to the itinerary air navigation aid of the image that uses the camera in the portable terminal.
Background technology
In GPS (GPS), in six orbit planes, place four on each plane or more gps satellites, and each gps satellite with the inclination angles of 55 degree across the equator of the earth so that the lip-deep user overlay area Anywhere of the maximization earth.Per approximately 12 hours of each gps satellite around the earth so that transmit navigation message.
The GPS receiver receives at least four or the regular navigation message that transmits of more gps satellites from a plurality of gps satellites.The GPS receiver detects the distance between receiver and gps satellite, and the position vector of gps satellite (the 3D elements of a fix) is so that calculate its position vector.
In navigational system with GPS receiver, will be on the numerical map that shows on the display screen, the position vector that map match is detected by the GPS receiver.As value-added service, navigational system provides the itinerary navigation Service, informs user's current location or the itinerary from the starting point to the destination.Therefore, the user of navigational system can be easy to find the destination from starting point or current location by receiving the itinerary navigation Service of navigational system.
In recent years, navigational system is installed to various portable terminals, such as portable terminal, PDA(Personal Digital Assistant), iBook phone and smart phone.Hereinafter, for simplicity, portable terminal abbreviates portable terminal as.Therefore, even user's walking to the destination, also can pass through to the itinerary of destination, navigation user.
Have the service that is used for the intended destination of navigation user on the itinerary if having the user of the portable terminal of navigational system, the realistic objective that this user must check on the street usually comes corresponding with the target that shows on the numerical map on the display of portable terminal.
In this case, the user must visually check and mate target on the map and the realistic objective on the street one by one.For example, the user will to mate the buildings that he or she shows on the actual buildings of seeing on the street and numerical map identical with actual buildings with the appointment buildings of guaranteeing to show on screen (" A ").After this, which buildings on the differentiation map is really corresponding to the buildings on the screen " A ".
Especially, if the target buildings away from user's current location, the user must arrive the position of target buildings in person so that obtain the information of target buildings.By moving to the position of target buildings, only in the information that obtains on the target buildings during with the buildings that shows on map coupling, the user could confirm that his or she desirable buildings is corresponding to the target buildings on the map.
In addition, if the user by receiving itinerary, advances to the destination through being positioned at the navigational system on the portable terminal, navigational system navigates on the itinerary at displayed map on the screen with the user usually.Therefore, the user must find the destination by buildings on the map and the realistic objective buildings on the street are checked one by one.
Summary of the invention
The present invention is directed to image and the coupling of the map datum in the portable terminal with camera.
Additional features of the present invention and advantage will be set forth in the instructions hereinafter and partly will obviously maybe can learn by implementing the present invention from instructions.By the structure of in the instructions of being write and claim and accompanying drawing, specifically noting, realize and reach purpose of the present invention and other advantages.
For realizing these and other advantages and according to purpose of the present invention, such as embodiment and the general introduction, in a kind of method that is used for the map datum coupling of the image of camera and portable terminal, embody the present invention, this method comprises by being positioned at the camera on the portable terminal, take at least one target, determine the positional information of the camera of at least one target of shooting, calculate the positional information of at least one captured target, current location based on camera, calculating is included in the positional information of at least one display-object in the map datum, the positional information of at least one captured target and the positional information of at least one display-object are compared, if and the described positional information of described at least one display-object is consistent with the described positional information of captured described at least one target, then with described at least one display-object and captured described at least one object matching.
In one aspect of the invention, the positional information of determining the camera of at least one target of shooting comprises the GPS message that receives according to by the GPS receiver that is positioned on the portable terminal, determine the current location of camera, use is positioned at the aspect sensor on the portable terminal, determine the position angle of the central shaft of camera, and use the inclination sensor that is positioned on the portable terminal, determine the inclination of camera.
Best, based on the position angle of the central shaft of the current location of camera and camera,, calculate the positional information of at least one captured target and the positional information of at least one display-object by the positional information of determined camera.Best, the positional information of camera comprises the position angle of the central shaft of the current location of camera and camera.
In another aspect of this invention, the positional information of calculating at least one captured target comprises from least one captured target extracts profile, by the profile that is extracted, set up the positional information calculation point of at least one target, position angle based on the central shaft of camera, the position angle of calculating location information calculation level, and calculate distance from the current location of camera to position information calculations point.
Best, by capturing at least one captured target, carry out from least one target and extract profile.Best, the positional information of camera further comprises the inclination of camera.Best, the positional information of calculating at least one captured target further comprises the computed range of correction from the current location of camera to position information calculations point.
In still another aspect of the invention, current location based on camera, the positional information of calculating at least one display-object comprises that positional information and map datum with camera mate, set up the positional information calculation point of at least one display-object, and based on the position of the camera of map datum coupling, calculate the positional information of the positional information calculation point of at least one display-object.Best, the positional information of camera comprises the position angle of the central shaft of the current location of camera and camera.
Best, the positional information of calculating the positional information calculation point of at least one display-object comprises the distance of calculating from the current location of camera to the positional information calculation point of at least one display-object, and based on the position angle of the central shaft of camera, the position angle of calculating location information calculation level.
Best, when the error amount between the positional information of the positional information of at least one captured target and at least one display-object is in predetermined value, carry out that at least one display-object is matched at least one captured target.
In still another aspect of the invention, this method further comprises from map datum reads text message with at least one display-object of at least one captured object matching, the text message read is inserted relevant position at least one captured target, and be presented at and wherein insert the text message of reading, captured at least one target.
Best, relevant position at least one target that the text message insertion of being read is captured is included in the position of at least one captured target of mating with at least one display-object, determine the display position of text message, and the text message of at least one display-object is mapped on the determined display position.
Best, demonstration is wherein inserted the text message of being read, at least one captured target comprises definite Show Color, and uses determined Show Color, shows the profile and the text message of at least one captured target.
According to another embodiment of the present invention, a kind of itinerary air navigation aid of the image of the camera in the portable terminal of using comprises use map data stored in portable terminal, the itinerary of search from the starting point to the destination, determine to be positioned at the positional information of the camera on the portable terminal, if the positional information of described at least one display-object is consistent with the positional information of captured described at least one target, then will comprise in the described map described at least one display-object with by described at least one object matching of described camera, determine by the navigation target at least one target of camera, the navigation target icon is inserted the position of the navigation target of determining by at least one captured target, show navigator target icon on the screen of portable terminal, and navigation itinerary.
In one aspect of the invention, the positional information of determining to be positioned at the camera on the portable terminal comprises the GPS message that receives according to by the GPS receiver that is positioned on the portable terminal, determine the current location of camera, use is positioned at the aspect sensor on the portable terminal, determine the position angle of the central shaft of camera, and use the inclination sensor that is positioned on the portable terminal, determine the inclination of camera.
In another aspect of this invention, with be included in the map datum at least one display-object with comprise the positional information of calculating at least one captured target by at least one object matching of camera, calculating is included in the positional information of at least one display-object in the map datum, the positional information of at least one captured target of the being calculated positional information with at least one display-object that is calculated is compared, and based on they positional informations separately, with at least one display-object and at least one captured object matching.
Best, based on the positional information of camera, calculate at least one captured target and the positional information that is included at least one display-object in the map datum.Best, the positional information of camera comprises the position angle of the central shaft of the current location of camera and camera.
In still another aspect of the invention, the positional information of calculating at least one captured target comprises from least one captured target extracts profile, by the profile that is extracted, set up the positional information calculation point of at least one target, position angle based on the central shaft of camera, the position angle of calculating location information calculation level, and calculate distance from the current location of camera to position information calculations point.
Best, the positional information of camera further comprises the inclination of camera.Best, the positional information of calculating at least one captured target further comprises the computed range of correction from the current location of camera to position information calculations point.
In still another aspect of the invention, the positional information that calculating is included at least one display-object in the map datum comprises that positional information and map datum with camera mate, set up the positional information calculation point of at least one display-object, and based on the position of the camera of map datum coupling, calculate the positional information of the positional information calculation point of at least one display-object.
Best, the positional information of camera comprises the position angle of the central shaft of the current location of camera and camera.Best, the positional information of calculating the positional information calculation point of at least one display-object comprises the distance of calculating from the current location of camera to the positional information calculation point of at least one display-object, and based on the position angle of the central shaft of camera, the position angle of calculating location information calculation level.
Best, when the error amount between the positional information of the positional information of at least one captured target and at least one display-object is in predetermined value, carry out that at least one display-object is matched at least one captured target.
In still another aspect of the invention, this method further comprises from map datum reads text message with at least one display-object of at least one captured object matching, the text message read is inserted relevant position at least one captured target, and be presented at and wherein insert the text message of reading, captured at least one target.
Best, relevant position at least one target that the text message insertion of being read is captured is included in the position of at least one captured target of mating with at least one display-object, determine the display position of text message, and the text message of at least one display-object is mapped on the determined display position.
Best, demonstration is wherein inserted the text message of being read, at least one captured target comprises definite Show Color, and uses determined Show Color, shows the profile and the text message of at least one captured target.
Best, determine that navigation target comprises and determine whether the target that is present at least one captured target can be used as the destination, can be used as the destination if determine the target that is present at least one captured target, the target that will be present at least one captured target is defined as navigation target, if and determine that driftlessness can be used as the destination, from at least one captured target, select to be present in a target on the itinerary, and selected target is defined as navigation target.
Best, the show navigator target comprises whether according to the target that is defined as navigation target be the destination, shows specific navigation target icon on the screen of portable terminal.
First purpose of the present invention provides the method that is used for the map datum coupling of the image of camera and portable terminal, by this method, by the target that is positioned at camera such as the buildings or the like on the portable terminal, and with target in the captured image and the coupling of the display-object on the map datum.
Second purpose provides the method that is used for the map datum coupling of the image of camera and portable terminal, by this method, if target in the captured image and the coupling of the display-object on the map datum, with the photographic images of camera, on screen, show the text message of display-object.
The 3rd purpose provides a kind of itinerary air navigation aid of using the image of the camera in the portable terminal, by this method, make display-object on the map datum and the object matching in the photographic images, and by using the photographic images of camera, the itinerary of navigation from user's current location to the destination.
The 4th purpose provides a kind of itinerary air navigation aid of using the image of the camera in the portable terminal, by this method, when the itinerary of navigation user, with the image of camera, the text message of the display-object on the map datum that shows on the screen with the object matching of captured image is so that the navigation itinerary.
Receive navigation message and will be defined as the current location of camera with the GPS receiver by the position that the navigation message that is received is judged, and determine to carry out the positional information of camera and judge by the position angle of the central shaft of camera and the mode of inclination by the position angle of sensor and inclination.
In addition, based on the positional information of judging by camera, calculating is present in the positional information of at least one target in the captured image and is included in the positional information of the display-object in the map datum, and makes at least one target and the display-object coupling with same position information.
So that separate at least one target, and be provided for calculating mode with the profile that extracts target from captured image, carry out the calculating of the positional information of at least one target from the central point of the positional information of separate targets.Then, use the position angle of camera to calculate to be used to the central point of the positional information of calculating target, and calculate distance from the current location of camera to the central point of target.
So that the positional information of camera and map datum coupling are carried out the positional information calculation that is included in the display-object in the map datum so that the mode of the central point of display-object is set.Then, the distance and bearing angle between camera and each display-object is calculated at the position of the camera of use and map datum coupling and position angle.
So that the positional information of at least one target and each positional information of display-object are compared, and according to comparative result, if the error between two positional informations is in preset range, make the display-object with the error in the preset range of positional information and the mode of at least one object matching, carry out the coupling between at least one target and display-object.
In addition, read text message with the display-object of at least one object matching, and the text message of reading is inserted in the position of related objective, wherein, in the image of taking thus, mate display-object, and show this information from map datum.
With by with the position of the target of display-object coupling, determine the display position of text message, and the text message of display-object be mapped to mode on the determined display position, carry out text message inserted captured image and demonstration.Then, be identified for the Show Color of display-object and text message, and by profile and the text message of being scheduled to the Show Color display-object.
According to third and fourth target of the present invention, determine starting point and destination, and by loading map datum, the itinerary of search from the starting point to the destination.If the user advances along the itinerary of being searched for, by the camera target, and at least one target in the captured image and the display-object coupling that is included in the map datum.
In addition, in the target with captured image, be defined as navigation target with at least one target of predetermined display-object coupling, and the navigation target icon inserted in the position of determined navigation target, and on screen, show so that the itinerary of navigation from the starting point to the destination.
If be present in the captured image with the destination, the destination is defined as the mode of navigation target, carry out determining of navigation destination.For example,, judge whether related building is present in the captured image if the predetermined architectural thing is input as the destination, and if related building exist, related building is defined as navigation target.If the destination is not present in the photographic images, select in the target with the display-object coupling, be present in captured image and be positioned at least one target on the itinerary and be defined as navigation target.
Will appreciate that above-mentioned general introduction of the present invention and following detailed description are exemplary and indicative, and intention provides of the present invention as requested and further specifies.
Description of drawings
Comprise further understanding of the present invention is provided and be included in and constitute this instructions a part accompanying drawing example explanation embodiments of the invention and explain principle of the present invention in conjunction with instructions.Feature of the present invention, element and the aspect of being quoted by the same numbers among the different figure represented according to identical, the equivalence of one or more embodiment or similar characteristics, element or aspect.
Fig. 1 is the block diagram that example illustrates the structure of navigational system according to an embodiment of the invention.
Fig. 2 is according to one embodiment of present invention, and example explanation is used for the signal flow graph of operation of text message of the position of display-object.
Fig. 3 is according to one embodiment of present invention, and example explanation is used to calculate the synoptic diagram of operation of the positional information of target.
Fig. 4 is according to one embodiment of present invention, and example explanation is used for the signal flow graph of operation of text message of the position of display-object.
Fig. 5 is according to one embodiment of present invention, the example of the situation of the text message of example explanation display-object.
Fig. 6 is according to one embodiment of present invention, example explanation be used to the to navigate signal flow graph of operation of itinerary.
Fig. 7 a and 7b are according to one embodiment of present invention, and the example explanation is used for showing the example of the operation of target goal point on photographic images.
Embodiment
The present invention relates to be used for the method for the map datum coupling of the image of camera and portable terminal and the itinerary air navigation aid of using the image of the camera in the portable terminal.To explain the present invention with reference to the accompanying drawing of illustration the preferred embodiments of the present invention.
Fig. 1 is the block scheme that example illustrates the structure of navigational system according to an embodiment of the invention.With reference to figure 1, reference number 100 is GPS receivers.GPS receiver 100 receives the navigation message that is regularly transmitted by a plurality of gps satellites, and uses this navigation message to come the extracting position coordinate.
Reference number 102 definition are used for the camera of photographic subjects, and reference number 104 presentation video processors.Image processor 104 capture the image of the target of taking by camera 102 and on image carries out image processing, such as text message being inserted the image of being captured and inserting the target goal indicator.
Reference number 106 expression map data storage, wherein pre-stored map datums.Comprising different display-objects, such as the text message of the positional information of buildings, and such as the mode of the text message of the building name of display-object, the map datum in the store map data storer 106.
Reference number 108 is sensors.Sensor 108 disposes inclination sensor and the azimuth sensor that is used to detect inclination, such as gyroscope, is used for camera 102 and detection side's parallactic angle and inclination.
Reference number 110 is input blocks.Input block 110 disposes a plurality of function keys.User's selection operation of input block 110 response function keys generates the associative operation order.That is the user's function key that can operate input block 110 guiding order and being used to of importing coupling order, the itinerary of the target starting point and destination of itinerary of navigating.
Reference number 112 definition controllers.Controller 112 can be taken the mode of intended target with camera 102, control camera 102.Controller 112 is captured the image of the intended target of being taken by camera 102 with image processor 104 and is handled the mode of the image of being captured, control image processor 104.The output signal of controller 112 response GPS receivers is determined the current location of camera 102, and is responded the detection signal by sensor 108, determines the target direction angle and the inclination of being taken by camera 102.Controller 112 uses the target direction angle and the inclination of being taken by camera 102 that target and the display-object in the map data stored in map data storage 106 are mated.Controller 112 is read text message with the display-object of object matching from map data storage 106, and the mode of inserting with the text message that will be read in the position of the coupling related objective in the captured image is controlled.Controller 112 can also search subscriber itinerary and controlling along the mode of the itinerary navigation user of search.
Reference number 114 expression display driver elements.The control of response controller 112, display driver element 114 show the image and the text message of the target of being taken by camera 102 on display unit 116.
Reference number 118 is voice signal generation units.The control of voice signal generation unit 118 response controllers 112, generation is used for the voice signal of the itinerary of navigation user, and the voice signal that is generated is outputed to loudspeaker 120 so that allow the user to listen to this sound.
In the navigational system that constitutes thus, if the target of the image of taking by camera 102 and the display-object coupling of map data stored in map data storage 106, controller 112 is at first operated GPS receiver 100 and is received navigation message and detect current location, and operate camera 102 is taken intended target then.In addition, if camera 102 has been taken intended target, controller 112 control image processors 104 are captured the image of being taken by camera 102.
If image processor 104 is captured the image of being taken by camera 102, controller 112 is differentiated the positional information of camera 102 by the output signal of GPS receiver 100 and the detection signal of sensor 108.That is, controller 112 is determined the current location of camera 102 by the output signal of GPS receiver 100, and the detection signal that passes through sensor 108, determines the position angle and the inclination of camera 102.
Therefore, the positional information separately from the target of capture images is calculated at firm current location of determining of use and position angle.By the inclination of camera 102, proofread and correct the positional information of being calculated, so that correctly calculate the positional information separately of target.
Controller 112 reads the map datum of the fate of the current location that comprises camera 102 from map data storage 106, and with the current location of camera 102 and the map datum coupling that is read.Use the current location and the position angle of the coupling of camera 102 to calculate each display-object, such as the positional information that is included in the buildings in the map datum.
Controller 112 relatively and according to comparative result, makes the target of being calculated mate with display-object the positional information of the positional information of the target calculated and display-object.If finish the target calculated and the coupling between display-object, controller 112 reads the text message separately of the display-object that is mated.According to the control of controller 112, the text message that read is inserted in the position of related objective of the capture images of capturing by image processor 104.Then, through display driver unit 114, the capture images of inserting text message is outputed to display unit 116, and it is presented on the screen.
Best, by the insertion position of each text message, distinguish the insertion position of text message.In addition, be preferably in position insertion and the demonstration text message separately that they do not overlap each other.
Fig. 2 is according to one embodiment of present invention, and example explanation is used for the signal flow graph of operation of text message of the position of display-object.With reference to figure 2, controller 112 differentiates whether generate the coupling order (S200) that is used at least one object matching in display-object that will be included in map datum and the image of being taken by camera 102.Can send the coupling order by the user of operation input block 110 in person.In the time of also can working as the user and operate input block 110, by the display message of target, or the navigation by itinerary, produce the coupling order.
If generate the coupling order, controller 112 operation GPS receiver 100 and cameras 102 (S202), and control image processor 104 is captured the image of being taken by camera 102 (S204) so that allow with predetermined time interval.
Controller 112 is determined the positional information (S206) of the camera 102 of photographic subjects.Best, controller 112 will be judged to be the current location of camera 102 by the current location that the navigation message that is received by GPS receiver 100 is calculated.By by aspect sensor,, determine the position angle of the central shaft of camera 102 such as the position angle that the gyroscope in sensor 108 configurations detects.
If determine the positional information of camera 102, controller 112 calculates from the positional information (S208) of at least one target of the image of being captured by image processor 104.
By the detection signal of sensor 108, the camera 102 of photographic subjects is determined the mode of inclination, carries out the calculating (S208-2) of at least one target (S208) with at first.That is, sensor 108 has the inclination sensor of the inclination that is used to detect camera 102.Therefore, controller 112 is determined the inclination of camera 102 by the inclination by the inclination sensor of sensor 108.
If determine the inclination of camera 102, controller 112 control image processors 104 separate (S208-4) with at least one target and the image of being captured.Best, image processor 104 extracts the profile of each target from the image of being captured, and uses the profile that is extracted, and separates at least one image.Then, controller 112 is provided for from the central point (S208-6) of at least one the target extracting position coordinate that separates thus.If the central point of target is set, controller 112 calculates the positional information (S208-8) of related objective based on the current location and the position angle of camera 102.
For example, if extract four targets (300-1 to 300-4), as shown in Figure 3, bottom centre's point of each target (300-1 to 300-4) is arranged for the central point of the positional information of extracting related objective from the image of taking by camera 102.In addition, GPS receiver 100 calculates that (azimuth angle theta 1 of the positional information of the central point (301-1 to 301-4) of 300-1~300-4), θ 2, θ 3 and θ 4 with respect to four targets.GPS receiver 100 is also based on the position angle 311 of the central shaft of the camera of being judged by the detection signal of sensor 108 102, computed range r1, r2, r3 and r4.
From the central point (301-1 to 301-4) of 310 to four targets (300-1 to 300-4) of current location of camera 102 apart from r1, r2, r3 and r4 change according to the inclination and the magnification information of camera 102.Therefore, the inclination of the camera 102 that detects by sensor 108 and the magnification information by camera 102 photographic subjects, correction from the central point (301-1 to 301-4) of 310 to four targets (300-1 to 300-4) of current location of camera 102 apart from r1, r2, r3 and r4 (S208-10 of Fig. 2).Therefore, can accurately calculate from the horizontal range of the central point (301-1 to 301-4) of 310 to four targets (300-1 to 300-4) of current location of camera 102.
Simultaneously, with reference to figure 2, with the positional information while of calculating at least one target, controller 112 is carried in map data stored in the map data storage 106.Best, based on the detection current location of camera 102, controller 112 loads the map datum (S210) of fate from map data storage 106.Then, controller 112 by the map datum that is loaded, calculates a plurality of display-objects based on the positional information of camera 103, such as the positional information (S212) of each buildings.
Best, comprise current location and map datum coupling (S212-2) with respect to the calculating (S212) of the positional information of a plurality of display-objects with camera 102, the central point (S212-4) of each display-object is set, the distance of calculating from the current location of determining of camera 102 to the central point of each display-object, and based on the position angle of determining of camera 102, the position angle (S212-6) of calculating the central point of each display-object.
If calculate the positional information of the target in the image captured and the positional information of the display-object in the map datum, controller 112 compares the positional information of at least one target of being calculated and the positional information of display-object.Then, with relevant display-object and at least one object matching (S214) with relevant position information.
In addition, preferably distinguish and set in advance error amount, relatively at least one target and the positional information that is included in the display-object in the map datum, and display-object and at least one object matching in the error amount that will formerly be provided with.
After at least one target and the display-object in the map datum in the captured image of coupling, controller 112 is read text message (S216) with the display-object of at least one object matching from map data storage 106.Then, controller 112 text message read is inserted in the image of being captured and on screen, show it (S218).Best, the text message of the display-object read is inserted respectively in the position of coupling related objective, and output to display driver unit 114, be used on the screen of display unit 116, showing.
In the position of text message being inserted at least one target of mating and on screen, show in it with display-object, at first, controller 112 determines text message is inserted the display position (S400) of at least one target of mating with display-object, as shown in Figure 4.At not upper position, centre position or the lower position of overlay text information object, determine display position.As long as determined the display position of text message, controller 112 is in relevant display position rendering text information (S402).
In the present invention, accurately proofread the text message of at least one target for the user, determine and at least one target of display-object coupling and the Show Color (S404) of text message, and the profile and the text message (S406) that show related objective with the color of determining.
For example, as shown in Figure 5, make camera 102 capture captured image, and make the display-object coupling in target 500 and the map datum in the image of being captured.Be identified for the display position 510 of the text message of display-object, and text message 510 is inserted determined display position 510 and demonstration.Best, with the profile and the text message of identical color display-object 500.
After text message was shown the position of inserting the target in the image of capturing thus, controller 112 was judged the coupling the finish command (S220 of Fig. 2) that whether has generated target.Can generate coupling the finish command by user's operation of input block 110.In the time of also can working as the guiding of the itinerary that stops the user, generate coupling the finish command of target.
With reference to figure 2, if do not generate coupling the finish command of target, controller 112 turns back to step S204, captures the image of being taken by camera 102.Therefore, target in the image that coupling is captured and the display-object in the map datum, and the operation that repeats videotex information.If generate coupling the finish command, controller 112 stops the operation of videotex information.
Simultaneously, if user's itinerary, at first, controller 112 receives through input block 110, the starting point and the destination of the itinerary that is used to navigate.The starting point and the destination of itinerary if input is used to navigate, controller 112 is carried in map data stored in the map data storage 106, and uses the map datum that is loaded to search for itinerary from the starting point to the destination.
If search itinerary, controller 112, are captured the image of being taken by camera 102 through image processor 104, so that with target in the image and the display-object coupling that is included in the map datum.In addition, controller 112 shows the image of being captured on the screen of display unit 116, so that use the match information between target and display-object and be user's itinerary of navigating.
With reference to figure 6, whether controller 112 judges the navigation command (S600) from input block 110 input itineraries.If the navigation command of input itinerary, controller 112 is connected GPS receiver 100 and camera 102, is used for normal running (S602).Then, controller 112 starting point and destination (S604) of itinerary that be identified for navigating.
Best, if by input block 110 input current locations, will be defined as the starting point of itinerary by the position that the GPS receiver 100 that receives navigation message is judged.If user command navigation itinerary, controller 112 is carried in map data stored in the map data storage 106, so that display map data on the screen of display unit 116, and the user can import starting point and destination in person through the map of demonstration on display unit 116.
If determine starting point and destination, controller 112 extracts map (S606) from map data storage 106, and uses the map datum that is extracted, the best itinerary (S608) of search from the starting point to the destination.Best, the operation that is used to search for itinerary is of various search operations known in the art.
If finish travel route search, controller 112 control image processors 104, capture the image of taking by camera 102 (S610) with predetermined time interval, and with target in the image of being captured and the display-object coupling (S612) that is included in the map datum.Then, read text message with the display-object of object matching, and the text message of being read is inserted the image of being captured and shows (S614) from the map datum of map data storage 106.
Best, the target that is used for the image of being captured is identical with the coupling of display-object and operation and Fig. 3 of videotex information in being included in map datum.After coupling between target and display-object and videotex information, controller 112 judges whether the target with destination exists (S616) in the image of being captured.
If in the image of being captured, the target with destination exists, and controller 112 is with in the image of being captured, and the target with this destination is defined as navigation target (S618).If in the image of being captured, do not have the target of destination, so, the intended target that controller 112 will be present on the itinerary is defined as navigation target (S620).
As long as determine navigation target, controller 112 is with in the position of navigation target indicator insertion corresponding to the target of the navigation target in the image of being captured, and through display driver unit 114, on display unit 116, show the image that inserts this navigation target indicator, so that the user checks (S622).
In addition, controller 112 uses the image of demonstration on display unit 116 and the itinerary (S624) that text message comes navigation user.Controller 112 guide sound tone signal generation units 118 generate the navigation voice signal, and the voice signal that is generated outputed to loudspeaker 120, if the direction of itinerary changes or the navigation of navigation target so that carry out,, the user is navigate on the itinerary by voice signal.In this case, controller 112 is judged the navigation (S626) that whether has been accomplished to the itinerary of destination.
According to result of determination, if also be not accomplished to the navigation of the itinerary of destination, controller 112 is captured the image of being taken by camera 102, and the display-object of target in the image of being captured and map datum is mated so that navigation target is set.Repeat the operation of insertion and show navigator target icon then.Best, differently the show navigator target icon is also destination whether, destination so that the user is checked be defined as the target of navigation target.
Best, be the destination if be defined as the target of navigation target, the show navigator target-designator 700 thereon, shown in Fig. 7 a.Best, if the intended target that will be present on the itinerary is defined as navigation target, show different navigation target indicators 710 thereon, shown in Fig. 7 b.
Be example and description purpose, the foregoing description of specific embodiments of the invention is provided.They are not exhaustive or the present invention are limited to disclosed precise forms that obviously, in view of above-mentioned instruction, many improvement and change are possible.Select and describe embodiment, make others skilled in the art utilize the present invention and each embodiment best thus and the various improvement of the concrete use that is suitable for expecting so that principle of the present invention and practical application thereof are described best.Plan limits scope of the present invention by additional claim and their equivalence.

Claims (32)

1. method that is used for the map datum coupling of the image of camera and portable terminal, described method comprises:
By being positioned at the camera on the portable terminal, take at least one target;
Determine the positional information of the described camera of described at least one target of shooting;
Calculate the positional information of captured described at least one target;
Based on the current location of described camera, calculate the positional information that is included at least one display-object in the described map datum;
The positional information of captured described at least one target and the positional information of described at least one display-object are compared; And
If the described positional information of described at least one display-object is consistent with the described positional information of captured described at least one target, then with described at least one display-object and captured described at least one object matching.
2. the positional information of the method for claim 1, wherein determining the camera of described at least one target of shooting comprises:
According to the GPS message that receives by the GPS receiver that is positioned on the described portable terminal, determine the current location of described camera;
Use is positioned at the aspect sensor on the described portable terminal, determines the position angle of the central shaft of described camera; And
Use is positioned at the inclination sensor on the described portable terminal, determines the inclination of described camera.
3. the method for claim 1, wherein, position angle based on the central shaft of the current location of described camera and described camera, by the positional information of determined described camera, calculate the positional information of at least one captured target and the positional information of described at least one display-object.
4. the method for claim 1, wherein the positional information of described camera comprises the position angle of the central shaft of the current location of described camera and described camera.
5. method as claimed in claim 4, wherein, the positional information of calculating at least one captured target comprises:
Extract profile from least one captured target;
By the profile that is extracted, set up the positional information calculation point of described at least one target;
Based on the position angle of the central shaft of described camera, the position angle of calculating described positional information calculation point; And
The distance of calculating from the current location of described camera to described positional information calculation point.
6. method as claimed in claim 5 wherein, by capturing at least one captured target, is carried out from described at least one target and is extracted profile.
7. method as claimed in claim 4, wherein, the positional information of described camera further comprises the inclination of described camera.
8. method as claimed in claim 5, wherein, the positional information of calculating captured described at least one target further comprises the computed range of correction from the current location of described camera to described positional information calculation point.
9. the method for claim 1, wherein based on the current location of described camera, the positional information of calculating described at least one display-object comprises:
Positional information and map datum coupling with described camera;
Set up the positional information calculation point of described at least one display-object; And
Based on the position of the camera of described map datum coupling, calculate the positional information of the positional information calculation point of described at least one display-object.
10. method as claimed in claim 9, wherein, the positional information of described camera comprises the position angle of the central shaft of the current location of described camera and described camera.
11. method as claimed in claim 9, wherein, the positional information of calculating the positional information calculation point of described at least one display-object comprises:
The distance of calculating from the current location of described camera to the positional information calculation point of described at least one display-object; And
Based on the position angle of the central shaft of described camera, the position angle of calculating described positional information calculation point.
12. the method for claim 1, wherein, when the error amount between the positional information of the positional information of captured described at least one target and described at least one display-object is in predetermined value, carry out that described at least one display-object is matched at least one captured target.
13. the method for claim 1 further comprises:
Read text message with at least one display-object of at least one captured object matching from map datum;
Relevant position at least one target that the text message insertion of being read is captured; And
Be presented at wherein insert at least one target text message of reading, captured.
14. method as claimed in claim 13, wherein, the relevant position that the text message of being read is inserted at least one captured target comprises:
With the position of at least one captured target of described at least one display-object coupling, determine the display position of described text message; And
The text message of described at least one display-object is mapped on the determined display position.
15. method as claimed in claim 13 wherein, is presented at least one target text message, captured that wherein insertion read and comprises:
Determine Show Color; And
Use determined Show Color, show the profile and the described text message of at least one captured target.
16. an itinerary air navigation aid of using the image of the camera in the portable terminal, described method comprises:
Use is map data stored in described portable terminal, the itinerary of search from the starting point to the destination;
Determine to be positioned at the position of the camera on the described portable terminal;
If the positional information of described at least one display-object is consistent with the positional information of captured described at least one target, then will comprise described at least one display-object and described at least one object matching in the described map by described camera;
Determine by the navigation target at least one target of described camera;
The navigation target icon is inserted the position of the navigation target of determining by at least one captured target;
On the screen of described portable terminal, show described navigation target icon; And
The described itinerary of navigating.
17. method as claimed in claim 16 wherein, determines that the positional information that is positioned at the camera on the described portable terminal comprises:
According to the GPS message that receives by the GPS receiver that is positioned on the described portable terminal, determine the current location of described camera;
Use is positioned at the aspect sensor on the described portable terminal, determines the position angle of the central shaft of described camera; And
Use is positioned at the inclination sensor on the described portable terminal, determines the inclination of described camera.
18. method as claimed in claim 16, wherein, be included in the described map datum at least one display-object with comprise by at least one object matching of described camera:
Calculate the positional information of captured described at least one target;
Calculating is included in the positional information of at least one display-object in the described map datum;
The positional information of captured described at least one target of the being calculated positional information with described at least one display-object that is calculated is compared; And
Based on they positional informations separately, with described at least one display-object and captured described at least one object matching.
19. method as claimed in claim 18 wherein, based on the positional information of described camera, is calculated captured described at least one target and the positional information that is included in described at least one display-object in the described map datum.
20. method as claimed in claim 18, wherein, the positional information of described camera comprises the position angle of the central shaft of the current location of described camera and described camera.
21. method as claimed in claim 20, wherein, the positional information of calculating at least one captured target comprises:
Extract profile from least one captured target;
By the profile that is extracted, set up the positional information calculation point of described at least one target;
Based on the position angle of the central shaft of described camera, the position angle of calculating described positional information calculation point; And
The distance of calculating from the current location of described camera to described positional information calculation point.
22. method as claimed in claim 21, wherein, the positional information of described camera further comprises the inclination of described camera.
23. method as claimed in claim 21, wherein, the positional information of calculating captured described at least one target further comprises the computed range of correction from the current location of described camera to described positional information calculation point.
24. method as claimed in claim 18 wherein, is calculated the positional information that is included at least one display-object in the described map datum and is comprised:
Positional information and map datum coupling with described camera;
Set up the positional information calculation point of described at least one display-object; And
Based on the position of the camera of described map datum coupling, calculate the positional information of the positional information calculation point of described at least one display-object.
25. method as claimed in claim 24, wherein, the positional information of described camera comprises the position angle of the central shaft of the current location of described camera and described camera.
26. method as claimed in claim 24, wherein, the positional information of calculating the positional information calculation point of described at least one display-object comprises:
The distance of calculating from the current location of described camera to the positional information calculation point of described at least one display-object; And
Based on the position angle of the central shaft of described camera, the position angle of calculating described positional information calculation point.
27. method as claimed in claim 18, wherein, when the error amount between the positional information of the positional information of captured described at least one target and described at least one display-object is in predetermined value, carry out that described at least one display-object is matched at least one captured target.
28. method as claimed in claim 18 further comprises:
Read text message with at least one display-object of at least one captured object matching from map datum;
Relevant position at least one target that the text message insertion of being read is captured; And
Be presented at wherein insert at least one target text message of reading, captured.
29. method as claimed in claim 28, wherein, the relevant position that the text message of being read is inserted at least one captured target comprises:
With the position of at least one captured target of described at least one display-object coupling, determine the display position of described text message; And
The text message of described at least one display-object is mapped on the determined display position.
30. method as claimed in claim 28 wherein, is presented at least one target text message, captured that wherein insertion read and comprises:
Determine Show Color; And
Use determined Show Color, show the profile and the described text message of at least one captured target.
31. method as claimed in claim 16 wherein, determines that navigation target comprises:
Determine whether the target that is present at least one captured target can be used as the destination;
Can be used as described destination if determine the target that is present at least one captured target, the target that will be present in captured described at least one target is defined as navigation target;
And
Can be used as described destination if determine driftlessness, be present in a target on the described itinerary, and the target of selecting is defined as navigation target from least one captured target selection.
32. method as claimed in claim 31, wherein, the show navigator target comprises whether according to the target that is defined as described navigation target be the destination, shows specific navigation target icon on the screen of described portable terminal.
CNB2006100927889A 2005-06-14 2006-06-14 Map datum and route air navigation aid in matching camera photographic images and the portable terminal device Active CN100561133C (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020050051099 2005-06-14
KR1020050051093A KR100674805B1 (en) 2005-06-14 2005-06-14 Method for matching building between camera image and map data
KR1020050051093 2005-06-14
KR1020050054397 2005-06-23

Publications (2)

Publication Number Publication Date
CN1880918A CN1880918A (en) 2006-12-20
CN100561133C true CN100561133C (en) 2009-11-18

Family

ID=37519197

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2006100927889A Active CN100561133C (en) 2005-06-14 2006-06-14 Map datum and route air navigation aid in matching camera photographic images and the portable terminal device

Country Status (2)

Country Link
KR (1) KR100674805B1 (en)
CN (1) CN100561133C (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102313554A (en) * 2010-06-30 2012-01-11 株式会社电装 Onboard navigation system
CN102519478A (en) * 2011-11-16 2012-06-27 深圳市凯立德科技股份有限公司 Streetscape destination guiding method and device

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100775123B1 (en) * 2006-09-15 2007-11-08 삼성전자주식회사 Method of indexing image object and image object indexing system using the same
KR100862148B1 (en) * 2007-01-08 2008-10-09 에스케이 텔레콤주식회사 Building information data integration apparatus and method on digital map
JP2010534455A (en) * 2007-03-24 2010-11-04 躍軍 閻 Portable digital imaging system combining positioning navigation information and image information
CN101109643B (en) * 2007-08-22 2011-06-29 广东瑞图万方科技有限公司 Navigation apparatus
DE112008003481T5 (en) * 2007-12-28 2010-12-30 Mitsubishi Electric Corp. navigation device
KR20090123227A (en) * 2008-05-27 2009-12-02 삼성전자주식회사 Offering apparatus of searching service, method and program thereof
US8098894B2 (en) 2008-06-20 2012-01-17 Yahoo! Inc. Mobile imaging device as navigator
CN101619976B (en) * 2008-07-01 2016-01-20 联想(北京)有限公司 A kind of position positioning retrieval device and method
KR101541076B1 (en) 2008-11-27 2015-07-31 삼성전자주식회사 Apparatus and Method for Identifying an Object Using Camera
WO2011025236A2 (en) * 2009-08-24 2011-03-03 Samsung Electronics Co., Ltd. Mobile device and server exchanging information with mobile apparatus
CN101753807B (en) * 2009-12-16 2012-11-28 惠州Tcl移动通信有限公司 Image pick-up device
US20110169947A1 (en) * 2010-01-12 2011-07-14 Qualcomm Incorporated Image identification using trajectory-based location determination
US20110279446A1 (en) 2010-05-16 2011-11-17 Nokia Corporation Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device
CN102338639B (en) * 2010-07-26 2015-04-22 联想(北京)有限公司 Information processing device and information processing method
US9639857B2 (en) 2011-09-30 2017-05-02 Nokia Technologies Oy Method and apparatus for associating commenting information with one or more objects
KR101265711B1 (en) 2011-11-30 2013-05-20 주식회사 이미지넥스트 3d vehicle around view generating method and apparatus
KR101919366B1 (en) * 2011-12-22 2019-02-11 한국전자통신연구원 Apparatus and method for recognizing vehicle location using in-vehicle network and image sensor
CN103954293B (en) * 2012-05-30 2016-10-05 常州市新科汽车电子有限公司 The method of work of navigator
US9404751B2 (en) * 2012-06-06 2016-08-02 Samsung Electronics Co., Ltd. Apparatus and method for providing 3D map showing area of interest in real time
CN102829788A (en) * 2012-08-27 2012-12-19 北京百度网讯科技有限公司 Live action navigation method and live action navigation device
CN103090875A (en) * 2012-11-26 2013-05-08 华南理工大学 Real-time real-scene matching vehicle navigation method and device based on double cameras
CN103077624B (en) * 2012-12-28 2015-07-29 天津爱迪尔软件开发有限公司 A kind of instant navigation road condition system based on GPS and air navigation aid
CN103134489B (en) * 2013-01-29 2015-12-23 北京凯华信业科贸有限责任公司 The method of target localization is carried out based on mobile terminal
CN104034335B (en) * 2013-03-08 2017-03-01 联想(北京)有限公司 Method and image capture device that image shows
KR101459522B1 (en) * 2013-02-15 2014-11-07 브이앤아이 주식회사 Location Correction Method Using Additional Information of Mobile Instrument
KR102222336B1 (en) 2013-08-19 2021-03-04 삼성전자주식회사 User terminal device for displaying map and method thereof
WO2015025195A1 (en) * 2013-08-23 2015-02-26 Insight Robotics Limited A method of determining the location of a point of interest and the system thereof
US9818196B2 (en) 2014-03-31 2017-11-14 Xiaomi Inc. Method and device for positioning and navigating
CN104596509B (en) * 2015-02-16 2020-01-14 杨阳 Positioning method and system, and mobile terminal
JP6288060B2 (en) * 2015-12-10 2018-03-07 カシオ計算機株式会社 Autonomous mobile device, autonomous mobile method and program
CN105890597B (en) * 2016-04-07 2019-01-01 浙江漫思网络科技有限公司 A kind of assisted location method based on image analysis
CN106998447B (en) * 2017-03-31 2018-05-11 大庆安瑞达科技开发有限公司 Wide area, oil field infrared panorama imaging radar scout command and control system
DE102018200827A1 (en) * 2018-01-19 2019-07-25 Robert Bosch Gmbh Method for aligning cards of a LIDAR system
CN113739797A (en) * 2020-05-31 2021-12-03 华为技术有限公司 Visual positioning method and device
EP4359733A1 (en) * 2021-06-22 2024-05-01 Grabtaxi Holdings Pte. Ltd. Method and system for gathering image training data for a machine learning model

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100513660B1 (en) * 2003-05-23 2005-09-09 엘지전자 주식회사 Method for creating three-dimensional map from two-dimensional map
KR20050058810A (en) * 2003-12-12 2005-06-17 주식회사 에브리웨어 Image processing system and method for electronic map

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102313554A (en) * 2010-06-30 2012-01-11 株式会社电装 Onboard navigation system
CN102519478A (en) * 2011-11-16 2012-06-27 深圳市凯立德科技股份有限公司 Streetscape destination guiding method and device

Also Published As

Publication number Publication date
KR20060130420A (en) 2006-12-19
CN1880918A (en) 2006-12-20
KR100674805B1 (en) 2007-01-29

Similar Documents

Publication Publication Date Title
CN100561133C (en) Map datum and route air navigation aid in matching camera photographic images and the portable terminal device
EP2840358B1 (en) Matching camera-photographed image with map data in portable terminal
US9546879B2 (en) User terminal, method for providing position and method for guiding route thereof
JP4994028B2 (en) Gasoline price information collection system, gasoline price information collection method, and navigation apparatus
CN101583842B (en) Navigation system, portable terminal device, and peripheral-image display method
US7822545B2 (en) Mobile terminal with navigation function
CN1896684B (en) Geographic data collecting system
US8600677B2 (en) Method for feature recognition in mobile communication terminal
EP3168571B1 (en) Utilizing camera to assist with indoor pedestrian navigation
US20080040024A1 (en) Method and apparatus of displaying three-dimensional arrival screen for navigation system
KR20140054162A (en) Method for ensuring continuity of service of a personal navigation device and device thereof
KR102392998B1 (en) Route guidandce apparatus and control method for the same
KR100734678B1 (en) Method for displaying building information
JP4352031B2 (en) Navigation system, terminal device, and map display method
EP2565676A1 (en) Apparatus and method for correcting position information of portable terminal in multi-path zone
KR100668969B1 (en) Method for guiding road using camera image
KR20000013568A (en) Navigation system displaying photograph image of target point
JP2009198508A (en) Route guidance device
RU2324236C2 (en) Method for matching a camera-photographed image with map data in a portable terminal and method for travel route guidance
JP6537189B2 (en) Map display device and map display method
JP7250499B2 (en) Equipment location management system
JP4544008B2 (en) Navigation device
JPH10132562A (en) Distance measuring equipment
JPH03226623A (en) Navigation apparatus for vehicle
JPH03191814A (en) Vehicle-running guiding apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant