WO1999060338A1 - Dispositif de traitement de donnees et systeme de navigation pour pietons utilisant ce dispositif - Google Patents
Dispositif de traitement de donnees et systeme de navigation pour pietons utilisant ce dispositif Download PDFInfo
- Publication number
- WO1999060338A1 WO1999060338A1 PCT/JP1998/002151 JP9802151W WO9960338A1 WO 1999060338 A1 WO1999060338 A1 WO 1999060338A1 JP 9802151 W JP9802151 W JP 9802151W WO 9960338 A1 WO9960338 A1 WO 9960338A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- unit
- pedestrian
- route
- conversion
- Prior art date
Links
- 238000012545 processing Methods 0.000 title abstract description 32
- 239000003550 marker Substances 0.000 claims abstract description 14
- 238000004364 calculation method Methods 0.000 claims abstract description 13
- 238000006243 chemical reaction Methods 0.000 claims description 31
- 238000004891 communication Methods 0.000 claims description 22
- 238000001514 detection method Methods 0.000 claims description 15
- 230000003287 optical effect Effects 0.000 claims description 14
- 230000001133 acceleration Effects 0.000 claims description 9
- 230000010365 information processing Effects 0.000 claims description 9
- 238000005259 measurement Methods 0.000 claims description 8
- 238000004458 analytical method Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims description 3
- 238000000034 method Methods 0.000 abstract description 20
- 238000013519 translation Methods 0.000 abstract description 11
- 230000000694 effects Effects 0.000 abstract description 3
- 238000005516 engineering process Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 16
- 230000009471 action Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 230000015572 biosynthetic process Effects 0.000 description 5
- 238000003786 synthesis reaction Methods 0.000 description 5
- 230000006399 behavior Effects 0.000 description 3
- 230000001771 impaired effect Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 208000032041 Hearing impaired Diseases 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S367/00—Communications, electrical: acoustic wave systems and devices
- Y10S367/91—Portable sonar devices
Definitions
- the present invention relates to the field of information support technology for a moving pedestrian, and more particularly to a pedestrian navigation system for navigating a pedestrian by transmitting a position and information related to the position.
- the attributes of the source information are the same as the attributes sent from the source (if the attributes of the source information are images, the attributes of the destination information are also images)
- the source is text
- the destination is text, and if it is a figure, it is played back as a figure, and if it is a voice, it is played back as a voice). It is difficult to exchange information sufficiently in the state Met. In such a system, the communication of information to users with disabilities, such as the visually impaired and the hearing impaired, cannot be said to be sufficient.
- the media information of the transmission source is translated (converted) into an appropriate media according to the state of action of the system user. It is possible to provide appropriate information to the usage status of the person. For visually impaired persons, for example, if the attribute of the information from the source is not only audio but also other attributes, it is more appropriate to translate (convert) the media information received at the destination. Information can be provided. If these media translation functions are implemented in real time at the source or destination, they can be realized by a function that dispatches various real-time media recognition functions and various real-time media synthesis functions according to the usage status.
- FIG. 1 is a system configuration diagram of the present invention.
- FIG. 2 is a system configuration diagram of a fixed route calculation device with a communication function.
- FIG. 3 is a system configuration diagram of a portable route guidance device with a communication function and a position / orientation detection function.
- Fig. 4 shows a facility management system with a communication function.
- Fig. 5 is an example of conventional route guidance.
- FIG. 6 is an example of route guidance according to the present invention.
- Figure 7 is an example of a facility map.
- Fig. 8 is an example of installation of facility markers.
- Fig. 9 is a side view of an example of installation of facilities.
- FIG. 10 is a plan view of an example of installation of facility facilities.
- FIG. 11 is an external view of a fixed route calculation device with a communication function.
- Fig. 12 shows an example where media cannot be transmitted.
- Figure 13 is an example of conversion by media translation.
- Fig. 14 is a basic operation diagram of media translation.
- FIG. 15 is a configuration diagram of an information processing apparatus according to the present invention.
- Figure 16 shows a road map and a facility map.
- FIG. 17 is an example of a route.
- Fig. 18 is a flow chart of the Dijkstra method.
- -Fig. 19 shows an example of changing the portable route guidance device for both cars and pedestrians.
- FIG. 20 is a flow diagram of the path sentence analysis.
- FIG. 21 is a flowchart of extracting word candidates.
- FIG. 22 is a keyword matching flow chart.
- FIG. 23 is a diagram showing a flow chart of generating a graphical word string.
- Fig. 24 is a diagram showing route guidance sentences, keywords, analysis results of route sentences, and examples of figures.
- FIG. 25 is a flowchart of the graphic processing.
- FIG. 26 is a diagram showing a graphic drawing example.
- FIG. 27 is a diagram showing a graphic drawing example.
- FIG. 28 is a diagram showing a graphic drawing example.
- FIG. 29 is a diagram showing a graphic drawing example.
- FIG. 30 is a diagram showing a figure drawing example.
- Fig. 31 shows a media conversion table for sensor output.
- FIG. 32 is a diagram showing a configuration example of an optical sensor.
- FIG. 33 is a diagram showing a configuration example of an acceleration sensor.
- FIG. 34 is a diagram showing a configuration example of a volume sensor.
- FIG. 35 is a diagram showing a flow of a user action recognition processing section.
- FIG. 36 is a diagram showing a flow of a media translation control processing section. BEST MODE FOR CARRYING OUT THE INVENTION
- FIG. 1 shows an example of the system configuration of the present invention.
- This system has a fixed route calculation information providing device 1 with communication function (hereafter, information providing device 1) and a communication function ⁇
- a portable route guidance device 2 with position and orientation detection function hereafter, a route guidance device
- a fixed facility marker device 3 (hereinafter, a marker device)-consists of ( a general outline of operation is that a user (pedestrian) carries the route guidance device 2 and, for example, reports the current position and destination via the route guidance device 2.
- the information providing device 1 calculates an appropriate route using a necessary database (a map or the like) and returns the route information to the route guiding device 2.
- the destination information is provided according to the guidance provided by the route guidance device 2 Move to.
- detailed route guidance can be provided by receiving information from the masturbation device 3 installed near the facility, such as a building entrance lillo and an elevator, with the route guidance device 2.
- FIG. 2 shows an example of a system configuration diagram of the information providing apparatus 1 according to the present invention.
- the information providing device 1 includes a pedestrian database (for example, a pedestrian digital road map 15, a facility map 16 including the structure and store information inside the building, a traffic route such as public transport and schedule information 17). , Construction that hinders walking and moving
- Route calculation unit that finds an appropriate route according to traffic regulation information 18, accidents and traffic obstacle information 19, etc.) and requests from users (current location and destination, etc.).
- Information selection section that selects necessary information in response to a request for facility guidance information
- User information is separated into route-related requests and information-related requests, and information obtained by the route calculation unit 11 and the information selection unit 12 is integrated to generate the necessary guidance information. It has a generation unit 13 and a communication unit 14 that communicates with the route guidance device 2 carried by the user.
- FIG. 3 shows an example of a system configuration of the route guidance device 2 according to the present invention.
- the route guidance device 2 is driven by a battery, and the information provision device 1 and the marker device
- Communication unit for communicating with 3, communication data storage unit for recording and storing communication information
- the information navigation unit 203 that generates guidance information for pedestrians using the facility marker orientation and the like obtained from the beam receiver 210 connected to the light source installed on the side of the route guidance device 2 Walking behavior by walking sensor 206 using sensor 205 or acceleration sensor Information from the user based on the output of the user action recognition unit 207 and the user action recognition unit 207 that recognizes (handhold of the route guidance device 2 ⁇ identification in the bag, pedestrian walking, running, stop, etc.)
- a media translation unit 204 that converts the access media into a form that can be received by the information navigation unit 203, or converts guide information obtained from the information navigation unit 203 into a media that can be received by the user. It includes the input / output devices connected to it (display 2 13, speaker 2 14, vibrator 2 15, microphone 2 16, tablet 217, keyboard 2 18).
- FIG. 11 is a diagram showing an appearance of the route guidance device 2.
- the route guide device 2 includes a flat display 21 3 on which a touch panel is superimposed, a GPS antenna 209, a beam receiver 210, a communication antenna 201, a speaker 214, and a microphone 211.
- the GPS antenna 209 and the communication antenna 201 interfere with each other depending on the frequency of the radio wave used, and the operation may be unstable.Therefore, install them at a distance from each other as shown in the figure, or It is better to standardize and separate signals internally.
- FIG. 4 is a diagram showing a system configuration example of the marker device 3 according to the present invention.
- the marker device 3 is connected to one or more sets of angle information superimposing units 34, 35, 36 and each angle information superimposing unit 34 connected to an information generating unit 37 for storing and outputting facility information.
- a communication unit 38 may be provided for connection with an external information control device.
- each beam transmitter uses light, for example, the radiation range is limited by a lens system.
- the angle for example, a value obtained by expanding the map clockwise by 360 degrees with the north of the map direction being 0 degrees may be used.
- Marker device 3 is used for building relo It is installed at the point where route guidance is provided, such as steps, elevators, and reception. As shown in FIG. 8, the marker device 3 is installed, for example, as a beam transmitter in three directions above the reception.
- Fig. 9 shows a side view
- Fig. 10 shows a top view.
- a lens system is used, and in the case of radio waves, a predetermined beam is obtained in the form of an antenna.
- a position and orientation transmitter such as a marker device 3 installed at each point in the facility is required.
- Fig. 16 shows an example of a database based on a road map (a) and a facility map (b).
- the facility map (b) contains the position of each marker in the facility, beam radiation direction, link information (adjacent Machiru ID), and the like.
- the goal is set to a transit point m on the road in order to separate the route into the facility route.
- the point ID of the adjacent road is acquired from the link information on the facility map.
- Route calculation First, the route on the road (start point, C1, C3, C4, m) is obtained, and then the route within the facility (m, mh, mf1, me, mf3, mf3r2) Ask for.
- Figure 18 shows the Dijkstra method, which is a typical algorithm for calculating the route.
- This algorithm uses the start point as the initial value, finds the minimum evaluation value to the adjacent point (node) up to the goal point, for example, the distance, and finally calculates the minimum evaluation value from the start point to the goal point. Get the route.
- the calculation time can be reduced by searching only the facility map of the floor at the goal point. .
- FIG. 17 shows an example of the route calculation result in this example.
- the route information includes the name of the waypoint, the position of the waypoint, the approach angle to the waypoint, and the departure angle from the waypoint.
- pedestrians turn left and right and look back, and the direction of the two angles changes drastically, so that the correct direction of travel can be obtained with these information.
- the user having the route guidance device 2 will be guided from the start point to the goal point. Outdoors, the current position and heading are obtained by GPS, and the route information is compared with the route information to guide the user to the destination. However, since GPS cannot be used inside the facility ( ⁇ museum), continued guidance is continued by beam reception. Is done. By beam reception, the bearing can be obtained from the angle information, and the current position can be obtained by beam intensity measurement.
- Guidance methods using GPS and beams can be applied, for example, to a system for guiding a car from an urban area to an underground parking lot. It is also applicable to a system that moves containers from a container yard to a predetermined position in a ship in port logistics.
- the information conversion function will be described with reference to FIGS. 12 to 15.
- Recent information services are multi-media, and often express certain information using multiple media such as text, graphics, images, and audio.
- visual information cannot be fully utilized during work (eg, walking) because visual information leads to accidents.
- voice information can be annoying to others. The same can be said for visual information for the visually impaired and audio information for the hearing impaired.
- the information attribute that the pedestrian can send and receive is changed.
- the conversion is a media translation function.
- the basic operation is, as shown in FIG. 14, a text conversion of a predetermined medium A by a recognition technique, and this is reproduced into a final medium by a synthesis technique.
- the apparatus configuration is as shown in FIG. That is, a media conversion table is added to the configuration shown in FIG. 14 to specify which media is to be converted into the input media.
- the basic configuration consists of a multimedia information receiving unit 151, which inputs information such as multimedia, and a media translation module 152, which converts the information received by the multimedia information receiving unit 151 (or an information converting unit). ) And a multimedia information output unit 153 that outputs the information converted by the media conversion module 152.
- the media conversion module includes a usage state detector 154 that detects the usage state of the user of the device, detection results of the usage state detector 154, and detection of the usage state detector 154 in advance.
- the use condition judging unit 155 for judging the use condition of the user based on the media conversion table 1557 in which the setting is made, and the multimedia information receiving unit 15 according to the judgment result by the use condition judging unit 155 And an information converter 156 for converting the information received in step 1.
- the use state detecting unit may be, specifically, an optical sensor, an acceleration sensor, a sound volume measuring sensor for measuring a sound volume picked up by a sound collecting microphone, a temperature sensor, or the like.
- the method for setting the media conversion table 157 is to grasp the action state by sensors (205, 206), which are one of the use state detectors 154 shown in FIG. Or by explicit setting of pedestrians (users) (menu, etc.).
- the measurement result of the optical sensor unit is low (dark place), it is determined that the camera is in a bag or the like, and the information is audibly output from earphones etc. ), And if it is stopped by vibration analysis of an acceleration sensor or the like, it is determined that the user is holding the route guidance device 2 and automatic change according to the information presentation situation such as using the image information as it is is performed. Will be possible. Also, if the ambient environmental sound level from a microphone is measured and voice information stating that "C-3 is not accessible due to construction work" in a noisy environment, the information is written by voice recognition.
- FIG. 19 shows an example of carrying a pedestrian route guide and a car route guide together.
- FIG. 2 shows a partially modified view of the belt-type route guidance device.
- the user action recognition unit determines whether the user is walking or driving, and sends the result to the communication unit.
- a method of generating a route map from a route guidance sentence will be described with reference to FIGS. 20 to 30.
- the route guidance sentence is converted into a word string for graphic representation by the four steps in FIG.
- words are extracted from the guidance text. At that time, for example, by focusing on a character string with the same character type, words can be easily extracted.
- each word candidate is compared with a key (key in Fig. 24) to find the same one, and the attributes of the key (location, direction, route, distance, etc.) ) Is added to word candidates.
- the above-mentioned figured word string is made into a figure by a procedure as shown in FIG.
- the first starting point is set in the waypoint buffer (processing 25-2).
- This waypoint is set near the center of the map screen (processing 25-3).
- This state In this state, the map drawing initial screen shown in Fig. 26 is set in the display memory.
- processing 25-4 to processing 25-15 are repeated until the waypoint becomes the end point.
- station A is extracted from the above-mentioned graphic word string as a starting point (processing 25-4), and is written in the above-mentioned display memory as a predetermined symbol or figure as shown in FIG. 27 (processing 2 5— 5).
- the direction (north), route (Odori), and distance (300 m) are extracted, and if the route line can be displayed on the screen at the initially set scale (processing 25-9), a predetermined value is set from the starting point.
- FIG. 28 is obtained on the display memory. Take out the next stop
- a route diagram can be generated from the route guidance sentence.
- Route guidance sentences can be obtained from Kana-Kanji conversion by keyboard, voice recognition, handwritten character recognition, and offline image recognition.
- the obtained route guidance sentence can be converted into a voice by voice synthesis, a graphic by the above processing, and the like.
- the state determination processing by the behavior sensor can provide a classification table as shown in FIG. 31 by using, for example, light amount information from an optical sensor, movement information from an acceleration sensor, and volume information from a microphone.
- the information medium to be exchanged at the time of each state determination is defined as an information medium, and the memory area accessible by the user action recognition processing unit 206 is defined. Store in the area.
- FIG. 32 to FIG. 3 schematically show a configuration example of each sensor portion.
- Fig. 32 shows an example of the configuration of the optical sensor unit.
- the output voltage is converted from analog to digital (AD conversion in the figure) using an optical transistor, etc., and stored in a buffer. Read from 06 as needed.
- the output of the optical sensor generally increases when the user is looking at the display, and it is determined that the state is “bright”.
- the output of the optical sensor is low in a state of being put in a bag or the like, and it is determined that the state is “dark”.
- FIG. 33 shows an example of the configuration of a three-axis acceleration sensor unit.
- the output of each sensor is stored in a buffer after AD conversion, and the contents of the buffer selected by the selector are used as necessary. 206 reads.
- Fig. 34 shows an example of the configuration of a microphone sensor.
- the microphone output which is generally AC, is converted to DC by a rectifier circuit, and the value is A / D converted and stored in a buffer.
- the user behavior recognition processing unit 206 reads out the contents of the buffer selected by the selector as necessary similarly to the above sensors.
- FIG. 35 shows a processing flow of the user action recognition processing unit 206.
- the output of each sensor is taken out (processing 206-6-1 to processing 206-3), the classification table is accessed and a corresponding state is selected (processing 206-6-4), and the corresponding information media is converted.
- the mode is extracted and created in the media translation control processing section 204.
- FIG. 36 shows a processing flow of the media translation control processing section 204.
- the information to be transmitted / received is extracted (process 2044-1), and it is determined whether or not the conversion mode corresponding to the extracted information is other than no conversion (process 2044-2). If conversion is necessary, the conversion is performed.
- Perform recognition processing on the destination media processing 2 0 4 One 3). For example, when converting voice information into character information, the voice information is converted into a character string using voice recognition processing. In addition, character recognition, image recognition, etc. are used as needed.
- synthesis processing is performed for output (processing 204-4-4). In the above example, the process of expanding the character string into the font line corresponds to this. In addition, speech synthesis, graphic drawing, etc. are used as needed.
- the effect of the present invention can teach a detailed route guidance service to a facility when a pedestrian wants to reach a destination.
- Another advantage is that when information is exchanged during walking, working, or during a meeting, the information exchange method can be changed according to the situation, so that necessary information can be obtained at a sufficiently high density.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/463,782 US6259990B1 (en) | 1998-05-15 | 1998-05-15 | Information processing apparatus and pedestrian navigation system using the same |
CN98814044.6A CN1292867A (zh) | 1998-05-15 | 1998-05-15 | 信息处理装置和使用该装置的行人导向系统 |
PCT/JP1998/002151 WO1999060338A1 (fr) | 1998-05-15 | 1998-05-15 | Dispositif de traitement de donnees et systeme de navigation pour pietons utilisant ce dispositif |
EP98919615A EP1081462A4 (en) | 1998-05-15 | 1998-05-15 | DATA PROCESSING DEVICE AND PEDESTRIAN NAVIGATION SYSTEM THAT THE DEVICE USES |
KR10-2000-7001875A KR100373666B1 (ko) | 1998-05-15 | 1998-05-15 | 정보 처리 장치 및 정보 처리 장치를 사용한 보행자용내비게이션 시스템 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP1998/002151 WO1999060338A1 (fr) | 1998-05-15 | 1998-05-15 | Dispositif de traitement de donnees et systeme de navigation pour pietons utilisant ce dispositif |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1999060338A1 true WO1999060338A1 (fr) | 1999-11-25 |
Family
ID=14208205
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1998/002151 WO1999060338A1 (fr) | 1998-05-15 | 1998-05-15 | Dispositif de traitement de donnees et systeme de navigation pour pietons utilisant ce dispositif |
Country Status (5)
Country | Link |
---|---|
US (1) | US6259990B1 (ja) |
EP (1) | EP1081462A4 (ja) |
KR (1) | KR100373666B1 (ja) |
CN (1) | CN1292867A (ja) |
WO (1) | WO1999060338A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001206649A (ja) * | 2000-01-24 | 2001-07-31 | Shimizu Corp | 携帯発信機信号を利用したエレベータ運行管理システム |
JP2001317947A (ja) * | 2000-03-01 | 2001-11-16 | Matsushita Electric Ind Co Ltd | ナビゲーション装置 |
WO2002006771A1 (de) * | 2000-07-13 | 2002-01-24 | Mueller Juergen W | Verfahren zur verknüpfung geographischer und kommerzieller daten sowie deren bereitstellung |
KR100719217B1 (ko) * | 2000-10-31 | 2007-05-16 | 엘지전자 주식회사 | 이동통신망에서의 목표물 위치확인 서비스 제공방법 |
JP2007139790A (ja) * | 2005-11-18 | 2007-06-07 | Navteq North America Llc | 詳細なローカル・データを有する地理データベース |
JP2008224507A (ja) * | 2007-03-14 | 2008-09-25 | Denso Corp | カーナビゲーション装置 |
JP2010091554A (ja) * | 2008-07-25 | 2010-04-22 | Navteq North America Llc | 公開エリア地図の位置付け |
JP6211217B1 (ja) * | 2016-08-10 | 2017-10-11 | 三菱電機ビルテクノサービス株式会社 | ビル用ビーコンシステム |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6826472B1 (en) * | 1999-12-10 | 2004-11-30 | Tele Atlas North America, Inc. | Method and apparatus to generate driving guides |
GB0003150D0 (en) * | 2000-02-12 | 2000-04-05 | Univ Newcastle | Navigation and routing system |
US7035650B1 (en) * | 2000-06-14 | 2006-04-25 | International Business Machines Corporation | System and method for providing directions |
WO2002011396A2 (en) * | 2000-08-01 | 2002-02-07 | Hrl Laboratories, Llc | Apparatus and method for context-sensitive dynamic information service |
US6581000B2 (en) * | 2001-01-04 | 2003-06-17 | Carnegie Mellon University | Position location system and method |
US20040155815A1 (en) * | 2001-05-14 | 2004-08-12 | Motorola, Inc. | Wireless navigational system, device and method |
FR2825226B1 (fr) * | 2001-05-25 | 2008-02-01 | Fabien Beckers | Procede et systeme pour fournir des informations en relation avec la position occupee par un utilisateur dans un site |
JP2003050846A (ja) * | 2001-08-07 | 2003-02-21 | Hitachi Ltd | 情報伝達システム及びそれに用いる旅行サーバ及び携帯端末及び情報伝達方法 |
US8493370B2 (en) * | 2001-08-29 | 2013-07-23 | Palm, Inc. | Dynamic brightness range for portable computer displays based on ambient conditions |
KR100635460B1 (ko) * | 2001-10-27 | 2006-10-17 | 멀티화인테크(주) | 휴대용 안내 정보 제공 장치 |
EP1456808B1 (de) * | 2001-12-21 | 2005-06-22 | Siemens Aktiengesellschaft | Vorrichtung zum erfassen und darstellen von bewegungen |
US6933840B2 (en) | 2002-01-29 | 2005-08-23 | Hewlett-Packard Development Company, L.P. | System and method for configuring a printing device for a physical environment |
US7480512B2 (en) * | 2004-01-16 | 2009-01-20 | Bones In Motion, Inc. | Wireless device, program products and methods of using a wireless device to deliver services |
JP2004227468A (ja) * | 2003-01-27 | 2004-08-12 | Canon Inc | 情報提供装置、情報提供方法 |
US7688222B2 (en) | 2003-09-18 | 2010-03-30 | Spot Devices, Inc. | Methods, systems and devices related to road mounted indicators for providing visual indications to approaching traffic |
US7319387B2 (en) * | 2004-03-17 | 2008-01-15 | 3M Innovaative Properties Company | GPS interface for locating device |
US7024782B1 (en) * | 2004-10-28 | 2006-04-11 | Texas Instruments Incorporated | Electronic device compass operable irrespective of localized magnetic field |
US7386682B2 (en) * | 2005-02-11 | 2008-06-10 | International Business Machines Corporation | Reducing number of rejected snoop requests by extending time to respond to snoop request |
US7496445B2 (en) * | 2005-04-27 | 2009-02-24 | Proxemics, Llc | Wayfinding |
US7761226B1 (en) | 2005-07-27 | 2010-07-20 | The United States Of America As Represented By The Secretary Of The Navy | Interactive pedestrian routing system |
JP5002140B2 (ja) * | 2005-08-24 | 2012-08-15 | クラリオン株式会社 | ナビゲーション装置およびナビゲーション処理方法 |
JP2007066106A (ja) * | 2005-08-31 | 2007-03-15 | Fujitsu Ltd | 経路案内装置 |
US7706973B2 (en) * | 2006-01-03 | 2010-04-27 | Navitrail Llc | Computer-aided route selection |
US20070156335A1 (en) | 2006-01-03 | 2007-07-05 | Mcbride Sandra Lynn | Computer-Aided Route Selection |
US8000892B2 (en) * | 2007-06-12 | 2011-08-16 | Campus Destinations, Inc. | Pedestrian mapping system |
US20090112473A1 (en) * | 2007-10-31 | 2009-04-30 | Hung Sung Lu | Method for providing location and promotional information associated with a building complex |
TW200934207A (en) * | 2008-01-21 | 2009-08-01 | Inventec Appliances Corp | Method of automatically playing text information in voice by an electronic device under strong light |
US20130293396A1 (en) | 2008-03-15 | 2013-11-07 | James R. Selevan | Sequenced guiding systems for vehicles and pedestrians |
US8384562B2 (en) * | 2008-03-25 | 2013-02-26 | University Of Idaho | Advanced accessible pedestrian control system for the physically disabled |
US8374780B2 (en) * | 2008-07-25 | 2013-02-12 | Navteq B.V. | Open area maps with restriction content |
US8417446B2 (en) | 2008-07-25 | 2013-04-09 | Navteq B.V. | Link-node maps based on open area maps |
US8099237B2 (en) | 2008-07-25 | 2012-01-17 | Navteq North America, Llc | Open area maps |
US8339417B2 (en) * | 2008-07-25 | 2012-12-25 | Navteq B.V. | Open area maps based on vector graphics format images |
US20100023251A1 (en) * | 2008-07-25 | 2010-01-28 | Gale William N | Cost based open area maps |
US8229176B2 (en) | 2008-07-25 | 2012-07-24 | Navteq B.V. | End user image open area maps |
WO2010035274A2 (en) * | 2008-09-23 | 2010-04-01 | Girish Patil | A self - service kiosk providing path information to users |
WO2011092639A1 (en) * | 2010-01-29 | 2011-08-04 | Nokia Corporation | Systems, methods, and apparatuses for providing context-based navigation services |
JP5110405B2 (ja) * | 2010-04-07 | 2012-12-26 | 村田機械株式会社 | 走行台車システム |
US8818714B2 (en) * | 2010-12-10 | 2014-08-26 | Sony Corporation | Portable navigation device and method with active elements |
US8621394B2 (en) | 2011-08-26 | 2013-12-31 | Nokia Corporation | Method, apparatus and computer program product for displaying items on multiple floors in multi-level maps |
US9253606B2 (en) | 2013-03-04 | 2016-02-02 | Here Global B.V. | Structure access characteristics determined from mobile unit data |
US11313546B2 (en) | 2014-11-15 | 2022-04-26 | James R. Selevan | Sequential and coordinated flashing of electronic roadside flares with active energy conservation |
US9593959B2 (en) * | 2015-03-31 | 2017-03-14 | International Business Machines Corporation | Linear projection-based navigation |
CN105167967B (zh) * | 2015-09-14 | 2018-04-03 | 深圳市冠旭电子股份有限公司 | 一种导盲方法及系统 |
US9989376B2 (en) | 2016-05-12 | 2018-06-05 | Tata Consultancy Services Limited | Systems and methods for generating signature ambient sounds and maps thereof |
US10677599B2 (en) * | 2017-05-22 | 2020-06-09 | At&T Intellectual Property I, L.P. | Systems and methods for providing improved navigation through interactive suggestion of improved solutions along a path of waypoints |
JP7100998B2 (ja) * | 2018-03-08 | 2022-07-14 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
JP7216893B2 (ja) * | 2019-03-06 | 2023-02-02 | トヨタ自動車株式会社 | 移動体及び移動システム |
WO2021024029A1 (en) * | 2019-08-08 | 2021-02-11 | Kukreja Ani Dave | Method and system for intelligent and adaptive indoor navigation for users with single or multiple disabilities |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08171558A (ja) * | 1994-12-19 | 1996-07-02 | Saburo Tanatsugi | 音声情報からの文字情報同時変換とその同時変換された文字情報の外部表示方法 |
JPH09126804A (ja) * | 1995-11-06 | 1997-05-16 | Toyota Motor Corp | ルート案内システム |
JPH09282589A (ja) * | 1996-04-15 | 1997-10-31 | Nippon Telegr & Teleph Corp <Ntt> | 位置方向判定実行装置およびシステム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5470233A (en) * | 1994-03-17 | 1995-11-28 | Arkenstone, Inc. | System and method for tracking a pedestrian |
US5508699A (en) * | 1994-10-25 | 1996-04-16 | Silverman; Hildy S. | Identifier/locator device for visually impaired |
JPH09287974A (ja) * | 1996-04-23 | 1997-11-04 | Nissan Motor Co Ltd | 車載用情報提供装置 |
US5842145A (en) * | 1996-07-08 | 1998-11-24 | Zimmer; John S. | Apparatus for providing individualized maps to pedestrians |
US5806017A (en) * | 1996-08-19 | 1998-09-08 | Board Of Regents The University Of Texas System | Electronic autorouting navigation system for visually impaired persons |
-
1998
- 1998-05-15 KR KR10-2000-7001875A patent/KR100373666B1/ko not_active IP Right Cessation
- 1998-05-15 WO PCT/JP1998/002151 patent/WO1999060338A1/ja not_active Application Discontinuation
- 1998-05-15 EP EP98919615A patent/EP1081462A4/en not_active Withdrawn
- 1998-05-15 US US09/463,782 patent/US6259990B1/en not_active Expired - Fee Related
- 1998-05-15 CN CN98814044.6A patent/CN1292867A/zh active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08171558A (ja) * | 1994-12-19 | 1996-07-02 | Saburo Tanatsugi | 音声情報からの文字情報同時変換とその同時変換された文字情報の外部表示方法 |
JPH09126804A (ja) * | 1995-11-06 | 1997-05-16 | Toyota Motor Corp | ルート案内システム |
JPH09282589A (ja) * | 1996-04-15 | 1997-10-31 | Nippon Telegr & Teleph Corp <Ntt> | 位置方向判定実行装置およびシステム |
Non-Patent Citations (1)
Title |
---|
See also references of EP1081462A4 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001206649A (ja) * | 2000-01-24 | 2001-07-31 | Shimizu Corp | 携帯発信機信号を利用したエレベータ運行管理システム |
JP2001317947A (ja) * | 2000-03-01 | 2001-11-16 | Matsushita Electric Ind Co Ltd | ナビゲーション装置 |
WO2002006771A1 (de) * | 2000-07-13 | 2002-01-24 | Mueller Juergen W | Verfahren zur verknüpfung geographischer und kommerzieller daten sowie deren bereitstellung |
KR100719217B1 (ko) * | 2000-10-31 | 2007-05-16 | 엘지전자 주식회사 | 이동통신망에서의 목표물 위치확인 서비스 제공방법 |
JP2007139790A (ja) * | 2005-11-18 | 2007-06-07 | Navteq North America Llc | 詳細なローカル・データを有する地理データベース |
JP2008224507A (ja) * | 2007-03-14 | 2008-09-25 | Denso Corp | カーナビゲーション装置 |
JP2010091554A (ja) * | 2008-07-25 | 2010-04-22 | Navteq North America Llc | 公開エリア地図の位置付け |
JP6211217B1 (ja) * | 2016-08-10 | 2017-10-11 | 三菱電機ビルテクノサービス株式会社 | ビル用ビーコンシステム |
WO2018029831A1 (ja) * | 2016-08-10 | 2018-02-15 | 三菱電機ビルテクノサービス株式会社 | ビル用ビーコンシステム |
Also Published As
Publication number | Publication date |
---|---|
US6259990B1 (en) | 2001-07-10 |
CN1292867A (zh) | 2001-04-25 |
KR20010023244A (ko) | 2001-03-26 |
EP1081462A1 (en) | 2001-03-07 |
EP1081462A4 (en) | 2003-08-27 |
KR100373666B1 (ko) | 2003-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO1999060338A1 (fr) | Dispositif de traitement de donnees et systeme de navigation pour pietons utilisant ce dispositif | |
US7277846B2 (en) | Navigation system | |
US8924149B2 (en) | Outdoor to indoor navigation system | |
CN102274109B (zh) | 用于视觉障碍人士的手持导航设施和导航方法 | |
JP2674521B2 (ja) | 移動体誘導装置 | |
US5832406A (en) | Vehicle navigation apparatus and method for route finding at road crossings | |
US7916948B2 (en) | Character recognition device, mobile communication system, mobile terminal device, fixed station device, character recognition method and character recognition program | |
US8694323B2 (en) | In-vehicle apparatus | |
US9864577B2 (en) | Voice recognition device and display method | |
KR100892079B1 (ko) | 내비게이션 시스템 | |
JPWO2016174955A1 (ja) | 情報処理装置、及び、情報処理方法 | |
JP4037866B2 (ja) | 移動体の位置推定装置、位置推定方法および位置推定プログラム | |
US6374183B1 (en) | Vehicle guidance method for navigation system | |
US9596204B2 (en) | Determination of a navigational text candidate | |
WO2003102816A1 (fr) | Systeme fournisseur de donnees | |
JPH10282987A (ja) | 音声認識システムおよび方法 | |
US20050144011A1 (en) | Vehicle mounted unit, voiced conversation document production server, and navigation system utilizing the same | |
JP3929011B2 (ja) | ナビゲーション装置 | |
JP2004012155A (ja) | 歩行者用ナビゲーションシステム及び携帯情報端末 | |
JPH11325946A (ja) | 車載用ナビゲーション装置 | |
KR100832940B1 (ko) | 음향 경로 정보를 제공하는 내비게이션 시스템 | |
KR20030037190A (ko) | 이동통신 단말을 이용한 항법 시스템 및 그 제어 방법 | |
JP2024048581A (ja) | 案内システム、情報提供装置、案内方法、情報提供方法、およびプログラム | |
JPH11125533A (ja) | ナビゲーション装置及びナビゲート方法 | |
JP2000311177A (ja) | 住宅街案内装置および住宅街案内システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 98814044.6 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 09463782 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1998919615 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020007001875 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 1998919615 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020007001875 Country of ref document: KR |
|
WWG | Wipo information: grant in national office |
Ref document number: 1020007001875 Country of ref document: KR |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1998919615 Country of ref document: EP |