WO2020255286A1 - ペアリング表示装置、ペアリング表示システムおよびペアリング表示方法 - Google Patents
ペアリング表示装置、ペアリング表示システムおよびペアリング表示方法 Download PDFInfo
- Publication number
- WO2020255286A1 WO2020255286A1 PCT/JP2019/024248 JP2019024248W WO2020255286A1 WO 2020255286 A1 WO2020255286 A1 WO 2020255286A1 JP 2019024248 W JP2019024248 W JP 2019024248W WO 2020255286 A1 WO2020255286 A1 WO 2020255286A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- unit
- moving body
- processing unit
- pairing
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/002—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/123—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
Definitions
- Facilities include, for example, hospitals, airports or commercial facilities.
- the moving body either carries the user in the facility or moves following the user. Examples of the moving body include an electric wheelchair, an electric cart, and a mobile robot.
- the user can move to the destination by the moving body by informing the moving body of the destination in the facility.
- the server transmits the reservation status data regarding the current usage reservation status of the robot to the user terminal, and the user terminal considers the reservation status data by the user.
- the reservation data for reserving the use of the input robot is transmitted to the server.
- the server determines the robot usage reservation by authenticating the user based on the reservation data.
- a user terminal can be used to make a reservation for using a robot, that is, pairing.
- a robot that is, pairing.
- the robot authenticates the user who went to the facility and performs pairing.
- the user generally does not know which robot is already paired among the robots placed in the facility, and does not know which robot should be contacted for pairing. For this reason, the user has to search for an available robot by himself or ask the facility manager about the available robot, which is inconvenient.
- the present invention solves the above problems, and obtains a pairing display device, a pairing display system, and a pairing display method capable of displaying that a moving body and a user are paired.
- the purpose a pairing display device, a pairing display system, and a pairing display method capable of displaying that a moving body and a user are paired.
- a detection processing unit that detects a user paired with a mobile body having a display unit and a detection unit using the detection unit, and a detection processing unit detects the user.
- the output processing unit is provided to display information indicating that the user is paired with the moving body on the floor surface around the moving body by using the display unit.
- a user paired with a mobile body is detected by using a detection unit, and when the user is detected, information indicating that the user is paired with the mobile body is provided. Display on the floor around the moving body using the display unit. By visually recognizing the information displayed on the floor surface, the user can recognize that the information is paired with the moving body displaying the information.
- FIG. 4A is a block diagram showing a hardware configuration that realizes the function of the pairing display device according to the first embodiment.
- FIG. 4B is a block diagram showing a hardware configuration for executing software that realizes the function of the pairing display device according to the first embodiment. It is a figure which shows the display example of the pairing state of a moving body and a user. It is a block diagram which shows the structure of the pairing display device which concerns on Embodiment 2.
- FIG. 8A is a diagram showing an example of an operation image according to the second embodiment.
- FIG. 8B is a diagram showing the progress of the operation with respect to the operation image of FIG. 8A.
- FIG. 8C is a diagram showing the completion of the operation on the operation image of FIG. 8A.
- FIG. 9A is a diagram showing an operation of an operation image by a user with an attendant.
- FIG. 9B is a diagram showing an example of a display for confirming an attendant.
- FIG. 9C is a diagram showing a display example of a pairing state between the moving body and the user and the attendant. It is a block diagram which shows the structure of the pairing display device which concerns on Embodiment 3.
- FIG. 1 is a block diagram showing a configuration of the pairing display device 10 according to the first embodiment.
- the mobile body 1 can move autonomously, and examples thereof include an electric wheelchair, an electric cart, and a mobile robot.
- the moving body 1 shown in FIG. 1 includes a display unit 2, a detection unit 3, a sound output unit 4, and a pairing display device 10.
- the display unit 2 is a display unit that displays information on the floor surface B around the moving body 1, and is, for example, a projector (projection unit) that projects information on the floor surface B. Further, the display unit 2 can display information three-dimensionally on the floor surface B around the moving body 1. For example, when the display unit 2 is a projector, the projector projects information on the floor surface B around the moving body 1 in three dimensions.
- “displaying in three dimensions” or “projecting in three dimensions” means displaying or projecting information in an expression that looks three-dimensional to human vision.
- the display unit 2 does not necessarily have to display the information in three dimensions, and may display the information in two dimensions.
- the detection unit 3 is a detection unit that detects the user A around the moving body 1, and is, for example, a camera device capable of photographing the surroundings of the moving body 1.
- the user A is a person paired with the mobile body 1, and the appearance information of the user A is registered in the pairing display device 10.
- the camera device which is the detection unit 3, photographs the user A and outputs the video information to the pairing display device 10.
- the detection unit 3 may be a sensor that combines any of infrared rays, light, and sound waves with a camera device.
- the sound output unit 4 is an audio output unit that outputs audio around the moving body 1, and is, for example, a speaker.
- the sound output unit 4 outputs sound effect information, voice guidance, and an alarm instructed by the pairing display device 10.
- the sound effect information is sound information corresponding to the information displayed on the floor surface B around the moving body 1 by the display unit 2.
- the pairing display device 10 displays the pairing between the mobile body 1 and the user A.
- the pairing display device 10 shown in FIG. 1 includes an output processing unit 10a and a detection processing unit 10b.
- the output processing unit 10a uses the display unit 2 to provide information indicating that the user A is paired with the moving body 1 of the moving body 1. Display on the surrounding floor surface B.
- the information indicating that the user A is paired with the mobile body 1 is, for example, an image 20 of at least one of the user A's name, face image, or specific mark.
- the output processing unit 10a displays the image 20 on the floor surface B around the moving body 1 at the timing when the user A enters the detection range of the detection unit 3.
- the output processing unit 10a can display the image 20 in an area on the floor surface B around the moving body 1 that is an effective detection range of the detection unit 3.
- the effective detection range is a range in which the detection unit 3 can stably detect an object.
- the detection unit 3 is a camera device, it is defined by the viewing angle of the camera device. Also called the stable detection range. Since the image 20 is displayed in the area that is the effective detection range of the detection unit 3, the user A can be guided to the area that is stably detected by the detection unit 3.
- the image 20 is display information for showing that the user A is paired with the moving body 1, and is composed of figures, characters, or a combination thereof.
- the image 20 may be an animation image whose display mode changes with time.
- the output processing unit 10a projects the image 20 on the floor surface B around the moving body 1 in three dimensions using the projector.
- the output processing unit 10a may use the sound output unit 4 to output sound according to the information displayed on the floor surface B around the moving body 1. Since the output mode of the sound effect is defined by the frequency, rhythm, and tempo of the sound effect, the output processing unit 10a may change these.
- the detection processing unit 10b uses the detection unit 3 to detect the user A who is operating the image 20. For example, the detection processing unit 10b analyzes an image of the periphery of the moving body 1 taken by the camera device which is the detection unit 3, and selects the user A from the image based on the appearance information of the user A and the image analysis result. Can be detected. For image analysis, for example, an image analysis method such as pattern matching is used.
- FIG. 2 is a flowchart showing a pairing display method according to the first embodiment.
- the detection processing unit 10b detects the user A paired with the moving body 1 (step ST1). For example, the detection processing unit 10b performs image analysis of the image around the moving body 1 taken by the camera device, and detects the user A based on the image analysis result. If user A is not detected (step ST1; NO), the detection processing unit 10b repeats the detection process in step ST1 until user A is detected.
- the output processing unit 10a displays the image 20 on the floor surface B around the moving body 1 by using the display unit 2 (step ST1; YES).
- the face image of the user A is displayed on the floor surface B as the image 20.
- the user A can recognize that the image 20 is paired with the moving body 1 displayed on the floor surface B.
- FIG. 3 is a block diagram showing a configuration of the pairing display system according to the first embodiment.
- the pairing display system shown in FIG. 3 is a system that displays that the mobile body 1 and the user A are paired, and includes the mobile body 1 and the server 30.
- the moving body 1 can move autonomously, and examples thereof include an electric wheelchair, an electric cart, and a moving robot.
- the mobile body 1 shown in FIG. 3 includes a display unit 2, a detection unit 3, a sound output unit 4, and a communication unit 5.
- the communication unit 5 communicates with the server 30.
- the display unit 2, the detection unit 3, and the sound output unit 4 each operate based on the control signal received from the server 30 through the communication unit 5.
- the display unit 2 is a display unit that displays information on the floor surface B around the mobile body 1 based on the control information received from the server 30 through the communication unit 5.
- the moving body 1 is, for example, a projector that projects information on the floor surface B.
- the detection unit 3 detects the user A around the mobile body 1 and transmits it to the server 30 through the communication unit 5.
- the sound output unit 4 outputs voice to the surroundings of the mobile body 1 based on the control information received from the server 30 through the communication unit 5.
- the server 30 is a device that displays the pairing of the mobile body 1 and the user A by using the display unit 2 based on the information received from the mobile body 1. As shown in FIG. 3, the server 30 includes an output processing unit 10a, a detection processing unit 10b, and a communication unit 10c. The communication unit 10c communicates with the communication unit 5 included in the mobile body 1.
- the output processing unit 10a displays the image 20 on the floor surface B around the moving body 1 by transmitting a control signal to the moving body 1 through the communication unit 10c to control the display unit 2.
- the detection processing unit 10b detects the user A by transmitting a control signal to the mobile body 1 through the communication unit 10c and controlling the detection unit 3.
- the pairing display device 10 includes a processing circuit for executing the processing from step ST1 to step ST2 in FIG.
- the processing circuit may be dedicated hardware, or may be a CPU (Central Processing Unit) that executes a program stored in the memory.
- CPU Central Processing Unit
- FIG. 4A is a block diagram showing a hardware configuration that realizes the functions of the pairing display device 10.
- FIG. 4B is a block diagram showing a hardware configuration for executing software that realizes the functions of the pairing display device 10.
- the input interface 100 is an interface for relaying information output from the detection unit 3 to the detection processing unit 10b included in the pairing display device 10.
- the output interface 101 is an interface for relaying information output from the output processing unit 10a to the display unit 2, the sound output unit 4, or both of them.
- the processing circuit 102 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, or an ASIC (Application Special Integrated Circuit). ), FPGA (Field-Programmable Gate Array), or a combination thereof.
- the functions of the output processing unit 10a and the detection processing unit 10b in the pairing display device 10 may be realized by separate processing circuits, or these functions may be collectively realized by one processing circuit.
- the processing circuit is the processor 103 shown in FIG. 4B
- the functions of the output processing unit 10a and the detection processing unit 10b in the pairing display device 10 are realized by software, firmware, or a combination of software and firmware.
- the software or firmware is described as a program and stored in the memory 104.
- the memory 104 is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically-volatile) semiconductor, an EPROM (Electrically-volatile), or the like.
- a part may be realized by dedicated hardware and a part may be realized by software or firmware.
- the output processing unit 10a realizes the function by the processing circuit 102 which is the dedicated hardware
- the detection processing unit 10b realizes the function by reading and executing the program stored in the memory 104 by the processor 103.
- the processing circuit can realize the above functions by hardware, software, firmware, or a combination thereof.
- FIG. 5 is a diagram showing a display example of a pairing state between the moving body 1A and the user A1 and a pairing state between the moving body 1B and the user A2.
- the user A1 is paired with the mobile body 1A
- the user A2 is paired with the mobile body 1B.
- a pairing display device 10 is mounted on the moving body 1A and the moving body 1B, respectively.
- the output processing unit 10a indicates that the mobile body 1A and the user A1 are paired by using the display unit 2 of the mobile body 1A. Display on the surrounding floor surface B. Similarly, in the pairing display device 10 mounted on the mobile body 1B, the output processing unit 10a uses the display unit 2 to move that the mobile body 1B and the user A2 are in a paired state. Display on the floor surface B around the body 1B.
- the detection processing unit 10b detects the movement of the user A1 by using the detection unit 3.
- the output processing unit 10a detects the image 20a below the user A1 and the dotted line image 20b extending from the display unit 2 of the moving body 1A to the image 20a. It is displayed on the floor surface B according to the movement of the user A1 detected by.
- the output processing unit 10a detects the image 20a below the user A2 and the dotted line image 20b extending from the display unit 2 of the moving body 1B to the image 20a by the detection processing unit 10b. It is displayed on the floor surface B according to the movement of A2.
- the detection processing unit 10b transmits to the user terminal 40 that the pairing of the mobile body 1 by the user A has been established through the communication unit 10e.
- the user A uses the user terminal 40 to transmit the appearance information of the user A to the pairing display device 10A.
- the communication unit 10e outputs the appearance information of the user A received from the user terminal 40 to the detection processing unit 10b.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Selective Calling Equipment (AREA)
- Telephone Function (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Traffic Control Systems (AREA)
- Controls And Circuits For Display Device (AREA)
- Manipulator (AREA)
Abstract
Description
図1は、実施の形態1に係るペアリング表示装置10の構成を示すブロック図である。移動体1は、自律的な移動が可能であり、例えば、電動車椅子、電動カートまたは移動ロボットが挙げられる。図1に示す移動体1は、表示部2、検出部3、音出力部4、およびペアリング表示装置10を備える。
図2は、実施の形態1に係るペアリング表示方法を示すフローチャートである。
検出処理部10bは、移動体1とペアリングされている利用者Aを検出する(ステップST1)。例えば、検出処理部10bは、カメラ装置によって撮影された移動体1の周辺の映像を画像解析し、画像解析結果に基づいて利用者Aを検出する。利用者Aが検出されなければ(ステップST1;NO)、検出処理部10bは、利用者Aが検出されるまで、ステップST1の検出処理を繰り返す。
図3は、実施の形態1に係るペアリング表示システムの構成を示すブロック図である。図3において、図1と同一の構成要素には、同一の符号を付して説明を省略する。図3に示すペアリング表示システムは、移動体1と利用者Aがペアリングされていることを表示するシステムであり、移動体1およびサーバ30を備える。
ペアリング表示装置10における出力処理部10aおよび検出処理部10bの機能は、処理回路により実現される。すなわち、ペアリング表示装置10は、図2のステップST1からステップST2までの処理を実行するための処理回路を備える。処理回路は、専用のハードウェアであってもよいが、メモリに記憶されたプログラムを実行するCPU(Central Processing Unit)であってもよい。
図5は、移動体1Aと利用者A1とのペアリング状態および移動体1Bと利用者A2とのペアリング状態の表示例を示す図である。図5において、利用者A1は、移動体1Aとペアリングされ、利用者A2は、移動体1Bとペアリングされている。移動体1Aおよび移動体1Bには、ペアリング表示装置10がそれぞれ搭載されている。
実施の形態2に係るペアリング表示装置は、実施の形態1と同様に、移動体と利用者がペアリングされていることを示す画像を床面に表示した後に、利用者が操作可能な画像を床面に表示する。実施の形態1における画像の表示で移動体とペアリングされていることを実感した利用者は、実施の形態2における画像に対する自身の操作によって移動体とのペアリングを成立させたと認識できるので(この操作前にペアリング自体は成立しているので、疑似的な操作)、移動体1を安心して利用することができる。
図7は、実施の形態2に係るペアリング表示方法を示すフローチャートである。なお、図7に示す処理が実行される前に、実施の形態1における画像20(移動体1と利用者Aがペアリングされていることを示す画像)が床面Bに表示され、利用者Aは、移動体1とペアリングされていることを認識している。また、図8Aは、利用者Aに操作させる画像20の例を示す図であり、床面B上に3次元に表示された押しボタン形状の画像20を示している。図8Aに示す画像20は、表示部2によって床面Bに3次元で投影された映像である。図8Bは、図8Aの画像20に対する操作の進行状況を示す図である。図8Cは、図8Aの画像20に対する操作の完了を示す図である。
図10は、実施の形態3に係るペアリング表示装置10Aの構成を示すブロック図である。移動体1は自律的な移動が可能であり、例えば、電動車椅子、電動カートまたは移動ロボットが挙げられる。図10に示す移動体1は、表示部2、検出部3、音出力部4およびペアリング表示装置10Aを備えている。利用者Aは、移動体1を利用可能なサービスに登録している利用者であり、利用者Aの識別情報がペアリング表示装置10Aに設定されている。利用者Aは、利用者端末40を携帯している。利用者端末40は、ペアリング表示装置10Aと通信を行う端末装置であり、例えば、スマートフォン、携帯電話端末またはタブレット情報端末である。
Claims (11)
- 表示部および検出部を有した移動体とペアリングされた利用者を、前記検出部を用いて検出する検出処理部と、
前記検出処理部によって前記利用者が検出された場合、前記利用者が前記移動体とペアリングされていることを示す情報を、前記表示部を用いて前記移動体の周囲の床面に表示する出力処理部と、
を備えたことを特徴とするペアリング表示装置。 - 前記出力処理部は、前記表示部を用いて、前記移動体と前記利用者とのペアリングに関する操作を行うための操作画像を前記移動体の周囲の床面に表示し、
前記検出処理部は、前記検出部を用いて、前記操作画像を操作する前記利用者を検出すること
を特徴とする請求項1記載のペアリング表示装置。 - 前記出力処理部は、前記移動体の周囲の床面における前記検出部の有効検出範囲となる領域に前記操作画像を表示すること
を特徴とする請求項2記載のペアリング表示装置。 - 前記出力処理部は、前記移動体と前記利用者がペアリングされた状態であることを示す画像を当該移動体の周囲の床面に表示すること
を特徴とする請求項1から請求項3のいずれか1項記載のペアリング表示装置。 - 前記表示部は、前記移動体の周囲の床面に画像を投影する投影部であり、
前記出力処理部は、前記投影部を用いて、前記移動体と前記利用者がペアリングされた状態であることを示す画像として、線、図形または文字あるいはこれらの組み合わせから構成された画像を表示すること
を特徴とする請求項4記載のペアリング表示装置。 - 前記操作画像に対する操作の進行状況を確認する確認部を備えたこと
を特徴とする請求項2記載のペアリング表示装置。 - 前記検出処理部は、前記検出部を用いて前記利用者の付添者を検出し、
前記出力処理部は、前記検出処理部によって前記付添者が検出された場合、前記利用者および前記付添者と前記移動体とがペアリングされていることを示す情報を、前記移動体の周囲の床面に表示すること
を特徴とする請求項1記載のペアリング表示装置。 - 利用者端末と通信を行う通信部を備え、
前記通信部は、前記利用者によって前記利用者端末を用いて送信された利用予約データを受信し、
前記出力処理部は、前記利用者端末を用いて前記利用予約データを送信した前記利用者が、前記検出処理部によって検出されると、前記利用者が前記移動体とペアリングされていることを示す情報を、前記表示部を用いて前記移動体の周囲の床面に表示すること
を特徴とする請求項1記載のペアリング表示装置。 - 前記移動体は、音出力部を有し、
前記出力処理部は、前記音出力部を用いて、前記移動体の周囲の床面に表示した情報に応じた音を出力すること
を特徴とする請求項1または請求項2記載のペアリング表示装置。 - 表示部および検出部を有した移動体と、
前記移動体とペアリングされた利用者を、前記検出部を用いて検出する検出処理部と、
前記検出処理部によって前記利用者が検出された場合、前記利用者が前記移動体とペアリングされていることを示す情報を、前記表示部を用いて前記移動体の周囲の床面に表示する出力処理部と、
を備えたことを特徴とするペアリング表示システム。 - 検出処理部が、表示部および検出部を有した移動体とペアリングされた利用者を、前記検出部を用いて検出するステップと、
出力処理部が、前記検出処理部によって前記利用者が検出された場合、前記利用者が前記移動体とペアリングされていることを示す情報を、前記表示部を用いて前記移動体の周囲の床面に表示するステップと、
を備えたことを特徴とするペアリング表示方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112019007321.4T DE112019007321B4 (de) | 2019-06-19 | 2019-06-19 | Kopplungsanzeigevorrichtung, kopplungsanzeigesystem und kopplungsanzeigeverfahren |
JP2019560777A JP6746013B1 (ja) | 2019-06-19 | 2019-06-19 | ペアリング表示装置、ペアリング表示システムおよびペアリング表示方法 |
CN201980097340.1A CN113950711B (zh) | 2019-06-19 | 2019-06-19 | 配对显示装置、配对显示系统以及配对显示方法 |
KR1020217040054A KR102449306B1 (ko) | 2019-06-19 | 2019-06-19 | 페어링 표시 장치, 페어링 표시 시스템 및 페어링 표시 방법 |
PCT/JP2019/024248 WO2020255286A1 (ja) | 2019-06-19 | 2019-06-19 | ペアリング表示装置、ペアリング表示システムおよびペアリング表示方法 |
US17/527,165 US20220076598A1 (en) | 2019-06-19 | 2021-11-16 | Pairing display device, pairing display system, and pairing display method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2019/024248 WO2020255286A1 (ja) | 2019-06-19 | 2019-06-19 | ペアリング表示装置、ペアリング表示システムおよびペアリング表示方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/527,165 Continuation US20220076598A1 (en) | 2019-06-19 | 2021-11-16 | Pairing display device, pairing display system, and pairing display method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020255286A1 true WO2020255286A1 (ja) | 2020-12-24 |
Family
ID=72146122
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/024248 WO2020255286A1 (ja) | 2019-06-19 | 2019-06-19 | ペアリング表示装置、ペアリング表示システムおよびペアリング表示方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220076598A1 (ja) |
JP (1) | JP6746013B1 (ja) |
KR (1) | KR102449306B1 (ja) |
CN (1) | CN113950711B (ja) |
DE (1) | DE112019007321B4 (ja) |
WO (1) | WO2020255286A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12073750B2 (en) * | 2020-12-08 | 2024-08-27 | Nec Corporation | Pedestrian guidance device, pedestrian guidance method, and computer-readable recording medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015179414A (ja) * | 2014-03-19 | 2015-10-08 | 株式会社日本総合研究所 | 自動運転交通システムを利用した撮影情報共有システム及び撮影情報共有方法 |
WO2015152304A1 (ja) * | 2014-03-31 | 2015-10-08 | エイディシーテクノロジー株式会社 | 運転支援装置、及び運転支援システム |
WO2016002527A1 (ja) * | 2014-06-30 | 2016-01-07 | みこらった株式会社 | 移動体呼び寄せシステム、呼び寄せ装置及び無線通信装置 |
WO2018230533A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 配車サービス提供装置、配車サービス提供方法、およびプログラム |
WO2018230679A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 送迎管理装置、送迎管理方法、およびプログラム |
WO2018230698A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | イベント配車装置、イベント配車方法、プログラム、および管理システム |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4214860B2 (ja) | 2003-08-12 | 2009-01-28 | 沖電気工業株式会社 | ロボットによる中継システム、ロボットによる中継プログラム及びその方法 |
JP4771147B2 (ja) * | 2005-10-24 | 2011-09-14 | 清水建設株式会社 | 道案内システム |
DE102007033391A1 (de) * | 2007-07-18 | 2009-01-22 | Robert Bosch Gmbh | Informationsvorrichtung, Verfahren zur Information und/oder Navigation von einer Person sowie Computerprogramm |
US8373657B2 (en) * | 2008-08-15 | 2013-02-12 | Qualcomm Incorporated | Enhanced multi-touch detection |
US9586135B1 (en) * | 2008-11-12 | 2017-03-07 | David G. Capper | Video motion capture for wireless gaming |
US20130285919A1 (en) * | 2012-04-25 | 2013-10-31 | Sony Computer Entertainment Inc. | Interactive video system |
JP5942840B2 (ja) * | 2012-12-21 | 2016-06-29 | ソニー株式会社 | 表示制御システム及び記録媒体 |
JP6111706B2 (ja) * | 2013-02-01 | 2017-04-12 | セイコーエプソン株式会社 | 位置検出装置、調整方法、および調整プログラム |
KR101917700B1 (ko) * | 2013-12-23 | 2018-11-13 | 엘지전자 주식회사 | 이동 단말기 및 그 제어 방법 |
US9682477B2 (en) * | 2015-03-24 | 2017-06-20 | Toyota Jidosha Kabushiki Kaisha | Robot communication of intent and functioning |
CN106470236B (zh) * | 2015-08-20 | 2019-05-10 | 腾讯科技(深圳)有限公司 | 基于移动终端的打车方法、装置和系统 |
EP3520684B1 (en) * | 2016-09-30 | 2024-06-12 | Asia Air Survey Co., Ltd. | Moving-body information providing system, and moving-body information providing program |
KR102003940B1 (ko) * | 2016-11-11 | 2019-10-01 | 엘지전자 주식회사 | 자율 주행 차량 및 그 제어방법 |
WO2019079790A1 (en) * | 2017-10-21 | 2019-04-25 | Eyecam, Inc | ADAPTIVE GRAPHIC USER INTERFACE SYSTEM |
EP3613638A1 (en) * | 2018-08-10 | 2020-02-26 | Lg Electronics Inc. | Vehicle display system for vehicle |
WO2020031740A1 (ja) * | 2018-08-10 | 2020-02-13 | ソニー株式会社 | 制御装置および制御方法、並びにプログラム |
KR102619558B1 (ko) * | 2018-11-16 | 2024-01-02 | 현대모비스 주식회사 | 자율주행차의 제어시스템 및 그 제어방법 |
DE102019206644B3 (de) | 2019-05-08 | 2020-08-06 | Audi Ag | Verfahren zum Betrieb einer Umgebungsbeleuchtungseinrichtung eines Kraftfahrzeugs und Kraftfahrzeug |
US11072277B2 (en) * | 2019-09-20 | 2021-07-27 | Adway International Inc. | Method and apparatus to dynamically identify a vehicle |
-
2019
- 2019-06-19 CN CN201980097340.1A patent/CN113950711B/zh active Active
- 2019-06-19 WO PCT/JP2019/024248 patent/WO2020255286A1/ja active Application Filing
- 2019-06-19 JP JP2019560777A patent/JP6746013B1/ja active Active
- 2019-06-19 KR KR1020217040054A patent/KR102449306B1/ko active IP Right Grant
- 2019-06-19 DE DE112019007321.4T patent/DE112019007321B4/de active Active
-
2021
- 2021-11-16 US US17/527,165 patent/US20220076598A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015179414A (ja) * | 2014-03-19 | 2015-10-08 | 株式会社日本総合研究所 | 自動運転交通システムを利用した撮影情報共有システム及び撮影情報共有方法 |
WO2015152304A1 (ja) * | 2014-03-31 | 2015-10-08 | エイディシーテクノロジー株式会社 | 運転支援装置、及び運転支援システム |
WO2016002527A1 (ja) * | 2014-06-30 | 2016-01-07 | みこらった株式会社 | 移動体呼び寄せシステム、呼び寄せ装置及び無線通信装置 |
WO2018230533A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 配車サービス提供装置、配車サービス提供方法、およびプログラム |
WO2018230679A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 送迎管理装置、送迎管理方法、およびプログラム |
WO2018230698A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | イベント配車装置、イベント配車方法、プログラム、および管理システム |
Also Published As
Publication number | Publication date |
---|---|
CN113950711B (zh) | 2023-11-21 |
US20220076598A1 (en) | 2022-03-10 |
KR20220002663A (ko) | 2022-01-06 |
KR102449306B1 (ko) | 2022-09-29 |
DE112019007321T5 (de) | 2022-07-07 |
CN113950711A (zh) | 2022-01-18 |
JPWO2020255286A1 (ja) | 2021-09-13 |
JP6746013B1 (ja) | 2020-08-26 |
DE112019007321B4 (de) | 2024-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105093526B (zh) | 眼镜式终端及其控制方法 | |
JP6401120B2 (ja) | 視標追跡手順に基づくネットワーク認証方法およびシステム | |
JP5779641B2 (ja) | 情報処理装置、方法およびプログラム | |
JP2020504887A (ja) | 障害のあるユーザ支援のためのシステムおよび方法 | |
KR102463806B1 (ko) | 이동이 가능한 전자 장치 및 그 동작 방법 | |
KR102055677B1 (ko) | 이동 로봇 및 그 제어방법 | |
EP3731118B1 (en) | Electronic device and method for performing biometric authentication function and intelligent agent function using user input in electronic device | |
CN109478288B (zh) | 虚拟现实系统及信息处理系统 | |
CN113035196A (zh) | 用于自助一体机的无接触操控方法和装置 | |
US10299982B2 (en) | Systems and methods for blind and visually impaired person environment navigation assistance | |
WO2020255286A1 (ja) | ペアリング表示装置、ペアリング表示システムおよびペアリング表示方法 | |
KR20210029388A (ko) | 시각 장애인을 위한 물건 탐지 및 안내를 위한 시스템 | |
WO2018112688A1 (zh) | 一种弱视辅助方法和装置 | |
CN112330380A (zh) | 订单创建方法、装置、计算机设备及计算机可读存储介质 | |
US11195525B2 (en) | Operation terminal, voice inputting method, and computer-readable recording medium | |
KR101629758B1 (ko) | 글라스형 웨어러블 디바이스의 잠금해제 방법 및 프로그램 | |
JP2018025931A (ja) | 操作支援装置、操作支援方法およびプログラム | |
WO2018056169A1 (ja) | 対話装置、処理方法、プログラム | |
EP4163765A1 (en) | Method and apparatus for initiating an action | |
WO2018047932A1 (ja) | 対話装置、ロボット、処理方法、プログラム | |
US20210049998A1 (en) | Information processing apparatus, information processing method, and program | |
JP2024058978A (ja) | サービス提供システム、サービス提供方法及びサービス提供プログラム | |
WO2018061871A1 (ja) | 端末装置、情報処理システム、処理方法、プログラム | |
JP2013041012A (ja) | 撮像装置および撮像方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2019560777 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19933651 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20217040054 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19933651 Country of ref document: EP Kind code of ref document: A1 |