WO2021106122A1 - 表示制御装置、表示システム、及び、表示制御方法 - Google Patents

表示制御装置、表示システム、及び、表示制御方法 Download PDF

Info

Publication number
WO2021106122A1
WO2021106122A1 PCT/JP2019/046470 JP2019046470W WO2021106122A1 WO 2021106122 A1 WO2021106122 A1 WO 2021106122A1 JP 2019046470 W JP2019046470 W JP 2019046470W WO 2021106122 A1 WO2021106122 A1 WO 2021106122A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile
information
acquisition unit
facility
image
Prior art date
Application number
PCT/JP2019/046470
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
直紀 古畑
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2019/046470 priority Critical patent/WO2021106122A1/ja
Priority to DE112019007831.3T priority patent/DE112019007831T5/de
Priority to CN201980102161.2A priority patent/CN114730185A/zh
Priority to JP2020522395A priority patent/JP6833111B1/ja
Publication of WO2021106122A1 publication Critical patent/WO2021106122A1/ja
Priority to US17/700,495 priority patent/US20220215666A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • G05D1/221Remote-control arrangements
    • G05D1/222Remote-control arrangements operated by humans
    • G05D1/224Output arrangements on the remote controller, e.g. displays, haptics or speakers
    • G05D1/2244Optic
    • G05D1/2247Optic providing the operator with simple or augmented images from one or more cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • the present invention relates to a display control device and a display control method.
  • Patent Document 1 there is a technique for avoiding contact between a vehicle traveling on a road and a person walking on the road (see, for example, Patent Document 1).
  • Patent Document 1 is a technique for avoiding contact between a vehicle traveling on a road and a person walking on the road, and the technique is a technique for avoiding contact with a person moving in a facility. , Contact with moving objects moving within the facility is not considered.
  • the present invention provides a display control device capable of providing information for avoiding contact between a person moving in a facility and a moving body moving in the facility to a person moving in the facility. It is an object.
  • the display control device of the present invention moves in the facility with the first mobile information acquisition unit that acquires the position, the moving speed, and the moving direction of the first moving body moving in the facility.
  • a second mobile information acquisition unit that acquires second mobile information indicating the position, movement speed, and movement direction of the second mobile, and a first mobile information acquired by the first mobile information acquisition unit.
  • the space in the facility and the first mobile can be visually recognized by the display output device installed in the facility. It includes an image acquisition unit that acquires image information indicating a display image to be displayed in space, and an image output unit that outputs image information acquired by the image acquisition unit.
  • FIG. 1 is a block diagram showing an example of the configuration of a main part of the display system according to the first embodiment to which the display control device according to the first embodiment is applied.
  • FIG. 2 is a layout diagram showing an example of the arrangement of the display output device included in the display system according to the first embodiment in the facility.
  • FIG. 3 is a block diagram showing an example of the configuration of a main part of the display control device according to the first embodiment.
  • FIG. 4A is a diagram showing an example of image information acquired by the image output unit included in the display control device according to the first embodiment.
  • FIG. 4B is a diagram showing another example of image information acquired by the image output unit included in the display control device according to the first embodiment.
  • FIG. 1 is a block diagram showing an example of the configuration of a main part of the display system according to the first embodiment to which the display control device according to the first embodiment is applied.
  • FIG. 2 is a layout diagram showing an example of the arrangement of the display output device included in the display system according to the first embodiment in the
  • FIG. 5 is a diagram showing another example of image information acquired by the image output unit included in the display control device according to the first embodiment.
  • 6A and 6B are diagrams showing an example of the hardware configuration of the display control device according to the first embodiment.
  • FIG. 7 is a flowchart showing an example of processing of the display control device according to the first embodiment.
  • FIG. 8 is a block diagram showing an example of the configuration of a main part of the display system according to the second embodiment to which the display control device according to the second embodiment is applied.
  • FIG. 9 is a block diagram showing an example of the configuration of a main part of the display control device according to the second embodiment.
  • FIGS. 10A, 10B, 10C, and 10D are diagrams showing an example of image information acquired by the image output unit included in the display control device according to the second embodiment when the in-facility equipment is an elevator. is there.
  • FIG. 11 is a diagram showing an example of image information acquired by the image output unit included in the display control device according to the second embodiment when the equipment in the facility is an automatic door.
  • FIG. 12 is a flowchart showing an example of processing of the display control device according to the second embodiment.
  • Embodiment 1 The display control device 100 according to the first embodiment will be described with reference to FIGS. 1 to 7. A configuration of a main part of the display system 10 according to the first embodiment to which the display control device 100 according to the first embodiment is applied will be described with reference to FIGS. 1 and 2.
  • FIG. 1 is a block diagram showing an example of the configuration of a main part of the display system 10 according to the first embodiment to which the display control device 100 according to the first embodiment is applied.
  • the display system 10 includes a display output device 11, a sensor 12, a mobile body detection device 13, a storage device 14, and a display control device 100.
  • the display output device 11, the sensor 12, the mobile detection device 13, the storage device 14, and the display control device 100 included in the display system 10 are each connected via a network 19 capable of transmitting and receiving information.
  • the display output device 11 is a display device installed in the facility and displaying a display image in the space in the facility.
  • the space in the facility referred to here includes an area on the structure constituting the facility, a space composed of the structure constituting the facility, and the like.
  • the display output device 11 is a projection type display device such as a front projector.
  • the display output device 11 is installed in the facility, projects a display image toward a predetermined area on the structure constituting the facility, and displays the display image on the structure.
  • the display image displayed by the display output device 11 is not limited to the still image, but may be a moving image.
  • the structure constituting the facility on which the display output device 11 displays the display image is the floor in the facility, the wall in the facility, the door in the facility, or the landing door of the elevator arranged in the facility. It is a landing operation panel, etc.
  • the display output device 11 is not limited to the projection type display device as long as it is a display device installed in the facility and displaying a display image in the space in the facility.
  • the display output device 11 is an aerial display even if it is a display such as an LED (Light Emitting Diode) display or an organic EL (Electro-Luminence) display arranged on a structure constituting the facility. Is also good.
  • An aerial display is a device that forms an image in the air.
  • the display output device 11 installed on the ceiling or wall in the facility projects the display image toward the floor in the facility to display the display image on the floor. explain.
  • FIG. 2 is a layout diagram showing an example of the arrangement of the display output device 11 included in the display system 10 according to the first embodiment in the facility.
  • FIG. 2 is a view of a part of the floor in the facility as viewed from above.
  • four display output devices 11-1, 11-2, 11-3, and 11-4 are shown as display output devices 11.
  • the display output device 11 is not limited to the four display output devices 11-1, 11-2, 11-3, and 11-4, and the display output device 11 is 3 if it is one or more. It may be one or less, or five or more.
  • the rectangle shown by the broken line in FIG. 2 is a display area of a display image corresponding to each of the four display output devices 11-1, 11-2, 11-3, and 11-4.
  • FIG. 2 shows a first moving body 1 and a second moving body 2 which are both moving bodies moving in the facility.
  • the number of moving bodies moving in the facility shown in FIG. 2 is 2, but the number of moving bodies moving in the facility may be 3 or more as long as it is 2 or more.
  • the moving body moving in the facility may be treated by regarding a plurality of moving bodies moving in the facility as one moving body.
  • the first mobile body 1 and the second mobile body 2 are a person moving in the facility or a moving body such as an automatic driving mobile device such as a self-driving robot moving in the facility.
  • an automatic driving mobile device such as a self-driving robot moving in the facility.
  • a first moving body 1 moving on the first passage and a second moving body 2 moving on the passage of the second passage orthogonal to the first passage are shown in the first passage and the second moving body 2 with respect to each other. It shows how it moves toward the intersection with the two passages.
  • the movement of a person moving in the facility is a movement by walking, a movement by running, or a movement using a wheelchair or the like.
  • the movement of the automatic driving movement device that moves in the facility is a traveling movement by rotating the wheels, a walking movement by operating the legs or the like, or the like.
  • a person who moves in the facility includes an automatic driving mobile device that can identify the display content of the display image displayed by the display output device 11 by a known image recognition technology or the like and can move according to the identified display content. But it's okay.
  • the movement of the person moving in the facility which is the automatic driving mobile device is the traveling movement by rotating the wheels or walking by operating the legs or the like. It is a move, etc.
  • the sensor 12 is an imaging device such as a digital still camera, a digital video camera, or a surveillance camera.
  • the sensor 12 is installed, for example, at a position where the first moving body 1 and the second moving body 2 shown in FIG. 2 can be photographed.
  • the sensor 12 is arranged on the ceiling at the intersection of the first passage and the second passage shown in FIG.
  • the sensor 12 transmits the sensor information indicating the captured image to the mobile body detection device 13 via the network 19 to the mobile body detection device 13.
  • the number of sensors 12 is assumed to be one, but the number of sensors 12 may be two or more.
  • the mobile body detection device 13 detects the position of the first mobile body 1 moving in the first passage by a known image analysis technique based on the sensor information transmitted by the sensor 12. Further, the moving body detecting device 13 uses a known image analysis technique to move the moving speed and moving of the first moving body 1 based on a plurality of positions in the first moving body 1 detected by the moving body detecting device 13 at different times. Detect the direction. Further, the moving body detecting device 13 generates the first moving body information indicating the detected position, moving speed, and moving direction of the first moving body 1, and displays the generated first moving body information in the display control device 100. Output to.
  • the moving body detecting device 13 detects the position of the second moving body 2 moving in the second passage by a known image analysis technique based on the sensor information transmitted by the sensor 12. Further, the moving body detecting device 13 uses a known image analysis technique to move and move the second moving body 2 based on a plurality of positions in the second moving body 2 detected by the moving body detecting device 13 at different times. Detect the direction. Further, the moving body detecting device 13 generates the second moving body information indicating the detected position, moving speed, and moving direction of the second moving body 2, and displays the generated second moving body information in the display control device 100. Output to.
  • the types of the first mobile body 1 and the second mobile body 2 are, for example, a person, an automatic driving mobile device such as a self-propelled robot, and the like.
  • the person who is one of the types of the first mobile body 1 and the second mobile body 2 is not only a so-called human being, but also the display content of the display image displayed by the display output device 11 by, for example, a known image recognition technique. It may include a self-propelled robot or the like that can identify and move according to the identified display content.
  • the mobile body detection device 13 detects only one of the types of the first mobile body 1 and the second mobile body 2, either a person or an automatic operation mobile device, and the mobile body detection device 13 detects the type. If this is not possible, it may be of the other type of human or autonomous mobile equipment.
  • the moving body detecting device 13 detects the types of the first moving body 1 and the second moving body 2 by the image analysis technique based on the sensor information transmitted by the sensor 12.
  • the mobile body detection device 13 includes information indicating the type of the detected first mobile body 1 in the first mobile body information, outputs the information to the display control device 100, and indicates the type of the detected second mobile body 2.
  • the information is included in the second mobile body information and output to the display control device 100. Since the method of detecting the position, the moving speed, and the moving direction of the first moving body 1 and the second moving body 2 based on the sensor information by the image analysis technique is known, the description thereof will be omitted. Further, since the method of detecting the types of the first moving body 1 and the second moving body 2 based on the sensor information by the image analysis technique is known, the description thereof will be omitted.
  • the mobile body detection device 13 is arranged at a predetermined place such as a server room in the facility.
  • the place where the mobile body detection device 13 is arranged is not limited to the facility, and if information can be transmitted / received to / from the sensor 12 and the display control device 100 via the network 19, the mobile body detection device 13 can be used. It may be placed outside the facility.
  • the sensor 12 is limited to an imaging device as long as the moving body detecting device 13 can detect the positions of the first moving body 1 and the second moving body 2 based on the sensor information transmitted from the sensor 12. is not it.
  • the sensor 12 may be an infrared sensor, an ultrasonic sensor, a radar sensor, a laser ranging sensor, or the like.
  • a method of detecting the positions of the first moving body 1 and the second moving body 2 based on the sensor information output by an infrared sensor, an ultrasonic sensor, a radar sensor, a laser ranging sensor, or the like is known. Therefore, the description is omitted.
  • the first moving body 1 and the second moving body 2 emit radio waves or the like that can specify the position of an RFID (Radio Frequency Identification Identity) tag or a beacon or the like conforming to BLE (Bluetooth Low Energy (registered trademark)) or the like.
  • the sensor 12 receives a radio wave emitted by an RFID tag or a radio wave such as a beacon conforming to BLE or the like, and outputs information indicating the received radio wave as sensor information. You may.
  • a method of detecting the positions of the first mobile body 1 and the second mobile body 2 based on the information indicating the radio wave emitted by the RFID tag or the information indicating the radio wave such as a beacon conforming to BLE or the like is known. Therefore, the description thereof will be omitted.
  • the display control device 100 acquires the first mobile body information and the second mobile body information output by the mobile body detection device 13, and image information indicating a display image corresponding to the first mobile body information and the second mobile body information. Is output.
  • the display control device 100 is arranged at a predetermined place such as a server room in the facility.
  • the place where the display control device 100 is arranged is not limited to the facility, and if information can be acquired from the mobile detection device 13 and the storage device 14 via the network 19, the display control device 100 is said to be concerned. It may be placed outside the facility. Details of the display control device 100 will be described later.
  • the storage device 14 includes, for example, a storage medium such as an SSD (Solid State Drive) or an HDD (Hard Disk Drive).
  • the storage device 14 reads the information corresponding to the request from the storage medium according to the request from the display control device 100, and outputs the read information to the display control device 100.
  • the storage device 14 is arranged at a predetermined place such as a server room in the facility. The place where the storage device 14 is arranged is not limited to the inside of the facility, and if the display control device 100 can acquire information from the storage device 14 via the network 19, the storage device 14 is outside the facility. It may be arranged.
  • FIG. 3 is a block diagram showing an example of the configuration of the main part of the display control device 100 according to the first embodiment.
  • the display control device 100 includes a first mobile information acquisition unit 110, a second mobile information acquisition unit 120, a contact prediction unit 130, a movement plan acquisition unit 140, a movement instruction unit 150, a facility layout map acquisition unit 160, and an image acquisition unit. It includes 170 and an image output unit 180.
  • the first moving body information acquisition unit 110 acquires the first moving body information indicating the position, moving speed, and moving direction of the first moving body 1 moving in the facility. Specifically, for example, the first mobile information acquisition unit 110 acquires the first mobile information generated and output by the mobile detection device 13 via the network 19.
  • the second mobile body information acquisition unit 120 acquires the second mobile body information indicating the position, the moving speed, and the moving direction of the second moving body 2 moving in the facility. Specifically, for example, the second mobile information acquisition unit 120 acquires the second mobile information generated and output by the mobile detection device 13 via the network 19.
  • the mobile body detection device 13 will be described as operating as a device different from the display control device 100, but the present invention is not limited to this.
  • the display control device 100 includes a mobile body detection device 13 inside, and the display control device 100 acquires the first mobile body information and the second mobile body information output by the mobile body detection device 13 included in the display control device 100. You may try to do it.
  • the first moving body information acquisition unit 110 is, for example, an automatic driving mobile device without going through the moving body detecting device 13.
  • the position, moving speed, and moving direction of the first moving body 1 output by the navigation system (not shown) included in the first moving body 1 are acquired from the first moving body 1. Is also good.
  • the second moving body 2 is an automatic driving mobile device such as a self-propelled robot
  • the second moving body information acquisition unit 120 is, for example, an automatic driving mobile device without going through the moving body detecting device 13.
  • the position, the moving speed, and the moving direction of the second moving body 2 output by the navigation system included in the second moving body 2 may be acquired from the second moving body 2.
  • the first mobile body 1 which is an automatic driving mobile device includes a wireless communication unit (not shown), and the first mobile body 1 which is an automatic driving mobile device is a first mobile body via the wireless communication unit.
  • the first mobile information output by the first mobile 1 is acquired by the first mobile information acquisition unit 110 via a wireless access point (not shown) connected to the network 19 and the network 19.
  • the second mobile body 2 which is an automatic driving mobile device includes a wireless communication unit, and the second mobile body 2 which is an automatic driving mobile device outputs the second mobile body information via the wireless communication unit.
  • the second mobile information output by the second mobile 2 is acquired by the second mobile information acquisition unit 120 via the wireless access point connected to the network 19 and the network 19.
  • the first moving body 1 is an automatic driving mobile device, and the first moving body 1 which is an automatic driving mobile device is controlled by a movement control device (not shown) that controls the movement of the automatic driving mobile device.
  • a movement control device not shown
  • the first mobile information acquisition unit 110 does not go through the mobile detection device 13, for example, the position of the first mobile 1 from the movement control device.
  • the moving speed, and the first moving body information indicating the moving direction may be acquired.
  • the second mobile body 2 is an automatic driving mobile device, and the second moving body 2 which is an automatic driving mobile device outputs control information output by a movement control device that controls the movement of the automatic driving mobile device.
  • the second moving body information acquisition unit 120 When moving by acquiring via the network 19, the second moving body information acquisition unit 120 does not go through the moving body detecting device 13, for example, the position and moving speed of the second moving body 2 from the moving control device. , And the second moving body information indicating the moving direction may be acquired.
  • the first mobile 1 or the movement control device is the first mobile 1.
  • the first mobile body information may include information indicating that the type of mobile device is an automatic operation mobile device such as a self-propelled robot, and the first mobile body information may be output.
  • the second mobile information acquisition unit 120 acquires the second mobile information without going through the mobile detection device 13 as described above, the second mobile 2 or the movement control device is the second mobile.
  • Information indicating that the type 2 is an automatic operation mobile device such as a self-propelled robot may be included in the second mobile information, and the second mobile information may be output.
  • the image acquisition unit 170 is installed in the facility based on the first mobile information acquired by the first mobile information acquisition unit 110 and the second mobile information acquired by the second mobile information acquisition unit 120.
  • the display output device 11 acquires image information indicating a display image to be displayed in the space in the facility.
  • the image acquisition unit 170 acquires the image information by reading the image information from the storage device 14, for example.
  • the image acquisition unit 170 has at least the type of a person based on the first mobile body information and the second mobile body information. 1 Acquires image information indicating a display image to be displayed in a space visible by the moving body 1 or the second moving body 2.
  • the display control device 100 provides information for avoiding contact between the first mobile body 1 moving in the facility and the second mobile body 2 moving in the facility. It can be provided to the first mobile body 1 or the second mobile body 2 which is a person who moves within.
  • the display image indicated by the image information acquired by the image acquisition unit 170 is not limited to the still image, but may be a moving image. Further, the image information acquired by the image acquisition unit 170 is not limited to one.
  • the display output device 11 has a plurality of display output devices 11-1, 11-2, 11-3, 11 as shown in FIG. In the case of -4, the image acquisition unit 170 is assigned to each of the plurality of display output devices 11-1, 11-2, 11-3, and 11-4 based on the first mobile body information and the second mobile body information. The corresponding image information may be acquired. The details of the image acquisition unit 170 will be described later.
  • the image output unit 180 outputs the image information acquired by the image acquisition unit 170. Specifically, for example, the image output unit 180 outputs the image information acquired by the image acquisition unit 170 to the display output device 11 installed in the facility. More specifically, for example, the image output unit 180 outputs the image information acquired by the image acquisition unit 170 as an image signal for the display output device 11 to output as a display image.
  • the display output device 11 receives the image information output by the image output unit 180, and displays the display image indicated by the image information in the space in the facility.
  • the facility layout acquisition unit 160 acquires facility layout information indicating the layout positions of structures such as passages, walls, and facilities in the facility.
  • the facility layout map acquisition unit 160 acquires image information by reading the facility layout map information from the storage device 14 via the network 19.
  • the facility layout map acquisition unit 160 outputs the acquired facility layout map information to the image acquisition unit 170.
  • the facility layout drawing acquisition unit 160 is not an essential configuration in the display control device 100.
  • the image acquisition unit 170 acquires the first mobile information acquired by the first mobile information acquisition unit 110 and the second mobile information acquisition unit 120.
  • image information may be acquired based on the facility layout map information acquired by the facility layout map acquisition unit 160.
  • the movement plan acquisition unit 140 is a first moving body 1 or a second moving body that is an automatic driving mobile device. Acquire the movement plan information indicating the movement plan in 2. For example, the movement plan acquisition unit 140 acquires the movement plan information by reading the facility layout map information from the storage device 14 via the network 19. The movement plan acquisition unit 140 may acquire movement plan information from the automatic driving mobile device or the above-mentioned movement control device via the network 19. The movement plan acquisition unit 140 outputs the acquired movement plan information to the image acquisition unit 170.
  • the first mobile body 1 or the second mobile body 2 which is an automatic driving mobile device moves in the facility based on a predetermined movement plan indicated by the movement plan information.
  • the movement plan acquisition unit 140 is not an essential configuration in the display control device 100.
  • the image acquisition unit 170 acquires the first mobile information acquired by the first mobile information acquisition unit 110 and the second mobile information acquisition unit 120.
  • image information may be acquired based on the movement plan information acquired by the movement plan acquisition unit 140.
  • the movement plan information acquired by the movement plan acquisition unit 140 is not limited to the information indicating the movement plan in the first moving body 1 or the second moving body 2 which is an automatic driving mobile device, and is not limited to the information indicating the moving plan in the first moving body 1 or the second moving body 2. It may be information indicating the movement prediction result acquired by predicting the movement direction of 2 or the like.
  • the contact prediction unit 130 is based on the first mobile information acquired by the first mobile information acquisition unit 110 and the second mobile information acquired by the second mobile information acquisition unit 120. Predicts whether or not there is a possibility that the second moving body 2 and the second moving body 2 come into contact with each other.
  • the contact prediction unit 130 outputs the prediction result of predicting whether or not the first moving body 1 and the second moving body 2 may come into contact with each other to the image acquisition unit 170.
  • the contact prediction unit 130 is not an essential configuration in the display control device 100.
  • the image acquisition unit 170 acquires the first mobile information acquired by the first mobile information acquisition unit 110 and the second mobile information acquisition unit 120. 2.
  • image information may be acquired based on the prediction result predicted by the contact prediction unit 130.
  • the contact prediction unit 130 when the display control device 100 includes the facility layout map acquisition unit 160, the first mobile information acquisition unit 110 and the second mobile information acquisition unit 120 acquire the first mobile information acquisition unit 120. Whether or not there is a possibility that the first mobile body 1 and the second mobile body 2 come into contact with each other based on the facility layout map information acquired by the facility layout map acquisition unit 160 in addition to the second mobile body information acquired by You may predict whether or not. Further, the contact prediction unit 130 is the first when the first mobile body 1 or the second mobile body 2 is an automatic operation mobile device that moves in the facility and the display control device 100 includes the movement plan acquisition unit 140.
  • the movement instruction unit 150 moves with respect to the first moving body 1 or the second moving body 2 which is the automatic driving mobile device. Outputs movement instruction information indicating a movement instruction such as pausing or resuming movement. Specifically, the movement instruction unit 150 outputs the movement instruction information corresponding to the display image indicated by the image information acquired by the image acquisition unit 170. For example, the movement instruction unit 150 outputs the movement instruction information to the automatic driving mobile device or the above-mentioned movement control device via the network 19.
  • the movement instruction unit 150 is not an essential configuration in the display control device 100.
  • the first mobile body 1 is a person who moves in the facility
  • the second mobile body 2 is an automatic driving mobile device such as a self-propelled robot that moves in the facility, or moves in the facility.
  • a self-propelled robot that moves in the facility, or moves in the facility.
  • FIGS. 4A and 4B are diagrams showing an example of image information acquired by the image acquisition unit 170 according to the first embodiment based on the first mobile body information and the second mobile body information. More specifically, FIGS. 4A and 4B are diagrams showing a state in which the display image indicated by the image information acquired by the image acquisition unit 170 is displayed by the display output device 11, and is a part of the floor in the facility. Is a view from above. 4A and 4B show a first passage and a second passage orthogonal to the first passage, similar to the layout diagram shown in FIG. Further, in FIGS. 4A and 4B, four display output devices 11-1, 11-2, 11-3, and 11-4 are shown as display output devices 11. Further, FIGS. 4A and 4B show a first moving body 1 moving on the first passage and a second moving body 2 moving on the second passage.
  • the first moving body 1 shown in FIGS. 4A and 4B is moving on the first passage in the direction of the arrow X shown in FIGS. 4A and 4B. Further, the second moving body 2 shown in FIGS. 4A and 4B is moving on the second passage in the direction of the arrow Y shown in FIGS. 4A and 4B.
  • the moving body that first invades the intersection of the first passage and the second passage is the first moving body 1 or the second moving body based on the first moving body information and the second moving body information.
  • Acquires image information showing a display image that determines whether it is one of the bodies 2 urges the moving body that invades the intersection first to move, and urges the moving body that invades the intersection later to stop.
  • the image acquisition unit 170 refers to, for example, based on the first moving body information and the second moving body information, with respect to the moving body that first invades the intersection after passing through the intersection and then later invades the intersection.
  • Acquires image information indicating a display image that encourages movement.
  • the image acquisition unit 170 when the first moving body 1 which is a person moving in the facility first invades the intersection, the image acquisition unit 170 is, for example, the first moving body 1 who is a person moving in the facility. On the other hand, it shows a state in which image information indicating a display image suggesting that the second moving body 2 is stopped is acquired for the first moving body 1 which is a person who promotes movement and moves in the facility. Further, in FIG. 4A, when the first moving body 1 which is a person moving in the facility first invades the intersection, the image acquisition unit 170 is, for example, the second moving body which is a person moving in the facility. An image showing a display image that suggests that the first mobile body 1 that is a person moving in the facility moves with respect to 2, and urges the second mobile body 2 that is a person moving in the facility to stop. It also indicates the state in which the information was acquired.
  • the display control device 100 provides information for avoiding contact between the first mobile body 1 which is a person moving in the facility and the second mobile body 2 moving in the facility. Can be provided to the first mobile body 1 which is a person who moves in the facility. Further, by configuring in this way, when the second mobile body 2 is a person who moves in the facility, the display control device 100 can move between the first mobile body 1 which is a person who moves in the facility and the inside of the facility. Information for avoiding contact with the second mobile body 2 which is a moving person can be provided to the first mobile body 1 and the second mobile body 2 who are moving in the facility. ..
  • the image acquisition unit 170 may determine the image information to be acquired based on the priority passage rule which is a movement rule provided in advance in addition to the first mobile body information and the second mobile body information.
  • the priority passage rule sets a priority for the movement of a moving body moving in the first passage and the movement of the moving body moving in the second passage, for example.
  • the information indicating the priority passage rule may be possessed by the image acquisition unit 170 in advance, or the image acquisition unit 170 may acquire the information from the storage device 14 via the network 19.
  • the image acquisition unit 170 moves the first moving body 1 in the first passage based on the first moving body information, the second moving body information, and the priority passage rule provided in advance.
  • the moving body moving in one of the higher priority passages is urged to move
  • the moving body moving in the other lower priority passage is urged to move.
  • Acquires image information indicating a display image prompting stop When the moving body moving in the first passage is moved with priority over the moving body moving in the second passage, the image acquisition unit 170 acquires, for example, image information showing the display image shown in FIG. 4A.
  • the moving body moving in the high priority passage is the first passage and the second passage. After passing through the intersection with the passage, the image information indicating the display image for urging the moving body moving in the passage having a low priority to move is acquired.
  • the display control device 100 provides information for avoiding contact between the first mobile body 1 which is a person moving in the facility and the second mobile body 2 moving in the facility. Can be provided to the first mobile body 1 which is a person who moves in the facility. Further, by configuring in this way, when the second mobile body 2 is a person who moves in the facility, the display control device 100 can move between the first mobile body 1 which is a person who moves in the facility and the inside of the facility. Information for avoiding contact with the second mobile body 2 which is a moving person can be provided to the first mobile body 1 and the second mobile body 2 who are moving in the facility. ..
  • the image acquisition unit 170 is a movement rule provided in advance in addition to the first mobile body information and the second mobile body information.
  • the image information to be acquired may be determined based on the priority moving body rule.
  • the priority moving body rule sets a priority on, for example, the movement of a person moving in a facility and the movement of an autonomous driving mobile device moving in the facility.
  • the information indicating the priority moving body rule may be previously possessed by the image acquisition unit 170, or the image acquisition unit 170 may acquire the information from the storage device 14 via the network 19.
  • the image acquisition unit 170 is a person who moves in the facility or a person in the facility based on the first moving body information, the second moving body information, and the priority moving body rules provided in advance.
  • the automatic operation mobile devices that move in one moving body having a high priority is urged to move, and the other moving body having a low priority is urged to stop.
  • Image information indicating a display image is acquired.
  • the image acquisition unit 170 acquires, for example, image information indicating the display image shown in FIG. 4A.
  • one of the moving bodies having a higher priority is the first passage and the second passage. After passing through the intersection with, the image information indicating the display image for urging the other moving body having a lower priority to move is acquired.
  • the display control device 100 is a first mobile body that is a person who moves in the facility in a situation where a person who moves in the facility and an automatic driving mobile device which moves in the facility coexist.
  • Information for avoiding contact between 1 and the second moving body 2 moving in the facility can be provided to the first moving body 1 which is a person moving in the facility.
  • the image acquisition unit 170 determines the image information to be acquired based on, for example, the first mobile body information and the second mobile body information. Specifically, for example, the image acquisition unit 170 has a first distance, which is a distance from the position of the intersection of the first passage and the second passage to the position of the first moving body 1 indicated by the first moving body information. The second distance, which is the distance from the position of the intersection to the position of the second moving body 2 indicated by the second moving body information, is calculated, and the first distance and the image information to be acquired according to the second distance are determined. To do.
  • the image acquisition unit 170 has a first distance and a first distance based on the first mobile body information, the second mobile body information, and the facility layout map information acquired by the facility layout map acquisition unit 160. 2 Calculate the distance.
  • the image acquisition unit 170 shows a display image in which the color, size, shape, or the mode of animation in the moving image is different depending on whether the first distance or the second distance is longer than a predetermined distance or shorter than a predetermined distance. Get image information.
  • the image acquisition unit 170 acquires image information according to the moving speed of the first moving body 1 indicated by the first moving body information or the moving speed of the second moving body 2 indicated by the second moving body information. May be decided. More specifically, for example, the image acquisition unit 170 determines that the moving speed of the first moving body 1 or the moving speed of the second moving body 2 is slower or faster than the predetermined moving speed. Acquires image information indicating a display image having a different size, shape, or animation mode in a moving image.
  • the image acquisition unit 170 is acquired by the movement plan acquisition unit 140 in addition to the first mobile body information and the second mobile body information.
  • Image information may be acquired based on the movement plan information indicating the movement plan in which the automatic driving mobile device moves.
  • the second moving body 2 which is an automatic driving mobile device that moves in the second passage, has the first moving body 1 at the intersection of the first passage and the second passage according to the movement plan indicated by the movement plan information. It is a figure which showed the case which turns right in the direction of right turn and moves in the 1st passage after turning right. In this case, as shown in FIG.
  • the image acquisition unit 170 moves in the facility in consideration of the size, turning radius, inner ring difference, etc. of the second moving body 2 which is an automatic driving mobile device.
  • the image acquisition unit 170 passes through the intersection of the first passage and the second passage, the image acquisition unit 170, for example, is based on the first moving body information and the second moving body information.
  • the display control device 100 is a first mobile body that is a person who moves in the facility in a situation where a person who moves in the facility and an automatic driving mobile device which moves in the facility coexist.
  • Information for avoiding contact between 1 and the second moving body 2 moving in the facility can be provided to the first moving body 1 which is a person moving in the facility.
  • the display output device 11 installed in the facility may acquire image information indicating a display image to be displayed in the space in the facility.
  • the image acquisition unit 170 is the first. Information for avoiding contact between the first mobile body 1 which is a person moving in the facility and the second mobile body 2 moving in the facility based on the 1 mobile body information and the 2nd mobile body information. As a result, image information indicating a display image to be displayed in the space in the facility is acquired. Further, the image acquisition unit 170 is a person who moves in the facility when the prediction result predicted by the contact prediction unit 130 is such that there is no possibility that the first moving body 1 and the second moving body 2 come into contact with each other.
  • the image acquisition unit 170 is a predetermined image for calling attention to the first moving body 1 who is a person moving in the facility that the second moving body 2 is approaching. Information may be obtained.
  • the display control device 100 is a person who moves in the facility when it is predicted that there is a possibility of contact between the first mobile body 1 and the second mobile body 2.
  • Information for avoiding contact between the first mobile body 1 and the second mobile body 2 moving in the facility can be provided to the first mobile body 1 which is a person moving in the facility. it can.
  • the display control device 100 refers to the first mobile body 1 or the second mobile body 2 when it is predicted that there is no possibility of contact between the first mobile body 1 and the second mobile body 2. It is possible to omit acquiring and outputting image information indicating a display image or the like prompting the stop.
  • the image acquisition unit 170 is an automatic driving mobile device in which the second moving body 2 moves in the facility, and the image acquisition unit 170 is the first moving body 1 which is a person moving in the facility. 2
  • the moving instruction unit 150 is instructed to temporarily stop the movement of the automatic driving mobile device which is the second moving body 2. May be output. Further, after the first moving body 1 which is a person moving in the facility passes through the intersection of the first passage and the second passage, the image acquisition unit 170 moves the automatic driving mobile device to the movement instruction unit 150.
  • the movement instruction information indicating the resumption instruction may be output.
  • FIG. 5 is a diagram showing another example of image information acquired by the image acquisition unit 170 according to the first embodiment based on the first mobile body information and the second mobile body information. More specifically, FIG. 5 is a diagram showing a state in which the display image indicated by the image information acquired by the image acquisition unit 170 in the embodiment is displayed by the display output device 11, and is a part of the floor in the facility. Is a view from above.
  • FIG. 5 shows the first passage. Further, FIG. 5 shows a first moving body 1 and a second moving body 2 moving on the first passage. The first moving body 1 shown in FIG. 5 is moving on the first passage in the direction of the arrow X shown in FIG. Further, the second moving body 2 shown in FIG. 5 is moving on the first passage in the direction of the arrow Y shown in FIG.
  • the image acquisition unit 170 may acquire image information based on a traffic rule which is a movement rule provided in advance.
  • the traffic rule gives priority to left-hand traffic or right-hand traffic based on the rules or customs of the passage through which the moving body moves, for example.
  • the information indicating the traffic rule may be previously possessed by the image acquisition unit 170, or the image acquisition unit 170 may acquire the information from the storage device 14 via the network 19.
  • the image acquisition unit 170 refers to the first moving body 1 and the second moving body 2 based on the first moving body information, the second moving body information, and the traffic rules provided in advance.
  • FIG. 5 is a diagram showing a case where the first moving body 1 and the second moving body 2 display a display image prompting them to move toward the left side.
  • the display control device 100 provides information for avoiding contact between the first mobile body 1 which is a person moving in the facility and the second mobile body 2 moving in the facility. Can be provided to the first mobile body 1 which is a person who moves in the facility. Further, by configuring in this way, when the second mobile body 2 is a person who moves in the facility, the display control device 100 can move between the first mobile body 1 which is a person who moves in the facility and the inside of the facility.
  • the display control device 100 displays the display image on the display output device 11, and also provides control information for causing the second moving body 2, which is an automatic driving mobile device, to perform movement corresponding to the display image. You may send it.
  • FIGS. 6A and 6B are diagrams showing an example of the hardware configuration of the display control device 100 according to the first embodiment.
  • the display control device 100 is composed of a computer, which has a processor 601 and a memory 602.
  • the computer is installed in the first mobile information acquisition unit 110, the second mobile information acquisition unit 120, the contact prediction unit 130, the movement plan acquisition unit 140, the movement instruction unit 150, the facility layout drawing acquisition unit 160, and the like.
  • a program for functioning as the image acquisition unit 170 and the image output unit 180 is stored.
  • the processor 601 reads and executes the program stored in the memory 602
  • the first mobile information acquisition unit 110, the second mobile information acquisition unit 120, the contact prediction unit 130, the movement plan acquisition unit 140, and the movement instruction are executed.
  • the functions of the unit 150, the facility layout drawing acquisition unit 160, the image acquisition unit 170, and the image output unit 180 are realized.
  • the display control device 100 may be configured by the processing circuit 603.
  • the function of the image output unit 180 may be realized by the processing circuit 603.
  • the display control device 100 may be composed of a processor 601, a memory 602, and a processing circuit 603 (not shown).
  • the first mobile information acquisition unit 110, the second mobile information acquisition unit 120, the contact prediction unit 130, the movement plan acquisition unit 140, the movement instruction unit 150, the facility layout acquisition unit 160, the image acquisition unit 170, and Some of the functions of the image output unit 180 may be realized by the processor 601 and the memory 602, and the remaining functions may be realized by the processing circuit 603.
  • the processor 601 uses, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a microcontroller, or a DSP (Digital Signal Processor).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • microprocessor a microcontroller
  • DSP Digital Signal Processor
  • the memory 602 uses, for example, a semiconductor memory or a magnetic disk. More specifically, the memory 602 includes, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), and an EEPROM (Electric Memory). Or, it uses an HDD.
  • a RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory an EPROM (Erasable Programmable Read Only Memory)
  • EEPROM Electrically Memory
  • the processing circuit 603 is, for example, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), an FPGA (Field-Programmable Gate Array), a System Integration) is used.
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device
  • FPGA Field-Programmable Gate Array
  • FPGA Field-Programmable Gate Array
  • System Integration System Integration
  • FIG. 7 is a flowchart showing an example of processing of the display control device 100 according to the first embodiment.
  • the display control device 100 repeatedly executes, for example, the processing of the flowchart.
  • the facility layout map acquisition unit 160 acquires the facility layout map information.
  • the first mobile information acquisition unit 110 acquires the first mobile information.
  • the second mobile information acquisition unit 120 acquires the second mobile information.
  • the movement plan acquisition unit 140 provides the movement plan information indicating the moving plan in which the automatic driving mobile device moves. get.
  • the contact prediction unit 130 predicts whether or not the first moving body 1 and the second moving body 2 may come into contact with each other, and outputs the prediction result.
  • the image acquisition unit 170 acquires image information.
  • step ST707 the image output unit 180 outputs image information.
  • step ST708 when the second moving body 2 is an automatic driving mobile device that moves in the facility, the movement instruction unit 150 outputs movement instruction information indicating the movement instruction of the automatic driving mobile device.
  • step ST708 the display control device 100 ends the processing of the flowchart. After finishing the processing of the flowchart, the display control device 100 returns to step ST702 and repeatedly executes the processing of the flowchart. If the display control device 100 does not include the facility layout drawing acquisition unit 160, the process of step ST701 is omitted. If the display control device 100 does not include the movement plan acquisition unit 140, the process of step ST704 is omitted. If the display control device 100 does not include the contact prediction unit 130, the process of step ST705 is omitted. Further, when the display control device 100 does not include the movement instruction unit 150, the process of step ST708 is omitted. Further, the order of processing from step ST701 to step ST703 is arbitrary.
  • the display control device 100 acquires the position, the moving speed, and the moving direction of the first moving body 1 moving in the facility.
  • the body information acquisition unit 110, the second mobile information acquisition unit 120 that acquires the position, the movement speed, and the movement direction of the second mobile 2 that moves in the facility, and the first movement.
  • the display output device 11 installed in the facility based on the first mobile information acquired by the body information acquisition unit 110 and the second mobile information acquired by the second mobile information acquisition unit 120.
  • An image acquisition unit 170 for acquiring image information indicating a display image to be displayed in the space in the facility and an image output unit 180 for outputting the image information acquired by the image acquisition unit 170 are provided.
  • the display control device 100 provides information for avoiding contact between the first mobile body 1 which is a person moving in the facility and the second mobile body 2 moving in the facility. Can be provided to the first mobile body 1 which is a person who moves in the facility.
  • the display control device 100 By configuring the display control device 100 in this way, between the first mobile body 1 which is a person moving in the facility and the second mobile body 2 moving in the facility in the blind spot of the first mobile body 1. Information for avoiding contact with the first mobile body 1 can be provided to the first mobile body 1 who is a person moving in the facility.
  • the display control device 100 moves in the facility by being configured in this way. It is not necessary for the first mobile body 1 who is the person to carry the mobile terminal.
  • the display control device 100 is for avoiding contact between the first mobile body 1 and the second mobile body 2 with respect to the first mobile body 1 even if the first mobile body 1 does not carry a mobile terminal. Information can be provided.
  • the pedestrian needs to shift his / her line of sight from the traveling direction to the mobile terminal in order to visually check the danger information displayed on the mobile terminal, whereas the display control
  • the device 100 By configuring the device 100 in this way, the information for avoiding contact between the first mobile body 1 and the second mobile body 2 is provided in the space in the facility and the first mobile body 1 Since it is displayed in a visible space, it is possible to shorten the time required for the person who is the first mobile body 1 to take an appropriate action after confirming the information.
  • the image acquisition unit 170 is the first mobile body 1 or the second mobile body based on the first mobile body information and the second mobile body information.
  • the image information indicating the display image to be displayed in the space visible to the first moving body 1 or the second moving body 2 whose type is a person is acquired.
  • the display control device 100 provides information for avoiding contact between a person moving in the facility and a moving body moving in the facility to the person moving in the facility. Can be provided.
  • the display control device 100 includes a facility layout map acquisition unit 160 for acquiring facility layout map information indicating the layout positions of the structures constituting the facility, and acquires an image.
  • the facility layout acquisition unit 160 In the unit 170, in addition to the first mobile information acquired by the first mobile information acquisition unit 110 and the second mobile information acquired by the second mobile information acquisition unit 120, the facility layout acquisition unit 160 It is configured to acquire image information based on the facility layout information to be acquired.
  • the display control device 100 provides information for avoiding contact between the first mobile body 1 which is a person moving in the facility and the second mobile body 2 moving in the facility. Can be provided to the first mobile body 1 which is a person who moves in the facility according to the arrangement of the structures constituting the facility.
  • the first mobile body 1 or the second mobile body 2 moves in the facility based on a predetermined movement plan.
  • the image acquisition unit 170 includes a movement plan acquisition unit 140 for acquiring movement plan information indicating a movement plan in the first mobile body 1 or the second mobile body 2 which is an automatic operation mobile device, and the image acquisition unit 170 is the first. Based on the first mobile information acquired by the mobile information acquisition unit 110 and the second mobile information acquired by the second mobile information acquisition unit 120, as well as the movement plan information acquired by the movement plan acquisition unit 140. It was configured to acquire image information. With this configuration, the display control device 100 automatically moves information in the facility to avoid contact between the person moving in the facility and the automatic driving mobile device moving in the facility. It can be provided to people who move in the facility according to the movement plan of the driving mobile device.
  • the display control device 100 acquires the first mobile information acquired by the first mobile information acquisition unit 110 and the second mobile information acquisition unit 120.
  • a contact prediction unit 130 for predicting contact between the first mobile body 1 and the second mobile body 2 based on the second mobile body information is provided, and the image acquisition unit 170 is acquired by the first mobile body information acquisition unit 110.
  • the image information is acquired based on the prediction result predicted by the contact prediction unit 130. did.
  • the display control device 100 avoids contact when there is a possibility of contact between a person moving in the facility and an automatic driving mobile device moving in the facility. Information can be provided to people moving around the facility.
  • the image acquisition unit 170 acquires the first mobile information and the second mobile information acquired by the first mobile information acquisition unit 110.
  • the image information is acquired based on the movement rules provided in advance.
  • the display control device 100 is for avoiding contact between a person moving in the facility and a moving body moving in the facility in consideration of a predetermined movement rule. Information can be provided to people moving around the facility.
  • FIG. 8 is a block diagram showing an example of the configuration of a main part of the display system 10a according to the second embodiment to which the display control device 100a according to the second embodiment is applied.
  • the in-facility equipment 15 is added to the display system 10 according to the first embodiment, and the display control device 100 in the display system 10 according to the first embodiment is changed to the display control device 100a. ..
  • the display system 10a includes a display output device 11, a sensor 12, a mobile body detection device 13, a storage device 14, facility equipment 15, and a display control device 100a.
  • the display output device 11, the sensor 12, the mobile body detection device 13, the storage device 14, the facility equipment 15, and the display control device 100a included in the display system 10a are each connected via a network 19 capable of transmitting and receiving information. Has been done.
  • FIG. 8 the same blocks as those shown in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • the in-facility equipment 15 is equipment installed in the facility such as an elevator, an automatic door, a security gate, or a home door which is a facility of a station facility, and is a person who moves in the facility or moves in the facility. This is a facility through which automatic driving mobile devices such as self-propelled robots go in and out.
  • the sensor 12 is installed in the facility 15 in the facility.
  • the sensor 12 outputs the sensor information to the equipment 15 in the facility.
  • the facility 15 acquires the sensor information output by the sensor 12 and outputs the sensor information to the mobile detection device 13.
  • the in-facility equipment 15 outputs the equipment state information indicating the operating state of the equipment installed in the facility to the display control device 100a.
  • the equipment status information includes, for example, if the equipment 15 in the facility is an elevator, information indicating the position of the car in the hoistway, information indicating the operating status of the landing operation panel, information indicating the operating status of the car operation panel, and the like. Is. Further, the equipment state information is information indicating the open / closed state of the door or the gate if the equipment 15 in the facility is equipment such as an automatic door, a security gate, or a platform door.
  • the display control device 100a acquires the first mobile body information and the second mobile body information output by the mobile body detection device 13, and the equipment status information output by the in-facility equipment 15, and obtains the first mobile body information and the first mobile body information. 2
  • the image information indicating the display image corresponding to the mobile body information is output to the facility facility 15.
  • the facility facility 15 acquires the image information output by the display control device 100a, and outputs the acquired image information to the display output device 11. More specifically, for example, the facility facility 15 outputs the image information acquired from the display control device 100a as an image signal for the display output device 11 to output as a display image.
  • FIG. 9 is a block diagram showing an example of the configuration of the main part of the display control device 100a according to the second embodiment.
  • the equipment state acquisition unit 190 is added to the display control device 100 according to the first embodiment, and the image acquisition unit 170 and the image output unit 180 in the display control device 100 according to the first embodiment acquire images. It is changed to the unit 170a and the image output unit 180a. That is, the display control device 100a includes the first mobile information acquisition unit 110, the second mobile information acquisition unit 120, the contact prediction unit 130, the movement plan acquisition unit 140, the movement instruction unit 150, the facility layout drawing acquisition unit 160, and the equipment. It includes a state acquisition unit 190, an image acquisition unit 170a, and an image output unit 180a.
  • the same blocks as those shown in FIG. 3 are designated by the same reference numerals, and the description thereof will be omitted.
  • the equipment status acquisition unit 190 acquires the equipment status information output by the equipment 15 in the facility.
  • the image acquisition unit 170a includes the equipment state acquisition unit 190. Acquires image information based on the equipment status information acquired by.
  • the image information acquired by the image acquisition unit 170a is stored in the space in the facility with respect to the display output device 11 installed in the facility, similarly to the image information acquired by the image acquisition unit 170 according to the first embodiment. This is image information indicating a display image to be displayed.
  • the image output unit 180a outputs the image information acquired by the image acquisition unit 170a.
  • the image output unit 180a outputs the image information to the in-facility equipment 15.
  • the display control device 100a generates control information for causing the moving body, which is an automatic driving mobile device, to move corresponding to the display image indicated by the image information output by the image output unit 180a, and the control information. May be transmitted to the moving body.
  • the functions of the figure acquisition unit 160, the equipment state acquisition unit 190, the image acquisition unit 170a, and the image output unit 180a are the processor 601 and the memory in the hardware configuration shown in FIGS. 6A and 6B in the first embodiment. It may be realized by 602, or it may be realized by the processing circuit 603.
  • the image acquisition unit 170a according to the second embodiment will explain the image information acquired based on the first mobile body information, the second mobile body information, and the equipment state information.
  • the image acquisition unit 170a according to the second embodiment has the first mobile body information, the second mobile body information, and the like.
  • FIGS. 10A, 10B, 10C, and 10D when the facility equipment 15 is an elevator, the display image indicated by the image information acquired by the image acquisition unit 170a is a display output device.
  • FIGS. 10A, 10B, 10C, and 10D the display images indicated by the image information acquired by the image acquisition unit 170a are displayed and output in the order of FIGS. 10A, 10B, 10C, and 10D.
  • the state displayed by the device 11 is shown.
  • FIG. 10A shows two moving objects moving in the facility waiting for the elevator car to arrive on a certain floor in the facility. Further, FIG. 10A shows a state in which a moving body (not shown) moving in the facility gets into the car.
  • a moving body (not shown) moving in the facility gets into the car.
  • two moving bodies waiting for the car to arrive on the floor are regarded as one moving body, and are referred to as a first moving body 1, and one or more movements that have entered the car.
  • the body is referred to as a second mobile body 2 (not shown).
  • at least one of the two mobile bodies, which is the first mobile body 1 will be described as being a person moving in the facility.
  • the facility equipment 15 which is an elevator has information indicating the operation state of the landing operation panel on the floor, information indicating the operation state of the car operation panel in the car, and a state in which the car is moving toward the floor. Information or the like indicating that the above is output to the display control device 100a as equipment status information.
  • the information indicating the operation state of the landing operation panel on the floor is the information indicating the operation state for stopping the car on the floor, and indicates the operation state of the car operation panel in the car.
  • the information is information indicating an operation state for stopping the car on the floor.
  • the sensor 12 (not shown) installed in the car outputs sensor information to the moving object detection device 13.
  • the mobile detection device 13 acquires the sensor information output by the sensor 12, generates the second mobile information indicating the position of the second mobile 2, and displays the generated second mobile information.
  • the sensor 12 (not shown) installed on the floor outputs sensor information to the moving object detection device 13.
  • the mobile detection device 13 acquires the sensor information output by the sensor 12, generates the first mobile information indicating the position of the first mobile 1, and displays the generated first mobile information. Output to the control device 100a.
  • the image acquisition unit 170a acquires the first mobile information acquired by the first mobile information acquisition unit 110, the second mobile information acquired by the second mobile information acquisition unit 120, and the equipment state acquisition unit 190. Based on the equipment status information, for example, a display suggesting that the second mobile body 2 may come down from the car to the first mobile body 1 and prompting the first mobile body 1 to stop. Acquire image information indicating an image.
  • FIG. 10A shows a state in which the display output device 11 (not shown) installed on the floor side of the elevator displays the display image indicated by the image information. With this configuration, the display control device 100a provides information for avoiding contact between the person moving in the facility and the moving body moving in the facility to the person moving in the facility. Can be provided.
  • FIG. 10B shows a state in which the car arrives at the floor and the landing door is opened after the state shown in FIG. 10A.
  • two moving bodies waiting for the car to arrive on the floor are regarded as one moving body, and are referred to as a first moving body 1, and one or more movements that have entered the car.
  • the body is called the second mobile body 2.
  • the equipment 15 in the facility which is an elevator, outputs information or the like indicating that the car is stopped on the floor and the landing door on the floor is open to the display control device 100a as equipment status information.
  • the sensor 12 (not shown) installed in the car outputs sensor information to the moving object detection device 13.
  • the mobile detection device 13 acquires the sensor information output by the sensor 12, generates the second mobile information indicating the position of the second mobile 2, and displays the generated second mobile information.
  • the sensor 12 (not shown) installed on the floor outputs sensor information to the moving object detection device 13.
  • the mobile detection device 13 acquires the sensor information output by the sensor 12, generates the first mobile information indicating the position of the first mobile 1, and displays the generated first mobile information. Output to the control device 100a.
  • the image acquisition unit 170a acquires the first mobile information acquired by the first mobile information acquisition unit 110, the second mobile information acquired by the second mobile information acquisition unit 120, and the equipment state acquisition unit 190. Based on the equipment state information, for example, image information showing a display image that urges the second moving body 2 to get out of the car and urges the first moving body 1 to stop is acquired.
  • FIG. 10B shows a state in which the display output device 11 (not shown) installed on the floor side of the elevator displays the display image indicated by the image information.
  • the display control device 100a provides information for avoiding contact between the person moving in the facility and the moving body moving in the facility to the person moving in the facility. Can be provided.
  • FIG. 10C shows a state in which one or more moving bodies that have boarded the car have finished descending from the elevator car after the state shown in FIG. 10B, and two moving bodies that move in the facility have not yet boarded the car. Is shown.
  • the first moving body 1 is a person who moves in the facility
  • the second mobile body 2 is an automatic driving mobile device such as a self-propelled robot that moves in the facility, or moves in the facility.
  • a self-propelled robot that moves in the facility, or moves in the facility.
  • the equipment 15 in the facility which is an elevator, outputs information or the like indicating that the car is stopped on the floor and the landing door on the floor is open to the display control device 100a as equipment status information.
  • the sensor 12 (not shown) installed on the floor outputs sensor information to the moving object detection device 13.
  • the mobile body detection device 13 acquires the sensor information output by the sensor 12, and indicates the first mobile body information indicating the position of the first mobile body 1 and the like, and the second moving body information indicating the position and the like of the second mobile body 2.
  • the mobile body information is generated, and the generated first mobile body information and the second mobile body information are output to the display control device 100a.
  • the image acquisition unit 170a acquires the first mobile information acquired by the first mobile information acquisition unit 110, the second mobile information acquired by the second mobile information acquisition unit 120, and the equipment state acquisition unit 190. Based on the equipment state information, for example, image information showing a display image that urges the second moving body 2 to get into the car and urges the first moving body 1 to stop is acquired.
  • FIG. 10C shows a state in which the display output device 11 (not shown) installed on the floor side of the elevator displays the display image indicated by the image information.
  • the display control device 100a provides information for avoiding contact between the person moving in the facility and the moving body moving in the facility to the person moving in the facility. Can be provided.
  • FIG. 10D shows a state in which the second moving body 2 has completed boarding in the car after the state shown in FIG. 10C, and the first moving body 1 has not boarded in the car.
  • the equipment 15 in the facility which is an elevator, outputs information or the like indicating that the car is stopped on the floor and the landing door on the floor is open to the display control device 100a as equipment status information.
  • the sensor 12 (not shown) installed in the car outputs sensor information to the moving object detection device 13.
  • the mobile detection device 13 acquires the sensor information output by the sensor 12, generates the first mobile information indicating the position of the first mobile 1, and displays the generated first mobile information.
  • the sensor 12 (not shown) installed on the floor outputs sensor information to the moving object detection device 13.
  • the mobile detection device 13 acquires the sensor information output by the sensor 12, generates the second mobile information indicating the position of the second mobile 2, and displays the generated second mobile information.
  • the image acquisition unit 170a acquires the first mobile information acquired by the first mobile information acquisition unit 110, the second mobile information acquired by the second mobile information acquisition unit 120, and the equipment state acquisition unit 190. Based on the equipment state information, for example, image information indicating a display image prompting the first mobile body 1 to get into the car is acquired.
  • FIG. 10D shows a state in which the display output device 11 (not shown) installed on the floor side of the elevator displays the display image indicated by the image information.
  • the display control device 100a provides information for avoiding contact between the person moving in the facility and the moving body moving in the facility to the person moving in the facility. Can be provided.
  • FIG. 11 when the equipment 15 in the facility is an automatic door, the image acquisition unit 170a according to the second embodiment acquires the first moving body information, the second moving body information, and the equipment state information. It is a figure which shows an example of image information. More specifically, FIG. 11 is a diagram showing a state in which the display image indicated by the image information acquired by the image acquisition unit 170a is displayed by the display output device 11 when the facility facility 15 is an automatic door. Yes, it is a view of the direction of the floor from the ceiling of a certain floor in the facility.
  • FIG. 11 shows a state before the passage of the automatic door when the first moving body 1 moving in the facility moves in the direction of the arrow X shown in FIG. 11 and passes through the automatic door. .. Further, FIG. 11 shows a state before the passage of the automatic door when the second moving body 2 moving in the facility moves in the direction of the arrow Y shown in FIG. 11 and passes through the automatic door. ing.
  • the first mobile body 1 is a person who moves in the facility
  • the second mobile body 2 is an automatic driving mobile device such as a self-propelled robot that moves in the facility, or moves in the facility. Explain as a person who does.
  • the facility equipment 15 which is an automatic door outputs information indicating an open / closed state of the automatic door to the display control device 100a as equipment state information.
  • the sensor 12 (not shown) installed on the automatic door, the wall in the vicinity of the automatic door, or the structure constituting the facility such as the ceiling outputs the sensor information to the moving body detection device 13.
  • the moving body detecting device 13 acquires the sensor information output by the sensor 12, and obtains the first moving body information indicating the position and the like of the first moving body 1 and the second moving body indicating the position and the like of the second moving body 2.
  • the body information is generated, and the generated first mobile body information and the second mobile body information are output to the display control device 100a.
  • the image acquisition unit 170a acquires the first mobile information acquired by the first mobile information acquisition unit 110, the second mobile information acquired by the second mobile information acquisition unit 120, and the equipment state acquisition unit 190. Based on the equipment state information, for example, the first moving body 1 is urged to stop, and the image information indicating the display image suggesting that the second moving body 2 moves with respect to the first moving body 1 is acquired. ..
  • the image information acquired by the image acquisition unit 170a may indicate a display image or the like that urges the first moving body 1 to stop and the second moving body 2 to move, and may indicate a second moving body. It may indicate a display image or the like that suggests that the first moving body 1 stops with respect to 2 and urges the second moving body 2 to move. In FIG.
  • the display output device 11 installed on the wall near the automatic door urges the first mobile body 1, which is a person moving in the facility, to stop, and causes the first mobile body 1 to stop in the facility. It shows a state in which a display image suggesting that the second mobile body 2, which is a moving automatic driving mobile device, moves is displayed.
  • the display control device 100a provides information for avoiding contact between the person moving in the facility and the moving body moving in the facility to the person moving in the facility. Can be provided.
  • the image information acquired by the image acquisition unit 170a has been described with reference to FIG. 11 as an example of the case where the in-facility equipment 15 is an automatic door.
  • the in-facility equipment 15 is a security gate or equipment of a station facility. It is possible to configure in the same manner in the case of a platform door or the like.
  • the image acquisition unit 170a includes the first mobile information acquired by the first mobile information acquisition unit 110, the second mobile information acquired by the second mobile information acquisition unit 120, and the equipment state acquisition unit 190.
  • Image information may be acquired based on the movement rules.
  • the image acquisition unit 170a predicts the prediction result predicted by the contact prediction unit 130, the movement plan information acquired by the movement plan acquisition unit 140, the facility layout map information acquired by the facility layout map acquisition unit 160, or a predetermined movement rule.
  • the mode of acquiring the image information based on the above is described in the first embodiment, and thus the description thereof will be omitted.
  • FIG. 12 is a flowchart showing an example of processing of the display control device 100a according to the second embodiment.
  • the display control device 100a repeatedly executes, for example, the processing of the flowchart.
  • the facility layout map acquisition unit 160 acquires the facility layout map information.
  • the first mobile information acquisition unit 110 acquires the first mobile information.
  • the second mobile information acquisition unit 120 acquires the second mobile information.
  • the equipment status acquisition unit 190 acquires the equipment status information.
  • the movement plan acquisition unit 140 provides movement plan information indicating a moving plan in which the automatic driving mobile device moves. get.
  • the contact prediction unit 130 predicts whether or not the first moving body 1 and the second moving body 2 may come into contact with each other, and outputs the prediction result.
  • step ST1207 the image acquisition unit 170a acquires image information.
  • step ST1208 the image output unit 180a outputs image information.
  • step ST1209 when the second moving body 2 is an automatic driving mobile device that moves in the facility, the movement instruction unit 150 outputs movement instruction information indicating the movement instruction of the automatic driving mobile device.
  • step ST1209 the display control device 100a ends the processing of the flowchart. After finishing the processing of the flowchart, the display control device 100a returns to step ST1202 and repeatedly executes the processing of the flowchart. If the display control device 100a does not include the facility layout drawing acquisition unit 160, the process of step ST1201 is omitted. If the display control device 100a does not include the movement plan acquisition unit 140, the process of step ST1205 is omitted. If the display control device 100a does not include the contact prediction unit 130, the process of step ST1206 is omitted. Further, when the display control device 100a does not include the movement instruction unit 150, the process of step ST1209 is omitted. Further, the order of processing from step ST1201 to step ST1204 is arbitrary.
  • the display control device 100a acquires the position, the moving speed, and the moving direction of the first moving body 1 moving in the facility.
  • the body information acquisition unit 110, the second mobile information acquisition unit 120 that acquires the position, the movement speed, and the movement direction of the second mobile 2 that moves in the facility.
  • the equipment status acquisition unit 190 that acquires the equipment status information indicating the operating status of the installed equipment, the first mobile information acquired by the first mobile information acquisition unit 110, and the second mobile information acquisition unit 120 A display image to be displayed in the space in the facility by the display output device 11 installed in the facility based on the equipment state information acquired by the equipment state acquisition unit 190 in addition to the second mobile body information to be acquired.
  • An image acquisition unit 170a for acquiring image information indicating the above, and an image output unit 180a for outputting image information acquired by the image acquisition unit 170a are provided.
  • the display control device 100a makes the information for avoiding contact between the person moving in the facility and the moving body moving in the facility correspond to the operating state of the facility. , Can be provided to those who move within the facility.
  • the display control devices 100, 100a and the display systems 10, 10a can also be applied to facilities such as parking lots where vehicles such as automobiles travel.
  • the autonomous driving mobile device may be a vehicle.
  • the equipment installed in the facility is a gate provided at the entrance / exit of the parking lot, or a vehicle lock device such as a lock plate that restricts the movement of parked vehicles. Is.
  • the display control device of the present invention can be applied to a display system that displays a display image in a facility.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/JP2019/046470 2019-11-28 2019-11-28 表示制御装置、表示システム、及び、表示制御方法 WO2021106122A1 (ja)

Priority Applications (5)

Application Number Priority Date Filing Date Title
PCT/JP2019/046470 WO2021106122A1 (ja) 2019-11-28 2019-11-28 表示制御装置、表示システム、及び、表示制御方法
DE112019007831.3T DE112019007831T5 (de) 2019-11-28 2019-11-28 Anzeigesteuerungsvorrichtung, anzeigesystem und anzeigesteuerungsverfahren
CN201980102161.2A CN114730185A (zh) 2019-11-28 2019-11-28 显示控制装置、显示系统和显示控制方法
JP2020522395A JP6833111B1 (ja) 2019-11-28 2019-11-28 表示制御装置、表示システム、及び、表示制御方法
US17/700,495 US20220215666A1 (en) 2019-11-28 2022-03-22 Display control device, display system, and display control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/046470 WO2021106122A1 (ja) 2019-11-28 2019-11-28 表示制御装置、表示システム、及び、表示制御方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/700,495 Continuation US20220215666A1 (en) 2019-11-28 2022-03-22 Display control device, display system, and display control method

Publications (1)

Publication Number Publication Date
WO2021106122A1 true WO2021106122A1 (ja) 2021-06-03

Family

ID=74665133

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/046470 WO2021106122A1 (ja) 2019-11-28 2019-11-28 表示制御装置、表示システム、及び、表示制御方法

Country Status (5)

Country Link
US (1) US20220215666A1 (de)
JP (1) JP6833111B1 (de)
CN (1) CN114730185A (de)
DE (1) DE112019007831T5 (de)
WO (1) WO2021106122A1 (de)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230056993A1 (en) * 2020-02-28 2023-02-23 Nec Corporation Authentication terminal, entrance/exit management system, entrance/exit management method, and program
JP7400641B2 (ja) * 2020-07-01 2023-12-19 トヨタ自動車株式会社 情報処理装置、情報処理方法、及び、プログラム
JP2022177375A (ja) * 2021-05-18 2022-12-01 株式会社日立製作所 人機械協調制御システム並びに人機械協調制御方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001184599A (ja) * 1999-12-27 2001-07-06 Hitachi Eng Co Ltd 施設利用者情報探索システム
JP2007149053A (ja) * 2005-10-24 2007-06-14 Shimizu Corp 道案内システムおよび道案内方法
WO2014024254A1 (ja) * 2012-08-07 2014-02-13 株式会社日立製作所 自律走行装置の利用支援ツール、運用管理センタ、運用システム及び自律走行装置
JP2014153118A (ja) * 2013-02-06 2014-08-25 Chugoku Electric Power Co Inc:The 移動体の位置又は向きを示す情報を取得するシステム及び方法
JP2017026568A (ja) * 2015-07-28 2017-02-02 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
JP2019087268A (ja) * 2018-12-27 2019-06-06 大日本印刷株式会社 光学システム
JP2019144168A (ja) * 2018-02-22 2019-08-29 パナソニックIpマネジメント株式会社 ナビゲーション方法、ナビゲーションシステム、移動体、及び、ナビゲーションプログラム

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5942840B2 (ja) * 2012-12-21 2016-06-29 ソニー株式会社 表示制御システム及び記録媒体
KR102071575B1 (ko) * 2013-04-23 2020-01-30 삼성전자 주식회사 이동로봇, 사용자단말장치 및 그들의 제어방법
TWI547355B (zh) * 2013-11-11 2016-09-01 財團法人工業技術研究院 人機共生安全監控系統及其方法
JP2015201113A (ja) 2014-04-10 2015-11-12 スズキ株式会社 歩行者と運転者間の情報共用システム
CN109789990B (zh) * 2016-10-04 2020-12-15 三菱电机株式会社 自主移动体控制装置
KR102608046B1 (ko) * 2016-10-10 2023-11-30 엘지전자 주식회사 공항용 안내 로봇 및 그의 동작 방법
WO2019163412A1 (ja) * 2018-02-22 2019-08-29 パナソニックIpマネジメント株式会社 ナビゲーション方法、ナビゲーションシステム、移動体、及び、ナビゲーションプログラム
JP7360406B2 (ja) * 2018-06-26 2023-10-12 ファナック アメリカ コーポレイション ロボット型ピッキングシステムのための拡張現実可視化
US11082667B2 (en) * 2018-08-09 2021-08-03 Cobalt Robotics Inc. Contextual automated surveillance by a mobile robot
US10780897B2 (en) * 2019-01-31 2020-09-22 StradVision, Inc. Method and device for signaling present driving intention of autonomous vehicle to humans by using various V2X-enabled application
JP7345128B2 (ja) * 2019-05-20 2023-09-15 パナソニックIpマネジメント株式会社 歩行者装置および交通安全支援方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001184599A (ja) * 1999-12-27 2001-07-06 Hitachi Eng Co Ltd 施設利用者情報探索システム
JP2007149053A (ja) * 2005-10-24 2007-06-14 Shimizu Corp 道案内システムおよび道案内方法
WO2014024254A1 (ja) * 2012-08-07 2014-02-13 株式会社日立製作所 自律走行装置の利用支援ツール、運用管理センタ、運用システム及び自律走行装置
JP2014153118A (ja) * 2013-02-06 2014-08-25 Chugoku Electric Power Co Inc:The 移動体の位置又は向きを示す情報を取得するシステム及び方法
JP2017026568A (ja) * 2015-07-28 2017-02-02 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
JP2019144168A (ja) * 2018-02-22 2019-08-29 パナソニックIpマネジメント株式会社 ナビゲーション方法、ナビゲーションシステム、移動体、及び、ナビゲーションプログラム
JP2019087268A (ja) * 2018-12-27 2019-06-06 大日本印刷株式会社 光学システム

Also Published As

Publication number Publication date
CN114730185A (zh) 2022-07-08
US20220215666A1 (en) 2022-07-07
DE112019007831T5 (de) 2022-08-11
JP6833111B1 (ja) 2021-02-24
JPWO2021106122A1 (de) 2021-06-03

Similar Documents

Publication Publication Date Title
US11619998B2 (en) Communication between autonomous vehicle and external observers
JP6833111B1 (ja) 表示制御装置、表示システム、及び、表示制御方法
US11225246B2 (en) Vehicle control method and apparatus, and storage medium
CN107257994B (zh) 用于在停车环境中对机动车进行交通协调的方法
WO2018037954A1 (ja) 移動体制御装置、移動体制御方法、及び、移動体
JP7308992B2 (ja) Lidarベースの通信
US20200133261A1 (en) Method and apparatus for controlling autonomous driving of vehicle, electronic device and storage medium
CN111469832A (zh) 用于自主代客停车的系统、方法、基础设施和车辆
US11815887B2 (en) Vehicle control device, vehicle control method, vehicle, information processing device, information processing method, and program
CN111951566A (zh) 车辆控制系统、车辆控制方法及存储介质
JP6945630B2 (ja) 車両管理システム
EP4057253B1 (de) System, verwaltungsverfahren, automatisch fahrendes fahrzeug und programm
JP2020082918A (ja) 車両の制御装置及び乗客輸送システム
US20220063665A1 (en) Systems And Methods For Directing A Parked Vehicle To Travel From A First Parking Spot To A Second Parking Spot
JP2019185629A (ja) 車高検知装置および車高検知方法
WO2022070250A1 (ja) 情報処理装置、情報処理方法、および、プログラム
US20220343757A1 (en) Information processing apparatus, information processing system, and information processing method
JP7217415B2 (ja) 情報処理装置、情報処理方法及び情報処理プログラム
US20220308556A1 (en) Delivery robot and notification method
CN108284830A (zh) 用于运行停放在第一位置上的机动车的方法和装置
US11333523B2 (en) Vehicle control device, output device, and input and output device
CN113168773A (zh) 移动体控制设备、移动体控制方法、移动体、信息处理装置、信息处理方法以及程序
WO2022196082A1 (ja) 情報処理装置、情報処理方法、及び、プログラム
US20240142967A1 (en) Secure software communication with autonomous vehicles
US20240143705A1 (en) Secure software communication with autonomous vehicles

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2020522395

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19954217

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19954217

Country of ref document: EP

Kind code of ref document: A1