WO2021245886A1 - Système de gestion de circulation dans un bâtiment - Google Patents

Système de gestion de circulation dans un bâtiment Download PDF

Info

Publication number
WO2021245886A1
WO2021245886A1 PCT/JP2020/022138 JP2020022138W WO2021245886A1 WO 2021245886 A1 WO2021245886 A1 WO 2021245886A1 JP 2020022138 W JP2020022138 W JP 2020022138W WO 2021245886 A1 WO2021245886 A1 WO 2021245886A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
building
management system
unit
Prior art date
Application number
PCT/JP2020/022138
Other languages
English (en)
Japanese (ja)
Inventor
剛 山崎
英治 湯浅
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2022529256A priority Critical patent/JP7294538B2/ja
Priority to PCT/JP2020/022138 priority patent/WO2021245886A1/fr
Priority to CN202080101375.0A priority patent/CN115698632A/zh
Publication of WO2021245886A1 publication Critical patent/WO2021245886A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/005Traffic control systems for road vehicles including pedestrian guidance indicator

Definitions

  • This disclosure relates to the traffic management system of buildings.
  • Patent Document 1 discloses an example of a guidance guidance system.
  • the guidance guidance system guides the user based on the congestion status of the elevator landing.
  • Patent Document 1 guides the user by the display of the guidance guidance display unit provided near the entrance / exit of the building. Therefore, the guidance guidance system cannot guide the user moving inside the building.
  • the present disclosure provides a traffic management system for a building that can guide users moving inside the building.
  • the building traffic management system includes a congestion information acquisition unit that acquires congestion information by one or more users in a building having a plurality of floors for each of the plurality of floors, and one or more users.
  • the position information acquisition unit that acquires the position information in the building of the first user who carries the portable device, the position information of the first user acquired by the position information acquisition unit, the purpose information of the first user, and a plurality of them.
  • the route information generation unit and the route information generation unit Based on the congestion information acquired by the congestion information acquisition unit for the floor on which the first user is located, the route information generation unit and the route information generation unit generate route information to guide the first user.
  • the mobile device includes a guidance information generation unit that generates guidance information for guiding the first user according to the route information.
  • the building traffic management system contains mobile devices carried by the first user among one or more users of a building having multiple floors, and congestion information by one or more users in the building.
  • the congestion information acquisition unit acquired for each of a plurality of floors, the position information acquisition unit for acquiring the position information in the building of the first user, the position information of the first user acquired by the position information acquisition unit, the first Route information that generates route information that guides the first user based on the purpose information of one user and the congestion information acquired by the congestion information acquisition unit for the floor where the first user is located among multiple floors. It includes a generation unit and a guidance information generation unit that generates guidance information in which the mobile device guides the first user according to the route information generated by the route information generation unit.
  • the building traffic management system can guide users to move inside the building.
  • FIG. It is a block diagram of the traffic management system which concerns on Embodiment 1.
  • FIG. It is a figure which shows the example of the function of the mobile device which concerns on Embodiment 1.
  • FIG. It is a figure which shows the example of the related information stored in the related information storage part which concerns on Embodiment 1.
  • FIG. It is a figure which shows the example of the route information generated by the route information generation part which concerns on Embodiment 1.
  • FIG. It is a figure which shows the example of the guidance information generated by the guidance information generation part which concerns on Embodiment 1.
  • FIG. It is a figure which shows the example of the image taken by the visual field camera which concerns on Embodiment 1.
  • FIG. 1 It is a figure which shows the example of the guidance information generated by the guidance information generation part which concerns on Embodiment 1.
  • FIG. 2 is a figure which shows the example of the guidance information generated by the guidance information generation part which concerns on Embodiment 1.
  • FIG. 2 is a figure which shows the example of the guidance information generated by the guidance information generation part which concerns on Embodiment 1.
  • FIG. It is a flowchart which shows the example of the operation of the traffic management system which concerns on Embodiment 1.
  • It is a hardware block diagram of the main part of the traffic management system which concerns on Embodiment 1.
  • FIG. is a block diagram of the traffic management system which concerns on Embodiment 2.
  • FIG. 1 It is a hardware block diagram of the main part of the traffic management system which concerns on Embodiment 1.
  • FIG. It is a figure which shows the example of the virtual design generated by the virtual design generation part which concerns on Embodiment 2.
  • FIG. It is a block diagram of the traffic management system which concerns on Embodiment 3. It is a figure which shows the example of the viewpoint information which the information storage part which concerns on Embodiment 3 accumulates. It is a block diagram of the traffic management system which concerns on Embodiment 4. It is a top view of the landing which concerns on Embodiment 4.
  • FIG. It is a figure which shows the example of the guidance information generated by the guidance information generation part which concerns on Embodiment 4.
  • FIG. It is a block diagram of the traffic management system which concerns on Embodiment 5. It is a figure which shows the user who gets in the car which concerns on Embodiment 5. It is a figure which shows the example of the guidance information generated by the guidance information generation part which concerns on Embodiment 5.
  • FIG. 1 is a configuration diagram of the traffic management system 1 according to the first embodiment.
  • the traffic management system 1 is applied to, for example, a building 2 having a plurality of floors.
  • building 2 is an office building.
  • the floor-to-floor moving means is a device or equipment used by the user of the building 2 to move between a plurality of floors.
  • the floor-to-floor transportation means includes, for example, a staircase 3, an escalator 4, an elevator 5, and the like.
  • the stairs 3 are provided between two adjacent floors.
  • the escalator 4 is a device for transporting a user who rides on a step of circulating movement or the like between two floors.
  • the hoistway 6 of the elevator 5 is provided.
  • the hoistway 6 is a vertically long space spanning a plurality of floors.
  • a landing 7 for the elevator 5 is provided on each floor of the building 2.
  • a landing device is provided at the landing 7.
  • the landing equipment is the equipment of the elevator 5 provided in the landing 7.
  • the landing equipment includes a landing operation panel 8, a landing display panel 9, a landing door 10, and a three-sided frame 11.
  • the landing operation panel 8 is a device that accepts operations such as registration of a user's landing call at the landing 7.
  • the landing display board 9 is a device that displays the status of the elevator 5.
  • the landing door 10 is provided at an opening leading from the landing 7 to the hoistway 6.
  • the landing door 10 is a door that separates the landing 7 and the hoistway 6.
  • the three-sided frame 11 is a frame that surrounds the opening leading from the landing 7 to the hoistway 6 from three sides above and to the left and right.
  • the coded image 12 is attached to the wall surface of the landing 7.
  • the coded image 12 is, for example, an image in which information for specifying the landing 7 and the floor of the elevator 5 is encoded.
  • the coded image 12 is, for example, a two-dimensional code symbol based on the ISO / IEC 18004 standard.
  • Elevator 5 is equipped with a basket 13.
  • the car 13 is a device for transporting a user between a plurality of floors by traveling vertically along the hoistway 6.
  • the car 13 travels on the hoistway 6 by, for example, a driving force generated by a hoist (not shown).
  • the car 13 includes a car door 14.
  • the car door 14 is a device that opens and closes the landing door 10 of the landing 7 in conjunction with the landing door 10 so that the user can get on and off when the car 13 is stopped on any floor.
  • the car 13 includes a car device.
  • the car device is a device provided inside the car 13.
  • the car equipment includes a car operation panel 15 and a car display panel 16.
  • the car operation panel 15 is a device that accepts operations such as registration of a user's car call inside the car 13.
  • the car display panel 16 is a device that displays the status of the elevator 5.
  • the status of the elevator 5 includes information such as the position of the car 13.
  • the traffic management system 1 is equipped with an elevator management board 17.
  • the elevator management board 17 includes an operation management unit 18 and a first communication unit 19.
  • the operation management unit 18 is a part that manages the operation of the elevator 5.
  • the management of the operation of the elevator 5 by the operation management unit 18 includes, for example, registration of a call and running of a car 13 in response to the registered call.
  • the first communication unit 19 is a part that communicates with an external device of the elevator management board 17.
  • the first communication unit 19 is connected to the elevator 5 so that, for example, a control signal from the operation management unit 18 can be output.
  • Building 2 is used by one or more users. At least one of the users carries the mobile device 20.
  • the user carrying the mobile device 20 is an example of the first user.
  • the mobile device 20 is, for example, a portable device that displays an image to a mobile user. In this example, the mobile device 20 is eyewear worn at the position of the user's eyes.
  • the portable device 20 includes a second communication unit 21, a display unit 22, a reading unit 23, a line-of-sight detection unit 24, a voice recognition unit 25, a field-of-view camera 26, and an image recognition unit 27.
  • the second communication unit 21 is a part that communicates with an external device of the mobile device 20.
  • the second communication unit 21 is equipped with a wireless communication function.
  • the second communication unit 21 is connected to a communication network such as a telephone line network or the Internet.
  • the second communication unit 21 may be equipped with a short-range wireless communication function such as RFID (Radio Frequency Identifier).
  • the second communication unit 21 may be equipped with a wireless communication function based on, for example, the IEEE 802.11 standard or the IEEE 802.11 standard.
  • the display unit 22 is a part that displays an image to a mobile user.
  • the image may include text.
  • the image may be either a still image or a moving image.
  • the reading unit 23 is a part that reads the user's authentication information.
  • Authentication information is information used for user authentication.
  • the authentication information is, for example, user-specific information.
  • the authentication information may be, for example, the biometric information of the user.
  • the user's biological information may be, for example, the user's retinal information or iris information.
  • the line-of-sight detection unit 24 is a part that detects the line of sight of a user carrying a mobile device 20.
  • the line-of-sight detection unit 24 detects the user's line of sight with, for example, a visible light camera, an infrared light source, an infrared camera, or the like.
  • the voice recognition unit 25 is a part that performs voice recognition of the voice of the user who carries the mobile device 20.
  • the voice recognition unit 25 has, for example, a microphone that collects the user's voice.
  • the field-of-view camera 26 is a camera that photographs the front of the user who wears the portable device 20 which is eyewear.
  • the field of view camera 26 is directed to the front of the user.
  • the field of view camera 26 captures, for example, the user's field of view.
  • the image recognition unit 27 is a part that recognizes an image taken by the field camera 26.
  • the traffic management system 1 includes a plurality of fixed cameras 28 and a building management board 29.
  • Each fixed camera 28 is a camera fixedly provided toward the internal area of the building 2.
  • Each fixed camera 28 captures a directed area inside the building 2.
  • one or more of the plurality of fixed cameras 28 are provided on each floor.
  • Each fixed camera 28 outputs the captured image to the building management board 29.
  • the building management board 29 includes a third communication unit 30, a congestion information acquisition unit 31, a location information acquisition unit 32, a route information generation unit 33, a guidance information generation unit 34, an authentication unit 35, and a related information storage unit. 36 and.
  • the third communication unit 30 is a part that communicates with an external device of the building management board 29.
  • the external equipment of the building management board 29 includes the elevator management board 17.
  • the third communication unit 30 communicates with the second communication unit 21 of the mobile device 20 through a communication network such as a telephone line network or the Internet.
  • the congestion information acquisition unit 31 is a part that acquires congestion information in the building 2 for each floor.
  • the congestion information is information indicating the state of congestion by one or more users of the building 2.
  • the congestion information includes information such as the congestion degree distribution in the building 2. Further, the congestion information may include information such as the utilization rate of the floor-to-floor transportation means in the building 2.
  • the degree of congestion is an index of congestion such as the density of users in the area of building 2. Further, when there is only one user in the building 2, the congestion information may be information indicating information that there is no other user.
  • the congestion information may include information such as the position or movement speed of each user.
  • the congestion information may include information on predicted values of future congestion situations. Here, the predicted value of the future congestion situation is predicted by the congestion information acquisition unit 31 or the like based on information such as the position and movement speed of each user.
  • the congestion information acquisition unit 31 acquires congestion information by, for example, image recognition of an image taken by each fixed camera 28.
  • the congestion information acquisition unit 31 detects, for example, a user passing through an area photographed by each fixed camera 28 by motion detection or the like.
  • the congestion information acquisition unit 31 acquires congestion information based on the detected density of users and the like.
  • the congestion information acquisition unit 31 acquires the utilization rate of the escalator 4 based on, for example, the load information of the escalator 4.
  • the congestion information acquisition unit 31 may acquire the utilization rate of the escalator 4 based on the image of the fixed camera 28 that captures the escalator 4.
  • the congestion information acquisition unit 31 acquires the utilization rate of the elevator 5 based on, for example, information on the operation status of the elevator 5 acquired from the operation management unit 18.
  • the congestion information acquisition unit 31 may acquire the utilization rate of the elevator 5 based on the image of the fixed camera 28 that captures the landing 7 of the elevator 5.
  • the location information acquisition unit 32 is a portion that acquires location information in the building 2 of the user carrying the mobile device 20.
  • the user's location information includes information on the floor on which the user is currently located.
  • the user's position information includes information on the position on the floor surface on the floor where the user is currently located.
  • the user's location information may include information on the orientation of the user.
  • the user's location information is detected, for example, by the mobile device 20 carried by the user.
  • the mobile device 20 detects the user's position information based on, for example, a satellite navigation system, an autonomous navigation system, an indoor positioning system applied to a building 2, or a combination of these systems.
  • the mobile device 20 detects the orientation of the user, for example, by recognizing an image of the inside of the building 2 taken by the visual field camera 26.
  • the mobile device 20 has a sensor such as a gyro sensor
  • the portable device 20 may detect the direction of the user based on the detection signal of the sensor or the like.
  • the user's position information detected by the mobile device 20 is transmitted by the second communication unit 21 to the third communication unit 30 of the position information acquisition unit 32.
  • the position information acquisition unit 32 acquires the position information transmitted from the mobile device 20.
  • the user's position information may be acquired by image recognition of an image taken by each fixed camera 28.
  • the position information acquisition unit 32 tracks, for example, a user who carries a mobile device 20 and passes through an area photographed by each fixed camera 28 by motion detection and tracking. At this time, the position information acquisition unit 32 may identify the user by, for example, face recognition or gait recognition based on a walking pattern.
  • the position information acquisition unit 32 acquires the position information of the user, for example, by tracking the user in the image taken by each fixed camera 28.
  • the route information generation unit 33 is a part that generates route information that guides the user.
  • the route information includes information on the route from the current position of the user to the destination of the user. When the user travels from the current floor to the destination of another floor, the route information includes information on the use of one of the floor-to-floor transportation means.
  • the route information generation unit 33 provides route information based on the user's location information, the user's purpose information, the congestion information about the floor on which the user is currently located, and the congestion information about the floor including the user's destination.
  • the user's purpose information is information representing the user's purpose.
  • the purpose information includes, for example, information on the user's destination.
  • the guidance information generation unit 34 is a part that generates guidance information in which the mobile device 20 guides the user according to the route information generated by the route information generation unit 33.
  • the guidance information is information that can be recognized by the user, such as images, characters, or voice.
  • the guidance information generation unit 34 generates an image to be displayed to the user by the display unit 22 of the mobile device 20 as guidance information.
  • the guidance information generated by the guidance information generation unit 34 is transmitted by the third communication unit 30 to the second communication unit 21 of the mobile device 20.
  • the authentication unit 35 is a part that authenticates a user carrying a mobile device 20 based on authentication information.
  • the authentication unit 35 authenticates, for example, by collating with pre-registered authenticateable authentication information.
  • the authentication information is transmitted to the third communication unit 30 by the second communication unit 21 of the mobile device 20.
  • the related information storage unit 36 is a part that stores related information related to the user who carries the mobile device 20.
  • the related information storage unit 36 stores the related information in association with the user's authentication information.
  • the related information storage unit 36 readablely stores the related information associated with the authentication information based on the user's authentication information.
  • FIG. 2 is a diagram showing an example of the function of the mobile device 20 according to the first embodiment.
  • FIG. 2 shows a schematic view of the mobile device 20 as viewed from the side of the user's head.
  • the portable device 20 is, for example, smart glasses or eyewear such as an HMD (Head Mounted Display).
  • HMD Head Mounted Display
  • the display for displaying an image is an example of the display unit 22.
  • the portable device 20 may be a retinal projection type laser eyewear that displays an image to the user by projecting an image by, for example, scanning the retina of the user wearing the device with a laser.
  • the optical system including the light source of the laser scanned by the user's retina is an example of the display unit 22.
  • the reading unit 23 is arranged, for example, in front of the eyes of the wearing user. At this time, the reading unit 23 reads the biological information from the front of the user's eyes. Alternatively, the reading unit 23 may read the user's biological information through a mirror (not shown) arranged in front of the user's eyes. In this example, the reading unit 23 reads the user's retinal information.
  • the line-of-sight detection unit 24 is arranged, for example, in front of the eyes of the wearing user. At this time, the line-of-sight detection unit 24 detects the user's line of sight based on the direction of the user's eyeball detected from the front of the user's eyes. Alternatively, the line-of-sight detection unit 24 may detect the user's line of sight through a mirror (not shown) arranged in front of the user's eyes.
  • the field of view camera 26 is arranged toward the front of the wearing user.
  • the field of view camera 26 is arranged, for example, between the eyes of the user.
  • the field-of-view camera 26 may be arranged outside both eyes of the user.
  • FIG. 3 is a diagram showing an example of related information stored in the related information storage unit 36 according to the first embodiment.
  • the related information storage unit 36 stores the related information as authentication information in association with the biometric information of the user.
  • Related information includes information on user attributes.
  • Information on user attributes includes, for example, information on the user's gender and age or age.
  • Related information includes the language used by the user.
  • the language used by the user is, for example, a language that the user can understand.
  • the related information includes information on the floor that can be used by the user among the plurality of floors.
  • Relevant information includes information on the function of the user's body.
  • the information on the function of the body includes, for example, the information on the motor function of the user or the information on the sensory function of the user.
  • the information on the function of the body includes, for example, information such as the presence or absence of a disability of the user.
  • Information on physical function includes information such as whether or not an auxiliary device is used.
  • the information on the function of the body may include information such as a temporary change in the function of the body due to an injury or the like.
  • the related information storage unit 36 may store information related to the user's thoughts and actions as related information.
  • the related information may include information such as a user's hobbies or preferences.
  • the relevant information may include user health information.
  • the health information includes, for example, information on the physical constitution of the user or information such as a history of the amount of activity of the user.
  • the related information may include the user's affiliation information.
  • the affiliation information includes, for example, information on the organization to which the user belongs, information on the project in which the user participates, and the like.
  • the related information may include information on the user's schedule.
  • the related information may include information on the place of work of the user.
  • the related information may include information such as an in-house newsletter of the organization to which the user belongs.
  • the related information storage unit 36 may acquire information related to the user in cooperation with an external system of the traffic management system 1.
  • the related information storage unit 36 has, for example, gender "male”, age “20 years old”, language “English”, available floor “1st floor, 2nd floor, 5th floor” as related information associated with the first biometric information. , And the 7th floor ”, and information including the physical function“ healthy person ”, etc. are memorized.
  • the related information storage unit 36 has, for example, gender "female”, age “30 years old”, language “Japanese”, available floors “2nd floor, 3rd floor, and” as related information associated with the second biometric information. Memorize information including “4th floor” and physical function "use of wheelchair”.
  • the related information storage unit 36 has, for example, gender "female”, age “40 years old”, language “Chinese”, available floors “5th and 9th floors” as related information associated with the third biometric information. , And information including physical function "low vision” and so on.
  • FIG. 4 is a diagram showing an example of route information generated by the route information generation unit 33 according to the first embodiment.
  • 5 and 7 to 9 are diagrams showing examples of guidance information generated by the guidance information generation unit 34 according to the first embodiment.
  • FIG. 6 is a diagram showing an example of an image taken by the visual field camera 26 according to the first embodiment.
  • FIG. 4 On the upper side of FIG. 4, an example of a plan view of the floor where the user is currently present is shown. Below FIG. 4, an example of a floor plan of the floor including the user's destination is shown.
  • the user moves inside the building 2 to the destination inside the area where the entrance / exit area is controlled.
  • the area is, for example, a room locked and controlled by an electric lock 37.
  • the congestion information acquisition unit 31 of the building management board 29 acquires congestion information for each floor based on information such as images taken by each fixed camera 28.
  • the user who carries the mobile device 20 is wearing the mobile device 20 which is eyewear.
  • the reading unit 23 reads the biometric information of the user wearing the mobile device 20.
  • the second communication unit 21 transmits the biometric information of the user to the building management board 29.
  • the authentication unit 35 of the building management board 29 authenticates the user carrying the mobile device 20.
  • the position information acquisition unit 32 acquires the user's position information.
  • the route information generation unit 33 and the guidance information generation unit 34 read the user-related information stored in the related information storage unit 36 based on the biological information. Reading related information that has already been read may be omitted.
  • the route information generation unit 33 acquires the purpose information of the user.
  • the purpose information is input by the user, for example, through the mobile device 20.
  • the input of the target information is performed, for example, through the voice recognition unit 25 of the mobile device 20.
  • the input of the target information may be performed, for example, through the line-of-sight detection unit 24 of the mobile device 20.
  • the input of the target information through the line-of-sight detection unit 24 is performed, for example, by continuously directing the line of sight to the area where the candidate of the target information is displayed by the display unit 22 for a time longer than a preset time.
  • the target information may be acquired based on the user's related information.
  • the destination, which is the purpose information is, for example, the work place of the user.
  • the destination as the purpose information may be, for example, the location of the conference room registered in the user's schedule.
  • the acquisition of the target information that has already been acquired may be omitted.
  • the already acquired target information may be updated by a new input of the target information or the like.
  • the route information generation unit 33 is used for the user's location information, the user's purpose information, the floor congestion information on which the user is currently located, the floor congestion information including the user's destination, and the user-related information. Generate route information based on.
  • the route information generation unit 33 sets the user's destination based on the purpose information. When the destination information includes the destination itself, the route information generation unit 33 sets the destination as the destination of the user. When the user's destination is on the floor on which the user is currently located, the route information generation unit 33 uses the user's location information, the user's purpose information, the congestion information on the floor on which the user is currently located, and the use. Generate route information based on the person's related information.
  • the route information generation unit 33 generates route information so as to avoid, for example, a place currently congested or a place where congestion is expected.
  • the route information generation unit 33 selects one of the floor-to-floor moving means and uses the floor-to-floor moving means. To generate.
  • the route information generation unit 33 may be selected to avoid, for example, the congested floor-to-floor moving means.
  • the route information generation unit 33 may generate route information using the escalator 4 or the stairs 3.
  • the route information generation unit 33 when the related information includes the physical function "wheelchair use” and the like, the route information generation unit 33 generates, for example, route information passing through a place with few steps. At this time, the route information generation unit 33 may generate route information using the elevator 5.
  • the route information generation unit 33 generates route information using the elevator 5.
  • FIG. 5 shows an example of guidance information displayed by the mobile device 20.
  • the guidance information generation unit 34 generates guidance information based on the route information generated by the route information generation unit 33.
  • the guidance information generation unit 34 generates, for example, an image that guides the user to the route of the route information.
  • the guidance information generation unit 34 uses the user's position information to generate an image of an arrow along the route of the route information.
  • the third communication unit 30 transmits the generated guidance information to the mobile device 20.
  • the display unit 22 of the mobile device 20 displays the guidance information received by the second communication unit 21.
  • the display unit 22 displays guidance information as an image (AR: Augmented Reality) displayed overlaid on the image of the user's actual visual field.
  • AR Augmented Reality
  • the display unit 22 displays guidance information as an AR image displayed superimposed on the image captured by the visual field camera 26.
  • the display unit 22 may display the guidance information as a virtually generated VR image (VR: Virtual Reality) of the user's visual field. At this time, the situation around the user recognized by the visual field camera 26 and the image recognition unit 27 may be reflected in the VR image.
  • VR Virtual Reality
  • the user wearing the mobile device 20 moves the building 2 according to the display of the display unit 22. After that, the user arrives at the platform 7 of the elevator 5.
  • FIG. 6 shows an example of an image of the landing 7 of the elevator 5 taken by the field camera 26.
  • the image recognition unit 27 of the portable device 20 performs image recognition of the image taken by the visual field camera 26.
  • the image recognition unit 27 recognizes one of the landing devices by image recognition.
  • the image recognition unit 27 recognizes, for example, a landing operation panel 8, a landing display panel 9, a landing door 10, or a three-way frame 11 as landing equipment.
  • the second communication unit 21 transmits a recognition signal indicating that the landing device has been recognized to the elevator management board 17.
  • the second communication unit 21 may transmit a recognition signal indicating that the coded image 12 is recognized to the elevator management board 17 when the image recognition unit 27 recognizes the coded image 12 of the landing. ..
  • the recognition signal is transmitted to the elevator management board 17 via the building management board 29.
  • the operation management unit 18 of the elevator management board 17 detects that the user has arrived at the landing 7 of the elevator 5 when the first communication unit 19 receives the recognition signal.
  • the operation management unit 18 registers the call from the departure floor to the destination floor of the user.
  • the departure floor of the user is the floor including the landing 7 where the user arrived.
  • the user's destination floor is, for example, a floor including the user's destination, or a floor such as a transit floor via a route to the user's destination.
  • the departure floor and the destination floor are transmitted from the building management board 29 to the elevator management board 17 together with, for example, a recognition signal.
  • the operation management unit 18 drives the car 13 so as to answer the registered call.
  • the destination floor of the user may be selected by the user.
  • the display unit 22 of the mobile device 20 displays the floor that can be used by the user registered in the related information of the user as a candidate for the destination floor.
  • the image displayed here is generated by, for example, the guidance information generation unit 34 or the like.
  • the user selects a destination floor from the candidates displayed through the voice recognition unit 25, the line-of-sight detection unit 24, or the like.
  • the display unit 22 of the mobile device 20 may display the user's related information and the like.
  • the display unit 22 displays, for example, information on the user's schedule, information on the user's in-house newsletter, and the like.
  • the display unit 22 may display the content information.
  • the content information may be, for example, news, a weather forecast, an advertisement, or management information of a building 2.
  • the image of the information displayed here is generated by, for example, the guidance information generation unit 34 or the like.
  • the display unit 22 of the mobile device 20 displays an image prompting the user to board the car 13.
  • the image displayed here is generated as guidance information by, for example, the guidance information generation unit 34.
  • the display unit 22 of the mobile device 20 may display the same as when waiting at the landing 7.
  • FIG. 7 shows an example of guidance information displayed by the mobile device 20.
  • the guidance information generation unit 34 When the car 13 is closer to the user's destination floor than the preset proximity, the guidance information generation unit 34 generates an image indicating arrival at the destination floor as guidance information.
  • the guidance information generation unit 34 generates the image when, for example, the distance from the car 13 to the user's destination floor is shorter than the preset distance.
  • the guidance information generation unit 34 generates an image including an icon image showing the operation of opening the car door 14, a message notifying the arrival at the destination floor, and a message showing the floor of the destination floor.
  • the message is generated based on the language registered as the user's related information.
  • the third communication unit 30 transmits the generated guidance information to the mobile device 20.
  • FIG. 8 shows an example of guidance information displayed by the mobile device 20 when the related information is shown in FIG. 7 and when another language is registered.
  • the guidance information generation unit 34 generates guidance information including a common icon image regardless of the language of the related information.
  • the guidance information generation unit 34 may generate guidance information including an icon image corresponding to the user's cultural area when the related information includes information representing the user's cultural area.
  • the user moves to the destination according to the display displayed on the mobile device 20.
  • the authentication unit 35 of the building management board 29 notifies the electric lock 37 that the user can be authenticated based on the biometric information.
  • the electric lock 37 unlocks the door. As a result, the user can smoothly pass through the inside of the building 2 to the destination.
  • FIG. 9 shows an example of guidance information displayed by the mobile device 20 when an emergency event occurs.
  • the traffic management system 1 guides the user to evacuate when an emergency event occurs.
  • the emergency event is an event requiring evacuation in the building 2 such as an earthquake or a fire.
  • the route information generation unit 33 When an emergency event occurs, the route information generation unit 33 generates route information including an evacuation route and an evacuation timing.
  • the route information generation unit 33 sets the user's destination as an evacuation area.
  • the evacuation area is, for example, a place preset in the building 2 in response to an emergency event.
  • the route information generation unit 33 is based on, for example, the location information of the user, the congestion information of each floor, the location information of the location where the emergency event occurred, and the related information of the user. You may choose an evacuation site.
  • the route information generation unit 33 generates route information including an evacuation route with the evacuation site as the destination of the user. At this time, the evacuation route is generated based on the user's related information. For example, when the related information includes the physical function "wheelchair use", the route information generation unit 33 generates an evacuation route through a place having few steps.
  • the route information generation unit 33 sets the evacuation timing for each user, for example, when it becomes difficult to move due to the occurrence of congestion.
  • the evacuation timing is set based on, for example, the position of the user, congestion information on each floor, and the like.
  • the evacuation timing may be set based on the user's related information. For example, when the information related to the user includes the physical function "visual impairment" and the route information generation unit 33 obtains the information that the caregiver is heading to the user, the user is evacuated. The timing may be set after the caregiver arrives.
  • the guidance information generation unit 34 generates guidance information based on the route information generated by the route information generation unit 33. For example, before the user's evacuation timing, the guidance information generation unit 34 generates an image including a message requesting to wait until the evacuation timing as guidance information. After the user's evacuation timing, the guidance information generation unit 34 generates an image of an arrow along the evacuation route using the user's position information.
  • the third communication unit 30 transmits the generated guidance information to the mobile device 20.
  • the display unit 22 of the mobile device 20 displays the guidance information received by the second communication unit 21.
  • the display unit 22 of the mobile device 20 may display information on an emergency event that has occurred, information on the response status to the emergency event, and the like.
  • the image of the information displayed here is generated by, for example, the guidance information generation unit 34 or the like.
  • FIGS. 10 and 11 are flowcharts showing an example of the operation of the traffic management system 1 according to the first embodiment.
  • FIG. 10 shows an example of the operation of the traffic management system 1 in a normal time.
  • step S1 the congestion information acquisition unit 31 of the building management board 29 acquires congestion information for each floor. After that, the operation of the traffic management system 1 proceeds to step S2.
  • step S2 the authentication unit 35 of the building management board 29 determines whether the authentication information that can be authenticated is transmitted from the mobile device 20.
  • the authentication unit 35 sets the determination result to No.
  • the authentication unit 35 sets the determination result to No, and the authentication information received by the third communication unit 30 from the mobile device 20 can be authenticated. If the above is the case, the authentication unit 35 sets the determination result to Yes.
  • the determination result is Yes, the operation of the traffic management system 1 proceeds to step S3.
  • the determination result is No, the operation of the traffic management system 1 ends.
  • step S3 the route information generation unit 33 and the guidance information generation unit 34 of the building management board 29 read the related information from the related information storage unit 36 based on the authentication information.
  • the route information generation unit 33 acquires the target information of the user. After that, the operation of the traffic management system 1 proceeds to step S4. If the related information has already been read, the operation of step S3 may be omitted.
  • step S4 the location information acquisition unit 32 of the building management board 29 acquires the location information of the user. After that, the operation of the traffic management system 1 proceeds to step S5.
  • step S5 the building management board 29 determines whether the user has arrived at the destination based on the user's location information or the like. When the determination result is No, the operation of the traffic management system 1 proceeds to step S6. When the determination result is Yes, the operation of the traffic management system 1 ends.
  • step S6 the operation management unit 18 of the elevator management board 17 determines whether the user using the elevator 5 has arrived at the landing 7 based on the recognition signal transmitted from the mobile device 20 or the like.
  • the determination result is Yes
  • the operation of the traffic management system 1 proceeds to step S7.
  • the determination result is No
  • the operation of the traffic management system 1 proceeds to step S8.
  • step S7 the traffic management system 1 processes the movement using the elevator 5. After that, the operation of the traffic management system 1 proceeds to step S8.
  • step S8 the route information generation unit 33 of the building management board 29 generates route information for guiding the user based on the user's location information, purpose information, related information, congestion information, and the like. After that, the operation of the traffic management system 1 proceeds to step S9.
  • step S9 the guidance information generation unit 34 of the building management board 29 generates guidance information based on the user's related information, route information, and the like.
  • the third communication unit 30 transmits the generated guidance information to the mobile device 20 so that the display unit 22 can display it. After that, the operation of the traffic management system 1 proceeds to step S1.
  • FIG. 11 shows an example of the operation of the traffic management system 1 related to the processing of movement using the elevator 5.
  • step S71 the operation management unit 18 of the elevator management board 17 registers the user's call. After that, the operation of the traffic management system 1 proceeds to step S72.
  • step S72 the operation management unit 18 of the elevator management board 17 determines whether the car 13 has arrived at the destination floor of the user. When the determination result is No, the operation of the traffic management system 1 proceeds to step S73. When the determination result is Yes, the operation of the traffic management system 1 proceeds to step S74.
  • step S73 the guidance information generation unit 34 of the building management board 29 generates an image to be displayed to the user waiting at the landing 7.
  • the third communication unit 30 transmits the generated image to the mobile device 20 so that the display unit 22 can display it.
  • step S74 the guidance information generation unit 34 of the building management board 29 generates guidance information for urging the user waiting at the landing 7 to board the car 13.
  • the third communication unit 30 transmits the generated guidance information to the mobile device 20 so that the display unit 22 can display it. After that, the operation of the traffic management system 1 proceeds to step S75.
  • step S75 the operation management unit 18 of the elevator management board 17 determines whether the car 13 is closer than the preset proximity to the user's destination floor. When the determination result is No, the operation of the traffic management system 1 proceeds to step S76. When the determination result is Yes, the operation of the traffic management system 1 proceeds to step S77.
  • step S76 the guidance information generation unit 34 of the building management board 29 generates an image to be displayed to the user in the car 13.
  • the third communication unit 30 transmits the generated image to the mobile device 20 so that the display unit 22 can display it.
  • step S77 the guidance information generation unit 34 of the building management board 29 generates guidance information for notifying the user in the car 13 of the arrival at the destination floor.
  • the third communication unit 30 transmits the generated guidance information to the mobile device 20 so that the display unit 22 can display it.
  • the building 2 to which the traffic management system 1 is applied is not limited to the office building.
  • the building 2 may be, for example, an apartment house, a commercial facility, an accommodation facility, a public facility, or the like.
  • the combination, replacement, omission, etc. of the configuration may be performed according to the type or scale of the applicable building 2 or the characteristics of the user, to the extent that the purpose of the present disclosure is not deviated.
  • the mobile device 20 may be a general-purpose information terminal such as a smartphone.
  • the reading unit 23 may read information such as a user's password as authentication information. Further, the reading unit 23 may read the user's face image information, voice information, fingerprint information, or the like as biological information. When the mobile device 20 has a portion to be attached to the user's ear, the reading unit 23 may read ear acoustic information or the like as the user's biological information.
  • the image recognition unit 27 may perform image recognition of an image taken by a camera provided in the portable device 20.
  • the mobile device 20 may include a reproduction unit that reproduces guidance information generated as voice.
  • the congestion information acquisition unit 31 may acquire congestion information by an infrared sensor, an infrared camera, or the like provided in the building 2.
  • the congestion information acquisition unit 31 is a radio transmitted from the second communication unit 21 of the mobile device 20 carried by each user.
  • Congestion information may be acquired based on the signal strength of the signal or the like. The signal strength is acquired by, for example, a plurality of receivers provided in the building 2.
  • the congestion information acquisition unit 31 may acquire congestion information based on the location information of each user.
  • the route information generation unit 33 may have already generated route information for other users.
  • the route information is an example of the route information of another person.
  • the route information generation unit 33 may generate the route information of the user based on the other user's route information. For example, when the route information of another person for a wheelchair user uses the elevator 5, the route information generation unit 33 assumes that the route information for a healthy person user uses the escalator 4 or the stairs 3. May be generated.
  • the route information generation unit 33 may generate route information so that each user can control traffic so as to avoid the occurrence of congestion in the entire building 2.
  • the detection that the user has arrived at the landing 7 of the elevator 5 may be performed by the position information acquisition unit 32 of the building management board 29 or the like based on the position information of the user.
  • the arrival of the user at the landing 7 may be detected by receiving the radio signal transmitted from the second communication unit 21 of the mobile device 20 by the receiver provided at the landing 7.
  • the building management board and the elevator management board 17 may have the same hardware.
  • the guidance information generation unit 34 may be provided in the mobile device 20.
  • the third communication unit 30 of the building management board 29 transmits a part or all of the route information generated by the route information generation unit 33 to the mobile device 20.
  • the user-related information may be stored in the storage unit of the mobile device 20 carried by the user.
  • the traffic management system 1 includes a mobile device 20, a congestion information acquisition unit 31, a position information acquisition unit 32, a route information generation unit 33, and a guidance information generation unit 34.
  • the mobile device 20 is carried by the first user among one or more users of the building 2.
  • Building 2 has a plurality of floors.
  • the congestion information acquisition unit 31 acquires congestion information by one or more users in the building 2 for each of the plurality of floors.
  • the position information acquisition unit 32 acquires the position information in the building 2 of the first user.
  • the route information generation unit 33 provides route information for guiding the first user based on the location information of the first user, the purpose information of the first user, and the congestion information about the floor on which the first user is located.
  • the guidance information generation unit 34 generates guidance information in which the mobile device 20 guides the first user according to the route information.
  • Guidance information that guides the user is displayed on the mobile device 20 carried by the user. Therefore, the user can confirm the guidance information in a free place inside the building 2. As a result, the traffic management system 1 can smoothly guide the user who moves inside the building 2. In addition, the situation such as congestion may change for each floor inside the building 2. Since the route information generation unit 33 generates the route information of the user based on the congestion information acquired for each floor, the guidance of the user inside the building 2 becomes more reliable.
  • the guidance information generation unit 34 generates an image for guiding the first user as guidance information by displaying the image on the mobile device 20 to the first user.
  • the mobile device 20 is eyewear worn at the position of the eyes of the first user.
  • the guidance information is displayed as an image, the user can intuitively understand the guidance information.
  • the mobile device 20 is eyewear, the user can check the guidance information without using his / her hands. As a result, the usage information can confirm the guidance information while moving. As a result, the guidance of the user inside the building 2 becomes smoother.
  • the traffic management system 1 can display guidance information as visual information to a user of low vision or the like. At this time, the range of users targeted by the traffic management system 1 becomes wider.
  • the portable device 20 has a voice recognition unit 25.
  • the voice recognition unit 25 recognizes the voice of the first user.
  • the mobile device 20 accepts the input of the target information of the first user by the voice recognized by the voice recognition unit 25.
  • the portable device 20 has a line-of-sight detection unit 24.
  • the line-of-sight detection unit 24 detects the line of sight of the first user.
  • the mobile device 20 accepts the input of the target information of the first user by the line of sight detected by the line of sight detection unit 24.
  • the user can enter the purpose information without using his hands. Therefore, the convenience of the user is improved.
  • the traffic management system 1 includes a reading unit 23 and a related information storage unit 36.
  • the reading unit 23 reads the authentication information of the first user.
  • the reading unit 23 is provided in, for example, a mobile device 20 which is eyewear.
  • the reading unit 23 reads the biometric information of the first user as the authentication information when the mobile device 20 is attached to the first user.
  • the related information storage unit 36 stores the related information related to the first user in a readable manner based on the authentication information read by the reading unit 23.
  • the guidance information generation unit 34 generates guidance information based on the related information of the first user.
  • the related information storage unit 36 stores information in the language used by the first user as related information.
  • the related information storage unit 36 stores information on the floor that can be used by the first user among the plurality of floors as related information.
  • the related information storage unit 36 stores information on the physical function of the first user as related information.
  • information such as guidance information is presented in a language that the user can understand, or in a form according to each user's circumstances such as whether or not the user has a disability. Since it becomes easier for the user to grasp the content of the presented guidance information, the guidance of the user inside the building 2 becomes smoother. Further, when the mobile device 20 is a wearable device such as eyewear, the reading unit 23 can continuously read biometric information as authentication information. At this time, unauthorized replacement of users is prevented. The security of the building 2 is improved when the authentication information is used for cooperation with the entrance / exit management of the building 2.
  • the route information generation unit 33 generates route information based on the related information of the first user.
  • guidance information is generated so as to follow the route information according to each user's circumstances such as the presence or absence of a user's disability. Since the guidance information to the movement route that the user can actually move is presented, the guidance of the user becomes smoother.
  • the route information generation unit 33 generates route information including the evacuation route of the first user based on the related information of the first user when an emergency event occurs. Further, the route information generation unit 33 generates route information including the evacuation timing of the first user based on the related information of the first user when an emergency event occurs.
  • a plurality of floor-to-floor transportation means are provided.
  • the route information generation unit 33 selects one of the plurality of floor-to-floor transportation means based on the related information of the first user.
  • the route information generation unit 33 generates route information using the selected alcove-to-floor moving means.
  • the actually available floor-to-floor transportation method is selected according to the circumstances of each user. As a result, the guidance of the user becomes smoother.
  • the route information generation unit 33 generates route information based on congestion information about the floor including the destination of the first user.
  • the situation such as congestion may change for each floor inside the building 2. Since the route information is generated based on the congestion information before and after the user moves between the floors, the guidance of the user becomes smoother.
  • the route information generation unit 33 when the route information generation unit 33 is generating the route information of another person, the route information about the first user is generated based on the route information of the other person.
  • the other user route information is route information that guides another user of the first user among a plurality of users.
  • route information adjusted between each user is generated, so that the user's guidance becomes smoother.
  • the traffic management system 1 includes an operation management unit 18.
  • the operation management unit 18 registers a call from the departure floor to the destination floor of the first user when the first user arrives at the platform 7 of the elevator 5. Further, the operation management unit 18 detects that the first user has arrived at the landing 7 by the recognition signal transmitted by the mobile device 20.
  • the recognition signal is transmitted when the mobile device 20 recognizes the coded image 12 attached to the landing 7 by the camera provided in the mobile device 20. Further, it is transmitted when the mobile device 20 recognizes the landing device of the elevator 5 by the camera provided in the mobile device 20.
  • the guided user's movement becomes smoother. Further, since the arrival at the landing 7 is detected based on the image of the landing equipment or the coded image 12 attached to the landing 7, the call is registered by the natural movement of the user. Since the coded image 12 may be recognized based on the image taken by the camera rather than the visual sense of the user, there is room for adjustment in the size or color of the coded image 12 within the range in which the image can be recognized. Therefore, the influence of the coded image 12 on the design of the landing 7 is suppressed.
  • the guidance information generation unit 34 generates guidance information including information on the call of the first user.
  • FIG. 12 is a hardware configuration diagram of a main part of the traffic management system 1 according to the first embodiment.
  • Each function of the traffic management system 1 can be realized by a processing circuit.
  • the processing circuit includes at least one processor 100a and at least one memory 100b.
  • the processing circuit may include at least one dedicated hardware 200 with or as a substitute for the processor 100a and the memory 100b.
  • each function of the traffic management system 1 is realized by software, firmware, or a combination of software and firmware. At least one of the software and firmware is written as a program. The program is stored in the memory 100b. The processor 100a realizes each function of the traffic management system 1 by reading and executing the program stored in the memory 100b.
  • the processor 100a is also referred to as a CPU (Central Processing Unit), a processing device, an arithmetic unit, a microprocessor, a microcomputer, and a DSP.
  • the memory 100b is composed of, for example, a non-volatile or volatile semiconductor memory such as a RAM, a ROM, a flash memory, an EPROM, or an EEPROM.
  • the processing circuit includes the dedicated hardware 200
  • the processing circuit is realized by, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC, an FPGA, or a combination thereof.
  • Each function of the traffic management system 1 can be realized by a processing circuit. Alternatively, each function of the traffic management system 1 can be collectively realized by a processing circuit. For each function of the traffic management system 1, a part may be realized by the dedicated hardware 200, and the other part may be realized by software or firmware. As described above, the processing circuit realizes each function of the traffic management system 1 by the dedicated hardware 200, software, firmware, or a combination thereof.
  • FIG. 13 is a configuration diagram of the traffic management system 1 according to the second embodiment.
  • the building 2 to which the traffic management system 1 is applied is an apartment house such as a residence.
  • the building management board 29 of the traffic management system 1 has a virtual design generation unit 38.
  • the virtual design generation unit 38 is a part that generates an image of a virtual design inside the building 2.
  • the image of the design generated by the virtual design generation unit 38 includes, for example, an image of the appearance of the walls, fittings, columns, ceilings, floors, and the like of the building 2.
  • the image of the design generated by the virtual design generation unit 38 includes an image of the appearance of the landing equipment of the elevator 5.
  • the image of the design generated by the virtual design generation unit 38 includes an image of the design inside the car 13 of the elevator 5. Images of the design inside the car 13 include images of the interior walls, ceilings, and floors of the car 13, the appearance of the car door 14, and the appearance of the car equipment.
  • the virtual design image generated by the virtual design generation unit 38 is an image that the display unit 22 of the mobile device 20 superimposes on the image of the actual design.
  • the mobile device 20 is eyewear worn by the user.
  • the image of the virtual design is, for example, an AR image or a VR image.
  • the related information storage unit 36 may include design information selected and registered in advance by the user.
  • the design registered here includes a color and a pattern such as a wall surface.
  • the virtual design generation unit 38 generates a virtual design based on the user's related information.
  • FIG. 14 is a diagram showing an example of a virtual design generated by the virtual design generation unit 38 according to the second embodiment.
  • FIG. 14 shows an example of a virtual design inside the car 13 displayed by the mobile device 20.
  • the route information generation unit 33 starts generating route information when, for example, a user wearing a mobile device 20 which is eyewear goes out.
  • the determination of the user going out is performed, for example, based on the operation of the mobile device 20 by the user. Alternatively, the determination of the user going out may be made based on, for example, the user's location information. Alternatively, the determination of the user going out may be made based on, for example, an image taken by a fixed camera 28 provided in a common area of the user's residential floor.
  • the route information generation unit 33 sets, for example, the entrance of the building 2 as the destination of the user.
  • the operation management unit 18 registers a call from the user's residential floor to the entrance floor.
  • the user's residential floor is, for example, a floor including the user's residence.
  • the mobile device 20 worn by the user is the other user who is riding in the car 13 together with the user by the visual field camera 26 and the image recognition unit 27. Is detected.
  • the mobile device 20 transmits information such as the relative position with the detected other user or the contour of the detected other user to the virtual design generation unit 38.
  • the virtual design generation unit 38 generates an image of a virtual design around the user, for example, based on the position information of the user. For example, when the user is in the car 13, the virtual design generation unit 38 has an image of a virtual design of the inner wall surface, ceiling, and floor of the car 13, the car door 14, the car device, and the like. Is generated individually. At this time, the virtual design generation unit 38 may generate a virtual design based on information such as a relative position or contour of another user transmitted from the mobile device 20 worn by the user. The virtual design generation unit 38 generates an AR image in which an image of a car device or the like is superimposed on the foreground of an actual image of another user, for example. Alternatively, the virtual design generation unit 38 may generate a VR image representing another user.
  • the virtual design generation unit 38 generates a VR image in which an image of a car device or the like is superimposed on the foreground of an image representing another user, for example.
  • the virtual design generation unit 38 generates an image in which the image of the car operation panel 15 is superimposed on the image of another user.
  • the virtual design generation unit 38 may generate a user's image or an image superimposed on the image as a semi-transparent image of the car device.
  • the virtual design generation unit 38 generates an image of another user or an image superimposed on the foreground of an image of a virtual design such as a wall surface, a ceiling, a floor surface, and a car door 14 inside the car 13. ..
  • the third communication unit 30 transmits the image of the virtual design generated by the virtual design generation unit 38 to the second communication unit 21 of the mobile device 20.
  • the mobile device 20 displays an image of a virtual design received by the second communication unit 21.
  • the mobile device 20 may display guidance information by superimposing it on an image of a virtual design.
  • the mobile device 20 may display the content information on the image of the virtual design, for example, when the user is in the car 13.
  • the content information may be, for example, news, a weather forecast, an advertisement, or a notification from the manager of the building 2.
  • the mobile device 20 may acquire information such as the amount of activity of the user based on, for example, an acceleration sensor.
  • the route information generation unit 33 starts generating route information when the user returns home.
  • the return of the user is determined in the same manner as the user going out, for example.
  • the route information generation unit 33 sets, for example, the entrance of the user's residence, which is stored as the user's related information, as the destination.
  • the route generation unit selects the means of transportation between floors based on, for example, the amount of activity of the user.
  • the amount of activity of the user is transmitted from the mobile device 20 to the building management board 29, for example.
  • the activity amount of the user may be, for example, a history of the activity amount registered in the related information.
  • the route generation unit selects the stairs 3 as the means for moving between floors, for example, when the amount of activity of the user is small.
  • the route generation unit may select the means of transportation between floors based on the situation such as the amount of luggage of the user.
  • the user's situation is determined based on, for example, an image taken by a fixed camera 28 provided in a common area such as a front door.
  • the route generation unit selects the elevator 5 for carrying cargo as the means for moving between floors.
  • the route generation unit may select the normal elevator 5 as the floor-to-floor transportation means.
  • the operation management unit 18 registers a call from the entrance floor to the user's residential floor.
  • the traffic management system 1 includes a virtual design generation unit 38.
  • the virtual design generation unit 38 generates an image of a virtual design inside the building 2.
  • the image of the virtual design is displayed by the mobile device 20 to the first user.
  • the image of the virtual design includes an image of the inside of the car 13 displayed by the mobile device 20 to the first user when the first user is in the car 13.
  • the design of the building 2 that reflects the preferences of each user can be displayed to the user.
  • the user can spend more comfortably inside the building 2.
  • the user can select a design that does not easily cause discomfort such as a feeling of obstruction even in a closed space such as the inside of the car 13.
  • the virtual design generation unit 38 is an image in which the image of the first user among a plurality of users and the image of another user riding in the car 13 is superimposed on the image of the car device provided inside the car 13 in the foreground. To generate.
  • the virtual design generation unit 38 superimposes an image of a landing device such as a landing display panel 9 or a landing operation panel 8 on an image of a first user among a plurality of users and another user present at the landing 7. An image may be generated in the foreground. Further, the image of the virtual design such as the car device or the landing device does not have to reflect the size of the actual car device or the landing device.
  • the image of the virtual design such as a car device or a landing device may be displayed larger than the actual one so that the user can easily recognize it, for example. Further, an image of a virtual design such as a car device or a landing device may be displayed in a place where there is no actual car device or a landing device.
  • the operation on the car device or the landing device displayed as an image of a virtual design is, for example, a mobile device. It may be done through the device 20.
  • the operation is performed, for example, through the line-of-sight detection unit 24 or the like.
  • the operation through the line-of-sight detection unit 24 is performed, for example, by continuously directing the line of sight to an area where an operation target part such as a button is displayed in a virtual design image for a longer time than a preset time. ..
  • the virtual design generation unit 38 may be provided in the mobile device 20. At this time, communication resources are saved by communicating images of virtual designs. In addition, the deviation of the virtual design display due to the communication delay is suppressed. On the other hand, when the virtual design generation unit 38 is provided on the building management board 29, the load of information processing on the mobile device 20 is reduced.
  • FIG. 15 is a configuration diagram of the traffic management system 1 according to the third embodiment.
  • the building 2 to which the traffic management system 1 is applied is a commercial facility.
  • Commercial facilities include, for example, a plurality of stores.
  • the building management board 29 of the traffic management system 1 has an information storage unit 39.
  • the information storage unit 39 is a part that stores and stores information on the behavior of the user in the building 2.
  • the information on the user's behavior may include, for example, the user's viewpoint information.
  • FIG. 16 is a diagram showing an example of viewpoint information accumulated by the information storage unit 39 according to the third embodiment.
  • the viewpoint information is information on the point where the user wearing the mobile device 20 which is eyewear is looking at.
  • the viewpoint information includes an image taken by the field camera 26 as the field image information.
  • the viewpoint information includes the point at which the user's line of sight detected by the line-of-sight detection unit 24 is directed in the visual field image information as the line-of-sight information.
  • the mobile device 20 transmits the authentication information to the building management board 29 when the user arrives at the building 2.
  • the mobile device 20 may transmit the target information to the building management board 29.
  • the target information is input based on, for example, a user's voice or line of sight.
  • the purpose information may be, for example, information such as a user's purchase target product.
  • the purpose information may be input before the user arrives at the building 2.
  • the purpose information may be updated while the user is staying in the building 2.
  • the mobile device 20 may be lent to a user from a commercial facility.
  • the user inputs related information including the user's attributes and the like when entering the commercial facility through the mobile device 20.
  • the user may also input the purpose information.
  • the route information generation unit 33 sets the destination based on the user's purpose information. For example, when the target information is a purchase target product, the route information generation unit 33 sets a store that handles the purchase target product as the destination.
  • the route information generation unit 33 may select the alcove-to-floor transportation means based on the situation such as the amount of luggage of the user.
  • the route information generation unit 33 may select the alcove-to-floor transportation means based on the situation of the user's companion such as whether or not the stroller is used.
  • the route information generation unit 33 may select the alcove-to-floor transportation means based on the usage status of the user such as whether or not the shopping cart is used. For example, when the user is using a stroller, a shopping cart, or the like, the route information generation unit 33 selects the elevator 5 as the floor-to-floor transportation means.
  • the mobile device 20 may display content information when it does not interfere with the movement of the user and other surrounding users.
  • the mobile device 20 displays content information, for example, when the user is stopped.
  • the content information may be audio information.
  • the content information is generated by, for example, the guidance information generation unit 34, the virtual design generation unit 38, or the like.
  • the content information may be, for example, an advertisement.
  • the displayed advertisement may be selected based on related information such as a user's attribute or preference.
  • the image recognition unit 27 of the mobile device 20 may detect the product picked up by the user based on the image of the visual field camera 26. At this time, the mobile device 20 transmits the detected product information to the building management board 29.
  • the guidance information generation unit 34 or the like of the building management board 29 may generate content information including advertisements for the detected products or related products of the products. Further, the image recognition unit 27 of the mobile device 20 may detect a product that has not been purchased after being picked up by the user. At this time, the display unit 22 of the mobile device 20 may display the content information including the advertisement of the product again after, for example, a preset time has elapsed.
  • the visual field camera 26, the line-of-sight detection unit 24, and the image recognition unit 27 of the mobile device 20 generate viewpoint information.
  • the viewpoint information may include the result of detection of the product picked up by the user.
  • the mobile device 20 transmits the generated viewpoint information to the building management board 29.
  • the building management board 29 stores and stores the received viewpoint information in the information storage unit 39.
  • the guidance information generation unit 34 or the like of the building management board 29 may generate content information or the like based on the information stored in the information storage unit 39.
  • the user behavior information stored in the information storage unit 39 is not limited to viewpoint information.
  • the information on the user's behavior may include, for example, the history of the route traveled by the user.
  • the mobile device 20 may be a wearable device worn on an arm or the like or a device other than eyewear such as a smartphone.
  • the traffic management system 1 includes an information storage unit 39.
  • the information storage unit 39 stores viewpoint information.
  • the viewpoint information includes visual field image information and line-of-sight information.
  • the field of view image information is information of an image taken by the field of view camera 26.
  • the field of view camera 26 is provided in the portable device 20.
  • the field of view camera 26 is directed to the front of the first user.
  • the line-of-sight information is information on the line-of-sight of the first user detected by the line-of-sight detection unit 24.
  • the line-of-sight detection unit 24 is provided in the portable device 20.
  • the viewpoint information is information that can reflect the object of interest of the user. Therefore, the viewpoint information is valuable information that can be used for selecting an advertisement for the user. Since the viewpoint information is collected by the mobile device 20 that guides the movement of the user, the traffic management system 1 can accumulate the viewpoint information while providing benefits to the user.
  • the traffic management system 1 can be used, for example, to propose a tenant layout to the owner or manager of a commercial facility by using the accumulated viewpoint information or the like. Further, the traffic management system 1 can be used to provide information on products that are of great interest to users, for example, to tenants of commercial facilities or manufacturers of products handled in commercial facilities, based on the accumulated viewpoint information.
  • the information stored in the information storage unit 39 may be processed so that individual users are not specified, and then targeted for transactions as big data or the like.
  • the information stored in the information storage unit 39 may be deposited in, for example, an information bank.
  • the information storage unit 39 may be provided in the mobile device 20. At this time, communication resources are saved by communicating the information stored in the information storage unit 39. Further, when the user is the owner of the mobile device 20, the user can manage the information stored in the information storage unit 39 by himself / herself. The user may directly receive compensation for providing the information, for example, by depositing the accumulated information in an information bank.
  • FIG. 17 is a configuration diagram of the traffic management system 1 according to the fourth embodiment.
  • the building 2 to which the traffic management system 1 is applied is an accommodation facility such as a hotel.
  • Elevator 5 is equipped with a plurality of baskets 13.
  • a plurality of landing doors 10 are provided at the landing 7 of the elevator 5.
  • Each landing door 10 corresponds to any of the baskets 13.
  • the landing door 10 corresponding to the car 13 of the landing 7 on the floor opens and closes in conjunction with the opening and closing of the car door 14 of the car 13.
  • the operation management unit 18 assigns a user's call to one of a plurality of baskets 13 based on, for example, the operation efficiency of the elevator 5.
  • FIG. 18 is a plan view of the landing 7 according to the fourth embodiment.
  • FIG. 19 is a diagram showing an example of guidance information generated by the guidance information generation unit 34 according to the fourth embodiment.
  • FIG. 18 shows a user carrying a mobile device 20 and waiting for the arrival of the car 13 at the landing 7.
  • the elevator 5 includes six baskets 13 from Unit A to Unit F.
  • the operation management unit 18 assigns the user's call to the F unit.
  • a user who is unaware that a call has been assigned to Unit F is standing by facing the landing door 10 corresponding to Unit B. At this time, the user turns his back to the landing door 10 corresponding to Unit F.
  • the guidance information generation unit 34 generates guidance information including information for specifying the car 13 to which the user's call is assigned.
  • the information of the car 13 to which the user's call is assigned is provided, for example, from the operation management unit 18 to the guidance information generation unit 34.
  • FIG. 19 shows an example of guidance information displayed by the mobile device 20.
  • the guidance information generation unit 34 generates guidance information including the number of the car 13 such as "Unit F" as the information for specifying the car 13 to which the user's call is assigned. Further, the guidance information generation unit 34 generates guidance information including an image such as an arrow indicating the direction seen from the user of the car 13 to which the call is assigned.
  • the third communication unit 30 transmits the generated guidance information to the mobile device 20.
  • the display unit 22 of the mobile device 20 displays the guidance information received by the second communication unit 21.
  • the destination of the user is, for example, the room in which the user stays.
  • the room in which the user stays is entered, for example, at check-in.
  • the mobile device 20 corresponding to the room in which the guest is staying may be lent to the user, for example, at the time of check-in.
  • the display unit 22 of the mobile device 20 may display the content information to the user, for example, when the user is waiting at the landing 7 of the elevator 5 or is in the car 13.
  • the content information may include facility information of the accommodation facility, cautionary information on the use of the facility, tourist information around the facility, and the like.
  • the content information displayed here may replace the material presented in writing in each room.
  • the electric lock 37 in the room where the user stays may be unlocked based on the authentication information read by the reading unit 23 of the mobile device 20.
  • the operation management unit 18 assigns the call of the first user to any of the plurality of cars 13.
  • the guidance information generation unit 34 generates guidance information including information for specifying the car 13 to which the call of the first user is assigned. Further, the guidance information generation unit 34 generates guidance information including the direction of the car 13 to which the call of the first user is assigned. The direction of the car 13 is the direction seen from the first user at the landing 7.
  • the user can use the elevator 5 more smoothly. Further, since the direction of the car 13 as seen from the user is shown as guidance information, the user can more easily recognize the car 13. This makes the movement of the user smoother.
  • FIG. 20 is a configuration diagram of the traffic management system 1 according to the fifth embodiment.
  • the building 2 to which the traffic management system 1 is applied is a public facility.
  • the car 13 of the elevator 5 has two car doors 14. One car door 14 is arranged in front of the car 13. The other car door 14 is arranged on the rear surface of the car 13. On a part of a plurality of floors, a landing 7 is provided on the front side of the car 13. In the other part of the plurality of floors, the landing 7 is provided on the rear side of the car 13. When the car 13 stops on any floor, the car door 14 on the side where the landing 7 on the floor is provided opens and closes.
  • FIG. 21 is a diagram showing a user who gets in the car 13 according to the fifth embodiment.
  • a landing 7 is provided on the front side of the car 13.
  • a landing 7 is provided on the rear side of the car 13.
  • a user who is unaware that the car door 14 that opens and closes differs between the departure floor and the destination floor turns his back to the car door 14 on the rear side that opens and closes on the destination floor. In this case, the user may take time to get on from the car 13. Therefore, the guidance information generation unit 34 generates guidance information including information indicating the car door 14 that opens and closes on the destination floor of the user.
  • FIG. 22 is a diagram showing an example of guidance information generated by the guidance information generation unit 34 according to the fifth embodiment.
  • FIG. 22 shows an example of guidance information displayed by the mobile device 20.
  • the guidance information generation unit 34 generates guidance information including a message indicating a car door 14 that opens and closes on the destination floor of the user. Further, the guidance information generation unit 34 generates guidance information including an image such as an arrow indicating the direction seen from the user of the car door 14 that opens and closes on the destination floor.
  • the third communication unit 30 transmits the generated guidance information to the mobile device 20.
  • the display unit 22 of the mobile device 20 displays the guidance information received by the second communication unit 21.
  • the guidance information generation unit 34 is the first when the first user is in the car 13 having the plurality of car doors 14. Generate guidance information including the direction of the car door 14 to be opened on the user's destination floor.
  • the direction of the car door 14 is the direction seen from the first user who is in the car 13.
  • the traffic management system according to this disclosure can be applied to buildings with multiple floors.
  • 1 traffic management system 2 buildings, 3 stairs, 4 escalator, 5 elevators, 6 hoistways, 7 landings, 8 landing operation boards, 9 landing display boards, 10 landing doors, 11 three-sided frames, 12 coded images, 13 baskets, 14 car door, 15 car operation panel, 16 car display panel, 17 elevator management panel, 18 operation management unit, 19 1st communication unit, 20 mobile devices, 21 2nd communication unit, 22 display unit, 23 reading unit, 24 line of sight Detection unit, 25 voice recognition unit, 26 field camera, 27 image recognition unit, 28 fixed camera, 29 building management board, 30 third communication unit, 31 congestion information acquisition unit, 32 location information acquisition unit, 33 route information generation unit, 34 Guidance information generation unit, 35 authentication unit, 36 related information storage unit, 37 electric lock, 38 virtual design generation unit, 39 information storage unit, 100a processor, 100b memory, 200 dedicated hardware

Abstract

L'invention concerne un système de gestion de circulation qui est destiné à un bâtiment et qui permet de guider un utilisateur se déplaçant à l'intérieur du bâtiment. Un système de gestion de circulation (1) est pourvu d'un dispositif portable (20), d'une unité d'acquisition d'informations d'encombrement (31), d'une unité d'acquisition d'informations de position (32), d'une unité de génération d'informations de trajet (33) et d'une unité de génération d'informations de guidage (34). Le dispositif portable (20) est porté par un premier utilisateur parmi un ou plusieurs utilisateurs dans un bâtiment (2). Le bâtiment (2) a une pluralité d'étages. L'unité d'acquisition d'informations d'encombrement (31) acquiert, pour chacun des étages, des informations concernant un encombrement provoqué par un ou plusieurs utilisateurs dans le bâtiment (2). L'unité d'acquisition d'informations de position (32) acquiert des informations de position concernant le premier utilisateur dans le bâtiment (2). L'unité de génération d'informations de trajet (33) génère des informations de trajet pour diriger le premier utilisateur, sur la base des informations de position concernant le premier utilisateur, des informations d'intention concernant le premier utilisateur et des informations d'encombrement concernant l'étage où se trouve le premier utilisateur. L'unité de génération d'informations de guidage (34) génère des informations de guidage pour le dispositif portable (20) pour guider le premier utilisateur conformément aux informations de trajet.
PCT/JP2020/022138 2020-06-04 2020-06-04 Système de gestion de circulation dans un bâtiment WO2021245886A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2022529256A JP7294538B2 (ja) 2020-06-04 2020-06-04 建物の交通管理システム
PCT/JP2020/022138 WO2021245886A1 (fr) 2020-06-04 2020-06-04 Système de gestion de circulation dans un bâtiment
CN202080101375.0A CN115698632A (zh) 2020-06-04 2020-06-04 建筑物的交通管理系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/022138 WO2021245886A1 (fr) 2020-06-04 2020-06-04 Système de gestion de circulation dans un bâtiment

Publications (1)

Publication Number Publication Date
WO2021245886A1 true WO2021245886A1 (fr) 2021-12-09

Family

ID=78830284

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/022138 WO2021245886A1 (fr) 2020-06-04 2020-06-04 Système de gestion de circulation dans un bâtiment

Country Status (3)

Country Link
JP (1) JP7294538B2 (fr)
CN (1) CN115698632A (fr)
WO (1) WO2021245886A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7416316B1 (ja) 2023-06-07 2024-01-17 三菱電機ビルソリューションズ株式会社 エレベーターシステム

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002286491A (ja) * 2001-03-22 2002-10-03 Kokusai Kogyo Co Ltd 携帯用ナビゲーションシステム
JP2011007696A (ja) * 2009-06-26 2011-01-13 Sanyo Electric Co Ltd 経路探索装置、経路探索システムおよび経路探索プログラム
JP2011242881A (ja) * 2010-05-14 2011-12-01 Chugoku Electric Power Co Inc:The 避難誘導方法、及びこの方法に用いる情報処理システム
JP2013513804A (ja) * 2009-12-09 2013-04-22 クアルコム,インコーポレイテッド 屋内ナビゲーション環境で命令を低減するための方法および装置
JP2015225025A (ja) * 2014-05-29 2015-12-14 株式会社日立システムズ 眼鏡型のウェアラブル端末および該ウェアラブル端末を使用する建屋内行先誘導システム
WO2015194017A1 (fr) * 2014-06-19 2015-12-23 日立マクセル株式会社 Dispositif portable sur soi et procédé d'authentification associé
WO2016117061A1 (fr) * 2015-01-22 2016-07-28 株式会社野村総合研究所 Terminal portatif et système de traitement d'informations utilisant ce terminal
JP2017026568A (ja) * 2015-07-28 2017-02-02 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
WO2018195099A1 (fr) * 2017-04-19 2018-10-25 Magic Leap, Inc. Exécution de tâche multimodale et édition de texte pour un système portable
JP2019516949A (ja) * 2016-05-19 2019-06-20 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 屋内ナビゲーションのための方法、装置及びシステム
WO2019181153A1 (fr) * 2018-03-20 2019-09-26 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
WO2019234899A1 (fr) * 2018-06-07 2019-12-12 三菱電機株式会社 Système de commande de dispositif

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002286491A (ja) * 2001-03-22 2002-10-03 Kokusai Kogyo Co Ltd 携帯用ナビゲーションシステム
JP2011007696A (ja) * 2009-06-26 2011-01-13 Sanyo Electric Co Ltd 経路探索装置、経路探索システムおよび経路探索プログラム
JP2013513804A (ja) * 2009-12-09 2013-04-22 クアルコム,インコーポレイテッド 屋内ナビゲーション環境で命令を低減するための方法および装置
JP2011242881A (ja) * 2010-05-14 2011-12-01 Chugoku Electric Power Co Inc:The 避難誘導方法、及びこの方法に用いる情報処理システム
JP2015225025A (ja) * 2014-05-29 2015-12-14 株式会社日立システムズ 眼鏡型のウェアラブル端末および該ウェアラブル端末を使用する建屋内行先誘導システム
WO2015194017A1 (fr) * 2014-06-19 2015-12-23 日立マクセル株式会社 Dispositif portable sur soi et procédé d'authentification associé
WO2016117061A1 (fr) * 2015-01-22 2016-07-28 株式会社野村総合研究所 Terminal portatif et système de traitement d'informations utilisant ce terminal
JP2017026568A (ja) * 2015-07-28 2017-02-02 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
JP2019516949A (ja) * 2016-05-19 2019-06-20 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 屋内ナビゲーションのための方法、装置及びシステム
WO2018195099A1 (fr) * 2017-04-19 2018-10-25 Magic Leap, Inc. Exécution de tâche multimodale et édition de texte pour un système portable
WO2019181153A1 (fr) * 2018-03-20 2019-09-26 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
WO2019234899A1 (fr) * 2018-06-07 2019-12-12 三菱電機株式会社 Système de commande de dispositif

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7416316B1 (ja) 2023-06-07 2024-01-17 三菱電機ビルソリューションズ株式会社 エレベーターシステム

Also Published As

Publication number Publication date
CN115698632A (zh) 2023-02-03
JP7294538B2 (ja) 2023-06-20
JPWO2021245886A1 (fr) 2021-12-09

Similar Documents

Publication Publication Date Title
AU2021200009B2 (en) System and method for alternatively interacting with elevators
JP6619760B2 (ja) エレベーター装置、エレベーターシステム、および、自律ロボットの制御方法
US10095315B2 (en) System and method for distant gesture-based control using a network of sensors across the building
RU2527883C2 (ru) Способ управления системой подъёмного оборудования
US20120234631A1 (en) Simple node transportation system and control method thereof
TW201532940A (zh) 電梯控制系統
US20130277153A1 (en) Conveying system
JP6927867B2 (ja) エレベーターシステム
EP3453665A1 (fr) Affichage dynamique d'informations pour les occupants d'un bâtiment
KR20180137549A (ko) 엘리베이터 시스템 및 카 호출 추정 방법
US11816934B2 (en) Smart airport and cabin system to avoid touch points and maintain social distancing
JP6180682B1 (ja) セキュリティゲートおよびエレベータシステム
WO2021245886A1 (fr) Système de gestion de circulation dans un bâtiment
KR101947570B1 (ko) 사용자 맞춤 동작을 수행하는 승강 시스템
JP5596423B2 (ja) エレベータ制御システム
CN207671480U (zh) 一种基于人脸识别的电梯控制系统
US20210101776A1 (en) Elevator system
JP6068691B1 (ja) エレベータの運行管理システム及び運行管理方法
JP2003226474A (ja) エレベータシステム
WO2022153899A1 (fr) Système de guidage
JP6719357B2 (ja) エレベータシステム
JP6969654B1 (ja) 案内システムおよびエレベーターシステム
JP5804958B2 (ja) 避難弱者優先救出機能付きエレベータ
KR102514128B1 (ko) 복수의 홈 디바이스간 연결을 제공하는 인공 지능 장치 및 그 방법
JPWO2021245886A5 (fr)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20938633

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022529256

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20938633

Country of ref document: EP

Kind code of ref document: A1