WO2022244372A1 - 環境状態通知装置、環境状態通知方法及びプログラム - Google Patents
環境状態通知装置、環境状態通知方法及びプログラム Download PDFInfo
- Publication number
- WO2022244372A1 WO2022244372A1 PCT/JP2022/008481 JP2022008481W WO2022244372A1 WO 2022244372 A1 WO2022244372 A1 WO 2022244372A1 JP 2022008481 W JP2022008481 W JP 2022008481W WO 2022244372 A1 WO2022244372 A1 WO 2022244372A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- state
- unit
- environmental
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 30
- 230000007613 environmental effect Effects 0.000 claims description 219
- 238000004891 communication Methods 0.000 claims description 97
- 238000001514 detection method Methods 0.000 claims description 52
- 230000001133 acceleration Effects 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 5
- 230000035807 sensation Effects 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 abstract description 16
- 238000004458 analytical method Methods 0.000 description 40
- 238000010586 diagram Methods 0.000 description 24
- 238000009434 installation Methods 0.000 description 24
- 238000010191 image analysis Methods 0.000 description 19
- 230000009471 action Effects 0.000 description 12
- 238000013459 approach Methods 0.000 description 12
- 230000006399 behavior Effects 0.000 description 9
- 238000007405 data analysis Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 238000003384 imaging method Methods 0.000 description 7
- 230000005856 abnormality Effects 0.000 description 5
- 230000007547 defect Effects 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 3
- 230000001747 exhibiting effect Effects 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000003703 image analysis method Methods 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/18—Status alarms
- G08B21/24—Reminder alarms, e.g. anti-loss alarms
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present disclosure relates to an environmental state notification device, an environmental state notification method, and a program.
- Patent Document 1 discloses a wearable terminal equipped with a camera.
- the angle of view of the camera is controlled according to the action state of the user.
- Pedestrians also need to pay sufficient attention not only to vehicles but also to other entities such as suspicious persons to ensure safety. Furthermore, it is very effective from the viewpoint of improving the safety of pedestrians to prevent pedestrians from approaching and entering areas requiring caution.
- the present disclosure has been made in view of the circumstances described above, and provides techniques that are advantageous for improving user safety.
- an environment information acquisition unit attached to a user and acquiring environment information data indicating information of the surrounding environment of the user;
- the present invention relates to an environmental state notification device including a recognition unit and a notification unit that notifies a user according to the state of the surrounding environment for the user recognized by the environmental state recognition unit.
- the environment information acquisition unit may include a wave sensor that receives at least one of electromagnetic waves and sound waves from the environment around the user and acquires environment information data.
- the wave sensor may acquire an image of the environment around the user.
- the environmental information acquisition unit may include an electromagnetic wave emission unit that emits detection electromagnetic waves or detection sound waves, and the wave sensor may receive the reflected detection electromagnetic waves or detection sound waves.
- the environment information acquisition unit may include a location information acquisition unit that acquires user location information and a map information collation unit that acquires environment information data based on the location information and map information.
- the location information acquisition unit may acquire the user's location information from the satellite positioning system.
- the environmental state notification device includes a communication unit that performs data communication, and the environmental state recognition unit receives environmental information data acquired by the environmental information acquisition unit and the environment information sent from the first external device via the communication unit.
- the state of the surrounding environment for the user may be recognized based on the information data.
- the environmental state notification device includes a user state information acquisition unit that acquires user state data indicating the state of the user, and the environmental state recognition unit recognizes the state of the surrounding environment for the user based on the environmental information data and the user state data. You may
- the user state information acquisition unit may include a sensor that detects at least one of the user's movement speed and acceleration.
- the environmental state recognition unit may predict the user's moving route based on the user state data, and recognize the state of the surrounding environment for the user based on the predicted user's moving route.
- the environmental state recognition unit may acquire a dynamic map of the user's surrounding environment and recognize the state of the surrounding environment for the user based on the environmental information data and the dynamic map.
- the notification unit may be attached to the user.
- the notification unit may notify the user through at least one of auditory, visual, and tactile sensations.
- the notification unit may notify the second external device that exists in the user's surrounding environment and is not attached to the user, according to the state of the user's surrounding environment recognized by the environmental state recognition unit.
- the second external device may be a vehicle.
- the environmental state notification device may include a communication unit that performs data communication, and the communication unit may send data indicating the state of the surrounding environment for the user recognized by the environmental state recognition unit to the third external device.
- the environmental state notification device may include an installation state detection unit that detects the installation state of the environment information acquisition unit with respect to the user, and the notification unit may notify the user according to the detection result of the installation state detection unit.
- Another aspect of the present disclosure includes a step of acquiring environment information data indicating information about the surrounding environment of the user by an environment information acquisition unit attached to the user;
- the present invention relates to an environmental state notification method including a step of recognizing a state of a surrounding environment, and a step of a notification unit notifying a user according to the state of the surrounding environment recognized by the environmental state recognition unit.
- Another aspect of the present disclosure is a procedure for obtaining, in a computer, environmental information data indicating information about the surrounding environment of the user by an environmental information obtaining unit attached to the user; , a program for executing a procedure for recognizing the state of the surrounding environment for a user, and a procedure for a notifying unit to notify the user according to the state of the surrounding environment for the user recognized by the environmental state recognizing unit.
- FIG. 1 is a diagram showing an example of a wearable device that functions as an environmental condition notification device.
- FIG. 2 is a diagram illustrating an example of a functional configuration of a wearable device;
- FIG. 3 is a diagram illustrating an example of a functional configuration of an environmental state notification device;
- FIG. 4 is a diagram showing another example of the functional configuration of the environmental state notification device.
- FIG. 5 is a diagram showing another example of the functional configuration of the environmental state notification device.
- FIG. 6 is a diagram showing another example of the functional configuration of the wearable device.
- FIG. 7 is a diagram showing another example of the functional configuration of the wearable device.
- FIG. 8 is a flow chart showing an example of an environmental state notification method.
- FIG. 9 is a diagram showing another example of the functional configuration of the environmental state notification device.
- FIG. 10 is a diagram showing another example of the functional configuration of the environmental state notification device.
- FIG. 11 is a diagram showing another example of the functional configuration of the environmental state notification device.
- FIG. 12 is a diagram showing another example of the functional configuration of the environmental state notification device.
- FIG. 13 is a diagram showing another example of the functional configuration of the environmental state notification device.
- FIG. 1 is a diagram showing an example of the wearable device 11 that functions as the environmental state notification device 10.
- FIG. 2 is a diagram showing an example of the functional configuration of the wearable device 11. As shown in FIG.
- the wearable device 11 shown in FIG. 1 is attached to user 1's clothing.
- the attachment point of the wearable device 11 is not limited to clothes, and the wearable device 11 can be attached to the user 1 in any manner.
- the wearable device 11 is worn by the user 1, typically in the form of a wrist watch, a neck strap, eyeglasses, earphones (including headphones), a ring, or the like.
- the wearable device 11 shown in FIG. 1 The wearable device 11 shown in FIG.
- the attachment part 21 is used to attach the wearable device 11 to the user 1 .
- the specific structure of the attachment portion 21 is not limited, and the manner in which the wearable device 11 is attached to the user 1 is also not limited.
- the attachment part 21 is provided with a magnet, and the wearable device 11 is detachably attached to the clothes by using the magnetic force to pinch the clothes. be able to.
- the attachment portion 21 may attach the wearable device 11 to the clothes of the user 1 in another form such as a clip.
- the mounting portion 21 can be configured by a belt (wristband).
- the attachment portion 21 has a configuration and shape suitable for the manner in which the wearable device 11 is attached to the user 1 .
- the control unit 22 controls each functional configuration unit that the wearable device 11 has.
- the control unit 22 typically includes various processors such as a CPU (Central Processing Unit) or MPU (Micro Processing Unit).
- control unit 22 The specific configuration of the control unit 22 is not limited.
- the controller 22 may be configured by a single device, and the controller 22 may integrally control a plurality of functional components included in the wearable device 11 .
- the control unit 22 may be composed of multiple devices.
- each functional configuration unit of the wearable device 11 may incorporate a corresponding control device (control unit 22).
- the memory 23 stores various data (including programs).
- the memory 23 includes at least one of a nonvolatile storage medium and a volatile storage medium.
- the memory 23 can be configured by, for example, EEPROM (Electrically Erasable Programmable Read Only Memory) or RAM (Random Access Memory).
- the memory 23 can be configured by a magnetic storage device such as a HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- Each functional configuration unit (including the control unit 22) of the wearable device 11 may read various data from the memory 23, store new data in the memory 23, or store new data in the memory 23. You may update the data that is
- the environment information acquisition unit 24 is attached to the user 1 and acquires environment information data indicating information on the surrounding environment of the user 1 .
- the specific configuration of the environment information acquisition unit 24 and the specific information included in the environment information data are not limited.
- the environment information acquisition unit 24 includes a wave sensor that operates under the control of the control unit 22.
- the wave sensor receives at least one of electromagnetic waves and sound waves (for example, ultrasonic waves) from the environment around the user 1 and acquires environment information data.
- the electromagnetic waves referred to here can include light such as visible light and radio waves such as millimeter waves. Therefore, the wave sensor is configured as an image sensor that acquires an image (still image and/or moving image) of the surrounding environment of the user 1, or as a sensor that receives/receives radio waves and light reflected by objects around the user 1. can be configured.
- the environment information acquisition unit 24 may include a camera (imaging device), LiDAR (Light Detection and Ranging), ToF (Time of Flight), millimeter wave radar and/or ultrasonic sensor.
- the camera can adopt various shooting methods such as a stereo camera, a monocular camera, and an infrared camera.
- the environment information acquisition unit 24 may have a sensor fusion unit (not shown).
- This sensor fusion unit performs sensor fusion processing to obtain new information from a combination of different types of sensor data (eg, image data and radar detection data).
- Sensor fusion processing may include, for example, processing that integrates, fuses, or combines multiple types of sensor data.
- the environment state recognition unit 25 recognizes the state of the surrounding environment for the user 1 based on the environment information data acquired by the environment information acquisition unit 24 .
- the "state of the surrounding environment for User 1" here may include the state of the safety of the surrounding environment for User 1. For example, whether or not a mobile object such as a vehicle or a bicycle is approaching the user 1, and whether or not the current position or future predicted position of the user 1 is a position requiring attention, can be determined by the "surrounding environment for the user 1". state. In addition, whether or not a mobile object such as a vehicle or a creature such as a human being located near the user 1 is exhibiting suspicious behavior can be included in the "status of the surrounding environment for the user 1".
- state of the surrounding environment for user 1 may include the state of the surrounding environment that may affect user 1 in ways other than safety.
- the specific analysis method by the environmental state recognizing unit 25 is not limited.
- the environmental state recognition unit 25 is equipped with AI (Artificial Intelligence), and can analyze environmental information data using learned data obtained by deep learning.
- learned data may include learned parameters and an inference program incorporating the learned parameters.
- Trained parameters and inference programs are obtained, for example, as a result of learning using raw data and/or training data sets obtained from raw data.
- the environmental state recognition unit 25 of this example can use learned data in which "environmental information data and/or data derived from the environmental information data" is associated with "data relating to the state of the surrounding environment for user 1".
- the environmental state recognition unit 25 applies the “environmental information data and/or data derived from the environmental information data” to the learned data as input, thereby obtaining “data regarding the state of the surrounding environment for the user 1”. can be obtained as output.
- the notification unit 26 notifies the user 1 according to the "state of the surrounding environment for the user 1" recognized by the environmental state recognition unit 25.
- the notification method for the user 1 by the notification unit 26 is not limited.
- the notification unit 26 attached to the user 1 can typically notify the user 1 through the senses of the user 1 (for example, at least one of auditory, visual, and tactile senses).
- the notification unit 26 can notify the user 1 of "the state of the surrounding environment for the user 1" by means of voice or music provided to the user 1 via speakers or earphones.
- the auditory notification of the user 1 enables prompt and intuitive notification to the user 1 and basically does not impair the visual information of the user 1 .
- the notification unit 26 provides new benefits such as recognition of the position and direction of an object requiring attention and induction of reflexive safety behavior of the user 1. effect.
- An example of utilization of stereophonic technology in the notification unit 26 will be described later.
- the notification unit 26 can notify the user 1 of "the state of the surrounding environment for the user 1" by means of an image provided to the user 1 through glasses (for example, AR (Augmented Reality) glasses).
- the notification unit 26 can display an image showing the state of the surrounding environment for the user 1 on glasses.
- the notification unit 26 turns off the image display on the glasses so that the user 1 can see the surrounding environment. It may be made visible.
- the notification unit 26 notifies the user 1 of "the state of the surrounding environment for the user 1" by haptic changes such as vibration provided to the user 1 via a haptics device (for example, a vibration device or a pressure variable device). I can.
- a haptics device for example, a vibration device or a pressure variable device.
- FIG. 3 is a diagram showing an example of the functional configuration of the environmental state notification device 10. As shown in FIG.
- the environment information acquisition unit 24 includes an imaging device 31 that acquires an image of the environment around the user 1, and has an image sensor (CMOS (Complementary Metal Oxide Semiconductor) sensor, etc.) as a wave sensor. CMOS (Complementary Metal Oxide Semiconductor) sensor, etc.) as a wave sensor. .
- CMOS Complementary Metal Oxide Semiconductor
- the imaging device 31 transmits the captured image of the surrounding environment to the environmental state recognition unit 25 as the environment information data D11.
- the environmental state recognition unit 25 has an image analysis device 32 that analyzes the captured image sent from the imaging device 31 .
- the image analysis device 32 recognizes "the state of the surrounding environment for the user 1" by analyzing the captured image. For example, the image analysis device 32 determines whether the captured image includes a moving object such as a vehicle approaching the user 1, an unsafe area such as a roadway, and/or other objects to which pedestrians should pay attention. , can be recognized.
- a specific image analysis method performed by the image analysis device 32 is not limited.
- the image analysis device 32 may recognize types of objects around the user 1 by performing recognition processing such as semantic segmentation.
- the image analysis device 32 uses techniques such as SLAM (Simultaneous Localization and Mapping) based on the image data acquired by the imaging device 31 to estimate the self-position and create an environment map (local map). good.
- SLAM Simultaneous Localization and Mapping
- a local map can be created, for example, as a three-dimensional high-precision map and an occupancy grid map.
- a three-dimensional high-precision map can be created, for example, as a point cloud map, which will be described later.
- the occupancy grid map can be created by dividing the three-dimensional or two-dimensional space around the user 1 into grids (lattice) of a predetermined size and showing the occupancy state of objects in grid units.
- the occupancy state of an object referred to here is indicated by, for example, the presence or absence of an object and the existence probability.
- the image analysis device 32 determines whether or not it is necessary to call attention to the user 1 based on the analysis result of the captured image.
- the image analysis device 32 prompts the user 1 to pay attention. You may decide that you need to. On the other hand, if the captured image does not include an object to which the pedestrian should pay attention, the image analysis device 32 may determine that there is no need to call the user 1's attention.
- the image analysis device 32 transmits a notification instruction signal D12 to the notification unit 26 depending on whether or not it is necessary to call attention to the user 1.
- the image analysis device 32 determines that it is necessary to call attention to the user 1, it transmits a notification instruction signal D12 indicating the determination to the notification unit 26.
- the image analysis device 32 may transmit the notification instruction signal D12 indicating the determination to the notification unit 26, or send the notification instruction signal D12 itself to the notification unit 26. 26 does not have to be sent.
- the image analysis device 32 may transmit the notification instruction signal D12 to the notification unit 26 regardless of whether it is necessary to call attention to the user 1 or not.
- the image analysis device 32 may transmit the notification instruction signal D12 to the notification unit 26 only when the user 1 needs to be alerted.
- the notification unit 26 notifies the user 1 according to the state of the surrounding environment based on the notification instruction signal D12 sent from the image analysis device 32 .
- the notification unit 26 of this example includes an audio device, a vibration device and/or a video device.
- the notification unit 26 when the notification unit 26 receives the notification instruction signal D12 sent when it is necessary to call the attention of the user 1, the notification unit 26 notifies the user 1 of the notification (audio, vibration and/or video) that calls the attention of the user 1. emit towards
- the notification unit 26 when the notification unit 26 receives the notification instruction signal D12 sent when it is not necessary to call the attention of the user 1, the notification unit 26 does not need to issue a notification to the user 1, or need to call attention to the user 1.
- a notification may be issued to User 1 to indicate that there is no. Further, the notification unit 26 may not issue a notification to the user 1 while not receiving the notification instruction signal D12 from the image analysis device 32 .
- the user 1 is notified of the existence of a target requiring attention in the surrounding environment based on the photographed image data acquired by the imaging device 31 included in the wearable device 11. be able to. Based on such notification, the user 1 can grasp the state of the surrounding environment for the user 1, and can take appropriate action as necessary.
- the environmental state notification device 10 of this example is configured as a pedestrian safety protection device that improves the safety of pedestrians (user 1).
- FIG. 4 is a diagram showing another example of the functional configuration of the environmental state notification device 10. As shown in FIG.
- the environmental information acquisition unit 24 includes an emitter (emission unit) 35 that emits the detected electromagnetic wave or the detected sound wave, and a receiver (wave sensor) 36 that receives the reflected detected electromagnetic wave or the detected sound wave. .
- the detected electromagnetic waves can be either light or radio waves.
- the emitter 35 and the receiver 36 are configured as LiDAR or ToF
- the emitter 35 emits light as detection electromagnetic waves toward the surrounding environment
- the receiver 36 receives the light reflected in the surrounding environment.
- the emitter 35 and the receiver 36 are configured as a millimeter wave radar, the emitter 35 emits radio waves in the millimeter wave band as detection electromagnetic waves toward the surrounding environment, and the receiver 36 receives the radio waves reflected in the surrounding environment. .
- the environmental information acquisition unit 24 transmits the surrounding environment information based on the reflected detected electromagnetic wave or reflected detected sound wave received by the receiver 36 to the environmental state recognition unit 25 as the environmental information data D11.
- the environmental state recognition unit 25 is equipped with a received data analysis device 37.
- the received data analysis device 37 analyzes the surrounding environment information based on the detection result (the reflected detected electromagnetic wave or the reflected detected sound wave) of the receiver 36 sent from the environment information acquisition unit 24, thereby obtaining "the state of the surrounding environment for the user 1 ”. Then, the received data analysis device 37 determines whether or not it is necessary to call attention to the user 1 based on the analysis result of the surrounding environment information.
- the received data analysis device 37 is capable of recognizing the same target as the recognition target of the image analysis device 32 (see FIG. 3). can decide whether it is necessary to prompt For example, the received data analysis device 37 may perform self-position estimation and environment map creation using techniques such as SLAM based on the detection results of the receiver 36 .
- the received data analysis device 37 transmits a notification instruction signal D12 to the notification unit 26 depending on whether or not it is necessary to call attention to the user 1.
- the notification unit 26 notifies the user 1 of the state of the surrounding environment based on the notification instruction signal D12 sent from the received data analysis device 37 .
- the environmental state notification device 10 of this example it is possible to notify the user 1 of the existence of a target requiring attention in the surrounding environment based on the surrounding environment information acquired by the emitter 35 and the receiver 36 .
- FIG. 5 is a diagram showing another example of the functional configuration of the environmental state notification device 10. As shown in FIG.
- the location information acquisition unit 40 acquires the location information of the user 1.
- the position information acquisition unit 40 can acquire the position information of the user 1 (strictly speaking, the wearable device 11) from a GNSS (Global Navigation Satellite System) 42, for example.
- the location information acquisition unit 40 may acquire the location information of the user 1 by means other than the GNSS 42, and may acquire the location information of the user 1 via a beacon, for example.
- the map information matching unit 41 acquires the environment information data D11 based on the location information of the user 1 acquired by the location information acquisition unit 40 and the map information.
- the map information matching unit 41 of this example acquires map information from the map information providing device 43 .
- the map information providing device 43 may be provided as part of the wearable device 11 or may be provided separately from the wearable device 11 .
- the map information matching unit 41 may read map information from the memory 23 (see FIG. 2) of the wearable device 11 acting as the map information providing device 43 .
- the map information may be sent to the map information matching unit 41 from the map information providing device 43 (for example, a server) connected to the wearable device 11 via a network such as the Internet.
- the map information matching unit 41 may acquire from the map information providing device 43 map information of a limited range determined based on the location information of the user 1 acquired by the location information acquisition unit 40 .
- the map information matching unit 41 may send information necessary for acquiring map information of such a limited range (for example, position information of the user 1) to the map information providing device 43.
- the map information providing device 43 may transmit the map information of the limited range to the map information collating section 41 based on the information sent from the map information collating section 41 .
- the map information acquired by the map information matching unit 41 may be a three-dimensional high-precision map, for example, a high-precision map such as a dynamic map, a point cloud map, or a vector map.
- a dynamic map is a map including, for example, four layers of dynamic information, quasi-dynamic information, quasi-static information, and static information, as will be described later.
- a point cloud map is a map composed of a point cloud (point cloud data).
- a vector map is, for example, a map in which traffic information such as lanes and traffic signal positions are associated with a point cloud map.
- the point cloud map and the vector map may be provided from an external device such as a server, or may be provided by the wearable device 11 (for example, the environment information acquisition unit 24 or the environmental state recognition unit 25).
- the environment information acquisition unit 24 transmits the environment information data D11 thus acquired (for example, information data in which the position of the user 1 is reflected on the map information) to the environment state recognition unit 25.
- the environmental state recognition unit 25 includes a position information analysis device 44 that analyzes the environment information data D11 sent from the environment information acquisition unit 24.
- the location information analysis device 44 recognizes "the state of the surrounding environment for the user 1" by analyzing the environment information data D11 based on the location information of the user 1 and the map information. As an example, recognition of whether or not the user 1 is located in an area requiring attention (for example, the roadway or the vicinity of the roadway) may be included in the recognition of the "state of the surrounding environment for the user 1".
- the position information analysis device 44 determines whether or not it is necessary to call attention to the user 1 based on the analysis result of the environment information data D11.
- the location information analysis device 44 may determine that it is necessary to call attention to the user 1, for example, when the user 1 is located in an area that requires attention. On the other hand, if the user 1 is not located in an area that requires attention, the location analysis device 44 may determine that the user 1 does not need to be alerted.
- the position information analysis device 44 transmits a notification instruction signal D12 to the notification unit 26 depending on whether or not it is necessary to call attention to the user 1.
- the notification unit 26 notifies the user 1 according to the state of the surrounding environment based on the notification instruction signal D12 sent from the received data analysis device 37.
- the environmental state notification device 10 of this example it is possible to notify the user 1 of the existence of a target requiring attention in the surrounding environment based on the user's 1 location information and map information.
- FIG. 6 is a diagram showing another example of the functional configuration of the wearable device 11.
- FIG. 6 is a diagram showing another example of the functional configuration of the wearable device 11.
- the wearable device 11 shown in FIG. 6 includes an attachment state detection portion 27 in addition to the attachment portion 21, the control portion 22, the memory 23, the environment information acquisition portion 24, the environment state recognition portion 25, and the notification portion 26 described above.
- the attachment state detection unit 27 detects the attachment state of the environment information acquisition unit 24 to the user 1 under the control of the control unit 22 . That is, the attachment state detection unit 27 detects the attachment state of the environment information acquisition unit 24 based on the environment information data acquired by the environment information acquisition unit 24 and/or the state of the surrounding environment for the user 1 recognized by the environment state recognition unit 25. state can be detected.
- the attachment state detection unit 27 determines that the environment information acquisition unit 24 is properly attached to the user 1. may be detected. On the other hand, if the environment information data acquired by the environment information acquisition unit 24 is not included in the data range expected in normal operation, the installation state detection unit 27 determines whether the environment information acquisition unit 24 is properly installed for the user 1. It may detect that it is not
- the attachment state detection unit 27 detects that the environment information acquisition unit 24 may detect that it is properly attached.
- the installation state detection unit 27 detects that the environment information acquisition unit 24 may detect that it is not properly attached.
- the installation state detection unit 27 further detects the type of installation failure based on the environment information data acquired by the environment information acquisition unit 24 and/or the state of the surrounding environment for the user 1 recognized by the environment state recognition unit 25. ) may be detected.
- the installation state detection unit 27 may detect a defect in the location where the environment information acquisition unit 24 is installed. That is, when the data obtained from the environment information acquisition unit 24 and the environment state recognition unit 25 indicate that the installation location of the environment information acquisition unit 24 is inappropriate for acquiring the environment information data, the installation state detection unit 27 A defect in the installation location of the information acquisition unit 24 can be detected.
- the environmental information acquiring unit 24 and the environmental state recognizing unit 25 indicate that the environmental information acquiring unit 24 (for example, a sensor) is blocked and inappropriate for acquiring environmental information data, the environment A defect in the installation location of the information acquisition unit 24 may be detected.
- the environmental information acquiring unit 24 for example, a sensor
- the environmental information acquisition unit 24 when the data obtained from the environmental information acquiring unit 24 and the environmental state recognizing unit 25 indicate that the direction (orientation) of detection by the environmental information acquiring unit 24 (for example, a sensor) is inappropriate, the environmental information A defect in the installation location of the acquisition unit 24 may be detected.
- the environmental information acquisition unit 24 when the environmental information acquisition unit 24 originally detects the surrounding environment in front of the user 1, but actually detects the surrounding environment behind the user 1, the environment information acquisition unit 24 A faulty installation location may be detected.
- Whether or not the detection direction of the environment information acquisition unit 24 is inappropriate is detected, for example, based on information indicating the progress of the user 1 derived from the environment information data D11 (images, etc.) acquired by the environment information acquisition unit 24. may be
- the notification unit 26 notifies the user 1 according to the detection result of the installation state detection unit 27.
- the notification unit 26 notifies that the installation state of the environment information acquisition unit 24 is inappropriate. Issue to User 1.
- the notification unit 26 notifies the user 1 that the installation state of the environment information acquisition unit 24 is appropriate. , or may not issue a notification itself.
- the environmental state notification device 10 of this example it is possible to notify the user 1 of the mounting state of the environmental information acquisition unit 24 to the user 1, and to appropriately mount the environment information acquisition unit 24 to the user 1. can be encouraged.
- FIG. 7 is a diagram showing another example of the functional configuration of the wearable device 11.
- FIG. 7 is a diagram showing another example of the functional configuration of the wearable device 11.
- the wearable device 11 shown in FIG. 7 includes the attachment unit 21, the control unit 22, the memory 23, the environment information acquisition unit 24, the environment state recognition unit 25, the notification unit 26, and the installation state detection unit 27, as well as user state information acquisition. A portion 28 is provided.
- the user status information acquisition unit 28 acquires user status data indicating information about the status of user 1.
- the specific information included in the user state data is not limited, but includes information about the state of user 1 that can affect the state of the surrounding environment for user 1. As an example, information regarding the speed and/or acceleration of movement of user 1 may be included in the user state data.
- the user state information acquisition unit 28 may include a sensor that detects at least one of the speed and acceleration of movement of the user 1 .
- the sensors included in the user state information acquisition unit 28 are not limited.
- the user state information acquisition unit 28 may include a speed sensor, an acceleration sensor, an angular velocity sensor (gyro sensor), and an inertial measurement unit (IMU (Inertial Measurement Unit)) integrating them.
- the user state information acquisition unit 28 can also include a functional configuration unit that acquires other information indicating movement of the user 1 (see, for example, the position information acquisition unit 40 and the map information collation unit 41 shown in FIG. 5).
- the environment state recognition unit 25 recognizes the state of the surrounding environment for the user 1 based on the environment information data acquired by the environment information acquisition unit 24 and the user state data acquired by the user state information acquisition unit 28 .
- the environmental state recognition unit 25 predicts the travel route of the user 1 based on the user state data, and determines the state of the surrounding environment for the user 1 based on the predicted travel route of the user 1 and the environmental information data. may recognize. For example, the environmental state recognition unit 25 can recognize whether or not the predicted movement route of the user 1 approaches or passes through an area or object requiring attention of the user 1 .
- the environmental state recognition unit 25 determines not only whether the current situation of the user 1 can threaten the safety of the user 1, but also whether the predicted future situation of the user 1 threatens the safety of the user 1. It is also possible to recognize whether or not the situation is likely to occur.
- the notification unit 26 notifies the user 1 according to the "state of the surrounding environment for the user 1" recognized by the environmental state recognition unit 25.
- the environmental state notification device 10 of this example it is possible to notify the user 1 of the existence of an object requiring attention in the surrounding environment at present and/or in the future based on the user state data.
- FIG. 8 is a flow chart showing an example of the environmental state notification method.
- the environment information acquisition unit 24 starts acquiring environment information data indicating information about the surrounding environment of the user 1, and the user state information acquisition unit 28 starts to acquire user state data indicating the state of the user 1. is started (S11 in FIG. 8).
- the attachment state of the wearable device 11 including the environment information acquisition section 24 is detected by the attachment state detection section 27 (S12). Specifically, the attachment state detection unit 27 determines whether or not the installation position of the wearable device 11 (especially the environment information acquisition unit 24) is appropriate (S13).
- the specific method of warning issued from the notification unit 26 at this time is not limited.
- the notification unit 26 can issue a warning to the user 1 through sound, video and/or vibration, similar to the notification to the user 1 described above.
- the user 1 is in an inappropriate state in which the wearable device 11 (especially the environment information acquisition unit 24) is hidden by the body or clothes, in an inappropriate state in which the environment information acquisition unit 24 is oriented, or in a state where the wearable device 11 is worn. Be aware of situations where the position is inappropriate.
- the notification unit 26 issues a warning regarding an abnormality in the mounting state of the wearable device 11 including the environment information acquisition unit 24, the above steps S12 and S13 are repeated.
- the environmental state notification device 10 proceeds to the next step (ie S15).
- the user 1 is encouraged to attach the wearable device 11 normally, and the environmental state notification device 10 does not proceed to the next step S15 until the wearable device 11 is properly attached to the user 1.
- step S15 the environmental state recognition unit 25 recognizes the state of the surrounding environment for the user 1 based on the environmental information data and the user state data.
- the environmental state recognition unit 25 recognizes the state of the surrounding environment of the user 1 and estimates the predicted course of the user 1 based on the environmental information data and the user state data. More specifically, information (including position information) on movement of a mobile object such as a vehicle and information on areas requiring attention in the surrounding environment are acquired as environment information data. However, any other information may be included in the environmental information data.
- the environmental state recognition unit 25 determines whether or not the moving object is approaching the predicted course of user 1 (for example, the predicted position of user 1) (S16).
- the notification unit 26 issues a notification to warn user 1 of the approach of the moving object ( S17). As a result, the user 1 can recognize the approach of the moving object and take action as necessary.
- the notification unit 26 warns the user 1 of the approach of the moving object, in this example, detection and determination of the state of attachment of the wearable device 11 to the user 1 (S12 and S13) are performed again. Note that after the notification unit 26 warns the user 1 of the approach of a moving object, the state of the surrounding environment is recognized and the predicted course of the user is estimated ( S15) may be performed again.
- the environmental state recognition unit determines whether or not user 1 approaches an area that requires caution. 25 (S18).
- the notification unit 26 issues a notification to the user 1 warning that the user 1 is approaching the area requiring attention (S19). .
- the environmental state notification device 10 of this example After the notification unit 26 issues a warning to the user 1 that the user is approaching an area that requires caution, the environmental state notification device 10 of this example returns to step S12 described above. However, the environmental state notification device 10 may return to step S15 without returning to step S12.
- the notification unit 26 will not notify the user 1.
- the notification unit 26 notifies the user 1 according to the state of the surrounding environment for the user 1 recognized by the environmental state recognition unit 25 .
- the series of processes (S11 to S19) described above may be performed repeatedly. Also, part of the above-described series of processes (for example, S12 to S19 or S15 to S19) may be performed repeatedly.
- FIG. 9 is a diagram showing another example of the functional configuration of the environmental state notification device 10. As shown in FIG.
- the wearable device 11 does not include the environmental state recognition unit 25, but includes the communication unit 29.
- the communication unit 29 communicates data with the external communication unit 55 of the external device 51 .
- External device 51 includes environmental state recognition section 25 in addition to external communication section 55 .
- the communication method between the communication unit 29 and the external communication unit 55 is not limited. Further, communication may be performed between the communication unit 29 and the external communication unit 55 using a plurality of communication methods complying with two or more communication standards.
- the communication unit 29 can communicate with the external communication unit 55 based on wireless communication methods such as 5G (5th generation mobile communication system), LTE (Long Term Evolution), and DSRC (Dedicated Short Range Communications).
- the external network with which the communication unit 29 communicates includes, for example, the Internet, a cloud network, or a provider's own network.
- the communication method that the communication unit 29 performs with the external network is not particularly limited as long as it is a wireless communication method that enables digital two-way communication at a communication speed of a predetermined value or more and a distance of a predetermined value or more.
- the communication unit 29 can communicate with a communication device existing near the communication unit 29 (that is, the wearable device 11) using P2P (Peer To Peer) technology.
- Communication devices that exist near the wearable device 11 include, for example, communication devices mounted on moving bodies such as vehicles and bicycles, communication devices that are fixed and do not move in shops, etc., and MTC (Machine Type Communication). Terminal.
- the communication unit 29 may receive electromagnetic waves transmitted by a road traffic information communication system (VICS (Vehicle Information and Communication System) (registered trademark)) such as radio wave beacons, optical beacons, and FM multiplex broadcasting.
- VICS Vehicle Information and Communication System
- radio wave beacons such as radio wave beacons, optical beacons, and FM multiplex broadcasting.
- the communication unit 29 and the external communication unit 55 are directly connected via wireless communication that assumes relatively short-range communication such as near field communication (NFC (Near Field Communication)) or Bluetooth (registered trademark). may be connected.
- wireless communication assumes relatively short-range communication such as near field communication (NFC (Near Field Communication)) or Bluetooth (registered trademark).
- the environment information data D11 acquired by the environment information acquisition unit 24 and the user state data D13 acquired by the user state information acquisition unit 28 are sent to the environment state recognition unit 25 via the communication unit 29 and the external communication unit 55. sent to
- the environmental state recognition unit 25 recognizes the state of the surrounding environment for the user 1 based on the sent environmental information data D11 and user state data D13.
- the environmental state recognition unit 25 sends a notification instruction signal D12 corresponding to the recognized state of the surrounding environment for the user 1 to the notification unit 26 via the external communication unit 55 and the communication unit 29 .
- the notification unit 26 notifies the user 1 according to the sent notification instruction signal D12.
- the notification instruction signal D12 may be sent directly from the external device 51 to the notification unit 26 via the communication unit 29, or may be notified via another functional configuration unit (for example, the control unit 22) of the wearable device 11. It may be sent to department 26 .
- FIG. 10 is a diagram showing another example of the functional configuration of the environmental state notification device 10. As shown in FIG.
- the configuration of the wearable device 11 shown in FIG. 10 is basically the same as the wearable device 11 shown in FIG. 7 described above.
- the notification unit 26 in this example detects the external device 52 (second external device).
- the external device 52 is typically a mobile object (including a passenger) such as a vehicle, but is not limited.
- the notification method from the notification unit 26 to the external device 52 is not limited.
- the notification unit 26 can notify the external device 52 by, for example, emitting sound or light (for example, warning sound or warning light emission).
- the notification unit 26 may also send a notification signal to the external device 52 via the communication unit (not shown in FIG. 10) of the wearable device 11.
- the network between the notification unit 26 and the external device 52 is, for example, CAN (Controller Area Network), LIN (Local Interconnect Network), or LAN (Local Area Network).
- CAN Controller Area Network
- LIN Local Interconnect Network
- LAN Local Area Network
- Configurable such a network can be configured by a network such as a vehicle-mounted communication network or a bus complying with FlexRay (registered trademark), Ethernet (registered trademark), or digital two-way communication standards.
- the notification signal sent from the notification unit 26 to the external device 52 may contain arbitrary information. For example, warning information, user 1 location information, environmental information data, user state data, and/or map information may be sent from the notification unit 26 to the external device 52 .
- the notification unit 26 notifies both the user 1 and the external device 52, so that the safety of the user 1 is further improved.
- FIG. 11 is a diagram showing another example of the functional configuration of the environmental state notification device 10. As shown in FIG.
- the wearable device 11 shown in FIG. 11 includes a communication unit 29 that communicates data with an external device (third external device) 53 .
- the communication unit 29 of this example sends to the external device 53 the surrounding environmental state data D14 indicating the state of the surrounding environment for the user 1 recognized by the environmental state recognition unit 25 . Thereby, the external device 53 can grasp the state of the surrounding environment for the user 1 .
- the external device 53 is not limited.
- the external device 53 can be configured as a mobile terminal owned by a guardian of the user 1 and/or a terminal device owned by a company (for example, an insurance company or a security company). In this case, the state of the environment surrounding the user 1 can be notified to the parent or the company, and the safety of the user 1 can be further improved.
- FIG. 12 is a diagram showing another example of the functional configuration of the environmental state notification device 10. As shown in FIG.
- the configuration of the wearable device 11 shown in FIG. 12 is basically the same as the wearable device 11 shown in FIG. 11 described above.
- the communication unit 29 of this example performs data communication with the external device (first external device) 54 and receives the environment information data D11 and/or the user state data D13 sent from the external device 54 .
- the external device 54 may send information about the attribute of the mobile object to the communication unit 29 as the environment information data D11.
- the attribute of the moving body referred to here corresponds to, for example, information regarding whether or not the external device 54 is a moving body equipped with a safety support device capable of recognizing the user 1, and other information that the external device 54 can provide. sell.
- the environment state recognition unit 25 Based on the environmental information data acquired by the environmental information acquisition unit 24 and the environment information data and/or user state data sent from the external device 54 via the communication unit 29, the environment state recognition unit 25 recognizes the user's state. 1 recognizes the state of the surrounding environment.
- the state of the surrounding environment for the user 1 can be recognized based on the environmental information data acquired by the wearable device 11 and the external device 54 . Also, based on the environment information data and user state data acquired by the wearable device 11 and the user state data acquired by the external device 54, the state of the surrounding environment for the user 1 can be recognized. Therefore, it is possible to more accurately recognize the state of the surrounding environment for the user 1, and the safety of the user 1 is further improved.
- FIG. 13 is a diagram showing another example of the functional configuration of the environmental state notification device 10. As shown in FIG.
- attachment portion 21, the control portion 22 and the memory 23 are not shown in FIG. 13, the wearable device 11 shown in FIG.
- the wearable device 11 of this example includes an external sensor 61 and an internal sensor 63 .
- the external sensor 61 detects the state of the user 1's surrounding environment.
- External sensors 61 may therefore include, for example, cameras, LiDAR, ToF and/or millimeter wave radars.
- the internal sensor 63 detects the state of the user 1.
- the internal sensor 63 may thus include, for example, a GNSS and/or an IMU.
- the data detected by the external world sensor 61 is analyzed by the external world information analysis unit 62, and the information of the surrounding environment of the user 1 is acquired by the external world information analysis unit 62. Information on the surrounding environment of the user 1 thus obtained is sent from the external world information analysis section 62 to the environment state recognition section 25 .
- the data detected by the internal world sensor 63 is analyzed by the internal world information analysis unit 64, and information on the state of the user 1 is acquired by the internal world information analysis unit 64.
- the internal world information analysis unit 64 calculates the moving direction and/or moving distance of the user 1 from the detection data of the internal world sensor 63 based on, for example, PDR (Pedestrian Dead Reckoning), and obtains information on the moving route of the user 1. be able to. Information about the state of the user 1 thus obtained is sent from the internal world information analysis section 64 to the environmental state recognition section 25 .
- Information received by the communication unit 29 from a device external to the wearable device 11 is analyzed by the communication information analysis unit 65, and the communication analysis information is acquired by the communication information analysis unit 65.
- the communication analysis information thus obtained is sent from the communication information analysis section 65 to the environmental state recognition section 25 .
- the communication unit 29 may transmit such information that does not require analysis to the environmental state recognition unit 25 via the communication information analysis unit 65. Then, the communication information analysis unit 65 may be skipped and transmitted to the environmental state recognition unit 25 .
- the external device that communicates with the communication unit 29 is not limited.
- the communication unit 29 can, for example, communicate data with servers, infrastructure equipment (eg, base stations and access points), and/or mobile units (eg, vehicle controllers).
- Information received by the communication unit 29 is not limited, and the communication unit 29 can receive, for example, map information, infrastructure information, and vehicle information.
- the communication information analysis unit 65 may transmit and receive data with the external world information analysis unit 62 and the internal world information analysis unit 64.
- the communication information analysis unit 65 may transmit map information received via the communication unit 29 to the internal world information analysis unit 64 .
- the internal world information analysis unit 64 may acquire information about the state of the user 1 by using map matching technology based on the map information received from the communication information analysis unit 65, for example.
- the state regarding the state of the user 1 may be sent from the internal world information analysis unit 64 to the communication information analysis unit 65, and the communication information analysis unit 65 may acquire the information regarding the state of the user 1 using map matching technology. .
- the highly accurate information regarding the state of the user 1 thus obtained is sent to the environmental state recognition section 25 .
- the environmental state recognition unit 25 of this example acquires the dynamic map of the surrounding environment of the user 1 from the dynamic map providing unit 66 .
- the dynamic map providing unit 66 is provided outside the wearable device 11 and can be configured by, for example, a server.
- a dynamic map is a database-like map containing high-precision 3D geospatial information (basic map information) and additional map information, and is sometimes called a local dynamic map.
- a dynamic map typically includes multiple layers depending on how often the information is updated. For example, a Permanent static data layer, a Transient static data layer, a Transient dynamic data layer, and a Highly dynamic data layer are Included in the dynamic map.
- the environmental state recognition unit 25 recognizes the state of the surrounding environment for the user 1 based on the environmental information data, user state data and dynamic map obtained as described above.
- the notification unit 26 notifies the user 1 according to the state of the surrounding environment for the user 1 recognized by the environmental state recognition unit 25 .
- Information regarding the state of the surrounding environment for the user 1 recognized by the environmental state recognition unit 25 may be sent to the external device via the communication unit 29 .
- the notification unit 26 notifies the user 1 according to the state of the surrounding environment recognized by the environmental state recognition unit 25, but the content of the notification is not limited.
- the notification unit 26 can notify the user 1 of the state of the surrounding environment.
- the notification unit 26 can notify the user 1 that a mobile object such as a vehicle or bicycle is approaching the user 1.
- the notification unit 26 can notify the user 1 of the existence of such a moving object with suspicious behavior.
- the notification unit 26 can notify the user 1 of the attributes of vehicles located near the user 1 .
- the notification unit 26 does not need to notify the user 1.
- the notification unit 26 detects that the vehicle can be notified to the user 1 as a vehicle requiring attention.
- the notification unit 26 can notify the user 1 of the existence of such a person whose behavior is suspicious. In this case, as shown in FIG. 11 described above, it is effective to transmit the information indicating the presence of a person acting suspiciously (that is, the surrounding environmental state data D14) to the external device 53 as well.
- the notification unit 26 can notify the user 1 of "the state of the surrounding environment for the user 1" related to the behavior of the user 1.
- the notification unit 26 can notify the user 1 that the user 1 is approaching or entering an area that requires attention.
- Such areas that require caution may include, for example, not only areas that are inherently dangerous for user 1 (eg, roadways), but also areas that are potentially dangerous for user 1.
- areas that are inherently dangerous for user 1 e.g. roadways
- areas that are potentially dangerous for user 1 may correspond to an area that is potentially dangerous for the user 1 .
- Whether or not the user 1 is approaching or entering an area that requires attention can be determined by the environmental state recognition unit 25 based on the sensor detection results of the environment information acquisition unit 24 . Further, whether or not the user 1 is approaching or entering an area requiring attention can be determined by the environmental state recognition unit 25 based on the map information acquired by the environment information acquisition unit 24 and the location information of the user 1. is.
- the notification unit 26 can notify the user 1 that the user 1 has violated traffic rules. Whether or not the user 1 is violating the traffic rules can be determined from the sensor detection results of the environmental information acquisition unit 24 .
- the image sensor of the environment information acquisition unit 24 captures a signal, and the environmental state recognition unit 25 analyzes the captured image to grasp the state of the signal, thereby determining whether or not the user 1 violates traffic rules. It is possible to Also, based on the detection result of the sensor of the environment information acquisition unit 24, the environment state recognition unit 25 determines whether the location where the user 1 is located is a roadway, thereby determining whether the user 1 is violating traffic rules. can be determined. Based on the map information acquired by the environment information acquisition unit 24 and the location information of the user 1, the environment state recognition unit 25 determines whether the location where the user 1 is located is a roadway, so that the user 1 does not comply with the traffic rules. It is possible to determine whether or not there is a violation.
- the notification unit 26 can notify the user 1 that the user 1 has not performed a sufficient safety confirmation action, such as a case where the user 1 does not stop temporarily when crossing the road.
- the notification unit 26 can notify the user 1 that the user 1 is taking an action that is difficult for the vehicle driver to predict (for example, sudden acceleration, sudden deceleration, meandering, etc.).
- Whether or not the user 1 is taking sufficient safety confirmation actions or whether or not the user 1 is taking an action that is difficult to predict depends on the detection result of the sensor of the environment information acquisition unit 24 and/or the user status information acquisition unit 28. The determination can be made based on the detection result of the sensor.
- the user 1 For example, based on the detection result of the sensor that detects the speed and/or acceleration of the user 1, it is possible to determine whether the user 1 is performing a sufficient safety confirmation action or whether the user 1 is taking an action that is difficult to predict. It is possible to determine Based on the map information and the location information of the user 1 acquired by the environmental information acquisition unit 24, it is determined whether or not the user 1 is sufficiently confirming safety and whether or not the user 1 is acting in an unpredictable manner. It is possible to determine
- the notification unit 26 can notify the user 1 of an abnormal attachment state of the wearable device 11 (in particular, the environment information acquisition unit 24 and/or the user state information acquisition unit 28).
- the notification unit 26 can notify that the wearable device 11 is in an abnormal attachment state. Further, when it is determined that the direction of detection by the environment information acquisition unit 24 is inappropriate from the information acquired by the environment information acquisition unit 24 and/or the user state information acquisition unit 28, the notification unit 26 may be configured as a wearable device. 11 mounting status abnormalities can be notified.
- Steposcopic sound function When the state of the surrounding environment for the user 1 requires the user 1 to quickly grasp the state of the surrounding environment or the user 1 to take a quick avoidance action, the notification unit 26 instructs the user 1 to It is desirable to have quick and intuitive notifications.
- the notification unit 26 can perform notification based on stereophonic technology, for example.
- Stereophonic technology is technology that provides the user 1 with three-dimensional sound effects, for example, technology that reproduces pseudo-stereoscopic sound sources.
- the specific method and device configuration for realizing stereophonic technology are not limited.
- the technology and device called "360 REALITY AUDIO" registered trademark
- stereophonic technology are also based on stereophonic technology.
- the notification unit 26 simply issues a warning to the user 1, the user 1 can instantly recognize the situation and take appropriate action. may not be possible.
- the notification unit 26 notifies the user 1 by stereophonic sound, so that it is possible to prompt the user 1 to instantly recognize the situation and to induce the user 1 to take reflexive actions, thereby improving the safety of the user. It is advantageous to
- stereophonic function is described below.
- the notification unit 26 may give directivity to the degree of emphasis of the notification sound (warning sound, etc.) according to the state of the surrounding environment for the user 1 .
- the notification unit 26 can notify the user 1 using a sound that is emphasized in the direction toward the user 1 from an object requiring attention (for example, a moving object such as a vehicle or an obstacle such as a utility pole).
- an object requiring attention for example, a moving object such as a vehicle or an obstacle such as a utility pole.
- the notification unit 26 can notify the user 1 by acquiring the positional relationship between the object requiring attention and the user 1 and creating stereophonic sound corresponding to the positional relationship. Also, the notification unit 26 may change the volume, tempo and/or other characteristics of the emphasized sound provided to the user 1 according to the distance between the user 1 and the object requiring attention.
- the notification unit 26 may acquire information about the positional relationship between the object requiring attention and the user 1 from the control unit 22 or an external device, or derive the information based on the environment information data and/or the user state data. good too.
- the notification sound (including the emphasized sound) provided from the notification unit 26 to the user 1 is determined by a function configuration unit other than the notification unit 26 (for example, the control unit 22), the object requiring attention and the user 1 is sent to the other functional component. In this case, information about the positional relationship between the object requiring attention and the user 1 may not be sent to the notification unit 26 .
- the user 1 can quickly and intuitively grasp the positional relationship between the object requiring attention and the user 1 by listening to the sound (especially emphasized sound) emitted from the notification unit 26. .
- the notification unit 26 may provide the user 1 with a sound that induces a reflex action of the user 1 according to the state of the surrounding environment for the user 1 based on stereophonic technology.
- the notification unit 26 emits a sound that is emphasized in the direction from the area that requires attention toward the user 1 (e.g. evocative sound) can be used to notify user 1 .
- a sound that is emphasized in the direction from the area that requires attention toward the user 1 e.g. evocative sound
- the notification unit 26 may give a notification that the user 1 suddenly hears the sound of a bat swing overhead, thereby inducing the user 1 to crouch.
- the notification unit 26 acquires the positional relationship between the object requiring attention and the user 1, and generates stereophonic sound corresponding to the positional relationship, thereby making it possible to notify the user 1.
- a functional configuration unit other than the notification unit 26 acquires the positional relationship between the object requiring attention and the user 1, creates stereophonic data corresponding to the positional relationship, and notifies the stereophonic data. By sending it to the unit 26, it is possible to notify the user 1.
- the notification unit 26 may provide the notification sound for the user 1 to the user 1 by combining it with music according to the state of the surrounding environment for the user 1 .
- the music provided to the user 1 from the notification unit 26 may be changed according to the distance or relative speed of approach of the user 1 to the object or area requiring attention.
- the notification unit 26 may transmit the melody, instrumental sound, tempo and/or volume (for example, the volume of a specific instrumental sound) of music provided to the user 1 from the notification unit 26 to the target and area requiring attention and the user 1. may vary depending on the relationship between Further, the notification unit 26 may give directivity to the degree of emphasis of music according to the state of the surrounding environment for the user 1 .
- the user 1 can quickly and intuitively grasp the state of the surrounding environment for the user 1, and experience music that changes according to the actual walking location, time, and surrounding conditions.
- the technical categories that embody the above technical ideas are not limited.
- the above technical ideas may be embodied by a computer program for causing a computer to execute one or more procedures (steps) included in the method of manufacturing or using the above apparatus.
- the above technical idea may be embodied by a computer-readable non-transitory recording medium in which such a computer program is recorded.
- the present disclosure can also take the following configuration.
- An environmental condition notification device comprising:
- the environmental information acquisition unit includes an emission unit that emits detection electromagnetic waves or detection sound waves, 4.
- the environmental state notification device according to item 2 or 3, wherein the wave sensor receives the reflected detected electromagnetic wave or the detected sound wave.
- the environment information acquisition unit a location information acquisition unit that acquires location information of the user; a map information matching unit that acquires the environment information data based on the position information and the map information; 5.
- the environmental condition notification device according to any one of items 1 to 4, comprising:
- Item 6 The environmental state notification device according to item 5, wherein the position information acquisition unit acquires the position information of the user from a satellite positioning system.
- the environmental state recognizing unit is configured to detect the surrounding environment for the user based on the environmental information data acquired by the environmental information acquiring unit and the environmental information data transmitted from the first external device via the communication unit. 7.
- the environmental state notification device according to any one of items 1 to 6, which recognizes the state of the environment.
- [Item 8] a user state information acquisition unit that acquires user state data indicating the state of the user; 8. The environmental state notification device according to any one of items 1 to 7, wherein the environmental state recognition unit recognizes the state of the surrounding environment for the user based on the environmental information data and the user state data.
- Item 9 The environmental state notification device according to item 8, wherein the user state information acquisition unit includes a sensor that detects at least one of speed and acceleration of movement of the user.
- the notification unit notifies a second external device that exists in the user's surrounding environment and is not attached to the user, according to the state of the surrounding environment for the user recognized by the environmental state recognition unit.
- the environmental state notification device according to any one of 1 to 13.
- FIG. 17 an attachment state detection unit that detects an attachment state of the environment information acquisition unit with respect to the user; 17. The environmental state notification device according to any one of items 1 to 16, wherein the notification unit notifies the user according to the detection result of the attachment state detection unit.
- [Item 18] obtaining environmental information data indicating information about the surrounding environment of the user by an environmental information obtaining unit attached to the user; an environmental state recognition unit recognizing the state of the surrounding environment for the user based on the environmental information data; a step in which the notification unit notifies the user according to the state of the surrounding environment for the user recognized by the environmental state recognition unit; environmental status notification methods, including
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
上述のように、通知部26は、環境状態認識部25が認識したユーザ1に対する周辺環境の状態に応じた通知を行うが、その通知内容は限定されない。
ユーザ1に対する周辺環境の状態が、ユーザ1が周辺環境の状態を迅速に把握することや、ユーザ1が迅速な回避行動をとることを必要とする場合、通知部26は、ユーザ1に対して迅速且つ直感的な通知を行うことが望ましい。
通知部26は、ユーザ1に対する周辺環境の状態に応じて、通知の音(警報音等)の強調の程度に指向性を持たせてもよい。
通知部26は、ユーザ1に対する周辺環境の状態に応じて、ユーザ1の反射的行動を誘発する音を、立体音響技術に基づいてユーザ1に提供してもよい。
通知部26は、ユーザ1に対する周辺環境の状態に応じて、ユーザ1に対する通知音を、音楽に融合してユーザ1に提供してもよい。
本明細書で開示されている実施形態及び変形例はすべての点で例示に過ぎず限定的には解釈されないことに留意されるべきである。上述の実施形態及び変形例は、添付の特許請求の範囲及びその趣旨を逸脱することなく、様々な形態での省略、置換及び変更が可能である。例えば上述の実施形態及び変形例が全体的に又は部分的に組み合わされてもよく、また上述以外の実施形態が上述の実施形態又は変形例と組み合わされてもよい。また、本明細書に記載された本開示の効果は例示に過ぎず、その他の効果がもたらされてもよい。
ユーザに取り付けられ、前記ユーザの周辺環境の情報を示す環境情報データを取得する環境情報取得部と、
前記環境情報データに基づいて、前記ユーザに対する周辺環境の状態を認識する環境状態認識部と、
前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態に応じて、前記ユーザに対する通知を行う通知部と、
を備える環境状態通知装置。
前記環境情報取得部は、前記ユーザの周辺の環境からの電磁波及び音波のうちの少なくともいずれか一方を受けて前記環境情報データを取得する波動センサを含む、項目1に記載の環境状態通知装置。
前記波動センサは、前記ユーザの周辺の環境の画像を取得する、項目2に記載の環境状態通知装置。
前記環境情報取得部は、検出電磁波又は検出音波を出射する出射部を含み、
前記波動センサは、反射された前記検出電磁波又は前記検出音波を受ける、項目2又は3に記載の環境状態通知装置。
前記環境情報取得部は、
前記ユーザの位置情報を取得する位置情報取得部と、
前記位置情報と地図情報とに基づいて前記環境情報データを取得する地図情報照合部と、
を含む、項目1~4のいずれに記載の環境状態通知装置。
前記位置情報取得部は、衛星測位システムから前記ユーザの前記位置情報を取得する、項目5に記載の環境状態通知装置。
データの通信を行う通信部を備え、
前記環境状態認識部は、前記環境情報取得部によって取得される前記環境情報データと、前記通信部を介して第1外部装置から送られてくる前記環境情報データとに基づいて、前記ユーザに対する周辺環境の状態を認識する、項目1~6のいずれかに記載の環境状態通知装置。
前記ユーザの状態を示すユーザ状態データを取得するユーザ状態情報取得部を備え、
前記環境状態認識部は、前記環境情報データ及び前記ユーザ状態データに基づいて、前記ユーザに対する周辺環境の状態を認識する、項目1~7のいずれかに記載の環境状態通知装置。
前記ユーザ状態情報取得部は、前記ユーザの移動の速度及び加速度のうちの少なくともいずれか一方を検出するセンサを含む、項目8に記載の環境状態通知装置。
前記環境状態認識部は、前記ユーザ状態データに基づいて前記ユーザの移動進路を予測し、予測された前記ユーザの前記移動進路に基づいて、前記ユーザに対する周辺環境の状態を認識する項目8又は9に記載の環境状態通知装置。
前記環境状態認識部は、前記ユーザの周辺環境のダイナミックマップを取得し、前記環境情報データ及び前記ダイナミックマップに基づいて、前記ユーザに対する周辺環境の状態を認識する、項目1~10のいずれかに記載の環境状態通知装置。
前記通知部は前記ユーザに取り付けられる、項目1~11のいずれかに記載の環境状態通知装置。
前記通知部は、聴覚、視覚及び触覚のうちの少なくともいずれかを通じて、前記ユーザに対する通知を行う、項目12に記載の環境状態通知装置。
前記通知部は、前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態に応じて、前記ユーザの周辺環境に存在し且つ前記ユーザに取り付けられていない第2外部装置に対する通知を行う、項目1~13のいずれかに記載の環境状態通知装置。
前記第2外部装置は車両である、項目14に記載の環境状態通知装置。
データの通信を行う通信部を備え、
前記通信部は、前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態を示すデータを、第3外部装置に送る、項目1~15のいずれかに記載の環境状態通知装置。
前記ユーザに対する前記環境情報取得部の取り付け状態を検出する取り付け状態検出部を備え、
前記通知部は、前記取り付け状態検出部の検出結果に応じて、前記ユーザに対する通知を行う、項目1~16のいずれかに記載の環境状態通知装置。
ユーザに取り付けられる環境情報取得部によって、前記ユーザの周辺環境の情報を示す環境情報データを取得するステップと、
環境状態認識部が、前記環境情報データに基づいて、前記ユーザに対する周辺環境の状態を認識するステップと、
通知部が、前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態に応じて、前記ユーザに対する通知を行うステップと、
を含む環境状態通知方法。
コンピュータに、
ユーザに取り付けられる環境情報取得部によって、前記ユーザの周辺環境の情報を示す環境情報データを取得する手順と、
環境状態認識部が、前記環境情報データに基づいて、前記ユーザに対する周辺環境の状態を認識する手順と、
通知部が、前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態に応じて、前記ユーザに対する通知を行う手順と、
を実行させるためのプログラム。
10 環境状態通知装置
11 ウエアラブルデバイス
21 取付部
22 制御部
23 メモリ
24 環境情報取得部
25 環境状態認識部
26 通知部
27 取り付け状態検出部
28 ユーザ状態情報取得部
29 通信部
31 撮像装置
32 画像解析装置
35 エミッター
36 レシーバー
37 受信データ解析装置
40 位置情報取得部
41 地図情報照合部
43 地図情報提供装置
44 位置情報解析装置
51~54 外部装置
55 外部通信部
61 外界センサ
62 外界情報解析部
63 内界センサ
64 内界情報解析部
65 通信情報解析部
66 ダイナミックマップ提供部
D11 環境情報データ
D12 通知指示信号
D13 ユーザ状態データ
D14 周辺環境状態データ
Claims (19)
- ユーザに取り付けられ、前記ユーザの周辺環境の情報を示す環境情報データを取得する環境情報取得部と、
前記環境情報データに基づいて、前記ユーザに対する周辺環境の状態を認識する環境状態認識部と、
前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態に応じて、前記ユーザに対する通知を行う通知部と、
を備える環境状態通知装置。 - 前記環境情報取得部は、前記ユーザの周辺の環境からの電磁波及び音波のうちの少なくともいずれか一方を受けて前記環境情報データを取得する波動センサを含む、請求項1に記載の環境状態通知装置。
- 前記波動センサは、前記ユーザの周辺の環境の画像を取得する、請求項2に記載の環境状態通知装置。
- 前記環境情報取得部は、検出電磁波又は検出音波を出射する出射部を含み、
前記波動センサは、反射された前記検出電磁波又は前記検出音波を受ける、請求項2に記載の環境状態通知装置。 - 前記環境情報取得部は、
前記ユーザの位置情報を取得する位置情報取得部と、
前記位置情報と地図情報とに基づいて前記環境情報データを取得する地図情報照合部と、
を含む、請求項1に記載の環境状態通知装置。 - 前記位置情報取得部は、衛星測位システムから前記ユーザの前記位置情報を取得する、請求項5に記載の環境状態通知装置。
- データの通信を行う通信部を備え、
前記環境状態認識部は、前記環境情報取得部によって取得される前記環境情報データと、前記通信部を介して第1外部装置から送られてくる前記環境情報データとに基づいて、前記ユーザに対する周辺環境の状態を認識する、請求項1に記載の環境状態通知装置。 - 前記ユーザの状態を示すユーザ状態データを取得するユーザ状態情報取得部を備え、
前記環境状態認識部は、前記環境情報データ及び前記ユーザ状態データに基づいて、前記ユーザに対する周辺環境の状態を認識する、請求項1に記載の環境状態通知装置。 - 前記ユーザ状態情報取得部は、前記ユーザの移動の速度及び加速度のうちの少なくともいずれか一方を検出するセンサを含む、請求項8に記載の環境状態通知装置。
- 前記環境状態認識部は、前記ユーザ状態データに基づいて前記ユーザの移動進路を予測し、予測された前記ユーザの前記移動進路に基づいて、前記ユーザに対する周辺環境の状態を認識する請求項8に記載の環境状態通知装置。
- 前記環境状態認識部は、前記ユーザの周辺環境のダイナミックマップを取得し、前記環境情報データ及び前記ダイナミックマップに基づいて、前記ユーザに対する周辺環境の状態を認識する、請求項1に記載の環境状態通知装置。
- 前記通知部は前記ユーザに取り付けられる、請求項1に記載の環境状態通知装置。
- 前記通知部は、聴覚、視覚及び触覚のうちの少なくともいずれかを通じて、前記ユーザに対する通知を行う、請求項12に記載の環境状態通知装置。
- 前記通知部は、前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態に応じて、前記ユーザの周辺環境に存在し且つ前記ユーザに取り付けられていない第2外部装置に対する通知を行う、請求項1に記載の環境状態通知装置。
- 前記第2外部装置は車両である、請求項14に記載の環境状態通知装置。
- データの通信を行う通信部を備え、
前記通信部は、前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態を示すデータを、第3外部装置に送る、請求項1に記載の環境状態通知装置。 - 前記ユーザに対する前記環境情報取得部の取り付け状態を検出する取り付け状態検出部を備え、
前記通知部は、前記取り付け状態検出部の検出結果に応じて、前記ユーザに対する通知を行う、請求項1に記載の環境状態通知装置。 - ユーザに取り付けられる環境情報取得部によって、前記ユーザの周辺環境の情報を示す環境情報データを取得するステップと、
環境状態認識部が、前記環境情報データに基づいて、前記ユーザに対する周辺環境の状態を認識するステップと、
通知部が、前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態に応じて、前記ユーザに対する通知を行うステップと、
を含む環境状態通知方法。 - コンピュータに、
ユーザに取り付けられる環境情報取得部によって、前記ユーザの周辺環境の情報を示す環境情報データを取得する手順と、
環境状態認識部が、前記環境情報データに基づいて、前記ユーザに対する周辺環境の状態を認識する手順と、
通知部が、前記環境状態認識部が認識した前記ユーザに対する周辺環境の状態に応じて、前記ユーザに対する通知を行う手順と、
を実行させるためのプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/557,829 US20240220001A1 (en) | 2021-05-21 | 2022-03-01 | Environmental state notification device, environmental state notification method, and program |
JP2023522245A JPWO2022244372A1 (ja) | 2021-05-21 | 2022-03-01 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021086237 | 2021-05-21 | ||
JP2021-086237 | 2021-05-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022244372A1 true WO2022244372A1 (ja) | 2022-11-24 |
Family
ID=84140892
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/008481 WO2022244372A1 (ja) | 2021-05-21 | 2022-03-01 | 環境状態通知装置、環境状態通知方法及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240220001A1 (ja) |
JP (1) | JPWO2022244372A1 (ja) |
WO (1) | WO2022244372A1 (ja) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016170854A1 (ja) * | 2015-04-22 | 2016-10-27 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP2017536595A (ja) * | 2014-09-26 | 2017-12-07 | ハーマン インターナショナル インダストリーズ インコーポレイテッド | 歩行者情報システム |
-
2022
- 2022-03-01 WO PCT/JP2022/008481 patent/WO2022244372A1/ja active Application Filing
- 2022-03-01 JP JP2023522245A patent/JPWO2022244372A1/ja active Pending
- 2022-03-01 US US18/557,829 patent/US20240220001A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017536595A (ja) * | 2014-09-26 | 2017-12-07 | ハーマン インターナショナル インダストリーズ インコーポレイテッド | 歩行者情報システム |
WO2016170854A1 (ja) * | 2015-04-22 | 2016-10-27 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20240220001A1 (en) | 2024-07-04 |
JPWO2022244372A1 (ja) | 2022-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9947215B2 (en) | Pedestrian information system | |
US20200241552A1 (en) | Using classified sounds and localized sound sources to operate an autonomous vehicle | |
EP2700032B1 (en) | A comprehensive and intelligent system for managing traffic and emergency services | |
US20180090009A1 (en) | Dynamic traffic guide based on v2v sensor sharing method | |
US10909759B2 (en) | Information processing to notify potential source of interest to user | |
CN111724616B (zh) | 基于人工智能的数据获取及共享的方法与装置 | |
WO2015184578A1 (en) | Adaptive warning management for advanced driver assistance system (adas) | |
KR20190011582A (ko) | 외부 이동 수단의 움직임과 관련된 데이터에 기반하여, 식별 정보가 변경된 외부 이동 수단을 확인하는 전자 장치 및 그 동작 방법 | |
US20180290590A1 (en) | Systems for outputting an alert from a vehicle to warn nearby entities | |
JP2017535008A (ja) | 道路交通利用者の移動モデルを形成するための方法及び装置 | |
JP2017527939A (ja) | 交通区域を監視する方法及び装置 | |
CN112363507A (zh) | 意图识别 | |
WO2018198926A1 (ja) | 電子機器、路側機、電子機器の動作方法および交通システム | |
JPWO2017163514A1 (ja) | 眼鏡型ウェアラブル端末、その制御方法および制御プログラム | |
US20220343757A1 (en) | Information processing apparatus, information processing system, and information processing method | |
CN113841100A (zh) | 自主行驶控制设备、自主行驶控制系统和自主行驶控制方法 | |
US11904893B2 (en) | Operating a vehicle | |
US11772640B2 (en) | Systems and methods for vehicular collision detection based on multi-dimensional collision models | |
JPWO2020039798A1 (ja) | 情報提供装置、情報提供方法、情報提供システム、コンピュータプログラム、及びデータ構造 | |
WO2022244372A1 (ja) | 環境状態通知装置、環境状態通知方法及びプログラム | |
WO2019117104A1 (ja) | 情報処理装置および情報処理方法 | |
CN114586081A (zh) | 信息处理设备、信息处理系统和信息处理方法 | |
Gupta et al. | Collision detection system for vehicles in hilly and dense fog affected area to generate collision alerts | |
WO2023204076A1 (ja) | 音響制御方法及び音響制御装置 | |
US20240122780A1 (en) | Autonomous vehicle with audio user guidance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22804293 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023522245 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18557829 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22804293 Country of ref document: EP Kind code of ref document: A1 |