WO2023243338A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et système de traitement d'informations - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et système de traitement d'informations Download PDF

Info

Publication number
WO2023243338A1
WO2023243338A1 PCT/JP2023/019250 JP2023019250W WO2023243338A1 WO 2023243338 A1 WO2023243338 A1 WO 2023243338A1 JP 2023019250 W JP2023019250 W JP 2023019250W WO 2023243338 A1 WO2023243338 A1 WO 2023243338A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
peripheral
audio
sensor
surrounding
Prior art date
Application number
PCT/JP2023/019250
Other languages
English (en)
Japanese (ja)
Inventor
正幸 横山
孝悌 清水
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023243338A1 publication Critical patent/WO2023243338A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute

Definitions

  • the present technology relates to an information processing device, an information processing method, a program, and an information processing system that can be applied to broadcasting information about the surrounding environment.
  • the reliability of the optical distance detector is determined based on the difference between the distance information obtained from the optical distance detector and the distance information obtained from the ultrasonic distance detector. degree is calculated. If the difference between the two pieces of distance information is large, it is determined that the reliability is low, and the distance information obtained from the optical distance detector is corrected using the distance information obtained from the ultrasonic distance detector.
  • the purpose of the present technology is to provide an information processing device, an information processing method, a program, and an information processing system that are capable of reporting information about the surrounding environment with high accuracy.
  • an information processing device includes a peripheral information acquisition section, an audio information generation section, and a notification control section.
  • the surrounding information acquisition unit obtains first surrounding information and second surrounding information regarding the surrounding environment based on detection results of one or more object detection sensors.
  • the audio information generation unit generates first audio information using first musical tone data for notifying the first peripheral information based on the first peripheral information, and generates first audio information based on the second peripheral information.
  • second audio information is generated using second musical tone data for notifying the second peripheral information based on the second peripheral information.
  • the notification control unit outputs both the first audio information and the second audio information.
  • first peripheral information and second peripheral information are acquired based on detection results by one or more object detection sensors. Then, first audio information is generated using the first musical tone data based on the first peripheral information. Also, second musical tone data is used to generate second audio information based on the second peripheral result. Both the first audio information and the second audio information are output. Thereby, it becomes possible to notify both the first surrounding information and the second surrounding information via audio, and it becomes possible to notify the user of information about the surrounding environment with high accuracy.
  • the one or more object detection sensors may include a first object detection sensor and a second object detection sensor.
  • the peripheral information acquisition unit acquires the first peripheral information based on the detection result of the first object detection sensor, and the peripheral information acquisition unit acquires the first peripheral information based on the detection result of the second object detection sensor. Surrounding information may also be acquired.
  • the first object detection sensor may be a first ranging sensor that operates according to a first method.
  • the second object detection sensor may be a second ranging sensor that operates according to a second method different from the first method.
  • the first object detection sensor may be a first distance measurement sensor arranged with the first direction as the detection direction.
  • the second object detection sensor may be a second distance measurement sensor arranged with a detection direction in a second direction different from the first direction.
  • the one or more object detection sensors may be sensors that generate image information.
  • the peripheral information acquisition unit acquires the first peripheral information based on information on some pixel areas of the image information, and acquires the first peripheral information based on information on other pixel areas of the image information.
  • the second surrounding information may be acquired using the second peripheral information.
  • the one or more object detection sensors may be sensors that generate image information.
  • the peripheral information acquisition unit acquires information regarding the first type of object detected based on the image information as the first peripheral information, and Information regarding a second type of object different from the type may be acquired as the second peripheral information.
  • the surrounding information acquisition unit may generate integrated surrounding information based on the first surrounding information and the second surrounding information.
  • the first ranging sensor may operate using an optical laser method.
  • the second ranging sensor may operate using an ultrasonic method.
  • the surrounding information acquisition unit may generate the integrated surrounding information based on the stability of detection by the first distance measurement sensor and the stability of detection by the second distance measurement sensor.
  • the peripheral information acquisition unit includes a light-transmitting member or a light-absorbing member in the periphery.
  • the integrated surrounding information indicating that the member exists may be generated.
  • the peripheral information acquisition unit further includes hardness information acquired as the second peripheral information based on the detection result of the second distance measurement sensor, and the peripheral information acquisition unit is configured to determine whether the light transmitting member or the light absorbing member Information regarding at least one of the material and type of object may be generated as the integrated surrounding information.
  • the audio information generation unit determines whether or not to output the first audio information based on the first peripheral information, and if it is determined that the first audio information is not output, the notification control
  • the output of the first audio information by the unit may be restricted.
  • the first audio information may be first musical tone information constituting a predetermined song.
  • the second audio information may be second musical tone information constituting the predetermined music piece.
  • the audio information generation unit may generate the first audio information by controlling musical tone parameters of the first musical tone data based on the first peripheral information.
  • the musical sound parameters may include at least one of volume, frequency, pitch, speed, BPM, or tempo.
  • the first surrounding information may include distance information.
  • the audio information generating section may generate the first audio information by controlling the musical tone parameters based on the distance information.
  • the audio information generation unit may control localization of the first audio information based on a detection direction of the first ranging sensor.
  • An information processing method is an information processing method executed by a computer system, and includes first peripheral information and second peripheral information regarding the surrounding environment based on detection results of one or more object detection sensors. including obtaining. First musical tone data for notifying the first peripheral information based on the first peripheral information is used to generate first audio information, and the second audio information is generated based on the second peripheral information. Second audio information is generated using second musical tone data for reporting surrounding information. Both the first audio information and the second audio information are output.
  • a program causes a computer system to execute the following steps.
  • First audio information is generated using first musical tone data for notifying the first peripheral information based on the first peripheral information, and first audio information is generated based on the second peripheral information.
  • An information processing system includes one or more object detection sensors, the surrounding information acquisition section, the audio information generation section, and the notification control section.
  • the information processing system may further include an audio output unit that outputs the first audio information and the second audio information, and an information output unit that outputs information to the user.
  • FIG. 1 is a schematic diagram for explaining an overview of a surrounding information notification system according to an embodiment of the present technology.
  • FIG. 1 is a schematic diagram showing an example of a functional configuration of a surrounding information notification system.
  • 2 is a flowchart showing an example of basic operation of the surrounding information notification system.
  • FIG. 2 is a schematic diagram for explaining a configuration example of a sensor section.
  • FIG. 7 is a schematic diagram for explaining another configuration example of the sensor section.
  • FIG. 7 is a schematic diagram for explaining another configuration example of the sensor section.
  • FIG. 2 is a schematic diagram showing an example of the configuration of one or more object detection sensors.
  • FIG. 3 is a schematic diagram showing another example of the configuration of one or more object detection sensors.
  • FIG. 2 is a block diagram for explaining notification of surrounding information according to the first embodiment.
  • FIG. 3 is a schematic diagram for explaining an example in which an image sensor is arranged as an object detection sensor.
  • FIG. 7 is a block diagram showing a configuration example for realizing notification of surrounding information according to a second embodiment.
  • 2 is a flowchart showing an example of notification of surrounding information according to the present embodiment.
  • 3 is a table showing an example of determination of the situation of the surrounding environment.
  • FIG. 3 is a schematic diagram showing a case where an obstacle exists in a position in the front direction of the user.
  • FIG. 3 is a schematic diagram showing a case where an obstacle exists on the ground in front of the user.
  • Schematic diagram showing a case where a fall danger point exists in the front direction of the user 3 is a table showing an example of a process for notifying obstacles and falling danger points.
  • FIG. 7 is a schematic diagram for explaining another example of the notification process of obstacles and falling danger points. It is a schematic diagram which shows the detection example of "ground obstacle (large)" and "ground obstacle (small).”
  • FIG. 7 is a block diagram showing an example of a configuration for realizing notification of surrounding information according to a third embodiment. 2 is a flowchart showing an example of notification of surrounding information according to the present embodiment. It is a schematic diagram which shows an example of an obstacle space map.
  • FIG. 1 is a schematic diagram showing a case where an obstacle exists on the ground in front of the user.
  • Schematic diagram showing a case where a fall danger point exists in the front direction of the user 3 is a table showing an example of a process for notifying
  • FIG. 3 is a schematic diagram showing another configuration example of the surrounding information notification system according to the present embodiment.
  • FIG. 7 is a schematic diagram showing a configuration example of a surrounding information notification system according to a fourth embodiment.
  • 2 is a flowchart showing an example of notification of surrounding information according to the present embodiment.
  • FIG. 7 is a schematic diagram showing another example of a method of outputting audio according to distance.
  • FIG. 2 is a block diagram illustrating an example of a hardware configuration of a computer (information processing device) that can be used to construct a peripheral information notification system according to the present technology.
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system.
  • FIG. 2 is an explanatory diagram showing an example of installation positions of an outside-vehicle information detection section and an imaging section.
  • FIG. 1 is a schematic diagram for explaining an overview of a surrounding information notification system according to an embodiment of the present technology.
  • the surrounding information notification system 1 is constructed as a system that can be used by visually impaired people such as blindness and amblyopia. That is, the user 2 of the surrounding information notification system 1 is a visually impaired person.
  • the surrounding information notification system 1 corresponds to an embodiment of an information processing system according to the present technology.
  • the user 2 uses a white cane 4 to grasp the situation of the surrounding environment when moving on the ground 3.
  • the user 2 can grasp the situation on the ground 3 based on the sensation (tactile sensation) obtained through the white cane 4.
  • objects 5 such as cars (vehicles), utility poles, signboards, etc. that are present in the direction in which the user 2 is traveling. Further, it is also possible to grasp stairs going upward (up stairs), escalators going upward (up escalator), and the like. In addition, there are various objects 5 such as stairs going downwards (down stairs), escalators going downwards (down escalators), the edge of the station platform (boundary between the platform and the railroad crossing), Braille blocks installed on the ground 3, etc. It is possible to grasp the situation.
  • a car is illustrated as the object 5.
  • any object 5 on the ground 3 such as a telephone pole, a signboard, a wall, an upward staircase, an upward escalator, a pedestrian, a bicycle, a motorbike, etc.
  • object any shape or area where there is a risk of the user 2 falling, such as stairs going downwards (downstairs), escalators going downwards (downward escalators), edges of station platforms, holes, etc.
  • any shape or area where there is a risk of the user 2 falling such as stairs going downwards (downstairs), escalators going downwards (downward escalators), edges of station platforms, holes, etc.
  • the concept of "fall hazard point” In the present disclosure, the "fall danger point” is included in the “downwardly concave area.”
  • the surrounding information notification system 1 is capable of reporting surrounding information regarding the surrounding environment to the user 2 with high precision.
  • the user 2 can avoid various dangers based on the notified surrounding information. For example, it is possible to avoid collision with the object 5 that becomes an obstacle during movement (walking, etc.). It is also possible to avoid falling at dangerous points, such as falling down stairs or falling from a station platform onto the tracks.
  • the surrounding information notification system 1 can also be called a danger avoidance system.
  • the notification of surrounding information can also be called notification of surrounding information.
  • the surrounding information notification system 1 includes a sensor section 6, an information output section 7, and a controller 8.
  • the sensor unit 6 performs sensing regarding the surrounding environment.
  • the information output unit 7 outputs information to the user 2.
  • the controller 8 controls the operation of the sensor section 6 and the information output section 7.
  • the controller 8 acquires surrounding information regarding the surrounding environment and notifies the user 2 of the surrounding information.
  • FIG. 2 is a schematic diagram showing an example of the functional configuration of the surrounding information notification system 1.
  • the sensor section 6 includes one or more object detection sensors 10.
  • the object detection sensor 10 includes any sensor capable of outputting information capable of detecting the object 5, a signal including information capable of detecting the object 5, or the like.
  • any sensor that can output information (signal) that can determine whether or not the object 5 is detected (ON/OFF) is included.
  • the object detection sensor 10 also detects various information regarding the object 5, such as the distance to the object 5, the shape of the object 5, the size of the object 5, the material of the object 5, etc. Any sensor capable of detecting information may be used.
  • detection can also be referred to as “detection.”
  • a sensor that acquires biological information such as pulse, heartbeat, body temperature, and brain waves may be used as necessary.
  • a distance measurement sensor for example, a distance measurement sensor, an image sensor (digital camera), an infrared sensor, etc.
  • the distance measurement sensor include an optical laser distance measurement sensor (hereinafter referred to as a laser distance measurement sensor), an ultrasonic distance measurement sensor (hereinafter referred to as an ultrasonic distance measurement sensor), a stereo camera, It is possible to use various types of distance measurement sensors, such as a ToF (Time of Flight) sensor, a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), and a structured light type distance measurement sensor.
  • ToF Time of Flight
  • LiDAR Light Detection and Ranging
  • Laser Imaging Detection and Ranging Laser Imaging Detection and Ranging
  • structured light type distance measurement sensor for example, a structured light type distance measurement sensor.
  • CMOS Complementary Metal-Oxide Semiconductor
  • CCD Charge Coupled Device
  • a sensor having both the functions of an image sensor and a ranging sensor may be used.
  • a ToF sensor or the like that can detect distance information for each pixel may be used.
  • An image sensor corresponds to one embodiment of a sensor that generates image information.
  • the information output unit 7 is configured by an arbitrary device for outputting information to the user 2.
  • a speaker 11 and a vibration device 12 are used as an example of the information output section 7.
  • the speaker 11 outputs audio. By driving the speaker 11, it becomes possible to notify the user 2 of information via audio.
  • a headset 13 including a speaker 11 is used as the information output section 7 and is attached to the user's 2 head.
  • the headset 13 is not limited to an overhead type, but may be an in-ear type, a canal type, an open-ear type, or a head-mounted device. Further, it may be a device having hearing aid processing such as a hearing aid or a sound collector.
  • the speaker 11 functions as an audio output section. That is, the information output section 7 is configured to include an audio output section.
  • the vibration device 12 outputs vibration.
  • the vibration device 12 is placed at any position that contacts the user's 2 body.
  • any vibration motor or the like that can generate notification vibrations or the like can be used as the vibration device 12.
  • the surrounding information notification system 1 further includes a communication section 14 and a storage section 15.
  • the communication unit 14 and the storage unit 15 are connected to the controller 8 via a bus or the like.
  • the communication unit 14 is a module for performing network communication, short-range wireless communication, etc. with other devices.
  • a wireless LAN module such as WiFi or a communication module such as Bluetooth (registered trademark) is provided.
  • the sensor unit 6 shown in FIG. 2 and the controller 8 may be communicably connected via wireless communication or the like.
  • a communication section is also configured in the sensor section 6 (not shown).
  • an object detection sensor 10 including a communication section is used.
  • the information output unit 7 shown in FIG. 2 and the controller 8 may be communicably connected via wireless communication or the like.
  • the information output section 7 is also configured with a communication section (not shown).
  • the headset 13 shown in FIG. 1 includes a communication section, and is connected to the controller 8 via wireless communication.
  • the storage unit 15 is a storage device such as a nonvolatile memory, and for example, an HDD (Hard Disk Drive) or SSD (Solid State Drive) is used. In addition, any computer-readable non-transitory storage medium may be used.
  • HDD Hard Disk Drive
  • SSD Solid State Drive
  • the storage unit 15 stores a control program for controlling the overall operation of the surrounding information notification system 1.
  • the storage unit 15 also contains a history of detection results (sensing results) by the sensor unit 6, a history of acquired surrounding information, user information regarding the user 2, and information such as methods and characteristics regarding the sensor unit 6 and the information output unit 7. , and other various information necessary for operating the surrounding information notification system 1 are stored. Note that the method of installing the control program etc. is not limited.
  • the controller 8 controls the operation of each block included in the surrounding information notification system 1.
  • the controller 8 includes hardware necessary for a computer, such as a processor such as a CPU, GPU, or DSP, memory such as a ROM or RAM, and a storage device such as an HDD.
  • the information processing method according to the present technology is executed by the CPU loading the program according to the present technology stored in the storage unit 15 or the memory into the RAM and executing it.
  • a device such as a PLD (Programmable Logic Device) such as an FPGA (Field Programmable Gate Array), or another ASIC (Application Specific Integrated Circuit) may be used.
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • the processor of the controller 8 executes a program (for example, an application program) according to the present technology, so that the peripheral information acquisition section 17, the notification information generation section 18, and the notification control section 19 are configured as functional blocks. Realized.
  • the information processing method according to this embodiment is executed by these functional blocks. Note that dedicated hardware such as an IC (integrated circuit) may be used as appropriate to realize each functional block.
  • the surrounding information acquisition unit 17 obtains surrounding information regarding the surrounding environment based on the detection results of one or more object detection sensors 10.
  • the peripheral information acquisition unit 17 can acquire the presence or absence of a peripheral object 5, the distance to the object 5, the shape of the object 5, the size of the object 5, the material of the object 5, etc. as peripheral information.
  • a distance measurement sensor is used as the object detection sensor 10
  • it is possible to obtain information regarding fall danger points such as descending stairs or the edge of a station platform as surrounding information.
  • the peripheral information acquisition unit 17 can also acquire information such as the presence or absence of a fall danger point, the distance to the fall danger point, the shape of the fall danger point, the size of the fall danger point, and the like.
  • various peripheral information regarding the surrounding environment may be acquired.
  • the distance to the object 5 is detected by the distance measurement sensor. That is, surrounding information is detected by the ranging sensor. In this way, peripheral information may be detected by the object detection sensor 10 in some cases. That is, the surrounding information may be generated by the sensor unit 6 in some cases. In this case, the surrounding information acquisition unit 17 obtains surrounding information by receiving the surrounding information from the sensor unit 6.
  • peripheral information may be generated by performing recognition processing, analysis processing, etc. based on information, signals, etc. detected by one or more object detection sensors 10.
  • the peripheral information acquisition unit 17 acquires peripheral information by generating peripheral information based on detection results by one or more object detection sensors.
  • acquiring peripheral information based on detection results by one or more object detection sensors 10 means receiving peripheral information detected by one or more object detection sensors 10 and detecting one or more objects. This includes both generating surrounding information based on the detection results by the sensor 10.
  • this surrounding information notification system 1 it is also possible to acquire the following detection results as surrounding information.
  • the distance measurement value (distance information) of a distance sensor whose detection direction is set to the front direction (progressing direction) upward direction, downward direction, left side, right side, etc. of user 2 reaches a threshold value or less, each detection direction detects that an obstacle is approaching.
  • object recognition By performing object recognition on image information acquired by an image sensor, specific objects such as people and cars are detected.
  • the distance measurement value of the distance measurement sensor directed downward and forward reaches a threshold value or more, it is detected that a step or the like is approaching.
  • the surrounding information notification system 1 is configured as a vehicle-mounted sensor, it detects pedestrians in front and obstacles and people behind when entering the garage.
  • the threshold value may be automatically or dynamically set by the surrounding information notification system 1 side, or may be set by the user 2 as appropriate.
  • any image recognition technology such as image size conversion, character recognition, shape recognition, matching processing using a model image of the object, edge detection, projective transformation, etc. can be used. May be used.
  • any machine learning algorithm using, for example, DNN (Deep Neural Network), RNN (Recurrent Neural Network), CNN (Convolutional Neural Network), etc. is used, good.
  • AI artificial intelligence
  • semantic segmentation it is also possible to determine the type of object for each pixel in the image.
  • the application of the machine learning algorithm may be performed to any processing within the present disclosure.
  • material information it is possible to acquire information related to hardness, such as amplitude information of ultrasonic reflected waves.
  • information regarding other materials may also be acquired.
  • the notification information generation unit 18 generates notification information for notifying the user 2 of surrounding information.
  • the notification information includes any information for realizing output of peripheral information by the speaker 11 and the vibration device 12 arranged as the information output section 7.
  • the notification information includes audio information to be output from the speaker 11 and output control information for specifying how to output the audio information.
  • Audio information can be in various forms, such as a message such as "There is an obstacle ahead", musical sound information (melody, accompaniment, etc.) that makes up a certain song, or a notification sound such as "beep beep”. may be output.
  • As the output control information arbitrary information defining volume, pitch, playback speed, BPM (Beats Per Minute), sound localization (localization direction), etc. may be generated. For example, by controlling the localization of sound, it is also possible to provide information using stereophonic sound.
  • the notification information generation unit 18 generates vibration information for vibrating the vibration device 12 as notification information. For example, vibration information for realizing various vibration patterns in which vibration strength (amplitude), frequency, tempo, etc. are specified is generated as notification information.
  • the notification control section 19 controls the information output section 7 based on the notification information.
  • the speaker 11 is driven by the notification control unit 19, and audio information generated as notification information is output. Further, the vibration device 12 is driven, and a vibration pattern corresponding to the vibration information generated as notification information is output.
  • a device including the controller 8 corresponds to an embodiment of an information processing device according to the present technology.
  • the information processing apparatus according to the present technology is realized in a form that includes one or more object detection sensors 10.
  • the information output section 7 and the controller 8 may be integrally configured.
  • the controller 8 may be configured in the headset 13 worn on the user's 2 head.
  • an embodiment of the information processing apparatus according to the present technology is implemented in a form that includes a device for notification such as the speaker 11.
  • the sensor section 6, the information output section 7, and the controller 8 may be integrally configured.
  • one embodiment of the information processing apparatus according to the present technology is realized in a form including one or more object detection sensors 10 and a notification device such as a speaker 11. In this way, it is possible to adopt various forms as the peripheral information notification system 1.
  • FIG. 3 is a flowchart showing an example of the basic operation of the surrounding information notification system 1.
  • the surrounding information acquisition section 17 obtains surrounding information based on the detection result by the sensor section 6 (step 101).
  • the broadcast information generation unit 18 generates broadcast information for broadcasting surrounding information (step 102).
  • step 102 notification information corresponding to the surrounding information to be notified to the user 2 is generated.
  • step 101 surrounding information indicating that the edge of the station platform, which is a falling danger point, exists in the immediate vicinity of the user 2 is acquired.
  • the danger level degree of danger
  • information is generated as broadcast information.
  • vibration information such that a powerful vibration pattern with a large amplitude and a suppressed frequency is output from the vibration device 12 is generated as notification information.
  • the information output unit 7 is controlled by the notification control unit 19 based on the notification information, and surrounding information is notified to the user 2 (step 103).
  • the speaker 11 and the vibration device 12 are controlled by the notification control section 19.
  • the user 2 can grasp the situation of the surrounding environment through sound and vibration (tactile sense), and can move while avoiding danger.
  • FIG. 4A the sensor section 6 is configured at the user's 2 waist position (belt position), and at a position on the front side of the user 2.
  • the sensor unit 6 is located on the head of the user 2, on the front side of the user 2.
  • the sensor main body 21 in which one or more object detection sensors 10 are arranged is configured as a wearable device that can be worn by the user 2.
  • the sensor section 6 is realized by the wearable device. In this way, one or more object detection sensors 10 may be placed in a wearable device worn by the user 2.
  • the user 2 can realize the sensor unit 6 at various positions by wearing the sensor main body 21 configured as a wearable device.
  • a wearable device For example, a wristband type worn on the wrist, a bracelet type worn on the upper arm, a headband type worn on the head (head mounted type), a neckband type worn around the neck, a torso type worn on the chest, and a type worn on the waist.
  • Various forms may be adopted, such as a belt type worn on the ankle or an anklet type worn on the ankle.
  • wearable devices in the form of glasses, rings, necklaces, earrings, or piercings, a form that can be attached to the toe of a shoe, a form that can be attached to any position with a clip, etc. may be adopted.
  • the sensor section 6 is realized in a form that can be held by the user 2.
  • a sensor main body 21 in which one or more object detection sensors 10 are arranged is configured as a device that can be held by the user 2.
  • the sensor main body 21 is held by the right hand holding the white cane 4.
  • the sensor main body 21 is held by the left hand on the opposite side from the right hand holding the white cane 4. In this way, one or more object detection sensors 10 may be placed on a device held by the user 2.
  • the sensor main body 21 (sensor section 6) is mounted on another device held by the user 2.
  • a sensor body 21 on which one or more object detection sensors 10 are arranged is mounted on a carrier 22 that is pulled and moved by the user 2.
  • the sensor body 21 is mounted on a handcart 23 that the user 2 pushes to move. In this way, the sensor unit 6 may be realized by mounting the sensor main body 21 on another device held by the user 2.
  • the sensor body 21 may be mounted on the white cane 4 held by the user 2.
  • a configuration in which the sensor main body 21 (sensor unit 6) is mounted on another device held by the user 2 is included in a configuration in which one or more object detection sensors 10 are placed in a device held by the user 2.
  • Example of configuration of one or more object detection sensors Various variations can also be considered for the configuration of the one or more object detection sensors 10. For example, the number of object detection sensors 10, the type (method, etc.) of object detection sensors 10, the attitude (detection direction, etc.) of object detection sensors 10, the sensing parameters of object detection sensors 10 (frame rate, gain, laser intensity, etc.), etc. By arbitrarily selecting and setting , it is possible to realize various configurations.
  • FIG. 7 is a schematic diagram showing an example of the configuration of one or more object detection sensors 10.
  • one or more object detection sensors 10 are arranged in a sensor main body 21 that can be held by the user 2. By changing the orientation of the sensor body 21, the user 2 can scan and sense the surrounding environment.
  • two ranging sensors with different methods are used as the one or more object detection sensors 10.
  • an optical laser distance measurement sensor (laser distance measurement sensor) 25 and an ultrasonic distance measurement sensor (ultrasonic distance measurement sensor) 26 are used.
  • the two distance measuring sensors 25 and 26 are arranged on the sensor main body 21 so that their detection directions are the same.
  • sensing is performed by the laser ranging sensor 25 and the ultrasonic ranging sensor 26, with the direction in which the sensor body 21 is directed by the user 2 as the detection direction.
  • the detection results by the two types of distance measuring sensors 25 and 26 make it possible to acquire highly accurate surrounding information and to notify the user 2 of the same.
  • the method of the distance measuring sensor employed is not limited and may be set arbitrarily.
  • FIG. 8 is a schematic diagram showing another example of the configuration of one or more object detection sensors 10.
  • one or more object detection sensors 10 are arranged, for example, in a wearable device (not shown) that can be worn on the user's 2 hand.
  • one or more object detection sensors 10 are arranged near the portion of the white cane 4 that the user 2 holds in his/her hand.
  • two ranging sensors 27 and 28 having different detection directions are used (the detection directions are denoted by the symbols of the ranging sensors 27 and 28).
  • the example shown in FIG. 8 includes a first ranging sensor arranged with a first direction as a detection direction, and a second ranging sensor arranged with a second direction different from the first direction as a detection direction. This is an embodiment. Either of the distance measurement sensors 27 and 28 may be used as the first distance measurement sensor.
  • the distance measurement sensor 27 is the first distance measurement sensor.
  • the distance measurement sensor 27 is arranged so that the front direction of the user 2 is the detection direction. Therefore, the first direction is the front direction of the user 2.
  • the distance measuring sensor 27 will be referred to as the front side distance measuring sensor 27 using the same reference numerals.
  • the front distance measuring sensor 27 is arranged at a height H from the ground 3 so that the direction parallel to the ground 3 is the detection direction. Note that the front direction of the user 2 can also be said to be the direction of movement of the user 2.
  • the distance measurement sensor 28 serves as a second distance measurement sensor, and is arranged so that the direction toward the measurement point P set on the ground 3 is the detection direction. Therefore, the second direction is the direction from the position of the user's 2 hand (the position at the height H) toward the measurement point P.
  • the distance measuring sensor 28 will be referred to as the ground side distance measuring sensor 28 using the same reference numerals.
  • the measurement point P is set at a position on the ground 3 that is a predetermined distance D away from the user 2 along the front direction.
  • the size of the distance D is determined, for example, depending on how far the object 5 on the ground 3 or the falling dangerous point is desired to be detected. For example, if it is desired to quickly detect an object 5 on the ground 3 or a fall danger point at a relatively far position, the distance D is set to be relatively long. If it is desired to detect the object 5 on the ground 3 or the fall danger point when it is in a relatively close position, the distance D is set to be relatively short.
  • the distance D to the measurement point P may be set taking into consideration the moving speed of the user 2 and the like. Of course, the distance D to the measurement point P may be arbitrarily set by the user 2, for example, based on other viewpoints.
  • intersection angle ⁇ between the detection direction of the ground-side ranging sensor 28 and the ground 3 can be calculated based on the desired distance D using the following trigonometric positioning formula.
  • Intersection angle ⁇ arctan(H/D)...(1)
  • the height H from the ground 3 of the front-side distance measurement sensor 27 and the ground-side distance measurement sensor 28 is 0.75 m.
  • two channels of detection can be performed in the front direction and the ground direction. It becomes possible to obtain the results. Thereby, it becomes possible to acquire highly accurate surrounding information, and it becomes possible to notify the user 2.
  • the detection directions of the two distance measuring sensors 27 and 28, that is, the first direction and the second direction, are not limited and may be set arbitrarily.
  • any combination of directions may be employed, such as (front direction, back direction), (front direction, upper direction), (front direction, left direction), (front direction, right direction).
  • the distance measuring sensor may be arranged with the direction in which the situation is desired to be known as the detection direction.
  • the detection direction is (front direction, left direction).
  • the measurement point P is, for example, It may be set at a position a predetermined distance D away from the user 2 along the front direction on the left wall.
  • an arbitrary number of three or more distance measuring sensors may be arranged so that their detection directions are different from each other.
  • three ranging sensors may be arranged so as to provide three channels in the front direction and in the left and right directions perpendicular to the front direction (directions of both left and right walls). Any variation may be adopted as the number of ranging sensors and the detection direction of each ranging sensor.
  • the user 2 may be able to set the detection directions of the distance measuring sensors 27 and 28 on the application.
  • GUI Graphic User Interface
  • FIG. 9 is a block diagram for explaining notification of surrounding information according to the first embodiment.
  • the peripheral information acquisition unit 17 acquires first peripheral information 30 and second peripheral information 31. That is, at least two different types of peripheral information are acquired.
  • the sensor section 6 having the configuration illustrated in FIG. 7 is adopted. That is, it is assumed that a laser ranging sensor 25 and an ultrasonic ranging sensor 26 are arranged.
  • the surrounding information obtained based on the detection result of the laser ranging sensor 25 is obtained as the first surrounding information 30.
  • surrounding information acquired based on the detection result of the ultrasonic ranging sensor 26 is acquired as second surrounding information 31.
  • the presence or absence (ON/OFF) of detection of the object 5 output from the laser ranging sensor 25, the distance to the object 5, the material (hardness) of the object 5, etc. are acquired as the first peripheral information 30.
  • information such as whether or not the object 5 is detected (ON/OFF), the distance to the object 5, and the material (hardness) of the object 5 outputted from the ultrasonic ranging sensor 26 is acquired as second peripheral information 31.
  • the laser distance measurement sensor 25 can detect the distance to the object 5, etc. without being affected by the hardness of the object 5. Conversely, it is often difficult to detect the material (hardness) of the object 5 using the laser ranging sensor 25. In this case, for example, the laser distance measurement sensor 25 may output information indicating that the object 5 is undetectable as information about the material (hardness) of the object 5.
  • the broadcast information generation section 18 includes audio signal processing sections 32 and 33 and a speech synthesis processing section 34.
  • the storage unit 15 also stores first musical tone data for notifying the first peripheral information 30 and second musical tone data for notifying the second peripheral information 31.
  • the musical tone data includes any data that constitutes a musical tone.
  • the data includes data in which a predetermined scale or melody is defined, audio data of a specific musical instrument, and the like.
  • the musical tone data of the main melody, the musical tone data of the sub melody, etc. of a predetermined song may be used individually.
  • specific instruments playing a given song (melody instruments such as piano, violin, vocals, bass instruments such as bass guitar, contrabass, bass drum, percussion instruments such as glockenspiel, drums, bells, chimes, etc.) ) audio data may be used.
  • data such as an electronic sound that is discontinuously and periodically reproduced as "pippipippi" may also be used.
  • the audio signal processing unit 32 Based on the first peripheral information 30, the audio signal processing unit 32 generates first audio information using the first musical tone data.
  • the audio signal processing unit 33 generates second audio information using the second musical tone data based on the second peripheral information 31.
  • the first audio information and the second audio information are generated as broadcast information.
  • FIG. 10 is a schematic diagram showing an example of first audio information and second audio information.
  • the first musical sound data and the second musical sound data are musical sound data that, when output together to the user 2, form a combination that can be listened to without any musical discomfort. .
  • musical tone data of a certain part of a predetermined musical piece is set as the first musical tone data.
  • musical sound data of other parts constituting the same music piece is set.
  • first musical sound data and second musical sound data such as (main melody, sub melody), (melody, accompaniment), (melody of high instrument, melody of low instrument), (melody of melody instrument, sound of percussion instrument), etc. Any combination may be adopted as the combination.
  • the audio signal processing unit 32 generates first audio information by controlling musical tone parameters based on the first peripheral information 30.
  • musical sound parameters include volume, frequency, pitch, speed of song reproduction, BPM, and tempo.
  • musical tone parameter control such as increasing the volume, pitch, and tempo is executed to generate the first audio information.
  • the first audio information may be generated by controlling the musical tone parameters based on the distance information.
  • audio data of a specific musical instrument may be generated as the first audio information in response to the detection of the object 5.
  • the audio signal processing unit 33 generates second audio information by controlling musical tone parameters based on the second peripheral information 31.
  • first musical tone information constituting a predetermined music piece is generated as the first audio information.
  • second musical tone information constituting the same song is generated.
  • the speech synthesis processing unit 34 synthesizes the first speech information and the second speech information to generate one speech information (synthesized speech information). That is, the synthesized speech information is generated by the speech synthesis processing unit 34 so that both the first speech information and the second speech information illustrated in FIG. 10 are output. Any mixing technique or the like for synthesizing audio information (audio data) may be used.
  • an audio output section 35 is configured within the notification control section 19.
  • the audio output unit 35 controls the speaker 11 to output synthesized audio information in which the first audio information and the second audio information are synthesized. As a result, both the first audio information and the second audio information are output from the speaker 11.
  • the user 2 is able to grasp the first peripheral information 30 through the first audio information and the second peripheral information 31 through the second audio information. That is, the user 2 can grasp the first peripheral information 30 and the second peripheral information 31 at the same time via voice. As a result, it becomes possible to notify the user 2 of information about the surrounding environment with high accuracy.
  • a melody of a certain musical instrument is output as the first audio information.
  • the melody of another musical instrument is output as the second audio information.
  • the first audio information and the second audio information are combined and output together, they are played back to the user 2 as one melody.
  • a sound of a certain scale is output. Sounds of different scales are output as second audio information.
  • the first audio information and the second audio information are synthesized and output together, they form a chord and are played back to the user 2.
  • the user 2 moves while scanning the surrounding area using the sensor section 6 having the configuration shown in FIG. 7 in which the laser distance measurement sensor 25 and the ultrasonic distance measurement sensor 26 are arranged.
  • the sensor section 6 may be mounted on the white cane 4.
  • the first peripheral information 30 is acquired by the peripheral information acquisition unit 17 based on the detection result of the laser ranging sensor 25. Further, second surrounding information 31 is acquired based on the detection result of the ultrasonic ranging sensor 26. It is assumed that the detection direction of the laser distance measurement sensor 25 and the detection direction of the ultrasonic distance measurement sensor 26 are set in the same direction.
  • FIG. 11 is a table showing differences between the laser distance measurement sensor 25 and the ultrasonic distance measurement sensor 26.
  • the detection range of the ultrasonic ranging sensor 26 becomes wide.
  • the laser has high directivity, the detection range of the laser ranging sensor 25 is narrow. Both the ultrasonic distance measurement sensor 26 and the laser distance measurement sensor 25 return the distance measurement value when an object 5 exists within the detection range, so the laser distance measurement sensor 25 is easier to aim at, It becomes possible to know with high accuracy whether or not there is an object 5 at the point where the cane 4 is pointed.
  • the ultrasonic ranging sensor 26 has a wide detection range, so while only the approximate direction of the detected object 5 can be known, it enables detection over a wide range. If we compare it with vision, it can be said that the laser distance measurement sensor 25 is close to the fovea, which has a narrow field of view, and the ultrasonic distance measurement sensor 26 is close to the peripheral field, which has a wide field of view.
  • the laser of the laser ranging sensor 25 straddles the boundary of a thin object 5 by slightly moving the hand, it is possible to know the width of the object 5 in conjunction with the hand movement.
  • the ultrasonic ranging sensor 26 having a wide detection range, the object 5 does not exceed the detection range unless the user moves his or her hand widely, so it is difficult for the user 2 to know information such as the width and height of the object 5 in detail.
  • musical sounds for example, piano, violin, vocals, etc.
  • instruments suitable for the main melody with many scales can be used to convey detailed changes in the detection results. musical sound data).
  • the ultrasonic ranging sensor 26 Since the ultrasonic ranging sensor 26 has a wide detection range, it will notify the detection of some object 5 relatively frequently compared to the laser ranging sensor 25, which has a narrow detection range. For this reason, for example, musical tones of accompaniment or bass instruments (eg, bass guitar, contrabass, bass drum, etc.) suitable for continuous notification are assigned.
  • the detection result is not a continuous value such as a distance value, but a binary value such as whether or not the object 5 is detected (ON/OFF), for example, musical sounds of percussion instruments such as glockenspiel, drums, bells, and chimes are assigned. It's okay.
  • the laser distance measurement sensor 25 has difficulty detecting objects with low light reflectance such as glass or black materials;
  • the difference is that the ultrasonic ranging sensor 26 is not suitable for detecting soft materials.
  • both types of distance measuring sensors have advantages and disadvantages.
  • the surrounding information notification system 1 it is possible to compensate for the shortcomings of both distance measuring sensors. For example, even if there is a wall or door made of glass that cannot be detected by a laser, the ultrasonic wave returns a measured distance value, so it is possible to notify the user 2 that there is some object 5 in the vicinity. In this way, it is possible to increase the types of objects 5 that can be detected. It is also possible to improve environmental resistance.
  • the laser distance measurement sensor 25 is used as the main distance measurement sensor, and musical tone data of a high-pitched musical instrument is assigned.
  • the secondary ultrasonic ranging sensor 26 receives musical tone data from a bass musical instrument.
  • musical sound data is assigned so that the user 2 can clearly hear the voices corresponding to both distance measuring sensors. This makes it possible to notify the user 2 of information about the surrounding environment with very high accuracy.
  • the main melody is output according to the distance measurement value of the laser distance measurement sensor 25, the output may be interrupted or become inaudible. It is difficult for the user 2 to judge whether this is due to the influence of small objects 5 such as trees or snow, or whether it is due to the light transmitting/absorbing material.
  • the object 5 is If it can be determined that the material is a transparent material or a light-absorbing material, it is also possible to further utilize information on hardness obtained from the ultrasonic ranging sensor 26. For example, it is assumed that objects 5 that are light-transmissive or light-absorbing and have high hardness are often highly dangerous obstacles such as glass walls or black walls. Based on such an assumption, it is possible to estimate that an object having light transmittance or light absorption and low hardness is not glass or the like but a soft light-absorbing material such as black clothing.
  • the peripheral information acquisition unit 17 can further generate peripheral information based on the first peripheral information 30 and the second peripheral information 31. That is, it is possible to integrate the first surrounding information and the second surrounding information to generate surrounding information regarding the surrounding environment (hereinafter referred to as integrated surrounding information).
  • the surrounding information acquisition unit 17 may generate integrated surrounding information based on the stability of detection by the laser ranging sensor 25 and the stability of detection by the ultrasonic ranging sensor 26. Note that in the present disclosure, the stability of sensor detection is included in the sensor detection result.
  • information regarding at least one of the material and the type of object for the light-transmitting member or the light-absorbing member is determined. It is also possible to generate detailed information as integrated peripheral information.
  • the shape of an object can be grasped in detail by converting the detection results from the laser distance measurement sensor 25 into minute changes in tone that are linked to the movements of the body or movable parts.
  • the ultrasonic ranging sensor 26 by converting the wide range detection results by the ultrasonic ranging sensor 26 into audio such as accompaniment, it is possible to detect in advance the presence of obstacles that are likely to block the route. It is also possible to realize the division of roles between the fovea and the peripheral visual field in vision. It is also possible to share roles that compensate for the shortcomings of each distance measurement sensor.
  • the laser distance measurement sensor 25 is normally used, it is also possible to rely on information from the ultrasonic distance measurement sensor 26 in places where there is glass or black material with low light reflectivity.
  • the specific types of the first peripheral information 30 and the second peripheral information 31 shown in FIG. 9 are not limited.
  • surrounding information obtained based on the detection result of a ranging sensor arranged with the first direction as the detection direction may be obtained as the first surrounding information 30.
  • surrounding information obtained based on the detection result of a ranging sensor arranged with the second direction as the detection direction may be obtained as the second surrounding information 31.
  • the first surrounding information 30 is acquired based on the detection result of the front distance measuring sensor 27 arranged with the front direction as the detection direction, for example.
  • the second surrounding information 31 may be acquired based on the detection result of the ground-side ranging sensor 28 arranged with the direction toward the measurement point P as the detection direction. This allows the user 2 to simultaneously grasp information about the environment on the front side and information on the environment on the ground side via audio.
  • the first surrounding information 30 may be acquired based on the detection result of the ground-side distance measurement sensor 28, and the second surrounding information 31 may be obtained based on the detection result of the front-side distance measurement sensor 27.
  • the distance measurement value of the front side distance measurement sensor 27 is acquired as the first peripheral information 30. Then, musical tone parameters are controlled according to the measured distance value, and a predetermined melody is output as first audio information.
  • the presence or absence of a fall danger point is acquired based on the distance measurement value of the ground-side distance measurement sensor 28. For example, when the distance measurement value of the ground-side distance measurement sensor 28 becomes large, it is determined that there is a fall danger point. In response to detection of a fall danger point, second audio information is generated and output from audio data of a percussion instrument or the like.
  • the melody which is the first audio information
  • the audio of a percussion instrument which is the second audio information
  • the user 2 can simultaneously grasp the proximity of the object 5 in the front direction and the presence or absence of a fall danger point on the ground 3.
  • Distance measuring sensors of the same type are arranged so that their detection directions are different from each other.
  • a plurality of laser ranging sensors are arranged with different detection directions such as front, back, left, right, top and bottom.
  • the main melody is assigned to the laser distance measurement sensor in the front direction.
  • Accompaniment etc. are assigned to the laser ranging sensor in a direction other than the front direction.
  • the localization of the first audio information is controlled based on the detection direction of the first ranging sensor among the plurality of ranging sensors.
  • the localization of the second audio information is controlled based on the detection direction of the second ranging sensor among the plurality of ranging sensors.
  • Such processing is also possible.
  • FIG. 12 is a schematic diagram for explaining an example in which an image sensor is arranged as the object detection sensor 10.
  • Image information 38 is generated by the image sensor and output to the surrounding information acquisition unit 17 as a detection result.
  • the surrounding information acquisition unit 17 performs object recognition processing on the image information 38.
  • the presence or absence of detection of the object 5 ON/OFF
  • the type of the object 5, the distance to the object 5, the material (hardness) of the object 5, etc. are used as peripheral information. It is possible to obtain.
  • the first peripheral information 30 may be acquired based on the information of the upper half pixel region 38a of the image information 38. Further, the second peripheral information 31 may be acquired based on the information of the lower half pixel region 38b of the image information 38. In this way, the first peripheral information 30 may be acquired based on information on a part of the pixel area of the image information 38. Further, the second peripheral information may be acquired based on information on other pixel areas in the image information 38. This allows the user 2 to simultaneously grasp information on the upper environment and information on the lower environment via audio.
  • information regarding a car 39 detected based on image information 38 is acquired as first surrounding information 30. Further, information regarding the person 40 detected based on the image information 38 is acquired as the second peripheral information 31. In this way, the first peripheral information 30 and the second peripheral information 31 may be acquired for each type of object detected based on the image information 38. That is, information regarding a first type of object detected based on the image information 38 is acquired as first peripheral information, and information regarding a second type different from the first type is acquired as second peripheral information. may be done.
  • the first type and second type of object can be set arbitrarily. For example, it is possible to set any combination of (person, vehicle), (motorcycle, automobile), (adult, child) (pedestrian, bicycle) as the first type and second type. be.
  • the user 2 can simultaneously grasp information regarding two different types of objects via voice.
  • the notification information generation unit 18 may determine whether to output the first audio information based on the first peripheral information 30. For example, it is determined whether or not to output the first audio information based on whether or not the object 5 is detected, the distance to the object 5, and the like. For example, if the object 5 is not detected, or if the distance to the object 5 is greater than a predetermined threshold (eg, 5 m, etc.), it is determined that the first audio information is not output.
  • a predetermined threshold eg, 5 m, etc.
  • the output of the first audio information by the audio output unit 35 is restricted. That is, when the first peripheral information 30 satisfies a predetermined condition, the first audio information is output, and when the predetermined condition is not satisfied, the first audio information is not output. Such processing is also possible.
  • the conditions that serve as the criteria for determining whether or not to output the first audio information may be set arbitrarily.
  • the output of the second audio information it is determined whether or not to output the second audio information based on predetermined conditions, and the output of the second audio information is controlled based on the determination result. Good too.
  • the same determination condition may be set as the determination condition regarding the output of the first audio information and the determination condition regarding the output of the second audio information, or different determination conditions may be set separately. .
  • first musical tone data, second musical tone data, and third musical tone data are prepared for each of first peripheral information, second peripheral information, and third peripheral information.
  • First musical tone data is used to generate first audio information based on the first peripheral information.
  • Second audio information is generated using the second musical tone data based on the second peripheral information.
  • Third audio information is generated using third musical tone data based on the third peripheral information.
  • a sensor capable of outputting multidimensional information may be disposed.
  • an image sensor capable of outputting image information 38 shown in FIG. 12 and the like can also be said to be a sensor capable of outputting multidimensional information, with information on each pixel being one-dimensional information.
  • the object detection sensor 10 that can output multidimensional information it is also possible to use a ToF (Time of Flight) sensor that can acquire distance information for each pixel.
  • ToF Time of Flight
  • audio information may be generated using musical tone data for each pixel information and output together.
  • the first peripheral information 30 and the second peripheral information 31 are acquired by the controller 8 based on the detection results by the one or more object detection sensors 10. . Then, first audio information is generated using the first musical tone data based on the first peripheral information 30. Also, second musical tone data is used to generate second audio information based on the second peripheral result. Both the first audio information and the second audio information are output. Thereby, it becomes possible to notify both the first surrounding information and the second surrounding information via voice, and it becomes possible to notify the user 2 of information on the surrounding environment with high accuracy.
  • Distance sensors include, for example, optical laser systems, ultrasonic systems, and stereo cameras, but optical laser systems have problems such as being unable to measure distances from objects with low light reflectance and being affected by environmental light. there were. Another problem with the ultrasonic method was that it was difficult to narrow down the distance measurement range because the sound waves spread out. In order to solve such problems, a configuration in which a plurality of distance measuring sensors of different methods are combined can be cited.
  • a laser distance measurement sensor and an ultrasonic distance measurement sensor are selectively switched and used as appropriate.
  • the highly directional laser distance measurement sensor outputs detection results for obstacles in the front direction, but the ultrasonic distance measurement sensor with a wide detection range suddenly outputs detection results for objects that are widely present in the surrounding area. It is also possible that the output is switched and output. It may be difficult for the user 2 to understand whether the detection result currently being reported via audio or the like is information for the front direction or information for a wide area around the user, making it difficult to avoid danger. could be.
  • the first peripheral information 30 and the second peripheral information 31 are converted into audio information, and it becomes possible to notify the user 2 at the same time.
  • the peripheral information acquisition unit 17 corresponds to an embodiment of the peripheral information acquisition unit according to the present technology.
  • the audio signal processing units 32 and 33 and the audio synthesis processing unit 34 configured in the notification information generation unit 18 correspond to an embodiment of the audio information generation unit according to the present technology.
  • the audio output unit 35 configured in the notification control unit 19 corresponds to an embodiment of the notification control unit according to the present technology, which outputs both first audio information and second audio information.
  • the speech synthesis processing section 34 can also be regarded as a block that also functions as a notification control section according to the present technology.
  • the laser ranging sensor 25 corresponds to an embodiment of the first object detection sensor according to the present technology. Further, the laser distance measurement sensor 25 also corresponds to an embodiment of a first distance measurement sensor that operates according to the first method (optical laser method).
  • the ultrasonic ranging sensor 26 corresponds to an embodiment of the second object detection sensor according to the present technology. Further, the ultrasonic ranging sensor 26 also corresponds to an embodiment of a second ranging sensor that operates according to a second method (ultrasonic method) different from the first method.
  • the front distance measuring sensor 27 corresponds to an embodiment of the first object detection sensor according to the present technology. Further, the front side distance measuring sensor 27 also corresponds to one implementation value of the first distance measuring sensor arranged with the first direction (front direction) as the detection direction.
  • the ground-side ranging sensor 28 corresponds to an embodiment of the second object detection sensor according to the present technology. Further, the ground-side ranging sensor 28 also corresponds to an embodiment of a second ranging sensor arranged with a detection direction in a second direction (ground direction) different from the first direction.
  • FIG. 13 is a block diagram showing a configuration example for realizing notification of surrounding information according to the second embodiment.
  • the surrounding information notification system 41 the configuration of the sensor unit 6 shown in FIG. 8 is adopted. That is, a front-side distance measurement sensor 27 whose detection direction is in the front direction, and a ground-side distance measurement sensor 28 whose detection direction is in a direction toward the measurement point P on the ground 3 (ground direction) are used.
  • the surrounding information acquisition section 17 includes a distance information acquisition section 42 and a situation determination section 43.
  • the distance information acquisition unit 42 receives first distance information detected by a first ranging sensor arranged with a first direction as a detection direction, and a second direction different from the first direction as a detection direction. and second distance information detected by the second distance measuring sensor arranged.
  • the distance information detected by the front distance measuring sensor 27 (hereinafter referred to as front distance information) is acquired as the first distance information. Further, distance information detected by the ground-side distance measuring sensor 28 (hereinafter referred to as ground-side distance information) is acquired as second distance information.
  • the situation determination unit 43 determines the surrounding environment based on at least one of first detection information including fluctuations and dispersion of the first distance information and second detection information including fluctuations and dispersion of the second distance information. Determine the situation.
  • first detection information including fluctuations and variations in the front side distance information
  • second detection information including fluctuations and variations in the ground side distance information are used. That is, the situation of the surrounding environment is determined using four pieces of information: variation in front distance information, variation in front distance information, variation in ground side distance information, and variation in ground side distance information.
  • variable in distance information includes any information related to variation in distance information, such as the magnitude of variation in distance information, the direction of variation (increase/decrease), time of variation, and time of no variation. It will be done.
  • distance information variations includes any information regarding variations in a plurality of distance information detected in time series at a predetermined frame rate. For example, information regarding dispersion of a plurality of pieces of distance information detected in the most recent predetermined period or information regarding dispersion of a predetermined number of distance information detected most recently may be used. For example, values such as variance and deviation (standard deviation) indicating variations are calculated. The variation time of the deviation, the magnitude of the variation in the deviation, the direction of the variation in the deviation (increase/decrease), the time during which the deviation does not vary, etc. are acquired as information regarding the "distance variation". Note that the values of variance and deviation (standard deviation) can be determined using known arithmetic expressions.
  • the presence or absence of a surrounding object 5, the distance to the object 5, the shape of the object 5, the type of the object 5, the size of the object 5, the material of the object 5, etc. may be output as a determination result.
  • the presence or absence of a fall danger point, the distance to the fall danger point, the type of fall danger point, the shape of the fall danger point, the size of the fall danger point, etc. can be output as the determination results.
  • other situations may also be determined.
  • the front side distance information and the ground side distance information acquired by the distance information acquisition unit 42 are information included in the surrounding information according to the present technology. Further, the determination result of the situation of the surrounding environment outputted by the situation judgment unit 43 is also information included in the surrounding information according to the present technology.
  • FIG. 14 is a flowchart showing an example of notification of surrounding information according to this embodiment.
  • the surrounding information notification system 41 starts up (step 201).
  • a power button or the like is installed on the sensor body 21, and the user 2 presses the power button or the like.
  • the power may be turned on by voice input by the user 2.
  • Automatic calibration of the ranging sensor is started (step 202).
  • automatic calibration is performed on the front-side ranging sensor 27 and the ground-side ranging sensor 28.
  • step 203 It is determined whether the automatic calibration results are normal (step 203). Specifically, the distance information (distance measurement value) of each distance measurement sensor is acquired, and it is determined whether the value is appropriate. For example, if the sensor body 21 is attached incorrectly or if the user's 2 hand or the like is covering each distance measurement sensor, the distance measurement value will not be appropriate and it will be determined that there is an abnormality. Further, even if each distance measuring sensor is out of order, it is determined that there is an abnormality.
  • a voice guide indicating an error is output from the speaker 11 (step 204). For example, a voice guide such as "Please check whether the device is worn properly" is output.
  • the user 2 corrects the orientation of each distance measuring sensor as a countermeasure for the error (step 205).
  • the distance information acquisition unit 42 acquires front side distance information and ground side distance information detected at a predetermined frame rate (step 206).
  • the situation determination unit 43 determines the situation of the surrounding environment based on fluctuations and dispersion of front side distance information and ground side distance information, which are two channels of distance information on the front side and the ground side. In this embodiment, determination results regarding obstacles and fall danger points are output. Specifically, it is determined whether an obstacle and a falling point are present (step 207).
  • FIG. 15 is a table showing an example of determination of the situation of the surrounding environment.
  • 16 to 18 are schematic diagrams for explaining the determination example shown in FIG. 15.
  • FIG. 16 is a schematic diagram showing a case where an obstacle exists in a position in the front direction of the user 2.
  • FIG. 17 is a schematic diagram showing a case where an obstacle exists on the ground 3 in front of the user 2.
  • FIG. 18 is a schematic diagram showing a case where a fall danger point exists at a position in the front direction of the user 2.
  • the situation determining unit 43 can determine that the obstacle 44 is present at the position in the front direction when the front distance becomes small and the variation time of the deviation of the front distance is longer than a predetermined time. . More specifically, it is possible to determine that an obstacle 44 with a height of at least H or more exists in a position in the front direction.
  • a threshold regarding front distance information may be set. For example, when the front distance information becomes smaller than a predetermined threshold value, it may be determined that the obstacle 44 exists. For example, when the front distance becomes smaller than a threshold value such as 5 m or 10 m, it is determined that the obstacle 44 exists at a position in the front direction. Thereby, it becomes possible to detect with high precision the obstacle 44 that exists within the distance where there is a possibility of collision.
  • the "predetermined time” that serves as the criterion for determination may be appropriately set when constructing the surrounding information notification system 41. For example, by arranging the obstacle 44 at a position in the front direction and performing calibration or the like, an appropriate time during which the obstacle 44 can be detected is calculated. A threshold value is set as a "predetermined time” based on the calculated time. For example, the calculated time may be directly used as the threshold, or a time close to the calculated time may be used as the threshold. When the variation time of the deviation of the front side distance is longer than the threshold value, it is possible to determine that the variation time of the deviation of the front side distance is "longer than a predetermined time". Of course, the settings are not limited to this.
  • FIG. 16A a car 45 exists as an obstacle 44.
  • FIG. 16B an upward staircase 46 is present as the obstacle 44.
  • the car 45 comes relatively close to the measurement point P.
  • the ground-side distance information becomes smaller. In other words, the ground side distance information does not substantially change until the vehicle 45 approaches the measurement point P.
  • the front distance information becomes approximately equal to the distance D from the user 2 to the measurement point P.
  • the up stairs 46 relatively approaches the vicinity of the measurement point P.
  • the ground-side distance information becomes smaller. In other words, the ground side distance information does not substantially change until the up stairs 46 approaches the measurement point P.
  • the step portion on the upper side of the ascending staircase 46 is located further back in the front direction than the lowest step, and is located away from the user 2. Therefore, when the upward stairs 46 approaches the measurement point P, the front side distance information has a value larger than the distance D from the user 2 to the measurement point P. After that, when the user 2 moves toward the upward stairs, the ground side distance information becomes further smaller, and the deviation of the ground side distance information continues to fluctuate. That is, until the front distance information becomes approximately equal to the distance D to the measurement point P, the ground side distance information becomes smaller and the deviation of the ground side distance information continues to fluctuate.
  • the obstacle 44 present in the front direction is, for example, a car 45 or an upward staircase 46.
  • the obstacle present in the front direction is a car 45 or an upward staircase 46 by focusing on the front side distance information at the timing when the ground side distance information becomes small. It is possible.
  • the front side distance information becomes smaller and the fluctuation time of the deviation of the front side distance information is longer than a predetermined time, and the ground side distance information is changed until the front side distance information becomes smaller than a predetermined threshold. If there is no change, it is determined that there is an object other than the object constructed obliquely upward and away from the user at a position in the front direction.
  • the ground side distance information becomes small and the ground side If the variation time of the deviation of the distance information is longer than a predetermined time, it is determined that there is an object constructed diagonally upward and away from the user at a position in the front direction.
  • the "predetermined threshold” regarding the front distance information is set based on the distance D on the ground 3 from the user 2 to the measurement point P.
  • the distance D may be used as it is as the threshold value.
  • a value close to the distance D may be used as the threshold.
  • the "predetermined threshold” may be calculated by calibration or the like, or may be set arbitrarily by the user 2.
  • a state in which the distance information does not change includes not only a state in which the distance information does not change at all, but also a state in which there is almost no change.
  • a state in which a range with a relatively small width is set and the distance information falls within the range can be defined as a "state in which the distance information does not change.”
  • an object in the shape of an upward staircase is determined as "an object constructed diagonally upward and away from the user.” Therefore, “an object other than an object obliquely constructed upward and away from the user” is an object other than an upward staircase-shaped object.
  • a car 45 is an embodiment of an object other than an upwardly directed staircase-shaped object.
  • the object is not limited to the car 45, and includes any object other than an object shaped like an upward staircase.
  • an ascending staircase 46 is one embodiment of an upwardly directed staircase-shaped object.
  • the object is not limited to the upward staircase 46, and for example, an upward escalator is also included in the upward staircase-shaped object. Other arbitrary step-shaped objects are also included.
  • a "frontal obstacle” corresponds to an object other than an upward staircase-shaped object.
  • the "up stairs/up escalator” corresponds to a staircase-shaped object heading upward.
  • the condition for the ground side distance information for determining a "frontal obstacle” and “up stairs/up escalator” is that the front side distance information becomes smaller than a predetermined threshold. is listed.
  • an obstacle 48 (object 5) whose height is lower than H exists on the ground 3.
  • an obstacle 48 whose height is lower than H will be referred to as a ground obstacle 48 using the same reference numeral.
  • the ground obstacle 48 shown in FIG. 17A is larger in size than the ground obstacle 48 shown in FIG. 17B.
  • the ground side distance information when the ground side distance information becomes small and the variation time of the deviation of the ground side distance information is longer than a predetermined time in a state where the front side distance information does not change, the ground side distance information is changed. It is determined that a ground obstacle 48 larger than a predetermined size exists on the ground. In addition, in a state where the front side distance information does not change, if the ground side distance information becomes small and the variation time of the deviation of the ground side distance information is shorter than a predetermined time, a ground smaller than the predetermined size is placed on the ground 3. It is determined that an obstacle 48 exists.
  • the deviation of the ground side distance information is higher than when the ground obstacle 48 shown in FIG. 17B exists on the ground 3.
  • the fluctuation time becomes longer. Based on the variation time of the deviation of the ground distance information, it is possible to determine the relative size of the ground obstacle 48 existing on the ground 3. That is, it is possible to determine whether the ground obstacle 48 is larger than a "predetermined size".
  • a threshold value (“predetermined time") is set for the variation time of the deviation of the ground side distance information. Then, when the variation time of the deviation of the ground side distance information is longer than the threshold value, it is determined that a ground obstacle 48 having a relatively large size exists on the ground 3. If the variation time of the deviation of the ground side distance information is shorter than the threshold value, it is determined that a ground obstacle 48 with a relatively small size exists on the ground 3.
  • a "predetermined size” that is a criterion for determining the size of the ground obstacle 48.
  • a "predetermined size” is set as a criterion for determining the size of the ground obstacle 48, and the ground side distance information is set so that determination can be made based on the set "predetermined size”.
  • a threshold value (“predetermined time”) may be set for the variation time of the deviation.
  • the threshold value (“predetermined time” or “predetermined size”) serving as a criterion for determination may be arbitrarily set by the system or the user.
  • the moving speed (walking speed) of the user 2 may be acquired and the threshold value may be set using the information.
  • the size of the ground obstacle 48 is also defined based on the height of the ground obstacle 48, the area seen from the information on the ground obstacle 48, or both of these parameters. It may be set arbitrarily.
  • Ground obstacle (large) listed in the table of FIG. 15 corresponds to a ground obstacle 48 that is larger than a predetermined size (relatively large in size) that exists on the ground 3.
  • “Ground obstacle (small)” corresponds to a ground obstacle 48 existing on the ground 3 that is smaller than a predetermined size (relatively small in size).
  • a downward staircase 51 exists as a fall danger point 50 at a position in the front direction of the user 2.
  • an edge 52 of the platform exists as a fall danger point 50 in a position in the front direction.
  • a fall danger point 50 which is an area concave downward, exists in a position in the front direction.
  • the front side distance information is described as being unchanged or decreasing. The present invention is not limited to this, and regardless of the conditions of the front side distance information, it may be determined that the fall danger point 50 exists when the ground side distance information becomes large.
  • FIGS. 18A and 18B a descending staircase 51 and an edge 52 of the platform are illustrated as falling danger points 50.
  • the present invention is not limited to this, and it is also possible to determine the presence of a downward escalator or any fall danger point 50 where there is a risk of the user 2 falling based on the ground side distance information. Note that the falling point in FIG. 15 corresponds to the falling danger point 50.
  • the notification information generation unit 18 and the notification control unit 19 execute notification processing of the obstacles 44 (48) and the falling danger points 50 (step 208).
  • FIG. 19 is a table showing an example of a process for notifying obstacles and falling danger points.
  • the notification information generation unit 18 determines the danger level for movement of the user 2 based on the determination result by the situation determination unit 43. Then, notification information is generated and output so as to correspond to the danger level.
  • the danger level is determined to be higher when the fall danger point 50 exists than when the obstacle 44 (48) exists.
  • the danger level is set to "high.”
  • the danger level of the obstacle 44 (48) is set to "medium” or "low,” which is lower than “high.” This makes it possible to more strongly notify the user 2 of the danger of falling.
  • the danger level is determined depending on the type of obstacle 44 (48). Specifically, the danger level of the "frontal obstacle” is set to “medium”, assuming that the degree of injury to the user 2 at the time of a collision is moderate. The risk level of the "ground obstacle (large)” is set to “medium” because the possibility of tripping or the degree of injury to the user 2 in the event of a collision is moderate.
  • the danger level of the "ground obstacle (small)” is set to “small” because the possibility of tripping or the degree of injury to the user 2 in the event of a collision is low.
  • the danger level of the "up stairs/up escalator” is set to “low” because the degree of injury to the user 2 is low.
  • the setting of the danger level is not limited to this example. In addition, it may be dynamically set by the system based on data on the user's status at the time (gender, age, health condition, whether or not hearing aids are worn, etc.), or it may be set arbitrarily by the user. It's okay.
  • the degree of risk of falling may differ between a young person and an elderly person. Therefore, if User 2 is a young person, the risk level of "Ground Obstacle (Large)" is set to “Medium”, but if User 2 is an elderly person, it is changed to "High”. may be performed. Note that the distance D and musical tone data may be set as appropriate based on data on user status (gender, age, health condition, presence or absence of hearing aids, etc.).
  • notification by voice and notification by vibration are used. Specifically, audio notifications are performed regarding “frontal obstacles” and “up stairs/up escalators.” Vibration notification is performed for "ground obstacles (large)”, “ground-side obstacles (small)”, “down stairs, down escalators, and other falling points”.
  • audio information is generated as notification information in order to notify the situation of the surrounding environment corresponding to the front direction.
  • vibration information is generated as notification information in order to notify the situation of the surrounding environment corresponding to the ground direction.
  • the user 2 can grasp the situation in the front direction through the sound, and can also grasp the situation in the ground direction through the vibration. That is, it becomes possible to broadcast information about the surrounding environment with high accuracy.
  • simultaneous notification to the user 2 via voice as described in the first embodiment may be adopted.
  • vibration information may be generated as notification information to notify the situation of the surrounding environment corresponding to the front direction
  • audio information may be generated as the notification information to notify the situation of the surrounding environment corresponding to the ground direction.
  • a "safety distance range” and a “notification distance range” are set.
  • the "safe distance range” is a distance range that is far from the obstacle 44 (48) or the fall danger point 50 and is determined to be safe. If the distance from the obstacle 44 (48) or fall danger point 50 is included in the "safe distance range", no notification is necessary and reproduction of the notification information is stopped. That is, the output of audio information from the speaker 11 and the output of the vibration pattern according to the vibration information of the vibration device 12 are stopped.
  • the “notification distance range” is a distance range in which it is determined that the obstacle 44 (48) or fall danger point 50 is approaching and that notification is necessary.
  • the notification according to the danger level is executed as follows, for example. "Front obstacle” (danger level “medium”)...Outputs discontinuous mid-range sound “Up stairs/up escalator” (danger level “low”)...Outputs discontinuous low-range sound “Ground obstacle (Large)” (Danger level “Medium”)...Outputs discontinuous mid-range vibrations. "Ground obstruction (Small)” (Danger level “Low”)...Outputs discontinuous high-range vibrations. "Downward escalators and other falling points”...outputs discontinuous low-frequency vibrations.In this way, by executing notifications according to the danger level, highly accurate notifications can be achieved. This allows the user 2 to intuitively grasp the danger level.
  • the "safety distance range” and the “notification distance range” can be set using, for example, a threshold related to distance. If the obstacle 44 (48) or the fall danger point 50 is further away than the threshold value, it is determined that the distance is within the "safe distance range”. If the obstacle 44 (48) or the falling danger point 50 is closer than the threshold value, it is determined that the obstacle 44 (48) or the fall danger point 50 is included in the "reported distance range”.
  • the threshold value is set, for example, based on the distance D on the ground 3 from the user 2 to the measurement point P.
  • the distance D may be used as it is as the threshold value.
  • a value close to the distance D may be used as the threshold.
  • the distance D is used as it is as the threshold value.
  • the distance D from the user 2 to the measurement point P is relatively large, a value shorter than the distance D is used as the threshold value.
  • any other setting method may be adopted.
  • the method of determining the danger level, the method of outputting notification information, the method of setting the "safe distance range” and the “notification distance range”, etc. are not limited and may be set arbitrarily.
  • personalization and customization can be freely performed based on the environment in which User 2 moves (such as what objects are present on the route he walks every day), information about User 2's walking (such as walking speed), etc. It's fine.
  • FIG. 20 is a schematic diagram for explaining another example of the process for notifying obstacles and falling danger points.
  • the "notification distance range” is further divided into a “soft notification distance range” and a “danger notification distance range.”
  • the "soft notification distance range” is a distance range for notifying that the obstacle 44 (48) or fall danger point 50 is approaching. In other words, it is a distance range that notifies you of the need for vigilance.
  • the “danger warning distance range” is a distance range in which the obstacles 44 (48) and falling danger points 50 are approaching and the danger level is high. In other words, this is a distance range that notifies you that the vehicle is about to collide with the obstacle 44 (48) or fall into the fall danger point 50.
  • 4 m is set as the distance indicating the boundary between the "safety distance range” and the “notification distance range”.
  • 2 m is set as the distance indicating the boundary between the "soft alert distance range” and the “danger alert distance range.”
  • the range from 0m to 2m from an obstacle or a fall danger point is the “dangerous distance range.”
  • the "soft distance range” is a range of 2m to 4m from an obstacle or fall hazard point.
  • the "safe distance range” is a range of 4 meters or more from an obstacle or fall hazard point.
  • the distance indicating the boundary between the "safety distance range” and the “notification distance range” may be set arbitrarily.
  • the tempo is switched for the discontinuously output sounds and vibrations.
  • discontinuous sounds and vibrations are output at a relatively low tempo.
  • discontinuous sounds and vibrations are output at a relatively high tempo.
  • the present invention is not limited to this, and the intensity of the sound, the playback speed of the song, the BPM, the intensity of vibration, the frequency, etc. may be controlled.
  • the notification of surrounding information continues until the user 2 turns off the main body power.
  • the operation of the peripheral information notification system 41 ends (step 209).
  • FIG. 21 is a schematic diagram showing an example of detection of "ground obstacle (large)” and “ground obstacle (small).” As shown in FIG. 21, until the detection range of the ground-side distance measuring sensor 28 reaches the ground obstacle 48, the ground-side distance information remains unchanged based on the distance to the ground 3, and the ground-side distance information The deviation of is stable. That is, as shown in FIG. 21, there is a "deviation stabilization period".
  • the deviation of the ground side distance information when detecting the ground obstacle 48, the deviation of the ground side distance information may become an unspecified value at the detection start timing and the detection end timing. Therefore, the deviation of the ground side distance information at the detection start timing of the ground obstacle 48 and the deviation of the ground side distance information at the obstacle detection end timing may be processed so as not to be included in the obstacle determination.
  • deviation data excluding the deviation of the ground distance information at the detection start timing and the detection end timing is used as data that can be used to determine an obstacle. This makes it possible to detect the ground obstacle 48 with high accuracy.
  • the controller 8 controls the fluctuation of the front side distance information (first distance information) detected by the front side distance measurement sensor 27 (first distance measurement sensor) and First detection information including variations, and second detection information including variations and variations in ground-side distance information (second distance information) detected by the ground-side distance measurement sensor 28 (second distance measurement sensor). Broadcast information is generated based on this. This makes it possible to detect information about the surrounding environment with high accuracy and notify the user 2 of the information.
  • danger avoidance does not use camera recognition technology that is unsuitable for low cost, weight reduction, and miniaturization, but uses a lightweight and inexpensive distance measurement sensor to implement the above-mentioned accessibility device. It is possible to achieve this.
  • Figure 15 by focusing on the fluctuations and dispersion of distance information in two channels, ⁇ front direction'' and ⁇ ground direction,'' it is possible to grasp the surrounding environment and situation while walking in detail and instantly. Become. In other words, it is possible to simultaneously detect "collision with an obstacle" and "fall/fall due to an abnormality on the floor”. This has the effect of improving the safety and sense of security of the visually impaired user 2.
  • the detection directions of the plurality of ranging sensors may be set arbitrarily. Based on the fluctuations and variations in distance information of a plurality of distance measuring sensors whose detection directions are set in various directions, it is possible to notify the user 2 of various environmental information with high accuracy.
  • the type of fall danger point may be able to be determined by appropriately setting the number of ranging sensors and the detection direction of each ranging sensor. For example, it may be possible to determine whether the point is a fall danger point in the shape of a staircase going downward or a fall danger point in another shape. Furthermore, it may be possible to determine that it is at the edge of a station platform. Then, the danger level may be determined according to the type of fall danger point.
  • the threshold value for determining whether or not an obstacle is detected in the front direction and the distance D from the user 2 to the measurement point P are adjusted based on the route and path that the user 2 takes every day. may be done. For example, route data frequently used by user 2 is acquired using GPS or the like. Based on the route data, settings such as the maximum distance for obstacle detection in the front direction and the irradiation angle in the ground direction may be automatically adjusted (personalized).
  • settings suitable for detecting upward stairs are adopted. If the route has many dangerous falling points, settings suitable for detecting the dangerous falling points are adopted. The settings may be automatically adjusted during the route.
  • a voice input device such as a microphone
  • settings such as the maximum distance for obstacle detection in the front direction and the irradiation angle in the ground direction may be adjusted by voice input by the user 2.
  • a motor mechanism, an actuator mechanism, or the like can be appropriately configured as a mechanism for automatically adjusting settings such as the maximum distance for obstacle detection in the front direction and the irradiation angle in the ground direction.
  • the configuration that can control the detection direction of each ranging sensor can also be said to be a direction control unit that changes at least one of the first direction and the second direction.
  • a mode in which only the front-side distance measurement sensor 27 is driven may be selectively switched between a mode in which only the front-side distance measurement sensor 27 is driven, a mode in which only the ground-side distance measurement sensor 28 is driven, and a mode in which both the two distance measurement sensors 27 and 28 are driven.
  • a mode in which only important channels are driven may be automatically set depending on the remaining battery level, or the user 2 may be able to set the mode as appropriate.
  • an image sensor or the like is installed and a person can be detected based on image information, the detected person may be excluded from obstacle detection. Further, as the configuration of the sensor section 6, an integral configuration with a smartphone may be adopted by utilizing a distance measuring sensor etc. mounted on the smartphone.
  • the distance information acquisition section 42 corresponds to one embodiment of the distance information acquisition section according to the present technology.
  • the situation determining unit 43 corresponds to an embodiment of the situation determining unit according to the present technology.
  • the notification information generation section 18 and the notification control section 19 function as an embodiment of a notification section that generates and outputs notification information for notifying the determination result by the situation determination section.
  • FIG. 22 is a block diagram showing a configuration example for realizing notification of surrounding information according to the third embodiment.
  • the sensor unit 6 is further equipped with a 9-axis sensor 55 and a GPS 56.
  • the 9-axis sensor 55 includes a 3-axis acceleration sensor, a 3-axis gyro sensor, and a 3-axis compass sensor.
  • the nine-axis sensor 55 can detect acceleration, angular velocity, and orientation in three axes of the sensor section 6 (sensor main body 21).
  • an IMU Inertial Measurement Unit
  • the GPS 56 acquires information on the current position of the sensor section 6 (sensor main body 21). Further, a sensor that acquires biological information such as pulse, heartbeat, body temperature, and brain waves may be used as necessary.
  • the controller 8 further includes a self-position estimating section 57 and a map information generating section 58. These blocks are realized by the processor of the controller 8 executing a program according to the present technology. In order to realize each functional block, dedicated hardware such as an IC (integrated circuit) may be used as appropriate.
  • IC integrated circuit
  • the self-position estimating section 57 estimates the self-position of the sensor section 6 (sensor main body 21).
  • the self-position includes the position and orientation of the sensor body 21.
  • the self-position estimating unit 57 can calculate position information indicating where the sensor body 21 is located and posture information such as which direction the sensor body 21 is facing. Furthermore, based on the posture information of the sensor body 21, it is possible to detect which direction the user 2 is currently facing. In other words, it is possible to detect which direction the front distance measuring sensor 27 is facing.
  • the attitude, position, and movement (movement) of the sensor body 21 can also be regarded as the attitude, position, and movement (movement) of the user 2.
  • the self-position of the sensor body 21 is calculated based on the detection results from the sensor section 6.
  • an image sensor or the like for acquiring surrounding image information may be installed.
  • a three-dimensional coordinate system is set for the surrounding space.
  • coordinate values for example, XYZ coordinate values
  • absolute coordinate system world coordinate system
  • coordinate values for example, xyz coordinate values or uvd coordinate values
  • a relative coordinate system with a predetermined point as a reference (origin)
  • the origin serving as a reference may be set arbitrarily.
  • the self-position estimating unit 57 calculates position coordinates in the set three-dimensional coordinate system.
  • the X axis is the pitch axis
  • the Y axis is the roll axis
  • the Z axis is the yaw axis
  • the pitch angle, roll angle, and yaw angle are calculated based on the front direction of user 2 (sensor body 21). Ru.
  • the specific format of the position information and posture information of the user 2 (sensor body 21) is not limited.
  • the algorithm for estimating the self-position of the sensor body 21 is not limited, and any algorithm such as SLAM (Simultaneous Localization and Mapping) may be used. In addition, any machine learning algorithm or the like may be used.
  • SLAM Simultaneous Localization and Mapping
  • machine learning algorithm or the like may be used.
  • the map information generation unit 58 generates an obstacle space map corresponding to the surrounding environment based on the history of determination results by the situation determination unit 43.
  • the obstacle space map corresponds to one embodiment of surrounding map information according to the present technology.
  • the situation determination unit 43 can detect the obstacles 44 (48) and the fall danger points 50.
  • FIG. 23 is a flowchart showing an example of notification of surrounding information according to this embodiment. Steps 301-307 in FIG. 23 are similar to steps 201-207 shown in FIG. In this embodiment, in step 308, the map information generation unit 58 generates an obstacle space map.
  • FIG. 24 is a schematic diagram showing an example of an obstacle space map.
  • a three-dimensional coordinate system is set in which the XY direction is the horizontal direction and the Z direction is the height direction.
  • step 308 based on the self-position of the user 2 and the yaw angle (rotation angle with respect to the Z axis) of the distance measurement sensor (front distance measurement sensor 27), detected obstacles 44 (48) and falling danger are detected.
  • the XY coordinate values of point 50 are calculated as position information.
  • an obstacle space map 60 containing positional information of the detected obstacles 44 (48) and fall danger points 50 is generated.
  • the obstacles 44 (48) and fall danger points 50 are all schematically illustrated in the same manner.
  • the obstacle space map 60 including information such as the types of obstacles 44 (48) and the types of fall danger points 50.
  • an obstacle space map 60 that includes spatial position information of the positions, types, and attributes of the obstacles 44 (48) and fall danger points 50.
  • the obstacles 44 (48) and the fall danger points 50 may be collectively referred to as objects to be avoided.
  • the object to be avoided can also be called a dangerous object.
  • the notification information generation unit 18 and notification control unit 19 execute notification via audio.
  • notification is performed using stereophonic sound regarding the object to be avoided that is closest to the user 2 on the obstacle space map 60. That is, the localization of the audio information is set so that the audio can be heard from the position of the object to be avoided at the closest distance with respect to the current direction of the user 2 (front direction). Additionally, audio information is output such that the volume is attenuated depending on the distance to the object to be avoided.
  • notification using stereophonic sound is performed for the object to be avoided at the closest distance from the user 2, but which object to avoid is targeted for notification using stereophonic sound? may be set arbitrarily.
  • notification using stereophonic sound may be performed regarding the object to be avoided that is closest to the user 2 and the object to be avoided that is the second closest to the user 2 . Furthermore, if a plurality of objects to be avoided are located at the same distance from the user 2, notification using stereophonic sound may be performed for all of the plurality of objects.
  • notification is performed in response to the detection of an object to be avoided that exists in a position in the front direction.
  • the obstacle space map 60 of FIG. 24A it is possible to detect and notify an object to be avoided that exists in a position in front of the user 2.
  • the user 2 changes the direction of travel to the right in order to avoid the notified object to be avoided.
  • the detection and notification end when the previously detected object to be avoided deviates from the front direction of the user 2, the detection and notification end. After that, the user 2 may walk right next to the avoidable object without being detected. As a result, the vehicle may collide with the end of the obstacle 44 (48) or fall from the end of the fall danger point 50.
  • the sensor unit 6 sensor main body 21
  • the sensor unit 6 sensor main body 21
  • the detection of the object to be avoided may fail and the notification may end.
  • the vehicle may move toward the object to be avoided that was detected immediately before, causing a collision with or falling from the object to be avoided. That is, it may become impossible to avoid an object to be avoided once it has been detected.
  • an obstacle space map 60 is generated that includes position information of objects to be avoided that have been detected in the past. Then, an object to be avoided that is close to the user 2 is notified by stereophonic sound so that the direction toward the user 2 is reflected. Thereby, even if the object to be avoided deviates from the front direction of the user 2, it is possible to notify the user 2 of the object to be avoided. This makes it possible to solve the above-mentioned problems and improve the success rate of avoiding the detected object to be avoided.
  • the notification method described in the second embodiment and the notification method described in the third embodiment may be used together. That is, both real-time notification for the front direction and ground direction and notification using stereophonic sound using the obstacle space map 60 may be performed.
  • the notification method described in the first embodiment may be used in combination.
  • the user 2 is also a hearing-impaired person (for example, is hard of hearing or uses a hearing aid/sound collector, etc.), it is also effective to provide notification using vibration instead of stereophonic sound. That is, vibrations are presented to the body part corresponding to the position of the object to be avoided at the closest distance with respect to the current direction of the user 2 (front direction). Further, the strength of the vibration may be attenuated depending on the distance to the object to be avoided.
  • the user 2 may be able to appropriately set whether to notify using stereophonic sound or vibration. Further, if the user 2 is using a hearing aid, a sound collector, etc., processing such as hearing aid processing may be performed on the stereophonic sound that is notified.
  • FIG. 25 is a schematic diagram showing another configuration example of the surrounding information notification system according to the present embodiment.
  • the peripheral information notification system 64 is realized by the controller 8 shown in FIG. 1 and the server device 63 arranged on the network 62 working together.
  • the network 62 is constructed by, for example, the Internet or a wide area communication network.
  • any WAN (Wide Area Network), LAN (Local Area Network), etc. may be used, and the protocol for constructing the network 62 is not limited.
  • the server device 63 includes hardware necessary for configuring a computer, such as a CPU, ROM, RAM, and HDD.
  • the server device 63 can be realized by any computer such as a PC (Personal Computer).
  • the map information generation unit 58 is implemented by the server device 63.
  • the determination result by the situation determination unit 43 ("obstacle/fall danger point information" in the figure) is transmitted to the server device 63 via the network 62.
  • information on the self-position estimated by the self-position estimating unit 57 is also transmitted to the server device 63 via the network 63.
  • the server device 63 a history of determination results by the situation determining section 43 is stored in an obstacle information DB. Then, the map information generation unit 58 generates an obstacle space map 60 as illustrated in FIG. 24 . The obstacle space map 60 generated by the server device 63 is transmitted to the controller 8 via the network 62.
  • the DB may be constructed in a storage device within the server device 63, or may be constructed in an external storage device that the server device 63 can access.
  • the notification information generation unit 18 and notification control unit 19 of the controller 8 execute notification using stereophonic sound based on the received obstacle space map 60.
  • the surrounding information notification system 64 can be realized by a cloud system using the cloud (cloud computing).
  • cloud cloud computing
  • the edge terminal configured on the user 2 side is not equipped with high processing capacity, it is possible to realize notification using a wide range obstacle space map 60.
  • an accessibility device that is lightweight and inexpensive, yet can report information about the surrounding environment with high accuracy.
  • the server device 63 on the network 62 may integrate "obstacle/fall danger point information" sent from multiple users 2 using the surrounding information notification system 64. Then, the obstacle space map 60 for each user 2 may be generated based on the integrated "obstacle/fall danger point information".
  • the sensor main body 21 worn by the user 2 walking in a certain area performs a search in the front direction and the ground direction.
  • “Obstacle/fall danger point information” related to the detection of the detected object to be avoided is transmitted from the controller 8 to the server device 63 via the network 62 .
  • the server device 63 integrates the information on the avoidable object detected by the user 2's search and the information on the avoidable object detected by the other user 2's search, and stores it in the obstacle information DB. Then, an obstacle space map 60 is generated that includes both the positional information of the avoidable object detected by the user 2's search and the positional information of the avoidable object detected by the search of another user 2. The generated obstacle space map 60 is transmitted to both the user 2 and other users 2 via the network 62.
  • FIG. 26 is a schematic diagram showing a configuration example of a surrounding information notification system according to the fourth embodiment.
  • the surrounding information notification system 66 according to this embodiment is configured by a cloud system. That is, the peripheral information notification system 66 is realized by the controller 8 shown in FIG. 1 and the server device 63 arranged on the network 62 working together.
  • the surrounding information notification system 66 includes a guide device 67 that is communicably connected to the controller 8 and the server device 63 via the network 62.
  • the guidance device 67 has a display and is used as a remote terminal used by an operator (route guidance etc. sender) 68 who provides route guidance etc. to the user 2.
  • the route guidance can also be called a walking guidance notification.
  • the server device 63 stores the information on the object to be avoided ("obstacle/fall danger point information") sent from the user 2 in the obstacle information DB. Further, the server device 63 has built therein a real world map information DB in which map information of the real world is stored. For example, map information of various regions is acquired from a map server that provides map services on the network 62, and stored in the real world map information DB.
  • FIG. 27 is a flowchart showing an example of notification of surrounding information according to this embodiment. Steps 401-408 in FIG. 27 are similar to steps 301-308 shown in FIG. 23.
  • step 408 the controller 8 transmits the determination result by the situation determining unit 43 (“obstacle/fall danger point information”) and the self-position information estimated by the self-position estimating unit 57 to the server device via the network 63. 63. Then, the map information generation unit 58 in the server device 63 generates an obstacle space map 60.
  • the map information generation unit 58 generates a real space dangerous object map 69 in which surrounding real world information is added to the obstacle space map 60 based on the user 2's position information in the real world. .
  • the obstacle space map 60 and real world map information are linked to generate a real space dangerous object map 69 to which landmark information and the like are added (step 409).
  • the real space dangerous object map 69 includes the user 2's real world position information, such as the real world position information and attribute information of the obstacles 44 (48) that are objects to be avoided (dangerous objects) and the falling danger points 50, and Contains real world information such as landmark information. Note that the real world information includes arbitrary geographic information such as place names and topography.
  • the real space dangerous object map 69 is an embodiment of real surrounding map information according to the present technology.
  • the real space dangerous object map 69 is transmitted to the guide device 67 via the network 62 and displayed on the display of the guide device 67.
  • the display mode of the real space dangerous object map 69 is not limited and may be set arbitrarily.
  • an icon of an object to be avoided may be superimposed on map information of the real world, and when the icon is selected, detailed information of the object to be avoided may be displayed.
  • the display area of the display may be divided and information regarding the avoidable object may be displayed in a list.
  • the operator 68 executes route guidance using the real space dangerous object map 69. For example, ⁇ There are stairs leading down to the entrance of ⁇ station 5 meters ahead. Please proceed with caution.'' ⁇ A car is parked at the entrance of ⁇ parking lot. Please stop temporarily.'' ⁇ There is a large obstacle on the ground. It becomes possible to provide route guidance that combines information from the real world with information about dangerous objects to avoid, such as "There is an object falling. Please slow down.” As a result, the user 2 can obtain information normally obtained by a healthy person, and it is possible to further improve safety in avoiding danger.
  • the notification information generation unit 18 in the controller 8 receives guidance information including the contents of the route guidance of the operator 68 (step 410). Based on the received guidance information, the notification control unit 19 outputs the contents of the route guidance from the speaker 11.
  • the configuration and method for transmitting the guidance information to the controller 8 on the user 2 side via the network 62 are not limited.
  • the guidance information can be transmitted using a well-known technique using a voice input device such as a microphone of the guidance device 67, a communication device, or the like.
  • the route guidance provided by the operator 68 can also be said to be notification using real world information based on the real space dangerous object map 69. Therefore, it can be said that the mechanism for transmitting guidance information including the contents of route guidance for the operator 68 provided in the guidance device 67 functions as a "notification section" in the surrounding information notification system 66.
  • Automatic voice route guidance based on the real space dangerous object map 69 may be performed instead of the route guidance by the operator 68. Automatic voice route guidance based on the real space dangerous object map 69 is also included in the notification using real world information based on the real space dangerous object map 69. Further, the mechanism that executes route guidance using automatic voice functions as a "notification unit" in the surrounding information notification system 66.
  • the real space dangerous object map 69 may be transmitted to the controller 8 on the user 2 side via the network 62. Then, the notification information generation section 18 and the notification control section 19 may perform route guidance or the like based on the real space dangerous object map 69.
  • the image of a company that creates smart accessibility products will be enhanced, and the company's brand image and existence value can be expected to be enhanced as a company that aims to contribute to society. .
  • FIG. 28 is a schematic diagram showing another example of a method of outputting sound according to distance.
  • musical tone information such as a predetermined song is played in the "safe distance range.” For example, songs that the user 2 likes are played.
  • a detection notification sound that notifies that an object to be avoided has been detected is faded in so as to be superimposed on the musical sound information.
  • the detection notification sound for example, discontinuous mid-range sound is output.
  • the mixing amount for the detection notification sound is increased linearly, but the present invention is not limited to this, and various other fade-in controls may be adopted.
  • both the musical tone information and the detection notification sound are output at the maximum standard level.
  • both the musical tone information and the detection notification sound are faded out.
  • the mixing amount for the musical tone information is reduced in a curve (rapidly), and the mixing amount for the detection notification sound is reduced linearly (reduced at a constant rate).
  • Various such fade-out controls may be employed.
  • the danger notification sound indicating that danger is approaching is faded in.
  • the danger notification sound for example, discontinuous high-frequency sound is output.
  • the danger notification sound becomes louder to the maximum standard level to strongly alert the user 2.
  • the buzzer sound when a buzzer sound is output when an object to be avoided is detected, the buzzer sound may continue to sound in a crowded station premises, inside an elevator, or when using an escalator. It may also be accompanied by discomfort.
  • music information of "favorite music” is used as the main sound source, and detection notification sound such as sonar sound is mixed in a fading manner according to the approach distance to the object to be avoided. To go. When the object to be avoided is extremely close, the main music, detection notification sound, and danger notification sound are cross-faded and mixed. Such notification becomes possible, and it becomes possible to eliminate notifications that are unpleasant for the user 2.
  • the object to be avoided is a person (not the object to be avoided)
  • the necessary distance measurement channels may be automatically switched based on map information of the real world or the like. For example, in the configuration shown in FIG. 8, even if sensing in the front direction by the front-side distance measurement sensor 27 and sensing in the ground direction by the ground-side distance measurement sensor 28 are automatically switched based on surrounding information. good. For example, when the user 2 is moving on a station platform, it is possible to prioritize sensing toward the ground.
  • the walking speed of the user 2 is estimated from the change in the front side distance information, and is used to determine the size of the ground obstacle 48 along with the variation time of the deviation of the ground side distance information. good.
  • peripheral information notification system By applying the peripheral information notification system according to the present technology, it is also possible to realize a device compatible with a speech-type UI for visually impaired people.
  • a surrounding information notification system according to the present technology for healthy people. For example, a front-side distance measuring sensor whose detection direction is in the front direction and a back-side distance measuring sensor whose detection direction is in the back direction (direction toward the rear side) are arranged. Then, detection of a suspicious person sneaking up from behind may be performed based on the back side distance information detected by the back side distance measuring sensor.
  • NR noise reduction
  • NC noise canceling
  • the surrounding information notification system according to the present technology may be provided to blindfolded healthy people at a theme park or the like where they can experience the experience of blind people. Furthermore, when a healthy person uses a smartphone while walking, a surrounding information notification system according to the present technology may be constructed by attaching a distance measurement sensor or the like in the same direction as the outward-facing camera. Further, a surrounding information system according to the present technology may be constructed for a vehicle, a drone, or the like, and the surrounding information may be notified to a pilot or the like.
  • FIG. 29 is a block diagram showing an example of a hardware configuration of a computer (information processing device) 70 that can be used to construct a peripheral information notification system according to the present technology.
  • the computer 70 includes a CPU 71, a ROM 72, a RAM 73, an input/output interface 75, and a bus 74 that connects these to each other.
  • a display section 76, an input section 77, a storage section 78, a communication section 79, a drive section 80, and the like are connected to the input/output interface 75.
  • the display section 76 is a display device using, for example, liquid crystal, EL, or the like.
  • the input unit 77 is, for example, a keyboard, pointing device, touch panel, or other operating device.
  • the input section 77 includes a touch panel
  • the touch panel can be integrated with the display section 76.
  • the storage unit 78 is a nonvolatile storage device, such as an HDD, flash memory, or other solid-state memory.
  • the drive section 80 is a device capable of driving a removable recording medium 81 such as an optical recording medium or a magnetic recording tape.
  • the communication unit 79 is a modem, router, or other communication equipment that can be connected to a LAN, WAN, etc., and is used to communicate with other devices. The communication unit 79 may communicate using either wired or wireless communication. The communication unit 79 is often used separately from the computer 70.
  • Information processing by the computer 70 having the above hardware configuration is realized by cooperation between software stored in the storage unit 78 or the ROM 72, and hardware resources of the computer 70.
  • the information processing method according to the present technology is realized by loading a program constituting software stored in the ROM 72 or the like into the RAM 73 and executing it.
  • the program is installed on the computer 70 via the recording medium 61, for example.
  • the program may be installed on the computer 70 via a global network or the like.
  • any computer-readable non-transitory storage medium may be used.
  • the information processing method (peripheral information notification method) and program according to the present technology are executed by multiple computers communicably connected via a network etc., and the information processing device according to the present technology is constructed. It's okay. That is, the information processing method and program according to the present technology can be executed not only in a computer system configured by a single computer but also in a computer system in which multiple computers operate in conjunction with each other. Note that in the present disclosure, a system means a collection of multiple components (devices, modules (components), etc.), and it does not matter whether all the components are in the same housing or not.
  • a plurality of devices housed in separate casings and connected via a network and a single device in which a plurality of modules are housed in one casing are both systems.
  • Execution of the information processing method and program according to the present technology by a computer system includes, for example, acquisition of surrounding information (obtaining distance information, situation determination), generation of notification information (generation of audio information, generation of vibration information), and notification control. This includes both the case where each process is executed by a single computer and the case where each process is executed by different computers.
  • execution of each process by a predetermined computer includes causing another computer to execute part or all of the process and acquiring the results. That is, the information processing method and program according to the present technology can also be applied to a cloud computing configuration in which one function is shared and jointly processed by a plurality of devices via a network.
  • “perfectly centered”, “perfectly centered”, “perfectly uniform”, “perfectly equal”, “perfectly identical”, “perfectly orthogonal”, “perfectly parallel”, “perfectly symmetrical”, “perfectly extended”, “perfectly” also includes states that fall within a predetermined range (e.g. ⁇ 10% range) based on the following criteria: axial direction, completely cylindrical, completely cylindrical, completely ring-shaped, completely annular, etc. It will be done. Therefore, even when words such as “approximately,””approximately,” and “approximately” are not added, concepts that can be expressed by adding so-called “approximately,””approximately,” and “approximately” may be included. On the other hand, when a state is expressed by adding words such as “approximately”, “approximately”, “approximately”, etc., a complete state is not always excluded.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be applied to any type of transportation such as a car, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility vehicle, an airplane, a drone, a ship, a robot, a construction machine, an agricultural machine (tractor), etc. It may also be realized as a device mounted on the body.
  • FIG. 30 is a block diagram showing a schematic configuration example of a vehicle control system 7000, which is an example of a mobile object control system to which the technology according to the present disclosure can be applied.
  • Vehicle control system 7000 includes multiple electronic control units connected via communication network 7010.
  • the vehicle control system 7000 includes a drive system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside vehicle information detection unit 7400, an inside vehicle information detection unit 7500, and an integrated control unit 7600. .
  • the communication network 7010 connecting these plurality of control units is, for example, a communication network based on any standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • Each control unit includes a microcomputer that performs calculation processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used in various calculations, and a drive circuit that drives various devices to be controlled. Equipped with Each control unit is equipped with a network I/F for communicating with other control units via the communication network 7010, and also communicates with devices or sensors inside and outside the vehicle through wired or wireless communication. A communication I/F is provided for communication. In FIG.
  • the functional configuration of the integrated control unit 7600 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning section 7640, a beacon receiving section 7650, an in-vehicle device I/F 7660, an audio image output section 7670, An in-vehicle network I/F 7680 and a storage unit 7690 are illustrated.
  • the other control units similarly include a microcomputer, a communication I/F, a storage section, and the like.
  • the drive system control unit 7100 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 7100 includes a drive force generation device such as an internal combustion engine or a drive motor that generates drive force for the vehicle, a drive force transmission mechanism that transmits the drive force to wheels, and a drive force transmission mechanism that controls the steering angle of the vehicle. It functions as a control device for a steering mechanism to adjust and a braking device to generate braking force for the vehicle.
  • the drive system control unit 7100 may have a function as a control device such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
  • a vehicle state detection section 7110 is connected to the drive system control unit 7100.
  • the vehicle state detection unit 7110 includes, for example, a gyro sensor that detects the angular velocity of the axial rotation movement of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, or an operation amount of an accelerator pedal, an operation amount of a brake pedal, or a steering wheel. At least one sensor for detecting angle, engine rotational speed, wheel rotational speed, etc. is included.
  • the drive system control unit 7100 performs arithmetic processing using signals input from the vehicle state detection section 7110, and controls the internal combustion engine, the drive motor, the electric power steering device, the brake device, and the like.
  • the body system control unit 7200 controls the operations of various devices installed in the vehicle body according to various programs.
  • the body system control unit 7200 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a turn signal, or a fog lamp.
  • radio waves transmitted from a portable device that replaces a key or signals from various switches may be input to the body control unit 7200.
  • the body system control unit 7200 receives input of these radio waves or signals, and controls the door lock device, power window device, lamp, etc. of the vehicle.
  • the battery control unit 7300 controls the secondary battery 7310, which is a power supply source for the drive motor, according to various programs. For example, information such as battery temperature, battery output voltage, or remaining battery capacity is input to the battery control unit 7300 from a battery device including a secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls the temperature adjustment of the secondary battery 7310 or the cooling device provided in the battery device.
  • the external information detection unit 7400 detects information external to the vehicle in which the vehicle control system 7000 is mounted. For example, at least one of an imaging section 7410 and an external information detection section 7420 is connected to the vehicle exterior information detection unit 7400.
  • the imaging unit 7410 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the vehicle external information detection unit 7420 includes, for example, an environmental sensor for detecting the current weather or weather, or a sensor for detecting other vehicles, obstacles, pedestrians, etc. around the vehicle equipped with the vehicle control system 7000. At least one of the surrounding information detection sensors is included.
  • the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunlight sensor that detects the degree of sunlight, and a snow sensor that detects snowfall.
  • the surrounding information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the imaging section 7410 and the vehicle external information detection section 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 31 shows an example of the installation positions of the imaging section 7410 and the vehicle external information detection section 7420.
  • the imaging units 7910, 7912, 7914, 7916, and 7918 are provided, for example, at at least one of the front nose, side mirrors, rear bumper, back door, and upper part of the windshield inside the vehicle 7900.
  • An imaging unit 7910 provided in the front nose and an imaging unit 7918 provided above the windshield inside the vehicle mainly acquire images in front of the vehicle 7900.
  • Imaging units 7912 and 7914 provided in the side mirrors mainly capture images of the sides of the vehicle 7900.
  • An imaging unit 7916 provided in the rear bumper or back door mainly acquires images of the rear of the vehicle 7900.
  • the imaging unit 7918 provided above the windshield inside the vehicle is mainly used to detect preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 31 shows an example of the imaging range of each of the imaging units 7910, 7912, 7914, and 7916.
  • Imaging range a indicates the imaging range of imaging unit 7910 provided on the front nose
  • imaging ranges b and c indicate imaging ranges of imaging units 7912 and 7914 provided on the side mirrors, respectively
  • imaging range d is The imaging range of an imaging unit 7916 provided in the rear bumper or back door is shown. For example, by superimposing image data captured by imaging units 7910, 7912, 7914, and 7916, an overhead image of vehicle 7900 viewed from above can be obtained.
  • the external information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided at the front, rear, sides, corners, and the upper part of the windshield inside the vehicle 7900 may be, for example, ultrasonic sensors or radar devices.
  • External information detection units 7920, 7926, and 7930 provided on the front nose, rear bumper, back door, and upper part of the windshield inside the vehicle 7900 may be, for example, LIDAR devices.
  • These external information detection units 7920 to 7930 are mainly used to detect preceding vehicles, pedestrians, obstacles, and the like.
  • the vehicle exterior information detection unit 7400 causes the imaging unit 7410 to capture an image of the exterior of the vehicle, and receives the captured image data. Further, the vehicle exterior information detection unit 7400 receives detection information from the vehicle exterior information detection section 7420 to which it is connected.
  • the external information detection unit 7420 is an ultrasonic sensor, a radar device, or a LIDAR device
  • the external information detection unit 7400 transmits ultrasonic waves, electromagnetic waves, etc., and receives information on the received reflected waves.
  • the external information detection unit 7400 may perform object detection processing such as a person, car, obstacle, sign, or text on the road surface or distance detection processing based on the received information.
  • the external information detection unit 7400 may perform environment recognition processing to recognize rain, fog, road surface conditions, etc. based on the received information.
  • the vehicle exterior information detection unit 7400 may calculate the distance to the object outside the vehicle based on the received information.
  • the outside-vehicle information detection unit 7400 may perform image recognition processing or distance detection processing for recognizing people, cars, obstacles, signs, characters on the road, etc., based on the received image data.
  • the outside-vehicle information detection unit 7400 performs processing such as distortion correction or alignment on the received image data, and also synthesizes image data captured by different imaging units 7410 to generate an overhead image or a panoramic image. Good too.
  • the outside-vehicle information detection unit 7400 may perform viewpoint conversion processing using image data captured by different imaging units 7410.
  • the in-vehicle information detection unit 7500 detects in-vehicle information.
  • a driver condition detection section 7510 that detects the condition of the driver is connected to the in-vehicle information detection unit 7500.
  • the driver state detection unit 7510 may include a camera that images the driver, a biosensor that detects biometric information of the driver, a microphone that collects audio inside the vehicle, or the like.
  • the biosensor is provided, for example, on a seat surface or a steering wheel, and detects biometric information of a passenger sitting on a seat or a driver holding a steering wheel.
  • the in-vehicle information detection unit 7500 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 7510, or determine whether the driver is dozing off. You may.
  • the in-vehicle information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
  • the integrated control unit 7600 controls overall operations within the vehicle control system 7000 according to various programs.
  • An input section 7800 is connected to the integrated control unit 7600.
  • the input unit 7800 is realized by a device that can be inputted by the passenger, such as a touch panel, a button, a microphone, a switch, or a lever.
  • the integrated control unit 7600 may be input with data obtained by voice recognition of voice input through a microphone.
  • the input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or an externally connected device such as a mobile phone or a PDA (Personal Digital Assistant) that is compatible with the operation of the vehicle control system 7000. It's okay.
  • the input unit 7800 may be, for example, a camera, in which case the passenger can input information using gestures. Alternatively, data obtained by detecting the movement of a wearable device worn by a passenger may be input. Furthermore, the input section 7800 may include, for example, an input control circuit that generates an input signal based on information input by a passenger or the like using the input section 7800 described above and outputs it to the integrated control unit 7600. By operating this input unit 7800, a passenger or the like inputs various data to the vehicle control system 7000 and instructs processing operations.
  • the storage unit 7690 may include a ROM (Read Only Memory) that stores various programs executed by the microcomputer, and a RAM (Random Access Memory) that stores various parameters, calculation results, sensor values, etc. Further, the storage unit 7690 may be realized by a magnetic storage device such as a HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the general-purpose communication I/F 7620 is a general-purpose communication I/F that mediates communication with various devices existing in the external environment 7750.
  • the general-purpose communication I/F7620 supports cellular communication protocols such as GSM (registered trademark) (Global System of Mobile communications), WiMAX (registered trademark), LTE (registered trademark) (Long Term Evolution), or LTE-A (LTE-Advanced). , or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi (registered trademark)) or Bluetooth (registered trademark).
  • the general-purpose communication I/F 7620 connects to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via a base station or an access point, for example. You may.
  • the general-purpose communication I/F 7620 uses, for example, P2P (Peer To Peer) technology to communicate with a terminal located near the vehicle (for example, a driver, a pedestrian, a store terminal, or an MTC (Machine Type Communication) terminal). You can also connect it with a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via a base station or an access point, for example. You may.
  • P2P Peer To Peer
  • a terminal located near the vehicle for example, a driver, a pedestrian, a store terminal, or an MTC (Machine Type Communication) terminal. You can also connect it with
  • the dedicated communication I/F 7630 is a communication I/F that supports communication protocols developed for use in vehicles.
  • the dedicated communication I/F 7630 uses standard protocols such as WAVE (Wireless Access in Vehicle Environment), which is a combination of lower layer IEEE802.11p and upper layer IEEE1609, DSRC (Dedicated Short Range Communications), or cellular communication protocol. May be implemented.
  • the dedicated communication I/F 7630 typically supports vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication. ) communications, a concept that includes one or more of the following:
  • the positioning unit 7640 performs positioning by receiving, for example, a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite), and determines the latitude, longitude, and altitude of the vehicle. Generate location information including. Note that the positioning unit 7640 may specify the current location by exchanging signals with a wireless access point, or may acquire location information from a terminal such as a mobile phone, PHS, or smartphone that has a positioning function.
  • GNSS Global Navigation Satellite System
  • GPS Global Positioning System
  • the beacon receiving unit 7650 receives, for example, radio waves or electromagnetic waves transmitted from a wireless station installed on the road, and obtains information such as the current location, traffic jams, road closures, or required travel time. Note that the function of the beacon receiving unit 7650 may be included in the dedicated communication I/F 7630 described above.
  • the in-vehicle device I/F 7660 is a communication interface that mediates connections between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle.
  • the in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • the in-vehicle device I/F 7660 connects to USB (Universal Serial Bus), HDMI (registered trademark) (High-Definition Multimedia Interface), or MHL (Mobile High).
  • USB Universal Serial Bus
  • HDMI registered trademark
  • MHL Mobile High
  • the in-vehicle device 7760 may include, for example, at least one of a mobile device or wearable device owned by a passenger, or an information device carried into or attached to the vehicle.
  • the in-vehicle device 7760 may include a navigation device that searches for a route to an arbitrary destination. or exchange data signals.
  • the in-vehicle network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the in-vehicle network I/F 7680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 communicates via at least one of a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning section 7640, a beacon reception section 7650, an in-vehicle device I/F 7660, and an in-vehicle network I/F 7680.
  • the vehicle control system 7000 is controlled according to various programs based on the information obtained. For example, the microcomputer 7610 calculates a control target value for a driving force generating device, a steering mechanism, or a braking device based on acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 7100. Good too.
  • the microcomputer 7610 realizes ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. Coordination control may be performed for the purpose of
  • the microcomputer 7610 controls the driving force generating device, steering mechanism, braking device, etc. based on the acquired information about the surroundings of the vehicle, so that the microcomputer 7610 can drive the vehicle autonomously without depending on the driver's operation. Cooperative control for the purpose of driving etc. may also be performed.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 7610 acquires information through at least one of a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning section 7640, a beacon reception section 7650, an in-vehicle device I/F 7660, and an in-vehicle network I/F 7680. Based on this, three-dimensional distance information between the vehicle and surrounding objects such as structures and people may be generated, and local map information including surrounding information of the current position of the vehicle may be generated. Furthermore, the microcomputer 7610 may predict dangers such as a vehicle collision, a pedestrian approaching, or entering a closed road, based on the acquired information, and generate a warning signal.
  • the warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the audio and image output unit 7670 transmits an output signal of at least one of audio and images to an output device that can visually or audibly notify information to the occupants of the vehicle or to the outside of the vehicle.
  • an audio speaker 7710, a display section 7720, and an instrument panel 7730 are illustrated as output devices.
  • Display unit 7720 may include, for example, at least one of an on-board display and a head-up display.
  • the display section 7720 may have an AR (Augmented Reality) display function.
  • the output device may be other devices other than these devices, such as headphones, a wearable device such as a glasses-type display worn by the passenger, a projector, or a lamp.
  • the output device When the output device is a display device, the display device displays results obtained from various processes performed by the microcomputer 7610 or information received from other control units in various formats such as text, images, tables, graphs, etc. Show it visually. Further, when the output device is an audio output device, the audio output device converts an audio signal consisting of reproduced audio data or acoustic data into an analog signal and audibly outputs the analog signal.
  • control units connected via the communication network 7010 may be integrated as one control unit.
  • each control unit may be composed of a plurality of control units.
  • vehicle control system 7000 may include another control unit not shown.
  • some or all of the functions performed by one of the control units may be provided to another control unit.
  • predetermined arithmetic processing may be performed by any one of the control units.
  • sensors or devices connected to any control unit may be connected to other control units, and multiple control units may send and receive detection information to and from each other via communication network 7010. .
  • a computer program for realizing each function of the information processing apparatus according to the present embodiment described using FIG. 2 and the like can be implemented in any control unit or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • the above computer program may be distributed, for example, via a network, without using a recording medium.
  • the information processing device according to the present embodiment described using FIG. 2 and the like can be applied to the integrated control unit 7600 of the application example shown in FIG. 30.
  • the components of the information processing device described using FIG. 2 etc. are included in the module for the integrated control unit 7600 shown in FIG. May be realized.
  • the information processing device described using FIG. 2 and the like may be realized by a plurality of control units of vehicle control system 7000 shown in FIG. 30.
  • a surrounding information acquisition unit that obtains first surrounding information and second surrounding information regarding the surrounding environment based on detection results of one or more object detection sensors;
  • First audio information is generated using first musical tone data for notifying the first peripheral information based on the first peripheral information, and first audio information is generated based on the second peripheral information.
  • an audio information generation unit that generates second audio information using second musical tone data for notifying surrounding information;
  • An information processing device comprising: a notification control unit that outputs both the first audio information and the second audio information.
  • the one or more object detection sensors include a first object detection sensor and a second object detection sensor
  • the peripheral information acquisition unit acquires the first peripheral information based on the detection result of the first object detection sensor, and acquires the second peripheral information based on the detection result of the second object detection sensor.
  • the first object detection sensor is a first ranging sensor that operates according to a first method
  • the second object detection sensor is a second distance measuring sensor that operates according to a second method different from the first method.
  • the information processing device is a first ranging sensor arranged with a first direction as a detection direction
  • the second object detection sensor is a second ranging sensor arranged with a second direction different from the first direction as a detection direction.
  • Information processing device. (5) The information processing device according to (1),
  • the one or more object detection sensors are sensors that generate image information
  • the peripheral information acquisition unit acquires the first peripheral information based on information on some pixel regions of the image information, and acquires the first peripheral information based on information on other pixel regions of the image information.
  • the information processing device is a sensor that generate image information
  • the peripheral information acquisition unit acquires information regarding a first type of object detected based on the image information as the first peripheral information, and acquires information regarding the first type of object detected based on the image information.
  • An information processing device that acquires information regarding a different second type of object as the second peripheral information.
  • the surrounding information acquisition unit generates integrated surrounding information based on the first surrounding information and the second surrounding information. Information processing device.
  • the information processing device operates by an optical laser method
  • the second ranging sensor operates using an ultrasonic method
  • the surrounding information acquisition unit generates the integrated surrounding information based on the stability of detection by the first ranging sensor and the stability of detection by the second ranging sensor.
  • Information processing device (9)
  • the information processing device (9)
  • the peripheral information acquisition unit includes a light-transmitting member or a light-absorbing member in the periphery. An information processing device that generates the integrated peripheral information indicating that a member exists.
  • the peripheral information acquisition unit further includes hardness information acquired as the second peripheral information based on the detection result of the second distance measurement sensor, and the peripheral information acquisition unit is configured to determine whether the light transmitting member or the light absorbing member An information processing device that generates information regarding at least one of a material and a type of an object as the integrated peripheral information.
  • the audio information generation unit determines whether or not to output the first audio information based on the first peripheral information, and if it is determined that the first audio information is not output, the notification control An information processing device that regulates output of the first audio information by the unit.
  • the information processing device according to any one of (1) to (11),
  • the first audio information is first musical tone information constituting a predetermined song
  • the second audio information is second musical tone information constituting the predetermined music piece.
  • the audio information generation unit generates the first audio information by controlling musical tone parameters of the first musical tone data based on the first peripheral information.
  • the musical tone parameter includes at least one of volume, frequency, pitch, speed, BPM, and tempo.
  • Information processing device (15)
  • the information processing device according to (13) or (14),
  • the first peripheral information includes distance information,
  • the audio information generation unit generates the first audio information by controlling the musical tone parameters based on the distance information.
  • the information processing device controls the localization of the first audio information based on the detection direction of the first ranging sensor.
  • First audio information is generated using first musical tone data for notifying the first peripheral information based on the first peripheral information, and first audio information is generated based on the second peripheral information.
  • an audio information generation unit that generates second audio information using second musical tone data for notifying surrounding information
  • An information processing system comprising: a notification control unit that outputs both the first audio information and the second audio information.
  • the information processing device according to any one of (1) to (16) The one or more object detection sensors are arranged in a device worn by a user or a device held by the user. Information processing apparatus.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

Un dispositif de traitement d'informations selon un aspect de la présente technologie comprend une unité d'acquisition d'informations périphériques, une unité de génération d'informations sonores et une unité de commande de notification. L'unité d'acquisition d'informations périphériques acquiert des premières informations périphériques et des secondes informations périphériques relatives à un environnement périphérique sur la base d'un résultat de détection provenant d'un ou de plusieurs capteurs de détection d'objet. L'unité de génération d'informations sonores génère des premières informations sonores en utilisant des premières données sonores musicales pour notifier les premières informations périphériques, sur la base des premières informations périphériques, et génère des secondes informations sonores en utilisant des secondes données sonores musicales pour notifier les secondes informations périphériques, sur la base des secondes informations périphériques. L'unité de commande de notification délivre à la fois les premières informations sonores et les secondes informations sonores.
PCT/JP2023/019250 2022-06-15 2023-05-24 Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et système de traitement d'informations WO2023243338A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022096704 2022-06-15
JP2022-096704 2022-06-15

Publications (1)

Publication Number Publication Date
WO2023243338A1 true WO2023243338A1 (fr) 2023-12-21

Family

ID=89191174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/019250 WO2023243338A1 (fr) 2022-06-15 2023-05-24 Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et système de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2023243338A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014113410A (ja) * 2012-12-12 2014-06-26 Yamaguchi Univ 路面状態判別報知装置
EP3157233A1 (fr) * 2015-10-13 2017-04-19 Thomson Licensing Dispositif portatif, procédé pour faire fonctionner un tel dispositif et programme informatique
US20190282433A1 (en) * 2016-10-14 2019-09-19 United States Government As Represented By The Department Of Veterans Affairs Sensor based clear path robot guide
WO2019225192A1 (fr) * 2018-05-24 2019-11-28 ソニー株式会社 Dispositif de traitement d'informations et procédé de traitement d'informations
JP2020508440A (ja) * 2017-02-21 2020-03-19 ブラスウェイト ヘイリーBRATHWAITE,Haley パーソナルナビゲーションシステム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014113410A (ja) * 2012-12-12 2014-06-26 Yamaguchi Univ 路面状態判別報知装置
EP3157233A1 (fr) * 2015-10-13 2017-04-19 Thomson Licensing Dispositif portatif, procédé pour faire fonctionner un tel dispositif et programme informatique
US20190282433A1 (en) * 2016-10-14 2019-09-19 United States Government As Represented By The Department Of Veterans Affairs Sensor based clear path robot guide
JP2020508440A (ja) * 2017-02-21 2020-03-19 ブラスウェイト ヘイリーBRATHWAITE,Haley パーソナルナビゲーションシステム
WO2019225192A1 (fr) * 2018-05-24 2019-11-28 ソニー株式会社 Dispositif de traitement d'informations et procédé de traitement d'informations

Similar Documents

Publication Publication Date Title
US7650001B2 (en) Dummy sound generating apparatus and dummy sound generating method and computer product
US9159236B2 (en) Presentation of shared threat information in a transportation-related context
US20050175186A1 (en) Dummy sound generating apparatus and dummy sound generating method and computer product
US11237241B2 (en) Microphone array for sound source detection and location
WO2020203657A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
US11590985B2 (en) Information processing device, moving body, information processing method, and program
JPWO2020100585A1 (ja) 情報処理装置、および情報処理方法、並びにプログラム
US20220017093A1 (en) Vehicle control device, vehicle control method, program, and vehicle
US20220018932A1 (en) Calibration apparatus, calibration method, program, and calibration system and calibration target
JP2011162055A (ja) 擬似走行音発生装置および擬似走行音発生システム
WO2019039281A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps mobile
WO2020189156A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, dispositif de commande de mouvement et procédé de commande de mouvement
WO2021187039A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme d'ordinateur
US11904893B2 (en) Operating a vehicle
WO2021070768A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations
US20210055116A1 (en) Get-off point guidance method and vehicular electronic device for the guidance
WO2023243338A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et système de traitement d'informations
WO2023243339A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations
WO2019117104A1 (fr) Dispositif et procédé de traitement d'informations
EP4171021A1 (fr) Dispositif de commande, système de projection, procédé de commande, et programme
US20200175959A1 (en) Apparatus, system, method and computer program
JP7302477B2 (ja) 情報処理装置、情報処理方法および情報処理プログラム
JP7469358B2 (ja) 交通安全支援システム
JP7469359B2 (ja) 交通安全支援システム
WO2023204076A1 (fr) Procédé de commande acoustique et dispositif de commande acoustique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23823638

Country of ref document: EP

Kind code of ref document: A1