WO2017057055A1 - Dispositif de traitement d'informations, terminal d'informations et procédés de traitement d'informations - Google Patents

Dispositif de traitement d'informations, terminal d'informations et procédés de traitement d'informations Download PDF

Info

Publication number
WO2017057055A1
WO2017057055A1 PCT/JP2016/077428 JP2016077428W WO2017057055A1 WO 2017057055 A1 WO2017057055 A1 WO 2017057055A1 JP 2016077428 W JP2016077428 W JP 2016077428W WO 2017057055 A1 WO2017057055 A1 WO 2017057055A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
information
moving body
accident
map
Prior art date
Application number
PCT/JP2016/077428
Other languages
English (en)
Japanese (ja)
Inventor
貝野 彰彦
嵩明 加藤
江島 公志
辰吾 鶴見
福地 正樹
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017057055A1 publication Critical patent/WO2017057055A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present technology relates to an information processing device, an information terminal, and an information processing method, and particularly relates to an information processing device, an information terminal, and an information processing method that can prevent an accident of a moving body.
  • the navigation device mounted on the host vehicle can receive information about the oncoming vehicle from the server and pass the oncoming vehicle. It has been proposed to search for and display.
  • the navigation device calculates the current position, speed, and direction of the host vehicle by map matching with a GPS (Global Positioning System) receiver, a vehicle speed sensor, an angular velocity gyroscope, and a map database. Then, when the host vehicle is traveling on a mountain road or a narrow road with poor visibility, the navigation device transmits the current position, speed and direction of the host vehicle, error information, and destination point information to the server.
  • GPS Global Positioning System
  • the server determines the passing possibility of each vehicle based on the information received from each vehicle, and transmits approach information including current position information, direction information, and speed information of other vehicles to the navigation device of each vehicle.
  • the navigation device that has received the approach information searches for a point where it can pass another vehicle and displays it on a map (for example, see Patent Document 1).
  • the present technology has been made in view of such a situation, and is intended to reliably prevent an accident of a moving body such as a vehicle.
  • the information processing apparatus includes information on each moving body that includes position information indicating the position and speed of each moving body estimated based on an image captured from each moving body.
  • a receiving unit that receives from the terminal; and a risk prediction unit that predicts an accident between the moving bodies based on the movements of the moving bodies that are predicted based on the estimated position and speed of the moving bodies; Is provided.
  • the receiving unit receives from each information terminal a local map indicating a position in a three-dimensional space of a feature point in an image photographed from each moving body, and based on the received local map, predetermined
  • a global map update unit that updates a global map that indicates the position of the feature point in the three-dimensional space in the region, and the motion prediction unit further predicts the motion of each moving object based on the global map Can be made.
  • the motion predicting unit can predict the motion of each moving body avoiding a stationary object on the global map.
  • the position information includes the movement of each moving body predicted by each information terminal, and the risk prediction unit includes each moving body based on the movement of each moving body predicted by each information terminal. Accidents can be predicted in the meantime.
  • the danger notification unit can cause the information terminal to transmit control information used for processing for the mobile body to avoid an accident.
  • the receiving unit receives a detection result of a dangerous area at risk of an accident from each information terminal, and indicates the dangerous area based on the detection result of the dangerous area by each information terminal
  • a dangerous area map update unit for updating a map and a dangerous area notification unit for notifying each of the information terminals of the dangerous area based on the dangerous area map may be further provided.
  • the receiving unit receives a detection result of a dangerous area at risk of an accident from each information terminal, and indicates the dangerous area based on the detection result of the dangerous area by each information terminal
  • a dangerous area map update unit that updates a map, and a dangerous area determination that determines whether each moving body is in the dangerous area based on the dangerous area map and notifies the information terminal of the determination result Can be further provided.
  • a storage unit for storing the position information related to the accident can be further provided.
  • the information processing apparatus transmits position information indicating the position and speed of each moving body estimated based on an image captured from each moving body to each moving body.
  • An information terminal in a moving body, and an estimation unit that estimates the position and speed of the moving body based on an image captured from the moving body;
  • a transmission unit that transmits position information including position and speed to the information processing device, and a risk avoidance process that performs a process for avoiding the accident when the information processing device is notified of a risk that the moving body will have an accident A part.
  • the estimation unit can estimate the position and speed of the moving body based on the relative position between the feature point in the image photographed from the moving body and the moving body.
  • a local map generating unit that generates a local map indicating the position of the feature point in the three-dimensional space may be further provided, and the transmitting unit may further cause the information processing apparatus to transmit the local map.
  • An object detection unit that detects an object around the moving object based on the feature point, an estimated position and speed of the moving object, and an accident of the moving object based on the detection result of the object
  • a risk prediction unit that predicts the accident, and the risk avoidance processing unit further performs a process for avoiding the accident when the risk predicting unit predicts a risk that the moving body will hit the accident. Can be made.
  • the estimating unit predicts the movement of the moving body, and the transmitting unit receives the position information including a prediction result of the moving body as the information. It can be transmitted to the processing device.
  • the transmission unit causes the information processing apparatus to transmit the position information when it is determined that the moving body is in a risk area where there is a risk of an accident based on information from the information processing apparatus. be able to.
  • the risk prediction unit may further detect the dangerous region, and the transmission unit may transmit the detection result of the dangerous region to the information processing apparatus.
  • An information processing method is estimated by an estimation step in which an information terminal provided in a moving body estimates a position and a speed of the moving body based on an image taken from the moving body.
  • a risk avoidance processing step is performed by an estimation step in which an information terminal provided in a moving body estimates a position and a speed of the moving body based on an image taken from the moving body.
  • position information indicating a position and a speed of each moving body estimated based on an image captured from each moving body is received from an information terminal provided in each moving body. Then, an accident between the moving bodies is predicted based on the movements of the moving bodies predicted based on the estimated position and speed of the moving bodies.
  • the position and speed of the moving body are estimated based on an image captured from the moving body, and position information including the estimated position and speed of the moving body is stored in the information processing apparatus.
  • the information processing apparatus notifies the risk that the moving body will have an accident, a process for avoiding the accident is performed.
  • the prediction accuracy of accidents between moving bodies is improved. As a result, it is possible to reliably prevent an accident of the moving body.
  • 1 is a block diagram illustrating an embodiment of an information processing system to which the present technology is applied. It is a block diagram which shows 1st Embodiment of an information terminal. It is a block diagram which shows 1st Embodiment of a SLAM process part. It is a block diagram which shows 1st Embodiment of a server. It is a figure which shows the example of a dangerous area map. It is a flowchart for demonstrating 1st Embodiment of the process of an information terminal. It is a flowchart for demonstrating 1st Embodiment of the process of a server. It is a flowchart for demonstrating 1st Embodiment of the process of a server.
  • FIG. 1 shows an embodiment of an information processing system 1 to which the present technology is applied.
  • the information processing system 1 is configured to include information terminals 11-1 to 11-n and a server 12.
  • the information terminals 11-1 to 11-n are simply referred to as information terminals 11 when it is not necessary to distinguish them individually.
  • the information terminals 11 and the server 12 are connected to each other via a base station (not shown), the network 13, and the like, and communicate with each other.
  • a base station not shown
  • the network 13 and the like
  • any wireless communication method can be adopted as the communication method of the information terminal 11.
  • the communication method of the server 12 any wired or wireless communication method can be adopted.
  • the information terminal 11 and the server 12 can directly communicate with each other.
  • each information terminal 11 and the server 12 communicate via the network 13 or the like
  • the description of “via the network 13 or the like” is omitted for easy understanding.
  • expressions such as “each information terminal 11 and server 12 communicate with each other” and “each information terminal 11 and server 12 perform transmission and reception of information and data” are used.
  • Each information terminal 11 includes, for example, an in-vehicle information terminal used for a car navigation system, an automatic driving system, or the like, a mobile information terminal such as a smartphone, a mobile phone, a tablet, a wearable device, or a notebook personal computer.
  • a mobile information terminal such as a smartphone, a mobile phone, a tablet, a wearable device, or a notebook personal computer.
  • each information terminal 11 is provided in a mobile body.
  • the mobile body provided with each information terminal 11 includes, for example, a mobile body that moves on land or in the air.
  • Such moving bodies include, for example, vehicles, people, airplanes, helicopters, drones, robots, and the like.
  • each information terminal 11 may be always installed on the moving body, or may be temporarily installed on the moving body, mounted, or carried.
  • a mobile body provided with the information terminal 11 is distinguished from other mobile bodies with respect to a certain information terminal 11, it is referred to as a self mobile body.
  • the information processing system 1 performs a risk prediction for each moving body provided with each information terminal 11 and performs a process for avoiding an accident in which each moving body is predicted.
  • the server 12 integrates information, data, and the like from each information terminal 11, performs risk prediction between the moving objects, and notifies each information terminal 11 of the prediction result.
  • each information terminal 11 performs risk prediction of the own moving body based on an image taken from the own moving body.
  • Each information terminal 11 performs a process for avoiding an accident in which the mobile object is predicted based on the risk prediction results of each information terminal 11 and the server 12.
  • the server 12 performs a simulation of the state of occurrence of the accident of the moving body based on, for example, information and data from each information terminal 11.
  • FIG. 2 shows a configuration example of the function of the information terminal 11a which is the first embodiment of the information terminal 11 of FIG.
  • the information terminal 11a is configured to include cameras 101L and 101R, a reception unit 102, an information processing unit 103, and a transmission unit 104.
  • the information processing unit 103 is configured to include a SLAM (Simultaneous Localization and Mapping) processing unit 111, a dangerous area determination unit 112, a risk prediction unit 113, and a risk avoidance processing unit 114.
  • SLAM Simultaneous Localization and Mapping
  • the camera 101L captures the moving direction of the moving object from the left side.
  • the camera 101L supplies an image (hereinafter referred to as a left image) obtained as a result of shooting to the SLAM processing unit 111.
  • the camera 101R captures the moving direction of the moving body from the right side.
  • the camera 101R supplies an image (hereinafter referred to as a right image) obtained as a result of shooting to the SLAM processing unit 111.
  • the receiving unit 102 receives various types of information, data, and the like from other information terminals 11a, the server 12, and other servers (not shown), and supplies them to each unit of the information terminal 11a.
  • the receiving unit 102 receives a global map from the server 12 and supplies the global map to the SLAM processing unit 111.
  • the receiving unit 102 receives the dangerous area information from the server 12 and supplies the dangerous area information to the dangerous area determination unit 112.
  • the receiving unit 102 receives the danger notification information from the server 12 and supplies it to the danger avoidance processing unit 114.
  • the global map is a map showing the position in a three-dimensional space of a stationary object in a predetermined wide area.
  • the global map includes information indicating the position and feature amount of a feature point of a stationary object in a predetermined region on a three-dimensional spatial coordinate system.
  • the spatial coordinate system is represented by, for example, latitude, longitude, and height from the ground.
  • the dangerous area information is information indicating the position of a dangerous area where an accident may occur.
  • the danger notification information is information for notifying the danger to the information terminal 11a provided in the mobile body having a risk of causing an accident when the server 12 predicts the risk of the mobile body having an accident. is there.
  • the SLAM processing unit 111 uses the SLAM technology to estimate the speed, position, and orientation of the moving object based on the left image, the right image, and the global map, and to detect objects around the moving object. Or generate a local map.
  • the SLAM processing unit 111 supplies position / orientation information indicating the estimated position and orientation of the moving body to the dangerous area determination unit 112 and the risk prediction unit 113.
  • the SLAM processing unit 111 supplies speed information indicating the estimated speed of the moving body to the danger prediction unit 113.
  • the SLAM processing unit 111 supplies position information including the estimated position and speed of the moving body to the transmission unit 104.
  • the SLAM processing unit 111 notifies the danger prediction unit 113 of the detection result of objects around the moving body.
  • the SLAM processing unit 111 supplies the generated local map to the transmission unit 104.
  • the SLAM processing unit 111 generates global map transmission request information for requesting transmission of the global map, and supplies it to the transmission unit 104.
  • the local map is a map indicating the position in the three-dimensional space of a stationary object around each moving object, and is generated by each information terminal 11.
  • the local map includes information indicating the position and the feature amount on the three-dimensional spatial coordinate system of the feature point of the stationary object around each moving object, as in the global map.
  • the dangerous area determination unit 112 determines whether or not the own moving body is in the dangerous area based on the dangerous area information and the position and orientation information of the own moving body, and notifies the SLAM processing unit 111 of the determination result.
  • the danger prediction unit 113 performs the danger prediction and the dangerous area detection process of the moving object based on the speed information and the position and orientation information of the moving object and the detection result of the objects around the moving object.
  • the risk prediction unit 113 notifies the risk avoidance processing unit 114 of the result of the risk prediction. Further, the danger prediction unit 113 generates danger area notification information for notifying the detected danger area, and supplies the danger area notification information to the transmission unit 104.
  • the danger avoidance processing unit 114 performs a process for avoiding the danger of the own mobile body based on the risk prediction result by the danger prediction unit 113 and the danger notification information from the server 12.
  • the transmission unit 104 transmits various information, data, and the like to the other information terminals 11a, the server 12, and other servers (not shown). For example, the transmission unit 104 transmits position information, a local map, dangerous area notification information, and global map transmission request information to the server 12.
  • FIG. 3 shows a functional configuration example of the SLAM processing unit 111a which is the first embodiment of the SLAM processing unit 111 of FIG.
  • the SLAM processing unit 111a is configured to include an estimation unit 201, a position information generation unit 202, an object detection unit 203, and a local map generation unit 204.
  • the estimation unit 201 estimates the movement amount, position, orientation, and speed of the moving body based on the relative positions of the feature points in the left and right images captured by the cameras 201L and 201R and the moving body.
  • the estimation unit 201 includes image correction units 211L and 211R, a feature point detection unit 212, a parallax matching unit 213, a distance estimation unit 214, a feature amount calculation unit 215, a map information storage unit 216, a motion matching unit 217,
  • the movement amount estimation unit 218, the object dictionary storage unit 219, the object recognition unit 220, the position / orientation information storage unit 221, the position / orientation estimation unit 222, and the speed estimation unit 223 are configured.
  • the image correcting unit 211L and the image correcting unit 211R correct the left image supplied from the camera 201L and the right image supplied from the camera 201R, respectively, so that the images are directed in the same direction.
  • the image correction unit 211L supplies the corrected left image to the feature point detection unit 212 and the motion matching unit 217.
  • the image correction unit 211R supplies the corrected right image to the parallax matching unit 213.
  • the feature point detection unit 212 detects a feature point of the left image.
  • the feature point detection unit 212 supplies two-dimensional position information indicating the position of each detected feature point on the two-dimensional image coordinate system to the parallax matching unit 213 and the feature amount calculation unit 215.
  • the image coordinate system is represented by, for example, an x coordinate and ay coordinate in the image.
  • the parallax matching unit 213 detects the feature point of the right image corresponding to the feature point detected in the left image. Thereby, the parallax which is the difference between the position on the left image of each feature point and the position on the right image is obtained.
  • the parallax matching unit 213 supplies the distance estimation unit 214 with two-dimensional position information indicating the position of each feature point on the image coordinate system in the left image and the right image.
  • the distance estimation unit 214 estimates the distance to each feature point based on the parallax between the left image and the right image of each feature point, and further calculates the position of each feature point on the three-dimensional spatial coordinate system To do.
  • the distance estimation unit 214 supplies the feature amount calculation unit 215 with three-dimensional position information indicating the position of each feature point on the spatial coordinate system.
  • the feature amount calculation unit 215 calculates the feature amount of each feature point of the left image.
  • the feature amount calculation unit 215 causes the map information storage unit 216 to store the feature point information including the three-dimensional position information of each feature point and the feature amount.
  • the map information storage unit 216 stores the glow map supplied from the server 12 in addition to the feature point information used for the local map.
  • the motion matching unit 217 acquires the three-dimensional position information of each feature point detected in the previous frame from the map information storage unit 216. Next, the motion matching unit 217 detects feature points corresponding to each feature point detected in the previous frame in the left image of the current frame. Then, the motion matching unit 217 supplies the movement amount estimation unit 218 with the three-dimensional position information in the previous frame of each feature point and the two-dimensional position information indicating the position on the image coordinate system in the current frame. .
  • the movement amount estimation unit 218 determines the movement of the moving body between frames (more precisely, the camera 201L). Estimate the amount of movement of the position and orientation.
  • the movement amount estimation unit 218 supplies movement amount information indicating the estimated movement amount of the position and posture of the moving body to the object detection unit 203, the position / orientation estimation unit 222, and the speed estimation unit 223.
  • the object recognition unit 220 recognizes an object in the left image based on the object dictionary stored in the object dictionary storage unit 219. Based on the recognition result of the object, the object recognition unit 220 sets the initial position and orientation values (hereinafter referred to as the initial position and initial orientation) of the moving body (more precisely, the camera 201L) in the spatial coordinate system. To do.
  • the object recognizing unit 220 causes the position and orientation information storage unit 221 to store initial position and orientation information indicating the set initial position and initial orientation.
  • the position / orientation estimation unit 222 performs self-movement based on the initial position / orientation information stored in the position / orientation information storage unit 221 or the position / orientation information of the previous frame and the estimation result of the movement amount of the own moving object. Estimate body position and posture. In addition, the position / orientation estimation unit 222 corrects the estimated position and orientation of the moving body based on the global map stored in the map information storage unit 216 as necessary. The position / orientation estimation unit 222 supplies position / orientation information indicating the estimated position and orientation of the moving object to the dangerous region determination unit 112, the risk prediction unit 113, the position information generation unit 202, and the speed estimation unit 223. And stored in the position and orientation information storage unit 221.
  • the speed estimation unit 223 estimates the speed of the mobile body by dividing the estimated travel amount of the mobile body by the elapsed time.
  • the speed estimation unit 223 supplies speed information indicating the estimated speed to the danger prediction unit 113 and the position information generation unit 202.
  • the position information generation unit 202 generates position information including the position and speed of the moving body when notified from the dangerous area determination unit 112 that the moving body is in the dangerous area.
  • the position information generation unit 202 supplies the generated position information to the transmission unit 104.
  • the object detection unit 203 uses the stationary object and the moving body around the moving body. Is detected.
  • the object detection unit 203 notifies the danger prediction unit 113 and the local map generation unit 204 of the detection results of the stationary object and the moving body around the moving body.
  • the local map generation unit 204 When notified from the dangerous area determination unit 112 that the moving body is in the dangerous area, the local map generation unit 204 detects a stationary object and a moving object around the moving body, and a map information storage unit. A local map is generated based on the feature point information of the current frame stored in H.216. The local map generation unit 204 supplies the generated local map to the transmission unit 104. The local map generation unit 204 includes the generated local map, generates global map transmission request information for requesting the server 12 to transmit the global map, and supplies the generated information to the transmission unit 104.
  • FIG. 4 shows an example of the functional configuration of the server 12a, which is the first embodiment of the server 12 of FIG.
  • the server 12a includes a reception unit 301, a position information storage unit 302, a map information storage unit 303, a global map update unit 304, a dangerous area map update unit 305, a motion prediction unit 306, a risk prediction unit 307, a risk notification unit 308, and a simulation unit. 309, a global map search unit 310, a dangerous area notification unit 311, and a transmission unit 312.
  • the receiving unit 301 receives various types of information, data, and the like from each information terminal 11a and other servers (not shown), and supplies them to each unit of the server 12a.
  • the receiving unit 301 receives position information from each information terminal 11 a and stores it in the position information storage unit 302.
  • the receiving unit 301 receives the local map and the dangerous area information from each information terminal 11 a and stores them in the map information storage unit 303.
  • the reception unit 301 receives global map transmission request information from each information terminal 11 a and supplies the global map transmission request information to the global map search unit 310.
  • the receiving unit 301 receives accident information indicating the occurrence of a moving object accident from another server and an instruction to execute an accident simulation, and supplies the instruction to the simulation unit 309.
  • the global map update unit 304 updates the global map stored in the map information storage unit 303 based on the local map generated by each information terminal 11a stored in the map information storage unit 303.
  • the dangerous area map update unit 305 updates the dangerous area map stored in the map information storage unit 303 based on the dangerous area notification information from each information terminal 11a stored in the map information storage unit 303.
  • the dangerous area map is a map showing the position of the dangerous area where there is a risk of an accident occurring.
  • the area covered by the global map is divided into grids, and whether or not each area is a dangerous area is indicated.
  • the grid of the dangerous area map may have a hierarchical structure.
  • FIG. 5 shows an example of a dangerous area map in which the grid has a two-layer structure. First, it is determined whether or not each grid is a dangerous area in the first hierarchy. Then, in the grid determined to be a dangerous area in the first hierarchy, it is determined whether or not each grid in the second hierarchy obtained by further dividing the grid is a dangerous area.
  • the grid G1 in the first layer which is a dangerous area indicated by diagonal lines, is divided into a plurality of grids in the second layer.
  • the motion prediction unit 306 predicts the motion of each mobile object based on the position information from each information terminal 11 a stored in the position information storage unit 302 and the global map stored in the map information storage unit 303. To do.
  • the motion prediction unit 306 notifies the motion prediction unit 306 of the motion prediction result of each moving object.
  • the danger prediction unit 307 performs danger prediction of each moving body based on the prediction result of the movement of each moving body, and supplies the prediction result to the danger notification unit 308.
  • the danger notification unit 308 provides danger notification information for notifying a danger to the information terminal 11a provided in the mobile object in which the dangerous state is detected, that is, the mobile object predicted to be in an accident. Generate.
  • the danger notification unit 308 supplies the generated danger notification information to the transmission unit 312.
  • the simulation unit 309 sets so as to save the position information related to the accident indicated in the accident information in the position information storage unit 302.
  • the simulation unit 309 stores an accident simulation execution command from an input unit (not shown) or from another server or the like via the reception unit 301, and is stored in the position information storage unit 302. Based on the location information, a simulation of the specified accident occurrence is performed.
  • the simulation unit 309 generates simulation data for simulating an accident occurrence state and supplies the simulation data to the transmission unit 312.
  • the global map search unit 310 performs a local map image search included in the global map transmission request information in the global map stored in the map information storage unit 303.
  • the global map search unit 310 extracts a map of the searched area and its surrounding area from the global map, and supplies the extracted map to the transmission unit 312.
  • the dangerous area notification unit 311 generates dangerous area information for each dangerous area shown in the dangerous area map stored in the map information storage unit 303 and supplies the dangerous area information to the transmission unit 312.
  • the transmission unit 312 transmits various information, data, and the like to each information terminal 11a and other servers. For example, the transmission unit 312 transmits danger notification information, simulation data, a global map, and dangerous area information to each information terminal 11a.
  • step S1 the estimation unit 201 estimates the movement amount, position, and orientation of the moving body.
  • the image correction unit 211L and the image correction unit 211R correct the left image supplied from the camera 201L and the right image supplied from the camera 201R, respectively, so that the images are directed in the same direction. To do.
  • the image correction unit 211L supplies the corrected left image to the feature point detection unit 212 and the motion matching unit 217.
  • the image correction unit 211R supplies the corrected right image to the parallax matching unit 213.
  • the feature point detection unit 212 detects a feature point of the left image.
  • a feature point detection method for example, an arbitrary method such as a Harris corner can be used.
  • the feature point detection unit 212 supplies two-dimensional position information indicating the position of each detected feature point on the image coordinate system to the parallax matching unit 213.
  • the parallax matching unit 213 detects the feature point of the right image corresponding to the feature point detected in the left image.
  • the parallax matching unit 213 supplies the distance estimation unit 214 with two-dimensional position information indicating the position of each feature point on the image coordinate system in the left image and the right image.
  • the distance estimation unit 214 estimates the distance to each feature point based on the parallax between the left image and the right image of each feature point, and further calculates the position of each feature point on the three-dimensional spatial coordinate system To do.
  • the distance estimation unit 214 supplies the feature amount calculation unit 215 with three-dimensional position information indicating the position of each feature point on the spatial coordinate system.
  • the feature amount calculation unit 215 calculates the feature amount of each feature point of the left image.
  • the feature amount for example, an arbitrary feature amount such as SURF (Speeded Up Robust ⁇ ⁇ ⁇ Features) can be used.
  • the feature amount calculation unit 215 causes the map information storage unit 216 to store the feature point information including the three-dimensional position information of each feature point and the feature amount.
  • the motion matching unit 217 acquires the three-dimensional position information of each feature point detected in the previous frame from the map information storage unit 216. Next, the motion matching unit 217 detects feature points corresponding to each feature point detected in the previous frame in the left image of the current frame. Then, the motion matching unit 217 supplies the movement amount estimation unit 218 with the three-dimensional position information in the previous frame of each feature point and the two-dimensional position information indicating the position on the image coordinate system in the current frame. .
  • the movement amount estimation unit 218 estimates the movement amount of the own moving body (more precisely, the camera 201L) between the previous frame and the current frame. For example, the movement amount estimation unit 218 calculates the movement amount dX that minimizes the value of the cost function f in the following equation (1).
  • the moving amount dX indicates the moving amount of the position and orientation of the own moving body (more precisely, the camera 201L) from the previous frame to the current frame.
  • the movement amount dX indicates the movement amount of the position in the three-axis direction (three degrees of freedom) and the posture around each axis (three degrees of freedom) in the spatial coordinate system.
  • M t ⁇ 1 and Z t indicate the positions of the frame immediately before the corresponding feature point and the current frame. More specifically, M t-1 indicates the position of the feature point on the spatial coordinate system of the previous frame, and Z t indicates the position of the feature point on the image coordinate system of the current frame. Yes.
  • proj (dX, M t-1 ) is the image coordinates of the left image of the current frame, using the movement amount dX as the position M t-1 of the feature point in the previous frame on the spatial coordinate system. The projected position on the system is shown. That is, proj (dX, M t-1 ) estimates the position of the feature point on the left image of the current frame based on the position M t-1 of the feature point in the previous frame and the movement amount dX. Is.
  • the movement amount estimation unit 218 obtains a movement amount dX that minimizes the sum of squares of Z t -proj (dX, M t-1 ) of each feature point shown in Expression (1) by, for example, the least square method. That is, the movement amount estimation unit 218 determines the feature point of the left image of the current frame on the image coordinate system based on the position M t-1 of the feature point on the spatial coordinate system and the movement amount dX of the previous frame. The amount of movement dX that minimizes the error when the position of is estimated is obtained.
  • the movement amount estimation unit 218 supplies movement amount information indicating the obtained movement amount dX to the object detection unit 203, the position / orientation estimation unit 222, and the speed estimation unit 223.
  • the position / orientation estimation unit 222 acquires position / orientation information in the previous frame from the position / orientation information storage unit 221. Then, the position / orientation estimation unit 222 adds the movement amount dX estimated by the movement amount estimation unit 218 to the position and posture of the own movement body in the previous frame, thereby Estimate posture.
  • the position / orientation estimation unit 222 acquires initial position / orientation information from the position / orientation information storage unit 221 when estimating the position and orientation of the moving body in the first frame. Then, the position / orientation estimation unit 222 estimates the position and orientation of the own moving body by adding the movement amount dX estimated by the movement amount estimation unit 218 to the initial position and initial posture of the own moving body.
  • the position / orientation estimation unit 222 corrects the estimated position and orientation of the moving object based on the global map stored in the map information storage unit 216 as necessary.
  • the position / orientation estimation unit 222 supplies position / orientation information indicating the estimated position and orientation of the moving object to the dangerous region determination unit 112, the risk prediction unit 113, the position information generation unit 202, and the speed estimation unit 223. And stored in the position and orientation information storage unit 221.
  • step S2 the speed estimation unit 223 estimates the speed of the mobile body. Specifically, the speed estimation unit 223 estimates the speed of the moving body by dividing the movement amount dX estimated by the movement amount estimation unit 218 by the elapsed time. The speed estimation unit 223 supplies speed information indicating the estimated speed to the danger prediction unit 113 and the position information generation unit 202.
  • the object detection unit 203 detects surrounding objects. Specifically, the object detection unit 203 acquires the feature point information of the previous frame and the current frame from the map information storage unit 216. Next, the object detection unit 203 performs matching between the feature point of the previous frame and the feature point of the current frame, and detects the movement of each feature point between frames. Next, the object detection unit 203 calculates a feature point that moves corresponding to the movement of the own moving body based on the movement amount dX estimated by the movement amount estimation unit 218 and a movement that corresponds to the movement of the own moving body. Distinguish from feature points that are not.
  • the object detection unit 203 detects a stationary object around the moving body based on a feature point that moves corresponding to the movement of the moving body. In addition, the object detection unit 203 detects a moving body around the moving body based on feature points that do not move corresponding to the movement of the moving body. The object detection unit 203 notifies the danger prediction unit 113 and the local map generation unit 204 of the detection results of the stationary object and the moving body around the moving body.
  • step S4 the dangerous area determination unit 112 determines whether or not the own mobile body is in the dangerous area. Specifically, when the own mobile body exists in the reception area of the dangerous area information transmitted from the server 12a in step S102 of FIG. 7 described later, the receiving unit 102 receives the dangerous area information. The receiving unit 102 supplies the received dangerous area information to the dangerous area determination unit 112.
  • the dangerous area determination unit 112 determines whether the position of the moving body estimated by the position / orientation estimation unit 222 is within the dangerous area indicated in the received dangerous area information. And when it determines with the own mobile body being in a danger area, a process progresses to step S5.
  • step S5 the information terminal 11a transmits the position information and the local map to the server 12a. Specifically, the dangerous area determination unit 112 notifies the position information generation unit 202 and the local map generation unit 204 that the mobile body is in the dangerous area.
  • the position information generation unit 202 generates position information including the speed of the moving body estimated by the speed estimation unit 223 and the position of the moving body included in the position / orientation information from the position / orientation estimation unit 222.
  • the position information generation unit 202 supplies the generated position information to the transmission unit 104.
  • the local map generation unit 204 acquires feature point information of the current frame from the map information storage unit 216. Next, the local map generation unit 204 deletes information on the feature points of the surrounding moving objects detected by the object detection unit 203 from the acquired feature point information. Then, the local map generation unit 204 generates a local map based on the remaining feature point information, and supplies the generated local map to the transmission unit 104.
  • the transmission unit 104 transmits the acquired position information and local map to the server 12a.
  • step S4 the dangerous area determination unit 112 receives the dangerous area information when the position of the moving body estimated by the position / orientation estimation unit 222 is not within the dangerous area indicated in the received dangerous area information. If not, it is determined that the mobile body is not in the danger area. And the process of step S5 is skipped and a process progresses to step S6.
  • the risk prediction unit 113 performs risk prediction. Specifically, the danger predicting unit 113 performs collision, rear-end collision with a surrounding object based on the estimated speed and position of the moving object and the detection result of the surrounding object. Predict the risk of accidents such as contact. Note that the risk prediction unit 113 may further perform the risk prediction by further using the estimated posture of the moving body as necessary. The risk prediction unit 113 notifies the risk avoidance processing unit 114 of the result of the risk prediction.
  • the danger prediction unit 113 may determine the risk of an accident based on the speed and traveling direction of the mobile body and surrounding mobile bodies, the distance between the mobile body and surrounding objects, and the like. You may make it calculate the risk which shows a degree and a severity.
  • step S7 the danger avoidance processing unit 114 determines whether or not there is a risk that the moving body will encounter an accident.
  • step S8 when the danger avoidance processing unit 114 determines that there is a risk that the moving body will have an accident based on the result of the danger prediction by the danger prediction unit 113, the process proceeds to step S8.
  • step S109 the server 12a transmits the danger notification information to the mobile body predicted to be in danger of an accident. Then, when the danger avoidance processing unit 114 receives the danger notification information via the reception unit 102, the danger avoidance processing unit 114 determines that there is a risk that the mobile body will have an accident, and the process proceeds to step S8.
  • step S8 the danger avoidance processing unit 114 performs a process for avoiding an accident.
  • the danger avoidance processing unit 114 gives a warning to a person who is driving or maneuvering the vehicle. Further, for example, when the self-moving body is a person, the danger avoidance processing unit 114 warns the person.
  • the warning content can be arbitrarily set. For example, only danger notification may be performed, or a method for avoiding an accident may be notified.
  • the warning method any method such as an image, sound, blinking light, vibration of the operation unit, or the like can be adopted.
  • the danger avoidance processing unit 114 can also control the operation of the moving object so as to avoid an accident. For example, in order to avoid an accident, the danger avoidance processing unit 114 can actuate, decelerate, or change the traveling direction of a vehicle that is a vehicle.
  • the processing content may be changed according to the risk level.
  • the danger avoidance processing unit 114 may select and execute a process from stop, deceleration, direction change, warning, or the like of the moving body according to the degree of danger.
  • the danger avoidance processing unit 114 performs processing according to the control information.
  • step S9 the risk prediction unit 113 determines whether or not the predicted accident partner is a mobile object. If it is determined that the predicted accident partner is a mobile object, the process proceeds to step S10.
  • the danger prediction unit 113 notifies the danger area. Specifically, the danger prediction unit 113 regards the current position of the moving body as a dangerous area where there is a risk of an accident with another moving body, and sets danger area notification information for notifying the dangerous area. Generate.
  • the dangerous area notification information includes, for example, the position of the own moving body, the time when the dangerous area is detected, the position of another predicted moving body that is a predicted accident partner, and environmental conditions such as weather.
  • the danger prediction unit 113 transmits the generated dangerous area notification information to the server 12a via the transmission unit 104.
  • the server 12a receives the dangerous area notification information in step S101 of FIG. 7 described later.
  • step S9 determines in step S9 that the predicted accident partner is not a moving object but a stationary object
  • the process of step S10 is skipped, and the process proceeds to step S11.
  • step S7 If it is determined in step S7 that there is no risk that the moving body will have an accident, the processes in steps S8 to S10 are skipped, and the process proceeds to step S11.
  • step S11 the local map generation unit 204 determines whether to request transmission of the global map. If it is determined to request transmission of the global map, the process proceeds to step S12.
  • the conditions for requesting the transmission of the global map can be set arbitrarily. For example, when a global map is not stored in the map information storage unit 216, transmission of the global map may be requested. Or you may make it request
  • step S12 the local map generation unit 204 requests transmission of a global map. Specifically, the local map generation unit 204 generates a local map by the same process as in step S5 described above. Then, the local map generation unit 204 generates global map transmission request information including the generated local map, and transmits the generated global map transmission request information to the server 12a via the transmission unit 104.
  • the server 12a receives the global map transmission request information in step S103 of FIG. 7 to be described later, and transmits the global map around the requesting mobile body in step S105.
  • step S13 the receiving unit 102 receives the global map transmitted from the server 12a.
  • the receiving unit 102 stores the received global map in the map information storage unit 216.
  • step S1 the process returns to step S1, and the processes after step S1 are executed.
  • step S11 determines whether the transmission of the global map is not requested. If it is determined in step S11 that the transmission of the global map is not requested, the process returns to step S1, and the processes after step S1 are executed.
  • step S101 the receiving unit 301 starts receiving position information, a local map, and dangerous area notification information transmitted from each information terminal 11a.
  • the receiving unit 301 stores the received position information in the position information storage unit 302.
  • the receiving unit 301 also stores the received local map and dangerous area notification information in the map information storage unit 303.
  • step S102 the server 12a starts transmitting dangerous area information.
  • the dangerous area notification unit 311 generates dangerous area information for each dangerous area shown in the dangerous area map stored in the map information storage unit 303.
  • Each dangerous area information includes information indicating the position of the dangerous area.
  • the dangerous area notification unit 311 supplies the generated dangerous area information to the transmission unit 312.
  • the transmission unit 312 transmits, for example, each dangerous area information to a corresponding dangerous area and an area including the vicinity thereof.
  • step S103 the global map search unit 310 determines whether transmission of a global map is requested.
  • the global map search unit 310 determines that the transmission of the global map is requested when the global map transmission request information transmitted from the information terminal 11a in step S12 in FIG. Proceed to step S104.
  • step S104 the global map search unit 310 searches for a global map around the requesting mobile unit. Specifically, the global map search unit 310 searches the local map image and the feature amount included in the global map transmission request information in the global map stored in the map information storage unit 303.
  • step S105 the global map search unit 310 transmits a global map. Specifically, the global map search unit 310 extracts a map of the area searched in the process of step S104 and its surrounding area from the global map. The global map search unit 310 transmits the extracted map to the requesting information terminal 11a via the transmission unit 312.
  • step S103 determines whether the transmission of the global map is not requested. If it is determined in step S103 that the transmission of the global map is not requested, the processes in steps S104 and S105 are skipped, and the process proceeds to step S106.
  • step S106 the motion prediction unit 306 predicts the motion of each moving object. Specifically, the motion prediction unit 306 predicts the motion of each mobile object based on the position information from each information terminal 11a accumulated in the position information storage unit 302 between a predetermined time ago and the present time. . The motion prediction unit 306 notifies the risk prediction unit 307 of the motion prediction result of each moving object.
  • the motion prediction unit 306 may further predict the motion of the moving object using a global map. For example, in the global map, when there is a stationary object in the traveling direction of the moving body predicted by the position information, the motion prediction unit 306 may predict that the moving body moves so as to avoid the stationary object. . That is, the motion prediction unit 306 may predict a motion that the moving body avoids a stationary object on the global map. Thereby, the movement of the moving body can be predicted more accurately.
  • the risk prediction unit 307 performs risk prediction. Specifically, the risk predicting unit 307 determines the risk that each moving body may cause an accident such as a collision, rear-end collision, or contact with another moving body based on the predicted motion of each moving body. Predict.
  • the risk prediction unit 307 may predict that the distance between the plurality of moving objects will be within a predetermined distance based on the prediction result of the movement of each moving object. Predict that there is a risk.
  • a vehicle 401 and a vehicle 402 which are moving bodies, run side by side in one lane
  • a vehicle 403 and a vehicle 404 which are moving bodies, run side by side in the opposite lane.
  • the vehicles 401 to 404 are traveling at speeds v1a to v4a, respectively. If the speed v4a> the speed v3a and the distance between the vehicle 403 and the vehicle 404 is predicted to be within a predetermined distance, there is a risk that an accident will occur between the vehicle 403 and the vehicle 404. Predicted to be (dangerous).
  • the vehicles 401 to 404 are traveling at speeds v1b to v4b, respectively. If the vehicle 402 is traveling in the direction of the oncoming lane and the distance between the vehicle 402 and the vehicle 403 is predicted to be within a predetermined distance, an accident occurs between the vehicle 402 and the vehicle 403. It is predicted that there is a risk (dangerous state)
  • the risk prediction unit 307 calculates the degree of risk of occurrence of an accident and the degree of risk indicating the severity of the accident based on the speed and traveling direction of the moving object, the distance between the moving objects, and the like. You may do it.
  • step S108 the danger prediction unit 307 determines whether or not a dangerous state has been detected. For example, if the risk prediction unit 307 detects the risk of an accident between the moving objects as a result of the process in step S109, the risk prediction unit 307 determines that a dangerous state has been detected, and the process proceeds to step S109.
  • step S109 the danger notifying unit 308 notifies the danger to a moving body that is in an accident.
  • the danger prediction unit 307 supplies the danger notification unit 308 with information indicating the position, speed, traveling direction, and the like of a moving body that may be in an accident.
  • the danger notification unit 308 generates danger notification information including the position, speed, traveling direction, and the like of another moving body that is an accident partner for each moving body that is in danger of an accident.
  • the danger notification information may include information indicating a method for avoiding an accident, control information used for processing for the mobile body to avoid an accident, and the like.
  • the risk notification unit 308 may change the content of the risk notification information according to the risk level.
  • the danger notification unit 308 may include control information for the moving object in the danger notification information when the risk level is high, and may not include the control information when the risk level is low.
  • the danger notification unit 308 may change the content of the control information included in the danger notification information according to the degree of danger.
  • the danger notification unit 308 may include control information for instructing to stop the moving body when the degree of danger is high, and may include control information for instructing deceleration of the moving body when the degree of danger is low.
  • the danger notification unit 308 transmits the generated danger notification information to the information terminal 11a provided in the mobile body at risk of an accident via the transmission unit 313, respectively.
  • step S109 if it is determined in step S108 that a dangerous state has not been detected, the process of step S109 is skipped, and the process proceeds to step S110.
  • step S110 the global map update unit 304 determines whether to update the global map. If it is determined to update the global map, the process proceeds to step S111.
  • the conditions for updating the global map can be set arbitrarily.
  • the global map may be updated on a predetermined date, day of the week, or time zone.
  • the global map may be updated at predetermined time intervals.
  • the global map may be updated when the accumulated amount of the local map from each information terminal 11a after the previous update becomes a predetermined amount or more.
  • the global map update unit 304 updates the global map. For example, the global map update unit 304 compares the local map stored in the map information storage unit 303 after the previous update with the global map stored in the map information storage unit 303, and determines areas with different information. Extract. The global map update unit 304 replaces the extracted area of the global map with the local map information for the area with the high reliability of the local map.
  • the area where the reliability of the local map is high is, for example, an area where the information of the local maps from a predetermined number or more of the information terminals 11a match.
  • the information on the moving body can be reflected in the global map.
  • a vehicle parked for a long time can be reflected in the global map.
  • the change can be reflected in the global map.
  • falling trees, falling rocks, etc. can be reflected in the global map.
  • the global map update unit 304 stores the updated global map in the map information storage unit 303.
  • step S110 determines whether the global map is updated. If it is determined in step S110 that the global map is not updated, the process of step S111 is skipped, and the process proceeds to step S112.
  • step S112 the dangerous area map update unit 305 determines whether or not to update the dangerous area map. If it is determined to update the dangerous area map, the process proceeds to step S113.
  • the conditions for updating the dangerous area map can be arbitrarily set.
  • the dangerous area map may be updated on a predetermined date, day of the week, or time zone.
  • the dangerous area map may be updated at predetermined time intervals.
  • the dangerous area map may be updated in accordance with the update of the global map.
  • the dangerous area map may be updated when the accumulation amount of the dangerous area information from each information terminal 11a after the previous update becomes a predetermined amount or more. .
  • the dangerous area map update unit 305 updates the dangerous area map. For example, the dangerous area map update unit 305 selects an area in which the frequency of being notified as a dangerous area from each information terminal 11a exceeds a predetermined threshold among the areas not currently set as the dangerous area in the dangerous area map. Newly set a dangerous area. Further, for example, the dangerous area map update unit 305 sets the dangerous area when the frequency of being notified as the dangerous area from each information terminal 11a is very low even in the area currently set as the dangerous area. May be removed.
  • step S113 determines whether the dangerous area map is updated. If it is determined in step S112 that the dangerous area map is not updated, the process of step S113 is skipped, and the process proceeds to step S114.
  • step S114 the simulation unit 309 determines whether an accident has occurred. For example, the simulation unit 309 determines that an accident has occurred when receiving the accident information indicating the occurrence of the accident of the moving object from the external server, the information terminal 11a, or the like via the receiving unit 301, and the processing is performed in steps. The process proceeds to S115.
  • an eCall system server is assumed as an external server that transmits accident information.
  • step S115 the simulation unit 309 performs setting so as to save position information related to the accident.
  • the simulation unit 309 includes the position information stored in the position information storage unit 302 within a predetermined period before and after the accident occurrence time, and the position indicated in the position information indicates the location of the accident occurrence.
  • the position information within a predetermined range including the position information is set as position information related to the accident that has occurred. For example, even when the old position information is deleted from the position information storage unit 302, the position information related to the accident is stored as it is without being deleted.
  • step S114 determines whether accident has occurred. If it is determined in step S114 that no accident has occurred, the process of step S115 is skipped, and the process proceeds to step S116.
  • step S116 the simulation unit 309 determines whether an accident simulation execution has been commanded. For example, the simulation unit 309 receives an accident simulation execution command via an input unit (not shown), or receives an accident simulation execution command from another server or the like via the reception unit 301. If so, it is determined that an accident simulation execution has been commanded, and the process proceeds to step S117.
  • step S117 the simulation unit 309 executes an accident simulation.
  • the simulation unit 309 acquires position information related to the accident to be reproduced by the simulation from the position information storage unit 302.
  • the simulation unit 309 generates and displays a simulation image indicating an accident occurrence state based on the acquired position information. For example, in the simulation image, the movement of the moving body before and after the occurrence of the accident near the accident site is reproduced.
  • the simulation unit 309 generates simulation data for displaying the above-described simulation image based on the acquired position information. And the simulation part 309 transmits the produced
  • step S103 Thereafter, the process returns to step S103, and the processes after step S103 are executed.
  • step S116 when it is determined in step S116 that the execution of the accident simulation is not instructed, the process returns to step S103, and the processes after step S103 are executed.
  • the accuracy of estimation of the position and speed of the moving body is improved, and the accuracy of the risk prediction of the moving body is improved, so that an accident of the moving body can be surely prevented.
  • the position of a moving body when the position of a moving body is measured using GPS, the position of the moving body cannot be accurately measured in an urban area where it is difficult to receive radio waves from GPS satellites. Moreover, the measurement accuracy of the GPS position in private use is about 10 m. Therefore, when GPS is used, the accuracy of the risk prediction of the moving object may be reduced.
  • each information terminal 11a can stably estimate the position, posture, and speed of each moving object with high accuracy regardless of the surrounding conditions and environment.
  • the position of a moving body can be estimated with an accuracy of several centimeters. Therefore, the estimation accuracy of the movement of each moving body is improved, and the accuracy of the danger prediction of the moving body is improved.
  • the accuracy of the risk prediction is improved as compared with the case where the risk prediction is performed by each mobile body alone.
  • SLAM is basically a technique for estimating the position and orientation of a moving object relative to a stationary object, and is not a technique whose main purpose is to recognize surrounding moving objects. In addition, SLAM cannot recognize other moving objects that are not shown in the image in the blind spot of the camera.
  • the field of view in front of the vehicle 431 is blocked by the parked vehicle 433 and the vehicle 433. Therefore, it is difficult to recognize the movement of the vehicle 432 approaching from the oncoming lane from the vehicle 431.
  • the server 12a collects the positional information of the vehicles 421 and 422, and the vehicles 431 and 432, predicts the movement of each vehicle, and then predicts the danger of each vehicle. As a result, the server 12a can predict and prevent accidents in the vehicles 421 and 422 and accidents in the vehicles 431 and 432.
  • the server 12a collects a local map from each information terminal 11a, and updates the global map based on the collected local map. Therefore, since the change of the situation of each place can be reflected to a global map rapidly and correctly, the precision of the danger prediction of each moving body improves, and the accident of each moving body can be prevented reliably.
  • the server 12a can store location information related to the accident and can simulate the accident occurrence later, it can support on-site verification, analysis of the cause of the accident, countermeasures for accident prevention, and the like. it can.
  • FIG. 13 shows an example of the functional configuration of the information terminal 11b which is the second embodiment of the information terminal 11 of FIG. In the figure, parts corresponding to those in FIG.
  • the information terminal 11b is different from the information terminal 11a in FIG. 2 in that an information processing unit 501 is provided instead of the information processing unit 103.
  • the information processing unit 501 is different from the information processing unit 103 in that a saddle risk region determination unit 511 is provided instead of the risk region determination unit 112.
  • the dangerous area determination unit 511 receives the determination result as to whether or not the mobile body is in the dangerous area from the server 12b (FIG. 14) via the reception unit 102.
  • the dangerous area determination unit 511 determines whether or not the mobile body is in the dangerous area based on the received determination result, and notifies the SLAM processing unit 111 of the determination result.
  • FIG. 14 shows an example of the functional configuration of the server 12b, which is the second embodiment of the server 12 of FIG. In the figure, parts corresponding to those in FIG.
  • the server 12b is different from the server 12a of FIG. 4 in that a global map search unit 601 and a dangerous region determination unit 602 are provided instead of the global map search unit 310 and the dangerous region notification unit 311.
  • the global map search unit 601 receives global map transmission request information from each information terminal 11b via the receiving unit 301. Then, the global map search unit 601 searches for an image of the local map included in the global map transmission request information in the global map stored in the map information storage unit 303. The global map search unit 601 extracts a map of the searched area and its surrounding area from the global map, and transmits the extracted map to the requesting information terminal 11b via the transmission unit 312. In addition, the global map search unit 601 detects the current position of the requesting mobile body based on the search result, and supplies the detection result to the dangerous area determination unit 602.
  • the dangerous area determination unit 602 determines whether or not the current position of the requesting mobile body detected by the global map search unit 310 is within the dangerous area indicated in the dangerous area map stored in the map information storage unit 303. Determine.
  • the dangerous area determination unit 602 transmits the determination result to the information terminal 11b that has requested the global map via the transmission unit 312.
  • steps S201 to S203 processing similar to that in steps S1 to S3 in FIG. 6 is executed.
  • step S204 the dredge risk area determination unit 511 determines whether or not the mobile body is in the danger area. Specifically, if the determination result that the mobile object is in the dangerous area is received from the server 12b, the dredging dangerous area determination unit 511 determines that the mobile object is in the dangerous area, and the processing is step. The process proceeds to S205.
  • step S205 the position information and the local map are transmitted to the server 12b as in the process of step S5 of FIG.
  • step S204 the dredge risk area determination unit 511 receives the determination result that the own mobile body is outside the danger area from the server 12b, or if the determination result is not received from the server 12b, It is determined that the moving object is not in the dangerous area. Then, the process of step S205 is skipped, and the process proceeds to step S206.
  • steps S206 to S210 processing similar to that in steps S6 to S10 in FIG. 6 is executed.
  • step S211 it is determined whether or not the transmission of the global map is requested as in the process of step S11 of FIG. If it is determined to request transmission of the global map, the process proceeds to step S212.
  • step S212 global map transmission request information is transmitted to the server 12a in the same manner as in step S12 of FIG.
  • the server 12a receives the global map transmission request information in step S302 of FIG. 16 to be described later, and in step S305, transmits the global map around the requesting mobile body and the determination result of the dangerous area.
  • step S213 the receiving unit 102 receives the global map and the dangerous area determination result transmitted from the server 12a.
  • the receiving unit 102 stores the received global map in the map information storage unit 216 and supplies the dangerous region determination result to the heel risk region determining unit 511.
  • step S201 Thereafter, the process returns to step S201, and the processes after step S201 are executed.
  • step S211 if it is determined in step S211 that transmission of the global map is not requested, the process returns to step S201, and the processes after step S201 are executed.
  • step S301 reception of position information, a local map, and dangerous area notification information transmitted from each information terminal 11b is started in the same manner as in step S101 of FIG.
  • step S302 it is determined whether or not the transmission of the global map has been requested in the same manner as in step S103 of FIG. If it is determined that the transmission of the global map is requested, the process proceeds to step S303.
  • step S303 the global map around the requesting mobile unit is searched in the same manner as in step S104 of FIG.
  • the global map search unit 310 detects the current position of the requesting mobile body based on the search result, and supplies the detection result to the dangerous area determination unit 602.
  • step S304 the dangerous area determination unit 602 determines whether or not the requesting moving body is in the dangerous area. Specifically, the dangerous area determination unit 602 determines that the current position of the requesting mobile body detected by the global map search unit 310 is within the dangerous area indicated in the dangerous area map stored in the map information storage unit 303. It is determined whether or not.
  • step S305 the server 12b transmits the determination result of the global map and the dangerous area.
  • the global map search unit 310 extracts a map of the area searched in the process of step S305 and its surrounding area from the global map.
  • the global map search unit 310 supplies the extracted map to the transmission unit 312.
  • the dangerous area determination unit 602 notifies the transmission unit 312 of the determination result in the process of step S304.
  • the transmission unit 312 transmits the map supplied from the global map search unit 310 and the determination result notified from the dangerous area determination unit 602 to the requesting information terminal 11b.
  • step S302 determines whether the transmission of the global map is not requested. If it is determined in step S302 that the transmission of the global map is not requested, the processes in steps S303 to S305 are skipped, and the process proceeds to step S306.
  • steps S306 to S317 processing similar to that in steps S106 to S117 in FIG. 7 is executed. And a process returns to step S302 and the process after step S302 is performed.
  • the server 12b since the server 12b determines whether or not each mobile object is in the dangerous area, the processing of each information terminal 11b can be reduced.
  • 3rd Embodiment differs in the structure of the SLAM process part 111 of the information terminal 11 compared with 1st Embodiment.
  • FIG. 18 shows a functional configuration example of the SLAM processing unit 111b which is the second embodiment of the SLAM processing unit 111 of FIG. In the figure, parts corresponding to those in FIG.
  • the SLAM processing unit 111b is different from the SLAM processing unit 111a in FIG. 3 in that an estimation unit 701 is provided instead of the estimation unit 201.
  • the estimation unit 701 replaces the feature amount calculation unit 215, the motion matching unit 217, the movement amount estimation unit 218, the object dictionary storage unit 219, the object recognition unit 220, and the position and orientation estimation unit 222.
  • a feature amount calculation unit 711, an image search unit 712, a feature point matching unit 713, a position / orientation estimation unit 714, and a movement amount estimation unit 715 are examples of the estimation unit 701 .
  • the feature amount calculation unit 711 calculates the feature amount of each feature point of the left image in the same manner as the feature amount calculation unit 215 of the SLAM processing unit 111a.
  • the feature quantity calculation unit 711 supplies the feature point information regarding each feature point to the image search unit 712 and the feature point matching unit 713 and causes the map information storage unit 216 to store the feature point information.
  • the image search unit 712 searches for an image similar to the left image from the global map stored in the map information storage unit 216 based on the feature amount information supplied from the feature amount calculation unit 711. That is, the image search unit 712 searches the global map for an image around the moving object.
  • the image search unit 712 supplies the detected image (hereinafter referred to as a similar image) to the feature point matching unit 713.
  • the feature point matching unit 713 detects a feature point of a similar image corresponding to the feature point detected by the feature point detection unit 212 based on the feature amount information supplied from the feature amount calculation unit 711.
  • the feature point matching unit 713 supplies the similar image and the detection result of the feature point to the position / orientation estimation unit 714.
  • the position / orientation estimation unit 714 estimates the position and orientation of the self-moving body (more precisely, the camera 201L) in the global map (that is, the spatial coordinate system) based on the position of the feature point in the similar image.
  • the position / orientation estimation unit 714 obtains position / orientation information indicating the position and orientation of the moving body from the danger region determination unit 112, the risk prediction unit 113, the position information generation unit 202, the speed estimation unit 223, and the movement amount estimation unit 715. And is stored in the position / orientation information storage unit 221.
  • the movement amount estimation unit 715 Based on the position and orientation information of the previous frame and the current frame stored in the position and orientation information storage unit 221, the movement amount estimation unit 715 automatically moves between the previous frame and the current frame. Estimate the amount of movement of the body position and posture. The movement amount estimation unit 715 supplies movement amount information indicating the estimated movement amount of the position and orientation of the moving body to the object detection unit 203 and the speed estimation unit 223.
  • the position and orientation of the moving object are directly estimated using the global map.
  • the SLAM processing unit 111a uses the adjacent frame images (the image of the previous frame and the image of the current frame) to perform motion matching, movement amount estimation, position and orientation estimation, and object detection.
  • the processing may be performed using images separated by two or more frames (for example, images of two or more N frames before and an image of the current frame). The same applies to the SLAM processing unit 111b.
  • a risk level may be set for each risk area of the risk area map, and the information terminal 11 may perform processing according to the risk level.
  • the dangerous area and the degree of danger differ depending on the time zone, season, weather, etc. Therefore, a plurality of dangerous area maps may be created depending on the time zone, season, weather, etc., and the plurality of dangerous area maps may be used properly.
  • the information terminal 11 notifies the server 12 of the location where the risk of an accident with another moving body is detected as the dangerous region.
  • the dangerous region is detected by a different method. It is possible to do so. For example, in the left image or the right image, a place where the area that blocks the field of view in the traveling direction is a predetermined ratio or more may be detected as a dangerous area.
  • each information terminal 11 predicts the movement of its own moving body by the same process as the motion prediction unit 306 of the server 12a.
  • the prediction result may be included in the position information and transmitted to the server 12.
  • the server 12 may perform risk prediction of each mobile object based on the prediction result of the movement of the mobile object from each information terminal 11.
  • the cameras 101L and 101R may be provided outside the information terminal 11.
  • the present technology can be applied to a case where the moving body is a person other than a vehicle, an airplane, a helicopter, a drone, or the like. Can be applied.
  • the present technology can be applied not only to a vehicle that moves by a prime mover, but also to a vehicle that is driven by a rail or an overhead line, a vehicle that moves by human power, and the like.
  • the present technology can be applied regardless of differences in vehicle driving methods (for example, automatic driving, manual driving, remote control, etc.).
  • SLAM technology can be applied by photographing the ground from the moving body.
  • This technology can also be applied to prevent accidents between two or more types of moving objects.
  • the present invention can be applied to prevent accidents between cars, bicycles and people, accidents between cars and trains, and the like.
  • FIG. 19 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input / output interface 905 is further connected to the bus 904.
  • An input unit 906, an output unit 907, a storage unit 908, a communication unit 909, and a drive 910 are connected to the input / output interface 905.
  • the input unit 906 includes a keyboard, a mouse, a microphone, and the like.
  • the output unit 907 includes a display, a speaker, and the like.
  • the storage unit 908 includes a hard disk, a nonvolatile memory, and the like.
  • the communication unit 909 includes a network interface or the like.
  • the drive 910 drives a removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 901 loads the program stored in the storage unit 908 to the RAM 903 via the input / output interface 905 and the bus 904 and executes the program, for example. Is performed.
  • the program executed by the computer (CPU 901) can be provided by being recorded on a removable medium 911 as a package medium, for example.
  • the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 908 via the input / output interface 905 by attaching the removable medium 911 to the drive 910.
  • the program can be received by the communication unit 909 via a wired or wireless transmission medium and installed in the storage unit 908.
  • the program can be installed in the ROM 902 or the storage unit 908 in advance.
  • the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
  • the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
  • each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
  • the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
  • the information terminal 11 can be realized as a device mounted on any type of vehicle such as an automobile, an electric car, a hybrid electric car, and a motorcycle.
  • FIG. 20 is a block diagram illustrating an example of a schematic configuration of a vehicle control system 1000 to which the present technology can be applied.
  • the vehicle control system 1000 includes a plurality of electronic control units connected via a communication network 1001.
  • the vehicle control system 1000 includes a drive system control unit 1002, a body system control unit 1004, a battery control unit 1005, a vehicle exterior information detection device 1007, a vehicle interior information detection device 1010, and an integrated control unit 1012.
  • a communication network 1001 that connects the plurality of control units is compliant with an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or FlexRay (registered trademark). It may be an in-vehicle communication network.
  • Each control unit includes a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores programs executed by the microcomputer or parameters used for various calculations, and a drive circuit that drives various devices to be controlled. Is provided.
  • Each control unit includes a network I / F for performing communication with other control units via the communication network 1001, and wired or wireless communication with devices or sensors inside and outside the vehicle. A communication I / F for performing communication is provided. In FIG.
  • the functional configuration of the integrated control unit 1012 includes a microcomputer 1051, general-purpose communication I / F 1052, dedicated communication I / F 1053, positioning unit 1054, beacon receiving unit 1055, in-vehicle device I / F 1056, audio image output unit 1057, An in-vehicle network I / F 1058 and a storage unit 1059 are illustrated.
  • other control units include a microcomputer, a communication I / F, a storage unit, and the like.
  • the drive system control unit 1002 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 1002 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the drive system control unit 1002 may have a function as a control device such as ABS (Antilock Brake System) or ESC (ElectronicElectroStability Control).
  • a vehicle state detection unit 1003 is connected to the drive system control unit 1002.
  • the vehicle state detection unit 1003 includes, for example, a gyro sensor that detects the angular velocity of the axial rotational motion of the vehicle body, an acceleration sensor that detects the acceleration of the vehicle, or an accelerator pedal operation amount, a brake pedal operation amount, and a steering wheel steering. At least one of sensors for detecting an angle, an engine speed, a rotational speed of a wheel, or the like is included.
  • the drive system control unit 1002 performs arithmetic processing using a signal input from the vehicle state detection unit 1003, and controls an internal combustion engine, a drive motor, an electric power steering device, a brake device, or the like.
  • the body system control unit 1004 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 1004 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 1004 can be input with radio waves transmitted from a portable device that substitutes for a key or signals of various switches.
  • the body system control unit 1004 receives the input of these radio waves or signals, and controls the vehicle door lock device, power window device, lamp, and the like.
  • the battery control unit 1005 controls the secondary battery 1006 that is a power supply source of the drive motor according to various programs. For example, information such as a battery temperature, a battery output voltage, or a remaining battery capacity is input to the battery control unit 1005 from a battery device including the secondary battery 1006. The battery control unit 1005 performs arithmetic processing using these signals, and controls the temperature adjustment control of the secondary battery 1006 or the cooling device provided in the battery device.
  • the vehicle outside information detection device 1007 detects information outside the vehicle on which the vehicle control system 1000 is mounted.
  • the imaging unit 1008 and the outside information detection unit 1009 is connected to the outside information detection device 1007.
  • the imaging unit 1008 includes at least one of a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the outside information detection unit 1009 detects, for example, current weather or an environmental sensor for detecting the weather, or other vehicles, obstacles, pedestrians, etc. around the vehicle on which the vehicle control system 1000 is mounted.
  • a surrounding information detection sensor is included.
  • the environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects sunlight intensity, and a snow sensor that detects snowfall.
  • the ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device.
  • the imaging unit 1008 and the vehicle exterior information detection unit 1009 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • FIG. 21 shows an example of installation positions of the imaging unit 1008 and the vehicle exterior information detection unit 1009.
  • the imaging units 1101F, 1101L, 1101R, 1101B, and 1101C are provided, for example, at at least one position among a front nose, a side mirror, a rear bumper, a back door, and an upper portion of a windshield in the vehicle interior of the vehicle 1100.
  • the imaging unit 1101F provided in the front nose and the imaging unit 1101C provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 1100.
  • the imaging units 1101L and 1101R provided in the side mirror mainly acquire an image of the side of the vehicle 1100.
  • the imaging unit 1101B provided in the rear bumper or the back door mainly acquires an image behind the vehicle 1100.
  • the imaging unit 1101C provided on the upper part of the windshield in the passenger compartment is mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 21 shows an example of shooting ranges of the respective imaging units 1101F, 1101L, 1101R, and 1101B.
  • the imaging range a indicates the imaging range of the imaging unit 1101F provided in the front nose
  • the imaging ranges b and c indicate the imaging ranges of the imaging units 1101L and 1101R provided in the side mirrors, respectively
  • the imaging range d The imaging range of the imaging part 1101B provided in the rear bumper or the back door is shown. For example, by overlaying image data captured by the imaging units 1101F, 1101L, 1101R, and 1101B, an overhead image when the vehicle 1100 is viewed from above is obtained.
  • Vehicle exterior information detection units 1102F, 1102FL, 1102FR, 1102ML, 1102MR, 1102C, 1102BL, 1102BR, and 1102B provided on the front, rear, side, corner, and front windshield of the vehicle 1100 are, for example, an ultrasonic sensor or a radar. It may be a device.
  • the vehicle outside information detection units 1102F, 1102C, and 1102B provided on the front nose, the rear bumper, the back door, and the windshield in the vehicle interior of the vehicle 1100 may be LIDAR devices, for example.
  • These outside information detection units 1102F to 1102B are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, and the like.
  • the vehicle outside information detection device 1007 causes the imaging unit 1008 to capture an image outside the vehicle and receives the captured image data. Further, the vehicle exterior information detection device 1007 receives detection information from the vehicle exterior information detection unit 1009 connected thereto. When the vehicle exterior information detection unit 1009 is an ultrasonic sensor, a radar device, or a LIDAR device, the vehicle exterior information detection device 1007 transmits ultrasonic waves, electromagnetic waves, or the like, and receives received reflected wave information.
  • the vehicle outside information detection device 1007 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received information.
  • the vehicle outside information detection device 1007 may perform an environment recognition process for recognizing rainfall, fog, road surface conditions, or the like based on the received information.
  • the vehicle outside information detection device 1007 may calculate a distance to an object outside the vehicle based on the received information.
  • the vehicle outside information detection device 1007 may perform an image recognition process or a distance detection process for recognizing a person, a car, an obstacle, a sign, a character on a road surface, or the like based on the received image data.
  • the outside information detection apparatus 1007 performs processing such as distortion correction or alignment on the received image data, and combines the image data captured by the different imaging units 1008 to generate an overhead image or a panoramic image. Also good.
  • the vehicle exterior information detection device 1007 may perform viewpoint conversion processing using image data captured by different imaging units 1008.
  • the in-vehicle information detection device 1010 detects in-vehicle information.
  • a driver state detection unit 1011 that detects the driver's state is connected to the in-vehicle information detection apparatus 1010.
  • the driver state detection unit 1011 may include a camera that captures an image of the driver, a biological sensor that detects biological information of the driver, a microphone that collects sound in the passenger compartment, and the like.
  • the biometric sensor is provided, for example, on a seat surface or a steering wheel, and detects biometric information of an occupant sitting on the seat or a driver holding the steering wheel.
  • the in-vehicle information detection device 1010 may calculate the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 1011 and determine whether the driver is asleep. May be.
  • the vehicle interior information detection apparatus 1010 may perform a process such as a noise canceling process on the collected audio signal.
  • the integrated control unit 1012 controls the overall operation in the vehicle control system 1000 according to various programs.
  • An input unit 1018 is connected to the integrated control unit 1012.
  • the input unit 1018 is realized by a device that can be input by a passenger, such as a touch panel, a button, a microphone, a switch, or a lever.
  • the input unit 1018 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA (Personal Digital Assistant) that supports the operation of the vehicle control system 1000. May be.
  • the input unit 1018 may be, for example, a camera. In that case, the passenger can input information by gesture.
  • the input unit 1018 may include, for example, an input control circuit that generates an input signal based on information input by a passenger or the like using the input unit 1018 and outputs the input signal to the integrated control unit 1012.
  • a passenger or the like operates the input unit 1018 to input various data to the vehicle control system 1000 or instruct a processing operation.
  • the storage unit 1059 may include a RAM (Random Access Memory) that stores various programs executed by the microcomputer and a ROM (Read Only Memory) that stores various parameters, calculation results, sensor values, and the like.
  • the storage unit 1059 may be realized by a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • General-purpose communication I / F 1052 is a general-purpose communication I / F that mediates communication with various devices existing in the external environment 1016.
  • the general-purpose communication I / F 1052 is a cellular communication protocol such as GSM (registered trademark) (Global System of Mobile communications), WiMAX, Long Term Term Evolution (LTE) or LTE-A (LTE-Advanced), or a wireless LAN (Wi-Fi). (Also referred to as (registered trademark)) may be implemented.
  • the general-purpose communication I / F 1052 is connected to a device (for example, an application server or a control server) existing on an external network (for example, the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point. May be. Further, the general-purpose communication I / F 1052 is connected to a terminal (for example, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal) that exists in the vicinity of the vehicle using, for example, P2P (Peer To To Peer) technology. May be.
  • a terminal for example, a pedestrian or a store terminal, or an MTC (Machine Type Communication) terminal
  • P2P Peer To To Peer
  • the dedicated communication I / F 1053 is a communication I / F that supports a communication protocol formulated for use in a vehicle.
  • the dedicated communication I / F 1053 may implement a standard protocol such as WAVE (Wireless Access in Vehicle Environment) or DSRC (Dedicated Short Range Communication) that is a combination of the lower layer IEEE 802.11p and the upper layer IEEE 1609 .
  • the dedicated communication I / F 1053 is typically V2X, which is a concept that includes one or more of vehicle-to-vehicle communication, vehicle-to-infrastructure communication, and vehicle-to-pedestrian communication. Perform communication.
  • the positioning unit 1054 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a Global Positioning System (GPS) satellite), performs positioning, and performs latitude, longitude, and altitude of the vehicle.
  • the position information including is generated.
  • the positioning unit 1054 may specify the current position by exchanging signals with the wireless access point, or may acquire position information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
  • the beacon receiving unit 1055 receives, for example, radio waves or electromagnetic waves transmitted from radio stations or the like installed on the road, and acquires information such as the current position, traffic jams, closed roads, or required time. Note that the function of the beacon receiving unit 1055 may be included in the dedicated communication I / F 1053 described above.
  • the in-vehicle device I / F 1056 is a communication interface that mediates connections between the microcomputer 1051 and various devices existing in the vehicle.
  • the in-vehicle device I / F 1056 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB).
  • the in-vehicle device I / F 1056 may establish a wired connection via a connection terminal (and a cable if necessary).
  • the in-vehicle device I / F 1056 exchanges a control signal or a data signal with, for example, a mobile device or wearable device that a passenger has, or an information device that is carried in or attached to the vehicle.
  • the in-vehicle network I / F 1058 is an interface that mediates communication between the microcomputer 1051 and the communication network 1001.
  • the in-vehicle network I / F 1058 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 1001.
  • the microcomputer 1051 of the integrated control unit 1012 is connected via at least one of a general-purpose communication I / F 1052, a dedicated communication I / F 1053, a positioning unit 1054, a beacon receiving unit 1055, an in-vehicle device I / F 1056, and an in-vehicle network I / F 1058.
  • the vehicle control system 1000 is controlled according to various programs.
  • the microcomputer 1051 calculates a control target value of the driving force generation device, the steering mechanism, or the braking device based on the acquired information inside and outside the vehicle, and outputs a control command to the drive system control unit 1002. Also good.
  • the microcomputer 1051 may perform cooperative control for the purpose of vehicle collision avoidance or impact mitigation, follow-up travel based on inter-vehicle distance, vehicle speed maintenance travel, automatic driving, and the like.
  • the microcomputer 1051 is information acquired via at least one of the general-purpose communication I / F 1052, the dedicated communication I / F 1053, the positioning unit 1054, the beacon receiving unit 1055, the in-vehicle device I / F 1056, and the in-vehicle network I / F 1058. Based on the above, local map information including peripheral information on the current position of the vehicle may be created. Further, the microcomputer 1051 may generate a warning signal by predicting a danger such as a vehicle collision, approach of a pedestrian or the like or approach to a closed road based on the acquired information. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning lamp.
  • the sound image output unit 1057 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 1013, a display unit 1014, and an instrument panel 1015 are illustrated as output devices.
  • the display unit 1014 may include at least one of an on-board display and a head-up display, for example.
  • the display unit 1014 may have an AR (Augmented Reality) display function.
  • the output device may be another device such as a headphone, a projector, or a lamp other than these devices.
  • the display device can display the results obtained by various processes performed by the microcomputer 1051 or information received from other control units in various formats such as text, images, tables, and graphs. Display visually. Further, when the output device is an audio output device, the audio output device converts an audio signal made up of reproduced audio data or acoustic data into an analog signal and outputs it aurally.
  • At least two control units connected via the communication network 1001 may be integrated as one control unit.
  • each control unit may be configured by a plurality of control units.
  • the vehicle control system 1000 may include another control unit not shown.
  • some or all of the functions of any of the control units may be given to other control units. That is, as long as information is transmitted and received via the communication network 1001, the predetermined arithmetic processing may be performed by any of the control units.
  • a sensor or device connected to one of the control units may be connected to another control unit, and a plurality of control units may transmit / receive detection information to / from each other via the communication network 1001. .
  • the information processing unit 103 of the information terminal 11a in FIG. 2 and the information processing unit 501 of the information terminal 11b in FIG. 13 can be applied to the integrated control unit 1012 in FIG.
  • the reception unit 102 and the transmission unit 104 of the information terminal 11a in FIG. 2 and the information terminal 11b in FIG. 13 can be applied to the general-purpose communication I / F 1052 in FIG.
  • the cameras 101L and 101R of the information terminal 11a in FIG. 2 and the information terminal 11b in FIG. 13 can be applied to the imaging unit 1008 in FIG. It is assumed that the cameras 101L and 101R are provided at the positions of the vehicle exterior information detection units 1102FL and 1102FR in FIG. 21, for example, in order to increase the baseline length.
  • the components of the information processing unit 103 and the information processing unit 501 are realized in a module (for example, an integrated circuit module configured by one die) for the integrated control unit 1012 illustrated in FIG. May be.
  • the information processing unit 103 and the information processing unit 501 may be realized by a plurality of control units of the vehicle control system 1000 illustrated in FIG.
  • a computer program for realizing the functions of the information processing unit 103 and the information processing unit 501 can be installed in any control unit or the like. It is also possible to provide a computer-readable recording medium in which such a computer program is stored.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via a network, for example, without using a recording medium.
  • the present technology can take the following configurations.
  • a receiving unit that receives position information indicating a position and a speed of each moving body estimated based on an image photographed from each moving body from an information terminal included in each moving body;
  • An information processing apparatus comprising: a risk prediction unit that predicts an accident between the moving bodies based on the movements of the moving bodies predicted based on the estimated position and speed of the moving bodies.
  • the information processing apparatus according to (1) further including a motion prediction unit that predicts a motion of each moving body based on the estimated position and speed of each moving body.
  • the receiving unit receives a local map indicating a position in a three-dimensional space of a feature point in an image taken from each moving body from each information terminal, Based on the received local map, further comprising a global map updater for updating a global map indicating the position of the feature point in the predetermined area in the three-dimensional space;
  • the information processing apparatus according to (2) wherein the motion prediction unit further predicts the motion of each moving object based on the global map.
  • the motion prediction unit predicts a motion in which each moving body avoids a stationary object on the global map.
  • the position information includes the movement of each mobile object predicted by each information terminal, The information processing apparatus according to (1), wherein the risk prediction unit predicts an accident between the moving bodies based on movements of the moving bodies predicted by the information terminals.
  • Any of (1) to (5), further comprising a danger notification unit for notifying the information terminal provided in the mobile body that is predicted to have an accident risk by the danger prediction unit An information processing apparatus according to claim 1.
  • the receiving unit receives a detection result of a dangerous area where an accident may occur from each information terminal, A dangerous area map updating unit for updating a dangerous area map indicating the dangerous area based on the detection result of the dangerous area by each of the information terminals;
  • the information processing apparatus according to any one of (1) to (7), further comprising: a dangerous area notification unit that notifies each of the information terminals of the dangerous area based on the dangerous area map.
  • the receiving unit receives a detection result of a dangerous area where an accident may occur from each information terminal, A dangerous area map updating unit for updating a dangerous area map indicating the dangerous area based on the detection result of the dangerous area by each of the information terminals; (1) to (1), further comprising: a risk area determination unit that determines whether or not each moving body is in the risk area based on the risk area map, and notifies the information terminal of the determination result.
  • the information processing apparatus according to any one of 7).
  • the information processing apparatus according to any one of (1) to (9), further including a storage unit that stores location information related to the accident when the accident information indicating the occurrence of the accident of the mobile object is received.
  • the information processing apparatus further including a simulation unit that simulates an accident occurrence state of the moving body based on the position information stored in the storage unit.
  • Information processing device A receiving step of receiving position information indicating the position and speed of each moving body estimated based on an image captured from each moving body from an information terminal provided in each moving body; A risk prediction step of predicting an accident between the moving bodies based on the movements of the moving bodies predicted based on the estimated position and speed of the moving bodies.
  • An estimation unit that estimates the position and speed of the moving body based on an image taken from the moving body;
  • a transmission unit that transmits position information including the estimated position and speed of the moving object to the information processing device;
  • An information terminal comprising: a risk avoidance processing unit that performs a process for avoiding an accident when the information processing apparatus is notified of a risk that the moving body will have an accident.
  • the estimation unit estimates a position and a speed of the moving body based on a relative position between a feature point in an image captured from the moving body and the moving body.
  • a local map generating unit that generates a local map indicating the position of the feature point in a three-dimensional space; The information terminal according to (14), wherein the transmission unit further transmits the local map to the information processing apparatus.
  • An object detection unit that detects an object around the moving body based on the feature points;
  • a risk prediction unit that predicts an accident of the moving body based on the estimated position and speed of the moving body and the detection result of the object, and
  • the information terminal according to (14) or (15), wherein the risk avoidance processing unit further performs a process for avoiding an accident when the risk predicting unit predicts a risk that the moving body will have an accident. .
  • the estimation unit predicts the movement of the moving body based on the estimated position and speed of the moving body, The information terminal according to any one of (13) to (16), wherein the transmission unit transmits the position information including a prediction result of the movement of the moving body to the information processing apparatus.
  • the transmission unit transmits the position information to the information processing device when it is determined that the moving body is in a dangerous area where there is a risk of an accident based on information from the information processing device. (13) The information terminal according to any one of (17).
  • the danger prediction unit further detects the dangerous area, The information terminal according to (18), wherein the transmission unit transmits a detection result of the dangerous area to the information processing apparatus.
  • An information terminal provided on a mobile object An estimation step of estimating the position and speed of the moving body based on an image taken from the moving body; A transmission step of transmitting position information including the estimated position and speed of the moving body to the information processing apparatus; An information processing method comprising: a risk avoidance processing step for performing a process for avoiding an accident when the information processing apparatus is notified of a risk that the moving body will have an accident.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

La présente technique concerne un dispositif de traitement d'informations, un terminal d'informations et un procédé de traitement d'informations qui rendent possible d'empêcher de manière fiable des accidents entre des corps mobiles. Le dispositif de traitement d'informations est pourvu de : une unité de réception qui, à partir de terminaux d'informations équipés sur chaque corps mobile, reçoit des informations de position indiquant la position et la vitesse des corps mobiles estimées sur la base d'images capturées à partir desdits corps mobiles ; et une unité de prédiction de danger qui prédit des accidents entre lesdits corps mobiles sur la base du déplacement des corps mobiles prédit sur la base de la position et de la vitesse prédites de chaque corps mobile. La présente invention peut être appliquée, par exemple, à des systèmes destinés à prédire et empêcher des accidents entre véhicules.
PCT/JP2016/077428 2015-09-30 2016-09-16 Dispositif de traitement d'informations, terminal d'informations et procédés de traitement d'informations WO2017057055A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-193360 2015-09-30
JP2015193360A JP2017068589A (ja) 2015-09-30 2015-09-30 情報処理装置、情報端末、及び、情報処理方法

Publications (1)

Publication Number Publication Date
WO2017057055A1 true WO2017057055A1 (fr) 2017-04-06

Family

ID=58423777

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/077428 WO2017057055A1 (fr) 2015-09-30 2016-09-16 Dispositif de traitement d'informations, terminal d'informations et procédés de traitement d'informations

Country Status (2)

Country Link
JP (1) JP2017068589A (fr)
WO (1) WO2017057055A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110936953A (zh) * 2018-09-21 2020-03-31 大众汽车有限公司 提供周围环境图像的方法和设备与具有这种设备的机动车
WO2020188895A1 (fr) * 2019-03-18 2020-09-24 日本電気株式会社 Serveur de bord informatique, procédé de commande et support lisible par ordinateur non transitoire
CN113177428A (zh) * 2020-01-27 2021-07-27 通用汽车环球科技运作有限责任公司 用于物体跟踪的实时主动物体融合
CN113362646A (zh) * 2020-03-05 2021-09-07 本田技研工业株式会社 信息处理装置、车辆、计算机可读存储介质以及信息处理方法
CN114830205A (zh) * 2020-01-17 2022-07-29 日立安斯泰莫株式会社 电子控制装置及车辆控制系统
US20220355823A1 (en) * 2019-06-18 2022-11-10 Ihi Corporation Travel route generation device and control device
US20220414983A1 (en) * 2019-11-19 2022-12-29 Sony Group Corporation Information processing device, information processing method, and program
WO2023179988A1 (fr) * 2022-03-24 2023-09-28 Zf Friedrichshafen Ag Communication heuristique avec des usagers de la route
WO2023243558A1 (fr) * 2022-06-15 2023-12-21 ソニーグループ株式会社 Dispositif de traitement d'informations, programme, et système de traitement d'informations
US11999384B2 (en) * 2019-06-18 2024-06-04 Ihi Corporation Travel route generation device and control device

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6979782B2 (ja) * 2017-04-17 2021-12-15 株式会社ゼンリン 3次元地図データおよび制御装置
WO2019008755A1 (fr) * 2017-07-07 2019-01-10 マクセル株式会社 Système de traitement d'informations, et infrastructure de système de traitement d'informations et procédé de traitement d'informations utilisé pour ce dernier
WO2019065546A1 (fr) * 2017-09-29 2019-04-04 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Procédé de création de données tridimensionnelles, dispositif client et serveur
JP6676025B2 (ja) 2017-10-23 2020-04-08 本田技研工業株式会社 車両制御装置、車両制御方法、およびプログラム
US10706308B2 (en) * 2018-08-07 2020-07-07 Accenture Global Solutions Limited Image processing for automated object identification
WO2020095541A1 (fr) * 2018-11-06 2020-05-14 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP7134252B2 (ja) * 2018-12-03 2022-09-09 株式会社Nttドコモ マップデータ生成装置
KR102651410B1 (ko) 2018-12-28 2024-03-28 현대자동차주식회사 자율 발렛 주차를 지원하는 시스템 및 방법, 그리고 이를 위한 인프라 및 차량
JP2022516849A (ja) * 2019-01-18 2022-03-03 ベステル エレクトロニク サナイー ベ ティカレト エー.エス. ヘッドアップディスプレイシステム、方法、データキャリア、処理システム及び車両
JP7243524B2 (ja) * 2019-08-23 2023-03-22 トヨタ自動車株式会社 自動運転システム
WO2021131797A1 (fr) * 2019-12-25 2021-07-01 ソニーグループ株式会社 Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations
JP7422177B2 (ja) 2022-03-31 2024-01-25 本田技研工業株式会社 交通安全支援システム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000306194A (ja) * 1999-04-21 2000-11-02 Toshiba Corp 自動走行支援システム
JP2007248321A (ja) * 2006-03-17 2007-09-27 Sumitomo Electric Ind Ltd 車両走行位置推定システム及び車両走行位置推定方法
JP2012155654A (ja) * 2011-01-28 2012-08-16 Sony Corp 情報処理装置、報知方法及びプログラム
JP2013149191A (ja) * 2012-01-23 2013-08-01 Masahiro Watanabe 交通安全支援システム
JP2014035639A (ja) * 2012-08-08 2014-02-24 Toshiba Corp 交通事故発生予報装置、方法およびプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000306194A (ja) * 1999-04-21 2000-11-02 Toshiba Corp 自動走行支援システム
JP2007248321A (ja) * 2006-03-17 2007-09-27 Sumitomo Electric Ind Ltd 車両走行位置推定システム及び車両走行位置推定方法
JP2012155654A (ja) * 2011-01-28 2012-08-16 Sony Corp 情報処理装置、報知方法及びプログラム
JP2013149191A (ja) * 2012-01-23 2013-08-01 Masahiro Watanabe 交通安全支援システム
JP2014035639A (ja) * 2012-08-08 2014-02-24 Toshiba Corp 交通事故発生予報装置、方法およびプログラム

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110936953A (zh) * 2018-09-21 2020-03-31 大众汽车有限公司 提供周围环境图像的方法和设备与具有这种设备的机动车
WO2020188895A1 (fr) * 2019-03-18 2020-09-24 日本電気株式会社 Serveur de bord informatique, procédé de commande et support lisible par ordinateur non transitoire
JPWO2020188895A1 (ja) * 2019-03-18 2021-12-23 日本電気株式会社 エッジコンピューティングサーバ、制御方法、及び制御プログラム
JP7173287B2 (ja) 2019-03-18 2022-11-16 日本電気株式会社 エッジコンピューティングサーバ、制御方法、及び制御プログラム
US20220355823A1 (en) * 2019-06-18 2022-11-10 Ihi Corporation Travel route generation device and control device
US11999384B2 (en) * 2019-06-18 2024-06-04 Ihi Corporation Travel route generation device and control device
US20220414983A1 (en) * 2019-11-19 2022-12-29 Sony Group Corporation Information processing device, information processing method, and program
CN114830205B (zh) * 2020-01-17 2023-12-08 日立安斯泰莫株式会社 电子控制装置及车辆控制系统
CN114830205A (zh) * 2020-01-17 2022-07-29 日立安斯泰莫株式会社 电子控制装置及车辆控制系统
CN113177428A (zh) * 2020-01-27 2021-07-27 通用汽车环球科技运作有限责任公司 用于物体跟踪的实时主动物体融合
CN113362646A (zh) * 2020-03-05 2021-09-07 本田技研工业株式会社 信息处理装置、车辆、计算机可读存储介质以及信息处理方法
CN113362646B (zh) * 2020-03-05 2023-04-07 本田技研工业株式会社 信息处理装置、车辆、计算机可读存储介质以及信息处理方法
US11622228B2 (en) 2020-03-05 2023-04-04 Honda Motor Co., Ltd. Information processing apparatus, vehicle, computer-readable storage medium, and information processing method
WO2023179988A1 (fr) * 2022-03-24 2023-09-28 Zf Friedrichshafen Ag Communication heuristique avec des usagers de la route
WO2023243558A1 (fr) * 2022-06-15 2023-12-21 ソニーグループ株式会社 Dispositif de traitement d'informations, programme, et système de traitement d'informations

Also Published As

Publication number Publication date
JP2017068589A (ja) 2017-04-06

Similar Documents

Publication Publication Date Title
WO2017057055A1 (fr) Dispositif de traitement d'informations, terminal d'informations et procédés de traitement d'informations
US10753757B2 (en) Information processing apparatus and information processing method
US10970877B2 (en) Image processing apparatus, image processing method, and program
US11450026B2 (en) Information processing apparatus, information processing method, and mobile object
US11363235B2 (en) Imaging apparatus, image processing apparatus, and image processing method
US11501461B2 (en) Controller, control method, and program
EP3835823B1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme informatique, système de traitement d'informations et dispositif de corps mobile
CN112119282A (zh) 信息处理装置、移动装置、方法和程序
US20210033712A1 (en) Calibration apparatus, calibration method, and program
JP2023126642A (ja) 情報処理装置、情報処理方法、及び、情報処理システム
JPWO2018180579A1 (ja) 撮像制御装置、および撮像制御装置の制御方法、並びに移動体
US20200191975A1 (en) Information processing apparatus, self-position estimation method, and program
CN112368598A (zh) 信息处理设备、信息处理方法、计算机程序和移动设备
US20230230368A1 (en) Information processing apparatus, information processing method, and program
US20200230820A1 (en) Information processing apparatus, self-localization method, program, and mobile body
US20220276655A1 (en) Information processing device, information processing method, and program
JP2022113054A (ja) 情報処理装置、情報処理方法、プログラムおよび移動装置
JP7363890B2 (ja) 情報処理装置、情報処理方法及びプログラム
US20220012552A1 (en) Information processing device and information processing method
US20210042886A1 (en) Image processing apparatus, image processing method, and program
WO2022059489A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2020195969A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
WO2020116204A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme, dispositif de commande de corps mobile, et corps mobile
JP2024003806A (ja) 情報処理装置、情報処理方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16851229

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16851229

Country of ref document: EP

Kind code of ref document: A1