WO2023026760A1 - Object detection device and object detection method - Google Patents

Object detection device and object detection method Download PDF

Info

Publication number
WO2023026760A1
WO2023026760A1 PCT/JP2022/029149 JP2022029149W WO2023026760A1 WO 2023026760 A1 WO2023026760 A1 WO 2023026760A1 JP 2022029149 W JP2022029149 W JP 2022029149W WO 2023026760 A1 WO2023026760 A1 WO 2023026760A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection
vehicle
movement
detection device
sensor
Prior art date
Application number
PCT/JP2022/029149
Other languages
French (fr)
Japanese (ja)
Inventor
秀典 田中
真澄 福万
幹生 大林
拓也 中川
Original Assignee
株式会社デンソー
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー, トヨタ自動車株式会社 filed Critical 株式会社デンソー
Publication of WO2023026760A1 publication Critical patent/WO2023026760A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to an object detection device and an object detection method.
  • Patent Document 1 proposes an in-vehicle object discrimination device that determines whether an object around the vehicle is a moving object or a stationary object. With this device, two ultrasonic sensors detect the same area at different times. If the detection results are the same, it is determined that the object is stationary. We are making a judgment on whether In this way, an obstacle map of only stationary objects is created, and a parking assistance system is constructed.
  • An object of the present disclosure is to provide an object detection device and an object detection method that can identify in a short time whether an object existing around a vehicle is a moving object.
  • the object detection device detects an object existing around the vehicle, an object detection unit that detects an object within the detection area of the search wave sensor as a first object based on the detection result of the search wave sensor installed in the vehicle; A movement detection unit that detects the movement of the second object in the target area based on the detection result of the detection device that sequentially detects the object within the range of the target area as the second object, with the area including the detection area as the target area; an object identification unit that identifies the first object as a moving object when the movement of the second object is detected by the movement detection unit and the first object and the second object are the same.
  • the object detection method detects an object existing around the vehicle, Detecting an object within the detection area of the probe wave sensor as the first object based on the detection result of the probe wave sensor installed in the vehicle; Detecting the movement of the second object in the target area based on the detection results of a detection device that sequentially detects objects within the range of the target area as the second object, with an area including the detection area as the target area; identifying the first object as a moving object when movement of the second object is detected and the first object and the second object are the same.
  • the detection result of the detection device is used to detect the movement of the object within the detection area of the search wave sensor, the detection of the object based on the detection result of the search wave sensor and the detection device detection of the movement of the object based on the detection result of can be carried out in parallel. Therefore, since the standby state unlike the conventional technology does not occur, it is possible to quickly identify whether or not an object existing around the vehicle is a moving object.
  • FIG. 1 is a schematic configuration diagram of a parking assistance system including an object detection device according to an embodiment of the present disclosure
  • FIG. 1 is a schematic plan view showing a schematic configuration of a vehicle equipped with an object detection device according to an embodiment of the present disclosure
  • FIG. 4 is an explanatory diagram for explaining a state in which another vehicle passes the own vehicle when the own vehicle is parked
  • 6 is a flowchart showing an example of object detection processing executed by the control device
  • FIG. 4 is an explanatory diagram for explaining identification of a moving object by an object detection device
  • FIG. 4 is an explanatory diagram for explaining identification of a stationary object by the object detection device
  • It is an explanatory view for explaining an obstacle map.
  • FIG. 1 An embodiment of the present disclosure will be described with reference to FIGS. 1 to 7.
  • FIG. 1 a case where the object detection device 10 is incorporated in the parking assistance system 1 in the vehicle will be described as an example.
  • the object detection device 10 is incorporated into the parking assistance system 1 here, it may be incorporated into a system other than the parking assistance system 1 .
  • the parking assist system 1 includes a perimeter monitoring device 20, a control device 30, an HMIECU 40, a brake ECU 50, and a power train (hereinafter referred to as power train) ECU 60.
  • the perimeter monitoring device 20 is directly connected to the control device 30 so that the monitoring result of the perimeter monitoring device 20 is input to the control device 30 .
  • communication between the control device 30 and the HMIECU 40, the brake ECU 50, and the power train ECU 60 is possible via an in-vehicle communication bus 70 such as an in-vehicle LAN (Local Area Network).
  • in-vehicle communication bus 70 such as an in-vehicle LAN (Local Area Network).
  • the controller 30 is connected to various sensors for vehicle control such as a wheel speed sensor and a steering angle sensor.
  • a wheel speed sensor is provided for each of the four wheels, and generates a detection signal corresponding to the rotation state of each wheel as a pulse output.
  • the steering angle sensor outputs a detection signal corresponding to the direction of steering and the amount of operation by the steering operation.
  • the surroundings monitoring device 20 is an autonomous sensor that monitors the surrounding environment of its own vehicle (hereinafter referred to as own vehicle Va).
  • the perimeter monitoring device 20 detects obstacles and the like composed of three-dimensional objects in the perimeter of the own vehicle Va, such as moving objects such as other vehicles Vb and stationary stationary objects such as structures on the road. detected as The vehicle is equipped with a survey wave sensor 21, a surroundings monitoring camera 22, and the like as a surroundings monitoring device 20. - ⁇
  • the search wave sensor 21 transmits search waves to a predetermined range around the own vehicle Va.
  • the exploration wave sensor 21 includes a plurality of ultrasonic sensors arranged at predetermined intervals along the traveling direction of the vehicle Va on the side of the vehicle Va.
  • the search wave sensor 21 includes a left front side sensor SLf, a left rear side sensor SLb, a right front side sensor SRf, a right rear side sensor SRb, a millimeter wave radar (not shown), and a LiDAR (light detection and ranging) not shown. contains.
  • the left front side sensor SLf, the left rear side sensor SLb, the right front side sensor SRf, and the right rear side sensor SRb are arranged on the side of the vehicle Va along the traveling direction of the vehicle Va.
  • ultrasonic sensors are provided on the front left, rear left, front right, and rear right of the own vehicle Va.
  • the front-rear direction of the vehicle Va is indicated by DR1
  • the left-right direction of the vehicle Va is indicated by DR2.
  • the ultrasonic sensor generates ultrasonic waves, receives the reflected waves, and detects the distance to an object existing in the pointing direction of the ultrasonic sensor based on the time from generation to reception of the reflected waves. , and outputs the result as a detection signal.
  • the distance detected by the ultrasonic sensor will be referred to as detection distance.
  • the detection area of the left front side sensor SLf is Rfl
  • the detection area of the left rear side sensor SLb is Rrl
  • the detection area of the right front side sensor SRf is Rfr
  • the detection area of the right rear side sensor SRb is Rrr.
  • the distance between the left front side sensor SLf and the left rear side sensor SLb and between the right front side sensor SRf and the right rear side sensor SRb are set to a predetermined distance, and are substantially the same distance.
  • the left front side sensor SLf and the right front side sensor SRf are arranged in a tire housing on the front wheel side of the own vehicle Va.
  • the left rear side sensor SLb and the right rear side sensor SRb are arranged in a portion of the rear bumper located on the side of the host vehicle Va, a rear fender portion, and the like.
  • the left front side sensor SLf, the left rear side sensor SLb, the right front side sensor SRf, and the right rear side sensor SRb are arranged to face the side of the own vehicle Va, and are arranged to face an object present on the side of the own vehicle Va. Detect distance.
  • the surroundings monitoring camera 22 images a predetermined range around the own vehicle Va as target areas RL and RR.
  • the perimeter monitoring camera 22 outputs imaging data obtained by imaging the perimeter of the vehicle Va as an imaging result.
  • the surrounding monitoring camera 22 includes a front camera CF, a back camera CB, a left side camera CL, a right side camera CR, and the like.
  • the front camera CF is provided, for example, on the front bumper or front grill that constitutes the front mask of the vehicle.
  • the back camera CB is provided, for example, on the rear bumper or in the vicinity of the rear bumper.
  • the left side camera CL is provided, for example, on the left side mirror ML or near the left side mirror ML.
  • the left side camera CL has a target area RL that includes the detection areas Rfl and Rrl of the left front side sensor SLf and the left rear side sensor SLb.
  • the right side camera CR is provided, for example, on the right side mirror MR or in the vicinity of the right side mirror MR.
  • the right side camera CR uses an area including the detection areas Rfr and Rrr of the right front side sensor SRf and the right rear side sensor SRb as a target area RR.
  • the perimeter monitoring camera 22 regards the areas including the detection areas Rfr and Rrr of the survey wave sensor 21 as the target areas RL and RR, and treats objects within the range of the target areas RL and RR as “second objects. ” to configure a detection device that detects sequentially.
  • the side sensors SLf, SLb, SRf, and SRb one of the ultrasonic sensors adjacent in the traveling direction of the vehicle Va is assumed to be the first sensor, and the other is assumed to be the second sensor.
  • the surroundings monitoring camera 22 sequentially detects the “second object” by using the areas including the detection area of the first sensor and the detection area of the second sensor as target areas RL and RR.
  • the control device 30 constitutes an ECU (that is, an electronic control device) for performing various controls for realizing a parking assistance method in the parking assistance system 1, and includes a CPU, a storage unit 31, an I/O, and the like. It is composed of a microcomputer equipped with
  • the storage unit 31 includes ROM, RAM, EEPROM, and the like. That is, the storage unit 31 has a volatile memory such as RAM and a nonvolatile memory such as EEPROM.
  • the storage unit 31 is composed of a non-transition tangible recording medium. For example, the storage unit 31 holds a predetermined amount of various information obtained from the survey wave sensor 21 and the surrounding monitoring camera 22 in chronological order.
  • the control device 30 executes parking support control including object detection processing based on the monitoring results of the perimeter monitoring device 20 described above.
  • this parking support control obstacles such as "stationary objects" that exist around the own vehicle Va are recognized, a moving route is calculated when the vehicle is parked so as to avoid the obstacles, and along the moving route Assist in moving vehicles. If the recognized obstacle is a moving object, the "moving object" may no longer exist when the vehicle starts to park after recognizing the "moving object.” For this reason, for example, processing such as calculating a movement route by excluding "moving objects” from obstacles is performed.
  • assistance is provided to move the vehicle along the travel route, for example, the driver can visually grasp the travel route, or control the braking/driving force to directly move the vehicle along the travel route. is done. Therefore, control signals are transmitted from the control device 30 to the HMIECU 40, the brake ECU 50, and the powertrain ECU 60 through the vehicle-mounted communication bus 70 in order to execute the control.
  • control device 30 includes an object detection unit 32, a movement detection unit 33, an object identification unit 34, a map generation unit 35, and a support control unit 36 as functional units. ing.
  • the object detection unit 32 detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 based on the detection result of the survey wave sensor 21 installed in the own vehicle Va as a "first object". To detect.
  • the object detection unit 32 calculates the distance between the vehicle and the object as the detection distance based on the sensor output of each of the side sensors SLf, SLb, SRf, and SRb. Then, the object detection unit 32 identifies where the object is by mobile triangulation from the detection distance based on the sensor outputs of the side sensors SLf, SLb, SRf, and SRb.
  • the detection distance changes so as to become shorter when the ultrasonic sensor moves forward, the object is positioned in front of the ultrasonic sensor, and if the detection distance does not change, A survey is performed such that an object exists at a side position of an ultrasonic sensor.
  • the detection of the “first object” by the object detection unit 32 may be realized by a method other than the moving triangulation method.
  • the movement detection unit 33 detects the movement of the “second object” in the detection areas Rfl, Rrl, Rfr, and Rrr of the side sensors SLf, SLb, SRf, and SRb based on the detection results of the peripheral monitoring camera 22 that constitutes the detection device. to detect
  • the movement detection unit 33 extracts, for example, a feature point corresponding to the "second object" in the captured image based on the imaged data around the own vehicle Va, and detects the time change of the feature point and the image of the own vehicle Va. Movement of the "second object” is detected based on the movement state. Extraction of feature points can be realized by using, for example, a Sobel filter using first-order differentiation and a Laplacian filter using second-order differentiation. Further, the movement state of the own vehicle Va can be acquired based on the sensor outputs of the wheel speed sensor and the steering angle sensor, for example. Note that the method of detecting movement of the "second object" is not limited to the above.
  • the method of detecting the movement of the "second object” includes, for example, optical flow that detects the movement of the object by vectorizing the movement of the feature points between frames, and the movement of the object that detects the movement of the object from the difference between the frames of the captured image. It may be implemented using inter-frame differences or the like.
  • the object identification unit 34 identifies whether the "first object" detected by the object detection unit 32 is a "moving object” or a "stationary object”.
  • the object identification unit 34 identifies the “first ,” is identified as a moving object.
  • the object identification unit 34 grasps the position of the "first object” from the detection result of the object detection unit 32, and grasps the position of the "second object” from the object monitoring result of the surroundings monitoring camera 22. .
  • the object specifying unit 34 determines that the "second object” and the "first object” are the same. It is determined that For example, as shown in FIG.
  • the object detection unit 32 detects the other vehicle Vb as a “first object”, and the movement detection unit 33 detects the other vehicle Vb as a “first object”. Movement of Vb is detected as movement of the "second object”. Then, since the position of the "first object” and the position of the "second object” substantially match, the object identifying unit 34 identifies the other vehicle Vb as the "first object” as the "moving object”. do.
  • the map generation unit 35 generates an obstacle map MP that defines the positional relationship between the vehicle Va and obstacles existing around the vehicle Va.
  • the map generation unit 35 generates an obstacle map MP based on the detection result of the "first object" by the object detection unit 32 and the identification result of the "first object” by the object identification unit 34, for example, by SLAM technology. Generate.
  • the map generator 35 generates, for example, a two-dimensional obstacle map MP or a three-dimensional obstacle map MP.
  • SLAM is an abbreviation for Simultaneous Localization and Mapping.
  • the map generator 35 excludes “moving objects” from the obstacle map MP. Specifically, the map generation unit 35 registers the “stationary object” identified by the object identification unit 34 in the obstacle map MP, and registers the “moving object” identified by the object identification unit 34 in the obstacle map MP. Do not register.
  • the support control unit 36 performs parking support control so that the vehicle can be parked while avoiding obstacles, based on the information about the obstacles in the obstacle map MP. For example, the support control unit 36 excludes moving objects from obstacles as objects that have already moved and do not exist when parking, and considers stationary objects among the obstacles as obstacles to be avoided. Calculate the avoidance movement route. Then, in order to assist the vehicle in moving along the moving route, it outputs appropriate control signals to the HMIECU 40, the brake ECU 50, and the power train ECU 60 through the on-vehicle communication bus 70.
  • the HMIECU 40 has a display section 41 and an audio output section 42 such as a speaker, which constitute an HMI (abbreviation for Human Machine Interface).
  • the display unit 41 is a device that provides the user with information in a visual manner.
  • the display unit 41 is configured by, for example, a touch panel display in which a display function and an operation function are integrated, or a head-up display that projects information onto a transparent glass element.
  • the HMIECU 40 receives image data of the surroundings of the own vehicle Va captured by the surroundings monitoring camera 22, and controls the display unit 41 to display the data together with a virtual vehicle image showing the own vehicle Va.
  • the HMIECU 40 for example, performs a process of clearly indicating the planned parking position of the own vehicle Va in the bird's-eye view image showing the own vehicle Va, indicating the moving route with an arrow, and emphasizing obstacles.
  • the HMIECU 40 controls various meter displays such as vehicle speed display and engine speed display.
  • the display unit 41 displays information related to parking assistance
  • the audio output unit 42 issues an alarm sound or information related to parking assistance.
  • the HMIECU 40 indicates a planned operation of the vehicle such as "Move forward” or “Reverse” through the display unit 41 or the voice output unit 42, or an automatic command such as “Set the shift position to 'D'". It issues instructions for preparations for parking.
  • the brake ECU 50 constitutes a braking control device that performs various brake controls, and automatically generates brake fluid pressure by driving an actuator for brake fluid pressure control, pressurizes the wheel cylinders, and produces braking force. generate When the brake ECU 50 receives the control signal from the support control unit 36, the brake ECU 50 controls the braking force of each wheel so as to move the vehicle along the moving route.
  • the power train ECU 60 constitutes a driving force control device that performs various driving force controls, and generates a desired driving force by controlling the engine or motor rotation speed and controlling the transmission.
  • the power train ECU 60 receives the control signal from the support control unit 36, the power train ECU 60 controls the driving force of the drive wheels so as to move the vehicle along the moving route.
  • the brake ECU 50 and the power train ECU 60 are included here as systems capable of performing automatic parking.
  • the HMIECU 40 is included in order to display a bird's-eye view image and display relating to parking assistance. However, these are not essential, and are selectively used as needed.
  • the parking assistance system 1 is configured as described above.
  • the parking assistance system 1 includes an object detection device 10 .
  • the parking assistance system 1 identifies whether an object existing around the vehicle Va is a "moving object” or a "stationary object” by object detection processing executed by the object detection device 10, Get information about obstacles in In the parking assistance system 1, the assistance control unit 36 performs parking assistance control based on the information about the obstacle acquired by the object detection device 10 so that the vehicle can be parked while avoiding the obstacle.
  • the object detection device 10 of the present embodiment includes an investigation wave sensor 21, a surrounding monitoring camera 22, a storage unit 31, an object detection unit 32, a movement detection unit 33, an object identification unit 34, and a map generation unit 35. It is An overview of the object detection process executed by the object detection device 10 will be described below with reference to the flowchart of FIG.
  • the object detection device 10 performs object detection processing periodically or irregularly, for example, when a start switch such as an ignition switch of the own vehicle Va is turned on.
  • Each processing shown in this flowchart is implemented by each functional unit of the object detection device 10 . Further, each step for realizing this processing can also be grasped as each step for realizing the object detection method.
  • the object detection device 10 reads various information in step S100.
  • the object detection device 10 sequentially reads sensor outputs from, for example, an investigation wave sensor 21, a peripheral monitoring camera 22, a wheel speed sensor, a steering angle sensor, and the like.
  • the object detection device 10 performs detection processing for the "first object”. Specifically, the object detection device 10 detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the surveying wave sensor 21 based on the detection results of the surveying wave sensor 21 installed in the own vehicle Va. 1 object”. This processing is performed by the object detection unit 32 of the object detection device 10 .
  • step S120 the object detection device 10 performs movement detection processing for the "second object". Specifically, the object detection device 10 detects the “second object” in the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 based on the detection result of the “second object” in the surroundings monitoring camera 22. ” movement is detected. This processing is performed by the movement detection unit 33 of the object detection device 10 .
  • step S130 the object detection device 10 determines whether or not the object detection unit 32 has detected the "first object". As a result, when the "first object" is not detected, the object detection device 10 skips subsequent processes and exits from this process. On the other hand, when the "first object" is detected, the object detection device 10 proceeds to step S140.
  • step S140 the object detection device 10 determines whether or not the movement detection unit 33 has detected movement of the "second object". Specifically, the object detection device 10 determines whether movement of an object has been detected in the target areas RL and RR including the detection areas Rfl, Rrl, Rfr, and Rrr of the probe wave sensor 21 .
  • the object detection device 10 determines whether the movement of the "first object” detected by the object detection unit 32 and the movement detection unit 33 are detected in step S150. It is determined whether or not the detected "second object” is the same object. For example, the object detection device 10 detects the position of the "first object” ascertained from the detection result of the object detection unit 32 and the position of the "second object” ascertained from the monitoring result of the periphery monitoring camera 22. If they substantially match, it is determined that the "second object” and the "first object” are the same object. Note that this determination process is not limited to the above. It may be realized by grasping the outer shape of the "object” and comparing them.
  • step S150 if the "second object” and the "first object” are the same object, the object detection device 10 proceeds to step S160 to change the "first object” to " identified as “moving object”.
  • step S170 Identify the "first object” as the "stationary object”. Note that each process from steps S130 to S170 is performed by the object identification unit 34.
  • step S180 the object detection device 10 generates an obstacle map MP and exits the object detection process. Specifically, the object detection device 10 registers obstacles identified as “stationary objects” by the object identification unit 34 in the obstacle map MP, and obstacles identified as “moving objects” by the object identification unit 34 are registered in the obstacle map MP. is not registered in the obstacle map MP. The object detection device 10 stores the obstacle map MP in the storage section 31 . Note that the process of step S180 is performed by the map generator 35. FIG.
  • FIG. 5 is an example showing the positional relationship between the own vehicle Va and the mobile object MO when the own vehicle Va and the mobile object MO such as the other vehicle Vb pass each other on the road in chronological order.
  • FIG. 6 is an example of a time-series representation of the positional relationship between the own vehicle Va and the installation object OB when the own vehicle Va travels on the road by the installation object OB. A specific example of the object detection process will be described below with reference to FIGS. 5 and 6.
  • FIG. 5 is an example showing the positional relationship between the own vehicle Va and the mobile object MO when the own vehicle Va and the mobile object MO such as the other vehicle Vb pass each other on the road in chronological order.
  • FIG. 6 is an example of a time-series representation of the positional relationship between the own vehicle Va and the installation object OB when the own vehicle Va travels on the road by the installation object OB.
  • the object detection device 10 When the own vehicle Va is in the position shown in the upper part of FIG. 5, the side wall SW provided along the road is detected by the left front side sensor SLf and the left rear side sensor SLb as the "first object". On the other hand, since the mobile object MO is located outside the range of the target area RL of the perimeter monitoring camera 22, the object detection device 10 does not detect the movement of the mobile object MO. Therefore, the object detection device 10 identifies the side wall SW as the "first object” as the "stationary object".
  • the left front side sensor SLf detects the mobile object MO as the "first object”.
  • the object detection device 10 detects the movement of the moving object MO as the movement of the "second object”.
  • the "first object” and the “second object” are the same mobile object MO, and have the same position, outer shape, and the like. Therefore, the object detection device 10 identifies the moving object MO as the "first object” as the "moving object”.
  • the left rear side sensor SLb detects the mobile object MO as the "first object”.
  • the object detection device 10 detects the movement of the moving object MO as the movement of the "second object”.
  • the "first object” and the “second object” are the same mobile object MO, and have the same position, outer shape, and the like. Therefore, the object detection device 10 identifies the moving object MO as the "first object” as the "moving object”.
  • the side wall SW provided along the road is detected by the left front side sensor SLf and the left rear side sensor SLb as the "first object”.
  • the installation object OB is located outside the range of the target area RL of the perimeter monitoring camera 22, and the object detection device 10 does not detect the installation object OB. Therefore, the object detection device 10 identifies the side wall SW as the "first object” as the "stationary object”.
  • the left front side sensor SLf detects the installed object OB as the "first object”.
  • the object detection device 10 simply detects the installation object OB as the "second object”. ” movement is not detected. Therefore, the object detection device 10 identifies the installation object OB as the "first object” as the "stationary object”.
  • the left rear side sensor SLb detects the installed object OB as the "first object”.
  • the object detection device 10 simply detects the installation object OB as the "second object”. ” movement is not detected. Therefore, the object detection device 10 identifies the installation object OB as the "first object” as the "stationary object”.
  • the object detection device 10 After that, the object detection device 10 generates an obstacle map MP.
  • the obstacle map MP is constructed by adding information corresponding to a "stationary object" to a grid map obtained by dividing the area around the own vehicle Va into small areas (for example, mesh). be done.
  • the object detection device 10 described above detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the surveying wave sensor 21 based on the detection results of the surveying wave sensor 21 installed in the own vehicle Va as the "first object”.
  • the object detection device 10 detects objects in the target areas RL and RR including the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 as "second objects" based on the detection results of the peripheral monitoring camera 22. , the movement of the "second object” in the target areas RL and RR is detected. Then, when the movement of the "second object” is detected and the "first object” and the "second object” are the same, the object detection device 10 detects the "first object". is identified as a "moving object”.
  • the detection of an object based on the detection result of the survey wave sensor 21 and the detection of movement of the object based on the detection result of the perimeter monitoring camera 22 can be performed in parallel.
  • the standby state unlike the conventional technology does not occur, it is possible to identify in a short period of time whether or not the object existing around the own vehicle Va is a "moving object".
  • the object identification unit 34 identifies the "first object” as a "stationary object”. According to this, it is possible to identify in a short time whether an object existing around the own vehicle Va is a "stationary object”.
  • the object identification unit 34 Identify the "first object” as the "stationary object”. This also makes it possible to identify in a short period of time whether or not an object existing around the own vehicle Va is a "stationary object”.
  • the object detection device 10 includes a map generator 35 that generates an obstacle map MP that defines the positional relationship between the vehicle and obstacles existing around the own vehicle Va.
  • the map generation unit 35 registers the “stationary object” identified by the object identification unit 34 in the obstacle map MP, and does not register the “moving object” identified by the object identification unit 34 in the obstacle map MP. According to this, a "stationary object” among the objects existing around the host vehicle Va can be quickly reflected in the obstacle map MP as an obstacle.
  • the search wave sensor 21 includes a plurality of ultrasonic sensors arranged at predetermined intervals along the traveling direction of the vehicle Va on the side of the vehicle Va.
  • the plurality of ultrasonic sensors one of the ultrasonic sensors adjacent to each other in the traveling direction of the vehicle Va is assumed to be the first sensor, and the other is assumed to be the second sensor.
  • the surroundings monitoring camera 22 sequentially detects the second object by using the areas including the detection area of the first sensor and the detection area of the second sensor as target areas RL and RR.
  • the means for detecting the "first object” becomes redundant, it is possible to appropriately specify whether the "first object" is a "stationary object” or a "moving object". Become. For example, even if the other vehicle Vb that is temporarily stopped when the own vehicle Va and the other vehicle Vb pass each other is specified as a "stationary object" based on the detection result of the first sensor, the "stationary object” is determined based on the detection result of the second sensor. Can be changed to "moving object".
  • the target areas RL and RR for the perimeter monitoring camera 22 to detect the "second object" include the detection area of the first sensor and the detection area of the second sensor. According to this, the device configuration of the object detection device 10 can be simplified as compared with the case where a camera is provided for each of the detection area of the first sensor and the detection area of the second sensor.
  • the detection device is configured by the perimeter monitoring camera 22, but the detection device is not limited to this, and any device other than the perimeter monitoring camera 22 may be used as long as it can detect the movement of an object. may be composed of
  • the object detection device 10 when the movement detection unit 33 does not detect the movement of the "second object”, the object detection device 10 preferably identifies the "first object” as the "stationary object”. but not limited to this. For example, when the movement detection unit 33 does not detect movement of the "second object”, the object detection device 10 identifies each of the "first object” and the "second object” as “stationary objects”. It can be like this.
  • the object detection device 10 detects the movement of the "second object” by the movement detection unit 33, and the "first object” and the “second object” are different. In some cases, it may be desirable, but not limited, to identify the "first object” as a "stationary object". For example, when the movement detection unit 33 detects the movement of the "second object” and the “first object” and the “second object” are different, the object detection device 10 detects the "second object”. While specifying the "first object” as the "stationary object", the "second object” may be specified as the "moving object”.
  • the object detection device 10 creates the obstacle map MP after identifying the "first object” as either the "stationary object” or the "moving object”. is not limited to For example, instead of creating the obstacle map MP, the object detection device 10 may notify the user of detection of a “moving object” through at least one of the display unit 41 and the audio output unit 42 .
  • two ultrasonic sensors are installed on each of the left and right sides of the vehicle Va.
  • One ultrasonic sensor may be installed on each side.
  • the probe wave sensor 21 may be composed of, for example, one ultrasonic sensor.
  • the object detection device 10 may detect the "first object" based on the output of a millimeter wave radar or LiDAR instead of the ultrasonic sensor.
  • the target areas RL and RR of the perimeter monitoring camera 22 are desirably set so as to include the respective detection areas of the ultrasonic sensors adjacent in the traveling direction of the host vehicle Va. Not limited.
  • the target areas RL and RR of the surroundings monitoring camera 22 may include the detection area of one of the adjacent ultrasonic sensors and not include the detection area of the other sensor.
  • the object detection device 10 identifies whether an object around the own vehicle Va is a "moving object” or a "stationary object” when the own vehicle Va moves backward (so-called back). good too.
  • the object detection device 10 of the present disclosure is incorporated in the parking assistance system 1
  • the object detection device 10 is not limited to this, and may be incorporated in another system such as a collision prevention system. is also possible. That is, the object detection device 10 can be applied to applications other than those described above.
  • the vehicle's external environment information is obtained from a sensor
  • the controller and techniques of the present disclosure are implemented on a dedicated computer provided by configuring a processor and memory programmed to perform one or more functions embodied by the computer program. good too.
  • the controller and techniques of the present disclosure may be implemented in a dedicated computer provided by configuring the processor with one or more dedicated hardware logic circuits.
  • the control unit and method of the present disclosure is a combination of a processor and memory programmed to perform one or more functions and a processor configured by one or more hardware logic circuits. It may be implemented on one or more dedicated computers.
  • the computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.

Abstract

An object detection device (10) comprises an object detection unit (32) that detects, as a first object, an object within the range of a detection area (Rfl, Rrl, Rfr, Rrr) of a probe wave sensor (21) installed in a vehicle on the basis of a detection result from the probe wave sensor. The object detection device comprises a movement detection unit (33) that detects the movement of a second object in an area of interest (RL, RR) on the basis of a detection result from a detection device (22) that successively detects, as the second object, an object within the range of the area of interest, the area of interest being set to an area that includes the detection area. The object detection device comprises an object identification unit (34) that identifies the first object as a moving object if the movement of the second object is detected by the movement detection unit and the first and second objects are the same.

Description

物体検知装置、物体検知方法OBJECT DETECTION DEVICE AND OBJECT DETECTION METHOD 関連出願への相互参照Cross-references to related applications
 本出願は、2021年8月27日に出願された日本特許出願番号2021-139030号に基づくもので、ここにその記載内容が参照により組み入れられる。 This application is based on Japanese Patent Application No. 2021-139030 filed on August 27, 2021, the contents of which are incorporated herein by reference.
 本開示は、物体検知装置および物体検知方法に関する。 The present disclosure relates to an object detection device and an object detection method.
 従来、特許文献1に、車両周辺の物体が移動物体か静止物体かの判定を行う車載用物体判別装置が提案されている。この装置では、2つの超音波センサで、異なる時間に同じエリアを検知し、検知結果が同じであれば静止物体、検知結果にずれがあるなら移動物体と判定することで、移動物体か静止物体かの判定を行っている。これにより、静止物体のみの障害物マップを作成し、駐車支援システムの構築を図っている。 Conventionally, Patent Document 1 proposes an in-vehicle object discrimination device that determines whether an object around the vehicle is a moving object or a stationary object. With this device, two ultrasonic sensors detect the same area at different times. If the detection results are the same, it is determined that the object is stationary. We are making a judgment on whether In this way, an obstacle map of only stationary objects is created, and a parking assistance system is constructed.
特開2013-20458号公報JP 2013-20458 A
 しかしながら、上記特許文献1に示される装置では、2つの超音波センサのうち一方のセンサで物体を検知した段階では、移動物体か静止物体かの判定ができず、他方のセンサで物体を検知するまで待機状態となる。このことは、車両の周辺に存在する物体の障害物マップへの反映が遅延する要因となることから好ましくない。
 本開示は、車両の周辺に存在する物体が移動物体か否かを短時間で特定可能な物体検知装置および物体検知方法を提供することを目的とする。
However, in the apparatus disclosed in Patent Document 1, when one of the two ultrasonic sensors detects an object, it is not possible to determine whether the object is moving or stationary, and the other sensor detects the object. Standby until This is not preferable because it causes a delay in the reflection of objects existing around the vehicle on the obstacle map.
An object of the present disclosure is to provide an object detection device and an object detection method that can identify in a short time whether an object existing around a vehicle is a moving object.
 本開示の1つの観点によれば、
 物体検知装置は、車両の周辺に存在する物体を検知するものであって、
 車両に設置された探査波センサの検出結果に基づいて探査波センサの検知エリアの範囲内の物体を第1の物体として検出する物体検出部と、
 検知エリアを含むエリアを対象エリアとして対象エリアの範囲内の物体を第2の物体として逐次検知する検知装置の検知結果を基に対象エリアにおける第2の物体の移動を検出する移動検出部と、
 移動検出部で第2の物体の移動が検出され、且つ、第1の物体と第2の物体とが同じものである場合に、第1の物体を移動物体として特定する物体特定部と、を備える。
According to one aspect of the present disclosure,
The object detection device detects an object existing around the vehicle,
an object detection unit that detects an object within the detection area of the search wave sensor as a first object based on the detection result of the search wave sensor installed in the vehicle;
A movement detection unit that detects the movement of the second object in the target area based on the detection result of the detection device that sequentially detects the object within the range of the target area as the second object, with the area including the detection area as the target area;
an object identification unit that identifies the first object as a moving object when the movement of the second object is detected by the movement detection unit and the first object and the second object are the same. Prepare.
 本開示の別の観点によれば、
 物体検知方法は、車両の周辺の存在する物体を検知するものであって、
 車両に設置された探査波センサの検出結果に基づいて探査波センサの検知エリアの範囲内の物体を第1の物体として検出することと、
 検知エリアを含むエリアを対象エリアとして対象エリアの範囲内の物体を第2の物体として逐次検知する検知装置の検知結果を基に対象エリアにおける第2の物体の移動を検出することと、
 第2の物体の移動が検出され、且つ、第1の物体と第2の物体とが同じものである場合に、第1の物体を移動物体として特定することと、を含む。
According to another aspect of the disclosure,
The object detection method detects an object existing around the vehicle,
Detecting an object within the detection area of the probe wave sensor as the first object based on the detection result of the probe wave sensor installed in the vehicle;
Detecting the movement of the second object in the target area based on the detection results of a detection device that sequentially detects objects within the range of the target area as the second object, with an area including the detection area as the target area;
identifying the first object as a moving object when movement of the second object is detected and the first object and the second object are the same.
 これらのように、検知装置の検知結果を利用して探査波センサの検知エリアの範囲内の物体の移動を検出するようにすれば、探査波センサの検出結果に基づく物体の検知と、検知装置の検知結果に基づく物体の移動の検出とを並行して実施することができる。したがって、従来技術のような待機状態が生じないので、車両の周辺に存在する物体が移動物体か否かを短時間で特定することができる。 As described above, if the detection result of the detection device is used to detect the movement of the object within the detection area of the search wave sensor, the detection of the object based on the detection result of the search wave sensor and the detection device detection of the movement of the object based on the detection result of can be carried out in parallel. Therefore, since the standby state unlike the conventional technology does not occur, it is possible to quickly identify whether or not an object existing around the vehicle is a moving object.
 なお、各構成要素等に付された括弧付きの参照符号は、その構成要素等と後述する実施形態に記載の具体的な構成要素等との対応関係の一例を示すものである。 It should be noted that the reference numerals in parentheses attached to each component etc. indicate an example of the correspondence relationship between the component etc. and the specific component etc. described in the embodiment described later.
本開示の実施形態に係る物体検知装置を含む駐車支援システムの概略構成図である。1 is a schematic configuration diagram of a parking assistance system including an object detection device according to an embodiment of the present disclosure; FIG. 本開示の実施形態に係る物体検知装置を搭載した車両の概略構成を示す模式的な平面図である。1 is a schematic plan view showing a schematic configuration of a vehicle equipped with an object detection device according to an embodiment of the present disclosure; FIG. 自車両を駐車する際に、他車両が自車両とすれ違う際の状態を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining a state in which another vehicle passes the own vehicle when the own vehicle is parked; 制御装置が実行する物体検知処理の一例を示すフローチャートである。6 is a flowchart showing an example of object detection processing executed by the control device; 物体検知装置による移動物体の特定を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining identification of a moving object by an object detection device; 物体検知装置による静止物体の特定を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining identification of a stationary object by the object detection device; 障害物マップを説明するための説明図である。It is an explanatory view for explaining an obstacle map.
 本開示の一実施形態について図1~図7を参照しつつ説明する。本実施形態では、車両における駐車支援システム1に物体検知装置10を組み込んだ場合を例に挙げて説明する。なお、ここでは駐車支援システム1に物体検知装置10を組み込んでいるが、駐車支援システム1以外のシステムに組み込んでも良い。 An embodiment of the present disclosure will be described with reference to FIGS. 1 to 7. FIG. In this embodiment, a case where the object detection device 10 is incorporated in the parking assistance system 1 in the vehicle will be described as an example. Although the object detection device 10 is incorporated into the parking assistance system 1 here, it may be incorporated into a system other than the parking assistance system 1 .
 図1に示すように、駐車支援システム1は、周辺監視装置20、制御装置30、HMIECU40、ブレーキECU50、パワートレイン(以下、パワトレという)ECU60を有した構成とされている。周辺監視装置20は、制御装置30に直接接続されており、周辺監視装置20の監視結果が制御装置30に入力されるようになっている。また、制御装置30と、HMIECU40、ブレーキECU50、パワトレECU60との間は、車内LAN(Local Area Network)などの車載通信バス70を介して通信可能とされている。 As shown in FIG. 1, the parking assist system 1 includes a perimeter monitoring device 20, a control device 30, an HMIECU 40, a brake ECU 50, and a power train (hereinafter referred to as power train) ECU 60. The perimeter monitoring device 20 is directly connected to the control device 30 so that the monitoring result of the perimeter monitoring device 20 is input to the control device 30 . Further, communication between the control device 30 and the HMIECU 40, the brake ECU 50, and the power train ECU 60 is possible via an in-vehicle communication bus 70 such as an in-vehicle LAN (Local Area Network).
 図示しないが、制御装置30には、車輪速センサ、舵角センサ等の車両制御用の各種センサが接続されている。なお、車輪速センサは、4つの車輪それぞれに対応して備えられ、各車輪の回転状態に応じた検出信号をパルス出力として発生させる。また、舵角センサは、ステアリング操作による操舵の向きや操作量に応じた検出信号を出力する。 Although not shown, the controller 30 is connected to various sensors for vehicle control such as a wheel speed sensor and a steering angle sensor. A wheel speed sensor is provided for each of the four wheels, and generates a detection signal corresponding to the rotation state of each wheel as a pulse output. Further, the steering angle sensor outputs a detection signal corresponding to the direction of steering and the amount of operation by the steering operation.
 周辺監視装置20は、自身の車両(以下、自車両Vaという)の周辺環境を監視する自律センサである。例えば、周辺監視装置20は、他車両Vbなどの移動する移動物体、路上の構造物などの静止している静止物体といった自車両Vaの周辺の立体物で構成される障害物等を検知対象物として検知する。車両には、周辺監視装置20として、探査波センサ21、周辺監視カメラ22等が備えられている。 The surroundings monitoring device 20 is an autonomous sensor that monitors the surrounding environment of its own vehicle (hereinafter referred to as own vehicle Va). For example, the perimeter monitoring device 20 detects obstacles and the like composed of three-dimensional objects in the perimeter of the own vehicle Va, such as moving objects such as other vehicles Vb and stationary stationary objects such as structures on the road. detected as The vehicle is equipped with a survey wave sensor 21, a surroundings monitoring camera 22, and the like as a surroundings monitoring device 20. - 特許庁
 探査波センサ21は、自車両Vaの周囲の所定範囲に探査波を送信するものである。探査波センサ21は、自車両Vaの側方において自車両Vaの進行方向に沿って所定間隔をあけて配置された複数の超音波センサを含んでいる。具体的には、探査波センサ21は、左フロントサイドセンサSLf、左リアサイドセンサSLb、右フロントサイドセンサSRf、右リアサイドセンサSRb、図示しないミリ波レーダ、図示しないLiDAR(Light Detection and Rangingの略称)を含んでいる。 The search wave sensor 21 transmits search waves to a predetermined range around the own vehicle Va. The exploration wave sensor 21 includes a plurality of ultrasonic sensors arranged at predetermined intervals along the traveling direction of the vehicle Va on the side of the vehicle Va. Specifically, the search wave sensor 21 includes a left front side sensor SLf, a left rear side sensor SLb, a right front side sensor SRf, a right rear side sensor SRb, a millimeter wave radar (not shown), and a LiDAR (light detection and ranging) not shown. contains.
 左フロントサイドセンサSLf、左リアサイドセンサSLb、右フロントサイドセンサSRf、右リアサイドセンサSRbは、図2に示すように、自車両Vaの側方において自車両Vaの進行方向に沿って配置されている。本実施形態では、超音波センサを自車両Vaの左前、左後、右前、右後に備えている。なお、図2では、自車両Vaの前後方向をDR1で示し、自車両Vaの左右方向をDR2で示している。 As shown in FIG. 2, the left front side sensor SLf, the left rear side sensor SLb, the right front side sensor SRf, and the right rear side sensor SRb are arranged on the side of the vehicle Va along the traveling direction of the vehicle Va. . In this embodiment, ultrasonic sensors are provided on the front left, rear left, front right, and rear right of the own vehicle Va. In addition, in FIG. 2, the front-rear direction of the vehicle Va is indicated by DR1, and the left-right direction of the vehicle Va is indicated by DR2.
 超音波センサは、超音波を発生させるとともに、その反射波を受信し、発生させてから反射波を受信するまでの時間に基づいて超音波センサの指向方向に存在する物体までの距離を検知し、その結果を検出信号として出力する。以下、超音波センサによって検知される距離のことを検知距離と言う。また、図2に示すように、左フロントサイドセンサSLfの検知エリアをRfl、左リアサイドセンサSLbの検知エリアをRrl、右フロントサイドセンサSRfの検知エリアをRfr、右リアサイドセンサSRbの検知エリアをRrrとしている。 The ultrasonic sensor generates ultrasonic waves, receives the reflected waves, and detects the distance to an object existing in the pointing direction of the ultrasonic sensor based on the time from generation to reception of the reflected waves. , and outputs the result as a detection signal. Hereinafter, the distance detected by the ultrasonic sensor will be referred to as detection distance. Further, as shown in FIG. 2, the detection area of the left front side sensor SLf is Rfl, the detection area of the left rear side sensor SLb is Rrl, the detection area of the right front side sensor SRf is Rfr, and the detection area of the right rear side sensor SRb is Rrr. and
 左フロントサイドセンサSLfと左リアサイドセンサSLbとの間および右フロントサイドセンサSRfと右リアサイドセンサSRbとの間は、予め決められた所定間隔とされ、略同距離とされている。例えば、左フロントサイドセンサSLfや右フロントサイドセンサSRfは、自車両Vaにおける前輪側のタイヤハウス内などに配置されている。また、左リアサイドセンサSLbや右リアサイドセンサSRbは、リアバンパのうちの自車両Vaの側面側に位置する部分やリアフェンダ部などに配置されている。これら左フロントサイドセンサSLf、左リアサイドセンサSLb、右フロントサイドセンサSRf、右リアサイドセンサSRbは、自車両Vaの側方に向けて配置され、自車両Vaの側方に存在する物体との間の距離を検知する。 The distance between the left front side sensor SLf and the left rear side sensor SLb and between the right front side sensor SRf and the right rear side sensor SRb are set to a predetermined distance, and are substantially the same distance. For example, the left front side sensor SLf and the right front side sensor SRf are arranged in a tire housing on the front wheel side of the own vehicle Va. The left rear side sensor SLb and the right rear side sensor SRb are arranged in a portion of the rear bumper located on the side of the host vehicle Va, a rear fender portion, and the like. The left front side sensor SLf, the left rear side sensor SLb, the right front side sensor SRf, and the right rear side sensor SRb are arranged to face the side of the own vehicle Va, and are arranged to face an object present on the side of the own vehicle Va. Detect distance.
 周辺監視カメラ22は、自車両Va周囲の所定範囲を対象エリアRL、RRとして撮像するものである。周辺監視カメラ22は、自車両Va周囲を撮像した撮像データを撮像結果として出力する。周辺監視カメラ22は、フロントカメラCF、バックカメラCB、左サイドカメラCL、右サイドカメラCR等を含んでいる。 The surroundings monitoring camera 22 images a predetermined range around the own vehicle Va as target areas RL and RR. The perimeter monitoring camera 22 outputs imaging data obtained by imaging the perimeter of the vehicle Va as an imaging result. The surrounding monitoring camera 22 includes a front camera CF, a back camera CB, a left side camera CL, a right side camera CR, and the like.
 フロントカメラCFは、例えば、車両のフロントマスクを構成するフロントバンパまたはフロントグリルに設けられている。バックカメラCBは、例えば、リアバンパまたはリアバンパ近傍に設けられている。 The front camera CF is provided, for example, on the front bumper or front grill that constitutes the front mask of the vehicle. The back camera CB is provided, for example, on the rear bumper or in the vicinity of the rear bumper.
 左サイドカメラCLは、例えば、左サイドミラーMLまたは左サイドミラーML近傍に設けられている。左サイドカメラCLは、左フロントサイドセンサSLfおよび左リアサイドセンサSLbの検知エリアRfl、Rrlを含むエリアを対象エリアRLとしている。 The left side camera CL is provided, for example, on the left side mirror ML or near the left side mirror ML. The left side camera CL has a target area RL that includes the detection areas Rfl and Rrl of the left front side sensor SLf and the left rear side sensor SLb.
 右サイドカメラCRは、例えば、右サイドミラーMRまたは右サイドミラーMR近傍に設けられている。右サイドカメラCRは、右フロントサイドセンサSRfおよび右リアサイドセンサSRbの検知エリアRfr、Rrrを含むエリアを対象エリアRRとしている。 The right side camera CR is provided, for example, on the right side mirror MR or in the vicinity of the right side mirror MR. The right side camera CR uses an area including the detection areas Rfr and Rrr of the right front side sensor SRf and the right rear side sensor SRb as a target area RR.
 ここで、本実施形態では、周辺監視カメラ22が、探査波センサ21の検知エリアRfr、Rrrを含むエリアを対象エリアRL、RRとして対象エリアRL、RRの範囲内の物体を“第2の物体”として逐次検知する検知装置を構成する。各サイドセンサSLf、SLb、SRf、SRbのうち、自車両Vaの進行方向において隣り合う超音波センサの一方を第1センサとし、他方を第2センサとしたとする。このとき、周辺監視カメラ22は、第1センサの検知エリアおよび第2センサの検知エリアそれぞれを含むエリアを対象エリアRL、RRとして“第2の物体”を逐次検知する。 Here, in the present embodiment, the perimeter monitoring camera 22 regards the areas including the detection areas Rfr and Rrr of the survey wave sensor 21 as the target areas RL and RR, and treats objects within the range of the target areas RL and RR as “second objects. ” to configure a detection device that detects sequentially. Of the side sensors SLf, SLb, SRf, and SRb, one of the ultrasonic sensors adjacent in the traveling direction of the vehicle Va is assumed to be the first sensor, and the other is assumed to be the second sensor. At this time, the surroundings monitoring camera 22 sequentially detects the “second object” by using the areas including the detection area of the first sensor and the detection area of the second sensor as target areas RL and RR.
 制御装置30は、駐車支援システム1における駐車支援方法を実現するための各種制御を行うためのECU(すなわち、電子制御装置)を構成するものであり、CPU、記憶部31、I/Oなどを備えたマイクロコンピュータによって構成されている。 The control device 30 constitutes an ECU (that is, an electronic control device) for performing various controls for realizing a parking assistance method in the parking assistance system 1, and includes a CPU, a storage unit 31, an I/O, and the like. It is composed of a microcomputer equipped with
 記憶部31は、ROM、RAM、EEPROM等を含んでいる。すなわち、記憶部31は、RAM等の揮発性メモリと、EEPROM等の不揮発性メモリとを有している。記憶部31は、非遷移有形記録媒体で構成されている。例えば、記憶部31は、探査波センサ21および周辺監視カメラ22から得られた各種情報を時系列で所定容量分保持する。 The storage unit 31 includes ROM, RAM, EEPROM, and the like. That is, the storage unit 31 has a volatile memory such as RAM and a nonvolatile memory such as EEPROM. The storage unit 31 is composed of a non-transition tangible recording medium. For example, the storage unit 31 holds a predetermined amount of various information obtained from the survey wave sensor 21 and the surrounding monitoring camera 22 in chronological order.
 制御装置30は、上記した周辺監視装置20の監視結果に基づいて物体検知処理を含む駐車支援制御を実行する。この駐車支援制御では、自車両Vaの周辺に存在する“静止物体”などの障害物を認識し、障害物を避けるように車両が駐車する際の移動経路を算出し、その移動経路に沿って車両を移動させる支援を行う。認識した障害物が移動物体である場合には、“移動物体”を認識した後、車両が駐車を始めようとするときに既に“移動物体”が存在していない状況になり得る。このため、例えば、“移動物体”を障害物から除外して移動経路を算出するなどの処理が行われるようになっている。そして、移動経路に沿って車両を移動させる支援が行われる際には、例えば移動経路を視覚的にドライバに把握できるようにしたり、車両を移動経路に沿って直接移動させるような制駆動力制御が行われたりする。このため、その制御を実行すべく、制御装置30から車載通信バス70を通じて、HMIECU40、ブレーキECU50、パワトレECU60に制御信号が伝えられるようになっている。 The control device 30 executes parking support control including object detection processing based on the monitoring results of the perimeter monitoring device 20 described above. In this parking support control, obstacles such as "stationary objects" that exist around the own vehicle Va are recognized, a moving route is calculated when the vehicle is parked so as to avoid the obstacles, and along the moving route Assist in moving vehicles. If the recognized obstacle is a moving object, the "moving object" may no longer exist when the vehicle starts to park after recognizing the "moving object." For this reason, for example, processing such as calculating a movement route by excluding "moving objects" from obstacles is performed. When assistance is provided to move the vehicle along the travel route, for example, the driver can visually grasp the travel route, or control the braking/driving force to directly move the vehicle along the travel route. is done. Therefore, control signals are transmitted from the control device 30 to the HMIECU 40, the brake ECU 50, and the powertrain ECU 60 through the vehicle-mounted communication bus 70 in order to execute the control.
 具体的には、制御装置30には、記憶部31に加えて、機能部として、物体検出部32、移動検出部33、物体特定部34、マップ生成部35、および支援制御部36が備えられている。 Specifically, in addition to the storage unit 31, the control device 30 includes an object detection unit 32, a movement detection unit 33, an object identification unit 34, a map generation unit 35, and a support control unit 36 as functional units. ing.
 物体検出部32は、自車両Vaに設置された探査波センサ21の検出結果に基づいて探査波センサ21の検知エリアRfl、Rrl、Rfr、Rrrの範囲内の物体を“第1の物体”として検出する。 The object detection unit 32 detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 based on the detection result of the survey wave sensor 21 installed in the own vehicle Va as a "first object". To detect.
 物体検出部32は、例えば、各サイドセンサSLf、SLb、SRf、SRbそれぞれのセンサ出力に基づいて車両と物体との距離を検知距離として算出する。そして、物体検出部32は、各サイドセンサSLf、SLb、SRf、SRbのセンサ出力に基づく検知距離から移動三角測量法により物体がどこにあるかを特定する。 The object detection unit 32, for example, calculates the distance between the vehicle and the object as the detection distance based on the sensor output of each of the side sensors SLf, SLb, SRf, and SRb. Then, the object detection unit 32 identifies where the object is by mobile triangulation from the detection distance based on the sensor outputs of the side sensors SLf, SLb, SRf, and SRb.
 ここで、移動三角測量法では、例えば、超音波センサが前方に移動したときに検知距離が短くなるように変化した場合には物体が超音波センサの前方位置、検知距離が変化しない場合には物体が超音波センサの側方位置に存在するなどの測量を行う。なお、物体検出部32における“第1の物体”の検出は、移動三角測量法以外の方法によって実現されていてもよい。 Here, in the mobile triangulation method, for example, if the detection distance changes so as to become shorter when the ultrasonic sensor moves forward, the object is positioned in front of the ultrasonic sensor, and if the detection distance does not change, A survey is performed such that an object exists at a side position of an ultrasonic sensor. Note that the detection of the “first object” by the object detection unit 32 may be realized by a method other than the moving triangulation method.
 移動検出部33は、検知装置を構成する周辺監視カメラ22の検知結果に基づいて各サイドセンサSLf、SLb、SRf、SRbの検知エリアRfl、Rrl、Rfr、Rrrにおける“第2の物体”の移動を検出する。 The movement detection unit 33 detects the movement of the “second object” in the detection areas Rfl, Rrl, Rfr, and Rrr of the side sensors SLf, SLb, SRf, and SRb based on the detection results of the peripheral monitoring camera 22 that constitutes the detection device. to detect
 移動検出部33は、例えば、自車両Vaの周囲の撮像データに基づいて、撮像画像中における“第2の物体”に対応する特徴点を抽出し、当該特徴点の時間変化および自車両Vaの移動状態に基づいて、“第2の物体”の移動を検出する。特徴点の抽出は、例えば、一次微分を用いたソーベルフィルタ、二次微分を用いたラプラシアンフィルタを利用することで実現可能である。また、自車両Vaの移動状態は、例えば、車輪速センサおよび舵角センサのセンサ出力に基づいて取得することができる。なお、“第2の物体”の移動の検出方法は、上記のものに限定されない。“第2の物体”の移動の検出方法は、例えば、フレーム間での特徴点の動きをベクトル化して物体の移動を検出するプティカルフロー、撮像画像のフレーム間差分により物体の移動を検出するフレーム間差分等を用いて実現されていてもよい。 The movement detection unit 33 extracts, for example, a feature point corresponding to the "second object" in the captured image based on the imaged data around the own vehicle Va, and detects the time change of the feature point and the image of the own vehicle Va. Movement of the "second object" is detected based on the movement state. Extraction of feature points can be realized by using, for example, a Sobel filter using first-order differentiation and a Laplacian filter using second-order differentiation. Further, the movement state of the own vehicle Va can be acquired based on the sensor outputs of the wheel speed sensor and the steering angle sensor, for example. Note that the method of detecting movement of the "second object" is not limited to the above. The method of detecting the movement of the "second object" includes, for example, optical flow that detects the movement of the object by vectorizing the movement of the feature points between frames, and the movement of the object that detects the movement of the object from the difference between the frames of the captured image. It may be implemented using inter-frame differences or the like.
 物体特定部34は、物体検出部32で検出された“第1の物体”が“移動物体”および“静止物体”のいずれであるかを特定する。 The object identification unit 34 identifies whether the "first object" detected by the object detection unit 32 is a "moving object" or a "stationary object".
 物体特定部34は、移動検出部33で“第2の物体”の移動が検出され、且つ、当該“第2の物体”と“第1の物体”とが同じものである場合、“第1の物体”を移動物体として特定する。物体特定部34は、例えば、物体検出部32の検出結果から“第1の物体”の位置を把握するとともに、周辺監視カメラ22による物体の監視結果から“第2の物体”の位置を把握する。そして、物体特定部34は、“第1の物体”の位置と“第2の物体”の位置とが略一致する場合に、“第2の物体”と“第1の物体”とが同じものであると判定する。例えば、図3に示すように、自車両Vaが他車両Vbとすれ違う場合、物体検出部32にて他車両Vbが“第1の物体”として検出されるとともに、移動検出部33にて他車両Vbの移動が“第2の物体”の移動として検出される。そして、“第1の物体”の位置と“第2の物体”の位置とが略一致するので、物体特定部34は、“第1の物体”としての他車両Vbを“移動物体”として特定する。 When the movement detection unit 33 detects the movement of the “second object” and the “second object” and the “first object” are the same, the object identification unit 34 identifies the “first ,” is identified as a moving object. The object identification unit 34, for example, grasps the position of the "first object" from the detection result of the object detection unit 32, and grasps the position of the "second object" from the object monitoring result of the surroundings monitoring camera 22. . Then, when the position of the "first object" and the position of the "second object" substantially match, the object specifying unit 34 determines that the "second object" and the "first object" are the same. It is determined that For example, as shown in FIG. 3, when the own vehicle Va passes another vehicle Vb, the object detection unit 32 detects the other vehicle Vb as a “first object”, and the movement detection unit 33 detects the other vehicle Vb as a “first object”. Movement of Vb is detected as movement of the "second object". Then, since the position of the "first object" and the position of the "second object" substantially match, the object identifying unit 34 identifies the other vehicle Vb as the "first object" as the "moving object". do.
 マップ生成部35は、自車両Vaの周辺に存在する障害物と自車両Vaとの位置関係を定義する障害物マップMPを生成する。マップ生成部35は、例えば、SLAM技術によって、物体検出部32での“第1の物体”の検出結果および物体特定部34による“第1の物体”の特定結果に基づいて障害物マップMPを生成する。マップ生成部35は、例えば、二次元の障害物マップMPまたは三次元の障害物マップMPを生成する。なお、SLAMは、Simultaneous Localization and Mappingの略称である。 The map generation unit 35 generates an obstacle map MP that defines the positional relationship between the vehicle Va and obstacles existing around the vehicle Va. The map generation unit 35 generates an obstacle map MP based on the detection result of the "first object" by the object detection unit 32 and the identification result of the "first object" by the object identification unit 34, for example, by SLAM technology. Generate. The map generator 35 generates, for example, a two-dimensional obstacle map MP or a three-dimensional obstacle map MP. Note that SLAM is an abbreviation for Simultaneous Localization and Mapping.
 ここで、物体検出部32で検出された“第1の物体”が“移動物体”である場合、当該“移動物体”を検出した後、自車両Vaが駐車を始めようとするときに既に自車両Vaの周囲に“移動物体”が存在していない状況になり得る。このため、マップ生成部35は、“移動物体”を障害物マップMPから除外する。具体的には、マップ生成部35は、物体特定部34で特定された“静止物体”を障害物マップMPに登録し、物体特定部34で特定された“移動物体”を障害物マップMPに登録しない。 Here, when the "first object" detected by the object detection unit 32 is a "moving object", after the "moving object" is detected, when the own vehicle Va is about to start parking, the self-vehicle Va is already parked. A situation may arise in which there is no “moving object” around the vehicle Va. Therefore, the map generator 35 excludes "moving objects" from the obstacle map MP. Specifically, the map generation unit 35 registers the “stationary object” identified by the object identification unit 34 in the obstacle map MP, and registers the “moving object” identified by the object identification unit 34 in the obstacle map MP. Do not register.
 支援制御部36は、障害物マップMPにある障害物に関する情報に基づき、障害物を避けて駐車が行えるように駐車支援制御を行う。例えば、支援制御部36は、移動物体については駐車する際には既に移動して存在していないものとして障害物から除外し、障害物のうちの静止物体を避けるべき障害物として、障害物を避ける移動経路を算出する。そして、その移動経路に沿って車両を移動させられるように支援を行うべく、車載通信バス70を通じて、HMIECU40、ブレーキECU50、パワトレECU60に対して適宜制御信号を出力する。 The support control unit 36 performs parking support control so that the vehicle can be parked while avoiding obstacles, based on the information about the obstacles in the obstacle map MP. For example, the support control unit 36 excludes moving objects from obstacles as objects that have already moved and do not exist when parking, and considers stationary objects among the obstacles as obstacles to be avoided. Calculate the avoidance movement route. Then, in order to assist the vehicle in moving along the moving route, it outputs appropriate control signals to the HMIECU 40, the brake ECU 50, and the power train ECU 60 through the on-vehicle communication bus 70. FIG.
 HMIECU40は、HMI(Human Machine Interfaceの略)を構成する表示部41およびスピーカ等の音声出力部42を有する。表示部41は、視覚的な態様で情報を利用者にて提供する機器である。表示部41は、例えば、表示機能と操作機能が一体化されたタッチパネル式のディスプレイ、透明なガラス素子に情報を投影するヘッドアップディスプレイによって構成される。 The HMIECU 40 has a display section 41 and an audio output section 42 such as a speaker, which constitute an HMI (abbreviation for Human Machine Interface). The display unit 41 is a device that provides the user with information in a visual manner. The display unit 41 is configured by, for example, a touch panel display in which a display function and an operation function are integrated, or a head-up display that projects information onto a transparent glass element.
 HMIECU40は、例えば、周辺監視カメラ22が撮像した自車両Vaの周辺の画像データを受け取り、自車両Vaを示す仮想車両画像とともに表示部41に表示させる制御を行っている。HMIECU40は、例えば、自車両Vaを示した俯瞰画像中に自車両Vaの駐車予定位置を明示したり、移動経路を矢印で示したり、障害物を強調表示したりする処理を行う。HMIECU40は、車速表示やエンジン回転数表示など、各種メータ表示を行うための制御を行う。HMIECU40は、支援制御部36からの制御信号を受け取ると、表示部41に駐車支援に関わる情報を表示したり、音声出力部42から警報音や駐車支援に関わる情報のアナウンスを行ったりする。例えば、HMIECU40は、表示部41または音声出力部42を通じて、「前進します」や「バックします」などの車両の予定動作を示したり、「シフトポジションを『D』にしてください」などの自動駐車を行う準備のための指示を出したりする。 The HMIECU 40, for example, receives image data of the surroundings of the own vehicle Va captured by the surroundings monitoring camera 22, and controls the display unit 41 to display the data together with a virtual vehicle image showing the own vehicle Va. The HMIECU 40, for example, performs a process of clearly indicating the planned parking position of the own vehicle Va in the bird's-eye view image showing the own vehicle Va, indicating the moving route with an arrow, and emphasizing obstacles. The HMIECU 40 controls various meter displays such as vehicle speed display and engine speed display. When the HMIECU 40 receives a control signal from the assistance control unit 36, the display unit 41 displays information related to parking assistance, and the audio output unit 42 issues an alarm sound or information related to parking assistance. For example, the HMIECU 40 indicates a planned operation of the vehicle such as "Move forward" or "Reverse" through the display unit 41 or the voice output unit 42, or an automatic command such as "Set the shift position to 'D'". It issues instructions for preparations for parking.
 ブレーキECU50は、様々なブレーキ制御を行う制動制御装置を構成するものであり、ブレーキ液圧制御用のアクチュエータを駆動することで自動的にブレーキ液圧を発生させ、ホイールシリンダを加圧して制動力を発生させる。支援制御部36からの制御信号をブレーキECU50が受け取ると、ブレーキECU50は、移動経路に沿って車両を移動させるように、各車輪の制動力を制御する。 The brake ECU 50 constitutes a braking control device that performs various brake controls, and automatically generates brake fluid pressure by driving an actuator for brake fluid pressure control, pressurizes the wheel cylinders, and produces braking force. generate When the brake ECU 50 receives the control signal from the support control unit 36, the brake ECU 50 controls the braking force of each wheel so as to move the vehicle along the moving route.
 パワトレECU60は、様々な駆動力制御を行う駆動力制御装置を構成するものであり、エンジンもしくはモータ回転数を制御したり、トランスミッションを制御したりすることで、所望の駆動力を発生させる。支援制御部36からの制御信号をパワトレECU60が受け取ると、パワトレECU60は、移動経路に沿って車両を移動させるように、駆動輪の駆動力を制御する。 The power train ECU 60 constitutes a driving force control device that performs various driving force controls, and generates a desired driving force by controlling the engine or motor rotation speed and controlling the transmission. When the power train ECU 60 receives the control signal from the support control unit 36, the power train ECU 60 controls the driving force of the drive wheels so as to move the vehicle along the moving route.
 なお、ここでは自動駐車を行うことができるシステムとして、ブレーキECU50やパワトレECU60を含めている。また、俯瞰画像の表示や駐車支援に関わる表示を行うためにHMIECU40を含めている。しかしながら、これらについては必須のものではなく、必要に応じて適宜選択的に用いられる。 It should be noted that the brake ECU 50 and the power train ECU 60 are included here as systems capable of performing automatic parking. Moreover, the HMIECU 40 is included in order to display a bird's-eye view image and display relating to parking assistance. However, these are not essential, and are selectively used as needed.
 以上のようにして、本実施形態にかかる駐車支援システム1が構成されている。駐車支援システム1には、物体検知装置10が含まれている。駐車支援システム1は、物体検知装置10が実行する物体検知処理によって自車両Vaの周辺に存在する物体が“移動物体”および“静止物体”のいずれであるかを特定し、自車両Vaの周辺における障害物に関する情報を得る。そして、駐車支援システム1は、支援制御部36にて、物体検知装置10で取得された障害物に関する情報に基づき、障害物を避けて駐車が行えるような駐車支援制御が行われる。 The parking assistance system 1 according to this embodiment is configured as described above. The parking assistance system 1 includes an object detection device 10 . The parking assistance system 1 identifies whether an object existing around the vehicle Va is a "moving object" or a "stationary object" by object detection processing executed by the object detection device 10, Get information about obstacles in In the parking assistance system 1, the assistance control unit 36 performs parking assistance control based on the information about the obstacle acquired by the object detection device 10 so that the vehicle can be parked while avoiding the obstacle.
 ここで、本実施形態の物体検知装置10は、探査波センサ21、周辺監視カメラ22、記憶部31、物体検出部32、移動検出部33、物体特定部34、マップ生成部35を含んで構成されている。以下、物体検知装置10が実行する物体検知処理の概要について図4のフローチャートを参照しつつ説明する。物体検知装置10は、例えば、自車両Vaのイグニッションスイッチ等の起動スイッチがオンされると、周期的または不定期に毎に物体検知処理を実行する。なお、本フローチャートに示される各処理は、物体検知装置10の各機能部によって実現される。また、本処理を実現する各ステップは、物体検知方法を実現する各ステップとしても把握される。 Here, the object detection device 10 of the present embodiment includes an investigation wave sensor 21, a surrounding monitoring camera 22, a storage unit 31, an object detection unit 32, a movement detection unit 33, an object identification unit 34, and a map generation unit 35. It is An overview of the object detection process executed by the object detection device 10 will be described below with reference to the flowchart of FIG. The object detection device 10 performs object detection processing periodically or irregularly, for example, when a start switch such as an ignition switch of the own vehicle Va is turned on. Each processing shown in this flowchart is implemented by each functional unit of the object detection device 10 . Further, each step for realizing this processing can also be grasped as each step for realizing the object detection method.
 図4に示すように、物体検知装置10は、ステップS100にて、各種情報を読み込む。物体検知装置10は、例えば、探査波センサ21、周辺監視カメラ22、車輪速センサ、舵角センサ等のセンサ出力を逐次読み込む。 As shown in FIG. 4, the object detection device 10 reads various information in step S100. The object detection device 10 sequentially reads sensor outputs from, for example, an investigation wave sensor 21, a peripheral monitoring camera 22, a wheel speed sensor, a steering angle sensor, and the like.
 続いて、物体検知装置10は、ステップS110にて、“第1の物体”の検出処理を行う。具体的には、物体検知装置10は、自車両Vaに設置された探査波センサ21の検出結果に基づいて探査波センサ21の検知エリアRfl、Rrl、Rfr、Rrrの範囲内の物体を“第1の物体”として検出する。この処理は、物体検知装置10の物体検出部32によって行われる。 Subsequently, in step S110, the object detection device 10 performs detection processing for the "first object". Specifically, the object detection device 10 detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the surveying wave sensor 21 based on the detection results of the surveying wave sensor 21 installed in the own vehicle Va. 1 object”. This processing is performed by the object detection unit 32 of the object detection device 10 .
 続いて、物体検知装置10は、ステップS120にて、“第2の物体”の移動検出処理を行う。具体的には、物体検知装置10は、周辺監視カメラ22でにおける“第2の物体”の検知結果に基づいて、探査波センサ21の検知エリアRfl、Rrl、Rfr、Rrrにおける“第2の物体”の移動を検出する。この処理は、物体検知装置10の移動検出部33によって行われる。 Subsequently, in step S120, the object detection device 10 performs movement detection processing for the "second object". Specifically, the object detection device 10 detects the “second object” in the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 based on the detection result of the “second object” in the surroundings monitoring camera 22. ” movement is detected. This processing is performed by the movement detection unit 33 of the object detection device 10 .
 続いて、物体検知装置10は、ステップS130にて、物体検出部32が“第1の物体”を検出したか否かを判定する。この結果、“第1の物体”が検出されていない場合、物体検知装置10は、以降の処理をスキップして本処理を抜ける。一方、“第1の物体”が検出された場合、物体検知装置10は、ステップS140に移行する。 Subsequently, in step S130, the object detection device 10 determines whether or not the object detection unit 32 has detected the "first object". As a result, when the "first object" is not detected, the object detection device 10 skips subsequent processes and exits from this process. On the other hand, when the "first object" is detected, the object detection device 10 proceeds to step S140.
 物体検知装置10は、ステップS140にて、移動検出部33が“第2の物体”の移動を検出したか否かを判定する。具体的には、物体検知装置10は、探査波センサ21の検知エリアRfl、Rrl、Rfr、Rrrを含む対象エリアRL、RRで物体の移動が検出されたか否かを判定する。 In step S140, the object detection device 10 determines whether or not the movement detection unit 33 has detected movement of the "second object". Specifically, the object detection device 10 determines whether movement of an object has been detected in the target areas RL and RR including the detection areas Rfl, Rrl, Rfr, and Rrr of the probe wave sensor 21 .
 この結果、“第2の物体”の移動が検出された場合、物体検知装置10は、ステップS150にて、物体検出部32で検出された“第1の物体”と移動検出部33で移動が検出された“第2の物体”とが同じ物体であるか否かを判定する。例えば、物体検知装置10は、物体検出部32の検出結果から把握される“第1の物体”の位置と、周辺監視カメラ22の監視結果から把握される“第2の物体”の位置とが略一致する場合に、“第2の物体”と“第1の物体”とが同じ物体と判定する。なお、この判定処理は、上記のものに限らず、例えば、物体検出部32の検出結果から“第1の物体”の外形状を把握するとともに、周辺監視カメラ22の監視結果から“第2の物体”の外形状を把握し、これらの比較により実現されていてもよい。 As a result, when the movement of the "second object" is detected, the object detection device 10 determines whether the movement of the "first object" detected by the object detection unit 32 and the movement detection unit 33 are detected in step S150. It is determined whether or not the detected "second object" is the same object. For example, the object detection device 10 detects the position of the "first object" ascertained from the detection result of the object detection unit 32 and the position of the "second object" ascertained from the monitoring result of the periphery monitoring camera 22. If they substantially match, it is determined that the "second object" and the "first object" are the same object. Note that this determination process is not limited to the above. It may be realized by grasping the outer shape of the "object" and comparing them.
 ステップS150の判定処理の結果、“第2の物体”と“第1の物体”とが同じ物体である場合、物体検知装置10は、ステップS160に移行して、“第1の物体”を“移動物体”として特定する。 As a result of the determination process in step S150, if the "second object" and the "first object" are the same object, the object detection device 10 proceeds to step S160 to change the "first object" to " identified as “moving object”.
 一方、“第2の物体”の移動が検出されなかった場合、並びに、“第2の物体”と“第1の物体”とが同じ物体でない場合、物体検知装置10は、ステップS170にて、“第1の物体”を“静止物体”として特定する。なお、ステップS130~S170までの各処理は、物体特定部34によって行われる。 On the other hand, if the movement of the "second object" is not detected, and if the "second object" and the "first object" are not the same object, the object detection device 10, in step S170, Identify the "first object" as the "stationary object". Note that each process from steps S130 to S170 is performed by the object identification unit 34. FIG.
 このようにして、“第1の物体”が“移動物体”および“静止物体”のいずれであるかが特定されると、物体検知装置10は、ステップS180に移行する。物体検知装置10は、ステップS180にて、障害物マップMPを生成して、物体検知処理を抜ける。具体的には、物体検知装置10は、物体特定部34で“静止物体”と特定された障害物を障害物マップMPに登録し、物体特定部34で“移動物体”と特定された障害物を障害物マップMPに登録しない。物体検知装置10は、障害物マップMPを記憶部31に記憶する。なお、ステップS180の処理は、マップ生成部35によって行われる。 When it is thus specified whether the "first object" is a "moving object" or a "stationary object", the object detection device 10 proceeds to step S180. In step S180, the object detection device 10 generates an obstacle map MP and exits the object detection process. Specifically, the object detection device 10 registers obstacles identified as “stationary objects” by the object identification unit 34 in the obstacle map MP, and obstacles identified as “moving objects” by the object identification unit 34 are registered in the obstacle map MP. is not registered in the obstacle map MP. The object detection device 10 stores the obstacle map MP in the storage section 31 . Note that the process of step S180 is performed by the map generator 35. FIG.
 ここで、図5は、自車両Vaと他車両Vb等の移動体MOとが道路上ですれ違う際の自車両Vaおよび移動体MOの位置関係を時系列で表した一例である。また、図6は、自車両Vaが道路上に設置物OBの傍を走行する際の自車両Vaおよび設置物OBの位置関係を時系列で表した一例である。以下、図5および図6を参照しつつ、物体検知処理の具体例を説明する。 Here, FIG. 5 is an example showing the positional relationship between the own vehicle Va and the mobile object MO when the own vehicle Va and the mobile object MO such as the other vehicle Vb pass each other on the road in chronological order. FIG. 6 is an example of a time-series representation of the positional relationship between the own vehicle Va and the installation object OB when the own vehicle Va travels on the road by the installation object OB. A specific example of the object detection process will be described below with reference to FIGS. 5 and 6. FIG.
 自車両Vaが図5の上段に示す位置にある状態では、左フロントサイドセンサSLfおよび左リアサイドセンサSLbで道路に沿って設けられた側壁SWが“第1の物体”として検出される。一方、移動体MOが周辺監視カメラ22の対象エリアRLの範囲外に位置するので、物体検知装置10は、移動体MOの移動を検出しない。このため、物体検知装置10は、“第1の物体”としての側壁SWを“静止物体”として特定する。 When the own vehicle Va is in the position shown in the upper part of FIG. 5, the side wall SW provided along the road is detected by the left front side sensor SLf and the left rear side sensor SLb as the "first object". On the other hand, since the mobile object MO is located outside the range of the target area RL of the perimeter monitoring camera 22, the object detection device 10 does not detect the movement of the mobile object MO. Therefore, the object detection device 10 identifies the side wall SW as the "first object" as the "stationary object".
 自車両Vaが図5の中段に示す位置まで進むと、左フロントサイドセンサSLfで移動体MOが“第1の物体”として検出される。この際、移動体MOが周辺監視カメラ22の対象エリアRLの範囲内に位置するので、物体検知装置10は、移動体MOの移動を“第2の物体”の移動として検出する。“第1の物体”および“第2の物体”は、同一の移動体MOであり、その位置や外形状等が一致する。このため、物体検知装置10は、“第1の物体”としての移動体MOを“移動物体”として特定する。 When the own vehicle Va advances to the position shown in the middle part of FIG. 5, the left front side sensor SLf detects the mobile object MO as the "first object". At this time, since the moving object MO is located within the range of the target area RL of the perimeter monitoring camera 22, the object detection device 10 detects the movement of the moving object MO as the movement of the "second object". The "first object" and the "second object" are the same mobile object MO, and have the same position, outer shape, and the like. Therefore, the object detection device 10 identifies the moving object MO as the "first object" as the "moving object".
 そして、自車両Vaが図5の下段に示す位置まで進むと、左リアサイドセンサSLbで移動体MOが“第1の物体”として検出される。この際、移動体MOが周辺監視カメラ22の対象エリアRLの範囲内に位置するので、物体検知装置10は、移動体MOの移動を“第2の物体”の移動として検出する。“第1の物体”および“第2の物体”は、同一の移動体MOであり、その位置や外形状等が一致する。このため、物体検知装置10は、“第1の物体”としての移動体MOを“移動物体”として特定する。 Then, when the own vehicle Va advances to the position shown in the lower part of FIG. 5, the left rear side sensor SLb detects the mobile object MO as the "first object". At this time, since the moving object MO is located within the range of the target area RL of the perimeter monitoring camera 22, the object detection device 10 detects the movement of the moving object MO as the movement of the "second object". The "first object" and the "second object" are the same mobile object MO, and have the same position, outer shape, and the like. Therefore, the object detection device 10 identifies the moving object MO as the "first object" as the "moving object".
 また、自車両Vaが図6の上段に示す位置にある状態では、左フロントサイドセンサSLfおよび左リアサイドセンサSLbで道路に沿って設けられた側壁SWが“第1の物体”として検出される。設置物OBが周辺監視カメラ22の対象エリアRLの範囲外に位置し、物体検知装置10は、設置物OBを検出しない。このため、物体検知装置10は、“第1の物体”としての側壁SWを“静止物体”として特定する。 Also, when the vehicle Va is in the position shown in the upper part of FIG. 6, the side wall SW provided along the road is detected by the left front side sensor SLf and the left rear side sensor SLb as the "first object". The installation object OB is located outside the range of the target area RL of the perimeter monitoring camera 22, and the object detection device 10 does not detect the installation object OB. Therefore, the object detection device 10 identifies the side wall SW as the "first object" as the "stationary object".
 自車両Vaが図6の中段に示す位置まで進むと、左フロントサイドセンサSLfで設置物OBが“第1の物体”として検出される。この際、設置物OBが周辺監視カメラ22の対象エリアRL内で静止しているので、物体検知装置10は、設置物OBを“第2の物体”として検出するだけで、“第2の物体”の移動を検出しない。このため、物体検知装置10は、“第1の物体”としての設置物OBを“静止物体”として特定する。 When the own vehicle Va advances to the position shown in the middle part of FIG. 6, the left front side sensor SLf detects the installed object OB as the "first object". At this time, since the installation object OB is stationary within the target area RL of the surrounding monitoring camera 22, the object detection device 10 simply detects the installation object OB as the "second object". ” movement is not detected. Therefore, the object detection device 10 identifies the installation object OB as the "first object" as the "stationary object".
 そして、自車両Vaが図6の下段に示す位置まで進むと、左リアサイドセンサSLbで設置物OBが“第1の物体”として検出される。この際、設置物OBが周辺監視カメラ22の対象エリアRL内で静止しているので、物体検知装置10は、設置物OBを“第2の物体”として検出するだけで、“第2の物体”の移動を検出しない。このため、物体検知装置10は、“第1の物体”としての設置物OBを“静止物体”として特定する。 Then, when the own vehicle Va advances to the position shown in the lower part of FIG. 6, the left rear side sensor SLb detects the installed object OB as the "first object". At this time, since the installation object OB is stationary within the target area RL of the surrounding monitoring camera 22, the object detection device 10 simply detects the installation object OB as the "second object". ” movement is not detected. Therefore, the object detection device 10 identifies the installation object OB as the "first object" as the "stationary object".
 その後、物体検知装置10は、障害物マップMPを生成する。障害物マップMPは、例えば、図7に示すように、自車両Vaの周囲の領域を小領域(例えば網目状)に分割したグリッドマップに“静止物体”に対応する情報を付加したもので構成される。 After that, the object detection device 10 generates an obstacle map MP. For example, as shown in FIG. 7, the obstacle map MP is constructed by adding information corresponding to a "stationary object" to a grid map obtained by dividing the area around the own vehicle Va into small areas (for example, mesh). be done.
 以上説明した物体検知装置10は、自車両Vaに設置された探査波センサ21の検出結果に基づいて探査波センサ21の検知エリアRfl、Rrl、Rfr、Rrrの範囲内の物体を“第1の物体”として検出する。物体検知装置10は、探査波センサ21の検知エリアRfl、Rrl、Rfr、Rrrを含む対象エリアRL、RRにある物体を“第2の物体”として逐次検知する周辺監視カメラ22の検知結果を基に対象エリアRL、RRでの“第2の物体”の移動を検出する。そして、物体検知装置10は、“第2の物体”の移動が検出され、且つ、“第1の物体”と“第2の物体”とが同じものである場合に、“第1の物体”を“移動物体”として特定する。 The object detection device 10 described above detects an object within the detection areas Rfl, Rrl, Rfr, and Rrr of the surveying wave sensor 21 based on the detection results of the surveying wave sensor 21 installed in the own vehicle Va as the "first object”. The object detection device 10 detects objects in the target areas RL and RR including the detection areas Rfl, Rrl, Rfr, and Rrr of the survey wave sensor 21 as "second objects" based on the detection results of the peripheral monitoring camera 22. , the movement of the "second object" in the target areas RL and RR is detected. Then, when the movement of the "second object" is detected and the "first object" and the "second object" are the same, the object detection device 10 detects the "first object". is identified as a "moving object".
 これによると、探査波センサ21の検出結果に基づく物体の検知と、周辺監視カメラ22の検知結果に基づく物体の移動の検出とを並行して実施することができる。この場合、従来技術のような待機状態が生じないので、自車両Vaの周辺に存在する物体が“移動物体”か否かを短時間で特定することができる。 According to this, the detection of an object based on the detection result of the survey wave sensor 21 and the detection of movement of the object based on the detection result of the perimeter monitoring camera 22 can be performed in parallel. In this case, since the standby state unlike the conventional technology does not occur, it is possible to identify in a short period of time whether or not the object existing around the own vehicle Va is a "moving object".
 また、本実施形態によれば、以下の効果を得ることができる。 Also, according to this embodiment, the following effects can be obtained.
 (1)物体特定部34は、移動検出部33で“第2の物体”の移動が検出されなかった場合、“第1の物体”を“静止物体”として特定する。これによると、自車両Vaの周辺に存在する物体が“静止物体”か否かを短時間で特定することができる。 (1) When the movement detection unit 33 does not detect the movement of the "second object", the object identification unit 34 identifies the "first object" as a "stationary object". According to this, it is possible to identify in a short time whether an object existing around the own vehicle Va is a "stationary object".
 (2)物体特定部34は、移動検出部33で“第2の物体”の移動が検出され、且つ、“第1の物体”と“第2の物体”とが異なるものである場合に、“第1の物体”を“静止物体”として特定する。これによっても、自車両Vaの周辺に存在する物体が“静止物体”か否かを短時間で特定することができる。 (2) When the movement detection unit 33 detects the movement of the “second object” and the “first object” and the “second object” are different, the object identification unit 34 Identify the "first object" as the "stationary object". This also makes it possible to identify in a short period of time whether or not an object existing around the own vehicle Va is a "stationary object".
 (3)物体検知装置10は、自車両Vaの周辺に存在する障害物と車両との位置関係を定義する障害物マップMPを生成するマップ生成部35を備える。マップ生成部35は、物体特定部34で特定された“静止物体”を障害物マップMPに登録し、物体特定部34で特定された“移動物体”を障害物マップMPに登録しない。これによると、自車両Vaの周辺に存在する物体のうち“静止物体”を障害物として障害物マップMPへ早期に反映することができる。 (3) The object detection device 10 includes a map generator 35 that generates an obstacle map MP that defines the positional relationship between the vehicle and obstacles existing around the own vehicle Va. The map generation unit 35 registers the “stationary object” identified by the object identification unit 34 in the obstacle map MP, and does not register the “moving object” identified by the object identification unit 34 in the obstacle map MP. According to this, a "stationary object" among the objects existing around the host vehicle Va can be quickly reflected in the obstacle map MP as an obstacle.
 (4)探査波センサ21は、自車両Vaの側方において自車両Vaの進行方向に沿って所定間隔をあけて配置された複数の超音波センサを含んでいる。そして、複数の超音波センサのうち、自車両Vaの進行方向において隣り合う超音波センサの一方を第1センサとし、他方を第2センサとしたとする。このとき、周辺監視カメラ22は、第1センサの検知エリアおよび第2センサの検知エリアそれぞれを含むエリアを対象エリアRL、RRとして第2の物体を逐次検知する。 (4) The search wave sensor 21 includes a plurality of ultrasonic sensors arranged at predetermined intervals along the traveling direction of the vehicle Va on the side of the vehicle Va. Among the plurality of ultrasonic sensors, one of the ultrasonic sensors adjacent to each other in the traveling direction of the vehicle Va is assumed to be the first sensor, and the other is assumed to be the second sensor. At this time, the surroundings monitoring camera 22 sequentially detects the second object by using the areas including the detection area of the first sensor and the detection area of the second sensor as target areas RL and RR.
 これによると、“第1の物体”の検出手段が冗長となることで、“第1の物体”が“静止物体”および“移動物体”のいずれであるかを適切に特定することが可能となる。例えば、自車両Vaと他車両Vbとがすれ違う際に一時停止した他車両Vbを第1センサの検出結果に基づいて“静止物体”として特定したとしても、第2センサの検出結果に基づいて“移動物体”に変更することができる。 According to this, since the means for detecting the "first object" becomes redundant, it is possible to appropriately specify whether the "first object" is a "stationary object" or a "moving object". Become. For example, even if the other vehicle Vb that is temporarily stopped when the own vehicle Va and the other vehicle Vb pass each other is specified as a "stationary object" based on the detection result of the first sensor, the "stationary object" is determined based on the detection result of the second sensor. Can be changed to "moving object".
 加えて、周辺監視カメラ22が“第2の物体”を検知するための対象エリアRL、RRは、第1センサの検知エリアおよび第2センサの検知エリアを含んでいる。これによると、第1センサの検知エリアおよび第2センサの検知エリアそれぞれに対してカメラを設ける場合に比較して、物体検知装置10の装置構成を簡素化することができる。 In addition, the target areas RL and RR for the perimeter monitoring camera 22 to detect the "second object" include the detection area of the first sensor and the detection area of the second sensor. According to this, the device configuration of the object detection device 10 can be simplified as compared with the case where a camera is provided for each of the detection area of the first sensor and the detection area of the second sensor.
 (他の実施形態)
 以上、本開示の代表的な実施形態について説明したが、本開示は、上述の実施形態に限定されることなく、例えば、以下のように種々変形可能である。上述の実施形態では、物体検知装置10の詳細な構成、物体検知処理の詳細な内容について説明したが、これらに限定されず、これらの一部が異なっていてもよい。
(Other embodiments)
Although representative embodiments of the present disclosure have been described above, the present disclosure is not limited to the above-described embodiments, and can be modified in various ways, for example, as follows. Although the detailed configuration of the object detection device 10 and the detailed content of the object detection processing have been described in the above-described embodiment, the present invention is not limited to these, and some of them may be different.
 上述の実施形態では、周辺監視カメラ22によって検知装置が構成されているものを例示したが、これ限らず、検知装置は、物体の移動を検出可能のものであればよく、周辺監視カメラ22以外のもので構成されていてもよい。 In the above-described embodiment, the detection device is configured by the perimeter monitoring camera 22, but the detection device is not limited to this, and any device other than the perimeter monitoring camera 22 may be used as long as it can detect the movement of an object. may be composed of
 上述の実施形態の如く、物体検知装置10は、移動検出部33で“第2の物体”の移動が検出されなかった場合、“第1の物体”を“静止物体”として特定することが望ましいが、これに限定されない。物体検知装置10は、例えば、移動検出部33で“第2の物体”の移動が検出されなかった場合、“第1の物体”および“第2の物体”それぞれを“静止物体”として特定するようになっていてもよい。 As in the above-described embodiment, when the movement detection unit 33 does not detect the movement of the "second object", the object detection device 10 preferably identifies the "first object" as the "stationary object". but not limited to this. For example, when the movement detection unit 33 does not detect movement of the "second object", the object detection device 10 identifies each of the "first object" and the "second object" as "stationary objects". It can be like this.
 上述の実施形態の如く、物体検知装置10は、移動検出部33で“第2の物体”の移動が検出され、且つ、“第1の物体”と“第2の物体”とが異なるものである場合に、“第1の物体”を“静止物体”として特定することが望ましいが、これに限定されない。物体検知装置10は、例えば、移動検出部33で“第2の物体”の移動が検出され、且つ、“第1の物体”と“第2の物体”とが異なるものである場合、“第1の物体”を“静止物体”として特定しつつ、“第2の物体”を“移動物体”として特定してもよい。 As in the above-described embodiment, the object detection device 10 detects the movement of the "second object" by the movement detection unit 33, and the "first object" and the "second object" are different. In some cases, it may be desirable, but not limited, to identify the "first object" as a "stationary object". For example, when the movement detection unit 33 detects the movement of the "second object" and the "first object" and the "second object" are different, the object detection device 10 detects the "second object". While specifying the "first object" as the "stationary object", the "second object" may be specified as the "moving object".
 上述の実施形態の如く、物体検知装置10は、“第1の物体”を“静止物体”および“移動物体”のいずれかに特定した後、障害物マップMPを作成することが望ましいが、これに限定されない。物体検知装置10は、例えば、障害物マップMPの作成の代わりに、“移動物体”の検出を表示部41および音声出力部42の少なくとも一方を通じて利用者に通知するようになっていてもよい。 As in the above-described embodiment, it is desirable that the object detection device 10 creates the obstacle map MP after identifying the "first object" as either the "stationary object" or the "moving object". is not limited to For example, instead of creating the obstacle map MP, the object detection device 10 may notify the user of detection of a “moving object” through at least one of the display unit 41 and the audio output unit 42 .
 上述の実施形態では、自車両Vaの左右側面それぞれに2個ずつ超音波センサが設置されているが、自車両Vaには、これ以上の超音波センサが設置されていたり、自車両Vaの左右側面それぞれに1個ずつ超音波センサが設置されていたりしてもよい。なお、探査波センサ21は、例えば、1つの超音波センサで構成されていてもよい。また、物体検知装置10は、超音波センサに代えてミリ波レーダやLiDARの出力に基づいて“第1の物体”を検出するようになっていてもよい。 In the above-described embodiment, two ultrasonic sensors are installed on each of the left and right sides of the vehicle Va. One ultrasonic sensor may be installed on each side. The probe wave sensor 21 may be composed of, for example, one ultrasonic sensor. Further, the object detection device 10 may detect the "first object" based on the output of a millimeter wave radar or LiDAR instead of the ultrasonic sensor.
 上述の実施形態の如く、周辺監視カメラ22の対象エリアRL、RRは、自車両Vaの進行方向において隣り合う超音波センサそれぞれの検知エリアを含むように設定されていることが望ましいが、これに限定されない。周辺監視カメラ22の対象エリアRL、RRは、隣り合う超音波センサのうち一方のセンサの検知エリアを含み、他方のセンサの検知エリアを含まないようになっていてもよい。 As in the above-described embodiment, the target areas RL and RR of the perimeter monitoring camera 22 are desirably set so as to include the respective detection areas of the ultrasonic sensors adjacent in the traveling direction of the host vehicle Va. Not limited. The target areas RL and RR of the surroundings monitoring camera 22 may include the detection area of one of the adjacent ultrasonic sensors and not include the detection area of the other sensor.
 上述の実施形態では、自車両Vaが前進移動する場合の自車両Vaの周辺の物体が“移動物体”および“静止物体”のいずれであるかを特定する例ついて説明したが、物体検知装置10は、これに限定されない。物体検知装置10は、自車両Vaが後進移動(いわゆる、バック)する場合の自車両Vaの周辺の物体が“移動物体”および“静止物体”のいずれであるかを特定するようになっていてもよい。 In the above-described embodiment, an example of specifying whether an object in the vicinity of the own vehicle Va is a "moving object" or a "stationary object" when the own vehicle Va moves forward has been described. is not limited to this. The object detection device 10 identifies whether an object around the own vehicle Va is a "moving object" or a "stationary object" when the own vehicle Va moves backward (so-called back). good too.
 上述の実施形態では、本開示の物体検知装置10を駐車支援システム1に組み込んだものを例示したが、これに限らず、物体検知装置10は、例えば、衝突防止システム等の他のシステムに組み込むことも可能である。すなわち、物体検知装置10は、上述したもの以外にも適用可能である。 In the above-described embodiment, an example in which the object detection device 10 of the present disclosure is incorporated in the parking assistance system 1 is illustrated, but the object detection device 10 is not limited to this, and may be incorporated in another system such as a collision prevention system. is also possible. That is, the object detection device 10 can be applied to applications other than those described above.
 上述の実施形態において、実施形態を構成する要素は、特に必須であると明示した場合および原理的に明らかに必須であると考えられる場合等を除き、必ずしも必須のものではないことは言うまでもない。 It goes without saying that, in the above-described embodiments, the elements that make up the embodiments are not necessarily essential unless explicitly stated as essential or clearly considered essential in principle.
 上述の実施形態において、実施形態の構成要素の個数、数値、量、範囲等の数値が言及されている場合、特に必須であると明示した場合および原理的に明らかに特定の数に限定される場合等を除き、その特定の数に限定されない。 In the above-described embodiments, when numerical values such as the number, numerical value, amount, range, etc. of the constituent elements of the embodiment are mentioned, when it is explicitly stated that they are essential, and in principle they are clearly limited to a specific number It is not limited to that particular number, unless otherwise specified.
 上述の実施形態において、構成要素等の形状、位置関係等に言及するときは、特に明示した場合および原理的に特定の形状、位置関係等に限定される場合等を除き、その形状、位置関係等に限定されない。 In the above-described embodiments, when referring to the shape, positional relationship, etc. of components, etc., the shape, positional relationship, etc., unless otherwise specified or limited in principle to a specific shape, positional relationship, etc. etc. is not limited.
 上述の実施形態において、センサから車両の外部環境情報を取得することが記載されている場合、そのセンサを廃し、車両の外部のサーバまたはクラウドからその外部環境情報を受信することも可能である。あるいは、そのセンサを廃し、車両の外部のサーバまたはクラウドからその外部環境情報に関連する関連情報を取得し、取得した関連情報からその外部環境情報を推定することも可能である。 In the above-described embodiment, if it is described that the vehicle's external environment information is obtained from a sensor, it is possible to eliminate the sensor and receive the external environment information from a server or cloud outside the vehicle. Alternatively, it is also possible to eliminate the sensor, acquire related information related to the external environment information from a server or cloud outside the vehicle, and estimate the external environment information from the acquired related information.
 本開示の制御部及びその手法は、コンピュータプログラムにより具体化された一つ乃至は複数の機能を実行するようにプログラムされたプロセッサ及びメモリを構成することによって提供された専用コンピュータで、実現されてもよい。本開示の制御部及びその手法は、一つ以上の専用ハードウエア論理回路によってプロセッサを構成することによって提供された専用コンピュータで、実現されてもよい。本開示の制御部及びその手法は、一つ乃至は複数の機能を実行するようにプログラムされたプロセッサ及びメモリと一つ以上のハードウエア論理回路によって構成されたプロセッサとの組み合わせで構成された一つ以上の専用コンピュータで、実現されてもよい。また、コンピュータプログラムは、コンピュータにより実行されるインストラクションとして、コンピュータ読み取り可能な非遷移有形記録媒体に記憶されていてもよい。 The controller and techniques of the present disclosure are implemented on a dedicated computer provided by configuring a processor and memory programmed to perform one or more functions embodied by the computer program. good too. The controller and techniques of the present disclosure may be implemented in a dedicated computer provided by configuring the processor with one or more dedicated hardware logic circuits. The control unit and method of the present disclosure is a combination of a processor and memory programmed to perform one or more functions and a processor configured by one or more hardware logic circuits. It may be implemented on one or more dedicated computers. The computer program may also be stored as computer-executable instructions on a computer-readable non-transitional tangible recording medium.

Claims (6)

  1.  車両の周辺に存在する物体を検知する物体検知装置であって、
     前記車両に設置された探査波センサ(21)の検出結果に基づいて前記探査波センサの検知エリア(Rfl、Rrl、Rfr、Rrr)の範囲内の物体を第1の物体として検出する物体検出部(32)と、
     前記検知エリアを含むエリアを対象エリア(RL、RR)として前記対象エリアの範囲内の物体を第2の物体として逐次検知する検知装置(22)の検知結果を基に前記対象エリアにおける前記第2の物体の移動を検出する移動検出部(33)と、
     前記移動検出部で前記第2の物体の移動が検出され、且つ、前記第1の物体と前記第2の物体とが同じものである場合に、前記第1の物体を移動物体として特定する物体特定部(34)と、を備える物体検知装置。
    An object detection device that detects an object existing around a vehicle,
    An object detection unit that detects an object within a detection area (Rfl, Rrl, Rfr, Rrr) of the surveying wave sensor (21) as a first object based on the detection result of the surveying wave sensor (21) installed in the vehicle. (32) and
    An area including the detection area is set as a target area (RL, RR), and an object within the range of the target area is sequentially detected as a second object based on the detection result of the detection device (22) in the target area. a movement detection unit (33) for detecting movement of an object of
    An object that specifies the first object as a moving object when the movement of the second object is detected by the movement detection unit and the first object and the second object are the same. An object detection device comprising a specifying unit (34).
  2.  前記物体特定部は、前記移動検出部で前記第2の物体の移動が検出されなかった場合、前記第1の物体を静止物体として特定する、請求項1に記載の物体検知装置。 The object detection device according to claim 1, wherein the object identification unit identifies the first object as a stationary object when the movement of the second object is not detected by the movement detection unit.
  3.  前記物体特定部は、前記移動検出部で前記第2の物体の移動が検出され、且つ、前記第1の物体と前記第2の物体とが異なるものである場合に、前記第1の物体を静止物体として特定する、請求項1に記載の物体検知装置。 The object identification unit identifies the first object when the movement detection unit detects movement of the second object and the first object and the second object are different. 2. The object detection device according to claim 1, wherein the object is identified as a stationary object.
  4.  前記車両の周辺に存在する障害物と前記車両との位置関係を定義する障害物マップを生成するマップ生成部(35)を備え、
     前記マップ生成部は、前記物体特定部で特定された前記静止物体を前記障害物マップに登録し、前記物体特定部で特定された前記移動物体を前記障害物マップに登録しない、請求項2または3に記載の物体検知装置。
    A map generation unit (35) that generates an obstacle map that defines a positional relationship between obstacles existing around the vehicle and the vehicle,
    3. The map generation unit registers the stationary object identified by the object identification unit in the obstacle map, and does not register the moving object identified by the object identification unit in the obstacle map. 4. The object detection device according to 3.
  5.  前記探査波センサは、前記車両の側方において前記車両の進行方向に沿って所定間隔をあけて配置された複数の超音波センサ(SLf、SLb、SRf、SRb)を含み、
     複数の前記超音波センサのうち、前記進行方向において隣り合う前記超音波センサの一方を第1センサとし、他方を第2センサとしたとき、
     前記検知装置は、前記第1センサの前記検知エリアおよび前記第2センサの前記検知エリアそれぞれを含むエリアを前記対象エリアとして前記第2の物体を逐次検知する、請求項1ないし4のいずれか1つに記載の物体検知装置。
    The survey wave sensor includes a plurality of ultrasonic sensors (SLf, SLb, SRf, SRb) arranged at predetermined intervals along the traveling direction of the vehicle on the side of the vehicle,
    Among the plurality of ultrasonic sensors, when one of the ultrasonic sensors adjacent in the traveling direction is a first sensor and the other is a second sensor,
    5. Any one of claims 1 to 4, wherein the detection device successively detects the second object using an area including the detection area of the first sensor and the detection area of the second sensor as the target area. The object detection device according to 1.
  6.  車両の周辺の存在する物体を検知する物体検知方法であって、
     前記車両に設置された探査波センサ(21)の検出結果に基づいて前記探査波センサの検知エリア(Rfl、Rrl、Rfr、Rrr)の範囲内の物体を第1の物体として検出することと、
     前記検知エリアを含むエリアを対象エリア(RL、RR)として前記対象エリアの範囲内の物体を第2の物体として逐次検知する検知装置(22)の検知結果を基に前記対象エリアにおける前記第2の物体の移動を検出することと、
     前記第2の物体の移動が検出され、且つ、前記第1の物体と前記第2の物体とが同じものである場合に、前記第1の物体を移動物体として特定することと、を含む物体検知方法。
    An object detection method for detecting an object existing around a vehicle,
    Detecting an object within a detection area (Rfl, Rrl, Rfr, Rrr) of the surveying wave sensor as a first object based on the detection result of the surveying wave sensor (21) installed in the vehicle;
    An area including the detection area is set as a target area (RL, RR), and an object within the range of the target area is sequentially detected as a second object based on the detection result of the detection device (22) in the target area. detecting movement of an object of
    identifying the first object as a moving object when movement of the second object is detected and the first object and the second object are the same. Detection method.
PCT/JP2022/029149 2021-08-27 2022-07-28 Object detection device and object detection method WO2023026760A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-139030 2021-08-27
JP2021139030A JP2023032736A (en) 2021-08-27 2021-08-27 Object detection device, and object detection method

Publications (1)

Publication Number Publication Date
WO2023026760A1 true WO2023026760A1 (en) 2023-03-02

Family

ID=85323039

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/029149 WO2023026760A1 (en) 2021-08-27 2022-07-28 Object detection device and object detection method

Country Status (2)

Country Link
JP (1) JP2023032736A (en)
WO (1) WO2023026760A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013020458A (en) * 2011-07-12 2013-01-31 Daihatsu Motor Co Ltd On-vehicle object discrimination device
US20160116593A1 (en) * 2014-10-23 2016-04-28 Hyundai Mobis Co., Ltd. Object detecting apparatus, and method of operating the same
JP2018010466A (en) * 2016-07-13 2018-01-18 株式会社Soken Object detection device
JP2021064098A (en) * 2019-10-11 2021-04-22 株式会社デンソー Control unit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013020458A (en) * 2011-07-12 2013-01-31 Daihatsu Motor Co Ltd On-vehicle object discrimination device
US20160116593A1 (en) * 2014-10-23 2016-04-28 Hyundai Mobis Co., Ltd. Object detecting apparatus, and method of operating the same
JP2018010466A (en) * 2016-07-13 2018-01-18 株式会社Soken Object detection device
JP2021064098A (en) * 2019-10-11 2021-04-22 株式会社デンソー Control unit

Also Published As

Publication number Publication date
JP2023032736A (en) 2023-03-09

Similar Documents

Publication Publication Date Title
JP6531832B2 (en) Parking space detection method and apparatus
JP6854890B2 (en) Notification system and its control method, vehicle, and program
CN102470876B (en) Collision monitor for a motor vehicle
JP2016199262A (en) Avoidance of collision based on front wheel locus deviation during retreat travel
US11230284B2 (en) Driving assistance apparatus and driving assistance method
EP3650315B1 (en) Parking assistance method and parking assistance device
JPWO2018235274A1 (en) Parking control method and parking control device
CN110730735B (en) Parking assist method and parking assist apparatus
JP2023112053A (en) Image processing device
WO2023026760A1 (en) Object detection device and object detection method
CN113525350B (en) Parking assist system
WO2021172532A1 (en) Parking assistance device and parking assistance method
WO2021199674A1 (en) Parking assist apparatus, parking assist system, and parking assist method
JP7294176B2 (en) PARKING ASSIST DEVICE AND PARKING ASSIST METHOD
JP2014069721A (en) Periphery monitoring device, control method, and program
JP7263962B2 (en) VEHICLE DISPLAY CONTROL DEVICE AND VEHICLE DISPLAY CONTROL METHOD
JP7271950B2 (en) vehicle controller
WO2023032538A1 (en) Object detection device and object detection method
JP7244398B2 (en) Moving object detection device
CN112977418B (en) Parking assist system
CN113525337B (en) Parking space identification system and parking auxiliary system comprising same
WO2023002863A1 (en) Driving assistance device, driving assistance method
JP6198871B2 (en) Vehicle periphery monitoring device, vehicle periphery monitoring method, vehicle periphery monitoring program, and recording medium therefor
JP2023023158A (en) Drive support device and drive support method
JP2023083081A (en) vehicle controller

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22861051

Country of ref document: EP

Kind code of ref document: A1