WO2023152871A1 - 情報処理装置、制御方法、プログラム及び記憶媒体 - Google Patents
情報処理装置、制御方法、プログラム及び記憶媒体 Download PDFInfo
- Publication number
- WO2023152871A1 WO2023152871A1 PCT/JP2022/005353 JP2022005353W WO2023152871A1 WO 2023152871 A1 WO2023152871 A1 WO 2023152871A1 JP 2022005353 W JP2022005353 W JP 2022005353W WO 2023152871 A1 WO2023152871 A1 WO 2023152871A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- point
- obstacle
- ground
- information processing
- curb
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims description 114
- 238000000034 method Methods 0.000 title claims description 43
- 238000012545 processing Methods 0.000 claims abstract description 89
- 238000005259 measurement Methods 0.000 claims description 46
- 238000001514 detection method Methods 0.000 claims description 24
- 238000012937 correction Methods 0.000 claims description 14
- 230000015654 memory Effects 0.000 description 15
- 238000012986 modification Methods 0.000 description 9
- 230000004048 modification Effects 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 5
- 238000006073 displacement reaction Methods 0.000 description 4
- 230000001678 irradiating effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000003973 paint Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000000611 regression analysis Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- This disclosure relates to technology for processing measured data.
- Patent Literature 1 discloses a forward vehicle recognition device that detects the distance and inclination of a forward vehicle by changing the lighting pattern for projecting the light projection pattern depending on the detection state of the light projection pattern. .
- the number of data points obtained for data corresponding to distant curbs is small. Therefore, it is generally difficult to determine that it is an obstacle.
- a main object of the present disclosure is to provide an information processing device, a control method, a program, and a storage medium storing the program, which can suitably detect a curbstone as an obstacle based on measurement data output by the measurement device.
- the claimed invention is Acquisition means for acquiring measurement data output by the measuring device; an obstacle detection means for detecting an obstacle point, which is data representing a measured position of an obstacle, from the measurement data; ground detection means for detecting a ground point, which is data representing a position to be measured on the ground, from the measurement data; curb determination means for determining that the ground point corresponding to the curb is the obstacle point based on the travelable area of the moving body determined at the processing time immediately before the current processing time; drivable area determining means for determining the drivable area at the current processing time based on the obstacle point and the ground point; It is an information processing device comprising
- a control method executed by an information processing device Acquire the measurement data output by the measuring device, detecting an obstacle point, which is data representing a measured position of an obstacle, from the measurement data; Detecting a ground point, which is data representing a position to be measured on the ground, from the measurement data; determining that the ground point corresponding to the curb is the obstacle point based on the travelable area of the moving object determined at the processing time immediately before the current processing time; determining the travelable area at the current processing time based on the obstacle point and the ground point; control method.
- the invention described in the claims Acquire the measurement data output by the measuring device, detecting an obstacle point, which is data representing a measured position of an obstacle, from the measurement data; Detecting a ground point, which is data representing a position to be measured on the ground, from the measurement data; determining that the ground point corresponding to the curb is the obstacle point based on the travelable area of the moving object determined at the processing time immediately before the current processing time;
- the program causes a computer to execute a process of determining the travelable area at the current processing time based on the obstacle point and the ground point.
- 1 shows a schematic configuration of a lidar according to an embodiment
- 6 is an example of a flowchart of overall processing according to an embodiment
- It is an example of the flowchart which shows the procedure of a distant curb point determination process.
- It is an example of the flowchart which shows the procedure of a driving
- working area is shown.
- It is an example of a flowchart of a distant curb point determination second process.
- the block diagram of the lidar system which concerns on a modification is shown.
- the information processing device includes an acquisition means for acquiring measurement data output from the measurement device, and an obstacle detection device for detecting an obstacle point, which is data representing a measured position of an obstacle, from the measurement data.
- an object detection means for detecting a ground point, which is data representing a position to be measured on the ground, from the measurement data; curb determination means for determining that the ground point corresponding to is the obstacle point; and travelable area determination means for determining the travelable area at the current processing time based on the obstacle point and the ground point And prepare.
- the information processing device can accurately determine that the ground point corresponding to the curbstone, which has been erroneously detected as the ground point, is the obstacle point.
- the curb determining means sets a temporary drivable area at the current processing time based on the drivable area at the immediately preceding processing time, and determines the curbstone based on the temporary drivable area. is determined to be the obstacle point.
- the information processing device can accurately determine the ground point corresponding to the curb based on the travelable area at the immediately preceding processing time.
- the curb determination means corrects the ground point existing near the boundary position of the temporary travelable area to the obstacle point. According to this aspect, the information processing device can accurately determine the ground point corresponding to the curb.
- the curb determination means sets the temporary travelable area based on the travelable area at the immediately preceding processing time and the movement information of the measuring device. According to this aspect, the information processing device can accurately set the provisional travelable area at the current processing time.
- the information processing device further includes ground point correction means for correcting the obstacle point existing within the temporary travelable area to the ground point.
- ground point correction means for correcting the obstacle point existing within the temporary travelable area to the ground point.
- the curb determining means determines that the ground point corresponding to the curb is the obstacle point based on continuity in the direction in which the road extends. According to this aspect, the information processing device can appropriately correct the ground point corresponding to the curb to the obstacle point.
- the travelable area determining means extends the travelable area within a predetermined distance from the current processing time determined based on the obstacle point and the ground point. , determine the entire drivable area at the current processing time. According to this aspect, the information processing device can accurately determine the entire travelable area at the current processing time.
- the obstacle is an object existing near a road boundary.
- the object is at least one of a curbstone, vegetation, a road shoulder, and a fallen object near a road boundary.
- a control method executed by an information processing device measurement data output by a measurement device is acquired, and an obstacle point, which is data representing a measured position of an obstacle, is obtained. Detect from the measurement data, detect the ground point, which is data representing the position to be measured on the ground, from the measurement data, and correspond to the curb based on the travelable area of the moving object determined at the processing time immediately before the current processing time The ground point is determined to be the obstacle point, and the travelable area at the current processing time is determined based on the obstacle point and the ground point.
- the information processing device can accurately determine that the ground point corresponding to the curb stone that has been erroneously detected as the ground point is the obstacle point.
- the program acquires measurement data output by a measurement device, detects an obstacle point, which is data representing a measured position of an obstacle, from the measurement data, and detects an obstacle point on the ground.
- a ground point which is data representing a measured position, is detected from the measurement data, and the ground point corresponding to the curb is determined as the obstacle point based on the travelable area of the moving object determined at the processing time immediately preceding the current processing time.
- the computer executes a process of determining the travelable area at the current processing time.
- the computer can accurately determine the ground point corresponding to the curb that has been erroneously detected as the ground point to be the obstacle point.
- the program is stored in a storage medium.
- FIG. 1 shows a schematic configuration of a lidar 100 according to this embodiment.
- the rider 100 is mounted in a vehicle that supports driving such as automatic driving, for example.
- the lidar 100 irradiates a laser beam in a predetermined angular range in the horizontal and vertical directions, and receives light (also referred to as “reflected light”) returned by the laser beam reflected by an object.
- the distance from the lidar 100 to the object is discretely measured, and point cloud information indicating the three-dimensional position of the object is generated.
- the lidar 100 is installed so that the ground such as the road surface is included in the measurement range.
- the lidar 100 mainly includes a transmitter 1, a receiver 2, a beam splitter 3, a scanner 5, a piezo sensor 6, a controller 7, and a memory 8.
- the transmitter 1 is a light source that emits pulsed laser light toward the beam splitter 3 .
- the transmitter 1 includes, for example, an infrared laser emitting element.
- the transmitter 1 is driven based on the drive signal “Sg1” supplied from the controller 7 .
- the receiving unit 2 is, for example, an avalanche photodiode, generates a detection signal "Sg2" corresponding to the amount of received light, and supplies the generated detection signal Sg2 to the control unit 7.
- the beam splitter 3 transmits the pulsed laser light emitted from the transmitter 1 . Also, the beam splitter 3 reflects the light reflected by the scanner 5 toward the receiver 2 .
- the scanner 5 is, for example, an electrostatically driven mirror (MEMS mirror), and the tilt (that is, the optical scanning angle) changes within a predetermined range based on the drive signal "Sg3" supplied from the control unit 7.
- the scanner 5 reflects the laser light that has passed through the beam splitter 3 toward the outside of the lidar 100 , and reflects the reflected light incident from the outside of the lidar 100 toward the beam splitter 3 .
- a point measured by irradiating the laser beam within the measurement range of the lidar 100 or measurement data thereof is also referred to as a "point to be measured”.
- the scanner 5 is provided with a piezo sensor 6 .
- the piezo sensor 6 detects strain caused by the stress of the torsion bar that supports the mirror portion of the scanner 5 .
- the piezo sensor 6 supplies the generated detection signal “Sg4” to the controller 7 .
- the detection signal Sg4 is used for detecting the orientation of the scanner 5.
- the memory 8 is composed of various volatile and nonvolatile memories such as RAM (Random Access Memory), ROM (Read Only Memory), and flash memory.
- the memory 8 stores programs necessary for the control unit 7 to execute predetermined processing.
- the memory 8 also stores various parameters referred to by the control unit 7 . Further, the memory 8 stores the latest point group information for a predetermined number of frames generated by the control unit 7 and the like.
- the control unit 7 includes various processors such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
- the control unit 7 executes a predetermined process by executing a program stored in the memory 8 .
- the control unit 7 is an example of a computer that executes programs. Note that the control unit 7 is not limited to being realized by software programs, and may be realized by any combination of hardware, firmware, and software.
- the control unit 7 may be a user-programmable integrated circuit such as FPGA (field-programmable gate array) or microcontroller, ASSP (Application Specific Standard Produce), ASIC (Application Specific Integrated Circuit), etc. may be
- the control unit 7 functionally includes a transmission drive block 70 , a scanner drive block 71 , a point cloud information generation block 72 and a point cloud information processing block 73 .
- the transmission drive block 70 outputs a drive signal Sg1 for driving the transmission section 1.
- the drive signal Sg1 includes information for controlling the light emission time of the laser light emitting element included in the transmitter 1 and the light emission intensity of the laser light emitting element.
- the transmission drive block 70 controls the emission intensity of the laser light emitting element included in the transmission section 1 based on the drive signal Sg1.
- the scanner drive block 71 outputs a drive signal Sg3 for driving the scanner 5.
- the drive signal Sg3 includes a horizontal drive signal corresponding to the resonance frequency of the scanner 5 and a vertical drive signal for vertical scanning. Further, the scanner drive block 71 monitors the detection signal Sg4 output from the piezo sensor 6 to detect the scanning angle of the scanner 5 (that is, the laser beam emission direction).
- the point group information generation block 72 uses the lidar 100 as a reference point to determine the distance (measurement distance) to the object irradiated with the laser beam in the measurement direction (that is, the direction of the laser beam). point group information shown for each emission direction). In this case, the point group information generation block 72 calculates the time from when the laser light is emitted until the receiving unit 2 detects the reflected light as the time of flight of light. Then, the point cloud information generation block 72 generates a point cloud indicating a set of points corresponding to a set of the measured distance corresponding to the calculated flight time and the emitting direction of the laser beam corresponding to the reflected light received by the receiving unit 2. Information is generated, and the generated point cloud information is supplied to the point cloud information processing block 73 .
- the point cloud information obtained by scanning all the points to be measured once will be referred to as point cloud information for one frame.
- the point cloud information can be regarded as an image in which each measurement direction is a pixel and the measurement distance in each measurement direction is a pixel value.
- the emission direction of the laser light differs at the elevation/depression angle when the pixels are arranged in the vertical direction
- the emission direction of the laser light at the horizontal angle differs when the pixels are arranged in the horizontal direction.
- the above-mentioned three-dimensional coordinate system is also called a "reference coordinate system.”
- the reference coordinate system is assumed to be a three-dimensional coordinate system in which the horizontal plane (that is, the plane parallel to the ground) is the XY axis and the height direction perpendicular to the horizontal plane is the Z axis. Also, the X-axis is assumed to be parallel to the front direction of the vehicle (that is, the extending direction of the road).
- the origin of the reference coordinate system is set at the position of the rider 100, for example.
- a frame of point cloud information obtained at the current processing time is called a "current frame”
- a frame of point cloud information obtained in the past is also called a "past frame”.
- the point cloud information processing block 73 executes predetermined processing on the point cloud information generated by the point cloud information generation block 72 .
- the point cloud information processing block 73 performs a process of removing data generated by erroneously detecting an object in the point cloud information (also referred to as “noise data” or “false alarm data") from the point cloud information. .
- the point cloud information processing block 73 further adds flag information indicating whether or not each measured point is noise data to the point cloud information.
- effective points measurement points corresponding to data generated by detecting an existing object
- measurement points other than effective points that is, measurement points corresponding to noise data
- the point cloud information processing block 73 also classifies the points to be measured represented by the point cloud information supplied from the point cloud information generation block 72, and adds classification information representing the classification result to the point cloud information. Specifically, the point cloud information processing block 73 calculates points to be measured (also referred to as “ground points”) representing the ground (including road surface paint such as white lines) and points on or around the road (including near the road boundary). ) that represent obstacles (including vehicles and features in front of it) and points to be measured (also called “obstacle points”), and classify each point to be measured based on the detection results. Generate taxonomy information.
- the point cloud information processing block 73 stores points to be measured that are erroneously determined (classified) as obstacle points due to the beam width (so-called footprint size) of the laser beam (“erroneous obstacle determination points”). ) is specified, and the classification of the erroneous obstacle determination point is corrected to the ground point.
- the point cloud information processing block 73 considers that a distant curb is difficult to determine as an obstacle point (i.e., is determined as a ground point). point”) is corrected to an obstacle point. Further, the point cloud information processing block 73 divides the current frame, which is regarded as an image, into grids, and determines whether or not each grid represents an area in which the vehicle can travel (also referred to as a "drivable area"). Runable area determination processing is executed.
- the point cloud information processing block 73 associates the point cloud information for each frame and information on the travelable area (also referred to as “travelable area information”) with the time information indicating the processing time for each frame, and stores the information in the memory 8. memorize to Details of the processing of the point cloud information processing block 73 will be described later.
- the point cloud information processing block 73 is an example of the "acquisition means”, the “obstacle detection means”, the “ground detection means”, the “ground point correction means”, the “curb determination means”, and the “drivable area determination means”.
- the lidar 100 excluding the point cloud information processing block 73 is an example of the "measurement device".
- the point cloud information and travelable area information generated by the point cloud information processing block 73 are output to a device for controlling driving support such as automatic driving of the vehicle (also referred to as a "driving support device"), for example. good too.
- driving support device may be, for example, a vehicle ECU (Electronic Control Unit), or an in-vehicle device such as a car navigation device electrically connected to the vehicle.
- the lidar 100 is not limited to a scanning type lidar that scans a laser beam over a visual field range, but is a flash type lidar that generates three-dimensional data by diffusing and irradiating a laser beam into the visual field range of a two-dimensional array sensor. may be a rider of
- FIG. 2 is an example of a flowchart showing the procedure for processing point cloud information.
- the point cloud information processing block 73 repeatedly executes the processing shown in FIG. 2 for each cycle of generating point cloud information for one frame.
- the point cloud information generation block 72 generates point cloud information based on the detection signal Sg2 (step S01). In this case, the point cloud information generation block 72 generates the point cloud information of the current frame corresponding to the current processing time based on the detection signal Sg2 generated by one scan of the scanning target range of the lidar 100 .
- the point cloud information processing block 73 executes noise removal processing, which is processing for removing noise data from the point cloud information generated in step S01 (step S02).
- the point cloud information processing block 73 may perform arbitrary noise removal processing.
- the point cloud information processing block 73 regards the data of the measured points where the intensity of the reflected light received by the receiver 2 is less than a predetermined threshold value as noise data, and determines the measured points representing data other than the noise data. regarded as a valid point.
- the point cloud information processing block 73 executes processing for classifying each effective point of the point cloud information after noise removal processing (step S03).
- the point cloud information processing block 73 estimates a plane representing the ground based on the effective points represented by the point cloud information after noise removal processing, and determines that the effective points present at positions higher than the plane by a predetermined threshold or more are obstacles. It is determined as a point, and other effective points are determined as ground points.
- the point cloud information processing block 73 uses the point cloud data of effective points to estimate a plane representing the ground by obtaining a plane equation in the reference coordinate system by the least squares method.
- the point cloud information processing block 73 performs processing for estimating the ground height (also referred to as "ground height estimation processing") based on the ground points determined in step S03 (step S04). In this case, for example, the point cloud information processing block 73 estimates the ground height based on the plane equation calculated from the ground points.
- the point cloud information processing block 73 executes erroneous obstacle determination point correction processing (step S05).
- the point cloud information processing block 73 classifies the points to be measured that are classified as obstacle points due to reflection on the road surface paint, such as white lines, into heights according to the footprint size. Correction is made to the ground point considering displacement and depth displacement.
- points having a height difference between the obstacle point and ground points around the obstacle point equal to or less than a first threshold value are extracted as candidates for the erroneous obstacle determination point.
- the candidate whose depth distance difference from the surrounding ground points is larger than the second threshold is corrected to the ground point.
- the point to be measured that has been corrected to the ground point will also be referred to as a "corrected ground point”.
- the point cloud information processing block 73 detects erroneous obstacle determination points that could not be corrected to the corrected ground points in the first correction process based on the size of the cluster formed by the obstacle points. and correct to the corrected ground point.
- the point cloud information processing block 73 resets the points erroneously determined to be corrected ground points in the first correction process and the second correction process as obstacle points.
- the point cloud information processing block 73 corrects the corrected ground points located above and below the obstacle point among the corrected ground points to become obstacle points, or A predetermined number (for example, 3) or more consecutive clusters of corrected ground points in the direction) are corrected to obstacle points, or corrected ground points are corrected to obstacle points based on the past frame.
- the point cloud information processing block 73 executes distant curb point determination processing (step S06).
- the point cloud information processing block 73 sets a temporary drivable area (also referred to as a "temporary drivable area") for the current frame based on the drivable area information of the past frame one processing time before, A far curb point classified as a ground point is determined based on the tentative drivable area. Then, the point cloud information processing block 73 corrects the classification of the far curb point to the obstacle point.
- a temporary drivable area also referred to as a "temporary drivable area”
- the point cloud information processing block 73 executes travelable area determination processing (step S07).
- the point cloud information processing block 73 divides the current frame, which is regarded as an image, into grids and performs processing for determining whether or not each grid is a travelable area.
- FIG. 3 is an example of a flowchart showing the procedure of the distant curb point determination process.
- the point cloud information processing block 73 sets a temporary travelable area for the current frame based on the travelable area information of the past frame corresponding to the processing time immediately before the current processing time (step S11).
- the point cloud information processing block 73 sets the temporary travelable area based on information regarding the movement of the rider 100 (also referred to as "rider movement information").
- the point cloud information processing block 73 for example, based on the vehicle speed pulse information received from the vehicle in which the rider 100 is mounted via a predetermined communication protocol such as CAN, the vehicle angular velocity information in the yaw direction, and the like, the rider 100 Generates rider movement information that indicates the movement speed and direction change of the
- the point cloud information processing block 73 generates rider movement information based on detection signals output by various sensors such as an acceleration sensor provided in the rider 100 .
- the point cloud information processing block 73 determines based on the rider movement information that the rider 100 has not moved between the past frame and the current frame, there is no large change in the travelable area between consecutive frames. , and the travelable area of the past frame is set as the temporary travelable area.
- the point cloud information processing block 73 determines based on the rider movement information that the lidar 100 has moved between the past frame and the current frame. determines based on the rider movement information that the rider 100 moves between the past frame and the current frame. Calculate 100 movements. Then, the point cloud information processing block 73 sets a temporary travelable area in which the calculated amount of movement is reflected in the travelable area of the past frame. As a result, the point cloud information processing block 73 can set the temporary travelable area in consideration of the movement of the rider 100 .
- the point cloud information processing block 73 regards ground points existing near the boundary of the temporary travelable area as distant curb points and corrects them to obstacle points (step S12).
- the point cloud information processing block 73 recognizes the boundary between the temporary travelable area and other areas on the current frame, and corresponds to pixels corresponding to the boundary and neighboring pixels within a predetermined pixel range from the pixel.
- the classification of the measured points classified as ground points is corrected to be an obstacle point.
- the point cloud information processing block 73 can determine a distant curb point with a small number of points as an obstacle point.
- the point cloud information processing block 73 corrects the obstacle points within the temporary travelable area to ground points (step S13). In this case, the point cloud information processing block 73 corrects obstacle points (excluding those corrected to obstacle points in step S12) existing in the temporary travelable area to ground points.
- FIG. 4 is an example of a flowchart showing the procedure of the travelable area determination process.
- the point cloud information processing block 73 sets a grid for the current frame (step S21).
- the point cloud information processing block 73 regards the current frame as an image, and sets a grid by dividing it vertically and horizontally by a predetermined number of pixels.
- Each grid is a rectangular area having the predetermined number of pixels vertically and horizontally.
- the point cloud information processing block 73 classifies grids with obstacle points as "1" and grids with no obstacle points as "0" (step S22).
- the point cloud information processing block 73 refers to the classification result indicating whether the point to be measured corresponding to each pixel constituting the grid is a ground point or an obstacle point, and classifies the grid as described above. conduct.
- the point cloud information processing block 73 determines the travelable area corresponding to the current frame based on the classification result of each grid (step S23). In this case, the point cloud information processing block 73 regards a continuous row of "0" grids including the center of each horizontal line as a travelable area.
- FIG. 5 shows an overview of the method of determining the travelable area in step S23.
- FIG. 5 shows the result of grid classification in the portion of the current frame that is closer to the rider 100 and the travelable area that is set based on the classification result.
- the "reference line” is a line representing the center of the image in the horizontal direction.
- the point cloud information processing block 73 searches for grids that are 0 to the left of the reference line in order from the horizontal line of the front grid, and sets the grid to the front of the grid that is 1 as the travelable area. Similarly, the point cloud information processing block 73 searches for a grid of 0 to the right from the reference line, and sets the grid to the front of the grid of 1 as the travelable area. After that, the point cloud information processing block 73 shifts the horizontal line to be searched upward by one and performs the same processing.
- the point cloud information processing block 73 performs the above-described processing on horizontal lines of the grid that are within a predetermined distance from the rider 100 or within a predetermined number of grids from the lower end, so that a part of the travelable area ( That is, a travelable area within a predetermined distance from the rider 100 is determined. Then, the point group information processing block 73 determines the rest of the travelable area by extending the determined part of the travelable area in the depth direction. In this case, the point cloud information processing block 73 calculates, for example, straight lines of the reference coordinate system representing the left boundary line and the right boundary line of the determined partial travelable area based on regression analysis such as the least squares method. , the calculated area between these straight lines is set as the remaining drivable area. As a result, the point cloud information processing block 73 can suitably set the entire travelable area in the current frame.
- the point cloud information processing block 73 determines a distant curb point based on the continuity of the distant curb points in the direction in which the road extends, in addition to the process of determining the distant curb point based on the travelable area, as the distant curb point determination process.
- a process also referred to as a "distant curb point determination second process" may be performed.
- the point cloud information processing block 73 determines a distant curb point based on the premise that the distant curb point exists continuously (extended) in the X-axis direction.
- FIG. 6 is an example of a flowchart of the distant curb point determination second process.
- the point cloud information processing block 73 executes the processing of this flowchart together with the flowchart of the distant curb point determination processing shown in FIG.
- the point cloud information processing block 73 searches for a corrected ground point corrected from an obstacle point to a ground point by erroneous obstacle determination point correction processing or the like in the current frame (step S31).
- the point cloud information processing block 73 searches for ground points (including corrected ground points) existing near the boundary of the road by an arbitrary method, and performs the following steps on the searched points. processing may be performed.
- the point cloud information processing block 73 counts the surrounding points that have a positional relationship that matches the continuity of the curb for each of the searched corrected ground points (step S32). Specifically, the point cloud information processing block 73 determines that the distance on the YZ plane is within a threshold for each corrected ground point, and the obstacle point, the corrected ground point, or the corrected obstacle point (only the obstacle point is also possible). ) is counted.
- the surrounding points are the current points corresponding to the pixels whose displacement in the vertical direction is within a predetermined number of lines (for example, 30 lines) and whose displacement in the direction along the horizontal line is less than a predetermined pixel difference (for example, a difference of 3 pixels). This is the measured point of the frame.
- the point cloud information processing block 73 regards the corrected ground point where the count number in step S32 is equal to or greater than a predetermined number as a distant curb point, and corrects it to an obstacle point (step S33).
- the point cloud information processing block 73 may consider the continuity of distant curb points in the processing of step S12 of FIG. 3 instead of executing the processing of this flowchart separately from the processing of the flowchart of FIG. good.
- the point cloud information processing block 73 corrects, as an obstacle point, a ground point near the boundary of the temporary travelable area and which satisfies the condition regarding continuity as described above, as an obstacle point. do.
- the distant curb point can be determined with higher accuracy by utilizing the property that the distant curb point is continuous in the X-axis direction.
- the configuration of rider 100 is not limited to the configuration shown in FIG.
- a device other than the rider 100 may have the point cloud information processing block 73 of the control unit 7 and functions corresponding to the point cloud information processing block 73 .
- FIG. 7 is a configuration diagram of a lidar system according to a modification.
- the lidar system has a lidar 100X and an information processing device 200 .
- the rider 100X supplies the point cloud information generated by the point cloud information generation block 72 to the information processing device 200.
- FIG. 7 is a configuration diagram of a lidar system according to a modification.
- the lidar system has a lidar 100X and an information processing device 200 .
- the rider 100X supplies the point cloud information generated by the point cloud information generation block 72 to the information processing device 200.
- the information processing device 200 has a control unit 7A and a memory 8.
- the memory 8 stores information necessary for the control section 7A to execute processing.
- the control unit 7A functionally has a point cloud information acquisition block 72A and a point cloud information processing block 73 .
- the point cloud information acquisition block 72A receives the point cloud information generated by the point cloud information generation block 72 of the rider 100X and supplies the received point cloud information to the point cloud information processing block 73 .
- the point cloud information processing block 73 performs the same processing as the point cloud information processing block 73 of the above-described embodiment on the point cloud information supplied from the point cloud information acquisition block 72A.
- the information processing device 200 may be realized by a driving support device. Further, the parameter information necessary for processing may be stored by another device having a memory that the information processing device 200 can refer to. According to the configuration of this modification, the information processing device 200 can generate accurate classification information for each measured point in the point cloud information generated by the rider 100X.
- the control unit 7 of the rider 100 functions as an information processing device in the present invention, and functionally includes an acquisition unit, an obstacle detection unit, a ground detection unit, and a curb determination unit. and drivable area determination means.
- the acquisition means acquires point cloud data.
- the obstacle detection means detects obstacle points, which are data representing measured points of an obstacle, from the point cloud data.
- the ground detection means detects ground points, which are data representing measured points on the ground, from the point cloud data.
- the curb determining means determines that the ground point corresponding to the curb is the obstacle point based on the travelable area of the vehicle determined at the processing time immediately before the current processing time.
- the travelable area determining means determines the travelable area at the current processing time based on the obstacle point and the ground point. As a result, the rider 100 can accurately determine even a curbstone, which is likely to be determined as a ground point, as an obstacle point.
- Non-transitory computer readable media include various types of tangible storage media.
- Examples of non-transitory computer-readable media include magnetic storage media (e.g., floppy disks, magnetic tapes, hard disk drives), magneto-optical storage media (e.g., magneto-optical discs), CD-ROMs (Read Only Memory), CD-Rs, CD-R/W, semiconductor memory (eg mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)).
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
計測装置が出力する計測データを取得する取得手段と、
障害物の被計測位置を表すデータである障害物点を前記計測データから検出する障害物検出手段と、
地面の被計測位置を表すデータである地面点を前記計測データから検出する地面検出手段と、
現処理時刻の直前の処理時刻において判定した移動体の走行可能領域に基づき、縁石に該当する前記地面点を前記障害物点であると判定する縁石判定手段と、
前記障害物点と前記地面点とに基づき、前記現処理時刻における前記走行可能領域を決定する走行可能領域決定手段と、
を備える情報処理装置である。
情報処理装置が実行する制御方法であって、
計測装置が出力する計測データを取得し、
障害物の被計測位置を表すデータである障害物点を前記計測データから検出し、
地面の被計測位置を表すデータである地面点を前記計測データから検出し、
現処理時刻の直前の処理時刻において判定した移動体の走行可能領域に基づき、縁石に該当する前記地面点を前記障害物点であると判定し、
前記障害物点と前記地面点とに基づき、前記現処理時刻における前記走行可能領域を決定する、
制御方法である。
計測装置が出力する計測データを取得し、
障害物の被計測位置を表すデータである障害物点を前記計測データから検出し、
地面の被計測位置を表すデータである地面点を前記計測データから検出し、
現処理時刻の直前の処理時刻において判定した移動体の走行可能領域に基づき、縁石に該当する前記地面点を前記障害物点であると判定し、
前記障害物点と前記地面点とに基づき、前記現処理時刻における前記走行可能領域を決定する処理をコンピュータに実行させるプログラムである。
図1は、本実施例に係るライダ100の概略構成を示す。ライダ100は、例えば、自動運転などの運転支援を行う車両に搭載される。ライダ100は、水平方向および垂直方向の所定の角度範囲に対してレーザ光を照射し、当該レーザ光が物体に反射されて戻った光(「反射光」とも呼ぶ。)を受光することで、ライダ100から物体までの距離を離散的に測定し、当該物体の3次元位置を示す点群情報を生成する。ライダ100は、道路面などの地面を測定範囲に含むように設置されている。
図2は、点群情報の処理の手順を示すフローチャートの一例である。点群情報処理ブロック73は、図2に示す処理を、1フレーム分の点群情報を生成する周期ごとに繰り返し実行する。
次に、図2のステップS06で実行される遠方縁石点判定処理について詳細に説明する。図3は、遠方縁石点判定処理の手順を示すフローチャートの一例である。
次に、図2のステップS07で実行される走行可能領域決定処理について詳細に説明する。図4は、走行可能領域決定処理の手順を示すフローチャートの一例である。
次に、上述の実施例に好適な変形例について説明する。以下の変形例は、組み合わせて上述の実施例に適用してもよい。
点群情報処理ブロック73は、遠方縁石点判定処理として、走行可能領域に基づき遠方縁石点を判定する処理に加えて、道路が延伸する方向における遠方縁石点の連続性に基づき遠方縁石点を判定する処理(「遠方縁石点判定第2処理」とも呼ぶ。)を行ってもよい。この場合、点群情報処理ブロック73は、遠方縁石点がX軸方向に連続して(延伸して)存在しているという前提を利用し、遠方縁石点を判定する。
ライダ100の構成は、図1に示す構成に限定されない。例えば、制御部7の点群情報処理ブロック73及び点群情報処理ブロック73に相当する機能を、ライダ100とは別の装置が有してもよい。
2 受信部
3 ビームスプリッタ
5 スキャナ
6 ピエゾセンサ
7、7A 制御部
8 メモリ
100、100X ライダ
200 情報処理装置
Claims (12)
- 計測装置が出力する計測データを取得する取得手段と、
障害物の被計測位置を表すデータである障害物点を前記計測データから検出する障害物検出手段と、
地面の被計測位置を表すデータである地面点を前記計測データから検出する地面検出手段と、
現処理時刻の直前の処理時刻において判定した移動体の走行可能領域に基づき、縁石に該当する前記地面点を前記障害物点であると判定する縁石判定手段と、
前記障害物点と前記地面点とに基づき、前記現処理時刻における前記走行可能領域を決定する走行可能領域決定手段と、
を備える情報処理装置。 - 前記縁石判定手段は、前記直前の処理時刻における前記走行可能領域に基づき前記現処理時刻における仮走行可能領域を設定し、当該仮走行可能領域に基づき、前記縁石に該当する前記地面点を前記障害物点であると判定する、請求項1に記載の情報処理装置。
- 前記縁石判定手段は、前記仮走行可能領域の境界位置付近に存在する前記地面点を前記障害物点に補正する、請求項2に記載の情報処理装置。
- 前記縁石判定手段は、前記仮走行可能領域を、前記直前の処理時刻における前記走行可能領域と、前記計測装置の移動情報とに基づき設定する、請求項2または3に記載の情報処理装置。
- 前記仮走行可能領域内に存在する前記障害物点を、前記地面点に補正する地面点補正手段をさらに有する、請求項2~4のいずれか一項に記載の情報処理装置。
- 前記縁石判定手段は、道路が延伸する方向における連続性に基づき、前記縁石に該当する前記地面点を前記障害物点であると判定する、請求項1~5のいずれか一項に記載の情報処理装置。
- 前記走行可能領域決定手段は、前記障害物点と前記地面点とに基づき決定した前記現処理時刻の所定距離以内における前記走行可能領域を延伸することで、前記現処理時刻における全体の前記走行可能領域を決定する、請求項1~6のいずれか一項に記載の情報処理装置。
- 前記障害物は、道路境界付近に存在する物体である、請求項1~7のいずれか一項に記載の情報処理装置。
- 前記物体は、縁石、植生、路肩物、道路境界付近の落下物の少なくともいずれかである、請求項8に記載の情報処理装置。
- 情報処理装置が実行する制御方法であって、
計測装置が出力する計測データを取得し、
障害物の被計測位置を表すデータである障害物点を前記計測データから検出し、
地面の被計測位置を表すデータである地面点を前記計測データから検出し、
現処理時刻の直前の処理時刻において判定した移動体の走行可能領域に基づき、縁石に該当する前記地面点を前記障害物点であると判定し、
前記障害物点と前記地面点とに基づき、前記現処理時刻における前記走行可能領域を決定する、
制御方法。 - 計測装置が出力する計測データを取得し、
障害物の被計測位置を表すデータである障害物点を前記計測データから検出し、
地面の被計測位置を表すデータである地面点を前記計測データから検出し、
現処理時刻の直前の処理時刻において判定した移動体の走行可能領域に基づき、縁石に該当する前記地面点を前記障害物点であると判定し、
前記障害物点と前記地面点とに基づき、前記現処理時刻における前記走行可能領域を決定する処理をコンピュータに実行させるプログラム。 - 請求項11に記載のプログラムを格納した記憶媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280090621.6A CN118696252A (zh) | 2022-02-10 | 2022-02-10 | 信息处理装置、控制方法、程序以及存储介质 |
PCT/JP2022/005353 WO2023152871A1 (ja) | 2022-02-10 | 2022-02-10 | 情報処理装置、制御方法、プログラム及び記憶媒体 |
JP2023579953A JPWO2023152871A1 (ja) | 2022-02-10 | 2022-02-10 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2022/005353 WO2023152871A1 (ja) | 2022-02-10 | 2022-02-10 | 情報処理装置、制御方法、プログラム及び記憶媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023152871A1 true WO2023152871A1 (ja) | 2023-08-17 |
Family
ID=87563943
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/005353 WO2023152871A1 (ja) | 2022-02-10 | 2022-02-10 | 情報処理装置、制御方法、プログラム及び記憶媒体 |
Country Status (3)
Country | Link |
---|---|
JP (1) | JPWO2023152871A1 (ja) |
CN (1) | CN118696252A (ja) |
WO (1) | WO2023152871A1 (ja) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008082750A (ja) | 2006-09-26 | 2008-04-10 | Toyota Motor Corp | 前方車両認識装置 |
JP2011028659A (ja) * | 2009-07-29 | 2011-02-10 | Hitachi Automotive Systems Ltd | 道路形状認識装置 |
JP2016045507A (ja) * | 2014-08-19 | 2016-04-04 | 日野自動車株式会社 | 運転支援システム |
US20160131762A1 (en) * | 2014-11-07 | 2016-05-12 | Hyundai Mobis Co., Ltd. | Apparatus and method for determining available driving space |
CN109254289A (zh) * | 2018-11-01 | 2019-01-22 | 百度在线网络技术(北京)有限公司 | 道路护栏的检测方法和检测设备 |
JP2019027973A (ja) * | 2017-08-01 | 2019-02-21 | 株式会社デンソー | 物標判定装置 |
JP2019106022A (ja) * | 2017-12-13 | 2019-06-27 | 株式会社Soken | 路側物認識装置 |
JP2020149079A (ja) * | 2019-03-11 | 2020-09-17 | 本田技研工業株式会社 | 路面検出装置 |
-
2022
- 2022-02-10 JP JP2023579953A patent/JPWO2023152871A1/ja active Pending
- 2022-02-10 CN CN202280090621.6A patent/CN118696252A/zh active Pending
- 2022-02-10 WO PCT/JP2022/005353 patent/WO2023152871A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008082750A (ja) | 2006-09-26 | 2008-04-10 | Toyota Motor Corp | 前方車両認識装置 |
JP2011028659A (ja) * | 2009-07-29 | 2011-02-10 | Hitachi Automotive Systems Ltd | 道路形状認識装置 |
JP2016045507A (ja) * | 2014-08-19 | 2016-04-04 | 日野自動車株式会社 | 運転支援システム |
US20160131762A1 (en) * | 2014-11-07 | 2016-05-12 | Hyundai Mobis Co., Ltd. | Apparatus and method for determining available driving space |
JP2019027973A (ja) * | 2017-08-01 | 2019-02-21 | 株式会社デンソー | 物標判定装置 |
JP2019106022A (ja) * | 2017-12-13 | 2019-06-27 | 株式会社Soken | 路側物認識装置 |
CN109254289A (zh) * | 2018-11-01 | 2019-01-22 | 百度在线网络技术(北京)有限公司 | 道路护栏的检测方法和检测设备 |
JP2020149079A (ja) * | 2019-03-11 | 2020-09-17 | 本田技研工業株式会社 | 路面検出装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023152871A1 (ja) | 2023-08-17 |
CN118696252A (zh) | 2024-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8724094B2 (en) | Apparatus and method of recognizing presence of objects | |
JP4428208B2 (ja) | 車両用物体認識装置 | |
JP4793094B2 (ja) | 走行環境認識装置 | |
US7136753B2 (en) | Object recognition apparatus for vehicle, inter-vehicle control apparatus, and distance measurement apparatus | |
US8810445B2 (en) | Method and apparatus for recognizing presence of objects | |
JP5145986B2 (ja) | 物体検出装置及び測距方法 | |
JP6505470B2 (ja) | ノイズ除去方法および物体認識装置 | |
JP5488518B2 (ja) | 道路端検出装置、運転者支援装置、および道路端検出方法 | |
JP2008026997A (ja) | 歩行者認識装置及び歩行者認識方法 | |
JP3736521B2 (ja) | 車両用物体認識装置 | |
JP3736520B2 (ja) | 車両用障害物認識装置 | |
JP2005291788A (ja) | 車両用物体認識装置 | |
JP7095640B2 (ja) | 物体検出装置 | |
JP2019105654A (ja) | ノイズ除去方法および物体認識装置 | |
JP2024072823A (ja) | 情報処理装置、制御方法、プログラム及び記憶媒体 | |
JP5556317B2 (ja) | 物体認識装置 | |
JP2004184332A (ja) | 車両用物体認識装置及び車間制御装置 | |
WO2023152871A1 (ja) | 情報処理装置、制御方法、プログラム及び記憶媒体 | |
JP4878483B2 (ja) | レーダ装置 | |
WO2022190364A1 (ja) | 情報処理装置、情報処理方法、プログラム及び記憶媒体 | |
JP2022139739A (ja) | 情報処理装置、情報処理方法、プログラム及び記憶媒体 | |
JP2023064482A (ja) | 情報処理装置、制御方法、プログラム及び記憶媒体 | |
WO2023152870A1 (ja) | 情報処理装置、制御方法、プログラム及び記憶媒体 | |
JPH05113482A (ja) | 車載用追突防止装置 | |
WO2022186099A1 (ja) | 情報処理装置、情報処理方法、プログラム及び記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22925892 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023579953 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022925892 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022925892 Country of ref document: EP Effective date: 20240910 |