WO2023025983A1 - A device for assisted steering - Google Patents

A device for assisted steering Download PDF

Info

Publication number
WO2023025983A1
WO2023025983A1 PCT/FI2022/050537 FI2022050537W WO2023025983A1 WO 2023025983 A1 WO2023025983 A1 WO 2023025983A1 FI 2022050537 W FI2022050537 W FI 2022050537W WO 2023025983 A1 WO2023025983 A1 WO 2023025983A1
Authority
WO
WIPO (PCT)
Prior art keywords
points
point
control unit
group
vehicle
Prior art date
Application number
PCT/FI2022/050537
Other languages
French (fr)
Inventor
Pasi PYYKÖNEN
Matti Kutila
Mikko Tarkiainen
Original Assignee
Teknologian Tutkimuskeskus Vtt Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teknologian Tutkimuskeskus Vtt Oy filed Critical Teknologian Tutkimuskeskus Vtt Oy
Publication of WO2023025983A1 publication Critical patent/WO2023025983A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • G01S17/875Combinations of systems using electromagnetic waves other than radio waves for determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations

Definitions

  • the disclosure relates to assisted steering, and particularly to use of point clouds in assisted steering functions.
  • vehicle refers in general to means of carrying of transporting people or goods.
  • the term includes watercraft, amphibious vehicles, aircraft and spacecraft, but most often the term associates to a land vehicle, a piece of mechanized equipment that applies steering and drive forces against the ground.
  • the contact to the ground may be implemented through wheels, tracks, rails or skis, for example.
  • Driving of a vehicle means controlling operation and movement of the vehicle, and in the early days, vehicles were practically fully controlled by human drivers. Through the development of mechatronics, artificial intelligence and multi-agent systems, more and more of the monitoring, agency and action functions have been transferred to be implemented by automated driving systems, even if the human remains to be responsible for the vehicles’ performance as the operator.
  • vehicular automation One important function of vehicular automation is to enable a driver (human or automated) to steer a vehicle accurately relative to a physical landmark.
  • Conventional satellite-based navigation systems are applicable for some steering functions, but their accuracy is typically not sufficient for steering functions where tolerances are in the order of twenty centimetres or less. Examples of such steering functions include, for example, driving a bus to an exact vicinity of a curb, or positioning a bus to an exact position under a charging station.
  • IMU inertial measurement unit
  • accelerometers gyroscopes
  • magnetometers magnetometers
  • IMUs are integrated systems where the sensors are jointly adapted for a specific type of a vehicle and for a defined set of steering functions. Due to this, they are quite expensive and not applicable for general use in devices that can be installed to various types of land vehicles, even be retrofitted to an existing car, truck or bus.
  • a lidar a measuring system that detects and locates objects on the same principle as a radar but emits pulsed laser light instead of microwaves.
  • Lidar generates a point cloud that represents a measured distance to surrounding objects.
  • Land vehicles include various suspension systems and active mechanisms that add passenger comfort and assist passenger entry and exit. Most of these systems operate independently of each other or are dynamically controlled by the driver. These unpredictable tilting and swinging motions of the body of the vehicle introduce distortion that prevents use of point clouds for steering functions that require higher accuracy than what is achievable with satellite-based navigation systems.
  • An object of the present disclosure is to provide a device and a method to alleviate at least some of the above described challenges in assisted steering functions.
  • the following examples are based on the idea of determining the tilt of a surface that supports a vehicle based on its point cloud coordinates and using the tilt to compensate the effect of the tilt in the point cloud coordinates before using them to detect objects in the vicinity of the vehicle.
  • Figure 1 illustrates some basic elements and a vehicle setup for a device applicable in assisted steering
  • Figure 2 illustrates components of a control unit of the device
  • Figure 3 shows an example of the vehicle setup of Figure 1 in an operational situation
  • Figure 4 illustrates example positions for applicable reference areas
  • Figure 5 illustrates a reference area of Figure 4 in another view
  • Figure 6 illustrates the principle for determining the tilt of the supporting surface
  • Figure 7 illustrates steps of a method implemented in the control unit of the device.
  • Figure 1 shows a setup of an initial situation where a vehicle 100 stands or runs on a supporting surface 102.
  • the supporting surface 102 is even, i.e. it is essentially perpendicular to the direction of gravitation so that the vehicle presses against the supporting surface and suspension systems of the vehicle can maintain a neutral mode wherein the body remains or runs without tilting or swinging on the supporting surface 102.
  • a scanning unit 104 is fixed to the vehicle and is configured to generate a point cloud that represents measured distances to objects around the scanning unit.
  • the scanning unit includes emits pulses of light and measures the amounts of time before the reflected light pulses are seen by the detector.
  • the round-trip time determines the travel distance of a light pulse, which is twice the distance between the scanning unit and a point of reflection.
  • the scanning unit 104 uses emitted and detected light pulses to generate a point cloud, wherein each point has its set of Cartesian coordinates (x, y, z). These coordinates represent measured distances between the scanning unit and detected objects.
  • Devices generating point clouds through 3-D scanning are widely known and commercially available, so their operation will not be described in more detail herein.
  • Fixing of the scanning unit to the vehicle in this context means that during operation, the scanning unit does not move in relation to the body of the vehicle and thus provides a point of origin for the coordinates in the point cloud.
  • This fixing can be made permanent, for example by welding or use of a permanent adhesive or the scanning unit may be releasably fixed to the body of the vehicle with screws or by various locking/latching mechanisms.
  • the scanning unit 104 is communicatively coupled to a control unit 106 and is thus enabled to feed the point cloud data to the control unit.
  • the control unit 106 processes point cloud information from at least one scanning unit 104, but a vehicle can be equipped with more than one scanning units, and the control unit can process information from them in an integrated manner.
  • the block chart of Figure 2 illustrates components of the control unit.
  • the control unit 106 is a device that may comprise a processing component 210.
  • the processing component 210 is a combination of one or more computing devices suitable for performing systematic execution of operations upon predefined data.
  • the processing component may comprise one or more arithmetic logic units, special registers and control circuits.
  • the processing component may comprise or may be connected to a memory unit 212 that provides a data medium where computer-readable data or programs, or user data can be stored.
  • the memory unit may comprise one or more units of volatile or non-volatile memory, for example EEPROM, ROM, PROM, RAM, DRAM, SRAM, firmware, programmable logic, etc.
  • the control unit 106 may also comprise, or be connected to an interface unit 214 that comprises at least one input unit for inputting data to the internal processes of the control unit, and at least one output unit for outputting data from the internal processes of the control unit.
  • the interface unit 214 typically comprises plug-in units acting as a gateway for information delivered to its external connection points and for information fed to the lines connected to its external connection points.
  • the interface unit 214 typically comprises a radio transceiver unit, which includes a transmitter and a receiver.
  • a transmitter of the radio transceiver unit may receive a bitstream from the processing component 210 and convert it to a radio signal for transmission by an antenna.
  • the radio signals received by the antenna may be led to a receiver of the radio transceiver unit, which converts the radio signal into a bitstream that is forwarded for further processing to the processing component 210.
  • Different line or radio interfaces may be implemented in one interface unit.
  • the interface unit 214 may also comprise a user interface with a keypad, a touch screen, a microphone, or equals for inputting data and a screen, a touch screen, a loudspeaker, or equals for outputting data to the manual or automated driver of the vehicle.
  • the output data may include, for example, information on an object of interest that has been detected by the scanning unit in a form applicable for a manual or automated steering function of the vehicle.
  • the memory unit 212, the processing component 210 and the interface unit 214 are electrically interconnected to provide means for performing systematic execution of operations on the received and/or stored data according to predefined, essentially programmed processes. These operations comprise the procedures described herein for the control unit 106 of the device of Figure 1.
  • the point cloud typically includes a limited set of points that in combination provide a field of view a predefined region around the scanning unit. In the field of view, some specific object is relevant and needs to be detected for a steering function.
  • the control unit 106 is thus configured to apply one or more defined criteria to the point cloud and thereby extract from the point cloud one or more points of interest that correspond with one or more objects of interest in the vicinity of the vehicle.
  • Figure 1 illustrates notation of a coordinate system applied in this description.
  • the point of origin is the scanning unit, a direction parallel to the direction of gravitation is considered as the vertical direction and is denoted as the z-direction.
  • a direction parallel to the direction in which the vehicle moves is considered as a first horizontal direction and is denoted as the y-direction.
  • a second horizontal direction is orthogonal to the vertical direction and the first horizontal direction and is denoted as the x-direction.
  • the control unit needs to apply a criterion that extracts from the point cloud a set of points of interest that most likely include points that correspond to the edge of the curb.
  • the problem in use of the criterion is, however, illustrated with Figure 3.
  • Figure 3 shows the same vehicle setup as Figure 1 but in an operational situation where the vehicle 100 is a bus that stands or runs on a supporting surface 102 that is tilted with respect to a plane perpendicular to the z-direction.
  • the vehicle 100 is a bus that stands or runs on a supporting surface 102 that is tilted with respect to a plane perpendicular to the z-direction.
  • hydraulic system of the bus maintains the body of the bus in upright position even if the road is tilted.
  • minor unevenness of the road may further wobble the body, not much but quite unpredictably.
  • the vehicle becomes correspondingly tilted with respect to the supporting surface 102 so that many of the criteria applied to analyse the point cloud in the situation of Figure 1 are no longer applicable in the situation of Figure 3. This is specifically the case for analyses that need greater accuracy, like positioning the bus accurately beside a curb, or driving the bus under a charging station.
  • control unit determines from the point cloud at least one reference area and uses information from that reference area to compensate the effect of the tilt of the vehicle before any criterion for object detection is applied.
  • Figure 4 illustrates example positions for the reference areas. For many applications, a required accuracy can be achieved through one reference area, like Area 1 , but by use of additional reference areas, accuracy can be improved.
  • the control unit is configured to extract from the point cloud at least a first group of points that represent at least one selected region of a surface that supports the vehicle.
  • Tilt of the vehicle refers herein to tilt of the vehicle with respect to the surface of the supporting surface and it can then be determined based on coordinates of the points in the first group of points. With the information on the tilt of the vehicle, the effect of this tilt can be compensated from the point cloud before any criterion for object detection is applied.
  • control unit may be configured to extract from the point cloud a second group of points that represent another selected region of the surface that supports the vehicle. Tilt of the vehicle can then be determined based on points in the first group of points and in the second group of points.
  • even more reference areas can be applied. Availability of reference areas depends naturally on the extent of the field of view of the scanning unit. For example, if necessary, additional scanning units may be positioned in different parts of the body of the vehicle and be jointly connected to the control unit to provide reference areas and their respective groups of points to the control unit so that they can be used jointly for determination of the tilt of the vehicle. As shown in Figure 4, one reference area may be, for example, in front of the vehicle and another reference area on the back of the vehicle. Other reference area configurations may also be used without deviating from the scope.
  • Figures 1 and 3 As an example, let us look in more detail the case of Figures 1 and 3, and one reference area (Area 1 of Figure 4). If we assume that the scanning unit is fixed to the front of the bus, as shown in Figure 4, and one coordinate unit corresponds with 1 metre, an example condition for a first group of points could be ⁇ 0 ⁇ Xgi ⁇ 0.4A0 ⁇ y g i ⁇ 0.2) ⁇ , wherein x gi represents x-coordinates and y gi represents y-coordinates of the first group of points, in other words points of a point cloud included in the selected reference area.
  • Figures 4 and 5 illustrate the reference area 110 corresponding to a first group of points in this example in two different views.
  • the control unit is then configured to determine a tilt of the vehicle from said first group of points.
  • the determination may be based on one calculation, or several calculations may be combined for more accuracy.
  • a most straightforward method would be to select any two points from the reference area and determine the tilt of the vehicle based on their x- and z-coordinates.
  • the supporting surface here the road
  • the control unit may be configured to determine an average point p ave that represents an average of vertical coordinates of the points in the first group of points, and a minimum point p m in that represents a minimum of vertical coordinates of the points in the first group of points.
  • the computations may be implemented also in parallel to provide a way to check and ensure that the computed tilt angle is correct. It should be noted that these points are advantageous examples that are easily implemented with known library functions of available software systems. Other selected points and/or other averaging methods, well known to a person skilled in the art may be applied without deviating from the scope of protection.
  • the determinations are implemented in the control unit with coded programs, advantageously invoking library software available in the specific software system applied by the control unit.
  • coded programs advantageously invoking library software available in the specific software system applied by the control unit.
  • Most code used by modern applications is readily provided in these system libraries.
  • the choice of the applied library depends on a diverse range of requirements such as: desired features, ease of API, portability or platform/compiler dependence (for e.g.: Linux, Windows, Visual C++, GCC), performance in speed, to name some.
  • PCL Point Cloud Library
  • PCL Point Cloud Library
  • the control unit When the tilt angle a is determined, the control unit is configured to apply it to create a compensated point cloud where coordinates of the points are adjusted by compensating the effect of the tilt of the vehicle.
  • the effect of the tilt of the vehicle may be compensated by rotating coordinates of the points of the point cloud.
  • the control unit may be configured to apply a library function to rotate the coordinates or it may be configured with written code to implement the rotation.
  • Eigen is a high-level C++ library of template headers for linear algebra, matrix and vector operations, geometrical transformations, numerical solvers and related algorithms. With Eigen, the operation could be implemented with function Affine3f:: rotate
  • the control unit is then configured to detect the one or more objects of interest from the compensated point cloud.
  • the control unit is configured to apply one or more defined criteria to the compensated point cloud to extract from the point cloud points of interest that correspond with the one or more objects of interest in the field of view of the scanning unit.
  • the PCL framework contains algorithms including filtering, feature estimation, surface reconstruction, registration, model fitting and segmentation. These algorithms can be used, for example, to filter outliers from noisy data, stitch 3D point clouds together, segment relevant parts of a scene, extract key points and compute descriptors to recognize objects in the world based on their geometric appearance, create surfaces from point clouds and visualize them (https://Dointclouds.org/documentation/index.html).
  • the flow chart of Figure 7 illustrates steps of a method implemented in the control unit 106 of the device described with Figures 1 to 6.
  • the method begins by the control unit receiving (stage 700) from the scanning unit a point cloud, which has a field of view around a point of origin.
  • the point of origin is located in the scanning unit that is fixed to a vehicle.
  • the control unit extracts (stage 702) from the point cloud at least a first group of points that represent at least one selected region (REF) of a surface supporting the vehicle.
  • the control unit selects (stage 704) at least two points from the first group of points, and determines (stage 706) tilt of the vehicle based on coordinates of the at least two points in in said first group of points.
  • the control unit creates (stage 708) a compensated point cloud (cPC) where coordinates of the points are adjusted by compensating the effect of the tilt.
  • cPC compensated point cloud
  • One or more objects of interest can then be detected (710) from the compensated point cloud.

Abstract

A device that includes a scanning unit and a control unit communicatively connected to each other. The is configured to be fixed to a vehicle and generate a point cloud that has a field of view around a point of origin located in the scanning unit. The control unit is configured to receive the point cloud and extract from it at least a first group of points that represent at least one selected region of a surface supporting the vehicle. The control unit is further configured to determine a tilt of the vehicle from said first group of points and create a compensated point cloud where coordinates of the points are adjusted by compensating the effect of the tilt. The control unit is configured to detect one or more objects of interest for steering functions from the compensated point cloud.

Description

A DEVICE FOR ASSISTED STEERING
FIELD
The disclosure relates to assisted steering, and particularly to use of point clouds in assisted steering functions.
BACKGROUND
The term vehicle refers in general to means of carrying of transporting people or goods. In its broadest scope, the term includes watercraft, amphibious vehicles, aircraft and spacecraft, but most often the term associates to a land vehicle, a piece of mechanized equipment that applies steering and drive forces against the ground. The contact to the ground may be implemented through wheels, tracks, rails or skis, for example. Driving of a vehicle means controlling operation and movement of the vehicle, and in the early days, vehicles were practically fully controlled by human drivers. Through the development of mechatronics, artificial intelligence and multi-agent systems, more and more of the monitoring, agency and action functions have been transferred to be implemented by automated driving systems, even if the human remains to be responsible for the vehicles’ performance as the operator.
One important function of vehicular automation is to enable a driver (human or automated) to steer a vehicle accurately relative to a physical landmark. Conventional satellite-based navigation systems are applicable for some steering functions, but their accuracy is typically not sufficient for steering functions where tolerances are in the order of twenty centimetres or less. Examples of such steering functions include, for example, driving a bus to an exact vicinity of a curb, or positioning a bus to an exact position under a charging station.
For more accurate operations, special-purpose machines have been conventionally equipped with an inertial measurement unit (IMU) that includes a combination of accelerometers, gyroscopes and/or magnetometers. However, IMUs are integrated systems where the sensors are jointly adapted for a specific type of a vehicle and for a defined set of steering functions. Due to this, they are quite expensive and not applicable for general use in devices that can be installed to various types of land vehicles, even be retrofitted to an existing car, truck or bus.
Another conventionally applicable vehicle control and navigation method is to use a lidar, a measuring system that detects and locates objects on the same principle as a radar but emits pulsed laser light instead of microwaves. Lidar generates a point cloud that represents a measured distance to surrounding objects. However, a lidar needs to be fixed to a vehicle, and a vehicle does not provide a static frame of reference for the point cloud. Land vehicles include various suspension systems and active mechanisms that add passenger comfort and assist passenger entry and exit. Most of these systems operate independently of each other or are dynamically controlled by the driver. These unpredictable tilting and swinging motions of the body of the vehicle introduce distortion that prevents use of point clouds for steering functions that require higher accuracy than what is achievable with satellite-based navigation systems.
BRIEF DESCRIPTION
An object of the present disclosure is to provide a device and a method to alleviate at least some of the above described challenges in assisted steering functions.
This object is achieved with a device and a method, which are characterized by what is stated in the independent claims. Some exemplary embodiments of the disclosure are disclosed in the dependent claims.
The following examples are based on the idea of determining the tilt of a surface that supports a vehicle based on its point cloud coordinates and using the tilt to compensate the effect of the tilt in the point cloud coordinates before using them to detect objects in the vicinity of the vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
In the following the disclosure will be described in greater detail by means of preferred embodiments with reference to the accompanying drawings, in which
Figure 1 illustrates some basic elements and a vehicle setup for a device applicable in assisted steering;
Figure 2 illustrates components of a control unit of the device
Figure 3 shows an example of the vehicle setup of Figure 1 in an operational situation;
Figure 4 illustrates example positions for applicable reference areas;
Figure 5 illustrates a reference area of Figure 4 in another view;
Figure 6 illustrates the principle for determining the tilt of the supporting surface; and
Figure 7 illustrates steps of a method implemented in the control unit of the device.
DETAILED DESCRIPTION
The block chart of Figure 1 illustrates some basic elements necessary to understand following examples of the invention. Figure 1 shows a setup of an initial situation where a vehicle 100 stands or runs on a supporting surface 102. The supporting surface 102 is even, i.e. it is essentially perpendicular to the direction of gravitation so that the vehicle presses against the supporting surface and suspension systems of the vehicle can maintain a neutral mode wherein the body remains or runs without tilting or swinging on the supporting surface 102. A scanning unit 104 is fixed to the vehicle and is configured to generate a point cloud that represents measured distances to objects around the scanning unit. The scanning unit includes emits pulses of light and measures the amounts of time before the reflected light pulses are seen by the detector. Since the speed of light is known, the round-trip time determines the travel distance of a light pulse, which is twice the distance between the scanning unit and a point of reflection. The scanning unit 104 thus uses emitted and detected light pulses to generate a point cloud, wherein each point has its set of Cartesian coordinates (x, y, z). These coordinates represent measured distances between the scanning unit and detected objects. Devices generating point clouds through 3-D scanning are widely known and commercially available, so their operation will not be described in more detail herein. Fixing of the scanning unit to the vehicle in this context means that during operation, the scanning unit does not move in relation to the body of the vehicle and thus provides a point of origin for the coordinates in the point cloud. This fixing can be made permanent, for example by welding or use of a permanent adhesive or the scanning unit may be releasably fixed to the body of the vehicle with screws or by various locking/latching mechanisms.
The scanning unit 104 is communicatively coupled to a control unit 106 and is thus enabled to feed the point cloud data to the control unit. The control unit 106 processes point cloud information from at least one scanning unit 104, but a vehicle can be equipped with more than one scanning units, and the control unit can process information from them in an integrated manner. The block chart of Figure 2 illustrates components of the control unit. The control unit 106 is a device that may comprise a processing component 210. The processing component 210 is a combination of one or more computing devices suitable for performing systematic execution of operations upon predefined data. The processing component may comprise one or more arithmetic logic units, special registers and control circuits. The processing component may comprise or may be connected to a memory unit 212 that provides a data medium where computer-readable data or programs, or user data can be stored. The memory unit may comprise one or more units of volatile or non-volatile memory, for example EEPROM, ROM, PROM, RAM, DRAM, SRAM, firmware, programmable logic, etc. The control unit 106 may also comprise, or be connected to an interface unit 214 that comprises at least one input unit for inputting data to the internal processes of the control unit, and at least one output unit for outputting data from the internal processes of the control unit.
If a line interface is applied, the interface unit 214 typically comprises plug-in units acting as a gateway for information delivered to its external connection points and for information fed to the lines connected to its external connection points. If a radio interface is applied, the interface unit 214 typically comprises a radio transceiver unit, which includes a transmitter and a receiver. A transmitter of the radio transceiver unit may receive a bitstream from the processing component 210 and convert it to a radio signal for transmission by an antenna. Correspondingly, the radio signals received by the antenna may be led to a receiver of the radio transceiver unit, which converts the radio signal into a bitstream that is forwarded for further processing to the processing component 210. Different line or radio interfaces may be implemented in one interface unit.
The interface unit 214 may also comprise a user interface with a keypad, a touch screen, a microphone, or equals for inputting data and a screen, a touch screen, a loudspeaker, or equals for outputting data to the manual or automated driver of the vehicle. The output data may include, for example, information on an object of interest that has been detected by the scanning unit in a form applicable for a manual or automated steering function of the vehicle.
The memory unit 212, the processing component 210 and the interface unit 214 are electrically interconnected to provide means for performing systematic execution of operations on the received and/or stored data according to predefined, essentially programmed processes. These operations comprise the procedures described herein for the control unit 106 of the device of Figure 1. The point cloud typically includes a limited set of points that in combination provide a field of view a predefined region around the scanning unit. In the field of view, some specific object is relevant and needs to be detected for a steering function. The control unit 106 is thus configured to apply one or more defined criteria to the point cloud and thereby extract from the point cloud one or more points of interest that correspond with one or more objects of interest in the vicinity of the vehicle.
Figure 1 illustrates notation of a coordinate system applied in this description. The point of origin is the scanning unit, a direction parallel to the direction of gravitation is considered as the vertical direction and is denoted as the z-direction. A direction parallel to the direction in which the vehicle moves is considered as a first horizontal direction and is denoted as the y-direction. A second horizontal direction is orthogonal to the vertical direction and the first horizontal direction and is denoted as the x-direction. As an example, let us consider a situation where a driver wants to stop a bus exactly to a defined distance from the curb. For this, the control unit needs to apply a criterion that extracts from the point cloud a set of points of interest that most likely include points that correspond to the edge of the curb. The problem in use of the criterion is, however, illustrated with Figure 3.
Figure 3 shows the same vehicle setup as Figure 1 but in an operational situation where the vehicle 100 is a bus that stands or runs on a supporting surface 102 that is tilted with respect to a plane perpendicular to the z-direction. For passenger comfort, hydraulic system of the bus maintains the body of the bus in upright position even if the road is tilted. In addition, during driving, minor unevenness of the road may further wobble the body, not much but quite unpredictably. As can be understood from the drawing, the vehicle becomes correspondingly tilted with respect to the supporting surface 102 so that many of the criteria applied to analyse the point cloud in the situation of Figure 1 are no longer applicable in the situation of Figure 3. This is specifically the case for analyses that need greater accuracy, like positioning the bus accurately beside a curb, or driving the bus under a charging station.
In the following examples, this problem is overcome by complementing the control unit to implement a stage where the control unit determines from the point cloud at least one reference area and uses information from that reference area to compensate the effect of the tilt of the vehicle before any criterion for object detection is applied. Figure 4 illustrates example positions for the reference areas. For many applications, a required accuracy can be achieved through one reference area, like Area 1 , but by use of additional reference areas, accuracy can be improved. For a reference area, the control unit is configured to extract from the point cloud at least a first group of points that represent at least one selected region of a surface that supports the vehicle. Tilt of the vehicle refers herein to tilt of the vehicle with respect to the surface of the supporting surface and it can then be determined based on coordinates of the points in the first group of points. With the information on the tilt of the vehicle, the effect of this tilt can be compensated from the point cloud before any criterion for object detection is applied.
For added accuracy, the control unit may be configured to extract from the point cloud a second group of points that represent another selected region of the surface that supports the vehicle. Tilt of the vehicle can then be determined based on points in the first group of points and in the second group of points. As shown in Figure 4, even more reference areas can be applied. Availability of reference areas depends naturally on the extent of the field of view of the scanning unit. For example, if necessary, additional scanning units may be positioned in different parts of the body of the vehicle and be jointly connected to the control unit to provide reference areas and their respective groups of points to the control unit so that they can be used jointly for determination of the tilt of the vehicle. As shown in Figure 4, one reference area may be, for example, in front of the vehicle and another reference area on the back of the vehicle. Other reference area configurations may also be used without deviating from the scope.
As an example, let us look in more detail the case of Figures 1 and 3, and one reference area (Area 1 of Figure 4). If we assume that the scanning unit is fixed to the front of the bus, as shown in Figure 4, and one coordinate unit corresponds with 1 metre, an example condition for a first group of points could be {0<Xgi<0.4A0<ygi<0.2)}, wherein xgi represents x-coordinates and ygi represents y-coordinates of the first group of points, in other words points of a point cloud included in the selected reference area. Figures 4 and 5 illustrate the reference area 110 corresponding to a first group of points in this example in two different views.
The control unit is then configured to determine a tilt of the vehicle from said first group of points. The determination may be based on one calculation, or several calculations may be combined for more accuracy. A most straightforward method would be to select any two points from the reference area and determine the tilt of the vehicle based on their x- and z-coordinates. However, as the supporting surface (here the road) may have dents and bumps, a more accurate result can be achieved by means of averaging. The control unit may be configured to determine an average point pave that represents an average of vertical coordinates of the points in the first group of points, and a minimum point pmin that represents a minimum of vertical coordinates of the points in the first group of points. As shown in Figure 6, the tilt of the vehicle (angle a) may then be determined from x- and z- coordinates of the average point and the minimum point as: zave — zmin a = arctan - xave — xmin
Alternatively, the control unit may be configured to compute a maximum point pave that represents a maximum of vertical coordinates of the points in the first group of points, and a minimum point pmin that represents a minimum of vertical coordinates of the points in the first group of points, and the tilt of the vehicle (angle a) may then be determined from x- and z-coordinates of the average point and the minimum point as: zmax — zmin a = arctan xmax — xmin The computations may be implemented also in parallel to provide a way to check and ensure that the computed tilt angle is correct. It should be noted that these points are advantageous examples that are easily implemented with known library functions of available software systems. Other selected points and/or other averaging methods, well known to a person skilled in the art may be applied without deviating from the scope of protection.
The determinations are implemented in the control unit with coded programs, advantageously invoking library software available in the specific software system applied by the control unit. Most code used by modern applications is readily provided in these system libraries. The choice of the applied library depends on a diverse range of requirements such as: desired features, ease of API, portability or platform/compiler dependence (for e.g.: Linux, Windows, Visual C++, GCC), performance in speed, to name some. For example, the Point Cloud Library (PCL) is a large scale, open project for point cloud processing.
When the tilt angle a is determined, the control unit is configured to apply it to create a compensated point cloud where coordinates of the points are adjusted by compensating the effect of the tilt of the vehicle. The effect of the tilt of the vehicle may be compensated by rotating coordinates of the points of the point cloud. The control unit may be configured to apply a library function to rotate the coordinates or it may be configured with written code to implement the rotation. For example, Eigen is a high-level C++ library of template headers for linear algebra, matrix and vector operations, geometrical transformations, numerical solvers and related algorithms. With Eigen, the operation could be implemented with function Affine3f:: rotate
(https://eigen.tuxfamily.org/dox/group TutorialGeometry.html).
The control unit is then configured to detect the one or more objects of interest from the compensated point cloud. Typically, the control unit is configured to apply one or more defined criteria to the compensated point cloud to extract from the point cloud points of interest that correspond with the one or more objects of interest in the field of view of the scanning unit. For example, the PCL framework contains algorithms including filtering, feature estimation, surface reconstruction, registration, model fitting and segmentation. These algorithms can be used, for example, to filter outliers from noisy data, stitch 3D point clouds together, segment relevant parts of a scene, extract key points and compute descriptors to recognize objects in the world based on their geometric appearance, create surfaces from point clouds and visualize them (https://Dointclouds.org/documentation/index.html). Naturally, the software libraries and other programming tools mentioned in the above description are examples only. For a person skilled in the art it is clear, that a range of specially coded or commercially available products may be used to determine the tilt angle of a supporting surface based on its point cloud coordinates, and to use the tilt angle to compensate the effect of the tilt in coordinates of points before detecting objects in the scanned field of view of the point cloud.
The flow chart of Figure 7 illustrates steps of a method implemented in the control unit 106 of the device described with Figures 1 to 6. The method begins by the control unit receiving (stage 700) from the scanning unit a point cloud, which has a field of view around a point of origin. The point of origin is located in the scanning unit that is fixed to a vehicle. The control unit extracts (stage 702) from the point cloud at least a first group of points that represent at least one selected region (REF) of a surface supporting the vehicle. The control unit selects (stage 704) at least two points from the first group of points, and determines (stage 706) tilt of the vehicle based on coordinates of the at least two points in in said first group of points. The control unit creates (stage 708) a compensated point cloud (cPC) where coordinates of the points are adjusted by compensating the effect of the tilt.
One or more objects of interest can then be detected (710) from the compensated point cloud.

Claims

9 CLAIMS
1 . A device including a scanning unit and a control unit communicatively connected to each other, wherein the scanning unit is configured to be fixed to a vehicle and generate a point cloud having a field of view around a point of origin located in the scanning unit; the scanning unit is configured to feed the point cloud to the control unit; wherein the control unit is configured to extract from the point cloud at least a first group of points that represent a first selected region of a surface supporting the vehicle; the control unit is configured to select at least two points from the first group of points; the control unit is configured to determine a tilt of the vehicle based on coordinates of the at least two points in said first group of points; the control unit is configured to create a compensated point cloud where coordinates of the points are adjusted by compensating the effect of the tilt; the control unit is configured to detect one or more objects of interest from the compensated point cloud.
2. A device according to claim 1 , characterized in that the control unit is configured to extract from the point cloud points of interest that fulfil at least one defined criterion and thus correspond with the one or more objects of interest in the field of view of the scanning unit; the control unit is configured to apply the points of interest for the detection of the one of more objects of interest.
3. The device according to claim 1 or 2, characterized in that the control unit is configured use vertical coordinates to select the at least two points in the first group of points, wherein a vertical coordinate of each point in the first group of points indicates a distance from the point of origin to said point in a direction parallel to the direction of gravitation.
4. The device according to claim 3, characterized in that the control unit is configured to determine an average point representing an average of vertical coordinates of the points in the first group of points; determine a minimum point representing a minimum of vertical coordinates of the points in the first group of points; determine the tilt from coordinates of the average point and the minimum point. The device according to claim 3, characterized in that the control unit is configured to determine an average point representing an average of vertical coordinates of the points in the first group of points; determine a maximum point representing a minimum of vertical coordinates of the points in the first group of points; determine the tilt from coordinates of the average point and the maximum point. The device according to any of claims 1 to 5, characterized in that the control unit is configured to extract from the point cloud also a second group of points that represent a second selected region of the surface supporting the vehicle. The device according to claim 6, characterized in that the first selected region is in front of the vehicle and the second selected region is on the back of the vehicle. The device according to any of claims 1 to 7, characterized in that the control unit is configured to detect the one or more objects of interest in a direction that is orthogonal to the direction of gravitation and orthogonal to the direction in which the vehicle moves. The device according to any of claims 1 to 7, characterized in that the control unit is configured to detect the one or more objects of interest in a direction in which the vehicle moves. A method for a control unit communicatively connected to a scanning unit releasably fixed to a vehicle, the method comprising: receiving a point cloud having a field of view around a point of origin located in a scanning unit fixed to a vehicle; extracting from the point cloud at least a first group of points that represent a first selected region of a surface supporting the vehicle; selecting at least two points from the first group of points; determining a tilt of the vehicle based on coordinates of the at least two points in said first group of points; creating a compensated point cloud where coordinates of the points are adjusted by compensating the effect of the tilt; 11 detecting one or more objects of interest from the compensated point cloud. The method according to claim 10, characterized by extracting from the point cloud points of interest that fulfil at least one defined criterion and thus correspond with the one or more objects of interest in the field of view of the scanning unit; applying the points of interest for the detection of the one of more objects of interest. The method according to claim 10 or 11 , characterized by using vertical coordinates of the at least two points in the first group of points to determine the tilt of the vehicle, wherein a vertical coordinate of each point in the first group of points indicates a distance from the point of origin to the point in a direction parallel to the direction of gravitation. The method according to claim 12, characterized by determining an average point representing an average of vertical coordinates of the points in the first group of points; determining a minimum point representing a minimum of vertical coordinates of the points in the first group of points; determining the tilt from coordinates of the average point and the minimum point. The method according to claim 12, characterized by determining an average point representing an average of vertical coordinates of the points in the first group of points; determining a maximum point representing a minimum of vertical coordinates of the points in the first group of points; determining the tilt from coordinates of the average point and the maximum point. A computer program product, comprising instructions which, when the program is executed by a computer operated as a control unit, cause the computer to carry out the steps of the method of any of claims 10 to 14.
PCT/FI2022/050537 2021-08-23 2022-08-17 A device for assisted steering WO2023025983A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FI20215882 2021-08-23
FI20215882A FI20215882A1 (en) 2021-08-23 2021-08-23 A device for assisted steering

Publications (1)

Publication Number Publication Date
WO2023025983A1 true WO2023025983A1 (en) 2023-03-02

Family

ID=83322556

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2022/050537 WO2023025983A1 (en) 2021-08-23 2022-08-17 A device for assisted steering

Country Status (2)

Country Link
FI (1) FI20215882A1 (en)
WO (1) WO2023025983A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9052721B1 (en) * 2012-08-28 2015-06-09 Google Inc. Method for correcting alignment of vehicle mounted laser scans with an elevation map for obstacle detection
DE102017105209A1 (en) * 2017-03-13 2018-09-13 Valeo Schalter Und Sensoren Gmbh Determination of inclination angles with a laser scanner
DE102019117312A1 (en) * 2019-06-27 2020-12-31 Valeo Schalter Und Sensoren Gmbh Determining a pitch angle position of an active optical sensor system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9052721B1 (en) * 2012-08-28 2015-06-09 Google Inc. Method for correcting alignment of vehicle mounted laser scans with an elevation map for obstacle detection
DE102017105209A1 (en) * 2017-03-13 2018-09-13 Valeo Schalter Und Sensoren Gmbh Determination of inclination angles with a laser scanner
DE102019117312A1 (en) * 2019-06-27 2020-12-31 Valeo Schalter Und Sensoren Gmbh Determining a pitch angle position of an active optical sensor system
US20220260717A1 (en) * 2019-06-27 2022-08-18 Valeo Schalter Und Sensoren Gmbh Determining a pitch angle position of an active optical sensor system

Also Published As

Publication number Publication date
FI20215882A1 (en) 2023-02-24

Similar Documents

Publication Publication Date Title
US9251587B2 (en) Motion estimation utilizing range detection-enhanced visual odometry
CN103448634B (en) The dynamic reference superposed with image cropping
WO2018221453A1 (en) Output device, control method, program, and storage medium
CN108139755B (en) Abnormality detection device for self-position estimation device, and vehicle
JP2021519242A (en) Devices and methods for detecting the distance between the vehicle and the trailer
Monaco et al. Radarodo: Ego-motion estimation from doppler and spatial data in radar images
EP3324210B1 (en) Self-calibrating sensor system for a wheeled vehicle
JPH11325880A (en) Method of incremental determination of three-dimensional object
WO2018164203A1 (en) Scanner, working machine, and wheel stopper detecting device
CN106796291A (en) Vehicle-mounted object discrimination device
CN110836983A (en) Method for determining an uncertainty estimate of an estimated velocity
CN116559903A (en) Trailer pinch angle measuring method and device and vehicle
US20220358677A1 (en) Automatic trailer camera calibration
US11361543B2 (en) System and method for detecting objects
US10976426B2 (en) Apparatus and method for ascertaining object kinematics of a movable object
WO2010132014A1 (en) Method for calibrating a hinge angle sensor on a vehicle and a vehicle
WO2023025983A1 (en) A device for assisted steering
US11607999B2 (en) Method and apparatus for invisible vehicle underbody view
US11506791B2 (en) Sensor calibration
CN112345798A (en) Position and orientation estimation device and position and orientation estimation method
US20200118285A1 (en) Device and method for determining height information of an object in an environment of a vehicle
US20120249342A1 (en) Machine display system
US20210256276A1 (en) Sensor system for capturing the environment of at least a first and a second vehicle unit pivotably coupled to one another
EP4328619A1 (en) Apparatus for estimating vehicle pose using lidar sensor and method thereof
US20240069206A1 (en) Apparatus for estimating vehicle pose using lidar sensor and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22770030

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE