SE545062C2 - Method and control arrangement for determining displacement of a vehicle sensor - Google Patents

Method and control arrangement for determining displacement of a vehicle sensor

Info

Publication number
SE545062C2
SE545062C2 SE2051312A SE2051312A SE545062C2 SE 545062 C2 SE545062 C2 SE 545062C2 SE 2051312 A SE2051312 A SE 2051312A SE 2051312 A SE2051312 A SE 2051312A SE 545062 C2 SE545062 C2 SE 545062C2
Authority
SE
Sweden
Prior art keywords
sensor
sensors
vehicle
pose
determining
Prior art date
Application number
SE2051312A
Other languages
Swedish (sv)
Other versions
SE2051312A1 (en
Inventor
Daniel Tenselius
Erik Johansson
Fredrich Claezon
Mattias Johasson
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE2051312A priority Critical patent/SE545062C2/en
Publication of SE2051312A1 publication Critical patent/SE2051312A1/en
Publication of SE545062C2 publication Critical patent/SE545062C2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • G05D1/43
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D53/00Tractor-trailer combinations; Road trains
    • G05D2109/10

Abstract

The present disclosure relates to a method and a control arrangement (10) for displacement of a sensor in relation to a reference pose. A first aspect of the disclosure relates to a method for determining displacement of a first sensor (11a) in relation to a reference pose (p1) of the first sensor (11a), wherein the first sensor (11a) is included in a set of sensors (11) arranged at a vehicle (1), wherein the set of sensors (11) are configured for use in autonomous operation of the vehicle (1). The method comprises obtaining (S1), using one or more second sensors (11b) from the set of sensors (11), sensor data indicative of a pose of the first sensor (11a); and determining (S2) the displacement (d) of the first sensor (11a), based on the sensor data obtained by the one or more second sensors (11b). The disclosure also relates to a computer program, to a computer-readable medium to a control arrangement and to a vehicle (1).

Description

Method and control arrangement for determining displacement of a vehicle sensor Technical field The present disclosure relates to a method and a control arrangement for determining displacement of a sensor in relation to a reference pose. ln particular, the disclosure relates to determining a pose of a first sensor included in a set of sensors arranged at a vehicle, wherein the set of sensors is configured for use in autonomous operation of the vehicle. The disclosure also relates to a computer program, to a computer-readable medium and to a vehicle.
Background Autonomous operation of vehicles within reasonably unrestricted operational design domains requires utilization of sensors mounted at the vehicle to perceive the environment. For accurate use of the sensor data, the poses of these sensors have to be known. ln addition, relations between the sensors and the environment and in-between the sensors also need to be known.
These relations can be measured and compensated for by means of calibration and adjustments of the sensors. lt is also possible to measure the pose of the vehicle by means of, for example, inertial measurement units (IMU). The lMUs measurements are taken in relation to the magnetic field of the earth. However, sometimes the parts of the vehicle at which the sensors are mounted are not rigid but are flexing during operation. The parts may for example move relative to each other and they can also change in shape. These effects are often caused by the movement of the vehicle over uneven terrain or by change in longitudinal and lateral position. Movement between parts of the vehicle can be estimated using displacement sensors, for example lMUs, arranged at different parts of the vehicle. For example, patent application US2019163201 A1 proposes that data indicative of a displacement of the cab relative to the chassis can be obtained from a displacement sensor configured to obtain measurements of a displacement.
However, to accurately measure the movement and shape change of several different parts of a vehicle, multiple displacement sensors have to be deployed in order to detect how the different parts move in relation to each other. However, this makes the sensor pose determination ineffective, because of long tolerance chains and lot of expensive lMUs. Thus, there is a need for other ways of positioning of sensors arranged at a vehicle.
Summary lt is an object of the disclosure to alleviate at least some of the drawbacks with the prior art. Thus, it is an object of this disclosure to provide less complex ways of determining poses of sensors in relation to a vehicle without addition of additional hardware. lt is a further objective to provide a way of determining the poses that can be used during operation of the vehicle.
To meet these objectives, this disclosure proposes techniques where sensors used for autonomous operation perception are also utilized to perceive poses of other SGFISOFS.
According to a first aspect, the disclosure relates to a method for determining displacement of a first sensor in relation to a reference pose of the first sensor. The first sensor is included in a set of sensors arranged at a vehicle. The set of sensors is configured for use in autonomous operation of the vehicle. The method comprises obtaining, using one or more second sensors from the set of sensors, sensor data indicative of a pose of the first sensor. The method further comprises determining the displacement of the first sensor based on the sensor data obtained by the one or more second sensors. Thereby, sensors that are anyway required for autonomous operation are used for determining sensor poses, which reduces the cost, complexity, and uncertainties of the system. ln this way, fewer sensors are required as a need for a plurality of lMUs is eliminated, and a more direct measuring method is achieved. ln some embodiments, there is a Line-of-Sight between the at least one second sensor and the first sensor, or between the at least one second sensor and a part of the vehicle on which the first sensor is attached, when obtaining the sensor data indicative of the pose. Thereby, a movement or rotation of the first sensor may be directly (or indirectly when being a part of the vehicle) detected. ln some embodiments, the determining comprises determining a pose of a part of the vehicle on which the first sensor is attached. This is beneficial as it is often easier to detect a feature of a part of the vehicle, such as a corner, than of the sensor itself. ln some embodiments, the determining is based on reference poses of the individual sensors of the set of sensors. By using information about reference poses the determination of a deviation is facilitated. ln some embodiments, the determining comprises using a vehicle model defining how the individual sensors, or parts of the vehicle on which the sensors are attached, can move in relation to each other. Thereby, determination of sensor poses is facilitated, as possible sensor movement is limited by the model. ln some embodiments, the determining comprises detecting at least one feature of the first sensor and/or at least one feature of a part of the vehicle on which the first sensor is arranged and comparing the pose of the at least one feature to a reference pose of the at least one feature. Feature detection is one possible technique that is commonly available and that may be used to implement the proposed method. ln some embodiments, the at least one feature comprises one or more of a surface, an edge, a corner, a shape and a colour. These features are typically easy to detect using a feature detection algorithm. ln some embodiments, the determining the displacement comprises determining the pose in six degrees of freedom. Thus, the pose of the first sensor may be fully determined using the method. ln some embodiments, the method comprises determining a displacement of the first sensor based on fused sensor data obtained by a plurality of second sensors.
Thereby, better accuracy may be achieved. ln some embodiments, one of the plurality of second sensors is configured to measure distance with a resolution that meets a first resolution criteria and wherein another sensor of the plurality of second sensors is configured to measure angles with a resolution that meets a second resolution criteria. ln this way good accuracy is achieved in terms of position and rotation of the first sensor. ln some embodiments, the at least one second sensor comprises a distance sensor and/or an image sensor. Distance sensors are typically good at measuring position while image sensors are good at measuring rotation. ln some embodiments, the method comprises compensating for the determined displacement of the first sensor during autonomous operation of the vehicle. Thereby, the vehicle can be autonomously operated in a secure way.
According to a second aspect, the disclosure relates to a control arrangement controlling a vehicle, the control arrangement being configured to perform the method according to the first aspect.
According to a third aspect, the disclosure relates to a vehicle comprising the control arrangement of the second aspect.
According to a fourth aspect, the disclosure relates to a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method according to the first aspect.
According to a fifth aspect, the disclosure relates to a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to the first aspect.
Brief description of the drawinqs Fig. 1 illustrates a vehicle 1, where the proposed method for determining displacement of a first sensor may be implemented.
Figs. 2a and 2b illustrate cabin movement of the vehicle in Fig.
Fig. 3 illustrates a second sensor obtaining sensor data indicative of a pose of a first sensor.
Fig. 4 illustrates determination ofdisplacement of sensors 11 caused by skewing of a chassis of the vehicle of Fig.
Fig. 5 illustrates a computer implemented method for determining displacement of a first sensor in relation to a reference pose of the first sensor.
Fig. 6 illustrates a control arrangement of the vehicle of Fig. 1 according to an example embodiment.
Detailed description This disclosure proposes utilizing a set of sensors for use in autonomous operation perception to also position (i.e. perceive poses) of each other. ln other words, this disclosure proposes to let sensors in an autonomy sensor suite determine each other's poses. The disclosure also proposes using the set sensors to perceive shape of bodies or parts of the vehicle and the environment. Information about shape of bodies or parts of the vehicle and the environment can be used for perceiving the poses. ln the following disclosure, embodiments of a method for determining displacement of a first sensor in relation to a reference pose of the first sensor will be explained with reference to the figures 1 to Fig. 1 conceptually illustrates a vehicle 1, here a truck, where the proposed method for determining displacement of a first sensor in relation to a reference pose may be implemented. The proposed technique is applicable to any vehicle implementing any type of perception for autonomous operation. Note that autonomous operation is herein not limited to fully autonomous operation, but also includes autonomous functions such as lane keeping and parking assistance.
The vehicle 1 comprises equipment required for autonomous driving, such as an autonomous driving system, a navigation system, sensors, meters etc. The autonomous driving system is configured to operate the vehicle autonomously. The sensors and meters are configured to provide vehicle parameters for use by the autonomous driving system. The navigation system is configured to determine a curvature of an upcoming road. For simplicity, only parts of the vehicle 1 that are related to the proposed technique are illustrated in Fig. 1 and described herein.
The illustrated vehicle 1 comprises a set of sensors 11 and a control arrangement 10. The set of sensors 11 are configured for use in autonomous operation of the vehicle 1 and are sometimes referred to as an autonomy sensor suite. The autonomous sensors play an essential role in automated driving. For example, the autonomy sensor suite allows the vehicle to monitor its surroundings, detect oncoming obstacles, and safely plan its paths. ln other words, the set of sensors 11 are configured for use in autonomous driving. For example, the set of sensors 11 is configured for use in object detection, ego-vehicle localisation or odometry during autonomous operation. ln other words, the set of sensors 11 are for use in perceiving the vehicle's own position and movement as well as its surroundings during autonomous driving. The sensors 11 may comprise, but are not limited to, lidars or radars for distance measurements and sensors for angular measurements. As mentioned above, for accurate use of the sensors 11, the poses (orientation and position) of the sensors 11 in the vehicle coordinate frame needs to be known. The vehicle coordinate frame is typically defined in relation to the centre of the rear wheel axle.
The set of sensors 11 is arranged at different parts 1a, 1b of the vehicle 1. ln the illustrated example one sensor (denoted first sensor 11a) is mounted at a first part 1a, here the cabin, and two other sensors (denoted second sensors 11b) are mounted at a second part 1b, here the chassis. ln many situations there is a Line- of-Sight between individual sensors, within the set of sensors 11. Alternatively, there is a Line-of-Sight between one sensor and a part of the vehicle on which another sensor is attached. For example, in this example there is a line of sight between the second sensors 11b (mounted at the chassis) and a part 1a of the vehicle 1 (here the cabin) on which the first sensor 11a is attached. Consequently, the individual sensors of the set of sensors may be used to detect or sense each other. For example, the second sensors 11b may capture images picturing the second sensor or picturing the part on which the first sensor is attached. lt is for example assumed that any displacement of the part 1a on which the first sensor 11a is attached corresponds to the displacement of the first sensor 11a.
To improve driver comfort and stability, the cabin is typically flexibly suspended to the chassis. Hence, when the vehicle 1 is driving on an uneven road the relation between the cabin and the chassis will typically vary, as illustrated in Figs. 2a and 2b. Consequently, the pose of the first sensor 11a in the vehicle coordinate system may also vary while driving vehicle This disclosure is based on the insight that the other sensors in the set of sensors 11 can be used to determine for example such a deviation. For example, a second sensor 11b mounted on the chassis can be used to obtain sensor data indicative of a pose of a first sensor 11a mounted at the cabin, as illustrated in the example embodiment of Fig. 3. For example, the second sensor can be used to detect the movement of the part of the vehicle on which the first sensor is attached (here referred to as a first part 1a). ln the example of Fig. 3 the first part 1a is the cabin. The pose is determined by detecting movement of the upper left corner of the cabin using feature detection in one or more images. Hence, a deviation d of the cabin 1a from a reference pose (e.g. pre-configured as straight up) may then be determined based on the deviation of the corner. ln the example of Fig. 3, the second sensor is assumed to be fixed in the vehicle coordinate frame. Thus, the deviation d may be determined by detecting a movement of a position pz of the upper right corner of the cabin and comparing it by its position pl in the reference position in the vehicle coordinate frame.
The proposed method may be used in even more complex scenarios, where it is possible that several or all of the sensors move in relation to each other. For example, it is possible that the chassis of the vehicle is not 100% solid but can skew if forces are applied. Fig. 4 illustrates determination of displacement of sensors 11 caused by skewing of a chassis of the vehicle of Fig. 1. ln the example of Fig. 4 four sensors 11 are positioned in different corners of the chassis, such that each sensor 11 is in line-of-sight with at least two other sensors 11. Skewing of the chassis may cause the sensors to move in relation to each other. However, with the proposed method that will be described below, the positions of the sensors 11 may be calculated based on combined sensor data obtained by all the sensors. ln general, three measurements are required for accurate positioning of an object. However, provided that the dynamics of the chassis is known to some extent, fewer measurements may be required as the dynamics may serve as a restriction in possible movement. For example, a mathematical model may be used to describe the shape of the chassis. Sensor data from one or more of the sensors viewing each other (i.e. the sensors are line-of-sight of each other) may then be used to determine a present state of the model and thereby also the present position of the individual sensors.
The proposed method for determining displacement of a first sensor 11a in relation to a reference pose will now be described with reference to the flow chart of Fig. 5 and the vehicle of Figs. 1-4. The method is performed in a control arrangement 10. ln this example, the control arrangement 10 is arranged in the vehicle 1. However, it must be appreciated that the control arrangement 10 may alternatively be at least partly arranged off-board as long as it can communicate with the sensors The method may be implemented as a computer program comprising instructions which, when the program is executed by a computer (e.g. a processor in the control arrangement 10 (Fig. 5)), cause the computer to carry out the method. According to some embodiments the computer program is stored in a computer-readable medium (e.g. a memory or a compact disc) that comprises instructions which, when executed by a computer, cause the computer to carry out the method.
A reference pose of a sensor 11 herein refers to a pre-configured reference pose which for example corresponds to a pose (position and/or rotation) of the sensor 11 when the vehicle is standing on a flat and horizontal surface and when there are no external forces acting of the vehicle 1. Thus, the pre-configured reference pose may correspond to a predefined mounting pose. The pre-configured reference pose is typically known, i.e. it is stored in the control arrangement The method is for example performed during operation of the vehicle 1 while the parts of the vehicle 1 are deformed and/or move in relation to each other, which might cause the sensors 11 to deviate from their reference poses as explained above. A deviation is herein referred to as a deviation in any degree of freedom (in three-dimensional space). Hence, it might be a rotation around any axis and/or a movement in any direction.
As discussed above, the set of sensors 11 are typically mounted at certain positions at the vehicle 1. The positions are suitable for use in autonomous operation but will also allow the sensors to perceive features of relevant bodies of the vehicle. The relevant bodies are for example other sensors or parts on which other sensors are attached. The features may be natural features, such as, corners, edges, surfaces, shapes colours and/or augmented features, such as markers placed for the purpose of positioning a sensor. Many of these features are indicative of a pose of the sensor. The sensors may also be configured to perceive suitable features of themselves and of the environment.
The proposed method utilises the fact that the sensors 11 can determine each other's poses. More specifically, the proposed method comprises obtaining S1, using one or more second sensors 11b from the set of sensors, sensor data indicative of a pose of a first sensor 11a from the set of sensors 11. ln other words, sensor data indicative of a pose of a first sensor 11 in the set of sensors is recorded using one or more of the other sensors in the set. The sensor data is for example images, distance, or position measurements etc. lO The measurements may require Line-Of-Sight measurements. Hence, in some embodiments there is a Line-of-Sight between the at least one second sensor 11b and the first sensor 11a, or between the at least one second sensor 11b and a part 1a of the vehicle 1 on which the first sensor 1 1a is attached, when obtaining S1 the sensor data indicative of the pose.
The pose of the first sensor 11a is then determined based on the obtained sensor data. The pose of the first sensor is defined in relation to a reference pose of the sensor. The reference pose is as mentioned above typically pre-programmed. ln other words, the method further comprises determining S2 a displacement d of the first sensor, based on the sensor data obtained by the one or more second sensors 11b.
The determining may be performed in different ways. ln a very simple example, the pose of the second sensor 11b in the vehicle coordinate system is known. The pose of the first sensor 11a in the vehicle coordinate system may then be directly derived from sensor data recorded by the second sensor 11b. One example would be using feature recognition in an image to detect lateral deviation of the cabin in the example of Fig. 2. ln other words, the determining comprises determining S2 a pose of a part 1a of the vehicle 1 on which the first sensor 11a is attached. Longitudinal movement of a cabin may in the same way be detected using a distance measurement using e.g. lidar or radar. ln other words, in some embodiments, the determining S2 comprises detecting at least one feature of a part 1a of the vehicle 1 on which the first sensor 11a is arranged and comparing a pose of the feature to a reference pose p1 of the feature. Attached to herein refers for example to that the sensor 11a is rigidly mounted or flexibly mounted to the part with an at least partly known relation.
Alternatively, the first sensor 11a itself may be detected by the second sensor 1 1 b. This may be done by detecting the shape of the first sensor 11a or by detecting a marker or patter (e.g. a cross in a certain colour) arranged on the first sensor 11a. ln other words, in some embodiments, the determining S2 comprises detecting at ll least one feature of the first sensor 11a and comparing the pose of the feature to a reference pose p1 of the feature. ln some embodiments, the sensor data is evaluated based on the pose of the second sensor 11b. The pose of the second sensor 11b may be known, e.g. fix in the vehicle coordinate system. Alternatively, only a reference pose 11b of the second sensor is known. ln such a scenario the pose of the second sensor 11b might also need to be determined using for example one or more sensors from the set of sensors 11 to determine a deviation of the second sensor 11b from its reference pose. Alternatively, it may be assumed that the second sensor 11b is positioned at its reference pose. ln other words, in some embodiments, the determining S2 is based on reference poses of the individual sensors of the set of sensors. For example, the determining is based on the reference poses of the first and second sensors 11a, 11b. ln cases where several or all of the sensors are expected to deviate from their reference poses knowledge about the vehicle's dynamics can be used to determine the deviations of the sensors 11. Typically, the sensors can only deviate in certain directions. There might for example be known relations between deviations of two or more individual sensors such as mechanical constraints or common rigid fixation points. For example, two sensors that are attached to the same part will always deviate (move) in the same direction. ln some embodiments, the determining S2 a displacement comprises using a vehicle model defining how the individual sensors, or parts of the vehicle 1 on which the sensors are attached, can move in relation to each other. The vehicle model may describe how the sensors 11 may move in relation to each other. The movement may include both rotation and position.
As mentioned above, three measurements are typically required for positioning an object. ln some embodiments, one of these measurements may be in relation to an external reference. ln the example of Fig. 4 it is possible that each sensor sees a common reference e.g. a feature on the ground and that it also sees two other sensors from the set. This information wi|| typically be enough to position all sensorsin relation to each other, as there are then three measurements related to each SenSOF.
The determined displacement may involve position and/or rotation of the first sensor. ln some embodiment, a displacement in any direction is determined. ln other words, in some embodiments, determining S2 the displacement comprises determining the pose in six degrees of freedom. ln other words, a system can be developed that can accurately calculate and compensate for the pose and shape changes. This is done by analysing pose in three directions x, y, z and in addition rotation described by roll, pitch, and yaw. Alternatively, in some embodiment only deviation in some selected directions are determined. ln some embodiments, the determining S2 a displacement d of the first sensor based on fused sensor data obtained by a plurality of second sensors 11b. This in practice means that more information is used to determine the deviation. Sensor fusion is a well-known technique to bring together inputs from multiple sensors such as radars, lidars and cameras to form a single model or image of an object. The resulting model is more accurate because it balances the strengths of the different types of sensors. ln general, different types of sensors are suitable for measuring rotation than for measuring distance. For example, a Lidar can be used to determine a pose with high accuracy e.g. with resolution above a certain level or threshold. On the other hand, an image sensor is typically better at estimating an angle or rotation. Stated differently, an image sensor can measure angle or rotation with high resolution, e.g. a resolution above a certain level or threshold. ln other words, in some embodiments, one of the plurality of second sensors 11b is configured to measure distance with a resolution that meets a first resolution criteria and wherein another sensor of the plurality of second sensors 11b is configured to measure angles with a resolution that meets a second resolution criteria.
The determined deviation of the first sensor can then be used to calibrate sensor data recorded by the first sensor for example when autonomously operating the vehicle. Hence, in some embodiments, the method comprises compensating S3 forthe determined displacement of the first sensor 11a during autonomous operation of the vehicle. This typically involves calibrating all sensors to the same coordinate frame. ln some embodiments, one sensor 11 is not a single sensor but a sensor module comprising several sensors that are mounted such that their internal relations are fixed and known. For example, the sensors are rigidly mounted to a rigid structure of the sensor module. ln these embodiments, the displacement of the entire sensor module (from a reference pose of the sensor module) may be determined using the technique described above. For example, the relations between the sensors of the sensor module are pre-calibrated (before installation) or the sensors are mounted to a structure or frame of the sensor module with a certain accuracy. ln some embodiments, one sensor is a sensor module comprising several sensors of different types. For example, one sensor module may comprise one radar, one lidar and one image sensor.
Now turning to Fig. 6 which illustrates the control arrangement 10 configured to implement the proposed method in more detail. ln some embodiments, the control arrangement 10 is a "unit" in a functional sense. Hence, in some embodiments the control arrangement 10 is a control arrangement comprising several physical control devices that operate in corporation.
The control arrangement 10 comprises one or more ECUs. An ECU is basically a digital computer that controls one or more electrical systems (or electrical sub systems) of the vehicle 1 based on e.g. information read from sensors and meters placed at various parts and in different components of the vehicle 1. ECU is a generic term that is used in automotive electronics for any embedded system that controls one or more functions of the electrical system or sub systems in a transport vehicle. The control arrangement 10 comprises for example an Automated-Driving Control Unit, ADAS or any other suitable ECU.
The control arrangement 10, or more specifically the processor 101 of the control arrangement 10, is configured to cause the control arrangement 10 to perform allaspects of the method described above and below. This is typically done by running computer program code stored in the data storage or memory 102 in the processor 101 of the control arrangement 10. The data storage 102 may also be configured to store semi-static vehicle parameters such as vehicle dimensions.
The control arrangement 10 may also comprise a communication interface (not shown) for communicating with other control units of the vehicle and/or with off- board systems.
More specifically the control arrangement is configured to obtaining S1, using one or more second sensors 11b from the set of sensors, sensor data indicative of a pose of the first sensor 11a; and to determine S2 the displacement d of the first sensor, based on the sensor data obtained by the one or more second sensors 11b. ln some embodiments, the control arrangement 10 is configured to compensate S3 for the determined displacement of the first sensor 11a during autonomous operation of the vehicle.
The proposed technique has herein been explained with reference to a tractor- trailer vehicle. However, the proposed technique may also be applied to more complex vehicles, such as vehicles with multiple actuated steering axles, and articulated vehicles composed of a tractor, a dolly, and a trailer. The proposed technique can be readily extended to consider alternative desired driving behaviours besides that of centring the vehicle on the road. Based on the current traffic situation, it might be beneficial to plan paths that maximize the distance between the vehicle swept area and oncoming traffic. To further validate the approach, we plan to implement the proposed methods on real world tests using autonomous heavy-duty vehicles.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described method, control arrangement or computer program. Various changes, substitutions and/or alterations may be made, without departing from disclosure embodiments as defined by the appended claims.
The term "or" as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are to be interpreted as "at least one", thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. lt will be further understood that the terms "includes", "comprises", "including" and/ or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and/ or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims.

Claims (15)

1. A method for determining displacement of a first sensor (11a) in relation to a reference pose (pl) of the first sensor (11a), wherein the first sensor (11a) is included in a set of sensors (11) arranged at a vehicle, wherein the set of sensors (11) are configured for use in autonomous operation of the vehicle (1 ), the method comprising: - obtaining (S1), using one or more second sensors (11b) from the set of sensors, sensor data indicative of a pose of the first sensor (11a); and - determining (S2) the displacement (d) of the first sensor, based on the sensor data obtained by the one or more second sensors (1 1 b), wherein the determining (S2) comprises detecting at least one feature of the first sensor (11a) and comparing the pose of the feature to a reference pose (pl) of the feature.
2. The method according to claim 1, wherein there is a Line-of-Sight between the at least one second sensor (1 1 b) and the first sensor (1 1 a), or between the at least one second sensor (11b) and a part (1a) of the vehicle (1) on which the first sensor (11a) is attached, when obtaining (S1) the sensor data indicative of the pose.
3. The method according to claim 1 or 2, wherein the determining comprises determining (S2) a pose of a part (1 a) of the vehicle (1 ) on which the first sensor (11a) is attached.
4. The method according to any of the preceding claims, wherein the determining (S2) is based on reference poses of the individual sensors of the set of SGFISOFS.
5. The method according to any of the preceding claims, wherein the determining (S2) a displacement comprises using a vehicle model defining how theindividual Sensors, or parts of the vehicle (1) on which the sensors are attached, can move in relation to each other.
6. The method according to any of the preceding claims, wherein the determining (S2) further comprises detecting at least one feature of a part (1a) of the vehicle (1) on which the first sensor (11a) is arranged and comparing the pose of the feature to a reference pose (pl) of the feature.
7. The method according to claim 6, wherein the at least one feature comprises one or more of a surface, an edge, a corner, a shape and a colour.
8. The method according to claim 7, wherein the determining (S2) the displacement comprises determining the pose in six degrees of freedom.
9. The method according to any of the preceding claims, determining (S2) a displacement (d) of the first sensor based on fused sensor data obtained by a plurality of second sensors (1 1 b).
10. The method according to claim 9, wherein one of the plurality of second sensors (11b) is configured to measure distance with a resolution that meets a first resolution criteria and wherein another sensor of the plurality of second sensors (11b) is configured to measure angles with a resolution that meets a second resolution criteria.
11. The method according to any of the preceding claims, wherein the method comprises: - compensating (S3) for the determined displacement of the first sensor (1 1a) during autonomous operation of the vehicle.
12. A method according to any of the preceding claims wherein the set of sensors (11) are configured for use in at least one of object detection, ego vehicle localisation and odometry during autonomous operation of the vehicle.
13. A computer program comprising instructions which, when the computer program is executed by a computer, cause the computer to carry out the method according to any one of the preceding claims.
14. A control arrangement (10) configured for contro||ing a vehicle (1), the control arrangement (10) being configured to perform the method according to any one of claims 1 to
15. A vehicle (1) comprising a control arrangement (10) according to claim 14.
SE2051312A 2020-11-10 2020-11-10 Method and control arrangement for determining displacement of a vehicle sensor SE545062C2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
SE2051312A SE545062C2 (en) 2020-11-10 2020-11-10 Method and control arrangement for determining displacement of a vehicle sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2051312A SE545062C2 (en) 2020-11-10 2020-11-10 Method and control arrangement for determining displacement of a vehicle sensor

Publications (2)

Publication Number Publication Date
SE2051312A1 SE2051312A1 (en) 2022-05-11
SE545062C2 true SE545062C2 (en) 2023-03-21

Family

ID=81851771

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2051312A SE545062C2 (en) 2020-11-10 2020-11-10 Method and control arrangement for determining displacement of a vehicle sensor

Country Status (1)

Country Link
SE (1) SE545062C2 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180284243A1 (en) * 2017-03-31 2018-10-04 Uber Technologies, Inc. Autonomous Vehicle Sensor Calibration System
US20190066323A1 (en) * 2017-08-24 2019-02-28 Trimble Inc. Excavator Bucket Positioning Via Mobile Device
US20190163201A1 (en) * 2017-11-30 2019-05-30 Uber Technologies, Inc. Autonomous Vehicle Sensor Compensation Using Displacement Sensor
US20190163189A1 (en) * 2017-11-30 2019-05-30 Uber Technologies, Inc. Autonomous Vehicle Sensor Compensation By Monitoring Acceleration
EP3699630A1 (en) * 2019-02-25 2020-08-26 KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH System and method for compensating a motion of a vehicle component
US10798303B1 (en) * 2019-09-09 2020-10-06 Tusimple, Inc. Techniques to compensate for movement of sensors in a vehicle
DE102019205504A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Control device and method as well as computer program product

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180284243A1 (en) * 2017-03-31 2018-10-04 Uber Technologies, Inc. Autonomous Vehicle Sensor Calibration System
US20190066323A1 (en) * 2017-08-24 2019-02-28 Trimble Inc. Excavator Bucket Positioning Via Mobile Device
US20190163201A1 (en) * 2017-11-30 2019-05-30 Uber Technologies, Inc. Autonomous Vehicle Sensor Compensation Using Displacement Sensor
US20190163189A1 (en) * 2017-11-30 2019-05-30 Uber Technologies, Inc. Autonomous Vehicle Sensor Compensation By Monitoring Acceleration
EP3699630A1 (en) * 2019-02-25 2020-08-26 KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH System and method for compensating a motion of a vehicle component
DE102019205504A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Control device and method as well as computer program product
US10798303B1 (en) * 2019-09-09 2020-10-06 Tusimple, Inc. Techniques to compensate for movement of sensors in a vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
'Modularised data driven modelling and real time estimation of cabin dynamics in heavy-duty vehicles'; In: ip.com Prior Art Database Technical Disclosure; 2019-09-02. *

Also Published As

Publication number Publication date
SE2051312A1 (en) 2022-05-11

Similar Documents

Publication Publication Date Title
EP3867118B1 (en) Lidar-based trailer tracking
CN109211249B (en) Method and system for vehicle localization
US11377029B2 (en) Vehicular trailering assist system with trailer state estimation
US9902425B2 (en) System for guiding trailer along target route during reversing maneuver
KR101880013B1 (en) Magnetic position estimating apparatus and magnetic position estimating method
US20210215505A1 (en) Vehicle sensor calibration
US20160129939A1 (en) Vehicle control system
WO2017149813A1 (en) Sensor calibration system
US10011296B2 (en) Synchronizing vehicle steering
CN108349532A (en) System for the transfer for controlling motor vehicles in the case where that will collide with barrier
US20190170511A1 (en) Method and system for ascertaining and providing a ground profile
US10798303B1 (en) Techniques to compensate for movement of sensors in a vehicle
US20210001850A1 (en) Vehicle control method and apparatus
CN111795692A (en) Method and apparatus for parallel tracking and positioning via multi-mode SLAM fusion process
CN111284477A (en) System and method for simulating steering characteristics
CN113805145A (en) Dynamic lidar alignment
US11912360B2 (en) Vehicle control method, vehicle control system, and vehicle
GB2571590A (en) Vehicle control method and apparatus
GB2571587A (en) Vehicle control method and apparatus
SE545062C2 (en) Method and control arrangement for determining displacement of a vehicle sensor
US20230096655A1 (en) Method for reversing an articulated vehicle combination
Baer et al. EgoMaster: A central ego motion estimation for driver assist systems
Záhora et al. Perception, planning and control system for automated slalom with Porsche Panamera
GB2571585A (en) Vehicle control method and apparatus
US20230215026A1 (en) On-vehicle spatial monitoring system