WO2021078711A1 - Procédé de délivrance d'un signal de commande à une unité d'un véhicule - Google Patents

Procédé de délivrance d'un signal de commande à une unité d'un véhicule Download PDF

Info

Publication number
WO2021078711A1
WO2021078711A1 PCT/EP2020/079445 EP2020079445W WO2021078711A1 WO 2021078711 A1 WO2021078711 A1 WO 2021078711A1 EP 2020079445 W EP2020079445 W EP 2020079445W WO 2021078711 A1 WO2021078711 A1 WO 2021078711A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
unit
image data
steering angle
Prior art date
Application number
PCT/EP2020/079445
Other languages
German (de)
English (en)
Inventor
Markus Mueller
Azaruddin Syed
Vipin KAMARAJU
Nikhil SAWANT
Camille MARBACH
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Publication of WO2021078711A1 publication Critical patent/WO2021078711A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/302Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/602Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
    • B60R2300/605Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint the adjustment being automatic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/70Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by an event-triggered choice to display a specific image among a selection of captured images

Definitions

  • the invention relates to a method for outputting a control signal to a unit of a vehicle, a control unit for executing the method, a system with the control unit and the unit of the vehicle, and a vehicle according to the preamble of the independent claims.
  • the present invention also relates to a computer program and a machine-readable storage medium.
  • DE 10 2011 014368 A1 discloses an imaging driver assistance system.
  • a composite, virtual view is generated from camera images, which corresponds to a projection onto a cylinder surrounding the vehicle with an elliptical base area.
  • An image section shown on a display unit is dynamically adapted as a function of the current steering angle.
  • the present invention relates to a method for outputting a control signal to a unit of a vehicle, in particular for controlling the unit of the vehicle using the output control signal.
  • the method comprises a step of receiving first image data of a first camera unit of the vehicle and of second image data of a second camera unit of the vehicle by means of a control unit.
  • the first image data represent a first sub-area of a vehicle environment.
  • the second image data represent a second sub-area of the vehicle environment, the second sub-area adjoining the first sub-area or overlapping the first sub-area.
  • the method further comprises a step of receiving steering angle information relating to a steering angle of the vehicle by means of the control unit.
  • the method further comprises a step of determining image information relating to an image of the vehicle surroundings by means of the control unit.
  • the image information is determined using the received first and second image data and the received steering angle information.
  • the image of the vehicle environment is an image transformed into a perspective of the vehicle itself.
  • the method also includes a step of outputting the control signal as a function of the determined image information to the unit of the vehicle by means of the control unit.
  • the present invention also relates to a control unit for outputting a control signal to a unit of a vehicle.
  • the control unit is set up to receive first image data from a first camera unit of the vehicle and second image data from a second camera unit of the vehicle.
  • the first image data represent a first sub-area of a vehicle environment.
  • the second image data represent a second sub-area of the vehicle environment that adjoins the first sub-area or overlaps the first sub-area.
  • the control unit is further configured to receive steering angle information relating to a steering angle of the vehicle.
  • the control unit is also set up to determine image information relating to an image of the vehicle surroundings using the received first and second image data and the received steering angle information.
  • the image of the vehicle environment is an image transformed into a perspective of the vehicle itself.
  • the control unit is also set up to output the control signal as a function of the determined image information to the unit of the vehicle.
  • the present invention also relates to a computer program which is set up to carry out and / or control all steps of a method described above, as well as a machine-readable storage medium with a computer program stored thereon.
  • the present invention also relates to a system with the unit and the control unit described above.
  • the present invention also relates to a vehicle.
  • the vehicle has a first camera unit for providing first image data of a first sub-area of a vehicle environment, a second camera unit for providing second image data of a second sub-area of the vehicle environment adjoining the first sub-area or partially overlapping the first sub-area, and the system described above, that is the control unit and the unit controllable by means of the output control signal.
  • the vehicle can be a passenger car or a commercial vehicle. It is conceivable that the vehicle is a mobile work machine.
  • the vehicle can be a vehicle for agricultural, construction or logistical purposes.
  • the vehicle can be an industrial truck, in particular a forklift, a tractor or an excavator.
  • the vehicle comprises at least a first and a second camera unit, preferably at least one further camera unit, in particular a third camera unit and a fourth camera unit. It is also conceivable that the vehicle has more than four, for example six, camera units.
  • the camera units are preferably arranged at different spatial positions on the vehicle.
  • One or more of the camera units can be arranged on a body, a chassis or a structure, an attachment, a working arm and / or a lifting unit of the vehicle.
  • the camera units are preferably arranged on a contour of the vehicle, for example on a roof or roof spar, on a counterweight or, for example, on an upper end region of a lifting mast.
  • the camera units preferably have different spatial orientations or alignments or differently spatially aligned optical axes. In other words, the camera units can have different camera positions from one another.
  • different camera poses can be understood to mean poses of at least two cameras, the cameras differing from one another with regard to their spatial position and / or their spatial orientation or alignment.
  • Each of the camera units is designed to capture a partial area of a vehicle environment, in particular optically.
  • the sub-area of the vehicle surroundings captured by means of a camera unit can correspond to a detection area of the respective camera unit.
  • the camera units are designed to provide image data relating to the respective captured sub-area of the vehicle environment.
  • the camera units are designed to record the respective recorded partial area from a perspective of the corresponding camera unit.
  • the subregions of the vehicle environment captured by means of the camera units can include objects, for example other vehicles and / or people, in the vicinity of the vehicle.
  • the subregions of the vehicle environment captured by means of the camera units preferably include at least partially a surface or a roadway and / or a contour and / or an attachment and / or a lifting unit of the vehicle.
  • the recorded sub-areas can adjoin one another in pairs or, in particular, overlap in pairs.
  • the recorded subregions preferably overlap at most partially.
  • the recorded sub-areas can overlap horizontally and / or vertically.
  • the first camera unit is a front camera, ie a camera unit oriented along a preferred direction of travel or forward.
  • the second camera unit can, for example, be a rear camera, that is to say a camera unit that is oriented counter to a preferred direction of travel or backwards.
  • the second camera unit is a side camera, that is to say one that is transverse, in particular orthogonal, to a preferred direction of travel or one that is oriented laterally Camera unit is.
  • the first camera unit is particularly preferably a front camera, the second camera unit and the fourth camera unit are each arranged on opposite sides of the side cameras and the third camera unit is a rear camera.
  • Each of the camera units can be a mono camera or a stereo camera.
  • One or more of the camera units are preferably cameras with a fisheye lens in order to capture the largest possible angular range of the vehicle environment.
  • the angular range that can be detected by means of one of the camera units can have a size in the horizontal direction, for example, in a range greater than or equal to 150 ° and less than or equal to 210 °, in particular 180 °.
  • the image of the vehicle environment is determined or calculated using the received first and second image data.
  • the image of the vehicle surroundings is based at least partially on the first image data and at least partially on the second image data.
  • the image of the vehicle surroundings is an image composed or combined from the first and second image data.
  • the image of the vehicle environment can be put together from the first and the second image data by means of a stitching method.
  • the first and second image data are preferably combined or merged in the horizontal direction.
  • the composite image from the first and second image data can have a vertical composite edge or stitching edge.
  • the method comprises a step of assembling or stitching the received image data to form the assembled image of the vehicle surroundings.
  • the image of the vehicle environment is an image transformed into a vehicle's own perspective by means of a transformation process.
  • the vehicle's own perspective is one from a perspective of the at least first and second camera units, in particular of all camera units Different perspective of the vehicle.
  • the vehicle's own perspective is a perspective of a virtual camera unit or a camera unit that is not actually present on the vehicle.
  • the virtual camera unit has a camera pose that is different from the camera poses of the at least first and second camera units, preferably all camera units, of the vehicle. It is conceivable that the position of the virtual camera unit on the vehicle, in particular as an unchangeable position, is predetermined and / or can be predetermined.
  • the position of the virtual camera unit can be defined in a coordinate system assigned to the vehicle or a vehicle-specific coordinate system, for example by specifying associated X, Y, Z coordinates in the coordinate system.
  • the coordinate system assigned to the vehicle can have a coordinate origin, for example, in the middle of an axle of the vehicle, for example on a rear axle, of the vehicle or below the axle at the level of an underside of the vehicle.
  • the alignment of the virtual camera unit on the vehicle is preferably dependent at least on the steering angle information.
  • the method preferably includes a step of transforming, in particular projecting, the first image data from the first camera unit and the second image data from the second camera unit.
  • the combining of the first and second image data to form the combined image of the vehicle environment can take place before or after the step of transforming, in particular projecting, the image data of the camera units. That is to say, in other words, the image can be transformed into the vehicle's own perspective from the combined first and second image data or can be combined from the first and second image data transformed into the vehicle's own perspective.
  • the first and second image data can be transformed from the coordinate system of the respective camera unit into a common coordinate system.
  • the common coordinate system is preferably a coordinate system with a fixed reference point on the vehicle. It is conceivable that a coordinate origin of the coordinate system is arranged on the vehicle.
  • the origin of the coordinate system can, for. B. a position of a vehicle center or Be assigned to the center of gravity of the vehicle.
  • the origin of coordinates can also correspond to a position of the virtual camera unit on the vehicle.
  • the first and second image data transformed into the common coordinate system can furthermore be projected onto a predefined and / or predefinable projection surface.
  • the projection surface is a virtual or imaginary surface.
  • the projection surface can at least partially surround the vehicle, preferably completely in the horizontal direction.
  • the projection surface is preferably a horizontally and vertically curved surface. It is conceivable that the projection surface is a key-shaped or bowl-shaped surface.
  • the projection surface can have a bottom surface and a side surface. The bottom surface and the side surface are preferably spaced apart from one another by a transition surface of the projection surface.
  • the floor surface can be oriented essentially horizontally to the vehicle, preferably at the level of a sub-floor or a sub-floor or a roadway of the vehicle.
  • the size of the floor area corresponds essentially to the size of the outline of the vehicle in a top view, i.e. the floor area is only slightly larger than the size of the outline of the vehicle in a top view.
  • the side surface can be oriented essentially perpendicular or orthogonal to the floor surface or the vehicle.
  • the transition surface can represent an essentially continuous transition between the bottom surface and the side surface.
  • the transition surface can represent a curved transition, in particular a parabolic transition, between the bottom surface and the side surface.
  • a vehicle-specific perspective of a virtual camera unit on the vehicle environment can be determined or calculated.
  • a camera pose or a camera position and a camera orientation are specified relative to the projection surface.
  • one or more optical imaging parameters of the virtual camera unit and / or a detection area of the virtual camera unit are specified.
  • the perspective of the virtual camera unit on the vehicle environment can be determined or calculated using the specified camera pose, as well as the optical imaging parameter and / or the specified detection area, and the specified projection surface. For this purpose, for example, using known optical imaging rules, it can be determined which sub-area of the specified projection area or which part of the image data projected onto the projection area can be optically displayed or recorded by means of the virtual camera unit.
  • a predefined and / or predeterminable spatial model of the vehicle can also be taken into account.
  • the projection surface and the spatial model of the vehicle are preferably specified in the same coordinate system.
  • the vehicle's own perspective generated by projecting the image data of the camera units onto a projection surface and by determining a sub-area of the projected image data that can be captured by a virtual camera unit thus essentially corresponds to a perspective of a camera unit virtually or conceptually arranged on the vehicle.
  • the vehicle's own perspective is a perspective that faces away from the vehicle or is directed towards the vehicle surroundings.
  • image information relating to the image of the vehicle environment transformed into a vehicle-specific perspective can be understood to mean information relating to a sub-area or a subset of the first and second image data whose use the image is generated.
  • an overall image composed of the image data and projected onto the projection surface can be stored in a memory unit so that the image information relating to the image can be an indication of the partial image of the overall image corresponding to the image.
  • the image information includes the image of the vehicle environment transformed into a vehicle-specific perspective.
  • the image information has a horizontal and / or vertical angular range with respect to the image data projected onto a common projection surface.
  • a steering angle of a vehicle can be understood to mean an angle between a current or current direction of travel of the vehicle and a direction of travel of a predetermined and / or predeterminable forward travel of the vehicle.
  • the steering angle can be a steering angle of a steering wheel.
  • the steering angle can also be an angle between two vehicle parts rotatably supported against one another, for example a bending angle between a front and a rear carriage of a wheel loader or a rotation angle between an upper and an undercarriage of an excavator.
  • the steering angle can also be an angle between an alignment of a steerable axle of the vehicle and an alignment of the steerable axle when driving along a preferred direction of travel of the vehicle.
  • the steering angle can be provided by a sensor unit that detects the steering angle, for example by a steering wheel angle sensor, or by a vehicle bus, for example a CAN bus.
  • the steering angle can be provided by a control unit which is set up to control the steering unit of the vehicle.
  • the steering angle information can be information relating to a value of the steering angle.
  • the value of the steering angle can be a current or present value of the steering angle.
  • the image of the vehicle surroundings be an image composed of the first image data and the second image data if the current value of the steering angle is in a range between a first lower limit value and a first upper limit value. It is also conceivable that the image of the vehicle surroundings is an image generated from the first image data or the second image data if the current value of the steering angle is in a range is below the first lower limit value or above the first upper limit value.
  • the output control signal can be a wireless or wired, in particular electronic, signal.
  • the control signal can refer to the determined image information and / or the image transformed into a vehicle's own perspective.
  • the control signal can comprise the determined image information and / or the image transformed into a vehicle's own perspective.
  • the method comprises a step of controlling the unit of the vehicle using the output control signal. That is to say, the vehicle comprises a unit that can be controlled by means of the output control signal.
  • the unit can be, for example, a display unit and / or an image processing unit of the vehicle.
  • the display unit can be controlled using or by means of the output control signal in order to display or present the image of the vehicle environment transformed into a vehicle's own perspective, for example to an operator of the vehicle. It is conceivable that the method includes a step of displaying the image of the vehicle surroundings, which has been transformed into a vehicle-specific perspective, by means of the display unit.
  • the display unit can be arranged on the vehicle, preferably on, in particular in, a driver's cab of the vehicle.
  • the display unit can have a screen, a display surface, for example of an infotainment system, or a display or a monitor.
  • the display unit can be a display unit that is already present on the vehicle or that is built into the vehicle at a later date.
  • the image processing unit can be controlled using or by means of the output control signal in order to analyze or process or render the image of the vehicle environment transformed into a vehicle's own perspective.
  • the image can be analyzed, for example, by means of an algorithm for object recognition in order to detect objects in the vehicle environment based on the image, in particular to classify them. Since the object recognition is limited to the image of the vehicle environment transformed into a vehicle's own perspective, at least in a first step, the object recognition can be designed particularly efficiently, since not all of the image data from the camera units of the vehicle, but only the part that is actually used to create the image first and second image data are analyzed.
  • the method includes an analyzing or analyzing step.
  • the image processing unit can be a separate unit or part of the control unit. It is also conceivable that, based on the objects detected by means of an algorithm for object recognition, a further control signal is output to a warning unit and / or a control unit for a vehicle actuator, for example a drive, a brake, a steering and / or a lifting unit.
  • the method can include a step of controlling the warning unit and / or the vehicle actuator using the further control signal to output an acoustic, optical and / or haptic warning and / or to intervene in the vehicle, in particular to accelerate, decelerate and / or steer the vehicle and / or for adjusting a lifting unit of the vehicle.
  • the steering angle information includes information relating to a change, in particular a time curve, of the steering angle.
  • the course of the steering angle over time can include at least a current value of the steering angle and a value that is in the past include the steering angle.
  • the time course of the steering angle can include a rate of change or rate of change of the steering angle.
  • the horizontal image area can be enlarged in the steering direction X - xi> 0 and reduced against the steering direction or vice versa.
  • the horizontal image area of the image can have a value range of [- 90 °; 90 °].
  • the horizontal image area of the image can be used when the steering angle of 0 ° is reached instead of a value range of [-90 °; 90 °] a value range of [-80 °; 100 °] or from [-100 °; 80 °].
  • the steering angle information includes information relating to a change in the steering angle
  • the image can include a sub-area of the vehicle environment that is particularly relevant for operating the vehicle.
  • the image information is determined as a function of travel direction information relating to a travel direction of the vehicle. It is conceivable that for a steering wheel angle or an angle between two vehicle parts rotatably mounted against one another at least a first and a second image of the vehicle environment transformed into a vehicle-specific perspective is determined.
  • the image information includes information relating to the first image in the case of forward travel and information relating to the second image in the case of reverse travel of the vehicle.
  • the image last determined as a function of travel direction information can be used as the image.
  • an image determined independently of the direction of travel information can be used.
  • the transformed image always includes that area of Vehicle environment in or towards which the vehicle is moving or approaching, as a result of which the safety when operating the vehicle is further increased.
  • the image information is determined as a function of a predefined and / or predefinable value for the steering angle.
  • the image information is determined independently of the value of the received steering angle.
  • the vehicle's own perspective of the transformed image does not depend on the steering angle and, in particular, is unchanged.
  • the virtual camera unit cannot pivot at the same time. The position and the alignment of the virtual camera unit are constant here.
  • the perspective or the orientation of the virtual camera unit can only be changed depending on the steering angle when the steering angle is greater than the lower threshold value, e.g. for extreme areas of the steering angle such as occurs in forklifts, excavators, wheel loaders, tractors or tracked vehicles, but not in conventional cars .
  • the image information is determined as a function of this upper threshold value of the steering angle. It is conceivable that for steering angles greater than or equal to the upper threshold value of the steering angle, the image information is independent of the actual value of the steering angle.
  • the image information for all values of the steering angle greater than or equal to the upper threshold value can be identical to the image information for the upper threshold value of the steering angle.
  • the upper threshold value can be selected depending on the direction of travel of the vehicle. In other words, when the direction of travel changes, the upper threshold value can change, in particular at least in terms of amount.
  • the further image data being a further sub-area adjoining the first and / or the second sub-area or overlapping the first and / or the second sub-area represent the vehicle environment.
  • the further image data represent a further sub-area of the vehicle environment adjoining the second and / or the first sub-area or overlapping the second and / or the first sub-area.
  • third image data from a third camera unit of the vehicle and fourth image data from a fourth camera unit of the vehicle are received.
  • the third image data represent a third sub-area of the vehicle environment that adjoins the second sub-area or overlaps the second sub-area.
  • the fourth image data represent a fourth sub-area of the vehicle environment that adjoins the third and / or the first sub-area or overlaps the third and / or the first sub-area.
  • the image information relating to the image of the vehicle surroundings is additionally determined using the further image data received, in particular the third and fourth image data.
  • the image of the vehicle surroundings is an image composed of the second image data and the further image data if the current value of the steering angle is in a range between a second lower limit value and a further upper limit value.
  • the image of the vehicle surroundings can be an image composed of the second image data and the third image data if the current value of the steering angle is in a range between a second lower limit value and a second upper limit value.
  • the image of the vehicle surroundings can also be an image composed of the third image data and the fourth image data if the current value of the steering angle is in a range between a third lower limit value and a third upper limit value.
  • the image of the vehicle surroundings can be one of the fourth image data and the first image data Composite image if the current value of the steering angle is in a range between a fourth lower limit value and a fourth upper limit value.
  • the rule here is that the values of the limit values are different from one another.
  • the image of the vehicle environment can be one of at least three, for example composed of the first, second and further image data, if the current value of the steering angle is in a defined, predetermined and / or predeterminable range.
  • the method can be carried out for more than two, preferably for four, six or more than six camera units, in particular a multi-camera system or a surround view system, which results in particularly good monitoring of the vehicle environment and a corresponding extremely safe operation of the vehicle is possible.
  • a computer program product or computer program with program code which can be stored on a machine-readable carrier or storage medium such as a semiconductor memory, a hard disk or an optical memory, and for performing, implementing and / or controlling the steps of the method according to one of the embodiments described above is also advantageous is used, especially when the program product or program is executed on a computer or device.
  • FIG. 1 shows a schematic representation of a forklift truck
  • FIG. 2 shows the forklift truck according to FIG. 1;
  • FIGS. 1 and 3 shows a plan view of the forklift truck according to FIGS. 1 and
  • Fig. 2; and 4 shows a flow chart of a method for outputting a control signal to a unit of a vehicle.
  • the forklift truck 10 has a first camera unit 12, a second camera unit 14, a third camera unit 16 and a fourth camera unit 18 in order to be able to use the camera units 12, 14, 16, 18 to detect an environment of the forklift 10.
  • the forklift truck 10 further comprises a control unit 20 and a display unit (not shown).
  • the camera units 12, 14, 16, 18 are each designed as CMOS cameras 12, 14, 16, 18 with a detection area or field of view of 192 ° in the horizontal and greater than 94 ° in the vertical direction.
  • the first camera 12 is arranged on a front roof pillar 22 of the forklift truck 10 and is aimed at a front partial area of the vehicle environment.
  • the first camera 12 can also be arranged, in particular laterally, on a lifting mast of the forklift truck 10.
  • the first camera 12 is thus designed as a front camera 12.
  • the second camera 14 and the fourth camera 18 are arranged on opposite lateral roof pillars 24, 28 of the forklift truck 10 and are each aligned with a lateral partial area of the vehicle environment.
  • the second and fourth cameras 14, 18 are thus designed as side cameras 14, 18.
  • the third camera 16 is arranged on a rear roof pillar 26 of the forklift 10 and is oriented towards a rear partial area of the vehicle environment. Because of their arrangement, the cameras 12, 14, 16, 18 enable a 360 ° coverage of the surroundings of the forklift truck 10.
  • the control unit 20 is arranged in the area of a driver's cab of the forklift truck 10.
  • the control unit 20 is connected to each of the four cameras 12, 14, 16, 18 by means of a wired electronic connection 32, 34, 36, 38 connected. Image data from the cameras 12, 14, 16, 18 are transmitted to the control unit 20 by means of the connections 32, 34, 36, 38.
  • the control unit 20 is also connected to the display unit by means of a further wired electronic connection (not shown).
  • FIG. 2 shows a representation of the forklift truck 10 according to FIG. 1.
  • the detection areas 42, 44, 46, 48 of the cameras 12, 14, 16, 18 are also shown.
  • the detection areas 42, 44, 46, 48 of cameras 12, 14, 16, 18 arranged adjacently on the forklift 10 partially overlap and form an overlap area 43, 45, 49.
  • an object in the overlap area 43 is captured by both the front camera 12 and the side camera 14.
  • FIG. 3 shows a top view of the forklift truck 10 according to FIGS. 1 and 2. Shown are the detection areas 42, 44, 46, 48 of the cameras 12, 14, 16, 18 and the overlap areas 43, 45, 47, 49 of the Detection areas 42, 44, 46, 48. In addition, joining or stitching edges 52, 54, 56, 58 are shown for each of the overlapping areas 43, 45, 47, 49. The image data of the respective cameras 12, 14, 16, 18 are combined or stitched on or along each of the stitching edges 52, 54, 56, 58.
  • an imaginary position of a virtual camera unit 50 on the forklift truck 10 is shown.
  • the imaginary position of the virtual camera unit 50 corresponds to a head position of a vehicle driver sitting on the forklift truck 10.
  • the alignment or orientation of the virtual camera unit 50 can be adapted as a function of steering angle information and travel direction information of the forklift 10.
  • the alignment or orientation of the virtual camera unit 50 is preferably rotated or rotated around a vertical axis of the forklift 10 running through the imaginary position of the virtual camera unit 50 based on the steering angle information and the travel direction information.
  • the perspective of the virtual camera unit 50 corresponds to a vehicle's own perspective in the sense of a perspective of a vehicle driver sitting on the forklift 10.
  • the control unit 20 of the forklift 10 is set up to permanently receive image data from the cameras 12, 14, 16, 18 of the forklift 10.
  • the image data each represent a sub-area 42, 44, 46, 48 of the vehicle environment encompassed by the detection area 42, 44, 46, 48 of the respective camera 12, 14, 16, 18.
  • the partial areas 42, 44, 46, 48 of the vehicle surroundings also overlap.
  • the control unit 20 is also set up to receive a value of a steering angle from a control unit, which is set up to control the steering unit of the forklift truck 10.
  • control unit 20 is set up to determine image information relating to an image of the vehicle surroundings using the received image data and the received steering angle.
  • control unit 20 is set up to transform the image data into a representation from a perspective of the virtual camera 50.
  • the image information comprises the transformed image, which has a horizontal image angle that depends on the received steering angle.
  • the image With a steering angle of 0 °, the image has a horizontal image angle greater than or equal to -50 ° and less than or equal to 50 ° when driving forward.
  • the image When reversing and a steering angle of 0 °, the image has a horizontal image angle of greater than or equal to 130 ° and less than or equal to -130 °.
  • the image when driving forward along a trajectory 60 corresponding to the steering angle of 45 ° has a horizontal image angle of greater than or equal to -5 ° and less or equal to 95 °.
  • the image is an image composed of the first image data from the first camera 12 and the second image data from the second camera 14.
  • control unit 20 is set up to output a control signal with the determined image information to the display unit of the forklift 10 in order to display the image to an operator of the forklift 10 by means of the display unit.
  • 4 shows a flow chart of a method for outputting a control signal to a unit of a vehicle. The method is provided in its entirety with the reference symbol 100.
  • the method 100 can be carried out while the vehicle is in motion or while the vehicle is traveling. As an alternative or in addition, the method 100 can also be carried out when the vehicle is stationary and is not moving.
  • the vehicle can be, for example, the forklift 10 according to FIG. 1.
  • the method 100 comprises a step of capturing 110 an environment of the vehicle by means of a plurality of camera units of the vehicle in order to provide a control unit of the vehicle with image data which represent the vehicle environment.
  • a first partial area of the vehicle environment is recorded by means of a first camera unit of the vehicle and in step 110b a second partial area of the vehicle environment is recorded by means of a second camera unit of the vehicle.
  • the first camera unit provides first image data that represent a first partial area of the vehicle environment.
  • the second camera unit provides second image data that represent a second sub-area of the vehicle environment.
  • the first and second image data are transmitted by wire from the respective camera unit to the control unit and received in step 120 by means of the control unit.
  • the method 100 further includes a step of transforming 130 the first and second image data into a vehicle-specific perspective.
  • the in-vehicle perspective is a perspective of a virtual camera unit.
  • the virtual camera unit has a camera pose facing away from the vehicle or directed towards the vehicle surroundings.
  • the method 100 includes a step of joining 140 or stitching 140 the transformed first and transformed second image data to form an overall image of the vehicle environment.
  • the transformed first and transformed second image data are put together or put together in the horizontal direction.
  • the method 100 includes a step of receiving 150 steering angle information relating to a steering angle of the vehicle by means of the control unit.
  • the steering angle information here includes a current value of the steering angle and optionally a time profile of the steering angle.
  • the method 100 includes a step of receiving 160 travel direction information relating to a current travel direction of the vehicle or a currently selected travel gear of the vehicle.
  • the method 100 further comprises a step of determining 170 image information relating to an image of the vehicle surroundings using the received first and second image data and the received steering angle information as well as the received travel direction information by means of the control unit.
  • that partial image is selected as an image which corresponds to a perspective of the virtual camera on the vehicle environment.
  • the method includes a step of outputting 190 the control signal as a function of the determined image information to a display unit of the vehicle by means of the control unit.
  • the control signal comprises the determined or selected partial image, i.e. the partial image is transmitted to the display unit by means of the control signal.
  • the method further comprises a step of controlling 200 the vehicle using the output control signal by means of the control unit.
  • the display unit of the vehicle is controlled in order to display the image to an operator of the vehicle.
  • an exemplary embodiment comprises an “and / or” link between a first feature and a second feature, this is to be read in such a way that the exemplary embodiment according to one embodiment includes both the first feature and the second feature and, according to a further embodiment, either only the has the first feature or only the second feature.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

L'invention concerne un procédé de délivrance d'un signal de commande à une unité d'un véhicule (10), comprenant les étapes suivantes : - recevoir, au moyen d'une unité de commande, des premières données d'image d'une première unité de caméra (12) du véhicule (10), les premières données d'image représentant une première zone partielle (42) d'un environnement de véhicule, et des secondes données d'image d'une seconde unité de caméra (14) du véhicule (10), les secondes données d'image représentant une seconde zone partielle (44) de l'environnement du véhicule, adjacente à la première zone partielle (42) ou chevauchant la première zone partielle (42) ; - recevoir une information d'angle de braquage par rapport à un angle de braquage du véhicule au moyen de l'unité de commande ; - déterminer une information d'image par rapport à une image de l'environnement du véhicule à l'aide des premières et secondes données d'image reçues et de l'information d'angle de direction reçue au moyen de l'unité de commande, l'image de l'environnement du véhicule étant une image transformée en perspective interne de véhicule ; et - délivrer en sortie le signal de commande en fonction des informations d'image déterminées à l'unité du véhicule (10) au moyen de l'unité de commande.
PCT/EP2020/079445 2019-10-24 2020-10-20 Procédé de délivrance d'un signal de commande à une unité d'un véhicule WO2021078711A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019216368.8A DE102019216368A1 (de) 2019-10-24 2019-10-24 Verfahren zum Ausgeben eines Steuersignals an eine Einheit eines Fahrzeugs
DE102019216368.8 2019-10-24

Publications (1)

Publication Number Publication Date
WO2021078711A1 true WO2021078711A1 (fr) 2021-04-29

Family

ID=73030083

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/079445 WO2021078711A1 (fr) 2019-10-24 2020-10-20 Procédé de délivrance d'un signal de commande à une unité d'un véhicule

Country Status (2)

Country Link
DE (1) DE102019216368A1 (fr)
WO (1) WO2021078711A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022203951A1 (de) 2022-04-25 2023-10-26 Psa Automobiles Sa Verfahren und Vorrichtung zum Projizieren von Objekten

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021211710A1 (de) 2021-10-18 2023-04-20 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Ausgeben eines Steuersignals an eine fahrzeugseitige Anzeigeeinheit eines zumindest eine erste und eine zweite Kameraeinheit umfassenden Fahrzeugs

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100013930A1 (en) * 2006-10-11 2010-01-21 Masatoshi Matsuo Video display apparatus and video display method
DE102011014368A1 (de) 2011-03-17 2012-09-20 DSP-Weuffen GmbH Verfahren und Vorrichtung für ein bildgebendes Fahrerassistenzsystem
US20180309962A1 (en) * 2016-01-13 2018-10-25 Socionext Inc. Surround view monitor apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4467313B2 (ja) * 2004-01-09 2010-05-26 株式会社オートネットワーク技術研究所 車両周辺撮像装置
CN104285441B (zh) * 2012-05-22 2018-01-05 三菱电机株式会社 图像处理装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100013930A1 (en) * 2006-10-11 2010-01-21 Masatoshi Matsuo Video display apparatus and video display method
DE102011014368A1 (de) 2011-03-17 2012-09-20 DSP-Weuffen GmbH Verfahren und Vorrichtung für ein bildgebendes Fahrerassistenzsystem
US20180309962A1 (en) * 2016-01-13 2018-10-25 Socionext Inc. Surround view monitor apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022203951A1 (de) 2022-04-25 2023-10-26 Psa Automobiles Sa Verfahren und Vorrichtung zum Projizieren von Objekten

Also Published As

Publication number Publication date
DE102019216368A1 (de) 2021-04-29

Similar Documents

Publication Publication Date Title
EP3501897B1 (fr) Système de visualisation permettant d'apprehender l'environnement d'un véhicule
EP3219533B1 (fr) Système de vision pour un véhicule, notamment pour un véhicule utilitaire
EP3510463B1 (fr) Ensemble de capteurs pour un véhicule utilitaire à déplacement autonome et procédé de détection d'images environnantes
EP3135822B1 (fr) Engin automobile et procede d'affichage de l'environnement d'un engin automobile
DE112013002543B4 (de) Displayvorrichtung für Industriemaschine mit Eigenantrieb
DE102016120349A1 (de) Anhängerrückfahrassistenzsystem mit zielmanagement
DE102016210824A1 (de) Anhängungshilfe mit Schwenk-/Zoomansicht sowie virtueller Draufsicht
EP1631835A1 (fr) Dispositif et procede pour determiner une orientation dans l'espace d'une semi-remorque ou d'une remorque
DE102016100326A1 (de) Zielüberwachungssystem mit einer Linsenreinigungsvorrichtung
EP3135823B1 (fr) Engin automobile et procede d'affichage de l'environnement d'un chargeur frontal ou arriere
EP2273471B1 (fr) Procédé de fonctionnement d'un système d'assistance du conducteur d'un véhicule
DE102010048185A1 (de) Selbstfahrende Baumaschine
EP3437929A1 (fr) Système de vision à champs de vision / effet d'incrustation de la zone de vision en fonction de la situation de conduite
DE112013000873T5 (de) Peripherieüberwachungsvorrichtung für Transportfahrzeug
WO2021078711A1 (fr) Procédé de délivrance d'un signal de commande à une unité d'un véhicule
DE102016208372A1 (de) Verfahren, Vorrichtung und ein mobiles Anwendergerät zum Erzeugen einer Fahrerinformation im Zusammenhang mit einem von einem Fahrzeug zu zumindest einem Zeitpunkt überdeckten Geländebereich
DE102015109537B4 (de) Objektvermeidung für ein anhänger-rückfahrassistenzsystem
EP3794927A1 (fr) Machine de travail agricole
DE102016115313A1 (de) Verfahren zum Unterstützen eines Fahrers eines Gespanns, Fahrerassistenzsystem sowie Gespann
EP3380357B1 (fr) Système d'aide à la conduite pourvu d'un dispositif de traitement adaptatif de données d'images
DE102011080720A1 (de) Visualisierung einer Rampenabfahrt
DE102021201728A1 (de) System und verfahren zur übermittlung des vorhandenseins von nahegelegenen objekten in einem arbeitsbereich
EP2603403B1 (fr) Methode d'affichage d'images sur un panneau d'affichage dans un véhicule automobile, système d'aide à la conduite et véhicule
DE102019216147B3 (de) Verfahren zur Bestimmung einer Befahrbarkeit eines Bereichs eines Geländes für ein Fahrzeug
DE102016208373A1 (de) Verfahren, Vorrichtung und ein mobiles Anwendergerät zum Erzeugen einer Fahrerinformation umfassend zumindest ein grafisches Element im Zusammenhang mit zumindest einem Rad des Fahrzeugs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20797693

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20797693

Country of ref document: EP

Kind code of ref document: A1