US20220144187A1 - Camera system for a trailer hitch system - Google Patents

Camera system for a trailer hitch system Download PDF

Info

Publication number
US20220144187A1
US20220144187A1 US17/521,394 US202117521394A US2022144187A1 US 20220144187 A1 US20220144187 A1 US 20220144187A1 US 202117521394 A US202117521394 A US 202117521394A US 2022144187 A1 US2022144187 A1 US 2022144187A1
Authority
US
United States
Prior art keywords
trailer
degree image
vehicle
image view
combined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/521,394
Other languages
English (en)
Inventor
Christian Sperrle
PhaniKumar K. Bhamidipati
Matthew J. Barton
Ammar Jamal Eddin
Niara Simpson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to US17/521,394 priority Critical patent/US20220144187A1/en
Publication of US20220144187A1 publication Critical patent/US20220144187A1/en
Assigned to ROBERT BOSCH GMBH reassignment ROBERT BOSCH GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EDDIN, AMMAR JAMAL, BHAMIDIPATI, PHANIKUMAR K., BARTON, Matthew J., SPERRLE, CHRISTIAN, Simpson, Niara
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18036Reversing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • G06K9/00805
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/102Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • B60W2420/42
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/22Articulation angle, e.g. between tractor and trailer
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2530/00Input parameters relating to vehicle conditions or values, not covered by groups B60W2510/00 or B60W2520/00
    • B60W2530/203Presence of trailer
    • B60W2530/205Dimensions of trailer

Definitions

  • Embodiments relate to automotive camera systems.
  • Vehicles such as automobiles, trucks, SUVs, vans, recreational vehicles, etc.
  • a multiple camera system sometimes known as a near-range camera system that can be used to provide a 360 degree view (in 2D or 3D) of the car itself (a top down or “birds-eye view”).
  • a view from one or more of the cameras of the near-range camera system may be obstructed, which may affect the generated 360-degree view.
  • the 360-degree view will not include the trailer, limiting a user's ability to view objects surrounding the trailer.
  • FIG. 1 is a block diagram of a trailer camera system, according to some embodiments.
  • FIG. 2A is a block diagram of a camera system of the vehicle of the trailer camera system of FIG. 1 , according to some embodiments.
  • FIG. 2B is a block diagram of a camera system of the trailer of the trailer camera system of FIG. 1 , according to some embodiments.
  • FIG. 3 is a block diagram of an electronic controller of the trailer camera system of FIG. 1 , according to some embodiments.
  • FIG. 4 is a flow chart of method for generating a 360-degree image view of a vehicle coupled to a trailer of the system of FIG. 1 , according to some embodiments.
  • FIG. 5 is a diagram illustrating the generated views from video images captured by the cameras of the system of FIG. 1 .
  • FIG. 6 is an example 360-degree image view generated by the system of FIG. 1 , according to some embodiments.
  • FIG. 7 is a table illustrating a plurality of generated views from video images captured by the cameras of the system of FIG. 1 , according to some embodiments.
  • FIG. 8A is an example of a modified combined 360-degree image view, according to some embodiments.
  • FIG. 8B is an example of a modified combined 360-degree image view, according to some embodiments.
  • FIG. 8C is an example of a modified combined 360-degree image view, according to some embodiments.
  • FIG. 9 is an example of a modified combined 360-degree image view, according to some embodiments.
  • the present specification relates generally to the field of rear camera systems for vehicles.
  • Vehicles such as automobiles, trucks, SUVs, vans, recreational vehicles, etc.
  • a rear camera system sometimes known as a backup camera or reversing camera.
  • the rear camera is configured to capture an image of the area behind the vehicle, generally the area towards the ground.
  • the area may include a blind spot hidden from view of the rear-view mirror and side view mirrors.
  • the image is transferred to a display, allowing the driver to monitor the area behind the vehicle.
  • embodiments may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware.
  • the electronic based aspects may be implemented in software (for example, stored on non-transitory computer-readable medium) executable by one or more processors.
  • control units and “controllers” described in the specification can include one or more electronic processors, one or more memory modules including non-transitory computer-readable medium, one or more input/output interfaces, and various connections (for example, a system bus) connecting the components.
  • each of the example systems presented herein is illustrated with a single exemplar of each of its component parts. Some examples may not describe or illustrate all components of the systems. Other embodiments may include more or fewer of each of the illustrated components, may combine some components, or may include additional or alternative components.
  • FIG. 1 is a block diagram of one example embodiment of a trailer camera system 100 .
  • the trailer camera system 100 is integrated into a vehicle 102 and a trailer 104 .
  • the vehicle 102 is equipped with a trailer hitch 103 , positioned at the rear of the vehicle 102 .
  • the trailer 104 has a trailer coupling (or coupler) 105 positioned at the front of the trailer 104 .
  • the trailer hitch 103 may one of numerous kinds of hitches (for example, a ball type trailer hitch having a ball) or, for example, a hitch that is received by a recess of the trailer coupler 105 to connect (or hitch) the trailer 104 to the vehicle 102 .
  • the trailer 104 may one of numerous types of vehicle trailers (for example, an enclosed trailer, vehicle hauling trailer, recreational vehicle (RV) trailer, and the like). While the trailer 104 is described below (in particular, regarding the method 300 in FIG. 3 ) as being an enclosed trailer, this should not be considered limiting. The systems and methods described herein are applicable to other types of trailers.
  • vehicle trailers for example, an enclosed trailer, vehicle hauling trailer, recreational vehicle (RV) trailer, and the like. While the trailer 104 is described below (in particular, regarding the method 300 in FIG. 3 ) as being an enclosed trailer, this should not be considered limiting. The systems and methods described herein are applicable to other types of trailers.
  • the trailer camera system 100 includes an electronic controller 106 , a human machine interface (HMI) 108 , a display 110 , a first plurality of cameras 112 A positioned on the trailer 104 , a second plurality of cameras 112 B positioned on the vehicle 112 B, and other vehicle systems 116 .
  • the electronic controller 106 , the HMI 108 , the display 110 , the plurality of cameras 112 A and 112 B, and the other vehicle systems 116 , as well as other various modules and components of the vehicle 102 are communicatively coupled to each other via wired connections, wireless connections, or some combination thereof.
  • All or parts of the connections used in the system 100 may be implemented using various communication networks, for example, a BluetoothTM network, a control area network (CAN) , a wireless local area network (for example, Wi-Fi), a wireless accessory Personal Area Networks (PAN), and the like.
  • a BluetoothTM network for example, a BluetoothTM network, a control area network (CAN) , a wireless local area network (for example, Wi-Fi), a wireless accessory Personal Area Networks (PAN), and the like.
  • CAN control area network
  • Wi-Fi wireless local area network
  • PAN Personal Area Networks
  • the electronic controller 106 includes a plurality of electrical and electronic components that provide power, operational control, and protection to the components and modules within the electronic controller 106 .
  • the electronic controller 106 includes, among other things, an electronic processor 202 (for example, an electronic microprocessor, microcontroller, or other suitable programmable device), a memory 206 , and an input/output interface 208 .
  • the electronic processor 202 , the memory 204 , and the input/output interface 206 , as well as the other various modules are connected by one or more control or data buses.
  • the electronic controller 106 is implemented partially or entirely in hardware (for example, using a field-programmable gate array (“FPGA”), an application specific integrated circuit (“ASIC”), or other devices.
  • FPGA field-programmable gate array
  • ASIC application specific integrated circuit
  • the electronic processor 202 obtains and provides information (for example, from the memory 204 and/or the input/output interface 206 ), and processes the information by executing one or more software instructions or modules, capable of being stored, for example, in a random access memory (“RAM”) area of the memory 204 or a read only memory (“ROM”) of the memory 204 or another non-transitory computer readable medium (not shown).
  • the software can include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions.
  • the memory 204 can include one or more non-transitory computer-readable media and includes a program storage area and a data storage area.
  • “non-transitory computer-readable media” comprises all computer-readable media but does not consist of a transitory, propagating signal.
  • the program storage area and the data storage area can include combinations of different types of memory, for example, read-only memory (“ROM”), random access memory (“RAM”), electrically erasable programmable read-only memory (“EEPROM”), flash memory, or other suitable digital memory devices.
  • the electronic processor 202 is connected to the memory 204 and executes software, including firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions.
  • the electronic processor 202 retrieves from the memory 204 and executes, among other things, instructions related to the control processes and methods described herein.
  • the input/output interface 206 is configured to receive input and to provide system output.
  • the input/output interface 206 obtains information and signals from, and provides information and signals to (for example, over one or more wired and/or wireless connections) devices and/or components both internal and external to the system 100 .
  • the electronic controller 106 may include additional, fewer, or different components.
  • the controller electronic 106 may include a transceiver or separate transmitting and receiving components, for example, a transmitter and a receiver. Some or all of the components of electronic controller 106 may be dispersed and/or integrated into other devices/components of the system 100 (for example, a vehicle control module or VCM, not shown, of the vehicle 102 ).
  • each of the plurality of cameras 112 A and 112 B of the system 100 are video cameras, positioned to capture video images of an area surrounding the trailer 104 and the vehicle 102 respectively.
  • one or more of the cameras 112 A and 112 B are moveable (for example, using pan, tilt, or zoom functions) to capture video images of other areas on or around the trailer 104 and/or vehicle 102 .
  • one or more of the plurality of cameras 112 A and 112 B may be part of a back-up video camera system of the vehicle 102 . Backup video cameras are known and will not be described in further detail.
  • FIG. 3A is a block diagram 300 A of the plurality of cameras 112 B of the vehicle 102 .
  • each of the cameras of the plurality of cameras 112 B is positioned at a respective portion of the vehicle 102 to capture video images of a respective area surrounding the vehicle 102 .
  • the field of view (image capture) of one or more of the plurality of cameras 112 B may overlap with a field of view of another camera of the plurality 112 B.
  • the fields of view of each of the cameras of the plurality of cameras 112 B collectively capture image data of the complete area surrounding the vehicle 102 .
  • the electronic controller 106 is configured to image stitch the images captured by the plurality of cameras 112 B together to generate a 360 degree (“bird's eye”) view of the vehicle 102 and the area surrounding the vehicle 102 .
  • the plurality of cameras 112 A of the trailer 104 are similarly positioned at respective portions of the trailer 104 to capture the complete area surrounding the trailer 104 .
  • the field of view (image capture) of one or more of the plurality of cameras 112 A may overlap with a field of view of another camera of the plurality 112 A.
  • FIG. 3B is a diagram 300 B illustrating an example positioning of the plurality of cameras 112 A on the trailer 104 .
  • the electronic controller 106 is configured to image stitch the images captured by the plurality of cameras 112 A together to generate a 360 degree (“bird's eye”) view of the trailer 104 and the area surrounding the trailer 104 .
  • each of the plurality of cameras 112 A and 112 B may include additional or fewer cameras than illustrated.
  • the HMI 108 provides an interface between the vehicle 102 and the driver.
  • the HMI 108 is communicatively coupled to the electronic controller 106 and receives input from the driver, receives information from the electronic controller 106 , and provides feedback (for example, audio, visual, haptic, or a combination thereof) to the driver based on the received information.
  • the HMI 108 provides suitable input mechanisms, for example, a button, a touch-screen display having menu options, voice recognition, and the like for providing inputs from the driver that may be used by the electronic controller 106 as it controls the vehicle 102 .
  • the HMI 108 provides visual output, for example, a graphic user interface having graphical elements or indicators (for example, fixed or animated icons), lights, colors, text, images (for example, from the camera 108 ), combinations of the foregoing, and the like.
  • the HMI 108 includes a suitable display device, for example the display 110 , for displaying the visual output, for example, an instrument cluster, a mirror, a heads-up display, a center console display screen (for example, a liquid crystal display (LCD) touch screen, or an organic light-emitting diode (OLED) touch screen), or through other suitable devices.
  • a suitable display device for example the display 110 , for displaying the visual output, for example, an instrument cluster, a mirror, a heads-up display, a center console display screen (for example, a liquid crystal display (LCD) touch screen, or an organic light-emitting diode (OLED) touch screen), or through other suitable devices.
  • LCD liquid crystal display
  • OLED organic light-emitting
  • the HMI 108 includes a graphical user interface (GUI) (for example, generated by the electronic controller 106 , from instructions and data stored in the memory, and presented on a center console display screen) that enables a user to interact with the system 100 .
  • GUI graphical user interface
  • the HMI 108 may also provide audio output to the driver, for example, a chime, buzzer, voice output, or other suitable sound through a speaker included in the HMI 108 or separate from the HMI 108 .
  • the HMI 108 includes components configured to provide haptic outputs to the driver, for example, to vibrate one or more vehicle components (for example, the vehicle's steering wheel and the driver's seat), for example, through the use of a vibration motor.
  • HMI 108 provides a combination of visual, audio, and haptic outputs.
  • the HMI 108 causes the visual, audio, and haptic outputs to be produced by a smart phone, a smart tablet, a smart watch, or other portable or wearable electronic device communicatively coupled to the vehicle 102 .
  • the other vehicle systems 116 include controllers, sensors, actuators, and the like for controlling aspects of the operation of the vehicle 102 (for example, acceleration, braking, shifting gears, and the like).
  • the other vehicle systems 116 are configured to send and receive data relating to the operation of the vehicle 102 to and from the electronic controller 106 .
  • the system 100 may include a steering controller 118 coupled to a steering system (not shown) of the vehicle 102 .
  • the steering controller 118 may be configured to automatically steer the vehicle 102 in response to commands received from, among other things, the electronic controller 106 .
  • the steering controller 118 may also receive steering commands from a steering wheel of the vehicle 102 (for example, in a “drive by wire” system).
  • the electronic processor 106 is configured to perform a parking and/or reverse assist function to guide (visually or automatically via the steering controller 118 ) the vehicle 102 (with or without the trailer 104 ) into a user-desired area surrounding the vehicle 102 to park.
  • FIG. 4 illustrates an exemplary method 300 for generating a 360-degree image view display of vehicle 102 coupled to the trailer 104 .
  • the method 300 is explained in terms of the electronic controller 106 , in particular the electronic processor 202 .
  • portions of the method 400 may be distributed among multiple devices (for example, one or more additional controllers/processors of the system 100 ).
  • the electronic processor 202 receives a first plurality of video images from the plurality of cameras 112 A positioned on the trailer 104 and, at block 404 , receives a second plurality of video images from the plurality of cameras 112 B positioned on the vehicle 102 .
  • the electronic processor 202 determines a trailer angle of the trailer 104 in relation to the vehicle 102 (for example, via an image analysis of one or more images from one or more of the cameras 112 A and 112 B or a trailer angle sensor, which is not shown, within the trailer hitch 103 ).
  • the electronic processor 202 generates a first 360-degree image view of an area surrounding the trailer 104 based on an image stitching of the first plurality of video images from the first plurality of cameras 112 A and, at block 410 , generates a second 360-degree image view of an area surrounding the vehicle 102 based on an image stitching of the second plurality of video images from the second plurality of cameras 112 B.
  • the electronic processor 202 generates a combined 360-degree image view from the first 360-degree image view and the second 360-degree image view based on the trailer angle and, at block 414 , displays the combined 360-degree image view on a display (for example, the display 110 ).
  • the combined 360-degree image view includes both the area surrounding the trailer and the area surrounding the vehicle.
  • the combined 360-degree image view is, in some embodiments, a blend/image stitching of the first and second 360-degree image views.
  • the first and second 360-degree image views are combined via a blended/stitching algorithm such that the resulting combined 360-degree image view is a top down view of the vehicle 102 coupled to the trailer 104 and the area surrounding both.
  • FIG. 5 illustrates a diagram 500 including the vehicle 102 and the trailer 104 .
  • Boxes 502 A and 502 B indicate the 360-degree view of the area surrounding the vehicle 102 and the trailer 104 ( 504 A and 504 B respectively).
  • Box 504 C indicates the resulting viewable resolution of the combined 360-degree image view.
  • FIG. 6 is an example of a resulting combined 360-degree image view 600 .
  • the electronic processor 202 is configured to generate the combined 360-degree image view based on additional factors. For example, in some embodiments, the electronic processor 202 generates the combined 360-degree image view based on a position (x, y, z, pitch, roll, yaw) of one or more of the first plurality of cameras 112 A in relation to a position of one or more of the second plurality of cameras 112 B so both the 360-degree image views of the trailer 104 and the vehicle 102 , when stitched together, visually appear to have been captured from approximately the same point of view (for example, scaled, zoomed, cropped, panned, skewed, and the like).
  • the electronic processor 202 is configured to modify the combined 360-degree image view based on the whether the vehicle 102 is turning (for example, turning onto a road or changing lanes on a road).
  • the electronic processor 202 may determine that the vehicle is turning based on a steering angle of the steering wheel (determined for example, via a steering wheel angle sensor, not shown).
  • the electronic processor 202 determines that the vehicle 102 is turning, or going to turn, based on information from a route planning/navigation assistance system being used by a driver of the vehicle 102 .
  • FIG. 7 is a table 600 illustrating how a resulting rotation (during turning) of the vehicle 102 or the trailer 104 (as well as a length of the trailer 104 ) affects a viewable resolution area of the combined 360-degree image view. As illustrated, both factors may leave areas of missing image information as the rotating vehicle 102 or trailer 104 (and their respective cameras 112 B and 112 A) move out of the resolution box.
  • the electronic processor 202 may be configured to fill in (via augmentation) the areas of missing image information to increase the size of the total viewable resolution box. The electronic processor 202 may then generate pan, zoom, and/or scale a desired area of the viewable resolution area to display on the display 110 as shown in the modified combined 360-degree image views in FIG. 7 .
  • Modification of the combined 360-degree image view may also be based on one or more dimensions (for example, width or height) of the trailer 104 .
  • the electronic processor 202 may determine the dimension information of the trailer 104 , for example, directly from a user input (for example, via HMI 108 ) or automatically calculated via video analysis from images from one or more of the cameras 112 A and 112 B.
  • the electronic processor 202 may also modify the combined 360-degree image view based on a user input (for example, received via HMI 108 ).
  • the combined 360-degree image view includes one or more augmented indications or items.
  • the electronic processor 202 is configured to augment an indication of one or more dimensions of the trailer 104 into the combined 360-degree image view.
  • the electronic processor 202 is configured to determine a predicted trajectory based on the trailer angle of the trailer 104 (or, in embodiments where the vehicle 102 includes a trailer back-up assist system, a desired trajectory) of the trailer 104 and augment the combined 360-degree image view to include an indication of the predicted and/or desired trajectory.
  • Example combined views 800 A- 800 C are shown in FIGS. 8A-8C respectively.
  • the combined 360-degree image view may be utilized by the electronic processor 202 as part of a trailer back-up assist program. For example, when displayed on the display, a user may touch and drag the indication of the desired trajectory (for example, trajectory line 902 in the combined view 900 of FIG. 9 ) to modify the desired trajectory and the processor 302 may update the trailer back-up assist program accordingly.
  • the electronic processor 302 is configured to identify one or more objects within the combined 360-degree image view and augment the combined 360-degree image view to highlight the object.
  • the object may be a stationary or moving object (for example a pedestrian, a bicycle, a vehicle, and the like).
  • the electronic processor 202 is configured to detect an object (within or outside the region within the combined 360-degree image view) that may intersect with a predicted/desired trajectory of the trailer 104 and, in response, provide augment a visual indication of the object within the combined 360-degree image view.
  • An object not within the combined 360-degree image view may be detected by the electronic processor 202 if the object is within a field of view of the one or more cameras 112 A and 112 B of the system 100 .
  • the electronic processor 202 is configured to predict a trajectory of an object (for example, if the object is a moving object) within or outside of the region within the combined 360-degree image view and determines if a collision with the trailer 104 and the moving object may occur (for example, based on a predicted trajectory of the trailer 104 and/or a distance of the object's predicted trajectory to the trailer 104 ). The electronic processor 202 may then augment a visual indication into the combined 360-degree image view to alert the driver of the possible collision. The electronic processor 202 may additionally provide one or more indications (for example, an audible or haptic alert) to the user of the vehicle 102 to notify the user of a detected object and/or possible collision. In some embodiments, the electronic processor 202 may be configured to, following an initial indication of a possible collision, automatically control the vehicle 102 so as to avoid the possible collision (for example, automatically brake the vehicle 102 ).
  • embodiments provide, among other things, a trailer hitch guidance system including a human machine interface.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
US17/521,394 2020-11-06 2021-11-08 Camera system for a trailer hitch system Abandoned US20220144187A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/521,394 US20220144187A1 (en) 2020-11-06 2021-11-08 Camera system for a trailer hitch system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063110755P 2020-11-06 2020-11-06
US17/521,394 US20220144187A1 (en) 2020-11-06 2021-11-08 Camera system for a trailer hitch system

Publications (1)

Publication Number Publication Date
US20220144187A1 true US20220144187A1 (en) 2022-05-12

Family

ID=81256123

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/521,394 Abandoned US20220144187A1 (en) 2020-11-06 2021-11-08 Camera system for a trailer hitch system

Country Status (3)

Country Link
US (1) US20220144187A1 (de)
CN (1) CN114449214A (de)
DE (1) DE102021212050A1 (de)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150145995A1 (en) * 2013-11-22 2015-05-28 At&T Intellectual Property I, L.P. Enhanced view for connected cars
US20170280091A1 (en) * 2014-08-18 2017-09-28 Jaguar Land Rover Limited Display system and method
US20170341583A1 (en) * 2016-05-27 2017-11-30 GM Global Technology Operations LLC Systems and methods for towing vehicle and trailer with surround view imaging devices
US20180101736A1 (en) * 2016-10-11 2018-04-12 Samsung Electronics Co., Ltd. Method for providing a sight securing image to vehicle, electronic apparatus and computer readable recording medium therefor
US20200074735A1 (en) * 2018-08-30 2020-03-05 Valeo Comfort And Driving Assistance Conditional availability of vehicular mixed-reality
US20200164799A1 (en) * 2018-11-28 2020-05-28 Valeo Comfort And Driving Assistance Mixed reality view for enhancing pedestrian safety

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150286878A1 (en) 2014-04-08 2015-10-08 Bendix Commercial Vehicle Systems Llc Generating an Image of the Surroundings of an Articulated Vehicle
DE102016109954A1 (de) 2016-05-31 2017-11-30 Connaught Electronics Ltd. Verfahren zum Unterstützen eines Fahrers eines Gespanns beim Rangieren des Gespanns, Fahrerassistenzsystem sowie Kraftfahrzeug
DE102016115313A1 (de) 2016-08-18 2018-02-22 Connaught Electronics Ltd. Verfahren zum Unterstützen eines Fahrers eines Gespanns, Fahrerassistenzsystem sowie Gespann

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150145995A1 (en) * 2013-11-22 2015-05-28 At&T Intellectual Property I, L.P. Enhanced view for connected cars
US20170280091A1 (en) * 2014-08-18 2017-09-28 Jaguar Land Rover Limited Display system and method
US20170341583A1 (en) * 2016-05-27 2017-11-30 GM Global Technology Operations LLC Systems and methods for towing vehicle and trailer with surround view imaging devices
US20180101736A1 (en) * 2016-10-11 2018-04-12 Samsung Electronics Co., Ltd. Method for providing a sight securing image to vehicle, electronic apparatus and computer readable recording medium therefor
US20200074735A1 (en) * 2018-08-30 2020-03-05 Valeo Comfort And Driving Assistance Conditional availability of vehicular mixed-reality
US20200164799A1 (en) * 2018-11-28 2020-05-28 Valeo Comfort And Driving Assistance Mixed reality view for enhancing pedestrian safety

Also Published As

Publication number Publication date
CN114449214A (zh) 2022-05-06
DE102021212050A1 (de) 2022-05-12

Similar Documents

Publication Publication Date Title
US11763573B2 (en) Vehicular control system
US20220144169A1 (en) Rear-view camera system for a trailer hitch system
EP2487906B1 (de) Steuervorrichtung und umgebungsüberwachungsvorrichtung für ein fahrzeug
CN108621943B (zh) 用于在车辆电子显示器上动态显示图像的系统和方法
WO2002089485A1 (fr) Procede et dispositif pour la presentation d'une image de camera embarquee a bord d'un vehicule
US20130096820A1 (en) Virtual display system for a vehicle
JP2005311868A (ja) 車両周辺視認装置
WO2018150642A1 (ja) 周辺監視装置
JP2010116086A (ja) 車載用表示装置、表示方法および表示プログラム
KR102288950B1 (ko) 차량 및 그 제어 방법
JP2017111739A (ja) 運転支援装置、運転支援方法
CN111094115B (zh) 用于运行车辆的显示单元的方法、装置、可读取的介质
JP5195776B2 (ja) 車両周辺監視装置
US20220144187A1 (en) Camera system for a trailer hitch system
EP2481636A1 (de) Parkhilfesystem und -verfahren
US11495193B2 (en) Vehicle display system and vehicle display method
US9940908B2 (en) Display control device
US11457181B1 (en) Vehicle maneuver assist with environment-fixed overhead composite image
WO2019073885A1 (ja) 運転支援装置
US20120086798A1 (en) System and method for automatic dynamic guidelines
US20200231099A1 (en) Image processing apparatus
CN112449625B (zh) 在挂车组合的调度操作中辅助的方法、系统和挂车组合
US20200148222A1 (en) Driving support device
JP2022052416A (ja) 画像処理装置

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: ROBERT BOSCH GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SPERRLE, CHRISTIAN;BHAMIDIPATI, PHANIKUMAR K.;BARTON, MATTHEW J.;AND OTHERS;SIGNING DATES FROM 20211101 TO 20220823;REEL/FRAME:061399/0854

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION