GB2469438A - Displaying movement of an object - Google Patents

Displaying movement of an object Download PDF

Info

Publication number
GB2469438A
GB2469438A GB0904049A GB0904049A GB2469438A GB 2469438 A GB2469438 A GB 2469438A GB 0904049 A GB0904049 A GB 0904049A GB 0904049 A GB0904049 A GB 0904049A GB 2469438 A GB2469438 A GB 2469438A
Authority
GB
United Kingdom
Prior art keywords
vehicle
image
area
movement
obscure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0904049A
Other versions
GB0904049D0 (en
GB2469438B (en
Inventor
Trevor Kellway
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
APPLIC SOLUTIONS
Original Assignee
APPLIC SOLUTIONS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by APPLIC SOLUTIONS filed Critical APPLIC SOLUTIONS
Priority to GB0904049.4A priority Critical patent/GB2469438B/en
Publication of GB0904049D0 publication Critical patent/GB0904049D0/en
Publication of GB2469438A publication Critical patent/GB2469438A/en
Application granted granted Critical
Publication of GB2469438B publication Critical patent/GB2469438B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60DVEHICLE CONNECTIONS
    • B60D1/00Traction couplings; Hitches; Draw-gear; Towing devices
    • B60D1/24Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions
    • B60D1/36Traction couplings; Hitches; Draw-gear; Towing devices characterised by arrangements for particular functions for facilitating connection, e.g. hitch catchers, visual guide means, signalling aids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/002Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles specially adapted for covering the peripheral part of the vehicle, e.g. for viewing tyres, bumpers or the like
    • B60R1/003Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles specially adapted for covering the peripheral part of the vehicle, e.g. for viewing tyres, bumpers or the like for viewing trailer hitches
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Abstract

A driving assistance system is provided for a vehicle (1, figure 1) and comprises an image pickup means 101 for receiving an image captured by a camera (2). The image is analysed to identify a static portion 104 which is obscured, e.g. by part of the vehicle, and a dynamic viewable portion 103. An image of an object in the dynamic viewable portion is recorded by recording means 106 and subsequent movement of the vehicle is monitored by motion tracking means 105. An image synthesis and recomposition means 107 then generates a synthesised image of the object after it has passed at least partially into static obscured portion using the information from the motion tracking means to identify the correct location of the object. As a result, a driver of the vehicle may receive information regarding the location of objects relative to the vehicle even when they have passed into a region which cannot be viewed by the camera. This finds particularly utility, for example, when the driver wishes to couple the vehicle to a trailer (4), in which case a hitch (3) mounted to the vehicle must be manoeuvred so as to coincide with a coupling point (5) of the trailer.

Description

DISPLAYING MOVEMENT OF AN OBJECT
Field of the Invention
The present invention relates to a method and apparatus for displaying movement of an object in particular, but not exclusively, as the object passes from a viewable area to an obscure area.
Background to the Invention
Recent developments in both camera and display technology have enabled the provision of a wide range of visual aids. These aids can enhance images observed by a user, and can also provide visual information about regions that the user cannot observe.
In many systems, the purpose of the visual aid is to improve the user's ability to interact with the physical world by providing information that was previously obscure or unclear.
One particular example of a visual aid of this type is referred to as a Driver Assistance System. Driver Assistance Systems display to a driver of a vehicle images picked up by a camera observing the rear of the vehicle and its surroundings. Such systems are becoming widespread. They are designed to assist drivers when they are reversing by providing a superior view to that given by conventional rear-view mirrors.
Often, such systems employ wide-angle cameras to cover a wide visible range in order to provide the maximum possible viewing area to the driver, allowing him or her to avoid collisions with obstacles located behind the vehicle.
Although these Driver Assistance Systems can provide valuable assistance when reversing a car or similar vehicle, there are limits to the accuracy and utility of the information conventionally provided.
For example, should a driver wish to attach a trailer, caravan or the like to a vehicle, it is typical for the vehicle to comprise a hitch, such as a tow ball or other type of coupling point, at its rear. The driver is required to reverse the vehicle so that this hitch coincides with a corresponding coupling point on the trailer. In some examples, there are a number of coupling points that need to be aligned, such as for a "three-point hitch" used by tractors and farm machinery. This is a cause of some considerable difficulty, and although Driver Assistance Systems as described above can be of some help, they do not entirely solve the problem. Reasons for this include the sometimes awkward viewing angle of the rear view camera and parallax effects resulting from the wide angle nature of typical lenses. In some examples, it has been suggested to electronically manipulate the image to correct some of the most egregious distortions introduced by the wide angle lens system. However, improvement arising from such approaches is limited.
It has been proposed to enhance Driver Assistance Systems by providing means to predict the motion of the vehicle. This is done by taking control inputs from the vehicle's steering angle and super-imposing a predicted locus or path of the vehicle on the rear-view camera image displayed to the user. The user is then able to manipulate the locus by turning the steering wheel, and once the locus coincides with the trailer's coupling point the user knows that he or she need only reverse the vehicle with the steering wheel in that position until the hitch on the vehicle and the coupling point on the trailer coincide.
In order to maximize the utility of such Driver Assistance Systems, it is desirable to mount the rear view camera on the vehicle with clear, unobstructed view of both the vehicle's hitch and the trailer's coupling point. However, this is not always possible, due to the shape of modem vehicles and the limited availability of suitable mounting points. In such cases, it may not be possible to view either the hitch or the coupling point. This clearly limits the level of assistance offered by the Driver Assistance Systems, as it is necessary for the driver know the location of both the hitch and the coupling point in order to bring them together.
Efforts have been made to mitigate this difficulty by superimposing an image of the hitch on the image shown to the driver, even if this is not in the line of sight of the camera.
This is possible because the hitch does not move relative to the camera, so the superposition of the hitch can be arranged in the correct location relative to the real image when the device is set up with confidence that this will permanently remain a true representation of reality. However, such a technique can not be applied to the coupling point of the trailer, which moves relative to the vehicle, and as such can only provide limited assistance to the driver, especially when there is a significant region obscured from view that the coupling point must pass through to reach the hitch.
As a result, conventional systems often provide insufficient assistance to the driver in the coupling of vehicles to trailers or the like. More generally, conventional visual aids are of limited benefit in circumstances in which an object of interest is obscured from the view of the image capture device used.
Summary of the Invention
According to a first aspect of the present invention, there is provided a method for displaying movement of an object, comprising the steps of: capturing an image of the object in a viewable area; monitoring movement of the object as it passes to an obscure area; generating synthesised images of the object after it has passed at least partially into the obscure area, the synthesised images including the captured image relocated in the synthesised images according to the monitored movement; and displaying the synthesised images.
According to a second aspect of the present invention, there is provided an apparatus for displaying movement of an object, comprising: imaging means arranged to capture an image of the object in a viewable area; a processor arranged to monitor movement of the object as it passes to an obscure area, and arranged to generate synthesised images of the object after it has passed at least partially into the obscure area, the synthesised images including the captured image relocated in the synthesised images according to the monitored movement; and a display arranged to display the synthesised images.
The present invention allows images of an object to be produced even after it has passed into obscure region that cannot be directly observed. Information gleaned from an area that can be seen can be used to synthesise images of the object in an area which is obscured from view. As such, the present invention facilitates the management and manipulation of objects in an obscure area. The movement of the object may be relative rather than absolute; it is not necessary for object itself to move for such relative movement to occur, as it may be the result of the movement of other features. The movement may be relative to the imaging means. In other words, the object may be stationary relative to its environment/the background but move relative to the imaging means as the imaging means itself moves.
The present invention finds particular utility in assisting in the reversing of vehicles.
For example, the object may be a trailer having a coupling point, and the user may desire to reverse such that the coupling point engages with a hitch on the vehicle. The present invention provides assistance in reaching this aim even if the hitch lies in an obscure area which cannot be viewed directly.
Preferably, the captured image is a photographic image. In this way, the synthesised images can include a direct recreation of the image of the object when viewable. That is to say, a user of the system can be presented with a convincing likeness in the obscure area, ensuring that the nature of the object is clear. However, it is to be understood that the captured image may alternatively be an indirect representation of the object, rather than a photographic image. For example, the captured image may be a computer generated representation of the object. This representation need not necessarily reflect the actual visual appearance of the object.
Preferably, the viewable area and the obscure area are within the field of view of an imaging means, In preferred embodiments, the obscure area is defined by a foreground object in the field of view. The obscure area in this example is thus a region in which the foreground object obscures those behind it.
In some preferred embodiments the step of monitoring movement of the object comprises observing movement of the object in the viewable area. In this way, the movement of the object in the viewable area is used to ensure it is accurately located when synthesised in the obscure area. That is to say, movement of the object is directly monitored. This finds particular utility when part of an object is in the viewable area, while a remaining part lies in the obscure area.
Alternatively or additionally, movement of the object may be indirectly monitored from other available information. For example, the step of monitoring movement of the object may depend on an input received from a vehicle having a fixed spatial relationship to the obscure area. That is to say, the movement of the object may be determined according to the movement of the vehicle. The vehicle may be controlled by a driver from a position in the vehicle, may be remote-controlled, or may move autonomously. The input may comprise information regarding the trajectory and velocity of the vehicle. In a specific example, the input gives information regarding the current status of the steering wheel of the vehicle.
Preferably, the synthesised images further comprise one or more simulated static elements in the obscure area. In this way, the synthesised images display the relationship of the object to other features in the obscure area. This is of particular benefit when the user wishes to align existing static elements in the obscure area with the object. For example, the static element may be a vehicle's hitch which the user wishes to align with the object.
Preferably, the synthesised images further comprise an observed image of the viewable area. This allows the user to see the relationship of the obscure area to the viewable area. In this way, a user is presented with a seamless image of both obscure and viewable regions of space. As an object passes from the viewable area to the obscure area, its movement is monitored and an image of it is synthesised in the image of the obscure area. The user need not even be aware of this change.
Preferably, the synthesised images further comprise an estimated locus of future movement of the object. This provides an aid to for manipulation of objects in the obscure area. In one example, the estimated locus depends on a current trajectory of the object.
Alternatively, the estimated locus may depend upon an input received from a vehicle having a fixed spatial relationship to the obscure area. For example, the input may comprise information regarding the current trajectory of a vehicle.
It can also be appreciated that the invention can be implemented using computer program code. Indeed, according to a third aspect of the present invention, there is therefore provided computer software or computer program code adapted to carry out the method described above when processed by a processing means. The computer software or computer program code can be carried by a computer readable medium. The medium may be a physical storage medium such as a Read Only Memory (ROM) chip.
Alternatively, it may be a disk such as a Digital Versatile Disk (DVD-ROM) or Compact Disk (CD-ROM). It could also be a signal such as an electronic signal over wires, an optical signal or a radio signal such as to a satellite or the like. The invention also extends to a processor running the software or code, e.g. a computer configured to carry out the method described above.
Brief Description of the Drawings
A preferred embodiment of the present invention will now be described by way of example only with reference to the accompanying drawings, in which: Figure 1 illustrates the features of a vehicle and trailer in an uncoupled position; Figure 2 illustrates the view from a camera mounted on the vehicle when the vehicle and trailer are in the uncoupled position; Figure 3 illustrates the view from a camera mounted on the vehicle when the vehicle and trailer are coupled; Figure 4 is a schematic illustration of the components of a Driver Assistance system according to the present invention; Figure 5 illustrates the configuration of a Driver Assistance system according to the preferred embodiment of the present invention; and Figures 6A to 6C illustrate a sequence of images manipulated by the preferred embodiment.
Detailed Description
Figure 1 illustrates the relative positioning of an exemplary tractor 1 and a piece of farm machinery 4 before they are connected. The tractor 1 is equipped with a hitch 3 (in this case a tow ball) to engage with a coupling point 5 integrated with the farm machinery 4. In order to connect the farm machinery 5 to the tractor 1, the driver must reverse the tractor 1 until the hitch 3 and the coupling point 5 coincide.
Although the example shown in Figure 1 relates to the coupling of a tractor 1, this could be replaced by a vehicle 1 of any type, including, but not limited to: cars, caravans, vans, trucks and lorries. Indeed, the vehicle need not necessarily be automotive, and the present invention may find utility in the field of boats or the like.
The tractor 1 is further provided with a rear view camera 2, which assists the driver in reversing the tractor 1 by providing an improved view of objects to the rear. The dashed lines 6 and 8 represent the lower and upper extremes respectively of the area imaged by the camera 2. There will, of course, be corresponding sideward constraints in the imaged area although these are not represented in Figure 1.
Although only a single camera is shown in Figure 1, one skilled in the art will understand that an array of cameras may be employed. The images from the cameras can then be displayed separately, or, preferably, the images from the available cameras may be combined and treated as a single image. In one example, two cameras are used, placed at the left-most rear and right-most rear corners of the tractor respectively.
Figure 1 shows how the actual viewable area available to the camera 2 is limited by the shape of the tractor 1. In particular, although the camera 2 is able to image the region between dashed lines 6 and 8, the region between dashed lines 6 and 7 is obscured by the upper surface of the tractor 1. Only the region between dashed lines 7 and 8 is viewable to the camera 2. As a result, although the farm machinery 4 and coupling point 5 are viewable when in the position shown in Figure 1, the hitch 3 is not viewable by the camera 2.
Figure 2 illustrates the view from the camera 2 when the tractor 1 and farm machinery 4 are in the positions shown in Figure 1. Features obscured from view by the roof 10 of the tractor I are shown as dotted lines. Of particular interest is the location of the hitch 3.
Figure 3 illustrates the view from the camera 2 when the tractor 1 and farm machinery 4 are engaged via the hitch 3 and coupling point 5. Again, those features obscured from the camera are shown in dotted lines. Figure 3 clearly shows that both hitch 3 and coupling point 5 are obscured in this position. As a result, a direct view from the camera 2 is only of limited assistance to the driver of the tractor 1 as it will not show the hitch 3 or the coupling point 5 as they are brought into alignment.
Figure 4 shows the principle components of a driving assistance system according to the preferred embodiment. The camera 2 and the tractor 1 provide inputs to a processor 11, which drives a display 13.
Figure 5 is block diagram illustrating the configuration the Driving Assistance System. The processor 11 includes an image pick-up means 101 to receive an image captured by the camera 2. In a typical example, the camera 2 will pass 25 images a second to the image pick-up means 101.
The image pick-up means 101 then forwards captured images to image segmentation means 102. The image segmentation means 102 is active to divide the captured image into a dynamic portion 103 and a static portion 104. The image segmentation means 102 may make this division in a number of ways: for example, on the basis of observed movement in the initial image, or a preset division of sections of the image.
The dynamic portion 103 of the image includes those sections which can potentially move relative to the camera 2, such as the road, the farm machinery 4 and its coupling point 5, and is the region which is viewable to the camera 2. In contrast, the static portion 104 is the region which is obscure to the camera 2, and will not change, since it is essentially an image of the top of the tractor 1 to which the camera 2 is mounted.
Recording means 106 are provided to store a captured image from the dynamic portion 103, while motion tracking means 105 are provided to detect motion of an object in the dynamic portion 103. This information is passed to image synthesis and recomposition means 107 to generate synthesised images which combine the static portion 104 and the dynamic portion 103, and also include a synthesised image of any object that has passed from the dynamic portion 103 to the static portion 104 on the basis of information provided by the recording means 106 and motion tracking means 107. The recording means 106 provides the image of the object that is to be synthesised, while the motion tracking means 107 allows calculation of its position by monitoring movement of the object.
When synthesising the image of the object, the image synthesis and recomposition means 107 may also resize the image provided by the recording means 106 in order to adjust for the relative movement of the object and the tractor 1. In particular, the image recorded of an object in the dynamic portion 103 will be smaller than the object would appear in the static portion 104 were it visible as the static portion 104 is closer to the tractor 1. The image synthesis and recomposition means 107 is arranged to resize the recorded image to compensate for this effect.
The image synthesis and recomposition means 107 may also include additional information in the synthesised images it generates. For example, it may add details of static objects, such as the hitch 3, obscured from the view of the camera 2, and may additionally or alternatively provide a simulated locus of the projected path of objects in the image which is calculated on the basis of their current trajectory or the position of the steering wheel of the tractor 1.
The image synthesis and recomposition means 107 passes synthesised images comprising the above mentioned aspects to a display 13 for display to a user. The user will typically be the driver of the tractor 1 and will be able to use the image provided to control the tractor 1 as desired (for example, with the intention of bringing the hitch 3 and the coupling point 5 into alignment).
The action of the preferred embodiment of the present invention in use will now be described with reference to Figures 6A to 60.
Figure 6A shows the image captured by the camera 2 at a time t=0. In order to aid understanding of the present invention, Figure 6A also shows some features, including the hitch 3, that would be obscured by the roof of the tractor 1 and therefore would in fact not be present in the image received by the camera; these are represented by dotted lines.
At t=0 the tractor 1 and the farm machinery 4 are in the relative positions shown in Figure 1. The image pick-up means 101 receives the image from the camera 2, and the image segmentation means 102 divides this observed image into a static portion 104 (which is that portion obscured by the top of the tractor 1) and a dynamic portion (which is the remainder of the image). The recording means 106 records an image of the highlighted area 202 in Figure 6A and stores this information for use at a later time.
The image synthesis and recomposition means 107 recombines the static portion 104 and the dynamic portion 103. The image synthesis and recompositiori means 107 also overlays a simulated tractor rear 205, including an image of the hitch 3, onto the image. The resulting synthesised image is passed to the display 13 to the driver of the tractor 1.
Figure 6B illustrates the image received by the camera 2 at a later time t1.
Again, some features that are in fact obscured by the top of the tractor are shown in the Figure as dotted lines to aid understanding of the invention, although these would not exist in the image received by the camera 2. It is clear from Figure 6B that both the hitch 3 and coupling point 5 are obscured from the camera's 2 view.
The image pick-up means 101 receives the camera image shown in Figure 6B at time t=1. The image segmentation means 102 divides this into a static portion 104 and a dynamic portion 103 as before. The motion tracking means 105 monitors the movement of the farm machinery 4 since the image received at t=0, and is therefore able to predict where the rear of the farm machinery 4 is located, even though it is now obscured.
The image synthesis and recomposition means 107 then combines the available information to generate a synthesised image and forwards this to the display 13. The synthesised image comprises: a simulated rear portion 205 of the tractor 1; a simulated rear portion 202 of the farm machinery 4, the location of which has been monitored by the motion tracking means 105 and the image of which was captured by the camera 2 and recorded by the recording means 106; and the current image of the dynamic portion 103.
The result is shown in Figure 6C. This combined image illustrates the position of both the tractor 1 and hitch 3, and the farm machinery 4 and coupling point 5, to the driver, aiding in the alignment of these features and consequently the engagement of the farm machinery 4 to the tractor 1.
The image of the simulated rear portion 202 of the farm machinery 4 need not be a direct copy of the image 202 that was recorded at time t=0. For example, the image 202 may be manipulated to take account of the different angle from which it is viewed at time t1. Moreover, as mentioned previously, the size of the image may be adjusted to account for the change in its relative position between times t=0 and t=1. Alternatively or additionally, a change in colour or other characteristic may be applied to the image 202 to indicate to the user that it is being simulated.
Although the above description refers particularly to the example of an automotive tractor for connection with a farm machinery, the present invention finds utility in other fields. For example, the present invention may assist in the control of boats or the like.
More generally, the present invention provides advantages in any instance where it is desirable to manage or manipulate objects that cannot be directly viewed.
Other variations and modifications will be apparent to the skilled person.
Such variations and modifications may involve equivalent and other features which are already known and which may be used instead of, or in addition to, features described herein. Features that are described in the context of separate embodiments may be provided in combination in a single embodiment. Conversely, features which are described in the context of a single embodiment may also be provided separately or in any suitable sub-combination.
It should be noted that the term "comprising" does not exclude other elements or steps, the term "a" or "an" does not exclude a plurality, a single feature may fulfil the functions of several features recited in the claims and reference signs in the claims shall not be construed as limiting the scope of the claims. It should also be noted that the -10-Figures are not necessarily to scale; emphasis instead generally being placed upon illustrating the principles of the present invention.

Claims (32)

  1. CLAIMS1. A method for displaying movement of an object, comprising the steps of: capturing an image of the object in a viewable area; monitoring movement of the object as it passes to an obscure area; generating synthesised images of the object after it has passed at least partially into the obscure area, the synthesised images including the captured image relocated in the synthesised images according to the monitored movement; and displaying the synthesised images.
  2. 2. A method according to claim 1, wherein the captured image is a photographic image.
  3. 3. A method according to claim 1 or claim 2, wherein the step of monitoring movement of the object comprises observing movement of the object in the viewable area.
  4. 4. A method according to any one of the preceding claims, wherein the step of monitoring movement of the object comprises receiving an input from a vehicle having a fixed spatial relationship to the obscure area.
  5. A method according to claim 4, wherein the input comprises information regarding the trajectory and velocity of the vehicle.
  6. 6. A method according to any one of the preceding claims, wherein the viewable area and the obscure area are within the field of view of an imaging means.
  7. 7. A method according to claim 6, wherein the obscure area is defined by aforeground object in the field of view.
  8. 8. A method according to any one of the preceding claims, wherein the synthesised images further comprise one or more simulated static elements in the obscure area.
  9. 9. A method according to claim 8, wherein one of the static element comprises a vehicle hitch. -12-
  10. 10. A method according to any one of the preceding claims, wherein the synthesised images further comprise an observed image of the viewable area.
  11. 11. A method according to any one of the preceding claims, wherein the synthesised images further comprise an estimated locus of future movement of the object.
  12. 12. A method according to claim 11, wherein the estimated locus depends on the monitored movement of the object.
  13. 13. A method according to claim 11, wherein the estimated locus depends upon an input received from a/the vehicle having a fixed spatial relationship to the obscure area.
  14. 14. A method according to claim 13, wherein the input comprises information regarding the current trajectory of the vehicle.
  15. 15. A method according to any one of the preceding claims, wherein the object is one or more coupling points on a trailer.
  16. 16. A computer program product comprising computer executable instructions for carrying out the method of any one of the preceding claims.
  17. 17. An apparatus for displaying movement of an object, comprising: imaging means arranged to capture an image of the object in a viewable area; a processor arranged to monitor movement of the object as it passes to an obscure area, and arranged to generate synthesised images of the object after it has passed at least partially into the obscure area, the synthesised images including the captured image relocated in the synthesised images according to the monitored movement; and a display arranged to display the synthesised images.
  18. 18. An apparatus according to claim 17, wherein the captured image is a photographic image.
  19. 19. An apparatus according to claim 17 or claim 18, wherein the imaging means is further arranged to observe movement of the object in the viewable area, -13 -and wherein the processor is arranged to monitor movement of the object in dependence on the observed movement of the object in the viewable area.
  20. 20. An apparatus according to any one of claims 17 to 19, wherein the processor is arranged to monitor movement of the object based on an input received from a vehicle having a fixed spatial relationship to the obscure area.
  21. 21. An apparatus according to claim 20, wherein the input comprises information regarding the trajectory and velocity of the vehicle.
  22. 22. An apparatus according to any one of claims 17 to 21, wherein the viewable area and the obscure area are within the field of view of the imaging means.
  23. 23. An apparatus according to claim 22, wherein the obscure area is defined by aforeground object in the field of view.
  24. 24. An apparatus according to any one of claims 17 to 23, wherein the synthesised images further comprise one or more simulated static elements in the obscure area.
  25. 25. An apparatus according to claim 24, wherein one of the static elements comprises a vehicle hitch.
  26. 26. An apparatus according to any one of claims 17 to 25, wherein the synthesised images further comprise an observed image of the viewable area.
  27. 27. An apparatus according to any one of claims 17 to 26, wherein the synthesised images further comprise an estimated locus of future movement of the object.
  28. 28. An apparatus according to claim 27, wherein the estimated locus depends on monitored movment of the object.
  29. 29. An apparatus according to claim 27, wherein the estimated locus depends upon an input received from a/the vehicle having a fixed spatial relationship to the obscure area.
  30. 30. An apparatus according to claim 29, wherein the input comprises information regarding the current trajectory of the vehicle.
    -14 -
  31. 31. An apparatus according to any one of claims 17 to 30, wherein the object is a trailer having one or more coupling points.
  32. 32. An apparatus according to any one of claims 17 to 31, wherein the imaging means is a rear-view camera mounted on a vehicle.
GB0904049.4A 2009-03-09 2009-03-09 Display movement of an object Active GB2469438B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0904049.4A GB2469438B (en) 2009-03-09 2009-03-09 Display movement of an object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0904049.4A GB2469438B (en) 2009-03-09 2009-03-09 Display movement of an object

Publications (3)

Publication Number Publication Date
GB0904049D0 GB0904049D0 (en) 2009-04-22
GB2469438A true GB2469438A (en) 2010-10-20
GB2469438B GB2469438B (en) 2014-04-09

Family

ID=40600763

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0904049.4A Active GB2469438B (en) 2009-03-09 2009-03-09 Display movement of an object

Country Status (1)

Country Link
GB (1) GB2469438B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018037032A1 (en) * 2016-08-26 2018-03-01 Jaguar Land Rover Limited A vehicle camera system
IT201700054083A1 (en) * 2017-05-18 2018-11-18 Cnh Ind Italia Spa SYSTEM AND METHOD OF AUTOMATIC CONNECTION BETWEEN TRACTOR AND TOOL
US10870323B2 (en) 2018-07-18 2020-12-22 Ford Global Technologies, Llc Compensation for trailer coupler geometry in automatic hitch operation

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11358639B2 (en) 2020-03-11 2022-06-14 Ford Global Technologies, Llc Trailer hitching assistance system with contact mitigation measures

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1249365A1 (en) * 2001-04-09 2002-10-16 Matsushita Electric Industrial Co., Ltd. Driving aiding system
US20020149673A1 (en) * 2001-03-29 2002-10-17 Matsushita Electric Industrial Co., Ltd. Image display method and apparatus for rearview system
EP1251025A2 (en) * 2001-04-09 2002-10-23 Matsushita Electric Industrial Co., Ltd. Driving aiding system
US20050074143A1 (en) * 2003-10-02 2005-04-07 Nissan Motor Co., Ltd. Vehicle backing assist apparatus and vehicle backing assist method
JP2006252577A (en) * 2006-05-11 2006-09-21 Nippon Soken Inc Map data generating apparatus
JP2006298256A (en) * 2005-04-22 2006-11-02 Aisin Aw Co Ltd Parking supporting method and parking supporting device
US20080129539A1 (en) * 2006-04-12 2008-06-05 Toyota Jidosha Kabushiki Kaisha Vehicle surrounding monitoring system and vehicle surrounding monitoring method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8888120B2 (en) * 2007-01-25 2014-11-18 Target Hitch Llc Towing vehicle guidance for trailer hitch connection
US20100324770A1 (en) * 2007-07-03 2010-12-23 J. Edward Ramsey Trailer hitch alignment device and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020149673A1 (en) * 2001-03-29 2002-10-17 Matsushita Electric Industrial Co., Ltd. Image display method and apparatus for rearview system
EP1249365A1 (en) * 2001-04-09 2002-10-16 Matsushita Electric Industrial Co., Ltd. Driving aiding system
EP1251025A2 (en) * 2001-04-09 2002-10-23 Matsushita Electric Industrial Co., Ltd. Driving aiding system
US20050074143A1 (en) * 2003-10-02 2005-04-07 Nissan Motor Co., Ltd. Vehicle backing assist apparatus and vehicle backing assist method
JP2006298256A (en) * 2005-04-22 2006-11-02 Aisin Aw Co Ltd Parking supporting method and parking supporting device
US20080129539A1 (en) * 2006-04-12 2008-06-05 Toyota Jidosha Kabushiki Kaisha Vehicle surrounding monitoring system and vehicle surrounding monitoring method
JP2006252577A (en) * 2006-05-11 2006-09-21 Nippon Soken Inc Map data generating apparatus

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018037032A1 (en) * 2016-08-26 2018-03-01 Jaguar Land Rover Limited A vehicle camera system
IT201700054083A1 (en) * 2017-05-18 2018-11-18 Cnh Ind Italia Spa SYSTEM AND METHOD OF AUTOMATIC CONNECTION BETWEEN TRACTOR AND TOOL
WO2018210990A1 (en) * 2017-05-18 2018-11-22 Cnh Industrial Italia S.P.A. System and method for automatic connection between a tractor and an implement
CN110636753A (en) * 2017-05-18 2019-12-31 凯斯纽荷兰(中国)管理有限公司 System and method for automatic connection between a tractor and an implement
CN110636753B (en) * 2017-05-18 2022-06-07 凯斯纽荷兰(中国)管理有限公司 System and method for automatic connection between tractor and implement
US10870323B2 (en) 2018-07-18 2020-12-22 Ford Global Technologies, Llc Compensation for trailer coupler geometry in automatic hitch operation

Also Published As

Publication number Publication date
GB0904049D0 (en) 2009-04-22
GB2469438B (en) 2014-04-09

Similar Documents

Publication Publication Date Title
JP7105754B2 (en) IMAGING DEVICE AND METHOD OF CONTROLLING IMAGING DEVICE
JP7245295B2 (en) METHOD AND DEVICE FOR DISPLAYING SURROUNDING SCENE OF VEHICLE-TOUCHED VEHICLE COMBINATION
CN106573577B (en) Display system and method
US7212653B2 (en) Image processing system for vehicle
US8553081B2 (en) Apparatus and method for displaying an image of vehicle surroundings
WO2002089485A1 (en) Method and apparatus for displaying pickup image of camera installed in vehicle
GB2554427B (en) Method and device for detecting a trailer
JP2020537216A (en) Image processing method and equipment
GB2529408A (en) Display system and method
GB2469438A (en) Displaying movement of an object
US8581984B2 (en) Vehicle circumference monitor apparatus
JP3988551B2 (en) Vehicle perimeter monitoring device
US20230113406A1 (en) Image processing system, mobile object, image processing method, and storage medium
US20230098424A1 (en) Image processing system, mobile object, image processing method, and storage medium
WO2016047037A1 (en) Vehicular image-processing apparatus
US20120086798A1 (en) System and method for automatic dynamic guidelines
CN112449625B (en) Method, system and trailer combination for assisting in scheduling operation of trailer combination
US20200128215A1 (en) Hitch assist backup camera system
US11832019B2 (en) Method for harmonizing images acquired from non overlapping camera views
EP1405776A1 (en) Movable body circumstance monitoring apparatus
US20220144187A1 (en) Camera system for a trailer hitch system
JP3947117B2 (en) Vehicle periphery image processing system
JP2024050331A (en) Mobile body and imaging device installation method