SE2250066A1 - Method and projection device for assisting a driver of a heavy road vehicle, in surveilling surroundings - Google Patents

Method and projection device for assisting a driver of a heavy road vehicle, in surveilling surroundings

Info

Publication number
SE2250066A1
SE2250066A1 SE2250066A SE2250066A SE2250066A1 SE 2250066 A1 SE2250066 A1 SE 2250066A1 SE 2250066 A SE2250066 A SE 2250066A SE 2250066 A SE2250066 A SE 2250066A SE 2250066 A1 SE2250066 A1 SE 2250066A1
Authority
SE
Sweden
Prior art keywords
vehicle
light pattern
driver
predefined
projecting
Prior art date
Application number
SE2250066A
Inventor
Jan Johansson
Jan Söderlund
Tomas Skeppström
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE2250066A priority Critical patent/SE2250066A1/en
Priority to DE102022132150.9A priority patent/DE102022132150A1/en
Publication of SE2250066A1 publication Critical patent/SE2250066A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/22Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for reverse drive
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/48Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for parking purposes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/30Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating rear of vehicle, e.g. by means of reflecting surfaces
    • B60Q1/305Indicating devices for towed vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/24Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for lighting other areas than only the way ahead
    • B60Q1/249Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments for lighting other areas than only the way ahead for illuminating the field of view of a sensor or camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

The present disclosure relates to vehicles. According to a first aspect the disclosure relates to a method for assisting a driver of a Heavy Road, HR, vehicle 1, in surveilling surroundings of the HR vehicle. The HR vehicle comprises a surrounding monitoring arrangement 11 configured to display, to the driver, an image of a monitored zone 3 adjacent to the HR vehicle 1. The method comprises projecting, using one or more light sources 12a attached to the HR vehicle 1, a predefined light pattern 2 comprising one or more distinct guiding objects 2a, 2b, on one or more surfaces located within the monitored zone 3, whereby the driver is able to see the predefined light pattern in the image displayed by the surrounding monitoring arrangement 11. The disclosure also relates to a projection device and to a vehicle comprising a surrounding monitoring arrangement 11 and a projection device 12.

Description

Method and projection device for assisting a driver of a heavy road vehicle, in surveilling surroundings Technical field The present disclosure relates to exterior surveillance systems for vehicles. More specifically the disclosure relates to a method and a projection device for assisting a driver of a heavy road vehicle in surveilling surroundings. The disclosure also relates to a computer program, to a computer-readable medium and to a vehicle.
Background Vehicles of today are commonly equipped with monitoring systems that visualize vehicle surroundings on monitors and/or mirrors. Most monitoring systems utilizes information from cameras arranged on the vehicle to create one or more images of the vehicle surroundings. Different cameras may be used to cover different portions of the vehicle surroundings, such as generic surveillance cameras, or mirror- replacing cameras designed to comply with general laws or requirements.
An image of the vehicle may be overlayed (i.e., superimposed) on an image of the surroundings by means of algorithms. However, such overlaying requires detailed knowledge of movement of the vehicle, of positions and angles of the cameras, as well as of vehicle dimensions. Hence, the monitoring systems require access to information about camera positions and vehicle dimensions, in particular physical footprint, in order to be able to show correct information on the monitor.
For some vehicles, such as cars, this might not be a problem as the dimensions and driving properties are typically static and known beforehand. Other vehicles, such as, Heavy Road, HR, vehicles are typically configurable to have different physical footprints, such as variable length or overhang. For these vehicles, the monitoring system and the vehicle must be fully integrated to be able to provide correct information on the monitors. ln addition, trucks and other vehicles may use a wide variety of monitoring systems from different suppliers. The monitoring systems are either mounted at factory or retrofitted by an owner of the vehicle.
Therefore, it is not always possible to preprogram the monitoring systems with adequate vehicle information.
Hence, it may not be possible to provide the monitoring system with information that can cope with the varying dimensions/positions of vehicle and cameras may without individual calibration and/or active sensors. Calibration takes time and can be costly. Sensors built into a camera/monitor system may only be needed once and are therefore likely to be expensive in relation to the benefit. Hence, there is a need for improved techniques for monitoring surroundings of HR vehicles that are independent on vehicle configuration.
Summary lt is an object of the disclosure to alleviate at least some of the drawbacks with the prior art. Thus, it is an object of this disclosure to provide techniques for facilitating monitoring of surroundings of vehicles. ln particular, it is an object to provide techniques for assisting a driver of a Heavy Road, HR, vehicle, in surveilling surroundings of the HR vehicle, that are compatible with various monitoring systems and that does not require the monitoring system to be integrated into the vehicle.
According to a first aspect the disclosure relates to a method for assisting a driver of a Heavy Road, HR, vehicle, in surveilling surroundings of the HR vehicle. The HR vehicle comprises a surrounding monitoring arrangement configured to display, to the driver, an image of a monitored zone adjacent to the HR vehicle. The method comprises projecting, using one or more light sources attached to the HR vehicle, a predefined light pattern comprising one or more distinct guiding objects, on one or more surfaces located within the monitored zone, whereby the driver is able to see the predefined light pattern in the image displayed by the surrounding monitoring arrangement. By projecting guiding objects on the ground, or on nearby objects, the guiding objects will automatically appear in ordinary monitors or mirrors used to monitor surroundings of the vehicle. A driver may learn what the projected guiding objects means and can then use it to judge size and distance to other objects displayed by the monitoring arrangement. The method is independent of vehicle type, size or position of camera or mirrors of the monitoring arrangement. Furthermore, the technique may be fitted either in the factory or after the HR vehicle is placed on the market. ln some embodiments, the projected predefined light pattern is indicative of a predefined distance to a reference position of the HR vehicle. Thereby, the guiding objects can be used to estimate a distance to the reference position. ln some embodiments, the reference position comprises a reference point, a reference line, a surface, a side, or an edge of the HR vehicle that is affected by a current physical footprint of the HR vehicle. Hence, the guiding objects can be used to determine distance to various parts of the HR vehicle and may thus be utilized in various situations. ln some embodiments, the HR vehicle is configurable to have different physical footprints, and the reference position is affected by a current physical footprint of the HR vehicle. Hence, the proposed technique is applicable also for vehicles that are configurable to have different physical footprints. ln some embodiments, the projecting comprises projecting the predefined light pattern on a flat vehicle support surface, such that the one or more guiding objects are positioned at predefined distances from the reference position upon the predefined light pattern is projected onto a vehicle support surface being flat. Hence, an accurate estimate of a distance to an object may be determined by studying the guiding objects. ln some embodiments, the projecting comprises projecting two distinct guiding objects using two light beams that are angled towards each other such that they converge in at least one two-dimensional representation. Thereby, a distance between the guiding objects formed by the light beams will vary depending on a distance to a surface on which they are projected. A driver may then estimate the distance to the surface based on a distance between the guiding objects. ln some embodiments, the light beams intersect at a predefined distance from the reference position. Thus, the convergence point may be used to determine when the HR vehicle is positioned at the certain distance from an object, which may be useful for example when reversing towards a loading dock or similar. ln some embodiments, the predefined distance is static or is user configurable. Hence, the projecting of the predefined light pattern may be configured to indicate a certain distance suitable for a particular application, such as loading the HR vehicle. ln some embodiments, the projected predefined light pattern is configured such that the predefined light pattern indicates a topology of a surface on which it is projected. Hence, the predefined light pattern can be used to indicate a hole in the ground or a vertical plane behind the vehicle. ln some embodiments, the method comprises capturing the image, using an image sensor, and displaying the image on a display. Hence, the predefined light pattern may be displayed to a driver using an ordinary reversing camera or similar. ln some embodiments, the predefined light pattern is projected with a wavelength outside the visible spectrum, which is detectable by the image sensor, and wherein the displaying comprises imaging the predefined light pattern on the display with a wavelength in the visible spectrum. Thereby, the predefined light pattern will not be visible to people outside the vehicle and will thus not affect other road users. ln some embodiments, the surrounding monitoring arrangement comprises a mirror arranged to image the one or more surfaces where the predefined light pattern is projected. Hence, the predefined light pattern may be displayed to a driver using an ordinary rear view mirror. ln some embodiments, the one or more distinct guiding objects comprises one or more of a point, a line, a grid, and a scale. Thus, various objects may be projected based on driver desires and needs. ln some embodiments, the one or more distinct guiding objects comprises at least two distinct guiding objects having different colors. By projecting guiding objects having different colors the driver may quickly detect a certain distance etc. ln some embodiments, the method comprises activating and/or deactivating the projecting of the predefined light pattern. Hence, the surrounding surveilling assistance may only be activated when needed. ln some embodiments, the projecting is activated in response to one or more triggers including one or more of; a reverse gear being engaged, turn signals being activated, the HR vehicle having a speed below a certain level, a certain steering angle, or a trigger activated by a driver. These are examples of various situations when a driver may benefit from assistance in surveilling surroundings. ln some embodiments, the projecting comprises projecting the predefined light pattern using movable and/or directable light sources arranged on the HR vehicle. Hence, the predefined light pattern may be moved based on vehicle configuration by simply moving and/or redirecting the light sources. ln some embodiments, the projecting comprises projecting the predefined light patterns using one or more light sources arranged in taillights, or in a reverse camera of the HR vehicle. By arranging the light sources in parts of the vehicle that are affected by a reconfiguration, the light pattern will always be positioned at a certain distance from a certain reference point (such as a rear side of the vehicle), independently on vehicle configuration. ln some embodiments, the method is performed independently of the surrounding monitoring arrangement. Hence, the proposed technique may be used with an such as a retrofitted independent surrounding monitoring arrangement, arrangement provided by an external supplier.
According to a second aspect, the disclosure relates to a projection device configured to assist a driver of a HR vehicle, in surveilling surroundings of the HR vehicle, wherein the HR vehicle comprises a surrounding monitoring arrangement configured to display, to the driver, an image of a monitored zone adjacent to the HR vehicle. The projection device comprises one or more light sources configured to be attached to the HR vehicle, optical components, and an attachment device configured to attach the projection device to the HR vehicle. The one or more light sources and the optical components are arranged such that when the projection device is attached to the HR vehicle a predefined light pattern comprising one or more distinct guiding objects, is projected on one or more surfaces located within the monitored zone, such that the driver is able to see the predefined light pattern in an image displayed by the surrounding monitoring arrangement. ln some embodiments, the projection device is configured to perform the method according to any one of the embodiments of the first aspect.
According to a third aspect, the disclosure relates to a vehicle comprising a surrounding monitoring arrangement configured to display, to the driver, an image of a monitored zone adjacent to the HR vehicle, and the projection device according to the second aspect.
Brief description of the drawinqs Fig. 1 illustrates a vehicle comprising a monitoring arrangement.
Fig. 2 illustrates a vehicle configurable to have different physical footprints.
Fig. 3a illustrates a vehicle comprising a projection device according to the second aspect.
Fig. 3b illustrates the vehicle of Fig. 3a seen from above.
Fig. 4 illustrates the projection device in further detail.
Fig. 5 is a flowchart illustrating the method according to the first aspect.
Fig. 6a illustrates a vehicle projecting distinct guiding objects using two light beams that are converging in at least one plane.
Fig. 6b illustrates the predefined light pattern projected by the vehicle in Fig. 6a from above.
Detailed description lt is herein proposed to use a projection device that is separated from the monitoring system to project graphics on the ground or on objects nearby the vehicle. The projected graphics is then visible in an image shown by the monitoring system. ln this way a driver can learn what the projected graphics mean and can judge size and distance on-screen in relation to the projected graphics. The graphics can either be projected permanently or based on triggers, such as on request from truck or driver. For example, the graphics may be projected when speed is low or while reversing. The proposed technique will now be described with reference to Figs. 1- 6.
Fig. 1 illustrates a top view of a Heavy Road, HR vehicle 1, here a truck, where the proposed method for assisting a driver of a Heavy Road, HR, vehicle, in surveilling surroundings may be implemented. The vehicle comprises a monitoring system, herein called surrounding monitoring arrangement 11, configured to display to the driver, at least one image of a monitored zone 3 adjacent to the HR vehicle. ln other words, surrounding monitoring arrangement 11, is configured to show (or provide) one or more images to the driver.
The surrounding monitoring arrangement 11, herein also referred to simply the monitoring arrangement 11, may be configured to display one or more images on mirrors 11a and/or on displays 11c that are visible to the driver. Hence, in some embodiments, the surrounding monitoring arrangement 11 comprises a mirror 11a arranged to image the monitored zone. ln other words, in some embodiments the one or more images are displayed in a mirror, such as on ordinary rearview and/or side view mirrors. ln some embodiments, the surrounding monitoring arrangement 11 in addition or alternatively comprises a digital camera, herein also called image sensor 11b, configured to capture the image, and a display 11c configured to display the image. The surrounding monitoring arrangement 11 may also comprise a combination of cameras 11b, displays 11c and mirrors 11a as well as a control device 11d configured to control the capturing and displaying. The control device 11d may also be configured to control position and angle of view of the mirrors 11a and/or cameras 11b. For example, a servo motor physically coupled to the mirror or camera in order to automatically rotate, slide, pan or otherwise move the mirror Ol' Camera.
As mentioned above, the surrounding monitoring arrangement 11 may operate independently of the rest of the vehicle 1. lt may for example be retrofitted by an owner of the vehicle after manufacturing of the vehicle. This means that the surrounding monitoring arrangement 11 may not be aware about vehicle properties such as vehicle footprint and driving properties, which may be affected by a current load of the vehicle.
Fig. 2 illustrates a HR vehicle 1, herein also referred to as simply a vehicle 1, being configurable to have different physical footprints. Physical footprint herein refers to an area on a surface covered by the vehicle, i.e., a surface covered by the vehicle when seen straight from above. When the vehicle 1 is reconfigured to have another physical footprint one or more reference positions r of the vehicle 1 may be affected. For example, a reference position r may move to a new position r'. The positions r and r' are defined in relation to a static position p of the vehicle that is not affected by the reconfiguration. The static position p may for example be a center point of a cabin 15 of the vehicle. The vehicle 1 in Fig. 2 is reconfigurable to have varying length and overhang rear. ln the illustrated example, an alternative configuration with different length and overhang rear is illustrated with dashed lines. ln this example the rear edge of the vehicle constitutes a reference position r that moves in relation to the cabin 15 to a new position r' when the vehicle 1 is reconfigured to have a longer overhang or length. ln other words, in some embodiments, the proposed technique is implemented in a HR vehicle 1 that is configurable to have different physical footprints. More specifically, in some embodiments the vehicle 1 comprises at least one reference position r that is affected by a current physical footprint of the HR vehicle 1 .Various reference positions may be affected by such reconfiguration. ln other words, in some embodiments, the reference position r comprises a reference point, a reference line, a surface, a side, or an edge of the HR vehicle 1 that is dependent by a current physical footprint of the HR vehicle 1.
Vehicles with reconfigurable footprint may be problematic for the surrounding monitoring arrangement 11 to handle because, as the surrounding monitoring arrangement 11 does not always know the footprint of the vehicle 1, it is unable to overlay it on the display 11c. Hence, it may be difficult for a driver to estimate an exact distance to objects in the surroundings. A projection device 11 will now be described that can assist a driver in such situations.
Fig. 3a illustrates a HR vehicle 1 comprising the proposed projection device 12 that is configured to assist a driver of a HR vehicle 1, in surveilling surroundings of the HR vehicle 1. The projection device 12 is configured to project a predefined light pattern 2 comprising one or more distinct guiding objects on one or more surfaces located within a monitored zone 3 that the driver can monitor using the surrounding monitoring arrangement 11. The guiding objects are distinct in the sense that they are readily distinguishable by a human eye. For example, they have sharp contour or similar. Thereby, a driver of the vehicle 1 is able to see the predefined light pattern 2 in at least one image displayed by the surrounding monitoring arrangement 11 and use it to determine a distance to objects in the surroundings. ln the illustrated example, the predefined light pattern 2 comprises guiding objects formed by four lines 2a projected on the ground within predefined distance d from a reference position r of the HR vehicle 1. ln this example, the reference position r is a rear side of the vehicle 1. By studying the lines 2a the driver will learn how close it can drive to an object, which is helpful for example when reversing towards an object such as a loading dock.
Fig. 3b illustrates the vehicle 1 of Fig. 3a seen from above. lt must be appreciated that the vehicle 1 comprises a plurality of electrical systems and subsystems. However, for simplicity only some parts of the vehicle 1 that are associated with the lO proposed method are shown in Figs. 3a and 3b. Hence, the illustrated vehicle 1 comprises a projection device 12 and a surrounding monitoring arrangement 11.
The surrounding monitoring arrangement 11 is configured to display, to the driver, an image of a monitored zone 3 adjacent to the HR vehicle 1, as described in Fig. 1.
The projection device 12 is configured to project a predefined light pattern 2 that a driver can see in the monitoring arrangement 11. An example embodiment of the projection device 12 is illustrated in Fig. 4. More specifically, the illustrated projection device 12 comprises one or more light sources 12a, optical components 12b, an attachment mechanism 12c. ln some embodiments, the illustrated projection device 11 also comprises an enclosure 12d and a control arrangement 12e.
The one or more light sources 12a are configured to be attached to the HR vehicle 1. ln some embodiments the light sources are movable and/or directable. The light sources may be configured to be manually moved (i.e., repositioned) and/or re- directed. For example, the light sources 12a may be moved or directed based on predefined instructions provided by a manufacturer. This may be done when the HR vehicle has been reconfigured to have a different spatial footprint. Alternatively, the light sources 12a may be configured to be automatically moved or re-directed. For example, a servo motor may be physically coupled to the light sources 12a in order to automatically rotate, slide, pan or otherwise move the light sources 12a. ln some embodiments, the light sources 12a are arranged in taillights, or in a reverse camera of the HR vehicle 1. Thus, the light sources may be fitted during manufacturing. By placing the light sources 12a in for example taillights, the light sources 12a will automatically be moved when the lengths of the vehicle 1 is adjusted, as the taillights are always repositioned such that they are positioned at the back of the vehicle 1.
The optical components 12b are configured to create a predefined light pattern 2 from light provided by the one or more light sources 12a. This may for example be ll done by guiding the light through one or more lenses or by using lasers. The optical components 12b may be configurable such that the predefined light pattern 2 is exchangeable or programmable. For example, the optical components 12b comprise an automated lens handling system or a programmable laser imaging system.
The attachment mechanism 12c configured to attach the one or more light sources 12a, optical components 12b to the HR vehicle 1. The attachment mechanism 12c comprises for example a clamp, a clip, rails, screws, or any suitable mechanism. The enclosure 12d, is a shell or a container that assembles and/or protects the projection device 12.
The projection device 12 may also be referred to as a projection assembly, as it may comprise assembled components. lt must be appreciated that the projection device may also comprise physically separated light sources that do not form one single assembly, as illustrated in Fig. 6a-b.
The one or more light sources 12a and the optical components 12b are arranged such that when the projection device 12 is attached to the HR vehicle 1 a predefined light pattern 2 comprising one or more distinct guiding objects, is projected on one or more surfaces located within the monitored zone 3. The predefined light pattern 2 is projected such that the driver is able to see the predefined light pattern 2 in the image displayed by the surrounding monitoring arrangement 11.
The control arrangement 12e comprises for example a microprocessor that is electrically connected to the light sources and to the optical components. Hence, the control arrangement 12e is configured to control the one or more light sources 12a and the optical components 12b. For example, the control arrangement 12e may be configured to switch the one or more light sources on and/off. ln some embodiments the control arrangement 12e is configured to control the optical components to move, exchange or re-program the predefined light pattern. ln some embodiments, the control arrangement 12e is configured to control position and/or direction of the movable and/or directable light sources 12a. 12 ln some embodiments, the control arrangement 12e is configured to cause the projection device 11 to perform at least parts of the method for assisting a driver of a HR vehicle that is described above and below. This is typically done by running computer program code stored in a data storage or memory in a processor of the control arrangement 12e. The control arrangement 12e may also comprise a communication interface for communicating with other control units of the vehicle and/or with external systems. Note that the control arrangement 12e does not need to communicate with the surrounding monitoring arrangement 11. Hence, in some embodiments, the control arrangement 12e is independent on the surrounding monitoring arrangement 11 _ The control arrangement 12e may comprise one or more ECUs. ECU is a generic term that is used in automotive electronics for any embedded system that controls one or more functions of the electrical system or sub systems in a transport autonomous vehicle. A vehicle typically comprises a plurality of ECUs that communicate over a Controller Area Network, CAN. For example, the control arrangement may be configured to communicate with the Instrument Cluster, lCL, whereby the projection device can be controlled by the driver from the dashboard.
The proposed method for assisting a driver of a Heavy Road, HR, vehicle 1, in surveilling surroundings will now be described with reference to Fig. 5. The method is for use in a HR vehicle 1 comprising a surrounding monitoring arrangement 11 configured to display to the driver, an image of a monitored zone 3 adjacent to the HR vehicle 1, as illustrated in Fig. 1. The method is performed by a projection device 11. ln some embodiments the method is at least partly automatically performed by a control arrangement 12e of the projection device, alternatively one or more steps may be performed manually. ln some embodiments, some steps may be performed by the monitoring arrangement 12.
The method is typically performed during normal operation of the vehicle when the driver needs assistance in surveillant surroundings. The proposed method comprises projecting S1, using one or more light sources 12a attached to the HR 13 vehicle, a predefined light pattern 2 comprising one or more distinct guiding objects on one or more surfaces located within the monitored zone 3. ln other words, guiding objects are projected on surfaces such as on the ground or on other objects located in the vicinity of the vehicle 1. For example, the monitored zone 3 where the guiding objects are projected extend a few meter outside the physical footprint of the vehicle 1. The monitored zone 3 does not need to have a symmetrical shape but may basically be defined by the monitoring arrangement 11. As the predefined light pattern 2 is projected on surfaces located within the monitored zone 3, a driver can see the predefined light pattern 2 in the image displayed by the surrounding monitoring arrangement 11. ln other words, the projecting S1 is performed such that the driver is able to see the predefined light pattern 2 in the image displayed by the surrounding monitoring arrangement 11.
The guiding objects may not be projected all the time but may be activated only in situations when the driver needs assistance. ln other words, in some embodiments, the method comprises activating S0 the projecting S1 of the predefined light pattern 2. ln some embodiments, the projecting is activated in response to one or more triggers including one or more of; a reverse gear being engaged, turn signals being activated, the HR vehicle 1 having a speed below a certain level, a certain steering angle (such as above 10-20°), or a trigger activated by a driver. Alternatively, the projecting may be activated all the time.
The projected predefined light pattern 2 comprising one or more distinct guiding objects may be designed in various ways. ln some embodiments, the projected predefined light pattern 2 is indicative of a predefined distance d to a reference position r of the HR vehicle 1. ln other words, the projected pattern 2 may indicate a distance d to a position, such as a distance to a rear or side of the vehicle. The driver may use such a light pattern to avoid colliding with objects while reversing or in narrow passages.
The distinct guiding objects may comprise one or more points or lines. ln other embodiments the distinct guiding objects may form a grid or a scale. ln principle 14 any suitable shape that is readily distinguishable by a human eye can be used. ln some embodiments, the distinct guiding objects have different colors, such as different distinct colors. ln other words, the individual guiding objects may have different color, wherein the individual colors may indicate a property of the guiding object such as a distance to the vehicle. ln some embodiments, the distinct guiding objects are separated from each other. As an alternative, or in addition, one guiding object may have different colors, such as a line that changes color the longer the distance to the vehicle becomes. For example, different colors may be used to indicate different distances to the reference position. For example, red may mean "very close", yellow "quite close" and green may be used to indicate a "safe distance" i.e., low risk of collision. ln other words, in some embodiments the distinct guiding objects comprise text, color and/or symbols indicating individual distances to the reference position. lf the pattern is projected on the ground the objects may be projected on fixed distances from the vehicle, provided that the surface that it stands on is flat. For example, lines may be projected on the ground with a predefined distance, such as 10-30cm, in between. The line closest to the vehicle 1 may indicate the physical footprint (Fig. 3a). Hence, the guiding object closest to the vehicle 1 may be aligned with the overhand rear. ln other words, in some embodiments, the projecting S1 comprises projecting the predefined light pattern 2 on a flat vehicle support surface, such that the one or more guiding objects (such as lines 2a (Fig. 3) or points 2b (Fig. 6)) are positioned at predefined distances d from the reference position r upon the predefined light pattern being projected onto a vehicle support surface being flat. The predefined distance d may be static or is configurable. More specifically, the certain distance may be reconfigured by moving or redirecting the light sources 12a or by adjusting the optical components 12b. For example, the driver may adjust the distance based on a particular loading dock.
The predefined light pattern 2 may also be designed such that it can be used to reveal a topology of the surface that the vehicle 1 stands on, such as the ground. More specifically, a shape of the predefined light pattern will be affected by a topology of the surface on which it is projected. Typically, the guiding objects will be projected such that they have a certain shape (e.g., a circle, line or dot) when projected on a flat surface. This shape will be distorted if the surface is uneven. A distortion of the predefined light pattern caused by an uneven surface will be more apparent when viewed from an angle different from the projection angle predefined light pattern.
Hence, the predefined light pattern 2 will typically be distorted if the surface on which it is projected is uneven. For example, a straight line will curve or distance between evenly spaced objects may change on an uneven surface. This effect may for example be used to detect a hole or recess in the ground or to estimate distance to a vertical plane, such as a wall behind the vehicle 1. ln other words, a topology of terrain, soil or surroundings may be highlighted and/or made visible because of the projected light pattern. ln one example, 50 horizontal lines are projected on an apparently smooth and newly paved surface. However, there is an obstacle on the surface (such as a curb, road bump or pit) that has the same color and pattern as the surface and is therefore not easily visible. However, because the lines then distort at the obstacle, it becomes very obvious to the driver that there is a deviation. ln some embodiments, the monitoring arrangement 11 is configured to detect such a deviation and warn the driver or enhance the image to highlight the obstacle. ln other words, in some embodiments, the projected predefined light pattern 2 is configured such that the predefined light pattern 2 indicates a topology of a surface on which it is projected.
The driver can see the guiding objects in the monitoring arrangement 12, either in mirrors 11a or on displays 11c. ln some embodiments, the surrounding monitoring arrangement 11 comprises a mirror 11a arranged to image the monitored zone comprising the one or more surfaces where the predefined light pattern 2 is projected. ln this case the driver only needs to look in the mirror to see the guiding objects. 16 Alternatively, the monitoring arrangement 11 captures one or more images of the monitored zone 3 and displays the captured images on a display arranged in the cabin 15. ln other words, in some embodiments, the method comprises capturing S3 the image, using an image sensor 1 1 b, and displaying S4 the image on a display 11c. However, it shall be appreciated that this does not require that the monitoring arrangement 11 is aware of the projected light pattern, as it will show up in the image anyway. ln some embodiments, the monitoring arrangement 11 is aware about the projected light pattern and may assist the driver in detecting and high|ighting objects. ln some embodiments, the predefined light pattern 2 is projected with a wavelength outside the visible spectrum. This may be beneficial as no one outside the vehicle 1 will be disturbed by the light. A wavelength outside the visible spectrum may anyway be detected by the image sensor 11b and thereby it may be displayed, such as overlayed, on the image shown to the driver on the display 11c. ln other words, in some embodiments, the displaying S4 comprises imaging the predefined light pattern 2 on the display with a wavelength in the visible spectrum. ln some embodiments, the projection of the predefined light pattern 2 comprising one or more distinct guiding objects is deactivated when it is not needed or used. Hence, in some embodiments, the method comprises deactivating S4 the projecting S1 of the predefined light pattern 2. ln some embodiments, the projecting is deactivated in response to one or more triggers including one or more of; a reverse gear being disengaged, turn signals being deactivated, the HR vehicle 1 having a speed above a certain level, or any trigger activated by a driver, such as user input. Activation and deactivation are up to implementation and may be different for different vehicles and different scenarios.
Fig. 6a and 6b illustrates another example embodiment of a predefined light pattern 2 projected by a projection device 11 arranged on a vehicle 1. Fig. 6a illustrates the vehicle 1 obliquely from behind and Fig. 6b illustrates a top view of the vehicle 1. ln this embodiment the predefined light pattern 2b is projected using two light beams 17 E that are projected from a projector device 12 comprising individual light sources 12a that are spatially separated from each other (here arranged inside taillights). The beams are herein represented by a corresponding vector E. Each beam projects one guiding object 2b, in the i||ustrated example the guiding objects 2b are points. ln these embodiments the predetermined light pattern 2 is projected using light beams E that are intersecting or skewed (i.e., they are not parallel), with an angle, or skew angle, of less than 90 degrees in-between. Stated differently, the light beams E are angled towards each other in at least one plane or two- dimensional representation. More specifically, they are angled towards each other such that the distance between the beams E decreases (in the two-dimensional plane) in an area closest to the vehicle, i.e. from the projector device 12 and up to the point of convergence, whereafter the distance between the beams increases again. ln other words, the light beams E are projected such that they converge in at least one two-dimensional representation. A two-dimensional representation (or projection) of a vector, such as a beam E, onto a 2D-plane is calculated by subtracting the component of E that is orthogonal to the 2D-plane from E. Hence, a representation on the horizontal plane (x-y in Fig. 6a), corresponds to removing the vertical component z from E, while keeping the horizontal components x, y. A representation of the beams in the horizontal plane is shown in Fig. 6b. lt must be appreciated that in this representation convergence of the beams E is independent of whether the light sources 2a are arranged at the same or different heights (i.e. z- coordinates). ln other words, in some embodiments, the projecting S1 comprises projecting two distinct guiding objects 2b using two light beams 5 that are converging in at least one two-dimensional representation. For example, in the i||ustrated example light beams E converge when represented in the horizontal plane that is parallel with a surface on which the vehicle 1 stands (as i||ustrated in Fig. 6b). The light sources 12a may also be positioned right on top of each other, such that the light beams E only converge when represented in a vertical plane (x- z or y-z plane of Fig. 6a), such as seen from the side. The beams may also converge in more than one two-dimensional representation. The effect of using beams angled 18 towards each other is that a spatial relation in-between guiding objects 2b projected by the beams E will depend on a distance to a surface on which they are projected. ln some embodiments, the beams E are angled such that, in an area c|osest to the vehicle, the distance between two individual guiding objects decreases with the distance from the vehicle 1. The driver may then use the distance in-between the objects to estimate the distance to an obstacle in the monitored zone 3. ln some embodiments, the light beams are substantially coplanar, which implies that they are at least partially intersecting as they are angled towards each other. For example, the light sources are arranged at about the same height (same z- coordinate). Then, the light beams E can be combined to form one object, such as a point, on an obstacle, or on the ground, at a certain distance d from the vehicle 1. The guiding objects 2b may also have other shapes, such as circles or triangles that move towards, or away from, each other on the surface they are projected on, based on a distance between the projector device 11 and the surface. Fig. 6b illustrates how the guiding objects will be projected on surfaces 4 (the dashed lines) at different distances from the back of the vehicle 1. ln some embodiments, the light beams 5 intersect at a predefined distance d from the reference position r (here the rear part of the vehicle 1) within the monitored zone 3. lf an object is closer or further away two points will be visible to the driver. ln some embodiments the guiding object formed by the individual light beams E are different in colour, whereby a driver may get additional info. For example, two colours are combined into a third colour at the certain distance d. Alternatively, two guiding objects formed by the two light beams may have different shapes such that they form a third shape when they are combined at the certain distance d.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described method, control arrangement or computer program. Various changes, substitutions and/or alterations may be made, without departing from disclosure embodiments as defined by the appended claims. 19 The term "or" as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are to be interpreted as "at least one", thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. lt will be further understood that the terms "includes", "comprises", "including" and/ or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and/ or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and/ or groups thereof. A single unit such as e.g., a processor may fulfil the functions of several items recited in the claims.

Claims (22)

Claims
1.A method for assisting a driver of a Heavy Road, HR, vehicle (1 ), in surveilling surroundings of the HR vehicle (1 ), wherein the HR vehicle (1) comprises a surrounding monitoring arrangement (11) configured to display to the driver, an image of a monitored zone (3) adjacent to the HR vehicle (1 ), the method comprising: - projecting (S1), using one or more light sources (12a) attached to the HR vehicle, a predefined light pattern (2) comprising one or more distinct guiding objects (2a, 2b), on one or more surfaces located within the monitored zone (3), such that the driver is able to see the predefined light pattern (2) in the image displayed by the surrounding monitoring arrangement (11).
2._ The method according to claim 1, wherein the projected predefined light pattern (2) is indicative of a predefined distance (d) to a reference position (r) of the HR vehicle (1 )_
3._ The method according to claim 2, wherein the reference position (r) comprises a reference point, a reference line, a surface, a side, or an edge of the HR vehicle (1) that is affected by a current physical footprint of the HR vehicle (1 )_
4._ The method according to claim 2 or 3, wherein the HR vehicle (1) is configurable to have different physical footprints, and wherein the reference position (r) is affected by a current physical footprint of the HR vehicle (1 )_
5._ The method according to any one of claims 2 to 4, wherein the projecting (S1) comprises projecting the predefined light pattern (2) on a flat vehicle support surface, such that the one or more guiding objects (2a) arepositioned at predefined distances (d) from the reference position (r) upon the predefined light pattern is projected onto a vehicle support surface being flat.
6.The method according to any one of claims 2 to 5, wherein the projecting (S1) comprises projecting two distinct guiding objects (2b) using two light beams (5) that are angled towards each other such that they converge in at least one two-dimensional representation.
7.The method according to claim 6, wherein the light beams (5) at least partially intersect at a predefined distance (d) from the reference position (T)-
8.The method according to any one of claims 2 to 7, wherein the predefined distance (d) is static or is user configurable.
9.The method according to any one of the preceding claims, wherein the projected predefined light pattern (2) is configured such that the predefined light pattern (2) indicates a topology of a surface on which it is projected.
10.The method according to any one of the preceding claims, wherein themethod comprises capturing (S3) the image, using an image sensor (11b), and displaying (S4) the image on a display (11c).
11.The method according to claim 10, wherein the predefined light pattern (2) is projected with a wavelength outside the visible spectrum which is detectable by the image sensor (1 1 b), and wherein the displaying (S4) comprises imaging the predefined light pattern (2) on the display with a wavelength in the visible spectrum.
12.The method according to any one of the preceding claims, wherein the surrounding monitoring arrangement (11) comprises a mirror (11a)arranged to image the monitored zone comprising the one or more surfaces where the predefined light pattern (2) is projected.
13.The method according to any one of the preceding claims, wherein the one or more distinct guiding objects comprises one or more of a point, a line, a grid, and a scale.
14.The method according to any one of the preceding claims, wherein the one or more distinct guiding objects comprises at least two distinct guiding objects having different colors.
15.The method according to any one of the preceding claims, wherein the method comprises activating (S0) and/or deactivating (S4) the projecting (S1) of the predefined light pattern (2).
16.The method according to claim 15, wherein the projecting is activated in response to one or more triggers including one or more of; a reverse gear being engaged, turn signals being activated, the HR vehicle (1) having a speed below a certain level, a certain steering angle, or a trigger activated by a driver.
17.The method according to any one of the preceding claims, wherein the projecting (S1) comprises projecting the predefined light pattern (2) using movable and/or directable light sources (12a) arranged on the HR vehicle (1)-
18.The method according to any one of the preceding claims, wherein the projecting (S1) comprises projecting the predefined light patterns using one or more light sources (12a) arranged in taillights, or in a reverse camera of the HR vehicle (1).
19.The method according to any one of the preceding claims, wherein the method is performed independently of the surrounding monitoring arrangement (11).
20.A projection device (12) configured to assist a driver of a Heavy Road, HR, vehicle (1), in surveilling surroundings of the HR vehicle (1), wherein the HR vehicle (1) comprises a surrounding monitoring arrangement (11) configured to display, to the driver, an image of a monitored zone (3) adjacent to the HR vehicle (1 ), wherein the projection device (12) comprises: - one or more light sources (12a) configured to be attached to the HR vehicle (1 ), - optical components (12b), and - an attachment mechanism (12c) configured to attach the projection device (12) to the HR vehicle (1), and wherein the one or more light sources (12a) and the optical components (12b) are arranged such that when the projection device (12) is attached to the HR vehicle (1) a predefined light pattern (2) comprising one or more distinct guiding objects, is projected on one or more surfaces located within the monitored zone (3), such that the driver is able to see the predefined light pattern (2) in at least one of the an image displayed by the surrounding monitoring arrangement (11).
21.The projection device (12) according to claim 20, wherein the projection device (12) is configured to perform the method according to any one of claims 2-
22.A Heavy Road, HR, vehicle (1) comprising: - a surrounding monitoring arrangement (11) configured to display, to the driver, an image of a monitored zone (3) adjacent to the HR vehicle (1 ), and- the projection device (12) according to claim 20 or 21.
SE2250066A 2022-01-25 2022-01-25 Method and projection device for assisting a driver of a heavy road vehicle, in surveilling surroundings SE2250066A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE2250066A SE2250066A1 (en) 2022-01-25 2022-01-25 Method and projection device for assisting a driver of a heavy road vehicle, in surveilling surroundings
DE102022132150.9A DE102022132150A1 (en) 2022-01-25 2022-12-05 Method and projection device for supporting a driver of a heavy-duty vehicle in monitoring an environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2250066A SE2250066A1 (en) 2022-01-25 2022-01-25 Method and projection device for assisting a driver of a heavy road vehicle, in surveilling surroundings

Publications (1)

Publication Number Publication Date
SE2250066A1 true SE2250066A1 (en) 2023-07-26

Family

ID=87068466

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2250066A SE2250066A1 (en) 2022-01-25 2022-01-25 Method and projection device for assisting a driver of a heavy road vehicle, in surveilling surroundings

Country Status (2)

Country Link
DE (1) DE102022132150A1 (en)
SE (1) SE2250066A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6204754B1 (en) * 1999-06-17 2001-03-20 International Business Machines Corporation Proximity indicating system
US6543917B1 (en) * 2000-11-20 2003-04-08 Peter Till Vehicle position indicating device
DE10248650A1 (en) * 2002-10-18 2004-05-06 Daimlerchrysler Ag Parking aid for motor vehicle has light source that radiates light in visible wavelength range towards road surface so that partial or complete second vehicle profile is represented on road surface
DE102013222137A1 (en) * 2013-10-30 2015-04-30 Continental Automotive Gmbh Camera arrangement for a motor vehicle, motor vehicle and method
JP2016222078A (en) * 2015-05-29 2016-12-28 株式会社ビートソニック Vehicular rearview monitoring method
US20180257546A1 (en) * 2017-03-08 2018-09-13 Ford Global Technologies, Llc Vehicle illumination assembly
WO2020025313A1 (en) * 2018-07-30 2020-02-06 HELLA GmbH & Co. KGaA Marking device and vehicle trailer with the marking device
US20200055446A1 (en) * 2017-06-21 2020-02-20 Mitsubishi Electric Corporation Rear wheel position indicator

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6204754B1 (en) * 1999-06-17 2001-03-20 International Business Machines Corporation Proximity indicating system
US6543917B1 (en) * 2000-11-20 2003-04-08 Peter Till Vehicle position indicating device
DE10248650A1 (en) * 2002-10-18 2004-05-06 Daimlerchrysler Ag Parking aid for motor vehicle has light source that radiates light in visible wavelength range towards road surface so that partial or complete second vehicle profile is represented on road surface
DE102013222137A1 (en) * 2013-10-30 2015-04-30 Continental Automotive Gmbh Camera arrangement for a motor vehicle, motor vehicle and method
JP2016222078A (en) * 2015-05-29 2016-12-28 株式会社ビートソニック Vehicular rearview monitoring method
US20180257546A1 (en) * 2017-03-08 2018-09-13 Ford Global Technologies, Llc Vehicle illumination assembly
US20200055446A1 (en) * 2017-06-21 2020-02-20 Mitsubishi Electric Corporation Rear wheel position indicator
WO2020025313A1 (en) * 2018-07-30 2020-02-06 HELLA GmbH & Co. KGaA Marking device and vehicle trailer with the marking device

Also Published As

Publication number Publication date
DE102022132150A1 (en) 2023-07-27

Similar Documents

Publication Publication Date Title
US11693422B2 (en) Sensor array for an autonomously operated utility vehicle and method for surround-view image acquisition
US11285875B2 (en) Method for dynamically calibrating a vehicular trailer angle detection system
US10434945B2 (en) Method and device for displaying an image of the surroundings of a vehicle combination
US10589680B2 (en) Method for providing at least one information from an environmental region of a motor vehicle, display system for a motor vehicle driver assistance system for a motor vehicle as well as motor vehicle
EP2682329B1 (en) Vehicle trailer connect system
US9507345B2 (en) Vehicle control system and method
US8199975B2 (en) System and method for side vision detection of obstacles for vehicles
US7379089B2 (en) Apparatus and method for monitoring the immediate surroundings of a vehicle
US20090022423A1 (en) Method for combining several images to a full image in the bird's eye view
NL2018281B1 (en) Method and system for alerting a truck driver
EP1562146A2 (en) Image-based detection of motion in vehicle environment
US20160375829A1 (en) Display System For Vehicles, In Particular Commercial Vehicles
US20190163988A1 (en) Periphery monitoring device
EP3294611B1 (en) Predicted position display for vehicle
JP2023531637A (en) Human Machine Interface for Commercial Vehicle Camera Systems
US20200084395A1 (en) Periphery monitoring device
US20190389488A1 (en) Surroundings monitoring device
WO2014158081A1 (en) A system and a method for presenting information on a windowpane of a vehicle with at least one mirror
KR20230021737A (en) Apparatus for verifying the position or orientation of a sensor in an autonomous vehicle
SE2250066A1 (en) Method and projection device for assisting a driver of a heavy road vehicle, in surveilling surroundings
JP6855254B2 (en) Image processing device, image processing system, and image processing method
KR20160143247A (en) Apparatus and method for warning a dangerous element of surrounding of vehicle
CN110691726B (en) Method and device for evaluating the state of a driver, and vehicle
US11967237B2 (en) Blind spot warning method and system for a motor vehicle having a trailer coupled thereto
US20230356653A1 (en) Information processing apparatus, information processing method, program, and projection apparatus