US11997418B2 - Indirect viewing system and method for adjusting a frame rate - Google Patents
Indirect viewing system and method for adjusting a frame rate Download PDFInfo
- Publication number
- US11997418B2 US11997418B2 US17/522,231 US202117522231A US11997418B2 US 11997418 B2 US11997418 B2 US 11997418B2 US 202117522231 A US202117522231 A US 202117522231A US 11997418 B2 US11997418 B2 US 11997418B2
- Authority
- US
- United States
- Prior art keywords
- image
- images
- frame rate
- percentage
- output device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims description 34
- 230000002123 temporal effect Effects 0.000 claims description 21
- 230000008859 change Effects 0.000 claims description 5
- 230000003068 static effect Effects 0.000 claims description 4
- 230000003139 buffering effect Effects 0.000 claims 1
- 230000008569 process Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000002441 reversible effect Effects 0.000 description 4
- 238000005282 brightening Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000010363 phase shift Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000012432 intermediate storage Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
- H04N7/0127—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
Definitions
- the present invention relates to an indirect viewing system, which can be used in particular as a mirror replacement system (mirror substitute system) according to UN/ECE-R46 for a vehicle, and to a method for adjusting a frame repetition rate (refresh rate) of images acquired by an image sensor of an image acquisition device of the viewing system.
- a mirror replacement system mirror substitute system
- UN/ECE-R46 UN/ECE-R46 for a vehicle
- the above problem is solved by using special, high-priced image sensors that enable high resolution with full dynamic range and a refresh rate of, for example, 60 fps.
- a standard available image sensor with a frame rate of less than 60 fps is used and the frame rate is increased from, for example, 30 fps to 60 fps computationally by a processing unit of the camera system by outputting each image frame twice in succession on the monitor.
- a computational increase of the frame rate from, for example, 40 fps to 60 fps can be performed by outputting every second image frame twice in succession on the monitor.
- One disadvantage of the above-mentioned known techniques for adapting the refresh rate of the image sensor to the refresh rate of the LCD panel is that the image impression for the vehicle driver corresponds to that of a 30 fps or 40 fps camera system, and thus appears less “smooth” or “faltering”, making it more difficult to assess high relative speeds between the vehicle itself and other road users.
- the relative speeds are lower for mirror replacement systems according to UN/ECE-R46 of groups I-IV than for mirror replacement systems of groups V and VI, rear view cameras or surround view (bird's eye) systems, due to the image perspective of the representation of the image content when using the systems.
- An object of the invention is to provide a camera system for a vehicle as well as a method, which adapt the frame rate of an image sensor to the frame rate of a display panel (LCD panel, OLED panel, LED panel, etc.) and at the same time give the vehicle driver the impression of a smooth, non-faltering image sequence.
- a display panel LCD panel, OLED panel, LED panel, etc.
- the indirect viewing system for a vehicle comprises at least one image acquisition device comprising an image sensor for continuously acquiring images at a first frame rate corresponding to the frame rate of the image sensor.
- Standard image sensors used in automotive applications typically have a refresh rate of less than 60 fps. This low frame rate means that less data needs to be transmitted within the system, allowing the use of lower cost components.
- images captured by the image capture device are temporarily stored in an image memory for further processing of the images in time after the images are captured.
- this further processing is carried out in such a way that the refresh rate of the image sensor is adapted to a desired refresh rate of 60 fps, in this case prescribed by the vehicle manufacturer, in order to be able to display, for the human eye, a smooth motion sequence on an image output device with a refresh rate of 60 fps despite the low refresh rate of the image sensor.
- the processing of the images temporarily stored in the image memory is performed by an image processing device which continuously calculates at least one new image from at least two consecutively captured images.
- the newly calculated images have a second frame rate that is higher than the first frame rate at which the images were captured.
- the first frame rate is at 40 fps
- the second frame rate is at 60 fps.
- any number of frame rates can be adjusted to any number of higher or lower frame rates. This makes it possible to combine more different image sensors with different LCD panels, thereby increasing the number of technical solutions and providing cost advantages.
- the indirect viewing system comprises an image output device capable of outputting images at the second high refresh rate (60 fps). Specifically, the image output device outputs the images continuously recalculated by the image processing device at the adapted refresh rate of the image sensor.
- the continuous recalculation of new images in the image processing device is specifically performed by summing two successive images, each multiplied by a percentage factor, to produce a recalculated image.
- the percentage factor corresponds to a temporal position of the recalculated image with respect to the temporal position of the underlying at least two successively acquired images.
- the above-mentioned percentage factors depend in particular on the image refresh rate of the image capturing device or the image sensor and the image refresh rate of the image output device. These percentage factors may be predetermined and stored in a memory or calculated by the image processing device at system run time.
- the present invention in generating a new image, specifically brightness information and/or color information of each pixel in a captured image is multiplied by a corresponding percentage factor, and respective pixels corresponding to each other in the at least two successively captured images are summed to generate the new image.
- no sequence of successive images is analyzed and an attempt is made to find identical or related image parts in order to determine motion vectors, which can then be used to reconstruct the position of an object at the desired point in time between two successive images.
- the computational effort for generating a new image is reduced. In this case, the generation of so-called “ghost images” is accepted, since these do not impair the impression of a smooth motion sequence for the human eye.
- the brightness information and/or color information of each pixel of captured images may not be used in the calculation of a new image, but the brightness information and/or color information of the raw data of the image sensor itself.
- the percentage factor corresponds to the temporal position of the newly calculated image with respect to the temporal position of the at least two consecutively acquired images.
- the percentage factors may be stored as constants in the image processing device.
- a grid for the factor determination can be stored in a memory, wherein the grid for the factor determination depends on the image refresh rate of the image capturing device and the image refresh rate of the image output device, i.e. on a (first) image refresh rate of the image sensor to be adapted to the (second) image refresh rate of the image output device.
- the above-mentioned predetermined grid for factor determination may have a so-called initial phase offset between the first and second image refresh rates, so that the percentage factors depend not only on the image refresh rate of the image capturing device and the image refresh rate of the image output device, but also on this initial phase offset between the first and second image refresh rates.
- initial phase offset By additionally using the initial phase offset, as many new images as possible can be calculated, of which as few as possible correspond to originally acquired images.
- the sum of the percentage factors by which two successive images are multiplied may be equal to 100%. If the sum is greater than 100%, image brightening occurs, whereas if the sum is less than 100%, image darkening occurs in the new image generated from these two successive images.
- the image refresh rate of the image capture device is lower than the image refresh rate of the image output device, and both image refresh rates may be static or fixed, so that they do not change during operation of the system.
- the image refresh rate of the image capture device changes dynamically as a function of vehicle states, for example speed, forward and reverse travel, standstill, parking process, maneuvering process, turning process, etc., as a function of vehicle signals, for example turn signals, reverse gear, brightness sensors, acceleration sensors, etc., as a function of a manual input by a user, for example pressing a button, etc., and/or as a function of signals from the indirect viewing system itself, for example a brightness detected by the image capture device via the image sensor, by a brightness sensor in the image output device, etc.
- the exposure time of the image sensor can be extended, thereby achieving higher sensitivity and thus better visibility in dark environments.
- the image processing device is integrally formed in the image capture device or in the image output device
- the intermediate image memory is integrally formed in the image processing device.
- no more than 200 ms should elapse from the time of capturing the first image of the least two successive images, and displaying the newly calculated image therefrom to the driver.
- an image frame rate there is a continuous acquiring (capturing) of images at the image frame rate of the image sensor of the image capture device, an intermediate storage (cash storing) of the acquired images for a processing lying in time after the acquiring of the images, a continuous calculation of at least one new image from at least two successively acquired images, wherein the newly calculated images have the image frame rate of the image output device; and outputting the continuously recomputed images at the frame rate of the image output device.
- calculating the at least one new image is performed by forming a sum of at least two successive images each multiplied by a percentage factor, the percentage factor corresponding to the temporal position of the newly calculated image with respect to the temporal position of the at least two successively acquired images.
- multiplying brightness information and/or color information of each pixel in a captured image is performed separately or combined with a corresponding percentage factor, and then summing respective corresponding pixels in the at least two successively captured images to generate the new image.
- multiplying a brightness information and/or color information of the raw data of the image sensor may be performed separately or combined with a corresponding percentage factor, and then summing the respective results in the at least two consecutively captured images to generate the new image.
- the percentage factors in the method according to the invention depend on the frame rate of the image capturing device and the frame rate of the image output device.
- the percentage factors may depend not only on the frame rates of the image capturing device and the image output device, but additionally on an initial phase offset between the frame rates.
- the percentage factors may be stored as constants in the image processing device or calculated at runtime by the image processing device.
- the sum of the percentage factors by which at least two successive images are multiplied may be equal to 100%, or greater than 100% in the case of image brightening or less than 100% in the case of image darkening of the newly generated image.
- the frame rate of the image capture device and the frame rate of the image output device may be static and not change during operation.
- the image repetition rate of the image capture device may change dynamically in response to vehicle conditions, vehicle signals, manual input by the user, and/or signals from a brightness sensor of the image capture device and/or signals from a brightness sensor of the image output device, similar to the above system of the invention.
- the image processing device may be integrated in the image capture device or the image output device, and/or the image memory may be integrated in the image processing device, similar to the above system according to the present invention.
- FIG. 1 a schematic view of an indirect viewing system according to an embodiment of the present invention
- FIG. 2 is a schematic block diagram for explaining the calculation of a new image according to an embodiment of the present invention
- FIG. 3 is a first example of a frame rate increasing according to an embodiment of the prior art
- FIG. 4 is a first embodiment of a frame rate increasing according to the invention.
- FIG. 5 is a second embodiment of a frame rate increasing according to the invention.
- FIG. 6 is a second example of a frame rate increasing according to the prior art
- FIG. 7 is a third embodiment of a frame rate increasing according to the present invention.
- FIG. 8 is a fourth embodiment of a frame rate increasing according to the invention.
- FIG. 9 is a flowchart according to an embodiment of the method according to the present invention.
- FIG. 10 is a flowchart for illustrating the calculation of a new image according to the invention.
- FIG. 1 shows a schematic block diagram of an indirect viewing system for a vehicle according to a preferred embodiment of the invention.
- the indirect viewing system 1 has two image capture devices 2 , each having an image sensor 3 for continuously acquiring images at a first frame rate corresponding to the frame rate of the respective image sensor. Only one or also several image output devices may be present. In the following, only the image output device 2 shown on the left in the figure will be described.
- the frame rate of the image sensor 3 is, for example, equal to 45 frames per second (fps) or 50 fps.
- the image capture device 2 is connected to an image memory 4 that temporarily stores the images captured by the image capture devices 2 .
- the images captured by the image capturing devices 2 are continuously acquired images that are displayed to a driver of the vehicle in near real time on an image output device 5 .
- the images acquired by the image capture device 2 can also remain permanently stored in the image memory 4 for later use in an accident analysis, in the investigation of property damage, etc.
- the viewing system 1 further comprises an image processing device 6 , which is connected to the image memory 4 and uses the captured images stored in the image memory 4 for a continuous calculation of at least one new image from at least two successively captured images stored in the image memory 4 .
- the image processing device 6 is shown in FIG. 1 as a separate component, but may also be integrated in the image memory 4 , or the image memory 4 may be integrated in the image processing device 6 .
- the image processing device 6 is further connected to the image output device 5 to display the images recomputed in the image processing device 6 on the image output device 5 at a higher frame rate than the image sensor 3 of the image capture device 2 can pick up.
- the calculation process performed in the image processing device 6 to calculate a new image from at least two successively acquired images will be described in more detail later.
- FIG. 1 further shows a first sensor 7 and a second sensor 8 , each of which is connected to the image storage device 4 .
- the first sensor 7 and the second sensor 8 can alternatively also be connected to the image processing device 6 and/or the image capture device 2 .
- the first sensor 7 detects vehicle conditions, such as speed, forward and reverse, stationary, parking, maneuvering, turning, etc.
- the second sensor 8 detects vehicle signals, such as turn signal, reverse gear, brightness sensor, acceleration sensor.
- the first and second sensors 7 , 8 may also additionally or alternatively detect a manual input from a driver or a brightness detected via the image sensor 3 of the image capture device 2 , respectively, to dynamically adjust the image frame rate of the image capture device 2 depending on the signals detected by the first and second sensors 7 , 8 .
- a manual input from a driver or a brightness detected via the image sensor 3 of the image capture device 2 respectively, to dynamically adjust the image frame rate of the image capture device 2 depending on the signals detected by the first and second sensors 7 , 8 .
- the exposure time of the image sensor 3 can be extended, thereby achieving higher sensitivity and thus better visibility in dark environment.
- FIG. 2 shows a schematic diagram illustrating the calculation of a new image according to the invention.
- the images 9 , 10 shown in FIG. 2 are acquired by the at least one image capture device 2 shown in FIG. 1 and stored in the image memory 4 as subsequently acquired images.
- the image processing device 6 of FIG. 1 takes these images 9 , 10 stored in the image memory 4 and multiplies the image 9 by a percentage factor 1. Furthermore, the image processing device 6 multiplies the image 10 by a percentage factor 2. The calculation or determination of the percentage factors is described in more detail later.
- each pixel in the image 10 is multiplied by the percentage factor 2.
- the percentage factors 1, 2 may differ from each other. For example, the sum of percentage factor 1 and percentage factor 2 equals 100% and 1, respectively, as described in more detail later.
- images 9 , 10 are summed to produce a new image 11 .
- the pixel 12 in image 9 whose brightness and/or color information has been multiplied by the percentage factor 1 and the pixel 13 in image 10 that has been multiplied by the percentage factor 2 are summed.
- those pixels in image 9 are added to those in image 10 which are locally located at the same position.
- pixel 12 in FIG. 9 is added to pixel 13 , even though pixel 12 would correspond to pixel 14 in FIG. 10 due to a movement of image content.
- the above process is performed for each single pixel in images 9 and 10 to calculate the new image 11 that is displayed on the image output device 5 .
- FIG. 3 shows a first example of increasing the frame rate from 50 fps to 60 fps according to the prior art.
- the top row in FIG. 3 shows ten images acquired by the image capture device 2 at an image frame rate of 50 fps at 0, 20, . . . , 180 ms.
- the bottom row in FIG. 3 shows eleven images output at a frame rate of 60 fps.
- the image captured at 0 ms is reproduced twice in succession at 0 ms and 16.6 ms.
- the image captured by the image capture device at 100 ms is reproduced twice in succession at 100 ms and 116.6 ms.
- the images captured at 20 ms, 40 ms, 60 ms, 80 ms, 120 ms, 140 ms, 160 ms, and 180 ms by the image capture device are each output once but time-shifted on the image output device, as shown in FIG. 3 , to obtain a frame rate of 60 fps.
- no images are recomputed or newly calculated from at least two successive images captured by the image capture device, as described below with reference to FIG. 4 .
- FIG. 4 shows a first embodiment of the invention for increasing a frame rate from 50 fps to 60 fps.
- FIG. 4 shows in the middle row, similar to FIG. 3 , ten images captured by the image capture device at a frame rate of 50 fps.
- the image at 0 ms and the image at 20 ms are processed according to the process described in FIG. 2 to form a new image 1 in the top line in FIG. 4 by multiplying the image captured at 0 ms by the percentage factor 1.0, multiplying the image captured at 20 ms by the percentage factor 0.0, and summing the resulting images to form the new image 1 . Due to the percentage factors of 0.0 and 1.0, the new calculated image 1 in this case is identical to the image acquired at 0 ms.
- the new image 1 is output at 20 ms on the image output device 5 .
- the two images taken at 0 ms and 20 ms are further computed into a new image 2 , as shown in the top line of FIG. 4 .
- the image acquired at 0 ms is multiplied by the percentage factor 0.17, and the image acquired at 20 ms is multiplied by the percentage factor 0.83.
- the resulting images are summed to form the new image 2 .
- the newly calculated image 2 is output at 36.6 ms.
- the image 3 is calculated from the image captured at 20 ms by the image capture device and the subsequent image captured at 40 ms, as shown in the top line in FIG. 4 , by multiplying the image captured at 20 ms by the percentage factor 0.33, multiplying the image captured at 40 ms by the percentage factor 0.67, and summing the resulting images to form the new image 3 .
- the top line in FIG. 4 corresponds to a grid (raster) for determining the percentage factors to convert a frame rate of 50 fps to 60 fps.
- the grid is predefined in a memory as described above, for example, or is calculated at runtime. The grid changes depending on the frame rate of the image capture device and the frame rate of the image output device, as will become clear later with reference to FIG. 7 .
- the percentage factors shown in FIG. 4 correspond to the temporal position of the newly calculated image in relation to the temporal position of the at least two successively captured images.
- the percentage factors correspond to a kind of weighting of the captured images, which is carried out on the basis of temporal criteria, i.e. on the basis of the temporal relationship or distance between the time of the image to be output and the time of the captured image.
- the newly generated image 2 in the top line of FIG. 4 is at 36.6 ms, i.e., between the image captured at 20 ms and the image captured at 40 ms in the middle line of FIG. 4 .
- the percentage factor by which the image captured at 20 ms is multiplied is calculated by the temporal distance (36.6 ms-20 ms) divided by the temporal distance of two successive images at 50 fps (20 ms).
- the percentage factor by which the image captured at 0 ms is multiplied to generate the new image 2 is calculated by the temporal distance between the image captured at 40 ms and the newly generated image 2 (40 ms-36.6 ms) divided by the temporal distance of two successive images at 50 fps (20 ms).
- the percentage factors are determined from the grid for the remaining captured images and used to generate the new images 3 to 11 .
- the newly generated image 1 is output at 20 ms and calculated from the images acquired at 0 ms and 20 ms using the percentage factors 1 and 0 as described above.
- the newly generated image 2 is output at 36.6 ms and generated from the images acquired at 0 ms and 20 ms with the percentage factors 0.17 and 0.83.
- the newly generated image 3 is output at 53.3 ms and is generated from the images captured at 20 ms and 40 ms with the percentage factors 0.33 and 0.67.
- the newly generated images 4 to 11 are output on the image output device at a frame rate of 60 fps.
- FIG. 5 shows a second embodiment for increasing a frame rate from 50 fps to 60 fps.
- the second embodiment differs from the first embodiment in that, in addition, the grid for factor determination comprises an initial phase offset, as shown in FIG. 5 .
- An initial phase shift means that the grid shown in FIG. 4 is shifted in time.
- the grid of FIG. 4 is shifted to the right in such a way that the first newly generated image is generated at 8.3 ms.
- the temporal position of 8.3 ms with respect to the temporal position of the images acquired at 0 ms and 20 ms results in the percentage factors 0.58 and 0.42, which are calculated as described above. Similar to FIG. 4 , the sum of the percentage factors used to calculate the new image 1 equals 100%.
- the second embodiment with the initial phase offset has the advantage that as few as possible newly calculated images coincide with the captured images.
- the recalculated images 1 and 7 output at the 20 ms and 120 ms time points are identical to the images captured at 20 ms and 100 ms.
- the phase shift due to the phase shift, none of the newly generated images 1 to 11 is identical to any of the captured images.
- the output new image 1 is calculated from the images taken at 0 ms and 20 ms with the percentage factors 0.58 and 0.42
- the newly output new image 2 is calculated from the images taken or captured at 20 ms and 40 ms with the percentage factors 0.75 and 0.25
- the newly output image 3 is calculated from the images taken at 40 ms and 60 ms with the percentage factors 0.92 and 0, 08
- the newly output image 4 is calculated from the images acquired at 40 ms and 60 ms with the percentage factors 0.08 and 0.92
- the new image 5 is calculated from the images taken at 60 ms and 80 ms with the percentage factors 0.25 and 0.75, and so on.
- FIG. 6 shows a second example of increasing a frame rate from 45 fps to 60 fps according to the prior art.
- the top line of FIG. 6 shows nine images captured by the image capture device at 45 fps.
- the bottom line in FIG. 6 shows the output on the image output device at 60 fps.
- An increase from 45 fps to 60 fps is achieved according to the second example of the prior art by reproducing the images recorded at 0 ms, 66.6 ms and 133.3 ms twice in succession.
- the image recorded at 0 ms is reproduced twice at 0 ms and 16.6 ms
- the image recorded at 66.6 ms is reproduced at 66.6 ms and 83.3 ms
- the image recorded at 133.3 ms is reproduced at 133.3 ms and at 150 ms to obtain a frame rate increase from 45 fps to 60 fps.
- FIG. 7 shows a third embodiment of increasing a frame rate from 45 fps to 60 fps according to the invention. Similar to the first embodiment of FIG. 4 , the frame rate increase from 45 fps to 60 fps does not use a phase shift. As shown in FIG. 7 , the grid in the top row of FIG. 7 differs from the grid for factor determination in FIG. 4 in that the temporal intervals between the newly generated images are different. This results from the fact that according to the third embodiment, there is a frame rate increase from 45 fps to 60 fps, instead of from 50 fps to 60 fps.
- FIG. 8 shows a fourth embodiment of increasing a frame rate from 45 fps to 60 fps according to the invention.
- the fourth embodiment differs from the third embodiment in that an initial phase offset is used similar to the second embodiment.
- the sum of the percentage factors for calculating a new image is respectively 100% and 1.
- the percentage factors may be increased such that the sum is greater than 100%.
- the sum can be less than 100%.
- FIG. 9 shows a flowchart according to the invention for matching an image frame rate of an image sensor of an image capture device to the image frame rate of an image output device.
- step S 1 images are continuously captured at a first frame rate (frame rate of the image sensor).
- step S 2 the captured images are stored in a buffer memory.
- step S 3 at least one new image is calculated from two successively captured images stored in the buffer. Step S 3 is described in more detail with reference to FIG. 10 .
- step S 4 each newly calculated image is output at a frame rate that matches the frame rate of the image output device.
- FIG. 10 shows a flowchart that illustrates step S 3 shown in FIG. 9 in more detail.
- each captured image is multiplied by a respective predetermined percentage factor, which is determined as described above.
- a summation of at least two successive images obtained by the multiplication is performed. In this respect, reference is also made to the above description of FIG. 2 .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
- Television Systems (AREA)
Abstract
Description
Claims (18)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102020129908.7 | 2020-11-12 | ||
DE102020129908.7A DE102020129908A1 (en) | 2020-11-12 | 2020-11-12 | Indirect layer system and method for adjusting a refresh rate |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220150443A1 US20220150443A1 (en) | 2022-05-12 |
US11997418B2 true US11997418B2 (en) | 2024-05-28 |
Family
ID=78087216
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/522,231 Active 2041-11-11 US11997418B2 (en) | 2020-11-12 | 2021-11-09 | Indirect viewing system and method for adjusting a frame rate |
Country Status (6)
Country | Link |
---|---|
US (1) | US11997418B2 (en) |
EP (1) | EP4002835A1 (en) |
JP (1) | JP2022077975A (en) |
KR (1) | KR102637241B1 (en) |
CN (1) | CN114500905A (en) |
DE (1) | DE102020129908A1 (en) |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0147886B2 (en) | 1981-02-09 | 1989-10-17 | Tokyo Shibaura Electric Co | |
JPH0474082A (en) | 1990-07-13 | 1992-03-09 | Fujitsu Ltd | Pal signal/cif signal conversion circuit |
JPH05336499A (en) | 1992-05-29 | 1993-12-17 | Sanyo Electric Co Ltd | Standard television broadcasting system converting device |
JP2005184395A (en) | 2003-12-18 | 2005-07-07 | Sumitomo Electric Ind Ltd | Method, system and apparatus for image processing, and photographing equipment |
JP2006081047A (en) | 2004-09-13 | 2006-03-23 | Shibasoku:Kk | Number of field converting apparatus |
JP2007251254A (en) | 2006-03-13 | 2007-09-27 | Astro Design Inc | Frame rate converter and frame rate converting method |
US20080170161A1 (en) | 2006-12-28 | 2008-07-17 | Hitachi, Ltd. | Image processing apparatus and image display apparatus provided with the same |
US20110109796A1 (en) | 2009-11-09 | 2011-05-12 | Mahesh Subedar | Frame Rate Conversion Using Motion Estimation and Compensation |
EP2377725A1 (en) | 2010-04-19 | 2011-10-19 | SMR Patents S.à.r.l. | Side mirror simulation |
US20120050074A1 (en) * | 2010-02-26 | 2012-03-01 | Bechtel Jon H | Automatic vehicle equipment monitoring, warning, and control system |
US20150294479A1 (en) | 2014-04-15 | 2015-10-15 | Intel Corporation | Fallback detection in motion estimation |
WO2016047087A1 (en) | 2014-09-24 | 2016-03-31 | パナソニックIpマネジメント株式会社 | On-board electronic mirror |
US20160353054A1 (en) * | 2014-02-04 | 2016-12-01 | Marat Gilmutdinov | Techniques for frame repetition control in frame rate up-conversion |
US20170225621A1 (en) * | 2014-08-11 | 2017-08-10 | Seiko Epson Corporation | Vehicle imaging device, vehicle imaging display system, and vehicle |
US20180091768A1 (en) | 2016-09-28 | 2018-03-29 | Gopro, Inc. | Apparatus and methods for frame interpolation based on spatial considerations |
US20210044777A1 (en) * | 2019-08-08 | 2021-02-11 | Netflix, Inc. | Frame rate conversion |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6597282B2 (en) * | 2015-12-22 | 2019-10-30 | 株式会社デンソー | Vehicle display device |
-
2020
- 2020-11-12 DE DE102020129908.7A patent/DE102020129908A1/en active Pending
-
2021
- 2021-10-11 EP EP21201991.3A patent/EP4002835A1/en active Pending
- 2021-11-01 JP JP2021178423A patent/JP2022077975A/en active Pending
- 2021-11-08 CN CN202111316431.5A patent/CN114500905A/en active Pending
- 2021-11-09 US US17/522,231 patent/US11997418B2/en active Active
- 2021-11-12 KR KR1020210155276A patent/KR102637241B1/en active IP Right Grant
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0147886B2 (en) | 1981-02-09 | 1989-10-17 | Tokyo Shibaura Electric Co | |
JPH0474082A (en) | 1990-07-13 | 1992-03-09 | Fujitsu Ltd | Pal signal/cif signal conversion circuit |
JPH05336499A (en) | 1992-05-29 | 1993-12-17 | Sanyo Electric Co Ltd | Standard television broadcasting system converting device |
JP2005184395A (en) | 2003-12-18 | 2005-07-07 | Sumitomo Electric Ind Ltd | Method, system and apparatus for image processing, and photographing equipment |
JP2006081047A (en) | 2004-09-13 | 2006-03-23 | Shibasoku:Kk | Number of field converting apparatus |
JP2007251254A (en) | 2006-03-13 | 2007-09-27 | Astro Design Inc | Frame rate converter and frame rate converting method |
US20080170161A1 (en) | 2006-12-28 | 2008-07-17 | Hitachi, Ltd. | Image processing apparatus and image display apparatus provided with the same |
US20110109796A1 (en) | 2009-11-09 | 2011-05-12 | Mahesh Subedar | Frame Rate Conversion Using Motion Estimation and Compensation |
US20120050074A1 (en) * | 2010-02-26 | 2012-03-01 | Bechtel Jon H | Automatic vehicle equipment monitoring, warning, and control system |
EP2377725A1 (en) | 2010-04-19 | 2011-10-19 | SMR Patents S.à.r.l. | Side mirror simulation |
US20160353054A1 (en) * | 2014-02-04 | 2016-12-01 | Marat Gilmutdinov | Techniques for frame repetition control in frame rate up-conversion |
US20150294479A1 (en) | 2014-04-15 | 2015-10-15 | Intel Corporation | Fallback detection in motion estimation |
US20170225621A1 (en) * | 2014-08-11 | 2017-08-10 | Seiko Epson Corporation | Vehicle imaging device, vehicle imaging display system, and vehicle |
WO2016047087A1 (en) | 2014-09-24 | 2016-03-31 | パナソニックIpマネジメント株式会社 | On-board electronic mirror |
EP3200449A1 (en) | 2014-09-24 | 2017-08-02 | Panasonic Intellectual Property Management Co., Ltd. | On-board electronic mirror |
US20170291550A1 (en) | 2014-09-24 | 2017-10-12 | Panasonic Intellectual Property Management Co., Ltd. | On-board electronic mirror |
US20180091768A1 (en) | 2016-09-28 | 2018-03-29 | Gopro, Inc. | Apparatus and methods for frame interpolation based on spatial considerations |
US20210044777A1 (en) * | 2019-08-08 | 2021-02-11 | Netflix, Inc. | Frame rate conversion |
Non-Patent Citations (4)
Title |
---|
Isberg et al., "Frame rate up-conversion of real-time high-definition remote surveillance video," Apr. 30, 2012, pp. 1-90. |
Office Action dated Aug. 22, 2023 issued in Korean Patent Application No. 10-2021-0155276. |
Office Action dated Dec. 20, 2022 issued in Japanese Patent Application No. 2021-178423. |
Reconsideration Report dated Jan. 18, 2024 issued in Japanese Patent Application No. 2021-178423. |
Also Published As
Publication number | Publication date |
---|---|
EP4002835A1 (en) | 2022-05-25 |
DE102020129908A1 (en) | 2022-05-12 |
US20220150443A1 (en) | 2022-05-12 |
JP2022077975A (en) | 2022-05-24 |
KR102637241B1 (en) | 2024-02-19 |
KR20220064929A (en) | 2022-05-19 |
CN114500905A (en) | 2022-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7693303B2 (en) | Monitoring system and vehicle surrounding monitoring system | |
EP2077663A1 (en) | Image display device and method, and image processing device and method | |
US9669761B2 (en) | Around view monitoring apparatus and method thereof | |
KR20010098955A (en) | Image processor and monitoring system | |
CN103098456A (en) | Image processing unit, image processing method, and image processing program | |
US10783665B2 (en) | Apparatus and method for image processing according to vehicle speed | |
KR100989314B1 (en) | display apparatus | |
US9215353B2 (en) | Image processing device, image processing method, image display device, and image display method | |
US7221403B2 (en) | Image signal processing apparatus and processing method | |
US11997418B2 (en) | Indirect viewing system and method for adjusting a frame rate | |
KR20150089677A (en) | Camera system, calibration device and calibration method | |
JP4355347B2 (en) | Image display apparatus and method, image processing apparatus and method | |
JP2012147308A (en) | Image processing apparatus, image display system, and image processing method | |
JP4814752B2 (en) | Display control device | |
JP2009182865A (en) | Image display device and method, image processor, and image processing method | |
JP2012080411A (en) | Imaging apparatus and control method therefor | |
US20110234748A1 (en) | Image processing device and image processing method | |
EP4228247A1 (en) | Camera system for motor vehicle and method to operate | |
US11949876B2 (en) | Method and system for embedding information in a video signal | |
JP2011142400A (en) | Motion vector detecting device and method, video display device, video recorder, video reproducing device, program and recording medium | |
JP4157587B2 (en) | Image display apparatus and method, image processing apparatus and method | |
JP2012080355A (en) | Image processing device, imaging apparatus and control method therefor | |
JP2008109628A (en) | Image display apparatus and method, image processor and method | |
JP3947117B2 (en) | Vehicle periphery image processing system | |
JP2003316348A5 (en) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MEKRA LANG GMBH & CO. KG, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LANG, WERNER;GEIβENDOERFER, PETER;REDLINGSHOEFER, ANDREAS;SIGNING DATES FROM 20211026 TO 20211104;REEL/FRAME:058060/0638 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: MEKRA LANG GMBH & CO. KG, GERMANY Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE THE SPELLING OF THE SECOND INVENTORS NAME PREVIOUSLY RECORDED AT REEL: 058060 FRAME: 0638. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:LANG, WERNER;GEISSENDOERFER, PETER;REDLINGSHOEFER, ANDREAS;SIGNING DATES FROM 20211026 TO 20211104;REEL/FRAME:058105/0371 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |