WO2022023157A1 - Debayering with multiple camera rotations - Google Patents
Debayering with multiple camera rotations Download PDFInfo
- Publication number
- WO2022023157A1 WO2022023157A1 PCT/EP2021/070520 EP2021070520W WO2022023157A1 WO 2022023157 A1 WO2022023157 A1 WO 2022023157A1 EP 2021070520 W EP2021070520 W EP 2021070520W WO 2022023157 A1 WO2022023157 A1 WO 2022023157A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- array
- image
- color component
- samples
- sample
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 55
- 238000003491 array Methods 0.000 claims abstract description 21
- 230000004044 response Effects 0.000 claims description 9
- 230000008569 process Effects 0.000 description 11
- 238000012545 processing Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 4
- 239000003086 colorant Substances 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 229910001416 lithium ion Inorganic materials 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- QELJHCBNGDEXLD-UHFFFAOYSA-N nickel zinc Chemical compound [Ni].[Zn] QELJHCBNGDEXLD-UHFFFAOYSA-N 0.000 description 2
- 230000005355 Hall effect Effects 0.000 description 1
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 241000700159 Rattus Species 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- OJIJEKBXJYRIBZ-UHFFFAOYSA-N cadmium nickel Chemical compound [Ni].[Cd] OJIJEKBXJYRIBZ-UHFFFAOYSA-N 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 229910052987 metal hydride Inorganic materials 0.000 description 1
- 229910052759 nickel Inorganic materials 0.000 description 1
- PXHVJJICTQNCMI-UHFFFAOYSA-N nickel Substances [Ni] PXHVJJICTQNCMI-UHFFFAOYSA-N 0.000 description 1
- -1 nickel metal hydride Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000009257 reactivity Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4015—Image demosaicing, e.g. colour filter arrays [CFA] or Bayer patterns
Definitions
- the present disclosure relates to the processing of digital images. Specifically, some embodiments relate to techniques for reducing mosaic effects that can occur when full-color images are captured using a Bayer filter array.
- a method includes: obtaining a first array of image samples; obtaining a second array of image samples; generating a combined array of image samples, wherein, for at least a first sample in the combined array: a first value associated with a first color component is obtained from a corresponding position in the first array; a second value associated with a second color component is obtained from a corresponding position in the second array; and a third value associated with a third color component is obtained by interpolating a value from at least one of the first and second image samples.
- a first value associated with the first color component is obtained from a corresponding position in the second array; a second value associated with the second color component is obtained by interpolating a value from at least one of the first and second image samples; and a third value associated with the third color component is obtained from a corresponding position in the first array.
- the second sample may neighbor the first sample either vertically or horizontally.
- a first value associated with the first color component is obtained from a corresponding position in the second array; a second value associated with the second color component is obtained from a corresponding position in the first array; and a third value associated with the third color component is obtained by interpolating a value from at least one of the first and second image samples.
- the third sample may neighbor the first sample either vertically or horizontally.
- a first value associated with the first color component is obtained from a corresponding position in the first array; a second value associated with the second color component is obtained by interpolating a value from at least one of the first and second image samples; and a third value associated with the third color component is obtained from a corresponding position in the second array.
- the fourth sample neighbors the first sample diagonally.
- the first and second arrays of image samples are obtained using a Bayer filter array.
- the Bayer filter arrays used to capture the first and second arrays of image samples may have substantially perpendicular orientations.
- the first array of images samples is obtained using a first image sensor on a user device and the second array of images samples is obtained using a second image sensor on the user device, the first and second image sensors having substantially perpendicular orientations.
- the first array of image samples and the second array of image samples may be captured substantially simultaneously using the first and second image sensors.
- the substantially simultaneous capture of the first and second array of image samples may be performed in response to an input on the user device triggering a photograph.
- generating the combined array of image samples may be performed in response to a determination that the user device is not in a macro mode. In some embodiments, generating the combined array of image samples may be performed in response to a determination that an autofocus distance is greater than a threshold distance.
- the first color component is green
- the second color component is red
- the third color component is blue
- the first color component is green
- the second color component is blue
- the third color component is red
- a method in some embodiments includes: capturing a first input image using a first Bayer filter array and a second input image using a second Bayer filter array, where the first and Bayer filter arrays have substantially perpendicular orientations; and obtaining a combined image from the first image and the second image, wherein each pixel in the combined image includes a first color component obtained from the first image, a second color component obtained from the second image, and a third color component obtained by interpolating a value from at least one of the first and second input images.
- FIG. 1 is a system diagram illustrating an example wireless transmit/receive unit (WTRU) that may be used in some embodiments.
- WTRU wireless transmit/receive unit
- FIG. 2 is a schematic illustration of a demosaicing process performed on an image captured using a Bayer filter.
- FIG. 3 is a schematic illustration of a demosaicing process performed on an image captured using a Bayer filter, illustrating which resulting samples that are interpolated and which samples are not interpolated.
- FIG. 4 is a schematic illustration of image samples captured using a Bayer filter in a landscape orientation, which may be used as an input to methods performed according to some embodiments.
- FIG. 5 is a schematic illustration of image samples captured using a Bayer filter in a portrait orientation, which may be used as an input to methods performed according to some embodiments.
- FIG. 6 is a schematic illustration of an example demosaicing method according to some embodiments, combining image samples as captured according to FIGs. 4 and 5.
- FIG. 7 is a schematic illustration of a mobile device with two image sensors that may be used to capture a portrait-oriented image and a landscape-oriented image for processing according to some embodiments.
- FIGs. 8A and 8B illustrates arrays of image samples that may be combined in a demosaicing process according to some embodiments.
- FIG. 9 illustrates an array of image samples that may be generated from the image samples of FIGs. 8A and 8B.
- FIG. 10 illustrates a method according to some embodiments.
- FIG. 1 is a system diagram illustrating an example wireless transmit receive unit (WTRU) 102.
- the WTRU 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, non-removable memory 130, removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and/or other peripherals 138, among others.
- GPS global positioning system
- the processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
- the processor 118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment.
- the processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 1 depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
- the transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station over the air interface 116.
- the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals.
- the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example.
- the transmit/receive element 122 may be configured to transmit and/or receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
- the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ Ml MO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 116.
- the transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122.
- the WTRU 102 may have multi-mode capabilities.
- the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as NR and IEEE 802.11 , for example.
- the processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
- the processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128.
- the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132.
- the non-removable memory 130 may include random- access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
- the removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like.
- SIM subscriber identity module
- SD secure digital
- the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102, such as on a server or a home computer (not shown).
- the processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102.
- the power source 134 may be any suitable device for powering the WTRU 102.
- the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
- the processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102.
- location information e.g., longitude and latitude
- the WTRU 102 may receive location information over the air interface 116 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
- the processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
- the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs and/or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, a Virtual Reality and/or Augmented Reality (VR/AR) device, an activity tracker, and the like.
- FM frequency modulated
- the peripherals 138 may include one or more sensors, the sensors may be one or more of a gyroscope, an accelerometer, a hall effect sensor, a magnetometer, an orientation sensor, a proximity sensor, a temperature sensor, a time sensor; a geolocation sensor; an altimeter, a light sensor, a touch sensor, a magnetometer, a barometer, a gesture sensor, a biometric sensor, and/or a humidity sensor.
- a gyroscope an accelerometer, a hall effect sensor, a magnetometer, an orientation sensor, a proximity sensor, a temperature sensor, a time sensor; a geolocation sensor; an altimeter, a light sensor, a touch sensor, a magnetometer, a barometer, a gesture sensor, a biometric sensor, and/or a humidity sensor.
- the WTRU 102 may include a full duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for both the UL (e.g., for transmission) and downlink (e.g., for reception) may be concurrent and/or simultaneous.
- the full duplex radio may include an interference management unit to reduce and or substantially eliminate self-interference via either hardware (e.g., a choke) or signal processing via a processor (e.g., a separate processor (not shown) or via processor 118).
- the WTRU 102 may include a half-duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for either the UL (e.g., for transmission) or the downlink (e.g., for reception)).
- a half-duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for either the UL (e.g., for transmission) or the downlink (e.g., for reception)).
- the WTRU is described in FIG. 1 as a wireless terminal, it is contemplated that in certain representative embodiments that such a terminal may use (e.g., temporarily or permanently) wired communication interfaces with the communication network.
- Devices such as for example a WTRU, may have multiple cameras whose sensors may be portrait or landscape oriented.
- Example embodiments combine information from different camera sensors, which may have different orientations, for use in a debayering or other demosaicing process.
- the pixels of cameras sensors are generally designed to capture light (photons) from a broad spectrum to give a “luminance” frame.
- various different techniques may be used.
- three frames are captured with three different filters (Red/Green/Blue) at substantially the same time. With the red filter, each pixel is red, and so on for green and blue.
- RGB RGB
- a color picture is obtained. This can be done with three sensors (each dedicated to a color) or one sensor with three filters.
- one frame is captured, and a Bayer filter array is applied on the sensor.
- FIG. 2 is a schematic illustration of an example of a demosaicing process with a Bayer matrix.
- FIG. 3 is a schematic illustration of a demosaicing process with a Bayer matrix, illustrating that the color values of some pixels are interpolated. In some processes, 75% of the red and blue pixels are interpolated and 50% of the green pixels are interpolated.
- Example embodiments are intended to increase the RGB output quality by decreasing the number of interpolated pixels. Some embodiments operate to make use of multiple camera orientations to increase demosaicing process quality.
- a first image sensor of a device is in “landscape mode,” which provides color samples as illustrated in FIG. 4.
- a second image sensor is in “portrait mode,” which provides color samples as illustrated in FIG. 5.
- the portrait mode sensor in this example is rotated by +90° or -90° with respect to the landscape mode sensor.
- FIG. 6 illustrates the merging of the sensor captures. As illustrated in FIG. 6, the merger of the sensor captures allows for a reduced level of interpolation. In this example, only 50% of the red and blue pixels are now interpolated and none of the green pixels are interpolated.
- calibration between the two sensors is performed during manufacturing.
- the subjects of the images sensor captures are distant enough from the device to prevent a disparity effect between the two sensors (e.g. a disparity effect of less than one pixel).
- the merging of sensor captures is not performed during macro mode.
- Interpolation may be performed by hardware or software. Different interpolation techniques may be selected to achieve a balance between complexity, power consumption, and reactivity. Interpolation will typically be better with more “true” pixel colors.
- FIG. 7 illustrates an example of a device (e.g. a WTRU) employed in some embodiments.
- the example device includes an image sensor with a portrait orientation and another image sensor with a landscape orientation.
- the image sensors can be configured to capture images at least partly of the same entity (object, scene, target, aim,...), which typically means that the image sensors are aimed in the same direction.
- the image captures are performed in portrait and landscape mode at the same time.
- One potential benefit of such an embodiment is that it is not necessary to select one orientation to take the picture. Images is captured by both sensors, and the user may subsequently select whether to keep one or both of the images. In addition, it may be easier to hold the device in a portrait mode even if it is desired to capture an image in landscape mode.
- Such a device equipped with both portrait and landscape image sensors may be used to implement a demosaicing process as described herein.
- images are captured substantially simultaneously by the two sensors, and a merge between landscape/portrait sensors is made before algorithm interpolation.
- the capture of the images may be performed in response to a user input triggering the capture of a photograph (e.g. pressing a shutter button on a camera app).
- Example embodiments are not limited to the capture of still images.
- the methods described herein for processing of images may be performed in some embodiments on the frames of video images, for example to merge frames of video captured with a sensor having a portrait orientation and video captured with a sensor having a landscape orientation.
- a camera application on a WTRU such as a smartphone or other user device
- a specific color mode (“Enhanced colors” for example) may be activated, in response to which a demosaicing process such as those described herein may be implemented.
- this mode is disabled for close subjects, such as when macro mode is in use.
- the “Enhanced colors” mode may be automatically disabled when the autofocus is focusing on very close subjects.
- a first input image is using a first Bayer filter array and a second input image using a second Bayer filter array.
- the first and second Bayer filter arrays have at least substantially perpendicular orientations.
- a combined image is obtained from the first image and the second image, wherein each pixel in the combined image includes a first color component obtained from the first image, a second color component obtained from the second image, or a third color component obtained by interpolating a value from at least one of the first and second input images.
- a Bayer filter array that is turned 90° has the same pattern of colors as a Bayer filter array that has been shifted by, for example, one pixel in an appropriate direction.
- the principles described herein may thus be employed with images captured using Bayer filter arrays that are appropriately rotated and/or shifted with respect to one another.
- FIG. 8A illustrates an example portion of a first array of image samples, such as image samples captured with an image sensor in portrait mode.
- Each of the image samples is associated with a color component.
- the image sample Gi ,x -i ,y -i is an image sample in the first array that is at position (x-1 ,y-1) and is associated with a green color component.
- the image sample Bi ,x,y -i is an image sample in the first array that is at position (x,y-1) and is associated with a blue color component.
- the image sample Ri ,x -i ,y is an image sample in the first array that is at position (x-1 ,y) and is associated with a red color component.
- FIG. 8B illustrates an example portion of a second array of image samples, such as image samples captured with an image sensor in landscape mode.
- Each of the image samples in the second array is associated with a color component.
- the image sample R2 ,i -i j -i is an image sample in the second array that is at position (i-1 ,j-1 ) and is associated with a red color component.
- the image sample G2 , U- I is an image sample in the second array that is at position (i,j-1) and is associated with a green color component.
- the image sample B2 , u is an image sample in the second array that is at position (i,j) and is associated with a blue color component.
- a combined array of image samples is generated based on the first and second arrays of image samples.
- An example of a portion of a combined array of image samples is illustrated in FIG. 9.
- the image samples in the combined array are generated based on the arrays of FIGs. 8A and 8B.
- the samples in the combined array may be have positions indicated by, for example, coordinates (p,q).
- Each position (p,q) in the combined array corresponds to a sample position in the first array and a sample position in the second array.
- the sample at position (p,q) in the combined array corresponds to position (x,y) in the first array and to position (i,j) in the second array.
- samples at positions (x+c,y+d), (p+c,q+d), and (i+cj+d) correspond to one another, for various positive and negative integer values of c and d.
- the correspondence or mapping between sample positions may be predetermined in manufacturing of an image capturing device, it may be determined during a calibration procedure, or it may be determined by comparing features of two captured images to align those images, among other examples. It may be noted that not every pixel position in one array necessarily has a corresponding pixel in another array. For example, as seen in FIG. 6, there are portions of a landscape image that do not overlay any corresponding area in a portrait image, and vice versa.
- the sample has a first value G3 ,P+i ,q+i associated with a first color component (green, in this example), which is obtained from the value Gi ,x+i,y+i from the corresponding position in the first array.
- the sample 902 has a second value R3 ,P+i,q+i associated with a second color component (red, in this example) that is obtained from a corresponding position in the second array.
- the sample 902 has a third value B3 ,p+i,q+i associated with a third color component (blue, in this example) that is obtained by interpolating a value from at least one of the first and second image samples.
- G3 ,P+i ,q+i associated with a first color component green, in this example
- the sample 902 has a second value R3 ,P+i,q+i associated with a second color component (red, in this example) that is obtained from a corresponding position in the second array.
- the sample 902 has a third
- B 3 ,p +1 ,q +1 (Bi , x ,y+i+Bi , x +2,y+i)/2.
- other interpolation techniques that use samples from the first and/or second arrays may alternatively be used.
- the sample 904 may neighbor the sample 902 either vertically or horizontally.
- a third value B3, P +i , q associated with the third color component is obtained by interpolating a value from at least one of the first and second image samples.
- the sample 906 may neighbor the sample 902 either vertically or horizontally.
- a second value R3 ,P,q associated with the second color component is obtained by interpolating a value from at least one of the first and second image samples.
- the sample 908 may neighbor sample 902 diagonally.
- color components are primarily described herein as being green, red, and blue components, other components may alternatively be used. While some samples are described herein as being obtained using a Bayer filter array, the principles described herein may also be employed on arrays of samples collected using techniques other than Bayer filter arrays. It may also be noted that the methods described herein for generating a combined array of samples are not necessarily performed on the same device or devices that captured the samples in the first place. For example, raw image data captured by a first and second sensor on a user’s mobile device may later be combined in a post-processing method, which may be performed on a different device, such as the user’s home computer or a cloud-based processor.
- one or more of the methods described herein is performed by an apparatus that includes processor configured to perform the described methods.
- the processor may be configured to perform the methods using appropriate instructions stored in a computer-readable medium (e.g. a non- transitory medium).
- modules that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules.
- a module includes hardware (e.g., one ormore processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation.
- hardware e.g., one ormore processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices
- Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as commonly referred to as RAM, ROM, etc.
- ROM read only memory
- RAM random access memory
- register cache memory
- semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
- a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
In an example method, a first input image is captured using a first Bayer filter array, and a second input image is captured using a second Bayer filter array. The first and Bayer filter arrays may have substantially perpendicular orientations. A combined image is obtained from the first image and the second image, wherein each pixel in the combined image includes a first color component obtained from the first image, a second color component obtained from the second image, and a third color component obtained by interpolating a value from at least one of the first and second input images. Methods described herein may be used to combine images that have been simultaneously captured with a portrait-oriented image sensor and a landscape-oriented sensor.
Description
DEBAYERING WITH MULTIPLE CAMERA ROTATIONS
BACKGROUND
[0001] The present disclosure relates to the processing of digital images. Specifically, some embodiments relate to techniques for reducing mosaic effects that can occur when full-color images are captured using a Bayer filter array.
SUMMARY
[0002] A method according to some embodiments includes: obtaining a first array of image samples; obtaining a second array of image samples; generating a combined array of image samples, wherein, for at least a first sample in the combined array: a first value associated with a first color component is obtained from a corresponding position in the first array; a second value associated with a second color component is obtained from a corresponding position in the second array; and a third value associated with a third color component is obtained by interpolating a value from at least one of the first and second image samples. [0003] In some embodiments, for at least a second sample in the combined array: a first value associated with the first color component is obtained from a corresponding position in the second array; a second value associated with the second color component is obtained by interpolating a value from at least one of the first and second image samples; and a third value associated with the third color component is obtained from a corresponding position in the first array. The second sample may neighbor the first sample either vertically or horizontally.
[0004] In some embodiments, for at least a third sample in the combined array: a first value associated with the first color component is obtained from a corresponding position in the second array; a second value associated with the second color component is obtained from a corresponding position in the first array; and a third value associated with the third color component is obtained by interpolating a value from at least one of the first and second image samples. The third sample may neighbor the first sample either vertically or horizontally.
[0005] In some embodiments, for at least a fourth sample in the combined array: a first value associated with the first color component is obtained from a corresponding position in the first array; a second value associated with the second color component is obtained by interpolating a value from at least one of the first and second image samples; and a third value associated with the third color component is obtained from a corresponding position in the second array. The fourth sample neighbors the first sample diagonally.
[0006] In some embodiments, the first and second arrays of image samples are obtained using a Bayer filter array. The Bayer filter arrays used to capture the first and second arrays of image samples may have substantially perpendicular orientations.
[0007] In some embodiments, the first array of images samples is obtained using a first image sensor on a user device and the second array of images samples is obtained using a second image sensor on the user device, the first and second image sensors having substantially perpendicular orientations. The first array of image samples and the second array of image samples may be captured substantially simultaneously using the first and second image sensors. The substantially simultaneous capture of the first and second array of image samples may be performed in response to an input on the user device triggering a photograph.
[0008] In some embodiments, generating the combined array of image samples may be performed in response to a determination that the user device is not in a macro mode. In some embodiments, generating the combined array of image samples may be performed in response to a determination that an autofocus distance is greater than a threshold distance.
[0009] In some embodiments, the first color component is green, the second color component is red, and the third color component is blue.
[0010] In some embodiments, the first color component is green, the second color component is blue, and the third color component is red.
[0011] A method in some embodiments includes: capturing a first input image using a first Bayer filter array and a second input image using a second Bayer filter array, where the first and Bayer filter arrays have substantially perpendicular orientations; and obtaining a combined image from the first image and the second
image, wherein each pixel in the combined image includes a first color component obtained from the first image, a second color component obtained from the second image, and a third color component obtained by interpolating a value from at least one of the first and second input images.
[0012] Further embodiments include an apparatus comprising a processor configured to perform any of the methods described herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] FIG. 1 is a system diagram illustrating an example wireless transmit/receive unit (WTRU) that may be used in some embodiments.
[0014] FIG. 2 is a schematic illustration of a demosaicing process performed on an image captured using a Bayer filter.
[0015] FIG. 3 is a schematic illustration of a demosaicing process performed on an image captured using a Bayer filter, illustrating which resulting samples that are interpolated and which samples are not interpolated.
[0016] FIG. 4 is a schematic illustration of image samples captured using a Bayer filter in a landscape orientation, which may be used as an input to methods performed according to some embodiments.
[0017] FIG. 5 is a schematic illustration of image samples captured using a Bayer filter in a portrait orientation, which may be used as an input to methods performed according to some embodiments.
[0018] FIG. 6 is a schematic illustration of an example demosaicing method according to some embodiments, combining image samples as captured according to FIGs. 4 and 5.
[0019] FIG. 7 is a schematic illustration of a mobile device with two image sensors that may be used to capture a portrait-oriented image and a landscape-oriented image for processing according to some embodiments.
[0020] FIGs. 8A and 8B illustrates arrays of image samples that may be combined in a demosaicing process according to some embodiments.
[0021] FIG. 9 illustrates an array of image samples that may be generated from the image samples of FIGs. 8A and 8B.
[0022] FIG. 10 illustrates a method according to some embodiments.
EXAMPLE APPARATUS FOR IMPLEMENTATION OF THE EMBODIMENTS
[0023] FIG. 1 is a system diagram illustrating an example wireless transmit receive unit (WTRU) 102. As shown in FIG. 1 B, the WTRU 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, non-removable memory 130, removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and/or other peripherals 138, among others. It will be appreciated that the WTRU 102 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment.
[0024] The processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment. The processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 1 depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
[0025] The transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station over the air interface 116. For example, in one embodiment, the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals. In an embodiment, the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. In yet another embodiment, the transmit/receive element 122 may be configured to transmit and/or receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
[0026] Although the transmit/receive element 122 is depicted in FIG. 1 as a single element, the WTRU 102 may include any number of transmit/receive elements 122.
More specifically, the WTRU 102 may employ Ml MO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 116.
[0027] The transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122. As noted above, the WTRU 102 may have multi-mode capabilities. Thus, the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as NR and IEEE 802.11 , for example.
[0028] The processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128. In addition, the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132. The non-removable memory 130 may include random- access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102, such as on a server or a home computer (not shown).
[0029] The processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102. The power source 134 may be any suitable device for powering the WTRU 102. For example, the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
[0030] The processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102. In addition to, or in lieu of, the
information from the GPS chipset 136, the WTRU 102 may receive location information over the air interface 116 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
[0031] The processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs and/or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, a Virtual Reality and/or Augmented Reality (VR/AR) device, an activity tracker, and the like. The peripherals 138 may include one or more sensors, the sensors may be one or more of a gyroscope, an accelerometer, a hall effect sensor, a magnetometer, an orientation sensor, a proximity sensor, a temperature sensor, a time sensor; a geolocation sensor; an altimeter, a light sensor, a touch sensor, a magnetometer, a barometer, a gesture sensor, a biometric sensor, and/or a humidity sensor.
[0032] The WTRU 102 may include a full duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for both the UL (e.g., for transmission) and downlink (e.g., for reception) may be concurrent and/or simultaneous. The full duplex radio may include an interference management unit to reduce and or substantially eliminate self-interference via either hardware (e.g., a choke) or signal processing via a processor (e.g., a separate processor (not shown) or via processor 118). In an embodiment, the WTRU 102 may include a half-duplex radio for which transmission and reception of some or all of the signals (e.g., associated with particular subframes for either the UL (e.g., for transmission) or the downlink (e.g., for reception)).
[0033] Although the WTRU is described in FIG. 1 as a wireless terminal, it is contemplated that in certain representative embodiments that such a terminal may
use (e.g., temporarily or permanently) wired communication interfaces with the communication network.
DETAILED DESCRIPTION
[0034] Devices, such as for example a WTRU, may have multiple cameras whose sensors may be portrait or landscape oriented. Example embodiments combine information from different camera sensors, which may have different orientations, for use in a debayering or other demosaicing process.
[0035] The pixels of cameras sensors are generally designed to capture light (photons) from a broad spectrum to give a “luminance” frame. To get color information, various different techniques may be used. In one such technique, three frames are captured with three different filters (Red/Green/Blue) at substantially the same time. With the red filter, each pixel is red, and so on for green and blue. By combining the 3 (RGB) frames, a color picture is obtained. This can be done with three sensors (each dedicated to a color) or one sensor with three filters. In other techniques, one frame is captured, and a Bayer filter array is applied on the sensor. An interpolation algorithm process (for example: nearest neighbor, bilinear, smooth hue, or variable number of gradients (VNG)) is used to obtain a color picture. FIG. 2 is a schematic illustration of an example of a demosaicing process with a Bayer matrix.
[0036] FIG. 3 is a schematic illustration of a demosaicing process with a Bayer matrix, illustrating that the color values of some pixels are interpolated. In some processes, 75% of the red and blue pixels are interpolated and 50% of the green pixels are interpolated.
[0037] Example embodiments are intended to increase the RGB output quality by decreasing the number of interpolated pixels. Some embodiments operate to make use of multiple camera orientations to increase demosaicing process quality.
[0038] In an example embodiment, a first image sensor of a device is in “landscape mode,” which provides color samples as illustrated in FIG. 4. On the same device, a second image sensor is in “portrait mode,” which provides color samples as illustrated in FIG. 5. The portrait mode sensor in this example is rotated by +90° or -90° with respect to the landscape mode sensor. FIG. 6 illustrates the merging of the sensor captures. As illustrated in FIG. 6, the merger of the sensor captures
allows for a reduced level of interpolation. In this example, only 50% of the red and blue pixels are now interpolated and none of the green pixels are interpolated.
[0039] In some embodiments, calibration between the two sensors is performed during manufacturing. In some embodiments, the subjects of the images sensor captures are distant enough from the device to prevent a disparity effect between the two sensors (e.g. a disparity effect of less than one pixel). In some embodiments, the merging of sensor captures is not performed during macro mode.
[0040] Interpolation may be performed by hardware or software. Different interpolation techniques may be selected to achieve a balance between complexity, power consumption, and reactivity. Interpolation will typically be better with more “true” pixel colors.
[0041] FIG. 7 illustrates an example of a device (e.g. a WTRU) employed in some embodiments. The example device includes an image sensor with a portrait orientation and another image sensor with a landscape orientation. The image sensors can be configured to capture images at least partly of the same entity (object, scene, target, aim,...), which typically means that the image sensors are aimed in the same direction.
[0042] In some embodiments, the image captures are performed in portrait and landscape mode at the same time. One potential benefit of such an embodiment is that it is not necessary to select one orientation to take the picture. Images is captured by both sensors, and the user may subsequently select whether to keep one or both of the images. In addition, it may be easier to hold the device in a portrait mode even if it is desired to capture an image in landscape mode.
[0043] Such a device equipped with both portrait and landscape image sensors may be used to implement a demosaicing process as described herein. In an example use, images are captured substantially simultaneously by the two sensors, and a merge between landscape/portrait sensors is made before algorithm interpolation. The capture of the images may be performed in response to a user input triggering the capture of a photograph (e.g. pressing a shutter button on a camera app).
[0044] Example embodiments are not limited to the capture of still images. The methods described herein for processing of images may be performed in some
embodiments on the frames of video images, for example to merge frames of video captured with a sensor having a portrait orientation and video captured with a sensor having a landscape orientation.
[0045] In some embodiments, a camera application on a WTRU, such as a smartphone or other user device, a specific color mode (“Enhanced colors” for example) may be activated, in response to which a demosaicing process such as those described herein may be implemented. In some embodiments, this mode is disabled for close subjects, such as when macro mode is in use. In some embodiments, the “Enhanced colors” mode may be automatically disabled when the autofocus is focusing on very close subjects.
[0046] In a method according to some embodiments, a first input image is using a first Bayer filter array and a second input image using a second Bayer filter array. The first and second Bayer filter arrays have at least substantially perpendicular orientations. A combined image is obtained from the first image and the second image, wherein each pixel in the combined image includes a first color component obtained from the first image, a second color component obtained from the second image, or a third color component obtained by interpolating a value from at least one of the first and second input images.
[0047] While some example embodiments operate to combine images from a sensor with a portrait orientation and a sensor with a landscape orientation, the principles described herein may also be implemented using sensors with other relative orientations or with the same orientation. In this regard, it may be noted that a Bayer filter array that is turned 90° has the same pattern of colors as a Bayer filter array that has been shifted by, for example, one pixel in an appropriate direction. The principles described herein may thus be employed with images captured using Bayer filter arrays that are appropriately rotated and/or shifted with respect to one another.
[0048] Figure 10 illustrates a method according to an embodiment of the present principles. In an example embodiment, in step S1002, a first array of image samples is obtained. FIG. 8A illustrates an example portion of a first array of image samples, such as image samples captured with an image sensor in portrait mode. Each of the image samples is associated with a color component. For example, the image sample Gi,x-i,y-i is an image sample in the first array that is at position (x-1 ,y-1) and
is associated with a green color component. The image sample Bi,x,y-i is an image sample in the first array that is at position (x,y-1) and is associated with a blue color component. The image sample Ri,x-i,y is an image sample in the first array that is at position (x-1 ,y) and is associated with a red color component.
[0049] In step S1004, a second array of image samples is also obtained. FIG. 8B illustrates an example portion of a second array of image samples, such as image samples captured with an image sensor in landscape mode. Each of the image samples in the second array is associated with a color component. For example, the image sample R2,i-ij-i is an image sample in the second array that is at position (i-1 ,j-1 ) and is associated with a red color component. The image sample G2,U-I is an image sample in the second array that is at position (i,j-1) and is associated with a green color component. The image sample B2,u is an image sample in the second array that is at position (i,j) and is associated with a blue color component.
[0050] In step S1006, the first and second arrays of image samples are merged. Merging of the first and second arrays of image samples may make use of a mapping between positions in the first array and positions in the second array. For example, a mapping may indicate that position (i,j) in the first array corresponds to position (x,y) in the second array, where i=x+a and j=y+b.
[0051] In this example method, a combined array of image samples is generated based on the first and second arrays of image samples. An example of a portion of a combined array of image samples is illustrated in FIG. 9. The image samples in the combined array are generated based on the arrays of FIGs. 8A and 8B. The samples in the combined array may be have positions indicated by, for example, coordinates (p,q). Each position (p,q) in the combined array corresponds to a sample position in the first array and a sample position in the second array. In the example of FIG. 9, the sample at position (p,q) in the combined array corresponds to position (x,y) in the first array and to position (i,j) in the second array. More generally, samples at positions (x+c,y+d), (p+c,q+d), and (i+cj+d) correspond to one another, for various positive and negative integer values of c and d. The correspondence or mapping between sample positions may be predetermined in manufacturing of an image capturing device, it may be determined during a calibration procedure, or it may be determined by comparing features of two captured images to align those images, among other examples. It may be noted
that not every pixel position in one array necessarily has a corresponding pixel in another array. For example, as seen in FIG. 6, there are portions of a landscape image that do not overlay any corresponding area in a portrait image, and vice versa.
[0052] For at least one of the samples 902 in the combined array of FIG. 9, the sample has a first value G3,P+i ,q+i associated with a first color component (green, in this example), which is obtained from the value Gi,x+i,y+i from the corresponding position in the first array. The sample 902 has a second value R3,P+i,q+i associated with a second color component (red, in this example) that is obtained from a corresponding position in the second array. The sample 902 has a third value B3,p+i,q+i associated with a third color component (blue, in this example) that is obtained by interpolating a value from at least one of the first and second image samples. In the particular example of FIG. 9, this interpolated value is obtained as the average of the two nearest samples associated with the same color component, such that B3,p+1,q+1 = (Bi ,x,y+i+Bi ,x+2,y+i)/2. However, other interpolation techniques that use samples from the first and/or second arrays may alternatively be used.
[0053] In some embodiments, for another one of the samples, 904, in the combined array, a first value G3,P,q+i associated with the first color component (e.g. green) is obtained from a corresponding position in the second array, such that G3,P,q+i = G2,i,j+i . A second value R3,P,q+i associated with the second color component (e.g. red) is obtained by interpolating a value from at least one of the first and second image samples. For example, in some embodiments, R3,P,q+i = (R2j- i ,j+i+R2,i+i ,j+i)/2. A third value B3, ,q+i associated with the third color component (e.g. blue) is obtained from a corresponding position in the first array, such that B3, ,q+i = Bi ,x,y+i . The sample 904 may neighbor the sample 902 either vertically or horizontally.
[0054] In some embodiments, for another one of the samples, 906, a first value G3,P+i ,q associated with the first color component is obtained from a corresponding position in the second array, such that G3,P+i ,q = G2,i+i j. A second value R3,P+i ,q associated with the second color component is obtained from a corresponding position in the first array, such that R3,P+i ,q = Ri ,x+i ,y. A third value B3,P+i ,q associated with the third color component is obtained by interpolating a value from at least one
of the first and second image samples. The sample 906 may neighbor the sample 902 either vertically or horizontally.
[0055] In some embodiments, for another one of the samples, 908, a first value G3,p,q associated with the first color component is obtained from a corresponding position in the first array, such that G3,P,q = Gi,x,y. A second value R3,P,q associated with the second color component is obtained by interpolating a value from at least one of the first and second image samples. A third value B3,P,q associated with the third color component is obtained from a corresponding position in the second array, such that B3,P,q = B2,ij. The sample 908 may neighbor sample 902 diagonally.
[0056] While the color components are primarily described herein as being green, red, and blue components, other components may alternatively be used. While some samples are described herein as being obtained using a Bayer filter array, the principles described herein may also be employed on arrays of samples collected using techniques other than Bayer filter arrays. It may also be noted that the methods described herein for generating a combined array of samples are not necessarily performed on the same device or devices that captured the samples in the first place. For example, raw image data captured by a first and second sensor on a user’s mobile device may later be combined in a post-processing method, which may be performed on a different device, such as the user’s home computer or a cloud-based processor.
[0057] In some embodiments, one or more of the methods described herein is performed by an apparatus that includes processor configured to perform the described methods. The processor may be configured to perform the methods using appropriate instructions stored in a computer-readable medium (e.g. a non- transitory medium).
[0058] Note that various hardware elements of one or more of the described embodiments are referred to as “modules” that carry out (i.e., perform, execute, and the like) various functions that are described herein in connection with the respective modules. As used herein, a module includes hardware (e.g., one ormore processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation.
Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and it is noted that those instructions could take the form of or include hardware (i.e., hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as commonly referred to as RAM, ROM, etc.
[0059] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.
Claims
What is Claimed: 1. A method comprising: receiving a first array of image samples; receiving a second array of image samples; generating a combined array of image samples, wherein, for at least a first sample in the combined array: a first value associated with a first color component is obtained from a corresponding position in the first array; a second value associated with a second color component is obtained from a corresponding position in the second array; and a third value associated with a third color component is obtained by interpolating a value associated with the third color component from at least one of the first and second image samples.
2. The method of claim 1 , wherein for at least a second sample in the combined array: a first value associated with the first color component is obtained from a corresponding position in the second array; a second value associated with the second color component is obtained by interpolating a value from at least one of the first and second image samples; and a third value associated with the third color component is obtained from a corresponding position in the first array.
3. The method of claim 2, wherein the second sample neighbors the first sample either vertically or horizontally.
4. The method of any of the preceding claims, wherein for at least a third sample in the combined array: a first value associated with the first color component is obtained from a corresponding position in the second array;
a second value associated with the second color component is obtained from a corresponding position in the first array; and a third value associated with the third color component is obtained by interpolating a value from at least one of the first and second image samples.
5. The method of claim 4, wherein the third sample neighbors the first sample either vertically or horizontally.
6. The method of any of the preceding claims, wherein for at least a fourth sample in the combined array: a first value associated with the first color component is obtained from a corresponding position in the first array; a second value associated with the second color component is obtained by interpolating a value from at least one of the first and second image samples; and a third value associated with the third color component is obtained from a corresponding position in the second array.
7. The method of claim 6, wherein the fourth sample neighbors the first sample diagonally.
8. The method of any of the preceding claims, wherein the first and second arrays of image samples are obtained using a Bayer filter array.
9. The method of any of the preceding claims, wherein the first array of images samples is received from a first image sensor on a user device and the second array of images samples is received from a second image sensor on the user device, the first and second image sensors having at least substantially perpendicular orientations.
10. The method of claim 9, wherein the first array of image samples and the second array of image samples are captured at least substantially simultaneously using the first and second image sensors.
11 . The method of claim 10, wherein the at least substantially simultaneous capture of the first and second array of image samples is performed in response to an input on the user device triggering a photograph.
12. The method of claim 11 , wherein generating the combined array of image samples is performed in response to a determination that the user device is not in a macro mode.
13. The method of any of the preceding claims, wherein the first color component is green, the second color component is red, and the third color component is blue.
14. The method of any of claims 1-12, wherein the first color component is green, the second color component is blue, and the third color component is red.
15. A method comprising: capturing a first input image using a first Bayer filter array and a second input image using a second Bayer filter array, where the first and Bayer filter arrays have substantially perpendicular orientations; and obtaining a combined image from the first image and the second image, wherein each pixel in the combined image includes a first color component obtained from the first image, a second color component obtained from the second image, and a third color component obtained by interpolating a value from at least one of the first and second input images.
16. A device comprising a processor configured to: receive a first array of image samples; receive a second array of image samples; generate a combined array of image samples, wherein, for at least a first sample in the combined array: a first value associated with a first color component is obtained from a corresponding position in the first array;
a second value associated with a second color component is obtained from a corresponding position in the second array; and a third value associated with a third color component is obtained by interpolating a value associated with the third color component from at least one of the first and second image samples.
17. The device of claim 16, wherein the processor is further configured to, for at least a second sample in the combined array: obtain a first value associated with the first color component from a corresponding position in the second array; obtain a second value associated with the second color component by interpolating a value from at least one of the first and second image samples; and obtain a third value associated with the third color component from a corresponding position in the first array.
18. The device of claim 17, wherein the second sample neighbors the first sample either vertically or horizontally.
19. The device of any of claims 16-18, wherein the processor is further configured to, for at least a third sample in the combined array: obtain a first value associated with the first color component from a corresponding position in the second array; obtain a second value associated with the second color component from a corresponding position in the first array; and obtain a third value associated with the third color component by interpolating a value from at least one of the first and second image samples.
20. The device of claim 19, wherein the third sample neighbors the first sample either vertically or horizontally.
21. The device of any of claims 16-20, wherein the processor is further configured to, for at least a fourth sample in the combined array:
obtain a first value associated with the first color component from a corresponding position in the first array; obtain a second value associated with the second color component by interpolating a value from at least one of the first and second image samples; and obtain a third value associated with the third color component from a corresponding position in the second array.
22. The device of claim 21 , wherein the fourth sample neighbors the first sample diagonally.
23. The device of any of claims 16-22, further comprising a Bayer filter array configured to obtain the first and second arrays of image samples.
24. The device of any of claims 16-23, further comprising a first image sensor configured to capture the first array of images samples, and a second image sensor configured to capture the second array of images samples, the first and second image sensors having at least substantially perpendicular orientations.
25. The device of claim 24, wherein the device is configured to capture the first array of image samples and the second array of image samples at least substantially simultaneously using the first and second image sensors.
26. The device of claim 25, wherein the device is configured to perform the at least substantially simultaneous capture of the first and second array of image samples in response to an input on the user device triggering a photograph.
27. The device of claim 26, wherein the device is configured to generate the combined array of image samples in response to a determination that the user device is not in a macro mode.
28. The device of any of claims 16-27, wherein the first color component is green, the second color component is red, and the third color component is blue.
29. The device of any of claims 16-27, wherein the first color component is green, the second color component is blue, and the third color component is red.
30. A device comprising a processor configured to: capture a first input image using a first Bayer filter array and a second input image using a second Bayer filter array, where the first and Bayer filter arrays have substantially perpendicular orientations; and obtain a combined image from the first image and the second image, wherein each pixel in the combined image includes a first color component obtained from the first image, a second color component obtained from the second image, and a third color component obtained by interpolating a value from at least one of the first and second input images.
31. A method comprising: receiving a first array of image samples corresponding to an image from a first image sensor in a first orientation; receiving, simultaneously with the first array, a second array of image samples corresponding to the image from a second image sensor in an orientation rotated relative to the first orientation; and generating a combined array of image samples by combining corresponding positions from the received first array of image samples with the received second array of image samples.
32. A device comprising a processor configured to: receive a first array of image samples corresponding to an image from a first image sensor in a first orientation; receive, simultaneously with the first array, a second array of image samples corresponding to the image from a second image sensor in an orientation rotated relative to the first orientation; and generate a combined array of image samples by combining corresponding positions from the received first array of image samples with the received second array of image samples.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20305870 | 2020-07-30 | ||
EP20305870.6 | 2020-07-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022023157A1 true WO2022023157A1 (en) | 2022-02-03 |
Family
ID=72046822
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2021/070520 WO2022023157A1 (en) | 2020-07-30 | 2021-07-22 | Debayering with multiple camera rotations |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022023157A1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013025530A1 (en) * | 2011-08-12 | 2013-02-21 | Intuitive Surgical Operations, Inc. | An image capture unit in a surgical instrument |
JP2013172218A (en) * | 2012-02-20 | 2013-09-02 | Sony Corp | Imaging device, image processing method, and program |
-
2021
- 2021-07-22 WO PCT/EP2021/070520 patent/WO2022023157A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013025530A1 (en) * | 2011-08-12 | 2013-02-21 | Intuitive Surgical Operations, Inc. | An image capture unit in a surgical instrument |
JP2013172218A (en) * | 2012-02-20 | 2013-09-02 | Sony Corp | Imaging device, image processing method, and program |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8908013B2 (en) | Systems and methods for collaborative image capturing | |
US10762664B2 (en) | Multi-camera processor with feature matching | |
CN102640499B (en) | System and method for demosaicing image data using weighted gradients | |
US9489719B2 (en) | Image processing device, imaging device, computer, image processing method and computer readable non-transitory medium | |
JP5690974B2 (en) | Imaging apparatus and focus control method | |
KR102480600B1 (en) | Method for low-light image quality enhancement of image processing devices and method of operating an image processing system for performing the method | |
US20150363913A1 (en) | Adaptive filter demosaicizing for super resolution | |
JP5775977B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and image processing program | |
US9536289B2 (en) | Restoration filter generation device and method, image processing device, imaging device, and non-transitory computer-readable medium | |
CN108605099A (en) | The method and terminal taken pictures for terminal | |
US20240119566A1 (en) | Image processing method and apparatus, and electronic device | |
EP3818694A1 (en) | Method and system for near-eye focal plane overlays for 3d perception of content on 2d displays | |
WO2017071542A1 (en) | Image processing method and apparatus | |
JP5747124B2 (en) | Imaging device | |
JP2018504841A (en) | A method for mobile devices to improve camera image quality by detecting whether the mobile device is indoors or outdoors | |
US9307211B2 (en) | Image processing system, transmitting-side device and receiving-side device | |
JPWO2016080081A1 (en) | Imaging apparatus, imaging method, and program | |
JP7142715B2 (en) | Ambient light detection method and terminal | |
JP5542248B2 (en) | Imaging device and imaging apparatus | |
WO2019157427A1 (en) | Image processing | |
JP5749409B2 (en) | Imaging apparatus, image processing method, and program | |
KR102285756B1 (en) | Electronic system and image processing method | |
JP6324879B2 (en) | Imaging apparatus and control method thereof | |
CN112116530B (en) | Fisheye image distortion correction method, device and virtual display system | |
WO2022023157A1 (en) | Debayering with multiple camera rotations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21748624 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21748624 Country of ref document: EP Kind code of ref document: A1 |