EP2955916A1 - Systems and methods to capture a stereoscopic image pair - Google Patents

Systems and methods to capture a stereoscopic image pair Download PDF

Info

Publication number
EP2955916A1
EP2955916A1 EP15178955.9A EP15178955A EP2955916A1 EP 2955916 A1 EP2955916 A1 EP 2955916A1 EP 15178955 A EP15178955 A EP 15178955A EP 2955916 A1 EP2955916 A1 EP 2955916A1
Authority
EP
European Patent Office
Prior art keywords
image
capture
disparity
imaging sensor
stereoscopic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15178955.9A
Other languages
German (de)
English (en)
French (fr)
Inventor
Szepo Robert Hung
Ruben M Velarde
Thomas Wesley OSBORNE
Liang Liang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Publication of EP2955916A1 publication Critical patent/EP2955916A1/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/221Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20068Projection on vertical or horizontal image axis

Definitions

  • One aspect of the invention relates to imaging devices, and in particular, to methods, apparatus, and systems for the capture of stereoscopic images utilizing a single imaging sensor on an imaging device.
  • a second digital imaging sensor comes with some disadvantages.
  • the additional cost of a second imaging sensor and the associated electronics can be prohibitive in at least some market segments.
  • a second imaging sensor impacts the usability of the device in a number of ways.
  • accommodation of a second imaging sensor requires the device to be somewhat larger than a device with only a single sensor.
  • the power capabilities of the device must be sized to support powering both imaging sensors simultaneously. This may require larger and more costly power handling circuitry. Battery life of the device may also be affected, perhaps requiring a larger battery.
  • Some of the embodiments may comprise a method of capturing a stereoscopic image.
  • the method may comprise the capturing of a first image through an imaging sensor.
  • the method may further comprise the capturing of a second image through the imaging sensor, and the determining of the vertical disparity between the first image and the second image.
  • the method may further comprise determining the horizontal disparity between the first image and the second image, determining the geometric distortion between the first image and the second image, determining a convergence point between the first image and the second image, and applying a correction to create at least one corrected image.
  • the convergence point is determined based on a depth range and a depth histogram.
  • the vertical disparity is determined by the cross correlation of row sum vectors.
  • Some embodiments may further comprise the generating of a stereographic image pair based on the corrected image. In other embodiments, the above elements may be performed repetitively.
  • inventions may comprise a method of capturing a stereoscopic image.
  • the method may comprise the capturing of a first image through an imaging sensor, the displaying of a directional indicator on an electronic display, capturing a second image through the imaging sensor, and generating a stereoscopic image based on the first image and the second image.
  • the method may further comprise displaying a portion of the first image on a corresponding portion of an electronic display and displaying a portion of a preview image from the imaging sensor on a corresponding portion of the electronic display.
  • the method may further comprise displaying a transparent version of the first image on an electronic display, or displaying a transparent version of a preview image on the electronic display.
  • the method may further comprise displaying an indication on the electronic display of the horizontal shift required to capture a high quality image.
  • Other embodiments may include the displaying of a dynamically estimated quality indicator.
  • the capturing of the second image is performed in response to a user actuated control, while in other embodiments, the second image is captured automatically.
  • the automatic capture of the second image may be based on the horizontal disparity between a first image and a preview image.
  • the capture of the second image is based at least in part on input from an auto focus module or an accelerometer or a frame disparity between the first image and the real time image.
  • inventions may comprise an imaging device including an imaging sensor and an electronic processor, wherein the electronic processor is configured to control the imaging sensor.
  • These embodiments may also include a control module configured to capture a first image using the imaging sensor, capture a second image using the imaging sensor, determine the vertical disparity between the first image and the second image, determine the horizontal disparity between the first image and the second image, determine the geometric distortion between the first image and the second image, determine a convergence point between the first image and the second image, and apply a correction to create at least one corrected image.
  • the method may further comprise creation of a stereoscopic image pair based on the corrected image.
  • the elements may be performed repetitively.
  • the imaging device may further comprise a wireless telephone handset.
  • control module may be configured to capture the second image automatically.
  • Some embodiments of the device further comprise a user actuated control, wherein the control module is further configured to capture the first image in response to a first actuation of the user actuated control and to capture a second image in response to a second actuation of the user actuated control.
  • an imaging device comprising an imaging sensor, an electronic display, and a processor, wherein the processor is configured to control the imaging sensor and the electronic display.
  • These embodiments further comprise a control module configured to capture a first image using the imaging sensor, display a directional indicator on the electronic display, capture a second image using the imaging sensor, and generate a stereoscopic image based on the first image and the second image.
  • the control module is further configured to determine the horizontal disparity between the first image and the second image, and the display of the directional indicator is based on the horizontal disparity.
  • the device further comprises an accelerometer, wherein the display of the directional indicator is based on input from the accelerometer.
  • control module is further configured to display a portion of the first image on the electronic display, while in still other embodiments, the control module is further configured to display a portion of a preview image on the electronic display. In other embodiments, the portion of the first image and the portion of the preview image are displayed simultaneously.
  • inventions include a non-transitory computer readable medium containing processor executable instructions that are operative to cause a processor to capture a first image using an imaging sensor, capture a second image using the imaging sensor, determine the vertical disparity between the first image and the second image, determine the horizontal disparity between the first image and the second image, determine the geometric distortion between the first image and the second image, and determine a convergence point between the first image and the second image, and apply a correction to create at least one corrected image.
  • Other embodiments further comprise executable instructions operative to cause a processor to generate a stereographic image pair based on the corrected image.
  • Some other embodiments include instructions operative to cause a processor to determine the vertical disparity based on the cross correlation of row sum vectors.
  • Other embodiments include a non-transitory computer readable medium containing processor executable instructions that are operative to cause a processor to capture a first image using an imaging sensor, display a directional indicator, capture a second image through the image senor, and generate a stereoscopic image based on the first image and the second image.
  • Other embodiments further include instructions operative to cause a processor to display a portion of the first image on a portion of an electronic display and display a portion of a preview image on a portion of the electronic display.
  • Some other embodiments may comprise instructions operative to cause a processor to display a transparent version of the first image on an electronic display or display a transparent version of a preview image on the electronic display.
  • an imaging device comprising means for capturing a first image through an imaging sensor, means for capturing a second image through the imaging sensor, means for determining the vertical disparity between the first image and the second image, means for determining the horizontal disparity between the first image and the second image, means for determining the geometric distortion between the first image and the second image, means for determining a convergence point between the first image and the second image, and means for applying a correction to create at least one corrected image.
  • Some embodiments may further comprise means for generating a stereoscopic image based on the corrected image.
  • an imaging device comprising means for capturing a first image using an imaging sensor, means for displaying a directional indicator, means for capturing a second image through the imaging sensor, and means for generating a stereoscopic image based on the first image and the second image.
  • Implementations disclosed herein provide systems, methods and apparatus for capturing a stereoscopic image with a device including only one imaging sensor. Particularly, some embodiments described herein contemplate capturing two separate images using the one imaging sensor, and generating a stereoscopic image based on the two images. One embodiment includes providing a directional indicator on an electronic display, indicating in which direction the imaging sensor should be moved before capturing the second image.
  • a directional indicator on an electronic display indicating in which direction the imaging sensor should be moved before capturing the second image.
  • examples may be described as a process, which is depicted as a flowchart, a flow diagram, a finite state diagram, a structure diagram, or a block diagram.
  • a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel, or concurrently, and the process can be repeated. In addition, the order of the operations may be rearranged.
  • a process is terminated when its operations are completed.
  • a process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc.
  • a process corresponds to a software function
  • its termination corresponds to a return of the function to the calling function or the main function.
  • Figure 1 depicts a high-level block diagram of a device 100 having a set of components including a processor 120 linked to an imaging sensor 115.
  • a working memory 105, storage 110, electronic display 125, and memory 130 are also in communication with the processor 120.
  • Device 100 may be a cell phone, digital camera, personal digital assistant, or the like. Device 100 may also be a more stationary device such as a desktop personal computer, video conferencing station, or the like that uses an internal or external camera for capturing images.
  • a plurality of applications may be available to the user on device 100. These applications may include traditional photographic applications, high dynamic range imaging, panoramic video, or stereoscopic imaging such as 3D images or 3D video.
  • Processor 120 may be a general purpose processing unit or a processor specially designed for imaging applications. As shown, the processor 120 is connected to a memory 130 and a working memory 105.
  • the memory 130 stores an imaging sensor control module 135, disparity removal module 140, convergence adjustment module 145, geometric distortion estimation and correction module 150, shift and crop module 155, encoding module 160, user interface module 165, capture control module 170, and operating system 175. These modules include instructions that configure the processor to perform various image processing and device management tasks.
  • Working memory 105 may be used by processor 120 to store a working set of processor instructions contained in the modules of memory 130. Alternatively, working memory 105 may also be used by processor 120 to store dynamic data created during the operation of device 100.
  • the processor is configured by several modules stored in the memories.
  • the imaging sensor control module 135 includes instructions that configure the processor 120 to adjust the focus position of imaging sensor 115.
  • the imaging sensor control module 135 also includes instructions that configure the processor 120 to capture images with the imaging sensor 115. Therefore, processor 120, along with image capture control module 135, imaging sensor 115, and working memory 105 represent one means for capturing an image using an imaging sensor.
  • the disparity removal module 140 provides instructions that configure the processor 120 to detect and eliminate vertical disparity between two images captured by imaging sensor 115. Disparity removal module 140 may also provide instructions to detect horizontal disparity between two images captured by imaging sensor 115.
  • the convergence adjustment module contains instructions that configure the processor to adjust the convergence point between two images captured with the imaging sensor 115.
  • Geometric Distortion Estimation and Correction module 150 contains instructions that configure the processor to detect geometric distortion caused by misalignment of two images captured by imaging sensor 115.
  • Shift and crop module 155 includes instructions that configure the processor 120 to shift image one and image two in relation to each other in order to correct for vertical disparity between the two images.
  • Shift and crop module 155 may also include instructions to crop image 1 and/or image 2 to achieve consistent alignment between the two images.
  • Encoding module 160 includes instructions that configure the processor to encode images captured by imaging sensor 115 into a stereoscopic image. Therefore, instructions contained within encoding module 160 represent one means for generating a stereoscopic image based on a first image and a second image.
  • User interface module 165 includes instructions that configure the processor to display information on the electronic display 125.
  • Capture control module 170 includes instructions that control the overall image processing functions of device 100.
  • capture control module 170 may include instructions that call subroutines in imaging control module 135 in order to configure the processor 120 to capture a first and second image using the imaging sensor 115.
  • Capture control module 170 may then call disparity removal module 140 to determine the horizontal disparity between the two images.
  • Capture control module 170 may then call geometric distortion estimation and correction module 150 to determine the geometric distortion between the first and second images.
  • Capture control module may then call subroutines within the convergence adjustment module 145 to adjust a convergence point between two images.
  • Operating System module 175 configures the processor to manage the memory and processing resources of device 100.
  • operating system module 175 may include device drivers to manage hardware resources such as the electronic display 125, storage 110, or imaging sensor 115. Therefore, in some embodiments, instructions contained in the image processing modules discussed above may not interact with these hardware resources directly, but instead interact through standard subroutines or APIs located in operating system component 175. Instructions within operating system 175 may then interact directly with these hardware components.
  • Processor 120 may write data to storage module 110. While storage module 110 is represented graphically as a traditional disk device, those with skill in the art would understand multiple embodiments could include either a disk based storage device or one of several other type storage mediums to include a memory disk, USB drive, flash drive, remotely connected storage medium, virtual disk driver, or the like.
  • Figure 1 depicts a device comprising separate components to include a processor, imaging sensor, and memory
  • processor imaging sensor
  • memory memory components
  • Figure 1 illustrates two memory components, to include memory component 130 comprising several modules, and a separate memory 105 comprising a working memory
  • a design may utilize ROM or static RAM memory for the storage of processor instructions implementing the modules contained in memory 130.
  • processor instructions may be read at system startup from a disk storage device that is integrated into device 100 or connected via an external device port. The processor instructions may then be loaded into RAM to facilitate execution by the processor.
  • working memory 105 may be a RAM memory, with instructions loaded into working memory 105 before execution by the processor 120.
  • FIG. 2 is a flow chart illustrating a process 200 that runs within one embodiment of the capture control module 170 of Figure 1 .
  • the process 200 begins at start block 205 and then transitions to block 210 where a first image is captured.
  • the first image may be captured by instructions in capture control module 170 calling subroutines inside imaging sensor control module 135.
  • Imaging sensor control module 135 then configures the processor to control imaging sensor 115, possibly via operating system module 175, to capture an image.
  • Process 200 then moves to block 215 where a second image is captured.
  • Process 200 then moves to block 220 where instructions determine the vertical disparity between the first and second image. These instructions may be located in the disparity removal module 140 of Figure 1 . It is well known that vertical disparity between two stereoscopic images can create nausea, headaches, and other physical effects. Therefore, the removal of vertical disparity from a stereoscopic image ensures a pleasant viewing experience.
  • block 220 of process 200 may determine the vertical disparity between the first and second images by first summing the rows of each image. This summation process creates two vectors, one vector for each image. Each element of a vector represents one row sum for an image.
  • An example vector is shown in Figure 3 , item 310.
  • the rows of the image 305 in Figure 3 have been summarized, producing a vector represented by the graph 310.
  • a row sum for two images, taken by the same imaging sensor but from different imaging sensor positions, is illustrated in Figure 3 .
  • Image 320 is taken from a first position and image 330 is taken from a second position. Differences in the two graphs represent variation between the two images.
  • the vectors have substantial similarities, for example there is a general correspondence between the peaks and valleys of the two graphs. These similarities allow a best fit operation to be performed on the two vectors. In some embodiments, a best fit may be determined by identifying an offset between the two vectors that minimizes the sum of absolute differences between positions of the two vectors. While row summing provides a relatively simple solution to disparity recognition and adjustment, it has some disadvantages. For example, its effectiveness is scene dependent, and it may fail completely in some cases. Additionally, its precision can be affected when there is misalignment between the two images, for example, a misalignment in pitch can effect the accuracy of a row summing based solution..
  • Image misalignment due to scaling can also effect the accuracy of a vertical disparity determination based on row summing.
  • Other embodiments may form vectors based on the results of a horizontal edge detection process. A best fit is then performed on the horizontal edge vectors in a similar manner as that described above.
  • This best fit operation will identify an offset by which one graph may be adjusted to best align with the other. This offset can be applied to one of the images to align the images vertically. Therefore, instructions within vertical disparity removal module 140 performing a best fit of row sum vectors represents one means for determining the vertical disparity between two images.
  • block 220 may determine vertical disparity using other techniques. For example, some embodiments may identify a best match within a local neighborhood of the two images. For example, an embodiment may locate key feature points within one image and search for the best match in the other image. Therefore, instructions within vertical disparity removal module 140 performing a best match within a local neighborhood for a key feature point represent another means for determining the vertical disparity between two images.
  • Alternate embodiments may utilize an embedded motion sensor such as an accelerometer to determine angular motion and positional shift between two images.
  • the relative vertical disparity can be computed based on the relative motion occurring between the capture of the two images.
  • instructions within a vertical disparity removal module 140 utilizing an embedded motion sensor such as an accelerometer to calculate the relative motion occurring between the capture of two images represents another means for determining the vertical disparity between two images.
  • process 200 moves to block 225 where the horizontal disparity is determined.
  • Horizontal disparity detection may be performed in a similar manner to vertical disparity detection, with the exception that vectors are created by summarizing the columns of an image instead of the rows.
  • This column summation process is also illustrated in Figure 3.
  • Figure 3 graph 340 represents values of a vector created by summarizing the columns of an image.
  • a correction for the horizontal disparity can be obtained by determining a best fit between the two images' column sum vectors. Therefore, instructions that summarize the columns of two images to create two column sum vectors, and perform a best fit between the two column sum vectors represents one means for determining a horizontal disparity between two images.
  • Alternate embodiments may utilize the feature point technique described above to also determine horizontal disparity. Therefore, instructions in a disparity removal module 140 determining a best match for a feature point in a local neighborhood of two images represents one means for determining a horizontal disparity between the two images.
  • Process 200 then moves to block 230 where the geometric distortion is determined.
  • Block 230 may be performed by instructions contained in the geometric distortion estimation and correction module 150 that configure processor 120.
  • the means for geometric distortion estimation is for instructions to perform feature point matching within the geometric distortion estimation and correction module 150.
  • Other means for determining a geometric distortion between two images may utilize a motion sensor such as an accelerometer. Instructions record the relative shift in location between the capture of the two images. Most modern accelerometers can measure this shift across six independent axis. If an accelerometer is embedded in an imaging device such as device 100, some embodiments may also utilize it to assist in estimation of the vertical and horizontal disparity discussed above.
  • Block 235 may be performed by instructions contained in any one or combination of the disparity removal module 140, convergence adjustment module 145, geometric distortion estimation and correction module 150, shift and crop module 155, or the capture control module 170 of device 100, illustrated in Figure 1 . Instructions in these modules represent one means for applying a correction to create a corrected image.
  • block 235 may rely on the angular displacement information determined in block 230. Once the angular displacement is known, a three dimensional projection matrix is estimated. One image may then be corrected to properly match the other image. Instructions implementing these techniques represent one means for applying a correction to an image.
  • block 235 may shift and/or crop one or both images.
  • the first image may be cropped to remove disparity with respect to the second image.
  • the second image will also need to be cropped to maintain dimensions equivalent to the first image.
  • This cropping results in a stereoscopic image pair with a smaller vertical field of view than that of the original images.
  • eliminating vertical disparity typically requires the removal of a maximum of five percent of image height on the bottom and top of the image to produce a vertically aligned stereoscopic image pair. This reduces the vertical field of view by a total of ten percent.
  • Instructions contained in a shift and crop module 155 that perform cropping one or both images as described above represent another means for applying a correction to create a corrected image.
  • process 200 moves to block 240 where the convergence point of the first and second images is determined.
  • Block 240 may be performed by instructions contained within the convergence adjustment module 145 of Figure 1 .
  • one means for determining the convergence point in some embodiments is instructions that set the convergence point to be one half of the global horizontal disparity.
  • another means to determine a convergence point between two images is for instructions to first estimate a depth range for the scene and create a depth histogram. The convergence point is then set by instructions such that the stereoscopic depth falls into a comfortable viewing zone.
  • process 200 moves to block 245 where a stereoscopic image pair is created based on any corrected images.
  • block 245 may be performed by instructions in encoding module 160. The original first and second images may also be used. Process 200 then transitions to end state 250.
  • FIG. 4 One embodiment of a cropping process is illustrated by the images shown in Figure 4 . Illustrated are two images of a building on a body of water. Image 410 was taken from a slightly lower perspective than image 420, but using the same image sensor. Accordingly, image 410 includes a greater portion of the lake in its field of view, while image 420 includes a greater portion of the sky in its field of view. The portions of each image not included in the other image are represented by the shaded portions of each image, identified as 415 and 425. As illustrated, the two images include significant vertical disparity, which may be eliminated before forming a stereoscopic image pair. To eliminate the vertical disparity, the shaded portion of each image 415 and 425 will be cropped, resulting in a final field of view represented by the common portion of the two images, identified by the bracketed area 440.
  • FIG. 5A and Figure 5B are flow charts depicting one embodiment of processes used by a capture control module to capture a stereographic image pair.
  • Figure 5A is a flow chart illustrating a process 500 that runs within one embodiment of the capture control module 170 of Figure 1 . Closely related to illustrated process 500 is process 550, illustrated in Figure 5B . Process 550 may also run within one embodiment of the capture control module 170 of Figure 1 . Process 500 and process 550 work together to display information on an electronic display regarding the optimal position of the imaging sensor and to capture the first and second image of a stereoscopic image pair. The embodiment illustrated by process 500 and process 550 relies on user input to capture the second image before forming a stereoscopic image. Process 500 is responsible for capturing the first image of the stereoscopic image pair, and for managing the display of information on the electronic display, while process 550 captures the second image and exits the device from stereoscopic capture mode.
  • Process 500 may be performed by instructions included in capture control module 170 of device 100, illustrated in Figure 1 .
  • Process 500 begins at start state 505 and then transitions to block 510 where process 500 waits for an image capture command.
  • the image capture command may occur when a user actuates a device control, or by more automatic means.
  • the imaging device 100 may include a self timer that automatically captures an image after a particular delay.
  • other embodiments of device 100 may include remote control means that command an image capture remotely, for example via either a wired or wireless connection to device 100.
  • process 500 moves to block 512 where a first image is captured.
  • Instructions implementing process 500 may invoke subroutines located in imaging sensor control module 135 of device 100, illustrated in Figure 1 , to capture the first image. Instructions in those subroutines may configure processor 120 to control imaging sensor 115 to capture the first image.
  • Process 500 then moves to block 514 where instructions store the image to a data store. In some embodiments, the image may be stored in a data store such as data store 110 of Figure 1 .
  • Process 500 then moves to decision block 516 where instructions determine if the device is currently in a stereoscopic capture mode. Such a mode may be enabled for example if a user has selected a stereoscopic mode of operation before performing the first image capture command. If the stereoscopic capture mode is not enabled, process 500 moves to block 530 and process 500 ends.
  • process 500 moves to block 518 and display a portion of the first captured image on the electronic display.
  • Embodiments of the display of the first captured image are described in more detail in the explanation of Figure 7 and Figure 8 below.
  • Process 500 then moves to block 520 where instructions capture a preview image.
  • a preview image may be a real time image as perceived by the imaging sensor 115 of device 100 of Figure 1 .
  • Process 500 then moves to block 522 where a portion of the preview image is also displayed on the electronic display.
  • Embodiments illustrating the display functions of blocks 518-522 are described further in the explanation of Figure 8 and Figure 9 below.
  • process 500 moves to block 524, where instructions calculate the horizontal disparity between image 1 and the preview image.
  • Block 524 may be performed by subroutines contained in the disparity removal module 140 of device 100, illustrated in Figure 1 . Calculating the horizontal disparity may utilize one of the techniques described above with respect to Figure 2 , including row summation, an orientation sensor, or points of interest matching.
  • process 500 moves to block 526 where instructions display one or more indicators. Block 526 is explained in more detail in the discussion of Figure 10 below. After indicators are displayed in block 526, process 500 returns to decision block 516. Process 500 then repeats as described above.
  • process 550 begins at start state 555 and then moves to block 560 where process 550 waits for a second image capture command.
  • process 550 captures the second image in block 565.
  • Block 565 may be implemented by instructions in the capture control module 170, or by the imaging sensor control module 135 of device 100, illustrated in Figure 1 .
  • Process 550 then moves to block 570 where the second image is stored to a data store.
  • Process 550 then moves to block 575 where instructions turn off the stereographic capture mode. Note that when the stereoscopic capture mode is turned off by block 575 of process 550, decision block 516 of process 500 will transition process 500 to end block 530.
  • process 550 and process 500 interact to complete capture of the stereoscopic image pair, while displaying a directional indicator on a display.
  • FIG. 6 represents an alternative embodiment of a stereoscopic image pair capture process.
  • Process 600 may be implemented by instructions contained in capture control module 170 of device 100, illustrated in Figure 1 . Unlike process 500 and process 550, process 600 captures the second stereoscopic image automatically.
  • Process 600 begins at start block 605 and then moves to block 607 where instructions implementing process 600 wait for an image capture command. When the image capture command occurs, process 600 moves to block 610 and instructions capture image 1.
  • Image 1 represents the first of two images required to create the stereoscopic image pair.
  • process 600 moves to block 615 where instructions write image 1 to a data store.
  • a data store may include a nonvolatile memory such as a flash disk, external hard drive or the like, or it may be a volatile memory such as RAM or the working memory 105 illustrated in Figure 1 .
  • process 600 moves to block 620 where instructions cause a portion of image 1 to be displayed on an electronic display.
  • Block 620 may be performed in some embodiments by instructions contained in user interface module 165 of device 100, illustrated in Figure 1 .
  • the electronic display may be a display similar to display 125 of device 100, illustrated in Figure 1 .
  • Process 600 then moves to block 625 where instructions capture a preview image.
  • the preview image may be a real time image captured from an imaging sensor, for example, the imaging sensor 115 of device 100, illustrated in Figure 1 .
  • Process 600 then transitions to block 630, where a portion of the preview image is also displayed on an electronic display.
  • Block 630 may also be performed in some embodiments by instructions contained in user interface module 165 of device 100.
  • Process 600 then moves to block 635 where the horizontal disparity between the preview image and image 1 is calculated.
  • Block 635 may be performed by instructions contained in disparity removal module 140 of device 100, and use any of the techniques discussed earlier to calculate horizontal disparity, including a best fit of vectors created by summing the columns of image 1 and the preview image, points of interest matching between image 1 and the preview image, or utilization of an accelerometer to determine the relative position of image 1 with respect to a preview image.
  • process 600 moves to decision block 640, where the current horizontal disparity is compared against the thresholds required for a second image. In some embodiments, if the current horizontal disparity is within parameters to produce an adequate stereoscopic image pair, process 600 moves to block 645 where the second image is captured. Process 600 then moves to end block 675. If the current horizontal disparity is outside thresholds needed for an adequate stereoscopic image pair, process 600 moves to decision block 670.
  • process 600 determines if a user actuated control has been actuated.
  • the stereoscopic imaging device provides for an automatic capture mode, but also allows the user to override the automatic capture process and capture the second image of a stereoscopic image pair manually. Block 670 provides for this capability. If the user actuated control has been actated, process 600 moves to block 645, where the second image is captured. If no user actuated control has been actuated, process 600 moves from decision block 670 to decision block 655.
  • the current horizontal disparity is compared against boundary thresholds. Boundary thresholds establish whether the horizontal disparity is so great as to require the imaging device to abort the automatic capture of a second image. Aborting the stereoscopic image capture may be required, for example, when the imaging processing algorithms are unable to determine a correlation between image 1 and the preview image. In such a case, it may be necessary to automatically exit the stereoscopic image capture mode to avoid spurious results for the user. If the current horizontal disparity is beyond these boundaries, process 600 moves to block 650 where an error is generated. Instructions implementing block 650 may generate an error message on display 125 of device 100 for example. These instructions may be contained within the user interface module 165 of device 100. If the horizontal disparity remains within boundaries such that a stereoscopic image pair may be captured, process 600 moves to block 660.
  • one or more display indications are provided on an electronic display, for example, display 125 of device 100, illustrated in Figure 1 .
  • Block 660 is discussed in more detail in the explanation of Figures 8-10 below. After the indicators are displayed in block 660, process 600 returns to block 625 and process 600 repeats.
  • block 640 may include instructions implementing complex techniques to determine whether a second image should be captured. For example, some embodiments may consider not only whether the current horizontal disparity is within an acceptable range, but also whether the current horizontal disparity is trending toward producing an even higher quality stereoscopic image pair, or conversely if the trend is toward a lower quality stereoscopic image pair.
  • Figure 7 is a graph of horizontal disparity as a function of a pan distance of one imaging sensor. Acceptable stereoscopic images are normally produced when the horizontal disparity falls within the shaded portion of the graph, indicated by the bracketed area 760 of the y axis.
  • the darkly shaded portions 720 and 730 represent horizontal disparity values that produce a stereoscopic image pair of acceptable quality.
  • the narrow lightly shaded region 740 represents optimal horizontal disparity.
  • the diagonal line 750 represents the horizontal disparity for one example imaging scenario.
  • initial horizontal disparity between the first image and preview image may be close to zero, as illustrated by point Figure 7 , point 705.
  • horizontal disparity may enter the acceptable range, while remaining sub-optimal. This condition is represented by point 770.
  • a second image captured at this point will provide a stereoscopic image pair of acceptable quality, it will not be optimal quality.
  • some embodiments may wait to capture the second image. For example, waiting may result in a new horizontal disparity within the optimal zone, represented by point 780. Initiating the capture of the second image at this point may result in a stereoscopic image pair of significantly better quality than if the second image had been captured at point 770.
  • Figure 8 represents one embodiment of an image displayed on an electronic display that includes a directional indicator for capturing a second image.
  • Image 810 represents the first image captured in a stereoscopic image sequence.
  • Image 810 might be image 1 of process 500 or the first image captured in block 210 of process 200.
  • Image 810 may be captured by imaging sensor 115 of device 100 of Figure 1 .
  • Image 820 represents one embodiment of an image on an electronic display, which includes several displayed elements.
  • the upper half of the display includes the upper half of image 810. This portion of the display may be controlled by block 530 of process 500 in some embodiments. Other embodiments may control this portion of the display with block 620 of process 600.
  • the lower half of the displayed image 820 includes another image portion. In the illustrated embodiment, this image portion is a preview image.
  • the display of the preview image may be controlled by block 540 of process 500.
  • block 630 of process 600 may control this portion of the display in some embodiments.
  • the upper half of display 820 also includes an arrow 830.
  • the arrow indicates which direction the imaging sensor should be moved to provide optimal horizontal disparity between image 810, displayed in the upper half of display 820, and the preview image, displayed in the lower half of display 820.
  • Arrow 830 may change color to indicate stereoscopic quality achieved if the snapshot is captured in the current camera position. For example, when stereoscopic quality is far from optimal, the arrow may be red. In some embodiments, arrow 830 may transition to a yellow color as horizontal disparity transitions into a reasonable yet suboptimal zone.
  • the length of arrow 830 may also extend or contract depending on the amount of additional imaging sensor displacement needed to achieve an optimal horizontal disparity. When horizontal disparity achieves an optimal position, in some embodiments the arrow may transition to a different symbol, for example a green light. Alternatively, the arrow may disappear entirely or change to another form.
  • displayed image 820 may also include a gridline or ruler indicator 840.
  • the gridline communicates the allowed horizontal shift between the first image, displayed in the upper half of the display 820 in this embodiment, and the preview image, displayed in the lower half of displayed image 820 in this embodiment.
  • the gridline may also change color indicating the level of stereoscopic image quality achieved if the second image is captured in the present imaging sensor position. For example, in some embodiments, the gridline may be red when the current horizontal disparity will result in poor stereoscopic image quality. The gridline may become yellow when the horizontal disparity approaches a reasonable level, and green when the horizontal disparity provides for good stereoscopic image quality.
  • Some embodiments may also populate a portion of display image 820 with a dynamically calculated stereoscopic image quality indicator, such as status bar 850 shown in the lower portion of display image 820.
  • a dynamically calculated stereoscopic image quality indicator such as status bar 850 shown in the lower portion of display image 820.
  • the small arrow above status bar 850 moves horizontally to indicate the stereoscopic image quality level if a second image is captured in the current sensor position.
  • Each color zone within the horizontal bar in the illustrated embodiment corresponds to a particular stereoscopic image pair quality level.
  • image processing instructions may dynamically calculate the horizontal disparity between a first image and the current preview or real time image to determine where to position the arrow.
  • Other embodiments may choose a different form for their dynamic quality indicator.
  • Figure 9 represents another embodiment of an image displayed on an electronic display to provide guidance for capturing a second image.
  • the first image 910 represents the first image of a stereoscopic image pair.
  • Image 920 represents one embodiment of an image displayed on an electronic display after first image 910 has been captured. Of note is that in image 910, the hump in the brown bear's back is approximately centered in the image.
  • the brightly lit bear in image 920 is a preview image, or a real time image as currently perceived by a device imaging sensor.
  • the preview image in image 920 shows the bear's hump to the left of center. This shift in the bear's position is due to the imaging sensor being panned to the right after image 910 was captured.
  • the dimly lit image of the bear to the right of the brightly lit preview image corresponds to image 910, which is semi-transparently overlaid on the preview image.
  • the dimly lit bear may be displayed by block 620 of process 600 or block 518 of process 500.
  • the brightly lit bear may represent a preview or real time image, and may be displayed by block 522 of process 500 or block 630 of process 600.
  • this embodiment of a stereoscopic apparatus display also includes a directional indicator 930 and a quality indicator 940.
  • the directional indicator 930 is indicating a need to pan further right.
  • the quality indicator is communicating that a low quality stereoscopic image pair will result if the second image is captured at the present imaging sensor position. This is shown by the small arrow over a particular portion of the horizontal quality bar 840.
  • Figure 10 is a flow chart illustrating a process 1000 that runs within one embodiment of the capture control module 170 or user interface module 165 of Figure 1 .
  • Process 1000 starts at start state 1005 and then moves to block 1010.
  • block 1010 is implemented by instructions in the capture control module 170 that determine the horizontal disparity between a first image and the current preview or real time image.
  • the capture control module 170 may call subroutines in the disparity removal module 140, which includes instructions to determine a horizontal disparity between two images in some embodiments.
  • process 1000 moves to block 1020, and an initial direction for a directional indication is determined.
  • An example directional indication is the arrow 830 of Figure 8 .
  • Some embodiments may default to a left direction after the capture of image 1 in block 515 of process 500, illustrated in Figure 5 or block 610 of process 600, illustrated in Figure 6 .
  • Other embodiments may default to a right direction.
  • the direction of arrow 830 may be determined by the horizontal disparity between the first image and the current preview or real time image.
  • process 1000 moves to block 1030, where the length of the indicator is determined.
  • Some embodiments may maintain an indicator of fixed length, while other embodiments may vary the length of the directional indicator based on the horizontal disparity between the first captured image and the preview or real time image.
  • the width of the directional indicator may also vary based on distance between an optimal horizontal disparity and the current horizontal disparity. Additionally, some embodiments may limit the maximum and minimum width of the directional indicator. For example, in some embodiments, the directional indicator length may be limited to seventy five percent of the horizontal width of the display. Conversely, the length may be limited by a lower bound when horizontal disparity is close to optimal. This length may be determined by the size of the display, or may be limited to a size easily perceivable by the human eye.
  • the directional indicator when the horizontal disparity approaches optimum, may transition from one form to another.
  • the directional indicator may transform from an arrow, when horizontal disparity is not optimal, to a green light, represented by a green circle or other green indicator, indicating the second picture can now be taken.
  • Other embodiments may transition to a flashing indicator, to communicate a degree of urgency in capturing the second image.
  • process 1000 moves to block 1040, where any old directional indicator is erased (if present) and the new indicator is displayed based on the parameters determined in blocks 1020 and 1030. Therefore, a capture control module 170 or user interface module 165, containing instructions that perform the blocks 1010-1040 described above represent one means for displaying a directional indicator that indicates a direction to move the imaging sensor.
  • process 1000 moves to block 1050, and the delta between the current horizontal disparity and an optimal horizontal disparity is mapped to a particular quality indication.
  • the quality indicator may resemble item 850 of Figure 8 .
  • Optimal horizontal disparity can be approximately 1/30 of the field of view.
  • the illustrated embodiment of Figure 8 will place the small arrow above the horizontal quality bar 850 over a green portion of the bar.
  • the region to the right of the small arrow represents too small a horizontal disparity, while the region to the left of the arrow represents progressively larger horizontal disparities, with a bright red region to the far left end of the bar representing the worst position for capturing a second image.
  • process 1000 moves to block 1070, and a gridline is displayed on the screen.
  • the gridline may be item 840 of Figure 8 .
  • the gridline communicates the optimal horizontal disparity between the first image and the real time or preview image. Since optimal horizontal disparity is typically found at approximately 1/30 the field of view, the length of the gridline will typically be in that proportion with the width of the display. After the gridline has been displayed, process 1000 moves to end state 1080.
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory storage medium known in the art.
  • An exemplary computer-readable storage medium is coupled to the processor such the processor can read information from, and write information to, the computer-readable storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.
  • the ASIC may reside in a user terminal, camera, or other device.
  • the processor and the storage medium may reside as discrete components in a user terminal, camera, or other device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
EP15178955.9A 2011-08-12 2012-08-06 Systems and methods to capture a stereoscopic image pair Withdrawn EP2955916A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/208,838 US9191649B2 (en) 2011-08-12 2011-08-12 Systems and methods to capture a stereoscopic image pair
EP12745985.7A EP2742696B1 (en) 2011-08-12 2012-08-06 Systems and methods to capture a stereoscopic image pair

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
EP12745985.7A Division-Into EP2742696B1 (en) 2011-08-12 2012-08-06 Systems and methods to capture a stereoscopic image pair
EP12745985.7A Division EP2742696B1 (en) 2011-08-12 2012-08-06 Systems and methods to capture a stereoscopic image pair

Publications (1)

Publication Number Publication Date
EP2955916A1 true EP2955916A1 (en) 2015-12-16

Family

ID=46642659

Family Applications (2)

Application Number Title Priority Date Filing Date
EP12745985.7A Active EP2742696B1 (en) 2011-08-12 2012-08-06 Systems and methods to capture a stereoscopic image pair
EP15178955.9A Withdrawn EP2955916A1 (en) 2011-08-12 2012-08-06 Systems and methods to capture a stereoscopic image pair

Family Applications Before (1)

Application Number Title Priority Date Filing Date
EP12745985.7A Active EP2742696B1 (en) 2011-08-12 2012-08-06 Systems and methods to capture a stereoscopic image pair

Country Status (6)

Country Link
US (1) US9191649B2 (ko)
EP (2) EP2742696B1 (ko)
JP (1) JP6042434B2 (ko)
KR (1) KR101958044B1 (ko)
CN (1) CN103733617B (ko)
WO (1) WO2013025391A2 (ko)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103026716A (zh) * 2010-07-27 2013-04-03 松下电器产业株式会社 摄像装置
KR101773616B1 (ko) * 2011-05-16 2017-09-13 엘지디스플레이 주식회사 영상처리방법과 이를 이용한 입체영상 표시장치
US9402065B2 (en) * 2011-09-29 2016-07-26 Qualcomm Incorporated Methods and apparatus for conditional display of a stereoscopic image pair
US20130107008A1 (en) * 2011-10-31 2013-05-02 Nokia Corporation Method, apparatus and computer program product for capturing images
KR101873747B1 (ko) * 2011-12-27 2018-07-03 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR101918030B1 (ko) * 2012-12-20 2018-11-14 삼성전자주식회사 하이브리드 멀티-뷰 랜더링 방법 및 장치
CN104113684B (zh) * 2013-04-15 2017-09-22 宏达国际电子股份有限公司 控制方法及电子装置
EP2988093B1 (en) * 2013-04-19 2019-07-17 Toppan Printing Co., Ltd. Three-dimensional shape measurement device, three-dimensional shape measurement method, and three-dimensional shape measurement program
DE102013014536B4 (de) * 2013-09-03 2015-07-09 Sew-Eurodrive Gmbh & Co Kg Verfahren zur Übertragung von Information und Vorrichtung zur Durchführung des Verfahrens
CN103945207B (zh) * 2014-04-24 2015-09-02 浙江大学 一种基于视点合成的立体图像垂直视差消除方法
CN105306921A (zh) * 2014-06-18 2016-02-03 中兴通讯股份有限公司 一种基于移动终端的三维照片拍摄方法及移动终端
US10089396B2 (en) * 2014-07-30 2018-10-02 NthGen Software Inc. System and method of a dynamic interface for capturing vehicle data
WO2016113429A2 (en) * 2015-01-16 2016-07-21 Imra Europe S.A.S. Self-rectification of stereo camera
EP3089449B1 (en) * 2015-04-30 2020-03-04 InterDigital CE Patent Holdings Method for obtaining light-field data using a non-light-field imaging device, corresponding device, computer program product and non-transitory computer-readable carrier medium
US9813621B2 (en) * 2015-05-26 2017-11-07 Google Llc Omnistereo capture for mobile devices
US10341543B2 (en) * 2016-04-28 2019-07-02 Qualcomm Incorporated Parallax mask fusion of color and mono images for macrophotography
CN107046638A (zh) * 2016-12-30 2017-08-15 无锡易维视显示技术有限公司 单摄像头的3d影像拍摄方法
WO2018223267A1 (en) * 2017-06-05 2018-12-13 Shanghaitech University Method and system for hyperspectral light field imaging
JP2019057840A (ja) * 2017-09-21 2019-04-11 トヨタ自動車株式会社 撮像装置
US11721010B2 (en) 2019-09-22 2023-08-08 Openlane, Inc. Vehicle self-inspection apparatus and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050100207A1 (en) * 1996-06-28 2005-05-12 Kurt Konolige Realtime stereo and motion analysis on passive video images using an efficient image-to-image comparison algorithm requiring minimal buffering
US20070248260A1 (en) * 2006-04-20 2007-10-25 Nokia Corporation Supporting a 3D presentation
KR100962329B1 (ko) * 2009-02-05 2010-06-10 연세대학교 산학협력단 스테레오 카메라 영상으로부터의 지면 추출 방법과 장치 및이와 같은 방법을 구현하는 프로그램이 기록된 기록매체
US20110141227A1 (en) * 2009-12-11 2011-06-16 Petronel Bigioi Stereoscopic (3d) panorama creation on handheld device

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4754327A (en) 1987-03-20 1988-06-28 Honeywell, Inc. Single sensor three dimensional imaging
US4966436A (en) 1987-04-30 1990-10-30 Christopher A. Mayhew Apparatus for obtaining images for use in displaying a three-dimensional
US5157484A (en) 1989-10-23 1992-10-20 Vision Iii Imaging, Inc. Single camera autosteroscopic imaging system
JPH08262527A (ja) * 1995-03-24 1996-10-11 Canon Inc カメラ
US5883695A (en) 1997-09-19 1999-03-16 Paul; Eddie Method and apparatus for producing stereoscopic images with single sensor
GB9810553D0 (en) 1998-05-15 1998-07-15 Tricorder Technology Plc Method and apparatus for 3D representation
WO2001097173A1 (en) * 2000-06-15 2001-12-20 Lifef/X Networks, Inc. Basis functions of three-dimensional models for compression, transformation and streaming
JP2003244727A (ja) * 2002-02-13 2003-08-29 Pentax Corp ステレオ画像撮像装置
GB2405764A (en) * 2003-09-04 2005-03-09 Sharp Kk Guided capture or selection of stereoscopic image pairs.
CN1998153A (zh) 2004-05-10 2007-07-11 辉达公司 用于视频数据的处理器
JP2006005417A (ja) * 2004-06-15 2006-01-05 Canon Inc 撮影装置
JP4712661B2 (ja) 2006-09-22 2011-06-29 オリンパスイメージング株式会社 撮像装置
JP4828486B2 (ja) * 2007-08-14 2011-11-30 富士フイルム株式会社 デジタルカメラ、撮影方法及び撮影プログラム
US20100316282A1 (en) 2009-06-16 2010-12-16 Hope Clinton B Derivation of 3D information from single camera and movement sensors
US20110025830A1 (en) 2009-07-31 2011-02-03 3Dmedia Corporation Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
KR101661969B1 (ko) 2009-11-02 2016-10-04 엘지전자 주식회사 휴대 단말기 및 그 동작 제어방법
JP2013062557A (ja) * 2010-01-14 2013-04-04 Panasonic Corp デジタル撮影装置及び、3d撮影方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050100207A1 (en) * 1996-06-28 2005-05-12 Kurt Konolige Realtime stereo and motion analysis on passive video images using an efficient image-to-image comparison algorithm requiring minimal buffering
US20070248260A1 (en) * 2006-04-20 2007-10-25 Nokia Corporation Supporting a 3D presentation
KR100962329B1 (ko) * 2009-02-05 2010-06-10 연세대학교 산학협력단 스테레오 카메라 영상으로부터의 지면 추출 방법과 장치 및이와 같은 방법을 구현하는 프로그램이 기록된 기록매체
US20110141227A1 (en) * 2009-12-11 2011-06-16 Petronel Bigioi Stereoscopic (3d) panorama creation on handheld device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
RALPH ALTMANN: "3D-Bearbeitung", vol. 2/11, 30 June 2011 (2011-06-30), pages 28 - 31, XP002684266, Retrieved from the Internet <URL:http://www.heise.de/artikel-archiv/df/2011/2/16_kiosk> [retrieved on 20150902] *

Also Published As

Publication number Publication date
WO2013025391A3 (en) 2013-04-11
KR20140053313A (ko) 2014-05-07
KR101958044B1 (ko) 2019-03-13
US9191649B2 (en) 2015-11-17
EP2742696B1 (en) 2015-11-04
JP6042434B2 (ja) 2016-12-14
EP2742696A2 (en) 2014-06-18
US20130038701A1 (en) 2013-02-14
CN103733617B (zh) 2015-11-25
JP2014527756A (ja) 2014-10-16
CN103733617A (zh) 2014-04-16
WO2013025391A2 (en) 2013-02-21

Similar Documents

Publication Publication Date Title
US9191649B2 (en) Systems and methods to capture a stereoscopic image pair
US10171791B2 (en) Methods and apparatus for conditional display of a stereoscopic image pair
US10389948B2 (en) Depth-based zoom function using multiple cameras
CN107690649B (zh) 数字拍摄装置及其操作方法
CA2969482C (en) Method and apparatus for multiple technology depth map acquisition and fusion
JP5365885B2 (ja) 手持ち式電子装置、それに適用される二重像取得方法及びそれにロードされるプログラム
CA2941143C (en) System and method for multi-focus imaging
US9516223B2 (en) Motion-based image stitching
EP3544286B1 (en) Focusing method, device and storage medium
EP2562715A1 (en) Portable electric equipment and method of processing a series of frames
KR20160051803A (ko) 인터랙티브 이미지 합성
CN107105157A (zh) 从手持设备所捕获的多个图像进行肖像图像合成
EP3316568B1 (en) Digital photographing device and operation method therefor
CN105120172A (zh) 一种移动终端前后置摄像头拍照方法及移动终端
JP2014526823A (ja) 立体画像ペアの改善された切り取りのための方法および装置
CN116724558A (zh) 由手持式设备以横向模式拍摄视频的技术
CN106922181B (zh) 方向感知自动聚焦
US9843715B2 (en) Photographic apparatus, stroboscopic image prediction method, and a non-transitory computer readable storage medium storing stroboscopic image prediction program
KR20120085556A (ko) 디지털 촬영 장치 및 그의 이미지 제공 방법
WO2022178781A1 (en) Electric device, method of controlling electric device, and computer readable storage medium
KR20150016871A (ko) 촬상 장치, 표시 장치, 촬상 방법 및 촬상 프로그램
CN104902161A (zh) 一种信息处理方法及电子设备

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150729

AC Divisional application: reference to earlier application

Ref document number: 2742696

Country of ref document: EP

Kind code of ref document: P

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

17Q First examination report despatched

Effective date: 20160628

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160909