EP2494524A2 - Algorithmes pour estimer des distances d'objets précises et relatives dans une scène - Google Patents

Algorithmes pour estimer des distances d'objets précises et relatives dans une scène

Info

Publication number
EP2494524A2
EP2494524A2 EP10842442A EP10842442A EP2494524A2 EP 2494524 A2 EP2494524 A2 EP 2494524A2 EP 10842442 A EP10842442 A EP 10842442A EP 10842442 A EP10842442 A EP 10842442A EP 2494524 A2 EP2494524 A2 EP 2494524A2
Authority
EP
European Patent Office
Prior art keywords
image
distance
curve information
focus position
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP10842442A
Other languages
German (de)
English (en)
Other versions
EP2494524B1 (fr
EP2494524A4 (fr
Inventor
Earl Wong
Pingshan Li
Mamoru Sugiura
Makibi Nakamura
Hidenori Kushida
Yoshihiro Murakami
Soroj Triteyaprasert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Sony Electronics Inc
Original Assignee
Sony Corp
Sony Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp, Sony Electronics Inc filed Critical Sony Corp
Publication of EP2494524A2 publication Critical patent/EP2494524A2/fr
Publication of EP2494524A4 publication Critical patent/EP2494524A4/fr
Application granted granted Critical
Publication of EP2494524B1 publication Critical patent/EP2494524B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10148Varying focus

Definitions

  • the present invention relates to the field of image processing. More specifically, the present invention relates to determining object distances in a scene.
  • a Depth Map is a map that shows the distance from the camera sensor to a corresponding point in the scene for every single pixel.
  • Traditional methods of DM generation include range sensors that use acoustic waves or project laser patterns or scan the scene with some other means to measure the distance from the camera, and stereoscopic systems, that use two or more cameras/lenses to acquire multiple images of the scene and then match them in order to triangulate the points in the scene. In both cases, single-lens cameras require additional hardware to generate the DM.
  • a two picture matching curve information is able to be used to determine precise object distance or relative object distance in a scene. Acquiring two images with different blur information in addition to the curve information enables a device to determine distance information of objects in a scene.
  • the distance information is able to be used in image processing including generating a depth map which is then able to be used in many imaging applications.
  • a method implemented on a device comprises acquiring a first image of a scene, acquiring a second image of the scene and utilizing curve information to determine a device-to-object distance of an object in the scene.
  • the curve information is precalculated. Utilizing the curve information includes: determining a number of convolutions used to blur one of the first image and the second image to a blurriness of the other of the first image and the second image, using the number of convolutions to determine an object-to-focus position distance based on the curve information, computing a device-to-focus position distance and adding the object-to-focus position distance and the device-to-focus position distance to determine the device-to-object distance.
  • the curve information includes multiple curves. The method further comprises generating the curve information.
  • Generating the curve information includes acquiring multiple images at different blur quantities.
  • the first image and the second image have a different blur amount.
  • the different blur amount is achieved by changing the focus position between acquiring the first image and the second image.
  • the method further comprises generating a depth map.
  • the method further comprises storing the depth map.
  • the method further comprises utilizing the depth map to perform an application.
  • the application is selected from the group consisting of auto focus, auto exposure, zoom setting, aperture setting, flash setting, shutter speed, white balance, noise reduction, gamma correction, motion estimation, image/video compression, generating blur, quality
  • the device is selected from the group consisting of a personal computer, a laptop computer, a computer workstation, a server, a mainframe computer, a handheld computer, a personal digital assistant, a cellular/mobile telephone, a smart appliance, a gaming console, a digital camera, a digital camcorder, a camera phone, an iPod®, a video player, a DVD writer/player, a television and a home entertainment system.
  • the distance calculation module utilizes the curve information by: determining a number of convolutions used to blur one of the first image and the second image to a blurriness of the other of the first image and the second image, using the number of convolutions to determine an object-to-focus position distance based on the curve information, computing a device-to-focus position distance and adding the object-to- focus position distance and the device-to-focus position distance to determine the device-to- object distance.
  • the curve information is determined by acquiring target data of multiple images at different blur quantities.
  • the curve information includes multiple curves.
  • the first image and the second image have a different blur amount. The different blur amount is achieved by changing the focus position between acquiring the first image and the second image.
  • the system further comprises a depth map generation module operatively coupled to the distance calculation module, the depth map generation module configured for generating a depth map.
  • the depth map is stored.
  • the depth map is utilized to perform an application.
  • the application is selected from the group consisting of auto focus, auto exposure, zoom setting, aperture setting, flash setting, shutter speed, white balance, noise reduction, gamma correction, motion estimation, image/video compression, generating blur, quality improvement, generating a 3-D image, shadow removal and object segmentation.
  • the device is selected from the group consisting of a personal computer, a laptop computer, a computer workstation, a server, a mainframe computer, a handheld computer, a personal digital assistant, a cellular/mobile telephone, a smart appliance, a gaming console, a digital camera, a digital camcorder, a camera phone, an iPod®, a video player, a DVD writer/player, a television and a home entertainment system.
  • a device comprises a memory for storing an application, the application configured for acquiring a first image of a scene, acquiring a second image of the scene and utilizing curve information to determine a distance of one or more objects in the scene and a processing component coupled to the memory, the processing component configured for processing the application.
  • Utilizing the curve information includes determining a number of convolutions used to blur one of the first image and the second image to a blurriness of the other of the first image and the second image, using the number of convolutions to determine an object-to-focus position distance based on the curve information, computing a device-to-focus position distance and adding the object-to-focus position distance and the device-to-focus position distance to determine the device-to-object distance.
  • the curve information is predetermined.
  • the curve information includes multiple curves.
  • the application is further configured for generating the curve information.
  • Generating the curve information includes acquiring multiple images at different blur quantities.
  • the first image and the second image have a different blur amount.
  • the different blur amount is achieved by changing the focus position between acquiring the first image and the second image.
  • the application is further configured for generating a depth map.
  • the depth map is stored.
  • the application is further configured for utilizing the depth map to perform an imaging application.
  • the imaging application is selected from the group consisting of auto focus, auto exposure, zoom setting, aperture setting, flash setting, shutter speed, white balance, noise reduction, gamma correction, motion estimation, image/video compression, generating blur, quality improvement, generating a 3-D image, shadow removal and object segmentation.
  • the device is selected from the group consisting of a personal computer, a laptop computer, a computer workstation, a server, a mainframe computer, a handheld computer, a personal digital assistant, a cellular/mobile telephone, a smart appliance, a gaming console, a digital camera, a digital camcorder, a camera phone, an iPod®, a video player, a DVD writer/player, a television and a home entertainment system.
  • a method of generating curve information on a device comprises acquiring a first image and a second image of a test object in a scene with a changed focus position for the second image, computing a change in blur between the first image and the second image and repeating a and b for a plurality of different focus positions to generate the curve information.
  • the method further comprises identifying the test object in the scene.
  • the image of the test object is acquired for a fixed zoom and aperture. Generating the curve information occurs while calibrating the device.
  • the curve information is stored on the device.
  • the device is selected from the group consisting of a personal computer, a laptop computer, a computer workstation, a server, a mainframe computer, a handheld computer, a personal digital assistant, a cellular/mobile telephone, a smart appliance, a gaming console, a digital camera, a digital camcorder, a camera phone, an iPod®, a video player, a DVD writer/player, a television and a home entertainment system.
  • FIG. 1 illustrates a graph of a curve according to some embodiments.
  • FIGs. 2A-B illustrate a diagram of a camera acquiring a picture according to some embodiments.
  • FIGs. 3A-B illustrate a diagram of using convolutions to affect the focus of a picture according to some embodiments.
  • FIG. 4 illustrates a diagram of convolutions and distance according to some embodiments.
  • FIG. 5 illustrates a curve according to some embodiments.
  • FIG. 6 illustrates an exemplary image and a set of convolutions to blur the image according to some embodiments.
  • FIG. 7 illustrates a diagram for determining distances in an image according to some embodiments.
  • FIG. 8 illustrates a diagram of a series of curves according to some embodiments.
  • FIG. 9 illustrates a flowchart of a method of utilizing a device to determine object distances according to some embodiments.
  • FIG. 10 illustrates a block diagram of an exemplary computing device configured for determining a object distances according to some embodiments.
  • the inputs to this mathematical formulation are intrinsic camera parameters such as focal length, aperture size and other information as well as two picture matching curve information.
  • the two picture matching curve information is able to be used to determine precise object distance or relative object distance in a scene.
  • a two picture matching curve is a physical quantity computed from data captured by any imaging device that employs a Charge-Coupled Device (CCD) or Complimentary Metal- Oxide-Semiconductor (CMOS) sensor.
  • CCD Charge-Coupled Device
  • CMOS Complimentary Metal- Oxide-Semiconductor
  • a specific point on the two picture matching curve is generated in the following manner. For a fixed camera zoom and aperture, a picture of a test target or object is captured for a specific camera focus position. Next, the camera focus position is changed by some small quantity and a second picture is captured. The change in blur is then computed for the test target or object. The process is then repeated for different camera focus positions, thereby generating a two picture matching curve.
  • the computed two picture matching curve is theoretically linear, when a Gaussian convolution kernel is applied. Otherwise, the curve is monotonically increasing/decreasing. Due to imperfections in the camera lens, linearity only exists for a fixed range, even if the camera focus position is changed by a fixed M depth of field for each new camera focus position relative to its previous camera focus position.
  • FIG. 1 An example of a captured two picture matching curve is shown in Figure 1.
  • the picture number references the captured pictures (associated with different camera focus positions).
  • an iteration is computed and associated with non-overlapping regions in the scene.
  • the number is able to then be used to determine the relative order of the objects in a scene (e.g. which objects are closer to the camera and which objects are farther), or to approximate the actual distance of the objects in the scene.
  • the matrix containing the iteration information is defined to be an iteration map.
  • a scene in question has the following rudimentary 3 x 3 iteration map.
  • the dimensions of the iteration map are n x m.
  • the iteration map often contains non-overlapping regions that contain two distinct depths (such as a foreground object and a background object) corresponding to border regions. At these locations, the iteration map information is inaccurate.
  • a rudimentary filtering or clustering scheme is able to be used to improve the iteration map estimate at the non-border locations.
  • the information in the iteration map is able to then be used to determine relative and precise object distances.
  • the relative object distance is able to be determined, assuming the object in question comprises several non- overlapping regions / multiple neighboring entries in the iteration map.
  • each adjacent camera position corresponds to a movement of M depths of field, then the iterations number is able to be used to compute the object distance. This is illustrated in Figure 1.
  • the iterations number for the object in question at the current focus position is K.
  • the distance of the object is then able to determined to be 8 * M depths of field.
  • Each successive picture number location is computed from a mathematical formula.
  • each successive picture number location is computed as the forward [or rear] depth of field location from some reference picture number 1 [or Nmax], or some other picture number in between 1 and Nmax.
  • the code shown below (where
  • Dno (H*distance_focus) / H+distance_focus);
  • s_new distance_focus; % Dfo;
  • the rear depth of field locations are able to be computed and used.
  • the new picture number locations are able to be computed in the forward direction based on the "textbook" depth of field definition.
  • distance focus 1, and both Dnol and Dfol are computed using the above formulas.
  • the following equation is then solved.
  • distance_focus2 (Dno2 * H) / (H-Dno2);
  • the process is repeated to generate all subsequent distance focus positions.
  • Dfo2 (H*distance_focus2) / (H - distance_focus2);
  • Dno3 Dfo2;
  • Dfol Dfo2;
  • the new picture number locations are computed in the reverse direction based on the "textbook" depth of field definition.
  • distance focus H/2
  • both Dnol and Dfol are computed using the above formulas.
  • distance_focus2 (Dfo2 * H) / (H+Dfo2);
  • the process is repeated to generate all subsequent distance focus positions.
  • Dno2 (H * distance_focus2) / (H + distance_focus2);
  • the picture number locations are computed using a pre-determined mathematical formula. By iterating the mathematical formula, the object depth associated with a specific iterations number is then able to be determined.
  • FIG 2A illustrates a diagram of a camera acquiring a picture according to some embodiments.
  • a camera viewing a scene has specified settings such as zoom (zl) and aperture size (al).
  • the camera also has a focus position (fl). From this information, the distance (Dl) is able to be computed.
  • Dl the distance from the camera to where the camera is focused.
  • Figure 2B illustrates a diagram of a camera acquiring a picture according to some embodiments.
  • the zoom and aperture size are kept the same, but the focus position (f2) is changed. Then, from this information, the distance (D2) is able to be computed.
  • distance function (zL, aL, fL).
  • Figure 2B in contrast to Figure 2A, the image is focused closer to the sun, with the person blurred. The distance from the camera to where the camera is focused is D2, which is beyond the person.
  • D2 which is beyond the person.
  • Figure 3 A illustrates a diagram of using convolutions to affect the focus of a picture according to some embodiments.
  • Two pictures are taken, for example, the first picture is initially focused on a person with the sun blurred, and the second picture is focused closer to the sun, with the person blurred.
  • mathematical convolution operations are able to be applied.
  • the convolutions are iterated and the closeness of the first picture and the second picture is checked after each iteration. Thus, each iteration blurs the person more.
  • the blurring will match, and blurring beyond that will result in the first picture being blurred more than the second picture.
  • the number of convolutions is able to be recorded, such as M convolutions.
  • Figure 3B illustrates a diagram of using convolutions to affect the focus of a picture according to some embodiments. Similar to the convolutions to blur the person in Figure 3A, the sun is able to be blurred in N convolutions to achieve a matching blur.
  • Figure 4 illustrates a diagram of convolutions and distance according to some embodiments.
  • the sign of the convolution operator is positive, and when the picture pair is before the focus point, the sign is negative.
  • the sign indicates direction. This enables obtaining a sequence of numbers depending on where the picture pair is taken.
  • a curve is able to be generated. Then, using picture pairs, convolutions related to the picture pairs and the generated curve, distances are able to be determined as shown in Figure 5.
  • Figure 6 illustrates an exemplary image and a set of convolutions to blur the image according to some embodiments.
  • Figure 7 illustrates a diagram for determining distances in an image according to some embodiments.
  • the number of convolutions to blur the sharper car to the less sharp car of Figure 6 is L convolutions.
  • L convolutions results in a distance, d ear, for the car.
  • the focus position is known; thus, the distance from the focus position to the car is able to be determined.
  • the distance from the car to the camera is d ear + d camera.
  • the curve is generated for each camera.
  • the curve is generated and stored on the camera when the camera is calibrated.
  • multiple curves are generated to improve performance.
  • the slopes of the curves are possibly slightly different depending on a number of factors, such as where the camera is focused, so it is possible that one curve is more appropriate to use than another curve.
  • a curve is selected from the set of curves based on where the camera is focused.
  • the curve is selected based on another factor. Acquiring a picture pair involves capturing two pictures with some fraction of depth of field separation.
  • the depth of field separation is a rational number. For example, the separation is 1 depth of field, two depths of field, one half of a depth of field or others.
  • One or more curves are stored within a device such as a camera.
  • Figure 8 illustrates a diagram of a series of curves according to some embodiments. Then, the focus positions of the two pictures are used to determine which curve is appropriate, and what the distance is based on the curve. The curves are able to be modified if camera parameters are changed. Using the curves and the information described above, distances of objects in a picture, and a depth map, are able to be determined. The depth map is able to be determined by establishing distances for many objects in a scene which are then mapped out, thus generating the depth map. The curves are able to be stored in any data structure such as a lookup table.
  • Figure 9 illustrates a flowchart of a method of utilizing a device to determine object distances according to some embodiments.
  • a first image is acquired.
  • a second image is acquired.
  • the first image and the second image have different blur amounts.
  • a determined curve is utilized to determine a distance of one or more objects in the acquired images. The distance is computed by determining the number of convolutions used to blur a sharper image (e.g. the first image) to a less sharp image (e.g. the second image), then using the number of convolutions to determine the distance from the object to the focus position (e.g. object-to-focus position distance) based on the curve (see Figure 5).
  • the distance from the camera to the focus position (e.g. device-to-focus position distance) is able to be computed. Then, adding the distance from the camera to the focus position with the distance from the focus position to the object is the total distance of the object from the camera.
  • relative object distance is also able to be computed. For example, if focus position to object distances are calculated for two separate objects, then those distances are able to be used to determine the relative object distance.
  • the curve is determined by acquiring images of a target with different blur quantities, as described above.
  • a depth map is generated based on the determined distances, in the step 906.
  • the determined curve is determined and stored before the first image is acquired.
  • the depth map is stored. In some embodiments, the depth map is not stored and the step 908 is able to be skipped.
  • the depth map is utilized to perform applications such as those described below.
  • FIG. 10 illustrates a block diagram of an exemplary computing device configured for determining distances of objects in an image according to some embodiments.
  • the computing device 1000 is able to be used to acquire, store, compute, communicate and/or display information such as images and videos. For example, the computing device 1000 is able to acquire and store a picture, as well as use information from the acquired picture to perform calculations.
  • a hardware structure suitable for implementing the computing device 1000 includes a network interface 1002, a memory 1004, a processor 1006, I/O device(s) 1008, a bus 1010 and a storage device 1012.
  • the choice of processor is not critical as long as a suitable processor with sufficient speed is chosen.
  • the memory 1004 is able to be any conventional computer memory known in the art.
  • the storage device 1012 is able to include a hard drive, CDROM, CDRW, DVD, DVDRW, flash memory card or any other storage device.
  • the computing device 1000 is able to include one or more network interfaces 1002.
  • An example of a network interface includes a network card connected to an Ethernet or other type of LAN.
  • the I/O device(s) 1008 are able to include one or more of the following: keyboard, mouse, monitor, display, printer, modem, touchscreen, button interface and other devices.
  • Distance application(s) 1030 used to perform the distance methods are likely to be stored in the storage device 1012 and memory 1004 and processed as applications are typically processed. More or less components shown in Figure 10 are able to be included in the computing device 1000. In some embodiments, distance processing hardware 1020 is included. Although the computing device 1000 in Figure 10 includes applications 1030 and hardware 1020 for distance applications, the distance applications are able to be implemented on a computing device in hardware, firmware, software or any combination thereof.
  • the distance application(s) 1030 include several applications and/or modules. In some embodiments, the distance application(s) 1030 include an image/picture/video acquisition module 1032 configured for acquiring a multiple images.
  • images/pictures/videos e.g. a first image/picture/video and a second image/picture/video
  • a curve generation module 1034 configured for generating a curve
  • a distance calculation module 1036 configured for determining/calculating a distance of an object within the image/picture/video
  • a depth map generation module 1038 configured for generating a depth map.
  • suitable computing devices include a personal computer, a laptop computer, a computer workstation, a server, a mainframe computer, a handheld computer, a personal digital assistant, a cellular/mobile telephone, a smart appliance, a gaming console, a digital camera, a digital camcorder, a camera phone, an iPod®, a video player, a DVD writer/player, a television, a home entertainment system or any other suitable computing device.
  • the applications utilizing distances and/or a depth map are able to set parameters including, but not limited to, auto focus, auto exposure, zoom setting, aperture setting, flash setting and shutter speed. These applications are able to be automatically implemented or manually implemented by a user. Then, a user is able to acquire the image/video as he would acquire any image/video such as pointing at a target and pressing a button. While the image/video is being acquired, additional applications are able to affect the image including, but are not limited to, white balance, noise reduction, gamma correction, motion estimation and image/video compression.
  • post processing is able to occur.
  • the image is stored on the camera/camcorder or on another device such as a computer.
  • the user is able to perform post processing operations on the image/video.
  • the post processing occurs automatically without user input. Examples of image post processing, include but are not limited to, generating blur, quality improvement, generating a 3-D image, shadow removal and object segmentation. All of these steps are able to benefit from the distance information and the depth map.
  • image/video processing is able to be improved in a number of ways.
  • the curve is generated using acquired image information.
  • the number of convolutions to blur a sharper image to a more blurred image is able to be recorded and then used with the curve to determine a distance (such as an object to focal position distance).
  • the distance from the focal position to the camera is able to be calculated.
  • These two distances together are the distance from the object to the camera or other device.
  • Image acquisition is able to be improved by camera settings being configured appropriately before the image/video is acquired.
  • Image processing of the acquired image is able to be improved.
  • Post processing of the image is also able to be improved using the distance information as well. Improvements include more efficient processing, better quality images/videos, additional features and other improvements.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

Selon l'invention, des informations de courbe de mise en correspondance de deux images sont utilisées afin de déterminer la distance précise de l'objet ou la distance relative de l'objet dans une scène. L'acquisition de deux images ayant des informations de flou différentes en plus des informations de courbe permet à un dispositif de déterminer des informations de distance d'objets dans une scène. Les informations de distance peuvent être utilisées dans un traitement d'image comprenant la génération d'une carte de profondeur qui est ensuite utilisée dans de nombreuses applications d'imagerie.
EP10842442.5A 2009-12-16 2010-12-02 Algorithmes pour estimer des distances d'objets précises et relatives dans une scène Active EP2494524B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/639,841 US8229172B2 (en) 2009-12-16 2009-12-16 Algorithms for estimating precise and relative object distances in a scene
PCT/US2010/058711 WO2011084279A2 (fr) 2009-12-16 2010-12-02 Algorithmes pour estimer des distances d'objets précises et relatives dans une scène

Publications (3)

Publication Number Publication Date
EP2494524A2 true EP2494524A2 (fr) 2012-09-05
EP2494524A4 EP2494524A4 (fr) 2017-02-22
EP2494524B1 EP2494524B1 (fr) 2021-06-16

Family

ID=44142954

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10842442.5A Active EP2494524B1 (fr) 2009-12-16 2010-12-02 Algorithmes pour estimer des distances d'objets précises et relatives dans une scène

Country Status (6)

Country Link
US (1) US8229172B2 (fr)
EP (1) EP2494524B1 (fr)
JP (1) JP2013512626A (fr)
KR (1) KR101429371B1 (fr)
CN (1) CN102640189B (fr)
WO (1) WO2011084279A2 (fr)

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8184196B2 (en) * 2008-08-05 2012-05-22 Qualcomm Incorporated System and method to generate depth data using edge detection
US8553093B2 (en) * 2008-09-30 2013-10-08 Sony Corporation Method and apparatus for super-resolution imaging using digital imaging devices
US8335390B2 (en) * 2010-03-22 2012-12-18 Sony Corporation Blur function modeling for depth of field rendering
JP2012003233A (ja) * 2010-05-17 2012-01-05 Sony Corp 画像処理装置、画像処理方法およびプログラム
US8488900B2 (en) * 2010-06-23 2013-07-16 Digimarc Corporation Identifying and redressing shadows in connection with digital watermarking and fingerprinting
US9307134B2 (en) * 2011-03-25 2016-04-05 Sony Corporation Automatic setting of zoom, aperture and shutter speed based on scene depth map
JP2013005091A (ja) * 2011-06-14 2013-01-07 Pentax Ricoh Imaging Co Ltd 撮像装置および距離情報取得方法
US8929607B2 (en) * 2011-12-01 2015-01-06 Sony Corporation System and method for performing depth estimation utilizing defocused pillbox images
KR101393869B1 (ko) * 2012-07-10 2014-05-12 엘지이노텍 주식회사 3d 카메라 모듈 및 그의 구동 방법
US9336302B1 (en) 2012-07-20 2016-05-10 Zuci Realty Llc Insight and algorithmic clustering for automated synthesis
US9066002B2 (en) 2012-08-29 2015-06-23 Sony Corporation System and method for utilizing enhanced scene detection in a depth estimation procedure
JP5995614B2 (ja) * 2012-08-31 2016-09-21 キヤノン株式会社 距離情報推定装置
US9894269B2 (en) * 2012-10-31 2018-02-13 Atheer, Inc. Method and apparatus for background subtraction using focus differences
CN104853172B (zh) * 2014-02-19 2017-11-07 联想(北京)有限公司 一种信息处理方法以及一种电子设备
US20150271467A1 (en) * 2014-03-20 2015-09-24 Neal Weinstock Capture of three-dimensional images using a single-view camera
US9418432B2 (en) * 2014-03-28 2016-08-16 Sony Corporation Imaging system with depth estimation mechanism and method of operation thereof
CN106416217A (zh) * 2014-04-17 2017-02-15 索尼公司 照相机的深度辅助场景识别
US9804392B2 (en) 2014-11-20 2017-10-31 Atheer, Inc. Method and apparatus for delivering and controlling multi-feed data
CN105721852B (zh) * 2014-11-24 2018-12-14 奥多比公司 用于确定深度细化图像捕获指令的方法、存储设备和系统
WO2016114427A1 (fr) * 2015-01-16 2016-07-21 재단법인 다차원 스마트 아이티 융합시스템 연구단 Procédé et dispositif pour réduire la complexité computationnelle lorsque des informations de profondeur sont estimées à partir d'une image
US9639946B2 (en) * 2015-03-11 2017-05-02 Sony Corporation Image processing system with hybrid depth estimation and method of operation thereof
US9723197B2 (en) * 2015-03-31 2017-08-01 Sony Corporation Depth estimation from image defocus using multiple resolution Gaussian difference
WO2017007047A1 (fr) * 2015-07-08 2017-01-12 재단법인 다차원 스마트 아이티 융합시스템 연구단 Procédé et dispositif de compensation de la non-uniformité de la profondeur spatiale en utilisant une comparaison avec gigue
WO2017114846A1 (fr) * 2015-12-28 2017-07-06 Robert Bosch Gmbh Système basé sur la détection de profondeur pour détecter, suivre, estimer et identifier l'occupation en temps réel
JP6377295B2 (ja) 2016-03-22 2018-08-22 三菱電機株式会社 距離計測装置及び距離計測方法
SE541141C2 (en) * 2016-04-18 2019-04-16 Moonlightning Ind Ab Focus pulling with a stereo vision camera system
WO2018014254A1 (fr) 2016-07-20 2018-01-25 SZ DJI Technology Co., Ltd. Procédé et appareil pour effectuer un zoom par rapport à une mention de droits d'auteur d'objet
US10027879B2 (en) * 2016-11-15 2018-07-17 Google Llc Device, system and method to provide an auto-focus capability based on object distance information
US11205103B2 (en) 2016-12-09 2021-12-21 The Research Foundation for the State University Semisupervised autoencoder for sentiment analysis
US10277889B2 (en) * 2016-12-27 2019-04-30 Qualcomm Incorporated Method and system for depth estimation based upon object magnification
CN106959075B (zh) * 2017-02-10 2019-12-13 深圳奥比中光科技有限公司 利用深度相机进行精确测量的方法和系统
JP7002050B2 (ja) * 2018-07-11 2022-02-10 パナソニックIpマネジメント株式会社 撮像装置
US11115582B2 (en) * 2018-11-28 2021-09-07 Jvckenwood Corporation Imaging control apparatus, imaging apparatus, and recording medium
CN111435436B (zh) * 2019-12-13 2021-01-08 珠海大横琴科技发展有限公司 一种基于目标位置的周界防入侵方法和装置
CN113936199B (zh) * 2021-12-17 2022-05-13 珠海视熙科技有限公司 一种图像的目标检测方法、装置及摄像设备
KR20230155939A (ko) 2022-05-04 2023-11-13 한화비전 주식회사 영상 분석 장치 및 방법
KR20240040005A (ko) 2022-09-20 2024-03-27 한화비전 주식회사 영상 분석 장치 및 방법
WO2024063242A1 (fr) * 2022-09-20 2024-03-28 한화비전 주식회사 Appareil et procédé pour une analyse d'image

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3481631B2 (ja) * 1995-06-07 2003-12-22 ザ トラスティース オブ コロンビア ユニヴァーシティー イン ザ シティー オブ ニューヨーク 能動型照明及びデフォーカスに起因する画像中の相対的なぼけを用いる物体の3次元形状を決定する装置及び方法
US6005607A (en) * 1995-06-29 1999-12-21 Matsushita Electric Industrial Co., Ltd. Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus
JP4403477B2 (ja) * 2000-01-26 2010-01-27 ソニー株式会社 画像処理装置及び画像処理方法
US20070016425A1 (en) * 2005-07-12 2007-01-18 Koren Ward Device for providing perception of the physical environment
US7929801B2 (en) 2005-08-15 2011-04-19 Sony Corporation Depth information for auto focus using two pictures and two-dimensional Gaussian scale space theory
JP4752733B2 (ja) * 2005-12-02 2011-08-17 ソニー株式会社 撮像装置および撮像方法、並びに撮像装置の設計方法
US20070189750A1 (en) * 2006-02-16 2007-08-16 Sony Corporation Method of and apparatus for simultaneously capturing and generating multiple blurred images
US7711201B2 (en) * 2006-06-22 2010-05-04 Sony Corporation Method of and apparatus for generating a depth map utilized in autofocusing
JP2008067093A (ja) * 2006-09-07 2008-03-21 Nikon Corp カメラシステム、画像処理装置、および画像処理プログラム
US8280194B2 (en) 2008-04-29 2012-10-02 Sony Corporation Reduced hardware implementation for a two-picture depth map algorithm
US8199248B2 (en) * 2009-01-30 2012-06-12 Sony Corporation Two-dimensional polynomial model for depth estimation based on two-picture matching
TWI393980B (zh) * 2009-06-08 2013-04-21 Nat Univ Chung Cheng The method of calculating the depth of field and its method and the method of calculating the blurred state of the image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2011084279A2 *

Also Published As

Publication number Publication date
KR20120099713A (ko) 2012-09-11
EP2494524B1 (fr) 2021-06-16
CN102640189B (zh) 2015-05-27
WO2011084279A2 (fr) 2011-07-14
CN102640189A (zh) 2012-08-15
WO2011084279A3 (fr) 2011-09-29
KR101429371B1 (ko) 2014-08-11
US8229172B2 (en) 2012-07-24
JP2013512626A (ja) 2013-04-11
US20110142287A1 (en) 2011-06-16
EP2494524A4 (fr) 2017-02-22

Similar Documents

Publication Publication Date Title
US8229172B2 (en) Algorithms for estimating precise and relative object distances in a scene
JP6742732B2 (ja) 輝度分布と動きとの間のトレードオフに基づいてシーンのhdr画像を生成する方法
US9998666B2 (en) Systems and methods for burst image deblurring
CN107409166B (zh) 摇摄镜头的自动生成
Zhang et al. Denoising vs. deblurring: HDR imaging techniques using moving cameras
US9313419B2 (en) Image processing apparatus and image pickup apparatus where image processing is applied using an acquired depth map
US20130057714A1 (en) Image pickup device, image processing device, image processing method, and image processing program
US10306210B2 (en) Image processing apparatus and image capturing apparatus
EP2704423A1 (fr) Appareil de traitement d'image, procédé de traitement d'image et programme de traitement d'image
US20120300115A1 (en) Image sensing device
CN110536057A (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
US9361704B2 (en) Image processing device, image processing method, image device, electronic equipment, and program
CN110084765B (zh) 一种图像处理方法、图像处理装置及终端设备
CN112419161B (zh) 图像处理方法及装置、存储介质及电子设备
JP6645711B2 (ja) 画像処理装置、画像処理方法、プログラム
JP2009171341A (ja) ぶれ補正装置及び撮像装置
CN112132879A (zh) 一种图像处理的方法、装置和存储介质
CN111835968B (zh) 图像清晰度还原方法及装置、图像拍摄方法及装置
CN112950692B (zh) 一种基于移动游戏平台的图像景深处理方法及其系统
JP6548409B2 (ja) 画像処理装置、その制御方法、および制御プログラム、並びに撮像装置
Stupich Low Power Parallel Rolling Shutter Artifact Removal
CN115103108A (zh) 防抖处理方法、装置、电子设备和计算机可读存储介质
CN113055584A (zh) 基于模糊程度的对焦方法、镜头控制器及相机模组
Deever Improved image capture using liveview images

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120531

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602010067138

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06T0015100000

Ipc: G06T0007571000

A4 Supplementary search report drawn up and despatched

Effective date: 20170124

RIC1 Information provided on ipc code assigned before grant

Ipc: G06T 7/571 20170101AFI20170118BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20191105

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20210113

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602010067138

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1402959

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210715

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Ref country code: NL

Ref legal event code: FP

RAP4 Party data changed (patent owner data changed or rights of a patent transferred)

Owner name: SONY ELECTRONICS INC.

Owner name: SONY GROUP CORPORATION

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210916

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1402959

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210616

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210916

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210917

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20211018

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010067138

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

26N No opposition filed

Effective date: 20220317

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20211231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211202

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211202

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211231

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211231

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211231

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20220616

Year of fee payment: 13

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20101202

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230606

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20231121

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231121

Year of fee payment: 14

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20231122

Year of fee payment: 14

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210616