US20160247285A1 - Image processing device and image depth processing method - Google Patents
Image processing device and image depth processing method Download PDFInfo
- Publication number
- US20160247285A1 US20160247285A1 US14/706,090 US201514706090A US2016247285A1 US 20160247285 A1 US20160247285 A1 US 20160247285A1 US 201514706090 A US201514706090 A US 201514706090A US 2016247285 A1 US2016247285 A1 US 2016247285A1
- Authority
- US
- United States
- Prior art keywords
- image
- depth
- foreground
- background
- background image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 35
- 238000004088 simulation Methods 0.000 claims abstract description 49
- 238000000034 method Methods 0.000 claims description 19
- 230000008569 process Effects 0.000 claims description 12
- 238000005562 fading Methods 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 10
- 239000002131 composite material Substances 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 230000011514 reflex Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/51—Indexing; Data structures therefor; Storage structures
-
- G06T7/0051—
-
- G06F17/3028—
-
- G06K9/4661—
-
- G06K9/6215—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G06T7/0081—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G06K2009/4666—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G06T2207/20144—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/12—Acquisition of 3D measurements of objects
Definitions
- the image processing device and the image depth processing method allow the user to set the numbers of layer images and the numbers of re-focus, to simulate the depth of field of any practical lens according to the aperture simulation parameters and also to increase the brightness of the image of the bright area in advance so as to strengthen at least one image of the bright area of the simulation image.
- the image processing device and the image depth processing method via blurring the foreground image and blurring a local image of the background image, which is near the edge of the foreground image, and implementing a fading process to fade the margin image in the background image, it a simulation image can be generated that satisfies the physics principles of geographic optics (that is, a simulation image having graduated depth of field, which looks natural and continuous).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Studio Circuits (AREA)
Abstract
Description
- 1. Field of the Invention
- The instant disclosure relates to an image processing device and an image processing method; in particular, to an image processing device and an image processing method than can generate a simulation image having a depth of field according to a depth image and a reference image.
- 2. Description of Related Art
- As technology develops, the volume of the smart phone or the digital camera becomes smaller and smaller, which makes them portable so the user can take photos any time. Generally speaking, the aperture of the smart phone and the digital camera is smaller, so the smart phone and the digital camera can generate a clear photo no matter whether the captured scene is far or near; however, it also makes the smart phone and the digital camera unable to generate photographic images having depth of field to emphasize a certain object.
- On the other hand, the digital single lens reflex camera (DSLR) has a bigger aperture, so it can blur the image of an element not within the focal area and make another specific element within the focal area clear. However, the digital single lens reflex camera having a bigger aperture has big volume, is costly and not as portable.
- Therefore, in recent years, many image depth processing methods have been applied to the smart phone and the digital camera, so as to blur part of the photographic image taken by the smart phone and the digital camera, in order to emphasize a certain element in the photographic image. However, the physics principles of geographic optics are not considered in the traditional image depth processing method, so it is hard to estimate the blurring degree of the element within and not within the focal area, which makes the blurred photographic images look not natural or not continuous.
- The instant disclosure provides an image processing device and an image depth processing method based on geometric optics. The image processing device executes the image depth processing method, so that it can generate a simulation image having a depth of field that looks natural and continuous according to a reference image and a depth image corresponding to the reference image.
- The instant disclosure also provides an image depth processing method. The image depth processing method comprises: each time obtaining a background image and a foreground image from a reference image according to an order indicating a depth of field shown in a depth image (from far to near), wherein the depth image corresponds to the reference image and a depth value of the background image is larger than a depth value of the foreground image; blurring the foreground image and a local image of the blurred background image that is near the margin of the foreground image; and after blurring the foreground image and the local image, forming a simulation image according to the foreground image and the background image.
- In one of the embodiments of the instant disclosure, after blurring the foreground image and the local image and before forming the simulation image, the image depth processing method further comprises: executing a fading process so as to fade the margin image of the background image.
- The instant disclosure also provides an image processing device, and the image processing device comprises a memory module and a processing module. The processing module is coupled to the memory module. The processing module is configured to execute the above mentioned image depth processing method. The memory module is configured to store the reference image and the depth image.
- The instant disclosure further provides an image processing device, and the image processing device comprises a memory module and a processing module. The image capturing module is configured to capture an image of a scene so as to generate a plurality of photographic images. The processing module is coupled to the image capturing module, to generate a reference image according to the photographic images and to generate a depth image corresponding to the reference image. The processing module each time obtains a background image and a foreground image from a reference image according to an order indicating a depth of field shown in a depth image (from far to near), wherein a depth value of the background image is larger than a depth value of the foreground image. The processing module blurs the background image according to the depth value of the background image, and blurs the foreground image and a local image of the blurred background image that is near the margin of the foreground image according to the depth value of the foreground image. After the processing module blurs the foreground image and the local image, the processing module forms a simulation image according to the foreground image and the background image.
- To sum up, via the image processing device and the image processing method provided by the embodiment of the instant disclosure, after the image processing device blurs the background image according to the depth value of the background image, the image processing device blurs the foreground image and a local image of the background image according to the depth value of the foreground image, such that the connection between the foreground image and the background image can be explained by the geometrical optics, which makes an image look natural and continuous. In addition, the image processing device can further execute a fading process, so as to fade the margin image of the background image and thereby to generate a simulation image having a depth of field and a large aperture value.
- For further understanding of the instant disclosure, reference is made to the following detailed description illustrating the embodiments and examples of the instant disclosure. The description is only for illustrating the instant disclosure, not for limiting the scope of the claim.
- Embodiments are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
-
FIG. 1 shows a block diagram of an image processing device of one embodiment of the instant disclosure; -
FIG. 2 shows a flow chart of an image depth processing method of one embodiment of the instant disclosure; -
FIG. 3 shows a reference image stored in a memory module of the image processing device; -
FIG. 4 shows a schematic diagram of a background image obtained according to the reference image shown inFIG. 3 ; -
FIG. 5 shows a schematic diagram of another background image obtained according to the reference image shown inFIG. 3 ; -
FIG. 6 shows a schematic diagram of still another background image obtained according to the reference image shown inFIG. 3 ; and -
FIG. 7 shows a schematic diagram of a simulation image generated according to the reference image shown inFIG. 3 . - The aforementioned illustrations and following detailed descriptions are exemplary for the purpose of further explaining the scope of the instant disclosure. Other objectives and advantages related to the instant disclosure will be illustrated in the subsequent descriptions and appended drawings. In the drawings, the size and relative sizes of layers and regions may be exaggerated for clarity.
- It will be understood that, although the terms first, second, third, and the like, may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only to distinguish one element, component, region, layer or section from another region, layer or section discussed below and could be termed a second element, component, region, layer or section without departing from the teachings of the instant disclosure. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
- Please refer to
FIG. 1 .FIG. 1 shows a block diagram of an image processing device of one embodiment of the instant disclosure. As shown inFIG. 1 , animage processing device 1 comprises amemory module 11, aprocessing module 12 and adisplay module 13, and theprocessing module 12 is connected to thememory module 11 and thedisplay module 13. In this embodiment, theimage processing device 1 is a smart phone, a laptop, a personal computer, a tablet computer, a digital camera, a digital photo frame or other electric device capable of calculating and displaying, and it is not limited therein. - The
memory module 11 is a storage medium, storing a reference image, a depth image (that is a gray scale image presenting the image with a gray scale range 0˜255, wherein the brighter the color of the image is (the greater the gray scale is) means the nearer the position is, and vice versa) and at least an aperture simulation parameter. The depth image corresponds to the reference image. Thememory module 11 is, for example, an embedded temporal memory, a physical memory or an external storage device (such as an external memory card). In one embodiment, the reference image is an image of the full depth of field having a clear foreground image and a clear background image, but it is not limited thereto. In other words, the reference image can also be an image which is partly clear. The method for generating the depth image can be realized via laser distance measurement, binocular vision, structured light or the optical field effect, which is well known by those skilled in the art and not repeated herein. The aperture simulation parameter is, for example, the shape of aperture, the size of aperture or the focal length. - The
processing module 12 obtains the reference image, the depth image and the aperture simulation parameters from thememory module 11. Theprocessing module 12 determines the depth value of each pixel in the reference image according to the depth image, and accordingly separates the reference image into a plurality of layer images having different depth values. Theprocessing module 12 implements a blurring process for the layer images according to the depth values of the layer images and the aperture simulation parameters, so as to generate a simulation image having a larger aperture and the depth of field and the like photographed by a single lens camera. In one embodiment, theprocessing module 12 is an application specific integrated circuits (ASIC), a programmable microprocessor, a digital signal processor (DSP), a programmable logic device (PLD) or a CPU with a software module, and it is not limited thereto. - The
display module 13 is, for example, a liquid crystal display screen that displays the reference image for a user to click any position of the reference image. Thereby, the processing module determines the reference depth value according to the depth value of the pixel corresponding to the click position. It should be noted that, thedisplay module 13 can also be a digital display screen with the touch-sensing function or other general digital display screen, and it is not limited therein. In this embodiment, the user can click any position of the reference image via a mouse, a keyboard or other input module, such that theprocessing module 12 determines a reference depth value according to the depth value of the pixel corresponding to the clicked position. The user can also directly input a value via the input module, and theprocessing module 12 can use the input value as the reference depth value, but is not limited therein. - In this embodiment, after the
processing module 12 determines a reference depth value, theprocessing module 12 calculates the differences between the depth value of each layer image and the reference depth value, and determines the blurring degrees of the layer images according to the differences. Moreover, theprocessing module 12 makes thedisplay module 13 display different aperture shapes for the user to choose according to the aperture simulation parameter. After that, after the user chooses an aperture shape via adisplay module 13 with the touch-sensing function, a mouse, a keyboard or other input modules, theprocessing module 12 generates a simulation image having a specific Bokeh shape (such as the star-shaped, heart-shaped, circle, pentagon or other shapes) according to the aperture shape. It should be noted that, if the user does not choose an aperture shape, theprocessing module 12 would automatically load in a predetermined aperture simulation parameter so as to generate a simulation image of which the Bokeh shape is like a circle Gaussian function. - In addition, the way for the
image processing device 1 to obtain the reference image and the depth image is that, the user saves the reference image and the corresponding depth image in thememory module 11 in advance, but it is not limited herein. In other words, theimage processing device 1 can also generate the reference image and the depth image itself. In detail, theimage processing device 1 comprises an image capturing module (not shown), and the image capturing module is connected to theprocessing module 12. The image capturing module comprises a lens, a light-sensing element and an aperture, which is used to capture images of a scene so as to generate a plurality of photographic images. The light-sensing element is, for example, a charge coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS). In this embodiment, theprocessing module 12 generates the reference image and the depth image according to the photographic images. It should be noted that, how theprocessing module 12 generates the reference image and the corresponding depth image according to the photographic images is well-known by those skilled in the art, and not repeated herein. - Please refer to
FIGS. 1-3 .FIG. 2 shows a flow chart of an image depth processing method of one embodiment of the instant disclosure, andFIG. 3 shows a reference image stored in a memory module of the image processing device. The image depth processing method can be applied to theimage processing device 1 shown inFIG. 1 , and can generate a simulation image having a larger aperture and the depth of field and like photographed by a single lens camera according to the reference image shown inFIG. 3 . Thus, the image depth processing method in this embodiment is described as follows via theimage processing device 1 shown inFIG. 1 and the reference image shown inFIG. 3 . It should be noted that, the reference image shown inFIG. 3 is merely for describing the working principle of the image depth processing method, which does not mean that the image depth processing method merely can be applied to the reference image shown inFIG. 3 . In other words, the image depth processing method can generate simulation images according to other reference images. - In Step S201, the
processing module 12 obtains N layer images from the reference image according to the depth value shown in the depth image (from far to near), wherein N is an integer greater than 1 and the depth values of the layer images are different. As shown inFIG. 3 , theprocessing module 12 obtains four layer images from the reference image P1 according to the depth image corresponding to the reference image P1, which is a first layer image N1, a second layer image N2, a third layer image N3 and a fourth layer image, wherein the higher the layer number of the layer image is, the greater the depth value of the layer image would be (further depth of field). It should be noted that, before theprocessing module 12 obtains N layer images from the reference image, the user can set the value of N via thedisplay module 13 having touch-sensing function, a mouse or a keyboard to further set the numbers of layer images and the numbers of re-focus, wherein N can be a predetermined value. - In Step S203, after the user chooses a reference depth value via a
display module 13 having the touch-sensing function, a mouse or a keyboard, theprocessing module 12 respectively calculates the differences between the depth values of the layer images and the reference depth value, so as to obtain the difference values corresponding to each layer image. Theprocessing module 12 determines the simulation degree of each layer image according to the difference values, wherein if the difference value is large the blurring degree of the layer image would be large. Also, thememory module 11 stores a look-up table (not shown), and the look-up table records a plurality of blurring parameters. Assuming that the user clicks the first layer image N1 via thedisplay module 13 having the touch-sensing function, theprocessing module 12 would use the depth value of the first layer image N1 as the reference depth value, and respectively calculate difference values between the depth values of the layer images and the reference depth value so as to correspondingly obtain the first, the second, the third and the fourth difference values. Theprocessing module 12 obtains the corresponding blurring parameters from the look-up table according to the difference values, and then determines the blurring degree of each layer image according to the blurring parameters. - In Step S205, the
processing module 12 uses the Nth layer image having the greatest depth value as a background image (such as the fourth layer image N4), and uses the N−1th layer image as a foreground image (such as the third layer image N3 wherein the depth value of the N−1th layer image is smaller than the Nth layer image. After that, theprocessing module 12 blurs the background image according to the predetermined aperture simulation parameter (or the aperture simulation parameter chosen by the user) and the corresponding difference value of the Nth layer image. - In Step S206, the
processing module 12 composes the foreground image and a local image in the blurred background image, which is near the edge of the foreground image. Please refer toFIG. 4 .FIG. 4 shows a schematic diagram of a background image obtained according to the reference image shown inFIG. 3 . In this step, theprocessing module 12 obtains a local image P11 from the fourth layer image N4, and the local image P11 corresponds to a part of the fourth layer image N4, which is near the edge of the third layer image N3. Theprocessing module 12 composes the local image P11 and the third layer image N3 into a composite image. - In Step S207, the
processing module 12 blurs the composite image formed by the foreground image and the local image of the background image, such that the depth of field across from the fourth layer image N4 to the third layer image N3 would look continuous and natural. - In Step S209, after Step S207 the blurring degree of the local image and the blurring degree of the background image are different, so there would be a margin image existed in the background image. Thus, in this step, the
processing module 12 implements a fading process to fade a margin image in the background image. Please refer toFIG. 5 .FIG. 5 shows a schematic diagram of another background image obtained according to the reference image shown inFIG. 3 . As shown inFIG. 5 , after Step S207, the blurring degree of the local image P11 and the blurring degree of the fourth layer image N4 are different, and thus there would be a margin image L1 existed in the fourth layer image N4. Therefore, in this step, theprocessing module 12 would fade in the fourth layer image N4 along the margin image L1 (that is, to make the fourth layer image N4 near the margin image L1 gradually clear) and fade out the local image P11 along the margin image L1 (that is, to make the local image P11 of the margin image L1 gradually blur), so as to fade the margin image L1. - In Step S211, the
processing module 12 determines whether (N−1) equals to 1. If yes, it goes to Step S213 to form a simulation image according to the current background image and foreground image. If no, it goes to Step S215 to subtract 1 from N and then it goes to Step S217. - In Step S217, the
processing module 12 forms a new background image according to the current background image and foreground image to replace the old background image, and uses the N−1th layer image as a new foreground image to replace the old foreground image. After Step S215, N is 3 (that is, N=4−1), and thus theprocessing module 12 uses the second layer image N2 as a new foreground image, and forms a new background image according to the fourth layer image N4 and the third layer image N3 which are processed by Steps S203˜S209 (as shown inFIG. 5 ). - After Step S217, the
processing module 12 again implements Step S206 to compose the foreground image (that is, the second layer image N2) and a local image of the background image (that is, a local image P12 of the background image, which is near the edge of the second layer image N2) into a composite image. - After that, the
processing module 12 implements Step S207 to blur the composite image generated in Step S206, according to the aperture simulation parameter and the depth value corresponding to the second layer image N2. - Please refer to
FIG. 6 .FIG. 6 shows a schematic diagram of still another background image obtained according to the reference image shown inFIG. 3 . As shown inFIG. 6 , after Step S207, there would be a margin image L2 existing in the background image, and thus the processing module would again execute Step S209 so as to fade in the background image (that is, to make the background image of the margin image L2 gradually clear) and to fade out the local image P12 along the margin image L2 (that is, to make the local image P12 of the margin image L2 gradually blur), so as to fade the margin image L2. - After that, the
processing module 12 again executes Step S211 to determine that (N−1) is not equal to 1 (that is, N−1=2). In addition, theprocessing module 12 again executes Step S217 to use the first layer image N1 as a new foreground image and to form a new background image according to the background image and foreground image processed via Steps S207˜S209 (as shown inFIG. 6 ). - After that, the
processing module 12 again executes Step S206 to compose the foreground image (that is, the first layer image N1) and a local image of the background image (that is, a local image P13 of the background image, which is near the edge of the first layer image N1) into a composite image. - After that, the
processing module 12 executes Step S207 and Step S209 to blur the composite image generated in Step S206 according to the aperture simulation parameter and the corresponding depth value of the first layer image N1. Theprocessing module 12 fades in the background image along a margin image (not shown) in the background image and fades out the local image P13 along the margin image, so as to fade the margin image. - After that, the
processing module 12 again executes Step S211 to determine whether N−1 is equal to 1 (that is, 2−1=1), and then executes Step S213 to generate a simulation image. Specifically, when N−1 equals to 1, all of the layer images have been processed. Therefore, please refer toFIG. 7 ,FIG. 7 shows a schematic diagram of a simulation image generated according to the reference image shown inFIG. 3 . As shown inFIG. 7 , the processing module generates a simulation image having a larger aperture and the depth of field like photographed by a single lens camera according to the current background image and foreground image. - From the above, after the
image processing device 1 blurs the background image, theimage processing device 1 would blur the foreground image according to the difference value corresponding to the foreground image and the aperture simulation parameters, and theimage processing device 1 would also blur a local image of the background image according to the difference value corresponding to the foreground image and the aperture simulation parameters. After that, theimage processing device 1 implements a fading process along a margin image in the background image, so as to generate a simulation image satisfied with the physics principles of geometrical optics so that the simulation image would have the graduated depth of field and look natural and continuous. - Moreover, the depth value of the first layer image N1 is taken as the reference depth value in the above embodiment to generate a simulation image having near depth of field, but it is not limited herein. In other words, the
image processing device 1 can also take the depth value of other layer images as the simulation image according to the position clicked by the user so as to generate a simulation image having other kinds of depth of field. However, it should be noted that, no matter which layer image theimage processing device 1 takes the depth value of as the reference depth value, theimage processing device 1 processes images in an order from the layer image having the greatest depth value to the layer image having the smallest depth value during the image process (that is, from the farthest layer image to the nearest layer image), so as to generate a simulation image satisfying the physics principles of geometrical optics. - Additionally, in another embodiment, before the
image processing device 1 blurs the layer images, theimage processing device 1 would increase the image brightness value of at least one bright area in the reference image P1. Specifically, there may be images of a bright area in the reference image P1, such as a light-concentrating point or reflective surface. Usually, the image of the bright area has the greatest brightness which is 255 (the brightness of the general images ranges from 0 to 255). However, the brightness of the image of the bright area is usually presented as 255, so if the reference image P1 is blurred without increasing the brightness of the image of the bright area in advance, the brightness of the image of the bright area of the blurred reference image P1 would decrease. Thereby, the simulation image generated by theimage processing device 1 would not satisfy the principles of optics. Therefore, before theimage processing device 1 executes Step S205, theimage processing device 1 would increase the brightness of the image of the bright area in advance (for example, from 255 to 500), so as to prevent decreasing the brightness of the image of the bright area during the process of blurring each layer image. However, it is not limited herein, and those skilled in the art could choose to skip this step based on need. - It should be also mentioned that, the image depth processing method can be applied to still images and also to simulated dynamic images (that is, animation) to have depth of field, but is not limited herein.
- It is clarified that, the sequence of steps in
FIG. 2 is set for a need to instruct easily, but the sequence of the steps is not used as a condition for demonstrating embodiments of the instant disclosure. - To sum up, the image processing device and the image depth processing method allow the user to set the numbers of layer images and the numbers of re-focus, to simulate the depth of field of any practical lens according to the aperture simulation parameters and also to increase the brightness of the image of the bright area in advance so as to strengthen at least one image of the bright area of the simulation image. In the image processing device and the image depth processing method, via blurring the foreground image and blurring a local image of the background image, which is near the edge of the foreground image, and implementing a fading process to fade the margin image in the background image, it a simulation image can be generated that satisfies the physics principles of geographic optics (that is, a simulation image having graduated depth of field, which looks natural and continuous).
- The descriptions illustrated supra set forth simply the preferred embodiments of the instant disclosure; however, the characteristics of the instant disclosure are by no means restricted thereto. All changes, alterations, or modifications conveniently considered by those skilled in the art are deemed to be encompassed within the scope of the instant disclosure delineated by the following claims.
Claims (16)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW104106025A | 2015-02-25 | ||
TW104106025A TWI566601B (en) | 2015-02-25 | 2015-02-25 | Image processing device and image depth processing method |
TW104106025 | 2015-02-25 |
Publications (2)
Publication Number | Publication Date |
---|---|
US9412170B1 US9412170B1 (en) | 2016-08-09 |
US20160247285A1 true US20160247285A1 (en) | 2016-08-25 |
Family
ID=56555986
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/706,090 Active US9412170B1 (en) | 2015-02-25 | 2015-05-07 | Image processing device and image depth processing method |
Country Status (2)
Country | Link |
---|---|
US (1) | US9412170B1 (en) |
TW (1) | TWI566601B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109408718A (en) * | 2018-10-23 | 2019-03-01 | 西安艾润物联网技术服务有限责任公司 | Information-pushing method and Related product |
WO2021107384A1 (en) * | 2019-11-29 | 2021-06-03 | Samsung Electronics Co., Ltd. | Generation of bokeh images using adaptive focus range and layered scattering |
US11094041B2 (en) | 2019-11-29 | 2021-08-17 | Samsung Electronics Co., Ltd. | Generation of bokeh images using adaptive focus range and layered scattering |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10217195B1 (en) * | 2017-04-17 | 2019-02-26 | Amazon Technologies, Inc. | Generation of semantic depth of field effect |
CN112950692B (en) * | 2019-11-26 | 2023-07-14 | 福建天晴数码有限公司 | Image depth of field processing method and system based on mobile game platform |
CN113052754B (en) * | 2019-12-26 | 2022-06-07 | 武汉Tcl集团工业研究院有限公司 | Method and device for blurring picture background |
CN112270728A (en) * | 2020-10-27 | 2021-01-26 | 维沃移动通信有限公司 | Image processing method and device, electronic equipment and readable storage medium |
CN112532882B (en) * | 2020-11-26 | 2022-09-16 | 维沃移动通信有限公司 | Image display method and device |
WO2023245362A1 (en) * | 2022-06-20 | 2023-12-28 | 北京小米移动软件有限公司 | Image processing method and apparatus, electronic device, and storage medium |
TWI811043B (en) * | 2022-07-28 | 2023-08-01 | 大陸商星宸科技股份有限公司 | Image processing system and image object superimposition apparatus and method thereof |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10011411C2 (en) * | 2000-03-09 | 2003-08-14 | Bosch Gmbh Robert | Imaging fire detector |
US20100265313A1 (en) * | 2009-04-17 | 2010-10-21 | Sony Corporation | In-camera generation of high quality composite panoramic images |
JP5460173B2 (en) * | 2009-08-13 | 2014-04-02 | 富士フイルム株式会社 | Image processing method, image processing apparatus, image processing program, and imaging apparatus |
US20110110591A1 (en) * | 2009-11-09 | 2011-05-12 | Ming-Hwa Sheu | Multi-point image labeling method |
KR101470693B1 (en) * | 2012-07-31 | 2014-12-08 | 엘지디스플레이 주식회사 | Image data processing method and stereoscopic image display using the same |
CN104253939A (en) * | 2013-06-27 | 2014-12-31 | 聚晶半导体股份有限公司 | Focusing position adjusting method and electronic device |
AU2013206601A1 (en) * | 2013-06-28 | 2015-01-22 | Canon Kabushiki Kaisha | Variable blend width compositing |
CN103945118B (en) * | 2014-03-14 | 2017-06-20 | 华为技术有限公司 | Image weakening method, device and electronic equipment |
-
2015
- 2015-02-25 TW TW104106025A patent/TWI566601B/en active
- 2015-05-07 US US14/706,090 patent/US9412170B1/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109408718A (en) * | 2018-10-23 | 2019-03-01 | 西安艾润物联网技术服务有限责任公司 | Information-pushing method and Related product |
WO2021107384A1 (en) * | 2019-11-29 | 2021-06-03 | Samsung Electronics Co., Ltd. | Generation of bokeh images using adaptive focus range and layered scattering |
US11094041B2 (en) | 2019-11-29 | 2021-08-17 | Samsung Electronics Co., Ltd. | Generation of bokeh images using adaptive focus range and layered scattering |
Also Published As
Publication number | Publication date |
---|---|
TW201631955A (en) | 2016-09-01 |
TWI566601B (en) | 2017-01-11 |
US9412170B1 (en) | 2016-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9412170B1 (en) | Image processing device and image depth processing method | |
US11210799B2 (en) | Estimating depth using a single camera | |
EP3599760B1 (en) | Image processing method and apparatus | |
US10015469B2 (en) | Image blur based on 3D depth information | |
US9444991B2 (en) | Robust layered light-field rendering | |
CN110121882B (en) | Image processing method and device | |
US9639945B2 (en) | Depth-based application of image effects | |
US9906772B2 (en) | Method for performing multi-camera capturing control of an electronic device, and associated apparatus | |
US20220215568A1 (en) | Depth Determination for Images Captured with a Moving Camera and Representing Moving Features | |
CN108848367B (en) | Image processing method and device and mobile terminal | |
AU2013206601A1 (en) | Variable blend width compositing | |
US9332195B2 (en) | Image processing apparatus, imaging apparatus, and image processing method | |
CN105989574A (en) | Image processing device and image field-depth processing method | |
WO2018210308A1 (en) | Blurring method and apparatus for image, storage medium, and electronic device | |
US9538074B2 (en) | Image processing apparatus, imaging apparatus, and image processing method | |
CN107093395B (en) | Transparent display device and image display method thereof | |
US20230033956A1 (en) | Estimating depth based on iris size | |
US8665349B2 (en) | Method of simulating short depth of field and digital camera using the same | |
TWI541761B (en) | Image processing method and electronic device thereof | |
Han et al. | Virtual out of focus with single image to enhance 3D perception | |
RU2540786C2 (en) | Method and system for dynamic generation of three-dimensional animation effects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NATIONAL TAIWAN UNIVERSITY, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WU, JIUN-HUEI;LI, ZONG-SIAN;REEL/FRAME:035583/0787 Effective date: 20150504 Owner name: LITE-ON TECHNOLOGY CORPORATION, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WU, JIUN-HUEI;LI, ZONG-SIAN;REEL/FRAME:035583/0787 Effective date: 20150504 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |