US8718331B2 - Image detecting apparatus and method thereof - Google Patents
Image detecting apparatus and method thereof Download PDFInfo
- Publication number
- US8718331B2 US8718331B2 US12/844,953 US84495310A US8718331B2 US 8718331 B2 US8718331 B2 US 8718331B2 US 84495310 A US84495310 A US 84495310A US 8718331 B2 US8718331 B2 US 8718331B2
- Authority
- US
- United States
- Prior art keywords
- image
- eye
- block matching
- consecutive
- eye image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/007—Aspects relating to detection of stereoscopic image format, e.g. for adaptation to the display format
Definitions
- the present disclosure relates to an image frame detecting mechanism, and more particularly, to an image frame detecting apparatus and a method thereof capable of performing left-eye/right-eye image determination.
- stereo image display technologies implement a concept that an image signal is divided into left-eye images and right-eye images having different visual angles.
- the left-eye images and the right-eye images are respectively transmitted to the left eye and the right eye of a viewer via a stereo image display, and are then projected into a stereo image in the human brain. Accordingly, the left-eye images and the right-eye images are interleaved with each other in a common stereo image signal.
- a sequence of the left-eye images and the right-eye images is not particularly designated in the common stereo image signal, i.e., positions of the left-eye images and the right-eye images are not designated. Therefore, in order to accurately transmit the left-eye images and right-eye images to the left eye and the right eye, respectively, it is crucial to first detect the sequence of the left-eye image and right-eye image of the stereo image.
- one object of the present disclosure is to provide an image detecting apparatus and a method thereof capable of detecting left-eye/right-eye image frames to effectively and accurately detect a sequence or positions of left-eye/right-eye image frames of a stereo image signal.
- the image detecting apparatus and the method thereof are also capable of detecting a dimension of a current image frame, i.e., whether the current image frame is a stereo image or a two-dimensional (2D) image can be determined.
- an image detecting method comprises performing block matching on a target area corresponding to two consecutive image frames of an image signal to determine a motion vector; and performing left-eye/right-eye image determination on a current image frame from the two consecutive image frames according to the motion vector.
- an image detecting apparatus comprises a calculating unit and a determining unit.
- the calculating unit performs block matching on a target area corresponding to two consecutive image frames of an image signal to determine a motion vector.
- the determining unit performs left-eye/right-eye image determination on a current image frame from the consecutive image frames according to the motion vector.
- FIG. 1 is a block diagram of an image detecting apparatus in accordance with an embodiment of the present disclosure.
- FIG. 2 is a schematic diagram illustrating operations of block matching by the image detecting apparatus in FIG. 1 .
- FIG. 3A and FIG. 3B are schematic diagrams of examples of left-eye/right-eye images having different sequences in an input image signal Y in .
- FIG. 4A to FIG. 4C are schematic diagrams illustrating detection of an input image signal Y in with different frame rates by the image detecting apparatus in FIG. 1 .
- FIG. 5 is a flow chart of operations of the image detecting apparatus in FIG. 1 .
- FIG. 6 is a timing diagram of a stereo glass control signal generated by a processing unit in FIG. 1 .
- FIG. 1 shows a block diagram of an image detecting apparatus 100 in accordance with an embodiment of the present disclosure.
- the image detecting apparatus 100 comprises a scaling unit 105 , a storage unit 110 , a calculating unit 115 , a determining unit 120 , and a processing unit 125 .
- the calculating unit 115 performs block matching on a target area corresponding to two consecutive image frames of an input image signal Y in to calculate a plurality of block matching differences, selects a relatively small block matching difference from the plurality of block matching differences, and determines a motion vector according to the relatively small block matching difference.
- the relatively small block matching difference is the minimum block matching difference from the plurality of block matching differences.
- the determining unit 120 performs left-eye/right-eye image determination on a current image frame from the two consecutive image frames according to the motion vector to generate a determination result.
- the processing unit 125 compiles statistics of a plurality of determination results of a plurality of areas of the current image frame to determine a dimension of the current image frame, and to determine the current image frame as a left-eye image or a right-eye image.
- image frames of the input image signal Y in are scaled down, by the scaling unit 105 , to a plurality of down-scaled image frames before the calculating unit 115 performs block matching.
- the calculating unit 115 performs block matching according to two consecutive down-scaled image frames to determine motion vectors.
- the scaling unit 105 and the storage unit 110 are not limitations of the present disclosure as they are not main components for performing left-eye/right-eye image determination.
- the scaling unit 105 scales down the image frames of the input image signal Y in by either horizontal scale down (HSD) or vertical scale down (VSD), so as to respectively generate down-scaled image frames, which are then written into the storage unit 110 and outputted to the calculating unit 115 .
- the scaling unit 105 scales down the image frames by sampling or averaging the image frames.
- the calculating unit 115 receives a down-scaled image frame transmitted from the scaling unit 105 and reads a down-scaled image frame from in the storage unit 110 .
- the calculating unit 115 Since the calculating unit 115 almost simultaneously receives and reads the down-scaled image frames, as far as a time sequence is concerned, when the down-scaled frame received by the calculating unit 115 from the scaling unit 105 is a current image frame F n , the down-scaled frame read by the calculating unit 115 from the storage unit 110 is a frame prior to the current image frame F n , e.g., the previous image frame is F n ⁇ 1 or F n ⁇ 2 . Therefore, the calculating unit 115 performs block matching on a target area corresponding to the current image frame F n and the previous image frame F n ⁇ 1 to calculate the block matching differences.
- the calculating unit 115 performs block matching in the horizontal direction due to visual angle characteristics of left-eye/right-eye images. More specifically, the calculating 115 first performs block matching on a plurality of image blocks of two consecutive image frames to generate a plurality of block matching values, and then adds up the plurality of block matching values to generate a corresponding block matching difference among the abovementioned plurality of block matching differences, all of which are then obtained by iterating the foregoing operations.
- the plurality of image blocks are image blocks covered by each of scan line areas of the image frames, and the calculating unit 115 performs block matching on image blocks covered by corresponding scan line areas of the current image frame F n and the previous image frame F n ⁇ 1 .
- FIG. 2 shows a schematic diagram of operations of block matching by the image frame detecting apparatus 100 in FIG. 1 .
- the calculating unit 115 performs block matching on an image block M j of a scan line area L k corresponding to the previous image frame F n ⁇ 1 and 2n+1 blocks M j ⁇ R ′ to M j+R ′ of a scan line area L k ′ corresponding to the current image frame F n to generate a plurality of block matching values (e.g., a sum of absolute differences (SAD) between pixel values), where k represents a kth scan line area, j represents a jth block in the horizontal direction, and the plurality of block matching values corresponding to different horizontal motion vectors.
- SAD sum of absolute differences
- the calculating unit 115 performs block matching on a different block (e.g., a block M j+1 ) of the scan line area L k corresponding to the previous image frame F n ⁇ 1 and a plurality of blocks (e.g., blocks M j ⁇ R+1 ′ to M j+R+1 ′) of the scan line area L k ′ corresponding to the current image frame F n to generate a plurality of block matching values. Therefore, for every horizontal motion vector, the different image blocks of the scan line area L k corresponding to the previous image frame F n ⁇ 1 can generate a plurality of block matching values.
- a different block e.g., a block M j+1
- a plurality of blocks e.g., blocks M j ⁇ R+1 ′ to M j+R+1 ′
- the calculating unit 115 calculates 2R+1 block matching values respectively corresponding to 2R+1 different horizontal motion vectors. Likewise, for the block M j+1 , the calculating unit 115 obtains other 2R+1 block matching values respectively corresponding to 2R+1 motion vectors via the foregoing operations. After iterating N times the foregoing operations, i.e., after block matching is performed on N different blocks of the scan line area L k , every horizontal motion vector then corresponds to N block matching values (i.e., the SAD).
- N block matching values i.e., the SAD
- the calculating unit 115 For each of the horizontal motion vectors, e.g., a horizontal motion vector (1, 0) of a distance for moving rightwards by a block, the calculating unit 115 adds up N SADs corresponding to the motion vector (1, 0) to generate a block matching difference, and accordingly respectively generates a plurality of block matching differences corresponding to 2R+1 horizontal motion vectors.
- the minimum block matching difference is selected from the plurality of block matching differences, and a dimension of the current image frame F n is determined according to a motion vector corresponding to the minimum block matching difference, i.e., it is determined whether the current image frame F n is a stereo image or a plane image. For example, when the current image frame F n is a stereo image, it is further determined whether the current image frame F n is a left-eye image or a right-eye image according to a motion vector.
- FIG. 3A and FIG. 3B respectively show schematic diagrams of examples of left-eye/right-eye images having different sequences in an input image signal Y in .
- a current image frame received by the image detecting apparatus 100 is a right-eye image 300 R of a stereo image 300 represented by cubes in FIG. 3A
- a previous image frame is a left-eye image 300 L, i.e., the image detecting apparatus 100 first receives the left-eye image 300 L and then receives the right-eye image 300 R.
- a stereo image For forming a stereo image, referring to FIG.
- a same image may appear at different positions. For example, an edge 305 formed by the front plane and the side plane of the cube of the left-eye image 300 L is more near the right side than it of the right-eye image 300 R.
- a motion vector corresponding to the cube edge 305 is directed rightwards; otherwise, when the previous image frame is the right-eye image 300 R and the current image frame is the left-eye image 300 L (as shown in FIG. 3B ), as generated by the calculating 115 from performing block matching, the motion vector corresponding to the cube edge 305 is directed leftwards. Therefore, according to a direction of the motion vector, it can be determined that the current image frame is the left-eye image 300 L or the right-eye image 300 R, thereby determining that other stereo images are left-eye or right-eye images.
- the calculated motion vector when the calculated motion vector is zero, it means that the images in the two consecutive image frames don't move, such that it is determined that the frames are 2D images but not stereo images. Accordingly, in this embodiment, besides determining a dimension of the current image frame, it is also determined whether the current image frame is a left-eye image or a right-eye image according to the calculated motion vector.
- the determining unit 120 determines that the scan line area L k ′ of the current image frame F n is a right-eye image, and generates a determination result “1” according to the right-eye image.
- the determination result “1” is recorded in a flag corresponding to the scan line area L k ′ for subsequent statistics compiling by the processing unit 125 .
- the determining unit 120 determines that the current image frame F n in the scan line area L k ′ is a left-eye image, and generates a determination result “2” to be recorded in the flag corresponding to the scan line area L k ′. In addition, when the motion vector does not indicate any direction, the determining unit 120 determines that the current image frame F n in scan line area L k ′ is neither the left-eye image nor the right-eye image but a 2D image, and generates a determination result “0” to be recorded in the flag corresponding to the scan line area L k ′.
- the determining unit 120 determines the current image frame F n as the left-eye image or the right-eye image according to the motion vector calculated by the calculating unit 115 with respect to the scan line areas L k and L k ′.
- the calculating unit 115 respectively performs block matching on a plurality of scan line areas (to even all scan line areas) of the previous image frame F n ⁇ 1 and the current image frame F n to determine motion vectors corresponding to the scan line areas.
- a flag value corresponding to each of the scanning areas may indicate an image frame as a left-eye image, a right-eye image or a 2D image.
- a determination result indicates that an image frame is a left-eye image, i.e., when a flag value is “2”
- the processing unit 125 increases a count value of a first counter
- the determination result indicates that the frame is a right-eye image, i.e., when the flag value is “1”
- the processing unit 125 increases a count value of a second counter.
- the processing unit 125 compiles statistics of determination results (i.e., flag values) of the current image frame F n in all scan line areas, it is determined whether the current image frame F n is a left-eye image, a right-eye image or a 2D image according to the count values of the first counter and the second counter.
- the processing unit 125 determines that the current image frame F n is a left-eye image; when the ratio of the count values of the first counter and the second counter is lower than a second threshold V th2 , the processing unit 125 determines that the current image frame F n is a right-eye image, where the second threshold V th2 is lower than the first threshold V th1 .
- the processing unit 125 determines the current image frame F n as the left-eye image; when a majority of determination results of the current image frame F n in all scan line areas are right-eye images, the processing unit 125 determines the current image frame F n as the right-eye image.
- the ratio of the first counter and the second counter is between the first threshold V th1 and the second threshold V th2 , the processing unit 125 determines the current image frame F n with reference to a determination result of another image frame (e.g., a previous image frame F n ⁇ 1 or a next frame F n+1 ).
- the processing unit 125 determines that the current image frame F n is a 2D image or a plane image to avoid mistakenly determining the plane image as a stereo image.
- FIG. 4A to FIG. 4C show schematic diagrams illustrating detection of an input image signal Y in with different frame rates by the image detecting apparatus in FIG. 1 .
- the image detecting apparatus 100 detects the input image signal Y in with a date rate of 60 Hz.
- the input image signal Y in has a left-eye image L or a right-eye image R at one frame time point, and the image detecting apparatus 100 first receives the left-eye image L and then receives the right-eye image R in sequence.
- the calculating unit 115 is designed as determining whether the input image signal Y in comprises a motion image according to a plurality of image frames comprising a plurality of odd image frames or a plurality of even image frames (i.e., a plurality of left-eye images or a plurality of right-eye images). More specifically, the calculating unit 115 may determine whether the input image signal Y in comprises a motion image according to a plurality of odd image frames; likewise, the calculating unit 115 may determine whether the input image signal Y in comprises a motion image according to a plurality of even image frames.
- the processing unit 125 determines whether a current image frame is a left-eye image or a right-eye image according to a result of whether having the motion image and statistics of the foregoing determination results. For example, when the input image signal Y in in FIG. 4A comprises a motion image, the processing unit 125 performs left-eye/right-eye determination on two consecutive image frames at a same time point (e.g., a left-eye image L and a right-eye image R at a time point t 1 ), but not on two consecutive image frames at two different time points (e.g., a right-eye image R and a left-eye image L at the time point t 1 ), so as to prevent misjudgments. Referring to FIG.
- the processing unit 125 determines that the current image frame is a right-eye image R, and accordingly the previous image frame or the next frame is a left-eye image L. At this point, flag values of frames not processed at the time points are marked “X” for distinction. Therefore, even if the input image signal Y in comprises a motion image, the image detecting apparatus 100 still can effectively and accurately detect a sequence of left-eye/right-eye images of the input image signal Y in .
- the processing unit 125 when the input image signal Y in does not comprise any motion image (i.e., the input image signal Y in only comprises static images), regardless of being at a same time point or at different time points, the processing unit 125 performs dimension determination or left-eye/right-eye image determination on two consecutive image frames. Accordingly, when the previous image frame and the current image frame are at a same time point (e.g., a left-eye image L and a right-eye image R are at the time point t 1 ), the determining unit 120 determines that a majority of flag values corresponding to the current image frame are recorded as “1”, and determines that the current image frame is a right-eye image R.
- the determining unit 120 determines that a majority of flag values corresponding to the current image frame are recorded as “2”, and determines that the current image frame is a lift-eye image L. Accordingly, the image frame detecting apparatus 100 is capable of effectively detecting a sequence of left-eye/right-eye images of the input image signal Y in .
- the image detecting apparatus 100 first receives a right-eye image R of the input image signal Y in and then receives a left-eye image L.
- the processing unit 125 determines a dimensional of the current image frame according to a result of determining whether the input image signal Y in comprises a motion image, and statistics of the foregoing determination results.
- the input image signal Y in has a frame rate of 120 Hz, and each of left-eye images and each of right-eye images of the input image signal Y in correspond to different time points.
- the image detecting apparatus 100 first receives a left-eye image L and then receives a right-eye image R.
- the processing unit 125 determines a dimension of the input image signal Y in and a sequence of left-eye/right-eye images according to a result of determining whether the input image signal Y in comprises a motion image and statistics of the foregoing determination results.
- the processing unit 125 when a motion image is detected in an image frame of the input image signal Y in , the processing unit 125 performs neither the dimension determination nor the sequence determination of left-eye/right-eye images to prevent misjudgments. On the contrary, when the input image signal Y in does not comprise any motion image, the processing unit 125 performs the dimension determination or the sequence determination of left-eye/right-eye images on two consecutive image frames.
- FIG. 5 shows a flow chart of operations of the image detecting apparatus 100 in FIG. 1 . Note that on a premise that the same effect is achieved in practice, the steps of operations of the present image detecting method need not be executed as the sequence shown in FIG. 5 , and can be interleaved with other steps of the same flow.
- Step 500 the scaling unit 105 scales down image frames of the input image signal Y in to generate down-scaled image frames.
- the calculating unit 115 performs block matching on scan line areas corresponding to two consecutive down-scaled image frames to generate a plurality of block matching differences, and selects the minimum block matching difference from the plurality of block matching differences to determine a motion vector.
- the determining unit 120 performs left-eye/right image determination on a current image frame in a scan line area according to the motion vector corresponding to the minimum block matching difference to generate a determination result.
- Step 520 it is determined whether determination results of the current image frame in all scan line areas are generated.
- Step 525 When the result of Step 520 is positive, Step 525 is performed; otherwise, Step 510 is performed.
- the processing unit 125 compiles statistics of the determination results of the current image frame in all scan line areas to determine whether the current image frame is a left-eye image or a right-eye image. The flow ends in Step 530 .
- the calculating unit 115 can perform block matching on image blocks within a plurality of corresponding areas in other sizes, e.g., block matching is performed on image blocks within a range of a square area but not only the image blocks in the scan line areas.
- the calculating unit 115 can select a representative image block from the plurality of corresponding areas, and respectively calculates a plurality of block matching values as the plurality of block matching differences according to the representative image block and a plurality of different horizontal motion vectors. Referring to FIG.
- the calculating unit 115 selects the block M j in the scan line area L k corresponding to the previous image frame F n ⁇ 1 as a representative image block to perform block matching on the block M j and the blocks M j ⁇ R ′ to M j+R ′ in the scan line area L k ′ corresponding to the current image frame F n , so as to generate a plurality of SADs as a plurality of block matching differences, i.e., a minimum SAD from the SADs serves as a minimum block matching difference to be provided to the subsequent determining unit 120 to determine whether the current image frame F n in the scan line area L k ′ is a right-eye image, a left-eye image, or a plane image.
- the calculating unit 115 need not perform block matching on all image blocks within the plurality of corresponding areas.
- certain image blocks are omitted while block matching is performed on certain representative image blocks, and such modifications are within the spirit and scope of the present disclosure.
- the processing unit 125 of the image detecting apparatus 100 further generates a stereo glass control signal according to a determination result of a sequence of left-eye/right-eye images.
- FIG. 6 shows a timing diagram of a stereo glass control signal generated by the processing unit 125 .
- An initial vertical data enable signal VDE in of an input image signal Y in has a periodically high logic level.
- a rising edge of the high logic level represents a start time point of a left-eye image or a right-eye image
- a falling edge of the high logic level represents an end time point of a left-eye image or a right-eye image.
- the time T 1 represents a time point of practically completing scanning a right-eye image.
- the image detecting apparatus 100 performs left-eye/right-eye image determination on the input image signal Y in , in a practical situation, an overall time delay (e.g., a time difference T d ) may already be resulted in the original input image signal Y in when it is determined that a current image frame is a left-eye/right-eye image and an image data is outputted. Therefore, apart from determining a sequence of left-eye/right-eye images of the input image signal Y in , the processing unit 125 also estimates the time delay T d , of the input image signal Y in , resulted by the image detecting apparatus 100 .
- T d time difference
- the processing unit 125 generates a stereo glass control signal R ctrl of a right-eye image and a stereo glass control signal L ctrl of a left-eye image with reference to the original vertical data enable signal VDE in and operation delays resulted by all of the units. For that the most part of overall efficiency of the image detecting apparatus 100 is determined according to the block matching performed by the calculating unit 115 , the processing unit 125 generates the stereo glass control signal R ctrl of the right-eye image and the stereo glass control signal L ctrl of the left-eye image only with reference to the original vertical data enable signal VDE in and a delay resulted by the block matching performed by the calculating unit 115 —such principle is also within the spirit and scope of the present disclosure.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/844,953 US8718331B2 (en) | 2009-07-29 | 2010-07-28 | Image detecting apparatus and method thereof |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US22927709P | 2009-07-29 | 2009-07-29 | |
US12/844,953 US8718331B2 (en) | 2009-07-29 | 2010-07-28 | Image detecting apparatus and method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110026776A1 US20110026776A1 (en) | 2011-02-03 |
US8718331B2 true US8718331B2 (en) | 2014-05-06 |
Family
ID=43527048
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/844,953 Active 2031-04-12 US8718331B2 (en) | 2009-07-29 | 2010-07-28 | Image detecting apparatus and method thereof |
Country Status (3)
Country | Link |
---|---|
US (1) | US8718331B2 (zh) |
CN (1) | CN101990108B (zh) |
TW (1) | TWI422213B (zh) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012068948A (ja) * | 2010-09-24 | 2012-04-05 | Renesas Electronics Corp | 顔属性推定装置およびその方法 |
CN102447916A (zh) * | 2010-09-30 | 2012-05-09 | 宏碁股份有限公司 | 立体影像显示装置与方法 |
KR101793283B1 (ko) * | 2011-04-20 | 2017-11-02 | 엘지디스플레이 주식회사 | 재깅 개선방법과 이를 이용한 입체영상 표시장치 |
JP2013038454A (ja) * | 2011-08-03 | 2013-02-21 | Sony Corp | 画像処理装置および方法、並びにプログラム |
KR101846552B1 (ko) * | 2011-11-30 | 2018-04-09 | 엘지디스플레이 주식회사 | 표시패널과 필름 패턴 리타더의 미스 얼라인 검사 시스템 및 방법 |
EP2667354B1 (en) * | 2012-05-24 | 2015-07-08 | Thomson Licensing | Method and apparatus for analyzing stereoscopic or multi-view images |
EP4068930B1 (en) | 2021-04-01 | 2024-03-13 | Ovh | A rack system for housing an electronic device |
CA3151725A1 (en) | 2021-04-01 | 2022-10-01 | Ovh | Immersion cooling system with dual dielectric cooling liquid circulation |
US11924998B2 (en) | 2021-04-01 | 2024-03-05 | Ovh | Hybrid immersion cooling system for rack-mounted electronic assemblies |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4819255A (en) * | 1986-11-19 | 1989-04-04 | Kabushiki Kaisha Toshiba | Stereo X-ray fluorography apparatus |
US6314211B1 (en) * | 1997-12-30 | 2001-11-06 | Samsung Electronics Co., Ltd. | Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image |
US6411326B1 (en) * | 1997-05-21 | 2002-06-25 | Olympus Optical Co., Ltd. | Stereo image display unit |
US20040135740A1 (en) * | 2002-10-11 | 2004-07-15 | Seiji Sato | Polarization means and its position holding mechanism |
US20080247462A1 (en) * | 2007-04-03 | 2008-10-09 | Gary Demos | Flowfield motion compensation for video compression |
US20090189830A1 (en) * | 2008-01-23 | 2009-07-30 | Deering Michael F | Eye Mounted Displays |
US20090278918A1 (en) * | 2008-05-07 | 2009-11-12 | Marcus Michael A | Display using bidirectionally scanned linear modulator |
US7671888B2 (en) * | 2003-08-08 | 2010-03-02 | Olympus Corporation | Stereoscopic-endoscope display control apparatus and stereoscopic endoscope system |
US20100149321A1 (en) * | 2008-12-11 | 2010-06-17 | Ushiki Suguru | Image processing apparatus, image processing method, and program |
US20100238267A1 (en) * | 2007-03-16 | 2010-09-23 | Thomson Licensing | System and method for combining text with three dimensional content |
US7817106B2 (en) * | 2004-09-15 | 2010-10-19 | Sharp Kabushiki Kaisha | Display device, viewing angle control device, and electronic apparatus |
US20110199466A1 (en) * | 2010-02-17 | 2011-08-18 | Kim Daehun | Image display device, 3d viewing device, and method for operating the same |
US8436893B2 (en) * | 2009-07-31 | 2013-05-07 | 3Dmedia Corporation | Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3D) images |
US8441520B2 (en) * | 2010-12-27 | 2013-05-14 | 3Dmedia Corporation | Primary and auxiliary image capture devcies for image processing and related methods |
US8477425B2 (en) * | 2010-02-28 | 2013-07-02 | Osterhout Group, Inc. | See-through near-eye display glasses including a partially reflective, partially transmitting optical element |
US8488246B2 (en) * | 2010-02-28 | 2013-07-16 | Osterhout Group, Inc. | See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003061116A (ja) * | 2001-08-09 | 2003-02-28 | Olympus Optical Co Ltd | 立体映像表示装置 |
EP2315454B1 (en) * | 2002-09-27 | 2012-07-25 | Sharp Kabushiki Kaisha | 3-D image display device |
KR100932977B1 (ko) * | 2005-07-05 | 2009-12-21 | 삼성모바일디스플레이주식회사 | 입체 영상 표시 장치 |
KR100679054B1 (ko) * | 2006-02-15 | 2007-02-06 | 삼성전자주식회사 | 입체 영상을 디스플레이하는 장치 및 방법 |
CN101459761B (zh) * | 2007-12-13 | 2014-12-24 | 晨星半导体股份有限公司 | 图像处理方法及其相关装置 |
CN101282492B (zh) * | 2008-05-23 | 2010-07-21 | 清华大学 | 三维影像显示深度调整方法 |
-
2010
- 2010-03-12 TW TW099107224A patent/TWI422213B/zh active
- 2010-05-25 CN CN2010101927295A patent/CN101990108B/zh active Active
- 2010-07-28 US US12/844,953 patent/US8718331B2/en active Active
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4819255A (en) * | 1986-11-19 | 1989-04-04 | Kabushiki Kaisha Toshiba | Stereo X-ray fluorography apparatus |
US6411326B1 (en) * | 1997-05-21 | 2002-06-25 | Olympus Optical Co., Ltd. | Stereo image display unit |
US6314211B1 (en) * | 1997-12-30 | 2001-11-06 | Samsung Electronics Co., Ltd. | Apparatus and method for converting two-dimensional image sequence into three-dimensional image using conversion of motion disparity into horizontal disparity and post-processing method during generation of three-dimensional image |
US20040135740A1 (en) * | 2002-10-11 | 2004-07-15 | Seiji Sato | Polarization means and its position holding mechanism |
US7671888B2 (en) * | 2003-08-08 | 2010-03-02 | Olympus Corporation | Stereoscopic-endoscope display control apparatus and stereoscopic endoscope system |
US7817106B2 (en) * | 2004-09-15 | 2010-10-19 | Sharp Kabushiki Kaisha | Display device, viewing angle control device, and electronic apparatus |
US20100238267A1 (en) * | 2007-03-16 | 2010-09-23 | Thomson Licensing | System and method for combining text with three dimensional content |
US20080247462A1 (en) * | 2007-04-03 | 2008-10-09 | Gary Demos | Flowfield motion compensation for video compression |
US20090189830A1 (en) * | 2008-01-23 | 2009-07-30 | Deering Michael F | Eye Mounted Displays |
US8134591B2 (en) * | 2008-05-07 | 2012-03-13 | Eastman Kodak Company | Display using bidirectionally scanned linear modulator |
US20090278918A1 (en) * | 2008-05-07 | 2009-11-12 | Marcus Michael A | Display using bidirectionally scanned linear modulator |
US20100149321A1 (en) * | 2008-12-11 | 2010-06-17 | Ushiki Suguru | Image processing apparatus, image processing method, and program |
US8436893B2 (en) * | 2009-07-31 | 2013-05-07 | 3Dmedia Corporation | Methods, systems, and computer-readable storage media for selecting image capture positions to generate three-dimensional (3D) images |
US20110199466A1 (en) * | 2010-02-17 | 2011-08-18 | Kim Daehun | Image display device, 3d viewing device, and method for operating the same |
US8477425B2 (en) * | 2010-02-28 | 2013-07-02 | Osterhout Group, Inc. | See-through near-eye display glasses including a partially reflective, partially transmitting optical element |
US8488246B2 (en) * | 2010-02-28 | 2013-07-16 | Osterhout Group, Inc. | See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film |
US8441520B2 (en) * | 2010-12-27 | 2013-05-14 | 3Dmedia Corporation | Primary and auxiliary image capture devcies for image processing and related methods |
Non-Patent Citations (2)
Title |
---|
Machine translation-JP 2008-315524 Ushika et al Dec. 11, 2008. * |
Machine translation—JP 2008-315524 Ushika et al Dec. 11, 2008. * |
Also Published As
Publication number | Publication date |
---|---|
TW201105106A (en) | 2011-02-01 |
CN101990108A (zh) | 2011-03-23 |
TWI422213B (zh) | 2014-01-01 |
US20110026776A1 (en) | 2011-02-03 |
CN101990108B (zh) | 2012-07-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8718331B2 (en) | Image detecting apparatus and method thereof | |
US9204140B2 (en) | Display device and display method | |
EP2477158B1 (en) | Multi-view rendering apparatus and method using background pixel expansion and background-first patch matching | |
JP5153940B2 (ja) | 動き補償を用いた画像の奥行き抽出のためのシステムおよび方法 | |
TWI493505B (zh) | 影像處理方法以及影像處理裝置 | |
US20140240471A1 (en) | Method, device and apparatus for generating stereoscopic images using a non-stereoscopic camera | |
JP2011004388A (ja) | 多視点映像表示装置および方法 | |
JP5257248B2 (ja) | 画像処理装置および方法、ならびに画像表示装置 | |
JPH0927969A (ja) | 複数画像の中間像生成方法及び視差推定方法および装置 | |
US20120194905A1 (en) | Image display apparatus and image display method | |
US20120087571A1 (en) | Method and apparatus for synchronizing 3-dimensional image | |
US8970670B2 (en) | Method and apparatus for adjusting 3D depth of object and method for detecting 3D depth of object | |
JP2011155431A (ja) | フレームレート変換装置および映像表示装置 | |
US9277202B2 (en) | Image processing device, image processing method, image display apparatus, and image display method | |
US20160014387A1 (en) | Multiple view image display apparatus and disparity estimation method thereof | |
US9113140B2 (en) | Stereoscopic image processing device and method for generating interpolated frame with parallax and motion vector | |
JP4892105B1 (ja) | 映像処理装置、映像処理方法および映像表示装置 | |
JP2013165306A (ja) | 立体映像表示装置 | |
JPWO2012117462A1 (ja) | 立体映像処理装置および方法ならびに立体映像表示装置 | |
US20130307941A1 (en) | Video processing device and video processing method | |
US20120249733A1 (en) | Measuring apparatus for measuring stereo video format and associated method | |
JP2013247522A (ja) | 画像処理装置及び画像処理方法 | |
JP5395884B2 (ja) | 映像処理装置、映像処理方法および映像表示装置 | |
JP5490252B2 (ja) | 立体画像処理装置、立体画像表示装置及び立体画像処理方法 | |
WO2012099544A1 (en) | A method, an apparatus and a computer program product for estimating motion between frames of a video sequence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MSTAR SEMICONDUCTOR, INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIANG, REN KUAN;HUNG, YU-CHIEH;TSAI, MENG-CHE;SIGNING DATES FROM 20100702 TO 20100707;REEL/FRAME:024752/0068 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
AS | Assignment |
Owner name: MEDIATEK INC., TAIWAN Free format text: MERGER;ASSIGNOR:MSTAR SEMICONDUCTOR, INC.;REEL/FRAME:050665/0001 Effective date: 20190124 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |