WO2012050366A2 - 3d image display apparatus and display method thereof - Google Patents

3d image display apparatus and display method thereof Download PDF

Info

Publication number
WO2012050366A2
WO2012050366A2 PCT/KR2011/007595 KR2011007595W WO2012050366A2 WO 2012050366 A2 WO2012050366 A2 WO 2012050366A2 KR 2011007595 W KR2011007595 W KR 2011007595W WO 2012050366 A2 WO2012050366 A2 WO 2012050366A2
Authority
WO
WIPO (PCT)
Prior art keywords
display element
display
eye
depth value
image
Prior art date
Application number
PCT/KR2011/007595
Other languages
French (fr)
Other versions
WO2012050366A3 (en
Inventor
Su-Jin Yeon
Sang-Il Lee
Hye-Won Lee
Bo-Mi Kim
Moon-Sik Jeong
Yeon-Hee Lee
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to EP11832753.5A priority Critical patent/EP2628304A4/en
Priority to CN201180049444.9A priority patent/CN103155579B/en
Priority to JP2013533768A priority patent/JP2014500642A/en
Priority to BR112013008559A priority patent/BR112013008559A2/en
Priority to AU2011314521A priority patent/AU2011314521B2/en
Priority to RU2013121611/08A priority patent/RU2598989C2/en
Publication of WO2012050366A2 publication Critical patent/WO2012050366A2/en
Publication of WO2012050366A3 publication Critical patent/WO2012050366A3/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/339Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof

Definitions

  • the present invention relates to a Three-Dimensional (3D) () image display apparatus and a display method thereof, and more particularly to a 3D image display apparatus and a display method thereof, which can provide a 3D Graphical User Interface (GUI) .
  • 3D Three-Dimensional
  • GUI Graphical User Interface
  • 3D stereoscopic image technology has very diverse application fields, such as information communication, broadcasting, medical treatment, educational training, military affairs, games, animation, virtual reality, Computer-Aided Design (CAD), industrial technology, and the like, and may be the core basic technology of the next-generation 3D stereoscopic multimedia information communication which is commonly required in these fields.
  • information communication broadcasting, medical treatment, educational training, military affairs, games, animation, virtual reality, Computer-Aided Design (CAD), industrial technology, and the like
  • CAD Computer-Aided Design
  • a 3D effect occurs through complex actions of the degree of change in thickness of a crystalline lens according to the position of an object to be observed, a difference in angle between both eyes and an object, a difference in position and shape of an object between left and right eyes, disparity occurring in accordance with the movement of an object, and other effects caused by various kinds of psychologies and memories.
  • the binocular disparity that occurs due to a distance of about 6-7cm between two human eyes may be the most important factor. Due to the binocular disparity, two eyes see the same object at different angles, and due to this difference in angle between the two eyes, different images are formed on the two eyes, respectively. These two images are transferred to viewer’s brain through the retinas, and the brain accurately harmonizes these two kinds of information, resulting in that the viewer can feel the original 3D stereoscopic image.
  • a 3D image is composed of a left-eye image that is recognized by a left eye and a right-eye image that is recognized by a right eye. Also, the 3D display apparatus expresses a 3D effect of an image using the disparity between the left-eye image and the right-eye image. As described above, an environment in which a 3D image is implemented by alternately displaying the left-eye image and the right-eye image is called a stereo 3D image.
  • FIGS. 1 and 17 are diagrams explaining problems in the related art.
  • FIG. 1 shows a general 2D graphic (for example, 2.5D or 3D) User Interface (UI) screen, which expresses a difference in selection (attention) by giving variety to visual graphic elements, such as a position, size, color, and the like.
  • UI User Interface
  • FIG. 17 shows a method of expressing a UI through stereo 3D, in which an object is expressed with a depth value in a Z-axis direction through utilization of a difference in visual point between both eyes that occurs when the object existing on the screen is seen in a method of expressing a UI through stereo 3D, and attention information between an element selected on the screen and the remaining elements is stereoscopically expressed through such a depth value.
  • the 3D image when a 3D image is displayed on a background UI with depth, the 3D image may appear differently than how it should appear, or may cause visual fatigue to a user.
  • an aspect of the present invention provides a 3D image display apparatus and a display method thereof, which can arrange and provide depth values among 3D display elements.
  • a display method of a 3D image display apparatus includes displaying a first display element having a first depth value; adjusting at least one depth value of the first display element and a second display element having a second depth value to be displayed in superimposition with or displayed on the first display element in a state where the first display element having the first depth value is displayed; and displaying the first display element and the second display element in superimposition with the first display element or on the first display element, of which the depth value has been adjusted, wherein at least one of the first display element and the second display element is displayed with an adjusted depth value.
  • a 3D image display apparatus includes a display processing unit for generating a first display element having a first depth value and a second display element having a second depth value; a display unit for displaying the generated first and second display elements; and a control unit for adjusting and displaying at least one depth value of the first display element and the second display element having the second depth value to be displayed in superimposition with or displayed on the first display element in a state where the first display element having the first depth value is displayed, wherein at least one of the first display element and the second display element is displayed with an adjusted depth value.
  • the display state of the 3D display elements can be visually stabilized, and a user’s attention and recognition with respect to the 3D display elements can be heightened.
  • FIGS. 1 and 17 are diagrams explaining problems in the related art
  • FIG. 2 is a diagram illustrating a 3D image providing system according to an embodiment of the present invention
  • FIG. 3 is a block diagram illustrating the configuration of a display apparatus according to an embodiment of the present invention.
  • FIG. 4 is a diagram illustrating the disparity between a left-eye image and a right-eye image according to an embodiment of the present invention
  • FIG. 18 is a diagram illustrating the relationship between the disparity and the depth value according to an embodiment of the present invention.
  • FIGS. 5, 19, 20 are diagrams illustrating cases to which a display method according to an embodiment of the present invention is applied;
  • FIGS. 6, 21, 22 and 7, 23, 24 are diagrams illustrating a display method according to an embodiment of the present invention.
  • FIGS. 8, 25, 26 and 9, 27, 28 are diagrams illustrating a display method according to another embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a display method of a 3D image display apparatus according to an embodiment of the present invention.
  • FIG. 11 is a diagram illustrating examples of a 3D image to which a display method according to an embodiment of the present invention is applied.
  • FIGS. 12 and 29 are diagrams illustrating a method of adjusting disparity information according to an embodiment of the present invention.
  • FIGS. 13 and 30 are flowcharts illustrating methods of adjusting depth according to embodiments of the present invention.
  • FIGS. 14 and 31 are diagrams illustrating methods of adjusting a set of left-eye and right-eye images and a reference surface in accordance with a previously-stored imaging distance according to embodiments of the present invention.
  • FIGS. 15, 32, 33 are diagrams illustrating examples to which methods of adjusting depth according to embodiments of the present invention are applied.
  • FIGS. 16, 34, 35 are diagrams illustrating other examples to which methods of adjusting depth according to embodiments of the present invention are applied.
  • FIG. 2 is a diagram illustrating a 3D image providing system according to an embodiment of the present invention.
  • the 3D image providing system includes a display apparatus 100 for displaying a 3D image on a display and 3D glasses 200 for viewing the 3D image.
  • the 3D image display apparatus 100 may be implemented to display a 3D image or to display both a 2D image and a 3D image.
  • the 3D image display apparatus 100 displays a 2D image
  • the same method as the existing 2D display apparatus may be used, while in the case where the 3D image display apparatus 100 displays a 3D image, the received 2D image may be converted into a 3D image and the converted 3D image may be displayed on the screen.
  • a 3D image that is received from an imaging device such as a camera or a 3D image that is captured by a camera, edited/processed in a broadcasting station, and transmitted from the broadcasting station may be received and processed to be displayed on the screen.
  • the 3D image display apparatus 100 can process a left-eye image and a right-eye image with reference to the format of the 3D image, and make the processed left-eye image and right-eye image be time-divided and alternately displayed.
  • a user can view the 3D image through alternate seeing of the left-eye image and the right-eye image that are displayed on the display apparatus 100 with the left eye and the right eye using the 3D glasses 200.
  • the observer recognizes minutely different image information through the left eye and the right eye.
  • the observer acquires depth information on the 3D object by combining the minutely different image information, and feels the 3D effect.
  • the 3D image display apparatus 100 enables the observer to feel the 3D image by providing images that the left eye and the right eye of the observer can see to the observer when the observer actually observes the 3D object.
  • a difference in images that the left eye and the right eye of the observer see is called disparity. If such disparity has a positive value, the observer feels as if the 3D object is positioned closer to a predetermined reference surface in a direction of the observer, and if the disparity has a negative value, the observer feels as if the 3D object is spaced apart in an opposite direction to the observer.
  • the 3D glasses 200 may be implemented by active type shutter glasses.
  • the shutter glass type corresponds to a display method using the disparity of both eyes, which enables the observer to recognize space feeling caused by a brain action from the image that is observed at different angles through synchronization of the image providing of the display apparatus with the on/off operation of both left and right eyes of the 3D glasses.
  • the principle of the shutter glass type is to synchronize left and right image frames that are reproduced in the 3D image display apparatus 100 with a shutter mounted on the 3D glasses 200. That is, as left and right glasses of the 3D glasses are selectively opened and closed according to left and right image sync signals of the 3D image display apparatus 100, the 3D stereoscopic image is provided.
  • the 3D image display apparatus 100 can display a 3D display element, for example, a 3D UI (particularly, a GUI ) on the screen together with the 3D image.
  • a 3D UI particularly, a GUI
  • the GUI is means for inputting a user command through selection of an icon or menu that is displayed on the display.
  • the user may move a cursor with reference to a menu, a list, an icon, and the like, which are displayed on the display through the GUI, and select an item on which the cursor is located.
  • the 3D image display apparatus 100 can implement a 3D image through adjustment of only the disparity between a left-eye image and a right-eye image for the 3D effect, it can provide the 3D GUI without the necessity of passing through separate image processing (scaling, texture, and perspective effect processing).
  • FIG. 3 is a block diagram illustrating the configuration of a display apparatus according to an embodiment of the present invention.
  • the 3D image display apparatus 100 includes an image receiving unit 110, an image processing unit 120, a display unit 130, a control unit 140, a storage unit 150, a user interface unit 160, a UI processing unit 170, and a sync signal processing unit 180.
  • FIG. 2 illustrates that the 3D image display apparatus 100 is a 3D TeleVision (TV), this is merely exemplary, and the 3D image display apparatus 100 according to an embodiment of the present invention may be implemented by all devices that can display 3D UI elements, such as a digital TV, a mobile communication terminal, a mobile telephone, a Personal Digital Assistant (PDA) , a smart phone, a Digital Multimedia Broadcasting (DMB) phone, an MPEG Audio Layer III (MP3) player, an audio appliance, a portable TV, and a digital camera.
  • PDA Personal Digital Assistant
  • DMB Digital Multimedia Broadcasting
  • MP3 MPEG Audio Layer III
  • the image receiving unit 110 receives and demodulates a 2D or 3D image signal that is received by wire or wirelessly from a broadcasting station or a satellite. Further, the image receiving unit 110 may be connected to an external appliance such as a camera to receive a 3D image from the external appliance.
  • the external appliance may be connected wirelessly or by wire through an interface such as S-Video, component, composite, D-Sub, Digital Visual Interface (DVI), and High-Definition Multimedia Interface (HDMI). Since a 2D image processing method is well known to those skilled in the art, explanation will be hereinafter made around a 3D image processing method.
  • a 3D image is an image composed of at least one frame.
  • One frame may include a left-eye image and a right-eye image, or each frame may be composed of a left-eye frame or a right-eye frame. That is, a 3D image is an image that is generated according to one of diverse 3D formats.
  • the 3D image received in the image receiving unit 110 may be in diverse formats, and particularly may be in a format according to one of a general top-bottom type, a side-by-side type, a horizontal interleave type, a vertical interleave type or checker board type, and a sequential frame.
  • the image receiving unit 110 transfers the received 2D image or 3D image to the image processing unit 120.
  • the image processing unit 120 performs signal processing, such as video decoding, format analysis, and video scaling, and a task of GUI addition and the like, with respect to the 2D image or 3D image that is received in the image receiving unit 110.
  • the image processing unit 120 generates a left-eye image and a right-eye image, which correspond to the size of one screen (for example, 1920?1080) using the format of the 2D image or 3D image that is input to the image receiving unit 110.
  • the image processing unit 120 For example, if the format of the 3D image is a format according to the top-bottom type, the side-by-side type, the horizontal interleave type, the vertical interleave type or checker board type, or the sequential frame, the image processing unit 120 generates the left-eye image and right-eye image to be provided to the user by extracting a left-eye image portion and a right-eye image portion from each image frame and performing expansion scaling or interpolation of the extracted left-eye image and right-eye image.
  • the image processing unit 220 extracts the left-eye image or the right-eye image from each frame and prepares to provide the extracted image to the user.
  • information on the format of the input 3D image may be included in the 3D image signal or may not be included therein.
  • the image processing unit 120 extracts the information on the format by analyzing the 3D image, and processes the received 3D image according to the extracted information.
  • the image processing unit 120 processes the received 3D image according to the format input from the user, or processes the received 3D image according to a preset format.
  • the image processing unit 120 performs time division of the extracted left-eye image and right-eye image and alternately transfers the time-divided left-eye image and right-eye image to the display unit 130. That is, the image processing unit 120 transfers the left-eye image and the right-eye image to the display unit 130 in the temporal order of “left-eye image (L1) ? right-eye image (R1) ? left-eye image (L2) ? right-eye image (R2) ? ...”.
  • the image processing unit 120 may insert an On-Screen Display (OSD) image generated by an OSD processing unit 150 into a black image, or process and provide the OSD image itself as one image.
  • OSD On-Screen Display
  • the display unit 130 alternately outputs the left-eye image and the right-eye image output from the image processing unit 120 to the user.
  • the control unit 140 controls the whole operation of the display apparatus 100 according to a user command transferred from the user interface unit 170 or a preset option.
  • control unit 140 controls the image receiving unit 110 and the image processing unit 120 to receive the 3D image, separate the received 3D image into a left-eye image and a right-eye image, and perform scaling or interpolation of the separated left-eye image and right-eye image with a size in which the separated left-eye image and right-eye image can be displayed on one screen.
  • control unit 140 controls the display unit 130 to be switched so that the polarization direction of the image that is provided through the display unit 130 coincides with the left-eye image or the right-eye image.
  • control unit 140 may control the operation of the UI processing unit 170 to be described later.
  • the UI processing unit 150 may generate a display element that is displayed to overlap the 2D or 3D image output to the display unit 130, and insert the generated display element into the 3D image.
  • the UI processing unit 150 may set and generate depth values that are different according to the execution order of display elements such as, for example, UI elements, attributes thereof, and the like.
  • the depth value means a numerical value that indicates the degree of depth feeling in the 3D image.
  • the 3D image can express the depth feeling that corresponds to not only the positions in up, down, left, and right direction on the screen but also the positions in forward and backward directions that are viewer’s eye directions.
  • the depth feeling is determined by the disparity between the left-eye image and the right-eye image.
  • the depth value of the 3D content list GUI corresponds to the disparity between the left-eye GUI and the right-eye GUI. The relationship between the depth value and the disparity will be described in more detail with reference to FIGS. 5 and 6 later.
  • the UI elements may be displayed to overlap the display image as a screen that displays characters or figures of a menu screen, caution expression, time, and channel number on the display screen.
  • a caution expression may be displayed as a UI element in an OSD form according to a preset option or event.
  • a user operates input devices such as an operation panel and a remote controller in order to select a desired function from the menus, a main menu, a sub-menu, and the like, may be displayed on the display screen as UI elements in an OSD form.
  • Such menus may include option items that can be selected in the display apparatus or items that can adjust the function of the display apparatus.
  • the UI processing unit 150 may perform tasks of 2D/3D conversion of UI elements, transparency, color, size, shape and position adjustment, highlight, animation effect, and the like, under the control of the control unit 140.
  • the control unit 140 may calculate a value of the relative depth of a second display element to a first display element, may detect a set of left-eye and right-eye images that correspond to the calculated relative depth value from among a plurality of sets of previously-stored left-eye and right-eye images that correspond to different depth values, and may replace the left-eye and right-eye images of the second display element with the detected set of left-eye and right-eye images.
  • control unit 140 may replace one of the left-eye and right-eye images of the second display element with another image.
  • control unit 140 may adjust the distance, on a screen, between the left-eye and right-eye images of the second display element in accordance with the distance between the left-eye and right-eye images of the first display element, and may display the distance-adjusted left-eye and right-eye images.
  • the first display element may be a background element
  • the second display element may be a content element on the background element
  • the storage unit 160 is a storage medium in which various kinds of programs which are required to operate the 3D image display apparatus 100 are stored, and may be implemented by a memory, an Hard Disk Drive (HDD) , and the like.
  • the storage unit may include a Read-Only Memory (ROM) for storing programs for performing the operation of the control unit 140, a Random Access Memory (RAM) for temporarily storing data according to the operation performance of the control unit 140, and the like.
  • the storage unit 160 may further include an Electrically Erasable and Programmable ROM (EEPROM) for storing various kinds of reference data.
  • EEPROM Electrically Erasable and Programmable ROM
  • the user interface unit 170 transfers a user command that is received from input means such as a remote controller, an input panel, or the like, to the control unit 140.
  • the input panel may be a touch pad, a key pad that is composed of various kinds of function keys, numeral keys, special keys, character keys, and the like, or a touch screen.
  • the sync signal processing unit 180 generates a sync signal for alternately opening the left-eye shutter glass and the right-eye shutter glass of the 3D glasses 200 to match the display timing of the left-eye image and the right-eye image, and transmits the sync signal to the 3D glasses 200. Accordingly, the 3D glasses 200 are alternately opened and closed, so that the left-eye image is displayed on the display unit 130 in the left-eye open timing of the 3D glasses 200 and the right-eye image is displayed on the display unit 130 in the right-eye open timing of the 3D glasses 200.
  • the sync signal may be transmitted in the form of infrared rays.
  • the control unit 140 controls the whole operation of the 3D image display apparatus 100 according to a user operation that is transferred from the user interface unit 170.
  • control unit 140 controls the image receiving unit 110 and the image processing unit 120 to receive the 3D image, separate the received 3D image into a left-eye image and a right-eye image, and perform scaling or interpolation of the separated left-eye image and right-eye image with a size in which the separated left-eye image and right-eye image can be displayed on one screen.
  • control unit 140 controls the OSD processing unit 150 to generate an OSD that corresponds to the user operation that is transferred from the user interface unit 170, and controls the sync signal processing unit to generate and transmit the sync signal that is synchronized with the output timing of the left-eye image and the right-eye image.
  • control unit 140 can operate to adjust at least one depth value of the first UI element and the second UI element using the depth value of the first UI element.
  • control unit 140 can adjust a difference in depth values between the first UI element and the second UI element in consideration of the respective depth values of the first UI element and the second UI element.
  • control unit 140 can change the second depth value of the second UI element to a preset depth value, and then change the first depth value of the first UI element as large as the depth value to which the second UI element has been changed.
  • the preset depth value may be a value that is smaller than the second depth value.
  • the preset depth value may include a depth value of a display screen.
  • the control unit 140 can adjust the depth values of the plurality of UI elements which have been executed to be displayed to the same depth value.
  • the adjusted depth value may be smaller than the depth value of the newly executed UI element.
  • control unit 140 can adjust the adjusted depth value of the first and second UI elements to the original depth values if the execution of the superimposition display of the first and second UI elements is canceled.
  • the UI that is executed to be displayed in superimposition with the UI element which has been executed to be displayed may be a UI element having a feedback property that includes event contents related to the already executed UI element, or a UI element having at least one property of alarm, caution, and popup that include event contents which are not related to the already executed UI element.
  • the 3D glasses 200 enables a user to view the left-eye image and the right-eye image through the left eye and the right eye, respectively, by alternately opening and closing the left-eye shutter glass and the right-eye shutter glass according to the sync signal received from the 3D image display apparatus 100.
  • the display unit 130 may include detailed configurations, such as a panel driving unit (not illustrated), a display panel unit (not illustrated), a backlight driving unit (not illustrated), and a backlight emitting unit (not illustrated), and the detailed explanation thereof will be omitted.
  • the depth value is determined by the disparity between the left-eye image and the right-eye image, and this will now be described in detail with reference to FIGS. 4A and 4B.
  • FIG. 4 is a diagram illustrating the disparity between the left-eye image and the right-eye image according to an embodiment of the present invention.
  • FIG. 4 illustrates that an object 410 of the left-eye image and an object 420 of the right-eye image overlap each other. However, in the case of an actual display on the screen, the object 410 of the left-eye image and the object 420 of the right-eye image are alternately displayed.
  • the degree of mismatch between the object 410 of the left-eye image and the object 420 of the right-eye image is called the disparity.
  • FIG. 18 illustrates the relationship between the disparity and the depth value according to an embodiment of the present invention.
  • FIG. 18 illustrates the disparity that occurs between a TV screen and user’s eyes.
  • User’s eyes have the disparity according to the distance between the two eyes.
  • an object that is closer to the user has a larger disparity.
  • the left-eye image and the right-eye image are displayed in one position without the disparity.
  • the left-eye image 440 and the right-eye image 445 are displayed in positions which are spaced apart for the disparity of “1” from each other.
  • the left-eye image 440 and the right-eye image are displayed in positions which are spaced apart for the disparity of “2” from each other.
  • the 3D TV 100 can set the depth value of the 3D GUI using the disparity between the left-eye GUI and the right-eye GUI without separate image processing.
  • FIGS. 5A to 9C illustrate a UI in a 2D state, it is to be noted that they actually indicate a stereo 3D GUI that is implemented by alternately displaying a left-eye GUI and a right-eye GUI.
  • FIGS. 5, 19, 20 are diagrams illustrating cases to which a display method according to an embodiment of the present invention is applied.
  • UIs “A” and “B” are positioned with depth values which are equal to or at least larger than that of the display screen in the Z-axis direction through the pixel disparity.
  • a left-eye image that is projected from a left-eye pixel is formed as an image having a predetermined disparity from a right-eye image
  • the right-eye image that is projected from a right-eye pixel is formed as an image having a predetermined disparity from the left-eye image.
  • the observer can feel the 3D effect through obtaining of the same depth information as that in the case where the observer sees the actual 3D object through the left eye and the right eye.
  • the currently selected UI “B” since the currently selected UI “B” is executed later than the UI “A”, it may be positioned at an upper end of the display screen.
  • a UI “B1” that is executed by a user’s input on the selected UI “B” may be positioned at the upper end of the display screen with a depth value that is larger than that of the UI “A” or “B” in the Z-axis direction.
  • the UI “B1” is a kind of UI event that is related to the UI “B”, and may be a UI element having the character such as a feedback as a result of execution according to the user’s input.
  • an additionally executed UI “C” may be positioned at the upper end of the display screen with a depth value that is larger than that of the UI “A” or “B” in the Z-axis direction.
  • the UI “C” may be a new UI element that is executed through generation of an new window such as an alarm or caution message window or a popup form as a separate UI event that is not related to the UI “A” or “B”.
  • FIGS. 6, 21, 22, 7, 23 and 24 are diagrams illustrating a display method according to an embodiment of the present invention.
  • FIGS. 6, 21, 22, 7, 23 and 24 are related to a display method in the case where the UI execution screen is changed from the UI execution screen as illustrated in FIG. 5A to the UI execution screen as illustrated in FIG. 5B.
  • FIGS. 6, 21 and 22 illustrate front views of a stereo 3D screen
  • FIGS. 7, 23 and 24 illustrate top views.
  • the state illustrated in FIGS. 6A to 6C corresponds to the UI execution state illustrated in FIGS. 7, 23 and 24.
  • a UI element 1-1 611-1 having a predetermined depth value as illustrated in FIG. 6B may be executed in superimposition according to a user’s command or a preset option.
  • the UI element 1-1 611-1 executed in superimposition is a UI element that is related to the already executed UI element A (for example, a UI element having the feedback characteristic to the UI element A-1 611) will be described as an example.
  • the depth values of the already executed UI elements A, B, and C and the UI element 1-1 611-1 to be newly executed may be adjusted and displayed as illustrated in FIG. 22.
  • Respective UI elements illustrated in FIG. 7 correspond to respective UI elements illustrated in FIG. 6, and it can be confirmed that the illustrated UI element A 610 (in particular, UI element A-1 611) is being executed.
  • Respective UI elements illustrated in FIG. 23 correspond to respective UI elements illustrated in FIG. 21, and it can be confirmed that the illustrated UI element 1-1 611-1 having a predetermined depth value is being executed in superimposition with the UI element A-1 611.
  • Respective UI elements illustrated in FIG. 24 correspond to respective UI elements illustrated in FIG. 22, and the illustrated UI element 1-1 611-1 executed in superimposition can be moved as large as a predetermined depth value Z(*), and then the already executed UI elements can be moved as large as the depth value for which the UI element 1-1 611-1 has been moved, that is, Z(1-1)-Z(*).
  • the UI element 1-1 611-1 that is lastly input by the user maintains the character having the depth value at the uppermost end of the current display screen.
  • FIGS. 8, 25, 26, 9, 27 and 28 are diagrams illustrating a display method of according to another embodiment of the present invention.
  • FIGS. 8, 25, 26, 9, 27 and 28 are related to the display method in the case where the UI execution display screen is changed from the UI execution display screen as illustrated in FIG. 5 to the UI execution display screen as illustrated in FIG. 20.
  • FIGS. 8, 25 and 26 illustrate front views of a stereo 3D screen
  • FIGS. 9, 27 and 28 illustrate top views.
  • the state illustrated in FIGS. 8A to 8C corresponds to the UI execution state illustrated in FIGS. 9, 27 and 28.
  • a UI element 840 having a predetermined depth value as illustrated in FIG. 25 may be executed in superimposition according to a user’s command or a preset option.
  • the UI element 840 executed in superimposition is a UI element that is not related to the already executed UI elements A, B, and C 810, 820, and 830 will be described as an example.
  • a UI element D may be a new UI element that is executed through generation of an new window such as an alarm or caution message window or a popup form as a separate UI event that is not related to the already executed UI elements A, B, and C.
  • the depth values of the already executed UI elements A, B, and C 810, 820, and 830 and the UI element 840 to be newly executed may be adjusted and displayed as illustrated in FIG. 26.
  • Respective UI elements illustrated in FIG. 9 correspond to respective UI elements illustrated in FIG. 8, and it can be confirmed that the illustrated UI element A 810 is being executed.
  • Respective UI elements illustrated in FIG. 27 correspond to respective UI elements illustrated in FIG. 25, and it can be confirmed that the illustrated UI element 840 having a predetermined depth value is being executed in superimposition with the UI elements A, B, and C 810, 820, and 830.
  • Respective UI elements illustrated in FIG. 28 correspond to respective UI elements illustrated in FIG. 28, and the already executed UI elements A, B, and C 810, 820, and 830 except for the illustrated UI element 840 executed in superimposition can be moved with the same depth value Z(#).
  • the UI elements A, B, and C 810, 820, and 830 which are merged with the same depth value Z(#) maintain the 3D UI character having the predetermined depth value except for the case where Z(#) is “0”.
  • the depth value is adjusted in the method as illustrated in FIGS. 6, 21, 22, 7, 22 and 23 in the case of the UI that is related to the currently executed UI element, while the depth value is adjusted in the method as illustrated in FIGS. 8, 25, 26, 9, 27 and 28 in the case of the UI that is not related to the currently executed UI element.
  • this is merely exemplary, and it is possible to apply the display method as illustrated in FIGS. 6, 21, 22, 7, 22 and 23 and the display method as illustrated in FIGS. 8, 25, 26, 9, 27 and 28 regardless of the character of the UI element.
  • FIG. 10 is a flowchart illustrating a display method of a 3D image display apparatus according to an embodiment of the present invention.
  • a first UI element having a first depth value is displayed (S1010), and a second UI element having a second depth value is executed to be displayed in superimposition with the first UI element (S1020).
  • the depth value may correspond to the disparity between the left-eye UI and the right-eye UI.
  • At least one depth value of the first UI element and the second UI element is adjusted using the depth value of the first UI element (S1030).
  • step S1040O At least one of the first UI element and the second UI element, of which the depth value has been adjusted in step S1030, is displayed (S1040O).
  • step S1030 the different in depth values between the first UI element and the second UI element can be adjusted in consideration of the respective depth values of the first UI element and the second UI element.
  • step S1030 the second depth value of the second UI element can be changed to a preset depth value, and then the first depth value of the first UI element can be changed as large as the depth value to which the second UI element has been changed.
  • the preset depth value may be a value that is smaller than the second depth value of the second UI element.
  • the preset depth value may include the depth value of the display screen.
  • a third UI element having a third depth value may be displayed before execution of the second UI element.
  • the depth value-adjusting step may adjust the first and third depth values of the first and third UI elements to the same depth value if the second UI element having the second depth value is executed to be displayed in superimposition with the first UI element and the third UI element.
  • the adjusted same depth value of the first and third UI elements may be a value that is smaller than the second depth value of the second UI element.
  • the adjusted depth value of the first and second UI elements can be adjusted to the original depth values if the execution of the superimposition display of the first and second UI elements is canceled.
  • the second UI element may be a UI element having a feedback property that includes event contents related to the first UI element, and a UI element having at least one property of alarm, caution, and popup that include event contents which are not related to the first UI element.
  • FIG. 11 is a diagram illustrating an example of a 3D image to which a display method according to an embodiment of the present invention is applied.
  • a display method according to an embodiment of the present invention may be applied when content B with depth or a 3D image including content B is displayed over a background UI A with depth.
  • FIGS. 12 and 29 are diagrams illustrating a method of adjusting disparity information according to an embodiment of the present invention.
  • the content 1212 or the 3D image 1213 may appear differently than intended.
  • the content 1212 or the 3D image 1213 may be displayed as recessed into the background UI 1211 or being distant from the background UI 1211.
  • the left-eye and right-eye images 1214 and 1215 of the 3D image 1213 may be replaced with left-eye and right-eye images 1214-1 and 1215-1, respectively, which have been for adjusting disparity.
  • the left-eye and right-eye images 1214-1 and 1215-1 may be images that are created considering the depth of the background UI 1211 and the depth of the 3D image 1213.
  • the left-eye and right-eye images 1214-1 and 1215-1 may be the left-eye and right-eye images of a 3D image with a depth of (z+1).
  • the background UI 1211 which has a depth of +1
  • the left-eye and right-eye images of a 3D image 1213-1 that replaces the 3D image 1213 may appear to protrude beyond the background UI 1211 by as much as +z.
  • FIGS. 13 and 30 are flowcharts illustrating methods of adjusting depth according to embodiments of the present invention.
  • an event for displaying a 3D image (the current 3D image) on a background UI with a Z-axis depth value may occur.
  • a 3D photo may be displayed over a frame with a Z-axis depth value.
  • a plurality of sets of left-eye and right-eye images of an object that correspond to different distances from the object may be called.
  • the plurality of sets of left-eye and right-eye images may be sets of left-eye and right-eye images that are captured at different distances from the object by a 3D camera.
  • a set of left-eye and right-eye images having a relative depth, on the Z-axis, to the background UI may be searched for from the plurality of sets of left-eye and right-eye images.
  • the Z-axis depth of the background UI is +1
  • a set of left-eye and right-eye images of the object that are captured at a distance of +1 may be searched for from the plurality of sets of left-eye and right-eye images.
  • a set of left-eye and right-eye images with a depth of (z+1) may be searched for from the plurality of sets of left-eye and right-eye images.
  • a plurality of left-eye and right-eye images that correspond to different imaging distances may be stored in advance, as shown in FIG. 14.
  • step S1340 the left-eye and right-eye images of the current 3D image may be replaced with the left-eye and right-eye images, respectively, that are returned in step S1330.
  • step S1350 the distance on a screen between the returned left-eye and right-eye images may be adjusted in accordance with the distance between the left-eye and right-eye images of the background UI, and the distance-adjusted left-eye and right-eye images may be displayed. Accordingly, the reference surface for the returned left-eye and right-eye images may be adjusted to correspond with the background UI.
  • the distance between the left-eye and right-eye images of the background UI is +1 and the distance between the left-eye and right-eye images of the current 3D image is +d
  • the distance between the returned left-eye and right-eye images is adjusted by +1 so that they are displayed a distance of (d+1) apart from each other.
  • step S1321 in response to an event for displaying a 3D image (the current 3D image) on a background UI with a Z-axis depth value in step S1311, one of the left-eye and right-eye images of the current 3D image may be replaced with the other image. Accordingly, the depth of the current 3D image may be removed.
  • step S1331 the distance between the replaced left-eye and right-eye images may be adjusted in accordance with the distance on a screen between the left-eye and right-eye images of the background UI, and the distance-adjusted left-eye and right-eye images may be displayed. Accordingly, the reference surface the replaced left-eye and right-eye may be adjusted to correspond with the background UI.
  • FIGS. 14 and 31 are diagrams illustrating methods of adjusting a set of left-eye and right-eye images and a reference surface in accordance with a previously-stored imaging distance according to embodiments of the present invention.
  • FIGS. 15, 32 and 33 are diagrams illustrating examples to which methods of adjusting depth according to embodiments of the present invention are applied.
  • FIG. 15 illustrates an example in which an object 1512 with depth is displayed over a background thumbnail 1511 that is displayed with depth on a display screen 1510.
  • FIGS. 32 and 33 illustrate methods of adjusting depth according to embodiments of the present invention to the example illustrated in FIG. 15.
  • FIG. 32 illustrates the method illustrated in FIG. 13 being applied to the example illustrated in FIG. 15.
  • the depth of an object 1512 with respect to a background UI 1511 may be adjusted, and the depth-adjusted object 1512-1 may be displayed.
  • FIG. 33 illustrates the method illustrated in FIG. 13B being applied to the example illustrated in FIG. 15.
  • the depth of an object 1512 with respect to a background UI 1511 may be adjusted, and the 3D depth-adjusted object 1512-2 may be displayed.
  • FIGS. 16, 34 and 35 are diagrams illustrating other examples to which methods of adjusting depth according to embodiments of the present invention are applied.
  • FIG. 16 illustrates an example in which a background UI 1611 has different Z-axis depths from one point (x, y) to another point (x, y) on a display screen 1610 and an object 1612 with depth is displayed on the background UI 1611.
  • the object 1612 moves from a point (x1, y1) to a point (x2, y2).
  • FIGS. 34 and 35 illustrate methods of adjusting depth according to embodiments of the present invention to the example illustrated in FIG. 16.
  • FIG. 34 illustrates the method illustrated in FIG. 13 being applied to the example illustrated in FIG. 16A.
  • the depth of an object 1612 with respect to a background UI 1611 may be adjusted, and the depth-adjusted object 1612-1 may be displayed.
  • FIG. 35 illustrates the method illustrated in FIG. 13B being applied to the example illustrated in FIG. 16.
  • the depth of an object 1612 with respect to a background UI 1511 may be adjusted, and the 3D depth-adjusted object 1612-2 may be displayed.
  • the present invention may include a computer readable recording medium that includes a program for executing the display method of the 3D image display apparatus as described above.
  • the computer readable recording medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer readable recording media may include, for example, a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. Further, the computer readable recording medium may be distributed into computer systems connected through a network, and codes, which can be read by computers in a distribution method, may be stored and executed.
  • the display state of the 3D UI elements can be visually stabilized.
  • objects in the 3D image may be displayed naturally with as much depth as the background UI.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A display method of a Three-Dimensional (3D) display apparatus is provided. The display method includes displaying a first display element having a first depth value; adjusting at least one depth value of the first display element and a second display element having a second depth value to be displayed in superimposition with or displayed on the first display element in a state where the first display element having the first depth value is displayed; and displaying the first display element and the second display element in superimposition with the first display element or on the first display element, of which the depth value has been adjusted, wherein at least one of the first display element and the second display element is displayed with an adjusted depth value. Accordingly, a user's attention and recognition can be heightened in executing the User Interface (UI).

Description

3D IMAGE DISPLAY APPARATUS AND DISPLAY METHOD THEREOF
The present invention relates to a Three-Dimensional (3D) () image display apparatus and a display method thereof, and more particularly to a 3D image display apparatus and a display method thereof, which can provide a 3D Graphical User Interface (GUI) .
3D stereoscopic image technology has very diverse application fields, such as information communication, broadcasting, medical treatment, educational training, military affairs, games, animation, virtual reality, Computer-Aided Design (CAD), industrial technology, and the like, and may be the core basic technology of the next-generation 3D stereoscopic multimedia information communication which is commonly required in these fields.
In general, a 3D effect occurs through complex actions of the degree of change in thickness of a crystalline lens according to the position of an object to be observed, a difference in angle between both eyes and an object, a difference in position and shape of an object between left and right eyes, disparity occurring in accordance with the movement of an object, and other effects caused by various kinds of psychologies and memories.
Among them, the binocular disparity that occurs due to a distance of about 6-7cm between two human eyes may be the most important factor. Due to the binocular disparity, two eyes see the same object at different angles, and due to this difference in angle between the two eyes, different images are formed on the two eyes, respectively. These two images are transferred to viewer’s brain through the retinas, and the brain accurately harmonizes these two kinds of information, resulting in that the viewer can feel the original 3D stereoscopic image.
A 3D image is composed of a left-eye image that is recognized by a left eye and a right-eye image that is recognized by a right eye. Also, the 3D display apparatus expresses a 3D effect of an image using the disparity between the left-eye image and the right-eye image. As described above, an environment in which a 3D image is implemented by alternately displaying the left-eye image and the right-eye image is called a stereo 3D image.
In order to express a 3D image in a Two-Dimensional (2D) image, methods for changing the transparency, performing a shading process, changing texture, and the like, have been used. However, in the case of using a 3D display apparatus, a 3D effect can be given even to a UI.
FIGS. 1 and 17 are diagrams explaining problems in the related art.
FIG. 1 shows a general 2D graphic (for example, 2.5D or 3D) User Interface (UI) screen, which expresses a difference in selection (attention) by giving variety to visual graphic elements, such as a position, size, color, and the like.
FIG. 17 shows a method of expressing a UI through stereo 3D, in which an object is expressed with a depth value in a Z-axis direction through utilization of a difference in visual point between both eyes that occurs when the object existing on the screen is seen in a method of expressing a UI through stereo 3D, and attention information between an element selected on the screen and the remaining elements is stereoscopically expressed through such a depth value.
As illustrated in FIG. 17, in the case where UI elements having the same character are expressed in superimposition with each other on the screen where one or more UI elements having the depth values are stereoscopically reproduced using the stereo 3D, the depth values between the existing 3D UI elements and new UI elements, which exist on the screen, collide each other to cause the occurrence of visual interference.
Accordingly, since a distinction between the selected UI object and the unselected UI object becomes unclear on the screen, users are thrown into confusion in visibility or in UI operations, and excessive 3D values that are generated due to a plurality of UI elements shown on the screen may cause the users visual fatigue.
Additionally, when a 3D image is displayed on a background UI with depth, the 3D image may appear differently than how it should appear, or may cause visual fatigue to a user.
The present invention has been made to address at least the above problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention provides a 3D image display apparatus and a display method thereof, which can arrange and provide depth values among 3D display elements.
According to one aspect of the present invention, a display method of a 3D image display apparatus includes displaying a first display element having a first depth value; adjusting at least one depth value of the first display element and a second display element having a second depth value to be displayed in superimposition with or displayed on the first display element in a state where the first display element having the first depth value is displayed; and displaying the first display element and the second display element in superimposition with the first display element or on the first display element, of which the depth value has been adjusted, wherein at least one of the first display element and the second display element is displayed with an adjusted depth value.
According to another aspect of the present invention, a 3D image display apparatus includes a display processing unit for generating a first display element having a first depth value and a second display element having a second depth value; a display unit for displaying the generated first and second display elements; and a control unit for adjusting and displaying at least one depth value of the first display element and the second display element having the second depth value to be displayed in superimposition with or displayed on the first display element in a state where the first display element having the first depth value is displayed, wherein at least one of the first display element and the second display element is displayed with an adjusted depth value.
Accordingly, the display state of the 3D display elements can be visually stabilized, and a user’s attention and recognition with respect to the 3D display elements can be heightened.
The above and other aspects, features and advantages of the present invention will be more apparent from the following detailed description when taken in conjunction with the accompanying drawings, in which:
FIGS. 1 and 17 are diagrams explaining problems in the related art;
FIG. 2 is a diagram illustrating a 3D image providing system according to an embodiment of the present invention;
FIG. 3 is a block diagram illustrating the configuration of a display apparatus according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating the disparity between a left-eye image and a right-eye image according to an embodiment of the present invention;
FIG. 18 is a diagram illustrating the relationship between the disparity and the depth value according to an embodiment of the present invention;
FIGS. 5, 19, 20 are diagrams illustrating cases to which a display method according to an embodiment of the present invention is applied;
FIGS. 6, 21, 22 and 7, 23, 24 are diagrams illustrating a display method according to an embodiment of the present invention;
FIGS. 8, 25, 26 and 9, 27, 28 are diagrams illustrating a display method according to another embodiment of the present invention; and
FIG. 10 is a flowchart illustrating a display method of a 3D image display apparatus according to an embodiment of the present invention.
FIG. 11 is a diagram illustrating examples of a 3D image to which a display method according to an embodiment of the present invention is applied.
FIGS. 12 and 29 are diagrams illustrating a method of adjusting disparity information according to an embodiment of the present invention.
FIGS. 13 and 30 are flowcharts illustrating methods of adjusting depth according to embodiments of the present invention.
FIGS. 14 and 31 are diagrams illustrating methods of adjusting a set of left-eye and right-eye images and a reference surface in accordance with a previously-stored imaging distance according to embodiments of the present invention.
FIGS. 15, 32, 33 are diagrams illustrating examples to which methods of adjusting depth according to embodiments of the present invention are applied.
FIGS. 16, 34, 35 are diagrams illustrating other examples to which methods of adjusting depth according to embodiments of the present invention are applied.
Hereinafter, preferred embodiments of the present invention are described in detail with reference to the accompanying drawings. For reference, in explaining the present invention, well-known functions or constructions will not be described in detail so as to avoid obscuring the description with unnecessary detail.
FIG. 2 is a diagram illustrating a 3D image providing system according to an embodiment of the present invention. As illustrated in FIG. 2, the 3D image providing system includes a display apparatus 100 for displaying a 3D image on a display and 3D glasses 200 for viewing the 3D image.
The 3D image display apparatus 100 may be implemented to display a 3D image or to display both a 2D image and a 3D image.
In the case where the 3D image display apparatus 100 displays a 2D image, the same method as the existing 2D display apparatus may be used, while in the case where the 3D image display apparatus 100 displays a 3D image, the received 2D image may be converted into a 3D image and the converted 3D image may be displayed on the screen. According to circumstances, a 3D image that is received from an imaging device such as a camera or a 3D image that is captured by a camera, edited/processed in a broadcasting station, and transmitted from the broadcasting station may be received and processed to be displayed on the screen.
In particular, the 3D image display apparatus 100 can process a left-eye image and a right-eye image with reference to the format of the 3D image, and make the processed left-eye image and right-eye image be time-divided and alternately displayed. A user can view the 3D image through alternate seeing of the left-eye image and the right-eye image that are displayed on the display apparatus 100 with the left eye and the right eye using the 3D glasses 200.
In general, since the left eye and the right eye of an observer observe one 3D object in minutely different positions, the observer recognizes minutely different image information through the left eye and the right eye. The observer acquires depth information on the 3D object by combining the minutely different image information, and feels the 3D effect.
The 3D image display apparatus 100 according to the present invention enables the observer to feel the 3D image by providing images that the left eye and the right eye of the observer can see to the observer when the observer actually observes the 3D object. In this case, a difference in images that the left eye and the right eye of the observer see is called disparity. If such disparity has a positive value, the observer feels as if the 3D object is positioned closer to a predetermined reference surface in a direction of the observer, and if the disparity has a negative value, the observer feels as if the 3D object is spaced apart in an opposite direction to the observer.
The 3D glasses 200 may be implemented by active type shutter glasses. The shutter glass type corresponds to a display method using the disparity of both eyes, which enables the observer to recognize space feeling caused by a brain action from the image that is observed at different angles through synchronization of the image providing of the display apparatus with the on/off operation of both left and right eyes of the 3D glasses.
The principle of the shutter glass type is to synchronize left and right image frames that are reproduced in the 3D image display apparatus 100 with a shutter mounted on the 3D glasses 200. That is, as left and right glasses of the 3D glasses are selectively opened and closed according to left and right image sync signals of the 3D image display apparatus 100, the 3D stereoscopic image is provided.
On the other hand, the 3D image display apparatus 100 can display a 3D display element, for example, a 3D UI (particularly, a GUI ) on the screen together with the 3D image. Here, the GUI is means for inputting a user command through selection of an icon or menu that is displayed on the display. For example, the user may move a cursor with reference to a menu, a list, an icon, and the like, which are displayed on the display through the GUI, and select an item on which the cursor is located.
Since the 3D image display apparatus 100 can implement a 3D image through adjustment of only the disparity between a left-eye image and a right-eye image for the 3D effect, it can provide the 3D GUI without the necessity of passing through separate image processing (scaling, texture, and perspective effect processing).
FIG. 3 is a block diagram illustrating the configuration of a display apparatus according to an embodiment of the present invention.
Referring to FIG. 3, the 3D image display apparatus 100 according to an embodiment of the present invention includes an image receiving unit 110, an image processing unit 120, a display unit 130, a control unit 140, a storage unit 150, a user interface unit 160, a UI processing unit 170, and a sync signal processing unit 180.
On the other hand, although FIG. 2 illustrates that the 3D image display apparatus 100 is a 3D TeleVision (TV), this is merely exemplary, and the 3D image display apparatus 100 according to an embodiment of the present invention may be implemented by all devices that can display 3D UI elements, such as a digital TV, a mobile communication terminal, a mobile telephone, a Personal Digital Assistant (PDA) , a smart phone, a Digital Multimedia Broadcasting (DMB) phone, an MPEG Audio Layer III (MP3) player, an audio appliance, a portable TV, and a digital camera.
The image receiving unit 110 receives and demodulates a 2D or 3D image signal that is received by wire or wirelessly from a broadcasting station or a satellite. Further, the image receiving unit 110 may be connected to an external appliance such as a camera to receive a 3D image from the external appliance. The external appliance may be connected wirelessly or by wire through an interface such as S-Video, component, composite, D-Sub, Digital Visual Interface (DVI), and High-Definition Multimedia Interface (HDMI). Since a 2D image processing method is well known to those skilled in the art, explanation will be hereinafter made around a 3D image processing method.
As described above, a 3D image is an image composed of at least one frame. One frame may include a left-eye image and a right-eye image, or each frame may be composed of a left-eye frame or a right-eye frame. That is, a 3D image is an image that is generated according to one of diverse 3D formats.
Accordingly, the 3D image received in the image receiving unit 110 may be in diverse formats, and particularly may be in a format according to one of a general top-bottom type, a side-by-side type, a horizontal interleave type, a vertical interleave type or checker board type, and a sequential frame.
The image receiving unit 110 transfers the received 2D image or 3D image to the image processing unit 120.
The image processing unit 120 performs signal processing, such as video decoding, format analysis, and video scaling, and a task of GUI addition and the like, with respect to the 2D image or 3D image that is received in the image receiving unit 110.
In particular, the image processing unit 120 generates a left-eye image and a right-eye image, which correspond to the size of one screen (for example, 1920?1080) using the format of the 2D image or 3D image that is input to the image receiving unit 110.
For example, if the format of the 3D image is a format according to the top-bottom type, the side-by-side type, the horizontal interleave type, the vertical interleave type or checker board type, or the sequential frame, the image processing unit 120 generates the left-eye image and right-eye image to be provided to the user by extracting a left-eye image portion and a right-eye image portion from each image frame and performing expansion scaling or interpolation of the extracted left-eye image and right-eye image.
Further, if the format of the 3D image is of a general frame sequence type, the image processing unit 220 extracts the left-eye image or the right-eye image from each frame and prepares to provide the extracted image to the user.
On the other hand, information on the format of the input 3D image may be included in the 3D image signal or may not be included therein.
For example, if the information on the format of the input 3D image is included in the 3D image signal, the image processing unit 120 extracts the information on the format by analyzing the 3D image, and processes the received 3D image according to the extracted information. By contrast, if the information on the format of the input 3D image is not included in the 3D image signal, the image processing unit 120 processes the received 3D image according to the format input from the user, or processes the received 3D image according to a preset format.
The image processing unit 120 performs time division of the extracted left-eye image and right-eye image and alternately transfers the time-divided left-eye image and right-eye image to the display unit 130. That is, the image processing unit 120 transfers the left-eye image and the right-eye image to the display unit 130 in the temporal order of “left-eye image (L1) ? right-eye image (R1) ? left-eye image (L2) ? right-eye image (R2) ? …”.
Further, the image processing unit 120 may insert an On-Screen Display (OSD) image generated by an OSD processing unit 150 into a black image, or process and provide the OSD image itself as one image.
The display unit 130 alternately outputs the left-eye image and the right-eye image output from the image processing unit 120 to the user.
The control unit 140 controls the whole operation of the display apparatus 100 according to a user command transferred from the user interface unit 170 or a preset option.
In particular, the control unit 140 controls the image receiving unit 110 and the image processing unit 120 to receive the 3D image, separate the received 3D image into a left-eye image and a right-eye image, and perform scaling or interpolation of the separated left-eye image and right-eye image with a size in which the separated left-eye image and right-eye image can be displayed on one screen.
Further, the control unit 140 controls the display unit 130 to be switched so that the polarization direction of the image that is provided through the display unit 130 coincides with the left-eye image or the right-eye image.
Further, the control unit 140 may control the operation of the UI processing unit 170 to be described later.
The UI processing unit 150 may generate a display element that is displayed to overlap the 2D or 3D image output to the display unit 130, and insert the generated display element into the 3D image.
Further, the UI processing unit 150 may set and generate depth values that are different according to the execution order of display elements such as, for example, UI elements, attributes thereof, and the like. Here, the depth value means a numerical value that indicates the degree of depth feeling in the 3D image. The 3D image can express the depth feeling that corresponds to not only the positions in up, down, left, and right direction on the screen but also the positions in forward and backward directions that are viewer’s eye directions. In this case, the depth feeling is determined by the disparity between the left-eye image and the right-eye image. Accordingly, the depth value of the 3D content list GUI corresponds to the disparity between the left-eye GUI and the right-eye GUI. The relationship between the depth value and the disparity will be described in more detail with reference to FIGS. 5 and 6 later.
Here, the UI elements may be displayed to overlap the display image as a screen that displays characters or figures of a menu screen, caution expression, time, and channel number on the display screen.
For example, a caution expression may be displayed as a UI element in an OSD form according to a preset option or event.
On the other hand, as a user operates input devices such as an operation panel and a remote controller in order to select a desired function from the menus, a main menu, a sub-menu, and the like, may be displayed on the display screen as UI elements in an OSD form.
Such menus may include option items that can be selected in the display apparatus or items that can adjust the function of the display apparatus.
Further, the UI processing unit 150 may perform tasks of 2D/3D conversion of UI elements, transparency, color, size, shape and position adjustment, highlight, animation effect, and the like, under the control of the control unit 140.
The control unit 140 may calculate a value of the relative depth of a second display element to a first display element, may detect a set of left-eye and right-eye images that correspond to the calculated relative depth value from among a plurality of sets of previously-stored left-eye and right-eye images that correspond to different depth values, and may replace the left-eye and right-eye images of the second display element with the detected set of left-eye and right-eye images.
Further, the control unit 140 may replace one of the left-eye and right-eye images of the second display element with another image.
Further, the control unit 140 may adjust the distance, on a screen, between the left-eye and right-eye images of the second display element in accordance with the distance between the left-eye and right-eye images of the first display element, and may display the distance-adjusted left-eye and right-eye images.
The first display element may be a background element, and the second display element may be a content element on the background element.
The storage unit 160 is a storage medium in which various kinds of programs which are required to operate the 3D image display apparatus 100 are stored, and may be implemented by a memory, an Hard Disk Drive (HDD) , and the like. For example, the storage unit may include a Read-Only Memory (ROM) for storing programs for performing the operation of the control unit 140, a Random Access Memory (RAM) for temporarily storing data according to the operation performance of the control unit 140, and the like. The storage unit 160 may further include an Electrically Erasable and Programmable ROM (EEPROM) for storing various kinds of reference data.
The user interface unit 170 transfers a user command that is received from input means such as a remote controller, an input panel, or the like, to the control unit 140.
Here, the input panel may be a touch pad, a key pad that is composed of various kinds of function keys, numeral keys, special keys, character keys, and the like, or a touch screen.
The sync signal processing unit 180 generates a sync signal for alternately opening the left-eye shutter glass and the right-eye shutter glass of the 3D glasses 200 to match the display timing of the left-eye image and the right-eye image, and transmits the sync signal to the 3D glasses 200. Accordingly, the 3D glasses 200 are alternately opened and closed, so that the left-eye image is displayed on the display unit 130 in the left-eye open timing of the 3D glasses 200 and the right-eye image is displayed on the display unit 130 in the right-eye open timing of the 3D glasses 200. Here, the sync signal may be transmitted in the form of infrared rays.
The control unit 140 controls the whole operation of the 3D image display apparatus 100 according to a user operation that is transferred from the user interface unit 170.
In particular, the control unit 140 controls the image receiving unit 110 and the image processing unit 120 to receive the 3D image, separate the received 3D image into a left-eye image and a right-eye image, and perform scaling or interpolation of the separated left-eye image and right-eye image with a size in which the separated left-eye image and right-eye image can be displayed on one screen.
Further, the control unit 140 controls the OSD processing unit 150 to generate an OSD that corresponds to the user operation that is transferred from the user interface unit 170, and controls the sync signal processing unit to generate and transmit the sync signal that is synchronized with the output timing of the left-eye image and the right-eye image.
Further, if a second UI element having a second depth value is executed to be displayed in superimposition with a first UI element in a state where the first UI element having a first depth value is displayed, the control unit 140 can operate to adjust at least one depth value of the first UI element and the second UI element using the depth value of the first UI element.
Specifically, the control unit 140 can adjust a difference in depth values between the first UI element and the second UI element in consideration of the respective depth values of the first UI element and the second UI element.
Specifically, the control unit 140 can change the second depth value of the second UI element to a preset depth value, and then change the first depth value of the first UI element as large as the depth value to which the second UI element has been changed. Here, the preset depth value may be a value that is smaller than the second depth value. Further, the preset depth value may include a depth value of a display screen.
Further, if a new UI element is executed to be displayed in superimposition with a plurality of UI elements in a state where the plurality of UI elements having the corresponding depth values have been executed to be displayed, the control unit 140 can adjust the depth values of the plurality of UI elements which have been executed to be displayed to the same depth value. Here, the adjusted depth value may be smaller than the depth value of the newly executed UI element.
Further, the control unit 140 can adjust the adjusted depth value of the first and second UI elements to the original depth values if the execution of the superimposition display of the first and second UI elements is canceled.
On the other hand, the UI that is executed to be displayed in superimposition with the UI element which has been executed to be displayed may be a UI element having a feedback property that includes event contents related to the already executed UI element, or a UI element having at least one property of alarm, caution, and popup that include event contents which are not related to the already executed UI element.
The 3D glasses 200 enables a user to view the left-eye image and the right-eye image through the left eye and the right eye, respectively, by alternately opening and closing the left-eye shutter glass and the right-eye shutter glass according to the sync signal received from the 3D image display apparatus 100.
On the other hand, the display unit 130 may include detailed configurations, such as a panel driving unit (not illustrated), a display panel unit (not illustrated), a backlight driving unit (not illustrated), and a backlight emitting unit (not illustrated), and the detailed explanation thereof will be omitted.
In this case, the depth value is determined by the disparity between the left-eye image and the right-eye image, and this will now be described in detail with reference to FIGS. 4A and 4B.
FIG. 4 is a diagram illustrating the disparity between the left-eye image and the right-eye image according to an embodiment of the present invention.
FIG. 4 illustrates that an object 410 of the left-eye image and an object 420 of the right-eye image overlap each other. However, in the case of an actual display on the screen, the object 410 of the left-eye image and the object 420 of the right-eye image are alternately displayed.
As illustrated in FIG. 4, the degree of mismatch between the object 410 of the left-eye image and the object 420 of the right-eye image is called the disparity.
FIG. 18 illustrates the relationship between the disparity and the depth value according to an embodiment of the present invention.
FIG. 18 illustrates the disparity that occurs between a TV screen and user’s eyes. User’s eyes have the disparity according to the distance between the two eyes.
Further, as illustrated in FIG. 18, it can be confirmed that an object that is closer to the user has a larger disparity. Specifically, in the case of displaying an object that is positioned on the surface (that is, the depth value is “0”) of the TV screen, the left-eye image and the right-eye image are displayed in one position without the disparity. By contrast, in the case of displaying an object in a position which is somewhat closer to the viewer and thus has a depth value of “-1”, it is required that the left-eye image 440 and the right-eye image 445 are displayed in positions which are spaced apart for the disparity of “1” from each other. Further, in the case of displaying an object in a position which is further closer to the viewer and thus has a depth value of “-2”, it is required that the left-eye image 440 and the right-eye image are displayed in positions which are spaced apart for the disparity of “2” from each other.
As described above, it can be confirmed that the depth value is a value that corresponds to the disparity. Accordingly, the 3D TV 100 can set the depth value of the 3D GUI using the disparity between the left-eye GUI and the right-eye GUI without separate image processing.
Hereinafter, with reference to FIGS. 5A to 9C, a method of adjusting a depth value of a UI element will be described. Although FIGS. 5A to 9C illustrate a UI in a 2D state, it is to be noted that they actually indicate a stereo 3D GUI that is implemented by alternately displaying a left-eye GUI and a right-eye GUI.
FIGS. 5, 19, 20 are diagrams illustrating cases to which a display method according to an embodiment of the present invention is applied.
As illustrated in FIG. 5A, on a screen where a stereo 3D can be reproduced, UIs “A” and “B” are positioned with depth values which are equal to or at least larger than that of the display screen in the Z-axis direction through the pixel disparity. Specifically, a left-eye image that is projected from a left-eye pixel is formed as an image having a predetermined disparity from a right-eye image, and the right-eye image that is projected from a right-eye pixel is formed as an image having a predetermined disparity from the left-eye image. Accordingly, when the left eye and the right eye of an observer recognize the left-eye image that is projected from the left-eye pixel and the right-eye image that is projected from the right-eye pixel, the observer can feel the 3D effect through obtaining of the same depth information as that in the case where the observer sees the actual 3D object through the left eye and the right eye.
In this case, since the currently selected UI “B” is executed later than the UI “A”, it may be positioned at an upper end of the display screen.
Thereafter, as illustrated in FIG. 19, a UI “B1” that is executed by a user’s input on the selected UI “B” may be positioned at the upper end of the display screen with a depth value that is larger than that of the UI “A” or “B” in the Z-axis direction. Here, the UI “B1” is a kind of UI event that is related to the UI “B”, and may be a UI element having the character such as a feedback as a result of execution according to the user’s input.
Further, as illustrated in FIG. 20, an additionally executed UI “C” may be positioned at the upper end of the display screen with a depth value that is larger than that of the UI “A” or “B” in the Z-axis direction. Here, the UI “C” may be a new UI element that is executed through generation of an new window such as an alarm or caution message window or a popup form as a separate UI event that is not related to the UI “A” or “B”.
FIGS. 6, 21, 22, 7, 23 and 24 are diagrams illustrating a display method according to an embodiment of the present invention.
FIGS. 6, 21, 22, 7, 23 and 24 are related to a display method in the case where the UI execution screen is changed from the UI execution screen as illustrated in FIG. 5A to the UI execution screen as illustrated in FIG. 5B.
Here, FIGS. 6, 21 and 22 illustrate front views of a stereo 3D screen, and FIGS. 7, 23 and 24 illustrate top views. The state illustrated in FIGS. 6A to 6C corresponds to the UI execution state illustrated in FIGS. 7, 23 and 24.
As illustrated in FIG. 6, on a screen where UI elements A, B, and C 610, 620, and 630 having different depth values are executed, a UI element 1-1 611-1 having a predetermined depth value as illustrated in FIG. 6B may be executed in superimposition according to a user’s command or a preset option. Now, a case where the UI element 1-1 611-1 executed in superimposition is a UI element that is related to the already executed UI element A (for example, a UI element having the feedback characteristic to the UI element A-1 611) will be described as an example.
When the UI element 1-1 611-1 having a predetermined depth value is executed in superimposition as illustrated in FIG. 21, the depth values of the already executed UI elements A, B, and C and the UI element 1-1 611-1 to be newly executed may be adjusted and displayed as illustrated in FIG. 22.
A detailed method of adjusting the depth value of the newly executed UI element 1-1 611-1 will be described with reference to FIGS. 7, 23 and 24.
Respective UI elements illustrated in FIG. 7 correspond to respective UI elements illustrated in FIG. 6, and it can be confirmed that the illustrated UI element A 610 (in particular, UI element A-1 611) is being executed.
Respective UI elements illustrated in FIG. 23 correspond to respective UI elements illustrated in FIG. 21, and it can be confirmed that the illustrated UI element 1-1 611-1 having a predetermined depth value is being executed in superimposition with the UI element A-1 611.
Respective UI elements illustrated in FIG. 24 correspond to respective UI elements illustrated in FIG. 22, and the illustrated UI element 1-1 611-1 executed in superimposition can be moved as large as a predetermined depth value Z(*), and then the already executed UI elements can be moved as large as the depth value for which the UI element 1-1 611-1 has been moved, that is, Z(1-1)-Z(*).
Even in this case, the UI element 1-1 611-1 that is lastly input by the user maintains the character having the depth value at the uppermost end of the current display screen.
FIGS. 8, 25, 26, 9, 27 and 28 are diagrams illustrating a display method of according to another embodiment of the present invention.
FIGS. 8, 25, 26, 9, 27 and 28 are related to the display method in the case where the UI execution display screen is changed from the UI execution display screen as illustrated in FIG. 5 to the UI execution display screen as illustrated in FIG. 20.
Here, FIGS. 8, 25 and 26 illustrate front views of a stereo 3D screen, and FIGS. 9, 27 and 28 illustrate top views. The state illustrated in FIGS. 8A to 8C corresponds to the UI execution state illustrated in FIGS. 9, 27 and 28.
As illustrated in FIG. 8, on a screen where UI elements A, B, and C 810, 820, and 830 having different depth values are executed, a UI element 840 having a predetermined depth value as illustrated in FIG. 25 may be executed in superimposition according to a user’s command or a preset option. Now, a case where the UI element 840 executed in superimposition is a UI element that is not related to the already executed UI elements A, B, and C 810, 820, and 830 will be described as an example. For example, a UI element D may be a new UI element that is executed through generation of an new window such as an alarm or caution message window or a popup form as a separate UI event that is not related to the already executed UI elements A, B, and C.
In the case where the UI element 840 having a predetermined depth value is executed in superimposition as illustrated in FIG. 8B, the depth values of the already executed UI elements A, B, and C 810, 820, and 830 and the UI element 840 to be newly executed may be adjusted and displayed as illustrated in FIG. 26.
A detailed method of adjusting the depth value of the newly executed UI element 840 will be described with reference to FIGS. 9, 27 and 28.
Respective UI elements illustrated in FIG. 9 correspond to respective UI elements illustrated in FIG. 8, and it can be confirmed that the illustrated UI element A 810 is being executed.
Respective UI elements illustrated in FIG. 27 correspond to respective UI elements illustrated in FIG. 25, and it can be confirmed that the illustrated UI element 840 having a predetermined depth value is being executed in superimposition with the UI elements A, B, and C 810, 820, and 830.
Respective UI elements illustrated in FIG. 28 correspond to respective UI elements illustrated in FIG. 28, and the already executed UI elements A, B, and C 810, 820, and 830 except for the illustrated UI element 840 executed in superimposition can be moved with the same depth value Z(#).
Even in this case, the UI elements A, B, and C 810, 820, and 830 which are merged with the same depth value Z(#) maintain the 3D UI character having the predetermined depth value except for the case where Z(#) is “0”.
On the other hand, in the embodiments illustrated in FIGS. 6, 21, 22, 7, 22, 23, 24, 8, 25, 26, 9, 27 and 28, a case where the UI elements are executed in superimposition in +Z-axis direction is exemplified for convenience in explanation. However, this is merely exemplary, and the same principle can be applied in the case where the UI elements are executed in ?Z-axis direction and in the case where the UI elements are mixedly executed in +Z-axis direction and in -Z-axis direction.
Further, in the embodiments illustrated in FIGS. 6, 21, 22, 7, 22, 23, 24, 8, 25, 26, 9, 27 and 28, it is exemplified that the depth value is adjusted in the method as illustrated in FIGS. 6, 21, 22, 7, 22 and 23 in the case of the UI that is related to the currently executed UI element, while the depth value is adjusted in the method as illustrated in FIGS. 8, 25, 26, 9, 27 and 28 in the case of the UI that is not related to the currently executed UI element. However, this is merely exemplary, and it is possible to apply the display method as illustrated in FIGS. 6, 21, 22, 7, 22 and 23 and the display method as illustrated in FIGS. 8, 25, 26, 9, 27 and 28 regardless of the character of the UI element.
FIG. 10 is a flowchart illustrating a display method of a 3D image display apparatus according to an embodiment of the present invention.
Referring to FIG. 10, according to the display method of the 3D image display apparatus, a first UI element having a first depth value is displayed (S1010), and a second UI element having a second depth value is executed to be displayed in superimposition with the first UI element (S1020). Here, the depth value may correspond to the disparity between the left-eye UI and the right-eye UI.
Then, at least one depth value of the first UI element and the second UI element is adjusted using the depth value of the first UI element (S1030).
Thereafter, at least one of the first UI element and the second UI element, of which the depth value has been adjusted in step S1030, is displayed (S1040O).
Here, in step S1030, the different in depth values between the first UI element and the second UI element can be adjusted in consideration of the respective depth values of the first UI element and the second UI element.
Further, in step S1030, the second depth value of the second UI element can be changed to a preset depth value, and then the first depth value of the first UI element can be changed as large as the depth value to which the second UI element has been changed.
Here, the preset depth value may be a value that is smaller than the second depth value of the second UI element.
Further, the preset depth value may include the depth value of the display screen.
Further, a third UI element having a third depth value may be displayed before execution of the second UI element. In this case, the depth value-adjusting step may adjust the first and third depth values of the first and third UI elements to the same depth value if the second UI element having the second depth value is executed to be displayed in superimposition with the first UI element and the third UI element.
Here, the adjusted same depth value of the first and third UI elements may be a value that is smaller than the second depth value of the second UI element.
Further, the adjusted depth value of the first and second UI elements can be adjusted to the original depth values if the execution of the superimposition display of the first and second UI elements is canceled.
On the other hand, the second UI element may be a UI element having a feedback property that includes event contents related to the first UI element, and a UI element having at least one property of alarm, caution, and popup that include event contents which are not related to the first UI element.
FIG. 11 is a diagram illustrating an example of a 3D image to which a display method according to an embodiment of the present invention is applied.
Referring to FIG. 11, a display method according to an embodiment of the present invention may be applied when content B with depth or a 3D image including content B is displayed over a background UI A with depth.
FIGS. 12 and 29 are diagrams illustrating a method of adjusting disparity information according to an embodiment of the present invention.
Referring to FIG. 12, when content 1212 with depth or a 3D image 1213 including the content 1212 is displayed on a background UI 1211 with depth, the content 1212 or the 3D image 1213 may appear differently than intended. For example, the content 1212 or the 3D image 1213 may be displayed as recessed into the background UI 1211 or being distant from the background UI 1211.
In this example, referring to FIG. 29, the left-eye and right- eye images 1214 and 1215 of the 3D image 1213 may be replaced with left-eye and right-eye images 1214-1 and 1215-1, respectively, which have been for adjusting disparity.
The left-eye and right-eye images 1214-1 and 1215-1 may be images that are created considering the depth of the background UI 1211 and the depth of the 3D image 1213.
For example, when the depth, on the Z-axis, of the background UI 1211, is +1 and the depth, on the Z-axis, of the 3D image 1213 is +z, the left-eye and right-eye images 1214-1 and 1215-1 may be the left-eye and right-eye images of a 3D image with a depth of (z+1).
Accordingly, when the background UI 1211, which has a depth of +1, is set as a reference surface, the left-eye and right-eye images of a 3D image 1213-1 that replaces the 3D image 1213 may appear to protrude beyond the background UI 1211 by as much as +z.
FIGS. 13 and 30 are flowcharts illustrating methods of adjusting depth according to embodiments of the present invention.
Referring to FIG. 13, in step S1310, an event for displaying a 3D image (the current 3D image) on a background UI with a Z-axis depth value may occur. In an example, a 3D photo may be displayed over a frame with a Z-axis depth value.
In step S1320, a plurality of sets of left-eye and right-eye images of an object that correspond to different distances from the object may be called. The plurality of sets of left-eye and right-eye images may be sets of left-eye and right-eye images that are captured at different distances from the object by a 3D camera.
In step S1330, a set of left-eye and right-eye images having a relative depth, on the Z-axis, to the background UI may be searched for from the plurality of sets of left-eye and right-eye images. For example, when the Z-axis depth of the background UI is +1, a set of left-eye and right-eye images of the object that are captured at a distance of +1 may be searched for from the plurality of sets of left-eye and right-eye images. In this example, if the current 3D image has a depth of +z, a set of left-eye and right-eye images with a depth of (z+1) may be searched for from the plurality of sets of left-eye and right-eye images.
In order to accomplish the above, a plurality of left-eye and right-eye images that correspond to different imaging distances may be stored in advance, as shown in FIG. 14.
In step S1340, the left-eye and right-eye images of the current 3D image may be replaced with the left-eye and right-eye images, respectively, that are returned in step S1330.
In step S1350, the distance on a screen between the returned left-eye and right-eye images may be adjusted in accordance with the distance between the left-eye and right-eye images of the background UI, and the distance-adjusted left-eye and right-eye images may be displayed. Accordingly, the reference surface for the returned left-eye and right-eye images may be adjusted to correspond with the background UI.
For example, referring to FIG. 31, when the distance between the left-eye and right-eye images of the background UI is +1 and the distance between the left-eye and right-eye images of the current 3D image is +d, the distance between the returned left-eye and right-eye images is adjusted by +1 so that they are displayed a distance of (d+1) apart from each other.
Referring to FIG. 30, in step S1321, in response to an event for displaying a 3D image (the current 3D image) on a background UI with a Z-axis depth value in step S1311, one of the left-eye and right-eye images of the current 3D image may be replaced with the other image. Accordingly, the depth of the current 3D image may be removed.
In step S1331, the distance between the replaced left-eye and right-eye images may be adjusted in accordance with the distance on a screen between the left-eye and right-eye images of the background UI, and the distance-adjusted left-eye and right-eye images may be displayed. Accordingly, the reference surface the replaced left-eye and right-eye may be adjusted to correspond with the background UI.
FIGS. 14 and 31 are diagrams illustrating methods of adjusting a set of left-eye and right-eye images and a reference surface in accordance with a previously-stored imaging distance according to embodiments of the present invention.
FIGS. 15, 32 and 33 are diagrams illustrating examples to which methods of adjusting depth according to embodiments of the present invention are applied.
FIG. 15 illustrates an example in which an object 1512 with depth is displayed over a background thumbnail 1511 that is displayed with depth on a display screen 1510.
FIGS. 32 and 33 illustrate methods of adjusting depth according to embodiments of the present invention to the example illustrated in FIG. 15.
More specifically, FIG. 32 illustrates the method illustrated in FIG. 13 being applied to the example illustrated in FIG. 15.
Referring to FIGS. 15 and 32, the depth of an object 1512 with respect to a background UI 1511 may be adjusted, and the depth-adjusted object 1512-1 may be displayed.
FIG. 33 illustrates the method illustrated in FIG. 13B being applied to the example illustrated in FIG. 15.
In FIG. 33, similar to FIG. 32, the depth of an object 1512 with respect to a background UI 1511 may be adjusted, and the 3D depth-adjusted object 1512-2 may be displayed.
FIGS. 16, 34 and 35 are diagrams illustrating other examples to which methods of adjusting depth according to embodiments of the present invention are applied.
FIG. 16 illustrates an example in which a background UI 1611 has different Z-axis depths from one point (x, y) to another point (x, y) on a display screen 1610 and an object 1612 with depth is displayed on the background UI 1611.
For example, referring to FIG. 16, the object 1612 moves from a point (x1, y1) to a point (x2, y2).
FIGS. 34 and 35 illustrate methods of adjusting depth according to embodiments of the present invention to the example illustrated in FIG. 16.
More specifically, FIG. 34 illustrates the method illustrated in FIG. 13 being applied to the example illustrated in FIG. 16A.
Referring to FIGS. 16 and 34, the depth of an object 1612 with respect to a background UI 1611 may be adjusted, and the depth-adjusted object 1612-1 may be displayed.
FIG. 35 illustrates the method illustrated in FIG. 13B being applied to the example illustrated in FIG. 16.
In FIG. 35, like in FIG. 34, the depth of an object 1612 with respect to a background UI 1511 may be adjusted, and the 3D depth-adjusted object 1612-2 may be displayed.
Further, the present invention may include a computer readable recording medium that includes a program for executing the display method of the 3D image display apparatus as described above. The computer readable recording medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer readable recording media may include, for example, a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. Further, the computer readable recording medium may be distributed into computer systems connected through a network, and codes, which can be read by computers in a distribution method, may be stored and executed.
Accordingly, by arranging the depth values among the 3D UI elements, the display state of the 3D UI elements can be visually stabilized.
Further, user’s attention and recognition with respect to the 3D UI elements can be heightened.
Further, when a 3D image is displayed over a background UI with depth, objects in the 3D image may be displayed naturally with as much depth as the background UI.
Further, it is possible to remove the depth of a 3D image and, thus, prevent any inconsistency between disparity information of objects in the 3D image and the depth of a background UI.
While the invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention, as defined by the appended claims and their equivalents.

Claims (15)

  1. A display method of a Three-Dimensional (3D) image display apparatus, the method comprising:
    displaying a first display element having a first depth value;
    adjusting at least one depth value of the first display element and a second display element having a second depth value to be displayed in superimposition with or displayed on the first display element in a state where the first display element having the first depth value is displayed; and
    displaying the first display element and the second display element in superimposition with the first display element or on the first display element, of which the depth value has been adjusted,
    wherein at least one of the first display element and the second display element is displayed with an adjusted depth value.
  2. The display method of claim 1, wherein adjusting the at least one depth value comprises adjusting a difference in depth values between the first display element and the second display element in consideration of respective depth values of the first display element and the second display element.
  3. The display method of claim 1, wherein adjusting the at least one depth value comprises changing the second depth value of the second display element to a preset depth value, and changing the first depth value of the first display element to the depth value to which the second display element has been changed.
  4. The display method of claim 3, wherein the preset depth value is smaller than the second depth value of the second display element or a depth value of a display screen
  5. The display method of claim 1, further comprising:
    displaying a third display element having a third depth value before displaying the second display element,;
    adjusting the third depth value of the third display element to a same depth value as the first display element, when the second display element is to be displayed in superimposition with the first display element and the third display element.
  6. The display method of claim 1, further comprising:
    adjusting the adjusted depth value of the first and second display elements to original depth values if the superimposition display of the first and second display elements is canceled.
  7. The display method of claim 1, wherein the second display element includes a display element having a feedback property that includes event contents related to the first display element, and a display element having at least one property of an alarm, a caution, and a popup that includes event contents that are not related to the first display element.
  8. The display method of claim 1, wherein the first depth value corresponds to a disparity between left-eye and right-eye images of the first display element and the second depth value corresponds to a disparity between left-eye and right-eye images of the second display element.
  9. The display method of claim 1, wherein adjusting the at least one depth value comprises:
    calculating a value of a relative depth of the second display element to the first display element;
    detecting a set of left-eye and right-eye images that correspond to the calculated value of relative depth from among a plurality of sets of previously-stored sets of left-eye and right-eye images that correspond to different depth values; and
    replacing left-eye and right-eye images of the second display element with the detected set of left-eye and right-eye images, respectively.
  10. The display method of claim 1, wherein adjusting the at least one depth value comprises:
    replacing one of left-eye and right-eye images of the second display element with the other image.
  11. The display method of claim 10, further comprising:
    adjusting a distance between the detected left-eye and right-eye images in accordance with a distance between left-eye and right-eye images of the first display element and displaying the distance-adjusted left-eye and right-eye images.
  12. The display method of claim 11, further comprising:
    adjusting a distance between the replaced left-eye and right-eye images in accordance with a distance between left-eye and right-eye images of the first display element and displaying the distance-adjusted left-eye and right-eye images.
  13. The display method of claim 10, wherein the first display element is a background element and the second display element is a content element on the background element.
  14. The display method of claim 10, wherein the first display element includes a background element and the second display element includes a content element on the background element.
  15. A Three-Dimensional (3D) image display apparatus, the apparatus comprising:
    a display processing unit for generating a first display element having a first depth value and a second display element having a second depth value;
    a display unit for displaying the generated first and second display elements; and
    a control unit for adjusting and displaying at least one depth value of the first display element and the second display element having the second depth value to be displayed in superimposition with or displayed on the first display element in a state where the first display element having the first depth value is displayed,
    wherein at least one of the first display element and the second display element is displayed with an adjusted depth value.
PCT/KR2011/007595 2010-10-12 2011-10-12 3d image display apparatus and display method thereof WO2012050366A2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP11832753.5A EP2628304A4 (en) 2010-10-12 2011-10-12 3d image display apparatus and display method thereof
CN201180049444.9A CN103155579B (en) 2010-10-12 2011-10-12 3D rendering display device and display packing thereof
JP2013533768A JP2014500642A (en) 2010-10-12 2011-10-12 3D image display device and display method thereof
BR112013008559A BR112013008559A2 (en) 2010-10-12 2011-10-12 display method of a three-dimensional image display apparatus (3d), and three-dimensional image display apparatus (3d)
AU2011314521A AU2011314521B2 (en) 2010-10-12 2011-10-12 3D image display apparatus and display method thereof
RU2013121611/08A RU2598989C2 (en) 2010-10-12 2011-10-12 Three-dimensional image display apparatus and display method thereof

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR10-2010-0099323 2010-10-12
KR20100099323 2010-10-12
KR1020110001127A KR20120037858A (en) 2010-10-12 2011-01-05 Three-dimensional image display apparatus and user interface providing method thereof
KR10-2011-0001127 2011-01-05
KR10-2011-0102629 2011-10-07
KR1020110102629A KR20120037350A (en) 2010-10-11 2011-10-07 Three-dimensional image display apparatus and display method thereof

Publications (2)

Publication Number Publication Date
WO2012050366A2 true WO2012050366A2 (en) 2012-04-19
WO2012050366A3 WO2012050366A3 (en) 2012-06-21

Family

ID=46138875

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2011/007595 WO2012050366A2 (en) 2010-10-12 2011-10-12 3d image display apparatus and display method thereof

Country Status (9)

Country Link
US (1) US20120086714A1 (en)
EP (1) EP2628304A4 (en)
JP (1) JP2014500642A (en)
KR (2) KR20120037858A (en)
CN (1) CN103155579B (en)
AU (1) AU2011314521B2 (en)
BR (1) BR112013008559A2 (en)
RU (1) RU2598989C2 (en)
WO (1) WO2012050366A2 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102508651B (en) * 2011-09-29 2015-04-15 深圳超多维光电子有限公司 Realization method and system of user interface as well as electronic equipment
US9381431B2 (en) * 2011-12-06 2016-07-05 Autodesk, Inc. Property alteration of a three dimensional stereoscopic system
KR101691839B1 (en) * 2012-09-14 2017-01-02 엘지전자 주식회사 Method and apparatus of controlling a content on 3-dimensional display
DE102013000880A1 (en) 2013-01-10 2014-07-10 Volkswagen Aktiengesellschaft Method and apparatus for providing a user interface in a vehicle
WO2014157749A1 (en) * 2013-03-28 2014-10-02 케이디씨 주식회사 Multi-display device and displaying method using same
KR20140144056A (en) 2013-06-10 2014-12-18 삼성전자주식회사 Method for object control and an electronic device thereof
JP2015119373A (en) * 2013-12-19 2015-06-25 ソニー株式会社 Image processor and method, and program
CN106610833B (en) * 2015-10-27 2020-02-04 北京国双科技有限公司 Method and device for triggering overlapped HTML element mouse event
KR102335209B1 (en) * 2015-11-30 2021-12-03 최해용 Virtual Reality Display Mobile Device
KR20210063118A (en) * 2019-11-22 2021-06-01 삼성전자주식회사 Display apparatus and controlling method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010064118A1 (en) 2008-12-01 2010-06-10 Imax Corporation Methods and systems for presenting three-dimensional motion pictures with content adaptive information
US20100208040A1 (en) 2009-02-19 2010-08-19 Jean-Pierre Guillou Preventing interference between primary and secondary content in a stereoscopic display
US20110292176A1 (en) 2009-02-17 2011-12-01 Samsung Electronics Co., Ltd. Method and apparatus for processing video image

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8369607B2 (en) * 2002-03-27 2013-02-05 Sanyo Electric Co., Ltd. Method and apparatus for processing three-dimensional images
JP2005223495A (en) * 2004-02-04 2005-08-18 Sharp Corp Stereoscopic video image display apparatus and method
KR100649523B1 (en) * 2005-06-30 2006-11-27 삼성에스디아이 주식회사 Stereoscopic image display device
RU2313191C2 (en) * 2005-07-13 2007-12-20 Евгений Борисович Гаскевич Method and system for generation of a stereo image
KR100783552B1 (en) * 2006-10-11 2007-12-07 삼성전자주식회사 Input control method and device for mobile phone
ATE472230T1 (en) * 2007-03-16 2010-07-15 Thomson Licensing SYSTEM AND METHOD FOR COMBINING TEXT WITH THREE-DIMENSIONAL CONTENT
US8384769B1 (en) * 2007-05-23 2013-02-26 Kwangwoon University Research Institute For Industry Cooperation 3D image display method and system thereof
WO2009020277A1 (en) * 2007-08-06 2009-02-12 Samsung Electronics Co., Ltd. Method and apparatus for reproducing stereoscopic image using depth control
US8228327B2 (en) * 2008-02-29 2012-07-24 Disney Enterprises, Inc. Non-linear depth rendering of stereoscopic animated images
JP4925354B2 (en) * 2008-03-31 2012-04-25 富士フイルム株式会社 Image processing apparatus, image display apparatus, imaging apparatus, and image processing method
JP5449162B2 (en) * 2008-07-31 2014-03-19 三菱電機株式会社 Video encoding apparatus, video encoding method, video reproduction apparatus, and video reproduction method
US8335425B2 (en) * 2008-11-18 2012-12-18 Panasonic Corporation Playback apparatus, playback method, and program for performing stereoscopic playback
KR20100077270A (en) * 2008-12-29 2010-07-08 엘지전자 주식회사 Digital television and method of providing graphical user interfaces using same
JP2010250562A (en) * 2009-04-15 2010-11-04 Sony Corp Data structure, recording medium, playback apparatus, playback method, and program
CN101938670A (en) * 2009-06-26 2011-01-05 Lg电子株式会社 Image display device and method of operation thereof
EP2452506A4 (en) * 2009-07-07 2014-01-22 Lg Electronics Inc Method for displaying three-dimensional user interface
JP2011081453A (en) * 2009-10-02 2011-04-21 Toshiba Corp Apparatus and method for reproducing video
JP2011081480A (en) * 2009-10-05 2011-04-21 Seiko Epson Corp Image input system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010064118A1 (en) 2008-12-01 2010-06-10 Imax Corporation Methods and systems for presenting three-dimensional motion pictures with content adaptive information
US20110292176A1 (en) 2009-02-17 2011-12-01 Samsung Electronics Co., Ltd. Method and apparatus for processing video image
US20100208040A1 (en) 2009-02-19 2010-08-19 Jean-Pierre Guillou Preventing interference between primary and secondary content in a stereoscopic display

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2628304A4

Also Published As

Publication number Publication date
CN103155579B (en) 2016-11-16
US20120086714A1 (en) 2012-04-12
EP2628304A2 (en) 2013-08-21
BR112013008559A2 (en) 2016-07-12
KR20120037858A (en) 2012-04-20
KR20120037350A (en) 2012-04-19
RU2598989C2 (en) 2016-10-10
AU2011314521A1 (en) 2013-04-11
RU2013121611A (en) 2014-11-20
WO2012050366A3 (en) 2012-06-21
EP2628304A4 (en) 2014-07-02
AU2011314521B2 (en) 2014-12-11
CN103155579A (en) 2013-06-12
JP2014500642A (en) 2014-01-09

Similar Documents

Publication Publication Date Title
WO2012050366A2 (en) 3d image display apparatus and display method thereof
WO2011059261A2 (en) Image display apparatus and operating method thereof
WO2011102699A2 (en) Electronic device and method for reproducing three-dimensional images
WO2010151027A4 (en) Video display device and operating method therefor
WO2011062335A1 (en) Method for playing contents
WO2011059260A2 (en) Image display apparatus and image display method thereof
WO2011059270A2 (en) Image display apparatus and operating method thereof
WO2012044128A4 (en) Display device, signal-processing device, and methods therefor
WO2011055950A2 (en) Image display apparatus, method for controlling the image display apparatus, and image display system
WO2011021894A2 (en) Image display apparatus and method for operating the same
WO2010151028A2 (en) Image display apparatus, 3d glasses, and method for operating the image display apparatus
WO2010151044A2 (en) Image-processing method for a display device which outputs three-dimensional content, and display device adopting the method
WO2013100376A1 (en) Apparatus and method for displaying
EP2923232A1 (en) Head mount display and method for controlling the same
WO2012002690A2 (en) Digital receiver and method for processing caption data in the digital receiver
WO2011129566A2 (en) Method and apparatus for displaying images
WO2011155766A2 (en) Image processing method and image display device according to the method
WO2019172523A1 (en) Display device and image processing method thereof
WO2012128399A1 (en) Display device and method of controlling the same
WO2011059266A2 (en) Image display apparatus and operation method therefor
WO2019164045A1 (en) Display device and method for image processing thereof
WO2012046990A2 (en) Image display apparatus and method for operating the same
WO2013015466A1 (en) Electronic device for displaying three-dimensional image and method of using the same
WO2010050691A2 (en) Methods and apparatuses for processing and displaying image
WO2014007414A1 (en) Terminal for increasing visual comfort sensation of 3d object and control method thereof

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201180049444.9

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11832753

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2013533768

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2011314521

Country of ref document: AU

Date of ref document: 20111012

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2011832753

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2011832753

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2013121611

Country of ref document: RU

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112013008559

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112013008559

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20130409