US10324686B2 - Electronic device and operation method therefor - Google Patents

Electronic device and operation method therefor Download PDF

Info

Publication number
US10324686B2
US10324686B2 US15/119,595 US201515119595A US10324686B2 US 10324686 B2 US10324686 B2 US 10324686B2 US 201515119595 A US201515119595 A US 201515119595A US 10324686 B2 US10324686 B2 US 10324686B2
Authority
US
United States
Prior art keywords
display panel
optical element
electronic device
cover
generate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/119,595
Other versions
US20170054971A1 (en
Inventor
Shaohui JIAO
Haitao Wang
Mingcai ZHOU
Tao Hong
Weiming Li
Xiying WANG
Dong Kyung Nam
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2015/001466 external-priority patent/WO2015122712A1/en
Publication of US20170054971A1 publication Critical patent/US20170054971A1/en
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAM, DONG KYUNG, HONG, TAO, JIAO, SHAOHUI, LI, WEIMING, WANG, HAITAO, WANG, XIYING, ZHOU, MINGCAI
Application granted granted Critical
Publication of US10324686B2 publication Critical patent/US10324686B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/307Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using fly-eye lenses, e.g. arrangements of circular lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • H04N13/359Switching between monoscopic and stereoscopic modes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Definitions

  • Apparatuses and methods consistent with exemplary embodiments relate to an electronic device and an operation method thereof.
  • a mobile device may have an autostereoscopic display function.
  • An autostereoscopic display function is technology that enables a user of the mobile device to view a natural three-dimensional (3D) image without a need for special glasses.
  • a mobile device may generate a 3D image by refracting light, output by a display panel, in different directions in space using a parallax barrier and/or a lens array.
  • One or more exemplary embodiments may provide an electronic device to switch between a two-dimensional (2D) display operation and a three-dimensional (3D) display operation and to display a 2D image and a 3D image.
  • One or more exemplary embodiments may also provide technology for rapidly generating 3D image data by generating a multi-view image using a parallel rendering technique.
  • an electronic device including a display panel, an optical element, and a controller configured to sense a position of the optical element with respect to the display panel, generate a three-dimensional (3D) image through the display panel and the optical element in a state in which the display panel overlaps the optical element, and generate a two-dimensional (2D) image through the display panel in a state in which the optical element is detached from the display panel.
  • 3D three-dimensional
  • 2D two-dimensional
  • the controller may be configured to measure a displacement of the optical element and generate the 3D image based on the measured displacement.
  • the controller may be configured to calculate a rendering parameter for generating the 3D image based on the measured displacement.
  • the measured displacement may include a rotation parameter and a translation parameter.
  • the controller may include a mode selector configured to generate a mode selection signal (SEL) for a display operation of the display panel and the optical element based on a position of the display panel and a position of the optical element, a displacement sensor configured to measure a displacement of the optical element in response to the SEL, a parameter generator configured to calculate a rendering parameter for rendering based on the measured displacement, and a graphic processing unit (GPU) configured to generate the 3D image using the rendering parameter.
  • SEL mode selection signal
  • GPU graphic processing unit
  • the controller may further include an aligning unit configured to acquire information on a state of alignment between the optical element and the display panel, and the GPU is configured to generate the 3D image using the rendering parameter based on the information on the state of alignment.
  • the mode selector may include a position sensor configured to sense the position of the display panel and the position of the optical element and generate a sensing signal, and a mode controller configured to determine the position of the display panel and the position of the optical element in response to the sensing signal and generate the SEL based on a result of the determination.
  • the mode selector may further include a switching button unit configured to generate a switching signal in response to a user input, and the mode controller may be configured to generate the SEL in response to the switching signal.
  • the mode selector may further include a voice command processing unit configured to recognize a voice command of a user and generate the switching signal by processing the voice command, and the mode controller may be configured to generate the SEL in response to the switching signal.
  • the optical element may be at least one of a microlens array, a microprism array, and a lenticular lens array.
  • the optical element may be disposed in a cover of the electronic device and the display panel is disposed in a main body of the electronic device.
  • the electronic device may be a portable device.
  • the cover may be at least one of a flip close type, a flip over type, a slide type, and a rotation type.
  • an operation method of an electronic device including sensing a position relationship between an optical element and a display panel, and generating a three-dimensional (3D) image through the display panel and the optical element in a state in which the display panel overlaps the optical element, and generating a two-dimensional (2D) image through the display panel in a state in which the optical element is detached from the display panel.
  • the generating of the 3D image may include measuring a displacement of the optical element and generating the 3D image based on the measured displacement.
  • the generating of the 3D image may further include calculating a rendering parameter for generating the 3D image based on the measured displacement.
  • the measured displacement may include a rotation parameter and a translation parameter.
  • the optical element may be at least one of a microlens array, a microprism array, and a lenticular lens array.
  • the optical element may be disposed in a cover of the electronic device and the display panel is disposed in a main body of the electronic device.
  • the cover may be at least one of a flip close type, a flip over type, a slide type, and a rotation type.
  • FIG. 1 is a diagram illustrating an electronic device according to an exemplary embodiment
  • FIG. 2 is a diagram illustrating an exemplary range of movement of a cover illustrated in FIG. 1 ;
  • FIG. 3 is a block diagram of the electronic device illustrated in FIG. 1 according to an exemplary embodiment
  • FIGS. 4A and 4B are diagrams illustrating methods of disposing a lens array when an optical element illustrated in FIG. 3 is implemented as a lens array according to exemplary embodiments;
  • FIG. 5 is a block diagram illustrating a controller illustrated in FIG. 3 according to an exemplary embodiment
  • FIG. 6 is a diagram illustrating a positional relationship between a lens array and a display panel illustrated in FIG. 3 when the display panel performs as a three-dimensional (3D) display according to an exemplary embodiment
  • FIG. 7 is a block diagram illustrating an example of a mode selector illustrated in FIG. 5 ;
  • FIG. 8 is a block diagram illustrating another example of the mode selector illustrated in FIG. 5 ;
  • FIG. 9 is a block diagram illustrating still another example of the mode selector illustrated in FIG. 5 ;
  • FIG. 10 is a diagram illustrating an example of displacement information generated by a displacement sensor illustrated in FIG. 5 ;
  • FIG. 11 is a flowchart illustrating an example of a parameter generator illustrated in FIG. 3 ;
  • FIGS. 12A and 12B are diagrams illustrating examples of a parameter generator illustrated in FIG. 5 ;
  • FIG. 13 is a flowchart illustrating a method of generating three-dimensional (3D) image data of a graphic processing unit (GPU) illustrated in FIG. 5 according to an exemplary embodiment
  • FIG. 14 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to an exemplary embodiment
  • FIG. 15 is a diagram illustrating operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment
  • FIG. 16 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment
  • FIG. 17 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment
  • FIG. 18 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment
  • FIG. 19 is a diagram illustrating still another example of an operation method of the electronic device illustrated in FIG. 1 ;
  • FIG. 20 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • FIG. 21 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • FIG. 1 is a diagram illustrating an electronic device according to an exemplary embodiment
  • FIG. 2 is a diagram illustrating an exemplary range of movement of a cover illustrated in FIG. 1 according to an exemplary embodiment.
  • an electronic device 10 includes a main body 100 and a cover 200 .
  • the electronic device 10 may be a personal computer (PC), a data server, or a portable device.
  • PC personal computer
  • data server a data server
  • portable device a portable device.
  • a portable device may be a laptop computer, a mobile phone, a smartphone, a tablet PC, a mobile internet device (MID), a personal digital assistant (PDA), an enterprise digital assistant (EDA), a digital still camera, a digital video camera, a portable multimedia player (PMP), a personal navigation device or a portable navigation device (PND), a portable game console, or an e-book.
  • MID mobile internet device
  • PDA personal digital assistant
  • EDA enterprise digital assistant
  • PMP portable multimedia player
  • PND portable navigation device
  • portable game console or an e-book.
  • the electronic device 10 is considered to be a mobile phone as illustrated in FIG. 1 .
  • the cover 200 is connected to the main body 100 .
  • the cover 200 may be an integral type which is combined with the main body 100 .
  • the cover 200 may be a removable type which is detachable from the main body 100 .
  • the cover 200 may be operated (and moved) with respect to the main body 100 by the exertion of a physical force from an outside source.
  • the cover 200 may overlap a portion of the main body 100 .
  • the cover 200 may overlap an entirety of the main body 100 .
  • the cover 200 may be moveable such that it may be made to move into a position in which it overlaps the entirety of the main body 100 by exertion of the physical force.
  • FIG. 1 illustrates that the cover 200 is a flip close type
  • the cover 200 is not limited to the flip close type.
  • the cover 200 may be any of various types.
  • the cover 200 may be a flip over type, a slide type, or a rotation type.
  • the cover 200 includes some electronic elements of the electronic device 10 , and the electronic elements included in the cover 200 may be electrically connected to electronic elements included in the main body 100 .
  • Materials comprising the cover 200 may be transparent, translucent, or opaque.
  • FIG. 3 is a block diagram of the electronic device illustrated in FIG. 1 according to an exemplary embodiment.
  • the electronic device 10 includes an optical element 310 , a display panel 330 , and a controller 350 .
  • the optical element 310 may be included in the cover 200 .
  • the display panel 330 may be included in the cover 200 .
  • the optical element 310 outputs a three-dimensional (3D) image by refracting rays emitted from the display panel 330 as a 2D image.
  • the optical element 310 may be at least one of a parallax barrier and a lens array.
  • FIGS. 4A and 4B are diagrams illustrating methods of disposing a lens array when an optical element illustrated in FIG. 3 is implemented as a lens array, according to exemplary embodiments.
  • the lens array when the optical element 310 is a lens array, the lens array may be disposed such that the lenses thereof face upward, away from the display panel.
  • a protective layer may be disposed on the lens array to protect the lens array from being damaged.
  • materials of the protective layer may be transparent, and the protective layer may be a touch layer including a touch sensor.
  • the lens array may alternately be disposed such that the lenses thereof face downward, toward the display panel.
  • the touch layer may be disposed on an upper portion of the lens array
  • the protective layer may be also disposed on the upper portion of the lens array.
  • the lens array may be a microlens array, a microprism array, or a lenticular lens array.
  • the optical element 310 When the optical element 310 is a lens array, for example, a microlens array, the optical element 310 concurrently provides parallax images in a horizontal direction and a vertical direction, and provides a plurality of visual images. Thus, the optical element 310 displays a real and natural 3D image. Even when the display panel 330 of the electronic device 10 is rotated, the optical element 310 concurrently provides different visual images in the horizontal direction and the vertical direction through the use of the microlens array. Thus, a user of the electronic device 10 views a 3D image even when the display panel 330 of the electronic device 10 is rotated.
  • Materials of the optical element 310 may be transparent.
  • the optical element 310 is assumed to be a lens array 310 .
  • the display panel 330 may be a liquid crystal display (LCD) panel. Also, the display panel 330 may be a touch screen panel, a thin-film-transistor liquid crystal display (FTF-LCD) panel, a liquid emitting diode (LED) display panel, an organic LED (OLED), an active-matrix OLED (AMOLED) display panel, or a flexible display panel.
  • the display panel 330 may be included in the main body 100 .
  • the lens array 310 and the display panel 330 perform one of a 2D display operation or a 3D display operation in response to being controlled by the controller 350 .
  • the controller 350 senses a position state of the lens array 310 for the display panel 330 , and generates a 3D image via the display panel 330 and the lens array 310 or generates a 2D image via the display panel 330 based on the sensed position state.
  • FIG. 5 is a block diagram illustrating a controller illustrated in FIG. 3 according to an exemplary embodiment.
  • the controller 350 includes a mode selector 351 , a graphic processing unit (GPU) 353 , a displacement sensor 355 , a parameter generator 357 , and an aligning unit 359 .
  • a mode selector 351 the controller 350 includes a mode selector 351 , a graphic processing unit (GPU) 353 , a displacement sensor 355 , a parameter generator 357 , and an aligning unit 359 .
  • GPU graphic processing unit
  • the mode selector 351 generates a mode selection signal (SEL) for selecting a display operation of the lens array 310 and the display panel 330 based on the position state of the lens array 310 and a position state of the display panel 330 .
  • the mode selector 351 generates an SEL having a first level, for example, a low level or logic 0, such that the display panel 330 performs the 2D display operation.
  • the mode selector 351 generates an SEL having a second level, for example, a high level or logic 1, such that the lens array 310 and the display panel 330 perform the 3D display operation.
  • the electronic device 10 switches between the 2D display operation and the 3D display operation, and displays a 2D image or a 3D image.
  • FIG. 6 is a diagram illustrating a positional relationship of a lens array and a display panel illustrated in FIG. 3 when the display panel performs 3D display operation, according to an exemplary embodiment.
  • a gap G between a preset plane of the lens array 310 and a panel of the display panel 330 corresponds to a focal distance of the lens array 310 .
  • the gap G between the lens array 310 included in the cover 200 and the display panel 330 included in the main body 100 corresponds to the focal distance of the lens array 310 .
  • the mode selector 351 generates an SEL such that the lens array 310 and the display panel 330 perform the 3D display operation.
  • the state in which the cover 200 overlaps the main body 100 is referred to as a state in which the cover 200 is disposed above the main body 100 and an overlapping area of the cover 200 and the main body 100 is greater than or equal to a preset area.
  • the preset area may be a maximum area in which the cover 200 overlaps the main body 100 .
  • a state in which the cover 200 overlaps the electronic device 10 may be a state in which the cover 200 and the main body 100 entirely overlap with each other, for example, a state in which the cover 200 is entirely touching the main body 100 .
  • a state in which the cover 200 overlaps the electronic device 10 may be a state in which the cover 200 slides inwardly (or downwardly) and then reaches a stopping point at an opposite side (or a stopping point at a bottom side).
  • the preset area may be set to be 90% of the maximum area in which the cover 200 and the main body 100 entirely overlap each other. The preset area corresponding to 90% of the maximum area is only an example and embodiments described herein are not limited thereto.
  • the mode selector 351 in a state other than a state in which the lens array 310 overlaps the display panel 330 , the mode selector 351 generates the SEL such that the display panel 330 performs the 2D display operation.
  • the state may be a state in which the lens array 310 is detached from or spaced from the display panel 330 .
  • the mode selector 351 outputs the SEL to the GPU 353 and/or the displacement sensor 355 .
  • FIG. 7 is a block diagram illustrating an example of a mode selector illustrated in FIG. 5 .
  • a mode selector 351 A includes a mode controller 351 - 1 and a position sensor 351 - 3 .
  • the mode selector 351 - 1 is an example of the mode selector 351 illustrated in FIG. 5 .
  • the position sensor 351 - 3 is provided in the main body 100 and/or the cover 200 .
  • the position sensor 351 - 3 senses the position state of the lens array 310 and the position state of the display panel 330 , and generates a sensing signal SS, based on a type of the cover 200 , when the cover 200 moves with respect to the main body 100 .
  • the cover may be, for example, a flip close type, a flip over type, a slide type, or a rotation type.
  • the position sensor 351 - 3 outputs the sensing signal SS to the mode selector 351 - 1 .
  • the mode selector 351 - 1 determines the position state of the lens array 310 and the position state of the display panel 330 in response to the sensing signal SS, and generates an SEL based on a result of the determination. For example, the mode selector 351 - 1 determines an overlapping state of the lens array 310 and the display panel 330 , in response to the sensing signal SS.
  • the mode selector 351 - 1 generates the SEL such that the lens array 310 and the display panel 330 perform a 3D display operation in a state in which the lens array 310 overlaps the display panel 330 .
  • the mode selector 351 - 1 may generate the SEL such that the display panel 330 performs a 2D display operation in a state other than the state in which the lens array 310 overlaps the display panel 330 .
  • FIG. 8 is a block diagram illustrating another example of the mode selector illustrated in FIG. 5 .
  • a mode selector 351 B includes the mode controller 351 - 1 and a switching button unit 351 - 5 .
  • the mode selector 351 B is another example of the mode selector 351 illustrated in FIG. 5 .
  • the switching button unit 351 - 5 generates a switching signal in response to a user input.
  • the mode controller 351 - 1 generates an SEL in response to the switching signal output from the switching button unit 351 - 5 .
  • the mode controller 351 - 1 when a level of the switching signal is a first level, for example, a low level or logic 0, the mode controller 351 - 1 generates the SEL such that the display panel 310 performs a 2D display operation.
  • the level of the switching signal is a second level, for example, a high level or logic 1
  • the mode controller 351 - 1 When the level of the switching signal is a second level, for example, a high level or logic 1, the mode controller 351 - 1 generates the SEL such that the lens array 310 and the display panel 330 perform a 3D display operation.
  • a user of the electronic device 10 selects the 2D display operation or the 3D display operation using the switching button unit 351 - 5 .
  • the mode selector 351 B may also include the position sensor 351 - 3 illustrated in FIG. 7 (not shown in FIG. 8 ).
  • the user of the electronic device 10 selects an operation state, for example, an ON state or an OFF state, of the position sensor 351 - 3 .
  • the mode selector 351 B automatically controls a display operation mode of the display panel 330 through the position sensor 351 - 3 .
  • the mode selector 351 B automatically controls the display operation mode through the switching button unit 351 - 5 .
  • the electronic device 10 performs the 2D display operation and the 3D display operation by performing simple switching.
  • FIG. 9 is a block diagram illustrating still another example of the mode selector illustrated in FIG. 5 .
  • a mode selector 351 C includes the mode controller 351 - 1 and a voice command processing unit 351 - 7 .
  • the mode selector 351 C is still another example of the mode selector 351 illustrated in FIG. 5 .
  • the voice command processing unit 351 - 7 recognizes a voice command of a user of the electronic device 10 , and generates a switching signal by processing the recognized voice command.
  • the voice command processing unit 351 - 7 generally performs a voice recognizing operation and a processing operation.
  • the voice command processing unit 351 - 7 may be an independent circuit connected to a microphone (not shown), or may be integrated with a computing unit (not shown).
  • the computing unit may be a processor, for example, a central processing unit (CPU).
  • the mode controller 351 - 1 generates the SEL in response to the switching signal. For example, when a level of the switching signal is a first level, for example, a low level or logic 0, the mode controller 351 - 1 generates the SEL such that the display panel 330 performs a 2D display operation. When the level of the switching signal is a second level, for example, a high level or logic 1, the mode selector 351 C generates the SEL such that the lens array 310 and the display panel 330 perform a 3D display operation.
  • the mode selector 330 - 3 when the user of the electronic device 10 says “3D display” as an example of the voice command, the mode selector 330 - 3 generates the SEL such that the display panel 330 performs the 3D display operation in a state in which the cover 200 overlaps the main body 100 .
  • the mode selector 351 C may further include the position sensor 351 - 3 illustrated in FIG. 7 (not shown in FIG. 9 ).
  • the user of the electronic device 10 may select an operation state, for example, an ON state and an OFF state, of the position sensor 351 - 3 .
  • the mode selector 351 C automatically controls a display operation mode through the position sensor 351 - 3 .
  • the mode selector 351 C automatically controls the display operation mode through the voice command processing unit 351 - 7 .
  • the electronic device 10 performs the 2D display operation and the 3D display operation by performing simple switching.
  • the user may use the 2D display operation to access the basic functions of a mobile terminal such as to enable message editing and calendar viewing, and may use the 3D display mode of operation to access an additional function of a mobile terminal such as to enable video watching and/or game playing.
  • the displacement sensor 355 operates in response to the SEL. For example, when a level of the SEL is a second level, an operation of the displacement sensor 355 may start.
  • the displacement sensor 355 measures a displacement DI of the display panel 330 of the lens array 310 , and outputs the measured displacement DI to the parameter generator 357 .
  • the measured displacement DI may include a translation parameter T and a rotation parameter R for the display panel 330 of the lens array 310 .
  • R is a 3 ⁇ 3 rotation matrix and may indicate a 2 ⁇ 2 translation vector.
  • the displacement sensor 355 includes at least one displacement sensor.
  • the displacement sensor 355 may include one or more of an inductance type displacement sensor, a capacitance type displacement sensor, an induction sensor, a raster sensor, a magnetostriction displacement sensor, a magnetic grid sensor, a rotation generator, and a photoelectric encoder.
  • the displacement sensor 355 may be provided in the main body 100 and/or the cover 200 .
  • the displacement sensor 355 may include a first displacement sensor and a second displacement sensor.
  • the first displacement sensor may be provided in the main body 100
  • the second displacement sensor may be provided in the cover 200 .
  • the aligning unit 359 acquires information on a state of alignment of the lens array 310 and the display panel 330 .
  • the aligning unit 359 may include a first aligning unit and a second aligning unit.
  • the first aligning unit may be provided in the main body 100
  • the second aligning unit may be provided in the cover 200 .
  • the aligning unit 359 acquires the information on the state of alignment of the lens array 310 and the display panel 330 in a state in which the lens array 310 overlaps the display panel 330 .
  • the aligning unit 359 acquires the information on the state of alignment based on a structure of a textured surface or an embossed surface.
  • the first aligning unit may include an aligning hole structure and/or an aligning groove structure.
  • the second aligning unit may include an aligning protrusion. When the second aligning unit is inserted into the first aligning unit, the aligning unit 359 acquires the information on the state of alignment between the lens array 310 and the display panel 330 .
  • the aligning unit 359 acquires the information on the state of alignment between the lens array 310 and the display panel 330 based on force.
  • the first aligning unit and the second aligning unit may be magnetic materials.
  • one of the first aligning unit and the second aligning unit may be a magnetic material, and the other one may be a magnetic metal.
  • the aligning unit 359 acquires the information on the state of alignment between the lens array 310 and the display panel 330 .
  • the first aligning unit and the second aligning unit may be aligned by an electromagnetic force.
  • the aligning unit 359 outputs the information on the state of alignment to the GPU 353 .
  • the aligning unit 359 is not limited to the aforementioned structure.
  • the parameter generator 357 receives the measured displacement DI.
  • the parameter generator 357 calculates a rendering parameter R_PR for rendering based on a rotation parameter R and a translation parameter T included in the measured displacement DI.
  • FIG. 11 is a flowchart illustrating an example of a parameter generator illustrated in FIG. 3 .
  • FIGS. 12A and 12B are diagrams illustrating examples of a parameter generator illustrated in FIG. 5 .
  • the parameter generator 357 calculates a location on the display panel 330 of each lens included in the lens array 310 based on the rotation parameter R and the translation parameter T.
  • the parameter generator 357 determines a lens corresponding to each pixel of an interlaced image based on the calculated space location.
  • the parameter generator 370 generates a plurality of ray clusters C 1 , C 2 , and C 3 by clustering rays of a light field based on the determined lens.
  • the ray clusters C 1 , C 2 , and C 3 correspond to view frustums VF 1 , VF 2 , and VF 3 , respectively.
  • FIGS. 12A and 12B illustrate the three ray clusters C 1 , C 2 , and C 3 according to horizontal directions and the three view frustums VF 1 , VF 2 , and VF 3 corresponding to the respective ray clusters C 1 , C 2 , and C 3 .
  • a plurality of rays of the ray cluster C 1 corresponds to the view frustum VF 1 .
  • a plurality of rays of the ray cluster C 2 corresponds to the view frustum VF 2 .
  • a plurality of rays of the ray cluster C 3 corresponds to the view frustum VF 3 .
  • each of the view frustums VF 1 , VF 2 , and VF 3 may be a perspective view frustum.
  • each of the view frustums VF 1 , VF 2 , and VF 3 may be a shear perspective view frustum.
  • Each of the view frustums VF 1 , VF 2 , and VF 3 corresponding to the respective ray clusters C 1 , C 2 , and C 3 may include the rendering parameter R_PR for rendering.
  • the rendering parameter R_PR may include viewpoints and viewing angles of the view frustums VF 1 , VF 2 , and VF 3 .
  • the parameter generator 357 calculates the rendering parameter R_PR for each of the view frustums VF 1 , VF 2 , and VF 3 corresponding to the ray clusters C 1 , C 2 , and C 3 .
  • the parameter generator 357 transmits the rendering parameter R_PR to the GPU 353 .
  • the GPU 353 generates 2D image data or 3D image data in response to an SEL.
  • the GPU 353 When a level of the SEL is a first level, the GPU 353 generates the 2D image data and outputs the generated 2D image data to the display panel 330 . Thus, the display panel 330 performs a 2D display operation.
  • the GPU 353 When a level of the SEL is a second level, the GPU 353 generates the 3D image data and outputs the generated 3D image data to the display panel 330 .
  • the lens array 310 and the display panel 330 perform a 3D display operation.
  • the 3D image data may be interlaced 3D image data.
  • FIG. 13 is a flowchart illustrating a method of generating three-dimensional (3D) image data of a graphic processing unit (GPU) illustrated in FIG. 5 according to an exemplary embodiment.
  • the GPU 353 generates 3D image data using the rendering parameter R_PR in response to a determination that the lens array 310 and the display panel 330 are not aligned based on information on the state of alignment output by the aligning unit 359 .
  • the GPU 353 generates a multi-view image by performing geometry duplication based on the rendering parameter R_PR. For example, the GPU 353 performs the geometry duplication on 3D content of each of the ray clusters C 1 , C 2 , and C 3 , and generates the multi-view image by performing parallel rendering on the view frustums VF 1 , VF 2 , and VF 3 corresponding to the respective ray clusters C 1 , C 2 , and C 3 .
  • the multi-view image may be generated using a geometry shader of the GPU 353 .
  • the GPU 353 stores the multi-view image as a single texture image in a memory (not shown).
  • the memory may be a volatile memory or a non-volatile memory.
  • the volatile memory may be a dynamic random access memory (DRAM), a static random access memory (SRAM), a thyristor RAM (T-RAM), a zero capacitor RAM (Z-RAM), or a twin transistor RAM (TTRAM).
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • T-RAM thyristor RAM
  • Z-RAM zero capacitor RAM
  • TTRAM twin transistor RAM
  • the non-volatile memory may be an electrically erasable programmable read-only memory (EEPROM), a flash memory, a magnetic RAM (MRAM), a spin-transfer torque (STT) MRAM, a conductive bridging RAM (CBRAM), a ferroelectric RAM (FeRAM), a phase change RAM (PRAM), a resistive RAM (RRAM), a nanotube RRAM, a polymer RAM (PoRAM), a nano floating gate memory (NFGM), a holographic memory, a molecular electronics memory device, or an insulator resistance change memory.
  • EEPROM electrically erasable programmable read-only memory
  • flash memory a flash memory
  • MRAM magnetic RAM
  • STT spin-transfer torque
  • CBRAM conductive bridging RAM
  • FeRAM ferroelectric RAM
  • PRAM phase change RAM
  • RRAM resistive RAM
  • NFGM nano floating gate memory
  • holographic memory a holographic memory
  • molecular electronics memory device or an
  • the GPU 353 generates the 3D image data by rearranging pixels of the multi-view image.
  • the rearranging may be performed by a pixel shader or a fragment shader of the GPU 353 .
  • the GPU 353 outputs the 3D image data to the display panel 330 .
  • the GPU 353 may rapidly generate the 3D image data by generating the multi-view image by performing the parallel rendering.
  • FIG. 14 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to an exemplary embodiment.
  • the electronic device 10 switches between a 2D display and a 3D display in a flip close type manner.
  • a connector 400 may have a rotation axis along a short edge of the electronic device 10 .
  • FIG. 15 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • the electronic device 10 switches between a 2D display and a 3D display in a flip close type manner.
  • the connector 400 may have a rotation axis along a long edge of the electronic device 10 .
  • FIG. 16 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • the electronic device 10 switches between a 2D display and a 3D display in a slide type manner.
  • a direction of sliding may be perpendicular to a long edge of the electronic device 10 .
  • FIG. 17 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • the electronic device 10 switches between a 2D display and a 3D display in a slide type manner.
  • a direction of sliding may be perpendicular to a short edge of the electronic device 10 .
  • FIG. 18 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • FIG. 18 the electronic device 10 switches between a 2D display and a 3D display in a rotation type manner.
  • FIG. 18 illustrates that a rotation direction of a cover is anticlockwise, the rotation direction is not limited thereto. The rotation direction may be clockwise.
  • FIG. 19 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • the electronic device 10 switches between a 2D display and a 3D display in a flip close type manner.
  • the cover 200 overlaps a portion of the main body 100 in the closed position.
  • FIG. 20 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • the electronic device 10 switches between a 2D display and a 3D display in a flip close type manner.
  • a material of the cover 200 may be transparent or translucent.
  • FIG. 21 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
  • the electronic device 10 switches between a 2D display and a 3D display in a flip over type manner.
  • the cover 200 is flipped from a front surface of the display panel 330 to a back surface of the main body 100 .
  • the electronic device 10 may be in a 3D display operation state.
  • the electronic device 10 may be in a 2D display operation state.
  • the electronic device 10 may be in the 2D display operation state.
  • Exemplary embodiments include computer-readable media including program instructions to enable a computer to implement various operations.
  • the media may also include, alone or in combination with the program instructions, data files, data structures, tables, and the like.
  • the media and program instructions may be those specially designed and constructed for the purposes of exemplary embodiments.
  • Examples of computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM).
  • Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
  • the described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described exemplary embodiments, or vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Telephone Set Structure (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

An electronic device and an operation method therefor are provided. The electronic device may include: a display panel; an optical element; and a control unit which senses a location of the optical element, generates a 3D image through via display panel and the optical element in a state in which the display panel and the optical element overlap each other, and generates a 2D image via the display panel in a state in which the optical element is detached or separated from the display panel.

Description

The present application is a National Stage Entry of International Application PCT/KR2015/001466, filed Feb. 13, 2015, which claims the benefit of Chinese Patent Application No. 201410053867.3, filed Feb. 17, 2014, and Korean Patent Application No. 10-2014-0111369, filed Aug. 26, 2014.
TECHNICAL FIELD
Apparatuses and methods consistent with exemplary embodiments relate to an electronic device and an operation method thereof.
BACKGROUND ART
Current mobile devices include the functions of smartphones, as well as various other functions. For example, a mobile device may have an autostereoscopic display function. An autostereoscopic display function is technology that enables a user of the mobile device to view a natural three-dimensional (3D) image without a need for special glasses.
Generally speaking, a mobile device may generate a 3D image by refracting light, output by a display panel, in different directions in space using a parallax barrier and/or a lens array.
SUMMARY
One or more exemplary embodiments may provide an electronic device to switch between a two-dimensional (2D) display operation and a three-dimensional (3D) display operation and to display a 2D image and a 3D image.
One or more exemplary embodiments may also provide technology for rapidly generating 3D image data by generating a multi-view image using a parallel rendering technique.
According to an aspect of an exemplary embodiment, there is provided an electronic device including a display panel, an optical element, and a controller configured to sense a position of the optical element with respect to the display panel, generate a three-dimensional (3D) image through the display panel and the optical element in a state in which the display panel overlaps the optical element, and generate a two-dimensional (2D) image through the display panel in a state in which the optical element is detached from the display panel.
The controller may be configured to measure a displacement of the optical element and generate the 3D image based on the measured displacement.
The controller may be configured to calculate a rendering parameter for generating the 3D image based on the measured displacement.
The measured displacement may include a rotation parameter and a translation parameter.
The controller may include a mode selector configured to generate a mode selection signal (SEL) for a display operation of the display panel and the optical element based on a position of the display panel and a position of the optical element, a displacement sensor configured to measure a displacement of the optical element in response to the SEL, a parameter generator configured to calculate a rendering parameter for rendering based on the measured displacement, and a graphic processing unit (GPU) configured to generate the 3D image using the rendering parameter.
The controller may further include an aligning unit configured to acquire information on a state of alignment between the optical element and the display panel, and the GPU is configured to generate the 3D image using the rendering parameter based on the information on the state of alignment.
The mode selector may include a position sensor configured to sense the position of the display panel and the position of the optical element and generate a sensing signal, and a mode controller configured to determine the position of the display panel and the position of the optical element in response to the sensing signal and generate the SEL based on a result of the determination.
The mode selector may further include a switching button unit configured to generate a switching signal in response to a user input, and the mode controller may be configured to generate the SEL in response to the switching signal.
The mode selector may further include a voice command processing unit configured to recognize a voice command of a user and generate the switching signal by processing the voice command, and the mode controller may be configured to generate the SEL in response to the switching signal.
The optical element may be at least one of a microlens array, a microprism array, and a lenticular lens array.
The optical element may be disposed in a cover of the electronic device and the display panel is disposed in a main body of the electronic device.
The electronic device may be a portable device.
The cover may be at least one of a flip close type, a flip over type, a slide type, and a rotation type.
According to an aspect of another exemplary embodiment, there is provided an operation method of an electronic device, the method including sensing a position relationship between an optical element and a display panel, and generating a three-dimensional (3D) image through the display panel and the optical element in a state in which the display panel overlaps the optical element, and generating a two-dimensional (2D) image through the display panel in a state in which the optical element is detached from the display panel.
The generating of the 3D image may include measuring a displacement of the optical element and generating the 3D image based on the measured displacement.
The generating of the 3D image may further include calculating a rendering parameter for generating the 3D image based on the measured displacement.
The measured displacement may include a rotation parameter and a translation parameter.
The optical element may be at least one of a microlens array, a microprism array, and a lenticular lens array.
The optical element may be disposed in a cover of the electronic device and the display panel is disposed in a main body of the electronic device.
The cover may be at least one of a flip close type, a flip over type, a slide type, and a rotation type.
BRIEF DESCRIPTION OF DRAWINGS
The above and other exemplary aspects and advantages will be more apparent from the following detailed description of exemplary embodiments in which:
FIG. 1 is a diagram illustrating an electronic device according to an exemplary embodiment;
FIG. 2 is a diagram illustrating an exemplary range of movement of a cover illustrated in FIG. 1;
FIG. 3 is a block diagram of the electronic device illustrated in FIG. 1 according to an exemplary embodiment;
FIGS. 4A and 4B are diagrams illustrating methods of disposing a lens array when an optical element illustrated in FIG. 3 is implemented as a lens array according to exemplary embodiments;
FIG. 5 is a block diagram illustrating a controller illustrated in FIG. 3 according to an exemplary embodiment;
FIG. 6 is a diagram illustrating a positional relationship between a lens array and a display panel illustrated in FIG. 3 when the display panel performs as a three-dimensional (3D) display according to an exemplary embodiment;
FIG. 7 is a block diagram illustrating an example of a mode selector illustrated in FIG. 5;
FIG. 8 is a block diagram illustrating another example of the mode selector illustrated in FIG. 5;
FIG. 9 is a block diagram illustrating still another example of the mode selector illustrated in FIG. 5;
FIG. 10 is a diagram illustrating an example of displacement information generated by a displacement sensor illustrated in FIG. 5;
FIG. 11 is a flowchart illustrating an example of a parameter generator illustrated in FIG. 3;
FIGS. 12A and 12B are diagrams illustrating examples of a parameter generator illustrated in FIG. 5;
FIG. 13 is a flowchart illustrating a method of generating three-dimensional (3D) image data of a graphic processing unit (GPU) illustrated in FIG. 5 according to an exemplary embodiment;
FIG. 14 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to an exemplary embodiment;
FIG. 15 is a diagram illustrating operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment;
FIG. 16 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment;
FIG. 17 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment;
FIG. 18 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment;
FIG. 19 is a diagram illustrating still another example of an operation method of the electronic device illustrated in FIG. 1;
FIG. 20 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment; and
FIG. 21 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
DETAILED DESCRIPTION
Hereinafter, exemplary embodiments will be described in detail with reference to the accompanying drawings.
FIG. 1 is a diagram illustrating an electronic device according to an exemplary embodiment, and FIG. 2 is a diagram illustrating an exemplary range of movement of a cover illustrated in FIG. 1 according to an exemplary embodiment.
Referring to FIGS. 1 and 2, an electronic device 10 includes a main body 100 and a cover 200.
The electronic device 10 may be a personal computer (PC), a data server, or a portable device.
A portable device may be a laptop computer, a mobile phone, a smartphone, a tablet PC, a mobile internet device (MID), a personal digital assistant (PDA), an enterprise digital assistant (EDA), a digital still camera, a digital video camera, a portable multimedia player (PMP), a personal navigation device or a portable navigation device (PND), a portable game console, or an e-book.
Hereinafter, for ease and convenience of description, the electronic device 10 is considered to be a mobile phone as illustrated in FIG. 1.
The cover 200 is connected to the main body 100. In an example, the cover 200 may be an integral type which is combined with the main body 100. In another example, the cover 200 may be a removable type which is detachable from the main body 100.
The cover 200 may be operated (and moved) with respect to the main body 100 by the exertion of a physical force from an outside source. In an example, the cover 200 may overlap a portion of the main body 100. In another example, the cover 200 may overlap an entirety of the main body 100. As illustrated in FIG. 2, the cover 200 may be moveable such that it may be made to move into a position in which it overlaps the entirety of the main body 100 by exertion of the physical force.
Although FIG. 1 illustrates that the cover 200 is a flip close type, the cover 200 is not limited to the flip close type. The cover 200 may be any of various types. For example, the cover 200 may be a flip over type, a slide type, or a rotation type.
The cover 200 includes some electronic elements of the electronic device 10, and the electronic elements included in the cover 200 may be electrically connected to electronic elements included in the main body 100.
Materials comprising the cover 200 may be transparent, translucent, or opaque.
FIG. 3 is a block diagram of the electronic device illustrated in FIG. 1 according to an exemplary embodiment.
Referring to FIGS. 1 through 3, the electronic device 10 includes an optical element 310, a display panel 330, and a controller 350.
The optical element 310 may be included in the cover 200. Likewise, the display panel 330 may be included in the cover 200. The optical element 310 outputs a three-dimensional (3D) image by refracting rays emitted from the display panel 330 as a 2D image.
The optical element 310 may be at least one of a parallax barrier and a lens array.
FIGS. 4A and 4B are diagrams illustrating methods of disposing a lens array when an optical element illustrated in FIG. 3 is implemented as a lens array, according to exemplary embodiments.
As illustrated in FIG. 4A, when the optical element 310 is a lens array, the lens array may be disposed such that the lenses thereof face upward, away from the display panel. In this example, a protective layer may be disposed on the lens array to protect the lens array from being damaged. For example, materials of the protective layer may be transparent, and the protective layer may be a touch layer including a touch sensor.
As illustrated in FIG. 4B, when the optical element 310 is a lens array, the lens array may alternately be disposed such that the lenses thereof face downward, toward the display panel. In this example, the touch layer may be disposed on an upper portion of the lens array, and the protective layer may be also disposed on the upper portion of the lens array.
The lens array may be a microlens array, a microprism array, or a lenticular lens array.
When the optical element 310 is a lens array, for example, a microlens array, the optical element 310 concurrently provides parallax images in a horizontal direction and a vertical direction, and provides a plurality of visual images. Thus, the optical element 310 displays a real and natural 3D image. Even when the display panel 330 of the electronic device 10 is rotated, the optical element 310 concurrently provides different visual images in the horizontal direction and the vertical direction through the use of the microlens array. Thus, a user of the electronic device 10 views a 3D image even when the display panel 330 of the electronic device 10 is rotated.
Materials of the optical element 310 may be transparent.
Hereinafter, for ease and convenience of descriptions, the optical element 310 is assumed to be a lens array 310.
The display panel 330 may be a liquid crystal display (LCD) panel. Also, the display panel 330 may be a touch screen panel, a thin-film-transistor liquid crystal display (FTF-LCD) panel, a liquid emitting diode (LED) display panel, an organic LED (OLED), an active-matrix OLED (AMOLED) display panel, or a flexible display panel. For example, the display panel 330 may be included in the main body 100.
The lens array 310 and the display panel 330 perform one of a 2D display operation or a 3D display operation in response to being controlled by the controller 350.
The controller 350 senses a position state of the lens array 310 for the display panel 330, and generates a 3D image via the display panel 330 and the lens array 310 or generates a 2D image via the display panel 330 based on the sensed position state.
FIG. 5 is a block diagram illustrating a controller illustrated in FIG. 3 according to an exemplary embodiment.
Referring to FIGS. 1 through 5, the controller 350 includes a mode selector 351, a graphic processing unit (GPU) 353, a displacement sensor 355, a parameter generator 357, and an aligning unit 359.
The mode selector 351 generates a mode selection signal (SEL) for selecting a display operation of the lens array 310 and the display panel 330 based on the position state of the lens array 310 and a position state of the display panel 330. The mode selector 351 generates an SEL having a first level, for example, a low level or logic 0, such that the display panel 330 performs the 2D display operation. Alternatively, the mode selector 351 generates an SEL having a second level, for example, a high level or logic 1, such that the lens array 310 and the display panel 330 perform the 3D display operation.
Thus, the electronic device 10 switches between the 2D display operation and the 3D display operation, and displays a 2D image or a 3D image.
FIG. 6 is a diagram illustrating a positional relationship of a lens array and a display panel illustrated in FIG. 3 when the display panel performs 3D display operation, according to an exemplary embodiment.
As illustrated in FIG. 6, when the display panel 330 performs the 3D display operation, a gap G between a preset plane of the lens array 310 and a panel of the display panel 330 corresponds to a focal distance of the lens array 310. In a state in which the cover 200 overlaps the main body 100, the gap G between the lens array 310 included in the cover 200 and the display panel 330 included in the main body 100 corresponds to the focal distance of the lens array 310.
For example, in a state in which the lens array 310 overlaps the display panel 330, the mode selector 351 generates an SEL such that the lens array 310 and the display panel 330 perform the 3D display operation. The state in which the cover 200 overlaps the main body 100 is referred to as a state in which the cover 200 is disposed above the main body 100 and an overlapping area of the cover 200 and the main body 100 is greater than or equal to a preset area. The preset area may be a maximum area in which the cover 200 overlaps the main body 100. In an example, in an electronic device 10 which is a flip close type, a state in which the cover 200 overlaps the electronic device 10 may be a state in which the cover 200 and the main body 100 entirely overlap with each other, for example, a state in which the cover 200 is entirely touching the main body 100. In another example, in an electronic device 10 which is a slide type, a state in which the cover 200 overlaps the electronic device 10 may be a state in which the cover 200 slides inwardly (or downwardly) and then reaches a stopping point at an opposite side (or a stopping point at a bottom side). In addition, the preset area may be set to be 90% of the maximum area in which the cover 200 and the main body 100 entirely overlap each other. The preset area corresponding to 90% of the maximum area is only an example and embodiments described herein are not limited thereto.
In another example, in a state other than a state in which the lens array 310 overlaps the display panel 330, the mode selector 351 generates the SEL such that the display panel 330 performs the 2D display operation. The state may be a state in which the lens array 310 is detached from or spaced from the display panel 330.
The mode selector 351 outputs the SEL to the GPU 353 and/or the displacement sensor 355.
FIG. 7 is a block diagram illustrating an example of a mode selector illustrated in FIG. 5.
Referring to FIGS. 1 through 7, a mode selector 351A includes a mode controller 351-1 and a position sensor 351-3. The mode selector 351-1 is an example of the mode selector 351 illustrated in FIG. 5.
The position sensor 351-3 is provided in the main body 100 and/or the cover 200. The position sensor 351-3 senses the position state of the lens array 310 and the position state of the display panel 330, and generates a sensing signal SS, based on a type of the cover 200, when the cover 200 moves with respect to the main body 100. As noted above, the cover may be, for example, a flip close type, a flip over type, a slide type, or a rotation type. The position sensor 351-3 outputs the sensing signal SS to the mode selector 351-1.
The mode selector 351-1 determines the position state of the lens array 310 and the position state of the display panel 330 in response to the sensing signal SS, and generates an SEL based on a result of the determination. For example, the mode selector 351-1 determines an overlapping state of the lens array 310 and the display panel 330, in response to the sensing signal SS.
For example, the mode selector 351-1 generates the SEL such that the lens array 310 and the display panel 330 perform a 3D display operation in a state in which the lens array 310 overlaps the display panel 330.
Furthermore, the mode selector 351-1 may generate the SEL such that the display panel 330 performs a 2D display operation in a state other than the state in which the lens array 310 overlaps the display panel 330.
FIG. 8 is a block diagram illustrating another example of the mode selector illustrated in FIG. 5.
Referring to FIGS. 1 through 6, and 8, a mode selector 351B includes the mode controller 351-1 and a switching button unit 351-5. The mode selector 351B is another example of the mode selector 351 illustrated in FIG. 5.
The switching button unit 351-5 generates a switching signal in response to a user input.
The mode controller 351-1 generates an SEL in response to the switching signal output from the switching button unit 351-5.
For example, when a level of the switching signal is a first level, for example, a low level or logic 0, the mode controller 351-1 generates the SEL such that the display panel 310 performs a 2D display operation. When the level of the switching signal is a second level, for example, a high level or logic 1, the mode controller 351-1 generates the SEL such that the lens array 310 and the display panel 330 perform a 3D display operation.
According to this exemplary embodiment, a user of the electronic device 10 selects the 2D display operation or the 3D display operation using the switching button unit 351-5.
The mode selector 351B may also include the position sensor 351-3 illustrated in FIG. 7 (not shown in FIG. 8). Here, the user of the electronic device 10 selects an operation state, for example, an ON state or an OFF state, of the position sensor 351-3. When the operation state of the position sensor 351-3 is the ON state, the mode selector 351B automatically controls a display operation mode of the display panel 330 through the position sensor 351-3. When the operation state of the position sensor 351-3 is the OFF state, the mode selector 351B automatically controls the display operation mode through the switching button unit 351-5.
In short, according to the desire of the user, the electronic device 10 performs the 2D display operation and the 3D display operation by performing simple switching.
FIG. 9 is a block diagram illustrating still another example of the mode selector illustrated in FIG. 5.
Referring to FIGS. 1 through 6 and 9, a mode selector 351C includes the mode controller 351-1 and a voice command processing unit 351-7. The mode selector 351C is still another example of the mode selector 351 illustrated in FIG. 5.
The voice command processing unit 351-7 recognizes a voice command of a user of the electronic device 10, and generates a switching signal by processing the recognized voice command. The voice command processing unit 351-7 generally performs a voice recognizing operation and a processing operation. The voice command processing unit 351-7 may be an independent circuit connected to a microphone (not shown), or may be integrated with a computing unit (not shown). For example, the computing unit may be a processor, for example, a central processing unit (CPU).
The mode controller 351-1 generates the SEL in response to the switching signal. For example, when a level of the switching signal is a first level, for example, a low level or logic 0, the mode controller 351-1 generates the SEL such that the display panel 330 performs a 2D display operation. When the level of the switching signal is a second level, for example, a high level or logic 1, the mode selector 351C generates the SEL such that the lens array 310 and the display panel 330 perform a 3D display operation.
For example, when the user of the electronic device 10 says “3D display” as an example of the voice command, the mode selector 330-3 generates the SEL such that the display panel 330 performs the 3D display operation in a state in which the cover 200 overlaps the main body 100.
The mode selector 351C may further include the position sensor 351-3 illustrated in FIG. 7 (not shown in FIG. 9). Here, the user of the electronic device 10 may select an operation state, for example, an ON state and an OFF state, of the position sensor 351-3. When the operation state of the position sensor 351-3 is the ON state, the mode selector 351C automatically controls a display operation mode through the position sensor 351-3. When the operation state of the position sensor 351-3 is the OFF state, the mode selector 351C automatically controls the display operation mode through the voice command processing unit 351-7.
In short, according to the desire of the user, the electronic device 10 performs the 2D display operation and the 3D display operation by performing simple switching.
When the electronic device 10 is a portable device, the user may use the 2D display operation to access the basic functions of a mobile terminal such as to enable message editing and calendar viewing, and may use the 3D display mode of operation to access an additional function of a mobile terminal such as to enable video watching and/or game playing.
Referring to FIGS. 1 through 9, the displacement sensor 355 operates in response to the SEL. For example, when a level of the SEL is a second level, an operation of the displacement sensor 355 may start.
The displacement sensor 355 measures a displacement DI of the display panel 330 of the lens array 310, and outputs the measured displacement DI to the parameter generator 357. As illustrated in FIG. 10, the measured displacement DI may include a translation parameter T and a rotation parameter R for the display panel 330 of the lens array 310. For example, R is a 3×3 rotation matrix and may indicate a 2×2 translation vector.
The displacement sensor 355 includes at least one displacement sensor. The displacement sensor 355 may include one or more of an inductance type displacement sensor, a capacitance type displacement sensor, an induction sensor, a raster sensor, a magnetostriction displacement sensor, a magnetic grid sensor, a rotation generator, and a photoelectric encoder.
The displacement sensor 355 may be provided in the main body 100 and/or the cover 200. For example, the displacement sensor 355 may include a first displacement sensor and a second displacement sensor. The first displacement sensor may be provided in the main body 100, and the second displacement sensor may be provided in the cover 200.
The aligning unit 359 acquires information on a state of alignment of the lens array 310 and the display panel 330. The aligning unit 359 may include a first aligning unit and a second aligning unit. The first aligning unit may be provided in the main body 100, and the second aligning unit may be provided in the cover 200.
The aligning unit 359 acquires the information on the state of alignment of the lens array 310 and the display panel 330 in a state in which the lens array 310 overlaps the display panel 330.
For example, the aligning unit 359 acquires the information on the state of alignment based on a structure of a textured surface or an embossed surface. The first aligning unit may include an aligning hole structure and/or an aligning groove structure. The second aligning unit may include an aligning protrusion. When the second aligning unit is inserted into the first aligning unit, the aligning unit 359 acquires the information on the state of alignment between the lens array 310 and the display panel 330.
According to another example, the aligning unit 359 acquires the information on the state of alignment between the lens array 310 and the display panel 330 based on force. The first aligning unit and the second aligning unit may be magnetic materials. Alternatively, one of the first aligning unit and the second aligning unit may be a magnetic material, and the other one may be a magnetic metal. When the first aligning unit and the second aligning unit are aligned by a magnetic force, the aligning unit 359 acquires the information on the state of alignment between the lens array 310 and the display panel 330. The first aligning unit and the second aligning unit may be aligned by an electromagnetic force.
The aligning unit 359 outputs the information on the state of alignment to the GPU 353. The aligning unit 359 is not limited to the aforementioned structure.
The parameter generator 357 receives the measured displacement DI. The parameter generator 357 calculates a rendering parameter R_PR for rendering based on a rotation parameter R and a translation parameter T included in the measured displacement DI.
FIG. 11 is a flowchart illustrating an example of a parameter generator illustrated in FIG. 3. FIGS. 12A and 12B are diagrams illustrating examples of a parameter generator illustrated in FIG. 5.
Referring to FIGS. 1 through 12B, in operation 1010, the parameter generator 357 calculates a location on the display panel 330 of each lens included in the lens array 310 based on the rotation parameter R and the translation parameter T.
In operation 1120, the parameter generator 357 determines a lens corresponding to each pixel of an interlaced image based on the calculated space location.
In operation 1130, the parameter generator 370 generates a plurality of ray clusters C1, C2, and C3 by clustering rays of a light field based on the determined lens. The ray clusters C1, C2, and C3 correspond to view frustums VF1, VF2, and VF3, respectively. For ease and convenience of description, FIGS. 12A and 12B illustrate the three ray clusters C1, C2, and C3 according to horizontal directions and the three view frustums VF1, VF2, and VF3 corresponding to the respective ray clusters C1, C2, and C3.
A plurality of rays of the ray cluster C1 corresponds to the view frustum VF1. A plurality of rays of the ray cluster C2 corresponds to the view frustum VF2. A plurality of rays of the ray cluster C3 corresponds to the view frustum VF3.
For example, each of the view frustums VF1, VF2, and VF3 may be a perspective view frustum. Also, each of the view frustums VF1, VF2, and VF3 may be a shear perspective view frustum.
Each of the view frustums VF1, VF2, and VF3 corresponding to the respective ray clusters C1, C2, and C3 may include the rendering parameter R_PR for rendering.
The rendering parameter R_PR may include viewpoints and viewing angles of the view frustums VF1, VF2, and VF3.
In operation 1140, the parameter generator 357 calculates the rendering parameter R_PR for each of the view frustums VF1, VF2, and VF3 corresponding to the ray clusters C1, C2, and C3.
In operation 1150, the parameter generator 357 transmits the rendering parameter R_PR to the GPU 353.
The GPU 353 generates 2D image data or 3D image data in response to an SEL.
When a level of the SEL is a first level, the GPU 353 generates the 2D image data and outputs the generated 2D image data to the display panel 330. Thus, the display panel 330 performs a 2D display operation.
When a level of the SEL is a second level, the GPU 353 generates the 3D image data and outputs the generated 3D image data to the display panel 330. Thus, the lens array 310 and the display panel 330 perform a 3D display operation. For example, the 3D image data may be interlaced 3D image data.
FIG. 13 is a flowchart illustrating a method of generating three-dimensional (3D) image data of a graphic processing unit (GPU) illustrated in FIG. 5 according to an exemplary embodiment.
Referring to FIGS. 1 through 13, the GPU 353 generates 3D image data using the rendering parameter R_PR in response to a determination that the lens array 310 and the display panel 330 are not aligned based on information on the state of alignment output by the aligning unit 359.
In operation 1310, the GPU 353 generates a multi-view image by performing geometry duplication based on the rendering parameter R_PR. For example, the GPU 353 performs the geometry duplication on 3D content of each of the ray clusters C1, C2, and C3, and generates the multi-view image by performing parallel rendering on the view frustums VF1, VF2, and VF3 corresponding to the respective ray clusters C1, C2, and C3. For example, the multi-view image may be generated using a geometry shader of the GPU 353.
In operation 1320, the GPU 353 stores the multi-view image as a single texture image in a memory (not shown). The memory may be a volatile memory or a non-volatile memory.
The volatile memory may be a dynamic random access memory (DRAM), a static random access memory (SRAM), a thyristor RAM (T-RAM), a zero capacitor RAM (Z-RAM), or a twin transistor RAM (TTRAM).
The non-volatile memory may be an electrically erasable programmable read-only memory (EEPROM), a flash memory, a magnetic RAM (MRAM), a spin-transfer torque (STT) MRAM, a conductive bridging RAM (CBRAM), a ferroelectric RAM (FeRAM), a phase change RAM (PRAM), a resistive RAM (RRAM), a nanotube RRAM, a polymer RAM (PoRAM), a nano floating gate memory (NFGM), a holographic memory, a molecular electronics memory device, or an insulator resistance change memory.
In operation 1330, the GPU 353 generates the 3D image data by rearranging pixels of the multi-view image. The rearranging may be performed by a pixel shader or a fragment shader of the GPU 353.
In operation 1340, the GPU 353 outputs the 3D image data to the display panel 330. The GPU 353 may rapidly generate the 3D image data by generating the multi-view image by performing the parallel rendering.
FIG. 14 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to an exemplary embodiment.
In FIG. 14, the electronic device 10 switches between a 2D display and a 3D display in a flip close type manner. A connector 400 may have a rotation axis along a short edge of the electronic device 10.
FIG. 15 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
In FIG. 15, the electronic device 10 switches between a 2D display and a 3D display in a flip close type manner. The connector 400 may have a rotation axis along a long edge of the electronic device 10.
FIG. 16 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
In FIG. 16, the electronic device 10 switches between a 2D display and a 3D display in a slide type manner. A direction of sliding may be perpendicular to a long edge of the electronic device 10.
FIG. 17 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
In FIG. 17, the electronic device 10 switches between a 2D display and a 3D display in a slide type manner. A direction of sliding may be perpendicular to a short edge of the electronic device 10.
FIG. 18 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
In FIG. 18, the electronic device 10 switches between a 2D display and a 3D display in a rotation type manner. Although FIG. 18 illustrates that a rotation direction of a cover is anticlockwise, the rotation direction is not limited thereto. The rotation direction may be clockwise.
FIG. 19 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
In FIG. 19, the electronic device 10 switches between a 2D display and a 3D display in a flip close type manner. The cover 200 overlaps a portion of the main body 100 in the closed position.
FIG. 20 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
In FIG. 20, the electronic device 10 switches between a 2D display and a 3D display in a flip close type manner. A material of the cover 200 may be transparent or translucent.
FIG. 21 is a diagram illustrating an operation method of the electronic device illustrated in FIG. 1 according to another exemplary embodiment.
In FIG. 21, the electronic device 10 switches between a 2D display and a 3D display in a flip over type manner. The cover 200 is flipped from a front surface of the display panel 330 to a back surface of the main body 100. When the cover 200 is flipped to a surface of the display panel 330, the electronic device 10 may be in a 3D display operation state. When the cover 200 is flipped and distanced from the surface of the display panel 330, the electronic device 10 may be in a 2D display operation state. In addition, when the cover 200 is flipped and fully folded to the back surface of the main body 100, the electronic device 10 may be in the 2D display operation state.
Exemplary embodiments include computer-readable media including program instructions to enable a computer to implement various operations. The media may also include, alone or in combination with the program instructions, data files, data structures, tables, and the like. The media and program instructions may be those specially designed and constructed for the purposes of exemplary embodiments. Examples of computer-readable media include magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory devices (ROM) and random access memory (RAM). Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described exemplary embodiments, or vice versa.
Although a few exemplary embodiments have been shown and described, the present disclosure is not limited to the described exemplary embodiments. Instead, it would be appreciated by those skilled in the art that changes may be made to these exemplary embodiments without departing from the principles and spirit of the disclosure, the scope of which is defined by the claims and their equivalents.

Claims (15)

The invention claimed is:
1. An electronic device comprising:
a display panel;
an optical element; and
a controller configured to:
determine a position of the optical element with respect to the display panel;
control the display panel to generate a three-dimensional (3D) image in a state in which the display panel overlaps the optical element;
control the display panel to generate a two-dimensional (2D) image in a state in which the optical element is spaced from the display panel; and
in the state in which the display panel overlaps the optical element, determine a rendering parameter of the 3D image based on an alignment between the display panel and the optical element.
2. The electronic device of claim 1, wherein the alignment comprises a rotational displacement between the display panel and the optical element and a translation between the display panel and the optical element.
3. The electronic device of claim 1, wherein the controller comprises:
a mode selector configured to generate a mode selection signal based on the determined position;
a displacement sensor configured to measure a displacement of the optical element in response to the mode selection signal;
an aligning unit configured to determine the alignment between the display panel and the optical element
a parameter generator configured to calculate the rendering parameter based on the alignment; and
a graphic processing unit (GPU) configured to generate the 3D image using the rendering parameter.
4. The electronic device of claim 3, wherein the mode selector comprises:
a position sensor configured to sense a position of the display panel and a position of the optical element and generate a sensing signal; and
a mode controller configured to determine the position of the optical element in response to the sensing signal and to generate the mode selection signal based on the determined position.
5. The electronic device of claim 4, wherein the mode selector further comprises a switching button unit configured to generate a switching signal in response to a user input, and wherein the mode controller is configured to generate the mode selection signal in response to the switching signal.
6. The electronic device of claim 4, wherein the mode selector further comprises a voice command processing unit configured to recognize a voice command of a user and generate the switching signal by processing the voice command, and wherein the mode controller is configured to generate the mode selection signal in response to the switching signal.
7. The electronic device of claim 1, wherein the optical element comprises at least one of a microlens array, a microprism array, and a lenticular lens array.
8. The electronic device of claim 1, wherein the electronic device comprises a cover and a main body, and wherein optical element is disposed in the cover and the display panel is disposed in the main body.
9. The electronic device of claim 8, wherein the cover is moveable with respect to the main body.
10. The electronic device of claim 1, wherein the electronic device is portable device.
11. An operation method of an electronic device, the method comprising:
determining a position of an optical element with respect to a display panel; and
outputting a three-dimensional (3D) image through the display panel and the optical element based on the determined position indicating that the display panel overlaps the optical element and determining a rendering parameter of the 3D image based on an alignment between the display panel and the optical element;
outputting a two-dimensional (2D) image through the display panel based on the determined position indicating that the optical element is spaced from the display panel.
12. The method of claim 11, wherein the alignment comprises a rotational displacement between the display panel and the optical element and a translation between the display panel and the optical element.
13. The method of claim 11, wherein the optical element comprises at least one of a microlens array, a microprism array, and a lenticular lens array.
14. The method of claim 11, wherein the electronic device comprises a cover and a main body, and wherein the optical element is disposed in the cover and the display panel is disposed in the main body.
15. The method of claim 14, wherein the cover is moveable with respect to the main body.
US15/119,595 2014-02-17 2015-02-13 Electronic device and operation method therefor Active 2035-07-19 US10324686B2 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN201410053867 2014-02-17
CN201410053867.3A CN104853008B (en) 2014-02-17 2014-02-17 Portable device and method capable of switching between two-dimensional display and three-dimensional display
CN201410053867.3 2014-02-17
KR1020140111369A KR102187186B1 (en) 2014-02-17 2014-08-26 Electronic device and method thereof
KR10-2014-0111369 2014-08-26
PCT/KR2015/001466 WO2015122712A1 (en) 2014-02-17 2015-02-13 Electronic device and operation method therefor

Publications (2)

Publication Number Publication Date
US20170054971A1 US20170054971A1 (en) 2017-02-23
US10324686B2 true US10324686B2 (en) 2019-06-18

Family

ID=53852356

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/119,595 Active 2035-07-19 US10324686B2 (en) 2014-02-17 2015-02-13 Electronic device and operation method therefor

Country Status (3)

Country Link
US (1) US10324686B2 (en)
KR (1) KR102187186B1 (en)
CN (1) CN104853008B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102501384B1 (en) * 2016-02-17 2023-02-20 삼성전자 주식회사 Electronic device and method for controlling operation thereof
CN113168011A (en) * 2018-11-30 2021-07-23 株式会社小糸制作所 Head-up display
GB202218713D0 (en) * 2022-12-13 2023-01-25 Temporal Res Ltd An imaging method and an imaging device

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040097972A (en) 2004-10-25 2004-11-18 이정혜 A Method of displaying 3-D image on small size display panels
JP2006514340A (en) 2003-04-08 2006-04-27 イクスドライデー テヒノロギーズ ゲーエムベーハー Method for producing a three-dimensional display screen
KR20060060409A (en) 2004-11-30 2006-06-05 엘지전자 주식회사 Picture display appararus for mobile terminal
US20070153380A1 (en) 2006-01-03 2007-07-05 Samsung Electronics Co., Ltd. High-resolution field sequential autostereoscopic display
JP2007319237A (en) 2006-05-30 2007-12-13 Namco Bandai Games Inc Game apparatus
US20080316302A1 (en) 2004-04-13 2008-12-25 Koninklijke Philips Electronics, N.V. Autostereoscopic Display Device
US20110013879A1 (en) * 2009-07-14 2011-01-20 Nintendo Co., Ltd. Information processing system, information processing apparatus, and computer-readable storage medium having information processing program stored therein
US20110157169A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Operating system supporting mixed 2d, stereoscopic 3d and multi-view 3d displays
US20110157329A1 (en) * 2009-12-28 2011-06-30 Acer Incorporated Method for switching to display three-dimensional images and digital display system
US20120249545A1 (en) 2011-03-31 2012-10-04 Yoon-Soo Kim User interface apparatus for providing representation of 3d theme for cover page and method of operating the same
US8368745B2 (en) * 2008-09-19 2013-02-05 Samsung Electronics Co., Ltd. Apparatus and method to concurrently display two and three dimensional images
KR20130045109A (en) 2011-10-25 2013-05-03 엘지전자 주식회사 Display module and mibile terminal having the same
US20130196709A1 (en) * 2012-01-31 2013-08-01 Lg Electronics Inc. Mobile terminal, controlling method thereof and recording medium thereof
KR101307774B1 (en) 2012-05-22 2013-09-12 동아대학교 산학협력단 Design method of 2d-3d switchable lc lens
US20140015743A1 (en) * 2012-07-11 2014-01-16 Samsung Electronics Co., Ltd. Flexible display apparatus and operating method thereof
US20140111713A1 (en) * 2012-10-23 2014-04-24 Kabushiki Kaisha Toshiba Liquid crystal optical element and image device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0400371D0 (en) * 2004-01-09 2004-02-11 Koninkl Philips Electronics Nv Volumetric display
JP5250755B2 (en) * 2008-03-25 2013-07-31 株式会社サンメディカル技術研究所 Assisted artificial heart pump drive device and assisted artificial heart system
US9361699B2 (en) * 2012-06-27 2016-06-07 Imec Taiwan Co. Imaging system and method

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006514340A (en) 2003-04-08 2006-04-27 イクスドライデー テヒノロギーズ ゲーエムベーハー Method for producing a three-dimensional display screen
US20080316302A1 (en) 2004-04-13 2008-12-25 Koninklijke Philips Electronics, N.V. Autostereoscopic Display Device
KR20040097972A (en) 2004-10-25 2004-11-18 이정혜 A Method of displaying 3-D image on small size display panels
KR20060060409A (en) 2004-11-30 2006-06-05 엘지전자 주식회사 Picture display appararus for mobile terminal
US20070153380A1 (en) 2006-01-03 2007-07-05 Samsung Electronics Co., Ltd. High-resolution field sequential autostereoscopic display
JP2007319237A (en) 2006-05-30 2007-12-13 Namco Bandai Games Inc Game apparatus
US8368745B2 (en) * 2008-09-19 2013-02-05 Samsung Electronics Co., Ltd. Apparatus and method to concurrently display two and three dimensional images
US20110013879A1 (en) * 2009-07-14 2011-01-20 Nintendo Co., Ltd. Information processing system, information processing apparatus, and computer-readable storage medium having information processing program stored therein
US20110157329A1 (en) * 2009-12-28 2011-06-30 Acer Incorporated Method for switching to display three-dimensional images and digital display system
US20110157169A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Operating system supporting mixed 2d, stereoscopic 3d and multi-view 3d displays
US20120249545A1 (en) 2011-03-31 2012-10-04 Yoon-Soo Kim User interface apparatus for providing representation of 3d theme for cover page and method of operating the same
KR20130045109A (en) 2011-10-25 2013-05-03 엘지전자 주식회사 Display module and mibile terminal having the same
US20130196709A1 (en) * 2012-01-31 2013-08-01 Lg Electronics Inc. Mobile terminal, controlling method thereof and recording medium thereof
KR101307774B1 (en) 2012-05-22 2013-09-12 동아대학교 산학협력단 Design method of 2d-3d switchable lc lens
US20140015743A1 (en) * 2012-07-11 2014-01-16 Samsung Electronics Co., Ltd. Flexible display apparatus and operating method thereof
US20140111713A1 (en) * 2012-10-23 2014-04-24 Kabushiki Kaisha Toshiba Liquid crystal optical element and image device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Search Report dated May 1, 2015 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2015/001466 (PCT/ISA/210).
Written Opinion dated May 1, 2015 issued by the International Searching Authority in counterpart International Patent Application No. PCT/KR2015/001466 (PCT/ISA/237).

Also Published As

Publication number Publication date
US20170054971A1 (en) 2017-02-23
CN104853008B (en) 2020-05-19
KR102187186B1 (en) 2020-12-04
KR20150097369A (en) 2015-08-26
CN104853008A (en) 2015-08-19

Similar Documents

Publication Publication Date Title
US10652515B2 (en) Information processing apparatus, stereoscopic display method, and program
US9495805B2 (en) Three dimensional (3D) display terminal apparatus and operating method thereof
US8970478B2 (en) Autostereoscopic rendering and display apparatus
RU2559720C2 (en) Device and method of user's input for control over displayed data
US20160217602A1 (en) Method for generating eia and apparatus capable of performing same
US9684412B2 (en) Method and apparatus for generating a three-dimensional user interface
US9612444B2 (en) Display apparatus and control method thereof
US20120054690A1 (en) Apparatus and method for displaying three-dimensional (3d) object
BR112014016867B1 (en) MOBILE VIEWING DEVICE, A METHOD TO ALLOW A USER TO OBTAIN BOTH A THREE-DIMENSIONAL CONTENT AND A TWO-DIMENSIONAL CONTENT VIEW WITH THE USE OF A COMPUTER-READABLE MOBILE VIEWING DEVICE
US9432652B2 (en) Information processing apparatus, stereoscopic display method, and program
US20130222363A1 (en) Stereoscopic imaging system and method thereof
US8988500B2 (en) Information processing apparatus, stereoscopic display method, and program
US10324686B2 (en) Electronic device and operation method therefor
US20120120029A1 (en) Display to determine gestures
TW201503050A (en) Three dimensional data visualization
US11508131B1 (en) Generating composite stereoscopic images
KR101841719B1 (en) Multi-display apparatus and the operation method thereof
CN105867597B (en) 3D interaction method and 3D display equipment
US9674501B2 (en) Terminal for increasing visual comfort sensation of 3D object and control method thereof
US11145113B1 (en) Nested stereoscopic projections

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JIAO, SHAOHUI;WANG, HAITAO;ZHOU, MINGCAI;AND OTHERS;SIGNING DATES FROM 20190311 TO 20190503;REEL/FRAME:049100/0616

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4