US20200228717A1 - User terminal apparatus and control method thereof - Google Patents
User terminal apparatus and control method thereof Download PDFInfo
- Publication number
- US20200228717A1 US20200228717A1 US16/834,848 US202016834848A US2020228717A1 US 20200228717 A1 US20200228717 A1 US 20200228717A1 US 202016834848 A US202016834848 A US 202016834848A US 2020228717 A1 US2020228717 A1 US 2020228717A1
- Authority
- US
- United States
- Prior art keywords
- display area
- processor
- camera
- terminal apparatus
- user terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 230000004044 response Effects 0.000 claims abstract description 163
- 230000006870 function Effects 0.000 description 56
- 230000008859 change Effects 0.000 description 38
- 230000003993 interaction Effects 0.000 description 26
- 238000003860 storage Methods 0.000 description 16
- 230000006854 communication Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 9
- 238000012545 processing Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 239000007787 solid Substances 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000005674 electromagnetic induction Effects 0.000 description 3
- 210000000887 face Anatomy 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 230000007175 bidirectional communication Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 241000593989 Scardinius erythrophthalmus Species 0.000 description 1
- 229910021417 amorphous silicon Inorganic materials 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 201000004428 dysembryoplastic neuroepithelial tumor Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 201000005111 ocular hyperemia Diseases 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 229910021420 polycrystalline silicon Inorganic materials 0.000 description 1
- 229920005591 polysilicon Polymers 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Images
Classifications
-
- H04N5/23293—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1641—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1652—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
-
- H04N5/23216—
-
- H04N5/23219—
-
- H04N5/23245—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/16—Details of telephonic subscriber devices including more than one display unit
Definitions
- the present disclosure relates to a user terminal apparatus and a control method thereof. More particularly, the present disclosure relates to a user terminal apparatus which uses displays disposed on the front surface and the rear surface of the user terminal apparatus when photographing, and a control method thereof.
- a camera is provided on the rear surface of the electronic device.
- a display is provided on the rear surface of the electronic device, selfie photographing may be easily performed.
- a user may photograph while checking a photography state using the rear display.
- an aspect of the present disclosure is to provide a user terminal apparatus which controls displays disposed on the front surface and the rear surface of the user terminal apparatus based on a photographing situation, and a control method thereof.
- a user terminal apparatus includes a display including a main display area which is disposed on a front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of a rear surface of the user terminal apparatus, a camera configured to photograph an image, and a processor configured to display a live view acquired through the camera on one of the main display area or the sub display area, and control the display to display, in response to an orientation of the user terminal apparatus being changed, co the live view on another one of the main display area or the sub display area.
- a control method of a user terminal apparatus includes a display including a main display area which is disposed on a front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of a rear surface of the user terminal apparatus, and a camera configured to photograph an image which includes displaying a live view acquired through the camera on one of the main display area or the sub display area and displaying, in response to an orientation of the user terminal apparatus being changed, the live view on another one of the main display area or the sub display area.
- FIGS. 1A and 1B are views to illustrate an example of a configuration of a display of a user terminal apparatus according to various embodiments of the present disclosure
- FIG. 2A is a block diagram illustrating a configuration of a user terminal apparatus according to an embodiment of the present disclosure
- FIG. 2B is a block diagram illustrating an example of a detailed configuration of a user terminal apparatus according to an embodiment of the present disclosure
- FIG. 2C illustrates various modules stored in a storage according to an embodiment of the present disclosure
- FIG. 3 illustrates an example of using one of a main display area, a sub display area, and a round display area according to an embodiment of the present disclosure
- FIG. 4 illustrates an example of using at least two of a main display area, a sub display area, and a round display area according to an embodiment of the present disclosure
- FIG. 5 illustrates respective areas and an example of an operation corresponding to a touch input according to an embodiment of the present disclosure
- FIG. 6 illustrates an operation in response to the orientation of the user terminal apparatus being changed according to an embodiment of the present disclosure
- FIGS. 7A and 7B illustrate an example of a method for changing a provided image according to a display area according to various embodiments of the present disclosure
- FIGS. 8A and 8B illustrate an example of a method for executing a specific function through a main display area according to various embodiments of the present disclosure
- FIG. 9 illustrates a method for displaying a live view on both a main display area and a sub display area according to an embodiment of the present disclosure
- FIG. 10 illustrates a method for displaying a live view on a main display area and displaying a specific content on a sub display area according to an embodiment of the present disclosure
- FIGS. 11A and 11B illustrate an example of a method for determining an area to display a live view according to various embodiments of the present disclosure
- FIG. 12 illustrates a method for detecting a change in an orientation of a user terminal apparatus according to an embodiment of the present disclosure
- FIGS. 13A and 13B illustrate an example of an operation corresponding to a change in an orientation of a user terminal apparatus according to various embodiments of the present disclosure
- FIGS. 14A and 14B illustrate an example of a method for executing a photographing function according to various embodiments of the present disclosure
- FIG. 15 illustrates a method for using the round display area according to an embodiment of the present disclosure
- FIG. 16 illustrates a camera which is provided on a front surface of a user terminal apparatus according to an embodiment of the present disclosure
- FIGS. 17A to 17C illustrate an example of a case in which other applications are used according to various embodiments of the present disclosure
- FIGS. 18A and 18B illustrate an example of a configuration of a display according to various embodiments of the present disclosure.
- FIG. 19 is a flowchart to illustrate a control method of a user terminal apparatus according to an embodiment of the present disclosure.
- first and second used in various embodiments are used to distinguish various elements from one another regardless of an order or importance of the corresponding elements. Accordingly, the order or importance of the elements is not limited by these terms.
- a first element may be named a second element without departing from the scope of right of various embodiments of the present invention, and similarly, a second element may be named a first element.
- an element e.g., a first element
- another element e.g., a second element
- the element may be directly coupled with/to another element, and there may be an intervening element (e.g., a third element) between the element and the other element.
- an intervening element e.g., a third element
- FIGS. 1A and 1B are views to illustrate an example of a configuration of a display of a user terminal apparatus according to various embodiments of the present disclosure.
- the left view shows the front surface of a user terminal apparatus 100
- the right view shows the rear surface of the user terminal apparatus 100
- a front display is disposed on the front surface of the user terminal apparatus 100
- a rear display and a camera are disposed on the rear surface of the user terminal apparatus 100
- the front display and the rear display may be connected with each other, and the rear display may be smaller than the front display. However, this should not be considered as limiting, and the front display and the rear display may have the same size.
- the camera may be disposed on the front surface of the user terminal apparatus 100 .
- the front display will be explained as a main display area 10 and the rear display will be explained as a sub display area 20 .
- the left view is a front view showing the entire display in which the main display area 10 , the sub display area 20 , and a round display area 30 are connected with one another, and the right view is a rear view of the entire display.
- the sub display area 20 may extend from one side of the main display area 10 and may be disposed on at least one area of the rear surface of the user terminal apparatus 100 .
- the sub display area 20 may extend from the top of the main display area 10 to be bent.
- the bent area may have a curved shape, but is not limited to this and may form an angle according to the type of the display.
- the round display area 30 is an area for connecting the main display area 10 and the sub display area 20 . As described above, the round display area 30 may have a curved shape or an angular shape. The round display area 30 may be distinguished from the main display area 10 and the sub display area 20 by boundary lines 30 - 1 , 30 - 2 .
- the boundary lines 30 - 1 , 30 - 2 shown in FIG. 1B are merely an example and may be changed.
- the boundary lines 30 - 1 , 30 - 2 may be determined by a manufacturer at the time of manufacturing, and may be changed by a user.
- the sizes of the main display area 10 , the sub display area 20 , and the round display area 30 may be changed and the size of a content displayed on each area may be changed.
- the display encloses the upper side of the user terminal apparatus 100 .
- the display may enclose any one of the lower side, the left side surface, and the right side surface of the user terminal apparatus 100 .
- the display may enclose a plurality of side surface other than a single side surface.
- the display may enclose a touch pad, and the display and the touch pad may be implemented in the form of a touch screen by forming a mutual layer structure.
- the touch pad is also bent similarly to the display and thus a touch input may be inputted opposite to what the user thinks. For example, it may be determined that there is a difference in the touch area but the touch is made in the same direction in response to the user dragging from a certain point of the main display area 10 to an upper point and in response to the user dragging from a certain point of the sub display area 20 to an upper point.
- the touch pad since the touch pad is bent, the touch pad may receive the input in the opposite direction to the real input in response to the user dragging from a certain point of the main display area 10 to an upper point and in response to the user dragging from a certain point of the sub display area 20 to an upper point. Therefore, the user terminal apparatus 100 may be set to recognize the direction of a touch input at a certain area as the opposite direction. This departs from the scope of the present disclosure and thus a detailed description thereof is omitted.
- a receiver may be provided on the side surface or lower portion of the user terminal apparatus 100 .
- a directional receiver may be provided.
- the receiver may be provided on an integral cover which may interwork with the user terminal apparatus 100 .
- a processor 130 may reverse the top and bottom of an image and display the image while the user is talking on the phone.
- the present disclosure will be explained on the assumption that the sub display area 20 has a structure extending from the main display area 10 through the round display area 30 of the curved shape, and the camera is disposed on the rear surface of the user terminal apparatus 100 . Additionally, various embodiments will be expanded and described when the display has other configurations and the location of the camera is changed.
- the direction of the main display area 10 and the sub display area 20 going toward the round display area 30 will be explained as being upward from the user terminal apparatus 100 , and the opposite direction will be explained as being downward.
- the left side and the right side will be explained with reference to a display area which is viewed by the user. Accordingly, the left side and the right side when the user views the main display area 10 are reversed when the user views the sub display area 20 .
- FIG. 2A is a block diagram illustrates a configuration of a user terminal apparatus according to an embodiment of the present disclosure.
- a user terminal apparatus 100 includes a display 110 , a camera 120 , and a processor 130 .
- FIG. 2A illustrates overall elements of the user terminal apparatus 100 when the user terminal apparatus 100 is provided with various functions such as a display function, a control function, and the like. Accordingly, some of the elements shown in FIG. 2A may be omitted or changed and other elements may be added according to an embodiment.
- the display 110 may include a main display area 10 which is disposed on the front surface of the user terminal apparatus 100 , a sub display area 20 which extends from one side of the main display area 10 and is disposed on at least one area of the rear surface of the user terminal apparatus 100 , and a round display area 30 for connecting the main display area 10 and the sub display area 20 .
- the sub display area 20 may be configured to enclose the entire rear surface of the user terminal apparatus 100 .
- front surface and rear surface are used for convenience of explanation and are not limited by the meaning thereof.
- the front surface and the rear surface may refer to one side surface and the other side surface regarding a specific electronic device.
- the display 110 extends from one side of the main display area 10 .
- the display 110 may extend from all side surfaces of the main display area 10 and cover the entire user terminal apparatus 100 .
- the sub display area 20 of the display 110 may extend from the upper side of the main display area 10 to be bent and may be disposed on the upper area of the rear surface.
- a connection part between the main display area 10 and the sub display area 20 may be formed in the shape of “U” and have a curved line when the main display area 10 and the sub display area 20 are viewed from the side.
- the connection part between the main display area 10 and the sub display area 20 may be formed in the shape of a squared “C” and have an angle of 90° when the main display area 10 and the sub display area 20 are viewed from the side.
- various connection parts may be formed based on the type of the user terminal apparatus 100 .
- the display 110 may display various user interfaces (UIs) under the control of the processor 130 .
- UIs user interfaces
- the display 110 may display a live view, a gallery application, an animation, and the like.
- the display 110 may display different contents on the main display area 10 , the sub display area 20 , and the round display area 30 under the control of the controller 130 .
- the display 110 may display a moving picture on the main display area 10 , an image on the sub display area 20 , and a UI for transmitting a message on the round display area 30 .
- the display 110 may display a content by interlocking at least two of the main display area 10 , the sub display area 20 , and the round display area 30 .
- the display 110 may display a moving picture on the main display area 10 and display a UI for controlling the moving picture on the sub display area 20 .
- the display 110 may display a UI for providing a function unrelated to the moving picture on the round display area 30 .
- the display 110 may display the same content on at least two of the main display area 10 , the sub display area 20 , and the round display area 30 .
- the display 110 may display the same content on the main display area 10 and the sub display area 20 and may display a separate content on the round display area 30 .
- the display 110 may be implemented by using a liquid crystal display (LCD) panel, an organic light emitting diodes (OLED) display, a plasma display panel (PDP), and the like, but is not limited to these.
- the display 110 may be implemented by using a transparent display, a flexible display, and the like, according to circumstances.
- the camera 120 is configured to photograph a still image or a moving picture under the control of the user.
- the camera 120 may photograph a still image at a specific time or may continuously photograph a still image.
- the camera 120 provides the acquired image to the display 110 , and a live view may be displayed on at least one of the main display area 10 and the sub display area 20 .
- the camera 120 may photograph the user or a background image according to the orientation of the user terminal apparatus 100 .
- the camera 120 may include a plurality of cameras such as a front camera and a rear camera.
- the camera 120 includes a lens, a shutter, an aperture, a solid state imaging device, an analog front end (AFE), and a timing generator (TG).
- the shutter adjusts a time at which light reflected from a subject enters the user terminal apparatus 100
- the aperture adjusts an amount of light entering the lens by mechanically increasing or reducing the size of an opening through which light enters.
- the solid state imaging device outputs an image generated by photo-charge as an electric signal when the light reflected from the subject accumulates as photo-charge.
- the TG outputs a timing signal to read out pixel data of the solid state imaging device, and the AFE samples and digitizes the electric signal outputted from the solid state imaging device.
- the processor 130 may control the overall operation of the user terminal apparatus 100 .
- the processor 130 may control to display a live view acquired through the camera 120 on one of the main display area 10 and the sub display area 20 , and display the live view on the other one of the main display area 10 and the sub display area 20 in response to the orientation of the user terminal apparatus 100 being changed.
- the processor 130 may determine whether the orientation of the user terminal apparatus 100 is changed using a gravity sensor, an acceleration sensor, a gyro sensor, and the like.
- the processor 130 may determine whether the orientation of the user terminal apparatus 100 is changed by analyzing the live view acquired by the camera 120 . However, this should not be considered as limiting.
- the processor 130 may change the area to display the live view by moving or shaking in a specific direction, in addition by changing the orientation of the user terminal apparatus 100 .
- the processor 130 may display the live view and additionally may display a guide line.
- the processor 130 may display a graphical UI (GUI) for executing a gallery application, a GUI for executing an image correction application, and the like, in addition to a GUI for photographing.
- GUI graphical UI
- the processor 130 may display the live view on the main display area 10 and may not provide information on the sub display area 20 .
- the processor 130 may display the live view on the sub display area 20 and may not provide information on the main display area 10 .
- the information may not be provided in various ways. For example, power may not be supplied to the display 110 or black is displayed on the display 110 .
- the processor 130 may activate a touch function of the main display area 10 while displaying the live view on the sub display area 20 without providing information on the main display area 10 .
- the processor 130 may photograph and store an image.
- the processor 130 may change a photographing setting value to correspond to the predetermined direction.
- the processor 130 may display the live view on the main display area 10 , and, in response to recognizing the live view as including a person's face, the processor 130 may control the display 110 to display the face area included in the live view on the sub display area 20 .
- the processor 130 may crop a part of the live view to correspond to the size of the sub display area 20 and control the display 110 to display the cropped part on the sub display area 20 .
- the processor 130 may display the live view on the main display area 10 , and, in response to recognizing the live view as including a person, the processor 130 may control the display 110 to display an animation on the sub display area 20 .
- the processor 130 may determine a distance to a subject, and, in response to determining the distance is shorter than a predetermined distance, the processor 130 may control the display 110 to display the live view on the sub display area 20 , and, in response to determining the distance is longer than the predetermined distance, control the display 110 to display the live view on the main display area 10 .
- the user terminal apparatus 100 may further include a plurality of sensors, and the processor 130 may determine whether the orientation of the user terminal apparatus 100 is changed based on at least one of a location and a motion of the user terminal apparatus 100 and a user's grip detected by the plurality of sensors.
- the processor 130 may determine that the orientation of the user terminal apparatus 100 is changed.
- FIG. 2B is a block diagram illustrating an example of a detailed configuration of a user terminal apparatus according to an embodiment of the present disclosure.
- a user terminal apparatus 100 includes a display 110 , a camera 120 , a processor 130 , a global positioning system (GPS) chip 145 , a storage 140 , a sensor, a communicator 150 , a user interface 155 , an audio processor 160 , a video processor 170 , a speaker 180 , a button 181 , and a microphone 182 .
- GPS global positioning system
- the display 110 may be divided into the main display area 10 , the sub display area 20 , and the round display area 30 as described above.
- the display 110 may be implemented by using various types of displays such as an LCD, an OLED display, a PDP, and the like.
- the display 110 may further include a driving circuit which is implemented by using an amorphous silicon thin film transistor (a-si TFT), a low temperature poly silicon (LTPS) TFT, an organic TFT (OTFT), and the like, and a backlight unit.
- a-si TFT amorphous silicon thin film transistor
- LTPS low temperature poly silicon
- OFT organic TFT
- the display 110 may be combined with a touch sensor included in the sensor and may be implemented as a touch screen.
- the touch sensor may include at least one of a touch panel and a pen recognition panel.
- the touch panel may detect a user's finger gesture input and output a touch event value corresponding to a detected touch signal.
- the touch panel may be mounted under at least one of the main display area 10 , the sub display area 20 , and the round display area 30 of the display 110 .
- the touch panel may detect the user's finger gesture input in a capacitive method or a resistive method.
- the capacitive method calculates touch coordinates by detecting minute electricity excited in a user's body.
- the resistive method includes two electrode plates embedded in the touch panel, and calculates touch coordinates by detecting an electric current flowing due to contact between upper and lower plates at a touched point.
- the pen recognition panel may detect a user's pen gesture input according to a user's operation of using a touch pen (e.g., a stylus pen, a digitizer pen), and output a pen proximity event value or a pen touch event value.
- a touch pen e.g., a stylus pen, a digitizer pen
- the pen recognition panel may be mounted under at least one of the main display area 10 , the sub display area 20 , and the round display area 30 of the display 110 .
- the pen recognition panel may be implemented in an electromagnetic resonance (EMR) method, for example, and may detect a touch or a proximity input according to a change in the intensity of an electromagnetic field caused by the proximity or touch of the pen.
- the pen recognition panel may include an electromagnetic induction coil sensor (not shown) having a grid structure, and an electronic signal processor (not shown) which provides an alternating current (AC) signal having a predetermined frequency to each loop coil of the electromagnetic induction coil sensor in sequence.
- AC alternating current
- a magnetic field transmitted from the corresponding loop coil generates a current in the resonant circuit of the pen based on the mutual electromagnetic induction.
- an induction magnetic field is generated from the coil forming the resonant circuit in the pen, and the pen recognition panel detects the induction magnetic field from the loop coil in a signal reception state, and thus detects the proximity location or touch location of the pen.
- the pen recognition panel may be configured differently according to a display area.
- both the touch panel and the pen recognition panel may be provided in the main display area 10
- only the touch panel may be provided in the sub display area 20 and the round display area 30 .
- the processor 130 may deactivate a specific panel by shutting off the power to the specific panel.
- the processor 130 may supply power to the touch panel and the pen recognition panel and receive a touch or a pen input, but may deactivate a specific panel by disregarding an input to the specific panel in a software level.
- the processor 130 may receive at least one of a hand touch and a pen touch according to whether the touch panel and the pen recognition panel of the main display area 10 are activated. For example, the processor 130 may activate only the touch panel not to receive the pen touch input and receive only the hand touch input or may activate only the pen recognition panel not to receive the hand touch input and receive only the pen touch input.
- the touch panel and the pen recognition panel may be provided on the entire display area and the pen recognition panel of some areas may be deactivated.
- the processor 130 may deactivate only the pen recognition panel corresponding to the sub display area 20 and the round display area 30 .
- the touch panel and the pen recognition panel may be implemented as a single panel.
- the processor 130 may detect a touch input and a pen input through the entire display area.
- the processor 130 may perform an operation corresponding to an input based on at least one of an input type and an input area. For example, the processor 130 may disregard a specific input such as a pen input in a software level. In addition, the processor 130 may disregard a pen input to the sub display area 20 in a software level.
- the processor 130 may control the overall operations of the user terminal apparatus 100 using various programs stored in the storage 140 .
- the processor 130 may include a random access memory (RAM) 131 , a read only memory (ROM) 132 , a main central processing unit (CPU) 133 , a graphic processor 134 , first to n-th interfaces 135 - 1 to 135 - n , and a bus 136 .
- RAM random access memory
- ROM read only memory
- CPU main central processing unit
- graphic processor 134 first to n-th interfaces 135 - 1 to 135 - n
- bus 136 a bus 136 .
- the RAM 131 , the ROM 132 , the main CPU 133 , the graphic processor 134 , and the first to n-th interfaces 135 - 1 to 135 - n may be connected with one another via the bus 136 .
- the first to n-th interfaces 135 - 1 to 135 - n may be connected with the above-described various elements.
- One of the interfaces may be a network interface which is connected with an external device via a network.
- the main CPU 133 may access the storage 140 and perform booting using an operating system (O/S) stored in the storage 140 .
- the main CPU 133 may perform various operations using various programs stored in the storage 140 .
- the ROM 132 may store a set of instructions for booting a system.
- the main CPU 133 may copy the O/S stored in the storage 140 into the RAM 131 according to a command stored in the ROM 132 , and boot the system by executing the O/S.
- the main CPU 133 may copy various application programs stored in the storage 140 into the RAM 131 , and perform various operations by executing the application programs copied into the RAM 131 .
- the graphic processor 134 may generate a screen including various objects such as an icon, an image, a text, and the like, using a calculator (not shown) and a renderer (not shown).
- the calculator (not shown) may calculate attribute values of objects to be displayed according to a layout of the screen, such as a coordinate value, a shape, a size, a color, and the like, based on a received control command.
- the renderer (not shown) may generate the screen of various layouts including objects based on the attribute values calculated by the calculator (not shown).
- the screen generated by the renderer (not shown) may be displayed in the display area of the display 110 .
- the above-described operations of the processor 130 may be achieved by a program stored in the storage 140 .
- the storage 140 may store a variety of data such as an O/S software module for driving the user terminal apparatus 100 , a photographing module, and an application module.
- the processor 130 may process and display an input image based on information stored in the storage 140 .
- the GPS chip 145 is an element for receiving a GPS signal from a GPS satellite, and calculating a current location of the user terminal apparatus 100 .
- the processor 130 may calculate the user's location using the GPS chip 145 .
- the communicator 150 is configured to communicate with various kinds of external devices according to various kinds of communication methods.
- the communicator 150 includes a Wi-Fi chip 151 , a Bluetooth chip 152 , a wireless communication chip 153 , and a near field communication (NFC) chip 154 .
- the processor 130 may communicate with various external devices using the communicator 150 .
- the Wi-Fi chip 151 and the Bluetooth chip 152 communicate in a Wi-Fi method and a Bluetooth method, respectively.
- a variety of connection information such as a service set identifier (SSID) and a session key may be exchanged first, and communication may be established using the connection information, and then a variety of information may be exchanged.
- the wireless communication chip 153 refers to a chip which communicates according to various communication standards such as The Institute of Electrical and Electronics Engineers (IEEE), ZigBee, 3rd generation (3G), 3G partnership project (3GPP), long term evolution (LTE), and the like.
- the NFC chip 154 refers to a chip which operates in an NFC method using a band of 13.56 MHz from among various radio frequency identification (RF-ID) frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860-960 MHz, and 2.45 GHz.
- RFID radio frequency identification
- the communicator 150 may perform unidirectional communication or bidirectional communication with an electronic device. When the communicator 150 performs the unidirectional communication, the communicator 150 may receive signals from the electronic device. When the communicator 150 performs the bidirectional communication, the communicator 150 may receive signals from the electronic device or transmit signals to the electronic device.
- the user interface 155 may receive various types of user interaction.
- the user interface 155 may be implemented in the form of a touch screen forming a mutual layer structure with a touch pad.
- the user interface 155 may be used as the above-described display 110 .
- the sensor may include a touch sensor, a geomagnetic sensor, a gyro sensor, an acceleration sensor, a proximity sensor, a grip sensor, and the like.
- the sensor may detect various operations such as rotation, tilt, pressure, approach, grip, and the like. in addition to the above-described touch.
- the touch sensor may be implemented in a capacitive method or a resistive method.
- the capacitive type touch sensor is a sensor which calculates touch coordinates by detecting minute electricity excited in a user's body when a part of the user's body touches the surface of the display, using a dielectric substance coated on the surface of the display.
- the resistive type touch sensor includes two electrode plates embedded in the user terminal apparatus 100 , and, when the user touches the screen, calculates touch coordinates by detecting an electric current flowing due to contact between upper and lower plates at the touched point.
- infrared beam, surface acoustic wave, integral strain gauge, piezo electric, and the like may be used to detect a touch interaction.
- the user terminal apparatus 100 may determine whether a touch object such as a finger or a stylus pen touches or approaches using a magnetic field sensor, an optical sensor, a proximity sensor, and the like, instead of the touch sensor.
- the geomagnetic sensor is a sensor for detecting a rotational state, a moving direction, and the like, of the user terminal apparatus 100 .
- the gyro sensor is a sensor for detecting a rotational angle of the user terminal apparatus 100 . Both the geomagnetic sensor and the gyro sensor may be provided, but, even when only one of them is provided, the user terminal apparatus 100 may detect a rotation state.
- the acceleration sensor is a sensor for detecting how the user terminal apparatus 100 is tilted.
- the proximity sensor is a sensor for detecting a motion which approaches without directly contacting the display surface.
- the proximity sensor may be implemented by using various types of sensors such as a high-frequency oscillation type proximity sensor which forms a high frequency magnetic field and detects an electric current induced by a magnetic characteristic which is changed when an object approaches, a magnetic type proximity sensor which uses a magnet, and a capacitive type proximity sensor which detects capacitance that changes when an object approaches, and the like.
- the grip sensor may be disposed on the rear surface, edge, or handle part separately from the touch sensor provided on the touch screen, and detects a user's grip.
- the grip sensor may be implemented as a pressure sensor in addition to the touch sensor.
- the audio processor 160 is an element for processing audio data.
- the audio processor 160 may perform various processing operations such as decoding, amplification, noise filtering, and the like, with respect to the audio data.
- the video processor 170 is an element for processing video data.
- the video processor 170 may perform various image processing operations such as decoding, scaling, noise filtering, frame rate conversion, resolution conversion, and the like, with respect to the video data.
- the speaker 180 is an element for outputting not only various audio data processed by the audio processor 160 but also various notification sounds, voice messages, and the like.
- the button 181 may include various types of buttons such as a mechanical button, a touch pad, a wheel, and the like, formed on a certain area of the user terminal apparatus 100 , such as the front surface, the side surface, and the rear surface of the body exterior of the user terminal apparatus 100 .
- the microphone 182 is an element for receiving an input of a user voice or other sounds and converting the user voice or sound into audio data.
- the user terminal apparatus 100 may further include a universal serial bus (USB) port to which a USB connector is connected, a headset, a mouse, various external input ports for connecting to various external ports such as a local area network (LAN), a digital multimedia broadcasting (DMB) chip for receiving and processing a DMB signal, various sensors, and the like.
- USB universal serial bus
- LAN local area network
- DMB digital multimedia broadcasting
- FIG. 2C illustrates various modules stored in a storage according to an embodiment of the present disclosure.
- the software of FIG. 2C may be stored in the storage 140 , but is not limited to this.
- the software may be stored in various kinds of storing means used in the user terminal apparatus 100 .
- software including an O/S 191 , a kernel 192 , middleware 193 , an application 194 , and the like may be stored in the user terminal apparatus 100 .
- the O/S 191 controls and manages the overall operations of hardware. That is, the O/S 191 is a software layer which is responsible for basic functions such as hardware management, memory management, and security.
- the kernel 192 serves as a channel to transmit various signals including a touch signal, and the like, detected by the display 110 to the middleware 193 .
- the middleware 193 includes various software modules to control the operations of the user terminal apparatus 100 .
- the middleware 193 includes a main UI framework 193 - 1 , a window manager 193 - 2 , a sub UI framework 193 - 3 , a security module 193 - 4 , a system manager 193 - 5 , a connectivity manager 193 - 6 , an X11 module 193 - 7 , an APP manager 193 - 8 , and a multimedia framework 193 - 9 .
- the main UI framework 193 - 1 is a module which provides various UIs to be displayed on the main display area 10 of the display 110
- the sub UI framework 193 - 3 is a module which provides various UIs to be displayed on the sub display area 20 .
- the main UI framework 193 - 1 and the sub UI framework 193 - 3 may include an image compositor module to configure various objects, a coordinates compositor module to calculate coordinates for displaying the objects, a rendering module to render the configured objects on the calculated coordinates, a two dimensional (2D)/three dimensional (3D) UI toolkit to provide a tool for configuring a UI in the form of 2D or 3D.
- the window manager 193 - 2 may detect a touch event using a user's body or a pen or other input events. In response to such an event being detected, the window manager 193 - 2 transmits an event signal to the main UI framework 193 - 1 or the sub UI framework 193 - 3 such that an operation corresponding to the event is performed.
- various program modules such as a writing module which, when the user touches and drags on the screen, draws a line following the trace of the dragging, and an angle calculation module which calculates a pitch angle, a roll angle, a yaw angle, and the like, based on a sensor value detected by the sensor, may be stored.
- the security module 193 - 4 is a module which supports certification, permission, and secure storage for hardware.
- the system manager 193 - 5 monitors the states of the elements in the user terminal apparatus 100 , and provides the result of the monitoring to the other modules. For example, in response to a battery life level being low, an error being generated, or communication being disconnected, the system manager 193 - 5 provides the result of the monitoring to the main UI framework 183 - 1 or the sub UI framework 193 - 3 to output a notification message or a notification sound.
- the connectivity manager 193 - 6 is a module which supports wire or wireless network connection.
- the connectivity manager 193 - 6 may include various sub modules such as a DNET module, a universal plug and play (UPnP) module, and the like.
- the X11 module 193 - 7 is a module which receives various event signals from a variety of hardware provided in the user terminal apparatus 100 .
- the event recited herein refers to an event in which a user operation is detected, an event in which a system alarm is generated, an event in which a specific program is executed or ends, and the like.
- the APP manager 193 - 8 is a module which manages the execution states of various applications installed in the storage 140 .
- the APP manager 193 - 8 may call and execute an application corresponding to the event. That is, in response to an event in which at least one object is selected being detected, the APP manager 193 - 8 may call an application corresponding to the object and execute the application.
- the multimedia framework 193 - 9 is a module which reproduces multimedia contents which are stored in the user terminal apparatus 100 or provided from external sources.
- the multimedia framework 193 - 9 may include a player module, a camcorder module, a sound processing module, and the like. Accordingly, the multimedia framework 193 - 9 may reproduce various multimedia contents, generate a screen and a sound, and reproduce the same.
- the storage 140 may be additionally provided with various programs such as a sensing module to analyze signals sensed by various sensors, a messaging module such as a messenger program, a short message service (SMS) and multimedia message service (MMS) program, and an email program, a call information aggregator program module, a voice over internet protocol (VoIP) module, a web browser module, and the like.
- a sensing module such as a messenger program, a short message service (SMS) and multimedia message service (MMS) program
- MMS multimedia message service
- email program a call information aggregator program module
- VoIP voice over internet protocol
- the user terminal apparatus 100 may be implemented by using various kinds of devices such as a mobile phone, a tablet personal computer (PC), a laptop PC, a personal digital assistant (PDA), a Moving Picture Experts Group phase 1 or phase 2 (MPEG-1 or MPEG-2) audio layer 3 (MP3) player, an electronic album device, a television (TV), a PC, a kiosk, and the like. Accordingly, the elements described in FIGS. 2B and 2C may be changed in various ways according to the type of the user terminal apparatus 100 .
- devices such as a mobile phone, a tablet personal computer (PC), a laptop PC, a personal digital assistant (PDA), a Moving Picture Experts Group phase 1 or phase 2 (MPEG-1 or MPEG-2) audio layer 3 (MP3) player, an electronic album device, a television (TV), a PC, a kiosk, and the like. Accordingly, the elements described in FIGS. 2B and 2C may be changed in various ways according to the type of the user terminal apparatus 100 .
- MPEG-1 or MPEG-2
- the user terminal apparatus 100 may be implemented in various shapes and configurations.
- FIG. 3 illustrates an example of using only one of a main display area, a sub display area, and a round display area according to an embodiment of the present disclosure.
- the processor 130 may determine an area to provide information based on the orientation of the user terminal apparatus 100 . In response to the processor 130 providing information to one area, the processor 130 may not provide information to the other two areas.
- the processor 130 may determine the area to provide information. In addition, in response to a call or a text message being received, the processor 130 may determine the area to provide information.
- the processor 130 determines the area to provide information based on the orientation of the user terminal apparatus 100 .
- this should not be considered as limiting.
- the processor 130 may recognize the user and provide information through an area which is close to the user's eyes.
- the provided information may vary according to an executed application
- the processor 130 may change the display area to provide information through a user's touch input to the area which no information is provided. For example, in response to a user input continuing touching on a certain area of the sub display area 20 being received while the information is provided to the main display area 10 , the processor 130 may display the information provided to the main display area 10 on the sub display area 20 .
- the processor 130 may display the same information in different ways according to the area to display the information. For example, the processor 130 may change the layout of a home screen according to the display area as shown in FIG. 3 .
- the processor 130 may change the area to display information. For example, in response to the user terminal apparatus 100 being rotated by more than a predetermined angle while the home screen is being displayed on the main display area 10 , the processor 130 may display the home screen on the sub display area 20 .
- the processor 130 may change the layout of information to be provided.
- the processor 130 may change not only an object but also the size, content, and layout of the object.
- the processor 130 may change the operation state of an application which is being executed and provide the application to another display area.
- the processor 130 may divide displayed information and display divided pieces of information on the other display areas.
- FIG. 4 illustrates an example of using at least two areas of a main display area, a sub display area, and a round display area according to an embodiment of the present disclosure.
- the processor 130 may provide pieces of information related to each other to two areas of the main display area 10 , the sub display area 20 , and the round display area 30 .
- the processor 130 may display a moving picture on the main display area 10 and may display a UI for controlling the moving picture on the round display area 30 .
- the processor 130 may provide pieces of information unrelated to each other to two areas of the main display area 10 , the sub display area 20 , and the round display area 30 .
- the processor 130 may display a call reception UI on the round display area 30 , and move the UI for controlling the moving picture to the sub display area 20 and display the UI on the sub display area 20 .
- the processor 130 may display a telephone call UI on the round display area 30 and continue reproducing the moving picture. In this case, the processor 130 may mute the moving picture. In addition, the processor 130 may pause the moving picture.
- the processor 130 may provide pieces of information related to each other using all of the main display area 10 , the sub display area 20 , and the round display area 30 .
- the processor 130 may display a moving picture on the main display area 10 , display a UI for controlling the moving picture on the sub display area 20 , and display a UI showing a moving picture reproduction time on the round display area 30 .
- the processor 130 may provide pieces of information unrelated to each other using all of the main display area 10 , the sub display area 20 , and the round display area 30 .
- the processor 130 may provide pieces of information related to each other to two areas of the main display area 10 , the sub display area 20 , and the round display area 30 , and provide information unrelated to the aforementioned information to the other area.
- the processor 130 may change the display area of information displayed on each area.
- the processor 130 may change the display area by combining or dividing information displayed on each area.
- the processor 130 may display a UI displayed on the sub display area 20 on the main display area 10 .
- the processor 130 may display a UI displayed on the main display area 10 on at least one of the sub display area 20 and the round display area 30 .
- the processor 130 may differently adjust a setting value according to a touch area. For example, in response to a drag operation being inputted to the main display area 10 or the sub display area 20 while a moving picture is being displayed on the main display area 10 , the processor 130 may adjust a reproduction time or a volume according to the direction of the drag operation. In this case, in response to the drag operation on the sub display area 20 , the processor 130 may adjust the reproduction time or volume more minutely than in response to the drag operation on the main display area 10 .
- FIG. 4 is merely an embodiment and this should not be considered as limiting.
- the main display area 10 , the sub display area 20 , and the round display area 30 of FIG. 4 may be replaced with one another.
- the operations described in FIG. 4 may be applied to any other applications.
- FIG. 5 illustrates respective areas and an example of an operation corresponding to a touch input according to an embodiment of the present disclosure.
- the processor 130 may display information provided on the main display area 10 on the sub display area 20 .
- the processor 130 may display the information provided on the main display area 10 on the sub display area 20 .
- the processor 130 In response to there being information provided on the sub display area 20 before the drag input is received, the processor 130 does not provide the information provided on the sub display area 20 any longer. In addition, the processor 130 may display the information provided on the sub display area 20 on the main display area 10 or the round display area 30 .
- the processor 130 may display information provided on at least one of the main display area 10 and the sub display area 20 on the round display area 30 .
- the processor 130 may display information provided on the round display area 30 on at least one of the main display area 10 and the sub display area 20 .
- the processor 130 may receive a touch input. For example, in response to a user dragging input on the round display area 30 being received while a broadcast content is being displayed on the main display area 10 , the processor 130 may change a channel or a volume of the broadcast content. In this case, no information may be provided on the round display area 30 .
- FIG. 6 illustrates an operation in response to the orientation of a user terminal apparatus being changed according to an embodiment of the present disclosure.
- the user terminal apparatus 100 includes a display 110 , which includes a main display area 10 which is disposed on the front surface of the user terminal apparatus 100 , a sub display area 20 which extends from one side of the main display area 10 and is disposed on at least one area of the rear surface of the user terminal apparatus 100 , and a round display area 30 for connecting the main display area 10 and the sub display area 20 .
- the user terminal apparatus 100 may be provided with a camera 120 disposed on the rear surface thereof to photograph.
- the structure of the user terminal apparatus 100 having the camera 120 disposed on the rear surface thereof will be explained first for convenience of explanation.
- the structure of the user terminal apparatus 100 having the camera 120 disposed on the front surface thereof will be explained thereafter.
- the processor 130 may display a live view which is acquired through the camera 110 on one of the main display area 10 and the sub display area 20 .
- the upper view of FIG. 6 illustrates that the processor 130 displays a live view on the main display area 10 .
- the processor 130 may display the live view on the sub display area 20 , and a method for determining which of the main display area 10 and the sub display area 20 is the area to display the live view will be explained below.
- the processor 130 may control the display 110 to display the live view on the other one of the main display area 10 and the sub display area 20 .
- the lower view of FIG. 6 illustrates that, in response to a rotation by more than a predetermined angle in a specific direction being detected while the live view is being displayed on the main display area 10 , the processor 130 displays the live view on the sub display area 20 .
- the image photographed by the camera 120 may also be changed in response to the orientation of the user terminal apparatus 100 being changed.
- the processor 130 may determine that the orientation of the user terminal apparatus 100 is changed in various methods, and a detailed description thereof will be explained below.
- the processor 130 may display a soft key, a control GUI, and the like, to correspond to the corresponding area. For example, in response to the live view being displayed on the main display area 10 , the processor 130 may display a GUI for photographing on the right side of the main display area 10 . In addition, in response to the orientation of the user terminal apparatus 100 being changed and thus the live view being displayed on the sub display area 20 , the processor 130 may not display the GUI for photographing and may photograph in response to a touch on the sub display area 20 .
- the processor 130 may display a photographing setting value on the main display area 10 .
- the processor 130 may display the live view on the main display area 10 and may not provide information on the sub display area 20 , and, in response to the orientation of the user terminal apparatus 100 being changed, the processor 130 may display the live view on the sub display area 20 and may not provide information on the main display area 10 .
- the processor 130 may not provide information on the sub display area 20 .
- the processor 130 may not provide information on the main display area 10 .
- the processor 130 may display the live view on both the main display area 10 and the sub display area 20 .
- the processor 130 may not provide information by displaying one of the main display area 10 and the sub display area 20 in black. In addition, the processor 130 may not supply power to the display 110 of one of the main display area 10 and the sub display area 20 . Although the processor 130 may not provide information on one of the main display area 10 and the sub display area 20 , the processor 130 may activate a touch function to receive a touch input.
- the processor 130 may change a photographing mode. For example, in response to the orientation of the user terminal apparatus 100 being changed and thus a subject being recognized as being close to the user terminal apparatus 100 , the processor 130 may convert a current mode into a selfie photographing mode.
- the selfie photographing mode is a mode optimized to selfie photographing and may correct user's skin automatically.
- the processor 130 may convert a current mode into a background photographing mode.
- the background photographing mode is a mode optimized to scene photographing and may adjust contrast and white balance automatically.
- the processor 130 may change an angle of view according to a photographing mode. For example, the processor 130 may narrow the angle of view in the selfie photographing mode and may widen the angle of view in the background photographing mode.
- the processor 130 may change a UI provided on the display according to a photographing mode.
- the processor 130 may provide a UI including a function of correcting user's skin and a function of reducing a red-eye effect in the selfie photographing mode, and may provide a UI including a contrast adjustment function and a white balance adjustment function in the background photographing mode.
- FIGS. 7A and 7B illustrate an example of a method of changing a provided image according to a display area according to various embodiments of the present disclosure.
- the processor 130 may display a live view on the main display area 10 .
- the processor 130 may store the same image as the image displayed on the main display area 10 .
- the processor 130 may display the live view on the sub display area 20 .
- the processor 130 may display only a part of the image which is recognized through the camera 120 .
- FIGS. 7A and 7B illustrate that the same subject is photographed for convenience of explanation. That is, the processor 130 may photograph the image shown in FIG. 7A , but may display only a part of the photographed image on the sub display area 20 as shown in FIG. 7B .
- the processor 130 may rotate the photographed entire image by 90° and display the image on the sub display area 20 .
- the processor 130 may store the entire image. For example, even in response to the user performing selfie photographing with the composition shown in FIG. 7B , the processor 130 may photograph an original image including not only the user's face but also the user's upper body as shown in FIG. 7A .
- a gallery application may be executed and the processor 130 may display the stored image differently according to a display area. For example, in response to the image being displayed on the main display 10 , the processor 130 may display the original image including the user's upper body. In response to the image being displayed on the sub display area 20 , the processor 130 may display a partial image including only the user's face.
- the processor 130 may store both the original image and the partial image. In addition, the processor 130 may store only the original image, and may crop the image to correspond to the sub display area 20 when necessary and display the image on the sub display area 20 . In this case, the original image may store information on a part to be cropped.
- FIGS. 8A and 8B illustrate an example of a method of executing a specific function through a main display area according to various embodiments of the present disclosure.
- the processor 130 may display a live view on the sub display area 20 and may not provide information on the main display area 10 . In this case, the processor 130 may execute a corresponding function according to a user touch input on the main display area 10 .
- the processor 130 may display a live view on the sub display area 20 and may not provide information on the main display area 10 . In this case, in response to a user touch on the main display area 10 being detected, the processor 130 may photograph and store an image.
- the processor 130 may photograph the image only in response to a user touch on a predetermined area 11 of the main display area 10 being detected.
- the predetermined area 11 may be set by the manufacturer or may be set by the user.
- the processor 130 may activate a touch function of the predetermined area 11 of the main display area 10 and may deactivate a touch function of the other area of the main display area 10 .
- the processor 130 may display a live view on the sub display area 20 and may not provide information on the main display area 10 .
- the processor 130 may change a photographing setting value to correspond to the predetermined direction.
- the processor 130 may change the photographing setting value only in response to a swipe interaction from a certain point in the predetermined area 11 of the main display area 10 in a predetermined direction being detected.
- the predetermined area may be set by the manufacturer or may be set by the user.
- the processor 130 may activate the touch function of the predetermined area 10 of the main display area 10 and deactivate the touch function of the other area of the main display area 10 .
- the processor 130 may change an exposure value and display the live view.
- the processor 130 may display the live view while zooming in or zooming out. However, this should not be considered as limiting.
- the processor 130 may change a focus or change white balance through the swipe interaction.
- the processor 130 may activate the touch function of the entire main display area 10 , and photograph in response to a user touch on a certain point of the main display area 10 being detected.
- the processor 130 may divide the main display area 10 into a plurality of predetermined areas, and may allocate different functions to the predetermined areas.
- the processor 130 may execute corresponding functions in response to various interactions, such as a drag and drop interaction, a multi-touch interaction, and a stretch interaction which refers to pinching out, in addition to the touch and the swipe interaction.
- the interactions and corresponding functions thereof may be set by the manufacturer or may be set by the user.
- the processor 130 executes a specific function in response to a touch on the main display area 10 .
- the processor 130 may display a live view on the main display area 10 and may not provide information on the sub display area 20 .
- the processor 130 may photograph and store an image.
- the processor 130 may change the photographing setting value to correspond to the predetermined direction.
- the processor 130 may display the live view on both the main display area 10 and the sub display area 20 , and may use the area on the opposite side to the area which is viewed by the user as an area to receive a touch input.
- the processor 130 may receive a touch input using the area on the opposite side to the area which is viewed by the. For example, while displaying a broadcast content on the main display area 10 , the processor 130 may change a channel or a volume through a swipe interaction on the sub display area 20 .
- FIGS. 8A and 8B only the main display area 10 and the sub display area 20 are used. However, one of the main display area 10 and the sub display area 20 and the round display area 30 may be used. For example, while displaying a live view on the sub display area 20 , the processor 130 may photograph according to a user touch input on the round display area 30 .
- FIG. 9 illustrates a method of displaying a live view on both a main display area and a sub display area according to an embodiment of the present disclosure.
- the processor 130 may display a live view on the main display area 10 , and, in response to the live view being recognized as including a person's face, the processor 130 may control the display 110 to display a face area included in the live view on the sub display area 20 . For example, in response to the user terminal apparatus being far from a subject, the processor 130 may display the live view on the main display area 10 . This will be explained in detail below.
- the processor 130 may recognize a user's face from the live view displayed on the main display area 10 . To recognize the person's face using the live view, the processor 130 may divide a specific still image of the live view into a plurality of pixel blocks, and calculates a representative pixel value from each of the pixel blocks. The representative pixel value may be calculated based on an average of all pixels included in the pixel block or may be calculated based on a maximum distribution value, a median value, or a maximum value. The processor 130 may compare the representative pixel values of the pixel blocks with one another, and determine whether pixel blocks having pixel values falling within a similar range are continuously arranged. In response to the pixel blocks being continuously arranged, the processor 130 may determine that those pixel blocks form a single object.
- the processor 130 may determine whether there exists an object having pixel values falling within a range similar to person's skin from among the pixel blocks determined to be the objects. In response to such an object existing, the processor 130 may recognize the object as a user's face area or other body areas, and determine the other objects as a background.
- the processor 130 may crop a part 910 of the live view to correspond to the size of the sub display area 20 , and control the display 110 to display the part 910 on the sub display area 20 .
- the processor 130 may crop the part 910 of the live view while maintain the aspect ratio of the sub display area 20 .
- the processor 130 may crop the part 910 of the live view such that the face area included in the live view is located on the center of the sub display area 20 .
- the processor 130 may crop an area smaller than the part 910 of the live view shown in FIG. 9 , and may magnify the cropped area and display the area on the sub display area 20 .
- the processor 130 may display the closest person's face area to the user from among the people' faces on the sub display area 20 .
- the processor 130 may display a face area of a person who keeps his/her eyes toward the user terminal apparatus 100 from among the people's faces on the sub display area 20 .
- the processor 130 may crop a part of the live view to show all peoples.
- the processor 130 may display the live view on the sub display area 20 without cropping.
- the user who photographs using the user terminal apparatus 100 may identify an image to be photographed through the live view displayed on the main display area 10 , and people who are photographed may identify an image to be photographed through a part of the live view which is cropped and displayed on the sub display area 20 .
- the processor 130 may display the live view on the sub display area 20 and may display a UI informing that selfie photographing is in progress on the main display area 10 .
- FIG. 10 illustrates a method of displaying a live view on a main display area and displaying a specific content on a sub display area according to an embodiment of the present disclosure.
- the processor 130 may display a live view on the main display area 10 , and, in response to the live view being recognized as including a person, the processor 130 may control the display 110 to display an animation on the sub display area 20 .
- the processor 130 may display an animation for children or an animal image on the sub display area 20 .
- the processor 130 may output a sound corresponding to the content displayed on the sub display area 20 . Accordingly, the eyes of the person who is photographed may be kept toward the user terminal apparatus 100 .
- the processor 130 may determine a content to be displayed on the sub display area 20 based on at least one of age and sex of a person who is recognized in the live view. For example, in response to a woman being recognized in the live view, the processor 130 may display jewelry, and the like, on the sub display area 20 .
- the processor 130 may determine a content to be displayed on the sub display area 20 with reference to the closest person from among the people. In addition, the processor 130 may determine a content to be displayed on the sub display area 20 with reference to a person who keeps his/her eyes on other direction rather than toward the user terminal apparatus 100 . In addition, the processor 130 may determine a content to be displayed on the sub display area 20 with reference to a person which is located closest to the center of the live view.
- the processor 130 may display a part of the live view, an animation, and the like, on the sub display area 20 .
- the processor 130 may display a part of the live view, an animation, and the like, on the sub display area 20 according to user's manipulation.
- FIGS. 11A and 11B illustrate an example of a method of determining an area to display a live view according to various embodiments of the present disclosure.
- the processor 130 may determine a distance to a subject, and, in response to the determined distance being shorter than a predetermined distance, the processor 130 may control the display 110 to display the live view on the sub display area 20 , and, in response to the determined distance being longer than the predetermined distance, control the display 110 to display the live view on the main display area 10 .
- the processor 130 may determine the distance to the subject using auto focusing. In addition, the processor 130 may analyze a still image at a specific point of time of the live view, and determine the distance to the subject based on an outline of the subject. Accordingly, in response to the user photographing himself/herself while viewing the rear surface of the user terminal apparatus 100 , the live view may be displayed on the sub display area 20 so that the user may photograph while checking an image to be photographed. In addition, in response to the user photographing other people or a scene while viewing the front surface of the user terminal apparatus 100 , the live view may be displayed on the main display area 10 , so that the user may photograph while checking an image to be photographed.
- the processor 130 may display the live view on the opposite area to the area determined in the above-described method or may display the live view on both the main display area 10 and the sub display area 20 .
- the processor 130 may display the live view on an area where a user's touch is performed.
- the processor 130 may determine the area to display the live view by analyzing an image photographed by the camera 120 . For example, the processor 130 may recognize the user from the photographed image and determine the area to display the live view.
- the processor 130 may determine the area to display the live view by recognizing user's iris.
- FIG. 12 illustrates a method of detecting a change in an orientation of a user terminal apparatus according to an embodiment of the present disclosure.
- the user terminal apparatus 100 may include a plurality of sensors as described above.
- the processor 130 may determine whether the orientation of the user terminal apparatus 100 is changed based on at least one of a location and a motion of the user terminal apparatus 100 and a user's grip, which are detected by the plurality of sensors. For example, the processor 130 may determine whether the orientation of the user terminal apparatus 100 is changed based on a rotation state and a moving direction of the user terminal apparatus 100 detected by the geomagnetic sensor, and a rotation angle of the user terminal apparatus 100 detected by the gyro sensor.
- the processor 130 may determine that the orientation of the user terminal apparatus 100 is changed. For example, in response to the user terminal apparatus 100 being rotated by more than 90°, the processor 130 may determine that the orientation of the user terminal apparatus 100 is changed and perform a corresponding operation.
- FIG. 12 illustrates a plurality of still images of a live view in sequence.
- the first still image 1210 is an image without a person and the second to fourth still images 1220 , 1230 and 1240 are images with a person.
- the fifth image 1250 is an image without a person. That is, FIG. 12 illustrates that a camera angle is moved from the left to the right and then is moved back to the left.
- the processor 130 may determine that the orientation of the user terminal apparatus 100 is changed at the time at which the second still image 1220 is displayed and at the time at which the fifth still image 1250 is displayed as shown in FIG. 12 .
- the processor 130 may determine whether the orientation of the user terminal apparatus 100 is changed with reference to other objects rather than the person's face. For example, the processor 130 may extract an outline of a slide from a specific still image of the live view and then extract the outline of the slide from the next still image, and, in response to the outline of the slide being recognized as being moved by more than a predetermined distance on the live view, the processor 130 may determine that the orientation of the user terminal apparatus 100 is changed.
- the processor 130 may continue detecting an absolute posture using various sensors provided in the user terminal apparatus 100 , and determine whether the orientation of the user terminal apparatus 100 is changed.
- the processor 130 may set a reference posture, and, in response to the posture being changed from the reference posture by more than a predetermined value, the processor 130 may determine that the orientation of the user terminal apparatus 100 is changed.
- the reference posture is a posture when a camera application is selected.
- the processor 130 may determine whether the orientation of the user terminal apparatus 100 is changed according to user's touch manipulation.
- FIGS. 13A and 13B illustrate an example of an operation according to the change of an orientation of a user terminal apparatus according to various embodiments of the present disclosure.
- a gallery application in response to an orientation of the user terminal apparatus 100 being changed while a photographing function is being executed, a gallery application is executed.
- the processor 130 may display a live view on the sub display area 20 , and, in response to the user terminal apparatus 100 being rotated by about 180°, the processor 130 may display the gallery application on the main display area 10 .
- the processor 130 may display a telephony function, an Internet function, and the like, unrelated to the photographing function on the main display area 10 .
- the processor 130 may stop the photographing function.
- the processor 130 may display the gallery application or an application for correcting on the sub display area 20 .
- the processor 130 may display the photographed images on the sub display area 20 in sequence, and, in response to the user touching the main display area 10 , may change and display the image.
- FIG. 13B response to the orientation of the user terminal apparatus 100 being changed while a moving picture photographing function is being executed, the moving picture photographing function is stopped.
- the upper view of FIG. 13B illustrates a photographing time, a GUI 1310 for stopping photographing the moving picture, and a GUI 1320 for pausing photographing the moving picture.
- the user may touch the GUI 1310 for stopping photographing the moving picture to stop photographing the moving picture.
- the user may change the orientation of the user terminal apparatus 100 to stop photographing the moving picture.
- the lower view of FIG. 13B illustrates a thumbnail image of the moving picture after photographing is finished, and the user may touch a GUI 1330 for executing the moving picture to execute the moving picture.
- the processor 130 may pause or stop photographing the moving picture in response to a touch on the main display area 10 while the moving picture is being photographed.
- FIGS. 14A and 14B illustrate an example of a method of executing a photographing function.
- the processor 130 may execute the photographing function in other ways rather than by touching an icon indicating the photographing function.
- the photographing function is executed.
- the processor 130 may execute the photographing function and display a live view on the sub display area 20 .
- the processor 130 may execute the photographing function and display a live view on the sub display area 20 .
- the photographing function in response to a swipe interaction on the sub display area 20 being detected, the photographing function is executed.
- the processor 130 may execute the photographing function.
- FIG. 14B illustrates the swipe interaction moving from the left to the right.
- the swipe interaction is not limited to a specific direction.
- the photographing function may be executed through other interactions than the swipe interaction.
- the photographing function may be executed in various ways in addition to the ways in the embodiments of FIGS. 14A and 14B .
- the processor 130 may recognize a user's intention and execute the photographing function.
- the processor 130 may recognize at least one of a user's face, a user's gesture, and a grip form at the time when a specific area is activated, and execute the photographing function.
- the processor 13 may execute the photographing function.
- FIG. 15 is a view to illustrate a method of using a round display area according to an embodiment of the present disclosure.
- the round display area 30 is disposed between the main display area 10 and the sub display area 20 .
- the round display area 30 may display a UI for controlling the user terminal apparatus 100 .
- the processor 130 may display a GUI 1520 for photographing and storing an image on the round display area 30 .
- the processor 130 may photograph and store the image.
- the processor 130 may change an exposure value and display a live view.
- the processor 130 may display the live view by zooming in or zooming out. However, this should not be considered as limiting.
- the processor 130 may change a focus or change white balance through a swipe interaction.
- the processor 130 may execute the photographing function.
- the processor 130 may change the area to display the live view in response to a touch on the round display area 30 .
- the processor 130 may display a specific notification on the round display area 30 .
- the processor 130 may display a notification to inform the corresponding situation on the round display area 30 .
- the processor 130 may change settings to focus on the subject or retry photographing.
- FIG. 16 is a view to illustrate a user terminal apparatus which has a camera disposed on a front surface thereof according to an embodiment of the present disclosure.
- the user terminal apparatus 100 may include the camera 120 disposed under the main display area 10 .
- the main display area 10 and the sub display area 20 are reversed. Therefore, a detailed description is omitted.
- FIGS. 17A to 17C are views to illustrate an example of a case in which other applications are used according to various embodiments of the present disclosure.
- the processor 130 may determine a display area based on the orientation of the user terminal apparatus 100 at the time when the event occurs. For example, in response to a call or a message being received, the processor 130 may detect an absolute posture of the user terminal apparatus 100 at the time when the call or message is received, and display contents of the call or message on a predetermined display area. In particular, in response to the main display area 10 being in contact with a table, the processor 130 may display the contents of the call or message on the sub display area 20 .
- the processor 130 may determine a display area based on a user's location at the time when the event occurs. For example, in response to a call or a message being received, the processor 130 may recognize the user's location at the time when the call or message is received through the camera 120 , and, in response to the user's location being recognized, the processor 130 may display the contents of the call or message on the sub display area 20 , and, in response to the user's location not being recognized, the processor 130 may display the contents of the call or message on the main display area 10 .
- the processor 130 may determine a display area based on a using state of the user terminal apparatus 100 at the time when the event occurs. For example, in response to a call or a message being received while the user is viewing a moving picture through the main display area 10 , the processor 130 may display the contents of the call or message on the sub display area 20 or the round display area 30 .
- the processor 130 may change a display area. For example, in response to the user changing the orientation of the user terminal apparatus 100 in the middle of viewing contents of a text message through the main display area 10 , and then viewing the sub display area 20 , the processor 130 may display the contents of the text message on the sub display area 20 .
- the processor 130 may determine whether the orientation of the user terminal apparatus 100 is changed using various sensors or the camera 120 .
- the change of the orientation of the user terminal apparatus 100 has been described in detail and thus a redundant explanation thereof is omitted.
- the processor 130 may change a UI. Specifically, the processor 130 may change at least one of an amount of information, an information type, and a layout included in the UI. For example, in response to a text message being displayed on the main display area 10 , the processor 130 may additionally display a UI for creating a new text message, information on the other user, and the like, in addition to the contents of the text message. In addition, in response to the text message being displayed on the sub display area 20 , only the contents of the text message may be displayed. In addition, in response to the text message being displayed on the round display area 30 , the processor 130 may display some contents of the text message in sequence.
- the processor 130 may change a function to provide according to a display area. For example, in response to the received text message being displayed on the main display area 10 , the processor 130 may display a UI including a reply message input function, a received message storing function, a received message deleting function, and the like. However, in response to the received text message being displayed on the round display area 30 , only the contents of the received text message may be displayed and other functions may not be provided.
- FIGS. 18A and 18B illustrate an example of a configuration of a display according to various embodiments of the present disclosure.
- a flexible display is illustrated.
- the user may fold the flexible display and use the same.
- an area 1810 which is folded and bent back corresponds to the sub display area 20
- an unfolded area 1820 corresponds to the main display area 10 .
- the folded area 1810 is smaller than the unfolded area 1820 .
- the user may fold the flexible display in half and use the same.
- the round display area 30 may be formed by folding the flexible display two times. In this case, the embodiments of the round display area 30 may be applied.
- the user terminal apparatus 100 which is provided with a plurality of displays on the front surface and the rear surface thereof is illustrated.
- a front display 1830 corresponds to the main display area 10
- a rear display 1840 corresponds to the sub display area 20 .
- most of the embodiments described above be applied to the user terminal apparatus 100 having the plurality of displays disposed on the front surface and the rear surface thereof except for the round display area 30 , and a redundant explanation is omitted.
- FIG. 19 is a flowchart to illustrate a control method of the user terminal apparatus according to an embodiment of the present disclosure.
- the user terminal apparatus 100 displays a live view acquired through a camera on one of a main display area which is disposed on the front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of the rear surface of the user terminal apparatus 100 at operation S 1910 .
- the user terminal apparatus 100 displays the live view on the other one of the main display area and the sub display area at operation S 1920 .
- the displaying on one of the main display area and the sub display area at operation S 1910 may include displaying the live view on the main display area and not providing information on the sub display area, and the displaying on the other one of the main display area and the sub display area at operation S 1920 may include, in response to the orientation of the user terminal apparatus being changed, displaying the live view on the sub display area and not providing information on the main display area.
- the displaying on the other one of the main display area and the sub display area at operation S 1920 may include: displaying the live view on the sub display area and not providing information on the main display area; and may further include, in response to a user touch on the main display area being detected, include photographing and storing an image.
- the control method may further include, in response to a swipe interaction from a certain point in the main display area in a predetermined direction being detected, changing a photographing setting value to correspond to the predetermined direction.
- the displaying on the other one of the main display area and the sub display area at operation S 1920 may further include: displaying the live view on the main display area; and, in response to the live view being recognized as including a person's face, displaying a face area included in the live view on the sub display area.
- the displaying on the sub display area may include cropping a part of the live view to correspond to a size of the sub display area, and displaying the part on the sub display area.
- the displaying on the other one of the main display area and the sub display area at operation S 1920 may further include: displaying the live view on the main display area; and, in response to the live view being recognized as including a person, displaying an animation on the sub display area.
- the displaying on one of the main display area and the sub display area at operation S 1910 may include: determining a distance to a subject, and, in response to the determined distance being shorter than a predetermined distance, displaying the live view on the sub display area, and, in response to the determined distance being longer than the predetermined distance, displaying the live view on the main display area.
- the control method may further include detecting at least one of a location and a motion of the user terminal apparatus, and a user's grip, and the displaying on the other one of the main display area and the sub display area at operation S 1920 may include determining whether the orientation of the user terminal apparatus is changed based on at least one of the location and the motion of the user terminal apparatus, and the user's grip.
- the displaying on the other one of the main display area and the sub display area at operation S 1920 may include, in response to a person's face being greater than or equal to a predetermined size being recognized in the live view, or in response to the person's face being greater than or equal to the predetermined size being recognized and then the person's face not being recognized, determining that the orientation of the user terminal apparatus is changed.
- the user terminal apparatus may control the displays disposed on the front surface and the rear surface of the user terminal apparatus based on various photographing conditions, so that the user may easily photograph.
- the control method when the photographing function is executed has been mainly explained. However, this should not be considered as limiting.
- the processor may execute a predetermined application.
- the processor may connect the call.
- the processor may display contacts, and the like.
- the processor may display an execution state of an application which is being executed in the background on the sub display area.
- the processor may display an execution state of a specific game or music application, and the like. on the sub display area, and thus may minimize power consumption.
- the processor in response to the orientation of the user terminal apparatus being changed while a specific application is being executed and displayed on the main display area, the processor may display a UI displaying essential functions of the corresponding application on the sub display area.
- a different operation may be performed according to a direction of the changed orientation.
- a different function may be executed in response to rotation of the user terminal apparatus in the upward direction, downward direction, rightward direction, or leftward direction.
- a different function may be executed according to whether the rotation is made in the clockwise direction or counter clockwise direction.
- a control method of a user terminal apparatus which includes: a display including a main display area which is disposed on a front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of a rear surface of the user terminal apparatus; and a camera configured to photograph an image, according to the above-described embodiments may be implemented as a computer executable program code and stored in various non-transitory computer readable media, and may be provided to servers or devices to be executed by processors.
- a non-transitory computer readable medium which stores a program for performing the steps of in sequence: displaying a live view acquired through the camera on one of the main display area and the sub display area; and, in response to an orientation of the user terminal apparatus being changed, displaying the live view on the other one of the main display area and the sub display area may be provided.
- the non-transitory computer readable medium refers to a medium that stores data semi-permanently rather than storing data for a very short time, such as a register, a cache, a memory or and the like, and is readable by an apparatus.
- the above-described various applications or programs may be stored in the non-transitory computer readable medium such as a compact disc (CD), a digital versatile disc (DVD), a hard disk, a Blu-ray disc, a USB, a memory card, a ROM, and the like, and may be provided.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
A user terminal apparatus and a control method thereof are provided. The user terminal apparatus includes a display including a main display area which is disposed on a front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of a rear surface of the user terminal apparatus, a camera configured to photograph an image, and a processor configured to display a live view acquired through the camera on one of the main display area or the sub display area, and control the display to display, in response to an orientation of the user terminal apparatus being changed, the live view on another one of the main display area or the sub display area.
Description
- This application is a continuation application of prior application Ser. No. 15/915,696, filed on Mar. 8, 2018, which issued as U.S. Pat. No. 10,609,289 on Mar. 31, 2020; which is a continuation application of prior application Ser. No. 15/176,630, filed on Jun. 8, 2016, which has issued as U.S. Pat. No. 9,936,138 on Apr. 3, 2018; and which was based on and claimed priority under 35 U.S.C. § 119(e) of a U.S. Provisional application Ser. No. 62/198,360, filed on Jul. 29, 2015, in the U.S. Patent and Trademark Office, and under 35 U.S.C. § 119(a) of a Korean patent application number 10-2016-0001684, filed on Jan. 6, 2016, in the Korean Intellectual Property Office, the disclosure of each of which is incorporated by reference herein in its entirety.
- The present disclosure relates to a user terminal apparatus and a control method thereof. More particularly, the present disclosure relates to a user terminal apparatus which uses displays disposed on the front surface and the rear surface of the user terminal apparatus when photographing, and a control method thereof.
- Due to the development of electronic technology, various kinds of electronic devices are being used in various fields. In particular, an electronic device which has a display expanded to the rear surface thereof through out-bending (half round display) is being developed.
- It is common that a camera is provided on the rear surface of the electronic device. As a display is provided on the rear surface of the electronic device, selfie photographing may be easily performed. In addition, even when the electronic device is provided with a single camera, a user may photograph while checking a photography state using the rear display.
- Accordingly, there is a demand for a method of ensuring availability of a rear display when a photographing function is used.
- The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
- Aspects of the present disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a user terminal apparatus which controls displays disposed on the front surface and the rear surface of the user terminal apparatus based on a photographing situation, and a control method thereof.
- In accordance with an aspect of the present disclosure, a user terminal apparatus is provided. The user terminal apparatus includes a display including a main display area which is disposed on a front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of a rear surface of the user terminal apparatus, a camera configured to photograph an image, and a processor configured to display a live view acquired through the camera on one of the main display area or the sub display area, and control the display to display, in response to an orientation of the user terminal apparatus being changed, co the live view on another one of the main display area or the sub display area.
- In accordance with another aspect of the present disclosure, a control method of a user terminal apparatus is provided. The control method includes a display including a main display area which is disposed on a front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of a rear surface of the user terminal apparatus, and a camera configured to photograph an image which includes displaying a live view acquired through the camera on one of the main display area or the sub display area and displaying, in response to an orientation of the user terminal apparatus being changed, the live view on another one of the main display area or the sub display area.
- Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses various embodiments of the present disclosure.
- The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
-
FIGS. 1A and 1B are views to illustrate an example of a configuration of a display of a user terminal apparatus according to various embodiments of the present disclosure; -
FIG. 2A is a block diagram illustrating a configuration of a user terminal apparatus according to an embodiment of the present disclosure; -
FIG. 2B is a block diagram illustrating an example of a detailed configuration of a user terminal apparatus according to an embodiment of the present disclosure; -
FIG. 2C illustrates various modules stored in a storage according to an embodiment of the present disclosure; -
FIG. 3 illustrates an example of using one of a main display area, a sub display area, and a round display area according to an embodiment of the present disclosure; -
FIG. 4 illustrates an example of using at least two of a main display area, a sub display area, and a round display area according to an embodiment of the present disclosure; -
FIG. 5 illustrates respective areas and an example of an operation corresponding to a touch input according to an embodiment of the present disclosure; -
FIG. 6 illustrates an operation in response to the orientation of the user terminal apparatus being changed according to an embodiment of the present disclosure; -
FIGS. 7A and 7B illustrate an example of a method for changing a provided image according to a display area according to various embodiments of the present disclosure; -
FIGS. 8A and 8B illustrate an example of a method for executing a specific function through a main display area according to various embodiments of the present disclosure; -
FIG. 9 illustrates a method for displaying a live view on both a main display area and a sub display area according to an embodiment of the present disclosure; -
FIG. 10 illustrates a method for displaying a live view on a main display area and displaying a specific content on a sub display area according to an embodiment of the present disclosure; -
FIGS. 11A and 11B illustrate an example of a method for determining an area to display a live view according to various embodiments of the present disclosure; -
FIG. 12 illustrates a method for detecting a change in an orientation of a user terminal apparatus according to an embodiment of the present disclosure; -
FIGS. 13A and 13B illustrate an example of an operation corresponding to a change in an orientation of a user terminal apparatus according to various embodiments of the present disclosure; -
FIGS. 14A and 14B illustrate an example of a method for executing a photographing function according to various embodiments of the present disclosure; -
FIG. 15 illustrates a method for using the round display area according to an embodiment of the present disclosure; -
FIG. 16 illustrates a camera which is provided on a front surface of a user terminal apparatus according to an embodiment of the present disclosure; -
FIGS. 17A to 17C illustrate an example of a case in which other applications are used according to various embodiments of the present disclosure; -
FIGS. 18A and 18B illustrate an example of a configuration of a display according to various embodiments of the present disclosure; and -
FIG. 19 is a flowchart to illustrate a control method of a user terminal apparatus according to an embodiment of the present disclosure. - Throughout the drawings, like reference numerals will be understood to refer to like parts, components, and structures.
- The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the present disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope and spirit of the present disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
- The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the present disclosure is provided for illustration purpose only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalents.
- It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
- The terms such as “first” and “second” used in various embodiments are used to distinguish various elements from one another regardless of an order or importance of the corresponding elements. Accordingly, the order or importance of the elements is not limited by these terms. For example, a first element may be named a second element without departing from the scope of right of various embodiments of the present invention, and similarly, a second element may be named a first element.
- It will be understood that, when an element (e.g., a first element) is “(operatively or communicatively) coupled with/to” or “connected to” another element (e.g., a second element), the element may be directly coupled with/to another element, and there may be an intervening element (e.g., a third element) between the element and the other element. To the contrary, it will be understood that, when an element (e.g., a first element) is “directly coupled with/to” or “directly connected to” another element (e.g., a second element), there is no intervening element (e.g., a third element) between the element and the other element.
- All of the terms used herein including technical or scientific terms have the same meanings as those generally understood by an ordinary skilled person in the related art unless they are defined otherwise. The terms defined in a generally used dictionary should be interpreted as having the same meanings as the contextual meanings of the relevant technology and should not be interpreted as having ideal or exaggerated meanings unless they are clearly defined in the various embodiments. According to circumstances, even the terms defined in the embodiments should not be interpreted as excluding the embodiments of the present disclosure.
- Various embodiments will be explained in detail with reference to the accompanying drawings.
-
FIGS. 1A and 1B are views to illustrate an example of a configuration of a display of a user terminal apparatus according to various embodiments of the present disclosure. - Referring to
FIG. 1A , the left view shows the front surface of auser terminal apparatus 100, and the right view shows the rear surface of theuser terminal apparatus 100. A front display is disposed on the front surface of theuser terminal apparatus 100, and a rear display and a camera are disposed on the rear surface of theuser terminal apparatus 100. The front display and the rear display may be connected with each other, and the rear display may be smaller than the front display. However, this should not be considered as limiting, and the front display and the rear display may have the same size. In addition, the camera may be disposed on the front surface of theuser terminal apparatus 100. The front display will be explained as amain display area 10 and the rear display will be explained as asub display area 20. - Referring to
FIG. 1B , the left view is a front view showing the entire display in which themain display area 10, thesub display area 20, and around display area 30 are connected with one another, and the right view is a rear view of the entire display. - The
sub display area 20 may extend from one side of themain display area 10 and may be disposed on at least one area of the rear surface of theuser terminal apparatus 100. In particular, thesub display area 20 may extend from the top of themain display area 10 to be bent. The bent area may have a curved shape, but is not limited to this and may form an angle according to the type of the display. - The
round display area 30 is an area for connecting themain display area 10 and thesub display area 20. As described above, theround display area 30 may have a curved shape or an angular shape. Theround display area 30 may be distinguished from themain display area 10 and thesub display area 20 by boundary lines 30-1, 30-2. - The boundary lines 30-1, 30-2 shown in
FIG. 1B are merely an example and may be changed. In addition, the boundary lines 30-1, 30-2 may be determined by a manufacturer at the time of manufacturing, and may be changed by a user. In response to the boundary lines 30-1, 30-2 being changed, the sizes of themain display area 10, thesub display area 20, and theround display area 30 may be changed and the size of a content displayed on each area may be changed. - In
FIGS. 1A and 1B , the display encloses the upper side of theuser terminal apparatus 100. However, the display may enclose any one of the lower side, the left side surface, and the right side surface of theuser terminal apparatus 100. In addition, the display may enclose a plurality of side surface other than a single side surface. - The display may enclose a touch pad, and the display and the touch pad may be implemented in the form of a touch screen by forming a mutual layer structure. In this case, the touch pad is also bent similarly to the display and thus a touch input may be inputted opposite to what the user thinks. For example, it may be determined that there is a difference in the touch area but the touch is made in the same direction in response to the user dragging from a certain point of the
main display area 10 to an upper point and in response to the user dragging from a certain point of thesub display area 20 to an upper point. However, since the touch pad is bent, the touch pad may receive the input in the opposite direction to the real input in response to the user dragging from a certain point of themain display area 10 to an upper point and in response to the user dragging from a certain point of thesub display area 20 to an upper point. Therefore, theuser terminal apparatus 100 may be set to recognize the direction of a touch input at a certain area as the opposite direction. This departs from the scope of the present disclosure and thus a detailed description thereof is omitted. - Since the
main display area 10, thesub display area 20, and theround display area 30 are connected with one another, various sensors and a receiver may be provided on the side surface or lower portion of theuser terminal apparatus 100. In particular, a directional receiver may be provided. Alternatively, the receiver may be provided on an integral cover which may interwork with theuser terminal apparatus 100. In response to the receiver being provided on the lower portion, aprocessor 130 may reverse the top and bottom of an image and display the image while the user is talking on the phone. - First, the present disclosure will be explained on the assumption that the
sub display area 20 has a structure extending from themain display area 10 through theround display area 30 of the curved shape, and the camera is disposed on the rear surface of theuser terminal apparatus 100. Additionally, various embodiments will be expanded and described when the display has other configurations and the location of the camera is changed. - In addition, the direction of the
main display area 10 and thesub display area 20 going toward theround display area 30 will be explained as being upward from theuser terminal apparatus 100, and the opposite direction will be explained as being downward. The left side and the right side will be explained with reference to a display area which is viewed by the user. Accordingly, the left side and the right side when the user views themain display area 10 are reversed when the user views thesub display area 20. -
FIG. 2A is a block diagram illustrates a configuration of a user terminal apparatus according to an embodiment of the present disclosure. - Referring to
FIG. 2A , auser terminal apparatus 100 includes adisplay 110, acamera 120, and aprocessor 130. -
FIG. 2A illustrates overall elements of theuser terminal apparatus 100 when theuser terminal apparatus 100 is provided with various functions such as a display function, a control function, and the like. Accordingly, some of the elements shown inFIG. 2A may be omitted or changed and other elements may be added according to an embodiment. - The
display 110 may include amain display area 10 which is disposed on the front surface of theuser terminal apparatus 100, asub display area 20 which extends from one side of themain display area 10 and is disposed on at least one area of the rear surface of theuser terminal apparatus 100, and around display area 30 for connecting themain display area 10 and thesub display area 20. However, this should not be considering as limiting. For example, thesub display area 20 may be configured to enclose the entire rear surface of theuser terminal apparatus 100. - The terms “front surface” and “rear surface” are used for convenience of explanation and are not limited by the meaning thereof. For example, the front surface and the rear surface may refer to one side surface and the other side surface regarding a specific electronic device. In the above-described explanation, the
display 110 extends from one side of themain display area 10. However, this should not be considered as limiting. For example, thedisplay 110 may extend from all side surfaces of themain display area 10 and cover the entireuser terminal apparatus 100. - The
sub display area 20 of thedisplay 110 may extend from the upper side of themain display area 10 to be bent and may be disposed on the upper area of the rear surface. For example, a connection part between themain display area 10 and thesub display area 20 may be formed in the shape of “U” and have a curved line when themain display area 10 and thesub display area 20 are viewed from the side. However, this should not be considered as limiting. The connection part between themain display area 10 and thesub display area 20 may be formed in the shape of a squared “C” and have an angle of 90° when themain display area 10 and thesub display area 20 are viewed from the side. In addition, various connection parts may be formed based on the type of theuser terminal apparatus 100. - The
display 110 may display various user interfaces (UIs) under the control of theprocessor 130. For example, thedisplay 110 may display a live view, a gallery application, an animation, and the like. - The
display 110 may display different contents on themain display area 10, thesub display area 20, and theround display area 30 under the control of thecontroller 130. For example, thedisplay 110 may display a moving picture on themain display area 10, an image on thesub display area 20, and a UI for transmitting a message on theround display area 30. - In addition, the
display 110 may display a content by interlocking at least two of themain display area 10, thesub display area 20, and theround display area 30. For example, thedisplay 110 may display a moving picture on themain display area 10 and display a UI for controlling the moving picture on thesub display area 20. In addition, thedisplay 110 may display a UI for providing a function unrelated to the moving picture on theround display area 30. - In addition, the
display 110 may display the same content on at least two of themain display area 10, thesub display area 20, and theround display area 30. For example, thedisplay 110 may display the same content on themain display area 10 and thesub display area 20 and may display a separate content on theround display area 30. - The
display 110 may be implemented by using a liquid crystal display (LCD) panel, an organic light emitting diodes (OLED) display, a plasma display panel (PDP), and the like, but is not limited to these. In addition, thedisplay 110 may be implemented by using a transparent display, a flexible display, and the like, according to circumstances. - The
camera 120 is configured to photograph a still image or a moving picture under the control of the user. Thecamera 120 may photograph a still image at a specific time or may continuously photograph a still image. Thecamera 120 provides the acquired image to thedisplay 110, and a live view may be displayed on at least one of themain display area 10 and thesub display area 20. Thecamera 120 may photograph the user or a background image according to the orientation of theuser terminal apparatus 100. Thecamera 120 may include a plurality of cameras such as a front camera and a rear camera. - The
camera 120 includes a lens, a shutter, an aperture, a solid state imaging device, an analog front end (AFE), and a timing generator (TG). The shutter adjusts a time at which light reflected from a subject enters theuser terminal apparatus 100, and the aperture adjusts an amount of light entering the lens by mechanically increasing or reducing the size of an opening through which light enters. The solid state imaging device outputs an image generated by photo-charge as an electric signal when the light reflected from the subject accumulates as photo-charge. The TG outputs a timing signal to read out pixel data of the solid state imaging device, and the AFE samples and digitizes the electric signal outputted from the solid state imaging device. - The
processor 130 may control the overall operation of theuser terminal apparatus 100. - The
processor 130 may control to display a live view acquired through thecamera 120 on one of themain display area 10 and thesub display area 20, and display the live view on the other one of themain display area 10 and thesub display area 20 in response to the orientation of theuser terminal apparatus 100 being changed. Theprocessor 130 may determine whether the orientation of theuser terminal apparatus 100 is changed using a gravity sensor, an acceleration sensor, a gyro sensor, and the like. In addition, theprocessor 130 may determine whether the orientation of theuser terminal apparatus 100 is changed by analyzing the live view acquired by thecamera 120. However, this should not be considered as limiting. Theprocessor 130 may change the area to display the live view by moving or shaking in a specific direction, in addition by changing the orientation of theuser terminal apparatus 100. - The
processor 130 may display the live view and additionally may display a guide line. In addition, theprocessor 130 may display a graphical UI (GUI) for executing a gallery application, a GUI for executing an image correction application, and the like, in addition to a GUI for photographing. - The
processor 130 may display the live view on themain display area 10 and may not provide information on thesub display area 20. In response to the orientation of theuser terminal apparatus 100 changing, theprocessor 130 may display the live view on thesub display area 20 and may not provide information on themain display area 10. The information may not be provided in various ways. For example, power may not be supplied to thedisplay 110 or black is displayed on thedisplay 110. - The
processor 130 may activate a touch function of themain display area 10 while displaying the live view on thesub display area 20 without providing information on themain display area 10. In response to detecting a user touch from themain display area 10, theprocessor 130 may photograph and store an image. - In response to detecting a swipe interaction from a certain point in the
main display area 10 in a predetermined direction, theprocessor 130 may change a photographing setting value to correspond to the predetermined direction. - The
processor 130 may display the live view on themain display area 10, and, in response to recognizing the live view as including a person's face, theprocessor 130 may control thedisplay 110 to display the face area included in the live view on thesub display area 20. - The
processor 130 may crop a part of the live view to correspond to the size of thesub display area 20 and control thedisplay 110 to display the cropped part on thesub display area 20. - In addition, the
processor 130 may display the live view on themain display area 10, and, in response to recognizing the live view as including a person, theprocessor 130 may control thedisplay 110 to display an animation on thesub display area 20. - The
processor 130 may determine a distance to a subject, and, in response to determining the distance is shorter than a predetermined distance, theprocessor 130 may control thedisplay 110 to display the live view on thesub display area 20, and, in response to determining the distance is longer than the predetermined distance, control thedisplay 110 to display the live view on themain display area 10. - The
user terminal apparatus 100 may further include a plurality of sensors, and theprocessor 130 may determine whether the orientation of theuser terminal apparatus 100 is changed based on at least one of a location and a motion of theuser terminal apparatus 100 and a user's grip detected by the plurality of sensors. - In addition, in response to determining a person's face is greater than or equal to a predetermined size being recognized in the live view or in response to determining the person's face is greater than or equal to the predetermined size being recognized, but not recognizing the person's face afterward, the
processor 130 may determine that the orientation of theuser terminal apparatus 100 is changed. -
FIG. 2B is a block diagram illustrating an example of a detailed configuration of a user terminal apparatus according to an embodiment of the present disclosure. - Referring to
FIG. 2B , auser terminal apparatus 100 includes adisplay 110, acamera 120, aprocessor 130, a global positioning system (GPS)chip 145, astorage 140, a sensor, acommunicator 150, auser interface 155, anaudio processor 160, avideo processor 170, aspeaker 180, abutton 181, and amicrophone 182. The same elements ofFIG. 2B as inFIG. 2A will not be described in detail. - The
display 110 may be divided into themain display area 10, thesub display area 20, and theround display area 30 as described above. Thedisplay 110 may be implemented by using various types of displays such as an LCD, an OLED display, a PDP, and the like. Thedisplay 110 may further include a driving circuit which is implemented by using an amorphous silicon thin film transistor (a-si TFT), a low temperature poly silicon (LTPS) TFT, an organic TFT (OTFT), and the like, and a backlight unit. Thedisplay 110 may be combined with a touch sensor included in the sensor and may be implemented as a touch screen. - In this case, the touch sensor may include at least one of a touch panel and a pen recognition panel. The touch panel may detect a user's finger gesture input and output a touch event value corresponding to a detected touch signal. The touch panel may be mounted under at least one of the
main display area 10, thesub display area 20, and theround display area 30 of thedisplay 110. - The touch panel may detect the user's finger gesture input in a capacitive method or a resistive method. The capacitive method calculates touch coordinates by detecting minute electricity excited in a user's body. The resistive method includes two electrode plates embedded in the touch panel, and calculates touch coordinates by detecting an electric current flowing due to contact between upper and lower plates at a touched point.
- The pen recognition panel may detect a user's pen gesture input according to a user's operation of using a touch pen (e.g., a stylus pen, a digitizer pen), and output a pen proximity event value or a pen touch event value. The pen recognition panel may be mounted under at least one of the
main display area 10, thesub display area 20, and theround display area 30 of thedisplay 110. - The pen recognition panel may be implemented in an electromagnetic resonance (EMR) method, for example, and may detect a touch or a proximity input according to a change in the intensity of an electromagnetic field caused by the proximity or touch of the pen. Specifically, the pen recognition panel may include an electromagnetic induction coil sensor (not shown) having a grid structure, and an electronic signal processor (not shown) which provides an alternating current (AC) signal having a predetermined frequency to each loop coil of the electromagnetic induction coil sensor in sequence. In response to a pen having a resonant circuit embedded therein existing in the proximity of the loop coil of the pen recognition panel, a magnetic field transmitted from the corresponding loop coil generates a current in the resonant circuit of the pen based on the mutual electromagnetic induction. Based on the current, an induction magnetic field is generated from the coil forming the resonant circuit in the pen, and the pen recognition panel detects the induction magnetic field from the loop coil in a signal reception state, and thus detects the proximity location or touch location of the pen.
- The pen recognition panel may be configured differently according to a display area. For example, both the touch panel and the pen recognition panel may be provided in the
main display area 10, and only the touch panel may be provided in thesub display area 20 and theround display area 30. - The
processor 130 may deactivate a specific panel by shutting off the power to the specific panel. In addition, theprocessor 130 may supply power to the touch panel and the pen recognition panel and receive a touch or a pen input, but may deactivate a specific panel by disregarding an input to the specific panel in a software level. - In this case, the
processor 130 may receive at least one of a hand touch and a pen touch according to whether the touch panel and the pen recognition panel of themain display area 10 are activated. For example, theprocessor 130 may activate only the touch panel not to receive the pen touch input and receive only the hand touch input or may activate only the pen recognition panel not to receive the hand touch input and receive only the pen touch input. - In addition, the touch panel and the pen recognition panel may be provided on the entire display area and the pen recognition panel of some areas may be deactivated. For example, the
processor 130 may deactivate only the pen recognition panel corresponding to thesub display area 20 and theround display area 30. - The touch panel and the pen recognition panel may be implemented as a single panel. In this case, the
processor 130 may detect a touch input and a pen input through the entire display area. - In response to the touch panel and the pen recognition panel being implemented as a single panel, the
processor 130 may perform an operation corresponding to an input based on at least one of an input type and an input area. For example, theprocessor 130 may disregard a specific input such as a pen input in a software level. In addition, theprocessor 130 may disregard a pen input to thesub display area 20 in a software level. - The
processor 130 may control the overall operations of theuser terminal apparatus 100 using various programs stored in thestorage 140. - Specifically, the
processor 130 may include a random access memory (RAM) 131, a read only memory (ROM) 132, a main central processing unit (CPU) 133, agraphic processor 134, first to n-th interfaces 135-1 to 135-n, and abus 136. - The
RAM 131, the ROM 132, themain CPU 133, thegraphic processor 134, and the first to n-th interfaces 135-1 to 135-n may be connected with one another via thebus 136. - The first to n-th interfaces 135-1 to 135-n may be connected with the above-described various elements. One of the interfaces may be a network interface which is connected with an external device via a network.
- The
main CPU 133 may access thestorage 140 and perform booting using an operating system (O/S) stored in thestorage 140. In addition, themain CPU 133 may perform various operations using various programs stored in thestorage 140. - The ROM 132 may store a set of instructions for booting a system. In response to a turn on command being inputted and power being supplied, the
main CPU 133 may copy the O/S stored in thestorage 140 into theRAM 131 according to a command stored in the ROM 132, and boot the system by executing the O/S. In response to the booting being completed, themain CPU 133 may copy various application programs stored in thestorage 140 into theRAM 131, and perform various operations by executing the application programs copied into theRAM 131. - The
graphic processor 134 may generate a screen including various objects such as an icon, an image, a text, and the like, using a calculator (not shown) and a renderer (not shown). The calculator (not shown) may calculate attribute values of objects to be displayed according to a layout of the screen, such as a coordinate value, a shape, a size, a color, and the like, based on a received control command. The renderer (not shown) may generate the screen of various layouts including objects based on the attribute values calculated by the calculator (not shown). The screen generated by the renderer (not shown) may be displayed in the display area of thedisplay 110. - The above-described operations of the
processor 130 may be achieved by a program stored in thestorage 140. - The
storage 140 may store a variety of data such as an O/S software module for driving theuser terminal apparatus 100, a photographing module, and an application module. - In this case, the
processor 130 may process and display an input image based on information stored in thestorage 140. - The
GPS chip 145 is an element for receiving a GPS signal from a GPS satellite, and calculating a current location of theuser terminal apparatus 100. In response to a navigation program being used or a user's current location being required, theprocessor 130 may calculate the user's location using theGPS chip 145. - The
communicator 150 is configured to communicate with various kinds of external devices according to various kinds of communication methods. Thecommunicator 150 includes a Wi-Fi chip 151, aBluetooth chip 152, awireless communication chip 153, and a near field communication (NFC) chip 154. Theprocessor 130 may communicate with various external devices using thecommunicator 150. - In particular, the Wi-
Fi chip 151 and theBluetooth chip 152 communicate in a Wi-Fi method and a Bluetooth method, respectively. When the Wi-Fi chip 151 or theBluetooth chip 152 is used, a variety of connection information such as a service set identifier (SSID) and a session key may be exchanged first, and communication may be established using the connection information, and then a variety of information may be exchanged. Thewireless communication chip 153 refers to a chip which communicates according to various communication standards such as The Institute of Electrical and Electronics Engineers (IEEE), ZigBee, 3rd generation (3G), 3G partnership project (3GPP), long term evolution (LTE), and the like. The NFC chip 154 refers to a chip which operates in an NFC method using a band of 13.56 MHz from among various radio frequency identification (RF-ID) frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860-960 MHz, and 2.45 GHz. - The
communicator 150 may perform unidirectional communication or bidirectional communication with an electronic device. When thecommunicator 150 performs the unidirectional communication, thecommunicator 150 may receive signals from the electronic device. When thecommunicator 150 performs the bidirectional communication, thecommunicator 150 may receive signals from the electronic device or transmit signals to the electronic device. - The
user interface 155 may receive various types of user interaction. When theuser terminal apparatus 100 is implemented by using a touch-based portable terminal, theuser interface 155 may be implemented in the form of a touch screen forming a mutual layer structure with a touch pad. In this case, theuser interface 155 may be used as the above-describeddisplay 110. - The sensor (not shown) may include a touch sensor, a geomagnetic sensor, a gyro sensor, an acceleration sensor, a proximity sensor, a grip sensor, and the like. The sensor may detect various operations such as rotation, tilt, pressure, approach, grip, and the like. in addition to the above-described touch.
- The touch sensor may be implemented in a capacitive method or a resistive method. The capacitive type touch sensor is a sensor which calculates touch coordinates by detecting minute electricity excited in a user's body when a part of the user's body touches the surface of the display, using a dielectric substance coated on the surface of the display. The resistive type touch sensor includes two electrode plates embedded in the
user terminal apparatus 100, and, when the user touches the screen, calculates touch coordinates by detecting an electric current flowing due to contact between upper and lower plates at the touched point. In addition, infrared beam, surface acoustic wave, integral strain gauge, piezo electric, and the like, may be used to detect a touch interaction. - In addition, the
user terminal apparatus 100 may determine whether a touch object such as a finger or a stylus pen touches or approaches using a magnetic field sensor, an optical sensor, a proximity sensor, and the like, instead of the touch sensor. - The geomagnetic sensor is a sensor for detecting a rotational state, a moving direction, and the like, of the
user terminal apparatus 100. The gyro sensor is a sensor for detecting a rotational angle of theuser terminal apparatus 100. Both the geomagnetic sensor and the gyro sensor may be provided, but, even when only one of them is provided, theuser terminal apparatus 100 may detect a rotation state. - The acceleration sensor is a sensor for detecting how the
user terminal apparatus 100 is tilted. - The proximity sensor is a sensor for detecting a motion which approaches without directly contacting the display surface. The proximity sensor may be implemented by using various types of sensors such as a high-frequency oscillation type proximity sensor which forms a high frequency magnetic field and detects an electric current induced by a magnetic characteristic which is changed when an object approaches, a magnetic type proximity sensor which uses a magnet, and a capacitive type proximity sensor which detects capacitance that changes when an object approaches, and the like.
- The grip sensor may be disposed on the rear surface, edge, or handle part separately from the touch sensor provided on the touch screen, and detects a user's grip. The grip sensor may be implemented as a pressure sensor in addition to the touch sensor.
- The
audio processor 160 is an element for processing audio data. Theaudio processor 160 may perform various processing operations such as decoding, amplification, noise filtering, and the like, with respect to the audio data. - The
video processor 170 is an element for processing video data. Thevideo processor 170 may perform various image processing operations such as decoding, scaling, noise filtering, frame rate conversion, resolution conversion, and the like, with respect to the video data. - The
speaker 180 is an element for outputting not only various audio data processed by theaudio processor 160 but also various notification sounds, voice messages, and the like. - The
button 181 may include various types of buttons such as a mechanical button, a touch pad, a wheel, and the like, formed on a certain area of theuser terminal apparatus 100, such as the front surface, the side surface, and the rear surface of the body exterior of theuser terminal apparatus 100. - The
microphone 182 is an element for receiving an input of a user voice or other sounds and converting the user voice or sound into audio data. - Although not shown in
FIG. 2B , theuser terminal apparatus 100 may further include a universal serial bus (USB) port to which a USB connector is connected, a headset, a mouse, various external input ports for connecting to various external ports such as a local area network (LAN), a digital multimedia broadcasting (DMB) chip for receiving and processing a DMB signal, various sensors, and the like. -
FIG. 2C illustrates various modules stored in a storage according to an embodiment of the present disclosure. - The software of
FIG. 2C may be stored in thestorage 140, but is not limited to this. The software may be stored in various kinds of storing means used in theuser terminal apparatus 100. - Referring to
FIG. 2C , software including an O/S 191, akernel 192,middleware 193, anapplication 194, and the like may be stored in theuser terminal apparatus 100. - The O/
S 191 controls and manages the overall operations of hardware. That is, the O/S 191 is a software layer which is responsible for basic functions such as hardware management, memory management, and security. - The
kernel 192 serves as a channel to transmit various signals including a touch signal, and the like, detected by thedisplay 110 to themiddleware 193. - The
middleware 193 includes various software modules to control the operations of theuser terminal apparatus 100. Referring toFIG. 2C , themiddleware 193 includes a main UI framework 193-1, a window manager 193-2, a sub UI framework 193-3, a security module 193-4, a system manager 193-5, a connectivity manager 193-6, an X11 module 193-7, an APP manager 193-8, and a multimedia framework 193-9. - The main UI framework 193-1 is a module which provides various UIs to be displayed on the
main display area 10 of thedisplay 110, and the sub UI framework 193-3 is a module which provides various UIs to be displayed on thesub display area 20. The main UI framework 193-1 and the sub UI framework 193-3 may include an image compositor module to configure various objects, a coordinates compositor module to calculate coordinates for displaying the objects, a rendering module to render the configured objects on the calculated coordinates, a two dimensional (2D)/three dimensional (3D) UI toolkit to provide a tool for configuring a UI in the form of 2D or 3D. - The window manager 193-2 may detect a touch event using a user's body or a pen or other input events. In response to such an event being detected, the window manager 193-2 transmits an event signal to the main UI framework 193-1 or the sub UI framework 193-3 such that an operation corresponding to the event is performed.
- In addition, various program modules such as a writing module which, when the user touches and drags on the screen, draws a line following the trace of the dragging, and an angle calculation module which calculates a pitch angle, a roll angle, a yaw angle, and the like, based on a sensor value detected by the sensor, may be stored.
- The security module 193-4 is a module which supports certification, permission, and secure storage for hardware.
- The system manager 193-5 monitors the states of the elements in the
user terminal apparatus 100, and provides the result of the monitoring to the other modules. For example, in response to a battery life level being low, an error being generated, or communication being disconnected, the system manager 193-5 provides the result of the monitoring to the main UI framework 183-1 or the sub UI framework 193-3 to output a notification message or a notification sound. - The connectivity manager 193-6 is a module which supports wire or wireless network connection. The connectivity manager 193-6 may include various sub modules such as a DNET module, a universal plug and play (UPnP) module, and the like.
- The X11 module 193-7 is a module which receives various event signals from a variety of hardware provided in the
user terminal apparatus 100. The event recited herein refers to an event in which a user operation is detected, an event in which a system alarm is generated, an event in which a specific program is executed or ends, and the like. - The APP manager 193-8 is a module which manages the execution states of various applications installed in the
storage 140. In response to an event in which an application execution command is inputted being detected by the X11 module 193-7, the APP manager 193-8 may call and execute an application corresponding to the event. That is, in response to an event in which at least one object is selected being detected, the APP manager 193-8 may call an application corresponding to the object and execute the application. - The multimedia framework 193-9 is a module which reproduces multimedia contents which are stored in the
user terminal apparatus 100 or provided from external sources. The multimedia framework 193-9 may include a player module, a camcorder module, a sound processing module, and the like. Accordingly, the multimedia framework 193-9 may reproduce various multimedia contents, generate a screen and a sound, and reproduce the same. - The software structure shown in
FIG. 2C is merely an example and is not limited to this. Therefore, some of the elements may be omitted or changed or an element may be added when necessary. For example, thestorage 140 may be additionally provided with various programs such as a sensing module to analyze signals sensed by various sensors, a messaging module such as a messenger program, a short message service (SMS) and multimedia message service (MMS) program, and an email program, a call information aggregator program module, a voice over internet protocol (VoIP) module, a web browser module, and the like. - As described above, the
user terminal apparatus 100 may be implemented by using various kinds of devices such as a mobile phone, a tablet personal computer (PC), a laptop PC, a personal digital assistant (PDA), a Moving PictureExperts Group phase 1 or phase 2 (MPEG-1 or MPEG-2) audio layer 3 (MP3) player, an electronic album device, a television (TV), a PC, a kiosk, and the like. Accordingly, the elements described inFIGS. 2B and 2C may be changed in various ways according to the type of theuser terminal apparatus 100. - As described above, the
user terminal apparatus 100 may be implemented in various shapes and configurations. - Hereinafter, a basic configuration and various embodiments will be explained for easy understanding of the present disclosure.
-
FIG. 3 illustrates an example of using only one of a main display area, a sub display area, and a round display area according to an embodiment of the present disclosure. - Referring to
FIG. 3 , theprocessor 130 may determine an area to provide information based on the orientation of theuser terminal apparatus 100. In response to theprocessor 130 providing information to one area, theprocessor 130 may not provide information to the other two areas. - In response to a user input pressing a power button, the
processor 130 may determine the area to provide information. In addition, in response to a call or a text message being received, theprocessor 130 may determine the area to provide information. - In the above-described example, the
processor 130 determines the area to provide information based on the orientation of theuser terminal apparatus 100. However, this should not be considered as limiting. For example, in response to a user input pressing the power button, theprocessor 130 may recognize the user and provide information through an area which is close to the user's eyes. In this case, the provided information may vary according to an executed application - The
processor 130 may change the display area to provide information through a user's touch input to the area which no information is provided. For example, in response to a user input continuing touching on a certain area of thesub display area 20 being received while the information is provided to themain display area 10, theprocessor 130 may display the information provided to themain display area 10 on thesub display area 20. - The
processor 130 may display the same information in different ways according to the area to display the information. For example, theprocessor 130 may change the layout of a home screen according to the display area as shown inFIG. 3 . - In response to the orientation of the
user terminal apparatus 100 being changed, theprocessor 130 may change the area to display information. For example, in response to theuser terminal apparatus 100 being rotated by more than a predetermined angle while the home screen is being displayed on themain display area 10, theprocessor 130 may display the home screen on thesub display area 20. - In particular, in response to the orientation of the
user terminal apparatus 100 being changed and thus the display area being changed, theprocessor 130 may change the layout of information to be provided. In response to the area to provide information being changed, theprocessor 130 may change not only an object but also the size, content, and layout of the object. - In response to the orientation of the
user terminal apparatus 100 being changed and thus the display area being changed, theprocessor 130 may change the operation state of an application which is being executed and provide the application to another display area. In addition, in response to the orientation of theuser terminal apparatus 100 being changed, theprocessor 130 may divide displayed information and display divided pieces of information on the other display areas. -
FIG. 4 illustrates an example of using at least two areas of a main display area, a sub display area, and a round display area according to an embodiment of the present disclosure. - Referring to
FIG. 4 , theprocessor 130 may provide pieces of information related to each other to two areas of themain display area 10, thesub display area 20, and theround display area 30. For example, theprocessor 130 may display a moving picture on themain display area 10 and may display a UI for controlling the moving picture on theround display area 30. - However, this should not be considered as limiting. The
processor 130 may provide pieces of information unrelated to each other to two areas of themain display area 10, thesub display area 20, and theround display area 30. - In addition, in response to a call being received while the UI for controlling the moving picture is displayed on the
round display area 30 as shown inFIG. 4 , theprocessor 130 may display a call reception UI on theround display area 30, and move the UI for controlling the moving picture to thesub display area 20 and display the UI on thesub display area 20. - In response to the call being connected by touching the call reception UI, the
processor 130 may display a telephone call UI on theround display area 30 and continue reproducing the moving picture. In this case, theprocessor 130 may mute the moving picture. In addition, theprocessor 130 may pause the moving picture. - The
processor 130 may provide pieces of information related to each other using all of themain display area 10, thesub display area 20, and theround display area 30. For example, theprocessor 130 may display a moving picture on themain display area 10, display a UI for controlling the moving picture on thesub display area 20, and display a UI showing a moving picture reproduction time on theround display area 30. - In addition, the
processor 130 may provide pieces of information unrelated to each other using all of themain display area 10, thesub display area 20, and theround display area 30. In addition, theprocessor 130 may provide pieces of information related to each other to two areas of themain display area 10, thesub display area 20, and theround display area 30, and provide information unrelated to the aforementioned information to the other area. - In response to the orientation of the
user terminal apparatus 100 being changed, theprocessor 130 may change the display area of information displayed on each area. In particular, theprocessor 130 may change the display area by combining or dividing information displayed on each area. - In response to a touch input on the
sub display area 20 being received, theprocessor 130 may display a UI displayed on thesub display area 20 on themain display area 10. In addition, in response to a touch input on themain display area 10 being received, theprocessor 130 may display a UI displayed on themain display area 10 on at least one of thesub display area 20 and theround display area 30. - Even in response to the same user touch input being detected, the
processor 130 may differently adjust a setting value according to a touch area. For example, in response to a drag operation being inputted to themain display area 10 or thesub display area 20 while a moving picture is being displayed on themain display area 10, theprocessor 130 may adjust a reproduction time or a volume according to the direction of the drag operation. In this case, in response to the drag operation on thesub display area 20, theprocessor 130 may adjust the reproduction time or volume more minutely than in response to the drag operation on themain display area 10. -
FIG. 4 is merely an embodiment and this should not be considered as limiting. For example, themain display area 10, thesub display area 20, and theround display area 30 ofFIG. 4 may be replaced with one another. In addition, the operations described inFIG. 4 may be applied to any other applications. -
FIG. 5 illustrates respective areas and an example of an operation corresponding to a touch input according to an embodiment of the present disclosure. - Referring to
FIG. 5 , in response to a drag input going from a certain area of themain display area 10 to theround display area 30 being received, theprocessor 130 may display information provided on themain display area 10 on thesub display area 20. - Only in response to a drag input going longer than a predetermined length being received, the
processor 130 may display the information provided on themain display area 10 on thesub display area 20. - In response to there being information provided on the
sub display area 20 before the drag input is received, theprocessor 130 does not provide the information provided on thesub display area 20 any longer. In addition, theprocessor 130 may display the information provided on thesub display area 20 on themain display area 10 or theround display area 30. - In
FIG. 5 , only themain display area 10 and thesub display area 20 are used. However, this should not be considered as limiting. For example, theprocessor 130 may display information provided on at least one of themain display area 10 and thesub display area 20 on theround display area 30. In addition, theprocessor 130 may display information provided on theround display area 30 on at least one of themain display area 10 and thesub display area 20. - Although information is not provided on a specific area, the
processor 130 may receive a touch input. For example, in response to a user dragging input on theround display area 30 being received while a broadcast content is being displayed on themain display area 10, theprocessor 130 may change a channel or a volume of the broadcast content. In this case, no information may be provided on theround display area 30. -
FIG. 6 illustrates an operation in response to the orientation of a user terminal apparatus being changed according to an embodiment of the present disclosure. - Referring to
FIG. 6 , theuser terminal apparatus 100 includes adisplay 110, which includes amain display area 10 which is disposed on the front surface of theuser terminal apparatus 100, asub display area 20 which extends from one side of themain display area 10 and is disposed on at least one area of the rear surface of theuser terminal apparatus 100, and around display area 30 for connecting themain display area 10 and thesub display area 20. - The
user terminal apparatus 100 may be provided with acamera 120 disposed on the rear surface thereof to photograph. The structure of theuser terminal apparatus 100 having thecamera 120 disposed on the rear surface thereof will be explained first for convenience of explanation. The structure of theuser terminal apparatus 100 having thecamera 120 disposed on the front surface thereof will be explained thereafter. - The
processor 130 may display a live view which is acquired through thecamera 110 on one of themain display area 10 and thesub display area 20. The upper view ofFIG. 6 illustrates that theprocessor 130 displays a live view on themain display area 10. In addition, theprocessor 130 may display the live view on thesub display area 20, and a method for determining which of themain display area 10 and thesub display area 20 is the area to display the live view will be explained below. - In response to the orientation of the
user terminal apparatus 100 being changed, theprocessor 130 may control thedisplay 110 to display the live view on the other one of themain display area 10 and thesub display area 20. The lower view ofFIG. 6 illustrates that, in response to a rotation by more than a predetermined angle in a specific direction being detected while the live view is being displayed on themain display area 10, theprocessor 130 displays the live view on thesub display area 20. The image photographed by thecamera 120 may also be changed in response to the orientation of theuser terminal apparatus 100 being changed. Theprocessor 130 may determine that the orientation of theuser terminal apparatus 100 is changed in various methods, and a detailed description thereof will be explained below. - In response to the orientation of the
user terminal apparatus 100 being changed and thus the area to display the live view being changed, theprocessor 130 may display a soft key, a control GUI, and the like, to correspond to the corresponding area. For example, in response to the live view being displayed on themain display area 10, theprocessor 130 may display a GUI for photographing on the right side of themain display area 10. In addition, in response to the orientation of theuser terminal apparatus 100 being changed and thus the live view being displayed on thesub display area 20, theprocessor 130 may not display the GUI for photographing and may photograph in response to a touch on thesub display area 20. - In response to the orientation of the
user terminal apparatus 100 being changed while the live view is being displayed on thesub display area 20, theprocessor 130 may display a photographing setting value on themain display area 10. - The
processor 130 may display the live view on themain display area 10 and may not provide information on thesub display area 20, and, in response to the orientation of theuser terminal apparatus 100 being changed, theprocessor 130 may display the live view on thesub display area 20 and may not provide information on themain display area 10. For example, in the case of the upper view ofFIG. 6 , theprocessor 130 may not provide information on thesub display area 20. In addition, in the case of the lower view ofFIG. 6 , theprocessor 130 may not provide information on themain display area 10. However, this should not be considered as limiting. Theprocessor 130 may display the live view on both themain display area 10 and thesub display area 20. - The
processor 130 may not provide information by displaying one of themain display area 10 and thesub display area 20 in black. In addition, theprocessor 130 may not supply power to thedisplay 110 of one of themain display area 10 and thesub display area 20. Although theprocessor 130 may not provide information on one of themain display area 10 and thesub display area 20, theprocessor 130 may activate a touch function to receive a touch input. - In response to the orientation of the
user terminal apparatus 100 being changed, theprocessor 130 may change a photographing mode. For example, in response to the orientation of theuser terminal apparatus 100 being changed and thus a subject being recognized as being close to theuser terminal apparatus 100, theprocessor 130 may convert a current mode into a selfie photographing mode. The selfie photographing mode is a mode optimized to selfie photographing and may correct user's skin automatically. - In addition, in response to the orientation of the
user terminal apparatus 100 being changed and thus a subject being recognized as being far from theuser terminal apparatus 100, theprocessor 130 may convert a current mode into a background photographing mode. The background photographing mode is a mode optimized to scene photographing and may adjust contrast and white balance automatically. - The
processor 130 may change an angle of view according to a photographing mode. For example, theprocessor 130 may narrow the angle of view in the selfie photographing mode and may widen the angle of view in the background photographing mode. - In addition, the
processor 130 may change a UI provided on the display according to a photographing mode. For example, theprocessor 130 may provide a UI including a function of correcting user's skin and a function of reducing a red-eye effect in the selfie photographing mode, and may provide a UI including a contrast adjustment function and a white balance adjustment function in the background photographing mode. -
FIGS. 7A and 7B illustrate an example of a method of changing a provided image according to a display area according to various embodiments of the present disclosure. - Referring to
FIG. 7A , theprocessor 130 may display a live view on themain display area 10. In response to the user doing a photographing manipulation, theprocessor 130 may store the same image as the image displayed on themain display area 10. - In addition, referring to
FIG. 7B , theprocessor 130 may display the live view on thesub display area 20. In this case, theprocessor 130 may display only a part of the image which is recognized through thecamera 120. -
FIGS. 7A and 7B illustrate that the same subject is photographed for convenience of explanation. That is, theprocessor 130 may photograph the image shown inFIG. 7A , but may display only a part of the photographed image on thesub display area 20 as shown inFIG. 7B . - However, this is merely an embodiment, and the
processor 130 may rotate the photographed entire image by 90° and display the image on thesub display area 20. - In addition, even in response to photographing being performed while the live view is being displayed on the
sub display area 20, theprocessor 130 may store the entire image. For example, even in response to the user performing selfie photographing with the composition shown inFIG. 7B , theprocessor 130 may photograph an original image including not only the user's face but also the user's upper body as shown inFIG. 7A . - In this case, a gallery application may be executed and the
processor 130 may display the stored image differently according to a display area. For example, in response to the image being displayed on themain display 10, theprocessor 130 may display the original image including the user's upper body. In response to the image being displayed on thesub display area 20, theprocessor 130 may display a partial image including only the user's face. - The
processor 130 may store both the original image and the partial image. In addition, theprocessor 130 may store only the original image, and may crop the image to correspond to thesub display area 20 when necessary and display the image on thesub display area 20. In this case, the original image may store information on a part to be cropped. -
FIGS. 8A and 8B illustrate an example of a method of executing a specific function through a main display area according to various embodiments of the present disclosure. Theprocessor 130 may display a live view on thesub display area 20 and may not provide information on themain display area 10. In this case, theprocessor 130 may execute a corresponding function according to a user touch input on themain display area 10. - Referring to
FIG. 8A , photographing an image using themain display area 10 is illustrated. For example, theprocessor 130 may display a live view on thesub display area 20 and may not provide information on themain display area 10. In this case, in response to a user touch on themain display area 10 being detected, theprocessor 130 may photograph and store an image. - The
processor 130 may photograph the image only in response to a user touch on apredetermined area 11 of themain display area 10 being detected. Thepredetermined area 11 may be set by the manufacturer or may be set by the user. In this case, theprocessor 130 may activate a touch function of the predeterminedarea 11 of themain display area 10 and may deactivate a touch function of the other area of themain display area 10. - Referring to
FIG. 8B , changing a photographing setting value using themain display area 10 is illustrated. For example, theprocessor 130 may display a live view on thesub display area 20 and may not provide information on themain display area 10. In this case, in response to a swipe interaction from a certain point in themain display area 10 in a predetermined direction being detected, theprocessor 130 may change a photographing setting value to correspond to the predetermined direction. - The
processor 130 may change the photographing setting value only in response to a swipe interaction from a certain point in the predeterminedarea 11 of themain display area 10 in a predetermined direction being detected. The predetermined area may be set by the manufacturer or may be set by the user. In this case, theprocessor 130 may activate the touch function of the predeterminedarea 10 of themain display area 10 and deactivate the touch function of the other area of themain display area 10. - In response to a swipe interaction in the horizontal direction 810-1, 810-2 being detected, the
processor 130 may change an exposure value and display the live view. In response to a swipe interaction in the vertical direction 820-1, 820-2 being detected, theprocessor 130 may display the live view while zooming in or zooming out. However, this should not be considered as limiting. Theprocessor 130 may change a focus or change white balance through the swipe interaction. - In
FIGS. 8A and 8B , only the predeterminedarea 11 of themain display area 10 has been described. However, this should not be considered as limiting. For example, theprocessor 130 may activate the touch function of the entiremain display area 10, and photograph in response to a user touch on a certain point of themain display area 10 being detected. In addition, theprocessor 130 may divide themain display area 10 into a plurality of predetermined areas, and may allocate different functions to the predetermined areas. - The
processor 130 may execute corresponding functions in response to various interactions, such as a drag and drop interaction, a multi-touch interaction, and a stretch interaction which refers to pinching out, in addition to the touch and the swipe interaction. The interactions and corresponding functions thereof may be set by the manufacturer or may be set by the user. - In
FIGS. 8A and 8B , theprocessor 130 executes a specific function in response to a touch on themain display area 10. However, this should not be considered as limiting. For example, theprocessor 130 may display a live view on themain display area 10 and may not provide information on thesub display area 20. In this state, in response to a user touch on thesub display area 20 being detected, theprocessor 130 may photograph and store an image. In addition, in response to a swipe interaction from a certain point in thesub display area 20 in a predetermined direction being detected, theprocessor 130 may change the photographing setting value to correspond to the predetermined direction. - In addition, the
processor 130 may display the live view on both themain display area 10 and thesub display area 20, and may use the area on the opposite side to the area which is viewed by the user as an area to receive a touch input. - In
FIGS. 8A and 8B , only the case in which thecamera 120 is used has been described. However, in response to another application being used, theprocessor 130 may receive a touch input using the area on the opposite side to the area which is viewed by the. For example, while displaying a broadcast content on themain display area 10, theprocessor 130 may change a channel or a volume through a swipe interaction on thesub display area 20. - In addition, in
FIGS. 8A and 8B , only themain display area 10 and thesub display area 20 are used. However, one of themain display area 10 and thesub display area 20 and theround display area 30 may be used. For example, while displaying a live view on thesub display area 20, theprocessor 130 may photograph according to a user touch input on theround display area 30. -
FIG. 9 illustrates a method of displaying a live view on both a main display area and a sub display area according to an embodiment of the present disclosure. - Referring to
FIG. 9 , theprocessor 130 may display a live view on themain display area 10, and, in response to the live view being recognized as including a person's face, theprocessor 130 may control thedisplay 110 to display a face area included in the live view on thesub display area 20. For example, in response to the user terminal apparatus being far from a subject, theprocessor 130 may display the live view on themain display area 10. This will be explained in detail below. - The
processor 130 may recognize a user's face from the live view displayed on themain display area 10. To recognize the person's face using the live view, theprocessor 130 may divide a specific still image of the live view into a plurality of pixel blocks, and calculates a representative pixel value from each of the pixel blocks. The representative pixel value may be calculated based on an average of all pixels included in the pixel block or may be calculated based on a maximum distribution value, a median value, or a maximum value. Theprocessor 130 may compare the representative pixel values of the pixel blocks with one another, and determine whether pixel blocks having pixel values falling within a similar range are continuously arranged. In response to the pixel blocks being continuously arranged, theprocessor 130 may determine that those pixel blocks form a single object. Theprocessor 130 may determine whether there exists an object having pixel values falling within a range similar to person's skin from among the pixel blocks determined to be the objects. In response to such an object existing, theprocessor 130 may recognize the object as a user's face area or other body areas, and determine the other objects as a background. - The
processor 130 may crop apart 910 of the live view to correspond to the size of thesub display area 20, and control thedisplay 110 to display thepart 910 on thesub display area 20. In particular, theprocessor 130 may crop thepart 910 of the live view while maintain the aspect ratio of thesub display area 20. Theprocessor 130 may crop thepart 910 of the live view such that the face area included in the live view is located on the center of thesub display area 20. In addition, theprocessor 130 may crop an area smaller than thepart 910 of the live view shown inFIG. 9 , and may magnify the cropped area and display the area on thesub display area 20. - In response to the live view being recognized as including people' faces, the
processor 130 may display the closest person's face area to the user from among the people' faces on thesub display area 20. In addition, theprocessor 130 may display a face area of a person who keeps his/her eyes toward theuser terminal apparatus 100 from among the people's faces on thesub display area 20. In addition, theprocessor 130 may crop a part of the live view to show all peoples. In response to the live view including many people, theprocessor 130 may display the live view on thesub display area 20 without cropping. Accordingly, the user who photographs using theuser terminal apparatus 100 may identify an image to be photographed through the live view displayed on themain display area 10, and people who are photographed may identify an image to be photographed through a part of the live view which is cropped and displayed on thesub display area 20. - In
FIG. 9 , the operation of displaying the live view on both themain display area 10 and thesub display area 20 has been described. However, this should not be considered as limiting. For example, in response to theuser terminal apparatus 100 being operated in a selfie photographing mode, theprocessor 130 may display the live view on thesub display area 20 and may display a UI informing that selfie photographing is in progress on themain display area 10. -
FIG. 10 illustrates a method of displaying a live view on a main display area and displaying a specific content on a sub display area according to an embodiment of the present disclosure. - Referring to
FIG. 10 , theprocessor 130 may display a live view on themain display area 10, and, in response to the live view being recognized as including a person, theprocessor 130 may control thedisplay 110 to display an animation on thesub display area 20. For example, in response to a child being recognized in the live view, theprocessor 130 may display an animation for children or an animal image on thesub display area 20. Theprocessor 130 may output a sound corresponding to the content displayed on thesub display area 20. Accordingly, the eyes of the person who is photographed may be kept toward theuser terminal apparatus 100. - The
processor 130 may determine a content to be displayed on thesub display area 20 based on at least one of age and sex of a person who is recognized in the live view. For example, in response to a woman being recognized in the live view, theprocessor 130 may display jewelry, and the like, on thesub display area 20. - In response to people being recognized in the live view, the
processor 130 may determine a content to be displayed on thesub display area 20 with reference to the closest person from among the people. In addition, theprocessor 130 may determine a content to be displayed on thesub display area 20 with reference to a person who keeps his/her eyes on other direction rather than toward theuser terminal apparatus 100. In addition, theprocessor 130 may determine a content to be displayed on thesub display area 20 with reference to a person which is located closest to the center of the live view. - In
FIGS. 9 and 10 , the case in which a person or person's face is recognized in the live view has been described. However, this should not be considered as limiting. For example, even in response to an animal being recognized in the live view, theprocessor 130 may display a part of the live view, an animation, and the like, on thesub display area 20. In addition, theprocessor 130 may display a part of the live view, an animation, and the like, on thesub display area 20 according to user's manipulation. -
FIGS. 11A and 11B illustrate an example of a method of determining an area to display a live view according to various embodiments of the present disclosure. Theprocessor 130 may determine a distance to a subject, and, in response to the determined distance being shorter than a predetermined distance, theprocessor 130 may control thedisplay 110 to display the live view on thesub display area 20, and, in response to the determined distance being longer than the predetermined distance, control thedisplay 110 to display the live view on themain display area 10. - Referring to
FIGS. 11A and 11B , a case in which a distance to a subject is shorter than the predetermined distance is illustrated inFIG. 11A , and a case in which a distance to a subject is longer than the predetermined distance is illustrated inFIG. 11B . Theprocessor 130 may determine the distance to the subject using auto focusing. In addition, theprocessor 130 may analyze a still image at a specific point of time of the live view, and determine the distance to the subject based on an outline of the subject. Accordingly, in response to the user photographing himself/herself while viewing the rear surface of theuser terminal apparatus 100, the live view may be displayed on thesub display area 20 so that the user may photograph while checking an image to be photographed. In addition, in response to the user photographing other people or a scene while viewing the front surface of theuser terminal apparatus 100, the live view may be displayed on themain display area 10, so that the user may photograph while checking an image to be photographed. - However, this should not be considered as limiting. The
processor 130 may display the live view on the opposite area to the area determined in the above-described method or may display the live view on both themain display area 10 and thesub display area 20. - In addition, the
processor 130 may display the live view on an area where a user's touch is performed. - In addition, the
processor 130 may determine the area to display the live view by analyzing an image photographed by thecamera 120. For example, theprocessor 130 may recognize the user from the photographed image and determine the area to display the live view. - In addition, the
processor 130 may determine the area to display the live view by recognizing user's iris. -
FIG. 12 illustrates a method of detecting a change in an orientation of a user terminal apparatus according to an embodiment of the present disclosure. - Referring to
FIG. 12 , in response to an orientation of theuser terminal apparatus 100 being changed, an image of a live view may be changed. Theuser terminal apparatus 100 may include a plurality of sensors as described above. Theprocessor 130 may determine whether the orientation of theuser terminal apparatus 100 is changed based on at least one of a location and a motion of theuser terminal apparatus 100 and a user's grip, which are detected by the plurality of sensors. For example, theprocessor 130 may determine whether the orientation of theuser terminal apparatus 100 is changed based on a rotation state and a moving direction of theuser terminal apparatus 100 detected by the geomagnetic sensor, and a rotation angle of theuser terminal apparatus 100 detected by the gyro sensor. - In response to the
user terminal apparatus 100 being rotated in a specific direction by more than a predetermined angle, theprocessor 130 may determine that the orientation of theuser terminal apparatus 100 is changed. For example, in response to theuser terminal apparatus 100 being rotated by more than 90°, theprocessor 130 may determine that the orientation of theuser terminal apparatus 100 is changed and perform a corresponding operation. - In response to a person's face being greater than or equal to a predetermined size being recognized in the live view or in response to the person's face being greater than or equal to the predetermined size being recognized, but the person's face not being recognized afterward, the
processor 130 may determine that the orientation of theuser terminal apparatus 100 is changed.FIG. 12 illustrates a plurality of still images of a live view in sequence. The firststill image 1210 is an image without a person and the second tofourth still images fifth image 1250 is an image without a person. That is,FIG. 12 illustrates that a camera angle is moved from the left to the right and then is moved back to the left. - The
processor 130 may determine that the orientation of theuser terminal apparatus 100 is changed at the time at which the secondstill image 1220 is displayed and at the time at which the fifthstill image 1250 is displayed as shown inFIG. 12 . - However, this should not be considered as limiting. The
processor 130 may determine whether the orientation of theuser terminal apparatus 100 is changed with reference to other objects rather than the person's face. For example, theprocessor 130 may extract an outline of a slide from a specific still image of the live view and then extract the outline of the slide from the next still image, and, in response to the outline of the slide being recognized as being moved by more than a predetermined distance on the live view, theprocessor 130 may determine that the orientation of theuser terminal apparatus 100 is changed. - In addition, the
processor 130 may continue detecting an absolute posture using various sensors provided in theuser terminal apparatus 100, and determine whether the orientation of theuser terminal apparatus 100 is changed. - In this case, the
processor 130 may set a reference posture, and, in response to the posture being changed from the reference posture by more than a predetermined value, theprocessor 130 may determine that the orientation of theuser terminal apparatus 100 is changed. Herein, the reference posture is a posture when a camera application is selected. - In addition, the
processor 130 may determine whether the orientation of theuser terminal apparatus 100 is changed according to user's touch manipulation. -
FIGS. 13A and 13B illustrate an example of an operation according to the change of an orientation of a user terminal apparatus according to various embodiments of the present disclosure. - Referring to
FIG. 13A , in response to an orientation of theuser terminal apparatus 100 being changed while a photographing function is being executed, a gallery application is executed. For example, theprocessor 130 may display a live view on thesub display area 20, and, in response to theuser terminal apparatus 100 being rotated by about 180°, theprocessor 130 may display the gallery application on themain display area 10. In addition, theprocessor 130 may display a telephony function, an Internet function, and the like, unrelated to the photographing function on themain display area 10. In addition, theprocessor 130 may stop the photographing function. - In addition, in response to the user touching the
main display area 10 in the state in which photographing is performed and a stored image is displayed on thesub display area 20, theprocessor 130 may display the gallery application or an application for correcting on thesub display area 20. In addition, in response to photographing being performed, theprocessor 130 may display the photographed images on thesub display area 20 in sequence, and, in response to the user touching themain display area 10, may change and display the image. - Referring to
FIG. 13B response to the orientation of theuser terminal apparatus 100 being changed while a moving picture photographing function is being executed, the moving picture photographing function is stopped. The upper view ofFIG. 13B illustrates a photographing time, aGUI 1310 for stopping photographing the moving picture, and aGUI 1320 for pausing photographing the moving picture. The user may touch theGUI 1310 for stopping photographing the moving picture to stop photographing the moving picture. However, the user may change the orientation of theuser terminal apparatus 100 to stop photographing the moving picture. The lower view ofFIG. 13B illustrates a thumbnail image of the moving picture after photographing is finished, and the user may touch aGUI 1330 for executing the moving picture to execute the moving picture. - However, this should not be considered as limiting. The
processor 130 may pause or stop photographing the moving picture in response to a touch on themain display area 10 while the moving picture is being photographed. -
FIGS. 14A and 14B illustrate an example of a method of executing a photographing function. Theprocessor 130 may execute the photographing function in other ways rather than by touching an icon indicating the photographing function. - Referring to
FIG. 14A , in response to the orientation of theuser terminal apparatus 100 being changed, the photographing function is executed. For example, in response to the orientation of theuser terminal apparatus 100 being changed while a lock screen is being displayed on themain display area 10, theprocessor 130 may execute the photographing function and display a live view on thesub display area 20. In addition, in response to the orientation of theuser terminal apparatus 100 being changed while information is not being provided on themain display area 10, theprocessor 130 may execute the photographing function and display a live view on thesub display area 20. - Referring to
FIG. 14B , in response to a swipe interaction on thesub display area 20 being detected, the photographing function is executed. For example, in response to the swipe interaction on thesub display area 20 being detected while the lock screen is being displayed on thesub display area 20 or information is not being provided on thesub display area 20, theprocessor 130 may execute the photographing function.FIG. 14B illustrates the swipe interaction moving from the left to the right. However, the swipe interaction is not limited to a specific direction. In addition, the photographing function may be executed through other interactions than the swipe interaction. - The photographing function may be executed in various ways in addition to the ways in the embodiments of
FIGS. 14A and 14B . For example, in response to at least one of themain display area 10 and thesub display area 20 being activated while information is not being provided on themain display area 10 and thesub display area 20, theprocessor 130 may recognize a user's intention and execute the photographing function. Theprocessor 130 may recognize at least one of a user's face, a user's gesture, and a grip form at the time when a specific area is activated, and execute the photographing function. For example, in response to the user making a V sign with user's fingers, the processor 13 may execute the photographing function. -
FIG. 15 is a view to illustrate a method of using a round display area according to an embodiment of the present disclosure. - Referring to
FIG. 15 , theround display area 30 is disposed between themain display area 10 and thesub display area 20. Theround display area 30 may display a UI for controlling theuser terminal apparatus 100. For example, in response to the photographing function being activated, theprocessor 130 may display aGUI 1520 for photographing and storing an image on theround display area 30. In response to the user touching theGUI 1520 for photographing and storing the image, theprocessor 130 may photograph and store the image. - In response to a swipe interaction in the horizontal direction 1510-1, 1510-2 on the
round display area 30 being detected, theprocessor 130 may change an exposure value and display a live view. In addition, theprocessor 130 may display the live view by zooming in or zooming out. However, this should not be considered as limiting. Theprocessor 130 may change a focus or change white balance through a swipe interaction. - However, this should not be considered as limiting. In response to the swipe interaction on the
round display area 30 being detected, theprocessor 130 may execute the photographing function. In addition, theprocessor 130 may change the area to display the live view in response to a touch on theround display area 30. - The
processor 130 may display a specific notification on theround display area 30. For example, in response to a subject being out of focus, theprocessor 130 may display a notification to inform the corresponding situation on theround display area 30. In response to the user touching the notification, theprocessor 130 may change settings to focus on the subject or retry photographing. -
FIG. 16 is a view to illustrate a user terminal apparatus which has a camera disposed on a front surface thereof according to an embodiment of the present disclosure. - Referring to
FIG. 16 , theuser terminal apparatus 100 may include thecamera 120 disposed under themain display area 10. However, most of the operations described in the above-described embodiments are equally applied except that themain display area 10 and thesub display area 20 are reversed. Therefore, a detailed description is omitted. - In the above descriptions, the camera application has been mainly described. However, similar operations may be performed for other applications.
-
FIGS. 17A to 17C are views to illustrate an example of a case in which other applications are used according to various embodiments of the present disclosure. - Referring to
FIG. 17A , in response to an event occurring, theprocessor 130 may determine a display area based on the orientation of theuser terminal apparatus 100 at the time when the event occurs. For example, in response to a call or a message being received, theprocessor 130 may detect an absolute posture of theuser terminal apparatus 100 at the time when the call or message is received, and display contents of the call or message on a predetermined display area. In particular, in response to themain display area 10 being in contact with a table, theprocessor 130 may display the contents of the call or message on thesub display area 20. - In addition, referring to
FIG. 17B , in response to an event occurring, theprocessor 130 may determine a display area based on a user's location at the time when the event occurs. For example, in response to a call or a message being received, theprocessor 130 may recognize the user's location at the time when the call or message is received through thecamera 120, and, in response to the user's location being recognized, theprocessor 130 may display the contents of the call or message on thesub display area 20, and, in response to the user's location not being recognized, theprocessor 130 may display the contents of the call or message on themain display area 10. - In addition, referring to
FIG. 17C , in response to an event occurring, theprocessor 130 may determine a display area based on a using state of theuser terminal apparatus 100 at the time when the event occurs. For example, in response to a call or a message being received while the user is viewing a moving picture through themain display area 10, theprocessor 130 may display the contents of the call or message on thesub display area 20 or theround display area 30. - In response to the orientation of the
user terminal apparatus 100 being changed, theprocessor 130 may change a display area. For example, in response to the user changing the orientation of theuser terminal apparatus 100 in the middle of viewing contents of a text message through themain display area 10, and then viewing thesub display area 20, theprocessor 130 may display the contents of the text message on thesub display area 20. - As described above, the
processor 130 may determine whether the orientation of theuser terminal apparatus 100 is changed using various sensors or thecamera 120. The change of the orientation of theuser terminal apparatus 100 has been described in detail and thus a redundant explanation thereof is omitted. - In response to the orientation of the
user terminal apparatus 100 being changed and thus the display area being changed, theprocessor 130 may change a UI. Specifically, theprocessor 130 may change at least one of an amount of information, an information type, and a layout included in the UI. For example, in response to a text message being displayed on themain display area 10, theprocessor 130 may additionally display a UI for creating a new text message, information on the other user, and the like, in addition to the contents of the text message. In addition, in response to the text message being displayed on thesub display area 20, only the contents of the text message may be displayed. In addition, in response to the text message being displayed on theround display area 30, theprocessor 130 may display some contents of the text message in sequence. - In particular, the
processor 130 may change a function to provide according to a display area. For example, in response to the received text message being displayed on themain display area 10, theprocessor 130 may display a UI including a reply message input function, a received message storing function, a received message deleting function, and the like. However, in response to the received text message being displayed on theround display area 30, only the contents of the received text message may be displayed and other functions may not be provided. -
FIGS. 18A and 18B illustrate an example of a configuration of a display according to various embodiments of the present disclosure. - Referring to
FIG. 18A , a flexible display is illustrated. The user may fold the flexible display and use the same. In this case, anarea 1810 which is folded and bent back corresponds to thesub display area 20, and an unfoldedarea 1820 corresponds to themain display area 10. - In
FIG. 18A , the foldedarea 1810 is smaller than the unfoldedarea 1820. However, this should not be considered as limiting. For example, the user may fold the flexible display in half and use the same. - Most of the embodiments described above may be applied to the flexible display except for the
round display area 30, and a redundant explanation is omitted. However, theround display area 30 may be formed by folding the flexible display two times. In this case, the embodiments of theround display area 30 may be applied. - Referring to
FIG. 18B , theuser terminal apparatus 100 which is provided with a plurality of displays on the front surface and the rear surface thereof is illustrated. In this case, afront display 1830 corresponds to themain display area 10, and arear display 1840 corresponds to thesub display area 20. Likewise, most of the embodiments described above be applied to theuser terminal apparatus 100 having the plurality of displays disposed on the front surface and the rear surface thereof except for theround display area 30, and a redundant explanation is omitted. -
FIG. 19 is a flowchart to illustrate a control method of the user terminal apparatus according to an embodiment of the present disclosure. - Referring to
FIG. 19 , first, theuser terminal apparatus 100 displays a live view acquired through a camera on one of a main display area which is disposed on the front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of the rear surface of theuser terminal apparatus 100 at operation S1910. In addition, in response to the orientation of theuser terminal apparatus 100 being changed, theuser terminal apparatus 100 displays the live view on the other one of the main display area and the sub display area at operation S1920. - The displaying on one of the main display area and the sub display area at operation S1910 may include displaying the live view on the main display area and not providing information on the sub display area, and the displaying on the other one of the main display area and the sub display area at operation S1920 may include, in response to the orientation of the user terminal apparatus being changed, displaying the live view on the sub display area and not providing information on the main display area.
- The displaying on the other one of the main display area and the sub display area at operation S1920 may include: displaying the live view on the sub display area and not providing information on the main display area; and may further include, in response to a user touch on the main display area being detected, include photographing and storing an image.
- The control method may further include, in response to a swipe interaction from a certain point in the main display area in a predetermined direction being detected, changing a photographing setting value to correspond to the predetermined direction.
- The displaying on the other one of the main display area and the sub display area at operation S1920 may further include: displaying the live view on the main display area; and, in response to the live view being recognized as including a person's face, displaying a face area included in the live view on the sub display area.
- The displaying on the sub display area may include cropping a part of the live view to correspond to a size of the sub display area, and displaying the part on the sub display area.
- The displaying on the other one of the main display area and the sub display area at operation S1920 may further include: displaying the live view on the main display area; and, in response to the live view being recognized as including a person, displaying an animation on the sub display area.
- The displaying on one of the main display area and the sub display area at operation S1910 may include: determining a distance to a subject, and, in response to the determined distance being shorter than a predetermined distance, displaying the live view on the sub display area, and, in response to the determined distance being longer than the predetermined distance, displaying the live view on the main display area.
- The control method may further include detecting at least one of a location and a motion of the user terminal apparatus, and a user's grip, and the displaying on the other one of the main display area and the sub display area at operation S1920 may include determining whether the orientation of the user terminal apparatus is changed based on at least one of the location and the motion of the user terminal apparatus, and the user's grip.
- The displaying on the other one of the main display area and the sub display area at operation S1920 may include, in response to a person's face being greater than or equal to a predetermined size being recognized in the live view, or in response to the person's face being greater than or equal to the predetermined size being recognized and then the person's face not being recognized, determining that the orientation of the user terminal apparatus is changed.
- According to various embodiments described above, the user terminal apparatus may control the displays disposed on the front surface and the rear surface of the user terminal apparatus based on various photographing conditions, so that the user may easily photograph.
- In the above descriptions, the control method when the photographing function is executed has been mainly explained. However, this should not be considered as limiting. For example, in response to the orientation of the user terminal apparatus being changed while a lock screen is being displayed on the main display area, the processor may execute a predetermined application. In addition, in response to the orientation of the user terminal apparatus being changed while a UI indicating receiving of a call is being displayed on the main display area, the processor may connect the call. In response to the user terminal apparatus moving further away from user's ear while the user is talking on the phone, the processor may display contacts, and the like.
- In addition, the processor may display an execution state of an application which is being executed in the background on the sub display area. For example, the processor may display an execution state of a specific game or music application, and the like. on the sub display area, and thus may minimize power consumption. In addition, in response to the orientation of the user terminal apparatus being changed while a specific application is being executed and displayed on the main display area, the processor may display a UI displaying essential functions of the corresponding application on the sub display area.
- In the above-described embodiments, only the orientation of the user terminal apparatus is changed. However, a different operation may be performed according to a direction of the changed orientation. For example, a different function may be executed in response to rotation of the user terminal apparatus in the upward direction, downward direction, rightward direction, or leftward direction. In addition, even in response to the user terminal apparatus being rotated in the upward direction, a different function may be executed according to whether the rotation is made in the clockwise direction or counter clockwise direction.
- A control method of a user terminal apparatus, which includes: a display including a main display area which is disposed on a front surface of the user terminal apparatus, and a sub display area which extends from one side of the main display area and is disposed on at least one area of a rear surface of the user terminal apparatus; and a camera configured to photograph an image, according to the above-described embodiments may be implemented as a computer executable program code and stored in various non-transitory computer readable media, and may be provided to servers or devices to be executed by processors.
- For example, a non-transitory computer readable medium which stores a program for performing the steps of in sequence: displaying a live view acquired through the camera on one of the main display area and the sub display area; and, in response to an orientation of the user terminal apparatus being changed, displaying the live view on the other one of the main display area and the sub display area may be provided.
- The non-transitory computer readable medium refers to a medium that stores data semi-permanently rather than storing data for a very short time, such as a register, a cache, a memory or and the like, and is readable by an apparatus. Specifically, the above-described various applications or programs may be stored in the non-transitory computer readable medium such as a compact disc (CD), a digital versatile disc (DVD), a hard disk, a Blu-ray disc, a USB, a memory card, a ROM, and the like, and may be provided.
- While the present disclosure has been shown and described with reference to various embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims and their equivalents.
Claims (20)
1. An electronic device comprising:
a first touch display disposed on a first side of the electronic device;
a second touch display disposed on a second side of the electronic device, the first side and the second side being opposite sides of the electronic device;
a camera configured to obtain one or more images, the camera disposed on the first side of the electronic device; and
at least one processor configured to at least:
provide, to the first touch display, a first screen corresponding to a live view function based on a first image obtained through the camera,
detect whether the electronic device is flipped,
based on detecting that the electronic device is flipped, provide, to the second touch display, a second screen corresponding to the live view function based on a second image obtained through the camera, and
capture a third image obtained through the camera in response to a user command received through one of the first touch display or the second touch display,
wherein the first screen includes a first user interface for a first photography mode, and
wherein the second screen includes a second user interface for a second photography mode, the second user interface is different than the first user interface, and the second photography mode is different than the first photography mode.
2. The electronic device of claim 1 , wherein the at least one processor is further configured to:
based on detecting that the electronic device has flipped again while the second screen corresponding to the live view function based on the second image obtained through the camera is provided, provide, to the first touch display, the first screen corresponding to the live view function based on a fourth image obtained through the camera.
3. The electronic device of claim 1 , wherein the at least one processor is further configured to receive a touch input on a photography mode button displayed on one of the first touch display or the second touch display.
4. The electronic device of claim 1 , wherein the first image is only provided on the first screen and the second image is only provided on the second screen.
5. The electronic device of claim 1 , further comprising:
another camera disposed on the second side of the electronic device with the second touch display.
6. The electronic device of claim 1 , wherein the user command received through the one of the first touch display or the second touch display is received as a touch input on an image capture button provided on the one of the first touch display or the second touch display.
7. The electronic device of claim 1 , wherein the first photography mode is a selfie photography mode.
8. The electronic device of claim 1 , wherein the camera is oriented in a same direction as a view in the first screen when operating in the first photography mode, and the camera is oriented in an opposite direction of a view in the second screen when operating in the second photography mode.
9. The electronic device of claim 1 , wherein, while the first screen corresponding to the live view function based on the first image obtained through the camera is provided, the at least one processor is further configured to correct a hue of a subject's skin in the first image obtained through the camera.
10. The electronic device of claim 1 , wherein, while the second screen corresponding to the live view function based on the second image obtained through the camera is provided, the at least one processor is further configured to adjust a contrast and white balance of the second image obtained through the camera.
11. A method of an electronic device including a first touch display disposed on a first side of the electronic device, a second touch display disposed on a second side of the electronic device, the first side and the second side being opposite sides of the electronic device, and a camera configured to obtain one or more images, the camera disposed on the first side of the electronic device, the method comprising:
providing, to the first touch display, a first screen corresponding to a live view function based on a first image obtained through the camera;
detecting whether the electronic device is flipped;
based on detecting that the electronic device is flipped, providing, to the second touch display, a second screen corresponding to the live view function based on a second image obtained through the camera; and
capturing a third image obtained through the camera in response to a user command received through one of the first touch display or the second touch display,
wherein the first screen includes a first user interface for a first photography mode, and
wherein the second screen includes a second user interface for a second photography mode, the second user interface is different than the first user interface, and the second photography mode is different than the first photography mode.
12. The method of claim 11 , further comprising:
based on detecting that the electronic device has flipped again while providing the second screen corresponding to the live view function based on the second image obtained through the camera, providing, to the first touch display, the first screen corresponding to the live view function based on a fourth image obtained through the camera.
13. The method of claim 11 , further comprising:
receiving a touch input on a photography mode button displayed on one of the first touch display or the second touch display.
14. The method of claim 11 , wherein the first image is only provided on the first screen and the second image is only provided on the second screen.
15. The method of claim 11 , wherein another camera disposed on the second side of the electronic device with the second touch display.
16. The method of claim 11 , wherein the user command received through the one of the first touch display or the second touch display is received as a touch input on an image capture button provided on the one of the first touch display or the second touch display.
17. The method of claim 11 , wherein the first photography mode is a selfie photography mode.
18. The method of claim 11 , wherein the camera is oriented in a same direction as a view in the first screen when operating in the first photography mode, and the camera is oriented in an opposite direction of a view in the second screen when operating in the second photography mode.
19. The method of claim 11 , wherein the providing of the first screen corresponding to the live view function based on the first image obtained through the camera comprises correcting a hue of a subject's skin in the first image obtained through the camera.
20. The method of claim 11 , wherein the providing of the second screen corresponding to the live view function based on the second image obtained through the camera comprises adjusting a contrast and white balance of the second image obtained through the camera.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/834,848 US10701273B1 (en) | 2015-07-29 | 2020-03-30 | User terminal apparatus and control method thereof |
US16/890,480 US11284003B2 (en) | 2015-07-29 | 2020-06-02 | User terminal apparatus and control method thereof |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562198360P | 2015-07-29 | 2015-07-29 | |
KR10-2016-0001684 | 2016-01-06 | ||
KR1020160001684A KR102174740B1 (en) | 2015-07-29 | 2016-01-06 | User terminal apparatus and control method thereof |
US15/176,630 US9936138B2 (en) | 2015-07-29 | 2016-06-08 | User terminal apparatus and control method thereof |
US15/915,696 US10645292B2 (en) | 2015-07-29 | 2018-03-08 | User terminal apparatus and control method thereof |
US16/834,848 US10701273B1 (en) | 2015-07-29 | 2020-03-30 | User terminal apparatus and control method thereof |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/915,696 Continuation US10645292B2 (en) | 2015-07-29 | 2018-03-08 | User terminal apparatus and control method thereof |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/890,480 Continuation US11284003B2 (en) | 2015-07-29 | 2020-06-02 | User terminal apparatus and control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
US10701273B1 US10701273B1 (en) | 2020-06-30 |
US20200228717A1 true US20200228717A1 (en) | 2020-07-16 |
Family
ID=57883736
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/176,630 Active US9936138B2 (en) | 2015-07-29 | 2016-06-08 | User terminal apparatus and control method thereof |
US15/915,696 Active US10645292B2 (en) | 2015-07-29 | 2018-03-08 | User terminal apparatus and control method thereof |
US16/834,848 Active US10701273B1 (en) | 2015-07-29 | 2020-03-30 | User terminal apparatus and control method thereof |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/176,630 Active US9936138B2 (en) | 2015-07-29 | 2016-06-08 | User terminal apparatus and control method thereof |
US15/915,696 Active US10645292B2 (en) | 2015-07-29 | 2018-03-08 | User terminal apparatus and control method thereof |
Country Status (1)
Country | Link |
---|---|
US (3) | US9936138B2 (en) |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6611566B2 (en) * | 2015-11-02 | 2019-11-27 | キヤノン株式会社 | IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM |
US10078347B2 (en) * | 2015-11-30 | 2018-09-18 | Dell Products L.P. | Information handling system folded display assembly |
KR102421518B1 (en) * | 2017-06-27 | 2022-07-15 | 엘지전자 주식회사 | Electronic device and method of controlling the same |
JP7031228B2 (en) * | 2017-10-26 | 2022-03-08 | 株式会社リコー | Program, image display method, image display system, information processing device |
WO2019227281A1 (en) | 2018-05-28 | 2019-12-05 | 华为技术有限公司 | Capture method and electronic device |
JP2020527245A (en) * | 2018-06-08 | 2020-09-03 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Screen control method and equipment |
CN109040440A (en) * | 2018-07-25 | 2018-12-18 | 努比亚技术有限公司 | A kind of image pickup method, mobile terminal and computer readable storage medium |
CN109274791B (en) * | 2018-09-30 | 2021-06-15 | 联想(北京)有限公司 | Processing method and device and electronic equipment |
KR102542398B1 (en) | 2018-11-29 | 2023-06-13 | 삼성전자 주식회사 | Foldable electronic device and method for displaying information in the foldable electronic device |
CN109862154B (en) * | 2019-01-25 | 2020-10-16 | 武汉华星光电半导体显示技术有限公司 | Electronic device |
CN113412470B (en) * | 2019-04-23 | 2023-09-08 | 华为技术有限公司 | Method and device for processing image layer |
CN110312073B (en) * | 2019-06-25 | 2021-03-16 | 维沃移动通信有限公司 | Shooting parameter adjusting method and mobile terminal |
CN115052094B (en) * | 2019-08-20 | 2024-09-17 | 深圳市大疆创新科技有限公司 | Motion camera, self-timer control method and device, movable platform and storage medium |
CN112486346B (en) * | 2019-09-12 | 2023-05-30 | 北京小米移动软件有限公司 | Key mode setting method, device and storage medium |
CN110519433B (en) * | 2019-09-29 | 2021-09-10 | 上海闻泰信息技术有限公司 | Camera application control method, device, equipment and storage medium |
CN110798568B (en) * | 2019-09-30 | 2022-01-14 | 华为技术有限公司 | Display control method of electronic equipment with folding screen and electronic equipment |
KR20220017284A (en) * | 2020-08-04 | 2022-02-11 | 삼성전자주식회사 | Electronic device and method for controlling screen thereof |
EP4239440A4 (en) * | 2020-12-03 | 2024-05-01 | Samsung Electronics Co., Ltd. | Electronic device comprising flexible display, and method for capturing image in electronic device |
KR20220087659A (en) * | 2020-12-17 | 2022-06-27 | 삼성디스플레이 주식회사 | Electronic device and driving methode of the same |
US11720312B2 (en) * | 2021-03-22 | 2023-08-08 | Motorola Mobility Llc | Manage quickview content for a multi-display device |
US11693558B2 (en) * | 2021-06-08 | 2023-07-04 | Samsung Electronics Co., Ltd. | Method and apparatus for displaying content on display |
US11966659B2 (en) | 2022-03-08 | 2024-04-23 | Motorola Mobility Llc | Context-based display of content on a multi-display system |
Family Cites Families (88)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7173665B2 (en) * | 2001-03-30 | 2007-02-06 | Sanyo Electric Co., Ltd. | Folding mobile communication terminal |
JP2003198676A (en) | 2001-12-28 | 2003-07-11 | Kenwood Corp | Portable terminal apparatus |
JP3910112B2 (en) * | 2002-06-21 | 2007-04-25 | シャープ株式会社 | Camera phone |
KR20050057475A (en) * | 2002-09-20 | 2005-06-16 | 마츠시타 덴끼 산교 가부시키가이샤 | Liquid crystal display device, and portable telephone device using liquid crystal display device |
CN1627765B (en) * | 2003-12-10 | 2010-09-01 | 松下电器产业株式会社 | Portable information terminal device |
JP2006166133A (en) * | 2004-12-08 | 2006-06-22 | Samsung Techwin Co Ltd | Camera and image display method |
JP5151976B2 (en) * | 2006-03-10 | 2013-02-27 | 日本電気株式会社 | Mobile phone |
KR100753397B1 (en) * | 2006-04-04 | 2007-08-30 | 삼성전자주식회사 | Apparatus and method for controlling auto display in a mobile station |
EP2073092B1 (en) * | 2007-11-30 | 2010-05-26 | Telefonaktiebolaget L M Ericsson (publ) | Portable electronic apparatus having more than one display area, and method of controlling a user interface thereof |
KR101497656B1 (en) | 2008-03-25 | 2015-02-27 | 삼성디스플레이 주식회사 | Dual displaying method, dual display apparatus for performing the dual displaying method and dual display handphone having the dual display apparatus |
JP5219929B2 (en) * | 2008-07-31 | 2013-06-26 | ソニー株式会社 | Information processing apparatus and method, and program |
KR101512768B1 (en) | 2008-08-22 | 2015-04-16 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
US8860765B2 (en) * | 2008-09-08 | 2014-10-14 | Qualcomm Incorporated | Mobile device with an inclinometer |
JP4697289B2 (en) | 2008-11-05 | 2011-06-08 | ソニー株式会社 | Imaging apparatus and display control method for imaging apparatus |
KR101521219B1 (en) * | 2008-11-10 | 2015-05-18 | 엘지전자 주식회사 | Mobile terminal using flexible display and operation method thereof |
US8223241B2 (en) * | 2008-12-15 | 2012-07-17 | Robbyn Gayer | Self shot camera |
KR101591524B1 (en) * | 2009-08-25 | 2016-02-03 | 엘지전자 주식회사 | Mobile terminal and method for displaying menu in mobile terminal |
KR101617289B1 (en) * | 2009-09-30 | 2016-05-02 | 엘지전자 주식회사 | Mobile terminal and operation control method thereof |
KR101634247B1 (en) | 2009-12-04 | 2016-07-08 | 삼성전자주식회사 | Digital photographing apparatus, mdthod for controlling the same |
KR101674011B1 (en) * | 2010-06-07 | 2016-11-08 | 삼성전자주식회사 | Method and apparatus for operating camera function in portable terminal |
JP2012114871A (en) | 2010-11-29 | 2012-06-14 | Canon Inc | Imaging apparatus |
US20140310643A1 (en) | 2010-12-10 | 2014-10-16 | Yota Devices Ipr Ltd. | Mobile device with user interface |
KR101660505B1 (en) * | 2011-03-08 | 2016-10-10 | 엘지전자 주식회사 | Mobile terminal and control method therof |
US9927839B2 (en) * | 2011-05-03 | 2018-03-27 | DISH Technologies L.L.C. | Communications device with extendable screen |
US8842057B2 (en) * | 2011-09-27 | 2014-09-23 | Z124 | Detail on triggers: transitional states |
KR20110120858A (en) | 2011-10-19 | 2011-11-04 | 석상호 | Smart motion u.i with the double-faced screen |
US9686088B2 (en) * | 2011-10-19 | 2017-06-20 | Facebook, Inc. | Notification profile configuration based on device orientation |
KR101817656B1 (en) * | 2011-12-13 | 2018-01-12 | 삼성전자주식회사 | Camera with multi-function display |
US9837050B2 (en) * | 2012-03-12 | 2017-12-05 | Lenovo (Beijing) Co., Ltd. | Information processing method, method for driving image collection unit and electrical device |
JP5970937B2 (en) * | 2012-04-25 | 2016-08-17 | ソニー株式会社 | Display control apparatus and display control method |
KR101858604B1 (en) * | 2012-04-30 | 2018-05-17 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
KR101425443B1 (en) * | 2012-05-09 | 2014-08-04 | 엘지전자 주식회사 | Pouch and mobile device having the same |
KR102004409B1 (en) * | 2012-08-23 | 2019-07-29 | 삼성전자주식회사 | Flexible display apparatus and contorlling method thereof |
CN104272234B (en) * | 2012-08-24 | 2018-06-05 | 株式会社Ntt都科摩 | For controlling the apparatus and method in the direction of displayed image |
JP6039328B2 (en) | 2012-09-14 | 2016-12-07 | キヤノン株式会社 | Imaging control apparatus and imaging apparatus control method |
US9148651B2 (en) * | 2012-10-05 | 2015-09-29 | Blackberry Limited | Methods and devices for generating a stereoscopic image |
KR101984683B1 (en) * | 2012-10-10 | 2019-05-31 | 삼성전자주식회사 | Multi display device and method for controlling thereof |
KR102083918B1 (en) * | 2012-10-10 | 2020-03-04 | 삼성전자주식회사 | Multi display apparatus and method for contorlling thereof |
US8896533B2 (en) * | 2012-10-29 | 2014-11-25 | Lenova (Singapore) Pte. Ltd. | Display directional sensing |
JP6278593B2 (en) * | 2012-11-14 | 2018-02-14 | 京セラ株式会社 | Portable terminal device, program, and display control method |
KR20140085048A (en) * | 2012-12-27 | 2014-07-07 | 삼성전자주식회사 | Multi display device and method for controlling thereof |
KR102060155B1 (en) | 2013-01-11 | 2019-12-27 | 삼성전자주식회사 | Method and apparatus for controlling multi-tasking in electronic device using double-sided display |
US9891815B2 (en) * | 2013-02-21 | 2018-02-13 | Kyocera Corporation | Device having touch screen and three display areas |
KR102032347B1 (en) * | 2013-02-26 | 2019-10-15 | 삼성전자 주식회사 | Image display positioning using image sensor location |
KR102070776B1 (en) * | 2013-03-21 | 2020-01-29 | 엘지전자 주식회사 | Display device and method for controlling the same |
JP6229283B2 (en) * | 2013-03-26 | 2017-11-15 | 株式会社リコー | Image processing apparatus, display terminal, image display system, image processing method, display terminal control method, image display system control method, and program for those methods |
KR102127980B1 (en) | 2013-05-10 | 2020-06-29 | 삼성전자주식회사 | Portable electrinic device having dual display |
EP2814234A1 (en) * | 2013-06-11 | 2014-12-17 | Nokia Corporation | Apparatus for controlling camera modes and associated methods |
KR102031142B1 (en) * | 2013-07-12 | 2019-10-11 | 삼성전자주식회사 | Electronic device and method for controlling image display |
KR102063104B1 (en) * | 2013-08-29 | 2020-02-11 | 엘지전자 주식회사 | Mobile terminal |
CN110082940B (en) * | 2013-09-25 | 2022-10-21 | 索尼公司 | Display device and electronic apparatus |
CN105637853A (en) * | 2013-10-07 | 2016-06-01 | 索尼公司 | Information processing device, imaging device, imaging system, method for controlling information processing device, method for controlling imaging device, and program |
US9083860B2 (en) | 2013-10-09 | 2015-07-14 | Motorola Solutions, Inc. | Method of and apparatus for automatically controlling operation of a user-mounted recording device based on user motion and event context |
KR102073379B1 (en) * | 2013-11-26 | 2020-02-05 | 삼성디스플레이 주식회사 | Electronic device having a foldable display |
KR101511117B1 (en) | 2014-01-07 | 2015-04-10 | 주식회사 토비스 | A Dual-sided Display |
KR102166832B1 (en) * | 2014-01-28 | 2020-10-16 | 엘지전자 주식회사 | Apparatus and Method for controlling portable device with flip cover |
KR102278816B1 (en) * | 2014-02-04 | 2021-07-20 | 삼성디스플레이 주식회사 | Display apparatus and method for dring the same |
US9804635B2 (en) * | 2014-02-06 | 2017-10-31 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling displays |
US9712749B2 (en) * | 2014-02-27 | 2017-07-18 | Google Technology Holdings LLC | Electronic device having multiple sides |
KR101642808B1 (en) * | 2014-03-03 | 2016-07-26 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
KR102083597B1 (en) * | 2014-03-10 | 2020-03-02 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
JP5830564B2 (en) | 2014-04-09 | 2015-12-09 | オリンパス株式会社 | Imaging apparatus and mode switching method in imaging apparatus |
JP2015204568A (en) * | 2014-04-15 | 2015-11-16 | キヤノン株式会社 | Imaging apparatus, control method of the same, and program |
EP3872599A1 (en) * | 2014-05-23 | 2021-09-01 | Samsung Electronics Co., Ltd. | Foldable device and method of controlling the same |
US9817549B2 (en) * | 2014-06-25 | 2017-11-14 | Verizon Patent And Licensing Inc. | Method and system for auto switching applications based on device orientation |
KR20160001602A (en) * | 2014-06-26 | 2016-01-06 | 삼성전자주식회사 | Foldable electronic apparatus and method for performing interfacing thereof |
KR102264220B1 (en) * | 2014-09-02 | 2021-06-14 | 삼성전자주식회사 | Electronic apparatus and display method thereof |
KR102317525B1 (en) * | 2014-09-05 | 2021-10-26 | 엘지전자 주식회사 | Protable electronic device and control method thereof |
KR20160033507A (en) * | 2014-09-18 | 2016-03-28 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
KR102287099B1 (en) * | 2014-09-22 | 2021-08-06 | 엘지전자 주식회사 | Foldable display device displaying stored image by folding action or unfolding action and control method thereof |
KR20160041693A (en) * | 2014-10-08 | 2016-04-18 | 엘지전자 주식회사 | Mobile terminal |
KR102348947B1 (en) * | 2014-10-30 | 2022-01-11 | 삼성전자 주식회사 | Method and apparatus for controlling display on electronic devices |
KR102342555B1 (en) * | 2014-11-10 | 2021-12-23 | 엘지전자 주식회사 | Mobile terminal and control method thereof |
US20170357399A1 (en) * | 2014-12-29 | 2017-12-14 | Lg Electronics Inc. | Bended display device of controlling scroll speed of event information displayed on sub-region according to posture thereof, and control method therefor |
US10067666B2 (en) * | 2015-01-08 | 2018-09-04 | Samsung Electronics Co., Ltd. | User terminal device and method for controlling the same |
EP3265885A4 (en) | 2015-03-03 | 2018-08-29 | Prenav Inc. | Scanning environments and tracking unmanned aerial vehicles |
US10263982B2 (en) * | 2015-03-31 | 2019-04-16 | Samsung Electronics Co., Ltd. | Foldable device and method of controlling the same |
TWI708169B (en) * | 2015-06-02 | 2020-10-21 | 南韓商三星電子股份有限公司 | User terminal apparatus and controlling method thereof |
US9961239B2 (en) * | 2015-06-07 | 2018-05-01 | Apple Inc. | Touch accommodation options |
CN106325728B (en) * | 2015-06-30 | 2024-05-28 | 联想(北京)有限公司 | Electronic apparatus and control method thereof |
US10725725B2 (en) * | 2015-06-30 | 2020-07-28 | Lenovo (Beijing) Co., Ltd. | Electronic device and mode switching method |
US10409439B2 (en) * | 2015-08-05 | 2019-09-10 | Samsung Electronics Co., Ltd. | User terminal apparatus and controlling method thereof |
JP2017069776A (en) * | 2015-09-30 | 2017-04-06 | カシオ計算機株式会社 | Imaging apparatus, determination method and program |
US9942367B2 (en) * | 2015-10-13 | 2018-04-10 | Samsung Electronics Co., Ltd. | Electronic device and method for controlling the electronic device thereof |
KR102434865B1 (en) * | 2015-11-27 | 2022-08-22 | 엘지전자 주식회사 | Rollable mobile terminal and control method thereof |
US10015400B2 (en) * | 2015-12-17 | 2018-07-03 | Lg Electronics Inc. | Mobile terminal for capturing an image and associated image capturing method |
KR102168648B1 (en) * | 2016-01-07 | 2020-10-21 | 삼성전자주식회사 | User terminal apparatus and control method thereof |
KR20180023310A (en) * | 2016-08-25 | 2018-03-07 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
-
2016
- 2016-06-08 US US15/176,630 patent/US9936138B2/en active Active
-
2018
- 2018-03-08 US US15/915,696 patent/US10645292B2/en active Active
-
2020
- 2020-03-30 US US16/834,848 patent/US10701273B1/en active Active
Also Published As
Publication number | Publication date |
---|---|
US20180198987A1 (en) | 2018-07-12 |
US10701273B1 (en) | 2020-06-30 |
US10645292B2 (en) | 2020-05-05 |
US9936138B2 (en) | 2018-04-03 |
US20170034446A1 (en) | 2017-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10701273B1 (en) | User terminal apparatus and control method thereof | |
US10671115B2 (en) | User terminal device and displaying method thereof | |
US9594945B2 (en) | Method and apparatus for protecting eyesight | |
US10416883B2 (en) | User terminal apparatus and controlling method thereof | |
CN111372126B (en) | Video playing method, device and storage medium | |
KR102308201B1 (en) | User terminal apparatus and control method thereof | |
CN110100251B (en) | Apparatus, method, and computer-readable storage medium for processing document | |
KR102039172B1 (en) | User terminal device and method for displaying thereof | |
KR102018887B1 (en) | Image preview using detection of body parts | |
US11157127B2 (en) | User terminal apparatus and controlling method thereof | |
US20160062515A1 (en) | Electronic device with bent display and method for controlling thereof | |
KR102146858B1 (en) | Photographing apparatus and method for making a video | |
KR102243855B1 (en) | Method and apparatus for operating image in a electronic device | |
JP6109413B2 (en) | Image display method, image display apparatus, terminal, program, and recording medium | |
JP5766019B2 (en) | Binocular imaging device, control method thereof, control program, and computer-readable recording medium | |
KR20140088452A (en) | Display apparatus and Method for video calling thereof | |
US10095384B2 (en) | Method of receiving user input by detecting movement of user and apparatus therefor | |
US11243687B2 (en) | User terminal apparatus and controlling method thereof | |
US11284003B2 (en) | User terminal apparatus and control method thereof | |
US20230224574A1 (en) | Photographing method and apparatus | |
CN106464976B (en) | Display device, user terminal device, server, and control method thereof | |
CN116711316A (en) | Electronic device and operation method thereof | |
EP3287886A1 (en) | User terminal apparatus and controlling method thereof | |
EP3287887A1 (en) | User terminal apparatus and controlling method thereof | |
CN113763486A (en) | Dominant hue extraction method, device, electronic device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |