KR20170006014A - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR20170006014A
KR20170006014A KR1020150096319A KR20150096319A KR20170006014A KR 20170006014 A KR20170006014 A KR 20170006014A KR 1020150096319 A KR1020150096319 A KR 1020150096319A KR 20150096319 A KR20150096319 A KR 20150096319A KR 20170006014 A KR20170006014 A KR 20170006014A
Authority
KR
South Korea
Prior art keywords
image
user
mobile terminal
camera
control unit
Prior art date
Application number
KR1020150096319A
Other languages
Korean (ko)
Inventor
권윤미
이기선
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150096319A priority Critical patent/KR20170006014A/en
Publication of KR20170006014A publication Critical patent/KR20170006014A/en

Links

Images

Classifications

    • H04M1/72522
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G06K9/3283
    • G06K9/348
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

A mobile terminal and a control method of the mobile terminal are disclosed. The mobile terminal of the present invention comprises: a camera; a touch screen; a sensing unit for sensing rotation of the mobile terminal; and a control unit for checking an imaging direction of the camera changed according to the sensed rotation, obtaining a preview image in the checked imaging direction through the camera, displaying the obtained preview image in the touch screen, and correcting the preview image according to a set standard to correspond to the checked imaging direction. According to the present invention, the corrected preview image can be received on the basis of the imaging direction of the camera changed by rotation of a watch type mobile terminal.

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a watch-type mobile terminal and a control method thereof, which can take an image through a camera in consideration of user's convenience.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.

The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

2. Description of the Related Art Recently, as wearable terminals have been widely applied, there has been an increase in the number of images taken using a watch-type mobile terminal. Since the watch-type mobile terminal is worn on the user's body, it may be difficult to confirm the touch screen when taking an image. Therefore, a user interface capable of eliminating the inconvenience of shooting without recognizing the preview image is required.

The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a mobile terminal and a control method thereof that can correct a preview image based on a photographing direction of a camera changed in accordance with rotation of a watch-type mobile terminal.

According to an aspect of the present invention, there is provided an image processing apparatus including a camera, a touch screen, a sensing unit for sensing rotation of a mobile terminal, and a control unit for checking a photographing direction of the camera changed according to the sensed rotation, And a controller for displaying the obtained preview image on the touch screen and correcting the preview image according to a reference set to correspond to the confirmed photographing direction, watch type mobile terminal.

The set reference may include a reference value for the angle of view, brightness, saturation, contrast, or sharpness of the preview image.

The mobile terminal may further include a memory for storing a plurality of images photographed through the camera, and the control unit may acquire a common feature from images photographed in the same photographing direction among the plurality of images, And the set criteria may include the common feature.

The control unit may update the set reference based on the photographed image when the preview image is photographed by being corrected differently from the set reference.

The control unit may output an alarm to guide the preview image to the photographing angle when the mobile terminal moves.

The mobile terminal may further include a wireless communication unit for receiving the location information of the mobile terminal, wherein the controller is configured to transmit the location information of the mobile terminal to the camera, And stores the text when the text is recognized in an image photographed through the photographing apparatus, and removes the photographed image when the mobile terminal moves out of the preset location.

Wherein the control unit stores the text when a recognized text is recognized in the image taken through the camera at a predetermined angle or more in a state in which the confirmed photographing direction is ahead of the user, The user can delete the photographed image.

The control unit recognizes an object including information for executing an item in an image photographed through the camera in a state in which the photographing direction is downward, and when the mobile terminal is rotated to a predetermined position, A screen can be displayed on the touch screen.

Wherein the sensing unit is capable of sensing a gesture of a user and the control unit recognizes a text that can be translated in an image photographed by the camera in a state in which the confirmed photographing direction is downward, The recognized text may be translated and displayed on the touch screen.

The control unit outputs an alarm to guide the user's face to be included in the preview image by a predetermined ratio or more when the user is recognized in the preview image acquired through the camera in the state that the confirmed shooting direction is the user direction can do.

The mobile terminal may further include a wireless communication unit for connecting a video call with the other party, and the control unit may photograph an image of the user when the shooting direction of the camera is changed to the user direction according to the rotation of the mobile terminal, When the terminal is rotated to a predetermined position, the captured image of the user may be transmitted to the other party, and the image of the captured user and the image of the other party may be displayed on the touch screen.

The sensing unit may sense a user's gesture and the control unit may replace the displayed image of the user with a preview image acquired through the camera or an image photographed through the camera when a predetermined gesture of the user is sensed have.

Wherein the control unit displays the preview image acquired through the camera on the touch screen when the first touch input is received while the power of the touch screen is turned off and displays the image capture indicator or the moving image capture indicator It can be displayed on the touch screen.

When the controller receives the second touch input applied to the touch screen, the controller may change the indicator displayed on the touch screen among the image capture indicator and the moving image capture indicator to an indicator that is not displayed.

The controller may display an image photographed through the camera on the touch screen when the preview image is displayed and a drag input in a predetermined direction is received.

The control unit maintains the displayed preview image when receiving a pre-set touch input while displaying the preview image acquired by the camera, and when the area displayed on the touch screen among the acquired preview images is changed according to the input of the user And output an alarm to guide the camera to be positioned at a position corresponding to the changed preview image.

The mobile terminal may further include a wireless communication unit for communicating with an external device. When the camera is driven, a control signal for driving the camera of the external device, if the external device is within a predetermined distance, To the external device, and if the external device is farther than the predetermined distance, display the position of the external device on the touch screen.

The sensing unit may sense a gesture of a user, and the controller may output an alarm when a preset photographing condition is satisfied, and may photograph an image through the camera according to a gesture of the sensed user.

The mobile terminal may further include a memory for storing a plurality of images photographed through the camera, and the control unit may include a memory for storing the time, location, weather, illumination, The photographing condition can be set based on the included object.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, comprising the steps of: sensing rotation of the mobile terminal; confirming a photographing direction of the camera changed according to the sensed rotation; acquiring a preview image of the confirmed photographing direction Displaying the obtained preview image on a touch screen, and correcting the preview image according to a reference set to correspond to the confirmed photographing direction, the method comprising: to provide.

Effects of the mobile terminal and the control method according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, there is an advantage that a corrected preview image can be provided based on the photographing direction of the camera changed according to the rotation of the watch-type mobile terminal.

In addition, according to at least one of the embodiments of the present invention, the preview image can be corrected based on a plurality of images photographed in the same photographing direction, so that the user can photograph an image according to the characteristics mainly photographed without confirming the preview image There are advantages.

In addition, according to at least one of the embodiments of the present invention, there is an advantage that an image can be taken according to a criterion mainly applied recently by updating a reference for correcting a preview image at the time of shooting an image.

According to at least one of the embodiments of the present invention, an alarm for guiding the preview image to a preset photographing angle is output, so that the photographing can be performed at an appropriate photographing angle without checking the preview image.

In addition, according to at least one embodiment of the present invention, when the photographed image satisfies a specific condition at a predetermined place, if the photographed image is out of the predetermined place, the photographed image is deleted to effectively manage the capacity of the memory There is an advantage that it can be.

According to at least one of the embodiments of the present invention, in the case where a text inclined by a predetermined angle or more is recognized in the photographed image, the photographed image is deleted when a predetermined time has elapsed based on the recognized text , And memory capacity can be effectively managed.

According to at least one of the embodiments of the present invention, when there is an object including information for executing an item in an image photographed in a downward photographing direction, an operation screen of the item is rotated by an operation of rotating the mobile terminal So that it is possible to display the user's convenience.

According to at least one of the embodiments of the present invention, when there is text that can be translated in the photographed image in a state in which the photographing direction is downward, by translating the recognized text according to the gesture of the user, It is advantageous that it can be planned.

According to at least one of the embodiments of the present invention, by outputting an alarm to guide the user's face to be included in the preview image at a predetermined ratio or more in a state in which the photographing direction is the user direction, It is possible to take a picture of the photograph.

In addition, according to at least one embodiment of the present invention, an image of a user is photographed according to the rotation of the mobile terminal and transmitted to the other party of the video call, thereby making it possible to make a video call through the watch-type mobile terminal.

In addition, according to at least one embodiment of the present invention, there is an advantage that an image acquired through a camera can be transmitted to the other party only with a simple gesture during a video call.

In addition, according to at least one embodiment of the present invention, a preview image can be displayed only by simple touch input while the power of the touch screen is turned off, thereby providing convenience for the user.

In addition, according to at least one embodiment of the present invention, it is possible to switch the shooting of an image and the shooting of a moving image with a simple touch input, thereby providing convenience to the user.

In addition, according to at least one embodiment of the present invention, it is possible to identify a previously photographed image by a simple touch input, thereby providing convenience to the user.

In addition, according to at least one embodiment of the present invention, it is possible to identify the touch screen while maintaining the preview image displayed on the touch screen, and it is possible to solve the inconvenience that the touch screen can not be confirmed at the time of image shooting.

In addition, according to at least one of the embodiments of the present invention, the area of the preview image displayed on the touch screen can be changed, so that an image can be taken at a desired angle.

In addition, according to at least one of the embodiments of the present invention, it is possible to conveniently select a terminal for photographing an image by allowing a camera to be used in a communication-connected external device.

According to at least one of the embodiments of the present invention, there is an advantage that an image can be easily photographed at a specific moment by outputting an alarm when a preset photographing condition is satisfied.

Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.

1 is a block diagram illustrating a mobile terminal according to the present invention.
2 is a perspective view showing an example of a watch-type mobile terminal 300 according to another embodiment of the present invention.
3 is a flowchart illustrating a method of controlling a mobile terminal according to an exemplary embodiment of the present invention.
4A to 7B are views for explaining a preview image in the photographing direction of the camera changed according to the rotation of the mobile terminal according to the embodiment of the present invention.
FIGS. 8A and 8B are diagrams for explaining outputting of an alarm to guide a preview image according to an embodiment of the present invention to a preset photographing angle. FIG.
FIGS. 9A and 9B are views for explaining deletion of a photographed image at a preset location according to an embodiment of the present invention when the user leaves the preset location.
FIGS. 10A and 10B are diagrams for explaining deletion of an image photographed at a predetermined angle according to an embodiment of the present invention, based on text included in the image. FIG.
11A and 11B are diagrams for explaining correction of a text included in a preview image in a state in which the photographing direction of the camera is downward according to an embodiment of the present invention.
12A to 13 illustrate execution of a corresponding item based on an object included in a photographed image in a photographing direction of a camera downward according to an embodiment of the present invention.
FIGS. 14A and 14B are diagrams for explaining texts included in an image photographed in a photographing direction downward according to an embodiment of the present invention. FIG.
FIGS. 15 and 16 are diagrams for explaining an alarm for guiding a photographing position set in advance in a state in which the photographing direction is the user direction according to an embodiment of the present invention.
FIGS. 17A and 18B are views for explaining display of a user's image when a video call is connected according to an embodiment of the present invention.
FIGS. 19A and 21B are views for explaining image shooting or moving image shooting in a watch-type mobile terminal according to an embodiment of the present invention.
FIGS. 22A through 22C are diagrams for explaining display or sharing of recently photographed images according to an embodiment of the present invention. FIG.
FIGS. 23A to 24B are diagrams for explaining the maintenance of the preview image displayed on the touch screen according to the embodiment of the present invention and the area of the preview image displayed on the touch screen.
25A and 25B are diagrams for explaining how to guide an image to be taken in a communication-connected external device according to an embodiment of the present invention.
FIGS. 26A to 27D are diagrams for explaining shooting an image according to preset shooting conditions according to an embodiment of the present invention. FIG.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.

However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.

Referring to FIG. 1, FIG. 1 is a block diagram illustrating a mobile terminal according to the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, and a power supply unit 190 ), And the like. The components shown in FIG. 1A are not essential for implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than the components listed above.

The wireless communication unit 110 may be connected between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and another mobile terminal 100 or between the mobile terminal 100 and the external server 100. [ Lt; RTI ID = 0.0 > wireless < / RTI > In addition, the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The sensing unit 140 may include at least one sensor for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information. For example, the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 226, a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 150 includes at least one of a display unit 151, an acoustic output unit 152, a haptic tip module 153, and a light output unit 154 to generate an output related to visual, auditory, can do. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and a user and may provide an output interface between the mobile terminal 100 and a user.

The interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100. The interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the mobile terminal 100, corresponding to the connection of the external device to the interface unit 160, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 170 stores data supporting various functions of the mobile terminal 100. The memory 170 may store a plurality of application programs or applications running on the mobile terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least a part of these application programs may exist on the mobile terminal 100 from the time of shipment for the basic functions (e.g., telephone call receiving function, message receiving function, and calling function) of the mobile terminal 100. Meanwhile, the application program may be stored in the memory 170, installed on the mobile terminal 100, and may be operated by the control unit 180 to perform the operation (or function) of the mobile terminal.

In addition to the operations related to the application program, the control unit 180 typically controls the overall operation of the mobile terminal 100. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components illustrated in FIG. 1A in order to drive an application program stored in the memory 170. FIG. In addition, the controller 180 may operate at least two of the components included in the mobile terminal 100 in combination with each other for driving the application program.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power to the components included in the mobile terminal 100. The power supply unit 190 includes a battery, which may be an internal battery or a replaceable battery.

At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170. [

Hereinafter, the components listed above will be described in more detail with reference to FIG. 1 before explaining various embodiments implemented through the mobile terminal 100 as described above.

First, referring to the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) And an external terminal, or a server on a mobile communication network established according to a long term evolution (e. G., Long Term Evolution-Advanced).

The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. The wireless Internet module 113 is configured to transmit and receive a wireless signal in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.

The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE or LTE- May be understood as a kind of the mobile communication module 112.

The short-range communication module 114 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short-range communication module 114 is connected to the mobile terminal 100 and the wireless communication system through the wireless area networks, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 ) And the other mobile terminal 100 (or the external server). The short-range wireless communication network may be a short-range wireless personal area network.

Here, the other mobile terminal 100 may be a wearable device (e.g., a smartwatch, a smart glass, etc.) capable of interchanging data with the mobile terminal 100 according to the present invention (smart glass), HMD (head mounted display)). The short range communication module 114 may detect (or recognize) a wearable device capable of communicating with the mobile terminal 100 around the mobile terminal 100. [ If the detected wearable device is a device authenticated to communicate with the mobile terminal 100 according to the present invention, the control unit 180 may transmit at least a part of the data processed by the mobile terminal 100 to the short- 114 to the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a telephone is received in the mobile terminal 100, the user performs a telephone conversation via the wearable device, or when a message is received in the mobile terminal 100, It is possible to check the message.

The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, it can acquire the position of the mobile terminal based on information of a wireless access point (AP) that transmits or receives the wireless signal with the Wi-Fi module. Optionally, the location information module 115 may perform any of the other functions of the wireless communication unit 110 to obtain data relating to the location of the mobile terminal, in addition or alternatively. The location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.

Next, the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 121 may be provided. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame may be displayed on the display unit 151 or stored in the memory 170. [ A plurality of cameras 121 provided in the mobile terminal 100 may be arranged to have a matrix structure and various angles or foci may be provided to the mobile terminal 100 through the camera 121 having the matrix structure A plurality of pieces of image information can be input. In addition, the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for realizing a stereoscopic image.

The microphone 122 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 100. Meanwhile, the microphone 122 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 123 is for receiving information from a user and when the information is inputted through the user input unit 123, the control unit 180 can control the operation of the mobile terminal 100 to correspond to the input information . The user input unit 123 may include a mechanical input means (or a mechanical key such as a button located on the front, rear or side of the mobile terminal 100, a dome switch, a jog wheel, Jog switches, etc.) and touch-type input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, And a touch key disposed on the touch panel. Meanwhile, the virtual key or the visual key can be displayed on a touch screen having various forms, for example, a graphic, a text, an icon, a video, As shown in FIG.

Meanwhile, the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a corresponding sensing signal. The control unit 180 may control the driving or operation of the mobile terminal 100 or may perform data processing, function or operation related to the application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.

First, the proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 141 may be disposed in the inner area of the mobile terminal or in proximity to the touch screen, which is covered by the touch screen.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 141 can be configured to detect the proximity of the object with a change of the electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The proximity sensor 141 can detect a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, have. Meanwhile, the control unit 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 141 as described above, and further provides visual information corresponding to the processed data It can be output on the touch screen. Furthermore, the control unit 180 can control the mobile terminal 100 such that different operations or data (or information) are processed according to whether the touch to the same point on the touch screen is a proximity touch or a touch touch .

The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like. Here, the touch controller may be a separate component from the control unit 180, and may be the control unit 180 itself.

On the other hand, the control unit 180 may perform different controls or perform the same control according to the type of the touch object touching the touch screen (or a touch key provided on the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to the current state of the mobile terminal 100 or an application program being executed.

On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the controller 180 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera 121 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 121 and the laser sensor may be combined with each other to sense a touch of the sensing object with respect to the three-dimensional stereoscopic image. The photosensor can be laminated to the display element, which is adapted to scan the movement of the object to be detected proximate to the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

In addition, the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.

The sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output unit 152 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 153 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 153 may be vibration. The intensity and pattern of the vibration generated in the haptic module 153 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 153 may synthesize and output different vibrations or sequentially output the vibrations.

In addition to vibration, the haptic module 153 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, And various tactile effects such as an effect of reproducing a cold sensation using an endothermic or exothermic element can be generated.

The haptic module 153 can transmit the tactile effect through the direct contact, and the tactile effect can be felt by the user through the muscles of the finger or arm. The haptic module 153 may include two or more haptic modules 153 according to the configuration of the mobile terminal 100.

The light output unit 154 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 100. Examples of events that occur in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated by the mobile terminal detecting the event confirmation of the user.

The interface unit 160 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 160 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data in the mobile terminal 100 to an external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 160.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the interface unit 160. [

The interface unit 160 may be a path through which power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle, And various command signals may be transmitted to the mobile terminal 100. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The memory 170 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 170 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 170 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs the storage function of the memory 170 on the Internet.

Meanwhile, as described above, the control unit 180 controls the operations related to the application program and the general operation of the mobile terminal 100. [ For example, when the state of the mobile terminal meets a set condition, the control unit 180 can execute or release a lock state for restricting input of a user's control command to applications.

In addition, the control unit 180 performs control and processing related to voice communication, data communication, video call, or the like, or performs pattern recognition processing to recognize handwriting input or drawing input performed on the touch screen as characters and images, respectively . Further, the controller 180 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components. The power supply unit 190 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

In addition, the power supply unit 190 may include a connection port, and the connection port may be configured as an example of an interface 160 through which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port. In this case, the power supply unit 190 may use at least one of an inductive coupling method based on a magnetic induction phenomenon from an external wireless power transmission apparatus and a magnetic resonance coupling method based on an electromagnetic resonance phenomenon Power can be delivered.

In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

Meanwhile, the mobile terminal can be extended to a wearable device that can be worn on the body beyond the dimension that the user mainly grasps and uses. These wearable devices include smart watch, smart glass, and head mounted display (HMD). Hereinafter, examples of a mobile terminal extended to a wearable device will be described.

The wearable device can be made to be able to exchange (or interlock) data with another mobile terminal 100. The short range communication module 114 can detect (or recognize) a wearable device capable of communicating with the mobile terminal 100. If the detected wearable device is a device authenticated to communicate with the mobile terminal 100, the control unit 180 may transmit at least a part of the data processed by the mobile terminal 100 to the wearable device 100 via the short- Lt; / RTI > Accordingly, the user can use the data processed by the mobile terminal 100 through the wearable device. For example, when a telephone is received in the mobile terminal 100, it is possible to perform a telephone conversation via the wearable device or to confirm the received message via the wearable device when a message is received in the mobile terminal 100 .

2 is a perspective view showing an example of a watch-type mobile terminal 300 according to another embodiment of the present invention.

2, a watch-type mobile terminal 300 includes a main body 301 having a display unit 351 and a band 302 connected to the main body 301 and configured to be worn on the wrist. In general, mobile terminal 300 may include features of mobile terminal 100 of Figure 1 or similar features.

The main body 301 includes a case that forms an appearance. As shown, the case may include a first case 301a and a second case 301b that provide an internal space for accommodating various electronic components. However, the present invention is not limited to this, and one case may be configured to provide the internal space, so that a mobile terminal 300 of a unibody may be realized.

The watch-type mobile terminal 300 is configured to allow wireless communication, and the main body 301 may be provided with an antenna for the wireless communication. On the other hand, the antenna can expand its performance by using a case. For example, a case including a conductive material may be configured to electrically connect with the antenna to extend the ground or radiating area.

A display unit 351 is disposed on a front surface of the main body 301 to output information, and a touch sensor is provided on the display unit 351 to implement a touch screen. The window 351a of the display unit 351 may be mounted on the first case 301a to form a front surface of the terminal body together with the first case 301a.

The main body 301 may include an acoustic output unit 352, a camera 321, a microphone 322, a user input unit 323, and the like. When the display unit 351 is implemented as a touch screen, the display unit 351 may function as a user input unit 323, so that the main body 301 may not have a separate key.

The band 302 is worn on the wrist so as to surround the wrist and can be formed of a flexible material for easy wearing. As an example, the band 302 may be formed of leather, rubber, silicone, synthetic resin, or the like. In addition, the band 302 is detachably attached to the main body 301, and can be configured to be replaceable with various types of bands according to the user's preference.

On the other hand, the band 302 can be used to extend the performance of the antenna. For example, the band may include a ground extension (not shown) that is electrically connected to the antenna and extends the ground region.

The band 302 may be provided with a fastener 302a. The fastener 302a may be embodied by a buckle, a snap-fit hook structure, or a velcro (trademark), and may include a stretchable section or material . In this drawing, an example in which the fastener 302a is embodied as a buckle is shown.

Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

3 is a flowchart illustrating a method of controlling a mobile terminal according to an exemplary embodiment of the present invention. 4A to 7B are views for explaining a preview image in the photographing direction of the camera changed according to the rotation of the mobile terminal according to the embodiment of the present invention.

The method for controlling a mobile terminal according to an embodiment of the present invention may be implemented in the mobile terminal 100 described with reference to FIG. Hereinafter, a method of controlling a mobile terminal according to an exemplary embodiment of the present invention and an operation of the mobile terminal 100 for implementing the method will be described in detail with reference to the accompanying drawings.

Referring to FIG. 3, the sensing unit 140 may sense the rotation of the watch type mobile terminal 100 (S100).

The watch-type mobile terminal 100 may be worn on the user's body. The rotation may mean that the mobile terminal 100 is rotated while being worn on the user's body (e.g., the wrist). However, the present invention is not limited to this, and if the mobile terminal 100 rotates, the following features may be practically the same unless otherwise specified.

1, the sensing unit 140 may include an acceleration sensor, a magnetic sensor, a gravity sensor, a gyroscope sensor, and the like. The sensing unit 140 may sense the rotation of the mobile terminal 100 in the body of the user using the sensors. Any method can be applied as long as the sensing unit 140 can detect the rotation of the mobile terminal 100, and the present invention is not limited to the specific method. The sensing unit 140 may transmit information about the rotation of the mobile terminal 100 to the controller 180.

Referring back to FIG. 3, the control unit 180 can confirm the photographing direction of the changed camera according to the detected rotation (S110).

According to one example, the camera 121 may be provided in the band portion of the mobile terminal 100, as shown in FIG. When the mobile terminal 100 is rotated while being worn on the user's wrist, the camera 121 rotates together. The control unit 180 can confirm the state of the user of the mobile terminal 100 according to the rotation based on the received information about the rotation of the mobile terminal 100. [ Based on the worn state, the control unit 180 can acquire the photographing direction of the camera 121 rotated according to the rotation of the mobile terminal 100. [

5A, in a state in which the mobile terminal 100 is worn so that the user a can recognize the touch screen 151, the photographing direction of the camera 121 may be the front of the user a. 6A, the photographing direction of the camera 121 may be downward while the mobile terminal 100 is rotated so that the user a can photograph the object Q1 placed below. 7A, the photographing direction of the camera 121 may be the direction of the user (a) in a state in which the mobile terminal 100 is rotated so that the user a can photograph his or her own image.

Referring back to FIG. 3, the controller 180 may acquire a preview image of the identified photographing direction through the camera [S120]. The control unit 180 may display the obtained preview image on the touch screen (S130).

The control unit 180 may display a preview image of the photographing direction of the camera, which is changed according to the rotation of the mobile terminal 100, on the touch screen 151. [ According to an example, in a state in which the camera 121 is driven, the control unit 180 can continuously change the preview image according to the rotation of the mobile terminal 100. According to another example, when the mobile terminal 100 is rotated, the controller 180 may drive the camera 121 to display a preview image.

5B shows a state in which the mobile terminal 100 is worn from above when the photographing direction of the camera 121 is forward of the user a. Referring to FIG. 5B, the touch screen 151 is positioned toward the back of the user a. Therefore, the camera 121 faces upward in Fig. 5B, that is, toward the front of the user (a).

In this case, the preview image 20 displayed on the touch screen 151 may be an image of a forward object. According to an example, the control unit 180 may display the image capture indicator 10 on the touch screen 151. [ Upon receiving the input to select the image capture indicator 10, the control unit 180 may capture the preview image 20 and store the captured image in the memory 170.

6B shows a state in which the mobile terminal 100 is worn down from above when the photographing direction of the camera 121 is downward. Referring to FIG. 6B, the touch screen 151 is positioned toward the front of the user a. Therefore, the camera 121 is directed downward, which is the palm direction of the user (a).

In this case, the preview image 21 displayed on the touch screen 151 may be an image for the downward object. According to one example, the QR code Q1 is shown below the camera 121 in FIG. 6A. The control unit 180 may display the QR code Q1 sensed through the camera 121 on the touch screen 151. [ According to an example, when the control unit 180 determines that the preview image 21 includes the QR code Q1, it may execute an application for scanning the QR code Q1.

7B shows a state in which the mobile terminal 100 is worn when viewed from above when the camera 121 is in the direction of the user (a). Referring to FIG. 7B, the touch screen 151 is positioned toward the palm side of the user (a). Therefore, the camera 121 faces downward in Fig. 7B, that is, toward the user a. In this case, the preview image 22 displayed on the touch screen 151 may be an image for the user a.

Referring back to FIG. 3, the controller 180 may correct the preview image according to a reference set to correspond to the identified photographing direction (S140).

The control unit 180 can correct the preview images 20, 21, and 22 according to different standards for the case where the camera 121 is in the forward direction, the downward direction, or the user direction. According to an example, the set reference may include a reference value for the angle of view, brightness, saturation, contrast, or sharpness of the preview images 20, 21, However, the present invention is not limited thereto. The set reference may include a reference value for other characteristics as long as it is a characteristic for photographing an image.

For example, it is assumed that the photographing direction of the camera 121 is forward of the user. In this case, the brightness of the preview image 20 with respect to the front may be adjusted to a preset reference value to correspond to the case where the photographing direction is ahead of the user. That is, when the photographing direction is confirmed to be forward of the user, the controller 180 may control the camera 121 to adjust the brightness of the preview image 20 according to the brightness set corresponding to the front of the user. Accordingly, the user can shoot an image having a preset brightness even if the user does not check the touch screen 151.

It is assumed that the photographing direction of the camera 121 is forward of the user in relation to the correction of the preview image. However, this is not limitative. The above description can be applied to substantially the same with respect to other photographing directions of the camera 121. [

According to an example, the set criteria may be set by the user. The control unit 180 may display a screen on the touch screen 151 in which the reference can be set according to the rotation direction of the camera 121. [ Accordingly, the user can set the shooting angle, brightness, saturation, contrast, sharpness, or the like to a desired value for each shooting direction of the camera 121. [

According to another example, the set reference may be set according to a photographing pattern of the user. It is assumed that a plurality of images photographed through the camera 121 are stored in the memory 170 of the mobile terminal 100. The control unit 180 may acquire a common feature from the images taken in the same photographing direction among the plurality of images.

For example, the control unit 180 may classify images of the plurality of images when the shooting direction of the camera 121 is the user direction. For this purpose, the controller 180 may classify the images including the information that the photographing direction of the camera 121 is the user's direction among the plurality of images. Alternatively, the control unit 180 may classify the image including the user a among the plurality of images using the object recognition technology.

Thereafter, the control unit 180 may acquire features applied at the time of shooting in the images in the user direction. The obtained feature may be a photographing pattern such as a photographing angle, brightness or saturation of the image, and the like applied when photographing images in the user direction. The control unit 180 may acquire a common feature in the acquired features.

In this case, the characteristic of the image of a certain ratio or more among the images in the user direction may be stored in the memory 170 based on the set reference. For example, if the feature is acquired in 50% or more of the images in the user direction, it may be stored as the set reference. The above 50% is an example, and may be set differently as needed.

It is assumed that the photographing direction of the camera 121 is the user direction with respect to the set reference. However, the present invention is not limited thereto. The above description can be applied to substantially the same with respect to other photographing directions of the camera 121. [

According to an example, when the preview image 20, 21, 22 is photographed by being corrected differently from the set reference, the control unit 180 may update the set reference based on the photographed image. The user can correct the preview image 20, 21, 22 corrected by the control unit 180 by adjusting the characteristics of the camera 121. [

For example, in a case where the brightness of the preview image 20 is adjusted according to the set reference, the user can control the camera 121 to adjust the brightness to a different brightness and take an image. The control unit 180 can check whether the photographed images are corrected to have a brightness different from the set reference to be equal to or greater than the predetermined ratio. It is assumed that the images photographed with the different brightness are above the predetermined ratio. In this case, the control unit 180 may update the brightness included in the set criteria to the different brightness and store the brightness.

According to this, the preview image corrected based on the photographing direction of the camera 121 changed in accordance with the rotation of the watch-type mobile terminal 100 can be provided. Further, by correcting the preview image based on a plurality of images photographed in the same photographing direction, it is possible to photograph an image according to the characteristics mainly photographed by the user without confirming the preview image. Further, by updating the reference for correcting the preview image upon image capturing, it is possible to take an image according to a criterion mainly applied recently.

FIGS. 8A and 8B are diagrams for explaining outputting of an alarm to guide a preview image according to an embodiment of the present invention to a preset photographing angle. FIG.

The control unit 180 may output an alarm that guides the preview image to be positioned at the shooting angle included in the set reference when the mobile terminal 100 moves. Referring to FIG. 8A, when the photographing direction of the camera 121 is the front of the user, the photographing angle R1 included in the set reference is shown.

As described above, the photographing angle R1 may be an angle mainly photographed by the user when the photographing direction of the camera 121 is the front of the user. The control unit 180 may determine whether the mobile terminal 100 is located at the photographing angle R1 based on the movement of the mobile terminal 100 detected by the sensing unit 140. [

The control unit 180 may output an alarm through the output unit 150 when the mobile terminal 100 is not expanded to the photographing angle R1 as shown in FIG. For example, the alarm may be an output of vibration through the haptic module 153. Alternatively, the alarm may be an output of sound through the sound output unit 152. In this case, the controller 180 may output a voice alarm including information on the progress direction.

8B, the control unit 180 may output an alarm through the output unit 150 when the mobile terminal 100 is further expanded than the photographing angle R1. In this case, the alarm may be an output of vibration different from the vibration in FIG. 8A. Alternatively, the alarm may be an output of sound different from the sound in Fig. 8A. In this case, the controller 180 may output a voice alarm including information on the progress direction.

It is assumed that the photographing direction of the camera 121 is forward of the user in relation to the output of the alarm for the photographing angle, but this is not limitative. The above description can be applied to substantially the same with respect to other photographing directions of the camera 121. [

According to this, by outputting an alarm that guides the preview image to be set at a predetermined shooting angle, the user can shoot the image according to the appropriate shooting angle without checking the preview image.

FIGS. 9A and 9B are views for explaining deletion of a photographed image at a preset location according to an embodiment of the present invention when the user leaves the preset location.

9A, the preview image 23 obtained in the parking lot is shown in a state in which the photographing direction of the camera 121 is in front of the user. When the user photographs the preview image 23, the control unit 180 can receive the location information of the mobile terminal 100 through the location information module 115 of the wireless communication unit 110. However, this is an example, and the positional information can be received in advance regardless of the photographing of the image.

The control unit 180 may determine whether the received location information is a preset location. The predetermined place may include a place where an image for recording information is mainly photographed in the nature of the place. For example, a place where an image is taken to memorize the position where the car is parked, such as a parking lot, may correspond to the predetermined place.

The control unit 180 may determine whether the text t1 is recognized in the photographed image when the location of the mobile terminal 100 is the predetermined location. If the text t1 is recognized, the control unit 180 may store the text t1. According to one example, the control unit 180 may execute a memo application to store the text t1.

The control unit 180 can display the preset place 31, the recognized text 32, or the information 33 such as the received position and time on the execution screen 30 of the memo application, as shown in Fig. 9B. According to an example, the control unit 180 can associate and store the photographed image and the execution screen 30. When the specific touch input is applied to the execution screen 30, the controller 180 may display the photographed image on the touch screen 151. Alternatively, if a specific touch input is applied to the photographed image, the control unit 180 may display the execution screen 30 on the touch screen 151. [

Thereafter, when the mobile terminal 100 moves out of the preset location, the control unit 180 may remove the photographed image from the memory 170. [ According to an example, the control unit 180 may not delete the contents stored in the execution screen 30 of the memo application. However, the present invention is not limited to this, and the contents stored in the execution screen 30 may be preset to be deleted.

According to another example, the control unit 180 may output an alarm notifying the user before deleting the photographed image. In this case, the control unit 180 may display a screen on the touch screen 151 to select whether or not to delete the photographed image.

According to an example, even when there is another external device set to transmit the photographed image, the control unit 180 may not transmit the photographed image to the external device at the preset location. This is to prevent unnecessary transmission because the image photographed at the preset location is set to be deleted when the user leaves the preset location.

According to the present invention, when an image photographed at a predetermined place satisfies a specific condition, the photographed image is deleted when the user leaves the preset place, thereby effectively managing the capacity of the memory.

FIGS. 10A and 10B are diagrams for explaining deletion of an image photographed at a predetermined angle according to an embodiment of the present invention, based on text included in the image. FIG.

Referring to FIG. 10A, a preview image 24 including an object inclined by a predetermined angle or more is shown in a state in which the photographing direction of the camera 121 is in front of the user. The user can photograph the specific object in a state of being tilted by a predetermined angle or more in order to capture an image for recording information of the specific object. For example, as shown in FIG. 10A, when the specific object is a performance poster including the text t2, the user can photograph the performance poster with a predetermined angle or more.

The control unit 180 may determine whether the photographed specific object is in a state inclined by a predetermined angle or more. According to an example, the predetermined angle may be set to be larger than an angle that can be tilted by hand trembling or the like when a specific object is photographed. However, this is an example, and the predetermined angle may be set to another value as needed.

The control unit 180 may determine whether the text t2 is recognized in the photographed image when the photographed specific object is inclined by a predetermined angle or more. If the text t2 is recognized, the controller 180 may store the text t2. According to one example, the control unit 180 may execute the memo application to store the text t2.

The control unit 180 may display information 35, 36, 37 on the text t2 included in the specific object on the execution screen 34 of the memo application, as shown in FIG. 10B. According to an example, the control unit 180 may associate the captured image with the execution screen 34 and store the combined image. When the specific touch input is applied to the execution screen 34, the controller 180 may display the photographed image on the touch screen 151. Alternatively, when a specific touch input is applied to the photographed image, the control unit 180 may display the execution screen 34 on the touch screen 151. [

According to an example, the controller 180 may extract the time information 36 from the text t2 included in the photographed object. For this purpose, known text recognition techniques may be applied. The control unit 180 may remove the photographed image from the memory 170 after a time indicated in the extracted time information 36. [ According to an example, the control unit 180 may not delete the contents stored in the execution screen 34 of the memo application. However, the present invention is not limited to this, and the contents stored in the execution screen 34 may be preset to be deleted.

According to another example, the control unit 180 may output an alarm notifying the user before deleting the photographed image. In this case, the control unit 180 may display a screen on the touch screen 151 to select whether or not to delete the photographed image.

According to an example, even when there is another external device set to transmit the photographed image, the control unit 180 may not transmit the photographed image in the state tilted by the predetermined angle or more to the external device. This is because the image photographed in a state inclined by the predetermined angle is set to be deleted when the specific condition is satisfied, so that unnecessary transfer is prevented.

According to the present invention, the memory capacity can be effectively managed by deleting the photographed image when a predetermined time has elapsed based on the recognized text, in a case where a text inclined by a predetermined angle or more is recognized in the photographed image .

11A and 11B are diagrams for explaining correction of a text included in a preview image in a state in which the photographing direction of the camera is downward according to an embodiment of the present invention.

Referring to FIG. 11A, it is shown that the mobile terminal 100 is rotated and worn on the user's wrist as shown in FIG. 6A. The photographing direction of the camera 121 is downward, and is directed to the lower object n. Since the touch screen 151 faces the front of the user, it is difficult for the user to confirm the touch screen 151 when photographing the downward object n.

A preview image 25 for the object n may be displayed on the touch screen 151. According to one example, the object n may be a business card containing text t3. However, the present invention is not limited thereto.

The control unit 180 can recognize the text t3 in the preview image 25 for the object n below. In this case, as shown in FIG. 11B, the control unit 180 can correct the preview image 25 so that the text t3 can be captured more clearly. For this, the control unit 180 may control the camera 121 to adjust the sharpness or contrast of the preview image 25.

According to this, when the shooting direction of the camera is downward, by correcting the preview image so that the text can be well recognized, the user can shoot the text clearly without checking the touch screen 151 when shooting the object including the text have.

12A to 13 illustrate execution of a corresponding item based on an object included in a photographed image in a photographing direction of a camera downward according to an embodiment of the present invention.

Referring to Fig. 12A, the photographing of the QR code Q2 is shown in a state in which the photographing direction of the camera 121 is downward. The QR code Q2 may be an object including information for executing an item. According to one example, the item is a specific web page, and the information may be the address of the particular web page. However, the present invention is not limited thereto. According to another example, the item may be a screen for confirming the result of a specific application or a specific event, or the like.

A preview image of the QR code Q2 obtained through the camera 121 may be displayed on the touch screen 151. [ The control unit 180 may recognize that the QR code Q2, which is an object including the information for executing the item, is included in the preview image. In this case, as described above with reference to FIG. 6B, the control unit 180 may execute an application for scanning the QR code Q2. According to an example, the control unit 180 may output an alarm that the application is executed and scanned the QR code Q2.

The control unit 180 may display the execution screen of the item on the touch screen 151 when the mobile terminal 100 is rotated to a predetermined position. For example, the user can rotate the mobile terminal 100 to a predetermined position by rotating the wrist forward as shown in FIG. 12A. However, the present invention is not limited thereto. The rotation to the predetermined position may be set differently as needed.

The control unit 180 may display the execution screen 40 of the item on the touch screen 151 as shown in FIG. 12B when the mobile terminal 100 is rotated to a predetermined position. In FIG. 12B, the execution screen 40 of the item is shown as a screen for confirming the result of a specific prize event.

13, a preview image including a QR code Q3 is displayed on the touch screen 151 and an application for scanning the QR code Q3 is displayed. The control unit 180 can output an alarm so that the QR code Q3 can be accurately recognized. 13, when only a part of the QR code Q3 is displayed and the QR code Q3 can not be correctly recognized, the control unit 180 can output the vibration or sound through the output unit 150 have.

According to this, if there is an object including information for executing an item in an image photographed in a state in which the photographing direction is downward, an execution screen of the item can be displayed by rotating the mobile terminal 100, . In addition, by outputting an alarm for accurately recognizing the object, the user can recognize the object without checking the touch screen 151. [

FIGS. 14A and 14B are diagrams for explaining texts included in an image photographed in a photographing direction downward according to an embodiment of the present invention. FIG.

Referring to Fig. 14A, in a state in which the photographing direction of the camera 121 is downward, an object including a text t4 that can be translated is placed below. The text (t4) that can be translated can be set to a language different from the language set in the language used by the mobile terminal 100. [ For example, if the language of the mobile terminal 100 is English, the text t4 may correspond to a language that can be translated as 'baseball', which is Korean.

The control unit 180 can execute the translation application when the language that can be translated is recognized in the preview image 26. [ According to one example, the translation application may be executed in the background.

The control unit 180 may sense the user's gesture through the sensing unit 140. [ The sensing unit 140 may include an acceleration sensor, a magnetic sensor, a gravity sensor, a gyroscope sensor, or the like. The sensing unit 140 may sense the user's gesture using the sensors. Any method can be applied as long as the sensing unit 140 can sense the user's gesture, and the method is not limited to the specific method.

If the gesture of the detected user is a predetermined gesture, the control unit 180 may translate the recognized text t4 and display it on the touch screen 151. [ Referring to FIG. 14B, the predetermined gesture may be an operation of rotating the mobile terminal 100 such that the touch screen 151 faces the back of the hand. The control unit 180 may display a screen 41 in which the Korean baseball is translated into English, as shown in FIG. 14B.

However, the operation of rotating the mobile terminal 100 is not limited thereto. The predetermined gesture may be an operation of twisting the wrist to confirm the touch screen 151. [ Alternatively, the predetermined gesture may be set to another operation as needed.

According to an example, the screen 41 may be a preview image 26 for an object including text t4. The control unit 180 may temporarily store the preview image 26 immediately before the predetermined gesture is sensed. The control unit 180 may display the temporarily stored preview image 26 on the touch screen instead of the preview image obtained in real time in the camera 121. [ The control unit 180 can display the translated text around the text to be translated.

However, the present invention is not limited thereto. The control unit 180 may photograph the preview image 26 before the predetermined gesture is detected. The control unit 180 may display the photographed image on the touch screen 151. [ The control unit 180 can display the translated text around the text to be translated of the photographed image.

According to another example, the screen 41 may be an execution screen of the translation application. In this case, when a specific input is applied to the screen 41, the controller 180 may display the temporarily stored preview image 26 or the captured image on the touch screen 151.

According to this, when there is text that can be translated in the photographed image in the state where the photographing direction is downward, the recognized text can be translated according to the gesture of the user, thereby making it possible for the user to enjoy the convenience.

FIGS. 15 and 16 are diagrams for explaining an alarm for guiding a photographing position set in advance in a state in which the photographing direction is the user direction according to an embodiment of the present invention.

Referring to Fig. 15, a preset photographing angle R2 is shown when the photographing direction of the camera 121 is the user direction. The photographing angle R2 may be an angle mainly photographed by the user when the photographing direction of the camera 121 is the user direction. The control unit 180 may determine whether the mobile terminal 100 is located at the photographing angle R2 based on the movement of the mobile terminal 100 detected by the sensing unit 140. [

The control unit 180 may output an alarm through the output unit 150 when the mobile terminal 100 is not positioned at the photographing angle R2 as shown in FIG. For example, the alarm may be an output of vibration through the haptic module 153. Alternatively, the alarm may be an output of sound through the sound output unit 152. In this case, the controller 180 may output a voice alarm including information on the progress direction.

According to this, by outputting an alarm that guides the photographing angle of the camera 121 to be a preset photographing angle in a state in which the photographing direction is the user direction, it is possible to photograph the user's own photograph at a desired photographing angle without checking the preview image .

16, there is shown a preview image 27 in which a part of the face of the user (a) is displayed on the touch screen 151 while the photographing direction of the camera 121 is the direction of the user (a). The control unit 180 can recognize the face of the user a in the preview image 27. [ The face recognition in the image is performed according to a known technique, and a detailed description thereof will be omitted here.

The control unit 180 can determine whether the face of the recognized user (a) is included in the preview image 27 at a predetermined ratio or more. According to one example, the predetermined ratio may be set to 50% of the face. However, the present invention is not limited thereto, and may be set differently according to need.

16, when the face of the user a is included less than the predetermined ratio, the control unit 180 may include the face image of the user a in the preview image 27 with a predetermined ratio or more It is possible to output an alarm to guide the user.

According to an example, the alarm may be output differently depending on the direction in which the mobile terminal 100 is moved. For example, the controller 180 may output a different vibration according to a direction in which the mobile terminal 100 is to be moved. Alternatively, the control unit 180 may output a voice informing the direction in which the mobile terminal 100 is to be moved.

According to this, by outputting an alarm guiding the user's face to be included in the preview image at a predetermined ratio or more in a state in which the photographing direction is the user direction, the user can accurately photograph his / her photograph without checking the preview image.

FIGS. 17A and 18B are views for explaining display of a user's image when a video call is connected according to an embodiment of the present invention.

The control unit 180 may receive the video call request signal from the counterpart b via the wireless communication unit 110 or may transmit the video call request signal to the counterpart b. Then, the control unit 180 can connect a video call with the other party b. The control unit 180 may transmit and display the image of the counterpart b for the video call and may transmit the image of the user a to the counterpart b.

17A shows a call connection screen displayed on the touch screen 151 when a video call request signal is transmitted from the other party b. The user (a) can start the video call by selecting the call connection indicator p1. Alternatively, the call rejection indicator p2 may be selected to reject the video call.

When the user selects the call connection indicator p1, the control unit 180 can display the image of the transmitted opposite party b on the touch screen 151, as shown in FIG. 17B. In order to make a video call, an image of the user a obtained through the camera 121 must be transmitted to the other party 'b' in real time.

However, in the case of the watch-type mobile terminal 100, it may be difficult to obtain the image of the user (a) in real time due to the position of the camera 121. [ That is, the user (a) confirms the image of the counterpart (b) displayed on the touch screen 151 when making a video call. In this case, the photographing direction of the camera 121 faces the front of the user. Accordingly, the control unit 180 may display an alarm 50 to display an image of the user a in order to transmit the image of the user a to the other party b as shown in FIG. 17B.

Referring to FIG. 17C, the control unit 180 may sense the rotation of the mobile terminal 100 through the sensing unit 140. FIG. When the photographing direction of the camera 121 is changed to the direction of the user a according to the rotation of the mobile terminal 100, the control unit 180 can photograph the image of the user a predetermined number of times. The predetermined number of times may be set differently as needed.

After the image of the user a is photographed, the controller 180 may sense whether the mobile terminal 100 is rotated to a predetermined position. The predetermined position may be a position before rotation as shown in Fig. 17D. In this case, the control unit 180 may transmit images of the photographed user (a) to the counterpart (b).

According to an example, the control unit 180 may generate a moving image file in which the images of the photographed user (a) are continuously displayed. The control unit 180 may transmit the generated moving image file to the counterpart 'b'.

The control unit 180 may display the generated motion picture file 52 and the image 51 of the counterpart b on the touch screen 151 as shown in FIG. 17D. The control unit 180 can repeatedly reproduce the generated moving picture file 52 during a video call with the other party b.

Referring to FIG. 18A, the controller 180 may sense the gesture of the user a through the sensing unit 140. The control unit 180 may determine whether the sensed gesture is a predetermined user gesture. For example, as shown in FIG. 18A, when the operation of swinging the wrist up and down is a predetermined user gesture, the controller 180 may drive the camera 121. FIG.

In this case, according to an example, the controller 180 may replace the displayed image 52 of the user with the preview image 53 obtained through the camera 121, as shown in FIG. 18B. Or may be replaced with an image photographed through the camera 121, according to another example. In this case, the photographed image may be the most recently photographed image. Alternatively, the control unit 180 may display a screen through which the photographed image can be selected through the camera 121. [

According to an example, when the user again takes a specific gesture, the control unit 180 may restore the preview image 53 into the generated motion picture file 52. [

According to this, the user can take a video call through the watch-type mobile terminal by taking an image of the user according to the rotation of the mobile terminal 100 and transmitting the image to the other party of the video call. In addition, the user can transmit an image obtained through the camera to the other party only with a simple gesture during a video call.

FIGS. 19A and 21B are diagrams for explaining image shooting or moving image shooting in the watch-type mobile terminal 100 according to an embodiment of the present invention.

Referring to FIG. 19A, the power of the touch screen 151 is turned off. When the controller 180 receives the first touch input through the touch screen 151, the control unit 180 can drive the camera 121. The first touch input may be a drag input in the right direction as shown in FIG. 19A. However, the present invention is not limited thereto. The first touch input is set by various touches such as a short touch, a long touch, a multi-touch, a drag touch, a flick touch, a pinch-in touch, a pinch-out touch, a swipe touch, .

The control unit 180 may display the preview image 20 obtained through the camera 121 on the touch screen 151 as shown in FIG. The control unit 180 may display the image photographing indicator 10 on the touch screen 151. [

Referring to FIG. 20A, the first touch input may be a drag input in the left direction. In this case, the control unit 180 can display the moving image image taking indicator 11 on the touch screen 151 together with the preview image 20 acquired through the camera 121, as shown in FIG. 20B.

Referring to FIG. 21A, the controller 180 may receive a second touch input applied to the touch screen 151 in a state that the preview image 20 and the image photographing indicator 10 are displayed. The second touch input may be a long touch input for one area of the touch screen 151, as shown in FIG. 21A. However, the present invention is not limited thereto. The second touch input is set by various touches such as a short touch, a long touch, a multi-touch, a drag touch, a flick touch, a pinch-in touch, a pinch-out touch, a swipe touch, .

In this case, the control unit 180 may change the image shooting indicator 10 displayed on the touch screen 151 to the moving image shooting indicator 11, as shown in FIG. 21B. Similarly, when the moving image photographing indicator 11 is displayed and the second touch input is received, the controller 180 may change the moving image photographing indicator 11 to the image photographing indicator 10.

According to this, the preview image can be displayed only by a simple touch input while the power of the touch screen 151 is off, and the convenience of the user can be improved. In addition, it is possible to switch the shooting of the image and the shooting of the moving image with a simple touch input, thereby making it possible for the user to enjoy the convenience.

FIGS. 22A through 22C are diagrams for explaining display or sharing of recently photographed images according to an embodiment of the present invention. FIG.

Referring to FIG. 22A, a preview image 20 and an image photographing indicator 10 obtained through a camera 121 are displayed on a touch screen 151. The control unit 180 can receive a drag input in a predetermined direction in a state where the preview image 20 is displayed. 22A shows that the drag input in the predetermined direction is a drag input in the left direction. However, as an example, the predetermined direction may be set differently as needed.

In this case, the controller 180 can display the most recently captured image 60 on the touch screen 151 through the camera 121 as shown in FIG. 22B. The user can continue to apply the drag input to the left direction as shown in FIG. 22B. According to an example, the controller 180 may display images on the touch screen 151 in the order in which they were recently photographed each time the drag input is applied.

The control unit 180 displays the application 61, 62, 63 for sharing the displayed image 60 on the touch screen 151 (see FIG. 22C) when there is no more image to be displayed corresponding to the drag input ). ≪ / RTI > The user may transmit the displayed image 60 by applying a drag input in the direction of the desired application 62. [

According to an example, the control unit 180 may perform the above-described operation inversely according to a drag input in the opposite direction to the predetermined direction. For example, whenever the user applies the drag input to the right direction, the control unit 180 can display the images in the order in which they were photographed. In this case, finally, the control unit 180 can display the original preview image 20 on the touch screen 151. [

According to this, it is possible to confirm previously taken images or to share an image with a simple touch input, thereby making it possible for the user to enjoy the convenience.

FIGS. 23A to 24B are diagrams for explaining how to maintain the preview image displayed on the touch screen 151 and change the preview image area displayed on the touch screen 151 according to an embodiment of the present invention.

Referring to FIG. 23A, it is shown that the preview image 20 acquired by the camera 121 is displayed on the touch screen 151. FIG. It may happen that it is difficult to confirm the preview image 20 displayed on the touch screen 151, such as when the user tries to shoot an image by stretching his / her arms. In this case, if the user desires to check the preview image 20 at a close distance, the user can apply a predetermined touch input to the touch screen 151.

According to an example, the preset touch input may be an input for maintaining a state in which the touch screen 151 is touched, as shown in FIG. 23A. However, the present invention is not limited thereto. The preset touch input is set by various touches such as a short touch, a long touch, a multi-touch, a drag touch, a flick touch, a pinch-in touch, a pinch-out touch, a swipe touch, .

Upon receiving the preset touch input, the controller 180 may temporarily store the preview image 20 'acquired through the camera 121, as shown in FIG. 23B. Referring to FIG. 23B, only a partial area 20 of the preview image 20 'obtained from the camera is displayed on the touch screen 151. This is because the angle of view of the preview image 20 'acquired by the camera 121 is larger than the angle of view displayed on the touch screen 151.

The control unit 180 can maintain the display of the temporarily stored preview image 20 even when the user pulls his / her arm toward his / her body, as shown in FIG. 23A. Then, the user can apply a touch input to the touch screen 151, as shown in FIG. 23B, while viewing the touch screen 151. FIG. When the touch input is applied, the controller 180 may move the region 20 '' displayed on the touch screen 151 among the temporarily stored preview images 20 'as shown in FIG. 23C.

For example, when an input to the left is applied, the area 20 " displayed on the touch screen 151 may move to the left in the temporarily stored preview image 20 '. Thus, the user can change the area 20 " displayed on the touch screen 151 until a desired image appears on the touch screen 151. [

According to one example, as shown in FIG. 23D, when the user determines the area 20 '' displayed on the touch screen 151, it may attempt to photograph the determined area 20 '. In this case, the control unit 180 may output an alarm that guides the camera 121 to be positioned at the position corresponding to the changed preview image 20 ''. The user can take an image when the camera 121 is placed in the corresponding position.

According to another example, a user may apply a predetermined touch input if he wants to store the area 20 " displayed on the touch screen 151 directly. In this case, the control unit 180 can directly store the changed preview image 20 '' in the memory 170. [

Referring to FIG. 24A, the most recently photographed image 70 is displayed on the touch screen 151. FIG. After photographing the image 70, if the user does not like the photographed image, the user can apply a drag input to the left direction as shown in FIG. 24A. However, the present invention is not limited to this, and may be set to another touch input as needed.

In this case, the controller 180 may display a part of the temporarily stored preview image 20 'on the touch screen 151. [ The user can change the area 20 displayed on the touch screen 151 by applying a drag input. Subsequent operations are substantially the same as those described with reference to FIG. 23, and thus a detailed description thereof will be omitted.

In the above description, the photographing direction of the camera 121 is the forward direction of the user, but the present invention is not limited thereto. 23A to 24B can be applied substantially the same even when the photographing direction of the camera 121 is downward or in the user direction.

According to this, the touch screen 151 can be confirmed while the preview image displayed on the touch screen 151 is maintained, thereby eliminating the inconvenience that the touch screen can not be confirmed at the time of image capturing using the watch-type mobile terminal 100 can do. Further, the area of the preview image displayed on the touch screen 151 can be changed, so that the user can take an image at a desired angle.

25A and 25B are diagrams for explaining how to guide an image to be taken in a communication-connected external device according to an embodiment of the present invention.

Referring to FIG. 25A, the controller 180 can communicate with the external device 200 through the wireless communication unit 110. FIG. The external device 200 may be an electronic device having a camera.

The controller 180 can acquire the position of the external device 200 when the camera 121 is driven by the user as shown in FIG. The method of acquiring the information about the distance between the external device 200 and the mobile terminal 100 and the location of the external device 200 is based on a known method and will not be described in detail here.

The control unit 180 transmits a control signal for driving the camera of the external device 200 to the external device 200 when the external device 200 is within a predetermined distance d as shown in FIG. . When the control unit receives the control signal, the control unit of the external device 200 can display a preview image on the touch screen 251 by driving the camera.

If the external device 200 is farther than the predetermined distance d as shown in FIG. 25B, the controller 180 may touch the screen 80 indicating the position of the external device 200 And can be displayed on the screen 151. The control unit 180 may display the screen image 80 on the touch screen 151 according to another example.

In this case, when the camera 121 is driven by the mobile terminal 100, the user can conveniently select a terminal for capturing an image by displaying the external device 200 equipped with the camera.

FIGS. 26A to 27D are diagrams for explaining shooting an image according to preset shooting conditions according to an embodiment of the present invention. FIG.

Referring to FIG. 26A, a user can run while wearing the mobile terminal 100. FIG. The control unit 180 can recognize the user's situation through the sensing unit 140. [ For example, the user's location, current time, illuminance, weather, moving status, and the like can be recognized.

According to one example, it may be preset that the user stops in a running state under certain conditions, as a shooting condition. However, this is an example, and the photographing conditions may be set differently as needed. In this case, when the user stops running as shown in FIG. 26B, the controller 180 can determine that the preset shooting condition is satisfied and output an alarm.

The control unit 180 may sense the user's gesture through the sensing unit 140. [ If the user's gesture sensed after the output of the alarm is a preset gesture, the control unit 180 can shoot an image through the camera 121. [ For example, the predetermined gesture may be an operation of lifting the arm in a specific direction and positioning the camera, as shown in Fig. 26C.

According to an example, when the gesture of the user is sensed, the controller 180 may photograph an image a predetermined number of times. Alternatively, in accordance with another example, the control unit 180 may drive the camera 121 and display the preview image 20 and the image capture indicator on the touch screen 151.

A plurality of images photographed through the camera 121 may be stored in the memory 170 of the mobile terminal 100. The control unit 180 may set the photographing conditions based on the time, place, weather, illuminance, user's state, or objects included in the image when the plurality of images are photographed.

For example, if there are more than a predetermined number of images taken at a specific place, at a specific time, and at a specific illuminance among the plurality of images, the controller 180 may set the specific place, the specific time, have. That is, as in the specific counterpart (b), as shown in FIG. 27 (a), it is possible to coincide with the preset photographing conditions in the coffee shop, the weather is clear, the surroundings are quiet, and the motion state is stopped. In this case, the control unit 180 can output an alarm notifying that the photographing condition is satisfied. 27A, the control unit 180 may display the shooting conditions on the touch screen 151. [

Referring to FIG. 27B, after the alarm is output, the controller 180 may sense the user's specific gesture and may drive the camera 121. FIG. For example, when the user takes a gesture to pick up an arm, the control unit 180 may display the preview image 91 and the image shooting indicator 10 on the touch screen 151. [ However, the gesture for lifting the arm is not limited thereto, and may be set differently according to need.

Referring to FIG. 27C, after the alarm is output, the controller 180 may detect a specific gesture of the user and terminate the alarm according to the photographing condition satisfaction. For example, when the user takes a gesture to lower his arm, the control unit 180 may display an alarm end screen 92 on the touch screen 151. [ However, the gesture for lowering the arm is not limited thereto, and may be set differently according to need.

Referring to FIG. 27D, after the alarm is output, the controller 180 may detect a specific gesture of the user and may reserve an alarm according to the photographing condition satisfaction. For example, when the user takes a gesture to turn the wrist up and down, the control unit 180 may display the alarm reservation screen 93 on the touch screen 151. [ However, the gesture for turning the cuff vertically is not limited thereto, and may be set differently as needed. Through this, the user can have a time to prepare for shooting when the shooting conditions are satisfied.

In the above description, it is assumed that the photographing conditions are set in accordance with a common condition among a plurality of images, but the present invention is not limited thereto. According to another example, the control unit 180 may provide a screen for setting the shooting conditions. Through this, the user can set the desired shooting conditions in advance.

According to this, by outputting an alarm when a preset shooting condition is satisfied, the user can easily take an image at a specific moment.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

100: mobile terminal 110: wireless communication unit
120: input unit 140: sensing unit
150: output unit 160: interface unit
170: memory 180:
190: Power supply

Claims (20)

In a watch type mobile terminal,
camera;
touch screen;
A sensing unit sensing rotation of the mobile terminal; And
And a display unit for displaying the preview image obtained on the touch screen, and a display unit for displaying the obtained preview image on the touch screen, The preview image is corrected according to a reference set to correspond to the preview image;
.
The method according to claim 1,
Wherein the set reference includes a reference value for a shooting angle, brightness, saturation, contrast, or sharpness of the preview image.
The method according to claim 1,
And a memory for storing a plurality of images photographed through the camera,
Wherein the control unit acquires a common feature from the images photographed in the same photographing direction among the plurality of images and stores the common feature in the memory,
Wherein the set criteria includes the common feature.
The method of claim 3,
Wherein the control unit updates the set reference based on the photographed image when the preview image is photographed by being corrected differently from the set reference.
3. The method of claim 2,
Wherein the control unit outputs an alarm to guide the preview image to be positioned at the photographing angle when the mobile terminal moves.
The method according to claim 1,
A wireless communication unit for receiving location information of the mobile terminal; Further comprising:
Wherein the controller stores the text when the recognized location information is a preset location and the text is recognized in an image photographed through the camera in a state where the confirmed photographing direction is the front of the user, And removes the photographed image when the user leaves the set location.
The method according to claim 1,
Wherein the control unit stores the text when a recognized text is recognized in the image taken through the camera at a predetermined angle or more in a state in which the confirmed photographing direction is ahead of the user, And deletes the photographed image when the predetermined time elapses.
The method according to claim 1,
The control unit recognizes an object including information for executing an item in an image photographed through the camera in a state in which the photographing direction is downward, and when the mobile terminal is rotated to a predetermined position, And displays the screen on the touch screen.
The method according to claim 1,
The sensing unit senses the gesture of the user,
Wherein the control unit recognizes the text that can be translated from the image photographed through the camera in a state where the confirmed photographing direction is downward and, when a predetermined gesture of the user is sensed, To the mobile terminal.
The method according to claim 1,
The control unit outputs an alarm to guide the user's face to be included in the preview image by a predetermined ratio or more when the user is recognized in the preview image acquired through the camera in the state that the confirmed shooting direction is the user direction And the mobile terminal.
The method according to claim 1,
A wireless communication unit for connecting a video call with the other party; Further comprising:
The controller captures an image of the user when the photographing direction of the camera is changed to the user direction according to the rotation of the mobile terminal, and transmits the image of the photographed user to the counterpart when the mobile terminal rotates to a predetermined position And displays the photographed image of the user and the image of the other party on the touch screen.
12. The method of claim 11,
The sensing unit senses the gesture of the user,
Wherein the control unit replaces the displayed image of the user with a preview image obtained through the camera or an image photographed through the camera when a predetermined gesture of the user is sensed.
The method according to claim 1,
Wherein the control unit displays the preview image acquired through the camera on the touch screen when the first touch input is received while the power of the touch screen is turned off and displays the image capture indicator or the moving image capture indicator And displays it on a touch screen.
14. The method of claim 13,
Wherein the control unit changes an indicator displayed on the touch screen to an indicator that is not displayed when the second touch input is applied to the touch screen, and displays the changed indicator.
14. The method of claim 13,
Wherein the controller displays an image photographed through the camera on the touch screen when the preview image is displayed and a drag input in a predetermined direction is received.
The method according to claim 1,
The control unit maintains the displayed preview image when receiving a pre-set touch input while displaying the preview image acquired by the camera, and when the area displayed on the touch screen among the acquired preview images is changed according to the input of the user And outputs an alarm to guide the camera to be positioned at a position corresponding to the changed preview image.
The method according to claim 1,
A wireless communication unit for communicating with an external device; Further comprising:
Wherein the controller transmits a control signal for driving the camera of the external device to the external device when the external device is within a predetermined distance when the camera is driven, Is displayed on the touch screen.
The method according to claim 1,
The sensing unit senses the gesture of the user,
Wherein the control unit outputs an alarm when a preset photographing condition is satisfied, and photographs an image through the camera according to a gesture of the detected user.
19. The method of claim 18,
And a memory for storing a plurality of images photographed through the camera,
Wherein the control unit sets the photographing condition based on a time, a place, a weather, an illuminance, a user's state, or an object included in the image when the plurality of images are photographed.
A control method of a watch type mobile terminal,
Detecting rotation of the mobile terminal;
Confirming a photographing direction of the changed camera according to the sensed rotation;
Acquiring a preview image of the identified photographing direction through the camera;
Displaying the obtained preview image on a touch screen; And
Correcting the preview image according to a reference set to correspond to the identified photographing direction;
And transmitting the control information to the mobile terminal.
KR1020150096319A 2015-07-07 2015-07-07 Mobile terminal and method for controlling the same KR20170006014A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150096319A KR20170006014A (en) 2015-07-07 2015-07-07 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150096319A KR20170006014A (en) 2015-07-07 2015-07-07 Mobile terminal and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20170006014A true KR20170006014A (en) 2017-01-17

Family

ID=57990532

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150096319A KR20170006014A (en) 2015-07-07 2015-07-07 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20170006014A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108632579A (en) * 2018-05-29 2018-10-09 广东小天才科技有限公司 A kind of alarm method, device, intelligent wearable device and storage medium
WO2019107981A1 (en) * 2017-11-29 2019-06-06 Samsung Electronics Co., Ltd. Electronic device recognizing text in image
US11196908B2 (en) 2019-05-23 2021-12-07 Samsung Electronics Co., Ltd. Electronic device having camera module capable of switching line of sight and method for recording video
CN114200817A (en) * 2021-12-09 2022-03-18 歌尔科技有限公司 Photographing system and intelligent wearable device
CN115457559A (en) * 2022-08-19 2022-12-09 上海通办信息服务有限公司 Method, device and equipment for intelligently correcting text and license pictures

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019107981A1 (en) * 2017-11-29 2019-06-06 Samsung Electronics Co., Ltd. Electronic device recognizing text in image
CN108632579A (en) * 2018-05-29 2018-10-09 广东小天才科技有限公司 A kind of alarm method, device, intelligent wearable device and storage medium
US11196908B2 (en) 2019-05-23 2021-12-07 Samsung Electronics Co., Ltd. Electronic device having camera module capable of switching line of sight and method for recording video
CN114200817A (en) * 2021-12-09 2022-03-18 歌尔科技有限公司 Photographing system and intelligent wearable device
WO2023102976A1 (en) * 2021-12-09 2023-06-15 歌尔股份有限公司 Photographing system and smart wearable device
CN115457559A (en) * 2022-08-19 2022-12-09 上海通办信息服务有限公司 Method, device and equipment for intelligently correcting text and license pictures
CN115457559B (en) * 2022-08-19 2024-01-16 上海通办信息服务有限公司 Method, device and equipment for intelligently correcting texts and license pictures

Similar Documents

Publication Publication Date Title
KR20180019392A (en) Mobile terminal and method for controlling the same
KR20170029978A (en) Mobile terminal and method for controlling the same
KR20160031886A (en) Mobile terminal and control method for the mobile terminal
KR20160074334A (en) Mobile terminal and method for controlling the same
KR20180023310A (en) Mobile terminal and method for controlling the same
KR20170014356A (en) Mobile terminal and method of controlling the same
KR20180094340A (en) Mobile terminal and method for controlling the same
KR20160071263A (en) Mobile terminal and method for controlling the same
KR20150105845A (en) Mobile terminal and method for controlling the same
KR20170016165A (en) Mobile terminal and method for controlling the same
US10897582B2 (en) Mobile terminal and control method therefor
KR20170006014A (en) Mobile terminal and method for controlling the same
KR20180017638A (en) Mobile terminal and method for controlling the same
KR20170115863A (en) Mobile terminal and method for controlling the same
KR20170037123A (en) Mobile terminal and method for controlling the same
KR20170082036A (en) Mobile terminal
KR20160087969A (en) Mobile terminal and dual lcd co-processing method thereof
KR20170108715A (en) Mobile terminal and method for controlling the same
KR20170085358A (en) Mobile terminal and control method for the mobile terminal
KR20160049413A (en) Mobile terminal and method for controlling the same
KR20150123117A (en) Mobile terminal and method for controlling the same
KR20180079051A (en) Mobile terninal and method for controlling the same
KR20170029330A (en) Mobile terminal and method for controlling the same
KR20170020070A (en) Mobile terminal and method for controlling the same
KR20160046205A (en) Mobile terminal and method for controlling the same