WO2015174612A1 - 이동단말기 및 그 제어방법 - Google Patents
이동단말기 및 그 제어방법 Download PDFInfo
- Publication number
- WO2015174612A1 WO2015174612A1 PCT/KR2015/000962 KR2015000962W WO2015174612A1 WO 2015174612 A1 WO2015174612 A1 WO 2015174612A1 KR 2015000962 W KR2015000962 W KR 2015000962W WO 2015174612 A1 WO2015174612 A1 WO 2015174612A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mobile terminal
- image data
- camera
- area
- command
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/08—Annexed information, e.g. attachments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/52—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72439—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00281—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
- H04N1/00307—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a mobile telephone apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00413—Display of information to the user, e.g. menus using menus, i.e. presenting the user with a plurality of selectable options
- H04N1/00416—Multi-level menus
- H04N1/00419—Arrangements for navigating between pages or parts of the menu
- H04N1/00424—Arrangements for navigating between pages or parts of the menu using a list of graphical elements, e.g. icons or icon bar
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
- H04N1/00442—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails
- H04N1/00445—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails arranged in a one dimensional array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00469—Display of information to the user, e.g. menus with enlargement of a selected area of the displayed information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/00472—Display of information to the user, e.g. menus using a pop-up window
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0084—Digital still camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0089—Image display device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0096—Portable devices
Definitions
- the present invention relates to a mobile terminal and a method of controlling the same so that user's convenience can be further considered to enable the use of the terminal.
- Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
- the mobile terminal may be further classified into a handheld terminal and a vehicle mounted terminal according to whether a user can directly carry it.
- the functions of mobile terminals are diversifying. For example, data and voice communication, taking a picture and video with a camera, recording a voice, playing a music file through a speaker system, and outputting an image or video to a display unit.
- Some terminals have an electronic game play function or a multimedia player function.
- recent mobile terminals may receive multicast signals that provide visual content such as broadcasting, video, and television programs.
- such a terminal is a multimedia player having a complex function such as taking a picture or a video, playing a music or video file, playing a game, or receiving a broadcast. Is being implemented.
- SNS Social Network Service
- Users of such online communities are not just exchange of text data, but also social exchange and information sharing through exchange of various multimedia contents.
- the multimedia content to be exchanged may be pre-stored content, or may be content that is immediately photographed through a camera provided in the mobile terminal. Accordingly, there is a demand for a control method that is more convenient for a user in forming content on the fly using the camera and transmitting the formed content.
- a first camera to achieve the above or another object, a first camera
- a control method of a mobile terminal based on size is provided.
- the transmitted multimedia content is an image photographed through a camera on the fly
- the capacity of the captured image data can be easily controlled.
- FIG. 1A is a block diagram illustrating a mobile terminal related to the present invention.
- 1B and 1C are conceptual views of one example of a mobile terminal, viewed from different directions.
- FIG. 2 is a flowchart illustrating a control method of easily attaching image data photographed through a camera according to an embodiment of the present invention.
- FIG. 3 is a diagram illustrating a state diagram of a control method of easily attaching image data photographed through a camera, according to an exemplary embodiment.
- FIG. 4 and 5 illustrate examples of a user command for outputting a camera preview popup window 303 according to an exemplary embodiment of the present invention.
- 6 to 9 are diagrams illustrating a control method of controlling the preview pop-up window 303 according to one embodiment of the present invention.
- FIG. 10 is a diagram for a control method of performing an automatic crop operation based on the position and / or size of the preview pop-up window 303 according to one embodiment of the present invention.
- FIG. 11 is a diagram for a control method of arranging and storing image data attached to a predetermined application in a separate folder according to one embodiment of the present invention.
- the mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PCs, ultrabooks, wearable devices, such as smartwatches, glass glasses, head mounted displays, and the like. have.
- FIG. 1A is a block diagram illustrating a mobile terminal according to the present invention
- FIGS. 1B and 1C are conceptual views of one example of the mobile terminal, viewed from different directions.
- the mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power supply unit 190. ) May be included.
- the components shown in FIG. 1A are not essential to implementing a mobile terminal, so that the mobile terminal described herein may have more or fewer components than those listed above.
- the wireless communication unit 110 of the components, between the mobile terminal 100 and the wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or the mobile terminal 100 and the external server It may include one or more modules that enable wireless communication therebetween.
- the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.
- the wireless communication unit 110 may include at least one of the broadcast receiving module 111, the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115. .
- the input unit 120 may include a camera 121 or an image input unit for inputting an image signal, a microphone 122 for inputting an audio signal, an audio input unit, or a user input unit 123 for receiving information from a user. , Touch keys, mechanical keys, and the like.
- the voice data or the image data collected by the input unit 120 may be analyzed and processed as a control command of the user.
- the sensing unit 140 may include one or more sensors for sensing at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information.
- the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, an illumination sensor, a touch sensor, an acceleration sensor, a magnetic sensor, and gravity.
- Optical sensors e.g. cameras 121), microphones (see 122), battery gauges, environmental sensors (e.g.
- the mobile terminal disclosed herein may use a combination of information sensed by at least two or more of these sensors.
- the output unit 150 is used to generate an output related to sight, hearing, or tactile sense, and includes at least one of a display unit 151, an audio output unit 152, a hap tip module 153, and an optical output unit 154. can do.
- the display unit 151 forms a layer structure with or is integrally formed with the touch sensor, thereby implementing a touch screen.
- the touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and the user, and may also provide an output interface between the mobile terminal 100 and the user.
- the interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100.
- the interface unit 160 connects a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, and an identification module. It may include at least one of a port, an audio input / output (I / O) port, a video input / output (I / O) port, and an earphone port.
- I / O audio input / output
- I / O video input / output
- earphone port an earphone port
- the memory 170 stores data supporting various functions of the mobile terminal 100.
- the memory 170 may store a plurality of application programs or applications driven in the mobile terminal 100, data for operating the mobile terminal 100, and instructions. At least some of these applications may be downloaded from an external server via wireless communication.
- at least some of these application programs may exist on the mobile terminal 100 from the time of shipment for basic functions of the mobile terminal 100 (for example, a call forwarding, a calling function, a message receiving, and a calling function).
- the application program may be stored in the memory 170 and installed on the mobile terminal 100 to be driven by the controller 180 to perform an operation (or function) of the mobile terminal.
- the controller 180 In addition to the operation related to the application program, the controller 180 typically controls the overall operation of the mobile terminal 100.
- the controller 180 may provide or process information or a function appropriate to a user by processing signals, data, information, and the like, which are input or output through the above-described components, or by driving an application program stored in the memory 170.
- controller 180 may control at least some of the components described with reference to FIG. 1A in order to drive an application program stored in the memory 170. Furthermore, the controller 180 may operate by combining at least two or more of the components included in the mobile terminal 100 to drive the application program.
- the power supply unit 190 receives power from an external power source and an internal power source under the control of the controller 180 to supply power to each component included in the mobile terminal 100.
- the power supply unit 190 includes a battery, which may be a built-in battery or a replaceable battery.
- At least some of the components may operate in cooperation with each other to implement an operation, control, or control method of the mobile terminal according to various embodiments described below.
- the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the memory 170.
- the broadcast receiving module 111 of the wireless communication unit 110 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
- the broadcast channel may include a satellite channel and a terrestrial channel.
- Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or switching of broadcast channels for at least two broadcast channels.
- the mobile communication module 112 may include technical standards or communication schemes (eg, Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (CDMA2000), and EV).
- GSM Global System for Mobile communication
- CDMA Code Division Multi Access
- CDMA2000 Code Division Multi Access 2000
- EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced) and the like to transmit and receive a radio signal with at least one of a base station, an external terminal, a server on a mobile communication network.
- GSM Global System for Mobile communication
- CDMA Code Division Multi Access
- CDMA2000 Code Division Multi Access 2000
- EV Enhanced Voice-Data Optimized or Enhanced Voice-Data Only (DO)
- WCDMA Wideband CDMA
- HSDPA High
- the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
- the wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100.
- the wireless internet module 113 is configured to transmit and receive wireless signals in a communication network according to wireless internet technologies.
- wireless Internet technologies include Wireless LAN (WLAN), Wireless-Fidelity (Wi-Fi), Wireless Fidelity (Wi-Fi) Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), and WiMAX (World).
- the wireless Internet module 113 for performing a wireless Internet access through the mobile communication network 113 May be understood as a kind of mobile communication module 112.
- the short range communication module 114 is for short range communication, and includes Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and NFC. (Near Field Communication), at least one of Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus) technology can be used to support short-range communication.
- the short-range communication module 114 may be configured between a mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or through the wireless area networks. ) And a network in which the other mobile terminal 100 (or an external server) is located.
- the short range wireless communication network may be short range wireless personal area networks.
- the other mobile terminal 100 is a wearable device capable of exchanging (or interworking) data with the mobile terminal 100 according to the present invention (for example, smartwatch, smart glasses). (smart glass), head mounted display (HMD).
- the short range communication module 114 may sense (or recognize) a wearable device that can communicate with the mobile terminal 100, around the mobile terminal 100.
- the controller 180 may include at least a portion of data processed by the mobile terminal 100 in the short range communication module ( The transmission may be transmitted to the wearable device through 114. Therefore, the user of the wearable device may use data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a call is received by the mobile terminal 100, the user performs a phone call through the wearable device or when a message is received by the mobile terminal 100, the received through the wearable device. It is possible to check the message.
- the location information module 115 is a module for obtaining a location (or current location) of a mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module.
- GPS Global Positioning System
- Wi-Fi Wireless Fidelity
- the mobile terminal may acquire the location of the mobile terminal using a signal transmitted from a GPS satellite.
- the mobile terminal may acquire the location of the mobile terminal based on information of the wireless access point (AP) transmitting or receiving the Wi-Fi module and the wireless signal.
- the location information module 115 may perform any function of other modules of the wireless communication unit 110 to substitute or additionally obtain data regarding the location of the mobile terminal.
- the location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
- the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user.
- the mobile terminal 100 is one.
- the plurality of cameras 121 may be provided.
- the camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video call mode or the photographing mode.
- the processed image frame may be displayed on the display unit 151 or stored in the memory 170.
- the plurality of cameras 121 provided in the mobile terminal 100 may be arranged to form a matrix structure, and through the camera 121 forming a matrix structure in this way, the mobile terminal 100 may have various angles or focuses.
- the plurality of pieces of image information may be input.
- the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for implementing a stereoscopic image.
- the microphone 122 processes external sound signals into electrical voice data.
- the processed voice data may be variously used according to a function (or an application program being executed) performed by the mobile terminal 100. Meanwhile, various noise reduction algorithms may be implemented in the microphone 122 to remove noise generated in the process of receiving an external sound signal.
- the user input unit 123 is for receiving information from a user. When information is input through the user input unit 123, the controller 180 may control an operation of the mobile terminal 100 to correspond to the input information. .
- the user input unit 123 may be a mechanical input unit (or a mechanical key, for example, a button, a dome switch, a jog wheel, or the like located at the front or rear or side of the mobile terminal 100). Jog switch, etc.) and touch input means.
- the touch input means may include a virtual key, a soft key, or a visual key displayed on the touch screen through a software process, or a portion other than the touch screen.
- the virtual key or the visual key may be displayed on the touch screen while having various forms, for example, graphic or text. ), An icon, a video, or a combination thereof.
- the sensing unit 140 senses at least one of information in the mobile terminal, surrounding environment information surrounding the mobile terminal, and user information, and generates a sensing signal corresponding thereto.
- the controller 180 may control driving or operation of the mobile terminal 100 or perform data processing, function or operation related to an application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.
- the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
- the proximity sensor 141 may be disposed in an inner region of the mobile terminal covered by the touch screen described above or near the touch screen.
- the proximity sensor 141 examples include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
- the proximity sensor 141 may be configured to detect the proximity of the object by the change of the electric field according to the proximity of the conductive object.
- the touch screen (or touch sensor) itself may be classified as a proximity sensor.
- the proximity sensor 141 may detect a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). have.
- the controller 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern detected through the proximity sensor 141 as described above, and further, provides visual information corresponding to the processed data. It can be output on the touch screen. Further, the controller 180 may control the mobile terminal 100 to process different operations or data (or information) according to whether the touch on the same point on the touch screen is a proximity touch or a touch touch. .
- the touch sensor applies a touch (or touch input) applied to the touch screen (or the display unit 151) using at least one of various touch methods such as a resistive film method, a capacitive method, an infrared method, an ultrasonic method, and a magnetic field method. Detect.
- the touch sensor may be configured to convert a change in pressure applied to a specific portion of the touch screen or capacitance generated at the specific portion into an electrical input signal.
- the touch sensor may be configured to detect a position, an area, a pressure at the touch, a capacitance at the touch, and the like, when the touch object applying the touch on the touch screen is touched on the touch sensor.
- the touch object is an object applying a touch to the touch sensor and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
- the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180.
- the controller 180 can know which area of the display unit 151 is touched.
- the touch controller may be a separate component from the controller 180 or may be the controller 180 itself.
- the controller 180 may perform different control or perform the same control according to the type of touch object that touches the touch screen (or a touch key provided in addition to the touch screen). Whether to perform different control or the same control according to the type of touch object may be determined according to the operation state of the mobile terminal 100 or an application program being executed.
- the touch sensor and the proximity sensor described above may be independently or combined, and may be a short (or tap) touch, a long touch, a multi touch, a drag touch on a touch screen. ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, etc. A touch can be sensed.
- the ultrasonic sensor may recognize location information of a sensing object using ultrasonic waves.
- the controller 180 can calculate the position of the wave generation source through the information detected from the optical sensor and the plurality of ultrasonic sensors.
- the position of the wave source can be calculated using the property that the light is much faster than the ultrasonic wave, that is, the time that the light reaches the optical sensor is much faster than the time when the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generation source may be calculated using a time difference from the time when the ultrasonic wave reaches the light as the reference signal.
- the camera 121 which has been described as the configuration of the input unit 120, includes at least one of a camera sensor (eg, CCD, CMOS, etc.), a photo sensor (or image sensor), and a laser sensor.
- a camera sensor eg, CCD, CMOS, etc.
- a photo sensor or image sensor
- a laser sensor e.g., a laser sensor
- the camera 121 and the laser sensor may be combined with each other to detect a touch of a sensing object with respect to a 3D stereoscopic image.
- the photo sensor may be stacked on the display element, which is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of light change, and thus, the position information of the sensing object can be obtained.
- TR transistor
- the display unit 151 displays (outputs) information processed by the mobile terminal 100.
- the display unit 151 may display execution screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information according to the execution screen information. .
- UI user interface
- GUI graphical user interface
- the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
- the stereoscopic display unit may be a three-dimensional display method such as a stereoscopic method (glasses method), an auto stereoscopic method (glasses-free method), a projection method (holographic method).
- the sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
- the sound output unit 152 may also output a sound signal related to a function (for example, a call signal reception sound or a message reception sound) performed in the mobile terminal 100.
- the sound output unit 152 may include a receiver, a speaker, a buzzer, and the like.
- the haptic module 153 generates various haptic effects that a user can feel.
- a representative example of the tactile effect generated by the haptic module 153 may be vibration.
- the intensity and pattern of vibration generated by the haptic module 153 may be controlled by the user's selection or the setting of the controller. For example, the haptic module 153 may synthesize different vibrations and output or sequentially output them.
- the haptic module 153 may be used to stimulate pins that vertically move with respect to the contact skin surface, jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of electrodes, and electrostatic force
- Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endothermic heat generation.
- the haptic module 153 may not only deliver a tactile effect through direct contact, but also may allow a user to feel the tactile effect through a muscle sense such as a finger or an arm. Two or more haptic modules 153 may be provided according to a configuration aspect of the mobile terminal 100.
- the light output unit 154 outputs a signal for notifying occurrence of an event by using light of a light source of the mobile terminal 100.
- Examples of events occurring in the mobile terminal 100 may be message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.
- the signal output from the light output unit 154 is implemented as the mobile terminal emits light of a single color or a plurality of colors to the front or the rear.
- the signal output may be terminated by the mobile terminal detecting the user's event confirmation.
- the interface unit 160 serves as a path to all external devices connected to the mobile terminal 100.
- the interface unit 160 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
- the port, audio input / output (I / O) port, video input / output (I / O) port, earphone port, etc. may be included in the interface unit 160.
- the identification module is a chip that stores a variety of information for authenticating the usage rights of the mobile terminal 100, a user identification module (UIM), subscriber identity module (SIM), universal user authentication And a universal subscriber identity module (USIM).
- a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through the interface unit 160.
- the interface unit 160 may be a passage for supplying power from the cradle to the mobile terminal 100 or may be input from the cradle by a user.
- Various command signals may be a passage through which the mobile terminal 100 is transmitted.
- Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
- the memory 170 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
- the memory 170 may store data regarding vibration and sound of various patterns output when a touch input on the touch screen is performed.
- the memory 170 may include a flash memory type, a hard disk type, a solid state disk type, an SSD type, a silicon disk drive type, and a multimedia card micro type. ), Card-type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read It may include at least one type of storage medium of -only memory (PROM), programmable read-only memory (PROM), magnetic memory, magnetic disk and optical disk.
- the mobile terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 170 on the Internet.
- the controller 180 controls the operation related to the application program, and generally the overall operation of the mobile terminal 100. For example, if the state of the mobile terminal satisfies a set condition, the controller 180 may execute or release a lock state that restricts input of a user's control command to applications.
- controller 180 may perform control and processing related to voice call, data communication, video call, or the like, or may perform pattern recognition processing for recognizing handwriting input or drawing input performed on a touch screen as text and images, respectively. Can be. Furthermore, the controller 180 may control any one or a plurality of components described above in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.
- the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
- the power supply unit 190 includes a battery, and the battery may be a built-in battery configured to be rechargeable, and may be detachably coupled to the terminal body for charging.
- the power supply unit 190 may be provided with a connection port, the connection port may be configured as an example of the interface 160 is electrically connected to the external charger for supplying power for charging the battery.
- the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port.
- the power supply unit 190 uses one or more of an inductive coupling based on a magnetic induction phenomenon or a magnetic resonance coupling based on an electromagnetic resonance phenomenon from an external wireless power transmitter. Power can be delivered.
- various embodiments of the present disclosure may be implemented in a recording medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
- FIGS. 2 and 3 are flowchart illustrating a control method of easily attaching image data photographed through a camera according to an embodiment of the present invention.
- 3 is a diagram illustrating a state diagram of a control method of easily attaching image data photographed through a camera, according to an exemplary embodiment.
- FIGS. 2 and 3 a description will be given with reference to FIGS. 2 and 3.
- the controller 180 outputs an execution screen of a predetermined application through the touch screen.
- the mobile terminal 100 outputs an execution screen of an SNS (Social Network Service) application (hereinafter, a message transmission / reception application) capable of transmitting and receiving messages as an example of the execution screen.
- SNS Social Network Service
- the message transceiving application is an application capable of transceiving text, image and / or voice data with at least one receiving counterpart, and may transmit and receive such data using the wireless communication unit 110.
- the execution screen of the message transmission / reception application (example) will be described with reference to the execution state diagram of FIG.
- the user of the mobile terminal 100 is in a state of transmitting and receiving a message with a counterpart terminal called "Jane".
- This execution screen includes a transmission / reception recording area 300 to which messages which have already been transmitted and received have been output.
- the left-aligned messages 302-1 are messages (hereinafter, referred to as received messages) that the mobile terminal 100 receives from the counterpart terminal, and the right-aligned messages 301-1 to 301. -3)
- These are messages (hereinafter, transmitted messages) sent by the mobile terminal 100 to the counterpart terminal.
- the transmission message is indicated by 301-1, 301-2, ...
- the reception message is indicated by 302-1, 302-2, ... To be displayed.
- the controller 180 may display a preview screen of the camera in one region of the outputted execution screen.
- the preview screen is output in the form of a popup window 303 (hereinafter, a preview popup window).
- the preview popup window 303 is output to one region on the execution screen of the predetermined application.
- the preview pop-up window 303 in one embodiment of the present invention is always floating on the touch screen 151, regardless of the output of the execution screen of another application, as if floating (pup-up window). Suggest to print.
- the physical layer controlling the output of the preview pop-up window 303 is proposed to be implemented in a framework stage rather than an application stage. Therefore, even when the preview popup window 303 is floating, the execution screen of another application may operate (output of information and / or input of a touch gesture, etc.) regardless of the output of the preview popup window 303. .
- a method for easily attaching image data on a given application is proposed.
- the image is converted to a camera preview screen and output while the execution screen of the application is output. After taking the picture, the screen is switched back to the execution screen of the originally output application, and the photographed image data is attached.
- the application which was executed before the camera activation may be executed in the background for a while or the output of the execution screen may be temporarily stopped.
- one embodiment of the present invention proposes to output a preview screen of a camera in one region of the corresponding execution screen while outputting an execution screen of an existing output application. Furthermore, when a predetermined command is received, it is proposed to photograph an image through a camera and attach the photographed image data directly to the application.
- step S203 the controller 180 waits for the reception of the attachment command from the user.
- the attach command refers to a command received from a user who photographs an image through an activated camera and attaches the photographed image data directly to the application. If the attachment command is not received in operation S203, the controller 180 may return to operation S202. In step S203, when the attachment command is received, the controller 180 proceeds to step S204.
- the controller 180 may attach image data photographed through the camera onto the predetermined application.
- the controller 180 may control the wireless communication unit 110 to directly transmit the attached image data. That is, according to this embodiment, when the user wants to transmit the photographed image data to the message receiving counterpart, the user can attach and transmit the image simply and easily using one attach command (immediately attached).
- step S204 in response to the reception of the attachment command, after outputting a pop-up window to confirm whether to transmit the photographed image data, and inputs a command to confirm whether the user to attach through the pop-up window It is suggested to attach the captured image data. And, if you do not want to transmit the captured image data, it may be canceled through a pop-up window (attached after confirmation).
- the control unit 180 may combine the two embodiments to distinguish between the operation of attaching immediately and the operation of attaching after confirmation through a user command. That is, when the first command is received, the controller 180 may attach the photographed image data directly to the application after capturing through the camera. When the second command is received, the controller 180 may output a pop-up window for attaching image data photographed through the camera, and attach the photographed image data to the application when a confirmation command is received through the output pop-up window.
- Examples of the first command include a double touch (two short touches within a predetermined time) received in the preview pop-up window 303 and a short touch received in the preview pop-up window 303 as an example of the second command. have.
- the controller 180 may change the size of the preview pop-up window 303 in response to a user's command or a predetermined condition.
- the controller 180 converts a data size of the image data to be attached to the first data size, and previews the first data size.
- the size of the pop-up window 303 is the second size, the data size of the image data to be attached may be converted to the second data size.
- 3C shows an execution screen to which photographed image data is attached. Referring to FIG. 3 (c), the attached image data is transmitted and a transmission message 301-4 with image data attached to the transmission / reception recording area 300 is displayed.
- the message transmission / reception application has been described as an example, but the present invention is not limited to this example and may be applied to various types of applications.
- an embodiment of the present invention may be applied to an application in which a copy operation on the image data is possible and a paste operation on the copied image data is possible.
- the controller 180 may take a picture and attach the photographed image data directly as an image on a predetermined address book (or contact) item. .
- the controller 180 may search the address book based on the identified face and provide the same to the user.
- the controller 180 may take a picture and store image data of the taken picture as image data corresponding to a predetermined position on the map.
- the image data corresponding to the predetermined position refers to an image displayed in the form of a pin at the predetermined position when the user later views the map again.
- the controller 180 may attach the photographed image data to an email created to the reception counterpart.
- the controller 180 may attach (store) the photographed image data in the form of clip image data for the playback time.
- the controller 180 can control the photographed image data to be stored in the gallery application.
- the controller 180 may change the storage path so that the captured image data is stored in the corresponding folder.
- the controller 180 may search for an image in the gallery application using the photographed image data and provide a search result. In this case, when a face is included in the captured image data, an image including the face may be searched or may be searched using the location information where the image data is captured.
- the controller 180 can take a picture through a camera and attach image data of the taken picture directly on the memo application.
- the attached image data may be treated as a single text and deleted by key button (backspace button) input for erasing characters.
- the controller 180 can take a picture through a camera and add image data of the taken picture to a schedule.
- the controller 180 may set a cover picture of the music currently being played.
- the music and / or album may be searched using the recognized album image.
- the controller 180 outputs a preview popup window 303 of the camera.
- the preview pop-up window 303 may be output by the controller 180 under a predetermined condition, but in one embodiment of the present invention, it is proposed to be output by a user command.
- An example of a user command for outputting the preview popup window 303 will be described with reference to FIGS. 4 and 5.
- FIG. 4 and 5 illustrate examples of a user command for outputting a camera preview popup window 303 according to an exemplary embodiment of the present invention.
- the home screen may be defined as a screen displayed on the touch screen 151 for the first time when the lock state of the touch screen 151 is released, and may include one or more icons for executing an application or an internal function.
- the widget may be displayed.
- the mobile terminal 100 may include not only one but also two or more home screens, and in this case, two or more home screens may be sequentially displayed one by one when a predetermined touch gesture is performed on the touch screen 151. Different icons (widgets) may be disposed on each of the home screens.
- the controller 180 outputs an icon 401 (hereinafter, a quick shot icon) for outputting a camera preview popup window 303 on a home screen screen, and a quick shot icon 401 is displayed.
- the preview popup window 303 may be output in response to the selected input 10a.
- the controller 180 outputs an execution screen of a text message transmission application through the touch screen 151.
- the controller 180 in response to a predetermined input received from the user, to output the preview pop-up window 303 through a function for outputting at least one execution icon (Quick Launcher function, see FIG. 5B).
- the user input for calling the quick launcher function may be an input of touching the home button 10b and dragging 10c in a predetermined direction while maintaining the touch.
- the controller 180 may output at least one icon including the quick photographing icon 501 as illustrated in FIG. 5B.
- the controller 180 can output the preview pop-up window 303 as shown in FIG. 5C.
- 6 to 9 are diagrams illustrating a control method of controlling the preview pop-up window 303 according to one embodiment of the present invention.
- the preview popup window 303 it is proposed to adjust the position, size and / or transparency of the preview pop-up window 303. This is because the preview popup window 303 should be output in one region on the execution screen of another application, and thus may partially obstruct or obstruct the output of the execution screen of the application.
- a control method that can be easily switched between the plurality of cameras is proposed.
- a preview popup window 303 is output on an execution screen of a message transceiving application.
- the controller 180 determines the position of the preview pop-up window 303 that is being output.
- the touch screen 151 may be controlled to change and output the same.
- the movement command may be an input of receiving a touch 10e on the output preview pop-up window 303 and dragging 10f to a desired position while maintaining the touch 10e.
- FIG. 6B illustrates a state diagram in which the preview popup window 303 of FIG. 6A moves and outputs in response to the movement command.
- a preview popup window 303 is output on an execution screen of a message transceiving application.
- the controller 180 controls the size of the preview popup window 303 being output.
- the touch screen 151 may be controlled to change the output of the touch screen.
- the resizing command after receiving an input of touching 10g on one edge of the output preview pop-up window 303, dragging 10h to a desired position while maintaining the touch 10g. It can be an input.
- FIG. 7B illustrates a state diagram in which the size of the preview popup window 303 of FIG. 7A is adjusted and output in response to the size adjustment command.
- a preview popup window 303 is output on an execution screen of a message transceiving application.
- the controller 180 displays a preview screen output to the preview pop-up window 303 that is being output. You can control to switch to the preview screen of another camera. That is, when the camera switching command is received while the preview screen 801-1 of the first camera is being output, the controller 180 can control to output the preview screen 801-2 of the second camera. have.
- An example of the camera switching command may be an input of flicking 10j and 10k on the output preview pop-up window 303.
- FIG. 8B illustrates a state diagram in which a camera outputting the preview popup window 303 of FIG. 8B is switched to another camera and output in response to a camera switching command.
- a preview popup window 303 is output on an execution screen of a message transceiving application.
- the controller 180 transmits the transparency of the preview pop-up window 303 that is output.
- the touch screen 151 may be controlled to change the output of the touch screen.
- An example of the transparency control command may include an input for adjusting the transparency control bar.
- the transparency control bar and the transparency control object 901 are further output, and the position of the transparency control object 901 is moved on the transparency control bar. Propose to control the transparency.
- An example of a control command for controlling the position of the transparency control object 901 may be an input of receiving a touch input on the output transparency control object 901 and dragging it to a desired position while maintaining the touch. have.
- FIG. 9B illustrates a state diagram in which the preview pop-up window 303 of FIG. 9A is adjusted and output in response to a transparency control command.
- the image data captured by the camera is automatically cropped according to the position and / or size of the preview pop-up window 303.
- the cropping of an image refers to an operation of separately designating (selecting) a partial region of a whole image desired by a user (or determined by a predetermined condition). That is, the user may store and / or use only the image data of the region designated by the cropping operation in the entire image.
- FIG. 10 is a diagram for a control method of performing an automatic crop operation based on the position and / or size of the preview pop-up window 303 according to one embodiment of the present invention.
- a preview popup window 303 is output on an execution screen of a message transceiving application.
- the preview pop-up window 303 is output only in one region of the execution screen (or all touch screen output regions) of all the applications.
- it is proposed to crop a part of the entire camera image received through the camera so as to correspond to the output area of the preview pop-up window 303 among the entire touch screen output areas. That is, as shown in FIG. 10B, the entire touch screen output area corresponds to the entire camera image 1001, and the output area of the preview pop-up window 303 corresponds to the first automatic crop image 1002-1.
- the first automatic crop image 1002-1 may be generated to correspond to the corresponding.
- FIG. 10C illustrates a state diagram in which the position of the preview popup window 303 is changed in FIG. 10C.
- an image may be automatically captured by the camera, and the second captured image 1002-2 may be generated by cropping the entire photographed camera image 1001.
- the second auto cropped image 1002-2 is cropped to correspond to the position of the moved preview popup window 303.
- the image data attached (transmitted) by the control method according to an embodiment of the present invention can be controlled to be deleted immediately without storing.
- the captured image data can be temporarily stored for attachment to the application.
- it is intended to quickly capture and attach (and transmit).
- the attached (transmitted) image data is stored in a separate folder (or to be identified) on the Gallery application. This embodiment will be described with reference to FIG.
- FIG. 11 is a diagram for a control method of arranging and storing image data attached to a predetermined application in a separate folder according to one embodiment of the present invention.
- FIG. 11A an execution state diagram of the gallery application is shown, which includes three image folders ("camera roll”, “panorama”, and "Quick Photo”).
- the controller 180 proposes to arrange and store the image data attached (transmitted) in the "Quick Photo” folder 1101 according to the above-described embodiment.
- the folder name of the "Quick Photo” is an example, it will not be limited to this example.
- FIG. 11 (b) is a state diagram of entering the “Quick Photo” folder 1101, and a list 1100 of image data attached (transmitted) is output according to the above-described embodiment.
- the list 1100 includes three image data items 1102-1 through 1102-3. Each image data item may display the type of the attached application, the date and time of attachment and / or location information of the time of attachment.
- the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
- the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
- the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computing Systems (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Priority Applications (11)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP17154023.0A EP3185645B1 (en) | 2014-05-13 | 2015-01-29 | Mobile terminal and method for controlling the same |
| EP19174580.1A EP3565377B1 (en) | 2014-05-13 | 2015-01-29 | Mobile terminal and method for controlling the same |
| EP20181941.4A EP3780898B1 (en) | 2014-05-13 | 2015-01-29 | Mobile terminal and method for controlling the same |
| US14/648,169 US9621792B2 (en) | 2014-05-13 | 2015-01-29 | Mobile terminal and method for controlling the same |
| ES15753305T ES2739889T3 (es) | 2014-05-13 | 2015-01-29 | Terminal móvil y método para controlar el mismo |
| CN201580000191.4A CN105264874B (zh) | 2014-05-13 | 2015-01-29 | 移动终端及其控制方法 |
| EP15753305.0A EP2988568B1 (en) | 2014-05-13 | 2015-01-29 | Mobile terminal and control method therefor |
| US15/462,691 US9942469B2 (en) | 2014-05-13 | 2017-03-17 | Mobile terminal and method for controlling the same |
| US15/913,746 US10419660B2 (en) | 2014-05-13 | 2018-03-06 | Mobile terminal and method for controlling the same |
| US16/432,635 US10659678B2 (en) | 2014-05-13 | 2019-06-05 | Mobile terminal and method for controlling the same |
| US16/849,300 US10863080B2 (en) | 2014-05-13 | 2020-04-15 | Mobile terminal and method for controlling the same |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2014-0057095 | 2014-05-13 | ||
| KR1020140057095A KR102105961B1 (ko) | 2014-05-13 | 2014-05-13 | 이동단말기 및 그 제어방법 |
Related Child Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/648,169 A-371-Of-International US9621792B2 (en) | 2014-05-13 | 2015-01-29 | Mobile terminal and method for controlling the same |
| US15/462,691 Continuation US9942469B2 (en) | 2014-05-13 | 2017-03-17 | Mobile terminal and method for controlling the same |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015174612A1 true WO2015174612A1 (ko) | 2015-11-19 |
Family
ID=54395004
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2015/000962 Ceased WO2015174612A1 (ko) | 2014-05-13 | 2015-01-29 | 이동단말기 및 그 제어방법 |
Country Status (8)
| Country | Link |
|---|---|
| US (5) | US9621792B2 (enExample) |
| EP (4) | EP3565377B1 (enExample) |
| KR (1) | KR102105961B1 (enExample) |
| CN (2) | CN108933865B (enExample) |
| DE (1) | DE202015009611U1 (enExample) |
| ES (3) | ES2746332T3 (enExample) |
| FR (1) | FR3021135A1 (enExample) |
| WO (1) | WO2015174612A1 (enExample) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107437051A (zh) * | 2016-05-26 | 2017-12-05 | 上海市公安局刑事侦查总队 | 图像处理方法及装置 |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9954996B2 (en) | 2007-06-28 | 2018-04-24 | Apple Inc. | Portable electronic device with conversation management for incoming instant messages |
| KR101999137B1 (ko) * | 2013-01-03 | 2019-07-11 | 삼성전자주식회사 | 카메라를 구비하는 장치의 이미지 처리장치 및 방법 |
| KR102105961B1 (ko) | 2014-05-13 | 2020-05-28 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
| US9207835B1 (en) * | 2014-05-31 | 2015-12-08 | Apple Inc. | Message user interfaces for capture and transmittal of media and location content |
| US9886953B2 (en) | 2015-03-08 | 2018-02-06 | Apple Inc. | Virtual assistant activation |
| US20170024086A1 (en) * | 2015-06-23 | 2017-01-26 | Jamdeo Canada Ltd. | System and methods for detection and handling of focus elements |
| US10003938B2 (en) | 2015-08-14 | 2018-06-19 | Apple Inc. | Easy location sharing |
| US20180152622A1 (en) * | 2015-12-01 | 2018-05-31 | Huizhou Tcl Mobile Communication Co., Ltd | Mobile terminal-based photographing method and mobile terminal |
| US9766803B2 (en) * | 2015-12-31 | 2017-09-19 | Futurewei Technologies, Inc. | Mobile device camera viewfinder punch through effect |
| DK179831B1 (en) * | 2016-05-18 | 2019-07-22 | Apple Inc. | Devices, methods and graphical user interfaces for messaging |
| KR102338357B1 (ko) | 2016-05-18 | 2021-12-13 | 애플 인크. | 그래픽 메시징 사용자 인터페이스 내의 확인응답 옵션들의 적용 |
| US11513677B2 (en) | 2016-05-18 | 2022-11-29 | Apple Inc. | Devices, methods, and graphical user interfaces for messaging |
| US10368208B2 (en) | 2016-06-12 | 2019-07-30 | Apple Inc. | Layers in messaging applications |
| CN106775386B (zh) * | 2016-11-30 | 2019-11-15 | 努比亚技术有限公司 | 无边框移动终端及其触控方法 |
| WO2019074244A1 (en) | 2017-10-09 | 2019-04-18 | Samsung Electronics Co., Ltd. | METHOD AND ELECTRONIC DEVICE FOR AUTOMATICALLY MANAGING THE ACTIVITIES OF AN APPLICATION |
| US11157130B2 (en) * | 2018-02-26 | 2021-10-26 | Adobe Inc. | Cursor-based resizing for copied image portions |
| CN112135016B (zh) * | 2019-06-25 | 2022-07-08 | 北京小米移动软件有限公司 | 拍摄装置及移动终端 |
| US11252274B2 (en) * | 2019-09-30 | 2022-02-15 | Snap Inc. | Messaging application sticker extensions |
| CN110677586B (zh) * | 2019-10-09 | 2021-06-25 | Oppo广东移动通信有限公司 | 图像显示方法、图像显示装置及移动终端 |
| CN111050109B (zh) * | 2019-12-24 | 2021-09-17 | 维沃移动通信有限公司 | 电子设备控制方法及电子设备 |
| US11079913B1 (en) | 2020-05-11 | 2021-08-03 | Apple Inc. | User interface for status indicators |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20120009581A (ko) * | 2010-07-19 | 2012-02-02 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
| KR20120089919A (ko) * | 2010-12-23 | 2012-08-16 | 엘지전자 주식회사 | 휴대 단말기 및 그 동작 제어방법 |
| KR20130109466A (ko) * | 2012-03-27 | 2013-10-08 | 엘지전자 주식회사 | 이동 단말기 |
| KR20140039737A (ko) * | 2012-09-25 | 2014-04-02 | 삼성전자주식회사 | 이미지를 전송하기 위한 방법 및 그 전자 장치 |
Family Cites Families (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005012764A (ja) * | 2003-05-22 | 2005-01-13 | Casio Comput Co Ltd | データ通信装置、画像送信方法および画像送信プログラム |
| US20050146631A1 (en) * | 2004-01-07 | 2005-07-07 | Shelton Michael J. | In-camera cropping to standard photo sizes |
| US7782384B2 (en) * | 2004-11-05 | 2010-08-24 | Kelly Douglas J | Digital camera having system for digital image composition and related method |
| JP4926601B2 (ja) * | 2005-10-28 | 2012-05-09 | キヤノン株式会社 | 映像配信システム、クライアント端末及びその制御方法 |
| US8237807B2 (en) * | 2008-07-24 | 2012-08-07 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
| KR20100028344A (ko) * | 2008-09-04 | 2010-03-12 | 삼성전자주식회사 | 휴대단말의 영상 편집 방법 및 장치 |
| EP2207342B1 (en) * | 2009-01-07 | 2017-12-06 | LG Electronics Inc. | Mobile terminal and camera image control method thereof |
| KR101527037B1 (ko) * | 2009-06-23 | 2015-06-16 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
| KR101761613B1 (ko) * | 2010-10-04 | 2017-07-26 | 엘지전자 주식회사 | 이동 단말기 및 이것의 영상 송신 방법 |
| KR101706952B1 (ko) * | 2010-12-29 | 2017-02-15 | 엘지전자 주식회사 | 표시장치 및 객체의 위치 표시방법 |
| JP5360106B2 (ja) * | 2011-03-25 | 2013-12-04 | ブラザー工業株式会社 | 情報処理プログラム、情報処理装置、および情報処理方法 |
| WO2013169851A2 (en) * | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for facilitating user interaction with controls in a user interface |
| EP2664983A3 (en) * | 2012-05-17 | 2018-01-03 | LG Electronics, Inc. | Mobile terminal and control method therefor |
| US20130314558A1 (en) * | 2012-05-24 | 2013-11-28 | Mediatek Inc. | Image capture device for starting specific action in advance when determining that specific action is about to be triggered and related image capture method thereof |
| KR101952684B1 (ko) * | 2012-08-16 | 2019-02-27 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어 방법, 이를 위한 기록 매체 |
| KR101935039B1 (ko) * | 2012-09-11 | 2019-01-03 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 제어 방법 |
| KR20140038759A (ko) * | 2012-09-21 | 2014-03-31 | 삼성전자주식회사 | 이미지를 전송하기 위한 방법 및 그 전자 장치 |
| KR101545883B1 (ko) * | 2012-10-30 | 2015-08-20 | 삼성전자주식회사 | 단말의 카메라 제어 방법 및 그 단말 |
| KR102138516B1 (ko) * | 2013-10-11 | 2020-07-28 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
| KR20150113572A (ko) * | 2014-03-31 | 2015-10-08 | 삼성전자주식회사 | 영상데이터를 획득하는 전자장치 및 방법 |
| KR102105961B1 (ko) | 2014-05-13 | 2020-05-28 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
-
2014
- 2014-05-13 KR KR1020140057095A patent/KR102105961B1/ko active Active
-
2015
- 2015-01-29 EP EP19174580.1A patent/EP3565377B1/en active Active
- 2015-01-29 EP EP20181941.4A patent/EP3780898B1/en active Active
- 2015-01-29 CN CN201810796868.5A patent/CN108933865B/zh active Active
- 2015-01-29 ES ES17154023T patent/ES2746332T3/es active Active
- 2015-01-29 US US14/648,169 patent/US9621792B2/en active Active
- 2015-01-29 ES ES19174580T patent/ES2824799T3/es active Active
- 2015-01-29 WO PCT/KR2015/000962 patent/WO2015174612A1/ko not_active Ceased
- 2015-01-29 DE DE202015009611.5U patent/DE202015009611U1/de not_active Expired - Lifetime
- 2015-01-29 EP EP17154023.0A patent/EP3185645B1/en active Active
- 2015-01-29 EP EP15753305.0A patent/EP2988568B1/en active Active
- 2015-01-29 ES ES15753305T patent/ES2739889T3/es active Active
- 2015-01-29 CN CN201580000191.4A patent/CN105264874B/zh active Active
- 2015-05-11 FR FR1554202A patent/FR3021135A1/fr not_active Withdrawn
-
2017
- 2017-03-17 US US15/462,691 patent/US9942469B2/en active Active
-
2018
- 2018-03-06 US US15/913,746 patent/US10419660B2/en active Active
-
2019
- 2019-06-05 US US16/432,635 patent/US10659678B2/en active Active
-
2020
- 2020-04-15 US US16/849,300 patent/US10863080B2/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20120009581A (ko) * | 2010-07-19 | 2012-02-02 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
| KR20120089919A (ko) * | 2010-12-23 | 2012-08-16 | 엘지전자 주식회사 | 휴대 단말기 및 그 동작 제어방법 |
| KR20130109466A (ko) * | 2012-03-27 | 2013-10-08 | 엘지전자 주식회사 | 이동 단말기 |
| KR20140039737A (ko) * | 2012-09-25 | 2014-04-02 | 삼성전자주식회사 | 이미지를 전송하기 위한 방법 및 그 전자 장치 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP2988568A4 * |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107437051A (zh) * | 2016-05-26 | 2017-12-05 | 上海市公安局刑事侦查总队 | 图像处理方法及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| US10863080B2 (en) | 2020-12-08 |
| EP3780898A1 (en) | 2021-02-17 |
| US20160261790A1 (en) | 2016-09-08 |
| ES2739889T3 (es) | 2020-02-04 |
| KR102105961B1 (ko) | 2020-05-28 |
| CN105264874A (zh) | 2016-01-20 |
| US9942469B2 (en) | 2018-04-10 |
| ES2746332T3 (es) | 2020-03-05 |
| US20170195555A1 (en) | 2017-07-06 |
| US9621792B2 (en) | 2017-04-11 |
| ES2824799T3 (es) | 2021-05-13 |
| US20180198974A1 (en) | 2018-07-12 |
| CN108933865A (zh) | 2018-12-04 |
| EP2988568A1 (en) | 2016-02-24 |
| CN105264874B (zh) | 2018-08-14 |
| US20200244870A1 (en) | 2020-07-30 |
| US10659678B2 (en) | 2020-05-19 |
| EP2988568A4 (en) | 2016-07-06 |
| US20190313011A1 (en) | 2019-10-10 |
| EP3780898B1 (en) | 2025-03-26 |
| EP3565377B1 (en) | 2020-07-22 |
| EP2988568B1 (en) | 2019-05-15 |
| FR3021135A1 (enExample) | 2015-11-20 |
| KR20150130053A (ko) | 2015-11-23 |
| CN108933865B (zh) | 2021-07-23 |
| EP3185645A1 (en) | 2017-06-28 |
| EP3185645B1 (en) | 2019-06-26 |
| EP3565377A1 (en) | 2019-11-06 |
| US10419660B2 (en) | 2019-09-17 |
| DE202015009611U1 (de) | 2018-08-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2015174612A1 (ko) | 이동단말기 및 그 제어방법 | |
| WO2017099306A1 (en) | Flexible display device | |
| KR20170016215A (ko) | 이동단말기 및 그 제어방법 | |
| KR20170016165A (ko) | 이동단말기 및 그 제어방법 | |
| KR20160092877A (ko) | 이동 단말기 및 그 제어방법 | |
| WO2021006372A1 (ko) | 이동 단말기 | |
| WO2021006371A1 (ko) | 이동 단말기 | |
| WO2016085139A1 (ko) | 이동단말기 및 그 제어방법 | |
| KR20180017638A (ko) | 이동단말기 및 그 제어방법 | |
| KR101667452B1 (ko) | 이동 단말기 | |
| WO2020050432A1 (ko) | 이동 단말기 | |
| WO2015163548A1 (ko) | 이동단말기 및 그 제어방법 | |
| KR20170027165A (ko) | 이동 단말기 및 그 제어방법 | |
| KR102232429B1 (ko) | 이동단말기 및 그 제어방법 | |
| WO2016021744A1 (ko) | 이동단말기 및 그 제어방법 | |
| WO2018070627A1 (ko) | 단말기 및 그 제어 방법 | |
| WO2018128199A1 (ko) | 와치타입 단말기 | |
| KR20160039516A (ko) | 이동단말기 및 그 제어방법 | |
| KR101670752B1 (ko) | 이동단말기 및 그 제어방법 | |
| WO2016010169A1 (ko) | 이동 단말기 및 그 제어 방법 | |
| KR20160031341A (ko) | 이동 단말기 및 그 제어 방법 | |
| WO2017119522A1 (ko) | 이동 단말기 및 와치타입 단말기의 제어방법 | |
| KR20170025230A (ko) | 이동단말기 및 그 제어방법 | |
| KR20160004002A (ko) | 이동 단말기 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201580000191.4 Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14648169 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2015753305 Country of ref document: EP |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15753305 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |