CN105100388A - Mobile terminal and method of controlling the same - Google Patents

Mobile terminal and method of controlling the same Download PDF

Info

Publication number
CN105100388A
CN105100388A CN201410652555.4A CN201410652555A CN105100388A CN 105100388 A CN105100388 A CN 105100388A CN 201410652555 A CN201410652555 A CN 201410652555A CN 105100388 A CN105100388 A CN 105100388A
Authority
CN
China
Prior art keywords
touch
image
preview image
controller
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410652555.4A
Other languages
Chinese (zh)
Other versions
CN105100388B (en
Inventor
赵敬敏
全星翼
宋玟婀
金灿洙
朴书用
李政铉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of CN105100388A publication Critical patent/CN105100388A/en
Application granted granted Critical
Publication of CN105100388B publication Critical patent/CN105100388B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04104Multi-touch detection in digitiser, i.e. details about the simultaneous detection of a plurality of touching locations, e.g. multiple fingers or pen and finger
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A mobile terminal including a wireless communication unit configured to perform wireless communication; a camera configured to obtain an image; a display unit configured to display a preview image obtained through the camera; and a controller configured to control the display unit to operate in any one of a first state in which a graphic object relating to an image capturing function is displayed overlapping the preview image and a second state in which the graphic object is not displayed while the preview image is displayed, based on a user request, and when a first pre-set type of touch is sensed in a region on which the preview image is displayed in the second state, control the camera to capture the preview image based on the pre-set type of touch.

Description

Mobile terminal and control method thereof
Technical field
The present invention relates to the mobile terminal and control method thereof with image camera function.
Background technology
Terminal populations can be categorized as mobile/handheld terminal or fixed terminal.Mobile terminal can also be categorized as handheld terminal or car-mounted terminal.And mobile terminal functionally becomes day by day more.The example of this function comprises: data and Speech Communication, via image shot by camera and video, record audio, via speaker system playing music, and show image and video over the display.Some mobile terminals comprise the additional function of such as playing games, and other terminal is set to multimedia player.
User interface environment is also configured to allow user easily to search for or selection function.And recently, along with the resolution of the camera arranged in mobile terminal and the enhancing of function, the use being arranged on the camera of mobile terminal increases.But the function of camera and interface are limited, and sometimes make user inconvenient.
Summary of the invention
Therefore, an object of the present invention is, be devoted to the above-mentioned and other problem solving prior art.
Another object of the present invention is, provides a kind of for providing mobile terminal and the corresponding method of the graphic user interface (GUI) relevant with optimization image taking.
Another object of the present invention is, provides a kind of for providing mobile terminal and the corresponding method of image camera function when user simply can touch preview image.
For realizing these and other advantage and according to the object of this specification, as specifically implemented at this and broadly described, in one aspect, the invention provides a kind of mobile terminal, this mobile terminal comprises: wireless communication unit, and this wireless communication unit is configured to perform radio communication; Camera, this camera is configured to obtain image; Display unit, this display unit is configured to, and shows the preview image obtained by described camera; And controller, this controller is configured to, ask based on user, control described display unit by wherein overlapping with described preview image show the Drawing Object relevant with image camera function the first state and wherein any one state do not shown when showing described preview image in the second state of described Drawing Object operate, and when showing in the region of described preview image by described second state the touch sensing the first preset type thereon, based on the touch of described preset type, control described camera and take described preview image.
On the other hand, the invention provides a kind of method controlling mobile terminal, and it comprises the following steps: the display unit via described mobile terminal shows the preview image obtained by the camera of described mobile terminal; Controller via described mobile terminal is asked based on user, control described display unit by wherein overlapping with described preview image show the Drawing Object relevant with image camera function the first state and wherein any one state do not shown when showing described preview image in the second state of described Drawing Object operate, and when showing in the region of described preview image by described second state the touch sensing the first preset type thereon, via the touch of described controller based on described preset type, control described camera and take described preview image.
The further scope of applicability of the application will become clearer according to detailed description given below.But, be understood that, because those skilled in the art according to this detailed description by the various change understood in the spirit and scope of the present invention and amendment, although so this detailed description and specific embodiment indicate the preferred embodiment of the present invention, provide by means of only illustration.
Accompanying drawing explanation
Accompanying drawing is included to provide a further understanding of the present invention, and is merged in and forms the part of this specification, and accompanying drawing is exemplified with execution mode and for illustration of principle of the present invention together with this description.
In the drawings:
Figure 1A is the block diagram of mobile terminal according to an embodiment of the present invention;
Figure 1B and Fig. 1 C is the concept map of an embodiment that watch from different directions, mobile terminal;
Fig. 2 is the concept map of mobile terminal according to an embodiment of the present invention;
Fig. 3 be illustrate according to an embodiment of the present invention, the flow chart of method that controls mobile terminal;
Fig. 4 is the concept map of the control method illustrating Fig. 3;
Fig. 5 A to Fig. 5 D is the concept map illustrating the method differently controlling image camera function based on various touch;
Fig. 6 A to Fig. 6 G illustrates to utilize the touch being applied to display unit, the concept map regulating the method focused on for preview image;
Fig. 7 A to Fig. 7 C is the concept map illustrating the execution mode switching to the first state of wherein output pattern object from the second state wherein limiting output pattern object;
Fig. 8 A to Fig. 8 C is the concept map illustrating another execution mode switching to the first state of wherein output pattern object from the second state wherein limiting output pattern object;
Fig. 9 A to Fig. 9 E illustrates the concept map checking the method limiting the image taken under the second state of output pattern object wherein; And
Figure 10 A to Figure 10 D be illustrated in wherein limit output pattern object the second state under the concept map of method of carries out image shoot function.
Embodiment
Below, with reference to accompanying drawing, be described in detail according to execution mode disclosed herein.For the purpose of briefly describing with reference to accompanying drawing, identical or equivalent assemblies can be provided with same or similar label, and will not repeat its description.In general, such as the suffix of " module " and " unit " can be used to finger or assembly.Use this suffix to be only intended to be easy to describe this specification at this, and this suffix itself is not intended to provide any concrete meaning or function.In the present invention, for the purpose of brief, the content that those skilled in the relevant art are known is eliminated.
The terminal of number of different types can be utilized realize at this mobile terminal presented.The example of this terminal comprises: cell phone, smart phone, subscriber equipment, laptop computer, digital broadcast terminal, personal digital assistant (PDA), portable media player (PMP), navigator, portable computer (PC), board-like (slate) PC, dull and stereotyped PC, super, wearable device (such as, intelligent wristwatch, intelligent glasses, head mounted display (HMD)) etc.
By means of only the mode of non-limitmg examples, the mobile terminal with reference to particular type is described further.But this instruction is applied to other type terminals equally, as those types above-mentioned.In addition, these instructions can also be applied to fixed terminal, as digital TV, desktop computer, digital signage etc.
Below, be described Figure 1A-1C, wherein, Figure 1A is the block diagram according to mobile terminal of the present invention, and Figure 1B and Fig. 1 C is the concept map of an embodiment that watch from different directions, mobile terminal.
Below, with reference to Figure 1A, mobile terminal 100 has been illustrated multiple assembly, as wireless communication unit 110, input unit 120, sensing cell 140, output unit 150, interface unit 160, memory 170, controller 180 and power subsystem 190.Realizing whole illustrated components is not a necessary condition, but can alternatively realize more or less assembly.
Mobile terminal 100 is shown with the wireless communication unit 110 of assembly being provided with several usual realization.Such as, wireless communication unit 110 generally includes one or more assembly, its permit mobile terminal 100 and this mobile terminal wireless communication system or network between carry out radio communication.
As shown in Figure 1A, wireless communication unit 110 comprise following in one or more: broadcast reception module 111, mobile communication module 112, wireless Internet module 113, short-range communication module 114 and positional information module 115.And, input unit 120 comprises: for obtain image or video camera 121, as the microphone 122 of a type of the voice input device for input audio signal and for allowing the user input unit 123 (such as, membrane keyboard, pressing key, mechanical keys, soft key etc.) of user's input information.Data (such as, audio frequency, video, image etc.) are obtained by input unit 120, and can according to device parameter, user command, and combination carrys out treatment and analysis by controller 180.
Sensing cell 140 utilizes one or more transducer to realize usually, and this transducer is configured to the internal information of sensing movement terminal, the surrounding environment, user profile etc. of mobile terminal.Such as, in figure ia, sensing cell 140 is shown with proximity transducer 141 and illumination sensor 142.If desired, sensing cell 140 alternatively or in addition can comprise other type sensor or device, as touch sensor, acceleration transducer, Magnetic Sensor, G transducer, gyro sensor, motion sensor, RGB transducer, infrared ray (IR) transducer, finger scan, sonac, optical pickocff (such as, camera 121), microphone 122, battery meter, environmental sensor (such as, comprising: barometer, hygrometer, thermometer, emission detecting sensor, heat sensor and gas sensor), and chemical sensor (such as, Electronic Nose, health care transducer, biometric sensors etc.) etc.Mobile terminal 100 can be configured to utilize the information obtained from sensing cell 140, and specifically, the information obtained from one or more transducer sensing cell 140 and combination thereof.
Output unit 150 is configured to export dissimilar information, as audio frequency, video, sense of touch output etc.Output unit 150 is shown with display unit 151, dio Output Modules 152, sense of touch module 153 and optical output module 154.Display unit 151 can have interlayer structure or the integrated morphology of adjoint touch sensor, to be convenient to realize touch-screen.This touch-screen can provide output interface between mobile terminal 100 and user, and serves as user input unit 123, and it provides input interface between mobile terminal 100 and user.
Interface unit 160 is used as the interface with the dissimilar external device (ED) that can be coupled to mobile terminal 100.This interface unit 160 such as can comprise following in any one: wired or wireless port, external power source port, wired or wireless FPDP, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.In some cases, mobile terminal 100 in response to the external device (ED) being connected to interface unit 160, can perform the various controlling functions be associated with the external device (ED) be connected.
Memory 170 is realized as the data stored for the various function or feature supporting mobile terminal 100 usually.Such as, memory 170 can be configured to, store perform in the mobile terminal 100 application program, for the data of operating mobile terminal 100 or instruction etc.Some in these application programs can be downloaded from external server via radio communication.Other application program can be arranged in mobile terminal 100 when manufacturing or transport, and it is normally for the situation (such as, receipt of call, send calling, receipt message, transmission message etc.) of the basic function of mobile terminal 100.For application program, commonly, to be stored in memory 170, to install and perform in the mobile terminal 100 and by controller 180, to perform the operation (or function) for mobile terminal 100.
In order to drive the application program be stored in memory 170, controller 180 can control at least some assembly described above with reference to Figure 1A.In addition, in order to drive this application program, controller 180 can combine two or more assemblies comprised in the mobile terminal 100, to operate it.
Power subsystem 190 can be configured to, and receives external power or provide internal power, to be provided as the parts that operate and comprise in the mobile terminal 100 and the suitable electric power needed for assembly.Power subsystem 190 can comprise battery, and this battery can be configured to embed in terminal body, or is configured to be separated with terminal body.At least some in these assemblies can operate collaboratively, to realize the operation of the mobile terminal according to following different execution mode, control or control method.And the operation of this mobile terminal, control or control method can realize in the terminal by driving at least one application program that be stored in memory 170.
Also with reference to Figure 1A, below, the different assemblies described in this figure are described in more detail.About wireless communication unit 110, broadcast reception module 111 is configured to, via broadcasting channel from external broadcasting management entity receiving broadcast signal and/or broadcast related information.This broadcasting channel can comprise satellite channel, ground channel or both.In some embodiments, two or more broadcast reception modules 111 can be utilized, to be easy to receive two or more broadcasting channels simultaneously, or be supported in the switching between broadcasting channel.
Mobile communication module 112 can to from one or more network entity send and/or receive wireless signal.The typical case of network entity comprises: base station, outside mobile terminal, server etc.This network entity defines a part for mobile communications network, this mobile communications network builds (such as according to the technical standard or communication means that are used for mobile communication, global system for mobile communications (GSM), code division multiple access (CDMA), CDMA2000 (CDMA 2000), EV-DO (the speech data optimization of enhancing or the speech data only strengthened), wideband CDMA (WCDMA), high-speed slender body theory (HSDPA), HSUPA (high speed uplink packet access), Long Term Evolution (LTE), LTE-A (senior Long Term Evolution) etc.).
The example sending and/or receive wireless signal via mobile communication module 112 comprises: audio call signal, video (phone) call signal or for supporting the data of the various forms transmitting text and Multimedia Message.
This wireless Internet module 113 is configured to be easy to wireless Internet access.This module can be inner or be externally engaged to mobile terminal 100.This wireless Internet module 113 according to wireless Internet technology, can send via communication network and/or receives wireless signal.The example of this wireless Internet access comprises: WLAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-FiDirect, DLNA (DLNA), WiMAX (WiBro), micro-wave access to global intercommunication (WiMAX), high-speed slender body theory (HSDPA), HSUPA (high speed uplink packet access), Long Term Evolution (LTE), LTE-A (senior Long Term Evolution) etc.Wireless Internet module 113 can send/receive data according to one or more in this wireless Internet technology, and other technique of internet is same.
In some embodiments, when wireless Internet access such as realizes according to WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A etc., as a part for mobile communications network, wireless Internet module 113 performs the access of this wireless Internet.Similarly, internet module 113 can cooperate with mobile communication module 112, or serves as mobile communication module 112.
Short-range communication module 114 is configured to be easy to short-range communication.Appropriate technology for realizing this short-range communication comprises: BLUETOOTH tMradio frequency identification (RFID), infra red data as-sodation (IrDA), ultra broadband (UWB), ZigBee, near-field communication (NFC), Wireless Fidelity (Wi-Fi), Wi-FiDirect, Wireless USB (radio universal serial bus) etc.Short-range communication module 114 generally via radio area network support the radio communication between mobile terminal 100 and wireless communication system, the communication between mobile terminal 100 and another mobile terminal 100 or mobile terminal and another mobile terminal 100 (or external server) network between communication.An example of radio area network is wireless personal domain network.
In some embodiments, another mobile terminal (it can configure similarly with mobile terminal 100) can be wearable device, such as, intelligent wristwatch, intelligent glasses or head mounted display (HMD), its can with mobile terminal 100 swap data (or otherwise cooperating with mobile terminal 100).Short-range communication module 114 can sense or identify this wearable device, and permits the communication between this wearable device and mobile terminal 100.In addition, when sensed wearable device is the certified device communicated with mobile terminal 100, controller 180 such as can make the data of process in mobile terminal 100 be sent to this wearable device via short-range communication module 114.Therefore, the user of wearable device can use the data processed in the mobile terminal 100 on this wearable device.Such as, when receiving calling in the mobile terminal 100, user can utilize wearable device to answer this calling.And when receipt message in the mobile terminal 100, user can utilize wearable device to check received message.
Positional information module 115 is configured to usually, detects, calculates, derives or otherwise identify the position of mobile terminal.As an embodiment, positional information module 115 comprise global positioning system (GPS) module, Wi-Fi module or both.If desired, positional information module 115 can alternatively or in addition and other module works any of wireless communication unit 110, to obtain the data relevant with the position of mobile terminal.As an embodiment, when mobile terminal uses GPS module, the position of mobile terminal can utilize the signal sent from gps satellite to obtain.As another embodiment, when mobile terminal uses Wi-Fi module, the position of mobile terminal can obtain based on the information relevant with WAP (wireless access point) (AP) (its to or send from Wi-Fi module or receive wireless signal).
Input unit 120 can be configured to permit the dissimilar input for mobile terminal 120.The example of this input comprises: audio frequency, image, video, data and user's input.Usually utilize one or more camera 121 to obtain image and video input.This camera 121 can process the picture frame of still picture or the video obtained by imageing sensor under video or image photograph-ing mode.Treated picture frame may be displayed on display unit 151 or is stored in memory 170.In some cases, camera 121 can arrange by matrix construction, has the different angle or focus that will input to mobile terminal 100 to permit multiple image.As another embodiment, camera 121 can arrange location by solid, to obtain left image for realizing stereo-picture and right image.
Microphone 122 is usually realized as the audio frequency of permitting for mobile terminal 100 and inputs.The input of this audio frequency according to the function performed in the mobile terminal 100, and can process differently.If desired, microphone 122 can comprise various noise remove algorithm, to remove the undesired noise produced during receiving external audio.
User input unit 123 is assemblies of grant user input.This user's input can make controller 180 can control the operation of mobile terminal 100.User input unit 123 can comprise one or more of mechanical input block (such as, key, the button, thin film switch (domeswitch), jog wheels, sensitive switch etc. be arranged on surface before or after mobile terminal 100 or side surface) or touch-sensitive input.As an embodiment, touch-sensitive input can be by software processes and displays virtual key on the touchscreen or soft key, or is positioned on mobile terminal, is different from the membrane keyboard of the position of touch-screen.
And this virtual key or visible keys can show on the touchscreen by difformity (such as, figure, text, icon, video or its combination).User input unit 123 by the information sensed by sensing cell 140 and the information inputted by aforementioned mechanical input mechanism and touch-type input mechanism, can be identified as the information from user's input.Therefore, controller 180 can control the operation of mobile terminal 100 accordingly with institute's sensitive information.
Sensing cell 140 is configured to usually, one or more in the internal information of sensing movement terminal, the ambient condition information of mobile terminal, user profile etc.Controller 180 usually cooperates with sensing cell 140 and controls the operation of mobile terminal 100, or based on the sensing provided by sensing cell 140, the data processing that execution is associated with the application program of installing in the terminal, function or operation.Sensing cell 140 can utilize in multiple sensors any one realize, below, some of them are described in more detail.
Proximity transducer 141 can comprise transducer, and this transducer is used for when not having Mechanical Contact, utilizes electromagnetic field, infrared ray etc. to sense to exist close to the object on a surface or the object near a location, surface.The interior zone place that the screen that is touched that proximity transducer 141 can be arranged on mobile terminal covers, or near touch-screen.
Proximity transducer 141 such as can comprise following in any one: emission type photoelectric sensor, directly reflective photoelectric sensor, mirror reflective photoelectric sensor, higher-order of oscillation proximity transducer, capacitive proximity sensor, magnetic-type proximity transducer, infrared ray proximity transducer etc.When touch-screen is implemented as capacitor type, proximity transducer 141 can according to the change of electromagnetic field, and sensing pointer is relative to the degree of approach of touch-screen, and this electromagnetic field utilizes conductivity to carry out the close of responsive objects.In this case, touch-screen (touch sensor) can also be classified as proximity transducer.
Term " proximity touch " indicates wherein indicant be positioned close to touch-screen and do not contact the situation of touch-screen usually herein, term " contact touch " indicates the situation of wherein indicant and touch-screen physical contact usually at this.Touch corresponding position for indicant relative to the proximity of touch-screen, this position by correspond to wherein indicant perpendicular to the position of touch-screen.Proximity transducer 141 can sense proximity and touch, and proximity touch mode (such as, distance, direction, speed, time, position, moving state etc.).In general, controller 180 is processed and the data that proximity touches and proximity touch mode is corresponding sensed by proximity transducer 141, and makes to export visual information on the touchscreen.In addition, controller 180 can be that proximity touches or contact touches according to the touch relative to any on touch-screen, controls mobile terminal and performs different operating or process different pieces of information.
Touch sensor can utilize any one in multiple touch method, and sensing is applied to the touch of touch-screen (as display unit 151).The example of this touch method especially comprises resistor-type, capacitor type, infra red type and field type.As an embodiment, this touch sensor can be configured to, and will be applied to the change of the pressure of the specific part of display unit 151, or converts electrical input signal in the capacitance variations that the specific part of display unit 151 produces.This touch sensor can also be configured to, not only sensing touch position and touch area, and sensing touch pressure and/or touch electric capacity.Touching object is normally used for touch input to be applied to touch sensor.The example of typical touching object comprises finger, stylus, pin pen, indicant etc.
If sensed by touch sensor and touch input, then respective signal can be sent to touch controller.Touch controller 180 can process received signal, and then corresponding data is sent to controller 180.Therefore, controller 180 can sensing touch which region of display unit 151.Here, touch controller 180 can be the assembly, the controller 180 that are separated with controller 180 and combine.
In some embodiments, controller 180 can according to touching this touch-screen or the type of the touching object of membrane keyboard that arranges except touch-screen performs identical or different control.Perform identical or different control according to providing the object touching input such as to decide based on the application program of the current operation status of mobile terminal 100 or current execution.
Touch sensor and proximity transducer can individually or in combination realize, to sense dissimilar touch.This touch comprises: short (or touching) touches, long touches, touch, pull touchs, flick touch, pinch and touch into formula (pinch-in), pinch out that formula (pinch-out) touches, scans touch, touch etc. of hovering more.
If desired, sonac can be realized as, and utilizes ultrasonic wave to identify the positional information relevant with touching object.Controller 180 such as based on the information by illumination sensor and multiple sonac sensing, can calculate the position of ripple occurring source.Because light is fast more than ultrasonic wave, so the time that the time that light arrives at optical pickocff arrives at sonac than ultrasonic wave is shorter.The position of ripple occurring source can utilize this fact to calculate.Such as, can, based on as the light with reference to signal, the time difference of arriving at the time of transducer with ultrasonic wave be utilized to calculate the position of ripple occurring source.
Camera 121 generally includes at least one camera sensor (CCD, CMOS etc.), photoelectric sensor (or imageing sensor) and laser sensor.The camera 121 realizing having laser sensor can allow to detect the touch of physical objects for 3D stereo-picture.Photoelectric sensor can be on the display apparatus stacked or overlapping with display unit.This photoelectric sensor can be configured to, the movement of the physical objects that scanning is close with touch-screen.In more detail, photoelectric sensor can comprise the photodiode and transistor that arrange by row and column, scans to utilize the signal of telecommunication changed according to the amount of applied light the content received at photoelectric sensor place.That is, photoelectric sensor according to the coordinate of the change calculations physical objects of light, can obtain the positional information of physical objects thus.
Display unit 151 is configured to usually, exports the information processed in the mobile terminal 100.Such as, display unit 151 may be displayed on the execution image information of application program that mobile terminal 100 place performs, or in response to the user interface (UI) of this execution image information and graphic user interface (GUI) information.
In some embodiments, display unit 151 may be implemented as the stereoscopic display unit for showing stereo-picture.Typical stereoscopic display unit can adopt the stereo display scheme of such as stereoscopic scheme (glass proposals), automatic stereo scheme (glasses scheme), projection scheme (holographic scheme) etc.
Dio Output Modules 152 is configured to outputting audio data usually.This voice data can obtain from any one many not homologies, can receive from wireless communication unit 110 to make voice data or can be stored in memory 170.This voice data can export during the pattern of such as signal receiving modes, call model, recording mode, voice recognition mode, broadcast reception mode etc.Dio Output Modules 152 can provide the listened output (such as, call signal receive sound, message sink sound etc.) relevant with the specific function performed by mobile terminal 100.Dio Output Modules 152 can also be implemented as receiver, loud speaker, buzzer etc.
Sense of touch module 153 can be configured to, the various haptic effects generate user's impression, discovering or otherwise experience.The typical case of the haptic effect generated by sense of touch module 153 is vibration.The intensity, pattern etc. of the vibration generated by touch modules 153 can be selected to control or arranged by controller by user.Such as, sense of touch module 153 can export different vibration by compound mode or sequential system.
Except vibration, sense of touch module 153 can generate other haptic effect various, comprise effect of stimulation (as arranged for the vertical mobile stitch of contact skin, air by the jet power of jet and suction inlet and suction force, skin contact, electrode contact, electrostatic force), by the effect etc. utilizing the parts that can absorb heat or generate heat to reproduce cold and hot sensation.
Sense of touch module 153 can also be realized as and allow user to feel to experience haptic effect by such as user's finger or the muscles of the arm, and transmits haptic effect by directly contacting.According to the particular configuration of mobile terminal 100, two or more sense of touch modules 153 can be set.
Optical output module 154 can utilize the light output of light source to be used to indicate the signal of event generation.The example of event can comprise in the mobile terminal 100: the reception of message sink, call signal, missed call, alarm, calendar notification, e-mail reception, by application receives information etc.The signal exported by optical output module 154 can be realized as, and makes mobile terminal send monochromatic light or has the light of multiple color.This signal output example as can along with mobile terminal sense user checked the event that generates and stopping.
Interface unit 160 is used as the interface for the external device (ED) that will be connected with mobile terminal 100.Such as, interface unit 160 can receive the data sent from external device (ED), receives electric power to be passed to parts in mobile terminal 100 and assembly, or sends the internal data of mobile terminal 100 to this external device (ED).Interface unit 160 can comprise: wired or wireless headphone port, external power source port, wired or wireless FPDP, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.
This identification module can be the chip storing the various information using the mandate of mobile terminal 100 for certification, and can comprise: subscriber identification module (UIM), Subscriber Identity Module (SIM), USIM (USIM) etc.In addition, the device (being also called " recognition device " at this) with identification module can take the form of smart card.Therefore, this recognition device can be connected with terminal 100 via interface unit 160.
When mobile terminal 100 is connected with external bracket, interface unit 160 can be used as to allow electric power to be supplied to the passage of mobile terminal 100 from bracket, or the various command signals that can be used as to allow user to input are passed to the passage of mobile terminal from bracket through there.The various command signal inputted from bracket or electric power can be operating as identifying the signal be suitably arranged on by mobile terminal bracket.
Memory 170 can store the program of the operation for support controller 180, and stores input/output data (such as, phone directory, message, rest image, video etc.).Memory 170 can store the data relevant with the various patterns of the vibration exported in response to the touch on touch-screen inputs and audio frequency.
Memory 170 can comprise the storage medium of one or more types, comprise: flash memory, hard disk, solid state hard disc, silicon hard disk, Multimedia Micro Cards, card-type memory are (such as, SD or DX memory etc.), random access memory (RAM), static RAM (SRAM), read-only memory (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc.Mobile terminal 100 can with on the network of such as internet, the network storage device of the memory function of execute store 170 relevant operate.
Controller 180 can control the general operation of mobile terminal 100 usually.Such as, controller 180 when the state of mobile terminal meets a prerequisite, can be arranged or discharge for the lock-out state of limited subscriber input for the control command of application.
Controller 180 can also perform the control and process that are associated with voice call, data communication, video call etc., or performs the pattern identification process being used for the handwriting input performed on the touchscreen and picture drafting input being identified as respectively character or image.In addition, controller 180 can control one or more combination of those assemblies, to realize at each execution mode disclosed herein.
Power subsystem 190 receives external power or provides internal power, and is provided as and operates the corresponding component that comprises in the mobile terminal 100 and the suitable electric power needed for assembly.Power subsystem 190 can comprise battery, its usual rechargeable or be coupled to mobile terminal separably for charging.
Power subsystem 190 can comprise connectivity port.Connectivity port can be set to an example of interface unit 160, for providing electric power to be electrically connected to interface unit 160 to the external charger of battery recharge.As another embodiment, power subsystem 190 can be configured to, when not using connectivity port, wirelessly to battery recharge.In this embodiment, power subsystem 190 can utilize at least one method in the inductively method based on magnetic induction or the magnetic resonance coupling process based on electromagnetic resonance, receives the electric power transmitted from external wireless electric power transmitter.
Various execution mode described here can such as utilize software, hardware or its any being combined in computer-readable medium, machine readable media or similar mediums to realize.
Below, with reference to Figure 1B and Fig. 1 C, with reference to bar-type terminals main body, mobile terminal 100 is described.But mobile terminal 100 can alternatively realize by any one in multiple difference structure.The example of this structure comprises: Wrist watch type, clamp type, glasses type, or as wherein two or more main bodys by the folded form of relatively removable mode combination with one another, flip-type, sliding-type, oscillating-type and spin axis type and combination thereof.The mobile terminal of particular type is usually directed in this discussion.
This mobile terminal 100 generally includes the housing (such as, framework, outer cover, lid etc.) forming this terminal appearance.In this embodiment, this housing utilizes fore shell 101 and back cover 102 to be formed.Various electronic building brick is incorporated in the space be formed between fore shell 101 and back cover 102.At least one middle case can be located in addition between fore shell 101 and back cover 102.
Display unit 151 is illustrated the front side being positioned at terminal body, with output information.As shown, the window 151a of display unit 151 can be mounted to fore shell 101, to form the front surface of terminal body together with fore shell 101.
In some embodiments, electronic building brick can also be mounted to back cover 102.The example of this electronic building brick comprises: separable battery 191, identification module, storage card etc.Bonnet 103 is illustrated overlay electronic assembly, and this lid can be connected to back cover 102 in a detachable manner.Therefore, if be separated from back cover 102 by bonnet 103, then the electronic building brick being mounted to back cover 102 is outwards exposed.
As shown, if bonnet 103 is coupled to back cover 102, then the side surface of back cover 102 will partly expose.In some cases, upon coupling, back cover 102 can also be covered by bonnet 103 completely.In some embodiments, bonnet 103 can comprise the perforate for externally exposing camera 121b or dio Output Modules 152b.
Housing 101,102,103 can be formed by injection-molded synthetic resin, or can be formed by metal (such as, stainless steel (STS), aluminium (Al), titanium (Ti) etc.).Form the alternative case of the embodiment of the inner space for holding assembly as wherein multiple housing, mobile terminal 100 can be configured to, and makes a housing form inner space.In this embodiment, the mobile terminal 100 with unitary body is formed, and makes synthetic resin or metal extend to rear surface from side surface.
If desired, mobile terminal 100 can comprise for preventing the waterproof unit be incorporated into by water in terminal body.Such as, waterproof unit can comprise the waterproof part between window 151a and fore shell 101, between fore shell 101 and back cover 102 or between back cover 102 and bonnet 103, with the sealed inner hermetically when coupling those housings.
Mobile terminal 100 can comprise: display unit 151, first audio output unit 152a and the second audio output unit 152b, proximity transducer 141, illumination sensor 142, light output unit 154, first camera 121a and second camera 121b, the first actuation unit 123a and the second actuation unit 123b, microphone 122, interface unit 160 etc.
Below, as illustrated in figures ib and 1 c, mobile terminal 100 is described as an embodiment, wherein, display unit 151, first audio output unit 152a, proximity transducer 141, illumination sensor 142, optical output module 154, first camera 121a and the first actuation unit 123a are arranged on the front surface of terminal body, second actuation unit 123b, microphone 122, and interface unit 160 is arranged on the side of terminal body, and the second audio output unit 152b and second camera 121b is arranged on the rear surface of terminal body.
But these assemblies are not limited to this structure.When needed, these assemblies can be excluded, replace or arrange in the other surface.Such as, the first actuation unit 123a can not be arranged on the front surface of terminal body, and the second audio output unit 152b can be arranged on the side of terminal body, but not on the rear surface of terminal body.
The information that display unit 151 can show (or output) processes in the mobile terminal 100.Such as, display unit 151 may be displayed on the application program driven in mobile terminal 100 execution image information or according to user interface (UI) information of this execution image information or graphic user interface (GUI) information.
Display unit 151 can comprise: liquid crystal display (LCD), Thin Film Transistor-LCD (TFT-LCD), Organic Light Emitting Diode (OLED), flexible display, 3 dimension (3D) display and electric ink (e-ink) displays.
Display unit 151 can utilize two display unit to realize, and it can realize identical or different Display Technique.Such as, multiple display unit 151 can be arranged on side, spaced, or these devices can be integrated, or these devices can be arranged on different surfaces.
Display unit 151 can also comprise the touch sensor of the touch input that sensing receives at display unit place.When touching to display unit 151 input, this touch sensor can be configured to sense this touch, and controller 180 such as can generate the control command corresponding with this touch or other signal.The content inputted by touch manner can be text or numerical value, or can by different mode instruction or the menu item of specifying.
Touch sensor can configure with the form membrane with touch patterns, is arranged between the display on the rear surface of window 151a and window 151a, or the metal wire of direct composition on the rear surface of window 151a.Alternatively, this touch sensor integrally can be formed with this display.Such as, touch sensor can be arranged on display substrate on or in display.Display unit 151 can also form touch-screen together with touch sensor.Here, this touch-screen can be used as user input unit 123 (see Figure 1A).Therefore, touch-screen can replace at least some function of the first actuation unit 123a.
First dio Output Modules 152a can realize by the form of receiver, and the second dio Output Modules 152b can realize by loudspeaker forms, to export voice audio, alarm sound, multimedia audio reproduction etc.
The window 151a of display unit 151 generally includes an aperture, passes through with the audio frequency of permitting being generated by the first dio Output Modules 152a.An alternative case allows audio frequency to discharge along the assembly clearance (gap such as, between window 151a and fore shell 101) between structural main body.In this case, be output audio sound and the hole that independently formed may be can't see or otherwise hide in outward appearance, thus, simplify outward appearance and the manufacture of mobile terminal 100 further.
Optical output module 154 can be configured to export the light that the event that is used to indicate occurs.The example of this event comprises: the reception of message sink, call signal, missed call, alarm, calendar notification, e-mail reception, by application receives information etc.When user has checked the event generated, controller can stop light output by control both optical output unit 154.
First camera 121a can process the picture frame of the such as static or mobile image obtained by imageing sensor under screening-mode or video call mode.Treated picture frame then may be displayed on display unit 151 or is stored in memory 170.
First actuation unit 123a and the second actuation unit 123b is the example of user input unit 123, and it can be handled by user and provide input to mobile terminal 100.First actuation unit 123a and the second actuation unit 123b can also be commonly referred to as actuating element, and can adopt allow user to perform such as to touch, promote, any haptic methods of manipulation that rolling etc. is such.First actuation unit 123a and the second actuation unit 123b can also adopt the non-tactile method allowing user to perform the manipulation of such as proximity touch, hovering etc.
Figure 1B is exemplified with the first actuation unit 123a as membrane keyboard, but possible alternative case comprises: mechanical keys, promotion key, membrane keyboard and combination thereof.The input received at the first actuation unit 123a and the second actuation unit 123b place can use differently.Such as, first actuation unit 123a can be used by a user in the input provided for menu, homepage key, cancellation, search etc., and the second actuation unit 123b can be used by a user in and provides such input, namely, this input, for controlling the volume level exported from the first dio Output Modules 152a or the second dio Output Modules 152b, switches to the touch recognition pattern etc. of display unit 151.
As another embodiment of user input unit 123, rear input unit can be positioned on the rear surface of terminal body.This rear input unit can be handled by user and provide input to mobile terminal 100.This input can use by multitude of different ways.Such as, rear input unit can be used by a user in the input being provided for starting shooting/shutting down, start, terminate, roll, control the volume level exported from the first dio Output Modules 152a or the second dio Output Modules 152b, switch to the touch recognition pattern etc. of display unit 151.This rear input unit can be configured to permit touching input, promote input or its combination.
Rear input unit can be oriented to along the thickness direction of terminal body and the display unit 151 of front side overlapping.As an embodiment, rear input unit can be positioned on the upper part of the rear side of terminal body, its forefinger easily can be utilized to handle terminal catching during terminal body with a hand to make user.But, the present invention is not limited thereto, and rear input unit can be revised.
When on the rear surface that rear input unit is arranged on terminal body, new user interface can be realized.And, when touch-screen as above or after input unit replace the first actuation unit 123a be arranged on the front surface of terminal body, when being omitted from the front surface of terminal body by the first actuation unit 123a thus, display unit 151 can have more large-screen.
As another alternative case, mobile terminal 100 can comprise the finger scan of scanning user fingerprints.Thus, controller 180 can use the finger print information sensed by finger scan, as a part for verification process.Finger scan can also be arranged in display unit 151, or realizes in user input unit 123.
Microphone 122 is illustrated an end portion being positioned at mobile terminal 100, but other position is also fine.If desired, multiple microphone can be realized, and this layout permit reception is stereo.
Interface unit 160 can be used as the path allowing mobile terminal 100 to be connected with external device (ED).Such as, interface unit 160 can wrap following in one or more: for being connected to another device (such as, earphone, external loudspeaker etc.) splicing ear, for near-field communication port (such as, Infrared Data Association (IrDA) port, bluetooth port, WLAN port etc.), or the power supply terminal for powering to mobile terminal 100.Interface unit 160 can realize by the external card for holding such as Subscriber Identity Module (SIM) or subscriber identification module (UIM) or for the slot form of the storage card of information storage.
Second camera 121b is illustrated the rear side place being positioned at terminal body, and comprises roughly contrary with the image taking direction of first camera unit 121a image taking direction.And second camera 121b can comprise the multiple lens arranged along at least one line.Described multiple lens can also be arranged by matrix construction.This camera can be called as " array camera ".When second camera 121b is implemented as array camera, described multiple lens photographic images differently can be utilized, and image has more good quality.
As shown in Figure 1 C, photoflash lamp 124 is illustrated with second camera 121b adjacent.When utilizing camera 121b to take the image of a main body, photoflash lamp 124 can throw light on this main body.As shown in Figure 1 C, the second dio Output Modules 152b can be positioned in terminal body.Second dio Output Modules 152b can realize stereo function in conjunction with the first dio Output Modules 152a, and can be used to realize the speakerphone mode for call communication.
At least one antenna for radio communication can be positioned at terminal body.This antenna can be arranged in terminal body or by shell and be formed.Such as, the antenna forming a part for broadcast reception module 111 can be retracted in terminal body.Alternatively, antenna can utilize the film of the inner surface being engaged to bonnet 103, or the housing comprising electric conducting material is formed.
Power subsystem 190 for powering to mobile terminal 100 can comprise battery 191, and it is arranged in terminal body, or is coupled to separably outside terminal body.Battery 191 can via being connected to the power line of interface unit 160 to receive electric power.And battery 191 can utilize wireless charger wirelessly to recharge.Wireless charging can be realized by magnetic induction or electromagnetic resonance.
Bonnet 103 is illustrated and is coupled to back cover 102, to cover battery 191, prevents battery 191 to be separated, and protects battery 191 not by external impact or foreign substance.When battery 191 and terminal body are separable, bonnet 103 can be coupled to back cover 102 in a detachable fashion.
The annex of the function for the protection of outward appearance or auxiliary or expansion mobile terminal 100 can also be arranged on mobile terminal 100.As an embodiment of annex, lid or the bag at least one surface of covering or holding mobile terminal 100 can be provided for.This lid or bag can cooperate with display unit 151, to expand the function of mobile terminal 100.Another embodiment of annex is the stylus inputted for the touch of touch-screen for auxiliary or expansion.
According to an embodiment of the present invention, the mobile terminal that can comprise as mentioned above one or more assembly can show the image received by camera on the display unit.In more detail, this mobile terminal can show the image received by camera on the display unit in real time.Here, the image received by camera can be expressed as " preview image ", " image " etc.
Mobile terminal according to an embodiment of the present invention can provide image (preview image) the storage image camera function in memory will received by camera.Here, by mobile terminal, the operation that the image received via camera stores in memory be can be expressed as " photographic images ", " acquisition image ", " shooting preview image ", " performing the shooting for preview image ", " processing the imaging commands for preview image ", " performing the image camera function for preview image " etc.And, when not limiting above-mentioned expression, can arbitrarily use any expression, as long as its image meaning to be received by camera is stored in a memory cell.
In one embodiment, mobile terminal can select carries out image shooting operation based on user.This user selects to can be expressed as " user control command " or " control command ".User selects to perform differently.Such as, user can by touch or press setting hardkey in the terminal or by touch export for the soft key of display unit 151 and visible keys at least one, select image camera function.
Namely, when the hardkey be associated with image camera function is touched or presses or when at least one in exporting for the soft key of display unit 151 and visible keys is touched, controller 180 can be determined to have received the user control command for carries out image shoot function.Based on this control command, controller 180 can take the image inputted by camera 121.And except these embodiments, image camera function can also performing when receiving the voiceband user corresponding with predetermined order, performing when given pose being applied to mobile terminal, or sensed by mobile terminal preset mobile time perform.
Now, in an embodiment of the present invention, can carries out image shoot function.The execution of image camera function can refer to the application performed as photographic images drives.When carries out image shoot function, controller 180 can enable camera 121 when preparing photographic images.And controller 180 can export the image inputted by camera 121 to display unit 151.
In addition, in an embodiment of the present invention, pass through enabled camera 121 and to input and the image exporting display unit 151 to is restricted to " preview image ".As preview image, the image inputted by camera 121 can be presented on display unit 151 in real time.And when selecting carries out image shooting operation based on user, the preview image exporting display unit 151 to can be stored in memory 170 by controller 180.
Below, with reference to Fig. 2, the operation of mobile terminal carries out image shoot function is described.Specifically, Fig. 2 is the concept map of mobile terminal according to an embodiment of the present invention.As mentioned above, mobile terminal according to an embodiment of the present invention can carries out image shoot function.
Such as, when icon (or the icon applied) of selecting (or touch) to be associated with image camera function, can carries out image shoot function.When by this icon carries out image shoot function, display unit 151 can be in unlatching (ON) state.Except selecting the icon be associated with image camera function with except the scheme of carries out image shoot function, this image camera function can also perform selecting to arrange at least one key in the terminal (such as, in hardkey and soft key at least one) time.In this case, even if display unit 151 is in cut out (OFF) state, controller 180 also can in response to selecting to arrange key in the terminal and carries out image shoot function.
When carries out image shoot function described above, controller 180 can show the preview image 300 relevant with image camera function and Drawing Object 302, as shown in (a) of Fig. 2 on display unit 151.That is, controller 180 can export the Drawing Object 302 relevant with image camera function by overlap mode to display unit 151.Here, Drawing Object 302 can be at least one in soft key described above and visible keys.
And based on user's request, controller 180 can control the shooting of camera execution for preview image 300.In more detail, based on the user's request being applied to the Drawing Object 302 exported to display unit 151, controller 180 can arrange the function relevant with image camera function (such as, setting, image photograph-ing mode, switch between front camera and rear camera, photoflash lamp, switch between rest image and video, input store etc.), and based on the touch being applied to image-shot button, the shooting for preview image 300 can be performed.
In an embodiment of the present invention, the state (as shown in (a) of Fig. 2) of the overlapping preview image 300 of wherein relevant with image camera function Drawing Object 302 is restricted to " the first state ".In mobile terminal according to an embodiment of the present invention, when carries out image shoot function, can not output pattern object 302, and only preview image 300 can be exported to display unit 151, as shown in (b) of Fig. 2.
And, the state (it not having Drawing Object 302, as shown in (b) of Fig. 2) wherein only preview image 300 being exported to display unit 151 can be expressed as the state wherein limiting and Drawing Object 302 is exported to preview image 300.And in embodiments of the present invention, the state wherein limited to preview image 300 output pattern object 302 is restricted to " the second state ".In the second condition, controller 180 can ask to perform the shooting for preview image 300 based on user.Such as, when sensing the touch in a region that be applied to display unit 151, that export to preview image 300, sensed touch can be treated to image shoot command by controller 180.
And captured preview image 300 can be stored in memory 151 by controller 180.In other words, controller 180 can by wherein together output pattern object 302 and preview image 300 the first state and wherein limit and carry out carries out image shoot function to any state in the second state of preview image 300 output pattern object 302.
And based on user's request, controller 180 can determine display unit will by which state of operation in the first state and the second state.In more detail, based on user's request, controller 180 can control display unit 151 and operate to any state in the second state of preview image 300 output pattern object 302 by the first state of the overlapping preview image 300 of wherein relevant with image camera function Drawing Object 302 and wherein limiting.
Display unit 151 and wherein can ask any state limited in the second state of preview image 300 output pattern object 302 to operate according to user by overlapping the first state of wherein relevant with image camera function Drawing Object 302 and preview image 300.Such as, ask based on user, controller 180 can export the menu for selecting any state in the first state and the second state, and when user is from any state this menu setecting first state and the second state, controller 180 can control display unit 151 by selected state of operation.In another embodiment, when sensing the touch of the preset type that user applies to display unit 151 (such as, flick touch) time, the state of display unit 151 can be switched to the second state from the first state by controller 180, or switches to the first state from the second state.
When display unit 151 is in the first state, controller 180 can be selected based on the user for the Drawing Object 302 (image-shot button) exporting display unit 151 to (or touch), carrys out shooting preview image 300.Below, when being in the second state to display unit 151, the method performed for the image camera function of preview image 300 is described in detail.
Below, with reference to Fig. 3 and Fig. 4, when being in the second state to display unit 151, the method for carries out image shooting is described in detail.Fig. 3 be illustrate according to an embodiment of the present invention, the flow chart of method that controls mobile terminal, and Fig. 4 is the concept map of the control method illustrating Fig. 3.Controller 180 can carries out image shoot function, and this image camera function can based on the selection (or the touch applied) to the icon be associated with image camera function (or the icon applied), differently perform.
When carries out image shoot function, controller 180 can enable camera 121.Controller 180 can export the preview image 300 inputted by camera 121 to display unit 151.Display unit 151 can comprise the viewing area (or performing picture-display-region) of the execution picture of display application.Only can export preview image 300 to this viewing area.
That is, relevant with image taking Drawing Object can not be presented on display unit 151.In other words, controller 180 can limit and export the Drawing Object relevant with image camera function.Because restriction output pattern object, so only can export preview image to display unit 151, as shown in step S301.Thus, wherein when output pattern object, a part for preview image can not be occurred by the phenomenon that this Drawing Object covers.
That is, controller 180 can limit and exports the graph image relevant with image camera function and export preview image 300 (such as, full frame).In more detail, controller 180 can not export the Drawing Object of the part covering preview image.This Drawing Object can comprise: image-shot button, for change for preview image 300 setting button is set, for entering picture library to check the button of storage image in memory, the button etc. for switching between rest image screening-mode and video shooting mode.
When only exporting preview image 300, when not overlay graphics object 320, controller 180 can carries out image shoot function.In this case, because it is limited to export the Drawing Object comprising image-shot button etc., so user can be asked the image shoot command be treated to for preview image 300 by controller 180.
That is, limit wherein under the second state of display unit 151 output pattern object, controller 180 can ask to perform the shooting for preview image 300 based on user.Here, this user request can be that the user being applied to display unit 151 touches.Along with the touch sensing preset type, controller 180 can perform the shooting for preview image 300.
In other words, limit under the second state of display unit 151 output pattern object 302 wherein, when sensing the touch of preset type in the region exporting preview image 300, in step s 320, sensed touch can be treated to image shoot command by controller 180.After this, controller 180 can carry out carries out image shooting based on the touch of sensed preset type, and is stored in memory 170 by captured image.
Below, be described touching based on the difference being applied to the region exporting preview image the method controlling image camera function in the second condition.Such as, controller 180 can control the type based on the image captured by dissimilar touch.The type of image can comprise: rest image, video etc.Below, with reference to Fig. 5, the method controlling mobile terminal 100 is described in detail.
When the touch of preset type is applied to display unit 151, region a part of that export preview image 300 time, controller 180 can photographic images.In this case, controller 180 can carries out image be taken, and no matter applies the position of the touch of this preset type.When applying the touch of this preset type, controller 180 can metering needle to the focusing of preview image 300.
This adjustment focusing function can perform based on applied preset dissimilar touch.Dissimilar adjustment focusing function can be associated with preset dissimilar touch.That is, when sense in preset dissimilar touch any one time, controller 180 can according to the scheme be associated with sensed touch, and metering needle is to the focusing of preview image 300.And controller 180 can regulate when considering the position of the touch applying this preset type and focus on.
Such as, when sensing the touch of preset type of the first area being applied to preview region 300, controller 180 can regulate focusing, and when sense be applied to preview region be different from the touch of preset type of the second area of first area time, controller 180 can carries out image shoot function, and does not regulate focusing.With reference to Fig. 6, the control method of focusing is regulated to be described in detail relating to by controller 180.Thus, when the touch based on sensed preset type performs preview image shooting, captured image can be stored in memory 170.
This can understand with reference to Fig. 4.As shown in (a) of Fig. 4, controller 180 can export preview image 300 by limiting the second state exporting the figure relevant with image taking to display unit 151.Subsequently, as shown in (b) of Fig. 4, when sensing the touch of the preset type being applied to the region exporting preview image 300, controller 180 can take this image, as shown in (c) of Fig. 4.Captured image can be stored in memory 170.
As mentioned above, in mobile terminal according to the embodiment of the present invention, limit to preview image output pattern object, and only provide preview image, thus, when image taking, preview image is not covered by Drawing Object.And, in mobile terminal according to an embodiment of the present invention, even if under the state of restriction output pattern object, also can pass through simple manipulation shooting preview image.Thus, user can watch clean image and take it.And thus, add convenience for users.
Below, with reference to accompanying drawing, be described in detail touching based on difference the method controlling image camera function.Specifically, Fig. 5 A to Fig. 5 D is the concept map illustrating the method differently controlling image camera function based on various touch.Controller 180 can perform different image camera functions, and by different image photograph-ing mode carries out image shoot functions.
Here, this image photograph-ing mode can comprise: rest image screening-mode, video shooting mode, snapshot & cleans (removing) pattern (removal is included in the pattern of the specific part (main body) in photographic images), high dynamic range (HDR) pattern, panning mode, virtual reality (VR) panning mode, burst mode rapidly, U.S. face pattern (pixel finding to have the different colours distinguished with adjacent color being changed over the pattern of the mean value of adjacent color), double camera pattern (simultaneously utilizing front camera and rear camera photographic images), Shi Guangji screening-mode (photographic images and storing in memory by preset time interval, and based on the pattern of the memory image of the time point display preset quantity of carries out image shooting), intelligence exposal model (most suitable image photograph-ing mode being set for each situation and the pattern of carries out image shooting thus), motor pattern (shutter speed being arranged faster with the pattern of moment), Night (expanding aperture to increase the pattern of the amount of input light) etc.
Controller 180 based on preset dissimilar touch, can carry out shooting preview image 300 by different image photograph-ing mode.Here, this preset dissimilar touch can comprise: short touch (or touching), length touch, pull touch, flick touch, scan the touch that touches and hover.The difference in functionality relevant with image taking can be associated with the preset dissimilar touch in memory 170.That is, when sense in preset dissimilar touch any one time, controller 180 can for associating and being stored in the information (match information) in memory 170, performs the function be associated with sensed touch.
Such as, as shown in (a) of Fig. 5 A, when sensing the first kind touch for the control command by the rest image pattern photographic images in the middle of preset type in the region exporting preview image 300, controller 180 can take rest image.Such as, when sense the first kind being applied to the region exporting preview image 300 touch (such as, short touch or touch) time, the rest image as shown in (b) of Fig. 5 A can be taken.
In another embodiment, as as shown in (a) of Fig. 5 B, when Second Type being touched (its be matched in the middle of for the touch by preset type, touch the control command of different video shooting mode photographic images from the first kind) and being applied to the region exporting preview image 300, can capture video.Such as, when sense the Second Type being applied to the region exporting preview image 300 touch (such as, long touch) time, can capture video, as shown in (b) of Fig. 5 B.After this, under the state of the touch capture video based on Second Type, when again sensing Second Type touch or sensing touch (such as, short touch) dissimilar with Second Type, controller 180 can stop taking this video.
In another embodiment, as shown in Figure 5 C, by the image photograph-ing mode be associated from the touch of the preset type in the middle of different image photograph-ing mode, carries out image shoot function can be carried out for preview image 300.In more detail, the touch of this preset type can be by different moving directions apply multiple touches in the middle of one.The multiple touches applied by different moving directions can comprise: pull touch, flick touch etc.Described multiple touch can be associated from the control command for carrying out photographic images by different image photograph-ing mode respectively.As shown in Figure 5 C, controller 180 can by the middle of different image photograph-ing mode, the image photograph-ing mode that is associated from sensed touch (in the middle of the multiple touches applied by different moving directions), perform the image camera function for preview image 300.
Such as, the touch of this preset type can be touch along pulling of applying of first direction and apply along the second direction being different from first direction pull in touch any one.And what apply along first direction pulls touch and can be associated with the first image photograph-ing mode, and touch can be associated with the second image photograph-ing mode being different from the first image photograph-ing mode along pulling of applying of second direction.In this case, controller 180 can carry out carries out image shoot function according to the image photograph-ing mode joined with the directional correlation pulling touch applied by the first and second image photograph-ing mode.
As shown in Figure 5 C, when the first image photograph-ing mode is associated with the touch applied along left direction, and when applying to touch along left direction to the region exporting preview image 300, controller 180 can carry out carries out image shoot function based on sensed touch by the first image photograph-ing mode.And, as shown in Figure 5 C, when in the second condition, sense and touch (such as along any one in the middle of multiple touches of different moving directions applying, touch along in downward direction applying) time, the state of display unit 151 can be switched to the first state of the wherein Drawing Object that output is relevant with image camera function by controller 180 from the second state.
And front/rear camera handoff functionality can be associated with the touch of preset type.Such as, as shown in Figure 5 C, what apply along third direction pulls touch and can be associated with front/rear camera handoff functionality.When sensing the touch applied along third direction, enabled camera the past camera can be switched to rear camera by controller 180, or switches to front camera from rear camera.Therefore, in an embodiment of the present invention, export in addition relevant with image camera function, for switch front/rear camera Drawing Object situation under, front/rear camera can switch by the second state limiting output pattern object.
In another embodiment, as shown in Figure 5 D, controller 180 can carry out carries out image shoot function based on object proximity display unit 151.Controller 180 can sense object close to predetermined detection surface by proximity transducer 141.When sensing this object, controller 180 can shooting preview image 300.This object can comprise the finger, face etc. of user.
For ease of illustrating, term " proximity touch " usually indicates wherein object to be positioned close to display unit 151 at this and does not contact the situation of touch-screen.Term " contact touch " indicates the situation of wherein object and display unit 151 physical contact usually at this.When sensing this proximity and touching, controller 180 can touch shooting preview image 300 based on sensed proximity.
In more detail, when sensing a proximity touch and reaching predetermined period, controller 180 can shooting preview image 300.Thus, in embodiments of the present invention, the touch of this proximity and contact touch and can be distinguished significantly.In more detail, contact touch is positioned at above display unit 151 a moment and contacts display unit 151 simultaneously, and thus, it comprises proximity and touches.In embodiments of the present invention, when sensing a proximity touch and exceeding the preset period, shooting preview image, and thus, mobile terminal can be solved and wish that carrying out is the problem that proximity touches by contact touch recognition when contact touches user.
Controller 180 can export preview image 300 under the state of restriction output pattern object.Subsequently, as shown in (a) of Fig. 5 D, controller 180 can be positioned close to display unit 151 and does not contact the proximity touch of display unit 151 by object sensing.When sensing this proximity and touching, as shown in (b) of Fig. 5 D, controller 180 can shooting preview image 300.In an embodiment of the present invention, by utilizing proximity to touch carries out image shoot function, can prevent the foreign body caused because of touch from remaining in advance, and can carries out image taking thus, watching clean preview image simultaneously.
As mentioned above, in mobile terminal according to an embodiment of the present invention, under the state of restriction output pattern object, along with the touch sensing different preset type, the various functions relevant with image camera function can be performed.Thus, even if Drawing Object does not cover a part for preview image, user also can perform the image function of his or her hope, and thus, can adding users facility.
Now, in mobile terminal according to an embodiment of the present invention, when sensing the touch of a preset type under the state limiting output pattern object, the focusing of metering needle to preview image can be carried out based on sensed touch.Specifically, Fig. 6 A to Fig. 6 G illustrates the touch utilizing and be applied to display unit, regulates the concept map of the method focused on for preview image.
When only preview image being exported to display unit 151 (the second state), controller 180 can metering needle to the focusing of preview image.In one embodiment, focusing function comprises: when mobile terminal stops automatically regulating the continuous autofocus of focusing (CAF) function when reaching scheduled time slot, regulating the touch automatic focus (TAF) etc. of focusing when sensing the touch being applied to preview image based on the region sensing touch.Below, with reference to accompanying drawing, the method in the region of focusing is regulated to be described to for being arranged in preview image by controller 180.
Described in Fig. 3 and Fig. 4, controller 180 can carry out carries out image shoot function when limiting and exporting the Drawing Object relevant with image camera function by touch.In addition, under limiting the state exporting the Drawing Object relevant with image camera function wherein, when export preview image 300 region in sense the touch of preset type time, controller 180 can before carries out image shoot function metering needle to the focusing of preview image.
In one embodiment, when sensing the touch of preset type in the region exporting preview image 300, controller 180 can regulate the focusing about sensing the region of touch in the region exporting preview image 300.Such as, when sensing the touch of preset type (such as in the region exporting preview image 300, short touch) time (as shown in (a) of Fig. 6 A), controller 180 can regulate the focusing about sensing the region of touch (short touch) in the region exporting preview image 300, as shown in (b) of Fig. 6 A.
After this, as shown in (c) of Fig. 6 A, controller 180 can for have adjusted the preview image 300 of focusing to take rest image about the region sensing touch.Namely, as as shown in (a), (b) and (c) of Fig. 6 A, when being applied with single short touch, controller 180 around being applied with the region of this single short touch to regulate focusing, and sequentially can take rest image.
And when sensing touch (such as, short touch) of preset type in the region exporting preview image 300, controller 180 can regulate around the region sensing this touch and focus on, as shown in (b) of Fig. 6 A.After this, when again applying touch (such as, short touch) of preset type under the state that have adjusted focusing about the region sensing touch wherein, controller 180 can take the rest image for the preview image 300 through focusing.
In another embodiment, when sensing the touch of preset type (such as in the second state in the region exporting preview image 300, long to touch) time (as shown in (a) of Fig. 6 B), controller 180 can regulate the focusing in the region about wherein sensing touch (long touch) in the region exporting preview image 300, as shown in (b) of Fig. 6 B.After this, as shown in (c) of Fig. 6 B, controller 180 can take the video for the preview image 300 through focusing.As shown in Figure 6B, controller 180 can only utilize single touch (long touch) to perform focus adjustment and video capture, or alternatively, when applying single touch (long touch), controller 180 only can regulate focusing, and after this, when applying additional touch, controller 180 can capture video.
Here, additional touch can be the touch (long touch) being applied to the region exporting preview image 300, or can be different from the touch applied touch (such as, short touch), with the function that the touch performed with apply (length touches) is corresponding.That is, based on initial input touch type, controller 180 can regulate for the region being applied with touch and focus on, and determines the image camera function for preview image 300.And, because touch based on initial input and determine image camera function, so when sense to touch different touch (such as, short touch) from initial input time, controller 180 can process preview image 300 through focusing according to determined image camera function.
Controller 180 can only along with single touch performs focus adjustment and image taking.In more detail, the touch of this preset type can comprise the downward touch that is applied to display unit 151 and the upwards touch for discharging the touch being applied to display unit 151.Such as, short touch can comprise downward short touch and upwards short touch, and long touch can comprise downward long touch and upwards long touch.
Touch is downward short touch or upwards short touch or touch longly downwards touch or upwards longly to touch, and the period touched can be kept to determine based between touching and upwards touching downwards.Such as, when be included in touch, between touching downwards and upwards touching, keep the period touched to be shorter than predetermined period time, this touch can be defined as short touch by controller 180.And, when be included in touch, between touching downwards and upwards touching, keep the period touched to be longer than predetermined period time, this touch can be defined as growing touch by controller 180.
When sense in display unit 151 touch downwards time, sensed downward touch can be treated to for the control command of metering needle to the focusing in the region sensing downward touch by controller 180.When sense upwards touch time, controller 180 can upwards touch the control command that be treated to for photographic images by sensed.Thus, with reference to Fig. 6 A and Fig. 6 B, controller 180 can only utilize single touch to perform focus adjustment and image taking.
In another embodiment, as shown in (a) of Fig. 6 C, controller 180 can pull touch based on what apply along preset direction, and metering needle is to the focusing of preview image.Controller 180 can based on be included in sensing pull in the path of touch at least one point regulate focusing.Such as, when sense along preset direction apply pull touch time, controller 180 can regulate focusing based on the starting point 342 exported in the region of preview image 300 and at least one in end point 344.
Such as, as shown in (a) of Fig. 6 c, when sense along the preset moving direction comprising starting point 342 and end point apply pull touch time, as shown in (b) of Fig. 6 C, controller 180 can regulate focusing based on starting point 342.After this, as as shown in (c) of Fig. 6 C, controller 180 can by with pulling of applying along preset moving direction touch be associated image photograph-ing mode (such as, the first image photograph-ing mode shown in Fig. 5 C), based on starting point 342 for the preview image 300 carries out image shoot function through focusing.
And controller 180 can regulate focusing based on being included in the end point 344 pulled in touch applied along preset moving direction, or can regulate focusing based on both starting point 342 and end point 344.And, described in Fig. 6 A, controller 180 can only utilize single touch (touch along preset moving direction applies) to perform focus adjustment, and the same image photograph-ing mode by being associated with this single touch (touch along preset moving direction applies) carrys out carries out image and takes, or alternatively, when applying single touch (touch along preset moving direction applies), controller 180 only can perform focus adjustment, and after this, when applying additional touch, controller 180 can carry out carries out image shoot function by associated images screening-mode.
In another embodiment, as as shown in (a) of Fig. 6 D, when applying the touch of preset type (such as, from touch initial light to extend pull touch with what draw a circular trace, below referred to " circle pulls touch ") time, controller 180 can regulate focusing based on the region being applied with touch.Such as, as as shown in (a) of Fig. 6 D, when sensing a circle and pull touch in the region exporting preview image 300, as shown in (b) of Fig. 6 D, controller 180 can regulate focusing based on sensing this region pulling touch.
After this, when sense be different from apply touch (such as, short touch, the long touch touching, apply along preset direction) of the preset type touched time, controller 180 can carry out carries out image shoot function for the preview image 300 of focusing.Here, when image photograph-ing mode and circle pull touch be associated time, controller 180 only can regulate focusing based on this single touch (circle pulls touch), and performs for the shooting of this preview image by associated images screening-mode subsequently.
In another embodiment, as shown in (a) of Fig. 6 E, when apply comprise at least one crosspoint pull touch time, as shown in (b) of Fig. 6 E, controller 180 can regulate focusing based at least one crosspoint described.After this, when sensing the touch of this preset type, controller 180 can based on the touch of the sensing as shown in (c) as Fig. 6 E, for the preview image 300 carries out image shoot function that have adjusted focusing based at least one crosspoint described.And controller 180 can regulate focusing based on comprising at least one region 364 described at least one crosspoint described.
When sensing the touch of the preset type being different from applied touch (such as, short touch, the long touch touching or apply along preset direction) time, controller 180 can for the preview image 300 carries out image shoot function that have adjusted focusing based on sensed touch (such as, short touch, the long touch touching or apply along preset direction).Similar is, described in Fig. 6 D, when image photograph-ing mode with comprise at least one crosspoint described pull touch be associated time, controller 180 only can regulate focusing based on this single touch (what comprise at least one crosspoint pulls touch), and performs the shooting for this preview image by the image photograph-ing mode of association subsequently.
In another embodiment, as shown in (a) of Fig. 6 F, controller 180 can divide the region of preview image 300.And controller 180 can sense touch based in which region in described multiple zoning, performs different image camera functions.Such as, controller 180 can regulate focusing according to the touch point in the region 332 and 334 being applied to division, and carries out image shoot function, or can direct carries out image shoot function, and does not carry out focus adjustment.
In addition, controller 180 can show the guide line 330 that instruction has divided this preview image on preview image 300.Such as, as shown in (b) of Fig. 6 F, when the touch of a preset type is applied to first area 332, controller 180 can regulate based on this region being applied with touch and focus on, and for preview image 300 carries out image shoot function.And, as as shown in (c) of Fig. 6 F, when the touch of a preset type is applied in the middle of institute zoning, to be different from first area 332 second area 334, controller 180 immediately for preview image 300 carries out image shoot function, and can not carry out focus adjustment based on this touch.
In an embodiment of the present invention, the focusing for an image adjustment based on whether, the image taking scheme that will perform when sensing the touch of a preset type can change.In more detail, according to the focusing that whether have adjusted preview image, controller 180 can touch for identical type and perform difference in functionality.
First, when according to suspend at mobile terminal reach scheduled time slot time the focusing for preview image 300 of continuous autofocus (CAF) function point analysis that automatically performs time, and when sensing the touch of a preset type (such as, short touch) time, controller 180 can perform shooting for preview image 300, and does not perform focus adjustment.And, based on the touch of sensing on preview image 300, according to for regulating touch automatic focus (TAF) function point analysis performed by focusing for the focusing of preview image 300 based on the region sensing touch, when sensing the touch of a preset type (such as, short touch) time, controller 180 can perform shooting for preview image 300, and does not perform focus adjustment in addition.
In another embodiment, for the state that wherein be have adjusted focusing by CAF function for preview image 300, or for wherein be have adjusted the state of focusing for preview image 300 by TAF function, when sensing the touch of a preset type (such as after have passed through predetermined period, short touch) time, controller 180 can readjust focusing based on the region of the touch sensing this preset type.
In another embodiment, as shown in Figure 6 G, controller 180 based on an object proximity display unit 151, can carry out the focusing of metering needle to preview image 300.Controller 180 can sense object close to predetermined detection surface by proximity transducer 141.When sensing this object, controller 180 can shooting preview image 300.This object can comprise the finger, face etc. of user.
When sensing this proximity and touching, controller 180 can touch metering needle to the focusing of preview image 300 based on sensed proximity.In more detail, when sensing this proximity and touching, controller 180 can touch corresponding preview area to regulate focusing for sensed proximity.And, when sensing the touch of this proximity and reaching more than predetermined period, controller 180 can for this preview image, corresponding with sensing region that this proximity touches region to regulate focusing, as mentioned above, in an embodiment of the present invention, when have passed through this predetermined period, regulating and focusing on, thus, the touch of this proximity and contact touch and can obviously distinguish.
As shown in (a) of Fig. 6 G, under the state of restriction output pattern object, controller 180 can sense that an object is positioned close to display unit 151 and the proximity that do not contact display unit 151 touches.When sense this proximity touch time, controller 180 can for preview image, touch corresponding region with sensed proximity, regulate focusing.After this, as shown in (b) of Fig. 6 G, controller 180 can apply touch and take this preview image 300 based on the region to the preview image through focusing.
In another embodiment, when have adjusted the focusing of preview image, controller 180 can carry out shooting preview image 300 based on the voice signal inputted from outside.As mentioned above, in mobile terminal according to an embodiment of the present invention, based on the touch sensing preset type under the second state of restriction output pattern object, metering needle, and can carries out image shoot function to the focusing of preview image.Thus, user can use image camera function on the clean preview image do not covered by Drawing Object, and by simple manipulation metering needle to the focusing of this preview image, and thus, meet the needs that user takes high quality graphic.
Below, to for being described in detail from wherein limiting the method switching to the first state of the overlapping preview image of wherein relevant with image camera function Drawing Object to the second state of display unit output pattern object.Specifically, Fig. 7 A to Fig. 7 C is the concept map illustrating the execution mode switching to the first state of wherein output pattern object from the second state wherein limiting output pattern object.
In mobile terminal according to an embodiment of the present invention, switching can be performed between the first state and the second state.This switching can ask to perform according to user.In more detail, based on the touch sensing preset type in the region thereon by the second State-output preview image 300, the state of display unit 151 can be switched to the first state of the overlapping preview image 300 of wherein Drawing Object 302 by controller 180 from the second state.During arbitrary touch in the middle of multiple touches in the middle of the touch sensing preset type in the second condition, that apply along different moving direction, controller 180 can export the Drawing Object 302 relevant with image camera function by overlap mode to preview image 300.
Such as, if (a) of Fig. 7 A is with shown in Fig. 5 C, when any one sensing in the middle of the multiple touches applied along different moving direction touches (such as, touch is pulled in downward direction along what apply) time, as as shown in (b) of Fig. 7 A, controller 180 can export the Drawing Object 302 relevant with image camera function.
And, when sense in the second condition for the region exporting preview image 300, in the middle of the touch of preset type, arbitrary touch in the middle of multiple touches of applying along different moving direction time, controller 180 based on the moving direction of this touch, can determine the position that Drawing Object will export to.Such as, as as shown in (a) of Fig. 7 B, when in the second condition, any one in the middle of multiple touches in the middle of the touch sensing preset type, that apply along different moving direction touches (such as, what apply along right direction pulls touch) time, based on sensed touch moving direction (such as, right direction), controller 180 can at the side (left side) corresponding with the moving direction of sensed touch and opposite side (right side) output pattern object 302a.
In addition, as as shown in (b) of Fig. 7 B, as output pattern object 302a, and any one that ought again apply in the middle of described multiple touch touches (such as, touch along right direction applies) time, controller 180 can also export the Drawing Object 302b different from the Drawing Object 302a exported.Here, preferably, different Drawing Object 302b is the image camera function of the subclass belonging to the Drawing Object 302a exported.But, the present invention is not limited thereto, and another Drawing Object 302b can be the Drawing Object being different from the Drawing Object 302a exported.
And, when the first state by output pattern object applies the touch of preset type (such as, touch along left direction applies) time, the state of display unit 151 can be switched to the second state wherein limiting output pattern object by controller 180 from the first state.In another embodiment, in the middle of the touch sensing preset type in the second condition, along different directions apply multiple touches in the middle of any one touch (such as, what apply along right direction pulls touch) time, controller 180 can perform with described any one touch the function be associated, to replace exporting the Drawing Object relevant with image camera function.
Such as, with described any one to touch the function be associated can be front/rear camera handoff functionality as shown in Figure 5 C.In addition, (a) of Fig. 7 C is the view illustrating the state of wherein enabling front camera.As shown in (a) as Fig. 7 C, sense under the second state limiting output pattern object along right direction apply pull touch time, controller 180 can by perform with apply along right direction pull the front/rear camera handoff functionality touching and be associated, enabled camera the past camera is switched to rear camera, as shown in (b) of Fig. 7 C.And when enabling rear camera, and when sensing touch (such as, what apply along right direction pulls touch) that be associated with front/rear camera handoff functionality, enabled camera can be switched to front camera from rear camera by controller 180.Now, this mobile terminal can carry out output pattern object based on dissimilar touch.
Next, Fig. 8 A to Fig. 8 C is the concept map illustrating another execution mode switching to the first state of wherein output pattern object from the second state wherein limiting output pattern object.Under limiting the second state exporting for the image object of preview image 300 to display unit 151 wherein, and when sensing touch at multiple somes places of display unit 151, the state of display unit 151 can be switched to the first state of the overlapping preview image 300 of wherein relevant with image camera function Drawing Object 302 by controller 180 from the second state.Here, the state of display unit 151 is switched to the first state from the second state and can refer to output pattern object.
And when described multiple point comprises the first touch and second touches, Drawing Object can export near at least one touch point touched in the middle of the first touch and the second touch by controller 180.In an embodiment of the present invention, as shown in (a) as Fig. 8 A, when multiple points 352 and 354 of display unit 151 sense touch, the Drawing Object 302 relevant with image camera function can be exported with described multiple sensing points to corresponding first and touch 352 and second any one touch in the middle of 354 and touch near the touch point of 352, as shown in (b) of Fig. 8 A by controller 180.
Here, relevant with image camera function output pattern object 302 can relate to the type of the touch sensed on described multiple point.In more detail, when any one corresponding in the touch of preset type of the touch sensed on multiple points of display unit 151 touches (such as, long touch), can output pattern object 302.And institute's output pattern object 302 can be multiple object.When select in (or touch) described multiple Drawing Object 302 any one time, described multiple Drawing Object 302 can disappear.After this, when at a point but not multiple point senses with any one in the touch of preset type (such as, long to touch) corresponding touch time, controller 180 can carry out photographic images by the function corresponding with the Drawing Object of user-selected (or touch).
Such as, as as shown in (a) of Fig. 8 A, when sensing multiple touch by preset mode (long touch type), at least one that Drawing Object can be exported in the middle of multiple touch 352 and 354 touches near the touch point of 352, as shown in (b) of Fig. 8 A.User can select any one (rapidly continuous shooting) in output pattern object.After this, as as shown in (a) of Fig. 5 B, when sensing the touch corresponding with preset type (long touch type) on a point, controller 180 can carry out photographic images by the function (rapidly continuous shooting) corresponding with selected Drawing Object.
In another embodiment, as as shown in (a) of Fig. 8 B, when sensing touch on multiple points 352 and 354 of display unit 151, as as shown in (b) of Fig. 8 B, controller 180 Drawing Object 302 relevant with image camera function can be exported to described multiple sensing points corresponding first touch 352 and second touch in the middle of 354, described at least one touch near the touch point of 352.
And the Drawing Object 302 relevant with output image shoot function can relate to the type of the touch sensed on described multiple point.In more detail, when touching (such as, what apply along preset direction pulls touch) when the touch sensed on described multiple points of display unit 151 is sensed to be arbitrary type, controller 180 can output pattern object.This Drawing Object can be multiple object.When have selected in described multiple Drawing Object any one time, the Drawing Object exported can disappear.After this, when sensing touch (such as, the touch along preset direction applies) that apply according to any one in the touch of preset type on a point, controller 180 can carry out photographic images by the function corresponding with selected Drawing Object.
Such as, as as shown in (a) of Fig. 8 B, when sensing multiple touch (such as by preset mode, what apply along left direction pulls touch) time, Drawing Object can be exported described in the middle of described multiple touch 352 and 354 at least one touches near the touch point of 352, as shown in (b) of Fig. 8 A.User can select any one (such as, timer) in the middle of output pattern object.In this case, controller 180 can determine the function corresponding with selected Drawing Object (such as, timer), as will one touch but not multiple touch be sensed to be preset type (such as, along left direction apply pull touch) time perform function.
After this, as shown in Figure 5 C, when sensing the corresponding touch of type preset with this (what apply along left direction pulls touch) on a point, controller 180 can carry out photographic images by the function (timer) corresponding with selected Drawing Object.In another embodiment, although do not sense touch on multiple point, the state of display unit 151 can be switched to the first state from the second state by controller 180.That is, when sensing the touch of preset type on a point, the Drawing Object relevant with image camera function can export near the touch point of sensed touch by controller 180.
Such as, when sensing the touch of preset type (such as in the region exporting preview image 300, long to touch) time (as shown in (a) of Fig. 8 C), the Drawing Object 302 relevant with image taking can export near sensed touch by controller 180, as as shown in (a) of Fig. 8 C, and non-immediate takes the video (as shown in (b) of Fig. 5 B) about preview image, or not regulate to focus on (as shown in Fig. 6 B (b)).After this, when any one in Drawing Object 302 multiple according to user's request selecting, controller 180 can perform the function corresponding with selected Drawing Object (such as, rapidly continuous shooting icon) (such as, rapidly continuous shooting).
As shown in (b) of Fig. 8 C, when keep at some place from the touch of any point be applied in the region exporting preview image 300 extension pull touchs (long touch) time, the Drawing Object 302 relevant with image taking can export to and keep this to pull described in touch near any one is put by controller 180.As mentioned above, in mobile terminal according to an embodiment of the present invention, even if in the second condition, mobile terminal 100 also can switch to the first state by simple manipulation.
And, when sensing the touch of preset type on multiple point, the Drawing Object of the image camera function performed when can export for arranging and will working as and sense the touch of this preset type on a point.Thus, even without Drawing Object, user also easily can associate and wish with his or her the image camera function that the touch of type is corresponding, and easily performs associated image camera function by again applying this touch.
Below, to providing the method for the additional function about captured image to be described in detail after photographic images in the terminal according to an embodiment of the present invention.Specifically, Fig. 9 A to Fig. 9 E illustrates the concept map checking the method limiting the image taken under the second state of output pattern object wherein.
As shown in (a) of Fig. 9 A, in the second condition, when sensing the touch of preset type in the region exporting preview image 300, can shooting preview image 300.After this, as shown in (b) of Fig. 9 A, the thumbnail 400 about captured image can be exported to a region of the output preview image 300 in this region by controller 180.This thumbnail 400 can overlapping preview image 300.
And thumbnail 400 can be the thumbnail of the image for shooting recently.And thumbnail 400 can disappear when have passed through predetermined period or according to user's request.When sensing short touch on thumbnail 400, controller 400 can enter the picture library for exporting the image be stored in memory 170.
In addition, along with the output of thumbnail 400, when by the touch of a preset type (such as, long to touch) when being applied to thumbnail 400 (as shown in (c) of Fig. 9 A), controller 180 can export the photographic images 500 corresponding with thumbnail 400 to wherein export preview image 300 region, as shown in (d) of Fig. 9 A.This thumbnail 400 can overlapping preview image 300.
And, as as shown in (d) of Fig. 8 A, for captured image thumbnail 400 with for shooting more Zao than nearest photographic images 500 and the thumbnail 401 storing image in memory with 402 can together with export the photographic images 500 exported by overlap mode to.And controller 180 can export to display unit 151 the dustbin Drawing Object 600 performing the function of deleting photographic images.
Controller 180 or according to circumstances can also determine whether to export at least one in the middle of thumbnail 400,401,402 and dustbin Drawing Object 600 according to condition.Such as, as shown in (d) of Fig. 9 A, when keeping long touch, controller 180 can to display unit 151 export in the middle of thumbnail 400,401,402 and dustbin Drawing Object 600 at least one.And when sensing this length of release and touching, the state of display unit 151 can be switched to the second state by controller 180, to make only to export preview image 300, as shown in (a) of Fig. 9 A.
As shown in (a) of Fig. 9 B, when long touch is applied to the thumbnail 400 about captured image, controller 180 can export the image 500 corresponding with thumbnail 400 to wherein output preview image 300 region by overlap mode.After this, as as shown in (b) of Fig. 9 B, when when the touch about the thumbnail 401 of previous photographic images sensing extension from applied long touch, output image 500 can be switched to the image 501 corresponding with the thumbnail 401 of the described touch it sensing extension from this length touches by controller 180.
In other words, when maintenance this length touch and when extending to the second thumbnail 401 from the first thumbnail 400, the image 500 corresponding with the first thumbnail 400 can be switched to the image 501 corresponding with the second thumbnail by controller 180, and exports this image 501 to display unit 151.
As shown in (c) of Fig. 9 B, touch along with remaining on the long of sensing in thumbnail 401, when sense be formed as from thumbnail 401 extend to dustbin Drawing Object long touch pull touch time, the previous photographic images 501 corresponding with long touch-sensing thumbnail 401 can be deleted.Here, thumbnail 401 can be managed the data into the separation from original image 501.
In this case, when the thumbnail 401 of long touch-sensing is drawn to dustbin Drawing Object 600, controller 180 can delete thumbnail 401 and the original image 501 corresponding with thumbnail 401.Here, except wherein thumbnail 401 and the original image corresponding with this thumbnail are by except the situation of deleting together, any one in thumbnail 401 and the original image corresponding with this thumbnail can only be deleted.
Such as, when sensing described multiple thumbnail 400,401, and long on any one thumbnail 401 in 402 is when touching, and when sense from optionally sense extend to the touch of dustbin Drawing Object 600 the long thumbnail 401 touched time, controller 180 can delete selected thumbnail 401 and the previous photographic images 501 corresponding with selected thumbnail 401 from memory cell.
When the long touch of release under any one state in state shown in (a), (b) and (c) of Fig. 9 B, the state of display unit 151 can be back to the second state wherein limiting output pattern object by controller 180, as shown in (d) of Fig. 9 B.In addition, in an embodiment of the present invention, even if before shooting preview image, with after shooting preview image, the image previously stored in memory also can be exported.
In more detail, under the state that restriction exports the Drawing Object relevant with image camera function wherein, when sensing the touch of preset type, controller 180 can to the thumbnail of image of display unit output about having taken.In this case, at least one thumbnail can be exported.This thumbnail can export according to the touch of the preset type being applied to preset regions.This preset regions can be the region that will export thumbnail.
At least one region (hereinafter, referred to as " thumbnail region ") that this thumbnail will export to can be arranged in this preview image of output in this region at least partially.That is, the output preview image in this region can be assigned as thumbnail region by controller 180 at least partially.This thumbnail region can be arranged when creating application or software, or can ask to arrange according to user.
As shown in (a) of Fig. 9 C, controller 180 can arrange at least one region 700 (thumbnail region) for exporting thumbnail.After this, when sensing the touch of preset type in thumbnail region, controller 180 can export at least one thumbnail 400 about being stored in the image in memory 170 to thumbnail region 700, as shown in (b) of Fig. 9 C.Along with at least one thumbnail 400 of output, when sensing the touch on any one thumbnail in exported thumbnail 400 (short touch), controller 180 can enter the picture library for exporting the image be stored in memory 170.
And along with the output of thumbnail 400, when sensing the touch on any one thumbnail in exported thumbnail 400 (long touch), controller 180 can provide and control to perform above with reference to this function described in Fig. 9 A and Fig. 9 B.
In addition, export the display size of thumbnail and quantity can be determined based on the scope of the touch of sensing on display unit 151.And thumbnail region 700 can be determined based on the scope of the touch of sensing on display unit 151.In more detail, controller 180 can sense the scope of the touch of sensing on display unit 151.Based on this scope, controller 180 can regulate the size at least one region (thumbnail region) for exporting thumbnail.
Such as, the size in thumbnail region can the scope of touch of sensing on display unit 151 in proportion to.Controller 180 can according to the size in thumbnail region, determines at least one in the middle of the quantity of exported thumbnail and display size, and exports thumbnail based on the quantity of determined thumbnail and display size to thumbnail region.Such as, as as shown in (a) of Fig. 9 D, when the scope of the touch of sensing in thumbnail region 700 is narrower, controller 180 can regulate (such as, reduction) size in thumbnail region 700, and export thumbnail 400 based on the size in regulated thumbnail region 700.
As shown in (b) of Fig. 9 D, when the scope of the touch of sensing in thumbnail region 700 is larger, the size in thumbnail region 700 can be adjusted to the size corresponding with this scope by controller 180, and exports thumbnail 400 based on the adjustment size in thumbnail region 700.As (a) and (b) of comparison diagram 9D, relative to (b), a () has narrow touch scope, and thus, the thumbnail with undersized larger number can be exported, relative to (a), (b) has large touch scope, can export the thumbnail with large-sized less quantity.By this structure, in an embodiment of the present invention, under the state of restriction output pattern object, even if before shooting preview image, also meet the needs that user checks storage image in memory.
In addition, in an embodiment of the present invention, under the second state of restriction output pattern object, the thumbnail stored in memory can utilize image analysis function to export.In more detail, when when being included in the touch main body in preview image sensing preset type, the image-related thumbnail that controller 180 can export the main body corresponding with the touch by taking and sensing and obtain.
Such as, as shown in (a) of Fig. 9 E, when sensing touch (such as, long touch) of preset type in the main body exporting preview image 300 to, controller 180 can be analyzed for this main body carries out image.Based on image analysis result, controller 180 can extract the image corresponding with this result from the image be stored in memory 170, and exports extracted image to display unit 151.In this case, controller 180 extracted image can be exported to the touch sensing this preset type region near or preset regions (for exporting the region of thumbnail).
By this structure, in an embodiment of the present invention, even if restriction output pattern object, also can export the image relevant with the main body be included in preview image.Thus, in an embodiment of the present invention, easily can be checked the photographic images of this main body by simple manipulation, thus user's needs of the image of the different personages for the shooting body can be met.
As mentioned above, in mobile terminal according to an embodiment of the present invention, the thumbnail for checking photographic images can be exported, and this thumbnail can be utilized check and delete this photographic images.Thus, even if because under the second state of not output pattern object, user also can check and delete photographic images, thus can adding users convenient.
Below, to wherein under the second state of restriction output pattern object another execution mode of carries out image shoot function be described in detail.Specifically, Figure 10 A to Figure 10 D be illustrated in wherein limit output pattern object the second state under the concept map of method of carries out image shoot function.
Controller 180 by utilizing the different hardware structure arranged in the terminal, can be taken preview image carries out image in the second condition.Such as, when the touch of preset type is applied to the microphone 122 of mobile terminal, controller 180 can to the preview image 300 carries out image shooting exported in the second condition.When as shown in Figure 10 A, sense in microphone 122 touch or the touch of preset type time, or ought as shown in Figure 10 B, when covering microphone 122 to stop noise, controller 180 can be taken display preview image 300 carries out image on the display unit.
In another embodiment, as illustrated in figure 10 c, controller 180 can pass through the motion of camera unit 121a sensing user, and based on sensed motion to preview image 300 carries out image shoot function.And controller 180 can identify the face of user and the motion of user by camera 121a, and takes preview image 300 carries out image.
And controller 180 can based on the user movement sensed by proximity transducer 141, to preview image 300 carries out image shoot function.The motion of user can comprise the motion etc. that covering proximity transducer 141 reaches predetermined period, as shown in Figure 10 D.
In addition, embodiments of the present invention described above can be installed as the basic function of this mobile terminal when issuing mobile terminal, or can provide by the form of the application that radio communication can be utilized to be downloaded by external server.Thus, when installing down load application in the terminal, the function according to embodiment of the present invention can be arranged in this mobile terminal.
In embodiments of the present invention, preceding method may be implemented as the code that can be read by processor in program recorded medium.This processor readable medium can comprise: ROM, RAM, CD-ROM, tape, floppy disk, optical data storage device etc.This processor readable medium also comprises the realization (such as, via internet transmissions) adopting carrier format.
Be not limited to the application of its structure and method according to the mobile terminal of embodiment of the present invention, but all or part of of this execution mode optionally can combine to form different modifications.
Aforementioned embodiments and advantage are only exemplary, and should not be considered as limitation of the present invention.This instruction easily can be applied to the device of other type.This description is intended to illustrate, and the scope of unrestricted claims.It should be apparent to those skilled in the art that many alternative cases, modification, and modified example.The feature of execution mode described here, structure, method, and other characteristic can variously combine, to obtain additional and/or alternative embodiment.
Due to specifically eigen can be implemented by several form when not departing from feature of the present invention, thus, be to be further appreciated that, above-mentioned execution mode is not limited to any details of aforementioned description, unless specified in addition, and on the contrary, extensively should consider in its scope defined in appended claims, and thus, fall in the boundary of claims institute change and revise, or the equivalent of this boundary all contain by appended claims.

Claims (20)

1. a mobile terminal, described mobile terminal comprises:
Wireless communication unit, described wireless communication unit is configured to perform radio communication;
Camera, described camera is configured to obtain image;
Display unit, described display unit is configured to, and shows the preview image obtained by described camera; And
Controller, described controller is configured to:
Ask based on user, control described display unit by wherein overlapping with described preview image show the Drawing Object relevant with image camera function the first state and wherein any one state do not shown when showing described preview image in the second state of described Drawing Object operate, and
When showing in the region of described preview image by described second state the touch sensing the first preset type thereon, the touch based on described preset type controls described camera and takes described preview image.
2. mobile terminal according to claim 1, wherein, described controller is also configured to, and when described display unit is in described second state, performs difference in functionality based on the touch showing the preset type of difference sensed in the described region of described preview image thereon.
3. mobile terminal according to claim 2, wherein, described controller also controls based on the touch of described first preset type the rest image that described preview image taken by described camera, and takes the video of described preview image based on the touch of the preset type of difference.
4. mobile terminal according to claim 1, wherein, the touch of described first preset type is that any one in the middle of multiple touches of applying along different moving direction touches, and described multiple touch is associated from different image photograph-ing mode respectively, and
Wherein, described controller is also configured to, by the middle of described different image photograph-ing mode, the image photograph-ing mode that is associated with the touch of the sensed first preset type, perform described image camera function for described preview image.
5. mobile terminal according to claim 4, wherein, the touch of described first preset type be touch along pulling of applying of first direction and apply along the second direction being different from described first direction pull in touch any one,
Wherein, pull described in applying along described first direction touch be associated with the first image photograph-ing mode, and pull described in applying along described second direction touch be associated with the second image photograph-ing mode being different from described first image photograph-ing mode, and
Wherein, described controller is also configured to, and by described first image photograph-ing mode joined with the applied directional correlation pulling touch or described second image photograph-ing mode, performs described image camera function for described preview image.
6. mobile terminal according to claim 1, wherein, described controller is also configured to, when sensing the touch of the second preset type in the described region showing described preview image thereon by described second state, based on the region wherein sensing described touch, regulate the focusing of described preview image.
7. mobile terminal according to claim 1, wherein, described controller is also configured to, sense with apply along preset moving direction by described second state pull the touch touching the second corresponding preset type, based on pull described on described preview image in the starting point of touch and end point at least one, regulate the focusing of described preview image.
8. mobile terminal according to claim 1, wherein, described controller is also configured to, when the touch of described first preset type comprises the touch by the multiple points of described second state on described display unit sense, the state of described display unit is switched to described first state from described second state.
9. mobile terminal according to claim 8, wherein, the described touch on described multiple point comprises the first touch and second and touches, and
Wherein, described controller is also configured to, near at least one touch point touched in the middle of described Drawing Object display to described first touch and described second being touched.
10. mobile terminal according to claim 1, wherein, described controller is also configured to, when the touch of described first preset type is short touch, take described preview image as rest image, and when the touch of described first preset type is long touch, described controller is also configured to, and takes the video of the described image obtained by described camera.
11. 1 kinds of methods controlling mobile terminal, the method comprises the following steps:
Display unit via described mobile terminal shows the preview image obtained by the camera of described mobile terminal;
Ask based on user, controller via described mobile terminal control described display unit by wherein overlapping with described preview image show the Drawing Object relevant with image camera function the first state and wherein any one state do not shown when showing described preview image in the second state of described Drawing Object operate, and
When showing in the region of described preview image by described second state the touch sensing the first preset type thereon, based on the touch of described preset type, controlling described camera via described controller and taking described preview image.
12. methods according to claim 11, wherein, described method is further comprising the steps of: when described display unit is in described second state, based on the touch showing the different preset type sensed in the described region of described preview image thereon, performs difference in functionality.
13. methods according to claim 12, described method is further comprising the steps of:
Based on described first preset type touch control described in camera take the rest image of described preview image, and take the video of described preview image based on the touch of different preset types.
14. methods according to claim 11, wherein, the touch of described first preset type touches along any one in the middle of multiple touches of different moving directions applying, and described multiple touch is associated from different image photograph-ing mode respectively, and
Wherein, described method is further comprising the steps of: by the middle of described different image photograph-ing mode, the image photograph-ing mode that is associated with the touch of the sensed first preset type, perform described image camera function for described preview image.
15. methods according to claim 14, wherein, the touch of described first preset type is any one touch pulled in touch pulling touch and apply along the second direction being different from described first direction applied along first direction,
Wherein, pull described in applying along described first direction touch be associated with the first image photograph-ing mode, and pull described in applying along described second direction touch be associated with the second image photograph-ing mode being different from described first image photograph-ing mode, and
Wherein, described method is further comprising the steps of: by described first image photograph-ing mode joined with the applied directional correlation pulling touch or described second image photograph-ing mode, perform described image camera function for described preview image.
16. methods according to claim 11, wherein, described method is further comprising the steps of: when sensing the touch of the second preset type in the described region showing described preview image thereon by described second state, based on the region wherein sensing described touch, regulate the focusing of described preview image.
17. methods according to claim 11, wherein, described method is further comprising the steps of: sense with apply along preset moving direction by described second state pull the touch touching the second corresponding preset type, based on pull described on described preview image in the starting point of touch and end point at least one, regulate the focusing of described preview image.
18. methods according to claim 11, wherein, described method is further comprising the steps of: when the touch of described first preset type comprises the touch by the multiple points of described second state on described display unit sense, the state of described display unit is switched to described first state from described second state.
19. methods according to claim 18, wherein, the described touch on described multiple point comprises the first touch and second and touches, and
Wherein, described method is further comprising the steps of: near at least one touch point touched in the middle of described Drawing Object display to described first touch and described second being touched.
20. methods according to claim 11, wherein, described method is further comprising the steps of: when the touch of described first preset type is short touch, take described preview image as rest image, and
Wherein, described method is further comprising the steps of: when the touch of described first preset type is long touch, take the video of the described image obtained by described camera.
CN201410652555.4A 2014-05-19 2014-11-17 Mobile terminal and its control method Expired - Fee Related CN105100388B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2014-0059902 2014-05-19
KR1020140059902A KR102158214B1 (en) 2014-05-19 2014-05-19 Mobile terminal and control method for the mobile terminal

Publications (2)

Publication Number Publication Date
CN105100388A true CN105100388A (en) 2015-11-25
CN105100388B CN105100388B (en) 2019-06-21

Family

ID=51846442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410652555.4A Expired - Fee Related CN105100388B (en) 2014-05-19 2014-11-17 Mobile terminal and its control method

Country Status (5)

Country Link
US (1) US9787890B2 (en)
EP (1) EP2947867B1 (en)
KR (1) KR102158214B1 (en)
CN (1) CN105100388B (en)
FR (1) FR3021133B1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107800878A (en) * 2017-10-19 2018-03-13 广东欧珀移动通信有限公司 image display method and device
CN108279832A (en) * 2017-01-06 2018-07-13 三星电子株式会社 Image-pickup method and electronic device
CN110462572A (en) * 2017-03-28 2019-11-15 三星电子株式会社 Electronic device and its control method
CN111201772A (en) * 2017-10-09 2020-05-26 深圳传音通讯有限公司 Video recording method, device and terminal
CN108279832B (en) * 2017-01-06 2024-05-28 三星电子株式会社 Image acquisition method and electronic device

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102091161B1 (en) * 2013-12-05 2020-03-19 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
USD767520S1 (en) * 2014-03-13 2016-09-27 Lg Electronics Inc. Cellular phone
KR20170041188A (en) 2014-08-12 2017-04-14 소니 주식회사 Information-processing device, program, and information processing method
US9503846B2 (en) * 2014-08-14 2016-11-22 Nicholas Sandin Embedded location tracking systems for sports equipment
US11567626B2 (en) * 2014-12-17 2023-01-31 Datalogic Usa, Inc. Gesture configurable floating soft trigger for touch displays on data-capture electronic devices
KR102302197B1 (en) * 2015-04-01 2021-09-14 삼성전자주식회사 Photographing apparatus, method for controlling the same, and computer-readable recording medium
US10122914B2 (en) * 2015-04-17 2018-11-06 mPerpetuo, Inc. Method of controlling a camera using a touch slider
US9838607B2 (en) 2015-04-17 2017-12-05 mPerpetuo, Inc. Passive optical electronic camera viewfinder apparatus
WO2016196988A1 (en) * 2015-06-05 2016-12-08 Apple Inc. Quick review of captured image data
US20170078240A1 (en) * 2015-09-16 2017-03-16 Whatsapp Inc. Techniques to select and configure media for media messaging
US9871962B2 (en) * 2016-03-04 2018-01-16 RollCall, LLC Movable user interface shutter button for camera
US10225471B2 (en) * 2016-03-18 2019-03-05 Kenneth L. Poindexter, JR. System and method for autonomously recording a visual media
US20190174069A1 (en) * 2016-03-18 2019-06-06 Kenneth L. Poindexter, JR. System and Method for Autonomously Recording a Visual Media
US9800975B1 (en) 2016-04-18 2017-10-24 mPerpetuo, Inc. Audio system for a digital camera
KR20170131101A (en) * 2016-05-20 2017-11-29 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10009536B2 (en) 2016-06-12 2018-06-26 Apple Inc. Applying a simulated optical effect based on data received from multiple camera sensors
US10547776B2 (en) * 2016-09-23 2020-01-28 Apple Inc. Devices, methods, and graphical user interfaces for capturing and recording media in multiple modes
KR20180061652A (en) * 2016-11-30 2018-06-08 엘지전자 주식회사 Mobile terminal and method for controlling the same
DK180859B1 (en) 2017-06-04 2022-05-23 Apple Inc USER INTERFACE CAMERA EFFECTS
KR102586170B1 (en) * 2017-08-01 2023-10-10 삼성전자주식회사 Electronic device and method for providing search result thereof
EP3438853A1 (en) * 2017-08-01 2019-02-06 Samsung Electronics Co., Ltd. Electronic device and method for providing search result thereof
CN107563316A (en) * 2017-08-22 2018-01-09 努比亚技术有限公司 A kind of image pickup method, terminal and computer-readable recording medium
CN107562347B (en) * 2017-09-07 2021-04-13 北京小米移动软件有限公司 Method and device for displaying object
US11112964B2 (en) 2018-02-09 2021-09-07 Apple Inc. Media capture lock affordance for graphical user interface
JP7143098B2 (en) * 2018-03-23 2022-09-28 キヤノン株式会社 Electronic device and its control method
US10375313B1 (en) 2018-05-07 2019-08-06 Apple Inc. Creative camera
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US11314410B2 (en) * 2018-05-29 2022-04-26 Asustek Computer Inc. Mobile device
JP6703057B2 (en) * 2018-08-31 2020-06-03 キヤノン株式会社 Electronic device, control method thereof, and program thereof
DK201870623A1 (en) 2018-09-11 2020-04-15 Apple Inc. User interfaces for simulated depth effects
US10674072B1 (en) 2019-05-06 2020-06-02 Apple Inc. User interfaces for capturing and managing visual media
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US11321857B2 (en) 2018-09-28 2022-05-03 Apple Inc. Displaying and editing images with depth information
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
CN111124236B (en) * 2018-10-30 2023-04-28 斑马智行网络(香港)有限公司 Data processing method, device and machine-readable medium
US11604575B2 (en) * 2018-12-14 2023-03-14 Honeywell International Inc. Systems and methods for managing configurations of multiple displays of a vehicle
CN113168676A (en) 2019-04-18 2021-07-23 贝克曼库尔特有限公司 Protecting data of objects in a laboratory environment
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
CN112449101A (en) * 2019-09-05 2021-03-05 华为技术有限公司 Shooting method and electronic equipment
KR20210046323A (en) * 2019-10-18 2021-04-28 엘지전자 주식회사 Mobile terminal and assistance device attached to the same
US11054973B1 (en) 2020-06-01 2021-07-06 Apple Inc. User interfaces for managing media
US11212449B1 (en) * 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media
CN115484386B (en) * 2021-06-16 2023-10-31 荣耀终端有限公司 Video shooting method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040189856A1 (en) * 2002-12-26 2004-09-30 Sony Corporation Apparatus and method for imaging, and computer program
CN101052939A (en) * 2004-07-30 2007-10-10 苹果电脑有限公司 Mode-based graphical user interfaces for touch sensitive input devices
US20100317410A1 (en) * 2009-06-11 2010-12-16 Yoo Mee Song Mobile terminal and method for controlling operation of the same
CN103002219A (en) * 2011-09-16 2013-03-27 卡西欧计算机株式会社 Imaging device, imaging method
US20140040822A1 (en) * 2007-12-20 2014-02-06 Samsung Electronics Co., Ltd. Mobile terminal having touch screen and function controlling method of the same

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040083788A (en) * 2003-03-25 2004-10-06 삼성전자주식회사 Portable communication terminal capable of operating program using a gesture command and program operating method using thereof
KR101491594B1 (en) * 2007-11-05 2015-02-09 삼성전자주식회사 Portable terminal having touch screen and method for processing image thereof
US10503376B2 (en) * 2007-12-20 2019-12-10 Samsung Electronics Co., Ltd. Method and apparatus for adjusting an image and control guides displayed on a display
JP5083049B2 (en) * 2008-06-05 2012-11-28 富士通株式会社 Portable terminal device, preview display method, and program having display function
TWI422020B (en) * 2008-12-08 2014-01-01 Sony Corp Solid-state imaging device
KR20100071754A (en) * 2008-12-19 2010-06-29 삼성전자주식회사 Photographing method according to multi input scheme through touch and key manipulation and photographing apparatus using the same
EP2207342B1 (en) * 2009-01-07 2017-12-06 LG Electronics Inc. Mobile terminal and camera image control method thereof
EP2393000B1 (en) * 2010-06-04 2019-08-07 Lg Electronics Inc. Mobile terminal capable of providing multiplayer game and method of controlling operation of the mobile terminal
US11165963B2 (en) * 2011-06-05 2021-11-02 Apple Inc. Device, method, and graphical user interface for accessing an application in a locked device
JP5816571B2 (en) * 2012-02-21 2015-11-18 京セラ株式会社 Mobile terminal, shooting key control program, and shooting key control method
KR20130143381A (en) * 2012-06-21 2013-12-31 삼성전자주식회사 Digital photographing apparatus and method for controlling the same
CN102902452A (en) 2012-08-06 2013-01-30 北京小米科技有限责任公司 Method for photographing image and mobile terminal
JP6039328B2 (en) 2012-09-14 2016-12-07 キヤノン株式会社 Imaging control apparatus and imaging apparatus control method
US9232127B2 (en) * 2013-04-28 2016-01-05 Moshe Lior Alkouby Loupe accessory and viewing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040189856A1 (en) * 2002-12-26 2004-09-30 Sony Corporation Apparatus and method for imaging, and computer program
CN101052939A (en) * 2004-07-30 2007-10-10 苹果电脑有限公司 Mode-based graphical user interfaces for touch sensitive input devices
US20140040822A1 (en) * 2007-12-20 2014-02-06 Samsung Electronics Co., Ltd. Mobile terminal having touch screen and function controlling method of the same
US20100317410A1 (en) * 2009-06-11 2010-12-16 Yoo Mee Song Mobile terminal and method for controlling operation of the same
CN103002219A (en) * 2011-09-16 2013-03-27 卡西欧计算机株式会社 Imaging device, imaging method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108279832A (en) * 2017-01-06 2018-07-13 三星电子株式会社 Image-pickup method and electronic device
CN108279832B (en) * 2017-01-06 2024-05-28 三星电子株式会社 Image acquisition method and electronic device
CN110462572A (en) * 2017-03-28 2019-11-15 三星电子株式会社 Electronic device and its control method
CN110462572B (en) * 2017-03-28 2023-03-14 三星电子株式会社 Electronic device and control method thereof
CN111201772A (en) * 2017-10-09 2020-05-26 深圳传音通讯有限公司 Video recording method, device and terminal
CN107800878A (en) * 2017-10-19 2018-03-13 广东欧珀移动通信有限公司 image display method and device

Also Published As

Publication number Publication date
EP2947867A1 (en) 2015-11-25
EP2947867B1 (en) 2019-09-18
US20150334291A1 (en) 2015-11-19
US9787890B2 (en) 2017-10-10
FR3021133B1 (en) 2019-08-30
KR102158214B1 (en) 2020-09-22
FR3021133A1 (en) 2015-11-20
CN105100388B (en) 2019-06-21
KR20150133056A (en) 2015-11-27

Similar Documents

Publication Publication Date Title
CN105100388A (en) Mobile terminal and method of controlling the same
CN110266874B (en) Mobile terminal and control method thereof
KR101832966B1 (en) Mobile terminal and method of controlling the same
EP2999128A1 (en) Mobile terminal and control method therefor
CN105830012A (en) Mobile terminal and control method therefor
CN105278745A (en) Mobile terminal and control method thereof
CN105450848A (en) Mobile terminal and controlling method thereof
CN105393522A (en) Mobile Terminal And Method For Controlling The Same
CN105141742A (en) Mobile terminal and control method for the mobile terminal
CN105278855A (en) Mobile terminal and method for controlling the same
KR20170006559A (en) Mobile terminal and method for controlling the same
CN105376396A (en) Mobile terminal and controlling method thereof
CN104915136A (en) Mobile terminal and method of controlling the same
KR20160014226A (en) Mobile terminal and method for controlling the same
KR20160090186A (en) Mobile terminal and method for controlling the same
US10915223B2 (en) Mobile terminal and method for controlling the same
CN104935723A (en) Mobile terminal and method of controlling the same
CN106406688B (en) Mobile terminal and its control method
CN105491220A (en) Mobile terminal and control method thereof
CN104978136A (en) Mobile terminal and control method for the mobile terminal
CN105323372A (en) Mobile terminal and method for controlling the same
KR20170033706A (en) Mobile terminal and control method for the mobile terminal
CN105282277A (en) Mobile terminal and method of controlling the same
KR20170022490A (en) Mobile terminal and method for controlling the same
CN107924284A (en) Mobile terminal and its control method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190621

Termination date: 20211117